Journal of
Systemics, Cybernetics and Informatics
HOME   |   CURRENT ISSUE   |   PAST ISSUES   |   RELATED PUBLICATIONS   |   SEARCH     CONTACT US
 

 ISSN: 1690-4524 (Online)



TABLE OF CONTENTS





A Strategy for Improving Dynamic Composability: Ontology-driven Introspective Agent Architectures
Levent Yilmaz
Pages: 1-9
ABSTRACT:
Seamless composability of disparate simulations within a systems of systems context is challenging. Large complex simulations must respond to changing technology, environments, and objectives. The problem exacerbates when dynamic extensibility and adaptability are significant concerns in an application domain. Simulations that are dynamically extensible, while being composable require principled designs that facilitate engineering interoperability and composability in the first place. The role of ontologies and their axiomatization is discussed as a potential strategy to improve dynamic composability.


Researches on Application of GPS to Earthquake Monitoring and Prediction
Wanju BO, Liangqian GUO, Guohua YANG, Xuesong DU
Pages: 10-15
ABSTRACT:
The earliest researches on application of GPS to earthquake monitoring and prediction in China began in 1980s, and it was limited to learn some relative technology from other countries and do some test with a few of equipments. As the improvement of software for data processing and the depreciating of hardware, several local GPS network had been gradually set up till the end of 1990s, and then more systematically GPS monitoring, data processing and its application research have been done gradually. In this paper, 3 research examples of the application of GPS to earthquake monitoring and prediction are presented.


WiMAX Experimentation and Verification in Field in Italy
Jonathan Buschmann, Francesco Vallone, Paris Dettorre, Andrea Valle, Gianluca Allegrezza
Pages: 16-20
ABSTRACT:
Under the auspices of the Italian Communications Ministry, in a trial program directed by the UGO Bordoni Foundation, Ericsson performed extensive field tests to verify and measure the performance of IEEE 802.16-2004 compliant equipment in the 3.5 GHz band. The trial was strictly technological in nature, with the goals of validating the claimed technical characteristics of this technology and determining the behavior of a point-to-multipoint technology in this frequency band. The experimentation was carried out with equipment from Airspan, a global partner of Ericsson. Measurements were done both at the radio bearer layer and the packet data layer. At the radio layer coverage and propagation were measured and characterized. Studies are under way using these results to develop a propagation model of the 802.16-2004 signal in the 3.5 GHz band. In addition, inter-cell interference was measured. At the packet layer, throughput, packet loss, and jitter were measured under varying field conditions, including various types of terrain and differing levels of clutter. Standard service functionality and performance were verified and measured with particular attention to the QoS support defined in the 802.16-2004 standard.


Direct Tests on Individual Behaviour in Small Decision-Making Problems
Takemi Fujikawa
Pages: 21-26
ABSTRACT:
This paper provides an empirical and experimental analysis of individual decision making in small decision-making problems with a series of laboratory experiments. Two experimental treatments with binary small decision-making problems are implemented: (1) the search treatment with the unknown payoff distribution to the decision makers and (2) the choice treatment with the known payoff distribution. A first observation is that in the search treatment the tendency to select best reply to the past performances, and misestimation of the payoff distribution can lead to robust deviations from expected value maximisation. A second observation is concerned with choice problems with two options with the same expected value: one option is more risky with larger payoff variability; the other option is moderate with less payoff variability. Experimental results show that it is likely that the more the decision makers choose a risky option, the higher they can achieve high points, ex post. Finally, I investigate the exploration tendency. Comparison of results between the search treatment and the choice treatment reveals that the additional information to the decision makers enhances expected value maximisation.


MetaBlast! Virtual Cell: A Pedagogical Convergence between Game Design and Science Education
Anson Call, Steven Herrnstadt, Eve Wurtele, Julie Dickerson, Diane Bassham
Pages: 27-31
ABSTRACT:
Virtual Cell is a game design solution to a specific scientific and educational problem; expressly, how to make advanced, university level plant biology instruction on molecular and anatomical levels an exciting, efficient learning experience. The advanced technologies of 3D modeling and animation, computer programming and game design are united and tempered with strong, scientific guidance for accuracy and art direction for a powerful visual and audio simulation. The additional strength of intense gaming as a powerful tool aiding memory, logic and problem solving has recently become well recognized. Virtual Cell will provide a unique gaming experience, while transparently teaching scientifically accurate facts and concepts about, in this case, a soybean plant’s inner workings and dependant mechanisms on multiple scales and levels of complexity. Virtual Cell (from now on referred to as VC) in the future may prove to be a reference for other scientific/education endeavors as scientists battle for a more prominent mind share among average citizens. This paper will discuss the difficulties of developing VC, its structure, intended game and educational goals along with additional benefits to both the sciences and gaming industry.


Sweet Spot Control of 1:2 Array Antenna using A Modified Genetic Algorithm
Kyo-Hwan HYUN, Kyung-Kwon JUNG, Ki-Hwan EOM
Pages: 32-37
ABSTRACT:
This paper presents a novel scheme that quickly searches for the sweet spot of 1:2 array antennas, and locks on to it for high-speed millimeter wavelength transmissions, when communications to another antenna array are disconnected. The proposed method utilizes a modified genetic algorithm, which selects a superior initial group through preprocessing in order to solve the local solution in a genetic algorithm. TDD (Time Division Duplex) is utilized as the transfer method and data controller for the antenna. Once the initial communication is completed for the specific number of individuals, no longer antenna’s data will be transmitted until each station processes GA in order to produce the next generation. After reproduction, individuals of the next generation become the data, and communication between each station is made again. The simulation results of 1:1, 1:2 array antennas, and experiment results of 1:1 array antenna confirmed the efficiency of the proposed method. The bit of gene is each 8bit, 16bit and 16bit split gene. 16bit split has similar performance as 16bit gene, but the gene of antenna is 8bit.


DAIDS: a Distributed, Agent-based Information Dissemination System
Pete Haglich, Mike Kopack, David Van Brackle
Pages: 38-45
ABSTRACT:
The Distributed Agent-Based Information Dissemination System (DAIDS) concept was motivated by the need to share information among the members of a military tactical team in an atmosphere of extremely limited or intermittent bandwidth. The DAIDS approach recognizes that in many cases communications limitations will preclude the complete sharing of all tactical information between the members of the tactical team. Communications may be limited by obstructions to the line of sight between platforms; electronic warfare; or environmental conditions, or just contention from other users of that bandwidth. Since it may not be possible to achieve a complete information exchange, it is important to prioritize transmissions so the most critical information from the standpoint of the recipient is disseminated first. The challenge is to be able to determine which elements of information are the most important to each teammate. The key innovation of the DAIDS concept is the use of software proxy agents to represent the information needs of the recipient of the information. The DAIDS approach uses these proxy agents to evaluate the content of a message in accordance with the context and information needs of the recipient platform (the agent’s principal) and prioritize the message for dissemination. In our research we implemented this approach and demonstrated that it provides nearly a reduction in transmission times for critical tactical reports by up to a factor of 30 under severe bandwidth limitations.



Nonparametric Comparison of Two Dynamic Parameter Setting Methods in a Meta-Heuristic Approach
Seyhun HEPDOGAN, Reinaldo Moraga, Gail DePuy, Gary Whitehouse
Pages: 46-52
ABSTRACT:
Meta-heuristics are commonly used to solve combinatorial problems in practice. Many approaches provide very good quality solutions in a short amount of computational time; however most meta-heuristics use parameters to tune the performance of the meta-heuristic for particular problems and the selection of these parameters before solving the problem can require much time. This paper investigates the problem of setting parameters using a typical meta-heuristic called Meta-RaPS (Metaheuristic for Randomized Priority Search.). Meta-RaPS is a promising meta-heuristic optimization method that has been applied to different types of combinatorial optimization problems and achieved very good performance compared to other meta-heuristic techniques. To solve a combinatorial problem, Meta-RaPS uses two well-defined stages at each iteration: construction and local search. After a number of iterations, the best solution is reported. Meta-RaPS performance depends on the fine tuning of two main parameters, priority percentage and restriction percentage, which are used during the construction stage. This paper presents two different dynamic parameter setting methods for Meta-RaPS. These dynamic parameter setting approaches tune the parameters while a solution is being found. To compare these two approaches, nonparametric statistic approaches are utilized since the solutions are not normally distributed. Results from both these dynamic parameter setting methods are reported.


Security Modeling on the Supply Chain Networks
Marn-Ling Shing, Chen-Chi Shing, Kuo Lane Chen, Huei Lee
Pages: 53-58
ABSTRACT:
In order to keep the price down, a purchaser sends out the request for quotation to a group of suppliers in a supply chain network. The purchaser will then choose a supplier with the best combination of price and quality. A potential supplier will try to collect the related information about other suppliers so he/she can offer the best bid to the purchaser. Therefore, confidentiality becomes an important consideration for the design of a supply chain network. Chen et al. have proposed the application of the Bell-LaPadula model in the design of a secured supply chain network. In the Bell-LaPadula model, a subject can be in one of different security clearances and an object can be in one of various security classifications. All the possible combinations of (Security Clearance, Classification) pair in the Bell-LaPadula model can be thought as different states in the Markov Chain model. This paper extends the work done by Chen et al., provides more details on the Markov Chain model and illustrates how to use it to monitor the security state transition in the supply chain network.


Generation Methods for Multidimensional Knapsack Problems and their Implications
Raymond R. Hill, Chaitr Hiremath
Pages: 59-64
ABSTRACT:
Although there are a variety of heuristics developed and applied to the variants of the binary knapsack problem, the testing of these heuristics are based on poorly defined test problems. This paper reviews the various types of knapsack problems, considers how test problems have been generated and depicts via empirical results the implications of using poorly formed test problems for empirical testing.


Applying the Levels of Conceptual Interoperability Model in Support of Integratability, Interoperability, and Composability for System-of-Systems Engineering
Andreas Tolk, Saikou Diallo, Charles Turnitsa
Pages: 65-74
ABSTRACT:
The Levels of Conceptual Interoperability Model (LCIM) was developed to cope with the different layers of interoperation of modeling & simulation applications. It introduced technical, syntactic, semantic, pragmatic, dynamic, and conceptual layers of interoperation and showed how they are related to the ideas of integratability, interoperability, and composability. The model was successfully applied in various domains of systems, cybernetics, and informatics.


Nano Indentation Inspection of the Mechanical Properties of Gold Nitride Thin Films
Armen Verdyan, Ya. M. Soifer, Jacob Azoulay, Maurizio Martino, A. P. Caricato, T. Tunno, Francesco Romano, D. Valerini
Pages: 75-80
ABSTRACT:
The morphology and the local mechanical properties of gold nitride thin films were studied by atomic force microscope (AFM). Gold nitride films were deposited for the first time on silicon substrate without any buffer layer at room temperature by reactive pulsed laser ablation deposition (RPLD). The films were fabricated on (100) Si wafers by RPLD technique in which KrF excimer laser was used to ablate a gold target in N2 atmosphere (0.1 GPa-100 Pa) and ambient temperature. Scanning electron microscopy (SEM) and atomic force microscopy inspections showed that the films were flat plane with rms roughness in the range of 35.1 nm-3.6 nm, depending on the deposition pressure. Rutherford backscattering spectrometry (RBS) and energy dispersion spectroscopy (EDS) used to detect the nitrogen concentration in the films, have revealed a composition close to Au3N. The film


Obstacle of Team Teaching and Collaborative Learning in Information Security
Marn-Ling Shing, Chen-Chi Shing, Kuo Lane Chen, Huei Lee
Pages: 81-86
ABSTRACT:
The field of information security includes diverse contents such as network security and computer forensics which are highly technical-oriented topics. In addition, information forensic requires the background of criminology. The information security also includes non-technical content such as information ethics and security laws. Because the diverse nature of information security, Shing et al. has proposed the use of team teaching and collaborative learning for the information security classes. Although team teaching seems to be efficient in information security, practically it needs a few challenges. The Purdue’s case mentioned in Shing’s paper has funding support of National Security Agency (NSA). However, a vast amount of resources may not be available for an instructor in a normal university. In addition, many obstacles are related to the administration problems. For example, how are the teaching evaluations computed if there are multiple instructors for a single course? How will instructors in a computer forensics class prepare students (criminal justice majors and information technology majors) before taking the same class with diverse background? The paper surveyed approximately 25 students in a university in Virginia concerning the satisfaction of team-teaching. Finally, this paper describes ways to meet those challenges.