A Dynamic Model for the Hepatitis B Virus Infection Changjiang Long, Huan Qi, Sheng-He Huang Pages: 1-5
ABSTRACT: According to the pathogenesis of hepatitis B, a mathematical model describing the relationship between hepatitis B virus(HBV) and the cellular immune response to the infection is built based on Nowak’s population dynamics model of immune responses to persistent viruses. The model has two possible equilibrium states: complete recovery (HBV will be eliminated from the body entirely), uninfected and infected hepatocytes coexisting state. The stability condition of each equilibrium points is discussed. Different set of parameters satisfied the different conditions is used in the simulation and the results show that the model can interpret the wide variety of clinical manifestations of infection: acute hepatitis, fulminant hepatitis, acute–turn-chronic hepatitis, chronic hepatitis without acute phase, recurring hepatitis, and so on. Both immunomics and infectomics may be involved in the underlying mechanisms. The model suggests that a rapid and vigorous CTL response is required for resolution of HBV infection.
A Novel Planar Microstrip Antenna Design for UHF RFID Madhuri Eunni, Mutharasu Sivakumar, Daniel Deavours Pages: 6-10
ABSTRACT: Passive UHF RFID tags generally do not work well near metal or water. Microstrip antennas offer a potential solution, but suffer from manufacturing complexity because a need for via or some other reference to ground. We present a new antenna and matching circuit design using a balanced feed that eliminates any reference to ground and thus simplifies the antenna’s construction.
An Ad Hoc Adaptive Hashing Technique forNon-Uniformly Distributed IP Address Lookup in Computer Networks Christopher Martinez, Wei-Ming Lin Pages: 11-17
ABSTRACT: Hashing algorithms long have been widely adopted
to design a fast address look-up process which involves a search through a large database to find a record associated with a given key. Hashing
algorithms involve transforming a key inside each target data to a hash value hoping that the hashing would render the database a uniform distribution with respect to this new hash value. The close the final distribution is to uniform, the less search time would be required when a query is made. When the database is already key-wise uniformly distributed, any regular hashing algorithm, such as bit-extraction, bit-group XOR, etc., would easily lead to a statistically perfect uniform distribution after the hashing. On the other hand, if records in the database are instead not uniformly distributed as in almost all known practical applications, then even
different regular hash functions would lead to very different performance. When the target database has a key with a highly skewed distributed value, performance delivered by regular hashing algorithms usually becomes far from desirable. This paper aims at designing a hashing algorithm to achieve the highest probability in leading to a uniformly distributed hash result from a non-uniformly distributed database. An analytical pre-process on the original database is first performed to extract critical information that would significantly benefit the design of a better hashing algorithm. This process includes sorting on the bits of the key to prioritize the use of them in the XOR hashing sequence, or in simple bit extraction, or even a combination of both. Such an ad hoc hash
design is critical to adapting to all real-time situations when there exists a changing (and/or expanding) database with an irregular non-uniform distribution. Significant improvement from simulation results is obtained in randomly generated data as well as real data.
An Adaptive Process Allocation Scheme in Grid Environment Tibor Gyires Pages: 18-26
ABSTRACT: The Grid is an interconnected set of distributed compute servers. An application running on the Grid consists of processes, which can be executed in parallel or in a sequential manner. An application can specify application level and network level Quality of Service parameters including number of processors, memory, special software, network bandwidth, delay, jitter, packet loss, etc. We investigate the question: Which processes are allocated to which compute servers that collectively satisfy the application’s resource requirements and optimize performance and cost parameters. We describe a protocol to identify those compute servers that can execute the application with minimal cost and provide the required application level and network level Quality of Service.
Comparison of various feature decorrelation techniques in automatic speech recognition J.V. Psutka, Ludek Muller Pages: 27-30
ABSTRACT: The design of an optimum front-end module for an automatic speech recognition system is still a great effort of many research teams all over the world. Prepared paper wants to contribute partly to these discussions. It is especially aimed at feature decorrelation techniques based on Maximum Linear Likelihood Transform (MLLT) applied at a different level of matrix clustering. Also the comparison of the MLLT with other decorrelation techniques will be discussed.
Cross Decomposition of the Degree-Constrained Minimum Spanning Tree problem Han-Suk Sohn, Dennis Bricker Pages: 31-34
ABSTRACT: As computer communication networks become a prevalent part in our daily life, the importance of efficient design of those networks becomes more evident. One of the critical issues in the network design process is the topological design problem involved in establishing a centralized data communication network with best performance and low costs. It can be recognized as a degree-constrained minimum spanning tree and it has been shown to be NP-hard. The degree-constrained minimum spanning tree problem commonly appears as a subproblem in the design of centralized data communication networks, and so the development of effective algorithms has received much attention in the research literature. To achieve effectiveness in solving degree-constrained minimum spanning tree, a solution algorithm based on cross-decomposition is proposed in this paper. The computational results are analyzed to demonstrate the effectiveness of the proposed algorithm. It shows a great promise in the design of centralized data communication networks.
Data and Communication Security Sadeq ALHAMOUZ Pages: 35-38
ABSTRACT: The regional initiative was presented by the United Nations Economic and Social Commission for Western Asia in preparation for the world summit, Dec 2003. The Initiative by itself and away from regional trouble and differences between both the Arab countries and other countries in the regions is a good and noble Initiative. However with such differences and lack of trust the security issue should be the first issue tackled and resolved. In this paper it is aimed to look at present tools and techniques available, and then suggest alternatives when possible.
Development of a Multi-fingered Mechanical Hand with Envelope Grasp Capability for Various Shapes of Parts Nobuaki IMAMURA Pages: 39-43
ABSTRACT: Recently, it has been required that various objects with different shapes or sizes could be held and grasped by one robot hand mechanism for the sake of factory automation and labor saving. Corresponding to such needs, we have developed an articulated mechanical hand which has an envelope grasp capability. Since the developed hand is possible to envelope and grasp an object mechanically, it can be used easily and widely in the factory where various shaped or different sized parts should be handled. In this paper, the detailed design of articulated mechanical hand with envelope grasp capability which we have newly developed by introducing@ GaeaDriveⓇ and its motion mechanism are described.
DFA on Cardiac Rhythm: Fluctuation of the Heartbeat Interval Contain Useful Information for the Risk of Mortality in Both, Animal Models and Humans Toru Yazawa, Katsunori Tanaka, Tomoo Katsuyama Pages: 44-49
ABSTRACT: We analyzed the heartbeat interval to test the possibility that the detrended fluctuation analysis (DFA) distinguishes a sick condition from a healthy condition of the cardiac control network. The healthy heart exhibited exponents ranging from 0.8 to 1.0 in both, animal models and humans. In the sick animal models, the exponents declined with an approaching very low range leading to a natural death (~0.6 in the end). Other models, which had a myocardial injury, exhibited extremely high exponents (~1.4). The high exponent was maintained until they died. Human arrhythmic hearts exhibited low exponent (~0.7). A human subject who has an abnormally high heart rate exhibited high exponents (as high as 1.4). A Human transplanted heart, which has no nervous controls, exhibited exponent 1.2. The fluctuation of the heartbeat interval contains information for the risk of a cardiac cessation or mortality.
| | Diffusion of IP Telephony in Undergraduate Private Colleges Patrick C. Olson, PhD, Donna M. Schaeffer, PhD, Blair Simmons, Aaron McDonald Pages: 50-53
ABSTRACT: Over the summer of 2000 Menlo College implemented enterprise wide Internet Protocol (IP) Telephony (Voice over Internet Protocol [VoIP]). More than five years have passed, and analysts are predicting that in the near future the only available Private Branch Exchange (PBX) solutions will be VoIP [4]. In view of this market trend, the diffusion of this technology in these institutions seems slow. The Menlo College implementation is very successful, but has the concept diffused to other institutions? What factors influence the diffusion of this technology to other institutions? This paper examines the status of VoIP at these institutions.
Evaluation and Classification of the Semantic Congruence of Information in Concurrent Product Development Heiko Gsell Pages: 54-60
ABSTRACT: Not available
Modelling SDL, Modelling Languages Michael Piefel, Markus Scheidgen Pages: 61-67
ABSTRACT: Today’s software systems are too complex to implement them and model them using only one language. As a result, modern software engineering uses different languages for different levels of abstraction and different system aspects. Thus to handle an increasing number of related or integrated languages is the most challenging task in the development of tools.
We use object oriented metamodelling to describe languages. Object orientation allows us to derive abstract reusable concept definitions (concept classes) from existing languages. This language definition technique concentrates on semantic abstractions rather than syntactical peculiarities. We present a set of common concept classes that describe structure, behaviour, and data aspects of high-level modelling languages. Our models contain syntax modelling using the OMG MOF as well as static semantic constraints written in OMG OCL.
We derive metamodels for subsets of SDL and UML from these common concepts, and we show for parts of these languages that they can be modelled and related to each other through the same abstract concepts.
New algorithm to reduce the number of computing steps in reliability formula of Weighted-k-out-of-n system Tatsunari Ohkura, Ventsi Rumchev Pages: 68-71
ABSTRACT: In the disjoint products version of reliability analysis of weighted–k–out–of–n systems, it is necessary to determine the order in which the weight of components is to be considered. The k–out–of–n:G(F) system consists of n components; each com-ponent has its own probability and positive integer weight such that the system is operational (failed) if and only if the total weight of some operational (failure) components is at least k. This paper designs a method to compute the reliability in O(nk) computing time and in O(nk) memory space. The proposed method expresses the system reliability in fewer product terms than those already published.
Reuse-centric Requirements Analysis with Task Models, Scenarios, and Critical Parameters Cyril Montabert, D. Scott McCrickard Pages: 72-78
ABSTRACT: This paper outlines a requirements-analysis process that unites task models, scenarios, and critical parameters to exploit and generate reusable knowledge at the requirements phase. Through the deployment of a critical-parameter-based approach to task modeling, the process yields the establishment of an integrative and formalized model issued from scenarios that can be used for requirements characterization. Furthermore, not only can this entity serve as interface to a knowledge repository relying on a critical-parameter-based taxonomy to support reuse but its characterization in terms of critical parameters also allows the model to constitute a broader reuse solution. We discuss our vision for a user-centric and reuse-centric approach to requirements analysis, present previous efforts implicated with this line of work, and state the revisions brought to extend the reuse potential and effectiveness of a previous iteration of a requirements tool implementing such process. Finally, the paper describes the sequence and nature of the activities involved with the conduct of our proposed requirements-analysis technique, concluding by previewing ongoing work in the field that will explore the feasibility for designers to use our approach.
Risk Assessment and Management for Long-Term Storage of CO2 in Geologic Formations — United States Department of Energy R&D Dawn Deel, Kanwal Mahajan, Christopher Mahoney, Howard McIlvried, Rameshwar Srivastava Pages: 79-84
ABSTRACT: Concern about increasing atmospheric concentrations of carbon dioxide (CO2) and other greenhouse gases (GHG) and their impact on the earth’s climate has grown significantly over the last decade. Many countries, including the United States, wrestle with balancing economic development and meeting critical near-term environmental goals while minimizing long-term environmental risks. One promising solution to the buildup of GHGs in the atmosphere, being pursued by the U.S. Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) and its industrial and academic partners, is carbon sequestration—a process of permanent storage of CO2 emissions in underground geologic formations, thus avoiding CO2 release to the atmosphere. This option looks particularly attractive for point source emissions of GHGs, such as fossil fuel fired power plants. CO2 would be captured, transported to a sequestration site, and injected into an appropriate geologic formation. However, sequestration in geologic formations cannot achieve a significant role in reducing GHG emissions unless it is acceptable to stakeholders, regulators, and the general public, i.e., unless the risks involved are judged to be acceptable.
One tool that can be used to achieve acceptance of geologic sequestration of CO2 is risk assessment, which is a proven method to objectively manage hazards in facilities such as oil and natural gas fields, pipelines, refineries, and chemical plants. Although probabilistic risk assessment (PRA) has been applied in many areas, its application to geologic CO2 sequestration is still in its infancy.
The most significant risk from geologic carbon sequestration is leakage of CO2. Two types of CO2 releases are possible—atmospheric and subsurface. High concentrations of CO2 caused by a release to the atmosphere would pose health risks to humans and animals, and any leakage of CO2 back into the atmosphere negates the effort expended to sequester the CO2. Subsurface risks, attributable to subsurface releases, arise from the displacement of fluids by the injected CO2 that could damage nearby hydrocarbon resources or trigger small seismic events. There is also the potential for sequestered CO2 to leak into non-saline formations, which could cause problems with potable uses of this water. However, overall, risks from CO2 sequestration are believed to be small.
Implementation of CO2 sequestration is being approached in phases. The DOE is currently sponsoring a series of pilot tests to generate important data that will elucidate the risks involved in geologic sequestration and lead to the development of risk management protocols. This phased approach should ensure that potential sources of leakage are identified, consequences are quantified, events with the potential to cause harm are analyzed to estimate their frequency and associated risk, and safeguards are put in place to further reduce risks for an operation for which risks already appear to be low.
Unleashing the Power of Networks - Case Study Guillermo Velasquez, Peggy Odem Pages: 85-91
ABSTRACT: As market forces continue to push the envelope of productivity and performance, developing a well-trained and highly skilled work force is considered one of the most important business differentiators in the market place. A recent survey1 indicates that informal training accounts for over 70% of all the training an individual gets in his/her job. These data emphasize the importance of having a training system in place that can fulfill the needs of the work force in a timely manner. Halliburton Energy Services has developed a system of communities of practice to strengthen organizational and individual development. This paper discusses how this training system is transforming the culture and the way it does business.
A Security Mechanism for library management system using low cost RFID tags V. Nagalakshmi, I. Rameshbabu, D. Lalithabhaskari Pages: 92-96
ABSTRACT: Radio Frequency Identification (RFID) systems will become pervasive in our daily lives due to their low cost and easy to use characteristics[1]. This paper presents a methodology of using low cost RFID tags for a library management system to protect the books from unauthorized capturing and usage. Every object to be identified in the RFID system is physically labeled with a tag. In the proposed method a book or a magazine or a CD is identified with the RFID tag. Whenever a book is issued to any user of the library, the RFID reader will capture the information of that book and compares it with the related information of the book in the Online Public Access Catalogue (OPAC) and issues the book to the user depending on his identity in the database. If any unauthorized person intends to take the book the RFID reader will immediately respond to that and make sure that it was not issued. If any body tries to remove the tag it can be traced with the help of location device.
Keywords: RFID, OPAC, Unauthorized user, tag, Reader.
|