Energy Efficient Position-Based Three Dimensional Routing for Wireless Sensor Networks Jeongdae Kim, Daeyoung Kim Pages: 1-5
ABSTRACT: In this paper, we focus on an energy efficient position-based three dimensional (3D) routing algorithm using distance information, which affects transmission power consumption between nodes as a metric. In wireless sensor networks, energy efficiency is one of the primary objectives of research. In addition, recent interest in sensor networks is extended to the need to understand how to design networks in a 3D space. Generally, most wireless sensor networks are based on two dimensional (2D) designs. However, in reality, such networks operate in a 3D space. Since 2D designs are simpler and easier to implement than 3D designs for routing algorithms in wireless sensor networks, the 2D assumption is somewhat justified and usually does not lead to major inaccuracies. However, in some applications such as an airborne to terrestrial sensor networks or sensor networks, which are deployed in mountains, taking 3D designs into consideration is reasonable. In this paper, we propose the Minimum Sum of Square distance (MSoS) algorithm as an energy efficient position-based three dimensional routing algorithm. In addition, we evaluate and compare the performance of the proposed routing algorithm with other algorithms through simulation. Finally, the results of the simulation show that the proposed routing algorithm is more energy efficient than other algorithms in a 3D space.
Hydrodynamic and Mass Transfer Model Adjusted to Sulphur Dioxide Absorption in Water Rosa-Hilda Chavez, Javier de J. Guadarrama Pages: 6-11
ABSTRACT: In this work we report experimental results at loading points
and compare them with hydrodinamic and mass transfer model
predictions in order to determine the adjusted parameters and to
know the relationship between a two-phase countercurrent flow
and the geometry of the bed of the packing column. The bed of
the packing is essential for the design of rectification and
absorption columns. A study of hydrodynamic processes was
carried out in an absorption column of 0.252 metre diameter
with stainless steel gauze corrugated sheet packing by means of
air-water and SO2-water systems. The experiment results
include capacity, liquid hold-up and composition. The
absorption test produced a total of 48 data points. The average
deviation between the measured values of liquid hold-up to the
predicted values is 3 time higher than the experimental data.
Knowledge Generation as Natural Computation Gordana Dodig-Crnkovic Pages: 12-16
ABSTRACT: Knowledge generation can be naturalized by adopting
computational model of cognition and evolutionary approach.
In this framework knowledge is seen as a result of the
structuring of input data (data ? information ? knowledge) by
an interactive computational process going on in the agent
during the adaptive interplay with the environment, which
clearly presents developmental advantage by increasing agent’s
ability to cope with the situation dynamics. This paper
addresses the mechanism of knowledge generation, a process
that may be modeled as natural computation in order to be
better understood and improved.
The Fuzzy MCDM Algorithms for the M&A Due Diligence Chung-Tsen Tsao Pages: 17-22
ABSTRACT: An M&A due diligence is the process in which one of the
parties to the transaction undertakes to investigate the other in
order to judge whether to go forward with the transaction on
the terms proposed. It encompasses the missions in three
phases: searching and preliminary screening potential
candidates, evaluating the candidates and deciding the target,
and assisting the after-transaction integration. This work
suggests using a Fuzzy Multiple Criteria Decision Making
approach (Fuzzy MCDM) and develops detailed algorithms to
carry out the second-phase task. The approach of MCDM is
able to facilitate the analysis and integration of information
from different aspects and criteria. The theory of Fuzzy Sets
can include qualitative information in addition to quantitative
information. In the developed algorithms the evaluators’
subjective judgments are expressed in linguistic terms which
can better reflect human intuitive thought than the quantitative
scores. These linguistic judgments are transformed into
fuzzy numbers and made subsequent synthesis with
quantitative financial figures. The order of candidates can be
ranked after a defuzzification. Then the acquiring firm can
work out a more specific study, including pricing and costing,
on the priority candidates so as to decide the target.
The Kernel Estimation in Biosystems Engineering Esperanza Ayuga Téllez, Mª Angeles Grande Ortiz, Concepción González García, Angel Julián Martín Fernández, Ana Isabel García García Pages: 23-27
ABSTRACT: In many fields of biosystems engineering, it is common to find
works in which statistical information is analysed that violates
the basic hypotheses necessary for the conventional forecasting
methods. For those situations, it is necessary to find alternative
methods that allow the statistical analysis considering those
infringements.
Non-parametric function estimation includes methods that fit a
target function locally, using data from a small neighbourhood
of the point. Weak assumptions, such as continuity and
differentiability of the target function, are rather used than “a
priori” assumption of the global target function shape (e.g.,
linear or quadratic).
In this paper a few basic rules of decision are enunciated, for the
application of the non-parametric estimation method. These
statistical rules set up the first step to build an interface usermethod
for the consistent application of kernel estimation for
not expert users. To reach this aim, univariate and multivariate
estimation methods and density function were analysed, as well
as regression estimators. In some cases the models to be applied
in different situations, based on simulations, were defined.
Different biosystems engineering applications of the kernel
estimation are also analysed in this review.
Using Computers for Assessment of Facial Features and Recognition of Anatomical Variants that Result in Unfavorable Rhinoplasty Outcomes Tarik Ozkul, Murat Haluk Ozkul Pages: 28-35
ABSTRACT: Rhinoplasty and facial plastic surgery are among the most frequently performed surgical procedures in the world. Although the underlying anatomical features of nose and face are very well known, performing a successful facial surgery requires not only surgical skills but also aesthetical talent from surgeon. Sculpting facial features surgically in correct proportions to end up with an aesthetically pleasing result is highly difficult. To further complicate the matter, some patients may have some anatomical features which affect rhinoplasty operation outcome negatively. If goes undetected, these anatomical variants jeopardize the surgery causing unexpected rhinoplasty outcomes. In this study, a model is developed with the aid of artificial intelligence tools, which analyses facial features of the patient from photograph, and generates an index of “appropriateness” of the facial features and an index of existence of anatomical variants that effect rhinoplasty negatively. The software tool developed is intended to detect the variants and warn the surgeon before the surgery. Another purpose of the tool is to generate an objective score to assess the outcome of the surgery.
A Real-Time Intrusion Detection System using Data Mining Technique Fang-Yie Leu, Kai-Wei Hu Pages: 36-41
ABSTRACT: Presently, most computers authenticate user ID and password
before users can login these systems. However, danger soon
comes if the two items are known to hackers. In this paper, we
propose a system, named Intrusion Detection and Identification
System (IDIS), which builds a profile for each user in an intranet
to keep track his/her usage habits as forensic features with which
IDIS can identify who the underlying user in the intranet is. Our
experimental results show that the recognition accuracy of
students of computer science department is up to 98.99%.
Management and Communication of the Companies’ Knowledge; Guidelines for Intellectual Capital Statement Justyna Fijalkowska Pages: 42-47
ABSTRACT: This paper aims at analyzing the development of guidelines on Intellectual Capital Statement, providing a comparison of them and presenting their importance within the knowledge management process of the today’s companies.
We entered the Knowledge Era in which the basic economic resources are no longer financial capital, physical resources, or labor, but knowledge, called also intellectual capital (IC). Many analysts and investors demand for more information and they highlight the gap that exists between the information found in companies’ annual reports and the financial information regarding intangible part of the company requested by the market. Knowledge of the company should be measured and the effects should be communicated, as measurement without any further action has no sense. Intellectual capital statement seems an appropriate tool for that and becomes an integral part of the knowledge management of the modern enterprise. This kind of statement emphasizes the role of IC in relation to the value creation and communicates how knowledge resources are managed in the firms within a strategic objectives.
This paper compares different approaches to IC statement preparation: underlines similarities and differences concerning the scope, methodology and terminology used and ensuing consequences. It raises significant implications for managers of the companies, researches and policy makers.
Spoken Language Understanding Software for Language Learning Hassan Alam, Aman Kumar, Fuad Rahman, Rachmat Hartono, Yuliya Tarnikova Pages: 48-51
ABSTRACT: In this paper we describe a preliminary, work-in-progress
Spoken Language Understanding Software (SLUS) with
tailored feedback options, which uses interactive spoken
language interface to teach Iraqi Arabic and culture to second
language learners. The SLUS analyzes input speech by the
second language learner and grades for correct pronunciation
in terms of supra-segmental and rudimentary segmental errors
such as missing consonants. We evaluated this software on
training data with the help of two native speakers, and found
that the software recorded an accuracy of around 70% in law
and order domain. For future work, we plan to develop
similar systems for multiple languages.
| | Material Discriminated X-Ray CT System by Using New X-Ray Imager with Energy Discriminate Function Toru Aoki, Takuya Nakashima, Hisashi Morii, Yoichiro Neo, Hidenori Mimura Pages: 52-55
ABSTRACT: Material discriminated X-ray CT system has been constructed by
using conventional X-ray tube (white X-ray source) and
photon-counting X-ray imager as an application with energy band
detection. We have already reported material identify X-ray CT
using K-shell edge method elsewhere. In this report the principle
of material discrimination was adapted the separation of
electron-density and atomic number from attenuation coefficient
mapping in X-ray CT reconstructed image in two wavelength
X-ray CT method using white X-ray source and energy
discriminated X-ray imager by using two monochrome X-ray
source method.
The measurement phantom was prepared as four kinds material
rods (Carbon(C), Iron(Fe), Copper(Cu), Titanium(Ti) rods of
3mm-diameter) inside an aluminum(Al) rod of 20mm-diameter.
We could observed material discriminated X-ray CT reconstructed
image, however, the discrimination properties were not good than
two monochrome X-ray CT method. This results was could be
explained because X-ray scattering, beam-hardening and so on
based on white X-ray source, which could not observe in two
monochrome X-ray CT method. However, since our developed
CdTe imager can be detect five energy-bands at the same time, we
can use multi-band analysis to decrease the least square error
margin. We will be able to obtain more high separation in atomic
number mapping in X-ray CT reconstructed image by using this
system.
(e-) Mind Thinking with e-Um Damjan Kobal, Blaž Zmazek Pages: 56-59
ABSTRACT: Modern technology has opened up many new
possibilities in learning. Unfortunately, technology's
uncritical use can also be damaging. In promoting
productive and comprehensive IT learning the essential
issue lies within the capability of the teacher and IT
material to use computer to promote the basic cognitive
aspects of learning and not only to manipulate the learner
to remain motivated. Motivation is productive only if
used with a focus towards knowledge and understanding.
Especially in mathematics the concepts, we try to teach,
are simple and logical, but often abstract. Smart use of
computers can motivate this abstract concepts through
intuitive simulations and animations as well as provide a
sophisticated but simple insight into the causality of
mathematical thinking. Thus, we argue that preparation of
good e-Learning materials requires an almost
contemplative focus on what we want to communicate in
order not to overwhelm the student with too many effects
that the technology offers.
The concept and the vision of E-um project has been
based on the above premises with a comprehensive
system of simple technical, mathematical and didactical
guidelines, together with a dynamic and creative system
of permanent self evaluation and control.
To support those premises new software package based
on the Exe open source system has been developed. In
order to provide an adequate technical framework for our
conceptual ideas new emerging technologies with an
emphasis on writing mathematical texts had been used.
Detrended Fluctuation Analysis on Cardiac Pulses in Both, Animal Models and Humans: A Computation for an Early Prognosis of Cardiovascular Disease Toru Yazawa, Katsunori Tanaka, Tomoyuki Nagaoka, Tomoo Katsuyama Pages: 60-64
ABSTRACT: We analyzed the heartbeat interval by the detrended
fluctuation analysis (DFA) in models and humans. In
models, the myocardium of the healthy heart contracted
regularly. The deteriorated heart model, however, showed
alternating beats so-called “alternans.” The DFA revealed
that if the heart is having “alternans” exhibited there is a
declined scaling exponent (~0.5). In humans, the heart
that had “alternans” also showed a low scaling exponent
(~0.6). We consider that the coexistence of “alternans”
and a low scaling exponent can be a risk marker in
predictive and preventative diagnosis, supporting the idea
that “alternans” can be a harbinger of sudden death.
European Environment Agency Developments of Land and Ecosystem Accounts: General Overview Agnieszka Romanowicz, Franz Daffner, Jean-Louis Weber, Ronan Uhel Pages: 65-70
ABSTRACT: The European Environment Agency has started the implementation
of a programme of land use and ecosystem accounts, following the
System of Environmental and Economic Accounts (SEEA)
guidelines of the United Nations. The purpose is to integrate
information across the various ecosystem components and to support
further assessments and modeling of these components and their
interactions with economic and social developments. This
programme reflects the increasing demand for environmental policy
integration in Europe, both vertically through thematic policies as
well as horizontally across policies in those sectors that contribute
most to environmental impacts. The construction of land and
ecosystem accounts is now feasible due to continuous improvements
in monitoring, collecting and processing data and progress with the
development of statistical methods that facilitate data assimilation
and integration. The accounts are based on explicit spatial patterns
provided by comprehensive land cover accounts that can be upscaled
and downscaled using a 1km² grid to any type of administrative
region or ecosystem zone (e.g., river basin catchments, coastal zones
or bio- geographic areas). Land cover accounts have been produced
for 24 countries in Europe and published in EEA Report in 2006.
Identification of Acceptable Restoration Strategies Seung-Tae Cha, Nam-Ho Lee, Eung-Bo Shim, Jeong-Hoon Shin, Hyun-Il Son, Soo-Chul Nam Pages: 71-76
ABSTRACT: In recent years, we have seen several catastrophic &
cascading failures of power systems throughout the
world. Power system breakup and blackouts are rare
events. However, when they occur, the effects on
utilities & general population can be quite severe. To
prevent or reduce cascading sequences of events
caused by the various reasons, KEPRI is researching
ways to revolutionize innovative strategies that will
significantly reduce the vulnerability of the power
system and will ensure successful restoration of service
to customers. This paper describes a restoration
guidelines / recommendations for the KEPS simulator,
which allows power system operator and planner to
simulate and plan restoration events in an interactive
mode. The KEPS simulator provides a list of
restoration events according to the priority based on
some restoration rules and list of priority loads.
Further, the paper will draw on research using
information from a Jeju case study.
ILearning and EHomeStudy: Multimedia Training and Assessments for Field Survey Staff Charles Loftis, Nanthini Ganapathi Pages: 77-81
ABSTRACT: Survey data collection projects strive to collect high
quality data from survey respondents. The quality of the
data collected is greatly dependent upon the effectiveness
of field interviewers (FIs) to conduct inperson
screenings
and interviews. Training FIs and subsequently assessing
their knowledge of project protocol, methods and
interviewing techniques is critical to the overall success
of any data collection effort. For large surveys, as the
number of FIs increase, the cost of inperson
training can
become prohibitively large.
As a cost effective solution to increase the quality of the
field data, we developed a suite of web and media based
training and assessment tools called iLearning and
eHomeStudy for training field staff. Besides saving the
project costs associated with inperson
training, we are
also able to provide refresher trainings throughout the
year. This application also enables FIs to view
standardized training courses at their convenience and at
their own pace. This paper describes the technical
details, key features and benefits of this application suite,
and also it includes some details on user satisfaction and
future directions.
Industry-Academy Research Framework on Electronics Hardware Innovations Pauliina Mansikkamäki, Matti Mäntysalo, Markku Kivikoski, Seppo Pienimaa, Reijo Paajanen Pages: 82-88
ABSTRACT: New technologies are needed to put on the market ever
accelerated schedule in order to design and fabricate
devices that fulfill consumers’ expectations. An
industry-academy collaborative working mode is very
efficient way to accelerate and diversify progression of
novel technological solutions, educate new
multidisciplinary professionals, and to act the function
of new business incubation. This type of long-term
research activity strengthens the position of
research groups from small countries in an
international competition.
An Intersecting Cortical Model Based Framework for Human Face Recognition Ahmed G. Mahgoub, Amira A. Ebeid, Hossam-El-Deen M. Abdel-Baky, El-Sayed A. El-Badawy Pages: 89-93
ABSTRACT: This paper introduces a novel method for human face
recognition based on a simplified approach for the Pulse
Coupled Neural Network (PCNN) Algorithm. The face
image is introduced to the Intersecting Cortical Model
(ICM) to be iterated 200 times, and then the time signals
for the faces are compared to make a decision.
Experimental results for human face recognition confirm
that the proposed method lends itself to higher
classification accuracy relative to existing techniques.
|