Accelerating Image Based Scientific Applications using Commodity Video Graphics Adapters Randy P. Broussard, Robert W. Ives Pages: 1-5
ABSTRACT: The processing power available in current video graphics cards
is approaching super computer levels. State-of-the-art graphical
processing units (GPU) boast of computational performance in
the range of 1.0-1.1 trillion floating point operations per second
(1.0-1.1 Teraflops). Making this processing power accessible to
the scientific community would benefit many fields of research.
This research takes a relatively computationally expensive
image-based iris segmentation algorithm and hosts it on a GPU
using the High Level Shader Language which is part of DirectX
9.0. The selected segmentation algorithm uses basic image
processing techniques such as image inversion, value squaring,
thresholding, dilation, erosion and a computationally intensive
local kurtosis (fourth central moment) calculation. Strengths and
limitations of the DirectX rendering pipeline are discussed. The
primary source of the graphical processing power, the pixel or
fragment shader, is discussed in detail. Impressive acceleration
results were obtained. The iris segmentation algorithm was
accelerated by a factor of 40 over the highly optimized C++
version hosted on the computer’s central processing unit. Some
parts of the algorithm ran at speeds that were over 100 times
faster than their C++ counterpart. GPU programming details
and HLSL code samples are presented as part of the
acceleration discussion.
Prospecting for Sustainable Investment Possibilities in Financial Markets Viktorija Stasytyte, Aleksandras Vytautas Rutkauskas Pages: 6-11
ABSTRACT: The main objective of the paper is to analyse the
author’s proposed model, which is adequate for stock prices and
currency exchange rates markets stochasticity, as well as
discuss its application to investor‘s possibilities research in
those markets. The paper is grounded on the hypothesis of
stratification of stock profitability ratios, traded on the market.
In other words, the concept of stratification means
concentration into certain groups in risk-profitability plane. If
the hypothesis proved overall, then a constructive scheme for
investor‘s possibilities research in exchange and capital markets
would appear, as well as efficient investment strategies would
develop.
Requirements Content Goodness and Complexity Measurement Based On NP Chunks Chao Y. Din Pages: 12-18
ABSTRACT: In a typical software development project, a requirements
document summarizes the results of the requirements analysis
and becomes the basis for subsequent software development.
In many cases, the quality of the requirements documents
dictates the success of the software development. The need for
determining the quality of requirements documents is
particularly acute when the target applications are large,
complicated, and mission critical. The purpose of this research
is to develop quality indicators to indicate the quality of
requirements statements in a requirements document. To
achieve the goal, the goodness properties of the requirements
statements are adopted to represent the quality of requirements
statements. A suite of complexity metrics of requirements
statements is proposed as the quality indicators and is
developed based upon research of noun phrase (NP) chunks. A
two phased empirical case study is performed to evaluate the
usage of the proposed metrics. By focusing upon the
complexity metrics based on NP chunks, the research aided in
development of complexity indicators of low quality
requirements documents.
The Science of Structural Revolutions William P. Graf Pages: 19-24
ABSTRACT: A perspective on the very human process by which scientific
paradigms change can help point the path forward in any
science, or in an applied science, such as Structural
Engineering. Understanding this process of change, we can
examine earthquake engineering, seismic building codes and
theories of structural engineering for earthquake loads.
When we take this perspective, we recognize that Structural
Engineering for earthquake resistance is in the midst of a
number of revolutions, from paradigms embodied in current
building codes in which earthquake demands are associated
with forces, to a new paradigm in which earthquake demands
are re-envisioned as resulting from structural displacements or
drift. The new paradigm is embodied in the current national
standard for the seismic rehabilitation of existing structures,
ASCE 41 [2] and the emerging standards for performance-based
earthquake engineering (PBEE). Associated with this is the shift
from design oriented towards life-safety to design for a range of
performance objectives, such as life-safety, damage reduction,
or immediate occupancy.
With this perspective, we further recognize deficiencies in
research and development. We have failed to systematically use
the experimental and computational tools we possess to fill in
the gaps of scientific knowledge. We have not developed and
deployed appropriate frameworks to collect and share ideas and
results. As one example, the formulation of performance-based
codes now outstrips the knowledge-base needed to ensure that
structures designed by the new tools will meet their
performance objectives.
Multigraph Decomposition Into Multigraphs With Two Underlying Edges Miri Priesler, Michael Tarsi Pages: 25-32
ABSTRACT: Due to some intractability considerations, reasonable
formulation of necessary and sufficient conditions for decomposability
of a general multigraph G into a fixed connected multigraph H, is
probably not feasible if the underlying simple graph of H has three or
more edges. We study the case where H consists of two underlying
edges. We present necessary and sufficient conditions for H-
decomposability of G, which hold when certain size parameters of G
lies within some bounds which depends on the multiplicities of the two
edges of H. We also show this result to be "tight" in the sense that even
a slight deviation of these size parameters from the given bounds results
intractability of the corresponding decision problem.
Computer Assisted Testing of Spoken English: A Study of the SFLEP College English Oral Test System in China John Lowe, Xin Yu Pages: 33-38
ABSTRACT: This paper reports on the on-going evaluation of a
computer-assisted system (CEOTS) for the assessing of
spoken English skills among Chinese university students.
This system is being developed to deal with the negative
backwash effects of the present system of assessment of
speaking skills which is only available to a tiny minority.
We present data from a survey of students at the
developing institution (USTC), with follow-up interviews
and further interviews with English language teachers, to
gauge the reactions to the test and its impact on language
learning. We identify the key issue as being one of
validity, with a tension existing between construct and
consequential validities of the existing system and of
CEOTS. We argue that a computer-based system seems to
offer the only solution to the negative backwash problem
but the development of the technology required to meet
current construct validity demands makes this a very long
term prospect. We suggest that a compromise between the
competing forms of validity must therefore be accepted,
probably well before a computer-based system can deliver
the level of interaction with the examinees that would
emulate the present face-to-face mode.
A Method for Knowledge Management and Communication Within and Across Multidisciplinary Teams Don Flynn, Erin Brown, Rebekah Krieg Pages: 39-44
ABSTRACT: The use of knowledge management (KM) and communication
tools in an applied scientific arena where research is performed
and knowledge must be managed within and across
multidisciplinary teams and organizations is a challenge.
Teams of scientists and engineers from up to 17 different
technical specialties required knowledge management tools for
developing multiple environmental impact statements under
challenging circumstances. Factors that contributed to the
success of the KM tools included 1) pairing of project staff with
Knowledge Systems staff to determine system requirements, 2)
the use of the tools by the team as they were being developed
thus allowing many opportunities for feedback and interaction,
3) developing the tools to approximate the overall project
structure and work flow already in place, 4) providing
immediate assistance to the project team as they learned to use
the new KM tools, and 5) replacing earlier practices with the
new KM approach by “burning the bridges” to past practices
after the team had learned to use the new KM tools.
Improved Accuracy of Nonlinear Parameter Estimation with LAV and Interval Arithmetic Methods Humberto Muñoz, Nigel Gwee Pages: 45-50
ABSTRACT: The reliable solution of nonlinear parameter es-
timation problems is an important computational
problem in many areas of science and engineering,
including such applications as real time optimization.
Its goal is to estimate accurate model parameters that
provide the best fit to measured data, despite small-
scale noise in the data or occasional large-scale mea-
surement errors (outliers). In general, the estimation
techniques are based on some kind of least squares
or maximum likelihood criterion, and these require
the solution of a nonlinear and non-convex optimiza-
tion problem. Classical solution methods for these
problems are local methods, and may not be reliable
for finding the global optimum, with no guarantee
the best model parameters have been found. Interval
arithmetic can be used to compute completely and
reliably the global optimum for the nonlinear para-
meter estimation problem. Finally, experimental re-
sults will compare the least squares, l2, and the least
absolute value, l1, estimates using interval arithmetic
in a chemical engineering application.
| | Measurement of Periodical Contraction of Cultured Muscle Tube with Laser Shigehiro Hashimoto, Shuichi Mochizuki, Jun Takase, Daisuke Inoue Pages: 51-55
ABSTRACT: Periodical contraction of a cultured muscle tube has been
measured with laser in vitro. C2C12 (mouse myoblast cell line)
was cultured with High-glucose Dulbecco’s Modified Eagle’s
Medium on a dish to make muscle tubes. Differentiation from
myoblasts to myotubes was induced with an additional horse
serum. Repetitive contraction of the tube was generated by
electric pulses lower than sixty volts of amplitude with one
milli-second of width through the electrodes of platinum, and
observed with a phase-contrast microscope. A laser beam of
632.8 nm wavelength was restricted to 0.096 mm diameter, and
applied to the muscle tubes on the bottom of the culture dish.
Fluctuating intensity of the transmitted laser beam through the
periodically contracting muscle tubes was measured, and its
spectrum was analyzed. The analyzed data show that the
repetitive contraction is synchronized with stimulation of the
periodical electric pulses between 0.2 s and 2 s.
A kind of discussing method of information contents taking account of the YUBITSUKIYI system embedded into the Life Support System Masahiro Aruga, Shuichiro Ono, Kiyotaka Takagi, Shuichi Kato Pages: 56-61
ABSTRACT: A kind of discussing method of information contents taking account of the YUBITSUKIYI system embedded into the Life Support System
New method for the failure probability of strict circular consecutive–k–out–of–n:F system Yoichi Higashiyama, Ventsi Rumchev Pages: 62-65
ABSTRACT: New method for the failure probability of strict circular consecutive–k–out–of–n:F system
A planar parallel 3-DOF cable-driven haptic interface Clément Gosselin, Régis Poulin, Denis Laurendeau Pages: 66-71
ABSTRACT: In this paper, a cable-driven planar parallel haptic interface is pre-
sented. First, the velocity equations are derived and the forces in
the cables are obtained by the principle of virtual work. Then, an
analysis of the wrench-closure workspace is performed and a ge-
ometric arrangement of the cables is proposed. Control issues are
then discussed and a control scheme is presented. The calibration
of the attachment points is also discussed. Finally, the prototype
is described and experimental results are provided.
Missing Data Estimation using Principle Component Analysis and Autoassociative Neural Networks Jaisheel Mistry, Fulufhelo V. Nelwamondo, Tshilidzi Marwala Pages: 72-79
ABSTRACT: Three new methods are used for estimating missing data in a
database using Neural Networks, Principal Component Analysis
and Genetic Algorithms are presented. The proposed methods
are tested on a set of data obtained from the South African
Antenatal Survey. The data is a collection of demographic
properties of patients. The proposed methods use Principal
Component Analysis to remove redundancies and reduce the
dimensionality in the data. Variations of autoassociative Neural
Networks are used to further reduce the dimensionality of the
data. A Genetic Algorithm is then used to find the missing data
by optimizing the error function of the three variants of the
Autoencoder Neural Network. The proposed system was tested
on data with 1 to 6 missing fields in a single record of data and
the accuracy of the estimated values were calculated and
recorded. All methods are as accurate as a conventional
feedforward neural network structure however the use of the
newly proposed methods employs neural network architectures
that have fewer hidden nodes.
TRManager – Technical Risk Manager Mark A. Gregory, Christopher White Pages: 80-85
ABSTRACT: This paper presents research into the development of a new
information management technique called Technical Risk
Manager. Project management involves the use of processes and
information management techniques to aid decision making in
the pursuit of project success. Project success may be achieved
by meeting time, cost or performance criteria. Current project
management practices focus on achieving time and cost project
success criteria by using three information management
techniques developed in the 1950s: Gantt, PERT and Critical
Path Method. Technical Risk Manager has been developed to
provide an information management technique that may be used
to aid project management decision making in the pursuit of
achieving the performance project success criteria.
Handling Undiscovered Vulnerabilities Using a Provenance Network Amrit’anshu Thakur, Rayford Vaughn, Valentine Anantharaj Pages: 86-91
ABSTRACT: This paper elaborates on a novel approach at
preventing exploits from vulnerabilities which remain uncovered
during the testing phase of a system’s development lifecycle. The
combination of predicted usage patterns, a Provenance network
model and a clustering methodology provide a secure failure
mechanism for both known and unknown security issues within
the system. The paper also addresses of the requisite supporting
infrastructure and deployment issues related to the model. The
idea is to approach the growing problem of newer and more
complex vulnerabilities in an ever more intricate and vast set
of systems using a generic software state mapping procedure
for recognizable (and thus the complementary unrecognizable)
patterns to judge the stability at each step in an operation
sequence. Thus abstracting these vulnerabilities at a higher level
provides us a generic technique to classify and handle such
concerns in the future and in turn prevent exploits before a
corrective patch is released.
Evaluation of the Performance of Vortex Generators on the DU 91-W2-250 Profile using Stereoscopic PIV Clara Marika Velte, Martin Otto Lavér Hansen, Knud Erik Meyer, Peter Fuglsang Pages: 92-96
ABSTRACT: Stereoscopic PIV measurements investigating the
effect of Vortex Generators on the lift force near
stall and on glide ratio at best aerodynamic
performance have been carried out in the LM
Glasfiber wind tunnel on a DU 91-W2-250 profile.
Measurements at two Reynolds numbers were
analyzed; Re=0.9·10 6 and 2.4·10 6 . The results show
that one can resolve the longitudinal vortex
structures generated by the devices and that mixing
is created close to the wall, transferring high
momentum fluid into the near wall region. It is also
seen that the vortex generators successfully can
obstruct separation near stall.
|