Attrition of Knowledge Workforce in Healthcare in Northern parts of India – Health Information Technology as a Plausible Retention Strategy Indrajit Bhattacharya, Anandhi Ramachandran, R.K. Suri, S.L. Gupta Pages: 1-11
ABSTRACT: Faced with a global shortage of skilled health workers due to attrition, countries are struggling to build and maintain optimum knowledge workforce in healthcare for delivering quality healthcare services. Forces that affect healthcare professionals’ turnover needs to be addressed before a competent uniformly adoptable strategy could be proposed for mitigating the problem. In this study we investigate the effects of the socio–demographic characteristics on attrition of healthcare knowledge workforce in northern parts of India that have a wide gradient of rural and urban belt, taking into account both public and private healthcare organizations.
For this purpose healthcare professional attrition tracking survey (HATS) was designed. The data has been collected from a random sample of 807 respondents consisting of doctors, nurses, paramedics and administrators to explore the relationships between various factors acting as antecedents in affecting the job satisfaction, commitment and intention of a healthcare professional to stay in the job. Structured questionnaires were utilized as the data collection tools.
Descriptive statistics, factor analysis and path analysis were carried out using multiple regression and correlation to propose a model that best explains the theoretical assumption of factors leading to attrition. Six factors of attrition namely compensation and perks, work life balance, sense of accomplishment, work load, need for automation and technology improvement, substandard nature of work have been identified as the main factors with a data reliability of 0.809%. It has also been identified that the intention to shift is a major decision maker that affects attrition and in turn affected by job satisfaction dimensions. Based on the survey response and analysis, a highly possible strategy of utilizing information technology implementation for increasing worker motivation, job satisfaction and commitment to reduce attrition has been proposed.
A Study on the application of Data Mining Methods in the analysis of Transcripts Luis Raunheitte, Rubens De Camargo, Takato Kurihara, Alan Heitokotter, Juvenal J. Duarte Pages: 12-17
ABSTRACT: Schools always had an essential role in the formation of students‟ intellect; however, the constant incorporation of knowledge to improve techniques and technologies used in the production of goods and services has caused a major demand for highly qualified professionals and, in order to meet that need, the teaching process must understand and adapt to the profile of the students. The transcript is the most used document to measure the performance of a student. Its digital storage combined with data mining methodologies can contribute not only to the analysis of performances, but also to the identification of significant information about student
CUYT Design Framework Model (CDFM) Joseph Bowman Pages: 18-23
ABSTRACT: This article discusses the significance and relevance of program design research in informal
environments in urban communities. Design and project based research initiated through the
Center for Urban Youth and Technology (CUYT) framework has produced several Science,
Technology, Engineering, Arts, and Mathematics (STEAM) projects in urban communities in
New York State. Elements of our design framework model will be explored and defined.
Connections between project design, action research, and intercultural research will be presented,
and the process of design to implementation will be explained. CUYT interventions will be
included to provide examples of how the model works in practice. The action research based
project Information Technology and Cities (ITC) and the Institute for Nanoscale Technology and
Youth (INTY) program demonstrate the influence of the CUYT Design Framework Model
(CDFM). The current conceptual CDFM will be discussed and reviewed.
Time Domain Modeling Of A Band-Notched Antenna For UWB Applications S. Mridula, Binu Paul, P. Mythili, P. Mohanan Pages: 24-28
ABSTRACT: The time domain modeling of a coplanar wave guide (CPW) fed band-notched antenna for Ultra Wide band (UWB) applications is presented. The annular ring antenna has a dimension of 36x36 mm2 when printed on a substrate of dielectric constant 4.4 and thickness 1.6 mm. The uniplanar nature and compact structure of the antenna make it apt for modular design. The crescent shaped slot provides a notch in the 5.2-5.8 GHz frequency band to avoid interference with Wireless Local Area Network (WLAN). The pulse distortion is insignificant in the operating band and is verified by the measured antenna performance with high signal fidelity and virtually steady group delay.
Virtual Sambor Prei Kuk: An Interactive Learning Tool Daniel Michon, Yehuda Kalay Pages: 29-37
ABSTRACT: MUVEs (Multi-User Virtual Environments) are a new media for researching the genesis and evolution of sites of cultural significance. MUVEs are able to model both the tangible and intangible heritage of a site, allowing the user to obtain a more dynamic understanding of the culture. This paper illustrates a cultural heritage project which captures and communicates the interplay of context (geography), content (architecture and artifacts) and temporal activity (rituals and everyday life) leading to a unique digital archive of the tangible and intangible heritage of the temple complex at Sambor Prei Kuk, Cambodia, circa seventh-eighth century CE. The MU6VE is used to provide a platform which enables the experience of weaved tangible and intangible cultural heritage. This, we argue, turns static space into meaningful place. Further, this kind of digital model has the potential to bring together Jean Lave and Etienne Wenger’s theorizing on the importance of community in education and more recent theorizing on the impact of Virtual Worlds on learning by James Paul Gee.
Individualising Media Practice Education Using a Feedback Loop and
Instructional Videos Within an eLearning Environment. Trevor Harris Pages: 38-44
ABSTRACT: This paper explores the development and impact of the author’s
TELE (Technology Enhanced Learning Environment) action
research project for individualising media practice education.
The latest iteration of different classroom methodologies being
employed to develop high-level skills in media production, the
author has combined an interactive eLearning approach with
instructional videos and, crucially, an individual feedback loop
in order to widen access to the curriculum and create a more
efficient teaching and learning environment. The focus therefore
is on student engagement and organisational efficiencies as a
result of the research.
It should be noted that there has been no funding attached to
this work, nor are there any institutional imperatives or other
stakeholder involvement in this research. This project has been
undertaken by the author as an evolutionary development of the
various methodologies developed, cognisant of the increased
technology literacy of the student cohort. The educational
benefit of bringing video instruction into the curriculum as
part of the project is examined as a creative pedagogy of direct
benefit to students rather than as a subliminal marketing tool that
other systems are often used for.
Over 16K words of written data was collected during the
project, and this is analysed both quantitatively and qualitatively
with reference to the initial objectives of the research
A Participatory Geographic Information System (PGIS) Utilizing the GeoWeb 2.0: Filling the Gaps of the Marcellus Shale Natural Gas Industry Drew Michanowicz, Samantha Malone, Matthew Kelso, Kyle Ferrar, Charles Christen, Conrad Dan Volz Pages: 45-53
ABSTRACT: The application of neocartography, specifically through the Web 2.0, is a new phase of participatory geographic information system (PGIS) research. Neocartography includes the encouragement of non-expert participation through visual design (e.g., map layering), and knowledge discovery via the Web. To better understand the challenges from an increase in natural gas extraction in the Marcellus Shale region of the United States, a GeoWeb 2.0 platform titled FracTracker (FracTracker.org) that relies upon PGIS and neocartography was created and implemented in June 2010. FracTracker focuses on data-to-information translation to stimulate capacity building for a range of user types by leveraging the immense benefits of a spatial component. The main features of FracTracker are the ability to upload and download geospatial data as various file types, visualize data through thematic mapping and charting tools, and learn about and share drilling experiences. In less than 2 years, 2,440 registered users have effectively participated in creating 956 maps or „snapshots‟ using 399 available datasets. FracTracker demonstrates that participatory, interoperable GeoWebs can be utilized to help understand and localize related impacts of complex systems, such as the extractive energy industry.
Information Risk Management: Qualitative or Quantitative?
Cross industry lessons from medical and financial fields Upasna Saluja, Norbik Bashah Idris Pages: 54-59
ABSTRACT: Enterprises across the world are taking a hard look at their risk management practices. A number of qualitative and quantitative models and approaches are employed by risk practitioners to keep risk under check. As a norm most organizations end up choosing the more flexible, easier to deploy and customize qualitative models of risk assessment. In practice one sees that such models often call upon the practitioners to make qualitative judgments on a relative rating scale which brings in considerable room for errors, biases and subjectivity. On the other hand under the quantitative risk analysis approach, estimation of risk is connected with application of numerical measures of some kind. Medical risk management models lend themselves as ideal candidates for deriving lessons for Information Security Risk Management. We can use this considerably developed understanding of risk management from the medical field especially Survival Analysis towards handling risks that information infrastructures face. Similarly, financial risk management discipline prides itself on perhaps the most quantifiable of models in risk management. Market Risk and Credit Risk Information Security Risk Management can make risk measurement more objective and quantitative by referring to the approach of Credit Risk. During the recent financial crisis many investors and financial institutions lost money or went bankrupt respectively, because they did not apply the basic principles of risk management. Learning from the financial crisis provides some valuable lessons for information risk management.
No Problem? No Research, Little Learning ... Big Problem! Fernando Ornelas Marques, Maria Teresa Marques Pages: 60-62
ABSTRACT: The motivation to carry out this study stemmed from the
generalized perception that nowadays youth lacks the skills for
the 21st century. Especially the high-level competences like
critical thinking, problem solving and autonomy. Several tools
can help to improve these competences (e.g. the SCRATCH
programming language), but, as researchers and educators, we
are mostly concerned with the skill to recognize problems.
What if we do not find problems to solve? What if we do not
even feel the need to find or solve problems? The problem is to
recognize the problem; the next step is to equate the problem;
finally we have to feel the need to solve it. No need? No
invention. Recognizing a problem is probably the biggest
problem of everyday life, because we are permanently faced
with problems (many ill-defined problems), which we need to
identify, equate and solve.
| | Corporate Governance Best Practice and Stock Performance: Case of CEE
Companies Julia Bistrova, Natalja Lace Pages: 63-69
ABSTRACT: Corporate governance (CG) becomes a very essential factor to
consider prior to investing in the company. A number of studies
proved its importance on the developed equity markets.
However, intuitively corporate governance should gain more
importance due to high degree of uncertainty because of the
unstable environment. In order to assess the influence of
corporate governance quality on Central and Eastern European
companies‟ stock performance, the CG assessment model,
which includes 21 evaluation criteria, was developed. Based on
the model rating, the companies with the highest CG quality
(top 25%) outperformed companies with the worst CG quality
(bottom 25%) by 0.98% on a monthly basis during the period of
2008 - 2010. Study demonstrate that companies with good CG
quality are able to offer lower risk.
Human Walk Modeled by PCPG to Control a Lower Limb
Neuroprosthesis by High-Level Commands Matthieu Duvinage, Thierry Castermans, Rene Jimenez-Fabian, Thomas Hoellinger, Mathieu Petieau, Olivier Verlinden, Guy Cheron, Thierry Dutoit Pages: 70-80
ABSTRACT: Current active leg prostheses do not integrate the most recent
advances in Brain-Computer Interfaces (BCI) and bipedal
robotics. Moreover, their actuators are seldom driven by the
subject’s intention.
This paper aims at showing a summary of our current results in
the field of human gait rehabilitation. In a first prototype, the
main focus was on people suffering from foot drop problems,
i.e. people who are unable to lift their feet. However, current
work is focusing on a full active ankle orthosis.
The approach is threefold: a BCI system, a gait model and
an orthosis. Thanks to the BCI system, patients are able to
generate high-level commands. Typically, a command could
represent a speed modification. Then, a gait model based on
a programmable central pattern generator is used to generate
the adequate kinematics. Finally, the orthosis is tracking this
kinematics when the foot is in the air, whereas, the orthosis is
mimicking a spring when the foot is on the ground.
Modeling Tools for Drilling, Reservoir Navigation, and Formation Evaluation Sushant Dutta, Fei Le, Alexandre Bespalov, Arcady Reiderman, Michael Rabinovich Pages: 81-87
ABSTRACT: The oil and gas industry routinely uses borehole tools for measuring
or logging rock and fluid properties of geologic formations to
locate hydrocarbons and maximize their production. Pore fluids
in formations of interest are usually hydrocarbons or water. Resistivity
logging is based on the fact that oil and gas have a substantially
higher resistivity than water. The first resistivity log
was acquired in 1927, and resistivity logging is still the foremost
measurement used for drilling and evaluation. However, the acquisition
and interpretation of resistivity logging data has grown
in complexity over the years.
Resistivity logging tools operate in a wide range of frequencies
(from DC to GHz) and encounter extremely high (several orders
of magnitude) conductivity contrast between the metal mandrel
of the tool and the geologic formation. Typical challenges
include arbitrary angles of tool inclination, full tensor electric
and magnetic field measurements, and interpretation of complicated
anisotropic formation properties. These challenges combine
to form some of the most intractable computational electromagnetic
problems in the world. Reliable, fast, and convenient
numerical modeling of logging tool responses is critical
for tool design, sensor optimization, virtual prototyping,
and log data inversion. This spectrum of applications necessitates
both depth and breadth of modeling software—from blazing
fast one-dimensional (1-D) modeling codes to advanced threedimensional
(3-D) modeling software, and from in-house developed
codes to commercial modeling packages.
In this paper, with the help of several examples, we demonstrate
our approach for using different modeling software to address
different drilling and evaluation applications. In one example,
fast 1-D modeling provides proactive geosteering information
from a deep-reading azimuthal propagation resistivity measurement.
In the second example, a 3-D model with multiple vertical
resistive fractures successfully explains the unusual curve separations
of an array laterolog tool in a shale-gas formation. The third
example uses two-dimensional (2-D) and 3-D modeling to prove
the efficacy of a new borehole technology for reservoir monitoring.
The Simulation and Analysis of the Closed Die Hot Forging Process by
A Computer Simulation Method Dipakkumar Gohil Pages: 88-93
ABSTRACT: The objective of this research work is to study the variation of
various parameters such as stress, strain, temperature, force, etc.
during the closed die hot forging process. A computer
simulation modeling approach has been adopted to transform
the theoretical aspects in to a computer algorithm which would
be used to simulate and analyze the closed die hot forging
process. For the purpose of process study, the entire
deformation process has been divided in to finite number of
steps appropriately and then the output values have been
computed at each deformation step. The results of simulation
have been graphically represented and suitable corrective
measures are also recommended, if the simulation results do not
agree with the theoretical values. This computer simulation
approach would significantly improve the productivity and
reduce the energy consumption of the overall process for the
components which are manufactured by the closed die forging
process and contribute towards the efforts in reducing the global
warming.
American Depositary: A Case Study for Brazilian Market André Machado Caldeira, Giovanna Lamastra Pacheco, Walter Gassenferth, Maria Augusta Soares Machado Pages: 94-99
ABSTRACT: Specialists often question market efficiency. Some works
suggest arbitrage opportunities in several financial operations.
Such opportunities can be explained mainly by information
asymmetry, since pricing in the stock market is directly linked
to information; therefore, the investor that has access to such
information the soonest has a competitive advantage. The
objective of this paper is to verify the existence of arbitrage
opportunities via ADRs, traded in the American market, and
their respective stocks, which are traded in the domestic
market. Through a case study conducted with four companies,
not considering the transition costs, arbitrage opportunity
windows were found. Among the companies studied, two had
frequent arbitrage opportunities, for one of them the arbitrage
opportunity can be shaped by the time series model.
Monte Carlo Numerical Models for Nuclear Logging Applications Fusheng Li, Xiaogang Han Pages: 100-103
ABSTRACT: Nuclear logging is one of most important logging services
provided by many oil service companies. The main parameters
of interest are formation porosity, bulk density, and natural
radiation. Other services are also provided from using complex
nuclear logging tools, such as formation lithology/mineralogy,
etc. Some parameters can be measured by using neutron
logging tools and some can only be measured by using a
gamma ray tool.
To understand the response of nuclear logging tools, the
neutron transport/diffusion theory and photon diffusion theory
are needed. Unfortunately, for most cases there are no analytical
answers if complex tool geometry is involved. For many years,
Monte Carlo numerical models have been used by nuclear
scientists in the well logging industry to address these
challenges. The models have been widely employed in the
optimization of nuclear logging tool design, and the
development of interpretation methods for nuclear logs. They
have also been used to predict the response of nuclear logging
systems for forward simulation problems. In this case, the
system parameters including geometry, materials and nuclear
sources, etc., are pre-defined and the transportation and
interactions of nuclear particles (such as neutrons, photons
and/or electrons) in the regions of interest are simulated
according to detailed nuclear physics theory and their nuclear
cross-section data (probability of interacting). Then the
deposited energies of particles entering the detectors are
recorded and tallied and the tool responses to such a scenario
are generated. A general-purpose code named Monte Carlo N–
Particle (MCNP) has been the industry-standard for some time.
In this paper, we briefly introduce the fundamental principles of
Monte Carlo numerical modeling and review the physics of
MCNP. Some of the latest developments of Monte Carlo
Models are also reviewed. A variety of examples are presented
to illustrate the uses of Monte Carlo numerical models for the
development of major nuclear logging tools, including
compensated neutron porosity, compensated density, natural
gamma ray and a nuclear geo-mechanical tool.
Electronic Algebra and Calculus Tutor Larissa Fradkin, Victor Zernov Pages: 104-110
ABSTRACT: Modern undergraduates join science and engineering courses
with poorer mathematical background than most
contemporaries of the current faculty had when they were
freshers. The problem is very acute in the United Kingdom
but more and more countries adopt less resource intensive
models of teaching and the problem spreads. University
tutors and lecturers spend more and more time covering the
basics. However, most of them still rely on traditional
methods of delivery which presuppose that learners have a
good memory and considerable time to practice, so that they
can memorize disjointed facts and discover for themselves
various connections between the underlying concepts. These
suppositions are particularly unrealistic when dealing with a
large number of undergraduates who are ordinary learners
with limited mathematics background. The first author has
developed a teaching system that allows such adult learners
achieve relatively deep learning of mathematics – and
remarkably quickly – through a teacher-guided (often called
Socratic) dialog, which aims at the frequent reinforcement of
basic mathematical abstractions through Eulerian
sequencing. These ideas have been applied to create a
prototype of a Cognitive Mathematics Tutoring System
aimed at teaching basic mathematics to University freshers.,
an electronic Personal Algebra and Calculus Tutor (e-
PACT).
Environmental Knowledge as Design Development Agent Buthayna Hasan Eilouti Pages: 111-121
ABSTRACT: Linking knowledge learnt from nature with concepts of man-made product generation, Biomimetics represents an application of environmental knowledge on engineering design. In this paper, an exploratory approach that is based on the concepts of Biomimetics and their potential transformations into design processing agents is introduced. The approach is tested as a project that applies knowledge inspired by organisms in their natural biomes for the derivation of architectural designs. The project is implemented in a digital architectural design studio in order to model metamorphosis and simulate adaptations of products. The results of the project implementation seem to encourage adopting its associative environmental problem-solving techniques and inter-disciplinary methods as alternatives or complementary to conventional functional- or formal-oriented problem-solving approaches.
|