Information Management Processes for Extraction of Student Dropout Indicators in Courses in Distance Mode Renata Maria Abrantes Baracho, Paloma de Albuqyerque Diesel Pages: 1-6
ABSTRACT: This research addresses the use of information management
processes in order to extract student dropout indicators in distance
mode courses. Distance education in Brazil aims to facilitate access to
information. The MEC (Ministry of Education) announced, in the
second semester of 2013, that the main obstacles faced by institutions
offering courses in this mode were students dropping out and the
resistance of both educators and students to this mode. The research
used a mixed methodology, qualitative and quantitative, to obtain
student dropout indicators. The factors found and validated in this
research were: the lack of interest from students, insufficient training
in the use of the virtual learning environment for students, structural
problems in the schools that were chosen to offer the course, students
without e-mail, incoherent answers to activities to the course, lack of
knowledge on the part of the student when using the computer tool.
The scenario considered was a course offered in distance mode called
Aluno Integrado (Integrated Student)
Tools for Teaching Mathematical Functions and Geometric Figures to Tactile Visualization through a Braille Printer for Visual Impairment People Lorena León, Luiz Cesar Martini, Cristhian Moreno-Chaparro Pages: 7-10
ABSTRACT: In this article, we showed the features and facilities offered by two new computer programs developed for the treatment and generation of geometric figures and math functions, through a Braille printer designed for visually impaired people. The programs have complete accessible features, in which users with full visual impairments can communicate with the systems via short-keys, and the speech synthesizer. The system sends sound messages that will accompanying the user during all the process to generate geometrical figures or to do a mathematical treatment. Finally, a tactile visualization displays as the results to the person with visual impairment, thus they will can complete their geometry and mathematical studies.
A Project-Based Language Learning Model for Improving the Willingness to Communicate of EFL Students Ibrahim Farouck Pages: 11-18
ABSTRACT: Anxiety and inadequate motivation due to misapplication of some language teaching methodologies and learning materials have been shown to affect the Willingness to Communicate of students in EFL programs. This study used a Project-Based Language Learning to improve learning motivation and content relevance. Students were grouped into pairs to conduct fieldwork activities on their chosen topics and learned the English language that was suitable for describing their activities and outcomes. They interacted with content and peers through Web 2.0 environments. In the classroom, they engaged in communicative tasks in a jigsaw format and presented their projects where their peers used an online rubric and forum to give feedback. They also participated in a speech contest with peers outside their class or from another university in order to broaden their confidence. Findings from this study show that students were able to develop the language and evaluation skills for presentation. Additionally, they indicated a reduction in communication anxiety.
Educating Future Coders with a Holistic ICT Curriculum and New Learning Solutions Pia Niemelä, Cristiano Di Flora, Martti Helevirta, Ville Isomöttönen Pages: 19-23
ABSTRACT: Technology-orientation and coding are gaining momentum in Finnish curriculum planning for primary and secondary school. However, according to the existing plans, the scope of ICT teaching is limited to practical topics, e.g., how to drill basic control structures (if-then-else, for, while) without focusing on the high level epistemological view of ICT. This paper proposes some key extensions to such plans, targeted to highlight rather the epistemological factors of teaching than talk about concrete means of strengthening the craftsmanship of coding. The proposed approach stems from the qualitative data collected by interviewing ICT professionals (N=7, 4 males, 3 females), who have gained experience of the industry needs while working as ICT professionals (avg=11.3 y, s=3.9 y). This work illustrates a holistic model of ICT teaching as well as suggests a set of new methods and tools.
Challenges with Ethical Behavior and Accountability in Leadership Laura Thompson Pages: 24-29
ABSTRACT: In terms of purpose, accountability systems are designed to
apply governance, and in some cases, legislate rules, in order
to impact the quality of the end result, or control the behavior
of people and their environments [19]. The rules within
accountability systems are usually implicit, intrinsic, very
detailed, and fully known by only a few people. Education and
levels of leadership are some of the main factors leading to
breakdown of communication and accountability within
organizational structure. However, business intelligence tools
like knowledge management [11], make it easier to access,
capture share information and make decisions on
accountability within organizations.
Strategic Misalignment occurs when decisions are
made, without communication or ethical standards [13]. To
address the challenges associated with accountability in for and
non profit organizations, a sequential explanatory mixed
method design was employed, along with action research.
Participants of the study were interviewed and asked seven
qualitative questions, in efforts to explain the quantitative
results. The process to gather and culminate the qualitative
results took approximately 6 months. Three main
classifications of accountability systems were derived from the
interviews; personal accountability, financial accountability,
and organizational accountability [8]. To ensure the credibility
of findings in the qualitative analysis, the framework for
additional study with more rigor is presented here.
Biotreatment of Slaughterhouse Wastewater Accompanied with Sustainable Electricity Generation in Microbial Fuel Cell Zainab Z. Ismail, Ali J. Mohammed Pages: 30-35
ABSTRACT: This study aimed to investigate the performance of microbial fuel cell (MFC) for simultaneous bioremediation of slaughterhouse wastewater and sustainable power generation. For the first time, an integrated system of tubular type microbial fuel cell (MFC) was used in this study. The MFC consisted of three concentric Plexiglas tubes; the inner tube was the anaerobic anodic compartment, the mid tube was the aerobic biocathodic chamber, and the outer tube act as an aerobic bioreactor for extended nitrification process. The MFC system was connected to a complementary external anaerobic bioreactor for denitrification process. The microbial fuel cell was inoculated with freshly collected activated sludge and was continuously fueled with simulated slaughterhouse wastewater. Results revealed that the removal efficiency of the chemical oxygen demand (COD) was up to 99%, and the power generation was 165 mW/m2. Also, results demonstrated that maximum removal of NO3- via the denitrification process in the final effluent was 94.7% when the initial concentration of NO3- in the effluent of the extended bioreactor was 15.2 mg/L. Approximately; complete recovery of nitrogen gas was obtained in the complementary external anaerobic bioreactor. These results indicated that MFC could be a promising approach for slaughterhouse wastewater bioremediation and renewable power generation.
Towards to a Predictive Model of Academic Performance Using Data Mining in the UTN - FRRe David L. La Red Martínez, Marcelo Karanik, Mirtha Giovannini, Reinaldo Scappini Pages: 36-41
ABSTRACT: Students completing the courses required to become an Engineer in Information Systems in the Resistencia Regional Fac-ulty, National Technological University, Argentine (UTN-FRRe), face the chal-lenge of attending classes and fulfilling course regularization requirements, often for correlative courses. Such is the case of freshmen's course Algorithms and Data Structures: it must be regularized in order to be able to attend several second and third year courses. Based on the results of the project entitled “Profiling of students and academic performance through the use of data mining”, 25/L059 - UTI1719, implemented in the aforementioned course (in 2013-2015), a new project has started, aimed to take the descriptive analysis (what happened) as a starting point, and use advanced analytics, trying to explain the why, the what will happen, and how we can address it. Different data mining tools will be used for the study: clustering, neural networks, Bayesian networks, decision trees, regression and time series, etc. These tools allow differ-ent results to be obtained from different perspectives, for the given problem. In this way, potential problematic situations will be detected at the beginning of courses, and necessary measures can be taken to solve them. Thereby, the aim of this projects is to identify students who are at risk of abandoning the race to give special support and avoid that situation. Decision trees as predictive classification technique is mainly used.
SIGMATA: Storage Integrity Guaranteeing Mechanism against Tampering Attempts for Video Event Data Recorders Hyuckmin Kwon, Seulbae Kim, Heejo Lee Pages: 42-47
ABSTRACT: The usage and market size of video event data recorders (VEDRs), also known as car black boxes, are rapidly increasing. Since VEDRs can provide more visual information about car accident situations than any other device that is currently used for accident investigations (e.g., closed-circuit television), the integrity of the VEDR contents is important to any meaningful investigation. Researchers have focused on the file system integrity or photographic approaches to integrity verification. However, unlike other general data, the video data in VEDRs exhibit a unique I/O behavior in that the videos are stored chronologically. In addition, the owners of VEDRs can manipulate unfavorable scenes after accidents to conceal their recorded behavior. Since prior arts do not consider the time relationship between the frames and fail to discover frame-wise forgery, a more detailed integrity assurance is required. In this paper, we focus on the development of a frame-wise forgery detection mechanism that resolves the limitations of previous mechanisms. We introduce SIGMATA, a novel storage integrity guaranteeing mechanism against tampering attempts for VEDRs. We describe its operation, demonstrate its effectiveness for detecting possible frame-wise forgery, and compare it with existing mechanisms. The result shows that the existing mechanisms fail to detect any frame-wise forgery, while our mechanism thoroughly detects every frame-wise forgery. We also evaluate its computational overhead using real VEDR videos. The results show that SIGMATA indeed discovers frame-wise forgery attacks effectively and efficiently, with the encoding overhead less than 1.5 milliseconds per frame.
| | Hypertextuality in the Alexander von Humboldt Digital Library Detlev Doherr, Andreas Jankowski Pages: 48-53
ABSTRACT: To do justice to the legacy of Alexander von Humboldt, a 19 th century German scientist and explorer an information and knowledge management system is required to preserve the author’s original intent and promote an awareness of all his relevant works. Although all of Humboldt's works can be found on the internet as digitized papers, the complexity and internal interconnectivity of the writings is not very transparent. Humboldt's concepts of interaction cannot be adequately represented only by digitized papers or scanned documents.
The Humboldt Portal is an attempt to create a new generation of digital libraries, providing a new form of interaction and synthesis between humanistic texts and scientific observation. The digital version of his documents supplies dynamic links to sources, maps, images, graphs and relevant texts in accordance with his visions, because “everything is interconnectedness”.
The Time Diagram Control Approach for the Dynamic Representation of Time-Oriented Data Rolf Dornberger, Darjan Hil, Johann Wittwer, Pascal Bürgy Pages: 54-60
ABSTRACT: The dynamic representation of time-oriented data on small screen devices is of increasing importance. Most solution approaches use issue-specific requirements based on established desktop technologies. Applied to mobile devices with small multi-touch displays such approaches often lead to a limited usability. Particularly, the time-dependent data can only be fragmentarily visualized due to limited screen sizes. Instead of reducing the complexity by visualizing the data, the interpretation of the data is getting more complex. This paper proposes a Time Diagram Control (TDC) approach, a new way of representing time-based diagrams on small screen devices. The TDC uses a principle of cybernetics to integrate the user in the visualization process and thus reduce complexity. TDC focuses on simplicity of design by only providing 2D temporal line diagrams with a dynamic zooming function that works via standard multi-touch controls. Involving the user into a continuous loop of refining the visualization, TDC allows to compare data of different temporal granularities without losing the overall context of the presented data. The TDC approach ensures constant information reliability on small screen devices.
Q-Learning Multi-Objective Sequential Optimal Sensor Parameter Weights Raquel Cohen, Mark Rahmes, Kevin Fox, George Lemieux Pages: 61-66
ABSTRACT: The goal of our solution is to deliver trustworthy decision making
analysis tools which evaluate situations and potential impacts of
such decisions through acquired information and add efficiency for
continuing mission operations and analyst information.We discuss
the use of cooperation in modeling and simulation and show
quantitative results for design choices to resource allocation. The
key contribution of our paper is to combine remote sensing decision
making with Nash Equilibrium for sensor parameter weighting
optimization. By calculating all Nash Equilibrium possibilities per
period, optimization of sensor allocation is achieved for overall
higher system efficiency. Our tool provides insight into what are the
most important or optimal weights for sensor parameters and can be
used to efficiently tune those weights.
Multi-User Virtual Reality Therapy for Post-Stroke Hand Rehabilitation at Home Daria Tsoupikova, Kristen Triandafilou, Greg Rupp, Fabian Preuss, Derek Kamper Pages: 67-71
ABSTRACT: Our paper describes the development of a novel multi-user
virtual reality (VR) system for post-stroke rehabilitation that
can be used independently in the home to improve upper
extremity motor function. This is the pre-clinical phase of an
ongoing collaborative, interdisciplinary research project at the
Rehabilitation Institute of Chicago involving a team of
engineers, researchers, occupational therapists and artists. This
system was designed for creative collaboration within a virtual
environment to increase patients’ motivation, further
engagement and to alleviate the impact of social isolation
following stroke. This is a low-cost system adapted to everyday
environments and designed to run on a personal computer that
combines three VR environments with audio integration,
wireless Kinect tracking and hand motion tracking sensors.
Three different game exercises for this system were developed
to encourage repetitive task practice, collaboration and
competitive interaction. The system is currently being tested
with 15 subjects in three settings: a multi-user VR, a single-user
VR and at a tabletop with standard exercises to examine the
level of engagement and to compare resulting functional
performance across methods. We hypothesize that stroke
survivors will become more engaged in therapy when training
with a multi-user VR system and this will translate into greater
gains.
Adding agility to Enterprise Process and Data Engineering Sergey Zykov, Pavel Shapkin, Nikolay Kazantsev, Vladimir Roslovtsev Pages: 72-77
ABSTRACT: Managing development of large and complex enterprise architectures is a key problem in enterprise engineering. Nowadays one of the breathtaking topics considering enterprise context is real-time system agility. The paper discusses an appropriate general architecture pattern and provides insights how dynamic process management environment could be made. We survey general enterprise software architecture and current agility problems. We introduce a special component called a process knowledge base and justify its crucial role in achieving agility within the enterprise. We study both the architecture of the process knowledge base as well as formal basis for its implementation which relies upon the type theory.
Automatic Parallelization Tool: Classification of Program Code for Parallel Computing Mustafa Basthikodi, Waseem Ahmed Pages: 78-82
ABSTRACT: Performance growth of single-core processors has come to a halt in the past decade, but was re-enabled by the introduction of parallelism in processors. Multicore frameworks along with Graphical Processing Units empowered to enhance parallelism broadly. Couples of compilers are updated to developing challenges forsynchronization and threading issues. Appropriate program and algorithm classifications will have advantage to a great extent to the group of software engineers to get opportunities for effective parallelization. In present work we investigated current species for classification of algorithms, in that related work on classification is discussed along with the comparison of issues that challenges the classification. The set of algorithms are chosen which matches the structure with different issues and perform given task. We have tested these algorithms utilizing existing automatic species extraction toolsalong with Bones compiler. We have added functionalities to existing tool, providing a more detailed characterization. The contributions of our work include support for pointer arithmetic, conditional and incremental statements, user defined types, constants and mathematical functions. With this, we can retain significant data which is not captured by original speciesof algorithms. We executed new theories into the device, empowering automatic characterization of program code.
Securing Information Systems in an Uncertain World Enterprise Level Security (Invited Paper) William R. Simpson Pages: 83-90
ABSTRACT: Increasing threat intrusions to enterprise computing systems have led to a formulation of guarded enterprise systems. The approach was to put in place steel gates and prevent hostile entities from entering the enterprise domain. The current complexity level has made the fortress approach to security implemented throughout the defense, banking, and other high trust industries unworkable. The alternative security approach presented in this paper is the result of a concentrated fourteen year program of pilots and research. Its distributed approach has no need for passwords or accounts and derives from a set of tenets that form the basic security model requirements. At each step in the process it determines identities and claims for access and privileges. These techniques are resilient, secure, extensible, and scalable. They are currently being implemented for a major enterprise, and are a candidate for other enterprise security approaches. This paper discusses the Enterprise Level Security architecture, a web-based security architecture designed to select and incorporate technology into a cohesive set of policies and rules for an enterprise information system. The paper discusses the history, theoretical underpinnings, implementation decisions, current status, and future plans for expansion of capabilities and scale.
|