15 resultados para 080401 Coding and Information Theory
em Digital Commons at Florida International University
Resumo:
This research pursued the conceptualization and real-time verification of a system that allows a computer user to control the cursor of a computer interface without using his/her hands. The target user groups for this system are individuals who are unable to use their hands due to spinal dysfunction or other afflictions, and individuals who must use their hands for higher priority tasks while still requiring interaction with a computer. ^ The system receives two forms of input from the user: Electromyogram (EMG) signals from muscles in the face and point-of-gaze coordinates produced by an Eye Gaze Tracking (EGT) system. In order to produce reliable cursor control from the two forms of user input, the development of this EMG/EGT system addressed three key requirements: an algorithm was created to accurately translate EMG signals due to facial movements into cursor actions, a separate algorithm was created that recognized an eye gaze fixation and provided an estimate of the associated eye gaze position, and an information fusion protocol was devised to efficiently integrate the outputs of these algorithms. ^ Experiments were conducted to compare the performance of EMG/EGT cursor control to EGT-only control and mouse control. These experiments took the form of two different types of point-and-click trials. The data produced by these experiments were evaluated using statistical analysis, Fitts' Law analysis and target re-entry (TRE) analysis. ^ The experimental results revealed that though EMG/EGT control was slower than EGT-only and mouse control, it provided effective hands-free control of the cursor without a spatial accuracy limitation, and it also facilitated a reliable click operation. This combination of qualities is not possessed by either EGT-only or mouse control, making EMG/EGT cursor control a unique and practical alternative for a user's cursor control needs. ^
Resumo:
With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.
Resumo:
The purpose of this study was to test Lotka’s law of scientific publication productivity using the methodology outlined by Pao (1985), in the field of Library and Information Studies (LIS). Lotka’s law has been sporadically tested in the field over the past 30+ years, but the results of these studies are inconclusive due to the varying methods employed by the researchers. ^ A data set of 1,856 citations that were found using the ISI Web of Knowledge databases were studied. The values of n and c were calculated to be 2.1 and 0.6418 (64.18%) respectively. The Kolmogorov-Smirnov (K-S) one sample goodness-of-fit test was conducted at the 0.10 level of significance. The Dmax value is 0.022758 and the calculated critical value is 0.026562. It was determined that the null hypothesis stating that there is no difference in the observed distribution of publications and the distribution obtained using Lotka’s and Pao’s procedure could not be rejected. ^ This study finds that literature in the field of Library and Information Studies does conform to Lotka’s law with reliable results. As result, Lotka’s law can be used in LIS as a standardized means of measuring author publication productivity which will lead to findings that are comparable on many levels (e.g., department, institution, national). Lotka’s law can be employed as an empirically proven analytical tool to establish publication productivity benchmarks for faculty and faculty librarians. Recommendations for further study include (a) exploring the characteristics of the high and low producers; (b) finding a way to successfully account for collaborative contributions in the formula; and, (c) a detailed study of institutional policies concerning publication productivity and its impact on the appointment, tenure and promotion process of academic librarians. ^
Resumo:
This paper analyzes how José Lopéz’s participatory action research and transformational learning theory addresses the oppressed Puerto Rican experience. The paper examines the historical experience of colonialism, explains these two theories, and explores Lopéz’s adult education work in the Puerto Rican community using participatory action research and transformational learning.
Resumo:
With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.
Resumo:
The purpose of this study was to test Lotka’s law of scientific publication productivity using the methodology outlined by Pao (1985), in the field of Library and Information Studies (LIS). Lotka’s law has been sporadically tested in the field over the past 30+ years, but the results of these studies are inconclusive due to the varying methods employed by the researchers. A data set of 1,856 citations that were found using the ISI Web of Knowledge databases were studied. The values of n and c were calculated to be 2.1 and 0.6418 (64.18%) respectively. The Kolmogorov-Smirnov (K-S) one sample goodness-of-fit test was conducted at the 0.10 level of significance. The Dmax value is 0.022758 and the calculated critical value is 0.026562. It was determined that the null hypothesis stating that there is no difference in the observed distribution of publications and the distribution obtained using Lotka’s and Pao’s procedure could not be rejected. This study finds that literature in the field of library and Information Studies does conform to Lotka’s law with reliable results. As result, Lotka’s law can be used in LIS as a standardized means of measuring author publication productivity which will lead to findings that are comparable on many levels (e.g., department, institution, national). Lotka’s law can be employed as an empirically proven analytical tool to establish publication productivity benchmarks for faculty and faculty librarians. Recommendations for further study include (a) exploring the characteristics of the high and low producers; (b) finding a way to successfully account for collaborative contributions in the formula; and, (c) a detailed study of institutional policies concerning publication productivity and its impact on the appointment, tenure and promotion process of academic librarians.
Resumo:
Abstract: Positive psychology has garnered great attention towards understanding how individuals develop personal resources to enhance wellbeing and flow. Barbara Fredrickson’s broaden and build theory suggests when individuals imbue various personal resources with more positive affect, individuals are more likely able to develop greater resilient assets as a result.
Resumo:
A possible gap exists between what parents and preschool providers know concerning children's readiness for school and what they should know when compared to teacher expectations. Students are experiencing difficulty in early schooling as a result of this gap in perspectives. This study's purpose was to describe, explain, and analyze the perspectives of parents, teachers, and preschool providers concerning school readiness. The qualitative strategy of interviewing was used with six parents, six teachers, and two preschool provider participants. Interview transcripts, field notes, member checking, and document analysis were used to interpret data and support findings. Categorization and coding organized data and aided in theory development. ^ Major findings of the study include: (a) All participant groups stress social skills, communication skills, and enthusiasm as most valuable for school readiness; (b) All participant groups agree parents have primary responsibility for readiness preparation; (c) Many participants suggest variables concerning family, economics, and home life contribute to a lack of readiness; (d) Parents place greater value on academic skills than teachers or preschool providers; (e) Preschool programs are identified as having the potential to significantly influence readiness; (f) Communicating, providing positive learning experiences, and providing preschool experience are valuable ways to prepare students for school, yet, differences were found in the types of experiences noted; (g) Participant perspectives indicate that informing parents of readiness expectations is of major importance, and they offer suggestions to accomplish this goal such as using public libraries and pediatrician offices as houses for written information and having kindergarten teachers make presentations at preschools. ^ This study concludes that parents and preschool providers do have knowledge concerning readiness for school. They may not, however, be in a position to carry out their responsibilities due to the intervening variables that inhibit the amount of time, interaction, and communication they have with the children in their care. This study discloses the beliefs of parents and preschool providers that children are ready for school, while teachers conclude that many children are not ready. Suggestions for readiness preparation and information dissemination are significant findings that offer implications for practice and future study. ^
Resumo:
This research investigated the effectiveness and efficiency of structured writing as compared to traditional nonstructured writing as a teaching and learning strategy in a training session for teachers.^ Structured writing is a method of identifying, interrelating, sequencing, and graphically displaying information on fields of a page or computer. It is an alternative for improving training and educational outcomes by providing an effective and efficient documentation methodology.^ The problem focuses upon the contradiction between: (a) the supportive research and theory to modify traditional methods of written documents and information presentation and (b) the existing paradigm to continue with traditional communication methods.^ A MANOVA was used to determine significant difference between a control and an experimental group in a posttest only experimental design. The experimental group received the treatment of structured writing materials during a training session. Two variables were analyzed. They were: (a) effectiveness; correct items on a posttest, and (b) efficiency; time spent on test.^ The quantitative data showed a difference for the experimental group on the two dependent variables. The experimental group completed the posttest in 2 minutes less time while scoring 1.5 more items correct. An interview with the training facilitators revealed that the structured writing materials were "user friendly." ^
Resumo:
The primary purpose of this research is to study the linkage between perceived job design characteristics and information system environment characteristics before and after the replacement of a legacy information system with a new type of information system (referred to as an Enterprise Resource Planning or ERP system). A public state University implementing an academic version of an ERP system was selected for the study. Three survey instruments were used to examine the perception of the information system, the job characteristics, and the organizational culture before and after the system implementation. The research participants included two large departments resulting in a sample of 130 workers. Research questions were analyzed using multivariate procedures including factor analysis, path analysis, step-wise regression, and matched pair analysis. ^ Results indicated that the ERP system has introduced new elements into the working environment that has changed the perception of how the job design characteristics and organization culture dimensions are viewed by the workers. The understanding of how the perceived system characteristics align with an individual's perceived job design characteristics is supported by each of the system characteristics significantly correlated in the proposed direction. The stronger support of this relationship becomes visible in the causal flow of the effects seen in the path diagram and in the step-wise regression. The perceived job design characteristics aligning with dimensions of organizational culture are not as strong as the literature suggests. Although there are significant correlations between the job and culture variables, only one relationship can be seen in the causal flow. ^ This research has demonstrated that system characteristics of ERP do contribute to the perception of change in an organization and do support organizational culture behaviors and job characteristics. ^
Resumo:
The field of chemical kinetics is an exciting and active field. The prevailing theories make a number of simplifying assumptions that do not always hold in actual cases. Another current problem concerns a development of efficient numerical algorithms for solving the master equations that arise in the description of complex reactions. The objective of the present work is to furnish a completely general and exact theory of reaction rates, in a form reminiscent of transition state theory, valid for all fluid phases and also to develop a computer program that can solve complex reactions by finding the concentrations of all participating substances as a function of time. To do so, the full quantum scattering theory is used for deriving the exact rate law, and then the resulting cumulative reaction probability is put into several equivalent forms that take into account all relativistic effects if applicable, including one that is strongly reminiscent of transition state theory, but includes corrections from scattering theory. Then two programs, one for solving complex reactions, the other for solving first order linear kinetic master equations to solve them, have been developed and tested for simple applications.
Resumo:
This study evaluated the early development and pilot-testing of Project IMPACT, a case management intervention for victims of stalking. The Design and Development framework (Rothman & Thomas, 1994) was used as a guide for program development and evaluation. Nine research questions examined the processes and outcomes associated with program implementation. ^ The sample included all 36 clients who participated in Project IMPACT between February of 2000 and June of 2001, as well as the victim advocates who provided them with services. Quantitative and qualitative data were drawn from client case files, participant observation field notes and interview transcriptions. Quantitative data were entered into three databases where: (1) clients were the units of analysis (n = 36), (2) services were the units of analysis (n = 1146), and (3) goals were the units of analysis (n = 149). These data were analyzed using descriptive statistics, Pearson's Chi-square, Spearman's Rho, Phi, Cramer's V, Wilcoxon's Matched Pairs Signed-Ranked Test and McNemar's Test Statistic. Qualitative data were reduced via open, axial and selective coding methods. Grounded theory and case study frameworks were utilized to analyze these data. ^ Results showed that most clients noted an improved sense of well-being and safety, although residual symptoms of trauma remained for numerous individuals. Stalkers appeared to respond to criminal and civil justice-based interventions by reducing violent and threatening behaviors; however, covert behaviors continued. The study produced findings that provided preliminary support for the use of several intervention components including support services, psycho-education, safety planning, and boundary spanning. The psycho-education and safety planning in particular seemed to help clients cognitively reframe their perceptions of the stalking experience and gain a sense of increased safety and well-being. A 65% level of satisfactory goal achievement was observed overall, although goals involving justice-based organizations were associated with lower achievement. High service usage was related to low-income clients and those lacking in social support. Numerous inconsistencies in program implementation were found to be associated with the skills and experiences of victim advocates. Thus, recommendations were made to further refine, develop and evaluate the intervention. ^
Resumo:
In outsourcing relationships with China, the Electronic Manufacturing (EM) and Information Technology Services (ITS) industry in Taiwan may possess such advantages as the continuing growth of its production value, complete manufacturing supply chain, low production cost and a large-scale Chinese market, and language and culture similarity compared to outsourcing to other countries. Nevertheless, the Council for Economic Planning and Development of Executive Yuan (CEPD) found that Taiwan's IT services outsourcing to China is subject to certain constraints and might not be as successful as the EM outsourcing (Aggarwal, 2003; CEPD, 2004a; CIER, 2003; Einhorn and Kriplani, 2003; Kumar and Zhu, 2006; Li and Gao, 2003; MIC, 2006). Some studies examined this issue, but failed to (1) provide statistical evidence about lower prevalence rates of IT services outsourcing, and (2) clearly explain the lower prevalence rates of IT services outsourcing by identifying similarities and differences between both types of outsourcing contexts. This research seeks to fill that gap and possibly provide potential strategic guidelines to ITS firms in Taiwan. This study adopts Transaction Cost Economics (TCE) as the theoretical basis. The basic premise is that different types of outsourcing activities may incur differing transaction costs and realize varying degrees of outsourcing success due to differential attributes of the transactions in the outsourcing process. Using primary data gathered from questionnaire surveys of ninety two firms, the results from exploratory analysis and binary logistic regression indicated that (1) when outsourcing to China, Taiwanese firms' ITS outsourcing tends to have higher level of asset specificity, uncertainty and technical skills relative to EM outsourcing, and these features indirectly reduce firms' outsourcing prevalence rates via their direct positive impacts on transaction costs; (2) Taiwanese firms' ITS outsourcing tends to have lower level of transaction structurability relative to EM outsourcing, and this feature indirectly increases firms' outsourcing prevalence rates via its direct negative impacts on transaction costs; (3) frequency does influence firms' transaction costs in ITS outsourcing positively, but does not bring impacts into their outsourcing prevalence rates, (4) relatedness does influence firms' transaction costs positively and prevalence rates negatively in ITS outsourcing, but its impacts on the prevalence rates are not caused by the mediation effects of transaction costs, and (5) firm size of outsourcing provider does not affect firms' transaction costs, but does affect their outsourcing prevalence rates in ITS outsourcing directly and positively. Using primary data gathered from face-to-face interviews of executives from seven firms, the results from inductive analysis indicated that (1) IT services outsourcing has lower prevalence rates than EM outsourcing, and (2) this result is mainly attributed to Taiwan's core competence in manufacturing and management and higher overall transaction costs of IT services outsourcing. Specifically, there is not much difference between both types of outsourcing context in the transaction characteristics of reputation and most aspects of overall comparison. Although there are some differences in the feature of firm size of the outsourcing provider, the difference doesn't cause apparent impacts on firms' overall transaction costs. The medium or above medium difference in the transaction characteristics of asset specificity, uncertainty, frequency, technical skills, transaction structurability, and relatedness has caused higher overall transaction costs for IT services outsourcing. This higher cost might cause lower prevalence rates for ITS outsourcing relative to EM outsourcing. Overall, the interview results are consistent with the statistical analyses and provide support to my expectation that in outsourcing to China, Taiwan's electronic manufacturing firms do have lower prevalence rates of IT services outsourcing relative to EM outsourcing due to higher transaction costs caused by certain attributes. To solve this problem, firms' management should aim at identifying alternative strategies and strive to reduce their overall transaction costs of IT services outsourcing by initiating appropriate strategies which fit their environment and needs.
Resumo:
This research, conducted in 2006-2008, examines the ways in which various groups involved with the marine resources of Seward, Alaska construct attitudes towards the environment. Participant observation and semi-structured interviews are used to assess how commercial halibut fishers, tour boat operators, local residents and government officials understand the marine environment based on their previous experiences. This study also explores how ideologies relate to the current practices of each group. Two theories orient the analyses: The first, cultural modeling provided a theoretical and methodological framework for pursuing a more comprehensive analysis of resource management. The second, Theory of Reasoned Action (Ajzen and Fishbein 1980), guided the analysis of the ways in which each participant’s ideology towards the marine environment relates to their practice. Aside from contributing to a better understanding of a coastal community’s ideologies and practices, this dissertation sought to better understand the role of ecological ideologies and behaviors in fisheries management. The research illustrates certain domains where ideologies and practices concerning Pacific halibut and the marine environment differ among commercial fishers, government, and management officials, tour boat operators and residents of Seward, AK. These differences offer insights into how future collaborative efforts between government officials, managers and local marine resource users might better incorporate local ideology into management, and provide ecological information to local marine resource users in culturally appropriate ways.