8 resultados para GNSS, Ambiguity resolution, Regularization, Ill-posed problem, Success probability

em Digital Commons at Florida International University


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The high velocity of free atoms associated with the thermal motion, together with the velocity distribution of atoms has imposed the ultimate limitation on the precision of ultrahigh resolution spectroscopy. A sample consisting of low velocity atoms would provide a substantial improvement in spectroscopy resolution. To overcome the problem of thermal motion, atomic physicists have pursued two goals; first, the reduction of the thermal motion (cooling); and second, the confinement of the atoms by means of electromagnetic fields (trapping). Cooling carried sufficiently far, eliminates the motional problems, whereas trapping allows for long observation times. In this work the laser cooling and trapping of an argon atomic beam will be discussed. The experiments involve a time-of-flight spectroscopy on metastable argon atoms. Laser deceleration or cooling of atoms is achieved by counter propagating a photon against an atomic beam of metastable atoms. The solution to the Doppler shift problem is achieved using spatially varying magnetic field along the beam path to Zeeman shift the atomic resonance frequency so as to keep the atoms in resonance with a fixed frequency cooling laser. For trapping experiments a Magnetooptical trap (MOT) will be used. The MOT is formed by three pairs of counter-propagating laser beams with mutual opposite circular polarization and a frequency tuned slightly below the center of the atomic resonance and superimposed on a magnetic quadrupole field.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as “histogram binning” inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. ^ Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. ^ The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. ^ These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. ^ In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as "histogram binning" inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem investigated was negative effects on the ability of a university student to successfully complete a course in religious studies resulting from conflict between the methodologies and objectives of religious studies and the student's system of beliefs. Using Festinger's theory of cognitive dissonance as a theoretical framework, it was hypothesized that completing a course with a high level of success would be negatively affected by (1) failure to accept the methodologies and objectives of religious studies (methodology), (2) holding beliefs about religion that had potential conflicts with the methodologies and objectives (beliefs), (3) extrinsic religiousness, and (4) dogmatism. The causal comparative method was used. The independent variables were measured with four scales employing Likert-type items. An 8-item scale to measure acceptance of the methodologies and objectives of religious studies and a 16-item scale to measure holding of beliefs about religion having potential conflict with the methodologies were developed for this study. These scales together with a 20-item form of Rokeach's Dogmatism Scale and Feagin's 12-item Religious Orientation Scale to measure extrinsic religiousness were administered to 144 undergraduate students enrolled in randomly selected religious studies courses at Florida International University. Level of success was determined by course grade with the 27% of students receiving the highest grades classified as highly successful and the 27% receiving the lowest grades classified as not highly successful. A stepwise discriminant analysis produced a single significant function with methodology and dogmatism as the discriminants. Methodology was the principal discriminating variable. Beliefs and extrinsic religiousness failed to discriminate significantly. It was concluded that failing to accept the methodologies and objectives of religious studies and being highly dogmatic have significant negative effects on a student's success in a religious studies course. Recommendations were made for teaching to diminish these negative effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, many organizations are turning to new approaches to building and maintaining information systems (I/S) to cope with a highly competitive business environment. Current anecdotal evidence indicates that the approaches being used improve the effectiveness of software development by encouraging active user participation throughout the development process. Unfortunately, very little is known about how the use of such approaches enhances the ability of team members to develop I/S that are responsive to changing business conditions.^ Drawing from predominant theories of organizational conflict, this study develops and tests a model of conflict among members of a development team. The model proposes that development approaches provide the relevant context conditioning the management and resolution of conflict in software development which, in turn, are crucial for the success of the development process.^ Empirical testing of the model was conducted using data collected through a combination of interviews with I/S executives and surveys of team members and business users at nine organizations. Results of path analysis provide support for the model's main prediction that integrative conflict management and distributive conflict management can contribute to I/S success by influencing differently the manifestation and resolution of conflict in software development. Further, analyses of variance indicate that object-oriented development, when compared to rapid and structured development, appears to produce the lowest levels of conflict management, conflict resolution, and I/S success.^ The proposed model and findings suggest academic implications for understanding the effects of different conflict management behaviors on software development outcomes, and practical implications for better managing the software development process, especially in user-oriented development environments. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation develops a process improvement method for service operations based on the Theory of Constraints (TOC), a management philosophy that has been shown to be effective in manufacturing for decreasing WIP and improving throughput. While TOC has enjoyed much attention and success in the manufacturing arena, its application to services in general has been limited. The contribution to industry and knowledge is a method for improving global performance measures based on TOC principles. The method proposed in this dissertation will be tested using discrete event simulation based on the scenario of the service factory of airline turnaround operations. To evaluate the method, a simulation model of aircraft turn operations of a U.S. based carrier was made and validated using actual data from airline operations. The model was then adjusted to reflect an application of the Theory of Constraints for determining how to deploy the scarce resource of ramp workers. The results indicate that, given slight modifications to TOC terminology and the development of a method for constraint identification, the Theory of Constraints can be applied with success to services. Bottlenecks in services must be defined as those processes for which the process rates and amount of work remaining are such that completing the process will not be possible without an increase in the process rate. The bottleneck ratio is used to determine to what degree a process is a constraint. Simulation results also suggest that redefining performance measures to reflect a global business perspective of reducing costs related to specific flights versus the operational local optimum approach of turning all aircraft quickly results in significant savings to the company. Savings to the annual operating costs of the airline were simulated to equal 30% of possible current expenses for misconnecting passengers with a modest increase in utilization of the workers through a more efficient heuristic of deploying them to the highest priority tasks. This dissertation contributes to the literature on service operations by describing a dynamic, adaptive dispatch approach to manage service factory operations similar to airline turnaround operations using the management philosophy of the Theory of Constraints.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hospitals and healthcare facilities in the United States are facing serious shortages of medical laboratory personnel, which, if not addressed, stand to negatively impact patient care. The problem is compounded by a reduction in the numbers of academic programs and resulting decrease in the number of graduates to keep up with the increase in industry demands. Given these challenges, the purpose of this study was to identify predictors of success for students in a selected 2-year Medical Laboratory Technology Associate in Science Degree Program. ^ This study examined five academic factors (College Placement Test Math and Reading scores, Cumulative GPA, Science GPA, and Professional [first semester laboratory courses] GPA) and, demographic data to see if any of these factors could predict program completion. The researcher examined academic records for a 10-year period (N =158). Using a retrospective model, the correlational analysis between the variables and completion revealed a significant relationship (p < .05) for CGPA, SGPA, CPT Math, and PGPA indicating that students with higher CGPA, SGPA, CPT Math, and PGPA were more likely to complete their degree in 2 years. Binary logistic regression analysis with the same academic variables revealed PGPA was the best predictor of program completion (p < .001). ^ Additionally, the findings in this study are consistent with the academic part of the Bean and Metzner Conceptual Model of Nontraditional Student Attrition which points to academic outcome variables such as GPA as affecting attrition. Thus, the findings in this study are important to students and educators in the field of Medical Laboratory Technology since PGPA is a predictor that can be used to provide early in-program intervention to the at-risk student, thus increasing the chances of successful timely completion.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the past several years, U.S. colleges and universities have faced increased pressure to improve retention and graduation rates. At the same time, educational institutions have placed a greater emphasis on the importance of enrolling more students in STEM (science, technology, engineering and mathematics) programs and producing more STEM graduates. The resulting problem faced by educators involves finding new ways to support the success of STEM majors, regardless of their pre-college academic preparation. The purpose of my research study involved utilizing first-year STEM majors’ math SAT scores, unweighted high school GPA, math placement test scores, and the highest level of math taken in high school to develop models for predicting those who were likely to pass their first math and science courses. In doing so, the study aimed to provide a strategy to address the challenge of improving the passing rates of those first-year students attempting STEM-related courses. The study sample included 1018 first-year STEM majors who had entered the same large, public, urban, Hispanic-serving, research university in the Southeastern U.S. between 2010 and 2012. The research design involved the use of hierarchical logistic regression to determine the significance of utilizing the four independent variables to develop models for predicting success in math and science. The resulting data indicated that the overall model of predictors (which included all four predictor variables) was statistically significant for predicting those students who passed their first math course and for predicting those students who passed their first science course. Individually, all four predictor variables were found to be statistically significant for predicting those who had passed math, with the unweighted high school GPA and the highest math taken in high school accounting for the largest amount of unique variance. Those two variables also improved the regression model’s percentage of correctly predicting that dependent variable. The only variable that was found to be statistically significant for predicting those who had passed science was the students’ unweighted high school GPA. Overall, the results of my study have been offered as my contribution to the literature on predicting first-year student success, especially within the STEM disciplines.