936 resultados para Interpolation accuracy


Relevância:

20.00% 20.00%

Publicador:

Resumo:

ACM Computing Classification System (1998): G.1.1, G.1.2.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 46B50, 46B70, 46G12.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To test the practicality and effectiveness of cheap, ubiquitous, consumer-grade smartphones to discriminate Parkinson’s disease (PD) subjects from healthy controls, using self-administered tests of gait and postural sway. Background: Existing tests for the diagnosis of PD are based on subjective neurological examinations, performed in-clinic. Objective movement symptom severity data, collected using widely-accessible technologies such as smartphones, would enable the remote characterization of PD symptoms based on self-administered, behavioral tests. Smartphones, when backed up by interviews using web-based videoconferencing, could make it feasible for expert neurologists to perform diagnostic testing on large numbers of individuals at low cost. However, to date, the compliance rate of testing using smart-phones has not been assessed. Methods: We conducted a one-month controlled study with twenty participants, comprising 10 PD subjects and 10 controls. All participants were provided identical LG Optimus S smartphones, capable of recording tri-axial acceleration. Using these smartphones, patients conducted self-administered, short (less than 5 minute) controlled gait and postural sway tests. We analyzed a wide range of summary measures of gait and postural sway from the accelerometry data. Using statistical machine learning techniques, we identified discriminating patterns in the summary measures in order to distinguish PD subjects from controls. Results: Compliance was high all 20 participants performed an average of 3.1 tests per day for the duration of the study. Using this test data, we demonstrated cross-validated sensitivity of 98% and specificity of 98% in discriminating PD subjects from healthy controls. Conclusions: Using consumer-grade smartphone accelerometers, it is possible to distinguish PD from healthy controls with high accuracy. Since these smartphones are inexpensive (around $30 each) and easily available, and the tests are highly non-invasive and objective, we envisage that this kind of smartphone-based testing could radically increase the reach and effectiveness of experts in diagnosing PD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Heterogeneous datasets arise naturally in most applications due to the use of a variety of sensors and measuring platforms. Such datasets can be heterogeneous in terms of the error characteristics and sensor models. Treating such data is most naturally accomplished using a Bayesian or model-based geostatistical approach; however, such methods generally scale rather badly with the size of dataset, and require computationally expensive Monte Carlo based inference. Recently within the machine learning and spatial statistics communities many papers have explored the potential of reduced rank representations of the covariance matrix, often referred to as projected or fixed rank approaches. In such methods the covariance function of the posterior process is represented by a reduced rank approximation which is chosen such that there is minimal information loss. In this paper a sequential Bayesian framework for inference in such projected processes is presented. The observations are considered one at a time which avoids the need for high dimensional integrals typically required in a Bayesian approach. A C++ library, gptk, which is part of the INTAMAP web service, is introduced which implements projected, sequential estimation and adds several novel features. In particular the library includes the ability to use a generic observation operator, or sensor model, to permit data fusion. It is also possible to cope with a range of observation error characteristics, including non-Gaussian observation errors. Inference for the covariance parameters is explored, including the impact of the projected process approximation on likelihood profiles. We illustrate the projected sequential method in application to synthetic and real datasets. Limitations and extensions are discussed. © 2010 Elsevier Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Large-scale mechanical products, such as aircraft and rockets, consist of large numbers of small components, which introduce additional difficulty for assembly accuracy and error estimation. Planar surfaces as key product characteristics are usually utilised for positioning small components in the assembly process. This paper focuses on assembly accuracy analysis of small components with planar surfaces in large-scale volume products. To evaluate the accuracy of the assembly system, an error propagation model for measurement error and fixture error is proposed, based on the assumption that all errors are normally distributed. In this model, the general coordinate vector is adopted to represent the position of the components. The error transmission functions are simplified into a linear model, and the coordinates of the reference points are composed by theoretical value and random error. The installation of a Head-Up Display is taken as an example to analyse the assembly error of small components based on the propagation model. The result shows that the final coordination accuracy is mainly determined by measurement error of the planar surface in small components. To reduce the uncertainty of the plane measurement, an evaluation index of measurement strategy is presented. This index reflects the distribution of the sampling point set and can be calculated by an inertia moment matrix. Finally, a practical application is introduced for validating the evaluation index.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Correct specification of the simple location quotients in regionalizing the national direct requirements table is essential to the accuracy of regional input-output multipliers. The purpose of this research is to examine the relative accuracy of these multipliers when earnings, employment, number of establishments, and payroll data specify the simple location quotients.^ For each specification type, I derive a column of total output multipliers and a column of total income multipliers. These multipliers are based on the 1987 benchmark input-output accounts of the U.S. economy and 1988-1992 state of Florida data.^ Error sign tests, and Standardized Mean Absolute Deviation (SMAD) statistics indicate that the output multiplier estimates overestimate the output multipliers published by the Department of Commerce-Bureau of Economic Analysis (BEA) for the state of Florida. In contrast, the income multiplier estimates underestimate the BEA's income multipliers. For a given multiplier type, the Spearman-rank correlation analysis shows that the multiplier estimates and the BEA multipliers have statistically different rank ordering of row elements. The above tests also find no significant different differences, both in size and ranking distributions, among the vectors of multiplier estimates. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Intoxilyzer 5000 was tested for calibration curve linearity for ethanol vapor concentration between 0.020 and 0.400g/210L with excellent linearity. Calibration error using reference solutions outside of the allowed concentration range, response to the same ethanol reference solution at different temperatures between 34 and 38$\sp\circ$C, and its response to eleven chemicals, 10 mixtures of two at the time, and one mixture of four chemicals potentially found in human breath have been evaluated. Potential interferents were chosen on the basis of their infrared signatures and the concentration range of solutions corresponding to the non-lethal blood concentration range of various volatile organic compounds reported in the literature. The result of this study indicates that the instrument calibrates with solutions outside the allowed range up to $\pm$10% of target value. Headspace FID dual column GC analysis was used to confirm the concentrations of the solutions. Increasing the temperature of the reference solution from 34 to 38$\sp\circ$C resulted in linear increases in instrument recorded ethanol readings with an average increase of 6.25%/$\sp\circ$C. Of the eleven chemicals studied during this experiment, six, isopropanol, toluene, methyl ethyl ketone, trichloroethylene, acetaldehyde, and methanol could reasonably interfere with the test at non-lethal reported blood concentration ranges, the mixtures of those six chemicals showed linear additive results with a combined effect of as much as a 0.080g/210L reading (Florida's legal limit) without any ethanol present. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examined the effect of schemas on consistency and accuracy of memory across interviews, providing theoretical hypotheses explaining why inconsistencies may occur. The design manipulated schema-typicality of items (schema-typical and atypical), question format (free-recall, cued-recall and recognition) and retention interval (immediate/2 week and 2 week/4 week). Consistency, accuracy and experiential quality of memory were measured. ^ All independent variables affected accuracy and experiential quality of memory while question format was the only variable affecting consistency. These results challenge the commonly held notion in the legal arena that consistency is a proxy for accuracy. The study also demonstrates that other variables, such as item-typicality and retention interval have different effects on consistency and accuracy in memory. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Historical accuracy is only one of the components of a scholarly college textbook used to teach the history of jazz music. Textbooks in this field should include accurate ethnic representation of the most important musical figures as jazz is considered the only original American art form. As college and universities celebrate diversity, it is important that jazz history be accurate and complete. ^ The purpose of this study was to examine the content of the most commonly used jazz history textbooks currently used at American colleges and universities. This qualitative study utilized grounded and textual analysis to explore the existence of ethnic representation in these texts. The methods used were modeled after the work of Kane and Selden each of whom conducted a content analysis focused on a limited field of study. This study is focused on key jazz artists and composers whose work was created in the periods of early jazz (1915-1930), swing (1930-1945) and modern jazz (1945-1960). ^ This study considered jazz notables within the texts in terms of ethnic representation, authors' use of language, contributions to the jazz canon, and place in the standard jazz repertoire. Appropriate historical sections of the selected texts were reviewed and coded using predetermined rubrics. Data were then aggregated into categories and then analyzed according to the character assigned to the key jazz personalities noted in the text as well as the comparative standing afforded each personality. ^ The results of this study demonstrate that particular key African-American jazz artists and composers occupy a significant place in these texts while other significant individuals representing other ethnic groups are consistently overlooked. This finding suggests that while America and the world celebrates the quality of the product of American jazz as great musically and significant socially, many ethnic contributors are not mentioned with the result being a less than complete picture of the evolution of this American art form. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation establishes the foundation for a new 3-D visual interface integrating Magnetic Resonance Imaging (MRI) to Diffusion Tensor Imaging (DTI). The need for such an interface is critical for understanding brain dynamics, and for providing more accurate diagnosis of key brain dysfunctions in terms of neuronal connectivity. ^ This work involved two research fronts: (1) the development of new image processing and visualization techniques in order to accurately establish relational positioning of neuronal fiber tracts and key landmarks in 3-D brain atlases, and (2) the obligation to address the computational requirements such that the processing time is within the practical bounds of clinical settings. The system was evaluated using data from thirty patients and volunteers with the Brain Institute at Miami Children's Hospital. ^ Innovative visualization mechanisms allow for the first time white matter fiber tracts to be displayed alongside key anatomical structures within accurately registered 3-D semi-transparent images of the brain. ^ The segmentation algorithm is based on the calculation of mathematically-tuned thresholds and region-detection modules. The uniqueness of the algorithm is in its ability to perform fast and accurate segmentation of the ventricles. In contrast to the manual selection of the ventricles, which averaged over 12 minutes, the segmentation algorithm averaged less than 10 seconds in its execution. ^ The registration algorithm established searches and compares MR with DT images of the same subject, where derived correlation measures quantify the resulting accuracy. Overall, the images were 27% more correlated after registration, while an average of 1.5 seconds is all it took to execute the processes of registration, interpolation, and re-slicing of the images all at the same time and in all the given dimensions. ^ This interface was fully embedded into a fiber-tracking software system in order to establish an optimal research environment. This highly integrated 3-D visualization system reached a practical level that makes it ready for clinical deployment. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Historically, memory has been evaluated by examining how much is remembered, however a more recent conception of memory focuses on the accuracy of memories. When using this accuracy-oriented conception of memory, unlike with the quantity-oriented approach, memory does not always deteriorate over time. A possible explanation for this seemingly surprising finding lies in the metacognitive processes of monitoring and control. Use of these processes allows people to withhold responses of which they are unsure, or to adjust the precision of responses to a level that is broad enough to be correct. The ability to accurately report memories has implications for investigators who interview witnesses to crimes, and those who evaluate witness testimony. ^ This research examined the amount of information provided, accuracy, and precision of responses provided during immediate and delayed interviews about a videotaped mock crime. The interview format was manipulated such that a single free narrative response was elicited, or a series of either yes/no or cued questions were asked. Instructions provided by the interviewer indicated to the participants that they should either stress being informative, or being accurate. The interviews were then transcribed and scored. ^ Results indicate that accuracy rates remained stable and high after a one week delay. Compared to those interviewed immediately, after a delay participants provided less information and responses that were less precise. Participants in the free narrative condition were the most accurate. Participants in the cued questions condition provided the most precise responses. Participants in the yes/no questions condition were most likely to say “I don’t know”. The results indicate that people are able to monitor their memories and modify their reports to maintain high accuracy. When control over precision was not possible, such as in the yes/no condition, people said “I don’t know” to maintain accuracy. However when withholding responses and adjusting precision were both possible, people utilized both methods. It seems that concerns that memories reported after a long retention interval might be inaccurate are unfounded. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigated the effects of self-monitoring on the homework completion and accuracy rates of four, fourth-grade students with disabilities in an inclusive general education classroom. A multiple baseline across subjects design was utilized to examine four dependent variables: completion of spelling homework, accuracy of spelling homework, completion of math homework, accuracy of math homework. Data were collected and analyzed during baseline, three phases of intervention, and maintenance. ^ Throughout baseline and all phases, participants followed typical classroom procedures, brought their homework to school each day and gave it to the general education teacher. During Phase I of the intervention, participants self-monitored with a daily sheet at home and on the computer at school in the morning using KidTools (Fitzgerald & Koury, 2003); a student friendly, self-monitoring program. They also participated in brief daily conferences to review their self-monitoring sheets with the investigator, their special education teacher. Phase II followed the same steps except conferencing was reduced to two days a week, which were randomly selected by the researcher and Phase III conferencing was one random day a week. Maintenance data were taken over a two-to-three week period subsequent to the end of the intervention. ^ Results of this study demonstrated self-monitoring substantially improved spelling and math homework completion and accuracy rates of students with disabilities in an inclusive, general education classroom. On average, completion and accuracy rates were highest over baseline in Phase III. Self-monitoring led to higher percentages of completion and accuracy during each phase of the intervention compared to baseline, group percentages also rose slightly during maintenance. Therefore, results suggest self-monitoring leads to short-term maintenance in spelling and math homework completion and accuracy. ^ This study adds to the existing literature by investigating the effects of self-monitoring of homework for students with disabilities included in general education classrooms. Future research should consider selecting participants with other demographic characteristics, using peers for conferencing instead of the teacher, and the use of self-monitoring with other academic subjects (e.g., science, history). Additionally, future research could investigate the effects of each of the two self-monitoring components used alone, with or without the conferencing.^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hotel feasibility studies are critical in the determination of hotel construction, sales and refinancing. Discrepancies have been reported between forecasted results and actual operating results. The author, with funding provided by the Hilton corporation, examines whether such studies under- state or overstate occupancy, average rate, and net income.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Corporate executives closely monitor the accuracy of their hotels' occupancy fore- casts since important decisions are based upon these predictions. This study lists the criteria for selecting an appropriate error measure. It discusses several evaluation methods focusing on statistical significance tests and demonstrates the use of two adequate evaluation methods: Mincer- Zamowitz's efficiency test and Wilcoxon's Non-Parametric Matched-Pairs Signed- Ranks test.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hearing of the news of the death of Diana, Princess of Wales, in a traffic accident, is taken as an analogue for being a percipient but uninvolved witness to a crime, or a witness to another person's sudden confession to some illegal act. This event (known in the literature as a “reception event”) has previously been hypothesized to cause one to form a special type of memory commonly known as a “flashbulb memory” (FB) (Brown and Kulik, 1977). FB's are hypothesized to be especially resilient against forgetting, highly detailed including peripheral details, clear, and inspiring great confidence in the individual for their accuracy. FB's are dependent for their formation upon surprise, emotional valence, and impact, or consequentiality to the witness of the initiating event. FB's are thought to be enhanced by frequent rehearsal. FB's are very important in the context of criminal investigation and litigation in that investigators and jurors usually place great store in witnesses, regardless of their actual accuracy, who claim to have a clear and complete recollection of an event, and who express this confidently. Therefore, the lives, or at least the freedom, of criminal defendants, and the fortunes of civil litigants hang on the testimony of witnesses professing to have FB's. ^ In this study, which includes a large and diverse sample (N = 305), participants were surveyed within 2–4 days after hearing of the fatal accident, and again at intervals of 2 and 4 weeks, 6, 12, and 18 months. Contrary to the FB hypothesis, I found that participants' FB's degraded over time beginning at least as early as two weeks post event. At about 12 months the memory trace stabilized, resisting further degradation. Repeated interviewing did not have any negative affect upon accuracy, contrary to concerns in the literature. Analysis by correlation and regression indicated no effect or predictive power for participant age, emotionality, confidence, or student status, as related to accuracy of recall; nor was participant confidence in accuracy predicted by emotional impact as hypothesized. Results also indicate that, contrary to the notions of investigators and jurors, witnesses become more inaccurate over time regardless of their confidence in their memories, even for highly emotional events. ^