881 resultados para Accuracy.
Resumo:
The C++ class library C-XSC for scientific computing has been extended with the possibility to compute scalar products with selectable accuracy in version 2.3.0. In previous versions, scalar products have always been computed exactly with the help of the so-called long accumulator. Additionally, optimized floating point computation of matrix and vector operations using BLAS-routines are added in C-XSC version 2.4.0. In this article the algorithms used and their implementations, as well as some potential pitfalls in the compilation, are described in more detail. Additionally, the theoretical background of the employed DotK algorithm and the necessary modifications of the concrete implementation in C-XSC are briefly explained. Run-time tests and numerical examples are presented as well.
Resumo:
Implementation of a Monte Carlo simulation for the solution of population balance equations (PBEs) requires choice of initial sample number (N0), number of replicates (M), and number of bins for probability distribution reconstruction (n). It is found that Squared Hellinger Distance, H2, is a useful measurement of the accuracy of Monte Carlo (MC) simulation, and can be related directly to N0, M, and n. Asymptotic approximations of H2 are deduced and tested for both one-dimensional (1-D) and 2-D PBEs with coalescence. The central processing unit (CPU) cost, C, is found in a power-law relationship, C= aMNb0, with the CPU cost index, b, indicating the weighting of N0 in the total CPU cost. n must be chosen to balance accuracy and resolution. For fixed n, M × N0 determines the accuracy of MC prediction; if b > 1, then the optimal solution strategy uses multiple replications and small sample size. Conversely, if 0 < b < 1, one replicate and a large initial sample size is preferred. © 2015 American Institute of Chemical Engineers AIChE J, 61: 2394–2402, 2015
Resumo:
Михаил М. Константинов, Петко Х. Петков - Разгледани са възможните катастрофални ефекти от неправилното използване на крайна машинна аритметика с плаваща точка. За съжаление, тази тема не винаги се разбира достатъчно добре от студентите по приложна и изчислителна математика, като положението в инженерните и икономическите специалности в никакъв случай не е по-добро. За преодоляване на този образователен пропуск тук сме разгледали главните виновници за загубата на точност при числените компютърни пресмятания. Надяваме се, че представените резултати ще помогнат на студентите и лекторите за по-добро разбиране и съответно за избягване на основните фактори, които могат да разрушат точността при компютърните числени пресмятания. Последното не е маловажно – числените катастрофи понякога стават истински, с големи щети и човешки жертви.
Resumo:
A method of accurately controlling the position of a mobile robot using an external large volume metrology (LVM) instrument is presented in this article. By utilising an LVM instrument such as a laser tracker or indoor GPS (iGPS) in mobile robot navigation, many of the most difficult problems in mobile robot navigation can be simplified or avoided. Using the real-time position information from the laser tracker, a very simple navigation algorithm, and a low cost robot, 5mm repeatability was achieved over a volume of 30m radius. A surface digitisation scan of a wind turbine blade section was also demonstrated, illustrating possible applications of the method for manufacturing processes. Further, iGPS guidance of a small KUKA omni-directional robot has been demonstrated, and a full scale prototype system is being developed in cooperation with KUKA Robotics, UK. © 2011 Taylor & Francis.
Resumo:
The accuracy of many aerospace structures is limited by the accuracy of assembly tooling which is in turn limited by the accuracy of the measurements used to set the tooling. Further loss of accuracy results from different rates of thermal expansion for the components and tooling. This paper describes improved tooling designs and setting processes which have the potential to significantly improve the accuracy of aerospace structures. The most advanced solution described is environmentally isolated interferometer networks embedded within tooling combined with active compensation of component pick-ups. This would eliminate environmental effects on measurements while also allowing compensation for thermal expansion. A more immediately realizable solution is the adjustment of component pick-ups using micrometer jacking screws allowing multilateration to be employed during the final stages of the setting process to generate the required offsets. Copyright © 2011 SAE International.
Resumo:
Objective: To test the practicality and effectiveness of cheap, ubiquitous, consumer-grade smartphones to discriminate Parkinson’s disease (PD) subjects from healthy controls, using self-administered tests of gait and postural sway. Background: Existing tests for the diagnosis of PD are based on subjective neurological examinations, performed in-clinic. Objective movement symptom severity data, collected using widely-accessible technologies such as smartphones, would enable the remote characterization of PD symptoms based on self-administered, behavioral tests. Smartphones, when backed up by interviews using web-based videoconferencing, could make it feasible for expert neurologists to perform diagnostic testing on large numbers of individuals at low cost. However, to date, the compliance rate of testing using smart-phones has not been assessed. Methods: We conducted a one-month controlled study with twenty participants, comprising 10 PD subjects and 10 controls. All participants were provided identical LG Optimus S smartphones, capable of recording tri-axial acceleration. Using these smartphones, patients conducted self-administered, short (less than 5 minute) controlled gait and postural sway tests. We analyzed a wide range of summary measures of gait and postural sway from the accelerometry data. Using statistical machine learning techniques, we identified discriminating patterns in the summary measures in order to distinguish PD subjects from controls. Results: Compliance was high all 20 participants performed an average of 3.1 tests per day for the duration of the study. Using this test data, we demonstrated cross-validated sensitivity of 98% and specificity of 98% in discriminating PD subjects from healthy controls. Conclusions: Using consumer-grade smartphone accelerometers, it is possible to distinguish PD from healthy controls with high accuracy. Since these smartphones are inexpensive (around $30 each) and easily available, and the tests are highly non-invasive and objective, we envisage that this kind of smartphone-based testing could radically increase the reach and effectiveness of experts in diagnosing PD.
Resumo:
Large-scale mechanical products, such as aircraft and rockets, consist of large numbers of small components, which introduce additional difficulty for assembly accuracy and error estimation. Planar surfaces as key product characteristics are usually utilised for positioning small components in the assembly process. This paper focuses on assembly accuracy analysis of small components with planar surfaces in large-scale volume products. To evaluate the accuracy of the assembly system, an error propagation model for measurement error and fixture error is proposed, based on the assumption that all errors are normally distributed. In this model, the general coordinate vector is adopted to represent the position of the components. The error transmission functions are simplified into a linear model, and the coordinates of the reference points are composed by theoretical value and random error. The installation of a Head-Up Display is taken as an example to analyse the assembly error of small components based on the propagation model. The result shows that the final coordination accuracy is mainly determined by measurement error of the planar surface in small components. To reduce the uncertainty of the plane measurement, an evaluation index of measurement strategy is presented. This index reflects the distribution of the sampling point set and can be calculated by an inertia moment matrix. Finally, a practical application is introduced for validating the evaluation index.
Resumo:
Correct specification of the simple location quotients in regionalizing the national direct requirements table is essential to the accuracy of regional input-output multipliers. The purpose of this research is to examine the relative accuracy of these multipliers when earnings, employment, number of establishments, and payroll data specify the simple location quotients.^ For each specification type, I derive a column of total output multipliers and a column of total income multipliers. These multipliers are based on the 1987 benchmark input-output accounts of the U.S. economy and 1988-1992 state of Florida data.^ Error sign tests, and Standardized Mean Absolute Deviation (SMAD) statistics indicate that the output multiplier estimates overestimate the output multipliers published by the Department of Commerce-Bureau of Economic Analysis (BEA) for the state of Florida. In contrast, the income multiplier estimates underestimate the BEA's income multipliers. For a given multiplier type, the Spearman-rank correlation analysis shows that the multiplier estimates and the BEA multipliers have statistically different rank ordering of row elements. The above tests also find no significant different differences, both in size and ranking distributions, among the vectors of multiplier estimates. ^
Resumo:
The Intoxilyzer 5000 was tested for calibration curve linearity for ethanol vapor concentration between 0.020 and 0.400g/210L with excellent linearity. Calibration error using reference solutions outside of the allowed concentration range, response to the same ethanol reference solution at different temperatures between 34 and 38$\sp\circ$C, and its response to eleven chemicals, 10 mixtures of two at the time, and one mixture of four chemicals potentially found in human breath have been evaluated. Potential interferents were chosen on the basis of their infrared signatures and the concentration range of solutions corresponding to the non-lethal blood concentration range of various volatile organic compounds reported in the literature. The result of this study indicates that the instrument calibrates with solutions outside the allowed range up to $\pm$10% of target value. Headspace FID dual column GC analysis was used to confirm the concentrations of the solutions. Increasing the temperature of the reference solution from 34 to 38$\sp\circ$C resulted in linear increases in instrument recorded ethanol readings with an average increase of 6.25%/$\sp\circ$C. Of the eleven chemicals studied during this experiment, six, isopropanol, toluene, methyl ethyl ketone, trichloroethylene, acetaldehyde, and methanol could reasonably interfere with the test at non-lethal reported blood concentration ranges, the mixtures of those six chemicals showed linear additive results with a combined effect of as much as a 0.080g/210L reading (Florida's legal limit) without any ethanol present. ^
Resumo:
This study examined the effect of schemas on consistency and accuracy of memory across interviews, providing theoretical hypotheses explaining why inconsistencies may occur. The design manipulated schema-typicality of items (schema-typical and atypical), question format (free-recall, cued-recall and recognition) and retention interval (immediate/2 week and 2 week/4 week). Consistency, accuracy and experiential quality of memory were measured. ^ All independent variables affected accuracy and experiential quality of memory while question format was the only variable affecting consistency. These results challenge the commonly held notion in the legal arena that consistency is a proxy for accuracy. The study also demonstrates that other variables, such as item-typicality and retention interval have different effects on consistency and accuracy in memory. ^
Resumo:
Historical accuracy is only one of the components of a scholarly college textbook used to teach the history of jazz music. Textbooks in this field should include accurate ethnic representation of the most important musical figures as jazz is considered the only original American art form. As college and universities celebrate diversity, it is important that jazz history be accurate and complete. ^ The purpose of this study was to examine the content of the most commonly used jazz history textbooks currently used at American colleges and universities. This qualitative study utilized grounded and textual analysis to explore the existence of ethnic representation in these texts. The methods used were modeled after the work of Kane and Selden each of whom conducted a content analysis focused on a limited field of study. This study is focused on key jazz artists and composers whose work was created in the periods of early jazz (1915-1930), swing (1930-1945) and modern jazz (1945-1960). ^ This study considered jazz notables within the texts in terms of ethnic representation, authors' use of language, contributions to the jazz canon, and place in the standard jazz repertoire. Appropriate historical sections of the selected texts were reviewed and coded using predetermined rubrics. Data were then aggregated into categories and then analyzed according to the character assigned to the key jazz personalities noted in the text as well as the comparative standing afforded each personality. ^ The results of this study demonstrate that particular key African-American jazz artists and composers occupy a significant place in these texts while other significant individuals representing other ethnic groups are consistently overlooked. This finding suggests that while America and the world celebrates the quality of the product of American jazz as great musically and significant socially, many ethnic contributors are not mentioned with the result being a less than complete picture of the evolution of this American art form. ^
Resumo:
Historically, memory has been evaluated by examining how much is remembered, however a more recent conception of memory focuses on the accuracy of memories. When using this accuracy-oriented conception of memory, unlike with the quantity-oriented approach, memory does not always deteriorate over time. A possible explanation for this seemingly surprising finding lies in the metacognitive processes of monitoring and control. Use of these processes allows people to withhold responses of which they are unsure, or to adjust the precision of responses to a level that is broad enough to be correct. The ability to accurately report memories has implications for investigators who interview witnesses to crimes, and those who evaluate witness testimony. ^ This research examined the amount of information provided, accuracy, and precision of responses provided during immediate and delayed interviews about a videotaped mock crime. The interview format was manipulated such that a single free narrative response was elicited, or a series of either yes/no or cued questions were asked. Instructions provided by the interviewer indicated to the participants that they should either stress being informative, or being accurate. The interviews were then transcribed and scored. ^ Results indicate that accuracy rates remained stable and high after a one week delay. Compared to those interviewed immediately, after a delay participants provided less information and responses that were less precise. Participants in the free narrative condition were the most accurate. Participants in the cued questions condition provided the most precise responses. Participants in the yes/no questions condition were most likely to say “I don’t know”. The results indicate that people are able to monitor their memories and modify their reports to maintain high accuracy. When control over precision was not possible, such as in the yes/no condition, people said “I don’t know” to maintain accuracy. However when withholding responses and adjusting precision were both possible, people utilized both methods. It seems that concerns that memories reported after a long retention interval might be inaccurate are unfounded. ^
Resumo:
This study investigated the effects of self-monitoring on the homework completion and accuracy rates of four, fourth-grade students with disabilities in an inclusive general education classroom. A multiple baseline across subjects design was utilized to examine four dependent variables: completion of spelling homework, accuracy of spelling homework, completion of math homework, accuracy of math homework. Data were collected and analyzed during baseline, three phases of intervention, and maintenance. ^ Throughout baseline and all phases, participants followed typical classroom procedures, brought their homework to school each day and gave it to the general education teacher. During Phase I of the intervention, participants self-monitored with a daily sheet at home and on the computer at school in the morning using KidTools (Fitzgerald & Koury, 2003); a student friendly, self-monitoring program. They also participated in brief daily conferences to review their self-monitoring sheets with the investigator, their special education teacher. Phase II followed the same steps except conferencing was reduced to two days a week, which were randomly selected by the researcher and Phase III conferencing was one random day a week. Maintenance data were taken over a two-to-three week period subsequent to the end of the intervention. ^ Results of this study demonstrated self-monitoring substantially improved spelling and math homework completion and accuracy rates of students with disabilities in an inclusive, general education classroom. On average, completion and accuracy rates were highest over baseline in Phase III. Self-monitoring led to higher percentages of completion and accuracy during each phase of the intervention compared to baseline, group percentages also rose slightly during maintenance. Therefore, results suggest self-monitoring leads to short-term maintenance in spelling and math homework completion and accuracy. ^ This study adds to the existing literature by investigating the effects of self-monitoring of homework for students with disabilities included in general education classrooms. Future research should consider selecting participants with other demographic characteristics, using peers for conferencing instead of the teacher, and the use of self-monitoring with other academic subjects (e.g., science, history). Additionally, future research could investigate the effects of each of the two self-monitoring components used alone, with or without the conferencing.^
Resumo:
Hotel feasibility studies are critical in the determination of hotel construction, sales and refinancing. Discrepancies have been reported between forecasted results and actual operating results. The author, with funding provided by the Hilton corporation, examines whether such studies under- state or overstate occupancy, average rate, and net income.