945 resultados para ERROR AUTOCORRELATION


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The concept of measurement-enabled production is based on integrating metrology systems into production processes and generated significant interest in industry, due to its potential to increase process capability and accuracy, which in turn reduces production times and eliminates defective parts. One of the most promising methods of integrating metrology into production is the usage of external metrology systems to compensate machine tool errors in real time. The development and experimental performance evaluation of a low-cost, prototype three-axis machine tool that is laser tracker assisted are described in this paper. Real-time corrections of the machine tool's absolute volumetric error have been achieved. As a result, significant increases in static repeatability and accuracy have been demonstrated, allowing the low-cost three-axis machine tool to reliably reach static positioning accuracies below 35 μm throughout its working volume without any prior calibration or error mapping. This is a significant technical development that demonstrated the feasibility of the proposed methods and can have wide-scale industrial applications by enabling low-cost and structural integrity machine tools that could be deployed flexibly as end-effectors of robotic automation, to achieve positional accuracies that were the preserve of large, high-precision machine tools.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present three jargonaphasic patients who made phonological errors in naming, repetition and reading. We analyse target/response overlap using statistical models to answer three questions: 1) Is there a single phonological source for errors or two sources, one for target-related errors and a separate source for abstruse errors? 2) Can correct responses be predicted by the same distribution used to predict errors or do they show a completion boost (CB)? 3) Is non-lexical and lexical information summed during reading and repetition? The answers were clear. 1) Abstruse errors did not require a separate distribution created by failure to access word forms. Abstruse and target-related errors were the endpoints of a single overlap distribution. 2) Correct responses required a special factor, e.g., a CB or lexical/phonological feedback, to preserve their integrity. 3) Reading and repetition required separate lexical and non-lexical contributions that were combined at output.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We demonstrate an accurate BER estimation method for QPSK CO-OFDM transmission based on the probability density function of the received QPSK symbols. Using a 112Gbs QPSK CO-OFDM transmission as an example, we show that this method offers the most accurate estimate of the system's performance in comparison with other known approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To ascertain the agreement level between intra-operative refraction using a prototype surgical Hartmann-Shack aberrometer and subjective refraction a month later. Methods: Fifty-four consecutive patients had their pseudophakic refractive measured with the aberrometer intra-operatively at the end of their cataract surgery. A masked optometrist performed subjective refraction 4 weeks later. The two sets of data were then analysed for correlation. Results: The mean spherical equivalent was −0.14 ± 0.37 D (Range: −1.41 to +1.72 D) with the prototype aberrometer and −0.34 ± 0.32 (−1.64 to +1.88 D) with subjective refraction. The measurements positively correlated to a very high degree (r =+0.81, p < 0.01). In 84.3% of cases the two measurements were within 0.50D of each other. Conclusion: The aberrometer can verify the aimed refractive status of the eye intraoperatively to avoid a refractive surprise. The aberrometer is a useful tool for real time assessment of the ocular refractive status.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pairwise comparison is a popular assessment method either for deriving criteria-weights or for evaluating alternatives according to a given criterion. In real-world applications consistency of the comparisons rarely happens: intransitivity can occur. The aim of the paper is to discuss the relationship between the consistency of the decision maker—described with the error-free property—and the consistency of the pairwise comparison matrix (PCM). The concept of error-free matrix is used to demonstrate that consistency of the PCM is not a sufficient condition of the error-free property of the decision maker. Informed and uninformed decision makers are defined. In the first stage of an assessment method a consistent or near-consistent matrix should be achieved: detecting, measuring and improving consistency are part of any procedure with both types of decision makers. In the second stage additional information are needed to reveal the decision maker’s real preferences. Interactive questioning procedures are recommended to reach that goal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A szerző a 2008-ban kezdődött gazdasági világválság hatását vizsgálja az egy részvényre jutó nyereség előrejelzésének hibájára. Számos publikáció bizonyította, hogy az elemzők a tényértékeknél szisztematikusan kedvezőbb tervértéket adnak meg az egy részvényre jutó előrejelzéseikben. Más vizsgálatok azt igazolták, hogy az egy részvényre jutó előrejelzési hiba bizonytalan környezetben növekszik, míg arra is számos bizonyítékot lehet találni, hogy a negatív hírek hatását az elemzők alulsúlyozzák. A gazdasági világválság miatt az elemzőknek számtalan negatív hírt kellett figyelembe venniük az előrejelzések készítésekor, továbbá a válság az egész gazdaságban jelentősen növelte a bizonytalanságot. A szerző azt vizsgálja, hogy miként hatott a gazdasági világválság az egy részvényre jutó nyereség- előrejelzés hibájára, megkülönböztetve azt az időszakot, amíg a válság negatív hír volt, attól, amikor már hatásaként jelentősen megnőtt a bizonytalanság. _____ The author investigated the impact of the financial crisis that started in 2008 on the forecasting error for earnings per share. There is plentiful evidence from the 1980s that analysts give systematically more favourable values in their earnings per share (EPS) forecasts than reality, i.e. they are generally optimistic. Other investigations have supported the idea that the EPS forecasting error is greater under uncertain environmental circumstances, while other researchers prove that the analysts under-react to the negative information in their forecasts. The financial crisis brought a myriad of negative information for analysts to consider in such forecasts, while also increasing the level of uncertainty for the entire economy. The article investigates the impact of the financial crisis on the EPS forecasting error, distinguishing the period when the crisis gave merely negative information, from the one when its effect of uncertainty was significantly increased over the entire economy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to identify the effects of active dehydration on balance in euthermic individuals employing the Balance Error Scoring System (BESS). The results indicate that dehydration significantly negatively affects balance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigates the impact of a combined treatment of Systematic Error Correction and Repeated Reading on reading rate and errors for 18 year olds with undiagnosed reading difficulties on a Caribbean Island. In addition to direct daily measures of reading accuracy, the Reading Self Perception Scale was administered to determine whether the intervention was associated with changes in the way the student perceives himself as a reader.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study determined the levels of algebra problem solving skill at which worked examples promoted learning of further problem solving skill and reduction of cognitive load in college developmental algebra students. Problem solving skill was objectively measured as error production; cognitive load was subjectively measured as perceived mental effort. ^ Sixty-three Ss were pretested, received homework of worked examples or mass problem solving, and posttested. Univarate ANCOVA (covariate = previous grade) were performed on the practice and posttest data. The factors used in the analysis were practice strategy (worked examples vs. mass problem solving) and algebra problem solving skill (low vs. moderate vs. high). Students in the practice phase who studied worked examples exhibited (a) fewer errors and reduced cognitive load, at moderate skill; (b) neither fewer errors nor reduced cognitive load, at low skill; and (c) only reduced cognitive load, at high skill. In the posttest, only cognitive load was reduced. ^ The results suggested that worked examples be emphasized for developmental students with moderate problem solving skill. Areas for further research were discussed. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study analyzed three fifth grade students’ misconceptions and error patterns when working with equivalence, addition and subtraction of fractions. The findings revealed that students used both conceptual and procedural knowledge to solve the problems. They used pictures, gave examples, and made connections to other mathematical concepts and to daily life topics. Error patterns found include using addition and subtraction of numerators and denominators, and finding the greatest common factor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an experimental study of the sensitivity to 15-MeV neutrons of Advanced Low Power SRAMs (A-LPSRAM) at low bias voltage little above the threshold value that allows the retention of data. This family of memories is characterized by a 3D structure to minimize the area penalty and to cope with latchups, as well as by the presence of integrated capacitors to hinder the occurrence of single event upsets. In low voltage static tests, classical single event upsets were a minor source of errors, but other unexpected phenomena such as clusters of bitflips and hard errors turned out to be the origin of hundreds of bitflips. Besides, errors were not observed in dynamic tests at nominal voltage. This behavior is clearly different than that of standard bulk CMOS SRAMs, where thousands of errors have been reported.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trials in a temporal two-interval forced-choice discrimination experiment consist of two sequential intervals presenting stimuli that differ from one another as to magnitude along some continuum. The observer must report in which interval the stimulus had a larger magnitude. The standard difference model from signal detection theory analyses poses that order of presentation should not affect the results of the comparison, something known as the balance condition (J.-C. Falmagne, 1985, in Elements of Psychophysical Theory). But empirical data prove otherwise and consistently reveal what Fechner (1860/1966, in Elements of Psychophysics) called time-order errors, whereby the magnitude of the stimulus presented in one of the intervals is systematically underestimated relative to the other. Here we discuss sensory factors (temporary desensitization) and procedural glitches (short interstimulus or intertrial intervals and response bias) that might explain the time-order error, and we derive a formal model indicating how these factors make observed performance vary with presentation order despite a single underlying mechanism. Experimental results are also presented illustrating the conventional failure of the balance condition and testing the hypothesis that time-order errors result from contamination by the factors included in the model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Topological quantum error correction codes are currently among the most promising candidates for efficiently dealing with the decoherence effects inherently present in quantum devices. Numerically, their theoretical error threshold can be calculated by mapping the underlying quantum problem to a related classical statistical-mechanical spin system with quenched disorder. Here, we present results for the general fault-tolerant regime, where we consider both qubit and measurement errors. However, unlike in previous studies, here we vary the strength of the different error sources independently. Our results highlight peculiar differences between toric and color codes. This study complements previous results published in New J. Phys. 13, 083006 (2011).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Performing experiments on small-scale quantum computers is certainly a challenging endeavor. Many parameters need to be optimized to achieve high-fidelity operations. This can be done efficiently for operations acting on single qubits, as errors can be fully characterized. For multiqubit operations, though, this is no longer the case, as in the most general case, analyzing the effect of the operation on the system requires a full state tomography for which resources scale exponentially with the system size. Furthermore, in recent experiments, additional electronic levels beyond the two-level system encoding the qubit have been used to enhance the capabilities of quantum-information processors, which additionally increases the number of parameters that need to be controlled. For the optimization of the experimental system for a given task (e.g., a quantum algorithm), one has to find a satisfactory error model and also efficient observables to estimate the parameters of the model. In this manuscript, we demonstrate a method to optimize the encoding procedure for a small quantum error correction code in the presence of unknown but constant phase shifts. The method, which we implement here on a small-scale linear ion-trap quantum computer, is readily applicable to other AMO platforms for quantum-information processing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Funding — Forest Enterprise Scotland and the University of Aberdeen provided funding for the project. The Carnegie Trust supported the lead author, E. McHenry, in this research through the award of a tuition fees bursary.