942 resultados para errors and erasures decoding


Relevância:

40.00% 40.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Abstract Phonological tasks are highly predictive of reading development but their complexity obscures the underlying mechanisms driving this association. There are three key components hypothesised to drive the relationship between phonological tasks and reading; (a) the linguistic nature of the stimuli, (b) the phonological complexity of the stimuli, and (c) the production of a verbal response. We isolated the contribution of the stimulus and response components separately through the creation of latent variables to represent specially designed tasks that were matched for procedure. These tasks were administered to 570 6 to 7-year-old children along with standardised tests of regular word and non-word reading. A structural equation model, where tasks were grouped according to stimulus, revealed that the linguistic nature and the phonological complexity of the stimulus predicted unique variance in decoding, over and above matched comparison tasks without these components. An alternative model, grouped according to response mode, showed that the production of a verbal response was a unique predictor of decoding beyond matched tasks without a verbal response. In summary, we found that multiple factors contributed to reading development, supporting multivariate models over those that prioritize single factors. More broadly, we demonstrate the value of combining matched task designs with latent variable modelling to deconstruct the components of complex tasks.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We compared reading acquisition in English and Italian children up to late primary school analyzing RTs and errors as a function of various psycholinguistic variables and changes due to experience. Our results show that reading becomes progressively more reliant on larger processing units with age, but that this is modulated by consistency of the language. In English, an inconsistent orthography, reliance on larger units occurs earlier on and it is demonstrated by faster RTs, a stronger effect of lexical variables and lack of length effect (by fifth grade). However, not all English children are able to master this mode of processing yielding larger inter-individual variability. In Italian, a consistent orthography, reliance on larger units occurs later and it is less pronounced. This is demonstrated by larger length effects which remain significant even in older children and by larger effects of a global factor (related to speed of orthographic decoding) explaining changes of performance across ages. Our results show the importance of considering not only overall performance, but inter-individual variability and variability between conditions when interpreting cross-linguistic differences.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A cikkben a szerző a piac és a kormányzat kudarcaiból kiindulva azonosítja a közjó elérését célzó harmadik rendszer, az etikai felelősség kudarcait. Statisztikai analógiát használva elsőfajú kudarcként azonosítja, mikor az etikát nem veszik figyelembe, pedig szükség van rá. Ugyanakkor másodfajú kudarcként kezeli az etika profitnövelést célzó használatát, mely megtéveszti az érintetteteket, így még szélesebb utat enged az opportunista üzleti tevékenységnek. Meglátása szerint a három rendszer egymást nemcsak kiegészíti, de kölcsönösen korrigálja is. Ez az elsőfajú kudarc esetében általánosabb, a másodfajú kudarc megoldásához azonban a gazdasági élet alapvetéseinek átfogalmazására, az önérdek és az egydimenziós teljesítményértékelés helyett egy új, holisztikusabb szemléletű közgazdaságra van szükség. _______ In the article the author identifies the errors of ethical responsibility. That is the third system to attain common good, but have similar failures like the other two: the hands of the market and the government. Using statistical analogy the author identifies Type I error when ethics are not considered but it should be (null hypothesis is rejected however it’s true). She treats the usage of ethics to extend profit as Type II error. This misleads the stakeholders and makes room for opportunistic behaviour in business (null hypothesis is accepted in turn it’s false). In her opinion the three systems: the hand of the market, the government and the ethical management not only amend but interdependently correct each other. In the case of Type I error it is more general. Nevertheless to solve the Type II error we have to redefine the core principles of business. We need a more holistic approach in economics instead of self-interest and one-dimensional interpretation of value.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The spill-over of the global fi nancial crisis has uncovered the weaknesses in the governance of the EMU. As one of the most open economies in Europe, Hungary has suff ered from the ups and downs of the global and European crisis and its mismanagement. Domestic policy blunders have complicated the situation. This paper examines how Hungary has withstood the ups and downs of the eurozone crisis. It also addresses the questions of whether the country has converged with or diverged from the EMU membership, whether joining the EMU is still a good idea for Hungary, and whether the measures to ward off the crisis have actually helped to face the challenge of growth.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This research pursued the conceptualization, implementation, and verification of a system that enhances digital information displayed on an LCD panel to users with visual refractive errors. The target user groups for this system are individuals who have moderate to severe visual aberrations for which conventional means of compensation, such as glasses or contact lenses, does not improve their vision. This research is based on a priori knowledge of the user's visual aberration, as measured by a wavefront analyzer. With this information it is possible to generate images that, when displayed to this user, will counteract his/her visual aberration. The method described in this dissertation advances the development of techniques for providing such compensation by integrating spatial information in the image as a means to eliminate some of the shortcomings inherent in using display devices such as monitors or LCD panels. Additionally, physiological considerations are discussed and integrated into the method for providing said compensation. In order to provide a realistic sense of the performance of the methods described, they were tested by mathematical simulation in software, as well as by using a single-lens high resolution CCD camera that models an aberrated eye, and finally with human subjects having various forms of visual aberrations. Experiments were conducted on these systems and the data collected from these experiments was evaluated using statistical analysis. The experimental results revealed that the pre-compensation method resulted in a statistically significant improvement in vision for all of the systems. Although significant, the improvement was not as large as expected for the human subject tests. Further analysis suggest that even under the controlled conditions employed for testing with human subjects, the characterization of the eye may be changing. This would require real-time monitoring of relevant variables (e.g. pupil diameter) and continuous adjustment in the pre-compensation process to yield maximum viewing enhancement.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This research pursued the conceptualization, implementation, and verification of a system that enhances digital information displayed on an LCD panel to users with visual refractive errors. The target user groups for this system are individuals who have moderate to severe visual aberrations for which conventional means of compensation, such as glasses or contact lenses, does not improve their vision. This research is based on a priori knowledge of the user's visual aberration, as measured by a wavefront analyzer. With this information it is possible to generate images that, when displayed to this user, will counteract his/her visual aberration. The method described in this dissertation advances the development of techniques for providing such compensation by integrating spatial information in the image as a means to eliminate some of the shortcomings inherent in using display devices such as monitors or LCD panels. Additionally, physiological considerations are discussed and integrated into the method for providing said compensation. In order to provide a realistic sense of the performance of the methods described, they were tested by mathematical simulation in software, as well as by using a single-lens high resolution CCD camera that models an aberrated eye, and finally with human subjects having various forms of visual aberrations. Experiments were conducted on these systems and the data collected from these experiments was evaluated using statistical analysis. The experimental results revealed that the pre-compensation method resulted in a statistically significant improvement in vision for all of the systems. Although significant, the improvement was not as large as expected for the human subject tests. Further analysis suggest that even under the controlled conditions employed for testing with human subjects, the characterization of the eye may be changing. This would require real-time monitoring of relevant variables (e.g. pupil diameter) and continuous adjustment in the pre-compensation process to yield maximum viewing enhancement.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Last Interglacial (LIG, 129-116 thousand of years BP, ka) represents a test bed for climate model feedbacks in warmer-than-present high latitude regions. However, mainly because aligning different palaeoclimatic archives and from different parts of the world is not trivial, a spatio-temporal picture of LIG temperature changes is difficult to obtain. Here, we have selected 47 polar ice core and sub-polar marine sediment records and developed a strategy to align them onto the recent AICC2012 ice core chronology. We provide the first compilation of high-latitude temperature changes across the LIG associated with a coherent temporal framework built between ice core and marine sediment records. Our new data synthesis highlights non-synchronous maximum temperature changes between the two hemispheres with the Southern Ocean and Antarctica records showing an early warming compared to North Atlantic records. We also observe warmer than present-day conditions that occur for a longer time period in southern high latitudes than in northern high latitudes. Finally, the amplitude of temperature changes at high northern latitudes is larger compared to high southern latitude temperature changes recorded at the onset and the demise of the LIG. We have also compiled four data-based time slices with temperature anomalies (compared to present-day conditions) at 115 ka, 120 ka, 125 ka and 130 ka and quantitatively estimated temperature uncertainties that include relative dating errors. This provides an improved benchmark for performing more robust model-data comparison. The surface temperature simulated by two General Circulation Models (CCSM3 and HadCM3) for 130 ka and 125 ka is compared to the corresponding time slice data synthesis. This comparison shows that the models predict warmer than present conditions earlier than documented in the North Atlantic, while neither model is able to produce the reconstructed early Southern Ocean and Antarctic warming. Our results highlight the importance of producing a sequence of time slices rather than one single time slice averaging the LIG climate conditions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Topological quantum error correction codes are currently among the most promising candidates for efficiently dealing with the decoherence effects inherently present in quantum devices. Numerically, their theoretical error threshold can be calculated by mapping the underlying quantum problem to a related classical statistical-mechanical spin system with quenched disorder. Here, we present results for the general fault-tolerant regime, where we consider both qubit and measurement errors. However, unlike in previous studies, here we vary the strength of the different error sources independently. Our results highlight peculiar differences between toric and color codes. This study complements previous results published in New J. Phys. 13, 083006 (2011).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Despite its importance in the global climate system, age-calibrated marine geologic records reflecting the evolution of glacial cycles through the Pleistocene are largely absent from the central Arctic Ocean. This is especially true for sediments older than 200 ka. Three sites cored during the Integrated Ocean Drilling Program's Expedition 302, the Arctic Coring Expedition (ACEX), provide a 27 m continuous sedimentary section from the Lomonosov Ridge in the central Arctic Ocean. Two key biostratigraphic datums and constraints from the magnetic inclination data are used to anchor the chronology of these sediments back to the base of the Cobb Mountain subchron (1215 ka). Beyond 1215 ka, two best fitting geomagnetic models are used to investigate the nature of cyclostratigraphic change. Within this chronology we show that bulk and mineral magnetic properties of the sediments vary on predicted Milankovitch frequencies. These cyclic variations record ''glacial'' and ''interglacial'' modes of sediment deposition on the Lomonosov Ridge as evident in studies of ice-rafted debris and stable isotopic and faunal assemblages for the last two glacial cycles and were used to tune the age model. Potential errors, which largely arise from uncertainties in the nature of downhole paleomagnetic variability, and the choice of a tuning target are handled by defining an error envelope that is based on the best fitting cyclostratigraphic and geomagnetic solutions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The preparation and administration of medications is one of the most common and relevant functions of nurses, demanding great responsibility. Incorrect administration of medication, currently constitutes a serious problem in health services, and is considered one of the main adverse effects suffered by hospitalized patients. Objectives: Identify the major errors in the preparation and administration of medication by nurses in hospitals and know what factors lead to the error occurred in the preparation and administration of medication. Methods: A systematic review of the literature. Deined as inclusion criteria: original scientiic papers, complete, published in the period 2011 to May 2016, the SciELO and LILACS databases, performed in a hospital environment, addressing errors in preparation and administration of medication by nurses and in Portuguese language. After application of the inclusion criteria obtained a sample of 7 articles. Results: The main errors identiied in the pr eparation and administration of medication were wrong dose 71.4%, wrong time 71.4%, 57.2% dilution inadequate, incorrect selection of the patient 42.8% and 42.8% via inadequate. The factors that were most commonly reported by the nursing staff, as the cause of the error was the lack of human appeal 57.2%, inappropriate locations for the preparation of medication 57.2%, the presence of noise and low brightness in preparation location 57, 2%, professionals untrained 42.8%, fatigue and stress 42.8% and inattention 42.8%. Conclusions: The literature shows a high error rate in the preparation and administration of medication for various reasons, making it important that preventive measures of this occurrence are implemented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We provide a comprehensive study of out-of-sample forecasts for the EUR/USD exchange rate based on multivariate macroeconomic models and forecast combinations. We use profit maximization measures based on directional accuracy and trading strategies in addition to standard loss minimization measures. When comparing predictive accuracy and profit measures, data snooping bias free tests are used. The results indicate that forecast combinations, in particular those based on principal components of forecasts, help to improve over benchmark trading strategies, although the excess return per unit of deviation is limited.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Introduction: Since 2005, the workload of community pharmacists in England has increased with a concomitant increase in stress and work pressure. However, it is unclear how these factors are impacting on the ability of community pharmacists to ensure accuracy during the dispensing process. This research seeks to extend our understanding of the nature, outcome, and predictors of dispensing errors. Methodology: A retrospective analysis of a purposive sample of incident report forms (IRFs) from the database of a pharmacist indemnity insurance provider was conducted. Data collected included; type of error, degree of harm caused, pharmacy and pharmacist demographics, and possible contributory factors. Results: In total, 339 files from UK community pharmacies were retrieved from the database. The files dated from June 2006 to November 2011. Incorrect item (45.1%, n = 153/339) followed by incorrect strength (24.5%, n = 83/339) were the most common forms of error. Almost half (41.6%, n = 147/339) of the patients suffered some form of harm ranging from minor harm (26.7%, n = 87/339) to death (0.3%, n = 1/339). Insufficient staff (51.6%, n = 175/339), similar packaging (40.7%, n = 138/339) and the pharmacy being busier than normal (39.5%, n = 134/339) were identified as key contributory factors. Cross-tabular analysis against the final accuracy check variable revealed significant association between the pharmacy location (P < 0.024), dispensary layout (P < 0.025), insufficient staff (P < 0.019), and busier than normal (P < 0.005) variables. Conclusion: The results provide an overview of some of the individual, organisational and technical factors at play at the time of a dispensing error and highlight the need to examine further the relationships between these factors and dispensing error occurrence.