848 resultados para Errors and omission
Resumo:
The spill-over of the global fi nancial crisis has uncovered the weaknesses in the governance of the EMU. As one of the most open economies in Europe, Hungary has suff ered from the ups and downs of the global and European crisis and its mismanagement. Domestic policy blunders have complicated the situation. This paper examines how Hungary has withstood the ups and downs of the eurozone crisis. It also addresses the questions of whether the country has converged with or diverged from the EMU membership, whether joining the EMU is still a good idea for Hungary, and whether the measures to ward off the crisis have actually helped to face the challenge of growth.
Resumo:
This research pursued the conceptualization, implementation, and verification of a system that enhances digital information displayed on an LCD panel to users with visual refractive errors. The target user groups for this system are individuals who have moderate to severe visual aberrations for which conventional means of compensation, such as glasses or contact lenses, does not improve their vision. This research is based on a priori knowledge of the user's visual aberration, as measured by a wavefront analyzer. With this information it is possible to generate images that, when displayed to this user, will counteract his/her visual aberration. The method described in this dissertation advances the development of techniques for providing such compensation by integrating spatial information in the image as a means to eliminate some of the shortcomings inherent in using display devices such as monitors or LCD panels. Additionally, physiological considerations are discussed and integrated into the method for providing said compensation. In order to provide a realistic sense of the performance of the methods described, they were tested by mathematical simulation in software, as well as by using a single-lens high resolution CCD camera that models an aberrated eye, and finally with human subjects having various forms of visual aberrations. Experiments were conducted on these systems and the data collected from these experiments was evaluated using statistical analysis. The experimental results revealed that the pre-compensation method resulted in a statistically significant improvement in vision for all of the systems. Although significant, the improvement was not as large as expected for the human subject tests. Further analysis suggest that even under the controlled conditions employed for testing with human subjects, the characterization of the eye may be changing. This would require real-time monitoring of relevant variables (e.g. pupil diameter) and continuous adjustment in the pre-compensation process to yield maximum viewing enhancement.
Resumo:
This research pursued the conceptualization, implementation, and verification of a system that enhances digital information displayed on an LCD panel to users with visual refractive errors. The target user groups for this system are individuals who have moderate to severe visual aberrations for which conventional means of compensation, such as glasses or contact lenses, does not improve their vision. This research is based on a priori knowledge of the user's visual aberration, as measured by a wavefront analyzer. With this information it is possible to generate images that, when displayed to this user, will counteract his/her visual aberration. The method described in this dissertation advances the development of techniques for providing such compensation by integrating spatial information in the image as a means to eliminate some of the shortcomings inherent in using display devices such as monitors or LCD panels. Additionally, physiological considerations are discussed and integrated into the method for providing said compensation. In order to provide a realistic sense of the performance of the methods described, they were tested by mathematical simulation in software, as well as by using a single-lens high resolution CCD camera that models an aberrated eye, and finally with human subjects having various forms of visual aberrations. Experiments were conducted on these systems and the data collected from these experiments was evaluated using statistical analysis. The experimental results revealed that the pre-compensation method resulted in a statistically significant improvement in vision for all of the systems. Although significant, the improvement was not as large as expected for the human subject tests. Further analysis suggest that even under the controlled conditions employed for testing with human subjects, the characterization of the eye may be changing. This would require real-time monitoring of relevant variables (e.g. pupil diameter) and continuous adjustment in the pre-compensation process to yield maximum viewing enhancement.
Resumo:
The Last Interglacial (LIG, 129-116 thousand of years BP, ka) represents a test bed for climate model feedbacks in warmer-than-present high latitude regions. However, mainly because aligning different palaeoclimatic archives and from different parts of the world is not trivial, a spatio-temporal picture of LIG temperature changes is difficult to obtain. Here, we have selected 47 polar ice core and sub-polar marine sediment records and developed a strategy to align them onto the recent AICC2012 ice core chronology. We provide the first compilation of high-latitude temperature changes across the LIG associated with a coherent temporal framework built between ice core and marine sediment records. Our new data synthesis highlights non-synchronous maximum temperature changes between the two hemispheres with the Southern Ocean and Antarctica records showing an early warming compared to North Atlantic records. We also observe warmer than present-day conditions that occur for a longer time period in southern high latitudes than in northern high latitudes. Finally, the amplitude of temperature changes at high northern latitudes is larger compared to high southern latitude temperature changes recorded at the onset and the demise of the LIG. We have also compiled four data-based time slices with temperature anomalies (compared to present-day conditions) at 115 ka, 120 ka, 125 ka and 130 ka and quantitatively estimated temperature uncertainties that include relative dating errors. This provides an improved benchmark for performing more robust model-data comparison. The surface temperature simulated by two General Circulation Models (CCSM3 and HadCM3) for 130 ka and 125 ka is compared to the corresponding time slice data synthesis. This comparison shows that the models predict warmer than present conditions earlier than documented in the North Atlantic, while neither model is able to produce the reconstructed early Southern Ocean and Antarctic warming. Our results highlight the importance of producing a sequence of time slices rather than one single time slice averaging the LIG climate conditions.
Resumo:
Topological quantum error correction codes are currently among the most promising candidates for efficiently dealing with the decoherence effects inherently present in quantum devices. Numerically, their theoretical error threshold can be calculated by mapping the underlying quantum problem to a related classical statistical-mechanical spin system with quenched disorder. Here, we present results for the general fault-tolerant regime, where we consider both qubit and measurement errors. However, unlike in previous studies, here we vary the strength of the different error sources independently. Our results highlight peculiar differences between toric and color codes. This study complements previous results published in New J. Phys. 13, 083006 (2011).
Resumo:
Despite its importance in the global climate system, age-calibrated marine geologic records reflecting the evolution of glacial cycles through the Pleistocene are largely absent from the central Arctic Ocean. This is especially true for sediments older than 200 ka. Three sites cored during the Integrated Ocean Drilling Program's Expedition 302, the Arctic Coring Expedition (ACEX), provide a 27 m continuous sedimentary section from the Lomonosov Ridge in the central Arctic Ocean. Two key biostratigraphic datums and constraints from the magnetic inclination data are used to anchor the chronology of these sediments back to the base of the Cobb Mountain subchron (1215 ka). Beyond 1215 ka, two best fitting geomagnetic models are used to investigate the nature of cyclostratigraphic change. Within this chronology we show that bulk and mineral magnetic properties of the sediments vary on predicted Milankovitch frequencies. These cyclic variations record ''glacial'' and ''interglacial'' modes of sediment deposition on the Lomonosov Ridge as evident in studies of ice-rafted debris and stable isotopic and faunal assemblages for the last two glacial cycles and were used to tune the age model. Potential errors, which largely arise from uncertainties in the nature of downhole paleomagnetic variability, and the choice of a tuning target are handled by defining an error envelope that is based on the best fitting cyclostratigraphic and geomagnetic solutions.
Resumo:
The preparation and administration of medications is one of the most common and relevant functions of nurses, demanding great responsibility. Incorrect administration of medication, currently constitutes a serious problem in health services, and is considered one of the main adverse effects suffered by hospitalized patients. Objectives: Identify the major errors in the preparation and administration of medication by nurses in hospitals and know what factors lead to the error occurred in the preparation and administration of medication. Methods: A systematic review of the literature. Deined as inclusion criteria: original scientiic papers, complete, published in the period 2011 to May 2016, the SciELO and LILACS databases, performed in a hospital environment, addressing errors in preparation and administration of medication by nurses and in Portuguese language. After application of the inclusion criteria obtained a sample of 7 articles. Results: The main errors identiied in the pr eparation and administration of medication were wrong dose 71.4%, wrong time 71.4%, 57.2% dilution inadequate, incorrect selection of the patient 42.8% and 42.8% via inadequate. The factors that were most commonly reported by the nursing staff, as the cause of the error was the lack of human appeal 57.2%, inappropriate locations for the preparation of medication 57.2%, the presence of noise and low brightness in preparation location 57, 2%, professionals untrained 42.8%, fatigue and stress 42.8% and inattention 42.8%. Conclusions: The literature shows a high error rate in the preparation and administration of medication for various reasons, making it important that preventive measures of this occurrence are implemented.
Resumo:
We provide a comprehensive study of out-of-sample forecasts for the EUR/USD exchange rate based on multivariate macroeconomic models and forecast combinations. We use profit maximization measures based on directional accuracy and trading strategies in addition to standard loss minimization measures. When comparing predictive accuracy and profit measures, data snooping bias free tests are used. The results indicate that forecast combinations, in particular those based on principal components of forecasts, help to improve over benchmark trading strategies, although the excess return per unit of deviation is limited.
Resumo:
Introduction: Since 2005, the workload of community pharmacists in England has increased with a concomitant increase in stress and work pressure. However, it is unclear how these factors are impacting on the ability of community pharmacists to ensure accuracy during the dispensing process. This research seeks to extend our understanding of the nature, outcome, and predictors of dispensing errors. Methodology: A retrospective analysis of a purposive sample of incident report forms (IRFs) from the database of a pharmacist indemnity insurance provider was conducted. Data collected included; type of error, degree of harm caused, pharmacy and pharmacist demographics, and possible contributory factors. Results: In total, 339 files from UK community pharmacies were retrieved from the database. The files dated from June 2006 to November 2011. Incorrect item (45.1%, n = 153/339) followed by incorrect strength (24.5%, n = 83/339) were the most common forms of error. Almost half (41.6%, n = 147/339) of the patients suffered some form of harm ranging from minor harm (26.7%, n = 87/339) to death (0.3%, n = 1/339). Insufficient staff (51.6%, n = 175/339), similar packaging (40.7%, n = 138/339) and the pharmacy being busier than normal (39.5%, n = 134/339) were identified as key contributory factors. Cross-tabular analysis against the final accuracy check variable revealed significant association between the pharmacy location (P < 0.024), dispensary layout (P < 0.025), insufficient staff (P < 0.019), and busier than normal (P < 0.005) variables. Conclusion: The results provide an overview of some of the individual, organisational and technical factors at play at the time of a dispensing error and highlight the need to examine further the relationships between these factors and dispensing error occurrence.
Resumo:
Human radiosensitivity is a quantitative trait that is generally subject to binomial distribution. Individual radiosensitivity, however, may deviate significantly from the mean (by 2-3 standard deviations). Thus, the same dose of radiation may result in different levels of genotoxic damage (commonly measured as chromosome aberration rates) in different individuals. There is significant genetic component in individual radiosensitivity. It is related to carriership of variant alleles of various single-nucleotide polymorphisms (most of these in genes coding for proteins functioning in DNA damage identification and repair); carriership of different number of alleles producing cumulative effects; amplification of gene copies coding for proteins responsible for radioresistance, mobile genetic elements, and others. Among the other factors influencing individual radioresistance are: radioadaptive response; bystander effect; levels of endogenous substances with radioprotective and antimutagenic properties and environmental factors such as lifestyle and diet, physical activity, psychoemotional state, hormonal state, certain drugs, infections and others. These factors may have radioprotective or sensibilising effects. Apparently, there are too many factors that may significantly modulate the biological effects of ionising radiation. Thus, conventional methodologies for biodosimetry (specifically, cytogenetic methods) may produce significant errors if personal traits that may affect radioresistance are not accounted for.
Resumo:
Person tracking systems to date have either relied on motion detection or optical flow as a basis for person detection and tracking. As yet, systems have not been developed that utilise both these techniques. We propose a person tracking system that uses both, made possible by a novel hybrid optical flow-motion detection technique that we have developed. This provides the system with two methods of person detection, helping to avoid missed detections and the need to predict position, which can lead to errors in tracking and mistakes when handling occlusion situations. Our results show that our system is able to track people accurately, with an average error less than four pixels, and that our system outperforms the current CAVIAR benchmark system.
Error, Bias, and Long-Branch Attraction in Data for Two Chloroplast Photosystem Genes in Seed Plants
Resumo:
Sequences of two chloroplast photosystem genes, psaA and psbB, together comprising about 3,500 bp, were obtained for all five major groups of extant seed plants and several outgroups among other vascular plants. Strongly supported, but significantly conflicting, phylogenetic signals were obtained in parsimony analyses from partitions of the data into first and second codon positions versus third positions. In the former, both genes agreed on a monophyletic gymnosperms, with Gnetales closely related to certain conifers. In the latter, Gnetales are inferred to be the sister group of all other seed plants, with gymnosperms paraphyletic. None of the data supported the modern ‘‘anthophyte hypothesis,’’ which places Gnetales as the sister group of flowering plants. A series of simulation studies were undertaken to examine the error rate for parsimony inference. Three kinds of errors were examined: random error, systematic bias (both properties of finite data sets), and statistical inconsistency owing to long-branch attraction (an asymptotic property). Parsimony reconstructions were extremely biased for third-position data for psbB. Regardless of the true underlying tree, a tree in which Gnetales are sister to all other seed plants was likely to be reconstructed for these data. None of the combinations of genes or partitions permits the anthophyte tree to be reconstructed with high probability. Simulations of progressively larger data sets indicate the existence of long-branch attraction (statistical inconsistency) for third-position psbB data if either the anthophyte tree or the gymnosperm tree is correct. This is also true for the anthophyte tree using either psaA third positions or psbB first and second positions. A factor contributing to bias and inconsistency is extremely short branches at the base of the seed plant radiation, coupled with extremely high rates in Gnetales and nonseed plant outgroups. M. J. Sanderson,* M. F. Wojciechowski,*† J.-M. Hu,* T. Sher Khan,* and S. G. Brady
Resumo:
This paper reports on the performance of 58 11 to 12-year-olds on a spatial visualization task and a spatial orientation task. The students completed these tasks and explained their thinking during individual interviews. The qualitative data were analysed to inform pedagogical content knowledge for spatial activities. The study revealed that “matching” or “matching and eliminating” were the typical strategies that students employed on these spatial tasks. However, errors in making associations between parts of the same or different shapes were noted. Students also experienced general difficulties with visual memory and language use to explain their thinking. The students’ specific difficulties in spatial visualization related to obscured items, the perspective used, and the placement and orientation of shapes.
Resumo:
Risks and uncertainties are inevitable in engineering projects and infrastructure investments. Decisions about investment in infrastructure such as for maintenance, rehabilitation and construction works can pose risks, and may generate significant impacts on social, cultural, environmental and other related issues. This report presents the results of a literature review of current practice in identifying, quantifying and managing risks and predicting impacts as part of the planning and assessment process for infrastructure investment proposals. In assessing proposals for investment in infrastructure, it is necessary to consider social, cultural and environmental risks and impacts to the overall community, as well as financial risks to the investor. The report defines and explains the concept of risk and uncertainty, and describes the three main methodology approaches to the analysis of risk and uncertainty in investment planning for infrastructure, viz examining a range of scenarios or options, sensitivity analysis, and a statistical probability approach, listed here in order of increasing merit and complexity. Forecasts of costs, benefits and community impacts of infrastructure are recognised as central aspects of developing and assessing investment proposals. Increasingly complex modelling techniques are being used for investment evaluation. The literature review identified forecasting errors as the major cause of risk. The report contains a summary of the broad nature of decision-making tools used by governments and other organisations in Australia, New Zealand, Europe and North America, and shows their overall approach to risk assessment in assessing public infrastructure proposals. While there are established techniques to quantify financial and economic risks, quantification is far less developed for political, social and environmental risks and impacts. The report contains a summary of the broad nature of decision-making tools used by governments and other organisations in Australia, New Zealand, Europe and North America, and shows their overall approach to risk assessment in assessing public infrastructure proposals. While there are established techniques to quantify financial and economic risks, quantification is far less developed for political, social and environmental risks and impacts. For risks that cannot be readily quantified, assessment techniques commonly include classification or rating systems for likelihood and consequence. The report outlines the system used by the Australian Defence Organisation and in the Australian Standard on risk management. After each risk is identified and quantified or rated, consideration can be given to reducing the risk, and managing any remaining risk as part of the scope of the project. The literature review identified use of risk mapping techniques by a North American chemical company and by the Australian Defence Organisation. This literature review has enabled a risk assessment strategy to be developed, and will underpin an examination of the feasibility of developing a risk assessment capability using a probability approach.