915 resultados para Error judicial


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces an extended hierarchical task analysis (HTA) methodology devised to evaluate and compare user interfaces on volumetric infusion pumps. The pumps were studied along the dimensions of overall usability and propensity for generating human error. With HTA as our framework, we analyzed six pumps on a variety of common tasks using Norman’s Action theory. The introduced method of evaluation divides the problem space between the external world of the device interface and the user’s internal cognitive world, allowing for predictions of potential user errors at the human-device level. In this paper, one detailed analysis is provided as an example, comparing two different pumps on two separate tasks. The results demonstrate the inherent variation, often the cause of usage errors, found with infusion pumps being used in hospitals today. The reported methodology is a useful tool for evaluating human performance and predicting potential user errors with infusion pumps and other simple medical devices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Is the solution for medical errors medical or cognitive? In this AMIA2001 panel on medical error, we argued that medical error is primarily an issue for cognitive science and engineering, not for medicine, although the knowledge of the practice of medicine is essential for the research and prevention of medical errors. The three panelists presented studies that demonstrate that cognitive research is the foundation for theories of medical errors and interventions of error reductions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Healthcare has been slow in using human factors principles to reduce medical errors. The Center for Devices and Radiological Health (CDRH) recognizes that a lack of attention to human factors during product development may lead to errors that have the potential for patient injury, or even death. In response to the need for reducing medication errors, the National Coordinating Council for Medication Errors Reporting and Prevention (NCC MERP) released the NCC MERP taxonomy that provides a standard language for reporting medication errors. This project maps the NCC MERP taxonomy of medication error to MedWatch medical errors involving infusion pumps. Of particular interest are human factors associated with medical device errors. The NCC MERP taxonomy of medication errors is limited in mapping information from MEDWATCH because of the focus on the medical device and the format of reporting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is becoming clear that if we are to impact the rate of medical errors it will have to be done at the practicing physician level. The purpose of this project was to survey the attitude of physicians in Alabama concerning their perception of medical error, and to obtain their thoughts and desires for medical education in the area of medical errors. The information will be used in the development of a physician education program.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Next-generation sequencing (NGS) is a valuable tool for the detection and quantification of HIV-1 variants in vivo. However, these technologies require detailed characterization and control of artificially induced errors to be applicable for accurate haplotype reconstruction. To investigate the occurrence of substitutions, insertions, and deletions at the individual steps of RT-PCR and NGS, 454 pyrosequencing was performed on amplified and non-amplified HIV-1 genomes. Artificial recombination was explored by mixing five different HIV-1 clonal strains (5-virus-mix) and applying different RT-PCR conditions followed by 454 pyrosequencing. Error rates ranged from 0.04-0.66% and were similar in amplified and non-amplified samples. Discrepancies were observed between forward and reverse reads, indicating that most errors were introduced during the pyrosequencing step. Using the 5-virus-mix, non-optimized, standard RT-PCR conditions introduced artificial recombinants in a fraction of at least 30% of the reads that subsequently led to an underestimation of true haplotype frequencies. We minimized the fraction of recombinants down to 0.9-2.6% by optimized, artifact-reducing RT-PCR conditions. This approach enabled correct haplotype reconstruction and frequency estimations consistent with reference data obtained by single genome amplification. RT-PCR conditions are crucial for correct frequency estimation and analysis of haplotypes in heterogeneous virus populations. We developed an RT-PCR procedure to generate NGS data useful for reliable haplotype reconstruction and quantification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intensity modulated radiation therapy (IMRT) is a technique that delivers a highly conformal dose distribution to a target volume while attempting to maximally spare the surrounding normal tissues. IMRT is a common treatment modality used for treating head and neck (H&N) cancers, and the presence of many critical structures in this region requires accurate treatment delivery. The Radiological Physics Center (RPC) acts as both a remote and on-site quality assurance agency that credentials institutions participating in clinical trials. To date, about 30% of all IMRT participants have failed the RPC’s remote audit using the IMRT H&N phantom. The purpose of this project is to evaluate possible causes of H&N IMRT delivery errors observed by the RPC, specifically IMRT treatment plan complexity and the use of improper dosimetry data from machines that were thought to be matched but in reality were not. Eight H&N IMRT plans with a range of complexity defined by total MU (1460-3466), number of segments (54-225), and modulation complexity scores (MCS) (0.181-0.609) were created in Pinnacle v.8m. These plans were delivered to the RPC’s H&N phantom on a single Varian Clinac. One of the IMRT plans (1851 MU, 88 segments, and MCS=0.469) was equivalent to the median H&N plan from 130 previous RPC H&N phantom irradiations. This average IMRT plan was also delivered on four matched Varian Clinac machines and the dose distribution calculated using a different 6MV beam model. Radiochromic film and TLD within the phantom were used to analyze the dose profiles and absolute doses, respectively. The measured and calculated were compared to evaluate the dosimetric accuracy. All deliveries met the RPC acceptance criteria of ±7% absolute dose difference and 4 mm distance-to-agreement (DTA). Additionally, gamma index analysis was performed for all deliveries using a ±7%/4mm and ±5%/3mm criteria. Increasing the treatment plan complexity by varying the MU, number of segments, or varying the MCS resulted in no clear trend toward an increase in dosimetric error determined by the absolute dose difference, DTA, or gamma index. Varying the delivery machines as well as the beam model (use of a Clinac 6EX 6MV beam model vs. Clinac 21EX 6MV model), also did not show any clear trend towards an increased dosimetric error using the same criteria indicated above.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Scholars have increasingly theorized, and debated, the decision by states to create and delegate authority to international courts, as well as the subsequent autonomy and behavior of those courts, with principal–agent and trusteeship models disagreeing on the nature and extent of states’ influence on international judges. This article formulates and tests a set of principal–agent hypotheses about the ways in which, and the conditions under which, member states are able use their powers of judicial nomination and appointment to influence the endogenous preferences of international judges. The empirical analysis surveys the record of all judicial appointments to the Appellate Body (AB) of the World Trade Organization over a 15-year period. We present a view of an AB appointment process that, far from representing a pure search for expertise, is deeply politicized and offers member-state principals opportunities to influence AB members ex ante and possibly ex post. We further demonstrate that the AB nomination process has become progressively more politicized over time as member states, responding to earlier and controversial AB decisions, became far more concerned about judicial activism and more interested in the substantive opinions of AB candidates, systematically championing candidates whose views on key issues most closely approached their own, and opposing candidates perceived to be activist or biased against their substantive preferences. Although specific to the WTO, our theory and findings have implications for the judicial politics of a large variety of global and regional international courts and tribunals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RATIONALE In biomedical journals authors sometimes use the standard error of the mean (SEM) for data description, which has been called inappropriate or incorrect. OBJECTIVE To assess the frequency of incorrect use of SEM in articles in three selected cardiovascular journals. METHODS AND RESULTS All original journal articles published in 2012 in Cardiovascular Research, Circulation: Heart Failure and Circulation Research were assessed by two assessors for inappropriate use of SEM when providing descriptive information of empirical data. We also assessed whether the authors state in the methods section that the SEM will be used for data description. Of 441 articles included in this survey, 64% (282 articles) contained at least one instance of incorrect use of the SEM, with two journals having a prevalence above 70% and "Circulation: Heart Failure" having the lowest value (27%). In 81% of articles with incorrect use of SEM, the authors had explicitly stated that they use the SEM for data description and in 89% SEM bars were also used instead of 95% confidence intervals. Basic science studies had a 7.4-fold higher level of inappropriate SEM use (74%) than clinical studies (10%). LIMITATIONS The selection of the three cardiovascular journals was based on a subjective initial impression of observing inappropriate SEM use. The observed results are not representative for all cardiovascular journals. CONCLUSION In three selected cardiovascular journals we found a high level of inappropriate SEM use and explicit methods statements to use it for data description, especially in basic science studies. To improve on this situation, these and other journals should provide clear instructions to authors on how to report descriptive information of empirical data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Medical errors, in particular those resulting in harm, pose a serious situation for patients ("first victims") and the healthcare workers involved ("second victims") and can have long-lasting and distressing consequences. To prevent a second traumatization, appropriate and empathic interaction with all persons involved is essential besides error analysis. Patients share a nearly universal, broad preference for a complete disclosure of incidents, regardless of age, gender, or education. This includes the personal, timely and unambiguous disclosure of the adverse event, information relating to the event, its causes and consequences, and an apology and sincere expression of regret. While the majority of healthcare professionals generally support and honest and open disclosure of adverse events, they also face various barriers which impede the disclosure (e.g., fear of legal consequences). Despite its essential importance, disclosure of adverse events in practice occurs in ways that are rarely acceptable to patients and their families. The staff involved often experiences acute distress and an intense emotional response to the event, which may become chronic and increase the risk of depression, burnout and post-traumatic stress disorders. Communication with peers is vital for people to be able to cope constructively and protectively with harmful errors. Survey studies among healthcare workers show, however, that they often do not receive sufficient individual and institutional support. Healthcare organizations should prepare for medical errors and harmful events and implement a communication plan and a support system that covers the requirements and different needs of patients and the staff involved.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximate models (proxies) can be employed to reduce the computational costs of estimating uncertainty. The price to pay is that the approximations introduced by the proxy model can lead to a biased estimation. To avoid this problem and ensure a reliable uncertainty quantification, we propose to combine functional data analysis and machine learning to build error models that allow us to obtain an accurate prediction of the exact response without solving the exact model for all realizations. We build the relationship between proxy and exact model on a learning set of geostatistical realizations for which both exact and approximate solvers are run. Functional principal components analysis (FPCA) is used to investigate the variability in the two sets of curves and reduce the dimensionality of the problem while maximizing the retained information. Once obtained, the error model can be used to predict the exact response of any realization on the basis of the sole proxy response. This methodology is purpose-oriented as the error model is constructed directly for the quantity of interest, rather than for the state of the system. Also, the dimensionality reduction performed by FPCA allows a diagnostic of the quality of the error model to assess the informativeness of the learning set and the fidelity of the proxy to the exact model. The possibility of obtaining a prediction of the exact response for any newly generated realization suggests that the methodology can be effectively used beyond the context of uncertainty quantification, in particular for Bayesian inference and optimization.