924 resultados para Account errors
Resumo:
Medication errors, one of the most frequent types of medical errors, are a common cause of patient harm in hospital systems today. Nurses at the bedside are in a position to encounter many of these errors since they are there at the start of the process (ordering/prescribing) and the end of the process (administration). One of the recommendations from the IOM (Institute of Medicine) report, "To Err is Human," was for organizations to identify and learn from medical errors through event reporting systems. While many organizations have reporting systems in place, research studies report a significant amount of underreporting by nurses. A systematic review of the literature was performed to identify contributing factors related to the reporting and not reporting of medication errors by nurses at the bedside.^ Articles included in the literature review were primary or secondary studies, dated January 1, 2000 – July 2009, related to nursing medication error reporting. All 634 articles were reviewed with an algorithm developed to standardize the review process and help filter out those that did not meet the study criteria. In addition, 142 article bibliographies were reviewed to find additional studies that were not found in the original literature search.^ After reviewing the 634 articles and the additional 108 articles discovered in the bibliography review, 41 articles met the study criteria and were used in the systematic literature review results.^ Fear of punitive reactions to medication errors was a frequent barrier to error reporting. Nurses fear reactions from their leadership, peers, patients and their families, nursing boards, and the media. Anonymous reporting systems and departments/organizations with a strong safety culture in place helped to encourage the reporting of medication errors by nursing staff.^ Many of the studies included in this literature review do not allow results that can be generalized. The majority of them took place in single institutions/organizations with limited sample sizes. Stronger studies with larger sample sizes need to be performed, utilizing data collection methods that have been validated, to determine stronger correlations between safety cultures and nurse error reporting.^
Resumo:
A large number of ridge regression estimators have been proposed and used with little knowledge of their true distributions. Because of this lack of knowledge, these estimators cannot be used to test hypotheses or to form confidence intervals.^ This paper presents a basic technique for deriving the exact distribution functions for a class of generalized ridge estimators. The technique is applied to five prominent generalized ridge estimators. Graphs of the resulting distribution functions are presented. The actual behavior of these estimators is found to be considerably different than the behavior which is generally assumed for ridge estimators.^ This paper also uses the derived distributions to examine the mean squared error properties of the estimators. A technique for developing confidence intervals based on the generalized ridge estimators is also presented. ^
Resumo:
Errors in the administration of medication represent a significant loss of medical resources and pose life altering or life threatening risks to patients. This paper considered the question, what impact do Computerized Physician Order Entry (CPOE) systems have on medication errors in the hospital inpatient environment? Previous reviews have examined evidence of the impact of CPOE on medication errors, but have come to ambiguous conclusions as to the impact of CPOE and decision support systems (DSS). Forty-three papers were identified. Thirty-one demonstrated a significant reduction in prescribing error rates for all or some drug types; decreases in minor errors were most often reported. Several studies reported increases in the rate of duplicate orders and failures to remove contraindicated drugs, often attributed to inappropriate design or to an inability to operate the system properly. The evidence on the effectiveness of CPOE to reduce errors in medication administration is compelling though it is limited by modest study sample sizes and designs. ^
Resumo:
Next-generation sequencing (NGS) technology has become a prominent tool in biological and biomedical research. However, NGS data analysis, such as de novo assembly, mapping and variants detection is far from maturity, and the high sequencing error-rate is one of the major problems. . To minimize the impact of sequencing errors, we developed a highly robust and efficient method, MTM, to correct the errors in NGS reads. We demonstrated the effectiveness of MTM on both single-cell data with highly non-uniform coverage and normal data with uniformly high coverage, reflecting that MTM’s performance does not rely on the coverage of the sequencing reads. MTM was also compared with Hammer and Quake, the best methods for correcting non-uniform and uniform data respectively. For non-uniform data, MTM outperformed both Hammer and Quake. For uniform data, MTM showed better performance than Quake and comparable results to Hammer. By making better error correction with MTM, the quality of downstream analysis, such as mapping and SNP detection, was improved. SNP calling is a major application of NGS technologies. However, the existence of sequencing errors complicates this process, especially for the low coverage (
Resumo:
Over the last decade, adverse events and medical errors have become a main focus of interest for the standards of quality and safety in the U.S. healthcare system (Weinstein & Henderson, 2009). Particularly when a medical error occurs, the disclosure of medical errors and its practices have become a focal point of the healthcare process. Patients and family members who have experienced a medical error might be able to provide knowledge and insight on how to improve the disclose process. However, patient and family member are not typically involved in the disclosure process, thus their experiences go unnoticed. ^ The purpose of this research was to explore how best to include patients and family members in the disclosure process regarding a medical error. The research consisted of 28 qualitative interviews from three stakeholder groups: Hospital Administrators, Clinical Service Providers, and Patients and Family Members. They were asked for their ideas and suggestions on how best to include patients and family members in the disclosure process. Framework Analysis was used to analyze this data and find prevalent themes based on the primary research question. A secondary aim was to index categories created based on the interviews that were collected. Data was used from the Texas Disclosure and Compensation Study with Dr. Eric Thomas as the Principal Investigator. Full acknowledgement of access to this data is given to Dr. Thomas. ^ The themes from the research revealed that each stakeholder group was interested and open to including patients and family members in the disclosure process and that the disclosure process should not be a "one-way" avenue. The themes gave many suggestions regarding how to best include patients and family members in the disclosure process of a medical error. Secondary aims revealed several ways to assess the ideas and suggestion given by the stakeholders. Overall, acceptability of getting the perspective of patients and family members was the most common theme. Comparison of each stakeholder group revealed that including patients and family members would be beneficial to improving hospital disclosure practices. ^ Conclusions included a list of recommendations and measureable appropriate strategies that could provide hospital with key stakeholders insights on how to improve their disclosure process. Sharing patients and family members experience with healthcare providers can encourage a shift in culture where patients are valued and active in participating in hospital practices. To my knowledge, this research is the very first of its kind and moves the disclosure process conversation forward in a patient-family member inclusion direction that will assist in improving disclosure practices. Future research should implement and evaluate the success of the various inclusion strategies.^
Resumo:
Coastal managers require reliable spatial data on the extent and timing of potential coastal inundation, particularly in a changing climate. Most sea level rise (SLR) vulnerability assessments are undertaken using the easily implemented bathtub approach, where areas adjacent to the sea and below a given elevation are mapped using a deterministic line dividing potentially inundated from dry areas. This method only requires elevation data usually in the form of a digital elevation model (DEM). However, inherent errors in the DEM and spatial analysis of the bathtub model propagate into the inundation mapping. The aim of this study was to assess the impacts of spatially variable and spatially correlated elevation errors in high-spatial resolution DEMs for mapping coastal inundation. Elevation errors were best modelled using regression-kriging. This geostatistical model takes the spatial correlation in elevation errors into account, which has a significant impact on analyses that include spatial interactions, such as inundation modelling. The spatial variability of elevation errors was partially explained by land cover and terrain variables. Elevation errors were simulated using sequential Gaussian simulation, a Monte Carlo probabilistic approach. 1,000 error simulations were added to the original DEM and reclassified using a hydrologically correct bathtub method. The probability of inundation to a scenario combining a 1 in 100 year storm event over a 1 m SLR was calculated by counting the proportion of times from the 1,000 simulations that a location was inundated. This probabilistic approach can be used in a risk-aversive decision making process by planning for scenarios with different probabilities of occurrence. For example, results showed that when considering a 1% probability exceedance, the inundated area was approximately 11% larger than mapped using the deterministic bathtub approach. The probabilistic approach provides visually intuitive maps that convey uncertainties inherent to spatial data and analysis.
Resumo:
Secchi depth is a measure of water transparency. In the Baltic Sea region, Secchi depth maps are used to assess eutrophication and as input for habitat models. Due to their spatial and temporal coverage, satellite data would be the most suitable data source for such maps. But the Baltic Sea's optical properties are so different from the open ocean that globally calibrated standard models suffer from large errors. Regional predictive models that take the Baltic Sea's special optical properties into account are thus needed. This paper tests how accurately generalized linear models (GLMs) and generalized additive models (GAMs) with MODIS/Aqua and auxiliary data as inputs can predict Secchi depth at a regional scale. It uses cross-validation to test the prediction accuracy of hundreds of GAMs and GLMs with up to 5 input variables. A GAM with 3 input variables (chlorophyll a, remote sensing reflectance at 678 nm, and long-term mean salinity) made the most accurate predictions. Tested against field observations not used for model selection and calibration, the best model's mean absolute error (MAE) for daily predictions was 1.07 m (22%), more than 50% lower than for other publicly available Baltic Sea Secchi depth maps. The MAE for predicting monthly averages was 0.86 m (15%). Thus, the proposed model selection process was able to find a regional model with good prediction accuracy. It could be useful to find predictive models for environmental variables other than Secchi depth, using data from other satellite sensors, and for other regions where non-standard remote sensing models are needed for prediction and mapping. Annual and monthly mean Secchi depth maps for 2003-2012 come with this paper as Supplementary materials.
Resumo:
The climate during the Cenozoic era changed in several steps from ice-free poles and warm conditions to ice-covered poles and cold conditions. Since the 1950s, a body of information on ice volume and temperature changes has been built up predominantly on the basis of measurements of the oxygen isotopic composition of shells of benthic foraminifera collected from marine sediment cores. The statistical methodology of time series analysis has also evolved, allowing more information to be extracted from these records. Here we provide a comprehensive view of Cenozoic climate evolution by means of a coherent and systematic application of time series analytical tools to each record from a compilation spanning the interval from 4 to 61 Myr ago. We quantitatively describe several prominent features of the oxygen isotope record, taking into account the various sources of uncertainty (including measurement, proxy noise, and dating errors). The estimated transition times and amplitudes allow us to assess causal climatological-tectonic influences on the following known features of the Cenozoic oxygen isotopic record: Paleocene-Eocene Thermal Maximum, Eocene-Oligocene Transition, Oligocene-Miocene Boundary, and the Middle Miocene Climate Optimum. We further describe and causally interpret the following features: Paleocene-Eocene warming trend, the two-step, long-term Eocene cooling, and the changes within the most recent interval (Miocene-Pliocene). We review the scope and methods of constructing Cenozoic stacks of benthic oxygen isotope records and present two new latitudinal stacks, which capture besides global ice volume also bottom water temperatures at low (less than 30°) and high latitudes. This review concludes with an identification of future directions for data collection, statistical method development, and climate modeling.
Resumo:
Core-top samples from different ocean basins have been analyzed to refine our current understanding of the sensitivity of benthic foraminiferal calcite magnesium/calcium (Mg/Ca) to bottom water temperatures (BWT). Benthic foraminifera collected from Hawaii, Little Bahama Bank, Sea of Okhotsk, Gulf of California, NE Atlantic, Ceara Rise, Sierra Leone Rise, the Ontong Java Plateau, and the Southern Ocean covering a temperature range of 0.8 to 18°C were used to revise the Cibicidoides Mg/Ca-temperature calibration. The Mg/Ca-BWT relationship of three common Cibicidoides species is described by an exponential equation: Mg/Ca = 0.867 ± 0.049 exp (0.109 ± 0.007 * BWT) (stated errors are 95% CI). The temperature sensitivity is very similar to a previously published calibration. However, the revised calibration has a significantly different preexponential constant, resulting in different predicted absolute temperatures. We attribute this difference in the preexponential constant to an analytical issue of accuracy. Some genera, notably Uvigerina, show apparently lower temperature sensitivity than others, suggesting that the use of constant offsets to account for vital effects in Mg/Ca may not be appropriate. Downcore Mg/Ca reproducibility, as determined on replicate foraminiferal samples, is typically better than 0.1 mmol/mol (2 S.E.). Thus, considering the errors associated with the Cibicidoides calibration and the downcore reproducibility, BWT may be estimated to within ±1°C. Application of the revised core-top Mg/Ca-BWT data to Cenozoic foraminiferal Mg/Ca suggests that seawater Mg/Ca was not more than 35% lower than today in the ice-free ocean at 50 Ma.
Resumo:
We present measurements of the maximum diameter of the planktonic foraminifer Neogloboquadrina pachyderma sin. from six sediment cores (Ocean Drilling Program sites 643, 644, 907, 909, 985 and 987) from the Norwegian-Greenland Sea. Our data show a distinct net increase in mean shell size of N. pachyderma sin. at all sites during the last 1.3 Ma, with largest shell sizes reached after 0.4 Ma. External factors such as glacial-interglacial variability and carbonate dissolution alone cannot account for the observed variation in mean shell size of N. pachyderma sin. We consider the observed shell size increase to mirror an evolutionary trend towards better adaptation of N. pachyderma sin. to the cold water environment after 1.1-1.0 Ma. Probably, the Mid Pleistocene climate shift and the associated change of amplitude and frequency of glacial-interglacial fluctuations have triggered the evolution of this planktonic foraminifer. Oxygen and carbon stable isotope analyses of different shell size classes indicate that the observed shell size increase could not be explained by the functional concept that larger shells promote increasing sinking velocities during gametogenesis. For paleoceanographic reconstructions, the evolutionary adaptation of Neogloboquadrina pachyderma sin. to the cold water habitat has significant implications. Carbonate sedimentation in highest latitudes is highly dependent on the presence of this species. In the Norwegian-Greenland Sea, carbonate-poor intervals before 1.1 Ma are, therefore, not necessarily related to severe glacial conditions. They are probably attributed to the absence of this not yet polar-adapted species. Further, transfer function and modern analog techniques used for the reconstruction of surface water conditions in high latitudes could, therefore, contain a large range of errors if they were applied to samples older than 1.1-1.0 Myrs.
Resumo:
DNA extraction was carried out as described on the MICROBIS project pages (http://icomm.mbl.edu/microbis ) using a commercially available extraction kit. We amplified the hypervariable regions V4-V6 of archaeal and bacterial 16S rRNA genes using PCR and several sets of forward and reverse primers (http://vamps.mbl.edu/resources/primers.php). Massively parallel tag sequencing of the PCR products was carried out on a 454 Life Sciences GS FLX sequencer at Marine Biological Laboratory, Woods Hole, MA, following the same experimental conditions for all samples. Sequence reads were submitted to a rigorous quality control procedure based on mothur v30 (doi:10.1128/AEM.01541-09) including denoising of the flow grams using an algorithm based on PyroNoise (doi:10.1038/nmeth.1361), removal of PCR errors and a chimera check using uchime (doi:10.1093/bioinformatics/btr381). The reads were taxonomically assigned according to the SILVA taxonomy (SSURef v119, 07-2014; doi:10.1093/nar/gks1219) implemented in mothur and clustered at 98% ribosomal RNA gene V4-V6 sequence identity. V4-V6 amplicon sequence abundance tables were standardized to account for unequal sampling effort using 1000 (Archaea) and 2300 (Bacteria) randomly chosen sequences without replacement using mothur and then used to calculate inverse Simpson diversity indices and Chao1 richness (doi:10.2307/4615964). Bray-Curtis dissimilarities (doi:10.2307/1942268) between all samples were calculated and used for 2-dimensional non metric multidimensional scaling (NMDS) ordinations with 20 random starts (doi:10.1007/BF02289694). Stress values below 0.2 indicated that the multidimensional dataset was well represented by the 2D ordination. NMDS ordinations were compared and tested using Procrustes correlation analysis (doi:10.1007/BF02291478). All analyses were carried out with the R statistical environment and the packages vegan (available at: http://cran.r-project.org/package=vegan), labdsv (available at: http://cran.r-project.org/package=labdsv), as well as with custom R scripts. Operational taxonomic units at 98% sequence identity (OTU0.03) that occurred only once in the whole dataset were termed absolute single sequence OTUs (SSOabs; doi:10.1038/ismej.2011.132). OTU0.03 sequences that occurred only once in at least one sample, but may occur more often in other samples were termed relative single sequence OTUs (SSOrel). SSOrel are particularly interesting for community ecology, since they comprise rare organisms that might become abundant when conditions change.16S rRNA amplicons and metagenomic reads have been stored in the sequence read archive under SRA project accession number SRP042162.
Resumo:
This paper attempts to describe part of the history of Chinese rural migration to urban industrial areas. Using a case study of a township in Sichuan, the author examines a type of rural development which she defines as a "bottom-up" style strategy of regional development. Different types of social mobility are observed in the case study, and over its long history, migration in the township has offered diverse means of social mobility to the local peasants. The paper concludes by considering the diversity and limits of Chinese social mobility at this stage.