970 resultados para Setup errors
Resumo:
Spreading the PSF over a quite large amount of pixels is an increasingly used observing technique in order to reach extremely precise photometry, such as in the case of exoplanets searching and characterization via transits observations. A PSF top-hat profile helps to minimize the errors contribution due to the uncertainty on the knowledge of the detector flat field. This work has been carried out during the recent design study in the framework of the ESA small mission CHEOPS. Because of lack of perfect flat-fielding information, in the CHEOPS optics it is required to spread the light of a source into a well defined angular area, in a manner as uniform as possible. Furthermore this should be accomplished still retaining the features of a true focal plane onto the detector. In this way, for instance, the angular displacement on the focal plane is fully retained and in case of several stars in a field these look as separated as their distance is larger than the spreading size. An obvious way is to apply a defocus, while the presence of an intermediate pupil plane in the Back End Optics makes attractive to introduce here an optical device that is able to spread the light in a well defined manner, still retaining the direction of the chief ray hitting it. This can be accomplished through an holographic diffuser or through a lenslet array. Both techniques implement the concept of segmenting the pupil into several sub-zones where light is spread to a well defined angle. We present experimental results on how to deliver such PSF profile by mean of holographic diffuser and lenslet array. Both the devices are located in an intermediate pupil plane of a properly scaled laboratory setup mimicking the CHEOPS optical design configuration. © (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Resumo:
BACKGROUND There is limited research on anaesthesiologists' attitudes and experiences regarding medical error communication, particularly concerning disclosing errors to patients. OBJECTIVE To characterise anaesthesiologists' attitudes and experiences regarding disclosing errors to patients and reporting errors within the hospital, and to examine factors influencing their willingness to disclose or report errors. DESIGN Cross-sectional survey. SETTING Switzerland's five university hospitals' departments of anaesthesia in 2012/2013. PARTICIPANTS Two hundred and eighty-one clinically active anaesthesiologists. MAIN OUTCOME MEASURES Anaesthesiologists' attitudes and experiences regarding medical error communication. RESULTS The overall response rate of the survey was 52% (281/542). Respondents broadly endorsed disclosing harmful errors to patients (100% serious, 77% minor errors, 19% near misses), but also reported factors that might make them less likely to actually disclose such errors. Only 12% of respondents had previously received training on how to disclose errors to patients, although 93% were interested in receiving training. Overall, 97% of respondents agreed that serious errors should be reported, but willingness to report minor errors (74%) and near misses (59%) was lower. Respondents were more likely to strongly agree that serious errors should be reported if they also thought that their hospital would implement systematic changes after errors were reported [(odds ratio, 2.097 (95% confidence interval, 1.16 to 3.81)]. Significant differences in attitudes between departments regarding error disclosure and reporting were noted. CONCLUSION Willingness to disclose or report errors varied widely between hospitals. Thus, heads of department and hospital chiefs need to be aware of the importance of local culture when it comes to error communication. Error disclosure training and improving feedback on how error reports are being used to improve patient safety may also be important steps in increasing anaesthesiologists' communication of errors.
Resumo:
Charcoal particles in pollen slides are often abundant, and thus analysts are faced with the problem of setting the minimum counting sum as small as possible in order to save time. We analysed the reliability of charcoal-concentration estimates based on different counting sums, using simulated low-to high-count samples. Bootstrap simulations indicate that the variability of inferred charcoal concentrations increases progressively with decreasing sums. Below 200 items (i.e., the sum of charcoal particles and exotic marker grains), reconstructed fire incidence is either too high or too low. Statistical comparisons show that the means of bootstrap simulations stabilize after 200 counts. Moreover, a count of 200-300 items is sufficient to produce a charcoal-concentration estimate with less than+5% error if compared with high-count samples of 1000 items for charcoal/marker grain ratios 0.1-0.91. If, however, this ratio is extremely high or low (> 0.91 or < 0.1) and if such samples are frequent, we suggest that marker grains are reduced or added prior to new sample processing.
Resumo:
Osteoporotic proximal femur fractures are caused by low energy trauma, typically when falling on the hip from standing height. Finite element simulations, widely used to predict the fracture load of femora in fall, usually include neither mass-related inertial effects, nor the viscous part of bone's material behavior. The aim of this study was to elucidate if quasi-static non-linear homogenized finite element analyses can predict in vitro mechanical properties of proximal femora assessed in dynamic drop tower experiments. The case-specific numerical models of thirteen femora predicted the strength (R2=0.84, SEE=540 N, 16.2%), stiffness (R2=0.82, SEE=233 N/mm, 18.0%) and fracture energy (R2=0.72, SEE=3.85 J, 39.6%); and provided fair qualitative matches with the fracture patterns. The influence of material anisotropy was negligible for all predictions. These results suggest that quasi-static homogenized finite element analysis may be used to predict mechanical properties of proximal femora in the dynamic sideways fall situation.
Resumo:
We report on the developments of a neutron tomography setup at the instrument for prompt gamma-ray activation analysis (PGAA) at the Maier-Leibnitz Zentrum(MLZ). The recent developments are driven by the idea of combining the spatial information obtained with neutron tomography with the elemental information determined with PGAA, i.e. to further combine both techniques to an investigative technique called prompt gamma activation imaging (PGAI).At the PGAA instrument, a cold neutron flux of up to 6 x 1010 cm-2 s-1 (thermal equivalent) is available in the focus of an elliptically tapered neutron guide. In the reported experiments, the divergence of the neutron beam was investigated, the resolution of the installed detector system tested, and a proof-of-principle tomography experiment performed. In our study a formerly used camera box was upgraded with a better camera and an optical resolution of 8 line pairs/mm was achieved. The divergence of the neutron beam was measured by a systematic scan along the beam axis. Based on the acquired data, a neutron imaging setup with a L/D ratio of 200 was installed. The resolution of the setup was testedin combination with a gadolinium test target and different scintillator screens. The test target was irradiated at two positions to determine the maximum resolution and the resolution at the actual sample position. The performance of the installed tomography setup was demonstrated bya tomography experiment of an electric amplifier tube.
Resumo:
Studies have shown that the discriminability of successive time intervals depends on the presentation order of the standard (St) and the comparison (Co) stimuli. Also, this order affects the point of subjective equality. The first effect is here called the standard-position effect (SPE); the latter is known as the time-order error. In the present study, we investigated how these two effects vary across interval types and standard durations, using Hellström’s sensation-weighting model to describe the results and relate them to stimulus comparison mechanisms. In Experiment 1, four modes of interval presentation were used, factorially combining interval type (filled, empty) and sensory modality (auditory, visual). For each mode, two presentation orders (St–Co, Co–St) and two standard durations (100 ms, 1,000 ms) were used; half of the participants received correctness feedback, and half of them did not. The interstimulus interval was 900 ms. The SPEs were negative (i.e., a smaller difference limen for St–Co than for Co–St), except for the filled-auditory and empty-visual 100-ms standards, for which a positive effect was obtained. In Experiment 2, duration discrimination was investigated for filled auditory intervals with four standards between 100 and 1,000 ms, an interstimulus interval of 900 ms, and no feedback. Standard duration interacted with presentation order, here yielding SPEs that were negative for standards of 100 and 1,000 ms, but positive for 215 and 464 ms. Our findings indicate that the SPE can be positive as well as negative, depending on the interval type and standard duration, reflecting the relative weighting of the stimulus information, as is described by the sensation-weighting model.
Resumo:
The surface of Mars is host to many regions displaying polygonal crack patterns that have been identified as potential desiccation cracks. These regions are mostly within Noachian-aged terrains and are closely associated with phyllosilicate occurrences and smectites in particular. We have built a laboratory setup that allows us to carry out desiccation experiments on Mars-analog materials in an effort to constrain the physical and chemical properties of sediments that display polygonal cracks. The setup is complemented by a pre-existing simulation chamber that enables the investigation of the spectral and photometric properties of analog materials in Mars-like conditions. The initial experiments that have been carried out show that (1) crack patterns are visible in smectite-bearing materials in varying concentrations down to similar to 10% smectite by weight, (2) chlorides, and potentially other salts, delay the onset of cracking and may even block it from occurring entirely, and (3) the polygonal patterns, while being indicative of the presence of phyllosilicates, cannot be used to differentiate between various phyllosilicate-bearing deposits. However, their size-scale and morphology yields important information regarding their thickness and the hydrological conditions at the time of formation. Furthermore, the complementary spectral measurements for some of the analog samples shows that crack patterns may develop in materials with such low concentrations of smectites that would not be expected to be identified using remote-sensing instruments. This may explain the presence of polygonal patterns on Mars in sediments that lack spectral confirmation of phyllosilicates. (C) 2015 Elsevier Ltd. All rights reserved.
Resumo:
Sentinel-5 (S5) and its precursor (S5P) are future European satellite missions aiming at global monitoring of methane (CH4) column-average dry air mole fractions (XCH4). The spectrometers to be deployed onboard the satellites record spectra of sunlight backscattered from the Earth's surface and atmosphere. In particular, they exploit CH4 absorption in the shortwave infrared spectral range around 1.65 mu m (S5 only) and 2.35 mu m (both S5 and S5P) wavelength. Given an accuracy goal of better than 2% for XCH4 to be delivered on regional scales, assessment and reduction of potential sources of systematic error such as spectroscopic uncertainties is crucial. Here, we investigate how spectroscopic errors propagate into retrieval errors on the global scale. To this end, absorption spectra of a ground-based Fourier transform spectrometer (FTS) operating at very high spectral resolution serve as estimate for the quality of the spectroscopic parameters. Feeding the FTS fitting residuals as a perturbation into a global ensemble of simulated S5- and S5P-like spectra at relatively low spectral resolution, XCH4 retrieval errors exceed 0.6% in large parts of the world and show systematic correlations on regional scales, calling for improved spectroscopic parameters.
Resumo:
Height of instrument (HI) blunders in GPS measurements cause position errors. These errors can be pure vertical, pure horizontal, or a mixture of both. There are different error regimes depending on whether both the base and the rover both have HI blunders, if just the base has an HI blunder, or just the rover has an HI blunder. The resulting errors are on the order of 30 cm for receiver separations of 1000 km for an HI blunder of 2 m. Given the complicated nature of the errors, we believe it would be difficult, if not impossible, to detect such errors by visual inspection. This serves to underline the necessity to enter GPS HI's correctly.
Resumo:
Height of instrument (HI) blunders in GPS measurements cause position errors. These errors can be pure vertical, pure horizontal, or a mixture of both. There are different error regimes depending on whether both the base and the rover both have HI blunders, if just the base has an HI blunder, or just the rover has an HI blunder. The resulting errors are on the order of 30 cm for receiver separations of 1000 km for an HI blunder of 2 m. Given the complicated nature of the errors, we believe it would be difficult, if not impossible, to detect such errors by visual inspection. This serves to underline the necessity to enter GPS HIs correctly.
Resumo:
Because the goal of radiation therapy is to deliver a lethal dose to the tumor, accurate information on the location of the tumor needs to be known. Margins are placed around the tumor to account for variations in the daily position of the tumor. If tumor motion and patient setup uncertainties can be reduced, margins that account for such uncertainties in tumor location in can be reduced allowing dose escalation, which in turn could potentially improve survival rates. ^ In the first part of this study, we monitor the location of fiducials implanted in the periphery of lung tumors to determine the extent of non-gated and gated fiducial motion, and to quantify patient setup uncertainties. In the second part we determine where the tumor is when different methods of image-guided patient setup and respiratory gating are employed. In the final part we develop, validate, and implement a technique in which patient setup uncertainties are reduced by aligning patients based upon fiducial locations in projection images. ^ Results from the first part indicate that respiratory gating reduces fiducial motion relative to motion during normal respiration and setup uncertainties when the patients were aligned each day using externally placed skin marks are large. The results from the second part indicate that current margins that account for setup uncertainty and tumor motion result in less than 2% of the tumor outside of the planning target volume (PTV) when the patient is aligned using skin marks. In addition, we found that if respiratory gating is going to be used, it is most effective if used in conjunction with image-guided patient setup. From the third part, we successfully developed, validated, and implemented on a patient a technique for aligning a moving target prior to treatment to reduce the uncertainties in tumor location. ^ In conclusion, setup uncertainties and tumor motion are a significant problem when treating tumors located within the thoracic region. Image-guided patient setup in conjunction with treatment delivery using respiratory gating reduces these uncertainties in tumor locations. In doing so, margins around the tumor used to generate the PTV can be reduced, which may allow for dose escalation to the tumor. ^
Resumo:
Medication errors, one of the most frequent types of medical errors, are a common cause of patient harm in hospital systems today. Nurses at the bedside are in a position to encounter many of these errors since they are there at the start of the process (ordering/prescribing) and the end of the process (administration). One of the recommendations from the IOM (Institute of Medicine) report, "To Err is Human," was for organizations to identify and learn from medical errors through event reporting systems. While many organizations have reporting systems in place, research studies report a significant amount of underreporting by nurses. A systematic review of the literature was performed to identify contributing factors related to the reporting and not reporting of medication errors by nurses at the bedside.^ Articles included in the literature review were primary or secondary studies, dated January 1, 2000 – July 2009, related to nursing medication error reporting. All 634 articles were reviewed with an algorithm developed to standardize the review process and help filter out those that did not meet the study criteria. In addition, 142 article bibliographies were reviewed to find additional studies that were not found in the original literature search.^ After reviewing the 634 articles and the additional 108 articles discovered in the bibliography review, 41 articles met the study criteria and were used in the systematic literature review results.^ Fear of punitive reactions to medication errors was a frequent barrier to error reporting. Nurses fear reactions from their leadership, peers, patients and their families, nursing boards, and the media. Anonymous reporting systems and departments/organizations with a strong safety culture in place helped to encourage the reporting of medication errors by nursing staff.^ Many of the studies included in this literature review do not allow results that can be generalized. The majority of them took place in single institutions/organizations with limited sample sizes. Stronger studies with larger sample sizes need to be performed, utilizing data collection methods that have been validated, to determine stronger correlations between safety cultures and nurse error reporting.^
Resumo:
A large number of ridge regression estimators have been proposed and used with little knowledge of their true distributions. Because of this lack of knowledge, these estimators cannot be used to test hypotheses or to form confidence intervals.^ This paper presents a basic technique for deriving the exact distribution functions for a class of generalized ridge estimators. The technique is applied to five prominent generalized ridge estimators. Graphs of the resulting distribution functions are presented. The actual behavior of these estimators is found to be considerably different than the behavior which is generally assumed for ridge estimators.^ This paper also uses the derived distributions to examine the mean squared error properties of the estimators. A technique for developing confidence intervals based on the generalized ridge estimators is also presented. ^
Resumo:
Errors in the administration of medication represent a significant loss of medical resources and pose life altering or life threatening risks to patients. This paper considered the question, what impact do Computerized Physician Order Entry (CPOE) systems have on medication errors in the hospital inpatient environment? Previous reviews have examined evidence of the impact of CPOE on medication errors, but have come to ambiguous conclusions as to the impact of CPOE and decision support systems (DSS). Forty-three papers were identified. Thirty-one demonstrated a significant reduction in prescribing error rates for all or some drug types; decreases in minor errors were most often reported. Several studies reported increases in the rate of duplicate orders and failures to remove contraindicated drugs, often attributed to inappropriate design or to an inability to operate the system properly. The evidence on the effectiveness of CPOE to reduce errors in medication administration is compelling though it is limited by modest study sample sizes and designs. ^
Resumo:
Next-generation sequencing (NGS) technology has become a prominent tool in biological and biomedical research. However, NGS data analysis, such as de novo assembly, mapping and variants detection is far from maturity, and the high sequencing error-rate is one of the major problems. . To minimize the impact of sequencing errors, we developed a highly robust and efficient method, MTM, to correct the errors in NGS reads. We demonstrated the effectiveness of MTM on both single-cell data with highly non-uniform coverage and normal data with uniformly high coverage, reflecting that MTM’s performance does not rely on the coverage of the sequencing reads. MTM was also compared with Hammer and Quake, the best methods for correcting non-uniform and uniform data respectively. For non-uniform data, MTM outperformed both Hammer and Quake. For uniform data, MTM showed better performance than Quake and comparable results to Hammer. By making better error correction with MTM, the quality of downstream analysis, such as mapping and SNP detection, was improved. SNP calling is a major application of NGS technologies. However, the existence of sequencing errors complicates this process, especially for the low coverage (