996 resultados para Measurement uncertainty
Resumo:
Objective: To determine methadone plasma trough and peak concentrations in patients presenting opiate withdrawal symptoms after introduction of nevirapine or efavirenz. To describe the disappearance of these symptoms after methadone titration based on plasma concentrations rather than on the symptoms. Methods: Nine patients undergoing highly active antiretroviral therapy (HAART) and either nevirapine or efavirenz treatment were monitored daily for opiate withdrawal in a specialized drug addiction center. Methadone dose was titrated daily, and plasma concentrations were measured. The data are retrospective (case series). Results: Several patients complained of symptoms such as nausea, vomiting, accelerated intestinal transit, or insomnia. Even after methadone titration based on clinical symptoms, patients and health-care providers trained in infectious disease did not classify these as withdrawal symptoms and considered them as the side effects of HAART or anxiety. Methadone plasma trough concentration showed low levels of (R)- and (R,S)-methadone. Further methadone dose adjustment according to plasma level resulted in the disappearance of these withdrawal symptoms. The daily methadone dose was split when the peak/trough (R)-methadone ratio was more than 2. Conclusions: When introducing efavirenz or nevirapine to patients undergoing methadone treatment, withdrawal symptoms should be monitored, especially those such as insomnia, vomiting, or nausea. Methadone plasma trough and peak measurements can be of value in preventing unnecessary side effects of HAART.
Resumo:
We propose a method to evaluate cyclical models which does not require knowledge of the DGP and the exact empirical specification of the aggregate decision rules. We derive robust restrictions in a class of models; use some to identify structural shocks and others to evaluate the model or contrast sub-models. The approach has good size and excellent power properties, even in small samples. We show how to examine the validity of a class of models, sort out the relevance of certain frictions, evaluate the importance of an added feature, and indirectly estimate structural parameters.
Resumo:
The growing demand and the degree of patient care in oncological outpatient services, as well as the complexity of treatment have had an impact on the workload of nurses. This study aimed at measuring the workload and productivity of nurses in an oncological outpatient service. An observational study using a work sampling technique was conducted and included seven nurses working in an oncological outpatient service in the south-eastern region of Brazil. A total of 1,487 intervention or activity samples were obtained. Nurses used 43.2% of their time on indirect care, 33.2% on direct care, 11.6% on associated activities, and 12% on personal activities. Their mean productivity was 88.0%. The findings showed that nurses in this service spend most of their time in indirect care activities. Moreover, the productivity index in this study was above that recommended in the literature.
Resumo:
BACKGROUND: Measurement of plasma renin is important for the clinical assessment of hypertensive patients. The most common methods for measuring plasma renin are the plasma renin activity (PRA) assay and the renin immunoassay. The clinical application of renin inhibitor therapy has thrown into focus the differences in information provided by activity assays and immunoassays for renin and prorenin measurement and has drawn attention to the need for precautions to ensure their accurate measurement. CONTENT: Renin activity assays and immunoassays provide related but different information. Whereas activity assays measure only active renin, immunoassays measure both active and inhibited renin. Particular care must be taken in the collection and processing of blood samples and in the performance of these assays to avoid errors in renin measurement. Both activity assays and immunoassays are susceptible to renin overestimation due to prorenin activation. In addition, activity assays performed with peptidase inhibitors may overestimate the degree of inhibition of PRA by renin inhibitor therapy. Moreover, immunoassays may overestimate the reactive increase in plasma renin concentration in response to renin inhibitor therapy, owing to the inhibitor promoting conversion of prorenin to an open conformation that is recognized by renin immunoassays. CONCLUSIONS: The successful application of renin assays to patient care requires that the clinician and the clinical chemist understand the information provided by these assays and of the precautions necessary to ensure their accuracy.
Resumo:
The vast territories that have been radioactively contaminated during the 1986 Chernobyl accident provide a substantial data set of radioactive monitoring data, which can be used for the verification and testing of the different spatial estimation (prediction) methods involved in risk assessment studies. Using the Chernobyl data set for such a purpose is motivated by its heterogeneous spatial structure (the data are characterized by large-scale correlations, short-scale variability, spotty features, etc.). The present work is concerned with the application of the Bayesian Maximum Entropy (BME) method to estimate the extent and the magnitude of the radioactive soil contamination by 137Cs due to the Chernobyl fallout. The powerful BME method allows rigorous incorporation of a wide variety of knowledge bases into the spatial estimation procedure leading to informative contamination maps. Exact measurements (?hard? data) are combined with secondary information on local uncertainties (treated as ?soft? data) to generate science-based uncertainty assessment of soil contamination estimates at unsampled locations. BME describes uncertainty in terms of the posterior probability distributions generated across space, whereas no assumption about the underlying distribution is made and non-linear estimators are automatically incorporated. Traditional estimation variances based on the assumption of an underlying Gaussian distribution (analogous, e.g., to the kriging variance) can be derived as a special case of the BME uncertainty analysis. The BME estimates obtained using hard and soft data are compared with the BME estimates obtained using only hard data. The comparison involves both the accuracy of the estimation maps using the exact data and the assessment of the associated uncertainty using repeated measurements. Furthermore, a comparison of the spatial estimation accuracy obtained by the two methods was carried out using a validation data set of hard data. Finally, a separate uncertainty analysis was conducted that evaluated the ability of the posterior probabilities to reproduce the distribution of the raw repeated measurements available in certain populated sites. The analysis provides an illustration of the improvement in mapping accuracy obtained by adding soft data to the existing hard data and, in general, demonstrates that the BME method performs well both in terms of estimation accuracy as well as in terms estimation error assessment, which are both useful features for the Chernobyl fallout study.
Resumo:
A crucial method for investigating patients with coronary artery disease (CAD) is the calculation of the left ventricular ejection fraction (LVEF). It is, consequently, imperative to precisely estimate the value of LVEF--a process that can be done with myocardial perfusion scintigraphy. Therefore, the present study aimed to establish and compare the estimation performance of the quantitative parameters of the reconstruction methods filtered backprojection (FBP) and ordered-subset expectation maximization (OSEM). METHODS: A beating-heart phantom with known values of end-diastolic volume, end-systolic volume, and LVEF was used. Quantitative gated SPECT/quantitative perfusion SPECT software was used to obtain these quantitative parameters in a semiautomatic mode. The Butterworth filter was used in FBP, with the cutoff frequencies between 0.2 and 0.8 cycles per pixel combined with the orders of 5, 10, 15, and 20. Sixty-three reconstructions were performed using 2, 4, 6, 8, 10, 12, and 16 OSEM subsets, combined with several iterations: 2, 4, 6, 8, 10, 12, 16, 32, and 64. RESULTS: With FBP, the values of end-diastolic, end-systolic, and the stroke volumes rise as the cutoff frequency increases, whereas the value of LVEF diminishes. This same pattern is verified with the OSEM reconstruction. However, with OSEM there is a more precise estimation of the quantitative parameters, especially with the combinations 2 iterations × 10 subsets and 2 iterations × 12 subsets. CONCLUSION: The OSEM reconstruction presents better estimations of the quantitative parameters than does FBP. This study recommends the use of 2 iterations with 10 or 12 subsets for OSEM and a cutoff frequency of 0.5 cycles per pixel with the orders 5, 10, or 15 for FBP as the best estimations for the left ventricular volumes and ejection fraction quantification in myocardial perfusion scintigraphy.
Resumo:
Universal standard goniometer is an essential tool to measure articulations' range of motion (ROM). In this time of technological advances and increasing use of smartphones, new measurement's tools appear as specific smartphone applications. This article compares the iOS application "Knee Goniometer" with universal standard goniometer to assess knee ROM. To our knowledge, this is the first study that uses a goniometer application in a clinical context. The purpose of this study is to determine if this application could be used in clinical practice.
Resumo:
This paper explores biases in the elicitation of utilities under risk and the contribution that generalizations of expected utility can make to the resolution of these biases. We used five methods to measure utilities under risk and found clear violations of expected utility. Of the theories studies, prospect theory was most consistent with our data. The main improvement of prospect theory over expected utility was in comparisons between a riskless and a risky prospect(riskless-risk methods). We observed no improvement over expected utility in comparisons between two risky prospects (risk-risk methods). An explanation why we found no improvement of prospect theory over expected utility in risk-risk methods may be that there was less overweighting of small probabilities in our study than has commonly been observed.
Resumo:
We compare behavior in modified dictator games with and without role uncertainty. Subjectschoose between a selfish action, a costly surplus creating action (altruistic behavior) and acostly surplus destroying action (spiteful behavior). While costly surplus creating actions are themost frequent under role uncertainty (64%), selfish actions become the most frequent withoutrole uncertainty (69%). Also, the frequency of surplus destroying choices is negligible with roleuncertainty (1%) but not so without it (11%). A classification of subjects into four differenttypes of interdependent preferences (Selfish, Social Welfare maximizing, Inequity Averse andCompetitive) shows that the use of role uncertainty overestimates the prevalence of SocialWelfare maximizing preferences in the subject population (from 74% with role uncertainty to21% without it) and underestimates Selfish and Inequity Averse preferences. An additionaltreatment, in which subjects undertake an understanding test before participating in theexperiment with role uncertainty, shows that the vast majority of subjects (93%) correctlyunderstand the payoff mechanism with role uncertainty, but yet surplus creating actions weremost frequent. Our results warn against the use of role uncertainty in experiments that aim tomeasure the prevalence of interdependent preferences.
Resumo:
Unemployment rates in developed countries have recently reached levels not seenin a generation, and workers of all ages are facing increasing probabilities of losingtheir jobs and considerable losses in accumulated assets. These events likely increasethe reliance that most older workers will have on public social insurance programs,exactly at a time that public finances are suffering from a large drop in contributions.Our paper explicitly accounts for employment uncertainty and unexpectedwealth shocks, something that has been relatively overlooked in the literature, butthat has grown in importance in recent years. Using administrative and householdlevel data we empirically characterize a life-cycle model of retirement and claimingdecisions in terms of the employment, wage, health, and mortality uncertainty facedby individuals. Our benchmark model explains with great accuracy the strikinglyhigh proportion of individuals who claim benefits exactly at the Early RetirementAge, while still explaining the increased claiming hazard at the Normal RetirementAge. We also discuss some policy experiments and their interplay with employmentuncertainty. Additionally, we analyze the effects of negative wealth shocks on thelabor supply and claiming decisions of older Americans. Our results can explainwhy early claiming has remained very high in the last years even as the early retirementpenalties have increased substantially compared with previous periods, andwhy labor force participation has remained quite high for older workers even in themidst of the worse employment crisis in decades.
Resumo:
A new method of measuring joint angle using a combination of accelerometers and gyroscopes is presented. The method proposes a minimal sensor configuration with one sensor module mounted on each segment. The model is based on estimating the acceleration of the joint center of rotation by placing a pair of virtual sensors on the adjacent segments at the center of rotation. In the proposed technique, joint angles are found without the need for integration, so absolute angles can be obtained which are free from any source of drift. The model considers anatomical aspects and is personalized for each subject prior to each measurement. The method was validated by measuring knee flexion-extension angles of eight subjects, walking at three different speeds, and comparing the results with a reference motion measurement system. The results are very close to those of the reference system presenting very small errors (rms = 1.3, mean = 0.2, SD = 1.1 deg) and excellent correlation coefficients (0.997). The algorithm is able to provide joint angles in real-time, and ready for use in gait analysis. Technically, the system is portable, easily mountable, and can be used for long term monitoring without hindrance to natural activities.