193 resultados para Measurement uncertainty

em Université de Lausanne, Switzerland


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The 4πβ-γ coincidence counting method and its close relatives are widely used for the primary standardization of radioactivity. Both the general formalism and specific implementation of these methods have been well-documented. In particular, previous papers contain the extrapolation equations used for various decay schemes, methods for determining model parameters and, in some cases, tabulated uncertainty budgets. Two things often lacking from experimental reports are both the rationale for estimating uncertainties in a specific way and the details of exactly how a specific component of uncertainty was estimated. Furthermore, correlations among the components of uncertainty are rarely mentioned. To fill in these gaps, the present article shares the best-practices from a few practitioners of this craft. We explain and demonstrate with examples of how these approaches can be used to estimate the uncertainty of the reported massic activity. We describe uncertainties due to measurement variability, extrapolation functions, dead-time and resolving-time effects, gravimetric links, and nuclear and atomic data. Most importantly, a thorough understanding of the measurement system and its response to the decay under study can be used to derive a robust estimate of the measurement uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In clinical practice, physicians are confronted with a multitude of definitions and treatment goals for arterial hypertension, depending of the diagnostic method used (e.g. office, home and ambulatory blood pressure measurement) and the underlying disease. The historical background and evidence of these different blood pressure thresholds are discussed in this article, as well as some recent treatment guidelines. Besides, the debate of the "J curve", namely the possible risks associated with an excessive blood pressure reduction, is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electrical impedance tomography (EIT) allows the measurement of intra-thoracic impedance changes related to cardiovascular activity. As a safe and low-cost imaging modality, EIT is an appealing candidate for non-invasive and continuous haemodynamic monitoring. EIT has recently been shown to allow the assessment of aortic blood pressure via the estimation of the aortic pulse arrival time (PAT). However, finding the aortic signal within EIT image sequences is a challenging task: the signal has a small amplitude and is difficult to locate due to the small size of the aorta and the inherent low spatial resolution of EIT. In order to most reliably detect the aortic signal, our objective was to understand the effect of EIT measurement settings (electrode belt placement, reconstruction algorithm). This paper investigates the influence of three transversal belt placements and two commonly-used difference reconstruction algorithms (Gauss-Newton and GREIT) on the measurement of aortic signals in view of aortic blood pressure estimation via EIT. A magnetic resonance imaging based three-dimensional finite element model of the haemodynamic bio-impedance properties of the human thorax was created. Two simulation experiments were performed with the aim to (1) evaluate the timing error in aortic PAT estimation and (2) quantify the strength of the aortic signal in each pixel of the EIT image sequences. Both experiments reveal better performance for images reconstructed with Gauss-Newton (with a noise figure of 0.5 or above) and a belt placement at the height of the heart or higher. According to the noise-free scenarios simulated, the uncertainty in the analysis of the aortic EIT signal is expected to induce blood pressure errors of at least ± 1.4 mmHg.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter presents possible uses and examples of Monte Carlo methods for the evaluation of uncertainties in the field of radionuclide metrology. The method is already well documented in GUM supplement 1, but here we present a more restrictive approach, where the quantities of interest calculated by the Monte Carlo method are estimators of the expectation and standard deviation of the measurand, and the Monte Carlo method is used to propagate the uncertainties of the input parameters through the measurement model. This approach is illustrated by an example of the activity calibration of a 103Pd source by liquid scintillation counting and the calculation of a linear regression on experimental data points. An electronic supplement presents some algorithms which may be used to generate random numbers with various statistical distributions, for the implementation of this Monte Carlo calculation method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses basic theoretical strategies used to deal with measurement uncertainties arising from different experimental situations. It attempts to indicate the most appropriate method of obtaining a reliable estimate of the quantity to be evaluated depending on the characteristics of the data available. The theoretical strategies discussed are supported by experimental detail, and the conditions and results have been taken from examples in the field of radionuclide metrology. Special care regarding the correct treatment of covariances is emphasized because of the unreliability of the results obtained if these are neglected

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pressurized re-entrant (or 4 pi) ionization chambers (ICs) connected to current-measuring electronics are used for activity measurements of photon emitting radionuclides and some beta emitters in the fields of metrology and nuclear medicine. As a secondary method, these instruments need to be calibrated with appropriate activity standards from primary or direct standardization. The use of these instruments over 50 years has been well described in numerous publications, such as the Monographie BIPM-4 and the special issue of Metrologia on radionuclide metrology (Ratel 2007 Metrologia 44 S7-16, Schrader1997 Activity Measurements With Ionization Chambers (Monographie BIPM-4) Schrader 2007 Metrologia 44 S53-66, Cox et al 2007 Measurement Modelling of the International Reference System (SIR) for Gamma-Emitting Radionuclides (Monographie BIPM-7)). The present work describes the principles of activity measurements, calibrations, and impurity corrections using pressurized ionization chambers in the first part and the uncertainty analysis illustrated with example uncertainty budgets from routine source-calibration as well as from an international reference system (SIR) measurement in the second part.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To evaluate a diagnostic strategy for pulmonary embolism that combined clinical assessment, plasma D-dimer measurement, lower limb venous ultrasonography, and helical computed tomography (CT). METHODS: A cohort of 965 consecutive patients presenting to the emergency departments of three general and teaching hospitals with clinically suspected pulmonary embolism underwent sequential noninvasive testing. Clinical probability was assessed by a prediction rule combined with implicit judgment. All patients were followed for 3 months. RESULTS: A normal D-dimer level (<500 microg/L by a rapid enzyme-linked immunosorbent assay) ruled out venous thromboembolism in 280 patients (29%), and finding a deep vein thrombosis by ultrasonography established the diagnosis in 92 patients (9.5%). Helical CT was required in only 593 patients (61%) and showed pulmonary embolism in 124 patients (12.8%). Pulmonary embolism was considered ruled out in the 450 patients (46.6%) with a negative ultrasound and CT scan and a low-to-intermediate clinical probability. The 8 patients with a negative ultrasound and CT scan despite a high clinical probability proceeded to pulmonary angiography (positive: 2; negative: 6). Helical CT was inconclusive in 11 patients (pulmonary embolism: 4; no pulmonary embolism: 7). The overall prevalence of pulmonary embolism was 23%. Patients classified as not having pulmonary embolism were not anticoagulated during follow-up and had a 3-month thromboembolic risk of 1.0% (95% confidence interval: 0.5% to 2.1%). CONCLUSION: A noninvasive diagnostic strategy combining clinical assessment, D-dimer measurement, ultrasonography, and helical CT yielded a diagnosis in 99% of outpatients suspected of pulmonary embolism, and appeared to be safe, provided that CT was combined with ultrasonography to rule out the disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The traditional basis for assessing the effect of antihypertensive therapy is the blood pressure reading taken by a physician. However, several recent trials have been designed to evaluate the blood pressure lowering effect of various therapeutic agents during the patients' normal daytime activities, using a portable, semi-automatic blood pressure recorder. The results have shown that in a given patient, blood pressure measured at the physician's office often differs greatly from that prevailing during the rest of the day. This is true both in treated and untreated hypertensive patients. The difference between office and ambulatory recorded pressures cannot be predicted from blood pressure levels measured by the physician. Therefore, a prospective study was carried out in patients with diastolic blood pressures that were uncontrolled at the physician's office despite antihypertensive therapy. The purpose was to evaluate the response of recorded ambulatory blood pressure to treatment adjustments aimed at reducing office blood pressure below a pre-set target level. Only patients with high ambulatory blood pressures at the outset appeared to benefit from further changes in therapy. Thus, ambulatory blood pressure monitoring can be used to identify those patients who remain hypertensive only when facing the physician, despite antihypertensive therapy. Ambulatory monitoring could thus help to evaluate the efficacy of antihypertensive therapy and allow individual treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a new method, based on inertial sensors, to automatically measure at high frequency the durations of the main phases of ski jumping (i.e. take-off release, take-off, and early flight). The kinematics of the ski jumping movement were recorded by four inertial sensors, attached to the thigh and shank of junior athletes, for 40 jumps performed during indoor conditions and 36 jumps in field conditions. An algorithm was designed to detect temporal events from the recorded signals and to estimate the duration of each phase. These durations were evaluated against a reference camera-based motion capture system and by trainers conducting video observations. The precision for the take-off release and take-off durations (indoor < 39 ms, outdoor = 27 ms) can be considered technically valid for performance assessment. The errors for early flight duration (indoor = 22 ms, outdoor = 119 ms) were comparable to the trainers' variability and should be interpreted with caution. No significant changes in the error were noted between indoor and outdoor conditions, and individual jumping technique did not influence the error of take-off release and take-off. Therefore, the proposed system can provide valuable information for performance evaluation of ski jumpers during training sessions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The Adolescent Drug Abuse Diagnosis (ADAD) and Health of Nation Outcome Scales for Children and Adolescents (HoNOSCA) are both measures of outcome for adolescent mental health services. AIMS: To compare the ADAD with HoNOSCA; to examine their clinical usefulness. METHODS: Comparison of the ADAD and HoNOSCA outcome measures of 20 adolescents attending a psychiatric day care unit. RESULTS: ADAD change was positively correlated with HoNOSCA change. HoNOSCA assesses the clinic's day-care programme more positively than the ADAD. The ADAD detects a group for which the mean score remains unchanged whereas HoNOSCA does not. CONCLUSIONS: A good convergent validity emerges between the two assessment tools. The ADAD allows an evidence-based assessment and generally enables a better subject discrimination than HoNOSCA. HoNOSCA gives a less refined evaluation but is more economic in time and possibly more sensitive to change. Both assessment tools give useful information and enabled the Day-care Unit for Adolescents to rethink the process of care and of outcome, which benefited both the institution and the patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to determine whether breath 13CO2 measurements could be used to assess the compliance to a diet containing carbohydrates naturally enriched in 13C. The study was divided into two periods: Period 1 (baseline of 4 days) with low 13C/12C ratio carbohydrates. Period 2 (5 days) isocaloric diet with a high 13C/12C ratio (corn, cane sugar, pineapple, millet) carbohydrates. Measurements were made of respiratory gas exchange by indirect calorimetry, urinary nitrogen excretion and breath 13CO2 every morning in post-absorptive conditions, both in resting state and during a 45-min low intensity exercise (walking on a treadmill). The subjects were 10 healthy lean women (BMI 20.4 +/- 1.7 kg/m2, % body fat 24.4 +/- 1.3%), the 13C enrichment of oxidized carbohydrate and breath 13CO2 were compared to the enrichment of exogenous dietary carbohydrates. At rest the enrichment of oxidized carbohydrate increased significantly after one day of 13C carbohydrate enriched diet and reached a steady value (103 +/- 16%) similar to the enrichment of exogenous carbohydrates. During exercise, the 13C enrichment of oxidized carbohydrate remained significantly lower (68 +/- 17%) than that of dietary carbohydrates. The compliance to a diet with a high content of carbohydrates naturally enriched in 13C may be assessed from the measurement of breath 13CO2 enrichment combined with respiratory gas exchange in resting, postabsorptive conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background and aims: Family-centred care is an expected standard in PICU and parent reported outcomes are rarely measured. The Dutch validated EMPATHIC questionnaire provides accurate measures of parental perceptions of family-centred care in PICU. A French version would provide an important resource for quality control and benchmarking with other PICUs. The study aimed to translate and to assess the French cultural adaptation of the EMPATHIC questionnaire. Methods: In September 2012, following approval from the developer, translation and cultural adaptation were performed using a structured method (Wild et al. 2005). This included forward-backward translation and reconciliation by an official translator, harmonization assessed by the research team, and cognitive debriefing with the target users' population. In this last step, a convenience sample of parents with PICU experience assessed the comprehensibility and cultural relevance of the 65-item French EMPATHIC questionnaire. The PICUs in Lausanne, Switzerland and Lille, France participated. Results: Seventeen parents, including 13 French native and 4 French as second language speakers, tested the cognitive equivalence and cultural relevance of the French EMPATHIC questionnaire. The mean agreement for comprehensibility of all 65 items reached 90.2%. Three items fell below the cut-off 80% agreement and were revised for inclusion in the final French version. Conclusions: The translation and the cultural adaptation permitted to highlight a few cultural differences that did not interfere with the main construct of the EMPATHIC questionnaire. Reliability and validity testing with a new sample of parents is needed to strengthen the psychometric properties of the French EMPATHIC questionnaire.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. Species distribution modelling is used increasingly in both applied and theoretical research to predict how species are distributed and to understand attributes of species' environmental requirements. In species distribution modelling, various statistical methods are used that combine species occurrence data with environmental spatial data layers to predict the suitability of any site for that species. While the number of data sharing initiatives involving species' occurrences in the scientific community has increased dramatically over the past few years, various data quality and methodological concerns related to using these data for species distribution modelling have not been addressed adequately. 2. We evaluated how uncertainty in georeferences and associated locational error in occurrences influence species distribution modelling using two treatments: (1) a control treatment where models were calibrated with original, accurate data and (2) an error treatment where data were first degraded spatially to simulate locational error. To incorporate error into the coordinates, we moved each coordinate with a random number drawn from the normal distribution with a mean of zero and a standard deviation of 5 km. We evaluated the influence of error on the performance of 10 commonly used distributional modelling techniques applied to 40 species in four distinct geographical regions. 3. Locational error in occurrences reduced model performance in three of these regions; relatively accurate predictions of species distributions were possible for most species, even with degraded occurrences. Two species distribution modelling techniques, boosted regression trees and maximum entropy, were the best performing models in the face of locational errors. The results obtained with boosted regression trees were only slightly degraded by errors in location, and the results obtained with the maximum entropy approach were not affected by such errors. 4. Synthesis and applications. To use the vast array of occurrence data that exists currently for research and management relating to the geographical ranges of species, modellers need to know the influence of locational error on model quality and whether some modelling techniques are particularly robust to error. We show that certain modelling techniques are particularly robust to a moderate level of locational error and that useful predictions of species distributions can be made even when occurrence data include some error.