987 resultados para Efficiency Measurement
Resumo:
In epidemiologic studies, measurement error in dietary variables often attenuates association between dietary intake and disease occurrence. To adjust for the attenuation caused by error in dietary intake, regression calibration is commonly used. To apply regression calibration, unbiased reference measurements are required. Short-term reference measurements for foods that are not consumed daily contain excess zeroes that pose challenges in the calibration model. We adapted two-part regression calibration model, initially developed for multiple replicates of reference measurements per individual to a single-replicate setting. We showed how to handle excess zero reference measurements by two-step modeling approach, how to explore heteroscedasticity in the consumed amount with variance-mean graph, how to explore nonlinearity with the generalized additive modeling (GAM) and the empirical logit approaches, and how to select covariates in the calibration model. The performance of two-part calibration model was compared with the one-part counterpart. We used vegetable intake and mortality data from European Prospective Investigation on Cancer and Nutrition (EPIC) study. In the EPIC, reference measurements were taken with 24-hour recalls. For each of the three vegetable subgroups assessed separately, correcting for error with an appropriately specified two-part calibration model resulted in about three fold increase in the strength of association with all-cause mortality, as measured by the log hazard ratio. Further found is that the standard way of including covariates in the calibration model can lead to over fitting the two-part calibration model. Moreover, the extent of adjusting for error is influenced by the number and forms of covariates in the calibration model. For episodically consumed foods, we advise researchers to pay special attention to response distribution, nonlinearity, and covariate inclusion in specifying the calibration model.
Resumo:
BACKGROUND: The elderly population is particularly at risk for developing vitamin B12-deficiency. Serum cobalamin does not necessarily reflect a normal B12 status. The determination of methylmalonic acid is not available in all laboratories. Issues of sensitivity for holotranscobalamin and the low specificity of total homocysteine limit their utility. The aim of the present study is to establish a diagnostic algorithm by using a combination of these markers in place of a single measurement. METHODS: We compared the diagnostic efficiency of these markers for detection of vitamin B12 deficiency in a population (n = 218) of institutionalized elderly (median age 80 years). Biochemical, haematological and morphological data were used to categorize people with or without vitamin B12 deficiency. RESULTS: In receiver operating curves characteristics for detection on vitamin B12 deficiency using single measurements, serum folate has the greatest area under the curve (0.87) and homocysteine the lowest (0.67). The best specificity was observed for erythrocyte folate and methylmalonic acid (100% for both) but their sensitivity was very low (17% and 53%, respectively). The highest sensitivity was observed for homocysteine (81%) and serum folate (74%). When we combined these markers, starting with serum and erythrocyte folate, followed by holotranscobalamin and ending by methylmalonic acid measurements, the overall sensitivity and specificity of the algorithm were 100% and 90%, respectively. CONCLUSION: The proposed algorithm, which combines erythrocyte folate, serum folate, holotranscobalamin and methylmalonic acid, but eliminate B12 and tHcy measurements, is a useful alternative for vitamin B12 deficiency screening in an elderly institutionalized cohort.
Resumo:
Gumbel analyses were carried out on rainfall time-series at 151 locations in Switzerland for 4 different periods of 30 years in order to estimate daily extreme precipitation for a return period of 100 years. Those estimations were compared with maximal daily values measured during the last 100 years (1911-2010) to test the efficiency of these analyses. This comparison shows that these analyses provide good results for 50 to 60% locations in this country from rainfall time-series 1961-1990 and 1980-2010. On the other hand, daily precipitation with a return period of 100 years is underestimated at most locations from time-series 1931-1960 and especially 1911-1940. Such underestimation results from the increase of maximal daily precipitation recorded from 1911 to 2010 at 90% locations in Switzerland.
Resumo:
OBJECTIVE: Home blood pressure (BP) monitoring is recommended by several clinical guidelines and has been shown to be feasible in elderly persons. Wrist manometers have recently been proposed for such home BP measurement, but their accuracy has not been previously assessed in elderly patients. METHODS: Forty-eight participants (33 women and 15 men, mean age 81.3±8.0 years) had their BP measured with a wrist device with position sensor and an arm device in random order in a sitting position. RESULTS: Average BP measurements were consistently lower with the wrist than arm device for systolic BP (120.1±2.2 vs. 130.5±2.2 mmHg, P<0.001, means±SD) and diastolic BP (66.0±1.3 vs. 69.7±1.3 mmHg, P<0.001). Moreover, a 10 mmHg or greater difference between the arm and wrist device was observed in 54.2 and 18.8% of systolic and diastolic measures, respectively. CONCLUSION: Compared with the arm device, the wrist device with position sensor systematically underestimated systolic as well as diastolic BP. The magnitude of the difference is clinically significant and questions the use of the wrist device to monitor BP in elderly persons. This study points to the need to validate BP measuring devices in all age groups, including in elderly persons.
Resumo:
This paper analyzes the optimal behavior of farmers in the presence of direct payments and uncertainty. In an empirical analysis for Switzerland, it confirms previously obtained theoretical results and determines the magnitude of the theoretical predicted effects. The results show that direct payments increase agricultural production between 3.7% to 4.8%. Alternatively to direct payments, the production effect of tax reductions is evaluated in order to determine its magnitude. The empirical analysis corroborates the theoretical results of the literature and demonstrates that tax reductions are also distorting, but to a substantially lesser degree if losses are not offset. However, tax reductions, independently whether losses are offset or not, lead to higher government spending than pure direct payments
Resumo:
BACKGROUND Low back pain and its associated incapacitating effects constitute an important healthcare and socioeconomic problem, as well as being one of the main causes of disability among adults of working age. The prevalence of non-specific low back pain is very high among the general population, and 60-70% of adults are believed to have suffered this problem at some time. Nevertheless, few randomised clinical trials have been made of the efficacy and efficiency of acupuncture with respect to acute low back pain. The present study is intended to assess the efficacy of acupuncture for acute low back pain in terms of the improvement reported on the Roland Morris Questionnaire (RMQ) on low back pain incapacity, to estimate the specific and non-specific effects produced by the technique, and to carry out a cost-effectiveness analysis. METHODS/DESIGN Randomised four-branch controlled multicentre prospective study made to compare semi-standardised real acupuncture, sham acupuncture (acupuncture at non-specific points), placebo acupuncture and conventional treatment. The patients are blinded to the real, sham and placebo acupuncture treatments. Patients in the sample present symptoms of non specific acute low back pain, with a case history of 2 weeks or less, and will be selected from working-age patients, whether in paid employment or not, referred by General Practitioners from Primary Healthcare Clinics to the four clinics participating in this study. In order to assess the primary and secondary result measures, the patients will be requested to fill in a questionnaire before the randomisation and again at 3, 12 and 48 weeks after starting the treatment. The primary result measure will be the clinical relevant improvement (CRI) at 3 weeks after randomisation. We define CRI as a reduction of 35% or more in the RMQ results. DISCUSSION This study is intended to obtain further evidence on the effectiveness of acupuncture on acute low back pain and to isolate the specific and non-specific effects of the treatment.
Resumo:
Using H-2Kd-restricted CTL clones, which are specific for a photoreactive derivative of the Plasmodium berghei circumsporozoite peptide PbCS(252-260) (SYIPSAEKI) and permit assessment of TCR-ligand interactions by TCR photoaffinity labeling, we have previously identified several peptide derivative variants for which TCR-ligand binding and the efficiency of Ag recognition deviated by fivefold or more. Here we report that the functional CTL response (cytotoxicity and IFN-gamma production) correlated with the rate of TCR-ligand complex dissociation, but not the avidity of TCR-ligand binding. While peptide antagonists exhibited very rapid TCR-ligand complex dissociation, slightly slower dissociation was observed for strong agonists. Conversely and surprisingly, weak agonists typically displayed slower dissociation than the wild-type agonists. Acceleration of TCR-ligand complex dissociation by blocking CD8 participation in TCR-ligand binding increased the efficiency of Ag recognition in cases where dissociation was slow. In addition, permanent TCR engagement by TCR-ligand photocross-linking completely abolished sustained intracellular calcium mobilization, which is required for T cell activation. These results indicate that the functional CTL response depends on the frequency of serial TCR engagement, which, in turn, is determined by the rate of TCR-ligand complex dissociation.
Resumo:
Aortic stiffness is an independent predictor factor for cardiovascular risk. Different methods for determining pulse wave velocity (PWV) are used, among which the most common are mechanical methods such as SphygmoCor or Complior, which require specific devices and are limited by technical difficulty in obtaining measurements. Doppler guided by 2D ultrasound is a good alternative to these methods. We studied 40 patients (29 male, aged 21 to 82 years) comparing the Complior method with Doppler. Agreement of both devices was high (R = 0.91, 0.84-0.95, 95% CI). The reproducibility analysis revealed no intra-nor interobserver differences. Based on these results, we conclude that Doppler ultrasound is a reliable and reproducible alternative to other established methods for themeasurement of aortic PWV
Resumo:
Epidemiological studies have shown that obesity is associated with chronic kidney disease and end stage renal disease. These studies have used creatinine derived equations to estimate glomerular filtration rate (GFR) and have indexed GFR to body surface area (BSA). However, the use of equations using creatinine as a surrogate marker of glomerular filtration and the indexation of GFR for BSA can be questioned in the obese population. First, these equations lack precision when they are compared to gold standard GFR measurements such as inulin clearances; secondly, the indexation of GFR for 1.73 m(2) of BSA leads to a systematic underestimation of GFR compared to absolute GFR in obese patients who have BSA that usually exceed 1.73 m(2). Obesity is also associated with pathophysiological changes that can affect the pharmacokinetics of drugs. The effect of obesity on both renal function and drug pharmacokinetics raises the issue of correct drug dosage in obese individuals. This may be particularly relevant for drugs known to have a narrow therapeutic range or excreted by the kidney.
Resumo:
It is generally accepted that financial markets are efficient in the long run a lthough there may be some deviations in the short run. It is also accepted that a good portfolio manager is the one who beats the market persistently along time, this type of manager could not exist if markets were perfectly efficient According to this in a pure efficient market we should find that managers know that they can not beat the market so they would undertake only pure passive management strategies. Assuming a certain degree of inefficiency in the short run, a market may show some managers who tr y to beat the market by undertaking active strategies. From Fama’s efficient markets theory we can state that these active managers may beat the market occasionally although they will not be able to enhance significantly their performance in the long run. On the other hand, in an inefficient market it would be expected to find a higher level of activity related with the higher probability of beating the market. In this paper we follow two objectives: first, we set a basis to analyse the level of efficiency in an asset invest- ment funds market by measuring performance, strategies activity and it’s persistence for a certain group of funds during the period of study. Second, we analyse individual performance persistence in order to determine the existence of skilled managers. The CAPM model is taken as theoretical background and the use of the Sharpe’s ratio as a suitable performance measure in a limited information environment leads to a group performance measurement proposal. The empiri- cal study takes quarterly data from 1999-2007 period, for the whole population of the Spanish asset investment funds market, provided by the CNMV (Comisión Nacional del Mercado de Valores). This period of study has been chosen to ensure a wide enough range of efficient market observation so it would allow us to set a proper basis to compare with the following period. As a result we develop a model that allows us to measure efficiency in a given asset mutual funds market, based on the level of strategy’s activity undertaken by managers. We also observe persistence in individual performance for a certain group of funds
Resumo:
To evaluate the severity of airway pathologies, quantitative dimensioning of airways is of utmost importance. Endoscopic vision gives a projective image and thus no true scaling information can be directly deduced from it. In this article, an approach based on an interferometric setup, a low-coherence laser source and a standard rigid endoscope is presented, and applied to hollow samples measurements. More generally, the use of the low-coherence interferometric setup detailed here could be extended to any other endoscopy-related field of interest, e.g., gastroscopy, arthroscopy and other medical or industrial applications where tri-dimensional topology is required. The setup design with a multiple fibers illumination system is presented. Demonstration of the method ability to operate on biological samples is assessed through measurements on ex vivo pig bronchi.
Resumo:
Perfusion CT studies of regional cerebral blood flow (rCBF), involving sequential acquisition of cerebral CT sections during IV contrast material administration, have classically been reported to be achieved at 120 kVp. We hypothesized that using 80 kVp should result in the same image quality while significantly lowering the patient's radiation dose, and we evaluated this assumption. In five patients undergoing cerebral CT survey, one section level was imaged at 120 kVp and 80 kVp, before and after IV administration of iodinated contrast material. These four cerebral CT sections obtained in each patient were analyzed with special interest to contrast, noise, and radiation dose. Contrast enhancement at 80 kVp is significantly increased (P < .001), as well as contrast between gray matter and white matter after contrast enhancement (P < .001). Mean noise at 80 kVp is not statistically different (P = .042). Finally, performance of perfusion CT studies at 80 kVp, keeping mAs constant, lowers the radiation dose by a factor of 2.8. We, thus, conclude that 80 kVp acquisition of perfusion CT studies of rCBF will result in increased contrast enhancement and should improve rCBF analysis, with a reduced patient's irradiation.
Resumo:
Aim: The psychoanalytic theories of Bion, Anzieu, Berger and Gibello postulate that the development of thinking depends upon the formation of a psychic space. This thinking space has its origin in the body and in our interpersonal relations. This study aims to validate this psychodynamic hypothesis. Method: A group of 8- to 14-year-old children participated in this research. The presence of a thinking space was operationalized by the "barrier" and "penetration" scores on the Rorschach's Fisher and Cleveland scales and intellectual efficiency was measured using a short version of the WISC-IV. Results: Results show that extreme scores on the "barrier" and "penetration" variables predict a lower intellectual level than average scores on the same variables. Conclusion: The development of thinking and personality are undoubtedly linked and the "barrier" and "penetration" variables are useful measures when evaluating the development of a space for thought.