964 resultados para Methods Time Measurement (MTM)
Resumo:
BACKGROUND: The hospital readmission rate has been proposed as an important outcome indicator computable from routine statistics. However, most commonly used measures raise conceptual issues. OBJECTIVES: We sought to evaluate the usefulness of the computerized algorithm for identifying avoidable readmissions on the basis of minimum bias, criterion validity, and measurement precision. RESEARCH DESIGN AND SUBJECTS: A total of 131,809 hospitalizations of patients discharged alive from 49 hospitals were used to compare the predictive performance of risk adjustment methods. A subset of a random sample of 570 medical records of discharge/readmission pairs in 12 hospitals were reviewed to estimate the predictive value of the screening of potentially avoidable readmissions. MEASURES: Potentially avoidable readmissions, defined as readmissions related to a condition of the previous hospitalization and not expected as part of a program of care and occurring within 30 days after the previous discharge, were identified by a computerized algorithm. Unavoidable readmissions were considered as censored events. RESULTS: A total of 5.2% of hospitalizations were followed by a potentially avoidable readmission, 17% of them in a different hospital. The predictive value of the screen was 78%; 27% of screened readmissions were judged clearly avoidable. The correlation between the hospital rate of clearly avoidable readmission and all readmissions rate, potentially avoidable readmissions rate or the ratio of observed to expected readmissions were respectively 0.42, 0.56 and 0.66. Adjustment models using clinical information performed better. CONCLUSION: Adjusted rates of potentially avoidable readmissions are scientifically sound enough to warrant their inclusion in hospital quality surveillance.
Resumo:
Objectives: To test if the time of day significantly influences the occurrence of type 4A myocardial infarction in elective patients undergoing percutaneous coronary intervention (PCI). Background: Recent studies have suggested an influence of circadian rhythms on myocardial infarction size and mortality among patients with ST-elevation myocardial infarction. The aim of the study is to investigate whether periprocedural myocardial infarction (PMI) is influenced by the time of day in elective patients undergoing PCI. Methods: All consecutive patients undergoing elective PCI between 2007 and 2011 at our institutions with known post-interventional troponin were retrospectively included. Patients (n = 1021) were divided into two groups according to the starting time of the PCI: the morning group (n = 651) between 07:00 and 11:59, and the afternoon group (n = 370) between 12:00 and 18:59. Baseline and procedural characteristics as well as clinical outcome defined as the occurrence of PMI were compared between groups. In order to limit selection bias, all analyses were equally performed in 308 pairs using propensity score (PS) matching. Results: In the overall population, the rate of PMI was statistically lower in the morning group compared to the afternoon group (20% vs. 30%, p < 0.001). This difference remained statistically significant after PS-matching (21% vs. 29%, p = 0.03). Multivariate analysis shows that being treated in the afternoon independently increases the risk for PMI with an odds ratio of 2.0 (95%CI: 1.1-3.4; p = 0.02). Conclusions: This observational PS-matched study suggests that the timing of an elective PCI influences the rate of PMI.
Resumo:
BACKGROUND: Measurement of plasma renin is important for the clinical assessment of hypertensive patients. The most common methods for measuring plasma renin are the plasma renin activity (PRA) assay and the renin immunoassay. The clinical application of renin inhibitor therapy has thrown into focus the differences in information provided by activity assays and immunoassays for renin and prorenin measurement and has drawn attention to the need for precautions to ensure their accurate measurement. CONTENT: Renin activity assays and immunoassays provide related but different information. Whereas activity assays measure only active renin, immunoassays measure both active and inhibited renin. Particular care must be taken in the collection and processing of blood samples and in the performance of these assays to avoid errors in renin measurement. Both activity assays and immunoassays are susceptible to renin overestimation due to prorenin activation. In addition, activity assays performed with peptidase inhibitors may overestimate the degree of inhibition of PRA by renin inhibitor therapy. Moreover, immunoassays may overestimate the reactive increase in plasma renin concentration in response to renin inhibitor therapy, owing to the inhibitor promoting conversion of prorenin to an open conformation that is recognized by renin immunoassays. CONCLUSIONS: The successful application of renin assays to patient care requires that the clinician and the clinical chemist understand the information provided by these assays and of the precautions necessary to ensure their accuracy.
Resumo:
OBJECTIVES: (1) To evaluate the changes in surface roughness and gloss after simulated toothbrushing of 9 composite materials and 2 ceramic materials in relation to brushing time and load in vitro; (2) to assess the relationship between surface gloss and surface roughness. METHODS: Eight flat specimens of composite materials (microfilled: Adoro, Filtek Supreme, Heliomolar; microhybrid: Four Seasons, Tetric EvoCeram; hybrid: Compoglass F, Targis, Tetric Ceram; macrohybrid: Grandio), two ceramic materials (IPS d.SIGN and IPS Empress polished) were fabricated according to the manufacturer's instructions and optimally polished with up to 4000 grit SiC. The specimens were subjected to a toothbrushing (TB) simulation device (Willytec) with rotating movements, toothpaste slurry and at three different loads (100g/250g/350g). At hourly intervals from 1h to 10h TB, mean surface roughness Ra was measured with an optical sensor and the surface gloss (Gl) with a glossmeter. Statistical analysis was performed for log-transformed Ra data applying two-way ANOVA to evaluate the interaction between load and material and load and brushing time. RESULTS: There was a significant interaction between material and load as well as between load and brushing time (p<0.0001). The microhybrid and hybrid materials demonstrated more surface deterioration with higher loads, whereas with the microfilled resins Heliomolar and Adoro it was vice versa. For ceramic materials, no or little deterioration was observed over time and independent of the load. The ceramic materials and 3 of the composite materials (roughness) showed no further deterioration after 5h of toothbrushing. Mean surface gloss was the parameter which discriminated best between the materials, followed by mean surface roughness Ra. There was a strong correlation between surface gloss and surface roughness for all the materials except the ceramics. The evaluation of the deterioration curves of individual specimens revealed a more or less synchronous course suspecting hinting specific external conditions and not showing the true variability in relation to the tested material. SIGNIFICANCE: The surface roughness and gloss of dental materials changes with brushing time and load and thus results in different material rankings. Apart from Grandio, the hybrid composite resins were more prone to surface changes than microfilled composites. The deterioration potential of a composite material can be quickly assessed by measuring surface gloss. For this purpose, a brushing time of 10h (=72,000 strokes) is needed. In further comparative studies, specimens of different materials should be tested in one series to estimate the true variability.
Resumo:
The radioactive concentrations of (166m)Ho, (134)Cs and (133)Ba solutions have been standardised using a 4πβ-4πγ coincidence counting system we have recently set up. The detection in the beta channel is performed using various geometries of a UPS-89 plastic scintillator optically coupled to a selected low-noise 1in. diameter photomultiplier tube. The light-tight thin capsule that encloses this beta detector is housed within the well of a 5in.×5in. NaI(Tl) monocrystal detector. The beta detection efficiency can be varied either by optical filtering or electronic discrimination when the electrons loose all their energy in the plastic scintillator. This 4πβ-4πγ coincidence system improves on our 4πβ(PC)-γ system in that its sample preparation is less labour intensive, it yields larger beta- and gamma-counting efficiencies thus enabling the standardisation of low activity sources with good statistics in reasonable time, and it makes standardising short-lived radionuclides easier. The resulting radioactive concentrations of (166m)Ho, (134)Cs and (133)Ba are found to agree with those measured with other primary measurement methods thus validating our 4πβ-4πγ coincidence counting system.
Resumo:
A crucial method for investigating patients with coronary artery disease (CAD) is the calculation of the left ventricular ejection fraction (LVEF). It is, consequently, imperative to precisely estimate the value of LVEF--a process that can be done with myocardial perfusion scintigraphy. Therefore, the present study aimed to establish and compare the estimation performance of the quantitative parameters of the reconstruction methods filtered backprojection (FBP) and ordered-subset expectation maximization (OSEM). METHODS: A beating-heart phantom with known values of end-diastolic volume, end-systolic volume, and LVEF was used. Quantitative gated SPECT/quantitative perfusion SPECT software was used to obtain these quantitative parameters in a semiautomatic mode. The Butterworth filter was used in FBP, with the cutoff frequencies between 0.2 and 0.8 cycles per pixel combined with the orders of 5, 10, 15, and 20. Sixty-three reconstructions were performed using 2, 4, 6, 8, 10, 12, and 16 OSEM subsets, combined with several iterations: 2, 4, 6, 8, 10, 12, 16, 32, and 64. RESULTS: With FBP, the values of end-diastolic, end-systolic, and the stroke volumes rise as the cutoff frequency increases, whereas the value of LVEF diminishes. This same pattern is verified with the OSEM reconstruction. However, with OSEM there is a more precise estimation of the quantitative parameters, especially with the combinations 2 iterations × 10 subsets and 2 iterations × 12 subsets. CONCLUSION: The OSEM reconstruction presents better estimations of the quantitative parameters than does FBP. This study recommends the use of 2 iterations with 10 or 12 subsets for OSEM and a cutoff frequency of 0.5 cycles per pixel with the orders 5, 10, or 15 for FBP as the best estimations for the left ventricular volumes and ejection fraction quantification in myocardial perfusion scintigraphy.
Resumo:
Universal standard goniometer is an essential tool to measure articulations' range of motion (ROM). In this time of technological advances and increasing use of smartphones, new measurement's tools appear as specific smartphone applications. This article compares the iOS application "Knee Goniometer" with universal standard goniometer to assess knee ROM. To our knowledge, this is the first study that uses a goniometer application in a clinical context. The purpose of this study is to determine if this application could be used in clinical practice.
Resumo:
Cannabis use among adolescents and young adults has become a major public health challenge. Several European countries are currently developing short screening instruments to identify 'problematic' forms of cannabis use in general population surveys. One such instrument is the Cannabis Use Disorders Identification Test (CUDIT), a 10-item questionnaire based on the Alcohol Use Disorders Identification Test. Previous research found that some CUDIT items did not perform well psychometrically. In the interests of improving the psychometric properties of the CUDIT, this study replaces the poorly performing items with new items that specifically address cannabis use. Analyses are based on a sub-sample of 558 recent cannabis users from a representative population sample of 5722 individuals (aged 13-32) who were surveyed in the 2007 Swiss Cannabis Monitoring Study. Four new items were added to the original CUDIT. Psychometric properties of all 14 items, as well as the dimensionality of the supplemented CUDIT were then examined using Item Response Theory. Results indicate the unidimensionality of CUDIT and an improvement in its psychometric performance when three original items (usual hours being stoned; injuries; guilt) are replaced by new ones (motives for using cannabis; missing out leisure time activities; difficulties at work/school). However, improvements were limited to cannabis users with a high problem score. For epidemiological purposes, any further revision of CUDIT should therefore include a greater number of 'easier' items.
Resumo:
We present simple procedures for the prediction of a real valued sequence. The algorithms are based on a combinationof several simple predictors. We show that if the sequence is a realization of a bounded stationary and ergodic random process then the average of squared errors converges, almost surely, to that of the optimum, given by the Bayes predictor. We offer an analog result for the prediction of stationary gaussian processes.
Resumo:
This paper explores biases in the elicitation of utilities under risk and the contribution that generalizations of expected utility can make to the resolution of these biases. We used five methods to measure utilities under risk and found clear violations of expected utility. Of the theories studies, prospect theory was most consistent with our data. The main improvement of prospect theory over expected utility was in comparisons between a riskless and a risky prospect(riskless-risk methods). We observed no improvement over expected utility in comparisons between two risky prospects (risk-risk methods). An explanation why we found no improvement of prospect theory over expected utility in risk-risk methods may be that there was less overweighting of small probabilities in our study than has commonly been observed.
Resumo:
Purpose: To describe the evolution of retinal thickness in eyes affected with acute anterior uveitis (AAU) in the course of follow-up and to assess its correlation with severity of inflammatory activity in the anterior chamber. Methods: Design: Prospective, cohort study Setting: Institutional study Patient population: 72 eyes (affected and fellow eyes) of 36 patients Observation procedure: Patients were followed daily until beginning of resolution of inflammatory activity and weekly thereafter. Optical coherence tomography and laser flare photometry were performed at each visit. Treatment consisted of topical corticosteroids Main outcome measures: Retinal thickness of affected eyes, difference in retinal thickness between affected and fellow eyes and their evolution in time, association between maximal retinal thickness and initial laser flare photometry. Results: Difference in retinal thickness between affected and fellow eyes became significant on average seven days from baseline and remained so through-out follow-up (p<0.001). There was a steep increase in retinal thickness of affected eyes followed by a progressive decrease after reaching a peak value. Maximal difference in retinal thickness between affected and fellow eyes was observed between 17 and 25 days from baseline and exhibited a strong, positive correlation with initial laser flare photometry values (p=0.015). Conclusions: Retinal thickness in eyes affected with AAU presents a steep increase over 3 to 4 weeks and then gradually decreases. Severity of inflammation at baseline predicts the amount of retinal thickening in affected eyes. A characteristic pattern of temporal response of retinal anatomy to inflammatory stimuli seems to arise.
Resumo:
Condence intervals in econometric time series regressions suffer fromnotorious coverage problems. This is especially true when the dependencein the data is noticeable and sample sizes are small to moderate, as isoften the case in empirical studies. This paper suggests using thestudentized block bootstrap and discusses practical issues, such as thechoice of the block size. A particular data-dependent method is proposedto automate the method. As a side note, it is pointed out that symmetricconfidence intervals are preferred over equal-tailed ones, since theyexhibit improved coverage accuracy. The improvements in small sampleperformance are supported by a simulation study.
Resumo:
This paper provides a method to estimate time varying coefficients structuralVARs which are non-recursive and potentially overidentified. The procedureallows for linear and non-linear restrictions on the parameters, maintainsthe multi-move structure of standard algorithms and can be used toestimate structural models with different identification restrictions. We studythe transmission of monetary policy shocks and compare the results with thoseobtained with traditional methods.
Resumo:
In this paper we propose a general technique to develop first and second order closed-form approximation formulas for short-time options withrandom strikes. Our method is based on Malliavin calculus techniques andallows us to obtain simple closed-form approximation formulas dependingon the derivative operator. The numerical analysis shows that these formulas are extremely accurate and improve some previous approaches ontwo-assets and three-assets spread options as Kirk's formula or the decomposition mehod presented in Alòs, Eydeland and Laurence (2011).
Resumo:
A new method of measuring joint angle using a combination of accelerometers and gyroscopes is presented. The method proposes a minimal sensor configuration with one sensor module mounted on each segment. The model is based on estimating the acceleration of the joint center of rotation by placing a pair of virtual sensors on the adjacent segments at the center of rotation. In the proposed technique, joint angles are found without the need for integration, so absolute angles can be obtained which are free from any source of drift. The model considers anatomical aspects and is personalized for each subject prior to each measurement. The method was validated by measuring knee flexion-extension angles of eight subjects, walking at three different speeds, and comparing the results with a reference motion measurement system. The results are very close to those of the reference system presenting very small errors (rms = 1.3, mean = 0.2, SD = 1.1 deg) and excellent correlation coefficients (0.997). The algorithm is able to provide joint angles in real-time, and ready for use in gait analysis. Technically, the system is portable, easily mountable, and can be used for long term monitoring without hindrance to natural activities.