911 resultados para second medical use
Resumo:
Quality Management is a well-developed and widely used approach within industry to gain competitive edge and increased market share. It is a new management approach for schools who are now applying it without having the culture or experience of its evolution. Industrially based Quality management systems and excellence models have been developed. These excellence models and frameworks are based on the principles and concepts of TQM which are recognised as essential elements of high performing organisations. Schools are complex social institutions that provide a service. Like any other service industry, the customers of education are expecting and demanding a better service or else they will go elsewhere. Schools are beginning to reform and change to adapt to such demands. This has been reflected in Ireland in the Education Act, 1998. It is now the right time to develop a quality management system specifically for schools. The existing industrial excellence models have been modified for use in the private and public sector and some have been specifically tailored for education. The problem with such models is that they are still too sophisticated and the language still too industrial for schools. This Thesis develops and Excellence Model for Second Level Schools and provides guidance and school specific tools for its implementation.
Resumo:
Background: Appropriateness Criteria for nuclear imaging exams were created by American College of Cardiology (ACC) e American Society of Nuclear Cardiology (ASNC) to allow the rational use of tests. Little is known whether these criteria have been followed in clinical practice. Objective: To evaluate whether the medical applications of myocardial perfusion scintigraphy (MPS) in a private nuclear medicine service of a tertiary cardiology hospital were suitable to the criteria of indications proposed by the American medical societies in 2005 and 2009 and compare the level of indication of both. Methods: We included records of 383 patients that underwent MPS, November 2008 up to February 2009. Demographic characteristics, patient's origin, coronary risk factors, time of medical graduation and appropriateness criteria of medical applications were studied. The criteria were evaluated by two independent physicians and, in doubtful cases, defined by a medical expert in MPS. Results: Mean age was 65 ± 12 years. Of the 367 records reviewed, 236 (64.3%) studies were performed in men and 75 (20.4%) were internee. To ACC 2005, 255 (69.5%) were considered appropriate indication and 13 (3.5%) inappropriate. With ACC 2009, 249 (67.8%) were considered appropriate indications and 13 (5.2%) inappropriate. Conclusions: We observed a high rate of adequacy of medical indications for MPS. Compared to the 2005 version, 2009 did not change the results.
Resumo:
Background: Physiological reflexes modulated primarily by the vagus nerve allow the heart to decelerate and accelerate rapidly after a deep inspiration followed by rapid movement of the limbs. This is the physiological and pharmacologically validated basis for the 4-s exercise test (4sET) used to assess the vagal modulation of cardiac chronotropism. Objective: To present reference data for 4sET in healthy adults. Methods: After applying strict clinical inclusion/exclusion criteria, 1,605 healthy adults (61% men) aged between 18 and 81 years subjected to 4sET were evaluated between 1994 and 2014. Using 4sET, the cardiac vagal index (CVI) was obtained by calculating the ratio between the duration of two RR intervals in the electrocardiogram: 1) after a 4-s rapid and deep breath and immediately before pedaling and 2) at the end of a rapid and resistance-free 4-s pedaling exercise. Results: CVI varied inversely with age (r = -0.33, p < 0.01), and the intercepts and slopes of the linear regressions between CVI and age were similar for men and women (p > 0.05). Considering the heteroscedasticity and the asymmetry of the distribution of the CVI values according to age, we chose to express the reference values in percentiles for eight age groups (years): 18–30, 31–40, 41–45, 46–50, 51–55, 56–60, 61–65, and 66+, obtaining progressively lower median CVI values ranging from 1.63 to 1.24. Conclusion: The availability of CVI percentiles for different age groups should promote the clinical use of 4sET, which is a simple and safe procedure for the evaluation of vagal modulation of cardiac chronotropism.
Resumo:
AbstractIntroduction:Coronary computed tomography angiography (CCTA) allows for non-invasive coronary artery disease (CAD) phenotyping. There are still some uncertainties regarding the impact this knowledge has on the clinical care of patients.Objective:To determine whether CAD phenotyping by CCTA influences clinical decision making by the prescription of cardiovascular drugs and their impact on non-LDL cholesterol (NLDLC) levels.Methods:We analysed consecutive patients from 2008 to 2011 submitted to CCTA without previous diagnosis of CAD that had two serial measures of NLDLC, one up to 3 months before CCTA and the second from 3 to 6 months after.Results:A total of 97 patients were included, of which 69% were men, mean age 64 ± 12 years. CCTA revealed that 18 (18%) patients had no CAD, 38 (39%) had non-obstructive (< 50%) lesions and 41 (42%) had at least one obstructive ≥ 50% lesion. NLDLC was similar at baseline between the grups (138 ± 52 mg/dL vs. 135 ± 42 mg/dL vs. 131 ± 44 mg/dL, respectively, p = 0.32). We found significative reduction in NLDLC among patients with obstrctive lesions (-18%, p = 0.001). We also found a positive relationship between clinical treatment intensification with aspirin and cholesterol reducing drugs and the severity of CAD.Conclusion:Our data suggest that CCTA results were used for cardiovascular clinical treatment titration, with especial intensification seen in patients with obstructive ≥50% CAD.
Resumo:
Abstract Background: There are sparse data on the performance of different types of drug-eluting stents (DES) in acute and real-life setting. Objective: The aim of the study was to compare the safety and efficacy of first- versus second-generation DES in patients with acute coronary syndromes (ACS). Methods: This all-comer registry enrolled consecutive patients diagnosed with ACS and treated with percutaneous coronary intervention with the implantation of first- or second-generation DES in one-year follow-up. The primary efficacy endpoint was defined as major adverse cardiac and cerebrovascular event (MACCE), a composite of all-cause death, nonfatal myocardial infarction, target-vessel revascularization and stroke. The primary safety outcome was definite stent thrombosis (ST) at one year. Results: From the total of 1916 patients enrolled into the registry, 1328 patients were diagnosed with ACS. Of them, 426 were treated with first- and 902 with second-generation DES. There was no significant difference in the incidence of MACCE between two types of DES at one year. The rate of acute and subacute ST was higher in first- vs. second-generation DES (1.6% vs. 0.1%, p < 0.001, and 1.2% vs. 0.2%, p = 0.025, respectively), but there was no difference regarding late ST (0.7% vs. 0.2%, respectively, p = 0.18) and gastrointestinal bleeding (2.1% vs. 1.1%, p = 0.21). In Cox regression, first-generation DES was an independent predictor for cumulative ST (HR 3.29 [1.30-8.31], p = 0.01). Conclusions: In an all-comer registry of ACS, the one-year rate of MACCE was comparable in groups treated with first- and second-generation DES. The use of first-generation DES was associated with higher rates of acute and subacute ST and was an independent predictor of cumulative ST.
Resumo:
We review recent likelihood-based approaches to modeling demand for medical care. A semi-nonparametric model along the lines of Cameron and Johansson's Poisson polynomial model, but using a negative binomial baseline model, is introduced. We apply these models, as well a semiparametric Poisson, hurdle semiparametric Poisson, and finite mixtures of negative binomial models to six measures of health care usage taken from the Medical Expenditure Panel survey. We conclude that most of the models lead to statistically similar results, both in terms of information criteria and conditional and unconditional prediction. This suggests that applied researchers may not need to be overly concerned with the choice of which of these models they use to analyze data on health care demand.
Resumo:
Species distribution models (SDMs) are increasingly used to predict environmentally induced range shifts of habitats of plant and animal species. Consequently SDMs are valuable tools for scientifically based conservation decisions. The aims of this paper are (1) to identify important drivers of butterfly species persistence or extinction, and (2) to analyse the responses of endangered butterfly species of dry grasslands and wetlands to likely future landscape changes in Switzerland. Future land use was represented by four scenarios describing: (1) ongoing land use changes as observed at the end of the last century; (2) a liberalisation of the agricultural markets; (3) a slightly lowered agricultural production; and (4) a strongly lowered agricultural production. Two model approaches have been applied. The first (logistic regression with principal components) explains what environmental variables have significant impact on species presence (and absence). The second (predictive SDM) is used to project species distribution under current and likely future land uses. The results of the explanatory analyses reveal that four principal components related to urbanisation, abandonment of open land and intensive agricultural practices as well as two climate parameters are primary drivers of species occurrence (decline). The scenario analyses show that lowered agricultural production is likely to favour dry grassland species due to an increase of non-intensively used land, open canopy forests, and overgrown areas. In the liberalisation scenario dry grassland species show a decrease in abundance due to a strong increase of forested patches. Wetland butterfly species would decrease under all four scenarios as their habitats become overgrown
Resumo:
Critically ill patients depend on artificial nutrition for the maintenance of their metabolic functions and lean body mass, as well as for limiting underfeeding-related complications. Current guidelines recommend enteral nutrition (EN), possibly within the first 48 hours, as the best way to provide the nutrients and prevent infections. EN may be difficult to realize or may be contraindicated in some patients, such as those presenting anatomic intestinal continuity problems or splanchnic ischemia. A series of contradictory trials regarding the best route and timing for feeding have left the medical community with great uncertainty regarding the place of parenteral nutrition (PN) in critically ill patients. Many of the deleterious effects attributed to PN result from inadequate indications, or from overfeeding. The latter is due firstly to the easier delivery of nutrients by PN compared with EN increasing the risk of overfeeding, and secondly to the use of approximate energy targets, generally based on predictive equations: these equations are static and inaccurate in about 70% of patients. Such high uncertainty about requirements compromises attempts at conducting nutrition trials without indirect calorimetry support because the results cannot be trusted; indeed, both underfeeding and overfeeding are equally deleterious. An individualized therapy is required. A pragmatic approach to feeding is proposed: at first to attempt EN whenever and as early as possible, then to use indirect calorimetry if available, and to monitor delivery and response to feeding, and finally to consider the option of combining EN with PN in case of insufficient EN from day 4 onwards.
Resumo:
Many studies based on either an experimental or an epidemiological approach, have shown that the ability to drive is impaired when the driver is under the influence of cannabis. Baseline performances of heavy users remain impaired even after several weeks of abstinence. Symptoms of cannabis abuse and dependence are generally considered incompatible with safe driving. Recently, it has been shown that traffic safety can be increased by reporting the long-term unfit drivers to the driver licensing authorities and referring the cases for further medical assessment. Evaluation of the frequency of cannabis use is a prerequisite for a reliable medical assessment of the fitness to drive. In a previous paper we advocated the use of two thresholds based on 11-nor-9-carboxy-Δ9-tetrahydrocannabinol (THCCOOH) concentration in whole blood to help to distinguish occasional cannabis users (≤3μg/L) from heavy regular smokers (≥40μg/L). These criteria were established on the basis of results obtained in a controlled cannabis smoking study with placebo, carried out with two groups of young male volunteers; the first group was characterized by a heavy use (≥10 joints/month) while the second group was made up of occasional users smoking at most 1 joint/week. However, to date, these cutoffs have not been adequately assessed under real conditions. Their validity can now be evaluated and confirmed with 146 traffic offenders' real cases in which the whole blood cannabinoid concentrations and the frequency of cannabis use are known. The two thresholds were not challenged by the presence of ethanol (40% of cases) and of other therapeutic and illegal drugs (24%). Thus, we propose the following procedure that can be very useful in the Swiss context but also in other countries with similar traffic policies: if the whole blood THCCOOH concentration is higher than 40μg/L, traffic offenders must be directed first and foremost toward medical assessment of their fitness to drive. This evaluation is not recommended if the THCCOOH concentration is lower than 3μg/L and if the self-rated frequency of cannabis use is less than 1 time/week. A THCCOOH level between these two thresholds cannot be reliably interpreted. In such a case, further medical assessment and follow-up of the fitness to drive are also suggested, but with lower priority.
Resumo:
BACKGROUND: The number of requests to pre-hospital emergency medical services (PEMS) has increased in Europe over the last 20 years, but epidemiology of PEMS interventions has little be investigated. The aim of this analysis was to describe time trends of PEMS activity in a region of western Switzerland. METHODS: Use of data routinely and prospectively collected for PEMS intervention in the Canton of Vaud, Switzerland, from 2001 to 2010. This Swiss Canton comprises approximately 10% of the whole Swiss population. RESULTS: We observed a 40% increase in the number of requests to PEMS between 2001 and 2010. The overall rate of requests was 35/1000 inhabitants for ambulance services and 10/1000 for medical interventions (SMUR), with the highest rate among people aged ≥ 80. Most frequent reasons for the intervention were related to medical problems, predominantly unconsciousness, chest pain respiratory distress, or cardiac arrest, whereas severe trauma interventions decreased over time. Overall, 89% were alive after 48 h. The survival rate after 48 h increased regularly for cardiac arrest or myocardial infarction. CONCLUSION: Routine prospective data collection of prehospital emergency interventions and monitoring of activity was feasible over time. The results we found add to the understanding of determinants of PEMS use and need to be considered to plan use of emergency health services in the near future. More comprehensive analysis of the quality of services and patient safety supported by indicators are also required, which might help to develop prehospital emergency services and new processes of care.
Resumo:
Background/Aims: Cognitive dysfunction after medical treatment is increasingly being recognized. Studies on this topic require repeated cognitive testing within a short time. However, with repeated testing, practice effects must be expected. We quantified practice effects in a demographically corrected summary score of a neuropsychological test battery repeatedly administered to healthy elderly volunteers. Methods: The Consortium to Establish a Registry for Alzheimer's Disease (CERAD) Neuropsychological Assessment Battery (for which a demographically corrected summary score was developed), phonemic fluency tests, and trail-making tests were administered in healthy volunteers aged 65 years or older on days 0, 7, and 90. This battery allows calculation of a demographically adjusted continuous summary score. Results: Significant practice effects were observed in the CERAD total score and in the word list (learning and recall) subtest. Based on these volunteer data, we developed a threshold for diagnosis of postoperative cognitive dysfunction (POCD) with the CERAD total score. Conclusion: Practice effects with repeated administration of neuropsychological tests must be accounted for in the interpretation of such tests. Ignoring practice effects may lead to an underestimation of POCD. The usefulness of the proposed demographically adjusted continuous score for cognitive function will have to be tested prospectively in patients.
Resumo:
INTRODUCTION: Assessing motivation for change is deemed an important step in the treatment process that allows further refinement of the intervention in motivational interviewing (MI) and brief MI (BMI) adaptations. During MI (and BMI) sessions, motivation for change is expressed by the client as "change talk", i.e. all statements inclined toward or away from change. We tested the predictive validity of the Change Questionnaire, a 12-item instrument assessing motivation to change, on hazardous tobacco and alcohol use. METHODS: As part of the baseline measurements for a randomized controlled trial on multi-substance BMI at the Lausanne recruitment center (army conscription is mandatory in Switzerland for males at age 20, and thus provides a unique opportunity to address a non-clinical and largely representative sample of young men), 213 participants completed the questionnaire on tobacco and 95 on alcohol and were followed-up six months later. The overall Change Questionnaire score and its six subscales (Desire, Ability, Reasons, Need, Commitment, and Taking steps) were used as predictors of hazardous tobacco use (defined as daily smoking) and hazardous alcohol use (defined as more than one occasion with six standard drinks or more per month, and/or more than 21 standard drinks per week) in bivariate logistic regression models at follow-up. RESULTS: Higher overall Change scores were significant predictors of decreased risk for hazardous tobacco (odds ratio [OR] = 0.83, p = 0.046) and alcohol (OR = 0.76, p = 0.03) use. Several sub-dimensions were associated with the outcomes in bivariate analyses. Using a principal components analysis to reduce the number of predictors for multivariate models, we obtained two components. 'Ability to change' was strongly related to change in hazardous tobacco use (OR = 0.54, p < 0.001), the second we interpreted as 'Other change language dimensions' and which was significantly related to change in hazardous alcohol use (OR = 0.81, p = 0.05). CONCLUSIONS: The present findings lend initial support for the predictive validity of the Change Questionnaire on hazardous tobacco and alcohol use, making it an interesting and potentially useful tool for assessing motivation to change among young males.