1000 resultados para impact of quotas
Resumo:
PURPOSE: Adequate empirical antibiotic dose selection for critically ill burn patients is difficult due to extreme variability in drug pharmacokinetics. Therapeutic drug monitoring (TDM) may aid antibiotic prescription and implementation of initial empirical antimicrobial dosage recommendations. This study evaluated how gradual TDM introduction altered empirical dosages of meropenem and imipenem/cilastatin in our burn ICU. METHODS: Imipenem/cilastatin and meropenem use and daily empirical dosage at a five-bed burn ICU were analyzed retrospectively. Data for all burn admissions between 2001 and 2011 were extracted from the hospital's computerized information system. For each patient receiving a carbapenem, episodes of infection were reviewed and scored according to predefined criteria. Carbapenem trough serum levels were characterized. Prior to May 2007, TDM was available only by special request. Real-time carbapenem TDM was introduced in June 2007; it was initially available weekly and has been available 4 days a week since 2010. RESULTS: Of 365 patients, 229 (63%) received antibiotics (109 received carbapenems). Of 23 TDM determinations for imipenem/cilastatin, none exceeded the predefined upper limit and 11 (47.8%) were insufficient; the number of TDM requests was correlated with daily dose (r=0.7). Similar numbers of inappropriate meropenem trough levels (30.4%) were below and above the upper limit. Real-time TDM introduction increased the empirical dose of imipenem/cilastatin, but not meropenem. CONCLUSIONS: Real-time carbapenem TDM availability significantly altered the empirical daily dosage of imipenem/cilastatin at our burn ICU. Further studies are needed to evaluate the individual impact of TDM-based antibiotic adjustment on infection outcomes in these patients.
Resumo:
[spa] España no fue admitida en la Comunidad Económica Europea durante el régimen de Franco por razones políticas. Integrarse a la Comunidad Europea en enero de 1986 fue el último peldaño hacia la consolidación definitiva de la democracia en España y de la apertura de su economía.. Los resultados de los veinticinco años como miembro de la UE se han traducido en un impulso sin precedentes de modernización y progreso. España adoptó el “Acervo Comunitario” y recibió considerables beneficios de su integración a la Comunidad, eliminando barreras, siguiendo las políticas comunes, recibiendo fondos europeos y adoptando la moneda europea común. A partir de un nivel del 60% del promedio europeo de renta per capita en 1986, el nivel actual –incluso con la crisis que estalló en 2008- se sitúa en torno al 105 por ciento. Los últimos tres años han sido diferentes y difíciles como consecuencia de la severa crisis económica y financiera.En este contexto este trabajo analiza como los sucesivos gobiernos de España han organizado la gobernanza económica para adaptarla a los cambios cuantitativos y cualitativos que se han ido produciendo en la integración europea.
Resumo:
[spa] España no fue admitida en la Comunidad Económica Europea durante el régimen de Franco por razones políticas. Integrarse a la Comunidad Europea en enero de 1986 fue el último peldaño hacia la consolidación definitiva de la democracia en España y de la apertura de su economía.. Los resultados de los veinticinco años como miembro de la UE se han traducido en un impulso sin precedentes de modernización y progreso. España adoptó el “Acervo Comunitario” y recibió considerables beneficios de su integración a la Comunidad, eliminando barreras, siguiendo las políticas comunes, recibiendo fondos europeos y adoptando la moneda europea común. A partir de un nivel del 60% del promedio europeo de renta per capita en 1986, el nivel actual –incluso con la crisis que estalló en 2008- se sitúa en torno al 105 por ciento. Los últimos tres años han sido diferentes y difíciles como consecuencia de la severa crisis económica y financiera.En este contexto este trabajo analiza como los sucesivos gobiernos de España han organizado la gobernanza económica para adaptarla a los cambios cuantitativos y cualitativos que se han ido produciendo en la integración europea.
Resumo:
Maximum entropy modeling (Maxent) is a widely used algorithm for predicting species distributions across space and time. Properly assessing the uncertainty in such predictions is non-trivial and requires validation with independent datasets. Notably, model complexity (number of model parameters) remains a major concern in relation to overfitting and, hence, transferability of Maxent models. An emerging approach is to validate the cross-temporal transferability of model predictions using paleoecological data. In this study, we assess the effect of model complexity on the performance of Maxent projections across time using two European plant species (Alnus giutinosa (L.) Gaertn. and Corylus avellana L) with an extensive late Quaternary fossil record in Spain as a study case. We fit 110 models with different levels of complexity under present time and tested model performance using AUC (area under the receiver operating characteristic curve) and AlCc (corrected Akaike Information Criterion) through the standard procedure of randomly partitioning current occurrence data. We then compared these results to an independent validation by projecting the models to mid-Holocene (6000 years before present) climatic conditions in Spain to assess their ability to predict fossil pollen presence-absence and abundance. We find that calibrating Maxent models with default settings result in the generation of overly complex models. While model performance increased with model complexity when predicting current distributions, it was higher with intermediate complexity when predicting mid-Holocene distributions. Hence, models of intermediate complexity resulted in the best trade-off to predict species distributions across time. Reliable temporal model transferability is especially relevant for forecasting species distributions under future climate change. Consequently, species-specific model tuning should be used to find the best modeling settings to control for complexity, notably with paleoecological data to independently validate model projections. For cross-temporal projections of species distributions for which paleoecological data is not available, models of intermediate complexity should be selected.
Resumo:
Introduction: Frequent emergency department (ED) users are often vulnerable patients with many risk factors affecting their quality of life (QoL). The aim of this study was to examine to what extent a case management intervention improved frequent ED users' QoL. Methods: Data were part of a randomized, controlled trial designed to improve frequent ED users' QoL at the Lausanne University Hospital. A total of 194 frequent ED users (≥ 5 attendances during the previous 12 months; ≥ 18 years of age) were randomly assigned to the control or the intervention group. Participants in the intervention group received a case management intervention (i.e. counseling and assistance concerning social determinants of health, substance-use disorders, and access to the health-care system). QoL was evaluated using the WHOQOL-BREF at baseline and twelve months later. Four dimensions of QoL were retained: physical health, psychological health, social relationship, and environment, with scores ranging from 0 (low QoL) to 100 (high QoL).
Resumo:
Clinic simulation as a training and knowledge method allows people experiment a real event representation with the aim of acquiring knowledge, abilities and aptitudes. The filming of the staging represents a useful tool to review the decisions taken and the actions they did, with the purpose to highlight the strengths, weaknesses and the areas for improvement. The article describes a study carried out by a group of students in second course of nursing, and it tries to evaluate if there is any influence if somebody is filming you during the clinic simulation, does it make you do more errors or not?
Resumo:
PURPOSE: Laparoscopic surgery represents specific challenges, such as the reduction of a three-dimensional anatomic environment to two dimensions. The aim of this study was to investigate the impact of the loss of the third dimension on laparoscopic virtual reality (VR) performance. METHODS: We compared a group of examinees with impaired stereopsis (group 1, n = 28) to a group with accurate stereopsis (group 2, n = 29). The primary outcome was the difference between the mean total score (MTS) of all tasks taken together and the performance in task 3 (eye-hand coordination), which was a priori considered to be the most dependent on intact stereopsis. RESULTS: The MTS and performance in task 3 tended to be slightly, but not significantly, better in group 2 than in group 1 [MTS: -0.12 (95 % CI -0.32, 0.08; p = 0.234); task 3: -0.09 (95 % CI -0.29, 0.11; p = 0.385)]. The difference of MTS between simulated impaired stereopsis between group 2 (by attaching an eye patch on the adominant eye in the 2nd run) and the first run of group 1 was not significant (MTS: p = 0.981; task 3: p = 0.527). CONCLUSION: We were unable to demonstrate an impact of impaired examinees' stereopsis on laparoscopic VR performance. Individuals with accurate stereopsis seem to be able to compensate for the loss of the third dimension in laparoscopic VR simulations.
Resumo:
A smoke-free law came into effect in Spain on 1st January 2006, affecting all enclosed workplaces except hospitality venues, whose proprietors can choose among totally a smoke-free policy, a partial restriction with designated smoking areas, or no restriction on smoking on the premises. We aimed to evaluate the impact of the law among hospitality workers by assessing second-hand smoke (SHS) exposure and the frequency of respiratory symptoms before and one year after the ban.
Resumo:
The impact of round-the-clock cerebrospinal fluid (CSF) Gram stain on overnight empirical therapy for suspected central nervous system (CNS) infections was investigated. All consecutive overnight CSF Gram stains between 2006 and 2011 were included. The impact of a positive or a negative test on empirical therapy was evaluated and compared to other clinical and biological indications based on institutional guidelines. Bacterial CNS infection was documented in 51/241 suspected cases. Overnight CSF Gram stain was positive in 24/51. Upon validation, there were two false-positive and one false-negative results. The sensitivity and specificity were 41 and 99 %, respectively. All patients but one had other indications for empirical therapy than Gram stain alone. Upon obtaining the Gram result, empirical therapy was modified in 7/24, including the addition of an appropriate agent (1), addition of unnecessary agents (3) and simplification of unnecessary combination therapy (3/11). Among 74 cases with a negative CSF Gram stain and without formal indication for empirical therapy, antibiotics were withheld in only 29. Round-the-clock CSF Gram stain had a low impact on overnight empirical therapy for suspected CNS infections and was associated with several misinterpretation errors. Clinicians showed little confidence in CSF direct examination for simplifying or withholding therapy before definite microbiological results.
Resumo:
AIMS: Published incidences of acute mountain sickness (AMS) vary widely. Reasons for this variation, and predictive factors of AMS, are not well understood. We aimed to identify predictive factors that are associated with the occurrence of AMS, and to test the hypothesis that study design is an independent predictive factor of AMS incidence. We did a systematic search (Medline, bibliographies) for relevant articles in English or French, up to April 28, 2013. Studies of any design reporting on AMS incidence in humans without prophylaxis were selected. Data on incidence and potential predictive factors were extracted by two reviewers and crosschecked by four reviewers. Associations between predictive factors and AMS incidence were sought through bivariate and multivariate analyses for different study designs separately. Association between AMS incidence and study design was assessed using multiple linear regression. RESULTS: We extracted data from 53,603 subjects from 34 randomized controlled trials, 44 cohort studies, and 33 cross-sectional studies. In randomized trials, the median of AMS incidences without prophylaxis was 60% (range, 16%-100%); mode of ascent and population were significantly associated with AMS incidence. In cohort studies, the median of AMS incidences was 51% (0%-100%); geographical location was significantly associated with AMS incidence. In cross-sectional studies, the median of AMS incidences was 32% (0%-68%); mode of ascent and maximum altitude were significantly associated with AMS incidence. In a multivariate analysis, study design (p=0.012), mode of ascent (p=0.003), maximum altitude (p<0.001), population (p=0.002), and geographical location (p<0.001) were significantly associated with AMS incidence. Age, sex, speed of ascent, duration of exposure, or history of AMS were inconsistently reported and therefore not further analyzed. CONCLUSIONS: Reported incidences and identifiable predictive factors of AMS depend on study design.
Resumo:
null
Resumo:
Clinical trials today are conducted in multiple countries to enhance patient recruitment and improve efficiency of trials. However, the demographic and cultural diversity may contribute to variations in study outcomes. Here we conducted post-hoc analyses for a placebo-controlled study with ziprasidone and haloperidol for the treatment of acute mania to address the demographic, dosing, and outcome disparities in India, Russia and the USA. We compared the baseline characteristics, outcomes and discontinuations in patients and explored the relationship between the outcome measures across these countries. We found substantial differences in baseline characteristics of subjects, administered dosage and disease severity in India compared to the USA and Russia. Conversely, US subjects had a higher placebo response compared to subjects in Russia and India. These results are probably due to demographic differences in patient populations and psychiatric clinical practice across countries. While we offer initial ideas to address the disparities identified in this analysis, it is clear that further research to improve our understanding of geographical differences is essential to ensure globally applicable results for clinical trials in psychiatry.
Resumo:
BACKGROUND: The impact of early treatment with immunomodulators (IM) and/or TNF antagonists on bowel damage in Crohn's disease (CD) patients is unknown. AIM: To assess whether 'early treatment' with IM and/or TNF antagonists, defined as treatment within a 2-year period from the date of CD diagnosis, was associated with development of lesser number of disease complications when compared to 'late treatment', which was defined as treatment initiation after >2 years from the time of CD diagnosis. METHODS: Data from the Swiss IBD Cohort Study were analysed. The following outcomes were assessed using Cox proportional hazard modelling: bowel strictures, perianal fistulas, internal fistulas, intestinal surgery, perianal surgery and any of the aforementioned complications. RESULTS: The 'early treatment' group of 292 CD patients was compared to the 'late treatment' group of 248 CD patients. We found that 'early treatment' with IM or TNF antagonists alone was associated with reduced risk of bowel strictures [hazard ratio (HR) 0.496, P = 0.004 for IM; HR 0.276, P = 0.018 for TNF antagonists]. Furthermore, 'early treatment' with IM was associated with reduced risk of undergoing intestinal surgery (HR 0.322, P = 0.005), and perianal surgery (HR 0.361, P = 0.042), as well as developing any complication (HR 0.567, P = 0.006). CONCLUSIONS: Treatment with immunomodulators or TNF antagonists within the first 2 years of CD diagnosis was associated with reduced risk of developing bowel strictures, when compared to initiating these drugs >2 years after diagnosis. Furthermore, early immunomodulators treatment was associated with reduced risk of intestinal surgery, perianal surgery and any complication.