979 resultados para 30-DAY
Resumo:
Aims: The recent availability of the novel oral anticoagulants (NOACs) may have led to a change in the anticoagulation regimens of patients referred to catheter ablation of atrial fibrillation (AF). Preliminary data exist concerning dabigatran, but information regarding the safety and efficacy of rivaroxaban in this setting is currently scarce. Methods: and results Of the 556 consecutive eligible patients (age 61.0 ± 9.6; 74.6% men; 61.2% paroxysmal AF) undergoing AF catheter ablation in our centre (October 2012 to September 2013) and enroled in a systematic standardized 30-day follow-up period: 192 patients were under vitamin K antagonists (VKAs), 188 under rivaroxaban, and 176 under dabigatran. Peri-procedural mortality and significant systemic or pulmonary thromboembolism (efficacy outcome), as well as bleeding events (safety outcome) during the 30 days following the ablation were evaluated according to anticoagulation regimen. During a 12-month time interval, the use of the NOACs in this population rose from <10 to 70%. Overall, the rate of events was low with no significant differences regarding: thrombo-embolic events in 1.3% (VKA 2.1%; rivaroxaban 1.1%; dabigatran 0.6%; P = 0.410); major bleeding in 2.3% (VKA 4.2%; rivaroxaban 1.6%; dabigatran 1.1%; P = 0.112), and minor bleeding 1.4% (VKA 2.1%; rivaroxaban 1.6%; dabigatran 0.6%; P = 0.464). No fatal events were observed. Conclusion: The use of the NOAC in patients undergoing catheter ablation of AF has rapidly evolved (seven-fold) over 1 year. These preliminary data suggest that rivaroxaban and dabigatran in the setting of catheter ablation of AF are efficient and safe, compared with the traditional VKA.
Resumo:
We present in situ microelectrode measurements of sediment formation factor and porewater oxygen and pH from six stations in the North Atlantic varying in depth from 2159 to 5380 m. A numerical model of the oxygen data indicates that fluxes of oxygen to the sediments are as much as an order of magnitude higher than benthic chamber flux measurements previously reported in the same area. Model results require dissolution driven by metabolic CO2 production within the sediments to explain the pH data; even at the station with the most undersaturated bottom waters >60% of the calcite dissolution occurs in response to metabolic CO2. Aragonite dissolution alone cannot provide the observed buffering of porewater pH, even at the shallowest station. A sensitivity test of the model that accounts for uncertainties in the bottom water saturation state and the stoichiometry between oxygen consumption and CO2 production during respiration constrains the dissolution rate constant for calcite to between 3 and 30% day**-1, in agreement with earlier in situ determinations of the rate constant. Model results predict that over 35% of the calcium carbonate rain to these sediments dissolves at all stations, confirmed by sediment trap and CaCO3 accumulation data.
Resumo:
Orthotopic liver retransplantation (re-OLT) is highly controversial. The objectives of this study were to determine the validity of a recently developed United Network for Organ Sharing (UNOS) multivariate model using an independent cohort of patients undergoing re-OLT outside the United States, to determine whether incorporation of other variables that were incomplete in the UNOS registry would provide additional prognostic information, to develop new models combining data sets from both cohorts, and to evaluate the validity of the model for end-stage liver disease (MELD) in patients undergoing re-OLT. Two hundred eighty-one adult patients undergoing re-OLT (between 1986 and 1999) at 6 foreign transplant centers comprised the validation cohort. We found good agreement between actual survival and predicted survival in the validation cohort; 1-year patient survival rates in the low-, intermediate-, and high-risk groups (as assigned by the original UNOS model) were 72%, 68%, and 36%, respectively (P < .0001). In the patients for whom the international normalized ratio (INR) of prothrombin time was available, MELD correlated with outcome following re-OLT; the median MELD scores for patients surviving at least 90 days compared with those dying within 90 days were 20.75 versus 25.9, respectively (P = .004). Utilizing both patient cohorts (n = 979), a new model, based on recipient age, total serum bilirubin, creatinine, and interval to re-OLT, was constructed (whole model χ(2) = 105, P < .0001). Using the c-statistic with 30-day, 90-day, 1-year, and 3-year mortality as the end points, the area under the receiver operating characteristic (ROC) curves for 4 different models were compared. In conclusion, prospective validation and use of these models as adjuncts to clinical decision making in the management of patients being considered for re-OLT are warranted.
Resumo:
Background: Hospital performance reports based on administrative data should distinguish differences in quality of care between hospitals from case mix related variation and random error effects. A study was undertaken to determine which of 12 diagnosis-outcome indicators measured across all hospitals in one state had significant risk adjusted systematic ( or special cause) variation (SV) suggesting differences in quality of care. For those that did, we determined whether SV persists within hospital peer groups, whether indicator results correlate at the individual hospital level, and how many adverse outcomes would be avoided if all hospitals achieved indicator values equal to the best performing 20% of hospitals. Methods: All patients admitted during a 12 month period to 180 acute care hospitals in Queensland, Australia with heart failure (n = 5745), acute myocardial infarction ( AMI) ( n = 3427), or stroke ( n = 2955) were entered into the study. Outcomes comprised in-hospital deaths, long hospital stays, and 30 day readmissions. Regression models produced standardised, risk adjusted diagnosis specific outcome event ratios for each hospital. Systematic and random variation in ratio distributions for each indicator were then apportioned using hierarchical statistical models. Results: Only five of 12 (42%) diagnosis-outcome indicators showed significant SV across all hospitals ( long stays and same diagnosis readmissions for heart failure; in-hospital deaths and same diagnosis readmissions for AMI; and in-hospital deaths for stroke). Significant SV was only seen for two indicators within hospital peer groups ( same diagnosis readmissions for heart failure in tertiary hospitals and inhospital mortality for AMI in community hospitals). Only two pairs of indicators showed significant correlation. If all hospitals emulated the best performers, at least 20% of AMI and stroke deaths, heart failure long stays, and heart failure and AMI readmissions could be avoided. Conclusions: Diagnosis-outcome indicators based on administrative data require validation as markers of significant risk adjusted SV. Validated indicators allow quantification of realisable outcome benefits if all hospitals achieved best performer levels. The overall level of quality of care within single institutions cannot be inferred from the results of one or a few indicators.
Multisite, quality-improvement collaboration to optimise cardiac care in Queensland public hospitals
Resumo:
Objective: To evaluate changes in quality of in-hospital care of patients with either acute coronary syndromes (ACS) or congestive heart failure (CHF) admitted to hospitals participating in a multisite quality improvement collaboration. Design: Before-and-after study of changes in quality indicators measured on representative patient samples between June 2001 and January 2003. Setting: Nine public hospitals in Queensland. Study populations: Consecutive or randomly selected patients admitted to study hospitals during the baseline period (June 2001 to January 2002; n = 807 for ACS, n = 357 for CHF) and post-intervention period (July 2002 to January 2003; n = 717 for ACS, n = 220 for CHF). Intervention: Provision of comparative baseline feedback at a facilitative workshop combined with hospital-specific quality-improvement interventions supported by on-site quality officers and a central program management group. Main outcome measure: Changes in process-of-care indicators between baseline and post-intervention periods. Results: Compared with baseline, more patients with ACS in the post-intervention period received therapeutic heparin regimens (84% v 72%; P < 0.001), angiotensin-converting enzyme inhibitors (64% v 56%; P = 0.02), lipid-lowering agents (72% v 62%; P < 0.001), early use of coronary angiography (52% v 39%; P < 0.001), in-hospital cardiac counselling (65% v 43%; P < 0.001), and referral to cardiac rehabilitation (15% v 5%; P < 0.001). The numbers of patients with CHF receiving β-blockers also increased (52% v 34%; P < 0.001), with fewer patients receiving deleterious agents (13% v 23%; P = 0.04). Same-cause 30-day readmission rate decreased from 7.2% to 2.4% (P = 0.02) in patients with CHF. Conclusion: Quality-improvement interventions conducted as multisite collaborations may improve in-hospital care of acute cardiac conditions within relatively short time frames.
Resumo:
Objectives: To re-examine interhospital variation in 30 day survival after acute myocardial infarction ( AMI) 10 years on to see whether the appointment of new cardiologists and their involvement in emergency care has improved outcome after AMI. Design: Retrospective cohort study. Setting: Acute hospitals in Scotland. Participants: 61 484 patients with a first AMI over two time periods: 1988 - 1991; and 1998 - 2001. Main outcome measures: 30 day survival. Results: Between 1988 and 1991, median 30 day survival was 79.2% ( interhospital range 72.1 - 85.1%). The difference between highest and lowest was 13.0 percentage points ( age and sex adjusted, 12.1 percentage points). Between 1998 and 2001, median survival rose to 81.6% ( and range decreased to 78.0 - 85.6%) with a difference of 7.6 ( adjusted 8.8) percentage points. Admission hospital was an independent predictor of outcome at 30 days during the two time periods ( p< 0.001). Over the period 1988 - 1991, the odds ratio for death ranged, between hospitals, from 0.71 ( 95% confidence interval ( CI) 0.58 to 0.88) to 1.50 ( 95% CI 1.19 to 1.89) and for the period 1998 - 2001 from 0.82 ( 95% CI 0.60 to 1.13) to 1.46 ( 95% CI 1.07 to 1.99). The adjusted risk of death was significantly higher than average in nine of 26 hospitals between 1988 and 1991 but in only two hospitals between 1998 and 2001. Conclusions: The average 30 day case fatality rate after admission with an AMI has fallen substantially over the past 10 years in Scotland. Between-hospital variation is also considerably less notable because of better survival in the previously poorly performing hospitals. This suggests that the greater involvement of cardiologists in the management of AMI has paid dividends.
Resumo:
Post-operative infections resulting from total hip arthroplasty are caused by bacteria such as Staphylococcus aureus and Pseudomonas aeruginosa entering the wound perioperatively or by haemetogenous spread from distant loci of infection. They can endanger patient health and require expensive surgical revision procedures. Gentamicin impregnated poly (methyl methacrylate) bone cement is traditionally used for treatment but is often removed due to harbouring bacterial growth, while bacterial resistance to gentamicin is increasing. The aim of this work was to encapsulate the antibiotics vancomycin, ciprofloxacin and rifampicin within sustained release microspheres composed of the biodegradable polymer poly (dl-lactide-co-glycolide) [PLCG] 75:25. Topical administration to the wound in hydroxypropylmethylcellulose gel should achieve high local antibiotic concentrations while the two week in vivo half life of PLCG 75:25 removes the need for expensive surgical retrieval operations. Unloaded and 20% w/w antibiotic loaded PLCG 75:25 microspheres were fabricated using a Water in Oil emulsification with solvent evaporation technique. Microspheres were spherical in shape with a honeycomb-like internal matrix and showed reproducible physical properties. The kinetics of in vitro antibiotic release into newborn calf serum (NCS) and Hank's balanced salt solution (HBSS) at 37°C were measured using a radial diffusion assay. Generally, the day to day concentration of each antibiotic released into NCS over a 30 day period was in excess of that required to kill St. aureus and Ps. auruginosa. Only limited microsphere biodegradation had occurred after 30 days of in vitro incubation in NCS and HBSS at 37°C. The moderate in vitro cytotoxicity of 20% w/w antibiotic loaded microspheres to cultured 3T3-L1 cells was antibiotic induced. In conclusion, generated data indicate the potential for 20% w/w antibiotic loaded microspheres to improve the present treatment regimens for infections occurring after total hip arthroplasty such that future work should focus on gaining industrial collaboration for commercial exploitation.
Resumo:
The research examines the deposition of airborne particles which contain heavy metals and investigates the methods that can be used to identify their sources. The research focuses on lead and cadmium because these two metals are of growing public and scientific concern on environmental health grounds. The research consists of three distinct parts. The first is the development and evaluation of a new deposition measurement instrument - the deposit cannister - designed specifically for large-scale surveys in urban areas. The deposit cannister is specifically designed to be cheap, robust, and versatile and therefore to permit comprehensive high-density urban surveys. The siting policy reduces contamination from locally resuspended surface-dust. The second part of the research has involved detailed surveys of heavy metal deposition in Walsall, West Midlands, using the new high-density measurement method. The main survey, conducted over a six-week period in November - December 1982, provided 30-day samples of deposition at 250 different sites. The results have been used to examine the magnitude and spatial variability of deposition rates in the case-study area, and to evaluate the performance of the measurement method. The third part of the research has been to conduct a 'source-identification' exercise. The methods used have been Receptor Models - Factor Analysis and Cluster Analysis - and a predictive source-based deposition model. The results indicate that there are six main source processes contributing to deposition of metals in the Walsall area: coal combustion, vehicle emissions, ironfounding, copper refining and two general industrial/urban processes. |A source-based deposition model has been calibrated using facctorscores for one source factor as the dependent variable, rather than metal deposition rates, thus avoiding problems traditionally encountered in calibrating models in complex multi-source areas. Empirical evidence supports the hypothesised associatlon of this factor with emissions of metals from the ironfoundry industry.
Resumo:
The hydrologic regime of Shark Slough, the most extensive long hydroperiod marsh in Everglades National Park, is largely controlled by the location, volume, and timing of water delivered to it through several control structures from Water Conservation Areas north of the Park. Where natural or anthropogenic barriers to water flow are present, water management practices in this highly regulated system may result in an uneven distribution of water in the marsh, which may impact regional vegetation patterns. In this paper, we use data from 569 sampling locations along five cross-Slough transects to examine regional vegetation distribution, and to test and describe the association of marsh vegetation with several hydrologic and edaphic parameters. Analysis of vegetation:environment relationships yielded estimates of both mean and variance in soil depth, as well as annual hydroperiod, mean water depth, and 30-day maximum water depth within each cover type during the 1990’s. We found that rank abundances of the three major marsh cover types (Tall Sawgrass, Sparse Sawgrass, and Spikerush Marsh) were identical in all portions of Shark Slough, but regional trends in the relative abundance of individual communities were present. Analysis also indicated clear and consistent differences in the hydrologic regime of three marsh cover types, with hydroperiod and water depths increasing in the order Tall Sawgrass , Sparse Sawgrass , Spikerush Marsh. In contrast, soil depth decreased in the same order. Locally, these differences were quite subtle; within a management unit of Shark Slough, mean annual values for the two water depth parameters varied less than 15 cm among types, and hydroperiods varied by 65 days or less. More significantly, regional variation in hydrology equaled or exceeded the variation attributable to cover type within a small area. For instance, estimated hydroperiods for Tall Sawgrass in Northern Shark Slough were longer than for Spikerush Marsh in any of the other regions. Although some of this regional variation may reflect a natural gradient within the Slough, a large proportion is the result of compartmentalization due to current water management practices within the marsh.We conclude that hydroperiod or water depth are the most important influences on vegetation within management units, and attribute larger scale differences in vegetation pattern to the interactions among soil development, hydrology and fire regime in this pivotal portion of Everglades.
Resumo:
Ocean acidification is predicted to have widespread implications for marine bivalve mollusks. While our understanding of its impact on their physiological and behavioral responses is increasing, little is known about their reproductive responses under future scenarios of anthropogenic climate change. In this study, we examined the physiological energetics of the Manila clam Ruditapes philippinarum exposed to CO2-induced seawater acidification during gonadal maturation. Three recirculating systems filled with 600 L of seawater were manipulated to three pH levels (8.0, 7.7, and 7.4) corresponding to control and projected pH levels for 2100 and 2300. In each system, temperature was gradually increased ca. 0.3 °C per day from 10 to 20 °C for 30 days and maintained at 20 °C for the following 40 days. Irrespective of seawater pH levels, clearance rate (CR), respiration rate (RR), ammonia excretion rate (ER), and scope for growth (SFG) increased after a 30-day stepwise warming protocol. When seawater pH was reduced, CR, ratio of oxygen to nitrogen, and SFG significantly decreased concurrently, whereas ammonia ER increased. RR was virtually unaffected under acidified conditions. Neither temperature nor acidification showed a significant effect on food absorption efficiency. Our findings indicate that energy is allocated away from reproduction under reduced seawater pH, potentially resulting in an impaired or suppressed reproductive function. This interpretation is based on the fact that spawning was induced in only 56% of the clams grown at pH 7.4. Seawater acidification can therefore potentially impair the physiological energetics and spawning capacity of R. philippinarum.
Resumo:
The Tara Oceans Expedition (2009-2013) sampled the world oceans on board a 36 m long schooner, collecting environmental data and organisms from viruses to planktonic metazoans for later analyses using modern sequencing and state-of-the-art imaging technologies. Tara Oceans Data are particularly suited to study the genetic, morphological and functional diversity of plankton. The present data set provides environmental context to all samples from the Tara Oceans Expedition (2009-2013), about water column features at the sampling location. Based on in situ measurements of... at the...
Resumo:
Introduction: The production of KPC (Klebsiella pneumoniae carbapenemase) has become an important mechanism of carbapenem-resistance among Enterobacteriaceae strains. In Brazil, KPC is already widespread and its incidence has increased significantly, reducing treatment options. The “perfect storm” combination of the absence of new drug developmentand the emergence of multidrug-resistant strains resulted in the need for the use of older drugs, with greater toxicity, such as polymyxins. Aims: To determine the occurrence of carbapenemase-producing strains in carbapenem-resistant Enterobacteriaceae isolated from patients with nosocomial infection/colonization during September/2014 to August/2015, to determine the risk factors associated with 30-day- mortality and the impact of inappropriate therapy. Materials and Methods: We performed a case control study to assess the risk factors (comorbidities, invasive procedures and inappropriate antimicrobial therapy) associated with 30-day-mortality, considering the first episode of infection in 111 patients. The resistance genes blaKPC, blaIMP, blaVIM and blaNDM-1 were detected by polymerase chain reaction technique. Molecular typing of the strains involved in the outbreak was performed by pulsed field gel electrophoresis technique. The polymyxin resistance was confirmed by the microdilution broth method. Results: 188 episodes of carbapenem-resistant Enterobacteriaceae infections/colonizations were detected; of these, 122 strains were recovered from the hospital laboratory. The presence of blaKPC gene were confirmed in the majority (74.59%) of these isolates. It was not found the presence of blaIMP , blaVIM and blaNDM-1 genes. K. pneumoniae was the most frequent microorganism (77,13%), primarily responsible for urinary tract infections (21,38%) and infections from patients of the Intensive Care Unit (ICU) (61,38%). Multivariate statistical analysis showed as predictors independently associated with mortality: dialysis and bloodstream infection. The Kaplan-Meier curve showed a lower probability of survival in the group of patients receiving antibiotic therapy inappropriately. Antimicrobial use in adult ICU varied during the study period, but positive correlation between increased incidence of strains and the consumption was not observed. In May and July 2015, the occurrence rates of carbapenem-resistant Enterobacteriaceae KPC-producing per 1000 patient-days were higher than the control limit established, confirming two outbreaks, the first caused by colistin-susceptible KPC-producing K. pneumoniae isolates, with a polyclonal profile and the second by a dominant clone of colistin-resistant (≥ 32 μg/mL) KPC-producing K. pneumoniae. The cross transmission between patients became clear by the temporal and spatial relationships observed in the second outbreak, since some patients occupied the same bed, showing problems in hand hygiene adherence among healthcare workers and inadequate terminal disinfection of environment. The outbreak was contained when the ICU was closed to new admissions. Conclusions: The study showed an endemicity of K. pneumoniae KPC-producing in adult ICU, progressing to an epidemic monoclonal expansion, resulted by a very high antibiotic consumption of carbapenems and polymyxins and facilitated by failures in control measures the unit.
Resumo:
BACKGROUND: It is unclear whether diagnostic protocols based on cardiac markers to identify low-risk chest pain patients suitable for early release from the emergency department can be applied to patients older than 65 years or with traditional cardiac risk factors. METHODS AND RESULTS: In a single-center retrospective study of 231 consecutive patients with high-risk factor burden in which a first cardiac troponin (cTn) level was measured in the emergency department and a second cTn sample was drawn 4 to 14 hours later, we compared the performance of a modified 2-Hour Accelerated Diagnostic Protocol to Assess Patients with Chest Pain Using Contemporary Troponins as the Only Biomarker (ADAPT) rule to a new risk classification scheme that identifies patients as low risk if they have no known coronary artery disease, a nonischemic electrocardiogram, and 2 cTn levels below the assay's limit of detection. Demographic and outcome data were abstracted through chart review. The median age of our population was 64 years, and 75% had Thrombosis In Myocardial Infarction risk score ≥2. Using our risk classification rule, 53 (23%) patients were low risk with a negative predictive value for 30-day cardiac events of 98%. Applying a modified ADAPT rule to our cohort, 18 (8%) patients were identified as low risk with a negative predictive value of 100%. In a sensitivity analysis, the negative predictive value of our risk algorithm did not change when we relied only on undetectable baseline cTn and eliminated the second cTn assessment. CONCLUSIONS: If confirmed in prospective studies, this less-restrictive risk classification strategy could be used to safely identify chest pain patients with more traditional cardiac risk factors for early emergency department release.
Resumo:
OBJECTIVE: The Thrombolysis in Myocardial Infarction (TIMI) score is a validated tool for risk stratification of acute coronary syndrome. We hypothesized that the TIMI risk score would be able to risk stratify patients in observation unit for acute coronary syndrome. METHODS: STUDY DESIGN: Retrospective cohort study of consecutive adult patients placed in an urban academic hospital emergency department observation unit with an average annual census of 65,000 between 2004 and 2007. Exclusion criteria included elevated initial cardiac biomarkers, ST segment changes on ECG, unstable vital signs, or unstable arrhythmias. A composite of significant coronary artery disease (CAD) indicators, including diagnosis of myocardial infarction, percutaneous coronary intervention, coronary artery bypass surgery, or death within 30 days and 1 year, were abstracted via chart review and financial record query. The entire cohort was stratified by TIMI risk scores (0-7) and composite event rates with 95% confidence interval were calculated. RESULTS: In total 2228 patients were analyzed. Average age was 54.5 years, 42.0% were male. The overall median TIMI risk score was 1. Eighty (3.6%) patients had 30-day and 119 (5.3%) had 1-year CAD indicators. There was a trend toward increasing rate of composite CAD indicators at 30 days and 1 year with increasing TIMI score, ranging from a 1.2% event rate at 30 days and 1.9% at 1 year for TIMI score of 0 and 12.5% at 30 days and 21.4% at 1 year for TIMI ≥ 4. CONCLUSIONS: In an observation unit cohort, the TIMI risk score is able to risk stratify patients into low-, moderate-, and high-risk groups.
Resumo:
The establishment and control of oxygen levels in packs of oxygen-sensitive food products such as cheese is imperative in order to maintain product quality over a determined shelf life. Oxygen sensors quantify oxygen concentrations within packaging using a reversible optical measurement process, and this non-destructive nature ensures the entire supply chain can be monitored and can assist in pinpointing negative issues pertaining to product packaging. This study was carried out in a commercial cheese packaging plant and involved the insertion of 768 sensors into 384 flow-wrapped cheese packs (two sensors per pack) that were flushed with 100% carbon dioxide prior to sealing. The cheese blocks were randomly assigned to two different storage groups to assess the effects of package quality, packaging process efficiency, and handling and distribution on package containment. Results demonstrated that oxygen levels increased in both experimental groups examined over the 30-day assessment period. The group subjected to a simulated industrial distribution route and handling procedures of commercial retailed cheese exhibited the highest level of oxygen detected on every day examined and experienced the highest rate of package failure. The study concluded that fluctuating storage conditions, product movement associated with distribution activities, and the possible presence of cheese-derived contaminants such as calcium lactate crystals were chief contributors to package failure.