952 resultados para Reduced-Impact Logging


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Newer antiepileptic drugs (AEDs) are increasingly prescribed and seem to have a comparable efficacy as the classical AEDs; however, their impact on status epilepticus (SE) prognosis has received little attention. In our prospective SE database (2006-2010), we assessed the use of older versus newer AEDs (levetiracetam, pregabalin, topiramate, lacosamide) over time and its relationship to outcome (return to clinical baseline conditions, new handicap, or death). Newer AEDs were used more often toward the end of the study period (42% of episodes versus 30%). After adjustment for SE etiology, SE severity score, and number of compounds needed to terminate SE, newer AEDs were independently related to a reduced likelihood of return to baseline (p<0.001) but not to increased mortality. These findings seem in line with recent findings on refractory epilepsy. Also, in view of the higher price of the newer AEDs, well-designed, prospective assessments analyzing the impact of newer AEDs on efficacy and tolerability in patients with SE appear mandatory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND/OBJECTIVES: A smoking law was passed by the Spanish Parliament in December 2005 and was enforced by 1 January 2006. The law bans smoking in all indoor workplaces but only in some hospitality venues, because owners are allowed to establish a smoking zone (venues>100 m2) or to allow smoking without restrictions (venues<100 m2). The objective of the study is to assess the impact of the Spanish smoking law on exposure to secondhand smoke (SHS) in enclosed workplaces, including hospitality venues. MATERIALS AND METHODS: The study design is a before-and-after evaluation. We studied workplaces and hospitality venues from eight different regions of Spain. We took repeated samples of vapor-phase nicotine concentration in 398 premises, including private offices (162), public administration offices (90), university premises (43), bars and restaurants (79), and discotheques and pubs (24). RESULTS: In the follow-up period, SHS levels were markedly reduced in indoor offices. The median decrease in nicotine concentration ranged from 60.0% in public premises to 97.4% in private areas. Nicotine concentrations were also markedly reduced in bars and restaurants that became smoke-free (96.7%) and in the no-smoking zones of venues with separate spaces for smokers (88.9%). We found no significant changes in smoking zones or in premises allowing smoking, including discotheques and pubs. CONCLUSIONS: Overall, this study shows the positive impact of the law on reducing SHS in indoor workplaces. However, SHS was substantially reduced only in bars and restaurants that became smoke-free. Most hospitality workers continue to be exposed to very high levels of SHS. Therefore, a 100% smoke-free policy for all hospitality venues is required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyse the variations produced on tsunami propagation and impact over a straight coastline because of the presence of a submarine canyon incised in the continental margin. For ease of calculation we assume that the shoreline and the shelf edge are parallel and that the incident wave approaches them normally. A total of 512 synthetic scenarios have been computed by combining the bathymetry of a continental margin incised by a parameterised single canyon and the incident tsunami waves. The margin bathymetry, the canyon and the tsunami waves have been generated using mathematical functions (e.g. Gaussian). Canyon parameters analysed are: (i) incision length into the continental shelf, which for a constant shelf width relates directly to the distance from the canyon head to the coast, (ii) canyon width, and (iii) canyon orientation with respect to the shoreline. Tsunami wave parameters considered are period and sign. The COMCOT tsunami model from Cornell University was applied to propagate the waves across the synthetic bathymetric surfaces. Five simulations of tsunami propagation over a non-canyoned margin were also performed for reference. The analysis of the results reveals a strong variation of tsunami arrival times and amplitudes reaching the coastline when a tsunami wave travels over a submarine canyon, with changing maximum height location and alongshore extension. In general, the presence of a submarine canyon lowers the arrival time to the shoreline but prevents wave build-up just over the canyon axis. This leads to a decrease in tsunami amplitude at the coastal stretch located just shoreward of the canyon head, which results in a lower run-up in comparison with a non-canyoned margin. Contrarily, an increased wave build-up occurs on both sides of the canyon head, generating two coastal stretches with an enhanced run-up. These aggravated or reduced tsunami effects are modified with (i) proximity of the canyon tip to the coast, amplifying the wave height, (ii) canyon width, enlarging the areas with lower and higher maximum height wave along the coastline, and (iii) canyon obliquity with respect to the shoreline and shelf edge, increasing wave height shoreward of the leeward flank of the canyon. Moreover, the presence of a submarine canyon near the coast produces a variation of wave energy along the shore, eventually resulting in edge waves shoreward of the canyon head. Edge waves subsequently spread out alongshore reaching significant amplitudes especially when coupling with tsunami secondary waves occurs. Model results have been groundtruthed using the actual bathymetry of Blanes Canyon area in the North Catalan margin. This paper underlines the effects of the presence, morphology and orientation of submarine canyons as a determining factor on tsunami propagation and impact, which could prevail over other effects deriving from coastal configuration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVE: Key factors of Fast Track (FT) programs are fluid restriction and epidural analgesia (EDA). We aimed to challenge the preconception that the combination of fluid restriction and EDA might induce hypotension and renal dysfunction. METHODS: A recent randomized trial (NCT00556790) showed reduced complications after colectomy in FT patients compared with standard care (SC). Patients with an effective EDA were compared with regard to hemodynamics and renal function. RESULTS: 61/76 FT patients and 59/75 patients in the SC group had an effective EDA. Both groups were comparable regarding demographics and surgery-related characteristics. FT patients received significantly less i.v. fluids intraoperatively (1900 mL [range 1100-4100] versus 2900 mL [1600-5900], P < 0.0001) and postoperatively (700 mL [400-1500] versus 2300 mL [1800-3800], P < 0.0001). Intraoperatively, 30 FT compared with 19 SC patients needed colloids or vasopressors, but this was statistically not significant (P = 0.066). Postoperative requirements were low in both groups (3 versus 5 patients; P = 0.487). Pre- and postoperative values for creatinine, hematocrit, sodium, and potassium were similar, and no patient developed renal dysfunction in either group. Only one of 82 patients having an EDA without a bladder catheter had urinary retention. Overall, FT patients had fewer postoperative complications (6 versus 20 patients; P = 0.002) and a shorter median hospital stay (5 [2-30] versus 9 d [6-30]; P< 0.0001) compared with the SC group. CONCLUSIONS: Fluid restriction and EDA in FT programs are not associated with clinically relevant hemodynamic instability or renal dysfunction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Critically ill patients are at high risk of malnutrition. Insufficient nutritional support still remains a widespread problem despite guidelines. The aim of this study was to measure the clinical impact of a two-step interdisciplinary quality nutrition program. DESIGN: Prospective interventional study over three periods (A, baseline; B and C, intervention periods). SETTING: Mixed intensive care unit within a university hospital. PATIENTS: Five hundred seventy-two patients (age 59 ± 17 yrs) requiring >72 hrs of intensive care unit treatment. INTERVENTION: Two-step quality program: 1) bottom-up implementation of feeding guideline; and 2) additional presence of an intensive care unit dietitian. The nutrition protocol was based on the European guidelines. MEASUREMENTS AND MAIN RESULTS: Anthropometric data, intensive care unit severity scores, energy delivery, and cumulated energy balance (daily, day 7, and discharge), feeding route (enteral, parenteral, combined, none-oral), length of intensive care unit and hospital stay, and mortality were collected. Altogether 5800 intensive care unit days were analyzed. Patients in period A were healthier with lower Simplified Acute Physiologic Scale and proportion of "rapidly fatal" McCabe scores. Energy delivery and balance increased gradually: impact was particularly marked on cumulated energy deficit on day 7 which improved from -5870 kcal to -3950 kcal (p < .001). Feeding technique changed significantly with progressive increase of days with nutrition therapy (A: 59% days, B: 69%, C: 71%, p < .001), use of enteral nutrition increased from A to B (stable in C), and days on combined and parenteral nutrition increased progressively. Oral energy intakes were low (mean: 385 kcal*day, 6 kcal*kg*day ). Hospital mortality increased with severity of condition in periods B and C. CONCLUSION: A bottom-up protocol improved nutritional support. The presence of the intensive care unit dietitian provided significant additional progression, which were related to early introduction and route of feeding, and which achieved overall better early energy balance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Therapeutic hypothermia following hypoxic ischaemic encephalopathy in term infants was introduced into Switzerland in 2005. Initial documentation of perinatal and resuscitation details was poor and neuromonitoring insufficient. In 2011, a National Asphyxia and Cooling Register was introduced. AIMS: To compare management of cooled infants before and after introduction of the register concerning documentation, neuromonitoring, cooling methods and evaluation of temperature variability between cooling methods. STUDY DESIGN: Data of cooled infants before the register was in place (first time period: 2005-2010) and afterwards (second time period: 2011-2012) was collected with a case report form. RESULTS: 150 infants were cooled during the first time period and 97 during the second time period. Most infants were cooled passively or passively with gel packs during both time periods (82% in 2005-2010 vs 70% in 2011-2012), however more infants were cooled actively during the second time period (18% versus 30%). Overall there was a significant reduction in temperature variability (p < 0.001) comparing the two time periods. A significantly higher proportion of temperature measurements within target temperature range (72% versus 77%, p < 0.001), fewer temperature measurements above (24% versus 7%, p < 0.001) and more temperatures below target range (4% versus 16%, p < 0.001) were recorded during the second time period. Neuromonitoring improved after introduction of the cooling register. CONCLUSION: Management of infants with HIE improved since introducing the register. Temperature variability was reduced, more temperature measurements in the target range and fewer temperature measurements above target range were observed. Neuromonitoring has improved, however imaging should be performed more often.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND & AIMS: Trace elements (TE) are involved in the immune and antioxidant defences which are of particular importance during critical illness. Determining plasma TE levels is costly. The present quality control study aimed at assessing the economic impact of a computer reminded blood sampling versus a risk guided on-demand monitoring of plasma concentrations of selenium, copper, and zinc. METHODS: Retrospective analysis of 2 cohorts of patients admitted during 6 months periods in 2006 and 2009 to the ICU of a University hospital. INCLUSION CRITERIA: to receive intravenous micronutrient supplements and/or to have a TE sampling during ICU stay. The TE samplings were triggered by computerized reminder in 2006 versus guided by nutritionists in 2009. RESULTS: During the 2 periods 636 patients met the inclusion criteria out of 2406 consecutive admissions, representing 29.7% and 24.9% respectively of the periods' admissions. The 2009 patients had higher SAPS2 scores (p = 0.02) and lower BMI compared to 2006 (p = 0.007). The number of laboratory determinations was drastically reduced in 2009, particularly during the first week, despite the higher severity of the cohort, resulting in à 55% cost reduction. CONCLUSIONS: The monitoring of TE concentrations guided by a nutritionist resulted in a reduction of the sampling frequency, and targeting on the sickest high risk patients, requiring a nutritional prescription adaptation. This control leads to cost reduction compared to an automated sampling prescription.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The epidermis on leaves protects plants from pathogen invasion and provides a waterproof barrier. It consists of a layer of cells that is surrounded by thick cell walls, which are partially impregnated by highly hydrophobic cuticular components. We show that the Arabidopsis T-DNA insertion mutants of REDUCED WALL ACETYLATION 2 (rwa2), previously identified as having reduced O-acetylation of both pectins and hemicelluloses, exhibit pleiotrophic phenotype on the leaf surface. The cuticle layer appeared diffused and was significantly thicker and underneath cell wall layer was interspersed with electron-dense deposits. A large number of trichomes were collapsed and surface permeability of the leaves was enhanced in rwa2 as compared to the wild type. A massive reprogramming of the transcriptome was observed in rwa2 as compared to the wild type, including a coordinated up-regulation of genes involved in responses to abiotic stress, particularly detoxification of reactive oxygen species and defense against microbial pathogens (e.g., lipid transfer proteins, peroxidases). In accordance, peroxidase activities were found to be elevated in rwa2 as compared to the wild type. These results indicate that cell wall acetylation is essential for maintaining the structural integrity of leaf epidermis, and that reduction of cell wall acetylation leads to global stress responses in Arabidopsis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: In most of the emergency departments (ED) in developed countries, a subset of patients visits the ED frequently. Despite their small numbers, these patients are the source of a disproportionally high number of all ED visits, and use a significant proportion of healthcare resources. They place a heavy economic burden on hospital and healthcare system budgets overall. In order to improve the management of these patients, the University hospital of Lausanne, Switzerland implemented a case management intervention (CM) between May 2012 and July 2013. In this randomized controlled trial, 250 frequent ED users (visits>5 during previous 12 months) were allocated to either the CM group or the standard ED care (SC) group and followed up for 12 months. The first result of the CM was to reduce significantly the ED visits. The present study examined whether the CM intervention also reduced the costs generated by the ED frequent users not only from the hospital perspective, but also from the healthcare system perspective. Methods: Cost data were obtained from the hospital's analytical accounting system and from health insurances. Multivariate linear models including a fixed effect "group" and socio-demographic characteristics and health-related variables were run.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dioxins and furans, PCDD/Fs, are highly toxic substances formed in post combustion zones in furnaces. PCDD/F emissions are regulated by a waste incineration directive which relates also to co-incineration plants. Several observations of dioxin and furan enhancements in wet scrub- bers have been reported previously. This is thought to be due to the so-called "memory effect" which occurs when dioxins and furans absorb into plastic material in scrubbers and desorb when ambient circumstances alter significantly. At the co-incineration plant involved, dioxins and furans are controlled with a wet scrubber, the tower packing of which is made of plastic in which activated carbon particles are dispersed. This should avoid the memory effect and act as a dioxin and furan sink since dioxins and furans are absorbed irreversibly into the packing ma- terial. In this case, the tower packing in the scrubber is covered with a white layer that has been found to be mainly aluminium. The aim of this thesis was to determine the aluminium balance and the dioxin and furan behaviour in the scrubber and, thus, the impacts that the foul- ing has on dioxin and furan removal. The source of aluminium, reasons for fouling and further actions to minimize its impacts on dioxin and furan removal were also to be discovered. Measurements in various media around the scrubber and in fuels were made and a profile analysis of PCDD/F and mass balance calculations were carried out. PCDD/F content de- creased in the scrubber. The reduced PCDD/F was not discharged into scrubbing water. The removal mechanism seems to work in spite of the fouling, at least with low PCDD/F loads. Most of the PCDD/F in excess water originates from the Kymijoki River which is used as feeding water in the scrubber. Fouling turned out to consist mainly of aluminium hydroxides. Sludge combusted in the furnace was found to be a significant source of aluminium. Ways to minimize the fouling would be adjustment of pH to a proper lever, installation of a mechanical filter to catch the loose material from the scrubbing water and affecting the aluminium content of the sludge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The impact of early treatment with immunomodulators (IM) and/or TNF antagonists on bowel damage in Crohn's disease (CD) patients is unknown. AIM: To assess whether 'early treatment' with IM and/or TNF antagonists, defined as treatment within a 2-year period from the date of CD diagnosis, was associated with development of lesser number of disease complications when compared to 'late treatment', which was defined as treatment initiation after >2 years from the time of CD diagnosis. METHODS: Data from the Swiss IBD Cohort Study were analysed. The following outcomes were assessed using Cox proportional hazard modelling: bowel strictures, perianal fistulas, internal fistulas, intestinal surgery, perianal surgery and any of the aforementioned complications. RESULTS: The 'early treatment' group of 292 CD patients was compared to the 'late treatment' group of 248 CD patients. We found that 'early treatment' with IM or TNF antagonists alone was associated with reduced risk of bowel strictures [hazard ratio (HR) 0.496, P = 0.004 for IM; HR 0.276, P = 0.018 for TNF antagonists]. Furthermore, 'early treatment' with IM was associated with reduced risk of undergoing intestinal surgery (HR 0.322, P = 0.005), and perianal surgery (HR 0.361, P = 0.042), as well as developing any complication (HR 0.567, P = 0.006). CONCLUSIONS: Treatment with immunomodulators or TNF antagonists within the first 2 years of CD diagnosis was associated with reduced risk of developing bowel strictures, when compared to initiating these drugs >2 years after diagnosis. Furthermore, early immunomodulators treatment was associated with reduced risk of intestinal surgery, perianal surgery and any complication.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The impact of early valve surgery (EVS) on the outcome of Staphylococcus aureus (SA) prosthetic valve infective endocarditis (PVIE) is unresolved. The objective of this study was to evaluate the association between EVS, performed within the first 60 days of hospitalization, and outcome of SA PVIE within the International Collaboration on Endocarditis-Prospective Cohort Study. METHODS: Participants were enrolled between June 2000 and December 2006. Cox proportional hazards modeling that included surgery as a time-dependent covariate and propensity adjustment for likelihood to receive cardiac surgery was used to evaluate the impact of EVS and 1-year all-cause mortality on patients with definite left-sided S. aureus PVIE and no history of injection drug use. RESULTS: EVS was performed in 74 of the 168 (44.3%) patients. One-year mortality was significantly higher among patients with S. aureus PVIE than in patients with non-S. aureus PVIE (48.2% vs 32.9%; P = .003). Staphylococcus aureus PVIE patients who underwent EVS had a significantly lower 1-year mortality rate (33.8% vs 59.1%; P = .001). In multivariate, propensity-adjusted models, EVS was not associated with 1-year mortality (risk ratio, 0.67 [95% confidence interval, .39-1.15]; P = .15). CONCLUSIONS: In this prospective, multinational cohort of patients with S. aureus PVIE, EVS was not associated with reduced 1-year mortality. The decision to pursue EVS should be individualized for each patient, based upon infection-specific characteristics rather than solely upon the microbiology of the infection causing PVIE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exiting from the largely sterile environment of the womb, the neonatal immune system is not fully mature, and thus neonatal immune cells must simultaneously mount responses against environmental stimuli while maturing. This dynamic process of immune maturation is driven by a variety of cell-intrinsic and extrinsic factors. Recent studies have focused on some of these factors and have shed light on the mechanisms by which they drive immune maturation. We review the interactions and consequences of immune maturation during the pre- and perinatal period. We discuss environmental signals in early life that are needed for healthy immune homeostasis, and highlight detrimental factors that can set an individual on a path towards disease. This early-life period of immune maturation could hold the key to strategies for setting individuals on trajectories towards health and reduced disease susceptibility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two enoxaparin dosage regimens are used as comparators to evaluate new anticoagulants for thromboprophylaxis in patients undergoing major orthopaedic surgery, but so far no satisfactory direct comparison between them has been published. Our objective was to compare the efficacy and safety of enoxaparin 3,000 anti-Xa IU twice daily and enoxaparin 4,000 anti-Xa IU once daily in this clinical setting by indirect comparison meta-analysis, using Bucher's method. We selected randomised controlled trials comparing another anticoagulant, placebo (or no treatment) with either enoxaparin regimen for venous thromboembolism prophylaxis after hip or knee replacement or hip fracture surgery, provided that the second regimen was assessed elsewhere versus the same comparator. Two authors independently evaluated study eligibility, extracted the data, and assessed the risk of bias. The primary efficacy outcome was the incidence of venous thomboembolism. The main safety outcome was the incidence of major bleeding. Overall, 44 randomised comparisons in 56,423 patients were selected, 35 being double-blind (54,117 patients). Compared with enoxaparin 4,000 anti-Xa IU once daily, enoxaparin 3,000 anti-Xa IU twice daily was associated with a reduced risk of venous thromboembolism (relative risk [RR]: 0.53, 95% confidence interval [CI]: 0.40 to 0.69), but an increased risk of major bleeding (RR: 2.01, 95% CI: 1.23 to 3.29). In conclusion, when interpreting the benefit-risk ratio of new anticoagulant drugs versus enoxaparin for thromboprophylaxis after major orthopaedic surgery, the apparently greater efficacy but higher bleeding risk of the twice-daily 3,000 anti-Xa IU enoxaparin regimen compared to the once-daily 4,000 anti-Xa IU regimen should be taken into account.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Infectious diseases after solid organ transplantation (SOT) are a significant cause of morbidity and reduced allograft and patient survival; however, the influence of infection on the development of chronic allograft dysfunction has not been completely delineated. Some viral infections appear to affect allograft function by both inducing direct tissue damage and immunologically related injury, including acute rejection. In particular, this has been observed for cytomegalovirus (CMV) infection in all SOT recipients and for BK virus infection in kidney transplant recipients, for community-acquired respiratory viruses in lung transplant recipients, and for hepatitis C virus in liver transplant recipients. The impact of bacterial and fungal infections is less clear, but bacterial urinary tract infections and respiratory tract colonization by Pseudomonas aeruginosa and Aspergillus spp appear to be correlated with higher rates of chronic allograft dysfunction in kidney and lung transplant recipients, respectively. Evidence supports the beneficial effects of the use of antiviral prophylaxis for CMV in improving allograft function and survival in SOT recipients. Nevertheless, there is still a need for prospective interventional trials assessing the potential effects of preventive and therapeutic strategies against bacterial and fungal infection for reducing or delaying the development of chronic allograft dysfunction.