594 resultados para nosocomial diarrhoea
Resumo:
Giardia lamblia is an intestinal protozoan parasite infecting humans and various other mammalian hosts. The most important clinical signs of giardiasis are diarrhoea and malabsorption. Giardia lamblia is able to undergo continuous antigenic variation of its major surface antigen, named VSP (variant surface protein). While intestinal antibodies, and more specifically anti-VSP IgA antibodies, were proven to be involved in modulating antigenic variation of the parasite the participation of the local antibody response in control of the parasite infection is still controversial. Conversely, previous studies based on experimental infections in mice showed that cellular immune mechanisms are essential for elimination of the parasite from its intestinal habitat. Furthermore, recent data indicated that inflammatory mast cells have a potential to directly, or indirectly, interfere in duodenal growth of G. lamblia trophozoites. However, this finding was challenged by other reports, which did not find a correlation between intestinal inflammation and resistance to infection. Since intestinal infiltration of inflammatory cells and/or CD8+T-cells were demonstrated to coincide with villus-shortening and crypt hyperplasia immunological reactions were considered to be a potential factor of pathogenesis in giardiasis. The contribution of physiological factors to pathogenesis was essentially assessed in vitro by co-cultivation of G. lamblia trophozoites with epithelial cell lines. By using this in vitro model, molecular (through surface lectins) and mechanical (through ventral disk) adhesion of trophozoites to the epithelium was shown to be crucial for increased epithelial permeability. This phenomenon as well as other Giardia-induced intestinal abnormalities such as loss of intestinal brush border surface area, villus flattening, inhibition of disaccharidase activities, and eventually also overgrowth of the enteric bacterial flora seem to be involved in the pathophysiology of giardiasis. However, it remains to be elucidated whether at least part of these pathological effects are causatively linked to the clinical manifestation of the disease.
Resumo:
nternational travel continues to increase in frequency. Health care providers need a wide understanding of the spectrum of travel related diseases and their management. This retrospective study analyses the demographic and clinical data of 360 travellers returning from the tropics presenting to an outpatient clinic at a tertiary hospital between 2003 - 2007. The aim of this study was to analyse the frequency of presenting symptoms and diseases in ill returning travellers and to correlate them to the areas visited and the duration and purpose of travel. The main symptoms during travel were diarrhoea (n = 200, 56 %) and fever (n = 124, 34 %). Travellers not visiting friends and relatives but with close contact to the local population were at more than two-fold increased risk of diarrhoea (Odds Ratio [OR] 2.5; 95 % confidence interval [CI] 1.1-6.0, p = 0.03) and fever (OR 2.4; 95 % CI 1.1-5.3; p = 0.02) compared to tourist travellers. Travellers visiting friends and relatives (VFR) were not at increased risk for diarrhoea (OR 0.6; 95 % CI 0.3-1.3; p = 0.17), or fever (OR 1.5; 95 % CI 0.7-3.4; p = 0.28). Thirty-two percent of all travellers (n = 115) were diagnosed with a specific pathogen. Malaria (6 %), giardiasis (6 %) and amebiasis (4 %) were the most frequently detected pathogens. The odds of malaria as a cause of the presenting illness was lower among travellers reporting pre-travel advice. Specific antimicrobial treatment was required in around one third of the patients.
Resumo:
Background Estimates of the disease burden due to multiple risk factors can show the potential gain from combined preventive measures. But few such investigations have been attempted, and none on a global scale. Our aim was to estimate the potential health benefits from removal of multiple major risk factors. Methods We assessed the burden of disease and injury attributable to the joint effects of 20 selected leading risk factors in 14 epidemiological subregions of the world. We estimated population attributable fractions, defined as the proportional reduction in disease or mortality that would occur if exposure to a risk factor were reduced to an alternative level, from data for risk factor prevalence and hazard size. For every disease, we estimated joint population attributable fractions, for multiple risk factors, by age and sex, from the direct contributions of individual risk factors. To obtain the direct hazards, we reviewed publications and re-analysed cohort data to account for that part of hazard that is mediated through other risks. Results Globally, an estimated 47% of premature deaths and 39% of total disease burden in 2000 resulted from the joint effects of the risk factors considered. These risks caused a substantial proportion of important diseases, including diarrhoea (92%-94%), lower respiratory infections (55-62%), lung cancer (72%), chronic obstructive pulmonary disease (60%), ischaemic heart disease (83-89%), and stroke (70-76%). Removal of these risks would have increased global healthy life expectancy by 9.3 years (17%) ranging from 4.4 years (6%) in the developed countries of the western Pacific to 16.1 years (43%) in parts of sub-Saharan Africa. Interpretation Removal of major risk factors would not only increase healthy life expectancy in every region, but also reduce some of the differences between regions, The potential for disease prevention and health gain from tackling major known risks simultaneously would be substantial.
Resumo:
The purpose of this study was to prospectively examine the effectiveness and tolerability of a simple radiotherapy technique for the palliation of symptomatic liver metastases. Twenty-eight patients with symptomatic liver metastases were enrolled from seven centres, and received targeted (partial or whole) liver irradiation consisting of 10 Gy in two fractions over 2 days. Symptoms at baseline were hepatic pain (27 patients), abdominal distension (19), night sweats (12), nausea (18) and vomiting (eight). Twenty-two patients (76%) had failed previous treatment with chemotherapy, hormonal therapy and/or high-dose steroids. Symptoms and potential toxicities were prospectively assessed at the time of treatment, then 2, 6 and 10 weeks later. Individual symptom response rates were 53−66% at 2 weeks. Partial or complete global symptomatic responses were noted in 15 patients (54%) overall. The treatment was well tolerated with two patients (7%) experiencing grade 3 toxicity (one vomiting and one diarrhoea); however, four patients reported temporary worsening of pain shortly after treatment. This simple and well-tolerated treatment achieves useful palliation.
Resumo:
The aim of this review is to analyse critically the recent literature on the clinical pharmacokinetics and pharmacodynamics of tacrolimus in solid organ transplant recipients. Dosage and target concentration recommendations for tacrolimus vary from centre to centre, and large pharmacokinetic variability makes it difficult to predict what concentration will be achieved with a particular dose or dosage change. Therapeutic ranges have not been based on statistical approaches. The majority of pharmacokinetic studies have involved intense blood sampling in small homogeneous groups in the immediate post-transplant period. Most have used nonspecific immunoassays and provide little information on pharmacokinetic variability. Demographic investigations seeking correlations between pharmacokinetic parameters and patient factors have generally looked at one covariate at a time and have involved small patient numbers. Factors reported to influence the pharmacokinetics of tacrolimus include the patient group studied, hepatic dysfunction, hepatitis C status, time after transplantation, patient age, donor liver characteristics, recipient race, haematocrit and albumin concentrations, diurnal rhythm, food administration, corticosteroid dosage, diarrhoea and cytochrome P450 (CYP) isoenzyme and P-glycoprotein expression. Population analyses are adding to our understanding of the pharmacokinetics of tacrolimus, but such investigations are still in their infancy. A significant proportion of model variability remains unexplained. Population modelling and Bayesian forecasting may be improved if CYP isoenzymes and/or P-glycoprotein expression could be considered as covariates. Reports have been conflicting as to whether low tacrolimus trough concentrations are related to rejection. Several studies have demonstrated a correlation between high trough concentrations and toxicity, particularly nephrotoxicity. The best predictor of pharmacological effect may be drug concentrations in the transplanted organ itself. Researchers have started to question current reliance on trough measurement during therapeutic drug monitoring, with instances of toxicity and rejection occurring when trough concentrations are within 'acceptable' ranges. The correlation between blood concentration and drug exposure can be improved by use of non-trough timepoints. However, controversy exists as to whether this will provide any great benefit, given the added complexity in monitoring. Investigators are now attempting to quantify the pharmacological effects of tacrolimus on immune cells through assays that measure in vivo calcineurin inhibition and markers of immuno suppression such as cytokine concentration. To date, no studies have correlated pharmacodynamic marker assay results with immunosuppressive efficacy, as determined by allograft outcome, or investigated the relationship between calcineurin inhibition and drug adverse effects. Little is known about the magnitude of the pharmacodynamic variability of tacrolimus.
Resumo:
A model was developed in dogs to determine the impact of oral enrofloxacin administration on the indigenous coliform population in the gastrointestinal tract and subsequent disposition to colonization by a strain of multidrug-resistant Escherichia coli (MDREC). Dogs given a daily oral dose of 5 mg enrofloxacin kg(-1) for 21 consecutive days showed a significant decline in faecal coliforms to levels below detectable limits by 72 In of administration. Subsequently, faecal coliforms remained suppressed throughout the period of enrofloxacin dosing. Upon termination of antibiotic administration, the number of excreted faecal coliforms slowly returned over an 8-day period, to levels comparable to those seen prior to antibiotic treatment. Enrofloxacin-treated dogs were more effectively colonized by MDREC, evidenced by a significantly increased count of MDREC in the faeces (7.1 +/- 1.5 log(10) g(-1)) compared with non-antibiotic-treated dogs (5.2 +/- 1.2; P = 0.003). Furthermore, antibiotic treatment also sustained a significantly longer period of MDREC excretion in the faeces (26.8 +/- 10.5 days) compared with animals not treated with enrofloxacin (8.5 +/- 5.4 days; P = 0.0215). These results confirm the importance of sustained delivery of an antimicrobial agent to maintain and expand the colonization potential of drug-resistant bacteria in vivo, achieved in part by reducing the competing commensal coliforms in the gastrointestinal tract to below detectable levels in the faeces. Without in vivo antimicrobial selection pressure, commensal coliforms dominated the gastrointestinal tract at the expense of the MDREC population. Conceivably, the model developed could be used to test the efficacy of novel non-antibiotic strategies aimed at monitoring and controlling gastrointestinal colonization by multidrug-resistant members of the Enterobacteriaceae that cause nosocomial infections.
Resumo:
This prospective study evaluated serum procalcitonin (PCT) and C-reactive protein (CRP) as markers for systemic inflammatory response syndrome (SIRS)/sepsis and mortality in patients with traumatic brain injury and subarachnoid haemorrhage. Sixty-two patients were followed for 7 days. Serum PCT and CRP were measured on days 0, 1, 4, 5, 6 and 7. Seventy-seven per cent of patients with traumatic brain injury and 83% with subarachnoid haemorrhage developed SIRS or sepsis (P= 0.75). Baseline PCT and CRP were elevated in 35% and 55% ofpatients respectively (P=0.03). There was a statistically non-significant step-wise increase in serum PCT levels from no SIRS (0.4 +/- 0.6 ng/ml) to SIRS (3.05 +/- 9.3 ng/ml) to sepsis (5.5 +/- 12.5 ng/ml). A similar trend was noted in baseline PCT in patients with mild (0.06 +/- 0.9 ng/ml), moderate (0.8 +/- 0.7 ng/ml) and severe head injury (1.2 +/- 1.9 ng/ml). Such a gradation was not observed with serum CRP There was a non-significant trend towards baseline PCT being a better marker of hospital mortality compared with baseline CRP (ROC-AUC 0.56 vs 0.31 respectively). This is the first prospective study to document the high incidence of SIRS in neurosurgical patients. In our study, serum PCT appeared to correlate with severity of traumatic brain injury and mortality. However, it could not reliably distinguish between SIRS and sepsis in this cohort. This is in pan because baseline PCT elevation seemed to correlate with severity of injury. Only a small proportion ofpatients developed sepsis, thus necessitating a larger sample size to demonstrate the diagnostic usefulness of serum PCT as a marker of sepsis. Further clinical trials with larger sample sizes are required to confirm any potential role of PCT as a sepsis and outcome indicator in patients with head injuries or subarachnoid haemorrhage.
Resumo:
The manner in which elements of clinical history, physical examination and investigations influence subjectively assessed illness severity and outcome prediction is poorly understood. This study investigates the relationship between clinician and objectively assessed illness severity and the factors influencing clinician's diagnostic confidence and illness severity rating for ventilated patients with suspected pneumonia in the intensive care unit (ICU). A prospective study of fourteen ICUs included all ventilated admissions with a clinical diagnosis of pneumonia. Data collection included pneumonia type - community-acquired (CAP), hospital-acquired (HAP) and ventilator-associated (VAP), clinician determined illness severity (CDIS), diagnostic methods, clinical diagnostic confidence (CDC), microbiological isolates and antibiotic use. For 476 episodes of pneumonia (48% CAP, 24% HAP, 28% VAP), CDC was greatest for CAP (64% CAP, 50% HAP and 49% VAP, P < 0.01) or when pneumonia was considered life-threatening (84% high CDC, 13% medium CDC and 3% low CDC, P < 0.001). Life-threatening pneumonia was predicted by worsening gas exchange (OR 4.8, CI 95% 2.3-10.2, P < 0.001), clinical signs of consolidation (OR 2.0, CI 95% 1.2-3.2, P < 0.01) and the Sepsis-Related Organ Failure Assessment (SOFA) Score (OR 1.1, CI 95% 1.1-1.2, P < 0.001). Diagnostic confidence increased with CDIS (OR 163, CI 95% 8.4-31.4, P < 0.001), definite pathogen isolation (OR 3.3, CI 95% 2.0-5.6) and clinical signs of consolidation (OR 2.1, CI 95% 1.3-3.3, P = 0.001). Although the CDIS, SOFA Score and the Simplified Acute Physiologic Score (SAPS II) were all associated with mortality, the SAPS II Score was the best predictor of mortality (P = 0.02). Diagnostic confidence for pneumonia is moderate but increases with more classical presentations. A small set of clinical parameters influence subjective assessment. Objective assessment using SAPS II Scoring is a better predictor of mortality.
Resumo:
Nosocomial transmission of methicillin-resistant Staphylococcus aureus (MRSA) to patients with cystic fibrosis (CF) frequently results in chronic respiratory tract carriage. This is an increasing problem, adds to the burden of glycopeptide antibiotic use in hospitals, and represents a relative contraindication to lung transplantation. The aim of this study was to determine whether it is possible to eradicate MRSA with prolonged oral combination antibiotics, and whether this treatment is associated with improved clinical status. Adult CF patients (six mate, one female) with chronic MRSA infection were treated for six months with rifampicin and sodium fusidate. Outcome data were examined for six months before treatment, on treatment and after treatment. The patients had a mean age of 29.3 (standard deviation = 6.3) years and FEV1 of 36.1% (standard deviation = 12.7) predicted. The mean duration of MRSA isolation was 31 months. MRSA isolates identified in these patients was of the same lineage as the known endemic strain at the hospital when assessed by pulsed-field get electrophoresis. Five of the seven had no evidence of MRSA during and for at [east six months after rifampicin and sodium fusidate. The proportion of sputum samples positive for MRSA was lower during the six months of treatment (0.13) and after treatment (0.19) compared with before treatment (0.85) (P < 0.0001). There was a reduction in the number of days of intravenous antibiotics per six months with 20.3 +/- 17.6 on treatment compared with 50.7 before treatment and 33.0 after treatment (P = 0.02). There was no change in lung function. Gastrointestinal side effects occurred in three, but led to therapy cessation in only one patient. Despite the use of antibiotics with anti-staphylococcal activity for treatment of respiratory exacerbation, MRSA infection persists. MRSA can be eradicated from the sputum of patients with CF and chronic MRSA carriage by using rifampicin and sodium fusidate for six months. This finding was associated with a significant reduction in the duration of intravenous antibiotic treatment during therapy. (C) 2003 The Hospital Infection Society. Published by Elsevier Ltd. All rights reserved.
Resumo:
This study of ventilated patients investigated pneumonia risk factors and outcome predictors in 476 episodes of pneumonia (48% community-acquired pneumonia, 24% hospital-acquired pneumonia, 28% ventilator-associated pneumonia) using a prospective survey in 14 intensive care units within Australia and New Zealand. For community acquired pneumonia, mortality increased with immunosuppression (OR 5.32, CI 95% 1.58-17.99, P < 0. 01), clinical signs of consolidation (OR 2.43, CI 95% 1.09-5.44, P = 0. 03) and Sepsis-Related Organ Failure Assessment (SOFA) scores (OR 1.19, CI 95% 1.08-1.30, P < 0. 001) but improved if appropriate antibiotic changes were made within three days of intensive care unit admission (OR 0.42, CI 95% 0.20-0.86, P = 0.02). For hospital-acquired pneumonia, immunosuppression (OR 6.98, CI 95% 1.16-42.2, P = 0.03) and non-metastatic cancer (OR 3.78, CI 95% 1.20-11.93, P = 0.02) were the principal mortality predictors. Alcoholism (OR 7.80, CI 95% 1.20-1750, P < 0.001), high SOFA scores (OR 1.44, CI 95% 1.20-1.75, P = 0.001) and the isolation of high risk organisms including Pseudomonas aeruginosa, Acinetobacter spp, Stenotrophomonas spp and methicillin resistant Staphylococcus aureus (OR 4.79, CI 95% 1.43-16.03, P = 0.01), were associated with increased mortality in ventilator-associated pneumonia. The use of non-invasive ventilation was independently protective against mortality for patients with community-acquired and hospital-acquired pneumonia (OR 0.35, CI 95% 0.18-0.68, P = 0.002). Mortality was similar for patients requiting both invasive and non-invasive ventilation and non-invasive ventilation alone (21% compared with 20% respectively, P = 0.56). Pneumonia risks and mortality predictors in Australian and New Zealand ICUs vary with pneumonia type. A history of alcoholism is a major risk factor for mortality in ventilator-associated pneumonia, greater in magnitude than the mortality effect of immunosuppression in hospital-acquired pneumonia or community-acquired pneumonia. Non-invasive ventilation is associated with reduced ICU mortality. Clinical signs of consolidation worsen, while rationalising antibiotic therapy within three days of ICU admission improves mortality for community-acquired pneumonia patients.
Resumo:
The clinical use of potent, well-tolerated, broad-spectrum antibiotics has been paralleled by the development of resistance in bacteria, and the prevalence of highly resistant bacteria in some intensive care units is despairingly commonplace. The intensive care community faces the realistic prospect of untreatable nosocomial infections and should be searching for new approaches to diagnose and manage resistant bacteria. In this review, we discuss some of the relevant underlying biology, with a particular focus on genetic transfer vehicles and the relationship of selection pressure to their movements. It is an attempt to demystify the relevant language and concepts for the anaesthetist and intensivist, to explain some of the reasons for the emergence of resistance in bacteria, and to provide a contextual basis for discussion of management approaches such as selective decontamination and antibiotic cycling.
Resumo:
Background Our aim was to calculate the global burden of disease and risk factors for 2001, to examine regional trends from 1990 to 2001, and to provide a starting point for the analysis of the Disease Control Priorities Project (DCPP). Methods We calculated mortality, incidence, prevalence, and disability adjusted life years (DALYs) for 136 diseases and injuries, for seven income/geographic country groups. To assess trends, we re-estimated all-cause mortality for 1990 with the same methods as for 2001. We estimated mortality and disease burden attributable to 19 risk factors. Findings About 56 million people died in 2001. Of these, 10.6 million were children, 99% of whom lived in low-and-middle-income countries. More than half of child deaths in 2001 were attributable to acute respiratory infections, measles, diarrhoea, malaria, and HIV/AIDS. The ten leading diseases for global disease burden were perinatal conditions, lower respiratory infections, ischaemic heart disease, cerebrovascular disease, HIV/AIDS, diarrhoeal diseases, unipolar major depression, malaria, chronic obstructive pulmonary disease, and tuberculosis. There was a 20% reduction in global disease burden per head due to communicable, maternal, perinatal, and nutritional conditions between 1990 and 2001. Almost half the disease burden in low-and-middle-income countries is now from non-communicable diseases (disease burden per head in Sub-Saharan Africa and the low-and-middle-income countries of Europe and Central Asia increased between 1990 and 2001). Undernutrition remains the leading risk factor for health loss. An estimated 45% of global mortality and 36% of global disease burden are attributable to the joint hazardous effects of the 19 risk factors studied. Uncertainty in all-cause mortality estimates ranged from around 1% in high-income countries to 15-20% in Sub-Saharan Africa. Uncertainty was larger for mortality from specific diseases, and for incidence and prevalence of non-fatal outcomes. Interpretation Despite uncertainties about mortality and burden of disease estimates, our findings suggest that substantial gains in health have been achieved in most populations, countered by the HIV/AIDS epidemic in Sub-Saharan Africa and setbacks in adult mortality in countries of the former Soviet Union. our results on major disease, injury, and risk factor causes of loss of health, together with information on the cost-effectiveness of interventions, can assist in accelerating progress towards better health and reducing the persistent differentials in health between poor and rich countries.
Resumo:
The contribution of enterotoxigenic Escherichia coli (ETEC) to pre-weaning diarrhoea was investigated over a 6 month period at five selected commercial piggeries (CPs) in north Vietnam with at least 100 sows each. Diarrhoea was found to affect 71(.)5% of the litters born during the period of study. Of 406 faecal specimens submitted for bacteriological culture, 200 (49(.)3%) yielded a heavy pure culture of E coli and 126(31 %)were confirmed by PCR to carry at least one of eight porcine ETEC virulence genes. ETEC was responsible for 43% of cases of diarrhoea in neonatal pigs during the first 4 days of life and 23(.)9% of the remaining cases up until the age of weaning. Pathotypes were determined by PCR for the 126 ETEC isolates together with 44 ETEC isolates obtained from village pigs (VPs) raised by smallholder farmers. The CP isolates belonged to five pathotypes, four of which were also identified in VP isolates. Haemolytic serogroup O149: K91 isolates that belonged to F4/STa/STb/LT were most commonly identified in both CPs (33 % of isolates) and VPs (45(.)5%). Other combinations identified in both production systems included O64 (F5/STa), O101 (F4/STa/STb) and O-nontypable (F-/STb). A high proportion of CP isolates (22(.)3 %) possessed all three enterotoxins (STa/STWLT), lacked the genes for all five tested fimbriae (F4, F5, F6, F41 and F18) and belonged to serogroup O8. These unusual 08 F- isolates were haemolytic and were isolated from all ages of diarrhoeic piglets at each CP, suggesting that they have pathogenic potential.
Resumo:
Heat stroke is a life-threatening condition that can be fatal if not appropriately managed. Although heat stroke has been recognised as a medical condition for centuries, a universally accepted definition of heat stroke is lacking and the pathology of heat stroke is not fully understood. Information derived from autopsy reports and the clinical presentation of patients with heat stroke indicates that hyperthermia, septicaemia, central nervous system impairment and cardiovascular failure play important roles in the pathology of heat stroke. The current models of heat stroke advocate that heat stroke is triggered by hyperthermia but is driven by endotoxaemia. Endotoxaemia triggers the systemic inflammatory response, which can lead to systemic coagulation and haemorrhage, necrosis, cell death and multi-organ failure. However, the current heat stroke models cannot fully explain the discrepancies in high core temperature (Tc) as a trigger of heat stroke within and between individuals. Research on the concept of critical Tc: as a limitation to endurance exercise implies that a high Tc may function as a signal to trigger the protective mechanisms against heat stroke. Athletes undergoing a period of intense training are subjected to a variety of immune and gastrointestinal (GI) disturbances. The immune disturbances include the suppression of immune cells and their functions, suppression of cell-mediated immunity, translocation of lipopolysaccharide (LPS), suppression of anti-LPS antibodies, increased macrophage activity due to muscle tissue damage, and increased concentration of circulating inflammatory and pyrogenic cytokines. Common symptoms of exercise-induced GI disturbances include diarrhoea, vomiting, gastrointestinal bleeding, and cramps, which may increase gut-related LPS translocation. This article discusses the current evidence that supports the argument that these exercise-induced immune and GI disturbances may contribute to the development of endotoxaemia and heat stroke. When endotoxaemia can be tolerated or prevented, continuing exercise and heat exposure will elevate Tc to a higher level (> 42 degrees C), where heat stroke may occur through the direct thermal effects of heat on organ tissues and cells. We also discuss the evidence suggesting that heat stroke may occur through endotoxaemia (heat sepsis), the primary pathway of heat stroke, or hyperthermia, the secondary pathway of heat stroke. The existence of these two pathways of heat stroke and the contribution of exercise-induced immune and GI disturbances in the primary pathway of heat stroke are illustrated in the dual pathway model of heat stroke. This model of heat stroke suggests that prolonged intense exercise suppresses anti-LPS mechanisms, and promotes inflammatory and pyrogenic activities in the pathway of heat stroke.