983 resultados para Reactional episodes
Resumo:
Introduction Activated protein C (APC) deC ciency is prevalent in severe sepsis and septic shock patients. The aim of the study was to relate the anticoagulation activity evaluated by APC with other coagulation
parameters adjusted to 28-day mortality.
Methods A cohort study of 150 critically ill adults. Age, sex, sources of infection and coagulation markers within 24< hours from severe sepsis or septic shock onset, deC ned according to Surviving Sepsis Campaign (SSC) criteria, were studied. We analyzed APC activity using a hemostasis laboratory analyzer (BCS® XP; Siemens). A descriptive and comparative statistical analysis was performed using SPSS version 15.0 (SPSS Inc., Chicago, IL, USA).
Results We analyzed 150 consecutive episodes of severe sepsis (16%) or septic shock (84%) admitted to the UCI. The median age of the study sample was 64 (interquartile range (IQR): 22.3
Resumo:
Background. Transient global amnesia (TGA) is a syndrome of sudden, unexplained isolated short-term memory loss. In the majority of TGA cases, no causes can be identified and neuroimaging, CSF studies and EEG are usually normal. We present a patient with TGA associated with a small acute infarct at the cingulate gyrus. Case Report. The patient, a 62 year-old man, developed two episodes of TGA. He had hypertension and hypercholesterolemia. He was found to have an acute ischemic stroke of small size (15 mm of maximal diameter) at the right cerebral cingulate gyrus diagnosed on brain magnetic resonance imaging. No lesions involving other limbic system structures such as thalamus, fornix, corpus callosum, or hippocampal structures were seen. The remainder of the examination was normal. Conclusion. Unilateral ischemic lesions of limbic system structures may result in TGA. We must bear in mind that TGA can be an associated clinical disorder of cingulate gyrus infarct.
Resumo:
Early Cretaceous life and the environment were strongly influenced by the accelerated break up of Pangaea, which was associated with the formation of a multitude of rift basins, intensified spreading, and important volcanic activity on land and in the sea. These processes likely interacted with greenhouse conditions, and Early Cretaceous climate oscillated between "normal" greenhouse, predominantly arid conditions, and intensified greenhouse, predominantly humid conditions. Arid conditions were important during the latest Jurassic and early Berriasian, the late Barremian, and partly also during the late Aptian. Humid conditions were particularly intense and widespread during shorter episodes of environmental change (EECs): the Valanginian Weissert, the latest Hauterivian Faraoni, the latest Barremian earliest Aptian Taxy, the early Aptian Selli, the early late Aptian Fallot and the late Aptian-early Albian Paquier episodes. Arid conditions were associated with evaporation, low biogeochemical weathering rates, low nutrient fluxes, and partly stratified oceans, leading to oxygen depletion and enhanced preservation of laminated, organic-rich mud (LOM). Humid conditions enabled elevated biogeochemical weathering rates and nutrient fluxes, important runoff and the buildup of freshwater lids in proximal basins, intensified oceanic and atmospheric circulation, widespread upwelling and phosphogenesis, important primary productivity and enhanced preservation of LOM in expanded oxygen-minimum zones. The transition of arid to humid climates may have been associated with the net transfer of water to the continent owing to the infill of dried-out groundwater reservoirs in internally drained inland basins. This resulted in shorter-term sea-level fall, which was followed by sea-level rise. These sea-level changes and the influx of freshwater into the ocean may have influenced oxygen-isotope signatures. Climate change preceding and during the Early Cretaceous EECs may have been rapid, but in general, the EECs had a "pre"-history, during which the stage was set for environmental change. Negative feedback on the climate through increased marine LOM preservation was unlikely, because of the low overall organic-carbon accumulation rates during these episodes. Life and climate co-evolved during the Early Cretaceous. Arid conditions may have affected continental life, such as across the Tithonian/Berriasian boundary. Humid conditions and the corresponding tendency to develop dys- to anaerobic conditions in deeper ocean waters led to phases of accelerated extinction in oceans, but may have led to more luxuriant vegetation cover on continents, such as during the Valanginian, to the benefit of herbivores. During Early Cretaceous EECs, reef systems and carbonate platforms in general were particularly vulnerable. They were the first to disappear and the last to recover, often only after several million years. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Adolescence, defined as a transition phase toward autonomy and independence, is a natural time of learning and adjustment, particularly in the setting of long-term goals and personal aspirations. It also is a period of heightened sensation seeking, including risk taking and reckless behaviors, which is a major cause of morbidity and mortality among teenagers. Recent observations suggest that a relative immaturity in frontal cortical neural systems may underlie the adolescent propensity for uninhibited risk taking and hazardous behaviors. However, converging preclinical and clinical studies do not support a simple model of frontal cortical immaturity, and there is substantial evidence that adolescents engage in dangerous activities, including drug abuse, despite knowing and understanding the risks involved. Therefore, a current consensus considers that much brain development during adolescence occurs in brain regions and systems that are critically involved in the perception and evaluation of risk and reward, leading to important changes in social and affective processing. Hence, rather than naive, immature and vulnerable, the adolescent brain, particularly the prefrontal cortex, should be considered as prewired for expecting novel experiences. In this perspective, thrill seeking may not represent a danger but rather a window of opportunities permitting the development of cognitive control through multiple experiences. However, if the maturation of brain systems implicated in self-regulation is contextually dependent, it is important to understand which experiences matter most. In particular, it is essential to unveil the underpinning mechanisms by which recurrent adverse episodes of stress or unrestricted access to drugs can shape the adolescent brain and potentially trigger life-long maladaptive responses.
Resumo:
BACKGROUND Missed, delayed or incorrect diagnoses are considered to be diagnostic errors. The aim of this paper is to describe the methodology of a study to analyse cognitive aspects of the process by which primary care (PC) physicians diagnose dyspnoea. It examines the possible links between the use of heuristics, suboptimal cognitive acts and diagnostic errors, using Reason's taxonomy of human error (slips, lapses, mistakes and violations). The influence of situational factors (professional experience, perceived overwork and fatigue) is also analysed. METHODS Cohort study of new episodes of dyspnoea in patients receiving care from family physicians and residents at PC centres in Granada (Spain). With an initial expected diagnostic error rate of 20%, and a sampling error of 3%, 384 episodes of dyspnoea are calculated to be required. In addition to filling out the electronic medical record of the patients attended, each physician fills out 2 specially designed questionnaires about the diagnostic process performed in each case of dyspnoea. The first questionnaire includes questions on the physician's initial diagnostic impression, the 3 most likely diagnoses (in order of likelihood), and the diagnosis reached after the initial medical history and physical examination. It also includes items on the physicians' perceived overwork and fatigue during patient care. The second questionnaire records the confirmed diagnosis once it is reached. The complete diagnostic process is peer-reviewed to identify and classify the diagnostic errors. The possible use of heuristics of representativeness, availability, and anchoring and adjustment in each diagnostic process is also analysed. Each audit is reviewed with the physician responsible for the diagnostic process. Finally, logistic regression models are used to determine if there are differences in the diagnostic error variables based on the heuristics identified. DISCUSSION This work sets out a new approach to studying the diagnostic decision-making process in PC, taking advantage of new technologies which allow immediate recording of the decision-making process.
Resumo:
INTRODUCTION For critically patients, enteral immunonutrition results in notable reductions in infections and in length of stay in hospital, but not on mortality, raising the question as to whether this relate to the heterogeneous nature of critically ill patients or to the absence of the altered absorption of specific nutrients within the immunonutrient mix (e.g. iron). Immune-associated functional iron deficiency (FID) is not only one of the many causes or anaemia in the critically ill, but also a cause of inappropriate immune response, leading to a longer duration of episodes of systemic inflammatory response syndrome and poor outcome. OBJECTIVE This prospective cross-sectional study was undertaken to assess the prevalence of FID in critically ill patients during their stay in intensive care (ICU) in order to find the more appropriate population of patients that can benefit from iron therapy. METHOD Full blood cell counts, including reticulocytes (RETIC), serum iron (SI), transferring levels (TRF) and saturation (satTRF), serum TFR receptor (sTfR), ferritin (FRT) and C-reactive protein (CRP) were measured in venous blood samples from 131 random patients admitted to the ICU for at least 24 h (Length of ICU stay, LIS; min: 1 day; max: 38 days). RESULTS Anaemia (Hb < 12 g/dL) was present in 76% of the patients (Hb < 10 g/dL in 33%), hypoferremia (SI < 45 microg/dl) in 69%; satTRF < 20% in 53%; FRT < 100 ng/mL in 23%; sTfR > 2.3 mg/dL in 13%; and CRP > 0.5 mg/dL in 88%. Statistically significant correlations (r of Pearson; *p < 0.05, **p < 0.01) were obtained for serum CRP levels and WBC**, Hb*, TRF**, satTRF*, and FRT**. There was also a strong correlation between TRF and FRT (-0.650**), but not between FRT and satTRF or SI. LIS correlated with Hb*, CRP**, TRF*, satTRF* and FRT**. CONCLUSIONS A large proportion of critically ill patients admitted to the ICU presented the typical functional iron deficiency (FID) of acute inflammation-related anaemia (AIRA). This FID correlates with the inflammatory status and the length of stay at the ICU. However, 21% of the ICU patients with AIRA had an associated real iron deficiency (satTRF < 20; FRT < 100 and sTfR > 2.3). Since oral supplementation of iron seems to be ineffective, all these patients might benefit of iv iron therapy for correction of real or functional iron deficiency, which in turn might help to ameliorate their inflammatory status.
Resumo:
BACKGROUND While pain is frequently associated with unipolar depression, few studies have investigated the link between pain and bipolar depression. In the present study we estimated the prevalence and characteristics of pain among patients with bipolar depression treated by psychiatrists in their regular clinical practice. The study was designed to identify factors associated with the manifestation of pain in these patients. METHODS Patients diagnosed with bipolar disorder (n=121) were selected to participate in a cross-sectional study in which DSM-IV-TR criteria were employed to identify depressive episodes. The patients were asked to describe any pain experienced during the study, and in the 6 weeks beforehand, by means of a Visual Analogical Scale (VAS). RESULTS Over half of the bipolar depressed patients (51.2%, 95% CI: 41.9%-60.6%), and 2/3 of the female experienced concomitant pain. The pain was of moderate to severe intensity and prolonged duration, and it occurred at multiple sites, significantly limiting the patient's everyday activities. The most important factors associated with the presence of pain were older age, sleep disorders and delayed diagnosis of bipolar disorder. CONCLUSIONS Chronic pain is common in bipolar depressed patients, and it is related to sleep disorders and delayed diagnosis of their disorder. More attention should be paid to study the presence of pain in bipolar depressed patients, in order to achieve more accurate diagnoses and to provide better treatment options.
Resumo:
INTRODUCTION: The onset of post-transplant diabetes mellitus (PTDM) among kidney recipients is associated with an increased risk of graft failure and high rates of morbidity and mortality. Minimize the risk of PTDM is a priority for improving long-term survival rates. Aims. This study aims to assess the prevalence of PTDM in a renal transplant patient population, to identify risk factors and assess the graft and patient survival. METHODS: The sample consisted of 112 renal transplant patients , 69 men and 43 women , renal transplant , who attended for five years post-transplant consultation. Were analyzed as potential risk factors for PTDM : age , sex, body mass index (BMI ) , obesity , VHC , hypertension, dyslipidemia , total cholesterol (TC) , serum triglyceride and immunosuppressive therapy ( cyclosporine , tacrolimus , mycophenolate mofetil and sirolimus ), also the prevalence of acute rejection episodes was evaluated. RESULTS: The prevalence of PTDM was 24.2 %, compared with 85 patients (75.8%) with standard glucose (PGN) . PTDM patients showed a higher BMI , a higher percentage of overweight , dyslipidemia , total cholesterol levels , triglycerides and performed a greater percentage of patients with PDMPT including Mycophenolate mofetil was administered. CONCLUSIONS: There is a high incidence of PTDM in kidney recipients , the importance of weight control and strict adherence to all identified risk factors , as well as in minimizing the doses of immunosuppressive therapies to prevent the onset of PTDM.
Resumo:
BACKGROUND & AIMS Hy's Law, which states that hepatocellular drug-induced liver injury (DILI) with jaundice indicates a serious reaction, is used widely to determine risk for acute liver failure (ALF). We aimed to optimize the definition of Hy's Law and to develop a model for predicting ALF in patients with DILI. METHODS We collected data from 771 patients with DILI (805 episodes) from the Spanish DILI registry, from April 1994 through August 2012. We analyzed data collected at DILI recognition and at the time of peak levels of alanine aminotransferase (ALT) and total bilirubin (TBL). RESULTS Of the 771 patients with DILI, 32 developed ALF. Hepatocellular injury, female sex, high levels of TBL, and a high ratio of aspartate aminotransferase (AST):ALT were independent risk factors for ALF. We compared 3 ways to use Hy's Law to predict which patients would develop ALF; all included TBL greater than 2-fold the upper limit of normal (×ULN) and either ALT level greater than 3 × ULN, a ratio (R) value (ALT × ULN/alkaline phosphatase × ULN) of 5 or greater, or a new ratio (nR) value (ALT or AST, whichever produced the highest ×ULN/ alkaline phosphatase × ULN value) of 5 or greater. At recognition of DILI, the R- and nR-based models identified patients who developed ALF with 67% and 63% specificity, respectively, whereas use of only ALT level identified them with 44% specificity. However, the level of ALT and the nR model each identified patients who developed ALF with 90% sensitivity, whereas the R criteria identified them with 83% sensitivity. An equal number of patients who did and did not develop ALF had alkaline phosphatase levels greater than 2 × ULN. An algorithm based on AST level greater than 17.3 × ULN, TBL greater than 6.6 × ULN, and AST:ALT greater than 1.5 identified patients who developed ALF with 82% specificity and 80% sensitivity. CONCLUSIONS When applied at DILI recognition, the nR criteria for Hy's Law provides the best balance of sensitivity and specificity whereas our new composite algorithm provides additional specificity in predicting the ultimate development of ALF.
Resumo:
Diabet. Med. 28, 539-542 (2011) ABSTRACT: Aims Achievement of good metabolic control in Type 1 diabetes is a difficult task in routine diabetes care. Education-based flexible intensified insulin therapy has the potential to meet the therapeutic targets while limiting the risk for severe hypoglycaemia. We evaluated the metabolic control and the rate of severe hypoglycaemia in real-life clinical practice in a centre using flexible intensified insulin therapy as standard of care since 1990. Methods Patients followed for Type 1 diabetes (n = 206) or those with other causes of absolute insulin deficiency (n = 17) in our outpatient clinic were analysed in a cross-sectional study. Mean age (± standard deviation) was 48.9 ± 15.7 years, with diabetes duration of 21.4 ± 14.4 years. Outcome measures were HbA(1c) and frequency of severe hypoglycaemia. Results Median HbA(1c) was 7.1% (54 mmol/mol) [interquartile range 6.6-7.8 (51-62 mmol/mol)]; a good or acceptable metabolic control with HbA(1c) < 7.0% (53 mmol/mol) or 7.5% (58 mmol/mol) was reached in 43.5 and 64.6% of the patients, respectively. The frequency of severe hypoglycaemic episodes was 15 per 100 patient years: 72.3% of the patients did not experience any such episodes during the past 5 years. Conclusions Good or acceptable metabolic control is achievable in the majority of patients with Type 1 diabetes or other causes of absolute insulin deficiency in routine diabetes care while limiting the risk for severe hypoglycaemia.
Resumo:
Leprosy inflammatory episodes [type 1 (T1R) and type 2 (T2R) reactions] represent the major cause of irreversible nerve damage. Leprosy serology is known to be influenced by the patient’s bacterial index (BI) with higher positivity in multibacillary patients (MB) and specific multidrug therapy (MDT) reduces antibody production. This study evaluated by ELISA antibody responses to leprosy Infectious Disease Research Institute diagnostic-1 (LID-1) fusion protein and phenolic glycolipid I (PGL-I) in 100 paired serum samples of 50 MB patients collected in the presence/absence of reactions and in nonreactional patients before/after MDT. Patients who presented T2R had a median BI of 3+, while MB patients with T1R and nonreactional patients had median BI of 2.5+ (p > 0.05). Anti-LID-1 and anti-PGL-I antibodies declined in patients diagnosed during T1R (p < 0.05). Anti-LID-1 levels waned in MB with T2R at diagnosis and nonreactional MB patients (p < 0.05). Higher anti-LID-1 levels were seen in patients with T2R at diagnosis (vs. patients with T1R at diagnosis, p = 0.008; vs. nonreactional patients, p = 0.020) and in patients with T2R during MDT (vs. nonreactional MB, p = 0.020). In MB patients, high and persistent anti-LID-1 antibody levels might be a useful tool for clinicians to predict which patients are more susceptible to develop leprosy T2R.
Resumo:
Dapagliflozin is a new oral antidiabetic agent whose mechanism of action increases renal glucose excretion, independently of insulin secretion or insulin action. The efficacy of dapagliflozin is dependent on renal function. The use of dapagliflozin has been licensed to improve glycaemic control in patients with type 2 diabetes mellitus as: - monotherapy when diet and exercise alone do not provide adequate glycaemic control in patients for whom the use of metformin is considered inappropriate due to intolerance. - Add-on combination therapy with other glucose-lowering agents including insulin, when these, together with diet and exercise, do not provide adequate glycaemic control. Funding has been restricted to the use of dapagliflozin, prior approval, as dual therapy in combination with metformin. This report aims to assess the efficacy and safety of dapagliflozin in the treatment of type 2 diabetes mellitus, rate the added therapeutic value of dapagliflozin in type 2 diabetes mellitus and identify its current place in therapy. A systematic literature search was carried out, for the purpose of this evaluation, using PubMed, Embase, Cochrane and IDIS databases as well as other secondary sources of evidence-based medicine, therapeutic bulletins and national and international drug agencies. Following the critical reading and analysis of the selected articles, a summary is made out of the scientific evidence available, using Scottish Intercollegiate Guidelines Network (SIGN) criteria. Only one randomised clinical trial, out of the ten trials found, was considered to be a suitable comparison (versus a dual therapy in combination with the sulfonylurea glipizide in patients inadequately controlled with metformin, diet and exercise). No trials have evaluated variables of relevance to patients, except for safety variables. The main efficacy variable in the trials was the change from baseline in HbA1c, except for a study which evaluated the change from baseline in total body weight as main variable. Baseline characteristics of the patients enrolled in the trials significantly differ from those of the population with diabetes in our society which tend to be of an older age and have a longer history of type 2 diabetes mellitus. The major limitation of dapagliflozin derives from its mechanism of action, since its efficacy decreases as renal function declines. The use of dapagliflozin is not recommended in patients with moderate to severe renal impairment ((CrCl<60ml/min or GFG <60 ml/min/1.73 m2) nor in elderly patients, in which a decrease in renal function can be expected. The assessment of safety includes the incidence and rate of discontinuations due to adverse events, episodes of hypoglycaemia, signs or symptoms of genital and urinary tract infections, dehydration, hypovolaemia and hypotension. Further pharmacoepidemiological studies are to be carried out to clarify the long-term effects of dapagliflozin on renal function and the potential effect in the development of breast and bladder tumours. Dapagliflozin as monotherapy has not been evaluated against adequate comparators (sulfonylureas, pioglitazone, gliptins). In combination therapy with metformin, the efficacy of dapagliflozin was shown to be non-inferior to glipizide plus metformin, resulting in a mean reduction of 0.52% in HbA1c, with a difference of 0.00 among both groups (95% CI: -0.11 a 0.11). There are no comparative data against other second-line treatment options. As shown in the studies, the overall incidence of adverse events with dapagliflozin as monotherapy (21.5%) was similar to that observed with placebo, and greater to that observed with metformin (15.4%). Hypoglycaemia of any type was the adverse event more frequently reported. The incidence of severe hypoglycaemic events observed in most of the studies was low. The overall incidence of adverse events observed in the study that compared dapagliflozin+metformin against glipizide+metformin was similar for both groups (27%) and incidence of hypoglycaemic events with dapagliflozin (3.5%) was significantly lower to that observed with glipizide (40.8%). Reductions of body weight of about 2 to 3 kg and a slight decrease in blood pressure (1 to 5 mmHg) have been observed in all studies in the groups treated with dapagliflozin together with diet and exercise. Dosing scheme (every 24 hours) is similar to other oral antidiabetic agents and its cost is similar to that for gliptines and higher to that for sulfonylureas or generic pioglitazone. Funding has been limited to the use of dapagliflozin as dual therapy regimen in combination with metformin as an option for patients with contraindication or intolerance to sulfonylureas, such a those experiencing frequent hypoglycaemic events, weight loss associated risks, as long as they are under 75 years of age and have no moderate to severe renal impairment. In the light of the above, we consider dapagliflozin means no therapeutic innovation in the therapy of type 2 diabetes mellitus over other therapeutic alternatives available.
Resumo:
The impact of antimicrobial resistance on clinical outcomes is the subject of ongoing investigations, although uncertainty remains about its contribution to mortality. We investigated the impact of carbapenem resistance on mortality in Pseudomonas aeruginosa bacteremia in a prospective multicenter (10 teaching hospitals) observational study of patients with monomicrobial bacteremia followed up for 30 days after the onset of bacteremia. The adjusted influence of carbapenem resistance on mortality was studied by using Cox regression analysis. Of 632 episodes, 487 (77%) were caused by carbapenem-susceptible P. aeruginosa (CSPA) isolates, and 145 (23%) were caused by carbapenem-resistant P. aeruginosa (CRPA) isolates. The median incidence density of nosocomial CRPA bacteremia was 2.3 episodes per 100,000 patient-days (95% confidence interval [CI], 1.9 to 2.8). The regression demonstrated a time-dependent effect of carbapenem resistance on mortality as well as a significant interaction with the Charlson index: the deleterious effect of carbapenem resistance on mortality decreased with higher Charlson index scores. The impact of resistance on mortality was statistically significant only from the fifth day after the onset of the bacteremia, reaching its peak values at day 30 (adjusted hazard ratio for a Charlson score of 0 at day 30, 9.9 [95% CI, 3.3 to 29.4]; adjusted hazard ratio for a Charlson score of 5 at day 30, 2.6 [95% CI, 0.8 to 8]). This study clarifies the relationship between carbapenem resistance and mortality in patients with P. aeruginosa bacteremia. Although resistance was associated with a higher risk of mortality, the study suggested that this deleterious effect may not be as great during the first days of the bacteremia or in the presence of comorbidities.
Resumo:
The impact of the adequacy of empirical therapy on outcome for patients with bloodstream infections (BSI) is key for determining whether adequate empirical coverage should be prioritized over other, more conservative approaches. Recent systematic reviews outlined the need for new studies in the field, using improved methodologies. We assessed the impact of inadequate empirical treatment on the mortality of patients with BSI in the present-day context, incorporating recent methodological recommendations. A prospective multicenter cohort including all BSI episodes in adult patients was performed in 15 hospitals in Andalucía, Spain, over a 2-month period in 2006 to 2007. The main outcome variables were 14- and 30-day mortality. Adjusted analyses were performed by multivariate analysis and propensity score-based matching. Eight hundred one episodes were included. Inadequate empirical therapy was administered in 199 (24.8%) episodes; mortality at days 14 and 30 was 18.55% and 22.6%, respectively. After controlling for age, Charlson index, Pitt score, neutropenia, source, etiology, and presentation with severe sepsis or shock, inadequate empirical treatment was associated with increased mortality at days 14 and 30 (odds ratios [ORs], 2.12 and 1.56; 95% confidence intervals [95% CI], 1.34 to 3.34 and 1.01 to 2.40, respectively). The adjusted ORs after a propensity score-based matched analysis were 3.03 and 1.70 (95% CI, 1.60 to 5.74 and 0.98 to 2.98, respectively). In conclusion, inadequate empirical therapy is independently associated with increased mortality in patients with BSI. Programs to improve the quality of empirical therapy in patients with suspicion of BSI and optimization of definitive therapy should be implemented.
Resumo:
BACKGROUND Complicated pyelonephritis (cPN), a common cause of hospital admission, is still a poorly-understood entity given the difficulty involved in its correct definition. The aim of this study was to analyze the main epidemiological, clinical, and microbiological characteristics of cPN and its prognosis in a large cohort of patients with cPN. METHODS We conducted a prospective, observational study including 1325 consecutive patients older than 14 years diagnosed with cPN and admitted to a tertiary university hospital between 1997-2013. After analyzing the main demographic, clinical and microbiological data, covariates found to be associated with attributable mortality in univariate analysis were included in a multivariate logistic regression model. RESULTS Of the 1325 patients, 689 (52%) were men and 636 (48%) women; median age 63 years, interquartile range [IQR] (46.5-73). Nine hundred and forty patients (70.9%) had functional or structural abnormalities in the urinary tract, 215 (16.2%) were immunocompromised, 152 (11.5%) had undergone a previous urinary tract instrumentation, and 196 (14.8%) had a long-term bladder catheter, nephrostomy tube or ureteral catheter. Urine culture was positive in 813 (67.7%) of the 1251 patients in whom it was done, and in the 1032 patients who had a blood culture, 366 (34%) had bacteraemia. Escherichia coli was the causative agent in 615 episodes (67%), Klebsiella spp in 73 (7.9%) and Proteus ssp in 61 (6.6%). Fourteen point one percent of GNB isolates were ESBL producers. In total, 343 patients (25.9%) developed severe sepsis and 165 (12.5%) septic shock. Crude mortality was 6.5% and attributable mortality was 4.1%. Multivariate analysis showed that an age >75 years (OR 2.77; 95% CI, 1.35-5.68), immunosuppression (OR 3.14; 95% CI, 1.47-6.70), and septic shock (OR 58.49; 95% CI, 26.6-128.5) were independently associated with attributable mortality. CONCLUSIONS cPN generates a high morbidity and mortality and likely a great consumption of healthcare resources. This study highlights the factors directly associated with mortality, though further studies are needed in the near future aimed at identifying subgroups of low-risk patients susceptible to outpatient management.