942 resultados para Time of injury
Resumo:
Introdução: As lesões medulares acarretam consigo a problemática da deficiência, as suas respectivas sequelas interferem na qualidade de vida bem como na independência funcional destas pessoas, sendo habitual a existência de limitações que se prolongam ao longo a vida. O presente estudo visa a caracterização de sujeitos com lesão medular há mais de um ano, no que diz respeito à sua qualidade de vida e independência funcional, assim como, a relação entre as variáveis sóciodemográficas (género, idade, estado civil, escolaridade e profissão) e clínicas em estudo (nível, tipo, etiologia e data da lesão). Materiais e Métodos: O presente trabalho apresenta-se como um estudo observacional exploratório descritivo com abordagem quantitativa. A amostra foi formada por indivíduos adultos com lesão medular recrutados através da Associação Salvador e Associação Vida Independente. As medidas instrumentais utilizadas pelo estudo são o questionário sóciodemográfico e clínico, bem com a escala MIF (Granger, Hamilton, Keith, Zielezny, & Sherwins, 1986) e a escala SF-36 (Ware % Sherbourne, 1992). Resultados e Discussão: As análises demonstraram valores satisfatórios. A independência funcional e sua respectiva dimensão física e sóciocognitiva revelam resultados estatisticamente significativos com a variável nível neurológico. No que diz respeito à qualidade de vida esta demonstrou resultados estatisticamente significativos entre a dimensão desempenho emocional e a escolaridade entre o desempenho físico e a profissão e vitalidade e idade. Revelou igualmente valores satisfatórios entre todas as dimensões e o nível neurológico, entre a função social e data da lesão e por último entre a etiologia e função social, vitalidade, dor, saúde geral e mental. Conclusão: Os resultados obtidos com este estudo vêm reforçar a visão de que apesar dos indivíduos experienciarem um evento traumático e aspetos negativos após a lesão medular, há diferentes níveis de percepção de QoL e IF e que estas variam em função de algumas variáveis sóciodemográficas e clínicas. Neste ponto são ainda apresentadas as limitações do estudo e discutidas propostas de trabalhos futuros.
Resumo:
BACKGROUND: The reverse transcription quantitative real-time polymerase chain reaction (RT-qPCR) is a widely used, highly sensitive laboratory technique to rapidly and easily detect, identify and quantify gene expression. Reliable RT-qPCR data necessitates accurate normalization with validated control genes (reference genes) whose expression is constant in all studied conditions. This stability has to be demonstrated.We performed a literature search for studies using quantitative or semi-quantitative PCR in the rat spared nerve injury (SNI) model of neuropathic pain to verify whether any reference genes had previously been validated. We then analyzed the stability over time of 7 commonly used reference genes in the nervous system - specifically in the spinal cord dorsal horn and the dorsal root ganglion (DRG). These were: Actin beta (Actb), Glyceraldehyde-3-phosphate dehydrogenase (GAPDH), ribosomal proteins 18S (18S), L13a (RPL13a) and L29 (RPL29), hypoxanthine phosphoribosyltransferase 1 (HPRT1) and hydroxymethylbilane synthase (HMBS). We compared the candidate genes and established a stability ranking using the geNorm algorithm. Finally, we assessed the number of reference genes necessary for accurate normalization in this neuropathic pain model. RESULTS: We found GAPDH, HMBS, Actb, HPRT1 and 18S cited as reference genes in literature on studies using the SNI model. Only HPRT1 and 18S had been once previously demonstrated as stable in RT-qPCR arrays. All the genes tested in this study, using the geNorm algorithm, presented gene stability values (M-value) acceptable enough for them to qualify as potential reference genes in both DRG and spinal cord. Using the coefficient of variation, 18S failed the 50% cut-off with a value of 61% in the DRG. The two most stable genes in the dorsal horn were RPL29 and RPL13a; in the DRG they were HPRT1 and Actb. Using a 0.15 cut-off for pairwise variations we found that any pair of stable reference gene was sufficient for the normalization process. CONCLUSIONS: In the rat SNI model, we validated and ranked Actb, RPL29, RPL13a, HMBS, GAPDH, HPRT1 and 18S as good reference genes in the spinal cord. In the DRG, 18S did not fulfill stability criteria. The combination of any two stable reference genes was sufficient to provide an accurate normalization.
Resumo:
Percutaneous transluminal coronary angioplasty is a frequently used interventional technique to reopen arteries that have narrowed because of atherosclerosis. Restenosis, or renarrowing of the artery shortly after angioplasty, is a major limitation to the success of the procedure and is due mainly to smooth muscle cell accumulation in the artery wall at the site of balloon injury. In the present study, we demonstrate that the antiangiogenic sulfated oligosaccharide, PI-88, inhibits primary vascular smooth muscle cell proliferation and reduces intimal thickening 14 days after balloon angioplasty of rat and rabbit arteries. PI-88 reduced heparan sulfate content in the injured artery wall and prevented change in smooth muscle phenotype. However, the mechanism of PI-88 inhibition was not merely confined to the antiheparanase activity of this compound. PI-88 blocked extracellular signal-regulated kinase-1/2 (ERK1/2) activity within minutes of smooth muscle cell injury. It facilitated FGF-2 release from uninjured smooth muscle cells in vitro, and super-released FGF-2 after injury while inhibiting ERK1/2 activation. PI-88 inhibited the decrease in levels of FGF-2 protein in the rat artery wall within 8 minutes of injury. PI-88 also blocked injury-inducible ERK phosphorylation, without altering the clotting time in these animals. Optical biosensor studies revealed that PI-88 potently inhibited (K-i 10.3 nmol/L) the interaction of FGF-2 with heparan sulfate. These findings show for the first time the capacity of this sulfated oligosaccharide to directly bind FGF-2, block cellular signaling and proliferation in vitro, and inhibit injury-induced smooth muscle cell hyperplasia in two animal models. As such, this study demonstrates a new role for PI-88 as an inhibitor of intimal thickening after balloon angioplasty. The full text of this article is available online at http://www.circresaha.org.
Resumo:
OBJECTIVE: A new nerve transfer technique using a healthy fascicle of the posterior cord for suprascapular nerve reconstruction is presented. This technique was used in a patient with posttraumatic brachial plexopathy resulting in upper trunk injury with proximal root stumps that were unavailable for grafting associated with multiple nerve dysfunction. CLINICAL PRESENTATION: A 45-year-old man sustained a right brachial plexus injury after a bicycle accident. Clinical evaluation and electromyography indicated upper trunk involvement. Trapezius muscle function and triceps strength were normal on physical examination. INTERVENTION: The patient underwent a combined supra- and infraclavicular approach to the brachial plexus. A neuroma-in-continuity of the upper trunk and fibrotic C5 and C6 roots were identified. Electrical stimulation of the phrenic and spinal accessory nerves produced no response. The suprascapular nerve was dissected from the upper trunk, transected, and rerouted to the infraclavicular fossa. A healthy fascicle of the posterior cord to the triceps muscle was transferred to the suprascapular nerve. At the time of the 1-year follow-up evaluation, arm abduction against gravity and external rotation reached 40 and 34 degrees, respectively. CONCLUSION: The posterior cord can be used as a source of donor fascicle to the suprascapular nerve after its infraclavicular relocation. This new intraplexal nerve transfer could be applied in patients with isolated injury of the upper trunk and concomitant lesion of the extraplexal nerve donors usually used for reinnervation of the suprascapular nerve.
Resumo:
Background: Laparoscopic cholecystectomy (LC) has become the first-line surgical treatment of calculous gall-bladder disease and the benefits over open cholecystectomy are well known. In the early years of LC, the higher rate of bile duct injuries compared with open cholecystectomy was believed to be due to the 'learning curve' and would dissipate with increased experience. The purpose of the present paper was to review a tertiary referral unit's experience of bile duct injuries induced by LC. Methods: A retrospective analysis was performed on all patients referred for management of an iatrogenic bile duct injury from 1981 to 2000. For injuries sustained at LC, details of time between LC and recognition of the injury, time from injury to definitive repair, type of injury, use of intraoperative cholangiography (IOC), definitive repair and postoperative outcome were recorded. The type of injury sustained at open cholecystectomy was similarly classified to allow the severity of injury to be compared. Results: There were 131 patients referred for management of an iatrogenic bile duct injury that occurred at open cholecystectomy (n = 62), liver resection (n = 5) and at LC (n = 64). Only 39% of bile duct injuries were recognized at the time of LC. Following conversion to open operation, half the subsequent procedures were considered inappropriate. When the injury was not recognized during LC, 70% of patients developed bile leak/peritonitis, almost half of whom were referred, whereas the rest underwent a variety of operative procedures by the referring surgeon. The remainder developed jaundice or abnormal liver function tests and cholangitis. An IOC was performed in 43% of cases, but failed to identify an injury in two-thirds of patients. The bile duct injuries that occurred at LC were of greater severity than with open cholecystectomy. Following definitive repair, there was one death (1.6%). Ninety-two per cent of patients had an uncomplicated recovery and there was one late stricture requiring surgical revision. Conclusions: The early prediction that the rate of injury during LC would decline substantially with increased experience has not been fulfilled. Bile duct injury that occurs at LC is of greater severity than with open cholecystectomy. Bile duct injury is recognized during LC in less than half the cases. Evidence is accruing that the use of cholangiography reduces the risk and severity of injury and, when correctly interpreted, increases the chance of recognition of bile duct injury during the procedure. Prevention is the key but, should an injury occur, referral to a specialist in biliary reconstructive surgery is indicated.
Resumo:
Dysfunction in the motor system is a feature of persistent whiplash associated disorders. Little is known about motor dysfunction in the early stages following injury and of its progress in those persons who recover and those who develop persistent symptoms. This study measured prospectively, motor system function (cervical range of movement (ROM), joint position error (JPE) and activity of the superficial neck flexors (EMG) during a test of cranio-cervical flexion) as well as a measure of fear of re-injury (TAMPA) in 66 whiplash subjects within 1 month of injury and then 2 and 3 months post injury. Subjects were classified at 3 months post injury using scores on the neck disability index: recovered (30). Motor system function was also measured in 20 control subjects. All whiplash groups demonstrated decreased ROM and increased EMG (compared to controls) at 1 month post injury. This deficit persisted in the group with moderate/severe symptoms but returned to within normal limits in those who had recovered or reported persistent mild pain at 3 months. Increased EMG persisted for 3 months in all whiplash groups. Only the moderate/severe group showed greater JPE, within 1 month of injury, which remained unchanged at 3 months. TAMPA scores of the moderate/severe group were higher than those of the other two groups. The differences in TAMPA did not impact on ROM, EMG or JPE. This study identifies, for the first time, deficits in the motor system, as early as 1 month post whiplash injury, that persisted not only in those reporting moderate/severe symptoms at 3 months but also in subjects who recovered and those with persistent mild symptoms. (C) 2002 International Association for the Study of Pain. Published by Elsevier Science B.V. All rights reserved.
Resumo:
BACKGROUND & AIMS Hy's Law, which states that hepatocellular drug-induced liver injury (DILI) with jaundice indicates a serious reaction, is used widely to determine risk for acute liver failure (ALF). We aimed to optimize the definition of Hy's Law and to develop a model for predicting ALF in patients with DILI. METHODS We collected data from 771 patients with DILI (805 episodes) from the Spanish DILI registry, from April 1994 through August 2012. We analyzed data collected at DILI recognition and at the time of peak levels of alanine aminotransferase (ALT) and total bilirubin (TBL). RESULTS Of the 771 patients with DILI, 32 developed ALF. Hepatocellular injury, female sex, high levels of TBL, and a high ratio of aspartate aminotransferase (AST):ALT were independent risk factors for ALF. We compared 3 ways to use Hy's Law to predict which patients would develop ALF; all included TBL greater than 2-fold the upper limit of normal (×ULN) and either ALT level greater than 3 × ULN, a ratio (R) value (ALT × ULN/alkaline phosphatase × ULN) of 5 or greater, or a new ratio (nR) value (ALT or AST, whichever produced the highest ×ULN/ alkaline phosphatase × ULN value) of 5 or greater. At recognition of DILI, the R- and nR-based models identified patients who developed ALF with 67% and 63% specificity, respectively, whereas use of only ALT level identified them with 44% specificity. However, the level of ALT and the nR model each identified patients who developed ALF with 90% sensitivity, whereas the R criteria identified them with 83% sensitivity. An equal number of patients who did and did not develop ALF had alkaline phosphatase levels greater than 2 × ULN. An algorithm based on AST level greater than 17.3 × ULN, TBL greater than 6.6 × ULN, and AST:ALT greater than 1.5 identified patients who developed ALF with 82% specificity and 80% sensitivity. CONCLUSIONS When applied at DILI recognition, the nR criteria for Hy's Law provides the best balance of sensitivity and specificity whereas our new composite algorithm provides additional specificity in predicting the ultimate development of ALF.
Resumo:
Monitoring and management of intracranial pressure (ICP) and cerebral perfusion pressure (CPP) is a standard of care after traumatic brain injury (TBI). However, the pathophysiology of so-called secondary brain injury, i.e., the cascade of potentially deleterious events that occur in the early phase following initial cerebral insult-after TBI, is complex, involving a subtle interplay between cerebral blood flow (CBF), oxygen delivery and utilization, and supply of main cerebral energy substrates (glucose) to the injured brain. Regulation of this interplay depends on the type of injury and may vary individually and over time. In this setting, patient management can be a challenging task, where standard ICP/CPP monitoring may become insufficient to prevent secondary brain injury. Growing clinical evidence demonstrates that so-called multimodal brain monitoring, including brain tissue oxygen (PbtO2), cerebral microdialysis and transcranial Doppler among others, might help to optimize CBF and the delivery of oxygen/energy substrate at the bedside, thereby improving the management of secondary brain injury. Looking beyond ICP and CPP, and applying a multimodal therapeutic approach for the optimization of CBF, oxygen delivery, and brain energy supply may eventually improve overall care of patients with head injury. This review summarizes some of the important pathophysiological determinants of secondary cerebral damage after TBI and discusses novel approaches to optimize CBF and provide adequate oxygen and energy supply to the injured brain using multimodal brain monitoring.
Resumo:
Introduction: Low brain tissue oxygen pressure (PbtO2) is associated with worse outcome in patients with severe traumatic brain injury (TBI). However, it is unclear whether brain tissue hypoxia is merely a marker of injury severity or a predictor of prognosis, independent from intracranial pressure (ICP) and injury severity. Hypothesis: We hypothesized that brain tissue hypoxia was an independent predictor of outcome in patients wih severe TBI, irrespective of elevated ICP and of the severity of cerebral and systemic injury. Methods: This observational study was conducted at the Neurological ICU, Hospital of the University of Pennsylvania, an academic level I trauma center. Patients admitted with severe TBI who had PbtO2 and ICP monitoring were included in the study. PbtO2, ICP, mean arterial pressure (MAP) and cerebral perfusion pressure (CPP = MAP-ICP) were monitored continuously and recorded prospectively every 30 min. Using linear interpolation, duration and cumulative dose (area under the curve, AUC) of brain tissue hypoxia (PbtO2 < 15 mm Hg), elevated ICP >20 mm Hg and low CPP <60 mm Hg were calculated, and the association with outcome at hospital discharge, dichotomized as good (Glasgow Outcome Score [GOS] 4-5) vs. poor (GOS 1-3), was analyzed. Results: A total of 103 consecutive patients, monitored for an average of 5 days, was studied. Brain tissue hypoxia was observed in 66 (64%) patients despite ICP was < 20 mm Hg and CPP > 60 mm Hg (72 +/- 39% and 49 +/- 41% of brain hypoxic time, respectively). Compared with patients with good outcome, those with poor outcome had a longer duration of brain hypoxia (1.7 +/- 3.7 vs. 8.3 +/- 15.9 hrs, P<0.01), as well as a longer duration (11.5 +/- 16.5 vs. 21.6 +/- 29.6 hrs, P=0.03) and a greater cumulative dose (56 +/- 93 vs. 143 +/- 218 mm Hg*hrs, P<0.01) of elevated ICP. By multivariable logistic regression, admission Glasgow Coma Scale (OR, 0.83, 95% CI: 0.70-0.99, P=0.04), Marshall CT score (OR 2.42, 95% CI: 1.42-4.11, P<0.01), APACHE II (OR 1.20, 95% CI: 1.03-1.43, P=0.03), and the duration of brain tissue hypoxia (OR 1.13; 95% CI: 1.01-1.27; P=0.04) were all significantly associated with poor outcome. No independent association was found between the AUC for elevated ICP and outcome (OR 1.01, 95% CI 0.97-1.02, P=0.11) in our prospective cohort. Conclusions: In patients with severe TBI, brain tissue hypoxia is frequent, despite normal ICP and CPP, and is associated with poor outcome, independent of intracranial hypertension and the severity of cerebral and systemic injury. Our findings indicate that PbtO2 is a strong physiologic prognostic marker after TBI. Further study is warranted to examine whether PbtO2-directed therapy improves outcome in severely head-injured patients .
Resumo:
Urinary indices are classically believed to allow differentiation of transient (or pre-renal) acute kidney injury (AKI) from persistent (or acute tubular necrosis) AKI. However, the data validating urinalysis in critically ill patients are weak. In the previous issue of Critical Care, Pons and colleagues demonstrate in a multicenter observational study that sodium and urea excretion fractions as well as urinary over plasma ratios performed poorly as diagnostic tests to separate such entities. This study confirms the limited diagnostic and prognostic ability of urine testing. Together with other studies, this study raises more fundamental questions about the value, meaning and pathophysiologic validity of the pre-renal AKI paradigm and suggests that AKI (like all other forms of organ injury) is a continuum of injury that cannot be neatly divided into functional (pre-renal or transient) or structural (acute tubular necrosis or persistent).
Resumo:
OBJECTIVES: We have sought to develop an automated methodology for the continuous updating of optimal cerebral perfusion pressure (CPPopt) for patients after severe traumatic head injury, using continuous monitoring of cerebrovascular pressure reactivity. We then validated the CPPopt algorithm by determining the association between outcome and the deviation of actual CPP from CPPopt. DESIGN: Retrospective analysis of prospectively collected data. SETTING: Neurosciences critical care unit of a university hospital. PATIENTS: A total of 327 traumatic head-injury patients admitted between 2003 and 2009 with continuous monitoring of arterial blood pressure and intracranial pressure. MEASUREMENTS AND MAIN RESULTS: Arterial blood pressure, intracranial pressure, and CPP were continuously recorded, and pressure reactivity index was calculated online. Outcome was assessed at 6 months. An automated curve fitting method was applied to determine CPP at the minimum value for pressure reactivity index (CPPopt). A time trend of CPPopt was created using a moving 4-hr window, updated every minute. Identification of CPPopt was, on average, feasible during 55% of the whole recording period. Patient outcome correlated with the continuously updated difference between median CPP and CPPopt (chi-square=45, p<.001; outcome dichotomized into fatal and nonfatal). Mortality was associated with relative "hypoperfusion" (CPP<CPPopt), severe disability with "hyperperfusion" (CPP>CPPopt), and favorable outcome was associated with smaller deviations of CPP from the individualized CPPopt. While deviations from global target CPP values of 60 mm Hg and 70 mm Hg were also related to outcome, these relationships were less robust. CONCLUSIONS: Real-time CPPopt could be identified during the recording time of majority of the patients. Patients with a median CPP close to CPPopt were more likely to have a favorable outcome than those in whom median CPP was widely different from CPPopt. Deviations from individualized CPPopt were more predictive of outcome than deviations from a common target CPP. CPP management to optimize cerebrovascular pressure reactivity should be the subject of future clinical trial in severe traumatic head-injury patients.
Resumo:
The rat models currently employed for studies of nerve regeneration present distinct disadvantages. We propose a new technique of stretch-induced nerve injury, used here to evaluate the influence of gabapentin (GBP) on nerve regeneration. Male Wistar rats (300 g; n=36) underwent surgery and exposure of the median nerve in the right forelimbs, either with or without nerve injury. The technique was performed using distal and proximal clamps separated by a distance of 2 cm and a sliding distance of 3 mm. The nerve was compressed and stretched for 5 s until the bands of Fontana disappeared. The animals were evaluated in relation to functional, biochemical and histological parameters. Stretching of the median nerve led to complete loss of motor function up to 12 days after the lesion (P<0.001), compared to non-injured nerves, as assessed in the grasping test. Grasping force in the nerve-injured animals did not return to control values up to 30 days after surgery (P<0.05). Nerve injury also caused an increase in the time of sensory recovery, as well as in the electrical and mechanical stimulation tests. Treatment of the animals with GBP promoted an improvement in the morphometric analysis of median nerve cross-sections compared with the operated vehicle group, as observed in the area of myelinated fibers or connective tissue (P<0.001), in the density of myelinated fibers/mm2 (P<0.05) and in the degeneration fragments (P<0.01). Stretch-induced nerve injury seems to be a simple and relevant model for evaluating nerve regeneration.
Resumo:
The role of protein kinase C (PKC) activation in ischemic preconditioning remains controversial. Since diacylglycerol is the endogenous activator of PKC and as such might be expected cardioprotective, we have investigated whether: (i) the diacylglycerol analog 1,2-dioctanoyl-sn-glycerol (DOG) can protect against injury during ischemia and reperfusion; (ii) any effect is mediated via PKC activation; and (iii) the outcome is influenced by the time of administration. Isolated rat hearts were perfused with buffer at 37°C and paced at 400 bpm. In Study 1, hearts (n=6/group) were subjected to one of the following: (1) 36 min aerobic perfusion (controls); (2) 20 min aerobic perfusion plus ischemic preconditioning (3 min ischemia/3 min reperfusion+5 min ischemia/5 min reperfusion); (3) aerobic perfusion with buffer containing DOG (10 μM) given as a substitute for ischemic preconditioning; (4) aerobic perfusion with DOG (10 μM) during the last 2 min of aerobic perfusion. All hearts then were subjected to 35 min of global ischemia and 40 min reperfusion. A further group (5) were perfused with DOG (10 μM) for the first 2 min of reperfusion. Ischemic preconditioning improved postischemic recovery of LVDP from 24±3% in controls to 71±2% (P<0.05). Recovery of LVDP also was enhanced by DOG when given just before ischemia (54±4%), however, DOG had no effect on the recovery of LVDP when used as a substitute for ischemic preconditioning (22±5%) or when given during reperfusion (29±6%). In Study 2, the first four groups of study were repeated (n=4–5/group) without imposing the periods of ischemia and reperfusion, instead hearts were taken for the measurement of PKC activity (pmol/min/mg protein±SEM). PKC activity after 36 min in groups (1), (2), (3) and (4) was: 332±102, 299±63, 521±144, and 340±113 and the membrane:cytosolic PKC activity ratio was: 5.6±1.5, 5.3±1.8, 6.6±2.7, and 3.9±2.1 (P=NS in each instance). In conclusion, DOG is cardioprotective but under the conditions of the present study is less cardioprotective than ischemic preconditioning, furthermore the protection does not appear to necessitate PKC activation prior to ischemia.
Resumo:
Objectives: The purpose of this study was to determine if intra-abdominal pressure (IAP) could predict acute renal injury (AKI) in the postoperative period of abdominal surgeries, and which would be its cutoff value. Patients and methods: A prospective observational study was conducted in the period from January 2010 to March 2011 in the Intensive Care Units (ICUs) of the University Hospital of Botucatu Medical School, UNESP. Consecutive patients undergoing abdominal surgery were included in the study. Initial evaluation, at admission in ICU, was performed in order to obtain demographic, clinical surgical and therapeutic data. Evaluation of IAP was obtained by the intravesical method, four times per day, and renal function was evaluated during the patient's stay in the ICU until discharge, death or occurrence of AKI. Results: A total of 60 patients were evaluated, 16 patients developed intra-abdominal hypertension (IAH), 45 developed an abnormal IAP (>7 mmHg) and 26 developed AKI. The first IAP at the time of admission to the ICU was able to predict the occurrence of AKI (area under the receiver-operating characteristic curve was 0.669; p=0.029) with the best cutoff point (by Youden index method) >= 7.68 mmHg, sensitivity of 87%, specificity of 46% at this point. The serial assessment of this parameter did not added prognostic value to initial evaluation. Conclusion: IAH was frequent in patients undergoing abdominal surgeries during ICU stay, and it predicted the occurrence of AKI. Serial assessments of IAP did not provided better discriminatory power than initial evaluation.
Resumo:
Peritoneal dialysis (PD) should be considered a suitable method of renal replacement therapy in acute kidney injury (AKI) patients. This study is the largest cohort providing patient characteristics, clinical practice, patterns and their relationship to outcomes in a developing country. Its objective was to describe the main determinants of patient and technique survival, including trends over time of PD treatment in AKI patients. This was a Brazilian prospective cohort study in which all adult AKI patients on PD were studied from January/2004 to January/2014. For comparison purposes, patients were divided into 2 groups according to the year of treatment: 2004-2008 and 2009-2014. Patient survival and technique failure (TF) were analyzed using the competing risk model of Fine and Gray. A total of 301 patients were included, 51 were transferred to hemodialysis (16.9%) during the study period. The main cause of TF was mechanical complication (47%) followed by peritonitis (41.2%). There was change in TF during the study period: compared to 2004-2008, patients treated at 2009-2014 had relative risk (RR) reduction of 0.86 (95% CI 0.77-0.96) and three independent risk factors were identified: period of treatment at 2009 and 2014, sepsis and age>65 years. There were 180 deaths (59.8%) during the study. Death was the leading cause of dropout (77.9% of all cases) mainly by sepsis (58.3%), followed cardiovascular disease (36.1%). The overall patient survival was 41% at 30 days. Patient survival improved along study periods: compared to 2004-2008, patients treated at 2009-2014 had a RR reduction of 0.87 (95% CI 0.79-0.98). The independent risk factors for mortality were sepsis, age>70 years, ATN-ISS > 0.65 and positive fluid balance. As conclusion, we observed an improvement in patient survival and TF along the years even after correction for several confounders and using a competing risk approach.