990 resultados para Record (London, England)
Resumo:
INTRODUCTION Although several parameters have been proposed to predict the hemodynamic response to fluid expansion in critically ill patients, most of them are invasive or require the use of special monitoring devices. The aim of this study is to determine whether noninvasive evaluation of respiratory variation of brachial artery peak velocity flow measured using Doppler ultrasound could predict fluid responsiveness in mechanically ventilated patients. METHODS We conducted a prospective clinical research in a 17-bed multidisciplinary ICU and included 38 mechanically ventilated patients for whom fluid administration was planned due to the presence of acute circulatory failure. Volume expansion (VE) was performed with 500 mL of a synthetic colloid. Patients were classified as responders if stroke volume index (SVi) increased >or= 15% after VE. The respiratory variation in Vpeakbrach (DeltaVpeakbrach) was calculated as the difference between maximum and minimum values of Vpeakbrach over a single respiratory cycle, divided by the mean of the two values and expressed as a percentage. Radial arterial pressure variation (DeltaPPrad) and stroke volume variation measured using the FloTrac/Vigileo system (DeltaSVVigileo), were also calculated. RESULTS VE increased SVi by >or= 15% in 19 patients (responders). At baseline, DeltaVpeakbrach, DeltaPPrad and DeltaSVVigileo were significantly higher in responder than nonresponder patients [14 vs 8%; 18 vs. 5%; 13 vs 8%; P < 0.0001, respectively). A DeltaVpeakbrach value >10% predicted fluid responsiveness with a sensitivity of 74% and a specificity of 95%. A DeltaPPrad value >10% and a DeltaSVVigileo >11% predicted volume responsiveness with a sensitivity of 95% and 79%, and a specificity of 95% and 89%, respectively. CONCLUSIONS Respiratory variations in brachial artery peak velocity could be a feasible tool for the noninvasive assessment of fluid responsiveness in patients with mechanical ventilatory support and acute circulatory failure. TRIAL REGISTRATION ClinicalTrials.gov ID: NCT00890071.
Resumo:
INTRODUCTION Refractory septic shock has dismal prognosis despite aggressive therapy. The purpose of the present study is to report the effects of terlipressin (TP) as a rescue treatment in children with catecholamine refractory hypotensive septic shock. METHODS We prospectively registered the children with severe septic shock and hypotension resistant to standard intensive care, including a high dose of catecholamines, who received compassionate therapy with TP in nine pediatric intensive care units in Spain, over a 12-month period. The TP dose was 0.02 mg/kg every four hours. RESULTS Sixteen children (age range, 1 month-13 years) were included. The cause of sepsis was meningococcal in eight cases, Staphylococcus aureus in two cases, and unknown in six cases. At inclusion the median (range) Pediatric Logistic Organ Dysfunction score was 23.5 (12-52) and the median (range) Pediatric Risk of Mortality score was 24.5 (16-43). All children had been treated with a combination of at least two catecholamines at high dose rates. TP treatment induced a rapid and sustained improvement in the mean arterial blood pressure that allowed reduction of the catecholamine infusion rate after one hour in 14 out of 16 patients. The mean (range) arterial blood pressure 30 minutes after TP administration increased from 50.5 (37-93) to 77 (42-100) mmHg (P < 0.05). The noradrenaline infusion rate 24 hours after TP treatment decreased from 2 (1-4) to 1 (0-2.5) microg/kg/min (P < 0.05). Seven patients survived to the sepsis episode. The causes of death were refractory shock in three cases, withdrawal of therapy in two cases, refractory arrhythmia in three cases, and multiorgan failure in one case. Four of the survivors had sequelae: major amputations (lower limbs and hands) in one case, minor amputations (finger) in two cases, and minor neurological deficit in one case. CONCLUSION TP is an effective vasopressor agent that could be an alternative or complementary therapy in children with refractory vasodilatory septic shock. The addition of TP to high doses of catecholamines, however, can induce excessive vasoconstriction. Additional studies are needed to define the safety profile and the clinical effectiveness of TP in children with septic shock.
Resumo:
BACKGROUND. Either higher levels of initial DNA damage or lower levels of radiation-induced apoptosis in peripheral blood lymphocytes have been associated to increased risk for develop late radiation-induced toxicity. It has been recently published that these two predictive tests are inversely related. The aim of the present study was to investigate the combined role of both tests in relation to clinical radiation-induced toxicity in a set of breast cancer patients treated with high dose hyperfractionated radical radiotherapy. METHODS. Peripheral blood lymphocytes were taken from 26 consecutive patients with locally advanced breast carcinoma treated with high-dose hyperfractioned radical radiotherapy. Acute and late cutaneous and subcutaneous toxicity was evaluated using the Radiation Therapy Oncology Group morbidity scoring schema. The mean follow-up of survivors (n = 13) was 197.23 months. Radiosensitivity of lymphocytes was quantified as the initial number of DNA double-strand breaks induced per Gy and per DNA unit (200 Mbp). Radiation-induced apoptosis (RIA) at 1, 2 and 8 Gy was measured by flow cytometry using annexin V/propidium iodide. RESULTS. Mean DSB/Gy/DNA unit obtained was 1.70 ± 0.83 (range 0.63-4.08; median, 1.46). Radiation-induced apoptosis increased with radiation dose (median 12.36, 17.79 and 24.83 for 1, 2, and 8 Gy respectively). We observed that those "expected resistant patients" (DSB values lower than 1.78 DSB/Gy per 200 Mbp and RIA values over 9.58, 14.40 or 24.83 for 1, 2 and 8 Gy respectively) were at low risk of suffer severe subcutaneous late toxicity (HR 0.223, 95%CI 0.073-0.678, P = 0.008; HR 0.206, 95%CI 0.063-0.677, P = 0.009; HR 0.239, 95%CI 0.062-0.929, P = 0.039, for RIA at 1, 2 and 8 Gy respectively) in multivariate analysis. CONCLUSIONS. A radiation-resistant profile is proposed, where those patients who presented lower levels of initial DNA damage and higher levels of radiation induced apoptosis were at low risk of suffer severe subcutaneous late toxicity after clinical treatment at high radiation doses in our series. However, due to the small sample size, other prospective studies with higher number of patients are needed to validate these results.
Resumo:
INTRODUCTION Human host immune response following infection with the new variant of A/H1N1 pandemic influenza virus (nvH1N1) is poorly understood. We utilize here systemic cytokine and antibody levels in evaluating differences in early immune response in both mild and severe patients infected with nvH1N1. METHODS We profiled 29 cytokines and chemokines and evaluated the haemagglutination inhibition activity as quantitative and qualitative measurements of host immune responses in serum obtained during the first five days after symptoms onset, in two cohorts of nvH1N1 infected patients. Severe patients required hospitalization (n = 20), due to respiratory insufficiency (10 of them were admitted to the intensive care unit), while mild patients had exclusively flu-like symptoms (n = 15). A group of healthy donors was included as control (n = 15). Differences in levels of mediators between groups were assessed by using the non parametric U-Mann Whitney test. Association between variables was determined by calculating the Spearman correlation coefficient. Viral load was performed in serum by using real-time PCR targeting the neuraminidase gene. RESULTS Increased levels of innate-immunity mediators (IP-10, MCP-1, MIP-1beta), and the absence of anti-nvH1N1 antibodies, characterized the early response to nvH1N1 infection in both hospitalized and mild patients. High systemic levels of type-II interferon (IFN-gamma) and also of a group of mediators involved in the development of T-helper 17 (IL-8, IL-9, IL-17, IL-6) and T-helper 1 (TNF-alpha, IL-15, IL-12p70) responses were exclusively found in hospitalized patients. IL-15, IL-12p70, IL-6 constituted a hallmark of critical illness in our study. A significant inverse association was found between IL-6, IL-8 and PaO2 in critical patients. CONCLUSIONS While infection with the nvH1N1 induces a typical innate response in both mild and severe patients, severe disease with respiratory involvement is characterized by early secretion of Th17 and Th1 cytokines usually associated with cell mediated immunity but also commonly linked to the pathogenesis of autoimmune/inflammatory diseases. The exact role of Th1 and Th17 mediators in the evolution of nvH1N1 mild and severe disease merits further investigation as to the detrimental or beneficial role these cytokines play in severe illness.
Resumo:
INTRODUCTION Higher and lower cerebral perfusion pressure (CPP) thresholds have been proposed to improve brain tissue oxygen pressure (PtiO2) and outcome. We study the distribution of hypoxic PtiO2 samples at different CPP thresholds, using prospective multimodality monitoring in patients with severe traumatic brain injury. METHODS This is a prospective observational study of 22 severely head injured patients admitted to a neurosurgical critical care unit from whom multimodality data was collected during standard management directed at improving intracranial pressure, CPP and PtiO2. Local PtiO2 was continuously measured in uninjured areas and snapshot samples were collected hourly and analyzed in relation to simultaneous CPP. Other variables that influence tissue oxygen availability, mainly arterial oxygen saturation, end tidal carbon dioxide, body temperature and effective hemoglobin, were also monitored to keep them stable in order to avoid non-ischemic hypoxia. RESULTS Our main results indicate that half of PtiO2 samples were at risk of hypoxia (defined by a PtiO2 equal to or less than 15 mmHg) when CPP was below 60 mmHg, and that this percentage decreased to 25% and 10% when CPP was between 60 and 70 mmHg and above 70 mmHg, respectively (p < 0.01). CONCLUSION Our study indicates that the risk of brain tissue hypoxia in severely head injured patients could be really high when CPP is below the normally recommended threshold of 60 mmHg, is still elevated when CPP is slightly over it, but decreases at CPP values above it.
Resumo:
INTRODUCTION Genetic variations may influence clinical outcomes in patients with sepsis. The present study was conducted to evaluate the impact on mortality of three polymorphisms after adjusting for confounding variables, and to assess the factors involved in progression of the inflammatory response in septic patients. METHOD The inception cohort study included all Caucasian adults admitted to the hospital with sepsis. Sepsis severity, microbiological information and clinical variables were recorded. Three polymorphisms were identified in all patients by PCR: the tumour necrosis factor (TNF)-alpha 308 promoter polymorphism; the polymorphism in the first intron of the TNF-beta gene; and the IL-10-1082 promoter polymorphism. Patients included in the study were followed up for 90 days after hospital admission. RESULTS A group of 224 patients was enrolled in the present study. We did not find a significant association among any of the three polymorphisms and mortality or worsening inflammatory response. By multivariate logistic regression analysis, only two factors were independently associated with mortality, namely Acute Physiology and Chronic Health Evaluation (APACHE) II score and delayed initiation of adequate antibiotic therapy. In septic shock patients (n = 114), the delay in initiation of adequate antibiotic therapy was the only independent predictor of mortality. Risk factors for impairment in inflammatory response were APACHE II score, positive blood culture and delayed initiation of adequate antibiotic therapy. CONCLUSION This study emphasizes that prompt and adequate antibiotic therapy is the cornerstone of therapy in sepsis. The three polymorphisms evaluated in the present study appear not to influence the outcome of patients admitted to the hospital with sepsis.
Resumo:
INTRODUCTION Metastases are detected in 20% of patients with solid tumours at diagnosis and a further 30% after diagnosis. Radiation therapy (RT) has proven effective in bone (BM) and brain (BrM) metastases. The objective of this study was to analyze the variability of RT utilization rates in clinical practice and the accessibility to medical technology in our region. PATIENTS AND METHODS We reviewed the clinical records and RT treatment sheets of all patients undergoing RT for BM and/or BrM during 2007 in the 12 public hospitals in an autonomous region of Spain. Data were gathered on hospital type, patient type and RT treatment characteristics. Calculation of the rate of RT use was based on the cancer incidence and the number of RT treatments for BM, BrM and all cancer sites. RESULTS Out of the 9319 patients undergoing RT during 2007 for cancer at any site, 1242 (13.3%; inter-hospital range, 26.3%) received RT for BM (n = 744) or BrM (n = 498). These 1242 patients represented 79% of all RT treatments with palliative intent, and the most frequent primary tumours were in lung, breast, prostate or digestive system. No significant difference between BM and BrM groups were observed in: mean age (62 vs. 59 yrs, respectively); gender (approximately 64% male and 36% female in both); performance status (ECOG 0-1 in 70 vs. 71%); or mean distance from hospital (36 vs. 28.6 km) or time from consultation to RT treatment (13 vs. 14.3 days). RT regimens differed among hospitals and between patient groups: 10 × 300 cGy, 5 × 400 cGy and 1x800cGy were applied in 32, 27 and 25%, respectively, of BM patients, whereas 10 × 300cGy was used in 49% of BrM patients. CONCLUSIONS Palliative RT use in BM and BrM is high and close to the expected rate, unlike the global rate of RT application for all cancers in our setting. Differences in RT schedules among hospitals may reflect variability in clinical practice among the medical teams.
Resumo:
BACKGROUND In cervical postoperative radiotherapy, the target volume is usually the same as the extension of the previous dissection. We evaluated a protocol of selective irradiation according to the risk estimated for each dissected lymph node level. METHODS Eighty patients with oral/oropharyngeal cancer were included in this prospective clinical study between 2005 and 2008. Patients underwent surgery of the primary tumor and cervical dissection, with identification of positive nodal levels, followed by selective postoperative radiotherapy. Three types of selective nodal clinical target volume (CTV) were defined: CTV0, CTV1, and CTV2, with a subclinical disease risk of <10%, 10-25%, and 25% and a prescribed radiation dose of <35 Gy, 50 Gy, and 66-70 Gy, respectively. The localization of node failure was categorized as field, marginal, or outside the irradiated field. RESULTS A consistent pattern of cervical infiltration was observed in 97% of positive dissections. Lymph node failure occurred within a high-risk irradiated area (CTV1-CTV2) in 12 patients, marginal area (CTV1/CTVO) in 1 patient, and non-irradiated low-risk area (CTV0) in 2 patients. The volume of selective lymph node irradiation was below the standard radiation volume in 33 patients (mean of 118.6 cc per patient). This decrease in irradiated volume was associated with greater treatment compliance and reduced secondary toxicity. The three-year actuarial nodal control rate was 80%. CONCLUSION This selective postoperative neck irradiation protocol was associated with a similar failure pattern to that observed after standard neck irradiation and achieved a significant reduction in target volume and secondary toxicity.
Resumo:
INTRODUCTION: Evidence-based recommendations are needed to guide the acute management of the bleeding trauma patient. When these recommendations are implemented patient outcomes may be improved. METHODS: The multidisciplinary Task Force for Advanced Bleeding Care in Trauma was formed in 2005 with the aim of developing a guideline for the management of bleeding following severe injury. This document represents an updated version of the guideline published by the group in 2007 and updated in 2010. Recommendations were formulated using a nominal group process, the Grading of Recommendations Assessment, Development and Evaluation (GRADE) hierarchy of evidence and based on a systematic review of published literature. RESULTS: Key changes encompassed in this version of the guideline include new recommendations on the appropriate use of vasopressors and inotropic agents, and reflect an awareness of the growing number of patients in the population at large treated with antiplatelet agents and/or oral anticoagulants. The current guideline also includes recommendations and a discussion of thromboprophylactic strategies for all patients following traumatic injury. The most significant addition is a new section that discusses the need for every institution to develop, implement and adhere to an evidence-based clinical protocol to manage traumatically injured patients. The remaining recommendations have been re-evaluated and graded based on literature published since the last edition of the guideline. Consideration was also given to changes in clinical practice that have taken place during this time period as a result of both new evidence and changes in the general availability of relevant agents and technologies. CONCLUSIONS: A comprehensive, multidisciplinary approach to trauma care and mechanisms with which to ensure that established protocols are consistently implemented will ensure a uniform and high standard of care across Europe and beyond.
Resumo:
According to the World Health Organization, traumatic injuries worldwide are responsible for over 5 million deaths annually. Post-traumatic bleeding caused by traumatic injury-associated coagulopathy is the leading cause of potentially preventable death among trauma patients. Despite these facts, awareness of this problem is insufficient and treatment options are often unclear. The STOP the Bleeding Campaign therefore aims to increase awareness of the phenomenon of post-traumatic coagulopathy and its appropriate management by publishing European guidelines for the management of the bleeding trauma patient, by promoting and monitoring the implementation of these guidelines and by preparing promotional and educational material, organising activities and developing health quality management tools. The campaign aims to reduce the number of patients who die within 24 hours after arrival in the hospital due to exsanguination by a minimum of 20% within the next 5 years.
Resumo:
INTRODUCTION Evidence-based recommendations are needed to guide the acute management of the bleeding trauma patient, which when implemented may improve patient outcomes. METHODS The multidisciplinary Task Force for Advanced Bleeding Care in Trauma was formed in 2005 with the aim of developing a guideline for the management of bleeding following severe injury. This document presents an updated version of the guideline published by the group in 2007. Recommendations were formulated using a nominal group process, the Grading of Recommendations Assessment, Development and Evaluation (GRADE) hierarchy of evidence and based on a systematic review of published literature. RESULTS Key changes encompassed in this version of the guideline include new recommendations on coagulation support and monitoring and the appropriate use of local haemostatic measures, tourniquets, calcium and desmopressin in the bleeding trauma patient. The remaining recommendations have been reevaluated and graded based on literature published since the last edition of the guideline. Consideration was also given to changes in clinical practice that have taken place during this time period as a result of both new evidence and changes in the general availability of relevant agents and technologies. CONCLUSIONS This guideline provides an evidence-based multidisciplinary approach to the management of critically injured bleeding trauma patients.
Resumo:
Much medical research is observational. The reporting of observational studies is often of insufficient quality. Poor reporting hampers the assessment of the strengths and weaknesses of a study and the generalisability of its results. Taking into account empirical evidence and theoretical considerations, a group of methodologists, researchers, and editors developed the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) recommendations to improve the quality of reporting of observational studies. The STROBE Statement consists of a checklist of 22 items, which relate to the title, abstract, introduction, methods, results and discussion sections of articles. Eighteen items are common to cohort studies, case-control studies and cross-sectional studies and four are specific to each of the three study designs. The STROBE Statement provides guidance to authors about how to improve the reporting of observational studies and facilitates critical appraisal and interpretation of studies by reviewers, journal editors and readers. This explanatory and elaboration document is intended to enhance the use, understanding, and dissemination of the STROBE Statement. The meaning and rationale for each checklist item are presented. For each item, one or several published examples and, where possible, references to relevant empirical studies and methodological literature are provided. Examples of useful flow diagrams are also included. The STROBE Statement, this document, and the associated Web site (http://www.strobe-statement.org/) should be helpful resources to improve reporting of observational research.
Resumo:
CONTEXT: Previous studies may have underestimated the contribution of health behaviors to social inequalities in mortality because health behaviors were assessed only at the baseline of the study. OBJECTIVE: To examine the role of health behaviors in the association between socioeconomic position and mortality and compare whether their contribution differs when assessed at only 1 point in time with that assessed longitudinally through the follow-up period. DESIGN, SETTING, AND PARTICIPANTS: Established in 1985, the British Whitehall II longitudinal cohort study includes 10 308 civil servants, aged 35 to 55 years, living in London, England. Analyses are based on 9590 men and women followed up for mortality until April 30, 2009. Socioeconomic position was derived from civil service employment grade (high, intermediate, and low) at baseline. Smoking, alcohol consumption, diet, and physical activity were assessed 4 times during the follow-up period. MAIN OUTCOME MEASURES: All-cause and cause-specific mortality. RESULTS: A total of 654 participants died during the follow-up period. In the analyses adjusted for sex and year of birth, those with the lowest socioeconomic position had 1.60 times higher risk of death from all causes than those with the highest socioeconomic position (a rate difference of 1.94/1000 person-years). This association was attenuated by 42% (95% confidence interval [CI], 21%-94%) when health behaviors assessed at baseline were entered into the model and by 72% (95% CI, 42%-154%) when they were entered as time-dependent covariates. The corresponding attenuations were 29% (95% CI, 11%-54%) and 45% (95% CI, 24%-79%) for cardiovascular mortality and 61% (95% CI, 16%-425%) and 94% (95% CI, 35%-595%) for noncancer and noncardiovascular mortality. The difference between the baseline only and repeated assessments of health behaviors was mostly due to an increased explanatory power of diet (from 7% to 17% for all-cause mortality, respectively), physical activity (from 5% to 21% for all-cause mortality), and alcohol consumption (from 3% to 12% for all-cause mortality). The role of smoking, the strongest mediator in these analyses, did not change when using baseline or repeat assessments (from 32% to 35% for all-cause mortality). CONCLUSION: In a civil service population in London, England, there was an association between socioeconomic position and mortality that was substantially accounted for by adjustment for health behaviors, particularly when the behaviors were assessed repeatedly.
Resumo:
BACKGROUND: Cytomegalovirus (CMV) retinitis is a major cause of visual impairment and blindness among patients with uncontrolled HIV infections. Whereas polymorphisms in interferon-lambda 3 (IFNL3, previously named IL28B) strongly influence the clinical course of hepatitis C, few studies examined the role of such polymorphisms in infections due to viruses other than hepatitis C virus. OBJECTIVES: To analyze the association of newly identified IFNL3/4 variant rs368234815 with susceptibility to CMV-associated retinitis in a cohort of HIV-infected patients. DESIGN AND METHODS: This retrospective longitudinal study included 4884 white patients from the Swiss HIV Cohort Study, among whom 1134 were at risk to develop CMV retinitis (CD4 nadir <100 /μl and positive CMV serology). The association of CMV-associated retinitis with rs368234815 was assessed by cumulative incidence curves and multivariate Cox regression models, using the estimated date of HIV infection as a starting point, with censoring at death and/or lost follow-up. RESULTS: A total of 40 individuals among 1134 patients at risk developed CMV retinitis. The minor allele of rs368234815 was associated with a higher risk of CMV retinitis (log-rank test P = 0.007, recessive mode of inheritance). The association was still significant in a multivariate Cox regression model (hazard ratio 2.31, 95% confidence interval 1.09-4.92, P = 0.03), after adjustment for CD4 nadir and slope, HAART and HIV-risk groups. CONCLUSION: We reported for the first time an association between an IFNL3/4 polymorphism and susceptibility to AIDS-related CMV retinitis. IFNL3/4 may influence immunity against viruses other than HCV.