948 resultados para neuropathic severity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES Previous studies concluded that haemorrhage is one of the most accurate prognostic factors of mortality in leptospirosis. Therefore, endothelial cell activation was investigated in relation to disease severity in severe leptospirosis. METHODS Prospective cohort study of severe leptospirosis patients. Plasma levels of sE-selectin and Von Willebrand factor (VWF) were determined. Consequently, an in vitro endothelial cell model was used to assess endothelial activation after exposure to virulent Leptospira. Finally, immune activation, as a potential contributing factor to endothelial cell activation, was determined by soluble IL2-receptor (sIL-2r) and soluble Fas-ligand (sFasL) levels. RESULTS Plasma levels of sE-selectin and VWF strongly increased in patients compared to healthy controls. Furthermore, sE-selectin was significantly elevated (203 ng/ml vs. 157 ng/ml, p < 0.05) in survivors compared to non-survivors. Endothelial cells exposed to virulent Leptospira showed increased VWF expression. E-selectin and ICAM-1 expression did not change. Immunohistochemistry revealed the presence of intracellular Leptospira and qPCR suggested replication. In vivo analysis showed that increased levels of sFasL and sIL-2r were both strongly associated with mortality. Furthermore sIL-2r levels were increased in patients that developed bleeding and significantly correlated to duration of hospital stay. DISCUSSION Markers of endothelial activation and immune activation were associated with disease severity in leptospirosis patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND Chronic postsurgical pain (CPSP) is an important clinical problem. Prospective studies of the incidence, characteristics and risk factors of CPSP are needed. OBJECTIVES The objective of this study is to evaluate the incidence and risk factors of CPSP. DESIGN A multicentre, prospective, observational trial. SETTING Twenty-one hospitals in 11 European countries. PATIENTS Three thousand one hundred and twenty patients undergoing surgery and enrolled in the European registry PAIN OUT. MAIN OUTCOME MEASURES Pain-related outcome was evaluated on the first postoperative day (D1) using a standardised pain outcome questionnaire. Review at 6 and 12 months via e-mail or telephonic interview used the Brief Pain Inventory (BPI) and the DN4 (Douleur Neuropathique four questions). Primary endpoint was the incidence of moderate to severe CPSP (numeric rating scale, NRS ≥3/10) at 12 months. RESULTS For 1044 and 889 patients, complete data were available at 6 and 12 months. At 12 months, the incidence of moderate to severe CPSP was 11.8% (95% CI 9.7 to 13.9) and of severe pain (NRS ≥6) 2.2% (95% CI 1.2 to 3.3). Signs of neuropathic pain were recorded in 35.4% (95% CI 23.9 to 48.3) and 57.1% (95% CI 30.7 to 83.4) of patients with moderate and severe CPSP, respectively. Functional impairment (BPI) at 6 and 12 months increased with the severity of CPSP (P < 0.01) and presence of neuropathic characteristics (P < 0.001). Multivariate analysis identified orthopaedic surgery, preoperative chronic pain and percentage of time in severe pain on D1 as risk factors. A 10% increase in percentage of time in severe pain was associated with a 30% increase of CPSP incidence at 12 months. CONCLUSION The collection of data on CPSP was feasible within the European registry PAIN OUT. The incidence of moderate to severe CPSP at 12 months was 11.8%. Functional impairment was associated with CPSP severity and neuropathic characteristics. Risk factors for CPSP in the present study were chronic preoperative pain, orthopaedic surgery and percentage of time in severe pain on D1. TRIAL REGISTRATION Clinicaltrials.gov identifier: NCT01467102.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE The development of peripheral artery disease is affected by the presence of cardiovascular risk factors. It is unclear, whether particular risk factors are leading to different clinical stages of peripheral artery disease. The aim of this retrospective cross-sectional study was to assess the association of cardiovascular risk factors with the presence of critical limb ischaemia. METHODS The study cohort was derived from a consecutive registry of patients undergoing endovascular therapy in a tertiary referral centre between January 2000 and April 2014. Patients undergoing first-time endovascular intervention for chronic peripheral artery disease of the lower extremities were included. Univariate and multivariate logistic regression models were used to assess the association of age, sex, diabetes mellitus, hypertension, dyslipidaemia, smoking, and renal insufficiency with critical limb ischaemia vs. intermittent claudication. RESULTS A total of 3406 patients were included in the study (mean age 71.7 ± 11.8 years, 2075 [61%] male). There was a significant association of age (OR 1.67, 95%-CI 1.53-1.82, p < 0.001), male gender (OR 1.23, 95%-CI 1.04-1.47, p = 0.016), diabetes (OR 1.99, 95%-CI 1.68-2.36, p < 0.001) and renal insufficiency (OR 1.62, 95%-CI 1.35-1.96, p < 0.001) with the likelihood of critical limb ischaemia. Smoking was associated with intermittent claudication rather than critical limb ischaemia (OR 0.78, 95%-CI 0.65-0.94, p = 0.010), while hypertension and dyslipidaemia did not show an association with critical limb ischaemia. CONCLUSIONS In peripheral artery disease patients undergoing first-time endovascular treatment, age, male gender, diabetes, and renal insufficiency were the strongest predictors for the presence of critical limb ischaemia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND Oesophageal clearance has been scarcely studied. AIMS Oesophageal clearance in endoscopy-negative heartburn was assessed to detect differences in bolus clearance time among patients sub-grouped according to impedance-pH findings. METHODS In 118 consecutive endoscopy-negative heartburn patients impedance-pH monitoring was performed off-therapy. Acid exposure time, number of refluxes, baseline impedance, post-reflux swallow-induced peristaltic wave index and both automated and manual bolus clearance time were calculated. Patients were sub-grouped into pH/impedance positive (abnormal acid exposure and/or number of refluxes) and pH/impedance negative (normal acid exposure and number of refluxes), the former further subdivided on the basis of abnormal/normal acid exposure time (pH+/-) and abnormal/normal number of refluxes (impedance+/-). RESULTS Poor correlation (r=0.35) between automated and manual bolus clearance time was found. Manual bolus clearance time progressively decreased from pH+/impedance+ (42.6s), pH+/impedance- (27.1s), pH-/impedance+ (17.8s) to pH-/impedance- (10.8s). There was an inverse correlation between manual bolus clearance time and both baseline impedance and post-reflux swallow-induced peristaltic wave index, and a direct correlation between manual bolus clearance and acid exposure time. A manual bolus clearance time value of 14.8s had an accuracy of 93% to differentiate pH/impedance positive from pH/impedance negative patients. CONCLUSIONS When manually measured, bolus clearance time reflects reflux severity, confirming the pathophysiological relevance of oesophageal clearance in reflux disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Lack of adaptive and enhanced maladaptive coping with stress and negative emotions are implicated in many psychopathological disorders. We describe the development of a new scale to investigate the relative contribution of different coping styles to psychopathology in a large population sample. We hypothesized that the magnitude of the supposed positive correlation between maladaptive coping and psychopathology would be stronger than the supposed negative correlation between adaptive coping and psychopathology. We also examined whether distinct coping style patterns emerge for different psychopathological syndromes. METHODS: A total of 2200 individuals from the general population participated in an online survey. The Patient Health Questionnaire-9 (PHQ-9), the Obsessive-Compulsive Inventory revised (OCI-R) and the Paranoia Checklist were administered along with a novel instrument called Maladaptive and Adaptive Coping Styles (MAX) questionnaire. Participants were reassessed six months later. RESULTS: MAX consists of three dimensions representing adaptive coping, maladaptive coping and avoidance. Across all psychopathological syndromes, similar response patterns emerged. Maladaptive coping was more strongly related to psychopathology than adaptive coping both cross-sectionally and longitudinally. The overall number of coping styles adopted by an individual predicted greater psychopathology. Mediation analysis suggests that a mild positive relationship between adaptive and certain maladaptive styles (emotional suppression) partially accounts for the attenuated relationship between adaptive coping and depressive symptoms. LIMITATIONS: Results should be replicated in a clinical population. CONCLUSIONS: Results suggest that maladaptive and adaptive coping styles are not reciprocal. Reducing maladaptive coping seems to be more important for outcome than enhancing adaptive coping. The study supports transdiagnostic approaches advocating that maladaptive coping is a common factor across different psychopathologies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Streptococcus pneumoniaebacteria can be characterized into over 90 serotypes according to the composition of their polysaccharide capsules. Some serotypes are common in nasopharyngeal carriage whereas others are associated with invasive disease, but when carriage serotypes do invade disease is often particularly severe. It is unknown whether disease severity is due directly to the capsule type or to other virulence factors. Here, we used a clinical pneumococcal isolate and its capsule-switch mutants to determine the effect of capsule, in isolation from the genetic background, on severity of meningitis in an infant rat model. We found that possession of a capsule was essential for causing meningitis. Serotype 6B caused significantly more mortality than 7F and this correlated with increased capsule thickness in the cerebrospinal fluid (CSF), a stronger inflammatory cytokine response in the CSF and ultimately more cortical brain damage. We conclude that capsule type has a direct effect on meningitis severity. This is an important consideration in the current era of vaccination targeting a subset of capsule types that causes serotype replacement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gastrointestinal (GI) protein loss, due to lymphangiectasia or chronic inflammation, can be challenging to diagnose. This study evaluated the diagnostic accuracy of serum and fecal canine α1-proteinase inhibitor (cα1PI) concentrations to detect crypt abscesses and/or lacteal dilation in dogs. Serum and fecal cα1PI concentrations were measured in 120 dogs undergoing GI tissue biopsies, and were compared between dogs with and without crypt abscesses/lacteal dilation. Sensitivity and specificity were calculated for dichotomous outcomes. Serial serum cα1PI concentrations were also evaluated in 12 healthy corticosteroid-treated dogs. Serum cα1PI and albumin concentrations were significantly lower in dogs with crypt abscesses and/or lacteal dilation than in those without (both P <0.001), and more severe lesions were associated with lower serum cα1PI concentrations, higher 3 days-mean fecal cα1PI concentrations, and lower serum/fecal cα1PI ratios. Serum and fecal cα1PI, and their ratios, distinguished dogs with moderate or severe GI crypt abscesses/lacteal dilation from dogs with only mild or none such lesions with moderate sensitivity (56-92%) and specificity (67-81%). Serum cα1PI concentrations increased during corticosteroid administration. We conclude that serum and fecal α1PI concentrations reflect the severity of intestinal crypt abscesses/lacteal dilation in dogs. Due to its specificity for the GI tract, measurement of fecal cα1PI appears to be superior to serum cα1PI for diagnosing GI protein loss in dogs. In addition, the serum/fecal cα1PI ratio has an improved accuracy in hypoalbuminemic dogs, but serum cα1PI concentrations should be carefully interpreted in corticosteroid-treated dogs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND Catecholamine-O-methyl-tranferase (COMT) initiates dopamine degradation. Its activity is mainly determined by a single nucleotide polymorphism in the COMT gene (Val158Met, rs4680) separating high (Val/Val, COMT(HH)), intermediate (Val/Met, COMT(HL)) and low metabolizers (Met/Met, COMT(LL)). We investigated dopaminergic denervation in the striatum in PD patients according to COMT rs4680 genotype. METHODS Patients with idiopathic PD were assessed for motor severity (UPDRS-III rating scale in OFF-state), dopaminergic denervation using [123I]-FP-CIT SPECT imaging, and genotyped for the COMT rs4680 enzyme. [123I]-FP-CIT binding potential (BP) for each voxel was defined by the ratio of tracer-binding in the region of interest (striatum, caudate nucleus and putamen) to that in a region of non-specific activity. Genotyping was performed using TaqMan(®) SNP genotyping assay. We used a regression model to evaluate the effect of COMT genotype on the BP in the striatum and its sub-regions. RESULTS Genotype distribution was: 11 (27.5%) COMT(HH), 26 (65%) COMT(HL) and 3 (7.5%) COMT(LL). There were no significant differences in disease severity, treatments, or motor scores between genotypes. When adjusted to clinical severity, gender and age, low and intermediate metabolizers showed significantly higher rates of striatal denervation (COMT(HL+LL) BP = 1.32 ± 0.04) than high metabolizers (COMT(HH), BP = 1.6 ± 0.08; F(1.34) = 9.0, p = 0.005). Striatal sub-regions showed similar results. BP and UPDRS-III motor scores (r = 0.44, p = 0.04) (p < 0.001) were highly correlated. There was a gender effect, but no gender-genotype interaction. CONCLUSIONS Striatal denervation differs according to COMT-Val158Met polymorphism. COMT activity may play a role as a compensatory mechanism in PD motor symptoms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Predicting the timing and amount of tree mortality after a forest fire is of paramount importance for post-fire management decisions, such as salvage logging or reforestation. Such knowledge is particularly needed in mountainous regions where forest stands often serve as protection against natural hazards (e.g., snow avalanches, rockfalls, landslides). In this paper, we focus on the drivers and timing of mortality in fire-injured beech trees (Fagus sylvatica L.) in mountain regions. We studied beech forests in the southwestern European Alps, which burned between 1970 and 2012. The results show that beech trees, which lack fire-resistance traits, experience increased mortality within the first two decades post-fire with a timing and amount strongly related to the burn severity. Beech mortality is fast and ubiquitous in high severity sites, whereas small- (DBH <12 cm) and intermediate-diameter (DBH 12–36 cm) trees face a higher risk to die in moderate-severity sites. Large-diameter trees mostly survive, representing a crucial ecological legacy for beech regeneration. Mortality remains low and at a level similar to unburnt beech forests for low burn severity sites. Beech trees diameter, the presence of fungal infestation and elevation are the most significant drivers of mortality. The risk of beech to die increases toward higher elevation and is higher for small-diameter than for large-diameter trees. In case of secondary fungi infestation beech faces generally a higher risk to die. Interestingly, fungi that initiate post-fire tree mortality differ from fungi occurring after mechanical injury. From a management point of view, the insights about the controls of post-fire mortality provided by this study should help in planning post-fire silvicultural measures in montane beech forests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background. Clostridium difficile infection is one of the major causes of antibiotic associated diarrhea and colitis in the United States. Currently, there is a dearth of literature on the risk factors and outcomes differences between the patients with infection due to the hypervirulent strain vs. the non-hypervirulent strains. The objective of this study was to determine the relationship between C. difficile toxin type and clinical features, severity and outcome in patients with C. difficile diarrhea. ^ Methods. The case group included 37 patients who had infections due to hypervirulent strain (tcdC deletion) and the control group included 55 patients with other toxin types (toxin A, B, binary toxin). A univariate analysis was performed followed by a multivariable logistic regression analysis to assess the differences between cases and controls. ^ Results. In the multivariate analyses, we found out that being a male was a protective factor for developing the infection due to the hypervirulent strain [OR 0.33; 95% CI 0.12-0.90]. Also, the hypervirulent group has worse clinical and economic outcomes, although the differences were small and nonsignificant. ^ Conclusions. There may likely be no predictive risk factor for acquiring infection due to the hypervirulent strain and the acquisition may be more linked to the infection control practices of the individual hospitals or location of patients. Hence, better infection control practices may prove helpful in decreasing the overall disease burden and thus improve patient outcomes. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Clostridium difficile is the most important and common cause of hospital-acquired diarrhea. Toxin A and B are two important protein toxins responsible for C. difficile disease. This systematic review was undertaken to summarize the association between severity of C. difficile disease and different types of toxins. Only 5 studies were found that met the inclusion criteria. Only two studies reported results that were statistically significant and that the C. difficile disease was more severe in patient with binary toxin genes. Other three studies did not report significant findings but the authors stated that these studies were too small to detect true association. The main difference between the studies which detect association and those which did not detect association was the sample size. Well-designed and large scale studies are needed to strengthen the relationship between severe disease and toxin types. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose. To determine which symptoms are the most reported, occur most frequently, have the greatest severity, and cause the most bother for hemodialysis (HD) patients and to determine if the symptoms experienced differ between the first (HD 1) and second (HD 2) treatments of the week. ^ Design. An observational, comparative design was used to determine participants' HD symptoms experience on HD 1 and HD 2, and the effect of the symptom experience on Quality of Life (QOL). One hundred subjects were recruited from five dialysis centers. ^ Methods. The adapted Dialysis Frequency, Severity and Symptom Burden Index (DFSSBI) and the Medical Outcomes Study Short Form 36 (MOS SF 36) were administered (N = 99) on HD 1 and the DFSSBI again on HD 2. Data were analyzed for significance among symptoms experience test scores in relation to HD 1 and HD 2, QOL, and gender and age. ^ Results. Of 31 symptoms assessed, respondents reported an average of 9.69 symptoms on HD 1 and 7.51 symptoms on HD 2. Overall, more symptoms were reported, and were more frequent, severe and bothersome on HD 1 when the level of metabolic waste is highest. The most reported symptoms included tiredness, dry skin, difficulty falling asleep, itching, numbness/tingling, difficulty staying asleep, decreased interest in sex, and bone/joint pain. Females scored consistently higher than males in the four symptom dimensions. The respondents reported about the same as the population norm (50) on the physical component summary score of the MOS SF 36 and higher than the norm (65.23) on the mental component summary score. ^ Conclusion. The study findings highlighted the fact that hemodialysis patients experience multiple symptoms that can be frequent, severe, and bothersome. Interventions should be developed and tested to reduce symptom burden and improve QOL. ^