81 resultados para Cesalpino, Andrea, 1524 or 5-1603.
Resumo:
Background: Inhibition of the c-Jun N-terminal kinase (JNK) pathway by the TAT-coupled peptide XG-102 (formerly D-JNKI1) induces strong neuroprotection in ischemic stroke in rodents. We investigated the effect of JNK inhibition in intracerebral hemorrhage (ICH). Methods: Three hours after induction of ICH by intrastriatal collagenase injection in mice, the animals received an intravenous injection of 100 mu g/kg of XG-102. The neurological outcome was assessed daily and the mice were sacrificed at 6 h, 1, 2 or 5 days after ICH. Results: XG-102 administration significantly improved the neurological outcome at 1 day (p < 0.01). The lesion volume was significantly decreased after 2 days (29 +/- 11 vs. 39 +/- 5 mm(3) in vehicle-treated animals, p < 0.05). There was also a decreased hemispheric swelling (14 +/- 13 vs. 26 +/- 9% in vehicle-treated animals, p < 0.05) correlating with increased aquaporin 4 expression. Conclusions: XG-102 attenuates cerebral edema in ICH and functional impairment at early time points. The beneficial effects observed with XG-102 in ICH, as well as in ischemic stroke, open the possibility to rapidly treat stroke patients before imaging, thereby saving precious time.
Resumo:
According to the most widely accepted Cattell-Horn-Carroll (CHC) model of intelligence measurement, each subtest score of the Wechsler Intelligence Scale for Adults (3rd ed.; WAIS-III) should reflect both 1st- and 2nd-order factors (i.e., 4 or 5 broad abilities and 1 general factor). To disentangle the contribution of each factor, we applied a Schmid-Leiman orthogonalization transformation (SLT) to the standardization data published in the French technical manual for the WAIS-III. Results showed that the general factor accounted for 63% of the common variance and that the specific contributions of the 1st-order factors were weak (4.7%-15.9%). We also addressed this issue by using confirmatory factor analysis. Results indicated that the bifactor model (with 1st-order group and general factors) better fit the data than did the traditional higher order structure. Models based on the CHC framework were also tested. Results indicated that a higher order CHC model showed a better fit than did the classical 4-factor model; however, the WAIS bifactor structure was the most adequate. We recommend that users do not discount the Full Scale IQ when interpreting the index scores of the WAIS-III because the general factor accounts for the bulk of the common variance in the French WAIS-III. The 4 index scores cannot be considered to reflect only broad ability because they include a strong contribution of the general factor.
Nimesulide, a cyclooxygenase-2 preferential inhibitor, impairs renal function in the newborn rabbit.
Resumo:
Tocolysis with nonsteroidal anti-inflammatory drugs (NSAIDs) has been widely accepted for several years. Recently, the use of the cyclooxygenase-2 (COX2) preferential NSAID nimesulide has been proposed. However, data reporting neonatal acute renal failure or irreversible end-stage renal failure after maternal ingestion of nimesulide question the safety of this drug for the fetus and the neonate. Therefore, this study was designed to define the renal effects of nimesulide in newborn rabbits. Experiments were performed in 28 newborn rabbits. Renal function and hemodynamic parameters were measured using inulin and para-aminohippuric acid clearances as markers of GFR and renal blood flow, respectively. After a control period, nimesulide 2, 20, or 200 microg/kg was given as an i.v. bolus, followed by a 0.05, 0.5, or 5 microg.kg(-1).min(-1) infusion. Nimesulide administration induced a significant dose-dependent increase in renal vascular resistance (29, 37, and 92%, respectively), with a concomitant decrease in diuresis (-5, -23, and -44%), GFR (-12, -23, and -47%), and renal blood flow (-23, -23, and -48%). These results are in contrast with recent reports claiming that selective COX2 inhibition could be safer for the kidney than nonselective NSAIDs. These experiments confirm that prostaglandins, by maintaining renal vasodilation, play a key role in the delicate balance regulating neonatal GFR. We conclude that COX2-selective/preferential inhibitors thus should be prescribed with the same caution as nonselective NSAIDs during pregnancy and in the neonatal period.
Resumo:
A new, investigational, parenteral form of sparfloxacin was compared with ceftriaxone in the treatment of experimental endocarditis caused by either of three penicillin-susceptible streptococci or one penicillin-resistant streptococcus. Both drugs have prolonged half-lives in serum, allowing single daily administration to humans. Sparfloxacin had relatively low MICs (0.25 to 0.5 mg/liter) for all four organisms and was also greater than or equal to eight times more effective than the other quinolones against 21 additional streptococcal isolates recovered from patients with bacteremia. Ceftriaxone MICs were 0.032 to 0.064 mg/liter for the penicillin-susceptible strains and 2 mg/liter for the resistant isolate. Both antibiotics resulted in moderate bacterial killing in vitro. Rats with catheter-induced aortic vegetations were inoculated with 10(7) CFU of the test organisms. Antibiotic treatment was started 48 h later and lasted either 3 or 5 days. The drugs were injected at doses which mimicked the kinetics in human serum produced by one intravenous injection of 400 mg of sparfloxacin (i.e., the daily dose expected to be given to human adults) and 2 g of ceftriaxone. Both antibiotics significantly decreased the bacterial densities in the vegetations. However, sparfloxacin was slower than ceftriaxone in its ability to eradicate valvular infection caused by penicillin-susceptible bacteria. While this difference was quite marked after 3 days of therapy, it tended to vanish when treatment was prolonged to 5 days. In contrast, sparfloxacin was very effective against the penicillin-resistant isolate, an organism against which ceftriaxone therapy failed in vivo. No sparfloxacin-resistant mutant was selected during therapy. Thus, in the present experimental setting, this new, investigational, parenteral form of sparfloxacin was effective against severe infections caused by both penicillin-susceptible and penicillin-resistant streptococci.
Resumo:
BACKGROUND AND AIMS: Critically ill patients with complicated evolution are frequently hypermetabolic, catabolic, and at risk of underfeeding. The study aimed at assessing the relationship between energy balance and outcome in critically ill patients. METHODS: Prospective observational study conducted in consecutive patients staying > or = 5 days in the surgical ICU of a University hospital. Demographic data, time to feeding, route, energy delivery, and outcome were recorded. Energy balance was calculated as energy delivery minus target. Data in means+/-SD, linear regressions between energy balance and outcome variables. RESULTS: Forty eight patients aged 57+/-16 years were investigated; complete data are available in 669 days. Mechanical ventilation lasted 11+/-8 days, ICU stay 15+/-9 was days, and 30-days mortality was 38%. Time to feeding was 3.1+/-2.2 days. Enteral nutrition was the most frequent route with 433 days. Mean daily energy delivery was 1090+/-930 kcal. Combining enteral and parenteral nutrition achieved highest energy delivery. Cumulated energy balance was between -12,600+/-10,520 kcal, and correlated with complications (P < 0.001), already after 1 week. CONCLUSION: Negative energy balances were correlated with increasing number of complications, particularly infections. Energy debt appears as a promising tool for nutritional follow-up, which should be further tested. Delaying initiation of nutritional support exposes the patients to energy deficits that cannot be compensated later on.
Resumo:
Late treatment of invasive candidiasis (IC) results in severe complications and high mortality. New tools are needed for early diagnosis. We conducted a retrospective study to assess the diagnostic utility of mannan antigenemia (Mn) and antimannan antibodies (anti-Mn) in neutropenic cancer patients at high risk for candidiasis. Twenty-eight patients with IC (based on European Organization for Research and Treatment of Cancer and Mycoses Study Group definitions) and 25 controls were studied. Mn and anti-Mn were positive (> or = 0.25 ng/mL and > or = 5 AU/mL, respectively) in 25/28 (89%) patients with candidiasis and in 4/25 (16%) controls: sensitivity, 89%; specificity, 84%; positive predictive value, 86%; negative predictive value, 88%. In patients with hepatosplenic lesions, assessing Mn/anti-Mn shortened the median time of diagnosis of candidiasis when compared with imaging (9 versus 25 days after fever onset as first sign of infection; P < 0.001). Candidiasis was diagnosed before neutrophil recovery in 78% and 11% of cases with Mn/anti-Mn and radiology, respectively (P < 0.001). Mn and anti-Mn may be useful for early noninvasive diagnosis of IC.
Resumo:
The phylogeographic population structure of Mycobacterium tuberculosis suggests local adaptation to sympatric human populations. We hypothesized that HIV infection, which induces immunodeficiency, will alter the sympatric relationship between M. tuberculosis and its human host. To test this hypothesis, we performed a nine-year nation-wide molecular-epidemiological study of HIV-infected and HIV-negative patients with tuberculosis (TB) between 2000 and 2008 in Switzerland. We analyzed 518 TB patients of whom 112 (21.6%) were HIV-infected and 233 (45.0%) were born in Europe. We found that among European-born TB patients, recent transmission was more likely to occur in sympatric compared to allopatric host-pathogen combinations (adjusted odds ratio [OR] 7.5, 95% confidence interval [95% CI] 1.21-infinity, p = 0.03). HIV infection was significantly associated with TB caused by an allopatric (as opposed to sympatric) M. tuberculosis lineage (OR 7.0, 95% CI 2.5-19.1, p<0.0001). This association remained when adjusting for frequent travelling, contact with foreigners, age, sex, and country of birth (adjusted OR 5.6, 95% CI 1.5-20.8, p = 0.01). Moreover, it became stronger with greater immunosuppression as defined by CD4 T-cell depletion and was not the result of increased social mixing in HIV-infected patients. Our observation was replicated in a second independent panel of 440 M. tuberculosis strains collected during a population-based study in the Canton of Bern between 1991 and 2011. In summary, these findings support a model for TB in which the stable relationship between the human host and its locally adapted M. tuberculosis is disrupted by HIV infection.
Resumo:
BACKGROUND: The impact of preoperative impaired left ventricular ejection fraction (EF) in octogenarians following coronary bypass surgery on short-term survival was evaluated in this study. METHODS: A total of 147 octogenarians (mean age 82.1 ± 1.9 years) with coronary artery diseases underwent elective coronary artery bypass graft between January 2000 and December 2009. Patients were stratified into: Group I (n = 59) with EF >50%, Group II (n = 59) with 50% > EF >30% and in Group III (n = 29) with 30% > EF. RESULTS: There was no difference among the three groups regarding incidence of COPD, renal failure, congestive heart failure, diabetes, and preoperative cerebrovascular events. Postoperative atrial fibrillation was the sole independent predictive factor for in-hospital mortality (odds ratio (OR), 18.1); this was 8.5% in Group I, 15.3% in Group II and 10.3% in Group III. Independent predictive factors for mortality during follow up were: decrease of EF during follow-up for more that 5% (OR, 5.2), usage of left internal mammary artery as free graft (OR, 18.1), and EF in follow-up lower than 40% (OR, 4.8). CONCLUSIONS: The results herein suggest acceptable in-hospital as well short-term mortality in octogenarians with impaired EF following coronary artery bypass grafting (CABG) and are comparable to recent literature where the mortality of younger patients was up to 15% and short-term mortality up to 40%, respectively. Accordingly, we can also state that in an octogenarian cohort with impaired EF, CABG is a viable treatment with acceptable mortality.
Resumo:
PATIENTS: All neonates admitted between January 2002 and December 2007 treated by nCPAP were eligible. METHODS: Patients' noses were monitored during nCPAP. Nasal trauma was reported into three stages: (I) persistent erythema; (II) superficial ulceration; and (III) necrosis. RESULTS: 989 neonates were enrolled. Mean gestational age was 34 weeks (SD 4), mean birth weight 2142 g (SD 840). Nasal trauma was reported in 420 (42.5%) patients and it was of stage I, II and III in 371 (88.3%), 46 (11%) and 3 (0.7%) patients, respectively. Incidence and severity of trauma were inversely correlated with gestational age and birth weight. The risk of nasal trauma was greater in neonates <32 weeks of gestational age (OR 2.48, 95% CI 1.59 to 3.86), weighing <1500 g at birth (OR 2.28, 95% CI 1.43 to 3.64), treated >5 days by nCPAP (OR 5.36, 95% CI 3.82 to 7.52), or staying >14 days in the NICU (OR 1.67, 95% CI 1.22 to 2.28). Most cases of nasal trauma (90%) appeared during the first 6 days of nCPAP. Persistent visible scars were present in two cases. CONCLUSIONS: Nasal trauma is a frequent complication of nCPAP, especially in preterm neonates, but long-term cosmetic sequelae are very rare. This study provides a description of nasal trauma and proposes a simple staging system. This could serve as a basis to develop strategies of prevention and treatment of this iatrogenic event.
Resumo:
BACKGROUND: Prevention of cardiovascular disease (CVD) at the individual level should rely on the assessment of absolute risk using population-specific risk tables. OBJECTIVE: To compare the predictive accuracy of the original and the calibrated SCORE functions regarding 10-year cardiovascular risk in Switzerland. DESIGN: Cross-sectional, population-based study (5773 participants aged 35-74 years). METHODS: The SCORE equation for low-risk countries was calibrated based on the Swiss CVD mortality rates and on the CVD risk factor levels from the study sample. The predicted number of CVD deaths after a 10-year period was computed from the original and the calibrated equations and from the observed cardiovascular mortality for 2003. RESULTS: According to the original and calibrated functions, 16.3 and 15.8% of men and 8.2 and 8.9% of women, respectively, had a 10-year CVD risk > or =5%. Concordance correlation coefficient between the two functions was 0.951 for men and 0.948 for women, both P<0.001. Both risk functions adequately predicted the 10-year cumulative number of CVD deaths: in men, 71 (original) and 74 (calibrated) deaths for 73 deaths when using the CVD mortality rates; in women, 44 (original), 45 (calibrated) and 45 (CVD mortality rates), respectively. Compared to the original function, the calibrated function classified more women and fewer men at high-risk. Moreover, the calibrated function gave better risk estimates among participants aged over 65 years. CONCLUSION: The original SCORE function adequately predicts CVD death in Switzerland, particularly for individuals aged less than 65 years. The calibrated function provides more reliable estimates for older individuals.
Resumo:
PURPOSE: The recent increase in drug-resistant micro-organisms complicates the management of hospital-acquired bloodstream infections (HA-BSIs). We investigated the epidemiology of HA-BSI and evaluated the impact of drug resistance on outcomes of critically ill patients, controlling for patient characteristics and infection management. METHODS: A prospective, multicentre non-representative cohort study was conducted in 162 intensive care units (ICUs) in 24 countries. RESULTS: We included 1,156 patients [mean ± standard deviation (SD) age, 59.5 ± 17.7 years; 65 % males; mean ± SD Simplified Acute Physiology Score (SAPS) II score, 50 ± 17] with HA-BSIs, of which 76 % were ICU-acquired. Median time to diagnosis was 14 [interquartile range (IQR), 7-26] days after hospital admission. Polymicrobial infections accounted for 12 % of cases. Among monomicrobial infections, 58.3 % were gram-negative, 32.8 % gram-positive, 7.8 % fungal and 1.2 % due to strict anaerobes. Overall, 629 (47.8 %) isolates were multidrug-resistant (MDR), including 270 (20.5 %) extensively resistant (XDR), and 5 (0.4 %) pan-drug-resistant (PDR). Micro-organism distribution and MDR occurrence varied significantly (p < 0.001) by country. The 28-day all-cause fatality rate was 36 %. In the multivariable model including micro-organism, patient and centre variables, independent predictors of 28-day mortality included MDR isolate [odds ratio (OR), 1.49; 95 % confidence interval (95 %CI), 1.07-2.06], uncontrolled infection source (OR, 5.86; 95 %CI, 2.5-13.9) and timing to adequate treatment (before day 6 since blood culture collection versus never, OR, 0.38; 95 %CI, 0.23-0.63; since day 6 versus never, OR, 0.20; 95 %CI, 0.08-0.47). CONCLUSIONS: MDR and XDR bacteria (especially gram-negative) are common in HA-BSIs in critically ill patients and are associated with increased 28-day mortality. Intensified efforts to prevent HA-BSIs and to optimize their management through adequate source control and antibiotic therapy are needed to improve outcomes.
Resumo:
BACKGROUND: Treatment strategies for acute basilar artery occlusion (BAO) are based on case series and data that have been extrapolated from stroke intervention trials in other cerebrovascular territories, and information on the efficacy of different treatments in unselected patients with BAO is scarce. We therefore assessed outcomes and differences in treatment response after BAO. METHODS: The Basilar Artery International Cooperation Study (BASICS) is a prospective, observational registry of consecutive patients who presented with an acute symptomatic and radiologically confirmed BAO between November 1, 2002, and October 1, 2007. Stroke severity at time of treatment was dichotomised as severe (coma, locked-in state, or tetraplegia) or mild to moderate (any deficit that was less than severe). Outcome was assessed at 1 month. Poor outcome was defined as a modified Rankin scale score of 4 or 5, or death. Patients were divided into three groups according to the treatment they received: antithrombotic treatment only (AT), which comprised antiplatelet drugs or systemic anticoagulation; primary intravenous thrombolysis (IVT), including subsequent intra-arterial thrombolysis; or intra-arterial therapy (IAT), which comprised thrombolysis, mechanical thrombectomy, stenting, or a combination of these approaches. Risk ratios (RR) for treatment effects were adjusted for age, the severity of neurological deficits at the time of treatment, time to treatment, prodromal minor stroke, location of the occlusion, and diabetes. FINDINGS: 619 patients were entered in the registry. 27 patients were excluded from the analyses because they did not receive AT, IVT, or IAT, and all had a poor outcome. Of the 592 patients who were analysed, 183 were treated with only AT, 121 with IVT, and 288 with IAT. Overall, 402 (68%) of the analysed patients had a poor outcome. No statistically significant superiority was found for any treatment strategy. Compared with outcome after AT, patients with a mild-to-moderate deficit (n=245) had about the same risk of poor outcome after IVT (adjusted RR 0.94, 95% CI 0.60-1.45) or after IAT (adjusted RR 1.29, 0.97-1.72) but had a worse outcome after IAT compared with IVT (adjusted RR 1.49, 1.00-2.23). Compared with AT, patients with a severe deficit (n=347) had a lower risk of poor outcome after IVT (adjusted RR 0.88, 0.76-1.01) or IAT (adjusted RR 0.94, 0.86-1.02), whereas outcomes were similar after treatment with IAT or IVT (adjusted RR 1.06, 0.91-1.22). INTERPRETATION: Most patients in the BASICS registry received IAT. Our results do not support unequivocal superiority of IAT over IVT, and the efficacy of IAT versus IVT in patients with an acute BAO needs to be assessed in a randomised controlled trial. FUNDING: Department of Neurology, University Medical Center Utrecht.
Resumo:
BACKGROUND: Good adherence to antiretroviral therapy (ART) is critical for successful HIV treatment. However, some patients remain virologically suppressed despite suboptimal adherence. We hypothesized that this could result from host genetic factors influencing drug levels. METHODS: Eligible individuals were Caucasians treated with efavirenz (EFV) and/or boosted lopinavir (LPV/r) with self-reported poor adherence, defined as missing doses of ART at least weekly for more than 6 months. Participants were genotyped for single nucleotide polymorphisms (SNPs) in candidate genes previously reported to decrease EFV (rs3745274, rs35303484, rs35979566 in CYP2B6) and LPV/r clearance (rs4149056 in SLCO1B1, rs6945984 in CYP3A, rs717620 in ABCC2). Viral suppression was defined as having HIV-1 RNA <400 copies/ml throughout the study period. RESULTS: From January 2003 until May 2009, 37 individuals on EFV (28 suppressed and 9 not suppressed) and 69 on LPV/r (38 suppressed and 31 not suppressed) were eligible. The poor adherence period was a median of 32 weeks with 18.9% of EFV and 20.3% of LPV/r patients reporting missed doses on a daily basis. The tested SNPs were not determinant for viral suppression. Reporting missing >1 dose/week was associated with a lower probability of viral suppression compared to missing 1 dose/week (EFV: odds ratio (OR) 0.11, 95% confidence interval (CI): 0.01-0.99; LPV/r: OR 0.29, 95% CI: 0.09-0.94). In both groups, the probability of remaining suppressed increased with the duration of continuous suppression prior to the poor adherence period (EFV: OR 3.40, 95% CI: 0.62-18.75; LPV/r: OR 5.65, 95% CI: 1.82-17.56). CONCLUSIONS: The investigated genetic variants did not play a significant role in the sustained viral suppression of individuals with suboptimal adherence. Risk of failure decreased with longer duration of viral suppression in this population.
Resumo:
Objective: To investigate personality traits in patients with Alzheimer disease, compared with mentally healthy control subjects. We compared both current personality characteristics using structured interviews as well as current and previous personality traits as assessed by proxies.Method: Fifty-four patients with mild Alzheimer disease and 64 control subjects described their personality traits using the Structured Interview for the Five-Factor Model. Family members filled in the Revised NEO Personality Inventory, Form R, to evaluate their proxies' current personality traits, compared with 5 years before the estimated beginning of Alzheimer disease or 5 years before the control subjects.Results: After controlling for age, the Alzheimer disease group presented significantly higher scores than normal control subjects on current neuroticism, and significantly lower scores on current extraversion, openness, and conscientiousness, while no significant difference was observed on agreeableness. A similar profile, though less accentuated, was observed when considering personality traits as the patients' proxies remembered them. Diachronic personality assessment showed again significant differences between the 2 groups for the same 4 domains, with important personality changes only for the Alzheimer disease group.Conclusions: Group comparison and retrospective personality evaluation are convergent. Significant personality changes follow a specific trend in patients with Alzheimer disease and contrast with the stability generally observed in mentally healthy people in their personality profile throughout their lives. Whether or not the personality assessment 5 years before the current status corresponds to an early sign of Alzheimer disease or real premorbid personality differences in people who later develop Alzheimer disease requires longitudinal studies.
Resumo:
OBJECTIVES: The objectives were to identify the social and medical factors associated with emergency department (ED) frequent use and to determine if frequent users were more likely to have a combination of these factors in a universal health insurance system. METHODS: This was a retrospective chart review case-control study comparing randomized samples of frequent users and nonfrequent users at the Lausanne University Hospital, Switzerland. The authors defined frequent users as patients with four or more ED visits within the previous 12 months. Adult patients who visited the ED between April 2008 and March 2009 (study period) were included, and patients leaving the ED without medical discharge were excluded. For each patient, the first ED electronic record within the study period was considered for data extraction. Along with basic demographics, variables of interest included social (employment or housing status) and medical (ED primary diagnosis) characteristics. Significant social and medical factors were used to construct a logistic regression model, to determine factors associated with frequent ED use. In addition, comparison of the combination of social and medical factors was examined. RESULTS: A total of 359 of 1,591 frequent and 360 of 34,263 nonfrequent users were selected. Frequent users accounted for less than a 20th of all ED patients (4.4%), but for 12.1% of all visits (5,813 of 48,117), with a maximum of 73 ED visits. No difference in terms of age or sex occurred, but more frequent users had a nationality other than Swiss or European (n = 117 [32.6%] vs. n = 83 [23.1%], p = 0.003). Adjusted multivariate analysis showed that social and specific medical vulnerability factors most increased the risk of frequent ED use: being under guardianship (adjusted odds ratio [OR] = 15.8; 95% confidence interval [CI] = 1.7 to 147.3), living closer to the ED (adjusted OR = 4.6; 95% CI = 2.8 to 7.6), being uninsured (adjusted OR = 2.5; 95% CI = 1.1 to 5.8), being unemployed or dependent on government welfare (adjusted OR = 2.1; 95% CI = 1.3 to 3.4), the number of psychiatric hospitalizations (adjusted OR = 4.6; 95% CI = 1.5 to 14.1), and the use of five or more clinical departments over 12 months (adjusted OR = 4.5; 95% CI = 2.5 to 8.1). Having two of four social factors increased the odds of frequent ED use (adjusted = OR 5.4; 95% CI = 2.9 to 9.9), and similar results were found for medical factors (adjusted OR = 7.9; 95% CI = 4.6 to 13.4). A combination of social and medical factors was markedly associated with ED frequent use, as frequent users were 10 times more likely to have three of them (on a total of eight factors; 95% CI = 5.1 to 19.6). CONCLUSIONS: Frequent users accounted for a moderate proportion of visits at the Lausanne ED. Social and medical vulnerability factors were associated with frequent ED use. In addition, frequent users were more likely to have both social and medical vulnerabilities than were other patients. Case management strategies might address the vulnerability factors of frequent users to prevent inequities in health care and related costs.