961 resultados para Botticelli, Sandro, 1444 or 5-1510
Resumo:
A cohort of 123 adult contacts was followed for 18‐24 months (86 completed the follow-up) to compare conversion and reversion rates based on two serial measures of QuantiFERON (QFT) and tuberculin skin test (TST) (PPD from TUBERSOL, Aventis Pasteur, Canada) for diagnosing latent tuberculosis (TB) in household contacts of TB patients using conventional (C) and borderline zone (BZ) definitions. Questionnaires were used to obtain information regarding TB exposure, TB risk factors and socio-demographic data. QFT (IU/mL) conversion was defined as <0.35 to ≥0.35 (C) or <0.35 to >0.70 (BZ) and reversion was defined as ≥0.35 to <0.35 (C) or ≥0.35 to <0.20 (BZ); TST (mm) conversion was defined as <5 to ≥5 (C) or <5 to >10 (BZ) and reversion was defined as ≥5 to <5 (C). The QFT conversion and reversion rates were 10.5% and 7% with C and 8.1% and 4.7% with the BZ definitions, respectively. The TST rates were higher compared with QFT, especially with the C definitions (conversion 23.3%, reversion 9.3%). The QFT conversion and reversion rates were higher for TST ≥5; for TST, both rates were lower for QFT <0.35. No risk factors were associated with the probability of converting or reverting. The inconsistency and apparent randomness of serial testing is confusing and adds to the limitations of these tests and definitions to follow-up close TB contacts.
Resumo:
The CIAOW study (Complicated intra-abdominal infections worldwide observational study) is a multicenter observational study underwent in 68 medical institutions worldwide during a six-month study period (October 2012-March 2013). The study included patients older than 18 years undergoing surgery or interventional drainage to address complicated intra-abdominal infections (IAIs). 1898 patients with a mean age of 51.6 years (range 18-99) were enrolled in the study. 777 patients (41%) were women and 1,121 (59%) were men. Among these patients, 1,645 (86.7%) were affected by community-acquired IAIs while the remaining 253 (13.3%) suffered from healthcare-associated infections. Intraperitoneal specimens were collected from 1,190 (62.7%) of the enrolled patients. 827 patients (43.6%) were affected by generalized peritonitis while 1071 (56.4%) suffered from localized peritonitis or abscesses. The overall mortality rate was 10.5% (199/1898). According to stepwise multivariate analysis (PR = 0.005 and PE = 0.001), several criteria were found to be independent variables predictive of mortality, including patient age (OR = 1.1; 95%CI = 1.0-1.1; p < 0.0001), the presence of small bowel perforation (OR = 2.8; 95%CI = 1.5-5.3; p < 0.0001), a delayed initial intervention (a delay exceeding 24 hours) (OR = 1.8; 95%CI = 1.5-3.7; p < 0.0001), ICU admission (OR = 5.9; 95%CI = 3.6-9.5; p < 0.0001) and patient immunosuppression (OR = 3.8; 95%CI = 2.1-6.7; p < 0.0001).
Resumo:
BACKGROUND: The impact of preoperative impaired left ventricular ejection fraction (EF) in octogenarians following coronary bypass surgery on short-term survival was evaluated in this study. METHODS: A total of 147 octogenarians (mean age 82.1 ± 1.9 years) with coronary artery diseases underwent elective coronary artery bypass graft between January 2000 and December 2009. Patients were stratified into: Group I (n = 59) with EF >50%, Group II (n = 59) with 50% > EF >30% and in Group III (n = 29) with 30% > EF. RESULTS: There was no difference among the three groups regarding incidence of COPD, renal failure, congestive heart failure, diabetes, and preoperative cerebrovascular events. Postoperative atrial fibrillation was the sole independent predictive factor for in-hospital mortality (odds ratio (OR), 18.1); this was 8.5% in Group I, 15.3% in Group II and 10.3% in Group III. Independent predictive factors for mortality during follow up were: decrease of EF during follow-up for more that 5% (OR, 5.2), usage of left internal mammary artery as free graft (OR, 18.1), and EF in follow-up lower than 40% (OR, 4.8). CONCLUSIONS: The results herein suggest acceptable in-hospital as well short-term mortality in octogenarians with impaired EF following coronary artery bypass grafting (CABG) and are comparable to recent literature where the mortality of younger patients was up to 15% and short-term mortality up to 40%, respectively. Accordingly, we can also state that in an octogenarian cohort with impaired EF, CABG is a viable treatment with acceptable mortality.
Resumo:
Durante los procedimientos endovasculares en el ictus isquémico la información per-procedimiento del daño cerebral ayudaría a tomar decisiones sobre intentar la recanalización arterial. Métodos Se estudiaron parámetros gasométricos de sangre extraída durante los procedimientos endovasculares de recanalización. Se obtuvieron muestras proximal y distal a la oclusión. Un estudio gasométrico se realizo inmediatamente. Resultados El estudio mostró diferencias significativas entre las muestras pre-oclusión y postoclusión en la presión parcial de oxígeno (PaO2Pre78,9±16.3mmHgVs73.4±14.9 mmHg,p&0,001). Una curva ROC determinó que una Post-PaO2&70 mmHg predice mejoría clínica. Los pacientes con post-PaO2&70mmHg tuvieron mejor autonomía (medianaMRS:3Vs.6, p=0,024). En el análisis multivariante el único predictor independiente de mejoría clínica fue la Post-PaO2&70(OR:5,21IC95%:1.38-67.24,p=0,013). Conclusión Es posible la obtención de muestras de sangre del tejido lesionado distal a la oclusión. La información puede ser utilizada para predecir evolución clínica y en las decisiones durante el procedimiento.
Resumo:
PATIENTS: All neonates admitted between January 2002 and December 2007 treated by nCPAP were eligible. METHODS: Patients' noses were monitored during nCPAP. Nasal trauma was reported into three stages: (I) persistent erythema; (II) superficial ulceration; and (III) necrosis. RESULTS: 989 neonates were enrolled. Mean gestational age was 34 weeks (SD 4), mean birth weight 2142 g (SD 840). Nasal trauma was reported in 420 (42.5%) patients and it was of stage I, II and III in 371 (88.3%), 46 (11%) and 3 (0.7%) patients, respectively. Incidence and severity of trauma were inversely correlated with gestational age and birth weight. The risk of nasal trauma was greater in neonates <32 weeks of gestational age (OR 2.48, 95% CI 1.59 to 3.86), weighing <1500 g at birth (OR 2.28, 95% CI 1.43 to 3.64), treated >5 days by nCPAP (OR 5.36, 95% CI 3.82 to 7.52), or staying >14 days in the NICU (OR 1.67, 95% CI 1.22 to 2.28). Most cases of nasal trauma (90%) appeared during the first 6 days of nCPAP. Persistent visible scars were present in two cases. CONCLUSIONS: Nasal trauma is a frequent complication of nCPAP, especially in preterm neonates, but long-term cosmetic sequelae are very rare. This study provides a description of nasal trauma and proposes a simple staging system. This could serve as a basis to develop strategies of prevention and treatment of this iatrogenic event.
Resumo:
BACKGROUND: Prevention of cardiovascular disease (CVD) at the individual level should rely on the assessment of absolute risk using population-specific risk tables. OBJECTIVE: To compare the predictive accuracy of the original and the calibrated SCORE functions regarding 10-year cardiovascular risk in Switzerland. DESIGN: Cross-sectional, population-based study (5773 participants aged 35-74 years). METHODS: The SCORE equation for low-risk countries was calibrated based on the Swiss CVD mortality rates and on the CVD risk factor levels from the study sample. The predicted number of CVD deaths after a 10-year period was computed from the original and the calibrated equations and from the observed cardiovascular mortality for 2003. RESULTS: According to the original and calibrated functions, 16.3 and 15.8% of men and 8.2 and 8.9% of women, respectively, had a 10-year CVD risk > or =5%. Concordance correlation coefficient between the two functions was 0.951 for men and 0.948 for women, both P<0.001. Both risk functions adequately predicted the 10-year cumulative number of CVD deaths: in men, 71 (original) and 74 (calibrated) deaths for 73 deaths when using the CVD mortality rates; in women, 44 (original), 45 (calibrated) and 45 (CVD mortality rates), respectively. Compared to the original function, the calibrated function classified more women and fewer men at high-risk. Moreover, the calibrated function gave better risk estimates among participants aged over 65 years. CONCLUSION: The original SCORE function adequately predicts CVD death in Switzerland, particularly for individuals aged less than 65 years. The calibrated function provides more reliable estimates for older individuals.
Resumo:
PURPOSE: The recent increase in drug-resistant micro-organisms complicates the management of hospital-acquired bloodstream infections (HA-BSIs). We investigated the epidemiology of HA-BSI and evaluated the impact of drug resistance on outcomes of critically ill patients, controlling for patient characteristics and infection management. METHODS: A prospective, multicentre non-representative cohort study was conducted in 162 intensive care units (ICUs) in 24 countries. RESULTS: We included 1,156 patients [mean ± standard deviation (SD) age, 59.5 ± 17.7 years; 65 % males; mean ± SD Simplified Acute Physiology Score (SAPS) II score, 50 ± 17] with HA-BSIs, of which 76 % were ICU-acquired. Median time to diagnosis was 14 [interquartile range (IQR), 7-26] days after hospital admission. Polymicrobial infections accounted for 12 % of cases. Among monomicrobial infections, 58.3 % were gram-negative, 32.8 % gram-positive, 7.8 % fungal and 1.2 % due to strict anaerobes. Overall, 629 (47.8 %) isolates were multidrug-resistant (MDR), including 270 (20.5 %) extensively resistant (XDR), and 5 (0.4 %) pan-drug-resistant (PDR). Micro-organism distribution and MDR occurrence varied significantly (p < 0.001) by country. The 28-day all-cause fatality rate was 36 %. In the multivariable model including micro-organism, patient and centre variables, independent predictors of 28-day mortality included MDR isolate [odds ratio (OR), 1.49; 95 % confidence interval (95 %CI), 1.07-2.06], uncontrolled infection source (OR, 5.86; 95 %CI, 2.5-13.9) and timing to adequate treatment (before day 6 since blood culture collection versus never, OR, 0.38; 95 %CI, 0.23-0.63; since day 6 versus never, OR, 0.20; 95 %CI, 0.08-0.47). CONCLUSIONS: MDR and XDR bacteria (especially gram-negative) are common in HA-BSIs in critically ill patients and are associated with increased 28-day mortality. Intensified efforts to prevent HA-BSIs and to optimize their management through adequate source control and antibiotic therapy are needed to improve outcomes.
Resumo:
BACKGROUND: Treatment strategies for acute basilar artery occlusion (BAO) are based on case series and data that have been extrapolated from stroke intervention trials in other cerebrovascular territories, and information on the efficacy of different treatments in unselected patients with BAO is scarce. We therefore assessed outcomes and differences in treatment response after BAO. METHODS: The Basilar Artery International Cooperation Study (BASICS) is a prospective, observational registry of consecutive patients who presented with an acute symptomatic and radiologically confirmed BAO between November 1, 2002, and October 1, 2007. Stroke severity at time of treatment was dichotomised as severe (coma, locked-in state, or tetraplegia) or mild to moderate (any deficit that was less than severe). Outcome was assessed at 1 month. Poor outcome was defined as a modified Rankin scale score of 4 or 5, or death. Patients were divided into three groups according to the treatment they received: antithrombotic treatment only (AT), which comprised antiplatelet drugs or systemic anticoagulation; primary intravenous thrombolysis (IVT), including subsequent intra-arterial thrombolysis; or intra-arterial therapy (IAT), which comprised thrombolysis, mechanical thrombectomy, stenting, or a combination of these approaches. Risk ratios (RR) for treatment effects were adjusted for age, the severity of neurological deficits at the time of treatment, time to treatment, prodromal minor stroke, location of the occlusion, and diabetes. FINDINGS: 619 patients were entered in the registry. 27 patients were excluded from the analyses because they did not receive AT, IVT, or IAT, and all had a poor outcome. Of the 592 patients who were analysed, 183 were treated with only AT, 121 with IVT, and 288 with IAT. Overall, 402 (68%) of the analysed patients had a poor outcome. No statistically significant superiority was found for any treatment strategy. Compared with outcome after AT, patients with a mild-to-moderate deficit (n=245) had about the same risk of poor outcome after IVT (adjusted RR 0.94, 95% CI 0.60-1.45) or after IAT (adjusted RR 1.29, 0.97-1.72) but had a worse outcome after IAT compared with IVT (adjusted RR 1.49, 1.00-2.23). Compared with AT, patients with a severe deficit (n=347) had a lower risk of poor outcome after IVT (adjusted RR 0.88, 0.76-1.01) or IAT (adjusted RR 0.94, 0.86-1.02), whereas outcomes were similar after treatment with IAT or IVT (adjusted RR 1.06, 0.91-1.22). INTERPRETATION: Most patients in the BASICS registry received IAT. Our results do not support unequivocal superiority of IAT over IVT, and the efficacy of IAT versus IVT in patients with an acute BAO needs to be assessed in a randomised controlled trial. FUNDING: Department of Neurology, University Medical Center Utrecht.
Resumo:
BACKGROUND: Good adherence to antiretroviral therapy (ART) is critical for successful HIV treatment. However, some patients remain virologically suppressed despite suboptimal adherence. We hypothesized that this could result from host genetic factors influencing drug levels. METHODS: Eligible individuals were Caucasians treated with efavirenz (EFV) and/or boosted lopinavir (LPV/r) with self-reported poor adherence, defined as missing doses of ART at least weekly for more than 6 months. Participants were genotyped for single nucleotide polymorphisms (SNPs) in candidate genes previously reported to decrease EFV (rs3745274, rs35303484, rs35979566 in CYP2B6) and LPV/r clearance (rs4149056 in SLCO1B1, rs6945984 in CYP3A, rs717620 in ABCC2). Viral suppression was defined as having HIV-1 RNA <400 copies/ml throughout the study period. RESULTS: From January 2003 until May 2009, 37 individuals on EFV (28 suppressed and 9 not suppressed) and 69 on LPV/r (38 suppressed and 31 not suppressed) were eligible. The poor adherence period was a median of 32 weeks with 18.9% of EFV and 20.3% of LPV/r patients reporting missed doses on a daily basis. The tested SNPs were not determinant for viral suppression. Reporting missing >1 dose/week was associated with a lower probability of viral suppression compared to missing 1 dose/week (EFV: odds ratio (OR) 0.11, 95% confidence interval (CI): 0.01-0.99; LPV/r: OR 0.29, 95% CI: 0.09-0.94). In both groups, the probability of remaining suppressed increased with the duration of continuous suppression prior to the poor adherence period (EFV: OR 3.40, 95% CI: 0.62-18.75; LPV/r: OR 5.65, 95% CI: 1.82-17.56). CONCLUSIONS: The investigated genetic variants did not play a significant role in the sustained viral suppression of individuals with suboptimal adherence. Risk of failure decreased with longer duration of viral suppression in this population.
Resumo:
Objective: To investigate personality traits in patients with Alzheimer disease, compared with mentally healthy control subjects. We compared both current personality characteristics using structured interviews as well as current and previous personality traits as assessed by proxies.Method: Fifty-four patients with mild Alzheimer disease and 64 control subjects described their personality traits using the Structured Interview for the Five-Factor Model. Family members filled in the Revised NEO Personality Inventory, Form R, to evaluate their proxies' current personality traits, compared with 5 years before the estimated beginning of Alzheimer disease or 5 years before the control subjects.Results: After controlling for age, the Alzheimer disease group presented significantly higher scores than normal control subjects on current neuroticism, and significantly lower scores on current extraversion, openness, and conscientiousness, while no significant difference was observed on agreeableness. A similar profile, though less accentuated, was observed when considering personality traits as the patients' proxies remembered them. Diachronic personality assessment showed again significant differences between the 2 groups for the same 4 domains, with important personality changes only for the Alzheimer disease group.Conclusions: Group comparison and retrospective personality evaluation are convergent. Significant personality changes follow a specific trend in patients with Alzheimer disease and contrast with the stability generally observed in mentally healthy people in their personality profile throughout their lives. Whether or not the personality assessment 5 years before the current status corresponds to an early sign of Alzheimer disease or real premorbid personality differences in people who later develop Alzheimer disease requires longitudinal studies.
Resumo:
Os objetivos deste estudo foram conhecer a prevalência da incontinência anal (IA) em adultos da cidade de Pouso Alegre (Minas Gerais) e verificar os fatores demográficos e clínicos preditores de sua presença. Estudo epidemiológico desenvolvido por meio de amostragem estratificada por conglomerado, tendo amostra final composta de 519 indivíduos, com idade >18 anos, em condições físicas e mentais adequadas, residentes em 341 domicílios da área urbana e sorteados aleatoriamente. As prevalências foram padronizadas por sexo e idade, resultando em 7,0% para IA, tanto geral como para homens e mulheres. No modelo final de regressão logística, número de filhos (OR=5,1; p<0,001), doença hemorroidária (OR=4,4; p<0,001) e cistocele (OR=3,0; p<0,001) estavam associados à presença de IA. O estudo permitiu conhecer a epidemiologia da IA e pode contribuir para o desenvolvimento de políticas públicas visando à prevenção primária e secundária, e ao tratamento, ainda que inicialmente em nível municipal.
Resumo:
OBJECTIVES: The objectives were to identify the social and medical factors associated with emergency department (ED) frequent use and to determine if frequent users were more likely to have a combination of these factors in a universal health insurance system. METHODS: This was a retrospective chart review case-control study comparing randomized samples of frequent users and nonfrequent users at the Lausanne University Hospital, Switzerland. The authors defined frequent users as patients with four or more ED visits within the previous 12 months. Adult patients who visited the ED between April 2008 and March 2009 (study period) were included, and patients leaving the ED without medical discharge were excluded. For each patient, the first ED electronic record within the study period was considered for data extraction. Along with basic demographics, variables of interest included social (employment or housing status) and medical (ED primary diagnosis) characteristics. Significant social and medical factors were used to construct a logistic regression model, to determine factors associated with frequent ED use. In addition, comparison of the combination of social and medical factors was examined. RESULTS: A total of 359 of 1,591 frequent and 360 of 34,263 nonfrequent users were selected. Frequent users accounted for less than a 20th of all ED patients (4.4%), but for 12.1% of all visits (5,813 of 48,117), with a maximum of 73 ED visits. No difference in terms of age or sex occurred, but more frequent users had a nationality other than Swiss or European (n = 117 [32.6%] vs. n = 83 [23.1%], p = 0.003). Adjusted multivariate analysis showed that social and specific medical vulnerability factors most increased the risk of frequent ED use: being under guardianship (adjusted odds ratio [OR] = 15.8; 95% confidence interval [CI] = 1.7 to 147.3), living closer to the ED (adjusted OR = 4.6; 95% CI = 2.8 to 7.6), being uninsured (adjusted OR = 2.5; 95% CI = 1.1 to 5.8), being unemployed or dependent on government welfare (adjusted OR = 2.1; 95% CI = 1.3 to 3.4), the number of psychiatric hospitalizations (adjusted OR = 4.6; 95% CI = 1.5 to 14.1), and the use of five or more clinical departments over 12 months (adjusted OR = 4.5; 95% CI = 2.5 to 8.1). Having two of four social factors increased the odds of frequent ED use (adjusted = OR 5.4; 95% CI = 2.9 to 9.9), and similar results were found for medical factors (adjusted OR = 7.9; 95% CI = 4.6 to 13.4). A combination of social and medical factors was markedly associated with ED frequent use, as frequent users were 10 times more likely to have three of them (on a total of eight factors; 95% CI = 5.1 to 19.6). CONCLUSIONS: Frequent users accounted for a moderate proportion of visits at the Lausanne ED. Social and medical vulnerability factors were associated with frequent ED use. In addition, frequent users were more likely to have both social and medical vulnerabilities than were other patients. Case management strategies might address the vulnerability factors of frequent users to prevent inequities in health care and related costs.
Resumo:
O objetivo do estudo foi comparar resultados perinatais de mulheres com idade igual ou superior a 35 anos com os de mulheres entre 20 e 34 anos. O estudo é retrospectivo e foi realizado a partir da consulta às fichas obstétricas de 1.255 puérperas que tiveram partos no único hospital de Sarandi-PR, no período de janeiro de 2007 a dezembro de 2008. As variáveis analisadas foram: estado civil, escolaridade, idade gestacional, tipo de parto, peso ao nascer, índice de Apgar no 1º e 5º minutos e óbitos fetais. Na regressão logística, a idade materna avançada esteve associada significativamente à cesariana (OR 1,23, IC 95% 0,19-0,44) e a um índice de Apgar menor que 7 no 5º minuto de vida (OR 5,78, IC 95% 0,74-2,76). Esses resultados evidenciam os riscos de complicações em gestantes com idade igual ou superior a 35 anos e a necessidade de que o aconselhamento às mulheres que pretendam postergar a gestação seja realizado.
Resumo:
Genotypic and phenotypic tolerance was studied in penicillin treatment of experimental endocarditis due to nontolerant and tolerant Streptococcus gordonii and to their backcross transformants. The organisms were matched for in vitro and in vivo growth rates. Rats with aortic endocarditis were treated for 3 or 5 days, starting 12, 24, or 48 h after inoculation. When started at 12 h, during fast intravegetation growth, 3 days of treatment cured 80% of the nontolerant parent compared with <30% of the tolerant derivative (P < .005). When started at 24 or 48 h and if intravegetation growth had reached a plateau, 3 days of treatment failed against both bacteria. However, a significant difference between the 2 organisms was restored when treatment was extended to 5 days. Thus, genotypic tolerance conferred a survival advantage in both fast- and slow-growing bacteria, demonstrating that the in vitro-defined tolerant phenotype also carried the risk of treatment failure in vivo.
Resumo:
BACKGROUND: Leprosy is characterized by a spectrum of clinical manifestations that depend on the type of immune response against the pathogen. Patients may undergo immunological changes known as "reactional states" (reversal reaction and erythema nodosum leprosum) that result in major clinical deterioration. The goal of the present study was to assess the effect of Toll-like receptor 2 (TLR2) polymorphisms on susceptibility to and clinical presentation of leprosy. METHODS: Three polymorphisms in TLR2 (597C-->T, 1350T-->C, and a microsatellite marker) were analyzed in 431 Ethiopian patients with leprosy and 187 control subjects. The polymorphism-associated risk of developing leprosy, lepromatous (vs. tuberculoid) leprosy, and leprosy reactions was assessed by multivariate logistic regression models. RESULTS: The microsatellite and the 597C-->T polymorphisms both influenced susceptibility to reversal reaction. Although the 597T allele had a protective effect (odds ratio [OR], 0.34 [95% confidence interval {CI}, 0.17-0.68]; P= .002 under the dominant model), homozygosity for the 280-bp allelic length of the microsatellite strongly increased the risk of reversal reaction (OR, 5.83 [95% CI, 1.98-17.15]; P= .001 under the recessive model). These associations were consistent among 3 different ethnic groups. CONCLUSIONS: These data suggest a significant role for TLR-2 in the occurrence of leprosy reversal reaction and provide new insights into the immunogenetics of the disease.