89 resultados para LARGE COHORT
Resumo:
BACKGROUND: Persons with human immunodeficiency virus (HIV) risk behaviors are excluded from donation to reduce the risk of transfusion-transmitted infection. Persons donating to be tested for HIV may therefore deny risk behaviors. STUDY DESIGN AND METHODS: A random sample of donors completed a survey on motivations, knowledge, and attitudes on the screening process. Donors were considered test seekers if they agreed with two statements ""I think that blood donation is a good, fast, and anonymous way to get my blood tested"" and ""I donate to get my test results."" This study was conducted from June to November 2006 at the largest blood bank in Sao Paulo, Brazil. RESULTS: Of 3061 participants, 208 (7%) were test seekers. They tended to be male and had a lower educational level. They were more likely to have incorrect knowledge about blood safety (e.g., not knowing that a unit can test antibody negative and still transmit infection, 60% vs. 42%, p = 0.02), express dissatisfaction with screening questions (e.g., feeling that important questions were not asked, 14% vs. 5%, p < 0.01), and concur that donors do not answer questions truthfully (e.g., donors have more sexual partners than they admit, 29% vs. 18%, p < 0.01). Test seekers were more likely to believe that it is acceptable to donate blood to get tested for HIV (41% vs. 10%, p < 0.01). CONCLUSIONS: Test-seeking motivation, coupled with low knowledge of window period risk, is counter to improving blood safety and to donor prevention needs. Donor education needs to be improved along with availability of appropriate HIV counseling and testing.
Resumo:
Objective. To evaluate the beneficial effect of antimalarial treatment on lupus survival in a large, multiethnic, international longitudinal inception cohort. Methods. Socioeconomic and demographic characteristics, clinical manifestations, classification criteria, laboratory findings, and treatment variables were examined in patients with systemic lupus erythematosus (SLE) from the Grupo Latino Americano de Estudio del Lupus Eritematoso (GLADEL) cohort. The diagnosis of SLE, according to the American College of Rheumatology criteria, was assessed within 2 years of cohort entry. Cause of death was classified as active disease, infection, cardiovascular complications, thrombosis, malignancy, or other cause. Patients were subdivided by antimalarial use, grouped according to those who had received antimalarial drugs for at least 6 consecutive months (user) and those who had received antimalarial drugs for <6 consecutive months or who had never received antimalarial drugs (nonuser). Results. Of the 1,480 patients included in the GLADEL cohort, 1,141 (77%) were considered antimalarial users, with a mean duration of drug exposure of 48.5 months (range 6-98 months). Death occurred in 89 patients (6.0%). A lower mortality rate was observed in antimalarial users compared with nonusers (4.4% versus 11.5%; P < 0.001). Seventy patients (6.1%) had received antimalarial drugs for 6-11 months, 146 (12.8%) for 1-2 years, and 925 (81.1%) for >2 years. Mortality rates among users by duration of antimalarial treatment (per 1,000 person-months of followup) were 3.85 (95% confidence interval [95% CI] 1.41-8.37), 2.7 (95% CI 1.41-4.76), and 0.54 (95% CI 0.37-0.77), respectively, while for nonusers, the mortality rate was 3.07 (95% CI 2.18-4.20) (P for trend < 0.001). After adjustment for potential confounders in a Cox regression model, antimalarial use was associated with a 38% reduction in the mortality rate (hazard ratio 0.62, 95% CI 0.39-0.99). Conclusion. Antimalarial drugs were shown to have a protective effect, possibly in a time-dependent manner, on SLE survival. These results suggest that the use of antimalarial treatment should be recommended for patients with lupus.
Resumo:
The aim of this study was to evaluate a prognostic score for aids-related lymphoma (ARL). A retrospective study of 104 patients with ARL treated between January 1999 and December 2007 was conducted. Diffuse large B-cell lymphoma (DLBC) was the most observed histological type (79.8%). The median CD4 lymphocyte count at lymphoma diagnosis was 125 cells per microliter. Treatment response could be evaluated in 83 (79.8%) patients, and 38 (45.8%) reached complete remission (CR); overall response rate was 51.8% (95 CI = 38.5-65.1%). After a median follow-up of 48 months, the 4-year overall survival (OS) rate among all patients was 35.8%, with a median survival time of 9.7 months (95% CI = 5.5-13.9 months). The survival risk factors observed in multivariate analysis (previous AIDS and high-intermediate/high international prognostic index (IPI)) were combined to construct a risk score, which divided the whole patient population in three distinct groups as low, intermediate, and high risk. When this score was applied to DLBC patients, a clear distinction in response rates and in OS could be demonstrated. Median disease-free survival (DFS) for patients that achieved CR was not reached, and DFS in 4 years was 83.0%. Our results show that the reduced OS observed could be explained by poor immune status with advanced stage of disease seen in our population of HIV-positive patients. Further studies will be needed to clarify the role of different treatment approaches for ARL in the setting of marked immunosuppression and to identify a group of patients to whom intensive therapy could be performed with a curative intent.
Resumo:
Aims: To evaluate the IL1RN polymorphism as a possible marker for Rheumatic Fever (RF) susceptibility or disease severity. Methods: The genotypes of 84 RF patients (Jones criteria) and 84 normal race-matched controls were determined through the analysis of the number of 86-bp tandem repeats in the second intron of IL1RN. The DNA was extracted from peripheral-blood leukocytes and amplified with specific primers. Clinical manifestations of RF were obtained through a standardized questionnaire and an extensive chart review. Carditis was defined as new onset cardiac murmur that was perceived by a trained physician with corresponding valvae regurgitation or stenosis on echocardiogram. Carditis was classified as severe in the presence of congestive heart failure or upon the indication for cardiac surgery. The statistical association among the genotypes, RF and its clinical variations was determined. Results: The presence of allele I and the genotype A1/A1 were found less frequently among patients with severe carditis when compared to patients without this manifestation (OR = 0.11, p = 0.031; OR = 0.092, p = 0.017). Neither allele I nor allele 2 were associated with the presence of RF (p = 0.188 and p = 0.106), overall carditis (p = 0.578 and p = 0.767), polyarthritis (p = 0.343 and p = 0.313) and chorea (p = 0.654 and p = 0.633). Conclusion: In the Brazilian population, the polymorphism of the IL-1ra gene is a relevant factor for rheumatic heart disease severity. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Background-Although routinely administered, definitive evidence for the benefits of prophylactic antibiotics before the implantation of permanent pacemakers and implantable cardioverter-defibrillators from a large double-blinded placebo-controlled trial is lacking. The purpose of this study was to determine whether prophylactic antibiotic administration reduces the incidence of infection related to device implantation. Methods and Results-This double blinded study included 1000 consecutive patients who presented for primary device (Pacemaker and implantable cardioverter-defibrillators) implantation or generator replacement randomized in a 1:1 fashion to prophylactic antibiotics or placebo. Intravenous administration of I g of cefazolin (group 1) or placebo (group 2) was done immediately before the procedure. Follow-up was performed 10 days, 1, 3, and 6 months after discharge. The primary end point was any evidence of infection at the surgical incision (pulse generator pocket), or systemic infection related to be procedure. The safety committee interrupted the trial after 649 patients were enrolled due to a significant difference in favor of the antibiotic arm (group 1: 2 of 314 infected patients-0.63%; group 11: 11 of 335 to 3.28%; RR=0.19; P=0.016). The following risk factors were positively correlated with infection by univariate analysis: nonuse of preventive antibiotic (P=0.016); implant procedures (versus generator replacement: P=0.02); presence of postoperative hematoma (P=0.03) and procedure duration (P=0.009). Multivariable analysis identified nonuse of antibiotic (P=0.037) and postoperative hematoma (P=0.023) as independent predictors of infection. Conclusions-Anti biotic prophylaxis significantly reduces infectious complications in patients undergoing implantation of pacemakers or cardioverter-defibrillators. (Circ Arrhythmia Electrophysiol. 2009;2:29-34.)
Resumo:
PURPOSE: To evaluate the impact of atypical retardation patterns (ARP) on detection of progressive retinal nerve fiber layer (RNFL) loss using scanning laser polarimetry with variable corneal compensation (VCC). DESIGN: Observational cohort study. METHODS: The study included 377 eyes of 221 patients with a median follow-up of 4.0 years. Images were obtained annually with the GDx VCC (Carl Zeiss Med, itec Inc, Dublin, California, USA), along with optic disc stereophotographs and standard automated perimetry (SAP) visual fields. Progression was determined by the Guided Progression Analysis software for SAP and by masked assessment of stereophotographs by expert graders. The typical scan score (TSS) was used to quantify the presence of ARPs on GDx VCC images. Random coefficients models were used to evaluate the relationship between ARP and RNFL thickness measurements over time. RESULTS: Thirty-eight eyes (10%) showed progression over time on visual fields, stereophotographs, or both. Changes in TSS scores from baseline were significantly associated with changes in RNFL thickness measurements in both progressing and nonprogressing eyes. Each I unit increase in TSS score was associated with a 0.19-mu m decrease in RNFL thickness measurement (P < .001) over time. CONCLUSIONS: ARPs had a significant effect on detection of progressive RNFL loss with the GDx VCC. Eyes with large amounts of atypical patterns, great fluctuations on these patterns over time, or both may show changes in measurements that can appear falsely as glaucomatous progression or can mask true changes in the RNFL. (Am J Ophthalmol 2009;148:155-163. (C) 2009 by Elsevier Inc. All rights reserved.)
Resumo:
Methods We pooled data from 17 case-control studies including 12 716 cases and the 17 438 controls. Odds ratios (ORs) and 95% confidence intervals (CIs) were estimated for associations between body mass index (BMI) at different ages and HNC risk, adjusted for age, sex, centre, race, education, tobacco smoking and alcohol consumption. Results Adjusted ORs (95% CIs) were elevated for people with BMI at reference (date of diagnosis for cases and date of selection for controls) < 18.5 kg/m(2) (2.13, 1.75-2.58) and reduced for BMI > 25.0-30.0 kg/m(2) (0.52, 0.44-0.60) and BMI >= 30 kg/m(2) (0.43, 0.33-0.57), compared with BMI > 18.5-25.0 kg/m(2). These associations did not differ by age, sex, tumour site or control source. Although the increased risk among people with BMI < 18.5 kg/m(2) was not modified by tobacco smoking or alcohol drinking, the inverse association for people with BMI > 25 kg/m(2) was present only in smokers and drinkers. Conclusions In our large pooled analysis, leanness was associated with increased HNC risk regardless of smoking and drinking status, although reverse causality cannot be excluded. The reduced risk among overweight or obese people may indicate body size is a modifier of the risk associated with smoking and drinking. Further clarification may be provided by analyses of prospective cohort and mechanistic studies.
Resumo:
Background Recent studies support an important role for human papillomavirus (HPV) in a subgroup of head and neck squamous cell carcinomas (HNSCC). We have evaluated the HPV deoxyribonucleic acid (DNA) prevalence as well as the association between serological response to HPV infection and HNSCC in two distinct populations from Central Europe (CE) and Latin America (LA). Methods Cases (n = 2214) and controls (n = 3319) were recruited from 1998 to 2003, using a similar protocol including questionnaire and blood sample collection. Tumour DNA from 196 fresh tissue biopsies was analysed for multiple HPV types followed by an HPV type-specific polymerase chain reaction (PCR) protocol towards the E7 gene from HPV 16. Using multiplex serology, serum samples were analysed for antibodies to 17 HPV types. Statistical analysis included the estimation of adjusted odds ratios (ORs) and the respective 95% confidence intervals (CIs). Results HPV16 E7 DNA prevalence among cases was 3.1% (6/196), including 4.4% in the oropharynx (3/68), 3.8% in the hypopharynx/larynx (3/78) and 0% among 50 cases of oral cavity carcinomas. Positivity for both HPV16 E6 and E7 antibodies was associated with a very high risk of oropharyngeal cancer (OR = 179, 95% CI 35.8-899) and hypopharyngeal/laryngeal cancer (OR = 14.9, 95% CI 2.92-76.1). Conclusions A very low prevalence of HPV DNA and serum antibodies was observed among cases in both CE and LA. The proportion of head and neck cancer caused by HPV may vary substantially between different geographical regions and studies that are designed to evaluate the impact of HPV vaccination on HNSCC need to consider this heterogeneity.
Resumo:
Objective: Using longitudinal and prospective measures of trauma during childhood, the authors assessed the risk of developing psychotic symptoms associated with maltreatment, bullying, and accidents in a nationally representative U. K. cohort of young twins. Method: Data were from the Environmental Risk Longitudinal Twin Study, which follows 2,232 twin children and their families. Mothers were interviewed during home visits when children were ages 5, 7, 10, and 12 on whether the children had experienced maltreatment by an adult, bullying by peers, or involvement in an accident. At age 12, children were asked about bullying experiences and psychotic symptoms. Children`s reports of psychotic symptoms were verified by clinicians. Results: Children who experienced maltreatment by an adult (relative risk=3.16, 95% CI=1.92-5.19) or bullying by peers (relative risk=2.47, 95% CI=1.74-3.52) were more likely to report psychotic symptoms at age 12 than were children who did not experience such traumatic events. The higher risk for psychotic symptoms was observed whether these events occurred early in life or later in childhood. The risk associated with childhood trauma remained significant in analyses controlling for children`s gender, socioeconomic deprivation, and IQ; for children`s early symptoms of internalizing or externalizing problems; and for children`s genetic liability to developing psychosis. In contrast, the risk associated with accidents was small (relative risk=1.47, 95% CI=1.02-2.13) and inconsistent across ages. Conclusions: Trauma characterized by intention to harm is associated with children`s reports of psychotic symptoms. Clinicians working with children who report early symptoms of psychosis should inquire about traumatic events such as maltreatment and bullying.
Resumo:
Context: It has been reported that childhood psychotic symptoms are common in the general population and may signal neurodevelopmental processes that lead to schizophrenia. However, it is not clear whether these symptoms are associated with the same extensive risk factors established for adult schizophrenia. Objective: To examine the construct validity of children`s self-reported psychotic symptoms by testing whether these symptoms share the risk factors and clinical features of adult schizophrenia. Design: Prospective, longitudinal cohort study of a nationally representative birth cohort in Great Britain. Participants: A total of 2232 twelve-year-old children followed up since age 5 years ( retention, 96%). Main Outcome Measure: Children`s self-reported hallucinations and delusions. Results: Children`s psychotic symptoms are familial and heritable and are associated with social risk factors (eg, urbanicity); cognitive impairments at age 5; home-rearing risk factors ( eg, maternal expressed emotion); behavioral, emotional, and educational problems at age 5; and comorbid conditions, including self-harm. Conclusions: The results provide a comprehensive picture of the construct validity of children`s self-reported psychotic symptoms. For researchers, the findings indicate that children who have psychotic symptoms can be recruited for neuroscience research to determine the pathogenesis of schizophrenia. For clinicians, the findings indicate that psychotic symptoms in childhood are often a marker of an impaired developmental process and should be actively assessed.
Resumo:
Objective: To evaluate whether including children with onset of symptoms between ages 7 and 12 years in the ADHD diagnostic category would: (a) increase the prevalence of the disorder at age 12, and (b) change the clinical and cognitive features, impairment profile, and risk factors for ADHD compared with findings in the literature based on the DSM-IV definition of the disorder. Method: A birth cohort of 2,232 British children was prospectively evaluated at ages 7 and 12 years for ADHD using information from mothers and teachers. The prevalence of diagnosed ADHD at age 12 was evaluated with and without the inclusion of individuals who met DSM-IV age-of-onset criterion through mothers` or teachers` reports of symptoms at age 7. Children with onset of ADHD symptoms before versus after age 7 were compared on their clinical and cognitive features, impairment profile, and risk factors for ADHD. Results: Extending the age-of-onset criterion to age 12 resulted in a negligible increase in ADHD prevalence by age 12 years of 0.1%. Children who first manifested ADHD symptoms between ages 7 and 12 did not present correlates or risk factors that were significantly different from children who manifested symptoms before age 7. Conclusions: Results from this prospective birth cohort might suggest that adults who are able to report symptom onset by age 12 also had symptoms by age 7, even if they are not able to report them. The data suggest that the prevalence estimate, correlates and risk factors of ADHD will not be affected if the new diagnostic scheme extends the age-of-onset criterion to age 12. J. Am. Acad. Child Adolesc. Psychiatry, 2010;49(3):210-216.
Resumo:
Background Mucosal leishmaniasis is caused mainly by Leishmania braziliensis and it occurs months or years after cutaneous lesions. This progressive disease destroys cartilages and osseous structures from face, pharynx and larynx. Objective and methods The aim of this study was to analyse the significance of clinical and epidemiological findings, diagnosis and treatment with the outcome and recurrence of mucosal leishmaniasis through binary logistic regression model from 140 patients with mucosal leishmaniasis from a Brazilian centre. Results The median age of patients was 57.5 and systemic arterial hypertension was the most prevalent secondary disease found in patients with mucosal leishmaniasis (43%). Diabetes, chronic nephropathy and viral hepatitis, allergy and coagulopathy were found in less than 10% of patients. Human immunodeficiency virus (HIV) infection was found in 7 of 140 patients (5%). Rhinorrhea (47%) and epistaxis (75%) were the most common symptoms. N-methyl-glucamine showed a cure rate of 91% and recurrence of 22%. Pentamidine showed a similar rate of cure (91%) and recurrence (25%). Fifteen patients received itraconazole with a cure rate of 73% and recurrence of 18%. Amphotericin B was the drug used in 30 patients with 82% of response with a recurrence rate of 7%. The binary logistic regression analysis demonstrated that systemic arterial hypertension and HIV infection were associated with failure of the treatment (P < 0.05). Conclusion The current first-line mucosal leishmaniasis therapy shows an adequate cure but later recurrence. HIV infection and systemic arterial hypertension should be investigated before start the treatment of mucosal leishmaniasis. Conflicts of interest The authors are not part of any associations or commercial relationships that might represent conflicts of interest in the writing of this study (e.g. pharmaceutical stock ownership, consultancy, advisory board membership, relevant patents, or research funding).
Resumo:
Background and Aim: It is unclear to what extent diabetes modulates the ageing-related adaptations of cardiac geometry and function. Methods and Results: We examined 1005 adults, aged 25-74 years, from a population-based survey at baseline in 1994/5 and at follow-up in 2004/5. We compared persistently non-diabetic individuals (ND; no diabetes at baseline and at follow-up, n = 833) with incident (ID; non-diabetic at baseline and diabetic at follow-up, n = 36) and with prevalent diabetics (PD; diabetes at baseline and follow-up examination, n = 21). Left ventricular (LV) geometry and function were evaluated by echocardiography. Statistical analyses were performed with multivariate linear regression models. Over ten years the PD group displayed a significantly stronger relative increase of LV mass (+9.34% vs. +23.7%) that was mediated by a more pronounced increase of LV end-diastolic diameter (+0% vs. +6.95%) compared to the ND group. In parallel, LA diameter increased (+4.50% vs. +12.7%), whereas ejection fraction decreased (+3.02% vs. -4.92%) more significantly in the PD group. Moreover, at the follow-up examination the PD and ID groups showed a significantly worse diastolic function, indicated by a higher E/EM ratio compared with the ND group (11.6 and 11.8 vs. 9.79, respectively). Conclusions: Long-standing diabetes was associated with an acceleration of age-related changes of left ventricular geometry accumulating in an eccentric remodelling of the left ventricle. Likewise, echocardiographic measures of systolic and diastolic ventricular function deteriorated more rapidly in individuals with diabetes. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The aim of this study is to investigate the changes in clinical pattern and therapeutic measures in leptospirosis-associated acute kidney injury; a retrospective study with 318 patients in Brazil. Patients were divided according to the time of admission: 1985-1996 (group I) and 1997-2010 (group II). Patients were younger in group I (36 +/- 13 versus 41 +/- 16 years, P = 0.005) and the numbers of oliguria increased (21% versus 41% in group II, P = 0.014). Higher frequency of lung manifestations was observed in group II (P<0.0001). Although increased severity, there was a significant reduction in mortality (20% in group I versus 12% in group II, P = 0.03). Mortality was associated with advanced age, low diastolic blood pressure, oliguria, arrhythmia, and peritoneal dialysis, besides a trend to better mortality with penicillin administration. Leptospirosis is occurring in an older population, with a higher number of oliguria and lung manifestations. However, mortality is decreasing and can be the result of changes in treatment.
Resumo:
Background and study aims In many patients, percutaneous endoscopic gastrostomy (PEG) can be limited by digestive tract stenosis. PEG placement using an introducer is the safest alternative for this group of patients, but the available devices are difficult to implement and require smaller-caliber tubes. The aim of this study was to evaluate the modification of an introducer technique device for PEG placement with regard to the following: procedure feasibility, possibility of using a 20-Fr balloon gastrostomy tube, tube-related function and problems, complications, procedure safety, and mortality. Patients and methods Between March 2007 and February 2008, 30 consecutive patients with head and neck malignancies underwent introducer PEG placement with the modified device and gastropexy. Each patient was evaluated for 60 days after the procedure for the success of the procedure, infection, pain, complications, mortality, and problems with the procedure. Results The procedure was successful in all cases with no perioperative complications. No signs of stomal infection were observed using the combined infection score. The majority of patients experienced mild-to-moderate pain both in the immediate postoperative period and at 72 hours. One major early complication (3.3%) and two minor complications (6.7%) were observed. No procedure-related deaths occurred during the first 60 days after the procedure. Conclusion The device modification for PEG using the introducer technique is feasible, safe, and efficient in outpatients with obstructive head and neck cancer. In this series, it allowed the use of a larger-caliber tube with low complication rates and no procedure-related mortality.