872 resultados para Prospective Cohort
Resumo:
IMPORTANCE: Depression and obesity are 2 prevalent disorders that have been repeatedly shown to be associated. However, the mechanisms and temporal sequence underlying this association are poorly understood. OBJECTIVE: To determine whether the subtypes of major depressive disorder (MDD; melancholic, atypical, combined, or unspecified) are predictive of adiposity in terms of the incidence of obesity and changes in body mass index (calculated as weight in kilograms divided by height in meters squared), waist circumference, and fat mass. DESIGN, SETTING, AND PARTICIPANTS: This prospective population-based cohort study, CoLaus (Cohorte Lausannoise)/PsyCoLaus (Psychiatric arm of the CoLaus Study), with 5.5 years of follow-up included 3054 randomly selected residents (mean age, 49.7 years; 53.1% were women) of the city of Lausanne, Switzerland (according to the civil register), aged 35 to 66 years in 2003, who accepted the physical and psychiatric baseline and physical follow-up evaluations. EXPOSURES: Depression subtypes according to the DSM-IV. Diagnostic criteria at baseline and follow-up, as well as sociodemographic characteristics, lifestyle (alcohol and tobacco use and physical activity), and medication, were elicited using the semistructured Diagnostic Interview for Genetic Studies. MAIN OUTCOMES AND MEASURES: Changes in body mass index, waist circumference, and fat mass during the follow-up period, in percentage of the baseline value, and the incidence of obesity during the follow-up period among nonobese participants at baseline. Weight, height, waist circumference, and body fat (bioimpedance) were measured at baseline and follow-up by trained field interviewers. RESULTS: Only participants with the atypical subtype of MDD at baseline revealed a higher increase in adiposity during follow-up than participants without MDD. The associations between this MDD subtype and body mass index (β = 3.19; 95% CI, 1.50-4.88), incidence of obesity (odds ratio, 3.75; 95% CI, 1.24-11.35), waist circumference in both sexes (β = 2.44; 95% CI, 0.21-4.66), and fat mass in men (β = 16.36; 95% CI, 4.81-27.92) remained significant after adjustments for a wide range of possible cofounding. CONCLUSIONS AND RELEVANCE: The atypical subtype of MDD is a strong predictor of obesity. This emphasizes the need to identify individuals with this subtype of MDD in both clinical and research settings. Therapeutic measures to diminish the consequences of increased appetite during depressive episodes with atypical features are advocated.
Resumo:
In contrast to some extensively examined food mutagens, for example, aflatoxins, N-nitrosamines and heterocyclic amines, some other food contaminants, in particular polycyclic aromatic hydrocarbons (PAH) and other aromatic compounds, have received less attention. Therefore, exploring the relationships between dietary habits and the levels of biomarkers related to exposure to aromatic compounds is highly relevant. We have investigated in the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort the association between dietary items (food groups and nutrients) and aromatic DNA adducts and 4-aminobiphenyl-Hb adducts. Both types of adducts are biomarkers of carcinogen exposure and possibly of cancer risk, and were measured, respectively, in leucocytes and erythrocytes of 1086 (DNA adducts) and 190 (Hb adducts) non-smokers. An inverse, statistically significant, association has been found between DNA adduct levels and dietary fibre intake (P = 0·02), vitamin E (P = 0·04) and alcohol (P = 0·03) but not with other nutrients or food groups. Also, an inverse association between fibre and fruit intake, and BMI and 4-aminobiphenyl-Hb adducts (P = 0·03, 0·04, and 0·03 respectively) was observed. After multivariate regression analysis these inverse correlations remained statistically significant, except for the correlation adducts v. fruit intake. The present study suggests that fibre intake in the usual range can modify the level of DNA or Hb aromatic adducts, but such role seems to be quantitatively modest. Fibres could reduce the formation of DNA adducts in different manners, by diluting potential food mutagens and carcinogens in the gastrointestinal tract, by speeding their transit through the colon and by binding carcinogenic substances.
Resumo:
Tree nuts, peanuts and seeds are nutrient dense foods whose intake has been shown to be associated with reduced risk of some chronic diseases. They are regularly consumed in European diets either as whole, in spreads or from hidden sources (e.g. commercial products). However, little is known about their intake profiles or differences in consumption between European countries or geographic regions. The objective of this study was to analyse the population mean intake and average portion sizes in subjects reporting intake of nuts and seeds consumed as whole, derived from hidden sources or from spreads. Data was obtained from standardised 24-hour dietary recalls collected from 36 994 subjects in 10 different countries that are part of the European Prospective Investigation into Cancer and Nutrition (EPIC). Overall, for nuts and seeds consumed as whole, the percentage of subjects reporting intake on the day of the recall was: tree nuts = 4. 4%, peanuts = 2.3 % and seeds = 1.3 %. The data show a clear northern (Sweden: mean intake = 0.15 g/d, average portion size = 15.1 g/d) to southern (Spain: mean intake = 2.99 g/d, average portion size = 34.7 g/d) European gradient of whole tree nut intake. The three most popular tree nuts were walnuts, almonds and hazelnuts, respectively. In general, tree nuts were more widely consumed than peanuts or seeds. In subjects reporting intake, men consumed a significantly higher average portion size of tree nuts (28.5 v. 23.1 g/d, P<0.01) and peanuts (46.1 v. 35.1 g/d, P<0.01) per day than women. These data may be useful in devising research initiatives and health policy strategies based on the intake of this food group.
Resumo:
BACKGROUND. A growing body of research suggests that prenatal exposure to air pollution may be harmful to fetal development. We assessed the association between exposure to air pollution during pregnancy and anthropometric measures at birth in four areas within the Spanish Children's Health and Environment (INMA) mother and child cohort study. METHODS. Exposure to ambient nitrogen dioxide (NO2) and benzene was estimated for the residence of each woman (n = 2,337) for each trimester and for the entire pregnancy. Outcomes included birth weight, length, and head circumference. The association between residential outdoor air pollution exposure and birth outcomes was assessed with linear regression models controlled for potential confounders. We also performed sensitivity analyses for the subset of women who spent more time at home during pregnancy. Finally, we performed a combined analysis with meta-analysis techniques. RESULTS. In the combined analysis, an increase of 10 µg/m3 in NO2 exposure during pregnancy was associated with a decrease in birth length of -0.9 mm [95% confidence interval (CI), -1.8 to -0.1 mm]. For the subset of women who spent ≥ 15 hr/day at home, the association was stronger (-0.16 mm; 95% CI, -0.27 to -0.04). For this same subset of women, a reduction of 22 g in birth weight was associated with each 10-µg/m3 increase in NO2 exposure in the second trimester (95% CI, -45.3 to 1.9). We observed no significant relationship between benzene levels and birth outcomes. CONCLUSIONS. NO2 exposure was associated with reductions in both length and weight at birth. This association was clearer for the subset of women who spent more time at home.
Resumo:
BACKGROUND Previous studies have demonstrated the efficacy of treatment for latent tuberculosis infection (TLTBI) in persons infected with the human immunodeficiency virus, but few studies have investigated the operational aspects of implementing TLTBI in the co-infected population.The study objectives were to describe eligibility for TLTBI as well as treatment prescription, initiation and completion in an HIV-infected Spanish cohort and to investigate factors associated with treatment completion. METHODS Subjects were prospectively identified between 2000 and 2003 at ten HIV hospital-based clinics in Spain. Data were obtained from clinical records. Associations were measured using the odds ratio (OR) and its 95% confidence interval (95% CI). RESULTS A total of 1242 subjects were recruited and 846 (68.1%) were evaluated for TLTBI. Of these, 181 (21.4%) were eligible for TLTBI either because they were tuberculin skin test (TST) positive (121) or because their TST was negative/unknown but they were known contacts of a TB case or had impaired immunity (60). Of the patients eligible for TLTBI, 122 (67.4%) initiated TLTBI: 99 (81.1%) were treated with isoniazid for 6, 9 or 12 months; and 23 (18.9%) with short-course regimens including rifampin plus isoniazid and/or pyrazinamide. In total, 70 patients (57.4%) completed treatment, 39 (32.0%) defaulted, 7 (5.7%) interrupted treatment due to adverse effects, 2 developed TB, 2 died, and 2 moved away. Treatment completion was associated with having acquired HIV infection through heterosexual sex as compared to intravenous drug use (OR:4.6; 95% CI:1.4-14.7) and with having taken rifampin and pyrazinamide for 2 months as compared to isoniazid for 9 months (OR:8.3; 95% CI:2.7-24.9). CONCLUSIONS A minority of HIV-infected patients eligible for TLTBI actually starts and completes a course of treatment. Obstacles to successful implementation of this intervention need to be addressed.
Resumo:
INTRODUCTION Higher and lower cerebral perfusion pressure (CPP) thresholds have been proposed to improve brain tissue oxygen pressure (PtiO2) and outcome. We study the distribution of hypoxic PtiO2 samples at different CPP thresholds, using prospective multimodality monitoring in patients with severe traumatic brain injury. METHODS This is a prospective observational study of 22 severely head injured patients admitted to a neurosurgical critical care unit from whom multimodality data was collected during standard management directed at improving intracranial pressure, CPP and PtiO2. Local PtiO2 was continuously measured in uninjured areas and snapshot samples were collected hourly and analyzed in relation to simultaneous CPP. Other variables that influence tissue oxygen availability, mainly arterial oxygen saturation, end tidal carbon dioxide, body temperature and effective hemoglobin, were also monitored to keep them stable in order to avoid non-ischemic hypoxia. RESULTS Our main results indicate that half of PtiO2 samples were at risk of hypoxia (defined by a PtiO2 equal to or less than 15 mmHg) when CPP was below 60 mmHg, and that this percentage decreased to 25% and 10% when CPP was between 60 and 70 mmHg and above 70 mmHg, respectively (p < 0.01). CONCLUSION Our study indicates that the risk of brain tissue hypoxia in severely head injured patients could be really high when CPP is below the normally recommended threshold of 60 mmHg, is still elevated when CPP is slightly over it, but decreases at CPP values above it.
Resumo:
BACKGROUND. Ritonavir-boosted saquinavir (SQVr) is nowadays regarded as an alternative antiretroviral drug probably due to several drawbacks, such as its high pill burden, twice daily dosing and the requirement of 200 mg ritonavir when given at the current standard 1000/100 mg bid dosing. Several once-daily SQVr dosing schemes have been studied with the 200 mg SQV old formulations, trying to overcome some of these disadvantages. SQV 500 mg strength tablets became available at the end of 2005, thus facilitating a once-daily regimen with fewer pills, although there is very limited experience with this formulation yet. METHODS. Prospective, multicentre study in which efficacy, safety and pharmacokinetics of a regimen of once-daily SQVr 1500/100 mg plus 2 NRTIs were evaluated under routine clinical care conditions in either antiretroviral-naïve patients or in those with no previous history of antiretroviral treatments and/or genotypic resistance tests suggesting SQV resistance. Plasma SQV trough levels were measured by HPLV-UV. RESULTS. Five hundred and fourteen caucasian patients were included (47.2% coinfected with hepatitis C and/or B virus; 7.8% with cirrhosis). Efficacy at 52 weeks (plasma RNA-HIV <50 copies/ml) was 67.7% (CI95: 63.6 - 71.7%) by intention-to-treat, and 92.2% (CI95: 89.8 - 94.6%) by on-treatment analysis. The reasons for failure were: dropout or loss to follow-up (18.4%), virological failure (7.8%), adverse events (3.1%), and other reasons (4.6%). The high rate of dropout may be explained by an enrollment and follow-up under routine clinical care condition, and a population with a significant number of drug users. The median SQV Cmin (n = 49) was 295 ng/ml (range, 53-2172). The only variable associated with virological failure in the multivariate analysis was adherence (OR: 3.36; CI95, 1.51-7.46, p = 0.003). CONCLUSIONS. Our results suggests that SQVr (1500/100 mg) once-daily plus 2 NRTIs is an effective regimen, without severe clinical adverse events or hepatotoxicity, scarce lipid changes, and no interactions with methadone. All these factors and its once-daily administration suggest this regimen as an appropriate option in patients with no SQV resistance-associated mutations.
Resumo:
Discordances exist in epidemiological studies regarding the association between the intake of nutrients and death and disease. We evaluated the social and health profile of persons who consumed olive oil in a prospective population cohort investigation (Pizarra study) with a 6-year follow-up. A food frequency questionnaire and a 7 d quantitative questionnaire were administered to 538 persons. The type of oil used in food preparation was determined by direct measurement of the fatty acids in samples obtained from the kitchens of the participants at baseline and after follow-up for 6 years. The fatty acid composition of the serum phospholipids was used as an endogenous marker of the type of oil consumed. Total fat intake accounted for a mean 40 % of the energy (at baseline and after follow-up). The concordance in intake of MUFA over the study period was high. The fatty acid composition of the serum phospholipids was significantly associated with the type of oil consumed and with fish intake. The concentration of polar compounds and polymers, indicative of degradation, was greater in oils from the kitchens where sunflower oil or refined olive oil was used, in oils used for deep frying and in oils that had been reused for frying five times or more. Consumption of olive oil was directly associated with educational level. Part of the discordance found in epidemiological studies between diet and health may be due to the handling of oils during food preparation. The intake of olive oil is associated with other healthy habits.
Resumo:
BACKGROUND Waist circumference (WC) is a simple and reliable measure of fat distribution that may add to the prediction of type 2 diabetes (T2D), but previous studies have been too small to reliably quantify the relative and absolute risk of future diabetes by WC at different levels of body mass index (BMI). METHODS AND FINDINGS The prospective InterAct case-cohort study was conducted in 26 centres in eight European countries and consists of 12,403 incident T2D cases and a stratified subcohort of 16,154 individuals from a total cohort of 340,234 participants with 3.99 million person-years of follow-up. We used Prentice-weighted Cox regression and random effects meta-analysis methods to estimate hazard ratios for T2D. Kaplan-Meier estimates of the cumulative incidence of T2D were calculated. BMI and WC were each independently associated with T2D, with WC being a stronger risk factor in women than in men. Risk increased across groups defined by BMI and WC; compared to low normal weight individuals (BMI 18.5-22.4 kg/m(2)) with a low WC (<94/80 cm in men/women), the hazard ratio of T2D was 22.0 (95% confidence interval 14.3; 33.8) in men and 31.8 (25.2; 40.2) in women with grade 2 obesity (BMI≥35 kg/m(2)) and a high WC (>102/88 cm). Among the large group of overweight individuals, WC measurement was highly informative and facilitated the identification of a subgroup of overweight people with high WC whose 10-y T2D cumulative incidence (men, 70 per 1,000 person-years; women, 44 per 1,000 person-years) was comparable to that of the obese group (50-103 per 1,000 person-years in men and 28-74 per 1,000 person-years in women). CONCLUSIONS WC is independently and strongly associated with T2D, particularly in women, and should be more widely measured for risk stratification. If targeted measurement is necessary for reasons of resource scarcity, measuring WC in overweight individuals may be an effective strategy, since it identifies a high-risk subgroup of individuals who could benefit from individualised preventive action.
Resumo:
While the risk of ovarian cancer clearly reduces with each full-term pregnancy, the effect of incomplete pregnancies is unclear. We investigated whether incomplete pregnancies (miscarriages and induced abortions) are associated with risk of epithelial ovarian cancer. This observational study was carried out in female participants of the European Prospective Investigation into Cancer and Nutrition (EPIC). A total of 274,442 women were followed from 1992 until 2010. The baseline questionnaire elicited information on miscarriages and induced abortions, reproductive history, and lifestyle-related factors. During a median follow-up of 11.5 years, 1,035 women were diagnosed with incident epithelial ovarian cancer. Despite the lack of an overall association (ever vs. never), risk of ovarian cancer was higher among women with multiple incomplete pregnancies (HR(≥4vs.0): 1.74, 95% CI: 1.20-2.70; number of cases in this category: n = 23). This association was particularly evident for multiple miscarriages (HR(≥4vs.0): 1.99, 95% CI: 1.06-3.73; number of cases in this category: n = 10), with no significant association for multiple induced abortions (HR(≥4vs.0): 1.46, 95% CI: 0.68-3.14; number of cases in this category: n = 7). Our findings suggest that multiple miscarriages are associated with an increased risk of epithelial ovarian cancer, possibly through a shared cluster of etiological factors or a common underlying pathology. These findings should be interpreted with caution as this is the first study to show this association and given the small number of cases in the highest exposure categories.
Resumo:
BACKGROUND Earlier analyses within the EPIC study showed that dietary fibre intake was inversely associated with colorectal cancer risk, but results from some large cohort studies do not support this finding. We explored whether the association remained after longer follow-up with a near threefold increase in colorectal cancer cases, and if the association varied by gender and tumour location. METHODOLOGY/PRINCIPAL FINDINGS After a mean follow-up of 11.0 years, 4,517 incident cases of colorectal cancer were documented. Total, cereal, fruit, and vegetable fibre intakes were estimated from dietary questionnaires at baseline. Hazard ratios (HRs) and 95% confidence intervals (CIs) were estimated using Cox proportional hazards models stratified by age, sex, and centre, and adjusted for total energy intake, body mass index, physical activity, smoking, education, menopausal status, hormone replacement therapy, oral contraceptive use, and intakes of alcohol, folate, red and processed meats, and calcium. After multivariable adjustments, total dietary fibre was inversely associated with colorectal cancer (HR per 10 g/day increase in fibre 0.87, 95% CI: 0.79-0.96). Similar linear associations were observed for colon and rectal cancers. The association between total dietary fibre and risk of colorectal cancer risk did not differ by age, sex, or anthropometric, lifestyle, and dietary variables. Fibre from cereals and fibre from fruit and vegetables were similarly associated with colon cancer; but for rectal cancer, the inverse association was only evident for fibre from cereals. CONCLUSIONS/SIGNIFICANCE Our results strengthen the evidence for the role of high dietary fibre intake in colorectal cancer prevention.
Resumo:
BACKGROUND Socio-economic inequalities in mortality are observed at the country level in both North America and Europe. The purpose of this work is to investigate the contribution of specific risk factors to social inequalities in cause-specific mortality using a large multi-country cohort of Europeans. METHODS A total of 3,456,689 person/years follow-up of the European Prospective Investigation into Cancer and Nutrition (EPIC) was analysed. Educational level of subjects coming from 9 European countries was recorded as proxy for socio-economic status (SES). Cox proportional hazard model's with a step-wise inclusion of explanatory variables were used to explore the association between SES and mortality; a Relative Index of Inequality (RII) was calculated as measure of relative inequality. RESULTS Total mortality among men with the highest education level is reduced by 43% compared to men with the lowest (HR 0.57, 95% C.I. 0.52-0.61); among women by 29% (HR 0.71, 95% C.I. 0.64-0.78). The risk reduction was attenuated by 7% in men and 3% in women by the introduction of smoking and to a lesser extent (2% in men and 3% in women) by introducing body mass index and additional explanatory variables (alcohol consumption, leisure physical activity, fruit and vegetable intake) (3% in men and 5% in women). Social inequalities were highly statistically significant for all causes of death examined in men. In women, social inequalities were less strong, but statistically significant for all causes of death except for cancer-related mortality and injuries. DISCUSSION In this European study, substantial social inequalities in mortality among European men and women which cannot be fully explained away by accounting for known common risk factors for chronic diseases are reported.
Resumo:
BACKGROUND: The Outpatient Bleeding Risk Index (OBRI) and the Kuijer, RIETE and Kearon scores are clinical prognostic scores for bleeding in patients receiving oral anticoagulants for venous thromboembolism (VTE). We prospectively compared the performance of these scores in elderly patients with VTE. METHODS: In a prospective multicenter Swiss cohort study, we studied 663 patients aged ≥ 65 years with acute VTE. The outcome was a first major bleeding at 90 days. We classified patients into three categories of bleeding risk (low, intermediate and high) according to each score and dichotomized patients as high vs. low or intermediate risk. We calculated the area under the receiver-operating characteristic (ROC) curve, positive predictive values and likelihood ratios for each score. RESULTS: Overall, 28 out of 663 patients (4.2%, 95% confidence interval [CI] 2.8-6.0%) had a first major bleeding within 90 days. According to different scores, the rate of major bleeding varied from 1.9% to 2.1% in low-risk, from 4.2% to 5.0% in intermediate-risk and from 3.1% to 6.6% in high-risk patients. The discriminative power of the scores was poor to moderate, with areas under the ROC curve ranging from 0.49 to 0.60 (P = 0.21). The positive predictive values and positive likelihood ratios were low and varied from 3.1% to 6.6% and from 0.72 to 1.59, respectively. CONCLUSION: In elderly patients with VTE, existing bleeding risk scores do not have sufficient accuracy and power to discriminate between patients with VTE who are at a high risk of short-term major bleeding and those who are not.
Resumo:
OBJECTIVE: Antitumor necrosis factor a agents have significantly improved the management of Crohn's disease (CD), but not all patients benefit from this therapy. We used data from the Swiss Inflammatory Bowel Disease Cohort Study and predefined appropriateness criteria to examine the appropriateness of use of infliximab (IFX) in CD patients. METHODS: EPACT II (European Panel on the Appropriateness of CD Therapy, 2007; www.epact.ch) appropriateness criteria have been developed using a formal explicit panel process combining evidence from the published literature and expert opinion. Questionnaires relating to EPACT II criteria were used at enrollment and follow-up of all Swiss Inflammatory Bowel Disease Cohort Study patients. A step-by-step analysis of all possible indications for IFX therapy in a given patient allowed identification of the most appropriate indication and final classification in a single appropriateness category (appropriate, uncertain, inappropriate). RESULTS: Eight hundred and twenty-one CD patients were prospectively enrolled between November 2006 and March 2009. IFX was administered to 146 patients (18%) at enrollment and was most frequently used for complex fistulizing disease and for the maintenance of remission induced by biological therapy. IFX therapy was considered appropriate in 44%, uncertain in 44%, and inappropriate in 10% of patients. CONCLUSION: In this cohort, 9 out of 10 indications for IFX therapy were clinically generally acceptable (appropriate or uncertain) according to EPACT II criteria. Uncertain indications resulted mainly from the current more liberal use of IFX in clinical practice as compared with the EPACT II criteria.
Resumo:
BACKGROUND AND OBJECTIVES The elective treatment of patients with post-transplant lymphoproliferative disorders is controversial. The purpose of this trial was to evaluate the efficacy of treatment with extended doses of rituximab adapted to the response in patients with post-transplant lymphoproliferative disorders after solid organ transplantation. DESIGN AND METHODS This was a prospective, multicenter, phase II trial. Patients were treated with reduction of immunosuppression and four weekly infusions of rituximab. Those patients who did not achieve complete remission (CR) received a second course of four rituximab infusions. The primary end-point of the study was the CR rate. RESULTS Thirty-eight patients were assesable. One episode of grade 4 neutropenia was the only severe adverse event observed. After the first course of rituximab, 13 (34.2%) patients achieved CR, 8 patients did not respond, and 17 patients achieved partial remission. Among those 17 patients, 12 could be treated with a second course of rituximab, and 10 (83.3%) achieved CR, yielding an intention-to-treat CR rate of 60.5%. Eight patients excluded from the trial because of absence of CR were treated with rituximab combined with chemotherapy, and six (75%) achieved CR. Event-free survival was 42% and overall survival was 47% at 27.5 months. Fourteen patients died, ten of progression of their post-transplant lymphoproliferative disorder. INTERPRETATION AND CONCLUSIONS These results confirm that extended treatment with rituximab can obtain a high rate of CR in patients with post-transplant lymphoproliferative disorders after solid organ transplantation without increasing toxicity, and should be recommended as initial therapy for these patients.