12 resultados para R..., Marie
Resumo:
Background With the emergence of influenza H1N1v the world is facing its first 21st century global pandemic. Severe Acute Respiratory Syndrome (SARS) and avian influenza H5N1 prompted development of pandemic preparedness plans. National systems of public health law are essential for public health stewardship and for the implementation of public health policy[1]. International coherence will contribute to effective regional and global responses. However little research has been undertaken on how law works as a tool for disease control in Europe. With co-funding from the European Union, we investigated the extent to which laws across Europe support or constrain pandemic preparedness planning, and whether national differences are likely to constrain control efforts. Methods We undertook a survey of national public health laws across 32 European states using a questionnaire designed around a disease scenario based on pandemic influenza. Questionnaire results were reviewed in workshops, analysing how differences between national laws might support or hinder regional responses to pandemic influenza. Respondents examined the impact of national laws on the movements of information, goods, services and people across borders in a time of pandemic, the capacity for surveillance, case detection, case management and community control, the deployment of strategies of prevention, containment, mitigation and recovery and the identification of commonalities and disconnects across states. Results Results of this study show differences across Europe in the extent to which national pandemic policy and pandemic plans have been integrated with public health laws. We found significant differences in legislation and in the legitimacy of strategic plans. States differ in the range and the nature of intervention measures authorized by law, the extent to which borders could be closed to movement of persons and goods during a pandemic, and access to healthcare of non-resident persons. Some states propose use of emergency powers that might potentially override human rights protections while other states propose to limit interventions to those authorized by public health laws. Conclusion These differences could create problems for European strategies if an evolving influenza pandemic results in more serious public health challenges or, indeed, if a novel disease other than influenza emerges with pandemic potential. There is insufficient understanding across Europe of the role and importance of law in pandemic planning. States need to build capacity in public health law to support disease prevention and control policies. Our research suggests that states would welcome further guidance from the EU on management of a pandemic, and guidance to assist in greater commonality of legal approaches across states.
Resumo:
Tree nuts, peanuts and seeds are nutrient dense foods whose intake has been shown to be associated with reduced risk of some chronic diseases. They are regularly consumed in European diets either as whole, in spreads or from hidden sources (e.g. commercial products). However, little is known about their intake profiles or differences in consumption between European countries or geographic regions. The objective of this study was to analyse the population mean intake and average portion sizes in subjects reporting intake of nuts and seeds consumed as whole, derived from hidden sources or from spreads. Data was obtained from standardised 24-hour dietary recalls collected from 36 994 subjects in 10 different countries that are part of the European Prospective Investigation into Cancer and Nutrition (EPIC). Overall, for nuts and seeds consumed as whole, the percentage of subjects reporting intake on the day of the recall was: tree nuts = 4. 4%, peanuts = 2.3 % and seeds = 1.3 %. The data show a clear northern (Sweden: mean intake = 0.15 g/d, average portion size = 15.1 g/d) to southern (Spain: mean intake = 2.99 g/d, average portion size = 34.7 g/d) European gradient of whole tree nut intake. The three most popular tree nuts were walnuts, almonds and hazelnuts, respectively. In general, tree nuts were more widely consumed than peanuts or seeds. In subjects reporting intake, men consumed a significantly higher average portion size of tree nuts (28.5 v. 23.1 g/d, P<0.01) and peanuts (46.1 v. 35.1 g/d, P<0.01) per day than women. These data may be useful in devising research initiatives and health policy strategies based on the intake of this food group.
Resumo:
Datos a 31 de diciembre del año 1999. Publicado en la página web de la Consejería de Salud: www.juntadeandalucia.es/salud (Consejería de Salud / Profesionales / Estadísticas Sanitarias / Estadísticas Hospitalarias / Estadísticas Hospitalarias de Andalucía
Resumo:
While the risk of ovarian cancer clearly reduces with each full-term pregnancy, the effect of incomplete pregnancies is unclear. We investigated whether incomplete pregnancies (miscarriages and induced abortions) are associated with risk of epithelial ovarian cancer. This observational study was carried out in female participants of the European Prospective Investigation into Cancer and Nutrition (EPIC). A total of 274,442 women were followed from 1992 until 2010. The baseline questionnaire elicited information on miscarriages and induced abortions, reproductive history, and lifestyle-related factors. During a median follow-up of 11.5 years, 1,035 women were diagnosed with incident epithelial ovarian cancer. Despite the lack of an overall association (ever vs. never), risk of ovarian cancer was higher among women with multiple incomplete pregnancies (HR(≥4vs.0): 1.74, 95% CI: 1.20-2.70; number of cases in this category: n = 23). This association was particularly evident for multiple miscarriages (HR(≥4vs.0): 1.99, 95% CI: 1.06-3.73; number of cases in this category: n = 10), with no significant association for multiple induced abortions (HR(≥4vs.0): 1.46, 95% CI: 0.68-3.14; number of cases in this category: n = 7). Our findings suggest that multiple miscarriages are associated with an increased risk of epithelial ovarian cancer, possibly through a shared cluster of etiological factors or a common underlying pathology. These findings should be interpreted with caution as this is the first study to show this association and given the small number of cases in the highest exposure categories.
Resumo:
BACKGROUND Earlier analyses within the EPIC study showed that dietary fibre intake was inversely associated with colorectal cancer risk, but results from some large cohort studies do not support this finding. We explored whether the association remained after longer follow-up with a near threefold increase in colorectal cancer cases, and if the association varied by gender and tumour location. METHODOLOGY/PRINCIPAL FINDINGS After a mean follow-up of 11.0 years, 4,517 incident cases of colorectal cancer were documented. Total, cereal, fruit, and vegetable fibre intakes were estimated from dietary questionnaires at baseline. Hazard ratios (HRs) and 95% confidence intervals (CIs) were estimated using Cox proportional hazards models stratified by age, sex, and centre, and adjusted for total energy intake, body mass index, physical activity, smoking, education, menopausal status, hormone replacement therapy, oral contraceptive use, and intakes of alcohol, folate, red and processed meats, and calcium. After multivariable adjustments, total dietary fibre was inversely associated with colorectal cancer (HR per 10 g/day increase in fibre 0.87, 95% CI: 0.79-0.96). Similar linear associations were observed for colon and rectal cancers. The association between total dietary fibre and risk of colorectal cancer risk did not differ by age, sex, or anthropometric, lifestyle, and dietary variables. Fibre from cereals and fibre from fruit and vegetables were similarly associated with colon cancer; but for rectal cancer, the inverse association was only evident for fibre from cereals. CONCLUSIONS/SIGNIFICANCE Our results strengthen the evidence for the role of high dietary fibre intake in colorectal cancer prevention.
Resumo:
BACKGROUND The role of re-treatment with rituximab in aggressive B-cell lymphomas still needs to be defined. This study evaluated the influence of prior exposure to rituximab on response rates and survival in patients with diffuse large B-cell lymphoma treated with rituximab plus etoposide, cytarabine, cisplatinum and methylprednisolone (R-ESHAP). DESIGN AND METHODS We retrospectively analyzed 163 patients with relapsed or refractory diffuse large B-cell lymphoma who received R-ESHAP as salvage therapy with a curative purpose. Patients were divided into two groups according to whether rituximab had been administered (n=94, "R+" group) or not (n=69, "R-" group) prior to R-ESHAP. RESULTS Response rates were significantly higher in the R- group in the univariate but not in the multivariate analysis. In the analysis restricted to the R+ group, we observed very low complete remission and overall response rates in patients with primary refractory disease (8% and 33%, respectively), as compared to those in patients who were in first partial remission (41% and 86%) or who had relapsed disease (50% and 75%) (p<0.01 in both cases). Overall, 60% and 65% of patients in the R+ and R- groups, respectively, underwent stem-cell transplantation after the salvage therapy. With a median follow-up of 29 months (range, 6-84), patients in the R+ group had significantly worse progression-free survival (17% vs. 57% at 3 years, p<0.0001) and overall survival (38% v 67% at 3 years, p=0.0005) than patients in the R- group. Prior exposure to rituximab was also an independent adverse prognostic factor for both progression-free survival (RR: 2.0; 95% CI: 1.2-3.3, p=0.008) and overall survival (RR: 2.2; 95% CI: 1.3-3.9, p=0.004). CONCLUSIONS R-ESHAP was associated with a high response rate in patients who were not refractory to upfront rituximab-based chemotherapy. However, the survival outcome was poor for patients previously exposed to rituximab, as compared to in those who had not previously been treated with rituximab.
Resumo:
BACKGROUND Recently, some US cohorts have shown a moderate association between red and processed meat consumption and mortality supporting the results of previous studies among vegetarians. The aim of this study was to examine the association of red meat, processed meat, and poultry consumption with the risk of early death in the European Prospective Investigation into Cancer and Nutrition (EPIC). METHODS Included in the analysis were 448,568 men and women without prevalent cancer, stroke, or myocardial infarction, and with complete information on diet, smoking, physical activity and body mass index, who were between 35 and 69 years old at baseline. Cox proportional hazards regression was used to examine the association of meat consumption with all-cause and cause-specific mortality. RESULTS As of June 2009, 26,344 deaths were observed. After multivariate adjustment, a high consumption of red meat was related to higher all-cause mortality (hazard ratio (HR) = 1.14, 95% confidence interval (CI) 1.01 to 1.28, 160+ versus 10 to 19.9 g/day), and the association was stronger for processed meat (HR = 1.44, 95% CI 1.24 to 1.66, 160+ versus 10 to 19.9 g/day). After correction for measurement error, higher all-cause mortality remained significant only for processed meat (HR = 1.18, 95% CI 1.11 to 1.25, per 50 g/d). We estimated that 3.3% (95% CI 1.5% to 5.0%) of deaths could be prevented if all participants had a processed meat consumption of less than 20 g/day. Significant associations with processed meat intake were observed for cardiovascular diseases, cancer, and 'other causes of death'. The consumption of poultry was not related to all-cause mortality. CONCLUSIONS The results of our analysis support a moderate positive association between processed meat consumption and mortality, in particular due to cardiovascular diseases, but also to cancer.
Resumo:
BACKGROUND Compared to food patterns, nutrient patterns have been rarely used particularly at international level. We studied, in the context of a multi-center study with heterogeneous data, the methodological challenges regarding pattern analyses. METHODOLOGY/PRINCIPAL FINDINGS We identified nutrient patterns from food frequency questionnaires (FFQ) in the European Prospective Investigation into Cancer and Nutrition (EPIC) Study and used 24-hour dietary recall (24-HDR) data to validate and describe the nutrient patterns and their related food sources. Associations between lifestyle factors and the nutrient patterns were also examined. Principal component analysis (PCA) was applied on 23 nutrients derived from country-specific FFQ combining data from all EPIC centers (N = 477,312). Harmonized 24-HDRs available for a representative sample of the EPIC populations (N = 34,436) provided accurate mean group estimates of nutrients and foods by quintiles of pattern scores, presented graphically. An overall PCA combining all data captured a good proportion of the variance explained in each EPIC center. Four nutrient patterns were identified explaining 67% of the total variance: Principle component (PC) 1 was characterized by a high contribution of nutrients from plant food sources and a low contribution of nutrients from animal food sources; PC2 by a high contribution of micro-nutrients and proteins; PC3 was characterized by polyunsaturated fatty acids and vitamin D; PC4 was characterized by calcium, proteins, riboflavin, and phosphorus. The nutrients with high loadings on a particular pattern as derived from country-specific FFQ also showed high deviations in their mean EPIC intakes by quintiles of pattern scores when estimated from 24-HDR. Center and energy intake explained most of the variability in pattern scores. CONCLUSION/SIGNIFICANCE The use of 24-HDR enabled internal validation and facilitated the interpretation of the nutrient patterns derived from FFQs in term of food sources. These outcomes open research opportunities and perspectives of using nutrient patterns in future studies particularly at international level.
Resumo:
INTRODUCTION Due to their low CNS penetrance, there are concerns about the capacity of non-conventional PI-based ART (monotherapy and dual therapies) to preserve neurocognitive performance (NP). METHODS We evaluated the NP change of aviremic participants of the SALT clinical trial (1) switching therapy to dual therapy (DT: ATV/r+3TC) or triple therapy (TT: ATV/r+2NRTI) who agreed to perform an NP assessment (NPZ-5) at baseline and W48. Neurocognitive impairment and NP were assessed using AAN-2007 criteria (2) and global deficit scores (GDS) (3). Neurocognitive change (GDS change: W48 - baseline) and the effect of DT on NP evolution crude and adjusted by significant confounders were determined using ANCOVA. RESULTS A total of 158 patients were included (Table 1). They had shorter times because HIV diagnosis, ART initiation and HIV-suppression and their virologic outcome at W48 by snapshot was higher (79.1% vs 72.7%; p=0.04) compared to the 128 patients not included in the sub-study. By AAN-2007 criteria, 51 patients in each ART group (68% vs 63%) were neurocognitively impaired at baseline (p=0.61). Forty-seven patients were not reassessed at W48: 30 lost of follow-up (16 DT-14 TT) and 17 had non-evaluable data (6 DT-11 TT). Patients retested were more likely to be men (78.9% vs 61.4%) and had neurological cofounders (9.6% vs 0%) than patients non-retested. At W48, 3 out of 16 (5.7%) patients on DT and 6 out of 21 (10.5%) on TT who were non-impaired at baseline became impaired (p=0.49) while 10 out of 37 (18.9%) on DT and 7 out of 36 (12.3%) on TT who were neurocognitively impaired at baseline became non-impaired (p=0.44). Mean GDS changes (95% CI) were: Overall -0.2 (-0.3 to -0.04): DT -0.26 (-0.4 to -0.07) and TT -0.08 (-0.2 to 0.07). NP was similar between DT and TT (0.15). This absence of differences was also observed in all cognitive tests. Effect of DT: -0.16 [-0.38 to 0.06]) (r(2)=0.16) on NP evolution was similar to TT (reference), even after adjusting (DT: -0.11 [-0.33 to 0.1], TT: reference) by significant confounders (geographical origin, previous ATV use and CD4 cell count) (r(2)=0.25). CONCLUSIONS NP stability was observed after 48 weeks of follow up in the majority of patients whether DT or TT was used to maintain HIV-suppression. Incidence rates of NP impairment or NP impairment recovery were also similar between DT and TT.
Resumo:
BACKGROUND Cancer survivors are advised to follow lifestyle recommendations on diet, physical activity, and body fatness proposed by the World Cancer Research Fund/American Institute of Cancer Research (WCRF/AICR) for cancer prevention. Previous studies have demonstrated that higher concordance with these recommendations measured using an index score (the WCRF/AICR score) was associated with lower cancer incidence and mortality. The aim of this study was to evaluate the association between pre-diagnostic concordance with WCRF/AICR recommendations and mortality in colorectal cancer (CRC) patients. METHODS The association between the WCRF/AICR score (score range 0-6 in men and 0-7 in women; higher scores indicate greater concordance) assessed on average 6.4 years before diagnosis and CRC-specific (n = 872) and overall mortality (n = 1,113) was prospectively examined among 3,292 participants diagnosed with CRC in the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort (mean follow-up time after diagnosis 4.2 years). Multivariable Cox proportional hazard models were used to estimate hazard ratios (HRs) and 95% confidence intervals (CIs) for mortality. RESULTS The HRs (95% CIs) for CRC-specific mortality among participants in the second (score range in men/women: 2.25-2.75/3.25-3.75), third (3-3.75/4-4.75), and fourth (4-6/5-7) categories of the score were 0.87 (0.72-1.06), 0.74 (0.61-0.90), and 0.70 (0.56-0.89), respectively (P for trend <0.0001), compared to participants with the lowest concordance with the recommendations (category 1 of the score: 0-2/0-3). Similar HRs for overall mortality were observed (P for trend 0.004). Meeting the recommendations on body fatness and plant food consumption were associated with improved survival among CRC cases in mutually adjusted models. CONCLUSIONS Greater concordance with the WCRF/AICR recommendations on diet, physical activity, and body fatness prior to CRC diagnosis is associated with improved survival among CRC patients.
Resumo:
Oleoylethanolamide (OEA) is an agonist of the peroxisome proliferator-activated receptor α (PPARα) and has been described to exhibit neuroprotective properties when administered locally in animal models of several neurological disorder models, including stroke and Parkinson's disease. However, there is little information regarding the effectiveness of systemic administration of OEA on Parkinson's disease. In the present study, OEA-mediated neuroprotection has been tested on in vivo and in vitro models of 6-hydroxydopamine (6-OH-DA)-induced degeneration. The in vivo model was based on the intrastriatal infusion of the neurotoxin 6-OH-DA, which generates Parkinsonian symptoms. Rats were treated 2 h before and after the 6-OH-DA treatment with systemic OEA (0.5, 1, and 5 mg/kg). The Parkinsonian symptoms were evaluated at 1 and 4 wk after the development of lesions. The functional status of the nigrostriatal system was studied through tyrosine-hydroxylase (TH) and hemeoxygenase-1 (HO-1, oxidation marker) immunostaining as well as by monitoring the synaptophysin content. In vitro cell cultures were also treated with OEA and 6-OH-DA. As expected, our results revealed 6-OH-DA induced neurotoxicity and behavioural deficits; however, these alterations were less severe in the animals treated with the highest dose of OEA (5 mg/kg). 6-OH-DA administration significantly reduced the striatal TH-immunoreactivity (ir) density, synaptophysin expression, and the number of nigral TH-ir neurons. Moreover, 6-OH-DA enhanced striatal HO-1 content, which was blocked by OEA (5 mg/kg). In vitro, 0.5 and 1 μM of OEA exerted significant neuroprotection on cultured nigral neurons. These effects were abolished after blocking PPARα with the selective antagonist GW6471. In conclusion, systemic OEA protects the nigrostriatal circuit from 6-OH-DA-induced neurotoxicity through a PPARα-dependent mechanism.
Resumo:
A total of 1,021 extended-spectrum-β-lactamase-producing Escherichia coli (ESBLEC) isolates obtained in 2006 during a Spanish national survey conducted in 44 hospitals were analyzed for the presence of the O25b:H4-B2-ST131 (sequence type 131) clonal group. Overall, 195 (19%) O25b-ST131 isolates were detected, with prevalence rates ranging from 0% to 52% per hospital. Molecular characterization of 130 representative O25b-ST131 isolates showed that 96 (74%) were positive for CTX-M-15, 15 (12%) for CTX-M-14, 9 (7%) for SHV-12, 6 (5%) for CTX-M-9, 5 (4%) for CTX-M-32, and 1 (0.7%) each for CTX-M-3 and the new ESBL enzyme CTX-M-103. The 130 O25b-ST131 isolates exhibited relatively high virulence scores (mean, 14.4 virulence genes). Although the virulence profiles of the O25b-ST131 isolates were fairly homogeneous, they could be classified into four main virotypes based on the presence or absence of four distinctive virulence genes: virotypes A (22%) (afa FM955459 positive, iroN negative, ibeA negative, sat positive or negative), B (31%) (afa FM955459 negative, iroN positive, ibeA negative, sat positive or negative), C (32%) (afa FM955459 negative, iroN negative, ibeA negative, sat positive), and D (13%) (afa FM955459 negative, iroN positive or negative, ibeA positive, sat positive or negative). The four virotypes were also identified in other countries, with virotype C being overrepresented internationally. Correspondingly, an analysis of XbaI macrorestriction profiles revealed four major clusters, which were largely virotype specific. Certain epidemiological and clinical features corresponded with the virotype. Statistically significant virotype-specific associations included, for virotype B, older age and a lower frequency of infection (versus colonization), for virotype C, a higher frequency of infection, and for virotype D, younger age and community-acquired infections. In isolates of the O25b:H4-B2-ST131 clonal group, these findings uniquely define four main virotypes, which are internationally distributed, correspond with pulsed-field gel electrophoresis (PFGE) profiles, and exhibit distinctive clinical-epidemiological associations.