945 resultados para exposure risk


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: There is limited data available regarding safety profile of artemisinins in early pregnancy. They are, therefore, not recommended by WHO as a first-line treatment for malaria in first trimester due to associated embryo-foetal toxicity in animal studies. The study assessed birth outcome among pregnant women inadvertently exposed to artemether-lumefantrine (AL) during first trimester in comparison to those of women exposed to other anti-malarial drugs or no drug at all during the same period of pregnancy. METHODS: Pregnant women with gestational age <20 weeks were recruited from Maternal Health clinics or from monthly house visits (demographic surveillance), and followed prospectively until delivery. RESULTS: 2167 pregnant women were recruited and 1783 (82.3%) completed the study until delivery. 319 (17.9%) used anti-malarials in first trimester, of whom 172 (53.9%) used (AL), 78 (24.4%) quinine, 66 (20.7%) sulphadoxine-pyrimethamine (SP) and 11 (3.4%) amodiaquine. Quinine exposure in first trimester was associated with an increased risk of miscarriage/stillbirth (OR 2.5; 1.3-5.1) and premature birth (OR 2.6; 1.3-5.3) as opposed to AL with (OR 1.4; 0.8-2.5) for miscarriage/stillbirth and (OR 0.9; 0.5-1.8) for preterm birth. Congenital anomalies were identified in 4 exposure groups namely AL only (1/164[0.6%]), quinine only (1/70[1.4%]), SP (2/66[3.0%]), and non-anti-malarial exposure group (19/1464[1.3%]). CONCLUSION: Exposure to AL in first trimester was more common than to any other anti-malarial drugs. Quinine exposure was associated with adverse pregnancy outcomes which was not the case following other anti-malarial intake. Since AL and quinine were used according to their availability rather than to disease severity, it is likely that the effect observed was related to the drug and not to the disease itself. Even with this caveat, a change of policy from quinine to AL for the treatment of uncomplicated malaria during the whole pregnancy period could be already envisaged.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: Darunavir was designed for activity against HIV resistant to other protease inhibitors (PIs). We assessed the efficacy, tolerability and risk factors for virological failure of darunavir for treatment-experienced patients seen in clinical practice. METHODS: We included all patients in the Swiss HIV Cohort Study starting darunavir after recording a viral load above 1000 HIV-1 RNA copies/mL given prior exposure to both PIs and nonnucleoside reverse transcriptase inhibitors. We followed these patients for up to 72 weeks, assessed virological failure using different loss of virological response algorithms and evaluated risk factors for virological failure using a Bayesian method to fit discrete Cox proportional hazard models. RESULTS: Among 130 treatment-experienced patients starting darunavir, the median age was 47 years, the median duration of HIV infection was 16 years, and 82% received mono or dual antiretroviral therapy before starting highly active antiretroviral therapy. During a median patient follow-up period of 45 weeks, 17% of patients stopped taking darunavir after a median exposure of 20 weeks. In patients followed beyond 48 weeks, the rate of virological failure at 48 weeks was at most 20%. Virological failure was more likely where patients had previously failed on both amprenavir and saquinavir and as the number of previously failed PI regimens increased. CONCLUSIONS: As a component of therapy for treatment-experienced patients, darunavir can achieve a similar efficacy and tolerability in clinical practice to that seen in clinical trials. Clinicians should consider whether a patient has failed on both amprenavir and saquinavir and the number of failed PI regimens before prescribing darunavir.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exposure to various pesticides has been characterized in workers and the general population, but interpretation and assessment of biomonitoring data from a health risk perspective remains an issue. For workers, a Biological Exposure Index (BEI®) has been proposed for some substances, but most BEIs are based on urinary biomarker concentrations at Threshold Limit Value - Time Weighted Average (TLV-TWA) airborne exposure while occupational exposure can potentially occurs through multiple routes, particularly by skin contact (i.e.captan, chlorpyrifos, malathion). Similarly, several biomonitoring studies have been conducted to assess environmental exposure to pesticides in different populations, but dose estimates or health risks related to these environmental exposures (mainly through the diet), were rarely characterized. Recently, biological reference values (BRVs) in the form of urinary pesticide metabolites have been proposed for both occupationally exposed workers and children. These BRVs were established using toxicokinetic models developed for each substance, and correspond to safe levels of absorption in humans, regardless of the exposure scenario. The purpose of this chapter is to present a review of a toxicokinetic modeling approach used to determine biological reference values. These are then used to facilitate health risk assessments and decision-making on occupational and environmental pesticide exposures. Such models have the ability to link absorbed dose of the parent compound to exposure biomarkers and critical biological effects. To obtain the safest BRVs for the studied population, simulations of exposure scenarios were performed using a conservative reference dose such as a no-observed-effect level (NOEL). The various examples discussed in this chapter show the importance of knowledge on urine collections (i.e. spot samples and complete 8-h, 12-h or 24-h collections), sampling strategies, metabolism, relative proportions of the different metabolites in urine, absorption fraction, route of exposure and background contribution of prior exposures. They also show that relying on urinary measurements of specific metabolites appears more accurate when applying this approach to the case of occupational exposures. Conversely, relying on semi-specific metabolites (metabolites common to a category of pesticides) appears more accurate for the health risk assessment of environmental exposures given that the precise pesticides to which subjects are exposed are often unknown. In conclusion, the modeling approach to define BRVs for the relevant pesticides may be useful for public health authorities for managing issues related to health risks resulting from environmental and occupational exposures to pesticides.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Following wider acceptance of 'the thrifty phenotype' hypothesis and the convincing evidence that early-life exposures can influence adult health even decades after the exposure, much interest has been placed on the mechanisms through which early-life exposures become biologically embedded. MATERIALS AND METHODS: In this review, we summarize the current literature regarding biological embedding of early-life experiences. To this end, we conducted a literature search to identify studies investigating early-life exposures in relation to DNA methylation changes. In addition, we summarize the challenges faced in investigations of epigenetic effects, stemming from the peculiarities of this emergent and complex field. A proper systematic review and meta-analyses were not feasible given the nature of the evidence. RESULTS: We identified seven studies on early-life socio-economic circumstances, 10 studies on childhood obesity and six studies on early-life nutrition all relating to DNA methylation changes that met the stipulated inclusion criteria. The pool of evidence gathered, albeit small, favours a role of epigenetics and DNA methylation in biological embedding, but replication of findings, multiple comparison corrections, publication bias and causality are concerns remaining to be addressed in future investigations. CONCLUSIONS: Based on these results, we hypothesize that epigenetics, in particular DNA methylation, is a plausible mechanism through which early-life exposures are biologically embedded. This review describes the current status of the field and acts as a stepping stone for future, better designed investigations on how early-life exposures might become biologically embedded through epigenetic effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous studies have demonstrated that poultry house workers are exposed to very high levels of organic dust and consequently have an increased prevalence of adverse respiratory symptoms. However, the influence of the age of broilers on bioaerosol concentrations has not been investigated. To evaluate the evolution of bioaerosol concentration during the fattening period, bioaerosol parameters (inhalable dust, endotoxin and bacteria) were measured in 12 poultry confinement buildings in Switzerland, at three different stages of the birds' growth; samples of air taken from within the breathing zones of individual poultry house employees as they caught the chickens ready to be transported for slaughter were also analysed. Quantitative polymerase chain reaction (Q-PCR) was used to assess the quantity of total airborne bacteria and total airborne Staphylococcus species. Bioaerosol levels increased significantly during the fattening period of the chickens. During the task of catching mature birds, the mean inhalable dust concentration for a worker was 26 +/- 1.9 mg m(-3) and endotoxin concentration was 6198 +/- 2.3 EU m(-3) air, >6-fold higher than the Swiss occupational recommended value (1000 EU m(-3)). The mean exposure level of bird catchers to total bacteria and Staphylococcus species measured by Q-PCR is also very high, respectively, reaching values of 53 (+/-2.6) x 10(7) cells m(-3) air and 62 (+/-1.9) x 10(6) m(-3) air. It was concluded that in the absence of wearing protective breathing apparatus, chicken catchers in Switzerland risk exposure beyond recommended limits for all measured bioaerosol parameters. Moreover, the use of Q-PCR to estimate total and specific numbers of airborne bacteria is a promising tool for evaluating any modifications intended to improve the safety of current working practices

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CONTEXT: The Fracture Reduction Evaluation of Denosumab in Osteoporosis Every 6 Months (FREEDOM) extension is evaluating the long-term efficacy and safety of denosumab for up to 10 years. OBJECTIVE: The objective of the study was to report results from the first 3 years of the extension, representing up to 6 years of denosumab exposure. DESIGN, SETTING, AND PARTICIPANTS: This was a multicenter, international, open-label study of 4550 women. INTERVENTION: Women from the FREEDOM denosumab group received 3 more years of denosumab for a total of 6 years (long-term) and women from the FREEDOM placebo group received 3 years of denosumab (crossover). MAIN OUTCOME MEASURES: Bone turnover markers (BTMs), bone mineral density (BMD), fracture, and safety data are reported. RESULTS: Reductions in BTMs were maintained (long-term) or achieved rapidly (crossover) after denosumab administration. In the long-term group, BMD further increased for cumulative 6-year gains of 15.2% (lumbar spine) and 7.5% (total hip). During the first 3 years of denosumab treatment, the crossover group had significant gains in lumbar spine (9.4%) and total hip (4.8%) BMD, similar to the long-term group during the 3-year FREEDOM trial. In the long-term group, fracture incidences remained low and below the rates projected for a virtual placebo cohort. In the crossover group, 3-year incidences of new vertebral and nonvertebral fractures were similar to those of the FREEDOM denosumab group. Incidence rates of adverse events did not increase over time. Six participants had events of osteonecrosis of the jaw confirmed by adjudication. One participant had a fracture adjudicated as consistent with atypical femoral fracture. CONCLUSION: Denosumab treatment for 6 years remained well tolerated, maintained reduced bone turnover, and continued to increase BMD. Fracture incidence remained low.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Question: Outdoor occupational exposure could be associated with important cumulative and intense exposure to ultraviolet (UV) solar radiation. Such exposure would increase risk of skin cancer. However, little information exists on jobs associated with intense UV exposure. The objective of this study was to characterise occupational UV exposure in a representative sample in France. Methods: A population-based survey was conducted in May-June 2012 through computer-assisted telephonic interviews in population 25 to 69 years of age. Individual UV irradiation was computed with declared time and place of residence matched to UV records from satellite measurement (Eurosun project). We analysed factors influencing exposure to UV (annual average and seasonal peak). Results: A total of 1442 individuals declared having an occupational exposure to UV which represents 18% of population aged 25 to 69 years. Outdoor workers were more frequently men (58%), aged 40-54 (43%), with a phototype III or IV (69%). Occupations associated with highest UV exposure were: construction workers (annual daily average 62.8 Joules/m2), gardeners (62.6), farmers (52.8), culture/art/social sciences workers (52.0) and transport workers/mail carriers (49.5). The maximum of UVA exposure was found for occupation with a strong seasonality of exposure: culture, art or social sciences works (98.1 Joules/m2), construction works (97.2), gardening (96.7) and farming (95.0). Significant factors associated with high occupational UV exposure were gender (men vs. women: 53.6 vs. 42.6), phototype (IV vs. I: 51.9 vs. 45.5) and taking lunch outdoors (always vs. never: 59.8 vs. 48.6). Conclusion: Our study showed that some occupations were associated with particularly intense UV exposure such as farmers, gardeners, construction workers. Other unexpected occupations were also associated with high UV exposure such as transport workers, mail carriers and culture/art/social sciences workers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To assess the contribution of modifiable risk factors to social inequalities in the incidence of type 2 diabetes when these factors are measured at study baseline or repeatedly over follow-up and when long term exposure is accounted for. DESIGN: Prospective cohort study with risk factors (health behaviours (smoking, alcohol consumption, diet, and physical activity), body mass index, and biological risk markers (systolic blood pressure, triglycerides and high density lipoprotein cholesterol)) measured four times and diabetes status assessed seven times between 1991-93 and 2007-09. SETTING: Civil service departments in London (Whitehall II study). PARTICIPANTS: 7237 adults without diabetes (mean age 49.4 years; 2196 women). MAIN OUTCOME MEASURES: Incidence of type 2 diabetes and contribution of risk factors to its association with socioeconomic status. RESULTS: Over a mean follow-up of 14.2 years, 818 incident cases of diabetes were identified. Participants in the lowest occupational category had a 1.86-fold (hazard ratio 1.86, 95% confidence interval 1.48 to 2.32) greater risk of developing diabetes relative to those in the highest occupational category. Health behaviours and body mass index explained 33% (-1% to 78%) of this socioeconomic differential when risk factors were assessed at study baseline (attenuation of hazard ratio from 1.86 to 1.51), 36% (22% to 66%) when they were assessed repeatedly over the follow-up (attenuated hazard ratio 1.48), and 45% (28% to 75%) when long term exposure over the follow-up was accounted for (attenuated hazard ratio 1.41). With additional adjustment for biological risk markers, a total of 53% (29% to 88%) of the socioeconomic differential was explained (attenuated hazard ratio 1.35, 1.05 to 1.72). CONCLUSIONS: Modifiable risk factors such as health behaviours and obesity, when measured repeatedly over time, explain almost half of the social inequalities in incidence of type 2 diabetes. This is more than was seen in previous studies based on single measurement of risk factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Whether nucleoside reverse transcriptase inhibitors increase the risk of myocardial infarction in HIV-infected individuals is unclear. Our aim was to explore whether exposure to such drugs was associated with an excess risk of myocardial infarction in a large, prospective observational cohort of HIV-infected patients. METHODS: We used Poisson regression models to quantify the relation between cumulative, recent (currently or within the preceding 6 months), and past use of zidovudine, didanosine, stavudine, lamivudine, and abacavir and development of myocardial infarction in 33 347 patients enrolled in the D:A:D study. We adjusted for cardiovascular risk factors that are unlikely to be affected by antiretroviral therapy, cohort, calendar year, and use of other antiretrovirals. FINDINGS: Over 157,912 person-years, 517 patients had a myocardial infarction. We found no associations between the rate of myocardial infarction and cumulative or recent use of zidovudine, stavudine, or lamivudine. By contrast, recent-but not cumulative-use of abacavir or didanosine was associated with an increased rate of myocardial infarction (compared with those with no recent use of the drugs, relative rate 1.90, 95% CI 1.47-2.45 [p=0.0001] with abacavir and 1.49, 1.14-1.95 [p=0.003] with didanosine); rates were not significantly increased in those who stopped these drugs more than 6 months previously compared with those who had never received these drugs. After adjustment for predicted 10-year risk of coronary heart disease, recent use of both didanosine and abacavir remained associated with increased rates of myocardial infarction (1.49, 1.14-1.95 [p=0.004] with didanosine; 1.89, 1.47-2.45 [p=0.0001] with abacavir). INTERPRETATION: There exists an increased risk of myocardial infarction in patients exposed to abacavir and didanosine within the preceding 6 months. The excess risk does not seem to be explained by underlying established cardiovascular risk factors and was not present beyond 6 months after drug cessation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 3-year FREEDOM trial assessed the efficacy and safety of 60 mg denosumab every 6 months for the treatment of postmenopausal women with osteoporosis. Participants who completed the FREEDOM trial were eligible to enter an extension to continue the evaluation of denosumab efficacy and safety for up to 10 years. For the extension results presented here, women from the FREEDOM denosumab group had 2 more years of denosumab treatment (long-term group) and those from the FREEDOM placebo group had 2 years of denosumab exposure (cross-over group). We report results for bone turnover markers (BTMs), bone mineral density (BMD), fracture rates, and safety. A total of 4550 women enrolled in the extension (2343 long-term; 2207 cross-over). Reductions in BTMs were maintained (long-term group) or occurred rapidly (cross-over group) following denosumab administration. In the long-term group, lumbar spine and total hip BMD increased further, resulting in 5-year gains of 13.7% and 7.0%, respectively. In the cross-over group, BMD increased at the lumbar spine (7.7%) and total hip (4.0%) during the 2-year denosumab treatment. Yearly fracture incidences for both groups were below rates observed in the FREEDOM placebo group and below rates projected for a "virtual untreated twin" cohort. Adverse events did not increase with long-term denosumab administration. Two adverse events in the cross-over group were adjudicated as consistent with osteonecrosis of the jaw. Five-year denosumab treatment of women with postmenopausal osteoporosis maintained BTM reduction and increased BMD, and was associated with low fracture rates and a favorable risk/benefit profile.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Exposure to combination antiretroviral therapy (cART) can lead to important metabolic changes and increased risk of coronary heart disease (CHD). Computerized clinical decision support systems have been advocated to improve the management of patients at risk for CHD but it is unclear whether such systems reduce patients' risk for CHD. METHODS: We conducted a cluster trial within the Swiss HIV Cohort Study (SHCS) of HIV-infected patients, aged 18 years or older, not pregnant and receiving cART for >3 months. We randomized 165 physicians to either guidelines for CHD risk factor management alone or guidelines plus CHD risk profiles. Risk profiles included the Framingham risk score, CHD drug prescriptions and CHD events based on biannual assessments, and were continuously updated by the SHCS data centre and integrated into patient charts by study nurses. Outcome measures were total cholesterol, systolic and diastolic blood pressure and Framingham risk score. RESULTS: A total of 3,266 patients (80% of those eligible) had a final assessment of the primary outcome at least 12 months after the start of the trial. Mean (95% confidence interval) patient differences where physicians received CHD risk profiles and guidelines, rather than guidelines alone, were total cholesterol -0.02 mmol/l (-0.09-0.06), systolic blood pressure -0.4 mmHg (-1.6-0.8), diastolic blood pressure -0.4 mmHg (-1.5-0.7) and Framingham 10-year risk score -0.2% (-0.5-0.1). CONCLUSIONS: Systemic computerized routine provision of CHD risk profiles in addition to guidelines does not significantly improve risk factors for CHD in patients on cART.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to test the short-term effects of using hypoxic rooms before a simulated running event. Thirteen subjects (29 +/- 4 years) lived in a hypoxic dormitory (1,800 m) for either 2 nights (n = 6) or 2 days + nights (n = 7) before performing a 1,500-m treadmill test. Performance, expired gases, and muscle electrical activity were recorded and compared with a control session performed 1 week before or after the altitude session (random order). Arterial blood samples were collected before and after altitude exposure. Arterial pH and hemoglobin concentration increased (p < 0.05) and PCO2 decreased (p < 0.05) upon exiting the room. However, these parameters returned (p < 0.05) to basal levels within a few hours. During exercise, mean ventilation (VE) was higher (p < 0.05) after 2 nights or days + nights of moderate altitude exposure (113.0 +/- 27.2 L.min) than in the control run (108.6 +/- 27.8 L.min), without any modification in performance (360 +/- 45 vs. 360 +/- 42 seconds, respectively) or muscle electrical activity. This elevated VE during the run after the hypoxic exposure was probably because of the subsistence effects of the hypoxic ventilatory response. However, from a practical point of view, although the use of a normobaric simulating altitude chamber exposure induced some hematological adaptations, these disappeared within a few hours and failed to provide any benefit during the subsequent 1,500-m run.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Data on a link between HCV or HBV infection and the development of cardiovascular disease among HIV-negative and HIV-positive individuals are conflicting. We sought to investigate the association between HBV or HCV infection and myocardial infarction in HIV-infected individuals. METHODS: The prospective observational database of the D:A:D collaboration of 11 cohorts of HIV-infected individuals, including 212 clinics in Europe, the United States and Australia was used. Multivariate Poisson regression was used to assess the effect of HCV or HBV infection on the development of myocardial infarction after adjustment for potential confounders, including cardiovascular risk factors, diabetes mellitus and exposure to antiretroviral therapy. RESULTS: Of 33,347 individuals, 517 developed a myocardial infarction over 157,912 person-years, with an event rate of 3.3 events/1,000 person-years (95% confidence interval [CI] 3.0-3.6). Event rates (95% CIs) per 1,000 person-years in those who were HCV-seronegative and HCV-seropositive were 3.3 (3.0-3.7) and 2.7 (2.2-3.3), respectively, and for those who were HBV-seronegative, had inactive infection or had active infection were 3.2 (2.8-3.5), 4.2 (3.1-5.2) and 2.8 (1.8-3.9), respectively. After adjustment, there was no association between HCV seropositivity (rate ratio 0.86 [95% CI 0.62-1.19]), inactive HBV infection (rate ratio 1.07 [95% CI 0.79-1.43]) or active HBV infection (rate ratio 0.78 [95% CI 0.52-1.15]) and the development of myocardial infarction. CONCLUSIONS: We found no association between HBV or HCV coinfection and the development of myocardial infarction among HIV-infected individuals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

What determines risk-bearing capacity and the amount of leverage in financial markets? Thispaper uses unique micro-data on collateralized lending contracts during a period of financialdistress to address this question. An investor syndicate speculating in English stocks wentbankrupt in 1772. Using hand-collected information from Dutch notarial archives, we examinechanges in lenders' behavior following exposure to potential (but not actual) losses. Before thedistress episode, financiers that lent to the ill-fated syndicate were indistinguishable from therest. Afterwards, they behaved differently: they lent with much higher haircuts. Only lendersexposed to the failed syndicate altered their behavior. The differential change is remarkable sincethe distress was public knowledge, and because none of the lenders suffered actual losses ? allfinanciers were repaid in full. Interest rates were also unaffected; the market balanced solelythrough changes in collateral requirements. Our findings are consistent with a heterogeneousbeliefs-interpretation of leverage. They also suggest that individual experience can modify thelevel of leverage in a market quickly.