307 resultados para Cumulative exposure
em Université de Lausanne, Switzerland
Resumo:
Although polychlorinated biphenyls (PCBs) have been banned in many countries for more than three decades, exposures to PCBs continue to be of concern due to their long half-lives and carcinogenic effects. In National Institute for Occupational Safety and Health studies, we are using semiquantitative plant-specific job exposure matrices (JEMs) to estimate historical PCB exposures for workers (n = 24,865) exposed to PCBs from 1938 to 1978 at three capacitor manufacturing plants. A subcohort of these workers (n = 410) employed in two of these plants had serum PCB concentrations measured at up to four times between 1976 and 1989. Our objectives were to evaluate the strength of association between an individual worker's measured serum PCB levels and the same worker's cumulative exposure estimated through 1977 with the (1) JEM and (2) duration of employment, and to calculate the explained variance the JEM provides for serum PCB levels using (3) simple linear regression. Consistent strong and statistically significant associations were observed between the cumulative exposures estimated with the JEM and serum PCB concentrations for all years. The strength of association between duration of employment and serum PCBs was good for highly chlorinated (Aroclor 1254/HPCB) but not less chlorinated (Aroclor 1242/LPCB) PCBs. In the simple regression models, cumulative occupational exposure estimated using the JEMs explained 14-24% of the variance of the Aroclor 1242/LPCB and 22-39% for Aroclor 1254/HPCB serum concentrations. We regard the cumulative exposure estimated with the JEM as a better estimate of PCB body burdens than serum concentrations quantified as Aroclor 1242/LPCB and Aroclor 1254/HPCB.
Resumo:
OBJECTIVES: Agriculture is considered one of the occupations most at risk of acute or chronic respiratory problems. The aim of our study was to determine from which level of exposure to organic dust the respiratory function is chronically affected in workers involved in wheat grain or straw manipulation and to test if some of these working populations can recover their respiratory function after an exposure decrease. METHOD: 87 workers exposed to wheat dust: farmers, harvesters, silo workers and livestock farmers and 62 non exposed workers, were included into a longitudinal study comprising two visits at a six months interval with lung function measurements and symptom questionnaires. Cumulative and mean exposure to wheat dust were generated from detailed work history of each worker and a task-exposure matrix based on task-specific exposure measurements. Immunoglobulins (IgG and IgE) specific of the most frequent microorganisms in wheat dust have been determined. RESULTS: FEV1 decreased significantly with the cumulative exposure and mean exposure levels. The estimated decrease was close to 200 mL per year of high exposure, which corresponds roughly to levels of wheat dust higher than 10 mg/m(3). Peak expiratory flow and several acute symptoms correlate with recent exposure level. Recovery of the respiratory function six months after exposure to wheat dust and evolution of exposure indicators in workers blood (IgG and IgE) will be discussed. CONCLUSIONS: These results show a chronic effect of exposure to wheat dust on bronchial obstruction. Short term effects and reversibility will be assessed using the full study results.
Resumo:
BACKGROUND: Patients with HIV exposed to the antiretroviral drug abacavir may have an increased risk of cardiovascular disease (CVD). There is concern that this association arises because of a channeling bias. Even if exposure is a risk, it is not clear how that risk changes as exposure cumulates. METHODS: We assess the effect of exposure to abacavir on the risk of CVD events in the Swiss HIV Cohort Study. We use a new marginal structural Cox model to estimate the effect of abacavir as a flexible function of past exposures while accounting for risk factors that potentially lie on a causal pathway between exposure to abacavir and CVD. RESULTS: A total of 11,856 patients were followed for a median of 6.6 years; 365 patients had a CVD event (4.6 events per 1000 patient-years). In a conventional Cox model, recent--but not cumulative--exposure to abacavir increased the risk of a CVD event. In the new marginal structural Cox model, continued exposure to abacavir during the past 4 years increased the risk of a CVD event (hazard ratio = 2.06; 95% confidence interval: 1.43 to 2.98). The estimated function for the effect of past exposures suggests that exposure during the past 6-36 months caused the greatest increase in risk. CONCLUSIONS: Abacavir increases the risk of a CVD event: the effect of exposure is not immediate, rather the risk increases as exposure cumulates over the past few years. This gradual increase in risk is not consistent with a rapidly acting mechanism, such as acute inflammation.
Resumo:
BACKGROUND: Socioeconomic adversity in early life has been hypothesized to "program" a vulnerable phenotype with exaggerated inflammatory responses, so increasing the risk of developing type 2 diabetes in adulthood. The aim of this study is to test this hypothesis by assessing the extent to which the association between lifecourse socioeconomic status and type 2 diabetes incidence is explained by chronic inflammation. METHODS AND FINDINGS: We use data from the British Whitehall II study, a prospective occupational cohort of adults established in 1985. The inflammatory markers C-reactive protein and interleukin-6 were measured repeatedly and type 2 diabetes incidence (new cases) was monitored over an 18-year follow-up (from 1991-1993 until 2007-2009). Our analytical sample consisted of 6,387 non-diabetic participants (1,818 women), of whom 731 (207 women) developed type 2 diabetes over the follow-up. Cumulative exposure to low socioeconomic status from childhood to middle age was associated with an increased risk of developing type 2 diabetes in adulthood (hazard ratio [HR] = 1.96, 95% confidence interval: 1.48-2.58 for low cumulative lifecourse socioeconomic score and HR = 1.55, 95% confidence interval: 1.26-1.91 for low-low socioeconomic trajectory). 25% of the excess risk associated with cumulative socioeconomic adversity across the lifecourse and 32% of the excess risk associated with low-low socioeconomic trajectory was attributable to chronically elevated inflammation (95% confidence intervals 16%-58%). CONCLUSIONS: In the present study, chronic inflammation explained a substantial part of the association between lifecourse socioeconomic disadvantage and type 2 diabetes. Further studies should be performed to confirm these findings in population-based samples, as the Whitehall II cohort is not representative of the general population, and to examine the extent to which social inequalities attributable to chronic inflammation are reversible. Please see later in the article for the Editors' Summary.
Resumo:
BACKGROUND: Here, we aimed to determine the prevalence of erectile dysfunction (ED) among HIV-infected men and its association with components of antiretroviral therapy. METHODS: Cross-sectional data on sexual dysfunction were collected in the Swiss HIV Cohort Study (SHCS) between December 2009 and November 2010. Multilevel logistic regression models were used to estimate the association between ED and exposure to 24 different antiretroviral drugs from four drug classes. RESULTS: During the study period, 5,194 of 5,539 eligible men in the SHCS had a follow-up visit; 4,064 men answered a question on ED for the first time. Among these men, ED was experienced often by 459 (11%), sometimes by 543 (13%), rarely by 389 (10%), never by 2,526 (62%) and 147 (4%) did not know. ED was associated with older age, an earlier HIV diagnosis and depression. No association was found with any drug class; however, ED was associated with cumulative exposure to either zalcitabine (OR 1.29 per year of use; 95% CI 1.07, 1.55) or enfuvirtide (OR 1.28; 95% CI 1.08, 1.52). CONCLUSIONS: Around 1 in 10 men in the SHCS reported often experiencing ED. We found no association between ED and any drug class, but those exposed to zalcitabine or enfurvitide (drugs no longer or rarely used) were more likely to report ED; this second association was probably not causal.
Resumo:
BACKGROUND: Obesity is increasing worldwide because developing countries are adopting Western high-fat foods and sedentary lifestyles. In parallel, in many of them, hypertension is rising more rapidly, particularly with age, than in Western countries. OBJECTIVE: To assess the relationship between adiposity and blood pressure (BP) in a developing country with high average BP (The Seychelles, Indian Ocean, population mainly of African origin) in comparison to a developed country with low average BP (Switzerland, population mainly of Caucasian origin). DESIGN: Cross-sectional health examination surveys based on population random samples. SETTING: The main Seychelles island (Mahé) and two Swiss regions (Vaud-Fribourg and Ticino). SUBJECTS: Three thousand one hundred and sixteen adults (age range 35-64) untreated for hypertension. MEASUREMENTS: Body mass index (BMI), waist circumference (WC), waist-to-hip ratio (WHR), systolic and diastolic blood pressure (SBP and DBP, mean of two measures). METHODS: Scatterplot smoothing techniques and gender-specific linear regression models. RESULTS: On average, SBP and DBP were found to increase linearly over the whole variation range of BMI, WHR and WC. A modest, but statistically significant linear association was found between each indicator of adiposity and BP levels in separate regression models controlling for age. The regression coefficients were not significantly different between the Seychelles and the two Swiss regions, but were generally higher in women than in men. For the latter, a gain of 1.7 kg/m(2) in BMI, of 4.5 cm in WC or of 3.4% in WHR corresponded to an elevation of 1 mmHg in SBP. For women, corresponding figures were 1.25 kg/m(2), 2.5 cm and 1.8% respectively. Regression coefficients for age reflected a higher effect of this variable on both SBP and DBP in the Seychelles than in Switzerland. CONCLUSION: These findings suggest a stable linear relation of adiposity with BP, independent of age and body fat distribution, across developed and developing countries. The more rapid increase of BP with age observed in the latter countries are likely to reflect higher genetic susceptibility and/or higher cumulative exposure to another risk factor than adiposity.
Resumo:
BACKGROUND: Metabolic complications, including cardiovascular events and diabetes mellitus (DM), are a major long-term concern in human immunodeficiency virus (HIV)-infected individuals. Recent genome-wide association studies have reliably associated multiple single nucleotide polymorphisms (SNPs) to DM in the general population. METHODS: We evaluated the contribution of 22 SNPs identified in genome-wide association studies and of longitudinally measured clinical factors to DM. We genotyped all 94 white participants in the Swiss HIV Cohort Study who developed DM from 1 January 1999 through 31 August 2009 and 550 participants without DM. Analyses were based on 6054 person-years of follow-up and 13,922 measurements of plasma glucose. RESULTS: The contribution to DM risk explained by SNPs (14% of DM variability) was larger than the contribution to DM risk explained by current or cumulative exposure to different antiretroviral therapy combinations (3% of DM variability). Participants with the most unfavorable genetic score (representing 12% and 19% of the study population, respectively, when applying 2 different genetic scores) had incidence rate ratios for DM of 3.80 (95% confidence interval [CI], 2.05-7.06) and 2.74 (95% CI, 1.53-4.88), respectively, compared with participants with a favorable genetic score. However, addition of genetic data to clinical risk factors that included body mass index only slightly improved DM prediction. CONCLUSIONS: In white HIV-infected persons treated with antiretroviral therapy, the DM effect of genetic variants was larger than the potential toxic effects of antiretroviral therapy. SNPs contributed significantly to DM risk, but their addition to a clinical model improved DM prediction only slightly, similar to studies in the general population.
Resumo:
L'endocardite infectieuse (EI) est une maladie potentiellement mortelle qui doit être prévenue dans toute la mesure du possible. Au cours de ces dernières 50 années, les recommandations Américaines et Européennes pour la prophylaxie de PEI proposaient aux patients à risques de prendre un antibiotique, préventif avant de subir une intervention médico-chirurgicale susceptible d'induire une bactériémie transitoire. Cependant, des études épidémiologiques récentes ont montré que la plupart des EI survenaient en dehors de tous actes médico-chirurgicaux, et indépendamment de la prise ou non de prophylaxie antibiotique . L'EI pourrait donc survenir suite à la cumulation de bactériémies spontanées de faibles intensités, associées à des activités de la vie courante telle que le brossage dentaire pour le streptocoques, ou à partir de tissus colonisés ou de cathéters infectés pour les staphylocoques. En conséquence, les recommandations internationales pour la prophylaxie de PEI ont été revues et proposent une diminution drastique de l'utilisation d'antibiotiques. Cependant, le risque d'EI représenté par le cumul de bactériémies de faibles intensités n'a pas été démontré expérimentalement. Nous avons développé un nouveau modèle d'EI expérimentale induite par une inoculation en continu d'une faible quantité de bactéries, simulant le cumul de bactériémies de faibles intensités chez l'homme, et comparé l'infection de Streptococcus gordonii et de Staphylococcus aureus dans ce modèle avec celle du modèle d'IE induite par une bactériémie brève, mais de forte intensité. Nous avons démontré, après injection d'une quantité égale de bactéries, que le nombre de végétations infectées était similaire dans les deux types d'inoculations. Ces résultats expérimentaux ont confirmé l'hypothèse qu'une exposition cumulée à des bactériémies de faibles intensités, en dehors d'une procédure médico-chirurgicale, représentait un risque pour le développement d'une El, comme le suggéraient les études épidémiologiques. En plus, ces résultats ont validé les nouvelles recommandations pour la prophylaxie de l'El, limitant drastiquement l'utilisation d'antibiotiques. Cependant, ces nouvelles recommandations laissent une grande partie (> 90%) de cas potentiels d'EI sans alternatives de préventions, et des nouvelles stratégies prophylactiques doivent être investiguées. Le nouveau modèle d'EI expérimentale représente un modèle réaliste pour étudier des nouvelles mesures prophylactiques potentielles appliquées à des expositions cumulées de bactériémies de faible nombre. Dans un contexte de bactériémies spontanées répétitives, les antibiotiques ne peuvent pas résoudre le problème de la prévention de l'EI. Nous avons donc étudié la une alternative de prévention par l'utilisation d'agents antiplaquettaires. La logique derrière cette approche était basée sur le fait que les plaquettes sont des composants clés dans la formation des végétations cardiaques, et le fait que les bactéries capables d'interagir avec les plaquettes sont plus enclines à induire une El. Les agents antiplaquettaires utilisés ont été l'aspirine (inhibiteur du COX1), la ticlopidine (inhibiteur du P2Y12, le récepteur de l'ADP), et l'eptifibatide et Pabciximab, deux inhibiteurs du GPIIb/IIIa, le récepteur plaquettaire pour le fibrinogène. Les anticoagulants étaient le dabigatran etexilate, inhibant lathrombine et l'acenocumarol, un antagoniste de la vitamine K. L'aspirine, la ticlopidine ou l'eptifibatide seuls n'ont pas permis de prévenir l'infection valvulaire (> 75% animaux infectés). En revanche, la combinaison d'aspirine et de ticlopidine, aussi bien que l'abciximab, ont protégé 45% - 88% des animaux de l'EI par S. gordonii et par S. aureus. L'antithrombotique dabigatran etexilate à protégé 75% des rats contre l'EI par S. aureus, mais pas (< 30% de protection) par S. gordonii. L'acenocoumarol n'a pas eu d'effet sur aucun des deux organismes. En général, ces résultats suggèrent un possible rôle pour les antiplaquettaires et du dabigatran etexilate dans la prophylaxie de l'EI dans un contexte de bactériémies récurrentes de faibles intensités. Cependant, l'effet bénéfique des antiplaquettaires doit être soupesé avec le risque d'hémorragie inhérent à ces molécules, et le fait que les plaquettes jouent un important rôle dans les défenses de l'hôte contre les infections endovasculaires. En plus, le double effet bénéfique du dabigatran etexilate devrait être revu chez les patients porteurs de valves prothétiques, qui ont besoin d'une anticoagulation à vie, et chez lesquels l'EI à S. aureus est associée avec une mortalité de près de 50%. Comme l'approche avec des antiplaquettaires et des antithrombotiques pourrait avoir des limites, une autre stratégie prophylactique pourrait être la vaccination contre des adhésines de surfaces des pathogènes. Chez S. aureus, la protéine de liaison au fibrinogène, ou dumping factor A (ClfA), et la protéine de liaison à la fibronectine (FnbpA) sont des facteurs de virulence nécessaires à l'initiation et l'évolution de PEI. Elles représentent donc des cibles potentielles pour le développement de vaccins contre cette infection. Récemment, des nombreuses publications ont décrit que la bactérie Lactococcus lactis pouvait être utilisée comme vecteur pour la diffusion d'antigènes bactériens in vivo, et que cette approche pourrait être une stratégie de vaccination contre les infections bactériennes. Nous avons exploré l'effet de l'immunisation par des recombinant de L. lactis exprimant le ClfA, la FnbpA, ou le ClfA ensemble avec et une forme tronquée de la FnbpA (Fnbp, comprenant seulement le domaine de liaison à la fibronectine mais sans le domaine A de liaison au fibrinogène [L. lactis ClfA/Fnbp]), dans la prophylaxie de PIE expérimentale à S. aureus. L. lactis ClfA a été utilisés comme agent d'immunisation contre la souche S. aureus Newman (qui a particularité de n'exprimer que le ClfA, mais pas la FnbpA). L. lactis ClfA, L. lactis FnbpA, et L. lactis ClfA/Fnbp, ont été utilisé comme agents d'immunisation contre une souche isolée d'une IE, S. aureus P8 (exprimant ClfA et FnbpA). L'immunisation avec L. lactis ClfA a généré des anticorps anti-ClfA fonctionnels, capables de bloquer la liaison de S. aureus Newman au fibrinogène in vitro et protéger 13/19 (69%) animaux d'une El due à S. aureus Newman (P < 0.05 comparée aux contrôles). L'immunisation avec L. lactis ClfA, L. lactis FnbpA, ou L. lactis ClfA/Fnbp, a généré des anticorps contre chacun de ces antigènes. Cependant, ils n'ont pas permis de bloquer l'adhésion de S. aureus P8 au fibrinogène et à la fibronectine in vitro. De plus, l'immunisation avec L. lactis ClfA ou L. lactis FnbpA s'est avérée inefficace in vivo (< 10% d'animaux protégés d'une El) et l'immunisation avec L. lactis ClfA/Fnbp a fourni une protection limitée de l'EI (8/23 animaux protégés; P < 0.05 comparée aux contrôles) après inoculation avec S. aureus P8. Dans l'ensemble, ces résultats indiquent que L. lactis est un système efficace pour la présentation d'antigènes in vivo et potentiellement utile pour la prévention de PEI à S. aureus. Cependant, le répertoire de protéines de surface de S. aureus capable d'évoquer une panoplie d'anticorps efficace reste à déterminer.. En résumé, notre étude a démontré expérimentalement, pour la première fois, qu'une bactériémie répétée de faible intensité, simulant la bactériémie ayant lieu, par exemple, lors des activités de la vie quotidienne, est induire un taux d'EI expérimentale similaire à celle induite par une bactériémie de haute intensité suite à une intervention médicale. Dans ce contexte, où l'utilisation d'antibiotiques est pas raisonnable, nous avons aussi montré que d'autres mesures prophylactiques, comme l'utilisation d'agents antiplaquettaires ou antithrombotiques, ou la vaccination utilisant L. lactis comme vecteur d'antigènes bactériens, sont des alternatives prometteuses qui méritent d'être étudiées plus avant. Thesis Summary Infective endocarditis (IE) is a life-threatening disease that should be prevented whenever possible. Over the last 50 years, guidelines for IE prophylaxis proposed the use of antibiotics in patients undergoing dental or medico-surgical procedures that might induce high, but transient bacteremia. However, recent epidemiological studies indicate that IE occurs independently of medico-surgical procedures and the fact that patients had taken antibiotic prophylaxis or not, i.e., by cumulative exposure to random low-grade bacteremia, associated with daily activities (e.g. tooth brushing) in the case of oral streptococci, or with a colonized site or infected device in the case of staphylococci. Accordingly, the most recent American and European guidelines for IE prophylaxis were revisited and updated to drastically restrain antibiotic use. Nevertheless, the relative risk of IE represented by such cumulative low-grade bacteremia had never been demonstrated experimentally. We developed a new model of experimental IE due to continuous inoculation of low-grade bacteremia, mimicking repeated low-grade bacteremia in humans, and compared the infectivity of Streptococcus gordonii and Staphylococcus aureus in this model to that in the model producing brief, high-level bacteremia. We demonstrated that, after injection of identical bacterial numbers, the rate of infected vegetations was similar in both types of challenge. These experimental results support the hypothesis that cumulative exposure to low-grade bacteremia, outside the context of procedure-related bacteremia, represents a genuine risk of IE, as suggested by human epidemiological studies. In addition, they validate the newer guidelines for IE prophylaxis, which drastic limit the procedures in which antibiotic prophylaxis is indicated. Nevertheless, these refreshed guidelines leave the vast majority (> 90%) of potential IE cases without alternative propositions of prevention, and novel strategies must be considered to propose effective alternative and "global" measures to prevent IE initiation. The more realistic experimental model of IE induced by low-grade bacteremia provides an accurate experimental setting to study new preventive measures applying to cumulative exposure to low bacterial numbers. Since in a context of spontaneous low-grade bacteremia antibiotics are unlikely to solve the problem of IE prevention, we addressed the role of antiplatelet and anticoagulant agents for the prophylaxis of experimental IE induced by S. gordonii and S. aureus. The logic of this approach was based on the fact that platelets are key players in vegetation formation and vegetation enlargement, and on the fact that bacteria capable of interacting with platelets are more prone to induce IE. Antiplatelet agents included the COX1 inhibitor aspirin, the inhibitor of the ADP receptor P2Y12 ticlopidine, and two inhibitors of the platelet fibrinogen receptor GPIIb/IIIa, eptifibatide and abciximab. Anticoagulants included the thrombin inhibitor dabigatran etexilate and the vitamin K antagonist acenocoumarol. Aspirin, ticlopidine or eptifibatide alone failed to prevent aortic infection (> 75% infected animals). In contrast, the combination of aspirin with ticlopidine, as well as abciximab, protected 45% to 88% of animals against IE due to S. gordonii and S. aureus. The antithrombin dabigatran etexilate protected 75% of rats against IE due to S. aureus, but failed (< 30% protection) against S. gordonii. Acenocoumarol had no effect against any bacteria. Overall, these results suggest a possible role for antiplatelet agents and dabigatran etexilate in the prophylaxis of IE in humans in a context of recurrent low- grade bacteremia. However, the potential beneficial effect of antiplatelet agents should be balanced against the risk of bleeding and the fact that platelets play an important role in the host defenses against intravascular infections. In addition, the potential dual benefit of dabigatran etexilate might be revisited in patients with prosthetic valves, who require life-long anticoagulation and in whom S. aureus IE is associated with high mortality rate. Because the antiplatelet and anticoagulant approach might be limited in the context of S. aureus bacteremia, other prophylactic strategies for the prevention of S. aureus IE, like vaccination with anti-adhesion proteins was tested. The S. aureus surface proteins fibrinogen-binding protein clumping-factor A (ClfA) and the fibronectin-binding protein A (FnbpA) are critical virulence factors for the initiation and development of IE. Thus, they represent key targets for vaccine development against this disease. Recently, numerous reports have described that the harmless bacteria Lactococcus lactis can be used as a bacterial vector for the efficient delivery of antigens in vivo, and that this approach is a promising vaccination strategy against bacterial infections. We therefore explored the immunization capacity of non- living recombinant L. lactis ClfA, L. lactis FnbpA, or L. lactis expressing ClfA together with Fnbp (a truncated form of FnbpA with only the fibronectin-binding domain but lacking the fibrinogen-binding domain A [L. lactis ClfA/Fnbp]), to protect against S. aureus experimental IE. L. lactis ClfA was used as immunization agent against the laboratory strain S. aureus Newman (expressing ClfA, but lacking FnbpA). L. lactis ClfA, L. lactis FnbpA, as well as L. lactis ClfA/Fnbp, were used as immunization agents against the endocarditis isolate S. aureus P8 (expressing both ClfA and FnbpA). Immunization with L. lactis ClfA produced anti-ClfA functional antibodies, which were able to block the binding of S. aureus Newman to fibrinogen in vitro and protect 13/19 (69%) animals from IE due to S. aureus Newman (P < 0.05 compared to controls). Immunization with L. lactis ClfA, L. lactis FnbpA or L. lactis ClfA/Fnbp, produced antibodies against each antigen. However, they were not sufficient to block S. aureus P8 binding to fibrinogen and fibronectin in vitro. Moreover, immunization with L. lactis ClfA or L. lactis FnbpA was ineffective (< 10% protected animals) and immunization with L. lactis ClfA/Fnbp conferred limited protection from IE (8/23 protected animals; P < 0.05 compared to controls) after challenge with S. aureus P8. Together, these results indicate that L. lactis is an efficient delivering antigen system potentially useful for preventing S. aureus IE. They also demonstrate that expressing multiple antigens in L. lactis, yet to be elucidated, will be necessary to prevent IE due to clinical S. aureus strains fully equipped with virulence determinants. In summary, our study has demonstrated experimentally, for the first time, the hypothesis that low-grade bacteremia, mimicking bacteremia occurring outside of a clinical intervention, is equally prone to induce experimental IE as high-grade bacteremia following medico-surgical procedures. In this context, where the use of antibiotics for the prophylaxis of IE is limited, we showed that other prophylactic measures, like the use of antiplatelets, anticoagulants, or vaccination employing L. lactis as delivery vector of bacterial antigens, are reasonable alternatives that warrant to be further investigated.
Resumo:
BACKGROUND: Allostatic load reflects cumulative exposure to stressors throughout lifetime and has been associated with several adverse health outcomes. It is hypothesized that people with low socioeconomic status (SES) are exposed to higher chronic stress and have therefore greater levels of allostatic load. OBJECTIVE: To assess the association of receiving social transfers and low education with allostatic load. METHODS: We included 3589 participants (1812 women) aged over 35years and under retirement age from the population-based CoLaus study (Lausanne, Switzerland, 2003-2006). We computed an allostatic load index aggregating cardiovascular, metabolic, dyslipidemic and inflammatory markers. A novel index additionally including markers of oxidative stress was also examined. RESULTS: Men with low vs. high SES were more likely to have higher levels of allostatic load (odds ratio (OR)=1.93/2.34 for social transfers/education, 95%CI from 1.45 to 4.17). The same patterns were observed among women. Associations persisted after controlling for health behaviors and marital status. CONCLUSIONS: Low education and receiving social transfers independently and cumulatively predict high allostatic load and dysregulation of several homeostatic systems in a Swiss population-based study. Participants with low SES are at higher risk of oxidative stress, which may justify its inclusion as a separate component of allostatic load.
Resumo:
BACKGROUND: Prevalence of hypertension in HIV infection is high, and information on blood pressure control in HIV-infected individuals is insufficient. We modeled blood pressure over time and the risk of cardiovascular events in hypertensive HIV-infected individuals. METHODS: All patients from the Swiss HIV Cohort Study with confirmed hypertension (systolic or diastolic blood pressure above 139 or 89 mm Hg on 2 consecutive visits and presence of at least 1 additional cardiovascular risk factor) between April 1, 2000 and March 31, 2011 were included. Patients with previous cardiovascular events, already on antihypertensive drugs, and pregnant women were excluded. Change in blood pressure over time was modeled using linear mixed models with repeated measurement. RESULTS: Hypertension was diagnosed in 2595 of 10,361 eligible patients. Of those, 869 initiated antihypertensive treatment. For patients treated for hypertension, we found a mean (95% confidence interval) decrease in systolic and diastolic blood pressure of -0.82 (-1.06 to -0.58) mm Hg and -0.89 (-1.05 to -0.73) mm Hg/yr, respectively. Factors associated with a decline in systolic blood pressure were baseline blood pressure, presence of chronic kidney disease, cardiovascular events, and the typical risk factors for cardiovascular disease. In patients with hypertension, increase in systolic blood pressure [(hazard ratio 1.18 (1.06 to 1.32) per 10 mm Hg increase], total cholesterol, smoking, age, and cumulative exposure to protease inhibitor-based and triple nucleoside regimens were associated with cardiovascular events. CONCLUSIONS: Insufficient control of hypertension was associated with increased risk of cardiovascular events indicating the need for improved management of hypertension in HIV-infected individuals.
Resumo:
Transient high-grade bacteremia following invasive procedures carries a risk of infective endocarditis (IE). This is supported by experimental endocarditis. On the other hand, case-control studies showed that IE could be caused by cumulative exposure to low-grade bacteremia occurring during daily activities. However, no experimental demonstration of this latter possibility exists. This study investigated the infectivity in animals of continuous low-grade bacteremia compared to that of brief high-grade bacteremia. Rats with aortic vegetations were inoculated with Streptococcus intermedius, Streptococcus gordonii or Staphylococcus aureus (strains Newman and P8). Animals were challenged with 10(3) to 10(6) CFU. Identical bacterial numbers were given by bolus (1 ml in 1 min) or continuous infusion (0.0017 ml/min over 10 h). Bacteremia was 50 to 1,000 times greater after bolus than during continuous inoculation. Streptococcal bolus inoculation of 10(5) CFU infected 63 to 100% vegetations compared to 30 to 71% infection after continuous infusion (P > 0.05). When increasing the inoculum to 10(6) CFU, bolus inoculation infected 100% vegetations and continuous infusion 70 to 100% (P > 0.05). S. aureus bolus injection of 10(3) CFU infected 46 to 57% valves. This was similar to the 53 to 57% infection rates produced by continuous infusion (P > 0.05). Inoculation of 10(4) CFU of S. aureus infected 80 to 100% vegetations after bolus and 60 to 75% after continuous infusion (P > 0.05). These results show that high-level bacteremia is not required to induce experimental endocarditis and support the hypothesis that cumulative exposure to low-grade bacteremia represents a genuine risk of IE in humans.
Resumo:
OBJECTIVE: To evaluate deaths from AIDS-defining malignancies (ADM) and non-AIDS-defining malignancies (nADM) in the D:A:D Study and to investigate the relationship between these deaths and immunodeficiency. DESIGN: Observational cohort study. METHODS: Patients (23 437) were followed prospectively for 104 921 person-years. We used Poisson regression models to identify factors independently associated with deaths from ADM and nADM. Analyses of factors associated with mortality due to nADM were repeated after excluding nADM known to be associated with a specific risk factor. RESULTS: Three hundred five patients died due to a malignancy, 298 prior to the cutoff for this analysis (ADM: n = 110; nADM: n = 188). The mortality rate due to ADM decreased from 20.1/1000 person-years of follow-up [95% confidence interval (CI) 14.4, 25.9] when the most recent CD4 cell count was <50 cells/microl to 0.1 (0.03, 0.3)/1000 person-years of follow-up when the CD4 cell count was more than 500 cells/microl; the mortality rate from nADM decreased from 6.0 (95% CI 3.3, 10.1) to 0.6 (0.4, 0.8) per 1000 person-years of follow-up between these two CD4 cell count strata. In multivariable regression analyses, a two-fold higher latest CD4 cell count was associated with a halving of the risk of ADM mortality. Other predictors of an increased risk of ADM mortality were homosexual risk group, older age, a previous (non-malignancy) AIDS diagnosis and earlier calendar years. Predictors of an increased risk of nADM mortality included lower CD4 cell count, older age, current/ex-smoking status, longer cumulative exposure to combination antiretroviral therapy, active hepatitis B infection and earlier calendar year. CONCLUSION: The severity of immunosuppression is predictive of death from both ADM and nADM in HIV-infected populations.
Resumo:
We developed a semiquantitative job exposure matrix (JEM) for workers exposed to polychlorinated biphenyls (PCBs) at a capacitor manufacturing plant from 1946 to 1977. In a recently updated mortality study, mortality of prostate and stomach cancer increased with increasing levels of cumulative exposure estimated with this JEM (trend p values = 0.003 and 0.04, respectively). Capacitor manufacturing began with winding bales of foil and paper film, which were placed in a metal capacitor box (pre-assembly), and placed in a vacuum chamber for flood-filling (impregnation) with dielectric fluid (PCBs). Capacitors dripping with PCB residues were then transported to sealing stations where ports were soldered shut before degreasing, leak testing, and painting. Using a systematic approach, all 509 unique jobs identified in the work histories were rated by predetermined process- and plant-specific exposure determinants; then categorized based on the jobs' similarities (combination of exposure determinants) into 35 job exposure categories. The job exposure categories were ranked followed by a qualitative PCB exposure rating (baseline, low, medium, and high) for inhalation and dermal intensity. Category differences in other chemical exposures (solvents, etc.) prevented further combining of categories. The mean of all available PCB concentrations (1975 and 1977) for jobs within each intensity rating was regarded as a representative value for that intensity level. Inhalation (in microgram per cubic milligram) and dermal (unitless) exposures were regarded as equally important. Intensity was frequency adjusted for jobs with continuous or intermittent PCB exposures. Era-modifying factors were applied to the earlier time periods (1946-1974) because exposures were considered to have been greater than in later eras (1975-1977). Such interpolations, extrapolations, and modifying factors may introduce non-differential misclassification; however, we do believe our rigorous method minimized misclassification, as shown by the significant exposure-response trends in the epidemiologic analysis.
Resumo:
OBJECTIVE: Hierarchical modeling has been proposed as a solution to the multiple exposure problem. We estimate associations between metabolic syndrome and different components of antiretroviral therapy using both conventional and hierarchical models. STUDY DESIGN AND SETTING: We use discrete time survival analysis to estimate the association between metabolic syndrome and cumulative exposure to 16 antiretrovirals from four drug classes. We fit a hierarchical model where the drug class provides a prior model of the association between metabolic syndrome and exposure to each antiretroviral. RESULTS: One thousand two hundred and eighteen patients were followed for a median of 27 months, with 242 cases of metabolic syndrome (20%) at a rate of 7.5 cases per 100 patient years. Metabolic syndrome was more likely to develop in patients exposed to stavudine, but was less likely to develop in those exposed to atazanavir. The estimate for exposure to atazanavir increased from hazard ratio of 0.06 per 6 months' use in the conventional model to 0.37 in the hierarchical model (or from 0.57 to 0.81 when using spline-based covariate adjustment). CONCLUSION: These results are consistent with trials that show the disadvantage of stavudine and advantage of atazanavir relative to other drugs in their respective classes. The hierarchical model gave more plausible results than the equivalent conventional model.
Resumo:
BACKGROUND: Chronic liver disease in human immunodeficiency virus (HIV)-infected patients is mostly caused by hepatitis virus co-infection. Other reasons for chronic alanine aminotransferase (ALT) elevation are more difficult to diagnose. METHODS: We studied the incidence of and risk factors for chronic elevation of ALT levels (greater than the upper limit of normal at 2 consecutive semi-annual visits) in participants of the Swiss HIV Cohort Study without hepatitis B virus (HBV) or hepatitis C virus (HCV) infection who were seen during the period 2002-2008. Poisson regression analysis was used. RESULTS: A total of 2365 participants were followed up for 9972 person-years (median age, 38 years; male sex, 66%; median CD4+ cell count, 426/microL; receipt of antiretroviral therapy [ART], 56%). A total of 385 participants (16%) developed chronic elevated ALT levels, with an incidence of 3.9 cases per 100 person-years (95% confidence interval [CI], 3.5-4.3 cases per 100 person-years). In multivariable analysis, chronic elevated ALT levels were associated with HIV RNA level >100,000 copies/mL (incidence rate ratio [IRR], 2.23; 95% CI, 1.45-3.43), increased body mass index (BMI, defined as weight in kilograms divided by the square of height in meters) (BMI of 25-29.9 was associated with an IRR of 1.56 [95% CI, 1.24-1.96]; a BMI 30 was associated with an IRR of 1.70 [95% CI, 1.16-2.51]), severe alcohol use (1.83 [1.19-2.80]), exposure to stavudine (IRR per year exposure, 1.12 [95% CI, 1.07-1.17]) and zidovudine (IRR per years of exposure, 1.04 [95% CI, 1.00-1.08]). Associations with cumulative exposure to combination ART, nucleoside reverse-transcriptase inhibitors, and unboosted protease inhibitors did not remain statistically significant after adjustment for exposure to stavudine. Black ethnicity was inversely correlated (IRR, 0.52 [95% CI, 0.33-0.82]). Treatment outcome and mortality did not differ between groups with and groups without elevated ALT levels. CONCLUSIONS: Among patients without hepatitis virus co-infection, the incidence of chronic elevated ALT levels was 3.9 cases per 100 person-years, which was associated with high HIV RNA levels, increased BMI, severe alcohol use, and prolonged stavudine and zidovudine exposure. Long-term follow-up is needed to assess whether chronic elevation of ALT levels will result in increased morbidity or mortality.