945 resultados para exposure risk
Resumo:
Numerous drug exposures do occur unintentionally at the beginning of pregnancy. On the other hand, pursuing drug treatment may be necessary in women who wish to be pregnant. In these situations risk evaluation has to be done in a precise and differentiated manner, taking into account at the same time the risk for the fetus and maternal health. Teratovigilance services are able to give a thorough information enabling to avoid unwarranted drug arrests or pregnancy terminations. In return, physician's catamnesis about the outcome of the pregnancy exposed to one or several therapeutic agents will increase the bulk of knowledge health professionals and pregnant women have at their disposal.
Resumo:
OBJECTIVES: Agriculture is considered one of the occupations most at risk of acute or chronic respiratory problems. The aim of our study was to determine from which level of exposure to organic dust the respiratory function is chronically affected in workers involved in wheat grain or straw manipulation and to test if some of these working populations can recover their respiratory function after an exposure decrease. METHOD: 87 workers exposed to wheat dust: farmers, harvesters, silo workers and livestock farmers and 62 non exposed workers, were included into a longitudinal study comprising two visits at a six months interval with lung function measurements and symptom questionnaires. Cumulative and mean exposure to wheat dust were generated from detailed work history of each worker and a task-exposure matrix based on task-specific exposure measurements. Immunoglobulins (IgG and IgE) specific of the most frequent microorganisms in wheat dust have been determined. RESULTS: FEV1 decreased significantly with the cumulative exposure and mean exposure levels. The estimated decrease was close to 200 mL per year of high exposure, which corresponds roughly to levels of wheat dust higher than 10 mg/m(3). Peak expiratory flow and several acute symptoms correlate with recent exposure level. Recovery of the respiratory function six months after exposure to wheat dust and evolution of exposure indicators in workers blood (IgG and IgE) will be discussed. CONCLUSIONS: These results show a chronic effect of exposure to wheat dust on bronchial obstruction. Short term effects and reversibility will be assessed using the full study results.
Resumo:
Previous studies have demonstrated that poultry-house workers are exposed to very high levels of organic dust and consequently have an increased prevalence of adverse respiratory symptoms. However, the influence of the age of broilers, on bioaerosol concentrations has not been investigated. To evaluate the evolution of bioaerosol concentration during the fattening period, bioaerosol parameters (inhalable dust, endotoxin and bacteria) were measured in 12 poultry confinement buildings in Switzerland, at 3 different stages of the birds' growth; Samples of air taken from within the breathing zones of individual poultry-house employees as they caught the chickens ready to be transported for slaughter, were also analysed. Quantitative PCR (Q-PCR) was used to assess the quantity of total airborne bacteria and total airborne Staphylococcus species. Bioaerosol levels increased significantly during the fattening period of the chickens. During the task of catching mature birds, the mean inhalable dust concentration for a worker was 31 ± 4.7 mg/m3, and endotoxin concentration was 11'080 ± 3436 UE/m3 air, more than ten-fold higher than the Swiss occupational recommended value (1000 UE/m3). The mean exposure level of bird catchers to total bacteria and Staphylococcus species measured by Q-PCR is also very high, respectively reaching values of 72 (± 11) x107 cells/m3 air and 70 (± 16) x106/m3 air. It was concluded that in the absence of wearing protective breathing apparatus, chicken catchers in Switzerland risk exposure beyond recommended limits for all measured bioaerosol parameters. Moreover, the use of Q-PCR to estimate total and specific numbers of airborne bacteria is a promising tool for evaluating any modifications intended to improve the safety of current working practices.
Resumo:
There are not enough previous publications which are focused on mothers withwell-controlled gestational diabetes mellitus (GDM) as a risk factor that determines the occurrence of neonatal hypoglycemia. In addition, approaches to blood glucose monitoring have been inconsistent and poorly defined. Our objective is to determine if being a newborn from a mother with well-controlled gestational diabetes (regardless insulin treatment) have a higher risk to develop hypoglycemia than a healthy newborn, using a defined and strict protocol. The project will take place in a regional hospital of Girona. We will recruit from 2014 to 2015 a cohort of 623 infants born in this center without any malformation or any perinatal pathology or complication, selected with a consecutive sampling. We will record sex, ethnicity and gestational age information. We will measure blood glucose levels and anthropometric measurements in newborns always taking into account the presence of well-controlled maternal gestational diabetes or not. Patients will be followed up during 24 hours to determine the incidence of hypoglycemia. We will analyze the contribution between exposure factors that we have studied and the incidence of the outcome using a multivariate analysis
Resumo:
BACKGROUND: Chronic liver disease in human immunodeficiency virus (HIV)-infected patients is mostly caused by hepatitis virus co-infection. Other reasons for chronic alanine aminotransferase (ALT) elevation are more difficult to diagnose. METHODS: We studied the incidence of and risk factors for chronic elevation of ALT levels (greater than the upper limit of normal at 2 consecutive semi-annual visits) in participants of the Swiss HIV Cohort Study without hepatitis B virus (HBV) or hepatitis C virus (HCV) infection who were seen during the period 2002-2008. Poisson regression analysis was used. RESULTS: A total of 2365 participants were followed up for 9972 person-years (median age, 38 years; male sex, 66%; median CD4+ cell count, 426/microL; receipt of antiretroviral therapy [ART], 56%). A total of 385 participants (16%) developed chronic elevated ALT levels, with an incidence of 3.9 cases per 100 person-years (95% confidence interval [CI], 3.5-4.3 cases per 100 person-years). In multivariable analysis, chronic elevated ALT levels were associated with HIV RNA level >100,000 copies/mL (incidence rate ratio [IRR], 2.23; 95% CI, 1.45-3.43), increased body mass index (BMI, defined as weight in kilograms divided by the square of height in meters) (BMI of 25-29.9 was associated with an IRR of 1.56 [95% CI, 1.24-1.96]; a BMI 30 was associated with an IRR of 1.70 [95% CI, 1.16-2.51]), severe alcohol use (1.83 [1.19-2.80]), exposure to stavudine (IRR per year exposure, 1.12 [95% CI, 1.07-1.17]) and zidovudine (IRR per years of exposure, 1.04 [95% CI, 1.00-1.08]). Associations with cumulative exposure to combination ART, nucleoside reverse-transcriptase inhibitors, and unboosted protease inhibitors did not remain statistically significant after adjustment for exposure to stavudine. Black ethnicity was inversely correlated (IRR, 0.52 [95% CI, 0.33-0.82]). Treatment outcome and mortality did not differ between groups with and groups without elevated ALT levels. CONCLUSIONS: Among patients without hepatitis virus co-infection, the incidence of chronic elevated ALT levels was 3.9 cases per 100 person-years, which was associated with high HIV RNA levels, increased BMI, severe alcohol use, and prolonged stavudine and zidovudine exposure. Long-term follow-up is needed to assess whether chronic elevation of ALT levels will result in increased morbidity or mortality.
Resumo:
BACKGROUND: We conducted a retrospective analysis of administration of nonoccupational HIV post-exposure prophylaxis (nPEP) in a single centre where tracing and testing of the source of exposure were carried out systematically over a 10-year period. METHODS: Files of all nPEP requests between 1998 and 2007 were reviewed. Characteristics of the exposed and source patients, the type of exposure, and clinical and serological outcomes were analysed. RESULTS: nPEP requests increased by 850% over 10 years. Among 910 events, 58% were heterosexual exposures, 15% homosexual exposures, 6% sexual assaults and 20% nonsexual exposures. In 208 events (23%), the source was reported to be HIV positive. In the remaining cases, active source tracing enabled 298 HIV tests to be performed (42%) and identified 11 HIV infections (3.7%). nPEP was able to be avoided or interrupted in 31% of 910 events when the source tested negative. Of 710 patients who started nPEP, 396 (56%) reported side effects, among whom 39 (5%) had to interrupt treatment. There were two HIV seroconversions, and neither was attributed to nPEP failure. CONCLUSIONS: nPEP requests increased over time. HIV testing of the source person avoided nPEP in 31% of events and was therefore paramount in the management of potential HIV exposures. Furthermore, it allowed active screening of populations potentially at risk for undiagnosed HIV infection, as shown by the increased HIV prevalence in these groups (3.7%) compared with a prevalence of 0.3% in Switzerland as a whole.
Resumo:
Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.
Resumo:
We analyse the impact of working and contractual conditions, particularly exposure to job risks, on the probability of acquiring a permanent disability, controlling for other personal and firm characteristics. We postulate a model in which this impact is mediated by the choice of occupation, with a level of risk associated with it. We assume this choice is endogenous, and that it depends on preferences and opportunities in the labour market, both of which may differ between immigrants and natives. To test this hypothesis we apply a bivariate probit model to data for 2006 from the Continuous Sample of Working Lives provided by the Spanish Social Security system, containing records for over a million workers. We find that risk exposure increases the probability of permanent disability arising from any cause - by almost 5%.
Resumo:
The use of electronic control devices has expanded worldwide during the last few years, the most widely used model being the Taser. However, the scientific knowledge about electronic control devices remains limited. We reviewed the medical literature to examine the potential implications of electronic devices in terms of morbidity and mortality, and to identify and evaluate all the existing experimental human studies. A single exposure of an electronic control device on healthy individuals can be assumed to be generally safe, according to 23 prospective human experimental studies and numerous volunteer exposures. In case series, however, electronic control devices could have deleterious effects when used in the field, in particular if persons receive multiple exposures, are intoxicated, show signs of "excited delirium," or present with medical comorbidities. As the use of electronic control devices continues to increase, the controversy about its safety, notably in potentially high-risk individuals, is still a matter of debate. The complications of electronic control device exposure are numerous but often recognizable, usually resulting from barbed dart injuries or from falls. Persons exposed to electronic control devices should therefore be fully examined, and traumatic lesions must be ruled out.
Resumo:
A nationwide investigation was conducted in Switzerland to establish the exposure of the population by medical x rays and update the results of the 1998 survey. Both the frequency and the dose variations were studied in order to determine the change in the collective dose. The frequency study addressed 206 general practitioners (GPs), 30 hospitals, and 10 private radiology institutes. Except for the latter, the response rate was very satisfactory. The dose study relied on the assessment of the speed class of the screen-film combinations used by the GPs as well as the results of two separate studies dedicated to fluoroscopy and CT. The investigation showed that the total number of all medical x-ray examinations performed by GPs registered a 1% decrease between 1998 and 2003, and that the sensitivities of the film-screen combinations registered a shift towards higher values, leading to a reduction of the dose delivered by a GP of the order of 20%. The study indicated also that the total number of all x-ray examinations performed in hospitals increased by 4%, with a slight increase of radiographies by 1% but significant decrease of examinations involving fluoroscopy (39%), and a 70% increase for CT examinations. Concerning the doses, the investigation of a selection of examinations involving fluoroscopy showed a significant increase of the kerma-area product (KAP) per procedure. For CT the study showed an increase of the dose-length product (DLP) per procedure for skull and abdomen examinations, and a decrease for chest examination. Both changes in the frequency and the effective dose per examination led to a 20% increase in the total collective dose.
Resumo:
Bacterial bioreporters have substantial potential for contaminant assessment but their real world application is currently impaired by a lack of sensitivity. Here, we exploit the bioconcentration of chemicals in the urine of animals to facilitate pollutant detection. The shore crab Carcinus maenas was exposed to the organic contaminant 2-hydroxybiphenyl, and urine was screened using an Escherichia coli-based luciferase gene (luxAB) reporter assay specific to this compound. Bioassay measurements differentiated between the original contaminant and its metabolites, quantifying bioconcentration factors of up to one hundred-fold in crab urine. Our results reveal the substantial potential of using bacterial bioreporter assays in real-time monitoring of biological matricesto determine exposure histories, with wide ranging potential for the in situ measurement of xenobiotics in risk assessments and epidemiology.
Resumo:
BACKGROUND: There is limited safety information on most drugs used during pregnancy. This is especially true for medication against tropical diseases because pharmacovigilance systems are not much developed in these settings. The aim of the present study was to demonstrate feasibility of using Health and Demographic Surveillance System (HDSS) as a platform to monitor drug safety in pregnancy. METHODS: Pregnant women with gestational age below 20 weeks were recruited from Reproductive and Child Health (RCH) clinics or from monthly house visits carried out for the HDSS. A structured questionnaire was used to interview pregnant women. Participants were followed on monthly basis to record any new drug used as well as pregnancy outcome. RESULTS: 1089 pregnant women were recruited; 994 (91.3%) completed the follow-up until delivery. 98% women reported to have taken at least one medication during pregnancy, mainly those used in antenatal programmes. Other most reported drugs were analgesics (24%), antibiotics (17%), and antimalarial (15%), excluding IPTp. Artemether-lumefantrine (AL) was the most used antimalarial for treating illness by nearly 3/4 compared to other groups of malaria drugs. Overall, antimalarial and antibiotic exposures in pregnancy were not significantly associated with adverse pregnancy outcome. Iron and folic acid supplementation were associated with decreased risk of miscarriage/stillbirth (OR 0.1; 0.08 - 0.3). CONCLUSION: Almost all women were exposed to medication during pregnancy. Exposure to iron and folic acid had a beneficial effect on pregnancy outcome. HDSS proved to be a useful platform to establish a reliable pharmacovigilance system in resource-limited countries. Widening drug safety information is essential to facilitate evidence based risk-benefit decision for treatment during pregnancy, a major challenge with newly marketed medicines.
Resumo:
BACKGROUND: Exposure to solar ultraviolet (UV) light is the main causative factor for skin cancer. Outdoor workers are at particular risk because they spend long working hours outside, may have little shade available and be bound to take their lunch at their workplace. Despite epidemiological evidence of a doubling in risk of squamous cell carcinoma in outdoor workers, the recognition of skin cancer as an occupational disease remains scarce. OBJECTIVE: To assess occupational solar UV doses and its contribution to skin cancer risk. METHODS: A numerical model (SimUVEx) was used to assess occupational and lunch break exposures, characterize exposure patterns and anatomical distribution. Risk of squamous cell carcinoma (SCC) was estimated from an existing epidemiological model. RESULTS: Horizontal body locations received 2.0-2.5 times more UV than vertical locations. Dose associated to lunch outdoor every day was similar to outdoor work one day per week but only half of a seasonal worker. Outdoor workers are associated with an increased risk of SCC but also of frequent acute episodes. CONCLUSION: Occupational solar exposure contributes largely to the overall lifetime UV dose, resulting in an excess risk of SCC. The magnitude of the estimated excess in risk supports the recognition of SCC as an occupational disease.
Resumo:
Molecular characterization of radical prostatectomy specimens after systemic therapy may identify a gene expression profile for resistance to therapy. This study assessed tumor cells from patients with prostate cancer participating in a phase II neoadjuvant docetaxel and androgen deprivation trial to identify mediators of resistance. Transcriptional level of 93 genes from a docetaxel-resistant prostate cancer cell lines microarray study was analyzed by TaqMan low-density arrays in tumors from patients with high-risk localized prostate cancer (36 surgically treated, 28 with neoadjuvant docetaxel þ androgen deprivation). Gene expression was compared between groups and correlated with clinical outcome. VIM, AR and RELA were validated by immunohistochemistry. CD44 and ZEB1 expression was tested by immunofluorescence in cells and tumor samples. Parental and docetaxel-resistant castration-resistant prostate cancer cell lines were tested for epithelial-to-mesenchymal transition (EMT) markers before and after docetaxel exposure. Reversion of EMT phenotype was investigated as a docetaxel resistance reversion strategy. Expression of 63 (67.7%) genes differed between groups (P < 0.05), including genes related to androgen receptor, NF-k B transcription factor, and EMT. Increased expression of EMT markers correlated with radiologic relapse. Docetaxel-resistant cells had increased EMT and stem-like cell markers expression. ZEB1 siRNA transfection reverted docetaxel resistance and reduced CD44 expression in DU-145R and PC-3R. Before docetaxel exposure, a selected CD44 þ subpopulation of PC-3 cells exhibited EMT phenotype and intrinsic docetaxel resistance; ZEB1/CD44 þ subpopulations were found in tumor cell lines and primary tumors; this correlated with aggressive clinical behavior. This study identifies genes potentially related to chemotherapy resistance and supports evi-dence of the EMT role in docetaxel resistance and adverse clinical behavior in early prostate cancer.
Resumo:
BACKGROUND: Persons infected with human immunodeficiency virus (HIV) have increased rates of coronary artery disease (CAD). The relative contribution of genetic background, HIV-related factors, antiretroviral medications, and traditional risk factors to CAD has not been fully evaluated in the setting of HIV infection. METHODS: In the general population, 23 common single-nucleotide polymorphisms (SNPs) were shown to be associated with CAD through genome-wide association analysis. Using the Metabochip, we genotyped 1875 HIV-positive, white individuals enrolled in 24 HIV observational studies, including 571 participants with a first CAD event during the 9-year study period and 1304 controls matched on sex and cohort. RESULTS: A genetic risk score built from 23 CAD-associated SNPs contributed significantly to CAD (P = 2.9 × 10(-4)). In the final multivariable model, participants with an unfavorable genetic background (top genetic score quartile) had a CAD odds ratio (OR) of 1.47 (95% confidence interval [CI], 1.05-2.04). This effect was similar to hypertension (OR = 1.36; 95% CI, 1.06-1.73), hypercholesterolemia (OR = 1.51; 95% CI, 1.16-1.96), diabetes (OR = 1.66; 95% CI, 1.10-2.49), ≥ 1 year lopinavir exposure (OR = 1.36; 95% CI, 1.06-1.73), and current abacavir treatment (OR = 1.56; 95% CI, 1.17-2.07). The effect of the genetic risk score was additive to the effect of nongenetic CAD risk factors, and did not change after adjustment for family history of CAD. CONCLUSIONS: In the setting of HIV infection, the effect of an unfavorable genetic background was similar to traditional CAD risk factors and certain adverse antiretroviral exposures. Genetic testing may provide prognostic information complementary to family history of CAD.