884 resultados para Inhalation dose and risk
Resumo:
Coal contains trace elements and naturally occurring radionuclides such as 40K, 232Th, 238U. When coal is burned, minerals, including most of the radionuclides, do not burn and concentrate in the ash several times in comparison with their content in coal. Usually, a small fraction of the fly ash produced (2-5%) is released into the atmosphere. The activities released depend on many factors (concentration in coal, ash content and inorganic matter of the coal, combustion temperature, ratio between bottom and fly ash, filtering system). Therefore, marked differences should be expected between the by-products produced and the amount of activity discharged (per unit of energy produced) from different coal-fired power plants. In fact, the effects of these releases on the environment due to ground deposition have been received some attention but the results from these studies are not unanimous and cannot be understood as a generic conclusion for all coal-fired power plants. In this study, the dispersion modelling of natural radionuclides was carried out to assess the impact of continuous atmospheric releases from a selected coal plant. The natural radioactivity of the coal and the fly ash were measured and the dispersion was modelled by a Gaussian plume estimating the activity concentration at different heights up to a distance of 20 km in several wind directions. External and internal doses (inhalation and ingestion) and the resulting risk were calculated for the population living within 20 km from the coal plant. In average, the effective dose is lower than the ICRP’s limit and the risk is lower than the U.S. EPA’s limit. Therefore, in this situation, the considered exposure does not pose any risk. However, when considering the dispersion in the prevailing wind direction, these values are significant due to an increase of 232Th and 226Ra concentrations in 75% and 44%, respectively.
Resumo:
In this study the inhalation doses and respective risk are calculated for the population living within a 20 km radius of a coal-fired power plant. The dispersion and deposition of natural radionuclides were simulated by a Gaussian dispersion model estimating the ground level activity concentration. The annual effective dose and total risk were 0.03205 mSv/y and 1.25 x 10-8, respectively. The effective dose is lower than the limit established by the ICRP and the risk is lower than the limit proposed by the U.S. EPA, which means that the considered exposure does not pose any risk for the public health.
Resumo:
Medical imaging is a powerful diagnostic tool. Consequently, the number of medical images taken has increased vastly over the past few decades. The most common medical imaging techniques use X-radiation as the primary investigative tool. The main limitation of using X-radiation is associated with the risk of developing cancers. Alongside this, technology has advanced and more centres now use CT scanners; these can incur significant radiation burdens compared with traditional X-ray imaging systems. The net effect is that the population radiation burden is rising steadily. Risk arising from X-radiation for diagnostic medical purposes needs minimising and one way to achieve this is through reducing radiation dose whilst optimising image quality. All ages are affected by risk from X-radiation however the increasing population age highlights the elderly as a new group that may require consideration. Of greatest concern are paediatric patients: firstly they are more sensitive to radiation; secondly their younger age means that the potential detriment to this group is greater. Containment of radiation exposure falls to a number of professionals within medical fields, from those who request imaging to those who produce the image. These staff are supported in their radiation protection role by engineers, physicists and technicians. It is important to realise that radiation protection is currently a major European focus of interest and minimum competence levels in radiation protection for radiographers have been defined through the integrated activities of the EU consortium called MEDRAPET. The outcomes of this project have been used by the European Federation of Radiographer Societies to describe the European Qualifications Framework levels for radiographers in radiation protection. Though variations exist between European countries radiographers and nuclear medicine technologists are normally the professional groups who are responsible for exposing screening populations and patients to X-radiation. As part of their training they learn fundamental principles of radiation protection and theoretical and practical approaches to dose minimisation. However dose minimisation is complex – it is not simply about reducing X-radiation without taking into account major contextual factors. These factors relate to the real world of clinical imaging and include the need to measure clinical image quality and lesion visibility when applying X-radiation dose reduction strategies. This requires the use of validated psychological and physics techniques to measure clinical image quality and lesion perceptibility.
Resumo:
The discovery of X-rays was undoubtedly one of the greatest stimulus for improving the efficiency in the provision of healthcare services. The ability to view, non-invasively, inside the human body has greatly facilitated the work of professionals in diagnosis of diseases. The exclusive focus on image quality (IQ), without understanding how they are obtained, affect negatively the efficiency in diagnostic radiology. The equilibrium between the benefits and the risks are often forgotten. It is necessary to adopt optimization strategies to maximize the benefits (image quality) and minimize risk (dose to the patient) in radiological facilities. In radiology, the implementation of optimization strategies involves an understanding of images acquisition process. When a radiographer adopts a certain value of a parameter (tube potential [kVp], tube current-exposure time product [mAs] or additional filtration), it is essential to know its meaning and impact of their variation in dose and image quality. Without this, any optimization strategy will be a failure. Worldwide, data show that use of x-rays has been increasingly frequent. In Cabo Verde, we note an effort by healthcare institutions (e.g. Ministry of Health) in equipping radiological facilities and the recent installation of a telemedicine system requires purchase of new radiological equipment. In addition, the transition from screen-films to digital systems is characterized by a raise in patient exposure. Given that this transition is slower in less developed countries, as is the case of Cabo Verde, the need to adopt optimization strategies becomes increasingly necessary. This study was conducted as an attempt to answer that need. Although this work is about objective evaluation of image quality, and in medical practice the evaluation is usually subjective (visual evaluation of images by radiographer / radiologist), studies reported a correlation between these two types of evaluation (objective and subjective) [5-7] which accredits for conducting such studies. The purpose of this study is to evaluate the effect of exposure parameters (kVp and mAs) when using additional Cooper (Cu) filtration in dose and image quality in a Computed Radiography system.
Resumo:
OBJECTIVES: It is still debated if pre-existing minority drug-resistant HIV-1 variants (MVs) affect the virological outcomes of first-line NNRTI-containing ART. METHODS: This Europe-wide case-control study included ART-naive subjects infected with drug-susceptible HIV-1 as revealed by population sequencing, who achieved virological suppression on first-line ART including one NNRTI. Cases experienced virological failure and controls were subjects from the same cohort whose viraemia remained suppressed at a matched time since initiation of ART. Blinded, centralized 454 pyrosequencing with parallel bioinformatic analysis in two laboratories was used to identify MVs in the 1%-25% frequency range. ORs of virological failure according to MV detection were estimated by logistic regression. RESULTS: Two hundred and sixty samples (76 cases and 184 controls), mostly subtype B (73.5%), were used for the analysis. Identical MVs were detected in the two laboratories. 31.6% of cases and 16.8% of controls harboured pre-existing MVs. Detection of at least one MV versus no MVs was associated with an increased risk of virological failure (OR = 2.75, 95% CI = 1.35-5.60, P = 0.005); similar associations were observed for at least one MV versus no NRTI MVs (OR = 2.27, 95% CI = 0.76-6.77, P = 0.140) and at least one MV versus no NNRTI MVs (OR = 2.41, 95% CI = 1.12-5.18, P = 0.024). A dose-effect relationship between virological failure and mutational load was found. CONCLUSIONS: Pre-existing MVs more than double the risk of virological failure to first-line NNRTI-based ART.
Resumo:
As a part of the HIV behavioural surveillance system in Switzerland, repeated cross-sectional surveys were conducted in 1993, 1994, 1996, 2000 and 2006 among attenders of all low threshold facilities (LTFs) with needle exchange programmes and/or supervised drug consumption rooms for injection or inhalation in Switzerland. Data were collected in each LTF over five consecutive days, using a questionnaire that was partly completed by an interviewer and partly self administered. The questionnaire was structured around three topics: socio-demographic characteristics, drug consumption, health and risk/preventive behaviour. Analysis was restricted to attenders who had injected drugs during their lifetime (IDUs). Between 1993 and 2006, the median age of IDUs rose by 10 years. IDUs are severely marginalised and their social situation has improved little. The borrowing of used injection equipment (syringe or needle already used by other person) in the last six months decreased (16.5% in 1993, 8.9% in 2006) but stayed stable at around 10% over the past three surveys. Other risk behaviour, such as sharing spoons, cotton or water, was reported more frequently, although also showed a decreasing trend. The reported prevalence of HIV remained fairly stable at around 10% between 1993 and 2006; reported levels of hepatitis C virus (HCV) prevalence were high (56.4% in 2006). In conclusion, the overall decrease in the practice of injection has reduced the potential for transmission of infections. However as HCV prevalence is high this is of particular concern, as the current behaviour of IDUs indicates a potential for further spreading of the infection. Another noteworthy trend is the significant decrease in condom use in the case of paid sex.
Resumo:
Objective. To examine the association between pre-diagnostic circulating vitamin D concentration, dietary intake of vitamin D and calcium, and the risk of colorectal cancer in European populations. Design Nested case-control study. Setting. The study was conducted within the EPIC study, a cohort of more than 520 000 participants from 10 western European countries. Participants: 1248 cases of incident colorectal cancer, which developed after enrolment into the cohort, were matched to 1248 controls. Main outcome measures. Circulating vitamin D concentration (25-hydroxy-vitamin-D, 25-(OH)D) was measured by enzyme immunoassay. Dietary and lifestyle data were obtained from questionnaires. Incidence rate ratios and 95% confidence intervals for the risk of colorectal cancer by 25-(OH)D concentration and levels of dietary calcium and vitamin D intake were estimated from multivariate conditional logistic regression models, with adjustment for potential dietary and other confounders. Results. 25-(OH)D concentration showed a strong inverse linear dose-response association with risk of colorectal cancer (P for trend <0.001). Compared with a pre-defined mid-level concentration of 25-(OH)D (50.0-75.0 nmol/l), lower levels were associated with higher colorectal cancer risk (<25.0 nmol/l: incidence rate ratio 1.32 (95% confidence interval 0.87 to 2.01); 25.0-49.9 nmol/l: 1.28 (1.05 to 1.56), and higher concentrations associated with lower risk (75.0-99.9 nmol/l: 0.88 (0.68 to 1.13); ≥100.0 nmol/l: 0.77 (0.56 to 1.06)). In analyses by quintile of 25-(OH)D concentration, patients in the highest quintile had a 40% lower risk of colorectal cancer than did those in the lowest quintile (P<0.001). Subgroup analyses showed a strong association for colon but not rectal cancer (P for heterogeneity=0.048). Greater dietary intake of calcium was associated with a lower colorectal cancer risk. Dietary vitamin D was not associated with disease risk. Findings did not vary by sex and were not altered by corrections for season or month of blood donation. Conclusions The results of this large observational study indicate a strong inverse association between levels of pre-diagnostic 25-(OH)D concentration and risk of colorectal cancer in western European populations. Further randomised trials are needed to assess whether increases in circulating 25-(OH)D concentration can effectively decrease the risk of colorectal cancer.
Resumo:
BACKGROUND: In contrast with established evidence linking high doses of ionizing radiation with childhood cancer, research on low-dose ionizing radiation and childhood cancer has produced inconsistent results. OBJECTIVE: We investigated the association between domestic radon exposure and childhood cancers, particularly leukemia and central nervous system (CNS) tumors. METHODS: We conducted a nationwide census-based cohort study including all children < 16 years of age living in Switzerland on 5 December 2000, the date of the 2000 census. Follow-up lasted until the date of diagnosis, death, emigration, a child's 16th birthday, or 31 December 2008. Domestic radon levels were estimated for each individual home address using a model developed and validated based on approximately 45,000 measurements taken throughout Switzerland. Data were analyzed with Cox proportional hazard models adjusted for child age, child sex, birth order, parents' socioeconomic status, environmental gamma radiation, and period effects. RESULTS: In total, 997 childhood cancer cases were included in the study. Compared with children exposed to a radon concentration below the median (< 77.7 Bq/m3), adjusted hazard ratios for children with exposure ≥ the 90th percentile (≥ 139.9 Bq/m3) were 0.93 (95% CI: 0.74, 1.16) for all cancers, 0.95 (95% CI: 0.63, 1.43) for all leukemias, 0.90 (95% CI: 0.56, 1.43) for acute lymphoblastic leukemia, and 1.05 (95% CI: 0.68, 1.61) for CNS tumors. CONCLUSIONS: We did not find evidence that domestic radon exposure is associated with childhood cancer, despite relatively high radon levels in Switzerland.
Resumo:
Résumé La prédominance de l'obésité qui touche les enfants et les adultes a augmenté dans le monde entier ces dernières décennies. Les différentes études épidémiologiques ont prouvé que l'obésité est devenue une préoccupation profonde de santé aux États-Unis et au Canada. Il a été montré que l'obésité a beaucoup d’effets sur la santé ainsi il serait important de trouver différentes causes pour le gain de poids. Il est clair que l'obésité soit la condition de multiples facteurs et implique des éléments génétiques et environnementaux. Nous nous concentrons sur les facteurs diététiques et particulièrement le fructose où sa consommation a parallèlement augmenté avec l'augmentation du taux d'obésité. La forme principale du fructose est le sirop de maïs à haute teneur en fructose (HFCS) qui est employé en tant qu'édulcorant primordial dans la plupart des boissons et nourritures en Amérique du Nord. Il a été suggéré que la prise du fructose serait probablement un facteur qui contribue à l’augmentation de la prédominance de l'obésité. L'objectif de cette étude était d'évaluer s'il y a un rapport entre la consommation du fructose et le risque d'obésité. Nous avons travaillé sur deux bases de données des nations Cree et Inuit. Nous avons eu un groupe de 522 adultes Cree, (263 femmes et 259 hommes) dans deux groupes d'âge : les personnes entre 20 et 40 ans, et les personnes de 40 à 60 ans. Nous les avons classés par catégorie en quatre groupes d'indice de masse corporelle (IMC). L'outil de collecte de données était un rappel de 24 heures. En revanche, pour la base de données d'Inuit nous avons eu 550 adultes (301 femmes et 249 hommes) dans deux groupes d'âge semblables à ceux du Cree et avec 3 catégories d’indice de masse corporelle. Les données dans la base d'Inuit ont été recueillies au moyen de deux rappels de 24 heures. Nous avons extrait la quantité de fructose par 100 grammes de nourriture consommés par ces deux populations et nous avons créé des données de composition en nourriture pour les deux. Nous avons pu également déterminer les sources principales du fructose pour ces populations. Aucun rapport entre la consommation du fructose et l’augmentation de l’indice de masse corporelle parmi les adultes de Cree et d'Inuit n’a été détecté. Nous avons considéré l’apport énergétique comme facteur confondant potentiel et après ajustement, nous avons constaté que l'indice de masse corporelle a été associé à l’apport énergétique total et non pas à la consommation du fructose. Puisque dans les études qui ont trouvé une association entre la consommation de fructose et l’obésité, le niveau de la consommation de fructose était supérieure à 50 grammes par jour et comme dans cette étude ce niveau était inférieur à cette limite (entre 20.6 et 45.4 g/jour), nous proposons que des effets negatifs du fructose sur la masse corporelle pourraient être testés dans des populations à plus haute consommation. Les essais cliniques randomisés et éventuelles études cohortes avec différents niveaux de consommation de fructose suivis à long terme pourraient aussi être utiles. Mots clés : fructose, sirop de maïs à haute teneur en fructose (HFCS), obésité et poids excessif
Resumo:
Objective: To determine the risk of lung cancer associated with exposure at home to the radioactive disintegration products of naturally Occurring radon gas. Design: Collaborative analysis of individual data from 13 case-control studies of residential radon and lung cancer. Setting Nine European countries. Subjects 7148 cases Of lung cancer and 14 208 controls. Main outcome measures: Relative risks of lung cancer and radon gas concentrations in homes inhabited during the previous 5-34 years measured in becquerels (radon disintegrations per second) per cubic incite (Bq/m(3)) Of household air. Results: The mean measured radon concentration in homes of people in tire control group was 97 Bq/m(3), with 11% measuring > 200 and 4% measuring > 400 Bq/m(3). For cases of lung cancer the mean concentration was 104 Bq/m(3). The risk of lung cancer increased by 8.4% (95% confidence interval 3.0% to 15.8%) per 100 Bq/m(3) increase in measured radon (P = 0.0007). This corresponds to an increase of 16% (5% to 31%) per 100 Bq/m(3) increase in usual radon-that is, after correction for the dilution caused by random uncertainties in measuring radon concentrations. The dose-response relation seemed to be linear with no threshold and remained significant (P=0.04) in analyses limited to individuals from homes with measured radon < 200 Bq/m(3). The proportionate excess risk did not differ significantly with study, age, sex, or smoking. In the absence of other causes of death, the absolute risks of lung cancer by age 75 years at usual radon concentrations of 0, 100, and 400 Bq/m(3) would be about 0.4%, 0.5%, and 0.7%, respectively, for lifelong non-smokers, and about 25 times greater (10%, 12%, and 16%) for cigarette smokers. Conclusions: Collectively, though not separately, these studies show appreciable hazards from residential radon, particularly for smokers and recent ex-smokers, and indicate that it is responsible for about 2% of all deaths from cancer in Europe.
Resumo:
Intermittent hemodialysis (IHD) and continuous renal replacement therapies (CRRT) are used as Acute Kidney Injury (AKI) therapy and have certain advantages and disadvantages. Extended daily dialysis (EDD) has emerged as an alternative to CRRT in the management of hemodynamically unstable AKI patients, mainly in developed countries.Objectives: We hypothesized that EDD is a safe option for AKI treatment and aimed to describe metabolic and fluid control of AKI patients undergoing EDD and identify complications and risk factors associated with death.Study Selection: This is an observational and retrospective study describing introduction of EDD at our institution. A total of 231 hemodynamically unstable AKI patients (noradrenalin dose between 0.3 and 1.0 ucg/kg/min) were assigned to 1367 EDD session. EDD consisted of 6-8 h of HD 6 days a week, with blood flow of 200 ml/min, dialysate flows of 300 ml/min.Data Synthesis: Mean age was 60.6 +/- 15.8 years, 97.4% of patients were in the intensive care unit, and sepsis was the main etiology of AKI (76.2). BUN and creatinine levels stabilized after four sessions at around 38 and 2.4 mg/dl, respectively. Fluid balance decreased progressively and stabilized around zero after five sessions. Weekly delivered Kt/V was 5.94 +/- 0.7. Hypotension and filter clotting occurred in 47.5 and 12.4% of treatment session, respectively. Regarding AKI outcome, 22.5% of patients presented renal function recovery, 5.6% of patients remained on dialysis after 30 days, and 71.9% of patients died. Age and focus abdominal sepsis were identified as risk factors for death. Urine output and negative fluid balance were identified as protective factors.Conclusions: EDD is effective for AKI patients, allowing adequate metabolic and fluid control. Age, focus abdominal sepsis, and lower urine output as well as positive fluid balance after two EDD sessions were associated significantly with death.
Resumo:
Background: In contrast with established evidence linking high doses of ionizing radiation with childhood cancer, research on low-dose ionizing radiation and childhood cancer has produced inconsistent results. Objective: We investigated the association between domestic radon exposure and childhood cancers, particularly leukemia and central nervous system (CNS) tumors. Methods: We conducted a nationwide census-based cohort study including all children < 16 years of age living in Switzerland on 5 December 2000, the date of the 2000 census. Follow-up lasted until the date of diagnosis, death, emigration, a child’s 16th birthday, or 31 December 2008. Domestic radon levels were estimated for each individual home address using a model developed and validated based on approximately 45,000 measurements taken throughout Switzerland. Data were analyzed with Cox proportional hazard models adjusted for child age, child sex, birth order, parents’ socioeconomic status, environmental gamma radiation, and period effects. Results: In total, 997 childhood cancer cases were included in the study. Compared with children exposed to a radon concentration below the median (< 77.7 Bq/m3), adjusted hazard ratios for children with exposure ≥ the 90th percentile (≥ 139.9 Bq/m3) were 0.93 (95% CI: 0.74, 1.16) for all cancers, 0.95 (95% CI: 0.63, 1.43) for all leukemias, 0.90 (95% CI: 0.56, 1.43) for acute lymphoblastic leukemia, and 1.05 (95% CI: 0.68, 1.61) for CNS tumors. Conclusions: We did not find evidence that domestic radon exposure is associated with childhood cancer, despite relatively high radon levels in Switzerland.
Resumo:
OBJECTIVES It is still debated if pre-existing minority drug-resistant HIV-1 variants (MVs) affect the virological outcomes of first-line NNRTI-containing ART. METHODS This Europe-wide case-control study included ART-naive subjects infected with drug-susceptible HIV-1 as revealed by population sequencing, who achieved virological suppression on first-line ART including one NNRTI. Cases experienced virological failure and controls were subjects from the same cohort whose viraemia remained suppressed at a matched time since initiation of ART. Blinded, centralized 454 pyrosequencing with parallel bioinformatic analysis in two laboratories was used to identify MVs in the 1%-25% frequency range. ORs of virological failure according to MV detection were estimated by logistic regression. RESULTS Two hundred and sixty samples (76 cases and 184 controls), mostly subtype B (73.5%), were used for the analysis. Identical MVs were detected in the two laboratories. 31.6% of cases and 16.8% of controls harboured pre-existing MVs. Detection of at least one MV versus no MVs was associated with an increased risk of virological failure (OR = 2.75, 95% CI = 1.35-5.60, P = 0.005); similar associations were observed for at least one MV versus no NRTI MVs (OR = 2.27, 95% CI = 0.76-6.77, P = 0.140) and at least one MV versus no NNRTI MVs (OR = 2.41, 95% CI = 1.12-5.18, P = 0.024). A dose-effect relationship between virological failure and mutational load was found. CONCLUSIONS Pre-existing MVs more than double the risk of virological failure to first-line NNRTI-based ART.
Resumo:
Ninety-one Swiss veal farms producing under a label with improved welfare standards were visited between August and December 2014 to investigate risk factors related to antimicrobial drug use and mortality. All herds consisted of own and purchased calves, with a median of 77.4% of purchased calves. The calves' mean age was 29±15days at purchasing and the fattening period lasted at average 120±28 days. The mean carcass weight was 125±12kg. A mean of 58±33 calves were fattened per farm and year, and purchased calves were bought from a mean of 20±17 farms of origin. Antimicrobial drug treatment incidence was calculated with the defined daily dose methodology. The mean treatment incidence (TIADD) was 21±15 daily doses per calf and year. The mean mortality risk was 4.1%, calves died at a mean age of 94±50 days, and the main causes of death were bovine respiratory disease (BRD, 50%) and gastro-intestinal disease (33%). Two multivariable models were constructed, for antimicrobial drug treatment incidence (53 farms) and mortality (91 farms). No quarantine, shared air space for several groups of calves, and no clinical examination upon arrival at the farm were associated with increased antimicrobial treatment incidence. Maximum group size and weight differences >100kg within a group were associated with increased mortality risk, while vaccination and beef breed were associated with decreased mortality risk. The majority of antimicrobial treatments (84.6%) were given as group treatments with oral powder fed through an automatic milk feeding system. Combination products containing chlortetracycline with tylosin and sulfadimidine or with spiramycin were used for 54.9%, and amoxicillin for 43.7% of the oral group treatments. The main indication for individual treatment was BRD (73%). The mean age at the time of treatment was 51 days, corresponding to an estimated weight of 80-100kg. Individual treatments were mainly applied through injections (88.5%), and included administration of fluoroquinolones in 38.3%, penicillines (amoxicillin or benzylpenicillin) in 25.6%, macrolides in 13.1%, tetracyclines in 12.0%, 3th and 4th generation cephalosporines in 4.7%, and florfenicol in 3.9% of the cases. The present study allowed for identifying risk factors for increased antimicrobial drug treatment and mortality. This is an important basis for future studies aiming at reducing treatment incidence and mortality in veal farms. Our results indicate that improvement is needed in the selection of drugs for the treatment of veal calves according to the principles of prudent use of antibiotics.