884 resultados para Inhalation dose and risk
Resumo:
A total of 92 samples of street dust were collected in Luanda, Angola, were sieved below 100 μm, and analysed by ICP-MS for 35 elements after an aqua-regia digestion. The concentration and spatial heterogeneity of trace elements in the street dust of Luanda are generally lower than in most industrialized cities in the Northern hemisphere. These observations reveal a predominantly “natural” origin for the street dust in Luanda, which is also manifested in that some geochemical processes that occur in natural soils are preserved in street dust: the separation of uranium from thorium, and the retention of the former by carbonate materials, or the high correlation between arsenic and vanadium due to their common mode of adsorption on solid particles in the form of oxyanions. The only distinct anthropogenic fingerprint in the composition of Luanda's street dust is the association Pb–Cd–Sb–Cu (and to a lesser extent, Ba–Cr–Zn). The use of risk assessment strategies has proved helpful in identifying the routes of exposure to street dust and the trace elements therein of most concern in terms of potential adverse health effects. In Luanda the highest levels of risk seem to be associated (a) with the presence of As and Pb in the street dust and (b) with the route of ingestion of dust particles, for all the elements included in the study except Hg, for which inhalation of vapours presents a slightly higher risk than ingestion. However, given the large uncertainties associated with the estimates of toxicity values and exposure factors, and the absence of site-specific biometric factors, these results should be regarded as preliminary and further research should be undertaken before any definite conclusions regarding potential health effects are drawn.
Resumo:
Objective: To determine the relative risk of hip fracture associated with postmenopausal hormone replacement therapy including the effect of duration and recency of treatment, the addition of progestins, route of administration, and dose.
Resumo:
To compare the incidence of foetal malformations (FMs) in pregnant women with epilepsy treated with different anti-epileptic drugs (AED) and doses, and the influence of seizures, family and personal history, and environmental factors. A prospective, observational, community-based cohort study. Methods. A voluntary, Australia-wide, telephone-interview-based register prospectively enrolling three groups of pregnant women: taking AEDs for epilepsy; with epilepsy not taking AEDs; taking AEDs for a non-epileptic indication. Four hundred and fifty eligible women were enrolled over 40 months. Three hundred and ninety six pregnancies had been completed, with 7 sets of twins, for a total of 403 pregnancy outcomes. Results. 354 (87.8%) pregnancy outcomes resulted in a healthy live birth, 26 (6.5%) had a FM, 4 (1%) a death in utero, 1 (0.2%) a premature labour with stillbirth, 14 (3.5%) a spontaneous abortion and 4 lost to follow-up. The FM rate was greater in pregnancies exposed to sodium valproate (VPA) in the first trimester (116.0%) compared with those exposed to all other AEDs (16.0% vs. 2.4%, P < 0.01) or no AEDs (16.0% vs. 3.1 %, P < 0.01). The mean daily dose of VPA taken in pregnancy with FMs was significantly greater than in those without (11975 vs: 1128 mg, P < 0.01). The incidence of FM with VPA doses greater than or equal to 1100 mg was 30.2% vs. 3.2% with doses < 1100 mg (P < 0.01). Conclusions. There is a dose-effect relationship for FM and exposure to VPA during the first trimester of pregnancy, with higher doses of VPA associated with a significantly greater risk than with lower doses or with other AEDs. These results highlight the need to limit, where possible, the dose of VPA in pregnancy. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
The use of granulocyte colony-stimulating factor (G-CSF)-mobilized peripheral blood as a source of stem cells has resulted in a high incidence of severe chronic graft-versus-host disease (cGVHD), which compromises the outcome of clinical allogeneic stem cell transplantation. We have studied the effect of G-CSF on both immune complex and fibrotic cGVHD directed to major (DBA/2 --> B6D2F1) or minor (B10.D2 --> BALB/c) histocompatibility antigens. In both models, donor pretreatment with G-CSF reduced cGVHD mortality in association with type 2 differentiation. However, after escalation of the donor T-cell dose, scleroderma occurred in 90% of the recipients of grafts from G-CSF-treated donors. In contrast, only 11% of the recipients of control grafts developed scleroderma, and the severity of hepatic cGVHD was also reduced. Mixing studies confirmed that in the presence of high donor T-cell doses, the severity of scleroderma was determined by the non-T-cell fraction of grafts from G-CSF-treated donors. These data confirm that the induction of cGVHD after donor treatment with G-CSF is dependent on the transfer of large numbers of donor T cells in conjunction with a putatively expanded myeloid lineage, providing a further rationale for the limitation of cell dose in allogeneic stem cell transplantation. (C) 2004 American Society for Blood and Marrow Transplantation.
Resumo:
Objective: Based on clues from epidemiology and animal experiments, low vitamin D during early life has been proposed as a risk factor for schizophrenia. The aim of this study was to explore the association between the use of vitamin D supplements during the first year of life and risk of developing schizophrenia. Method: Subjects were drawn from the Northern Finland 1966 Birth Cohort (n = 9 114). During the first year of life, data were collected about the frequency and dose of vitamin D supplementation. Our primary outcome measures were schizophrenia, psychotic disorders other than schizophrenia, and nonpsychotic disorders as diagnosed by age 31 years. Males and females were examined separately. Results: In males, the use of either irregular or regular vitamin D supplements was associated with a reduced risk of schizophrenia (Risk ratio (RR) = 0.08, 95% CI 0.01-0.95; RR = 0.12, 95% CI 0.02-0.90, respectively) compared with no supplementation. In males, the use of at least 2000 IU of vitamin D was associated with a reduced risk of schizophrenia (RR = 0.23, 95% CI 0.06-0.95) compared to those on lower doses. There were no significant associations between either the frequency or dose of vitamin D supplements and (a) schizophrenia in females, nor with (b) nonpsychotic disorder or psychotic disorders other than schizophrenia in either males or females. Conclusion: Vitamin D supplementation during the first year of life is associated with a reduced risk of schizophrenia in males. Preventing hypovitaminosis D during early life may reduce the incidence of schizophrenia. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
X-ray computed tomography (CT) is a non-invasive medical imaging technique that generates cross-sectional images by acquiring attenuation-based projection measurements at multiple angles. Since its first introduction in the 1970s, substantial technical improvements have led to the expanding use of CT in clinical examinations. CT has become an indispensable imaging modality for the diagnosis of a wide array of diseases in both pediatric and adult populations [1, 2]. Currently, approximately 272 million CT examinations are performed annually worldwide, with nearly 85 million of these in the United States alone [3]. Although this trend has decelerated in recent years, CT usage is still expected to increase mainly due to advanced technologies such as multi-energy [4], photon counting [5], and cone-beam CT [6].
Despite the significant clinical benefits, concerns have been raised regarding the population-based radiation dose associated with CT examinations [7]. From 1980 to 2006, the effective dose from medical diagnostic procedures rose six-fold, with CT contributing to almost half of the total dose from medical exposure [8]. For each patient, the risk associated with a single CT examination is likely to be minimal. However, the relatively large population-based radiation level has led to enormous efforts among the community to manage and optimize the CT dose.
As promoted by the international campaigns Image Gently and Image Wisely, exposure to CT radiation should be appropriate and safe [9, 10]. It is thus a responsibility to optimize the amount of radiation dose for CT examinations. The key for dose optimization is to determine the minimum amount of radiation dose that achieves the targeted image quality [11]. Based on such principle, dose optimization would significantly benefit from effective metrics to characterize radiation dose and image quality for a CT exam. Moreover, if accurate predictions of the radiation dose and image quality were possible before the initiation of the exam, it would be feasible to personalize it by adjusting the scanning parameters to achieve a desired level of image quality. The purpose of this thesis is to design and validate models to quantify patient-specific radiation dose prospectively and task-based image quality. The dual aim of the study is to implement the theoretical models into clinical practice by developing an organ-based dose monitoring system and an image-based noise addition software for protocol optimization.
More specifically, Chapter 3 aims to develop an organ dose-prediction method for CT examinations of the body under constant tube current condition. The study effectively modeled the anatomical diversity and complexity using a large number of patient models with representative age, size, and gender distribution. The dependence of organ dose coefficients on patient size and scanner models was further evaluated. Distinct from prior work, these studies use the largest number of patient models to date with representative age, weight percentile, and body mass index (BMI) range.
With effective quantification of organ dose under constant tube current condition, Chapter 4 aims to extend the organ dose prediction system to tube current modulated (TCM) CT examinations. The prediction, applied to chest and abdominopelvic exams, was achieved by combining a convolution-based estimation technique that quantifies the radiation field, a TCM scheme that emulates modulation profiles from major CT vendors, and a library of computational phantoms with representative sizes, ages, and genders. The prospective quantification model is validated by comparing the predicted organ dose with the dose estimated based on Monte Carlo simulations with TCM function explicitly modeled.
Chapter 5 aims to implement the organ dose-estimation framework in clinical practice to develop an organ dose-monitoring program based on a commercial software (Dose Watch, GE Healthcare, Waukesha, WI). In the first phase of the study we focused on body CT examinations, and so the patient’s major body landmark information was extracted from the patient scout image in order to match clinical patients against a computational phantom in the library. The organ dose coefficients were estimated based on CT protocol and patient size as reported in Chapter 3. The exam CTDIvol, DLP, and TCM profiles were extracted and used to quantify the radiation field using the convolution technique proposed in Chapter 4.
With effective methods to predict and monitor organ dose, Chapters 6 aims to develop and validate improved measurement techniques for image quality assessment. Chapter 6 outlines the method that was developed to assess and predict quantum noise in clinical body CT images. Compared with previous phantom-based studies, this study accurately assessed the quantum noise in clinical images and further validated the correspondence between phantom-based measurements and the expected clinical image quality as a function of patient size and scanner attributes.
Chapter 7 aims to develop a practical strategy to generate hybrid CT images and assess the impact of dose reduction on diagnostic confidence for the diagnosis of acute pancreatitis. The general strategy is (1) to simulate synthetic CT images at multiple reduced-dose levels from clinical datasets using an image-based noise addition technique; (2) to develop quantitative and observer-based methods to validate the realism of simulated low-dose images; (3) to perform multi-reader observer studies on the low-dose image series to assess the impact of dose reduction on the diagnostic confidence for multiple diagnostic tasks; and (4) to determine the dose operating point for clinical CT examinations based on the minimum diagnostic performance to achieve protocol optimization.
Chapter 8 concludes the thesis with a summary of accomplished work and a discussion about future research.
Resumo:
Background: There are limited data concerning endoscopist-directed endoscopic retrograde cholangiopancreatography deep sedation. The aim of this study was to establish the safety and risk factors for difficult sedation in daily practice. Patients and methods: Hospital-based, frequency matched case-control study. All patients were identified from a database of 1,008 patients between 2014 and 2015. The cases were those with difficult sedations. This concept was defined based on the combination of the receipt of high-doses of midazolam or propofol, poor tolerance, use of reversal agents or sedation-related adverse events. The presence of different factors was evaluated to determine whether they predicted difficult sedation. Results: One-hundred and eighty-nine patients (63 cases, 126 controls) were included. Cases were classified in terms of high-dose requirements (n = 35, 55.56%), sedation-related adverse events (n = 14, 22.22%), the use of reversal agents (n = 13, 20.63%) and agitation/discomfort (n = 8, 12.7%). Concerning adverse events, the total rate was 1.39%, including clinically relevant hypoxemia (n = 11), severe hypotension (n = 2) and paradoxical reactions to midazolam (n = 1). The rate of hypoxemia was higher in patients under propofol combined with midazolam than in patients with propofol alone (2.56% vs. 0.8%, p < 0.001). Alcohol consumption (OR: 2.674 [CI 95%: 1.098-6.515], p = 0.030), opioid consumption (OR: 2.713 [CI 95%: 1.096-6.716], p = 0.031) and the consumption of other psychoactive drugs (OR: 2.015 [CI 95%: 1.017-3.991], p = 0.045) were confirmed to be independent risk factors for difficult sedation. Conclusions: Endoscopist-directed deep sedation during endoscopic retrograde cholangiopancreatography is safe. The presence of certain factors should be assessed before the procedure to identify patients who are high-risk for difficult sedation.
Resumo:
To evaluate associations between polymorphisms of the N-acetyltransferase 2 (NAT2), human 8-oxoguanine glycosylase 1 (hOGG1) and X-ray repair cross-complementing protein 1 (XRCC1) genes and risk of upper aerodigestive tract (UADT) cancer. A case-control study involving 117 cases and 224 controls was undertaken. The NAT2 gene polymorphisms were genotyped by automated sequencing and XRCC1 Arg399Gln and hOGG1 Ser326Cys polymorphisms were determined by Polymerase Chain Reaction followed by Restriction Fragment Length Polymorphism (PCR-RFLP) methods. Slow metabolization phenotype was significantly associated as a risk factor for the development of UADT cancer (p=0.038). Furthermore, haplotype of slow metabolization was also associated with UADT cancer (p=0.014). The hOGG1 Ser326Cys polymorphism (CG or GG vs. CC genotypes) was shown as a protective factor against UADT cancer in moderate smokers (p=0.031). The XRCC1 Arg399Gln polymorphism (GA or AA vs. GG genotypes), in turn, was a protective factor against UADT cancer only among never-drinkers (p=0.048). Interactions involving NAT2, XRCC1 Arg399Gln and hOGG1 Ser326Cys polymorphisms may modulate the risk of UADT cancer in this population.
Resumo:
The aim of this study was to assess the prevalence and risk factors of apical periodontitis in endodontically treated teeth in a selected population of Brazilian adults. A total of 1,372 periapical radiographs of endodontically treated teeth were analyzed based on the quality of root filling, status of coronal restoration and presence of posts associated with apical periodontitis (AP). Data were analyzed statistically using odds ratio, confidence intervals and chi-square test. The prevalence of AP with adequate endodontic treatment was low (16.5%). This percentage dropped to 12.1% in cases with adequate root filling and adequate coronal restoration. Teeth with adequate endodontic treatment and poor coronal restoration had an AP prevalence of 27.9%. AP increased to 71.7% in teeth with poor endodontic treatment associated with poor coronal restoration. When poor endodontic treatment was combined with adequate coronal restoration, AP prevalence was 61.8%. The prevalence of AP was low when associated with high technical quality of root canal treatment. Poor coronal restoration increased the risk of AP even when endodontic treatment was adequate (OR=2.80; 95%CI=1.87-4.22). The presence of intracanal posts had no influence on AP prevalence.
Resumo:
In order to assess the prevalence of and risk factors for aminoglycoside-associated nephrotoxicity in intensive care units (ICUs), we evaluated 360 consecutive patients starting aminoglycoside therapy in an ICU. The patients had a baseline calculated glomerular filtration rate (cGFR) of ?30 ml/min/1.73 m2. Among these patients, 209 (58 per cent) developed aminoglycoside-associated nephrotoxicity (the acute kidney injury [AKI] group, which consisted of individuals with a decrease in cGFR of >20 per cent from the baseline cGFR), while 151 did not (non-AKI group). Both groups had similar baseline cGFRs. The AKI group developed a lower cGFR nadir (45 ± 27 versus 79 ± 39 ml/min/1.73 m2 for the non-AKI group; P < 0.001); was older (56 ± 18 years versus 52 ± 19 years for the non-AKI group; P = 0.033); had a higher prevalence of diabetes (19.6 per cent versus 9.3 per cent for the non-AKI group; P = 0.007); was more frequently treated with other nephrotoxic drugs (51 per cent versus 38 per cent for the non-AKI group; P = 0.024); used iodinated contrast more frequently (18 per cent versus 8 per cent for the non-AKI group; P = 0.0054); and showed a higher prevalence of hypotension (63 per cent versus 44 per cent for the non-AKI group; P = 0.0003), shock (56 per cent versus 31 per cent for the non-AKI group; P < 0.0001), and jaundice (19 per cent versus 8 per cent for the non-AKI group; P = 0.0036). The mortality rate was 44.5 per cent for the AKI group and 29.1 per cent for the non-AKI group (P = 0.0031). A logistic regression model identified as significant (P < 0.05) the following independent factors that affected aminoglycoside-associated nephrotoxicity: a baseline cGFR of <60 ml/min/1.73 m2 (odds ratio [OR], 0.42), diabetes (OR, 2.13), treatment with other nephrotoxins (OR, 1.61) or iodinated contrast (OR, 2.13), and hypotension (OR, 1.83). (To continue) In conclusion, AKI was frequent among ICU patients receiving an aminoglycoside, and it was associated with a high rate of mortality. The presence of diabetes or hypotension and the use of other nephrotoxic drugs and iodinated contrast were independent risk factors for the development of aminoglycoside-associated nephrotoxicity
Resumo:
This multicentric population-based study in Brazil is the first national effort to estimate the prevalence of hepatitis B (HBV) and risk factors in the capital cities of the Northeast, Central-West, and Federal Districts (2004-2005). Random multistage cluster sampling was used to select persons 13-69 years of age. Markers for HBV were tested by enzyme-linked immunosorbent assay. The HBV genotypes were determined by sequencing hepatitis B surface antigen (HBsAg). Multivariate analyses and simple catalytic model were performed. Overall, 7,881 persons were included; < 70 per cent were not vaccinated. Positivity for HBsAg was less than 1 per cent among non-vaccinated persons and genotypes A, D, and F co-circulated. The incidence of infection increased with age with similar force of infection in all regions. Males and persons having initiated sexual activity were associated with HBV infection in the two settings; healthcare jobs and prior hospitalization were risk factors in the Federal District. Our survey classified these regions as areas with HBV endemicity and highlighted the risk factors differences among the settings
Resumo:
Background Associations between aplastic anemia and numerous drugs, pesticides and chemicals have been reported. However, at least 50% of the etiology of aplastic anemia remains unexplained. Design and Methods This was a case-control, multicenter, multinational study, designed to identify risk factors for agranulocytosis and aplastic anemia. The cases were patients with diagnosis of aplastic anemia confirmed through biopsy or bone marrow aspiration, selected through an active search of clinical laboratories, hematology clinics and medical records. The controls did not have either aplastic anemia or chronic diseases. A total of 224 patients with aplastic anemia were included in the study, each case was paired with four controls, according to sex, age group, and hospital where the case was first seen. Information was collected on demographic data, medical history, laboratory tests, medications, and other potential risk factors prior to diagnosis. Results The incidence of aplastic anemia was 1.6 cases per million per year. Higher rates of benzene exposure (>= 30 exposures per year) were associated with a greater risk of aplastic anemia (odds ratio, OR: 4.2; 95% confidence interval, CI: 1.82-9.82). Individuals exposed to chloramphenicol in the previous year had an adjusted OR for aplastic anemia of 8.7 (CI: 0.87-87.93) and those exposed to azithromycin had an adjusted OR of 11.02 (CI 1.14-108.02). Conclusions The incidence of aplastic anemia in Latin America countries is low. Although the research study centers had a high coverage of health services, the underreporting of cases of aplastic anemia in selected regions can be discussed. Frequent exposure to benzene-based products increases the risk for aplastic anemia. Few associations with specific drugs were found, and it is likely that some of these were due to chance alone.
Resumo:
Objective: To determine whether the presence of in vitro penicillin-resistant Streptococcus pneumoniae increases the risk of clinical failure in children hospitalised with severe pneumonia and treated with penicillin/ampicillin. Design: Multicentre, prospective, observational study. Setting: 12 tertiary-care centres in three countries in Latin America. Patients: 240 children aged 3-59 months, hospitalised with severe pneumonia and known in vitro susceptibility of S pneumoniae. Intervention: Patients were treated with intravenous penicillin/ampicillin after collection of blood and, when possible, pleural fluid for culture. The minimal inhibitory concentration (MIC) test was used to determine penicillin susceptibility of the pneumococcal strains isolated. Children were continuously monitored until discharge. Main outcome measures: The primary outcome was treatment failure (using clinical criteria). Results: Overall treatment failure was 21%. After allowing for different potential confounders, there was no evidence of association between treatment failure and in vitro resistance of S pneumoniae to penicillin according to the Clinical Laboratory Standards Institute (CLSI)/National Committee for Clinical Laboratory Standards (NCCLS) interpretative standards ((adj)RR = 1.03; 95%Cl: 0.49-1.90 for resistant S pneumoniae). Conclusions: Intravenous penicillin/ampicillin remains the drug of choice for treating penicillin-resistant pneumococcal pneumonia in areas where the MIC does not exceed 2 mu g/ml.