120 resultados para Inhalation dose and risk
Resumo:
AIM To compare the computed tomography (CT) dose and image quality with the filtered back projection against the iterative reconstruction and CT with a minimal electronic noise detector. METHODS A lung phantom (Chest Phantom N1 by Kyoto Kagaku) was scanned with 3 different CT scanners: the Somatom Sensation, the Definition Flash and the Definition Edge (all from Siemens, Erlangen, Germany). The scan parameters were identical to the Siemens presetting for THORAX ROUTINE (scan length 35 cm and FOV 33 cm). Nine different exposition levels were examined (reference mAs/peek voltage): 100/120, 100/100, 100/80, 50/120, 50/100, 50/80, 25/120, 25/100 and 25 mAs/80 kVp. Images from the SOMATOM Sensation were reconstructed using classic filtered back projection. Iterative reconstruction (SAFIRE, level 3) was performed for the two other scanners. A Stellar detector was used with the Somatom Definition Edge. The CT doses were represented by the dose length products (DLPs) (mGycm) provided by the scanners. Signal, contrast, noise and subjective image quality were recorded by two different radiologists with 10 and 3 years of experience in chest CT radiology. To determine the average dose reduction between two scanners, the integral of the dose difference was calculated from the lowest to the highest noise level. RESULTS When using iterative reconstruction (IR) instead of filtered back projection (FBP), the average dose reduction was 30%, 52% and 80% for bone, soft tissue and air, respectively, for the same image quality (P < 0.0001). The recently introduced Stellar detector (Sd) lowered the radiation dose by an additional 27%, 54% and 70% for bone, soft tissue and air, respectively (P < 0.0001). The benefit of dose reduction was larger at lower dose levels. With the same radiation dose, an average of 34% (22%-37%) and 25% (13%-46%) more contrast to noise was achieved by changing from FBP to IR and from IR to Sd, respectively. For the same contrast to noise level, an average of 59% (46%-71%) and 51% (38%-68%) dose reduction was produced for IR and Sd, respectively. For the same subjective image quality, the dose could be reduced by 25% (2%-42%) and 44% (33%-54%) using IR and Sd, respectively. CONCLUSION This study showed an average dose reduction between 27% and 70% for the new Stellar detector, which is equivalent to using IR instead of FBP.
Resumo:
We investigated the association between exposure to radio-frequency electromagnetic fields (RF-EMFs) from broadcast transmitters and childhood cancer. First, we conducted a time-to-event analysis including children under age 16 years living in Switzerland on December 5, 2000. Follow-up lasted until December 31, 2008. Second, all children living in Switzerland for some time between 1985 and 2008 were included in an incidence density cohort. RF-EMF exposure from broadcast transmitters was modeled. Based on 997 cancer cases, adjusted hazard ratios in the time-to-event analysis for the highest exposure category (>0.2 V/m) as compared with the reference category (<0.05 V/m) were 1.03 (95% confidence interval (CI): 0.74, 1.43) for all cancers, 0.55 (95% CI: 0.26, 1.19) for childhood leukemia, and 1.68 (95% CI: 0.98, 2.91) for childhood central nervous system (CNS) tumors. Results of the incidence density analysis, based on 4,246 cancer cases, were similar for all types of cancer and leukemia but did not indicate a CNS tumor risk (incidence rate ratio = 1.03, 95% CI: 0.73, 1.46). This large census-based cohort study did not suggest an association between predicted RF-EMF exposure from broadcasting and childhood leukemia. Results for CNS tumors were less consistent, but the most comprehensive analysis did not suggest an association.
Resumo:
OBJECTIVES This study sought to determine whether high intestinal cholesterol absorption represents a cardiovascular risk factor and to link ABCG8 and ABO variants to cardiovascular disease (CVD). BACKGROUND Plant sterol-enriched functional foods are widely used for cholesterol lowering. Their regular intake yields a 2-fold increase in circulating plant sterol levels that equally represent markers of cholesterol absorption. Variants in ABCG8 and ABO have been associated with circulating plant sterol levels and CVD, thereby suggesting atherogenic effects of plant sterols or of cholesterol uptake. METHODS The cholestanol-to-cholesterol ratio (CR) was used as an estimate of cholesterol absorption because it is independent of plant sterols. First, we investigated the associations of 6 single nucleotide polymorphisms in ABCG8 and ABO with CR in the LURIC (LUdwisghafen RIsk and Cardiovascular health study) and the YFS (Young Finns Study) cohorts. Second, we conducted a systematic review and meta-analysis to investigate whether CR might be related to CVD. RESULTS In LURIC, the minor alleles of rs4245791 and rs4299376 and the major alleles of rs41360247, rs6576629, and rs4953023 of the ABCG8 gene and the minor allele of rs657152 of the ABO gene were significantly associated with higher CR. Consistent results were obtained for rs4245791, rs4299376, rs6576629, and rs4953023 in YFS. The meta-analysis, including 6 studies and 4,362 individuals, found that CR was significantly increased in individuals with CVD. CONCLUSIONS High cholesterol absorption is associated with risk alleles in ABCG8 and ABO and with CVD. Harm caused by elevated cholesterol absorption rather than by plant sterols may therefore mediate the relationships of ABCG8 and ABO variants with CVD.
Resumo:
QUESTIONS UNDER STUDY We sought to identify reasons for late human immunodeficiency virus (HIV) testing or late presentation for care. METHODS A structured chart review was performed to obtain data on test- and health-seeking behaviour of patients presenting late with CD4 cell counts below 350 cells/µl or with acquired immunodeficiency syndrome (AIDS), at the Zurich centre of the Swiss HIV Cohort Study between January 2009 and December 2011. Logistic regression analyses were used to compare demographic characteristics of persons presenting late with not late presenters. RESULTS Of 281 patients, 45% presented late, 48% were chronically HIV-infected non-late presenters, and an additional 7% fulfilled the <350 CD4 cells/µl criterion for late presentation but a chart review revealed that lymphopenia was caused by acute HIV infection. Among the late presenters, 60% were first tested HIV positive in a private practice. More than half of the tests (60%) were suggested by a physician, only 7% following a specific risk situation. The majority (88%) of patients entered medical care within 1 month of testing HIV positive. Risk factors for late presentation were older age (odds ratio [OR] for ≥50 vs <30 years: 3.16, p = 0.017), Asian versus Caucasian ethnicity (OR 3.5, p = 0.021). Compared with men who have sex with men (MSM) without stable partnership, MSM in a stable partnership appeared less likely to present late (OR 0.50, p = 0.034), whereas heterosexual men in a stable partnership had a 2.72-fold increased odds to present late (p = 0.049). CONCLUSIONS The frequency of late testing could be reduced by promoting awareness, particularly among older individuals and heterosexual men in stable partnerships.
Resumo:
Amyotrophic lateral sclerosis (ALS) has been associated with exposures in so-called 'electrical occupations'. It is unclear if this possible link may be explained by exposure to extremely low-frequency magnetic fields (ELF-MF) or by electrical shocks. We evaluated ALS mortality in 2000-2008 and exposure to ELF-MF and electrical shocks in the Swiss National Cohort, using job exposure matrices for occupations at censuses 1990 and 2000. We compared 2.2 million workers with high or medium vs. low exposure to ELF-MF and electrical shocks using Cox proportional hazard models. Results showed that mortality from ALS was higher in people who had medium or high ELF-MF exposure in both censuses (HR 1.55 (95% CI 1.11-2.15)), but closer to unity for electrical shocks (HR 1.17 (95% CI 0.83-1.65)). When both exposures were included in the same model, the HR for ELF-MF changed little (HR 1.56), but the HR for electric shocks was attenuated to 0.97. In conclusion, there was an association between exposure to ELF-MF and mortality from ALS among workers with a higher likelihood of long-term exposure.
Resumo:
BACKGROUND International travel contributes to the worldwide spread of multidrug resistant Gram-negative bacteria. Rates of travel-related faecal colonization with extended-spectrum β-lactamase (ESBL)-producing Enterobacteriaceae vary for different destinations. Especially travellers returning from the Indian subcontinent show high colonization rates. So far, nothing is known about region-specific risk factors for becoming colonized. METHODS An observational prospective multicentre cohort study investigated travellers to South Asia. Before and after travelling, rectal swabs were screened for third-generation cephalosporin- and carbapenem-resistant Enterobacteriaceae. Participants completed questionnaires to identify risk factors for becoming colonized. Covariates were assessed univariately, followed by a multivariate regression. RESULTS Hundred and seventy persons were enrolled, the largest data set on travellers to the Indian subcontinent so far. The acquired colonization rate with ESBL-producing Escherichia coli overall was 69.4% (95% CI 62.1-75.9%), being highest in travellers returning from India (86.8%; 95% CI 78.5-95.0%) and lowest in travellers returning from Sri Lanka (34.7%; 95% CI 22.9-48.7%). Associated risk factors were travel destination, length of stay, visiting friends and relatives, and eating ice cream and pastry. CONCLUSIONS High colonization rates with ESBL-producing Enterobacteriaceae were found in travellers returning from South Asia. Though risk factors were identified, a more common source, i.e. environmental, appears to better explain the high colonization rates.
Resumo:
Poor udder health represents a serious problem in dairy production and has been investigated intensively, but heifers generally have not been the main focus of mastitis control. The aim of this study was to evaluate the prevalence, risk factors and consequences of heifer mastitis in Switzerland. The study included 166,518 heifers of different breeds (Swiss Red Pied, Swiss Brown Cattle and Holstein). Monthly somatic cell counts (SCCs) provided by the main dairy breeding organisations in Switzerland were monitored for 3 years; the prevalence of subclinical mastitis (SCM) was determined on the basis of SCCs ≥100,000 cells/mL at the first test date. The probability of having SCM at the first test date during lactation was modelled using logistic regression. Analysed factors included data for the genetic background, morphological traits, geographical region, season of parturition and milk composition. The overall prevalence of SCM in heifers during the period from 2006 to 2010 was 20.6%. Higher frequencies of SCM were present in heifers of the Holstein breed (odds ratio, OR, 1.62), heifers with high fat:protein ratios (OR 1.97) and heifers with low milk urea concentrations combined with high milk protein concentrations (OR 3.97). Traits associated with a low risk of SCM were high set udders, high overall breeding values and low milk breeding values. Heifers with SCM on the first test day had a higher risk of either developing chronic mastitis or leaving the herd prematurely.
Resumo:
BACKGROUND Although the possibility of bleeding during anticoagulant treatment may limit patients from taking part in physical activity, the association between physical activity and anticoagulation-related bleeding is uncertain. OBJECTIVES To determine whether physical activity is associated with bleeding in elderly patients taking anticoagulants. PATIENTS/METHODS In a prospective multicenter cohort study of 988 patients aged ≥65 years receiving anticoagulants for venous thromboembolism, we assessed patients' self-reported physical activity level. The primary outcome was the time to a first major bleeding, defined as fatal bleeding, symptomatic bleeding in a critical site, or bleeding causing a fall in hemoglobin or leading to transfusions. The secondary outcome was the time to a first clinically-relevant non-major bleeding. We examined the association between physical activity level and time to a first bleeding using competing risk regression, accounting for death as a competing event. We adjusted for known bleeding risk factors and anticoagulation as a time-varying covariate. RESULTS During a mean follow-up of 22 months, patients with a low, moderate, and high physical activity level had an incidence of major bleeding of 11.6, 6.3, and 3.1 events per 100 patient-years, and an incidence of clinically relevant non-major bleeding of 14.0, 10.3, and 7.7 events per 100 patient-years, respectively. A high physical activity level was significantly associated with a lower risk of major bleeding (adjusted sub-hazard ratio 0.40, 95%-CI 0.22-0.72). There was no association between physical activity and non-major bleeding. CONCLUSIONS A high level of physical activity is associated with a decreased risk of major bleeding in elderly patients receiving anticoagulant therapy. This article is protected by copyright. All rights reserved.
Resumo:
OBJECTIVES Valve-sparing root replacement (VSRR) is thought to reduce the rate of thromboembolic and bleeding events compared with aortic root replacement using a mechanical aortic root replacement (MRR) with a composite graft by avoiding oral anticoagulation. But as VSRR carries a certain risk for subsequent reinterventions, decision-making in the individual patient can be challenging. METHODS Of 100 Marfan syndrome (MFS) patients who underwent 169 aortic surgeries and were followed at our institution since 1995, 59 consecutive patients without a history of dissection or prior aortic surgery underwent elective VSRR or MRR and were retrospectively analysed. RESULTS VSRR was performed in 29 (David n = 24, Yacoub n = 5) and MRR in 30 patients. The mean age was 33 ± 15 years. The mean follow-up after VSRR was 6.5 ± 4 years (180 patient-years) compared with 8.8 ± 9 years (274 patient-years) after MRR. Reoperation rates after root remodelling (Yacoub) were significantly higher than after the reimplantation (David) procedure (60 vs 4.2%, P = 0.01). The need for reinterventions after the reimplantation procedure (0.8% per patient-year) was not significantly higher than after MRR (P = 0.44) but follow-up after VSRR was significantly shorter (P = 0.03). There was neither significant morbidity nor mortality associated with root reoperations. There were no neurological events after VSRR compared with four stroke/intracranial bleeding events in the MRR group (log-rank, P = 0.11), translating into an event rate of 1.46% per patient-year following MRR. CONCLUSION The calculated annual failure rate after VSRR using the reimplantation technique was lower than the annual risk for thromboembolic or bleeding events. Since the perioperative risk of reinterventions following VSRR is low, patients might benefit from VSRR even if redo surgery may become necessary during follow-up.