884 resultados para Inhalation dose and risk


Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Infectious diseases and social contacts in early life have been proposed to modulate brain tumour risk during late childhood and adolescence. METHODS CEFALO is an interview-based case-control study in Denmark, Norway, Sweden and Switzerland, including children and adolescents aged 7-19 years with primary intracranial brain tumours diagnosed between 2004 and 2008 and matched population controls. RESULTS The study included 352 cases (participation rate: 83%) and 646 controls (71%). There was no association with various measures of social contacts: daycare attendance, number of childhours at daycare, attending baby groups, birth order or living with other children. Cases of glioma and embryonal tumours had more frequent sick days with infections in the first 6 years of life compared with controls. In 7-19 year olds with 4+ monthly sick day, the respective odds ratios were 2.93 (95% confidence interval: 1.57-5.50) and 4.21 (95% confidence interval: 1.24-14.30). INTERPRETATION There was little support for the hypothesis that social contacts influence childhood and adolescent brain tumour risk. The association between reported sick days due to infections and risk of glioma and embryonal tumour may reflect involvement of immune functions, recall bias or inverse causality and deserve further attention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE The aim of this study was to assess the association between frailty and risk for heart failure (HF) in older adults. BACKGROUND Frailty is common in the elderly and is associated with adverse health outcomes. Impact of frailty on HF risk is not known. METHODS We assessed the association between frailty, using the Health ABC Short Physical Performance Battery (HABC Battery) and the Gill index, and incident HF in 2825 participants aged 70 to 79 years. RESULTS Mean age of participants was 74 ± 3 years; 48% were men and 59% were white. During a median follow up of 11.4 (7.1-11.7) years, 466 participants developed HF. Compared to non-frail participants, moderate (HR 1.36, 95% CI 1.08-1.71) and severe frailty (HR 1.88, 95% CI 1.02-3.47) by Gill index was associated with a higher risk for HF. HABC Battery score was linearly associated with HF risk after adjusting for the Health ABC HF Model (HR 1.24, 95% CI 1.13-1.36 per SD decrease in score) and remained significant when controlled for death as a competing risk (HR 1.30; 95% CI 1.00-1.55). Results were comparable across age, sex, and race, and in sub-groups based on diabetes mellitus or cardiovascular disease at baseline. Addition of HABC Battery scores to the Health ABC HF Risk Model improved discrimination (change in C-index, 0.014; 95% CI 0.018-0.010) and appropriately reclassified 13.4% (net-reclassification-improvement 0.073, 95% CI 0.021-0.125; P = .006) of participants (8.3% who developed HF and 5.1% who did not). CONCLUSIONS Frailty is independently associated with risk of HF in older adults.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Mortality risk for people with chronic kidney disease is substantially greater than that for the general population, increasing to a 7-fold greater risk for those on dialysis therapy. Higher body mass index, generally due to higher energy intake, appears protective for people on dialysis therapy, but the relationship between energy intake and survival in those with reduced kidney function is unknown. STUDY DESIGN Prospective cohort study with a median follow-up of 14.5 (IQR, 11.2-15.2) years. SETTING & PARTICIPANTS Blue Mountains Area, west of Sydney, Australia. Participants in the general community enrolled in the Blue Mountains Eye Study (n=2,664) who underwent a detailed interview, food frequency questionnaire, and physical examination including body weight, height, blood pressure, and laboratory tests. PREDICTORS Relative energy intake, food components (carbohydrates, total sugars, fat, protein, and water), and estimated glomerular filtration rate (eGFR). Relative energy intake was dichotomized at 100%, and eGFR, at 60mL/min/1.73m(2). OUTCOMES All-cause and cardiovascular mortality. MEASUREMENTS All-cause and cardiovascular mortality using unadjusted and adjusted Cox proportional regression models. RESULTS 949 people died during follow-up, 318 of cardiovascular events. In people with eGFR<60mL/min/1.73m(2) (n=852), there was an increased risk of all-cause mortality (HR, 1.48; P=0.03), but no increased risk of cardiovascular mortality (HR, 1.59; P=0.1) among those with higher relative energy intake compared with those with lower relative energy intake. Increasing intake of carbohydrates (HR per 100g/d, 1.50; P=0.04) and total sugars (HR per 100g/d, 1.62; P=0.03) was associated significantly with increased risk of cardiovascular mortality. LIMITATIONS Under-reporting of energy intake, baseline laboratory and food intake values only, white population. CONCLUSIONS Increasing relative energy intake was associated with increased all-cause mortality in patients with eGFR<60mL/min/1.73m(2). This effect may be mediated by increasing total sugars intake on subsequent cardiovascular events.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bovine tuberculosis (bTB) caused by Mycobacterium bovis or M. caprae has recently (re-) emerged in livestock and wildlife in all countries bordering Switzerland (CH) and the Principality of Liechtenstein (FL). Comprehensive data for Swiss and Liechtenstein wildlife are not available so far, although two native species, wild boar (Sus scrofa) and red deer (Cervus elaphus elaphus), act as bTB reservoirs elsewhere in continental Europe. Our aims were (1) to assess the occurrence of bTB in these wild ungulates in CH/FL and to reinforce scanning surveillance in all wild mammals; (2) to evaluate the risk of a future bTB reservoir formation in wild boar and red deer in CH/FL. Tissue samples collected from 2009 to 2011 from 434 hunted red deer and wild boar and from eight diseased ungulates with tuberculosis-like lesions were tested by direct real-time PCR and culture to detect mycobacteria of the Mycobacterium tuberculosis complex (MTBC). Identification of suspicious colonies was attempted by real-time PCR, genotyping and spoligotyping. Information on risk factors for bTB maintenance within wildlife populations was retrieved from the literature and the situation regarding identified factors was assessed for our study areas. Mycobacteria of the MTBC were detected in six out of 165 wild boar (3.6%; 95% CI: 1.4-7.8) but none of the 269 red deer (0%; 0-1.4). M. microti was identified in two MTBC-positive wild boar, while species identification remained unsuccessful in four cases. Main risk factors for bTB maintenance worldwide, including different causes of aggregation often resulting from intensive wildlife management, are largely absent in CH and FL. In conclusion, M. bovis and M. caprae were not detected but we report for the first time MTBC mycobacteria in Swiss wild boar. Present conditions seem unfavorable for a reservoir emergence, nevertheless increasing population numbers of wild ungulates and offal consumption may represent a risk.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: To assess health care utilisation for patients co-infected with TB and HIV (TB-HIV), and to develop a weighted health care index (HCI) score based on commonly used interventions and compare it with patient outcome. METHODS: A total of 1061 HIV patients diagnosed with TB in four regions, Central/Northern, Southern and Eastern Europe and Argentina, between January 2004 and December 2006 were enrolled in the TB-HIV study. A weighted HCI score (range 0–5), based on independent prognostic factors identified in multivariable Cox models and the final score, included performance of TB drug susceptibility testing (DST), an initial TB regimen containing a rifamycin, isoniazid and pyrazinamide, and start of combination antiretroviral treatment (cART). RESULTS: The mean HCI score was highest in Central/Northern Europe (3.2, 95%CI 3.1–3.3) and lowest in Eastern Europe (1.6, 95%CI 1.5–1.7). The cumulative probability of death 1 year after TB diagnosis decreased from 39% (95%CI 31–48) among patients with an HCI score of 0, to 9% (95%CI 6–13) among those with a score of ≥4. In an adjusted Cox model, a 1-unit increase in the HCI score was associated with 27% reduced mortality (relative hazard 0.73, 95%CI 0.64–0.84). CONCLUSIONS: Our results suggest that DST, standard anti-tuberculosis treatment and early cART may improve outcome for TB-HIV patients. The proposed HCI score provides a tool for future research and monitoring of the management of TB-HIV patients. The highest HCI score may serve as a benchmark to assess TB-HIV management, encouraging continuous health care improvement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Empirical research has illustrated an association between study size and relative treatment effects, but conclusions have been inconsistent about the association of study size with the risk of bias items. Small studies give generally imprecisely estimated treatment effects, and study variance can serve as a surrogate for study size. METHODS We conducted a network meta-epidemiological study analyzing 32 networks including 613 randomized controlled trials, and used Bayesian network meta-analysis and meta-regression models to evaluate the impact of trial characteristics and study variance on the results of network meta-analysis. We examined changes in relative effects and between-studies variation in network meta-regression models as a function of the variance of the observed effect size and indicators for the adequacy of each risk of bias item. Adjustment was performed both within and across networks, allowing for between-networks variability. RESULTS Imprecise studies with large variances tended to exaggerate the effects of the active or new intervention in the majority of networks, with a ratio of odds ratios of 1.83 (95% CI: 1.09,3.32). Inappropriate or unclear conduct of random sequence generation and allocation concealment, as well as lack of blinding of patients and outcome assessors, did not materially impact on the summary results. Imprecise studies also appeared to be more prone to inadequate conduct. CONCLUSIONS Compared to more precise studies, studies with large variance may give substantially different answers that alter the results of network meta-analyses for dichotomous outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Glutathione S-transferase (GST) genes detoxify and metabolize carcinogens, including oxygen free radicals which may contribute to salivary gland carcinogenesis. This cancer center-based case-control association study included 166 patients with incident salivary gland carcinoma (SGC) and 511 cancer-free controls. We performed multiplex polymerase chain reaction-based polymorphism genotyping assays for GSTM1 and GSTT1 null genotypes. Odds ratios (ORs) and 95% confidence intervals (CIs) were calculated with multivariable logistic regression analyses adjusted for age, sex, ethnicity, tobacco use, family history of cancer, alcohol use and radiation exposure. In our results, 27.7% of the SGC cases and 20.6% of the controls were null for the GSTT1 (P = 0.054), and 53.0% of the SGC cases and 50.9% of the controls were null for the GSTM1 (P = 0.633). The results of the adjusted multivariale regression analysis suggested that having GSTT1 null genotype was associated with a significantly increased risk for SGC (odds ratio 1.5, 95% confidence interval 1.0-2.3). Additionally, 13.9% of the SGC cases but only 8.4% of the controls were null for both genes and the results of the adjusted multivariable regression analysis suggested that having both null genotypes was significantly associated with an approximately 2-fold increased risk for SGC (odds ratio 1.9, 95% confidence interval 1.0-3.5). The presence of GSTT1 null genotype and the simultaneous presence of GSTM1 and GSTT1 null genotypes appear associated with significantly increased SGC risk. These findings warrant further study with larger sample sizes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methylating agents are involved in carcinogenesis, and the DNA repair protein O(6)-methylguanine-DNA methyltransferase (MGMT) removes methyl group from O(6)-methylguanine. Genetic variation in DNA repair genes has been shown to contribute to susceptibility to squamous cell carcinoma of the head and neck (SCCHN). We hypothesize that MGMT polymorphisms are associated with risk of SCCHN. In a hospital-based case-control study of 721 patients with SCCHN and 1234 cancer-free controls frequency-matched by age, sex and ethnicity, we genotyped four MGMT polymorphisms, two in exon 3, 16195C>T and 16286C>T and two in the promoter region, 45996G>T and 46346C>A. We found that none of these polymorphisms alone had a significant effect on risk of SCCHN. However, when these four polymorphisms were evaluated together by the number of putative risk genotypes (i.e. 16195CC, 16286CC, 45996GT+TT, and 46346CA+AA), a statistically significantly increased risk of SCCHN was associated with the combined genotypes with three to four risk genotypes, compared with those with zero to two risk genotypes (adjusted odds ratio (OR)=1.27; 95% confidence interval (CI)=1.05-1.53). This increased risk was also more pronounced among young subjects (OR=1.81; 95% CI=1.11-2.96), men (OR=1.24; 95% CI=1.00-1.55), ever smokers (OR=1.25; 95%=1.01-1.56), ever drinkers (OR=1.29; 95% CI=1.04-1.60), patients with oropharyngeal cancer (OR=1.45; 95% CI=1.12-1.87), and oropharyngeal cancer with regional lymph node metastasis (OR=1.52; 95% CI=1.16-1.89). In conclusion, our results suggest that any one of MGMT variants may not have a substantial effect on SCCHN risk, but a joint effect of several MGMT variants may contribute to risk and progression of SCCHN, particularly for oropharyngeal cancer, in non-Hispanic whites.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIM To compare the computed tomography (CT) dose and image quality with the filtered back projection against the iterative reconstruction and CT with a minimal electronic noise detector. METHODS A lung phantom (Chest Phantom N1 by Kyoto Kagaku) was scanned with 3 different CT scanners: the Somatom Sensation, the Definition Flash and the Definition Edge (all from Siemens, Erlangen, Germany). The scan parameters were identical to the Siemens presetting for THORAX ROUTINE (scan length 35 cm and FOV 33 cm). Nine different exposition levels were examined (reference mAs/peek voltage): 100/120, 100/100, 100/80, 50/120, 50/100, 50/80, 25/120, 25/100 and 25 mAs/80 kVp. Images from the SOMATOM Sensation were reconstructed using classic filtered back projection. Iterative reconstruction (SAFIRE, level 3) was performed for the two other scanners. A Stellar detector was used with the Somatom Definition Edge. The CT doses were represented by the dose length products (DLPs) (mGycm) provided by the scanners. Signal, contrast, noise and subjective image quality were recorded by two different radiologists with 10 and 3 years of experience in chest CT radiology. To determine the average dose reduction between two scanners, the integral of the dose difference was calculated from the lowest to the highest noise level. RESULTS When using iterative reconstruction (IR) instead of filtered back projection (FBP), the average dose reduction was 30%, 52% and 80% for bone, soft tissue and air, respectively, for the same image quality (P < 0.0001). The recently introduced Stellar detector (Sd) lowered the radiation dose by an additional 27%, 54% and 70% for bone, soft tissue and air, respectively (P < 0.0001). The benefit of dose reduction was larger at lower dose levels. With the same radiation dose, an average of 34% (22%-37%) and 25% (13%-46%) more contrast to noise was achieved by changing from FBP to IR and from IR to Sd, respectively. For the same contrast to noise level, an average of 59% (46%-71%) and 51% (38%-68%) dose reduction was produced for IR and Sd, respectively. For the same subjective image quality, the dose could be reduced by 25% (2%-42%) and 44% (33%-54%) using IR and Sd, respectively. CONCLUSION This study showed an average dose reduction between 27% and 70% for the new Stellar detector, which is equivalent to using IR instead of FBP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigated the association between exposure to radio-frequency electromagnetic fields (RF-EMFs) from broadcast transmitters and childhood cancer. First, we conducted a time-to-event analysis including children under age 16 years living in Switzerland on December 5, 2000. Follow-up lasted until December 31, 2008. Second, all children living in Switzerland for some time between 1985 and 2008 were included in an incidence density cohort. RF-EMF exposure from broadcast transmitters was modeled. Based on 997 cancer cases, adjusted hazard ratios in the time-to-event analysis for the highest exposure category (>0.2 V/m) as compared with the reference category (<0.05 V/m) were 1.03 (95% confidence interval (CI): 0.74, 1.43) for all cancers, 0.55 (95% CI: 0.26, 1.19) for childhood leukemia, and 1.68 (95% CI: 0.98, 2.91) for childhood central nervous system (CNS) tumors. Results of the incidence density analysis, based on 4,246 cancer cases, were similar for all types of cancer and leukemia but did not indicate a CNS tumor risk (incidence rate ratio = 1.03, 95% CI: 0.73, 1.46). This large census-based cohort study did not suggest an association between predicted RF-EMF exposure from broadcasting and childhood leukemia. Results for CNS tumors were less consistent, but the most comprehensive analysis did not suggest an association.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many persons in the U.S. gain weight during young adulthood, and the prevalence of obesity has been increasing among young adults. Although obesity and physical inactivity are generally recognized as risk factors for coronary heart disease (CHD), the magnitude of their effect on risk may have been seriously underestimated due to failure to adequately handle the problem of cigarette smoking. Since cigarette smoking causes weight loss, physically inactive cigarette smokers may remain relatively lean because they smoke cigarettes. We hypothesize cigarette smoking modifies the association between weight gain during young adulthood and risk of coronary heart disease during middle age, and that the true effect of weight gain during young adulthood on risk of CHD can be assessed only in persons who have not smoked cigarettes. Specifically, we hypothesize that weight gain during young adulthood is positively associated with risk of CHD during middle-age in nonsmokers but that the association is much smaller or absent entirely among cigarette smokers. The purpose of this study was to test this hypothesis. The population for analysis was comprised of 1,934 middle-aged, employed men whose average age at the baseline examination was 48.7 years. Information collected at the baseline examinations in 1958 and 1959 included recalled weight at age 20, present weight, height, smoking status, and other CHD risk factors. To decrease the effect of intraindividual variation, the mean values of the 1958 and 1959 baseline examinations were used in analyses. Change in body mass index ($\Delta$BMI) during young adulthood was the primary exposure variable and was measured as BMI at baseline (kg/m$\sp2)$ minus BMI at age 20 (kg/m$\sp2).$ Proportional hazards regression analysis was used to generate relative risks of CHD mortality by category of $\Delta$BMI and cigarette smoking status after adjustment for age, family history of CVD, major organ system disease, BMI at age 20, and number of cigarettes smoked per day. Adjustment was not performed for systolic blood pressure or total serum cholesterol as these were regarded as intervening variables. Vital status was known for all men on the 25th anniversary of their baseline examinations. 705 deaths (including 319 CHD deaths) occurred over 40,136 person-years of experience. $\Delta$BMI was positively associated with risk of CHD mortality in never-smokers, but not in ever-smokers (p for interaction = 0.067). For never-smokers with $\Delta$BMI of stable, low gain, moderate gain, and high gain, adjusted relative risks were 1.00, 1.62, 1.61, and 2.78, respectively (p for trend = 0.010). For ever-smokers, with $\Delta$BMI of stable, low gain, moderate gain, and high gain, adjusted relative risks were 1.00, 0.74, 1.07, and 1.06, respectively (p for trend = 0.422). These results support the research hypothesis that cigarette smoking modifies the association between weight gain and CHD mortality. Current estimates of the magnitude of effect of obesity and physical inactivity on risk of coronary mortality may have been seriously underestimated due to inadequate handling of cigarette smoking. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A cohort of 418 United States Air Force (USAF) personnel from over 15 different bases deployed to Morocco in 1994. This was the first study of its kind and was designed with two primary goals: to determine if the USAF was medically prepared to deploy with its changing mission in the new world order, and to evaluate factors that might improve or degrade USAF medical readiness. The mean length of deployment was 21 days. The cohort was 95% male, 86% enlisted, 65% married, and 78% white.^ This study shows major deficiencies indicating the USAF medical readiness posture has not fully responded to meet its new mission requirements. Lack of required logistical items (e.g., mosquito nets, rainboots, DEET insecticide cream, etc.) revealed a low state of preparedness. The most notable deficiency was that 82.5% (95% CI = 78.4, 85.9) did not have permethrin pretreated mosquito nets and 81.0% (95% CI = 76.8, 84.6) lacked mosquito net poles. Additionally, 18% were deficient on vaccinations and 36% had not received a tuberculin skin test. Excluding injections, the overall compliance for preventive medicine requirements had a mean frequency of only 50.6% (95% CI = 45.36, 55.90).^ Several factors had a positive impact on compliance with logistical requirements. The most prominent was "receiving a medical intelligence briefing" from the USAF Public Health. After adjustment for mobility and age, individuals who underwent a briefing were 17.2 (95% CI = 4.37, 67.99) times more likely to have received an immunoglobulin shot and 4.2 (95% CI = 1.84, 9.45) times more likely to start their antimalarial prophylaxsis at the proper time. "Personnel on mobility" had the second strongest positive effect on medical readiness. When mobility and briefing were included in models, "personnel on mobility" were 2.6 (95% CI = 1.19, 5.53) times as likely to have DEET insecticide and 2.2 (95% CI = 1.16, 4.16) times as likely to have had a TB skin test.^ Five recommendations to improve the medical readiness of the USAF were outlined: upgrade base level logistical support, improve medical intelligence messages, include medical requirements on travel orders, place more personnel on mobility or only deploy personnel on mobility, and conduct research dedicated to capitalize on the powerful effect from predeployment briefings.^ Since this is the first study of its kind, more studies should be performed in different geographic theaters to assess medical readiness and establish acceptable compliance levels for the USAF. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study was conducted to determine the incidence and etiology of neonatal seizures, and evaluate risk factors for this condition in Harris County, Texas, between 1992 and 1994. Potential cases were ascertained from four sources: discharge diagnoses at local hospitals, birth certificates, death certificates, and a clinical study of neonatal seizures conducted concurrent with this study at a large tertiary care center in Houston, Texas. The neonatal period was defined as the first 28 days of life for term infants, and up to 44 weeks gestation for preterm infants.^ There were 207 cases of neonatal seizures ascertained among 116,048 live births, yielding and incidence of 1.8 per 1000. Half of the seizures occurred by the third day of life, 70% within the first week, and 93% within the first 28 days of life. Among 48 preterm infants with seizures 15 had their initial seizure after the 28th day of life. About 25% of all seizures occurred after discharge from the hospital of birth.^ Idiopathic seizures occurred most frequently (0.5/1000 births), followed by seizures attributed to perinatal hypoxia/ischemia (0.4/1000 births), intracranial hemorrhage (0.2/1000 births), infection of the central nervous system (0.2/1000 births), and metabolic abnormalities (0.1/1000 births).^ Risk factors were evaluated based on birth certificate information, using univariate and multivariate analysis (logistic regression). Factors considered included birth weight, gender, ethnicity, place of birth, mother's age, method of delivery, parity, multiple birth and, among term infants, small birth weight for gestational age (SGA). Among preterm infants, very low birth weight (VLBW, $<$1500 grams) was the strongest risk factor, followed by birth in private/university hospitals with a Level III nursery compared with hospitals with a Level II nursery (RR = 2.9), and male sex (RR = 1.8). The effect of very low birth weight varied according to ethnicity. Compared to preterm infants weighing 2000-2999 grams, non-white VLBW infants were 12.0 times as likely to have seizures; whereas white VLBW infants were 2.5 times as likely. Among term infants, significant risk factors included SGA (RR = 1.8), birth in Level III nursery private/university hospitals versus hospitals with Level II nursery (RR = 2.0), and birth by cesarean section (RR = 2.2). ^