983 resultados para fixed-width confidence interval


Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND To assess and compare the effectiveness and costs of Phototest, Mini Mental State Examination (MMSE), and Memory Impairment Screen (MIS) to screen for dementia (DEM) and cognitive impairment (CI). METHODS A phase III study was conducted over one year in consecutive patients with suspicion of CI or DEM at four Primary Care (PC) centers. After undergoing all screening tests at the PC center, participants were extensively evaluated by researchers blinded to screening test results in a Cognitive-Behavioral Neurology Unit (CBNU). The gold standard diagnosis was established by consensus of expert neurologists. Effectiveness was assessed by the proportion of correct diagnoses (diagnostic accuracy [DA]) and by the kappa index of concordance between test results and gold standard diagnoses. Costs were based on public prices and hospital accounts. RESULTS The study included 140 subjects (48 with DEM, 37 with CI without DEM, and 55 without CI). The MIS could not be applied to 23 illiterate subjects (16.4%). For DEM, the maximum effectiveness of the MMSE was obtained with different cutoff points as a function of educational level [k = 0.31 (95% Confidence interval [95%CI], 0.19-0.43), DA = 0.60 (95%CI, 0.52-0.68)], and that of the MIS with a cutoff of 3/4 [k = 0.63 (95%CI, 0.48-0.78), DA = 0.83 (95%CI, 0.80-0.92)]. Effectiveness of the Phototest [k = 0.71 (95%CI, 0.59-0.83), DA = 0.87 (95%CI, 0.80-0.92)] was similar to that of the MIS and higher than that of the MMSE. Costs were higher with MMSE (275.9 ± 193.3€ [mean ± sd euros]) than with Phototest (208.2 ± 196.8€) or MIS (201.3 ± 193.4€), whose costs did not significantly differ. For CI, the effectiveness did not significantly differ between MIS [k = 0.59 (95%CI, 0.45-0.74), DA = 0.79 (95%CI, 0.64-0.97)] and Phototest [k = 0.58 (95%CI, 0.45-0.74), DA = 0.78 (95%CI, 0.64-0.95)] and was lowest for the MMSE [k = 0.27 (95%CI, 0.09-0.45), DA = 0.69 (95%CI, 0.56-0.84)]. Costs were higher for MMSE (393.4 ± 121.8€) than for Phototest (287.0 ± 197.4€) or MIS (300.1 ± 165.6€), whose costs did not significantly differ. CONCLUSION MMSE is not an effective instrument in our setting. For both DEM and CI, the Phototest and MIS are more effective and less costly, with no difference between them. However, MIS could not be applied to the appreciable percentage of our population who were illiterate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Toxoplasma gondii infection is an important mediator of ocular disease in Brazil more frequently than reported from elsewhere. Infection and pathology are characterized by a strong proinflammatory response which in mice is triggered by interaction of the parasite with the toll-like receptor (TLR)/MyD88 pathway. A powerful way to identify the role of TLRs in humans is to determine whether polymorphisms at these loci influence susceptibility to T. gondii-mediated pathologies. Here we report on a small family-based study (60 families; 68 affected offspring) undertaken in Brazil which was powered for large effect sizes using single nucleotide polymorphisms with minor alleles frequencies > 0.3. Of markers in TLR2, TLR5 and TLR9 that met these criteria, we found an association Family Based Association Tests [(FBAT) Z score = 4.232; p = 1.5 x 10-5; p corrected = 1.2 x 10-4] between the C allele (frequency = 0.424; odds ratio = 7; 95% confidence interval 1.6-30.8) of rs352140 at TLR9 and toxoplasmic retinochoroiditis in Brazil. This supports the hypothesis that direct interaction between T. gondii and TLR9 may trigger proinflammatory responses that lead to severe pathologies such as the ocular disease that is associated with this infection in Brazil.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Addressing the risks of nanoparticles requires knowledge about release into the environment and occupational exposure. However, such information currently is not systematically collected; therefore, this risk assessment lacks quantitative data. The goal was to evaluate the current level of nanoparticle usage in Swiss industry as well as health, safety, and environmental measures, and the number of potentially exposed workers. A representative, stratified mail survey was conducted among 1626 clients of the Swiss National Accident Insurance Fund (SUVA), which insures 80,000 manufacturing firms, representing 84% of all Swiss manufacturing companies (947 companies answered the survey for a 58.3% response rate). The extrapolation to all Swiss manufacturing companies results in 1309 workers (95% confidence interval [CI]: 1073 to 1545) potentially exposed to nanoparticles in 586 companies (95% CI: 145 to 1027). This corresponds to 0.08% of workers (95% CI: 0.06% to 0.09%) and to 0.6% of companies (95% CI: 0.2% to 1.1%). The industrial chemistry sector showed the highest percentage of companies using nanoparticles (21.2%). Other important sectors also reported nanoparticles. Personal protection equipment was the predominant protection strategy. Only a few applied specific environmental protection measures. This is the first nationwide representative study on nanoparticle use in the manufacturing sector. The information gained can be used for quantitative risk assessment. It can also help policymakers design strategies to support companies developing a safer use of nanomaterial. Notingthe current low use of nanoparticles, there is still time to proactively introduce protective methods. If the predicted "nano-revolution" comes true, now is the time to take action. [Supplementary materials are available for this article. Go to the publisher's online edition of Journal of occupational and Environmental Hygiene for the following free supplemental resource: a pdf file containing a detailed description of the approach to statistical analyses, English translation of the questionnaire, additional information for Figure 1, and additional information for the SUVA-code.] [Authors]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction. Critically ill patients suffer from oxidative stress caused by reactive oxygen species (ROS) and reactive nitrogen species (RNS). Although ROS/RNS are constantly produced under normal circumstances, critical illness can drastically increase their production. These patients have reduced plasma and intracellular levels of antioxidants and free electron scavengers or cofactors, and decreased activity of the enzymatic system involved in ROS detoxification. The pro-oxidant/antioxidant balance is of functional relevance during critical illness because it is involved in the pathogenesis of multiple organ failure. In this study the objective was to evaluate the relation between oxidative stress in critically ill patients and antioxidant vitamin intake and severity of illness. Methods. Spectrophotometry was used to measure in plasma the total antioxidant capacity and levels of lipid peroxide, carbonyl group, total protein, bilirubin and uric acid at two time points: at intensive care unit (ICU) admission and on day seven. Daily diet records were kept and compliance with recommended dietary allowance (RDA) of antioxidant vitamins (A, C and E) was assessed. Results. Between admission and day seven in the ICU, significant increases in lipid peroxide and carbonyl group were associated with decreased antioxidant capacity and greater deterioration in Sequential Organ Failure Assessment score. There was significantly greater worsening in oxidative stress parameters in patients who received antioxidant vitamins at below 66% of RDA than in those who received antioxidant vitamins at above 66% of RDA. An antioxidant vitamin intake from 66% to 100% of RDA reduced the risk for worsening oxidative stress by 94% (ods ratio 0.06, 95% confidence interval 0.010 to 0.39), regardless of change in severity of illness (Sequential Organ Failure Assessment score). Conclusion. The critical condition of patients admitted to the ICU is associated with worsening oxidative stress. Intake of antioxidant vitamins below 66% of RDA and alteration in endogenous levels of substances with antioxidant capacity are related to redox imbalance in critical ill patients. Therefore, intake of antioxidant vitamins should be carefully monitored so that it is as close as possible to RDA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Chronic kidney disease (CKD) represents an increasing health burden. We present the population-based prevalence of CKD and compare the CKD Epidemiology collaboration (CKD-EPI) and modification of diet in renal disease (MDRD) equations to estimate the glomerular filtration rate, using the revised CKD classification with three albuminuria classes. We also explore factors associated with CKD. METHODS: The Swiss population-based, cross-sectional CoLaus study conducted in Lausanne (2003-2006) included 2810 men and 3111 women aged 35-75. CKD prevalence was assessed using CKD-EPI and MDRD equations and albuminuria estimated by the albumin-to-creatinine ratio in spot morning urine. Multivariate logistic regression was used to analyse determinants of CKD. RESULTS: Prevalence [95% confidence interval (CI)] of all stages CKD was 10.0% (9.2-10.8%) with CKD-EPI and 13.8% (12.9-14.6%) with MDRD. Using the revised CKD classification, the prevalence of low-, medium-, high- and very high-risk groups was 90.0, 8.46, 1.18 and 0.35% with CKD-EPI, respectively. With MDRD, the corresponding values were 86.24, 11.86, 1.55 and 0.35%. Using the revised classification, CKD-EPI systematically reclassified people in a lower risk category than MDRD. Age and obesity were more strongly associated with CKD in men [odds ratio (95% CI): 2.23(1.95; 2.56) per 10 years and 3.05(2.08;4.47), respectively] than in women [1.46 (1.29; 1.65) and 1.78 (1.30;2.44), respectively]. Hypertension, type 2 diabetes, serum homocysteine and uric acid were positively independently associated with CKD in men and women. CONCLUSIONS: One in 10 adults suffers from CKD in the population of Lausanne. CKD-EPI systematically reclassifies people in a lower CKD risk category than MDRD. Serum homocysteine and uric acid levels are associated with CKD independently of classical risk factors such as age, hypertension and diabetes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Intracoronary administration of autologous bone marrow-derived mononuclear cells (BM-MNC) may improve remodeling of the left ventricle (LV) after acute myocardial infarction. The optimal time point of administration of BM-MNC is still uncertain and has rarely been addressed prospectively in randomized clinical trials. METHODS AND RESULTS: In a multicenter study, we randomized 200 patients with large, successfully reperfused ST-segment elevation myocardial infarction in a 1:1:1 pattern into an open-labeled control and 2 BM-MNC treatment groups. In the BM-MNC groups, cells were administered either early (ie, 5 to 7 days) or late (ie, 3 to 4 weeks) after acute myocardial infarction. Cardiac magnetic resonance imaging was performed at baseline and after 4 months. The primary end point was the change from baseline to 4 months in global LV ejection fraction between the 2 treatment groups and the control group. The absolute change in LV ejection fraction from baseline to 4 months was -0.4±8.8% (mean±SD; P=0.74 versus baseline) in the control group, 1.8±8.4% (P=0.12 versus baseline) in the early group, and 0.8±7.6% (P=0.45 versus baseline) in the late group. The treatment effect of BM-MNC as estimated by ANCOVA was 1.25 (95% confidence interval, -1.83 to 4.32; P=0.42) for the early therapy group and 0.55 (95% confidence interval, -2.61 to 3.71; P=0.73) for the late therapy group. CONCLUSIONS: Among patients with ST-segment elevation myocardial infarction and LV dysfunction after successful reperfusion, intracoronary infusion of BM-MNC at either 5 to 7 days or 3 to 4 weeks after acute myocardial infarction did not improve LV function at 4-month follow-up. CLINICAL TRIAL REGISTRATION: URL: http://www.clinicaltrials.gov. Unique identifier: NCT00355186.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Recanalization in acute ischemic stroke with large-vessel occlusion is a potent indicator of good clinical outcome. OBJECTIVE: To identify easily available clinical and radiologic variables predicting recanalization at various occlusion sites. METHODS: All consecutive, acute stroke patients from the Acute STroke Registry and Analysis of Lausanne (2003-2011) who had a large-vessel occlusion on computed tomographic angiography (CTA) (< 12 h) were included. Recanalization status was assessed at 24 h (range: 12-48 h) with CTA, magnetic resonance angiography, or ultrasonography. Complete and partial recanalization (corresponding to the modified Treatment in Cerebral Ischemia scale 2-3) were grouped together. Patients were categorized according to occlusion site and treatment modality. RESULTS: Among 439 patients, 51% (224) showed complete or partial recanalization. In multivariate analysis, recanalization of any occlusion site was most strongly associated with endovascular treatment, including bridging therapy (odds ratio [OR] 7.1, 95% confidence interval [CI] 2.2-23.2), and less so with intravenous thrombolysis (OR 1.6, 95% CI 1.0-2.6) and recanalization treatments performed beyond guidelines (OR 2.6, 95% CI 1.2-5.7). Clot location (large vs. intermediate) and tandem pathology (the combination of intracranial occlusion and symptomatic extracranial stenosis) were other variables discriminating between recanalizers and non-recanalizers. For patients with intracranial occlusions, the variables significantly associated with recanalization after 24 h were: baseline National Institutes of Health Stroke Scale (NIHSS) (OR 1.04, 95% CI 1.02-1.1), Alberta Stroke Program Early CT Score (ASPECTS) on initial computed tomography (OR 1.2, 95% CI 1.1-1.3), and an altered level of consciousness (OR 0.2, 95% CI 0.1-0.5). CONCLUSIONS: Acute endovascular treatment is the single most important factor promoting recanalization in acute ischemic stroke. The presence of extracranial vessel stenosis or occlusion decreases recanalization rates. In patients with intracranial occlusions, higher NIHSS score and ASPECTS and normal vigilance facilitate recanalization. Clinical use of these predictors could influence recanalization strategies in individual patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several recent studies suggest that obesity may be a risk factor for fracture. The aim of this study was to investigate the association between body mass index (BMI) and future fracture risk at different skeletal sites. In prospective cohorts from more than 25 countries, baseline data on BMI were available in 398,610 women with an average age of 63 (range, 20-105) years and follow up of 2.2 million person-years during which 30,280 osteoporotic fractures (6457 hip fractures) occurred. Femoral neck BMD was measured in 108,267 of these women. Obesity (BMI ≥ 30 kg/m(2) ) was present in 22%. A majority of osteoporotic fractures (81%) and hip fractures (87%) arose in non-obese women. Compared to a BMI of 25 kg/m(2) , the hazard ratio (HR) for osteoporotic fracture at a BMI of 35 kg/m(2) was 0.87 (95% confidence interval [CI], 0.85-0.90). When adjusted for bone mineral density (BMD), however, the same comparison showed that the HR for osteoporotic fracture was increased (HR, 1.16; 95% CI, 1.09-1.23). Low BMI is a risk factor for hip and all osteoporotic fracture, but is a protective factor for lower leg fracture, whereas high BMI is a risk factor for upper arm (humerus and elbow) fracture. When adjusted for BMD, low BMI remained a risk factor for hip fracture but was protective for osteoporotic fracture, tibia and fibula fracture, distal forearm fracture, and upper arm fracture. When adjusted for BMD, high BMI remained a risk factor for upper arm fracture but was also a risk factor for all osteoporotic fractures. The association between BMI and fracture risk is complex, differs across skeletal sites, and is modified by the interaction between BMI and BMD. At a population level, high BMI remains a protective factor for most sites of fragility fracture. The contribution of increasing population rates of obesity to apparent decreases in fracture rates should be explored. © 2014 American Society for Bone and Mineral Research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For the last two decades, ultrasound (US) has been considered a surrogate for the gold standard in the evaluation of liver fibrosis in schistosomiasis. The use of magnetic resonance imaging (MRI) is not yet standardised for diagnosing and grading liver schistosomal fibrosis. The aim of this paper was to analyse MRI using an adaptation of World Health Organization (WHO) patterns for US assessment of schistosomiasis-related morbidity. US and MRI were independently performed in 60 patients (42.1 ± 13.4 years old), including 37 men and 23 women with schistosomiasis. Liver involvement appraised by US and MRI was classified according to the WHO protocol from patterns A-F. Agreement between image methods was evaluated by kappa index (k). The correlation between US and MRI was poor using WHO patterns [k = 0.14; confidence interval (CI) 0.02; 0.26]. Even after grouping image patterns as "A-D", "Dc-E" and "Ec-F", the correlation between US and MRI remained weak (k = 0.39; CI 0.21; 0.58). The magnetic resonance adaptation used in our study did not confirm US classification of WHO patterns for liver fibrosis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION Hemodynamic resuscitation should be aimed at achieving not only adequate cardiac output but also sufficient mean arterial pressure (MAP) to guarantee adequate tissue perfusion pressure. Since the arterial pressure response to volume expansion (VE) depends on arterial tone, knowing whether a patient is preload-dependent provides only a partial solution to the problem. The objective of this study was to assess the ability of a functional evaluation of arterial tone by dynamic arterial elastance (Ea(dyn)), defined as the pulse pressure variation (PPV) to stroke volume variation (SVV) ratio, to predict the hemodynamic response in MAP to fluid administration in hypotensive, preload-dependent patients with acute circulatory failure. METHODS We performed a prospective clinical study in an adult medical/surgical intensive care unit in a tertiary care teaching hospital, including 25 patients with controlled mechanical ventilation who were monitored with the Vigileo(®) monitor, for whom the decision to give fluids was made because of the presence of acute circulatory failure, including arterial hypotension (MAP ≤65 mmHg or systolic arterial pressure <90 mmHg) and preserved preload responsiveness condition, defined as a SVV value ≥10%. RESULTS Before fluid infusion, Ea(dyn) was significantly different between MAP responders (MAP increase ≥15% after VE) and MAP nonresponders. VE-induced increases in MAP were strongly correlated with baseline Ea(dyn) (r(2) = 0.83; P < 0.0001). The only predictor of MAP increase was Ea(dyn) (area under the curve, 0.986 ± 0.02; 95% confidence interval (CI), 0.84-1). A baseline Ea(dyn) value >0.89 predicted a MAP increase after fluid administration with a sensitivity of 93.75% (95% CI, 69.8%-99.8%) and a specificity of 100% (95% CI, 66.4%-100%). CONCLUSIONS Functional assessment of arterial tone by Ea(dyn), measured as the PVV to SVV ratio, predicted arterial pressure response after volume loading in hypotensive, preload-dependent patients under controlled mechanical ventilation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: Whether or not a high risk of falls increases the risk of bleeding in patients receiving anticoagulants remains a matter of debate. METHODS: We conducted a prospective cohort study involving 991 patients ≥65 years of age who received anticoagulants for acute venous thromboembolism (VTE) at nine Swiss hospitals between September 2009 and September 2012. The study outcomes were as follows: the time to a first major episode of bleeding; and clinically relevant nonmajor bleeding. We determined the associations between the risk of falls and the time to a first episode of bleeding using competing risk regression, accounting for death as a competing event. We adjusted for known bleeding risk factors and anticoagulation as a time-varying covariate. RESULTS: Four hundred fifty-eight of 991 patients (46%) were at high risk of falls. The mean duration of follow-up was 16.7 months. Patients at high risk of falls had a higher incidence of major bleeding (9.6 vs. 6.6 events/100 patient-years; P = 0.05) and a significantly higher incidence of clinically relevant nonmajor bleeding (16.7 vs. 8.3 events/100 patient-years; P < 0.001) than patients at low risk of falls. After adjustment, a high risk of falls was associated with clinically relevant nonmajor bleeding [subhazard ratio (SHR) = 1.74, 95% confidence interval (CI) = 1.23-2.46], but not with major bleeding (SHR = 1.24, 95% CI = 0.83-1.86). CONCLUSION: In elderly patients who receive anticoagulants because of VTE, a high risk of falls is significantly associated with clinically relevant nonmajor bleeding, but not with major bleeding. Whether or not a high risk of falls is a reason against providing anticoagulation beyond 3 months should be based on patient preferences and the risk of VTE recurrence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Evidence associating exposure to water disinfection by-products with reduced birth weight and altered duration of gestation remains inconclusive. OBJECTIVE We assessed exposure to trihalomethanes (THMs) during pregnancy through different water uses and evaluated the association with birth weight, small for gestational age (SGA), low birth weight (LBW), and preterm delivery. METHODS Mother-child cohorts set up in five Spanish areas during the years 2000-2008 contributed data on water ingestion, showering, bathing, and swimming in pools. We ascertained residential THM levels during pregnancy periods through ad hoc sampling campaigns (828 measurements) and regulatory data (264 measurements), which were modeled and combined with personal water use and uptake factors to estimate personal uptake. We defined outcomes following standard definitions and included 2,158 newborns in the analysis. RESULTS Median residential THM ranged from 5.9 μg/L (Valencia) to 114.7 μg/L (Sabadell), and speciation differed across areas. We estimated that 89% of residential chloroform and 96% of brominated THM uptakes were from showering/bathing. The estimated change of birth weight for a 10% increase in residential uptake was -0.45 g (95% confidence interval: -1.36, 0.45 g) for chloroform and 0.16 g (-1.38, 1.70 g) for brominated THMs. Overall, THMs were not associated with SGA, LBW, or preterm delivery. CONCLUSIONS Despite the high THM levels in some areas and the extensive exposure assessment, results suggest that residential THM exposure during pregnancy driven by inhalation and dermal contact routes is not associated with birth weight, SGA, LBW, or preterm delivery in Spain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Knowledge of the number of recent HIV infections is important for epidemiologic surveillance. Over the past decade approaches have been developed to estimate this number by testing HIV-seropositive specimens with assays that discriminate the lower concentration and avidity of HIV antibodies in early infection. We have investigated whether this "recency" information can also be gained from an HIV confirmatory assay. METHODS AND FINDINGS: The ability of a line immunoassay (INNO-LIA HIV I/II Score, Innogenetics) to distinguish recent from older HIV-1 infection was evaluated in comparison with the Calypte HIV-1 BED Incidence enzyme immunoassay (BED-EIA). Both tests were conducted prospectively in all HIV infections newly diagnosed in Switzerland from July 2005 to June 2006. Clinical and laboratory information indicative of recent or older infection was obtained from physicians at the time of HIV diagnosis and used as the reference standard. BED-EIA and various recency algorithms utilizing the antibody reaction to INNO-LIA's five HIV-1 antigen bands were evaluated by logistic regression analysis. A total of 765 HIV-1 infections, 748 (97.8%) with complete test results, were newly diagnosed during the study. A negative or indeterminate HIV antibody assay at diagnosis, symptoms of primary HIV infection, or a negative HIV test during the past 12 mo classified 195 infections (26.1%) as recent (< or = 12 mo). Symptoms of CDC stages B or C classified 161 infections as older (21.5%), and 392 patients with no symptoms remained unclassified. BED-EIA ruled 65% of the 195 recent infections as recent and 80% of the 161 older infections as older. Two INNO-LIA algorithms showed 50% and 40% sensitivity combined with 95% and 99% specificity, respectively. Estimation of recent infection in the entire study population, based on actual results of the three tests and adjusted for a test's sensitivity and specificity, yielded 37% for BED-EIA compared to 35% and 33% for the two INNO-LIA algorithms. Window-based estimation with BED-EIA yielded 41% (95% confidence interval 36%-46%). CONCLUSIONS: Recency information can be extracted from INNO-LIA-based confirmatory testing at no additional costs. This method should improve epidemiologic surveillance in countries that routinely use INNO-LIA for HIV confirmation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND. A growing body of research suggests that prenatal exposure to air pollution may be harmful to fetal development. We assessed the association between exposure to air pollution during pregnancy and anthropometric measures at birth in four areas within the Spanish Children's Health and Environment (INMA) mother and child cohort study. METHODS. Exposure to ambient nitrogen dioxide (NO2) and benzene was estimated for the residence of each woman (n = 2,337) for each trimester and for the entire pregnancy. Outcomes included birth weight, length, and head circumference. The association between residential outdoor air pollution exposure and birth outcomes was assessed with linear regression models controlled for potential confounders. We also performed sensitivity analyses for the subset of women who spent more time at home during pregnancy. Finally, we performed a combined analysis with meta-analysis techniques. RESULTS. In the combined analysis, an increase of 10 µg/m3 in NO2 exposure during pregnancy was associated with a decrease in birth length of -0.9 mm [95% confidence interval (CI), -1.8 to -0.1 mm]. For the subset of women who spent ≥ 15 hr/day at home, the association was stronger (-0.16 mm; 95% CI, -0.27 to -0.04). For this same subset of women, a reduction of 22 g in birth weight was associated with each 10-µg/m3 increase in NO2 exposure in the second trimester (95% CI, -45.3 to 1.9). We observed no significant relationship between benzene levels and birth outcomes. CONCLUSIONS. NO2 exposure was associated with reductions in both length and weight at birth. This association was clearer for the subset of women who spent more time at home.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Previous studies have demonstrated the efficacy of treatment for latent tuberculosis infection (TLTBI) in persons infected with the human immunodeficiency virus, but few studies have investigated the operational aspects of implementing TLTBI in the co-infected population.The study objectives were to describe eligibility for TLTBI as well as treatment prescription, initiation and completion in an HIV-infected Spanish cohort and to investigate factors associated with treatment completion. METHODS Subjects were prospectively identified between 2000 and 2003 at ten HIV hospital-based clinics in Spain. Data were obtained from clinical records. Associations were measured using the odds ratio (OR) and its 95% confidence interval (95% CI). RESULTS A total of 1242 subjects were recruited and 846 (68.1%) were evaluated for TLTBI. Of these, 181 (21.4%) were eligible for TLTBI either because they were tuberculin skin test (TST) positive (121) or because their TST was negative/unknown but they were known contacts of a TB case or had impaired immunity (60). Of the patients eligible for TLTBI, 122 (67.4%) initiated TLTBI: 99 (81.1%) were treated with isoniazid for 6, 9 or 12 months; and 23 (18.9%) with short-course regimens including rifampin plus isoniazid and/or pyrazinamide. In total, 70 patients (57.4%) completed treatment, 39 (32.0%) defaulted, 7 (5.7%) interrupted treatment due to adverse effects, 2 developed TB, 2 died, and 2 moved away. Treatment completion was associated with having acquired HIV infection through heterosexual sex as compared to intravenous drug use (OR:4.6; 95% CI:1.4-14.7) and with having taken rifampin and pyrazinamide for 2 months as compared to isoniazid for 9 months (OR:8.3; 95% CI:2.7-24.9). CONCLUSIONS A minority of HIV-infected patients eligible for TLTBI actually starts and completes a course of treatment. Obstacles to successful implementation of this intervention need to be addressed.