216 resultados para Pre-exposure prophylaxis during breastfeeding


Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To quantify the prevalence of accidental blood exposure (ABE) among interventional radiologists and contrast that with the prevalence of patients with hepatitis C virus (HCV) undergoing interventional radiology procedures. MATERIALS AND METHODS: A multicenter epidemiologic study was conducted in radiology wards in France. The risk of ABE to radiologists was assessed based on personal interviews that determined the frequency and type of ABE and the use of standard protective barriers. Patients who underwent invasive procedures underwent prospective sampling for HCV serologic analysis. HCV viremia was measured in patients who tested positive for HCV. RESULTS: Of the 77 radiologists who participated in 11 interventional radiology wards, 44% reported at least one incident of mucous membrane blood exposure and 52% reported at least one percutaneous injury since the beginning of their occupational activity. Compliance with standard precautions was poor, especially for the use of protective clothes and safety material. Overall, 91 of 944 treated patients (9.7%) tested positive for HCV during the study period, of whom 90.1% had positive viremia results, demonstrating a high potential for contamination through blood contacts. CONCLUSIONS: The probability of HCV transmission from contact with contaminated blood after percutaneous injury ranged from 0.013 to 0.030; the high frequency of accidental blood exposure and high percentage of patients with HCV could generate a risk of exposure to HCV for radiologists who perform invasive procedures with frequent blood contact. The need to reinforce compliance with standard hygiene precautions is becoming crucial for medical and technical personnel working in these wards.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective Biomonitoring of solvents using the unchanged substance in urine as exposure indicator is still relatively scarce due to some discrepancies between the results reported in the literature. Based on the assessment of toluene exposure, the aim of this work was to evaluate the effects of some steps likely to bias the results and to measure urinary toluene both in volunteers experimentally exposed and in workers of rotogravure factories. Methods Static headspace was used for toluene analysis. o-Cresol was also measured for comparison. Urine collection, storage and conservation conditions were studied to evaluate possible loss or contamination of toluene in controlled situations applied to six volunteers in an exposure chamber according to four scenarios with exposure at stable levels from 10 to 50 ppm. Kinetics of elimination of toluene were determined over 24 h. A field study was then carried out in a total of 29 workers from two rotogravure printing facilities. Results Potential contamination during urine collection in the field is confirmed to be a real problem but technical precautions for sampling, storage and analysis can be easily followed to control the situation. In the volunteers at rest, urinary toluene showed a rapid increase after 2 h with a steady level after about 3 h. At 47.1 ppm the mean cumulated excretion was about 0.005% of the amount of the toluene ventilated. Correlation between the toluene levels in air and in end of exposure urinary sample was excellent (r = 0.965). In the field study, the median personal exposure to toluene was 32 ppm (range 3.6-148). According to the correlations between environmental and biological monitoring data, the post-shift urinary toluene (r = 0.921) and o-cresol (r = 0.873) concentrations were, respectively, 75.6 mu g/l and 0.76 mg/g creatinine for 50 ppm toluene personal exposure. The corresponding urinary toluene concentration before the next shift was 11 mu g/l (r = 0.883). Conclusion Urinary toluene was shown once more time a very interesting surrogate to o-cresol and could be recommended as a biomarker of choice for solvent exposure. [Authors]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Peripheral arterial disease (PAD) is a common disease with increasing prevalence, presenting with impaired walking ability affecting patient's quality of life. PAD epidemiology is known, however, mechanisms underlying functional muscle impairment remain unclear. Using a mouse PAD model, aim of this study was to assess muscle adaptive responses during early (1 week) and late (5 weeks) disease stages. Unilateral hindlimb ischemia was induced in ApoE(-/-) mice by iliac artery ligation. Ischemic limb perfusion and oxygenation (Laser Doppler imaging, transcutaneous oxygen pressure assessments) significantly decreased during early and late stage compared to pre-ischemia, however, values were significantly higher during late versus early phase. Number of arterioles and arteriogenesis-linked gene expression increased at later stage. Walking ability, evaluated by forced and voluntary walking tests, remained significantly decreased both at early and late phase without any significant improvement. Muscle glucose uptake ([18F]fluorodeoxyglucose positron emission tomography) significantly increased during early ischemia decreasing at later stage. Gene expression analysis showed significant shift in muscle M1/M2 macrophages and Th1/Th2 T cells balance toward pro-inflammatory phenotype during early ischemia; later, inflammatory state returned to neutrality. Muscular M1/M2 shift inhibition by a statin prevented impaired walking ability in early ischemia. High-energy phosphate metabolism remained unchanged (31-Phosphorus magnetic resonance spectroscopy). Results show that rapid transient muscular inflammation contributes to impaired walking capacity while increased glucose uptake may be a compensatory mechanisms preserving immediate limb viability during early ischemia in a mouse PAD model. With time, increased ischemic limb perfusion and oxygenation assure muscle viability although not sufficiently to improve walking impairment. Subsequent decreased muscle glucose uptake may partly contribute to chronic walking impairment. Early inflammation inhibition and/or late muscle glucose impairment prevention are promising strategies for PAD management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Occupational exposures to wood dust have been associated with an elevated risk of sinonasal cancer (SNC). Wood dust is recognized as a human carcinogen but the specific cancer causative agent remains unknown. One possible explanation is a co-exposure to; wood dust and polycyclic aromatic hydrocarbons (PAHs). PAHs could be generated during incomplete combustion of wood due to heat created by use of power tools. To determine if PAHs are generated from wood during common wood working operations, PAH concentrations in wood dust samples collected in an experimental chamber operated under controlled conditions were analyzed. In addition, personal air samples from workers exposed to wood dust (n = 30) were collected. Wood dust was generated using three different power tools: vibrating sander, belt sander, and saw; and six wood materials: fir, Medium Density Fiberboard (MDF), beech, mahogany, oak and wood melamine. Monitoring of wood workers was carried out by means of personal sampler device during wood working operations. We measured 21 PAH concentrations in wood dust samples by capillary gas chromatography-ion trap mass spectrometry (GC-MS). Total PAH concentrations in wood dust varied greatly (0.24-7.95 ppm) with the lowest being in MDF dust and the highest in wood melamine dust. Personal PAH exposures were between 37.5-119.8 ng m(-3) during wood working operations. Our results suggest that PAH exposures are present during woodworking operations and hence could play a role in the mechanism of cancer induction related to wood dust exposure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Minimizing unwarranted prescription of antibiotics remains an important objective. Because of the heterogeneity between units regarding patient mix and other characteristics, site-specific targets for reduction must be identified. Here we present a model to address the issue by means of an observational cohort study. SETTING: A tertiary, multidisciplinary, neonatal, and pediatric intensive care unit of a university teaching hospital. PATIENTS: All newborns and children present in the unit (n = 456) between September 1998 and March 1999. Reasons for admission included postoperative care after cardiac surgery, major neonatal or pediatric surgery, severe trauma, and medical conditions requiring critical care. METHODS: Daily recording of antibiotics given and of indications for initiation. After discontinuation, each treatment episode was assessed as to the presence or absence of infection. RESULTS: Of the 456 patients 258 (56.6%) received systemic antibiotics, amounting to 1815 exposure days (54.6%) during 3322 hospitalization days. Of these, 512 (28%) were prescribed as prophylaxis and 1303 for suspected infection. Treatment for suspected ventilator-associated pneumonia accounted for 616 (47%) of 1303 treatment days and suspected sepsis for 255 days (20%). Patients were classified as having no infection or viral infection during 552 (40%) treatment days. The average weekly exposure rate in the unit varied considerably during the 29-week study period (range: 40-77/100 hospitalization days). Patient characteristics did not explain this variation. CONCLUSION: In this unit the largest reduction in antibiotic treatment would result from measures assisting suspected ventilator-associated pneumonia to be ruled out and from curtailing extended prophylaxis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The exposure to dust and polynuclear aromatic hydrocarbons (PAH) of 15 truck drivers from Geneva, Switzerland, was measured. The drivers were divided between "long-distance" drivers and "local" drivers and between smokers and nonsmokers and were compared with a control group of 6 office workers who were also divided into smokers and nonsmokers. Dust was measured on 1 workday both by a direct-reading instrument and by sampling. The local drivers showed higher exposure to dust (0.3 mg/m3) and PAH than the long-distance drivers (0.1 mg/m3), who showed no difference with the control group. This observation may be due to the fact that the local drivers spend more time in more polluted areas, such as streets with heavy traffic and construction sites, than do the long-distance drivers. Smoking does not influence exposure to dust and PAH of professional truck drivers, as measured in this study, probably because the ventilation rate of the truck cabins is relatively high even during cold days (11-15 r/h). The distribution of dust concentrations was shown in some cases to be quite different from the expected log-normal distribution. The contribution of diesel exhaust to these exposures could not be estimated since no specific tracer was used. However, the relatively low level of dust exposure dose not support the hypothesis that present day levels of diesel exhaust particulates play a significant role in the excess occurrence of lung cancer observed in professional truck drivers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Mirtazapine is a noradrenergic and serotonergic antidepressant mainly acting through blockade of presynaptic alpha-2 receptors. Published data on pregnancy outcome after exposure to mirtazapine are scarce. This study addresses the risk associated with exposure to mirtazapine during pregnancy. Patients (or Materials) and Methods: Multicenter (n = 11), observational prospective cohort study comparing pregnancy outcomes after exposure to mirtazapine with 2 matched control groups: exposure to any selective serotonin reuptake inhibitor (SSRI) as a diseasematched control group, and general controls with no exposure to medication known to be teratogenic or to any antidepressant. Data were collected by members of the European Network of Teratology Information Services (ENTIS) during individual risk counseling between 1995 and 2011. Standardized procedures for data collection were used in each center. Results: A total of 357 pregnant women exposed to mirtazapine at any time during pregnancy were included in the study and compared with 357 pregnancies from each control group. The rate of major birth defects between the mirtazapine and the SSRI group did not differ significantly (4.5% vs 4.2%; unadjusted odds ratio, 1.1; 95% confidence interval, 0.5-2.3, P = 0.9). A trend toward a higher rate of birth defects in the mirtazapine group compared with general controls did not reach statistical significance (4.2% vs 1.9%; OR, 2.4; 95% CI, 0.9-6.3; P = 0.08). The crude rate of spontaneous abortions did not differ significantly between the mirtazapine, the SSRI, and the general control groups (9.5% vs 10.4% vs 8.4%; P = 0.67), neither did the rate of deliveries resulting in live births (79.6% vs 84.3% in both control groups; P = 0.15). However, a higher rate of elective pregnancy-termination was observed in the mirtazapine group compared with SSRI and general controls (7.8% vs 3.4% vs 5.6%; P = 0.03). Premature birth (< 37 weeks) (10.6% vs 10.1% vs 7.5%; P = 0.38), gestational age at birth (median, 39 weeks; interquartile range (IQR), 38-40 in all groups; P = 0.29), and birth weight (median, 3320 g; IQR, 2979-3636 vs 3230 g; IQR, 2910-3629 vs 3338 g; IQR, 2967-3650; P = 0.34) did not differ significantly between the groups. Conclusion: This study did not observe a statistically significant difference in the rate of major birth defects between mirtazapine, SSRI-exposed, and nonexposed pregnancies. A slightly higher rate of birth defects was, however, observed in the mirtazapine and SSRI groups compared with the low rate of birth defects in our general controls. Overall, the pregnancy outcome after mirtazapine exposure in this study is very similar to that of the SSRI-exposed control group.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION. Both hypocapnia and hypercapnia can be deleterious to brain injured patients. Strict PaCO2 control is difficult to achieve because of patient's instability and unpredictable effects of ventilator settings changes. OBJECTIVE. The aim of this study was to evaluate our ability to comply with a protocol of controlled mechanical ventilation (CMV) aiming at a PaCO2 between 35 and 40 mmHg in patients requiring neuro-resuscitation. METHODS. Retrospective analysis of consecutive patients (2005-2011) requiring intracranial pressure (ICP) monitoring for traumatic brain injury (TBI), subarachnoid haemorrhage (SAH), intracranial haemorrhage (ICH) or ischemic stroke (IS). Demographic data, GCS, SAPS II, hospital mortality, PaCO2 and ICP values were recorded. During CMV in the first 48 h after admission, we analyzed the time spent within the PaCO2 target in relation to the presence or absence of intracranial hypertension (ICP[20 mmHg, by periods of 30 min) (Table 1). We also compared the fraction of time (determined by linear interpolation) spent with normal, low or high PaCO2 in hospital survivors and non-survivors (Wilcoxon, Bonferroni correction, p\0.05) (Table 2). PaCO2 samples collected during and after apnoea tests were excluded. Results given as median [IQR]. RESULTS. 436 patients were included (TBI: 51.2 %, SAH: 20.6 %, ICH: 23.2 %, IS: 5.0 %), age: 54 [39-64], SAPS II score: 52 [41-62], GCS: 5 [3-8]. 8744 PaCO2 samples were collected during 150611 h of CMV. CONCLUSIONS. Despite a high number of PaCO2 samples collected (in average one sample every 107 min), our results show that patients undergoing CMV for neuro- resuscitation spent less than half of the time within the pre-defined PaCO2 range. During documented intracranial hypertension, hypercapnia was observed in 17.4 % of the time. Since non-survivors spent more time with hypocapnia, further analysis is required to determine whether hypocapnia was detrimental per se, or merely reflects increased severity of brain insult.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cytotoxic CD8 T cells exert their antiviral and antitumor activity primarily through the secretion of cytotoxic granules. Degranulation activity and cytotoxic granules (perforin plus granzymes) generally define CD8 T cells with cytotoxic function. In this study, we have investigated the expression of granzyme K (GrmK) in comparison to that of GrmA, GrmB, and perforin. The expression of the cytotoxic granules was assessed in virus-specific CD8 T cells specific to influenza virus, Epstein-Barr virus (EBV), cytomegalovirus (CMV), or human immunodeficiency virus type 1 (HIV-1). We observed a dichotomy between GrmK and perforin expression in virus-specific CD8 T cells. The profile in influenza virus-specific CD8 T cells was perforin(-) GrmB(-) GrmA(+/-) GrmK(+); in CMV-specific cells, it was perforin(+) GrmB(+) GrmA(+) GrmK(-/+); and in EBV- and HIV-1-specific cells, it was perforin(-/+) GrmB(+) GrmA(+) GrmK(+). On the basis of the delineation of memory and effector CD8 T cells with CD45RA and CD127, the GrmK(+) profile was associated with early-stage memory CD8 T-cell differentiation, the perforin(+) GrmB(+) GrmA(+) profile with advanced-stage differentiation, and the GrmB(+) GrmA(+) Grmk(+) profile with intermediate-stage differentiation. Furthermore, perforin and GrmB but not GrmA and GrmK correlated with cytotoxic activity. Finally, changes in antigen exposure in vitro and in vivo during primary HIV-1 infection and vaccination modulated cytotoxic granule profiles. These results advance our understanding of the relationship between distinct profiles of cytotoxic granules in memory CD8 T cells and function, differentiation stage, and antigen exposure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Long-term implications of the exposure to traumatizing experiences during childhood or adolescence, such as sexual abuse, or cancer, have been documented, namely the subjects' response to an acute stress in adulthood. Several indicators of the stress response have been considered (e.g. cortisol, heart rate). Oxytocin (OT) response to an acute stress of individuals exposed to trauma has not been documented. Eighty subjects (n=26 women who had experienced episodes of child abuse, n=25 men and women healthy survivors of cancer in childhood or adolescence, and 29 controls) have been submitted to a laboratory session involving an experimental stress challenge, the Trier social stress test. Overall, there was a clear OT response to the psychosocial challenge. Subjects having experienced a childhood/adolescence life-threatening illness had higher mean levels of OT than both abused and control subjects. There was a moderate negative relationship between OT and salivary cortisol. It is suggested that an acute stress stimulates OT secretion, and that the exposure to enduring life-threatening experiences in childhood/adolescence has long-lasting consequences regarding the stress system and connected functions, namely the activation of OT secretion. Better knowledge of such long-term implications is important so that to prevent dysregulations of the stress responses, which have been shown to be associated to the individual's mental health.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although exposure to secondhand smoke (SHS) is reportedly high in prison, few studies have measured this in the prison environment, and none have done so in Europe. We measured two indicators of SHS exposure (particulate matter PM10 and nicotine) in fixed locations before (2009) and after (2010) introduction of a partial smoking ban in a Swiss prison. Access to smoking cessation support was available to detainees throughout the study. Objectives To measure SHS before and after the introduction of a partial smoking ban. Methods Assessment of particulate matter PM10 (suspended microparticles of 10 μm) and nicotine in ambient air, collected by real-time aerosol monitor and nicotine monitoring devices. Results The authors observed a significant improvement of nicotine concentrations in the air after the introduction of the smoking ban (before: 7.0 μg/m(3), after: 2.1 μg/m(3), difference 4.9 μg/m(3), 95% CI for difference: 0.52 to 9.8, p=0.03) but not in particulate matter PM10 (before: 0.11 mg/m(3), after: 0.06 mg/m(3), difference 0.06 mg/m(3), 95% CI for difference of means: -0.07 to 0.19, p=0.30). Conclusions The partial smoking ban was followed by a decrease in nicotine concentrations in ambient air. These improvements can be attributed to the introduction of the smoking ban since no other policy change occurred during this period. Although this shows that concentrations of SHS decreased significantly, protection was still incomplete and further action is necessary to improve indoor air quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ant queens that attempt to disperse and found new colonies independently face high mortality risks. The exposure of queens to soil entomopathogens during claustral colony founding may be particularly harmful, as founding queens lack the protection conferred by mature colonies. Here, we tested the hypotheses that founding queens (I) detect and avoid nest sites that are contaminated by fungal pathogens, and (II) tend to associate with other queens to benefit from social immunity when nest sites are contaminated. Surprisingly, in nest choice assays, young Formica selysi BONDROIT, 1918 queens had an initial preference for nest sites contaminated by two common soil entomopathogenic fungi, Beauveria bassiana and Metarhizium brunneum. Founding queens showed a similar preference for the related but non-entomopathogenic fungus Fusarium graminearum. In contrast, founding queens had no significant preference for the more distantly related nonentomopathogenic fungus Petromyces alliaceus, nor for heat-killed spores of B. bassiana. Finally, founding queens did not increase the rate of queen association in presence of B. bassiana. The surprising preference of founding queens for nest sites contaminated by live entomopathogenic fungi suggests that parasites manipulate their hosts or that the presence of specific fungi is a cue associated with suitable nesting sites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Within the ORAMED project a coordinated measurement program for occupationally exposed medical staff was performed in different hospitals in Europe. The main objectives of ORAMED were to obtain a set of standardized data on doses for staff in interventional cardiology and radiology and to optimize staff protection. Doses were measured with thermoluminescent dosemeters on the ring finger and wrist of both hands, on legs and at the level of the eyes of the main operator performing interventional procedures. In this paper an overview of the doses per procedure measured during 646 interventional cardiology procedures is given for cardiac angiographies and angioplasties (CA/PTCA), radiofrequency ablations (RFA) and pacemaker and defibrillator implantations (PM/ICD). 31% of the monitored procedures were associated with no collective protective equipment, whereas 44% involved a ceiling screen and a table curtain. Although associated with the smallest air kerma - area product (KAP), PM/ICD procedures led to the highest doses. As expected, KAP and doses values exhibited a very large variability. The left side of the operator, most frequently the closest to the X-ray scattering region, was more exposed than his right side. An analysis of the effect of parameters influencing the doses, namely collective protective equipment, X-ray tube configuration and catheter access route, was performed on the doses normalized to KAP. Ceiling screen and table curtain were observed to reduce normalized doses by atmost a factor 4, much smaller than theoretical attenuation factors typical for such protections, i.e. from 10 to 100. This observation was understood as their inappropriate use by the operators and their non-optimized design. Configurations with tube above the patient led to higher normalized doses to the operator than tube below, but the effect of using a biplane X-ray suite was more complex to analyze. For CA/PTCA procedures, the upper part of the operator's body received higher normalized doses for radial than for femoral catheter access, by atmost a factor 5. This could be seen for cases with no collective protection. The eyes were observed to receive the maximum fraction of the annual dose limit almost as frequently as legs and hands, and clearly the most frequently, if the former 150 mSv and new 20 mSv recommended limits for the lens of the eye are considered, respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Morbidly obese patients are at high risk to develop gallstones, and rapid weight loss after bariatric surgery further enhances this risk. The concept of prophylactic cholecystectomy during gastric bypass has been challenged recently because the risk may be lower than reported earlier and because cholecystectomy during laparoscopic gastric bypass may be more difficult and risky. <p>Methods A review of prospectively collected data on 772 patients who underwent laparoscopic primary gastric bypass between January 2000 and August 2007 was performed. The charts of patients operated before 2004 were retrospectively reviewed regarding preoperative echography and histopathological findings.</p> <p>Results Fifty-eight (7.5%) patients had had previous cholecystectomy. In the remaining patients, echography showed gallstones or sludge in 81 (11.3%). Cholecystectomy was performed at the time of gastric bypass in 665 patients (91.7%). Gallstones were found intraoperatively in 25 patients (3.9%), for a total prevalence of gallstones of 21.2%. The age of patients with gallstones was higher than that of gallstone-free patients (43.5 vs 38.7 years, p < 0.0001). Of the removed specimens, 81.8% showed abnormal histologic findings, mainly chronic cholecystitis and cholesterolosis. Cholecystectomy was associated with no procedure-related complication, prolonged duration of surgery by a mean of 19 min (4-45), and had no effect on the duration of hospital stay. Cholecystectomy was deemed too risky in 59 patients (8.3%) who were prescribed a 6-month course of ursodeoxycolic acid.</p> <p>Conclusion Concomitant cholecystectomy can be performed safely in most patients during laparoscopic gastric bypass and does not prolong hospital stay. As such, it is an acceptable form of prophylaxis against stones forming during rapid weight loss. Whether it is superior to chemical prophylaxis remains to be demonstrated in a large prospective randomized study.</p>

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SETTING: A 950 bed teaching hospital in Switzerland. AIM: To describe the result of a contact investigation among health care workers (HCW) and patients after exposure to a physician with smear-positive pulmonary tuberculosis in a hospital setting using standard tuberculin skin tests (TST) and Interferon-gamma release assay (IGRA). METHOD: HCW with a negative or unknown TST at hiring had a TST two weeks after the last contact with the index case (T0), repeated six weeks later if negative (T6). All exposed HCW had a T-SPOT.TB at T0 and T6. Exposed patients had a TST six weeks after the last contact, and a T-SPOT.TB if the TST was positive. RESULTS: Among 101 HCW, 17/73 (22%) had a positive TST at T0. TST was repeated in 50 at T6 and converted from negative to positive in eight (16%). Twelve HCW had a positive T-SPOT.TB at T0 and ten converted from negative to positive at T6. Seven HCW with a positive T-SPOT.TB reverted to negative at T6 or at later controls, most of them with test values close to the cut-off. Among 27 exposed patients tested at six weeks, ten had a positive TST, five of them confirmed by a positive T-SPOT.TB. CONCLUSIONS: HCW tested twice after exposure to a case of smear-positive pulmonary TB demonstrated a possible conversion in 10% with T-SPOT and 16% with TST. Some T-SPOT.TB reverted from positive to negative during the follow-up, mostly tests with a value close to the cut-off. Due to the variability of the test results, it seems advisable to repeat the test with values close to the cut-off before diagnosing the presence of a tuberculous infection.