39 resultados para inhaled particle count

em Université de Lausanne, Switzerland


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Part I of this series of articles focused on the construction of graphical probabilistic inference procedures, at various levels of detail, for assessing the evidential value of gunshot residue (GSR) particle evidence. The proposed models - in the form of Bayesian networks - address the issues of background presence of GSR particles, analytical performance (i.e., the efficiency of evidence searching and analysis procedures) and contamination. The use and practical implementation of Bayesian networks for case pre-assessment is also discussed. This paper, Part II, concentrates on Bayesian parameter estimation. This topic complements Part I in that it offers means for producing estimates useable for the numerical specification of the proposed probabilistic graphical models. Bayesian estimation procedures are given a primary focus of attention because they allow the scientist to combine (his/her) prior knowledge about the problem of interest with newly acquired experimental data. The present paper also considers further topics such as the sensitivity of the likelihood ratio due to uncertainty in parameters and the study of likelihood ratio values obtained for members of particular populations (e.g., individuals with or without exposure to GSR).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Highway maintenance workers are constantly and simultaneously exposed to traffic-related particle and noise emissions, and both have been linked to increased cardiovascular morbidity and mortality in population-based epidemiology studies. OBJECTIVES: We aimed to investigate short-term health effects related to particle and noise exposure. METHODS: We monitored 18 maintenance workers, during as many as five 24-hour periods from a total of 50 observation days. We measured their exposure to fine particulate matter (PM2.5), ultrafine particles, noise, and the cardiopulmonary health endpoints: blood pressure, pro-inflammatory and pro-thrombotic markers in the blood, lung function and fractional exhaled nitric oxide (FeNO) measured approximately 15 hours post-work. Heart rate variability was assessed during a sleep period approximately 10 hours post-work. RESULTS: PM2.5 exposure was significantly associated with C-reactive protein and serum amyloid A, and negatively associated with tumor necrosis factor α. None of the particle metrics were significantly associated with von Willebrand factor or tissue factor expression. PM2.5 and work noise were associated with markers of increased heart rate variability, and with increased HF and LF power. Systolic and diastolic blood pressure on the following morning were significantly associated with noise exposure after work, and non-significantly associated with PM2.5. We observed no significant associations between any of the exposures and lung function or FeNO. CONCLUSIONS: Our findings suggest that exposure to particles and noise during highway maintenance work might pose a cardiovascular health risk. Actions to reduce these exposures could lead to better health for this population of workers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An assessment of sewage workers' exposure to airborne cultivable bacteria, fungi and inhaled endotoxins was performed at 11 sewage treatment plants. We sampled the enclosed and unenclosed treatment areas in each plant and evaluated the influence of seasons (summer and winter) on bioaerosol levels. We also measured personal exposure to endotoxins of workers during special operation where a higher risk of bioaerosol inhalation was assumed. Results show that only fungi are present in significantly higher concentrations in summer than in winter (2331 +/- 858 versus 329 +/- 95 CFU m(-3)). We also found that there are significantly more bacteria in the enclosed area, near the particle grids for incoming water, than in the unenclosed area near the aeration basins (9455 +/- 2661 versus 2435 +/- 985 CFU m(-3) in summer and 11 081 +/- 2299 versus 2002 +/- 839 CFU m(-3) in winter). All bioaerosols were frequently above the recommended values of occupational exposure. Workers carrying out special tasks such as cleaning tanks were exposed to very high levels of endotoxins (up to 500 EU m(-3)) compared to routine work. The species composition and concentration of airborne Gram-negative bacteria were also studied. A broad spectrum of different species within the Pseudomonadaceae and the Enterobacteriaceae families were predominant in nearly all plants investigated. [Authors]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: : To determine the influence of nebulizer types and nebulization modes on bronchodilator delivery in a mechanically ventilated pediatric lung model. DESIGN: : In vitro, laboratory study. SETTING: : Research laboratory of a university hospital. INTERVENTIONS: : Using albuterol as a marker, three nebulizer types (jet nebulizer, ultrasonic nebulizer, and vibrating-mesh nebulizer) were tested in three nebulization modes in a nonhumidified bench model mimicking the ventilatory pattern of a 10-kg infant. The amounts of albuterol deposited on the inspiratory filters (inhaled drug) at the end of the endotracheal tube, on the expiratory filters, and remaining in the nebulizers or in the ventilator circuit were determined. Particle size distribution of the nebulizers was also measured. MEASUREMENTS AND MAIN RESULTS: : The inhaled drug was 2.8% ± 0.5% for the jet nebulizer, 10.5% ± 2.3% for the ultrasonic nebulizer, and 5.4% ± 2.7% for the vibrating-mesh nebulizer in intermittent nebulization during the inspiratory phase (p < 0.01). The most efficient nebulizer was the vibrating-mesh nebulizer in continuous nebulization (13.3% ± 4.6%, p < 0.01). Depending on the nebulizers, a variable but important part of albuterol was observed as remaining in the nebulizers (jet and ultrasonic nebulizers), or being expired or lost in the ventilator circuit (all nebulizers). Only small particles (range 2.39-2.70 µm) reached the end of the endotracheal tube. CONCLUSIONS: : Important differences between nebulizer types and nebulization modes were seen for albuterol deposition at the end of the endotracheal tube in an in vitro pediatric ventilator-lung model. New aerosol devices, such as ultrasonic and vibrating-mesh nebulizers, were more efficient than the jet nebulizer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The relationship between platelet count and outcome in patients with acute venous thromboembolism (VTE) has not been consistently explored. RIETE is an ongoing registry of consecutive patients with acute VTE. We categorised patients as having very low- (<80,000/µl), low- (80,000/µl to 150,000/µl), normal- (150,000/µl to 300,000/µl), high- (300,000/µl to 450,000/µl), or very high (>450,000/µl) platelet count at baseline, and compared their three-month outcome. As of October 2012, 43,078 patients had been enrolled in RIETE: 21,319 presenting with pulmonary embolism and 21,759 with deep-vein thrombosis. In all, 502 patients (1.2%) had very low-; 5,472 (13%) low-; 28,386 (66%) normal-; 7,157 (17%) high-; and 1,561 (3.6%) very high platelet count. During the three-month study period, the recurrence rate was: 2.8%, 2.2%, 1.8%, 2.1% and 2.2%, respectively; the rate of major bleeding: 5.8%, 2.6%, 1.7%, 2.3% and 4.6%, respectively; the rate of fatal bleeding: 2.0%, 0.9%, 0.3%, 0.5% and 1.2%, respectively; and the mortality rate: 29%, 11%, 6.5%, 8.8% and 14%, respectively. On multivariate analysis, patients with very low-, low-, high- or very high platelet count had an increased risk for major bleeding (odds ratio [OR]: 2.70, 95% confidence interval [CI]: 1.85-3.95; 1.43 [1.18-1.72]; 1.23 [1.03-1.47]; and 2.13 [1.65-2.75]) and fatal bleeding (OR: 3.70 [1.92-7.16], 2.10 [1.48-2.97], 1.29 [0.88-1.90] and 2.49 [1.49-4.15]) compared with those with normal count. In conclusion, we found a U-shaped relationship between platelet count and the three-month rate of major bleeding and fatal bleeding in patients with VTE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigated the use of in situ implant formation that incorporates superparamagnetic iron oxide nanoparticles (SPIONs) as a form of minimally invasive treatment of cancer lesions by magnetically induced local hyperthermia. We developed injectable formulations that form gels entrapping magnetic particles into a tumor. We used SPIONs embedded in silica microparticles to favor syringeability and incorporated the highest proportion possible to allow large heating capacities. Hydrogel, single-solvent organogel and cosolvent (low-toxicity hydrophilic solvent) organogel formulations were injected into human cancer tumors xenografted in mice. The thermoreversible hydrogels (poloxamer, chitosan), which accommodated 20% w/v of the magnetic microparticles, proved to be inadequate. Alginate hydrogels, however, incorporated 10% w/v of the magnetic microparticles, and the external gelation led to strong implants localizing to the tumor periphery, whereas internal gelation failed in situ. The organogel formulations, which consisted of precipitating polymers dissolved in single organic solvents, displayed various microstructures. A 8% poly(ethylene-vinyl alcohol) in DMSO containing 40% w/v of magnetic microparticles formed the most suitable implants in terms of tumor casting and heat delivery. Importantly, it is of great clinical interest to develop cosolvent formulations with up to 20% w/v of magnetic microparticles that show reduced toxicity and centered tumor implantation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Whole-body (WB) planar imaging has long been one of the staple methods of dosimetry, and its quantification has been formalized by the MIRD Committee in pamphlet no 16. One of the issues not specifically addressed in the formalism occurs when the count rates reaching the detector are sufficiently high to result in camera count saturation. Camera dead-time effects have been extensively studied, but all of the developed correction methods assume static acquisitions. However, during WB planar (sweep) imaging, a variable amount of imaged activity exists in the detector's field of view as a function of time and therefore the camera saturation is time dependent. A new time-dependent algorithm was developed to correct for dead-time effects during WB planar acquisitions that accounts for relative motion between detector heads and imaged object. Static camera dead-time parameters were acquired by imaging decaying activity in a phantom and obtaining a saturation curve. Using these parameters, an iterative algorithm akin to Newton's method was developed, which takes into account the variable count rate seen by the detector as a function of time. The algorithm was tested on simulated data as well as on a whole-body scan of high activity Samarium-153 in an ellipsoid phantom. A complete set of parameters from unsaturated phantom data necessary for count rate to activity conversion was also obtained, including build-up and attenuation coefficients, in order to convert corrected count rate values to activity. The algorithm proved successful in accounting for motion- and time-dependent saturation effects in both the simulated and measured data and converged to any desired degree of precision. The clearance half-life calculated from the ellipsoid phantom data was calculated to be 45.1 h after dead-time correction and 51.4 h with no correction; the physical decay half-life of Samarium-153 is 46.3 h. Accurate WB planar dosimetry of high activities relies on successfully compensating for camera saturation which takes into account the variable activity in the field of view, i.e. time-dependent dead-time effects. The algorithm presented here accomplishes this task.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: In recent years, treatment options for human immunodeficiency virus type 1 (HIV-1) infection have changed from nonboosted protease inhibitors (PIs) to nonnucleoside reverse-transcriptase inhibitors (NNRTIs) and boosted PI-based antiretroviral drug regimens, but the impact on immunological recovery remains uncertain. METHODS: During January 1996 through December 2004 [corrected] all patients in the Swiss HIV Cohort were included if they received the first combination antiretroviral therapy (cART) and had known baseline CD4(+) T cell counts and HIV-1 RNA values (n = 3293). For follow-up, we used the Swiss HIV Cohort Study database update of May 2007 [corrected] The mean (+/-SD) duration of follow-up was 26.8 +/- 20.5 months. The follow-up time was limited to the duration of the first cART. CD4(+) T cell recovery was analyzed in 3 different treatment groups: nonboosted PI, NNRTI, or boosted PI. The end point was the absolute increase of CD4(+) T cell count in the 3 treatment groups after the initiation of cART. RESULTS: Two thousand five hundred ninety individuals (78.7%) initiated a nonboosted-PI regimen, 452 (13.7%) initiated an NNRTI regimen, and 251 (7.6%) initiated a boosted-PI regimen. Absolute CD4(+) T cell count increases at 48 months were as follows: in the nonboosted-PI group, from 210 to 520 cells/muL; in the NNRTI group, from 220 to 475 cells/muL; and in the boosted-PI group, from 168 to 511 cells/muL. In a multivariate analysis, the treatment group did not affect the response of CD4(+) T cells; however, increased age, pretreatment with nucleoside reverse-transcriptase inhibitors, serological tests positive for hepatitis C virus, Centers for Disease Control and Prevention stage C infection, lower baseline CD4(+) T cell count, and lower baseline HIV-1 RNA level were risk factors for smaller increases in CD4(+) T cell count. CONCLUSION: CD4(+) T cell recovery was similar in patients receiving nonboosted PI-, NNRTI-, and boosted PI-based cART.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the forensic examination of DNA mixtures, the question of how to set the total number of contributors (N) presents a topic of ongoing interest. Part of the discussion gravitates around issues of bias, in particular when assessments of the number of contributors are not made prior to considering the genotypic configuration of potential donors. Further complication may stem from the observation that, in some cases, there may be numbers of contributors that are incompatible with the set of alleles seen in the profile of a mixed crime stain, given the genotype of a potential contributor. In such situations, procedures that take a single and fixed number contributors as their output can lead to inferential impasses. Assessing the number of contributors within a probabilistic framework can help avoiding such complication. Using elements of decision theory, this paper analyses two strategies for inference on the number of contributors. One procedure is deterministic and focuses on the minimum number of contributors required to 'explain' an observed set of alleles. The other procedure is probabilistic using Bayes' theorem and provides a probability distribution for a set of numbers of contributors, based on the set of observed alleles as well as their respective rates of occurrence. The discussion concentrates on mixed stains of varying quality (i.e., different numbers of loci for which genotyping information is available). A so-called qualitative interpretation is pursued since quantitative information such as peak area and height data are not taken into account. The competing procedures are compared using a standard scoring rule that penalizes the degree of divergence between a given agreed value for N, that is the number of contributors, and the actual value taken by N. Using only modest assumptions and a discussion with reference to a casework example, this paper reports on analyses using simulation techniques and graphical models (i.e., Bayesian networks) to point out that setting the number of contributors to a mixed crime stain in probabilistic terms is, for the conditions assumed in this study, preferable to a decision policy that uses categoric assumptions about N.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

CD4 expression in HIV replication is paradoxical: HIV entry requires high cell-surface CD4 densities, but replication requires CD4 down-modulation. However, is CD4 density in HIV+ patients affected over time? Do changes in CD4 density correlate with disease progression? Here, we examined the role of CD4 density for HIV disease progression by longitudinally quantifying CD4 densities on CD4+ T cells and monocytes of ART-naive HIV+ patients with different disease progression rates. This was a retrospective study. We defined three groups of HIV+ patients by their rate of CD4+ T cell loss, calculated by the time between infection and reaching a CD4 level of 200 cells/microl: fast (<7.5 years), intermediate (7.5-12 years), and slow progressors (>12 years). Mathematical modeling permitted us to determine the maximum CD4+ T cell count after HIV seroconversion (defined as "postseroconversion CD4 count") and longitudinal profiles of CD4 count and density. CD4 densities were quantified on CD4+ T cells and monocytes from these patients and from healthy individuals by flow cytometry. Fast progressors had significantly lower postseroconversion CD4 counts than other progressors. CD4 density on T cells was lower in HIV+ patients than in healthy individuals and decreased more rapidly in fast than in slow progressors. Antiretroviral therapy (ART) did not normalize CD4 density. Thus, postseroconversion CD4 counts define individual HIV disease progression rates that may help to identify patients who might benefit most from early ART. Early discrimination of slow and fast progressors suggests that critical events during primary infection define long-term outcome. A more rapid CD4 density decrease in fast progressors might contribute to progressive functional impairments of the immune response in advanced HIV infection. The lack of an effect of ART on CD4 density implies a persistent dysfunctional immune response by uncontrolled HIV infection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Particle physics studies highly complex processes which cannot be directly observed. Scientific realism claims that we are nevertheless warranted in believing that these processes really occur and that the objects involved in them really exist. This dissertation defends a version of scientific realism, called causal realism, in the context of particle physics. I start by introducing the central theses and arguments in the recent philosophical debate on scientific realism (chapter 1), with a special focus on an important presupposition of the debate, namely common sense realism. Chapter 2 then discusses entity realism, which introduces a crucial element into the debate by emphasizing the importance of experiments in defending scientific realism. Most of the chapter is concerned with Ian Hacking's position, but I also argue that Nancy Cartwright's version of entity realism is ultimately preferable as a basis for further development. In chapter 3,1 take a step back and consider the question whether the realism debate is worth pursuing at all. Arthur Fine has given a negative answer to that question, proposing his natural ontologica! attitude as an alternative to both realism and antirealism. I argue that the debate (in particular the realist side of it) is in fact less vicious than Fine presents it. The second part of my work (chapters 4-6) develops, illustrates and defends causal realism. The key idea is that inference to the best explanation is reliable in some cases, but not in others. Chapter 4 characterizes the difference between these two kinds of cases in terms of three criteria which distinguish causal from theoretical warrant. In order to flesh out this distinction, chapter 5 then applies it to a concrete case from the history of particle physics, the discovery of the neutrino. This case study shows that the distinction between causal and theoretical warrant is crucial for understanding what it means to "directly detect" a new particle. But the distinction is also an effective tool against what I take to be the presently most powerful objection to scientific realism: Kyle Stanford's argument from unconceived alternatives. I respond to this argument in chapter 6, and I illustrate my response with a discussion of Jean Perrin's experimental work concerning the atomic hypothesis. In the final part of the dissertation, I turn to the specific challenges posed to realism by quantum theories. One of these challenges comes from the experimental violations of Bell's inequalities, which indicate a failure of locality in the quantum domain. I show in chapter 7 how causal realism can further our understanding of quantum non-locality by taking account of some recent experimental results. Another challenge to realism in quantum mechanics comes from delayed-choice experiments, which seem to imply that certain aspects of what happens in an experiment can be influenced by later choices of the experimenter. Chapter 8 analyzes these experiments and argues that they do not warrant the antirealist conclusions which some commentators draw from them. It pays particular attention to the case of delayed-choice entanglement swapping and the corresponding question whether entanglement is a real physical relation. In chapter 9,1 finally address relativistic quantum theories. It is often claimed that these theories are incompatible with a particle ontology, and this calls into question causal realism's commitment to localizable and countable entities. I defend the commitments of causal realism against these objections, and I conclude with some remarks connecting the interpretation of quantum field theory to more general metaphysical issues confronting causal realism.