991 resultados para Sequential application


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective Biomonitoring of solvents using the unchanged substance in urine as exposure indicator is still relatively scarce due to some discrepancies between the results reported in the literature. Based on the assessment of toluene exposure, the aim of this work was to evaluate the effects of some steps likely to bias the results and to measure urinary toluene both in volunteers experimentally exposed and in workers of rotogravure factories. Methods Static headspace was used for toluene analysis. o-Cresol was also measured for comparison. Urine collection, storage and conservation conditions were studied to evaluate possible loss or contamination of toluene in controlled situations applied to six volunteers in an exposure chamber according to four scenarios with exposure at stable levels from 10 to 50 ppm. Kinetics of elimination of toluene were determined over 24 h. A field study was then carried out in a total of 29 workers from two rotogravure printing facilities. Results Potential contamination during urine collection in the field is confirmed to be a real problem but technical precautions for sampling, storage and analysis can be easily followed to control the situation. In the volunteers at rest, urinary toluene showed a rapid increase after 2 h with a steady level after about 3 h. At 47.1 ppm the mean cumulated excretion was about 0.005% of the amount of the toluene ventilated. Correlation between the toluene levels in air and in end of exposure urinary sample was excellent (r = 0.965). In the field study, the median personal exposure to toluene was 32 ppm (range 3.6-148). According to the correlations between environmental and biological monitoring data, the post-shift urinary toluene (r = 0.921) and o-cresol (r = 0.873) concentrations were, respectively, 75.6 mu g/l and 0.76 mg/g creatinine for 50 ppm toluene personal exposure. The corresponding urinary toluene concentration before the next shift was 11 mu g/l (r = 0.883). Conclusion Urinary toluene was shown once more time a very interesting surrogate to o-cresol and could be recommended as a biomarker of choice for solvent exposure. [Authors]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Home Childcarer Approval Scheme Application Form HCC1

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analysis of restriction fragment length polymorphism (RFLP) profiles derived from digestion of polymerase chain reaction (PCR) products of the ribosomal 18S from Trypanosoma cruzi yields a typical `riboprint' profile that can vary intraspecifically. A selection of 21 stocks of T. cruzi and three outgroup taxa: T. rangeli, T. conorhini and Leishmania braziliensis were analysed by riboprinting to assess divergence within and between taxa. T. rangeli, T. conorhini and L. braziliensis could be easily differentiated from each other and from T. cruzi. Phenetic analysis of PCR-RFLP profiles indicated that, with one or two exceptions, stocks of T. cruzi could be broadly partitioned into two groups that formally corresponded to T. cruzi I and T. cruzi II respectively. To test if ribosomal 18S sequences were homogeneous within each taxon, gradient gel electrophoresis methods were employed utilising either chemical or temperature gradients. Upon interpretation of the melting profiles of riboprints and a section of the 18S independently amplified by PCR, there would appear to be at least two divergent 18S types present within T. cruzi. Heterogeneity within copies of the ribosomal 18S within a single genome has therefore been demonstrated and interestingly, this dimorphic arrangement was also present in the outgroup taxa. Presumably the ancestral duplicative event that led to the divergent 18S types preceded that of speciation within this group. These divergent 18S paralogues may have, or had, different functional pressures or rates of molecular evolution. Whether or not these divergent types are equally transcriptionally active throughout the life cycle, remain to be assessed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recognition by the T-cell receptor (TCR) of immunogenic peptides (p) presented by Class I major histocompatibility complexes (MHC) is the key event in the immune response against virus-infected cells or tumor cells. A study of the 2C TCR/SIYR/H-2K(b) system using a computational alanine scanning and a much faster binding free energy decomposition based on the Molecular Mechanics-Generalized Born Surface Area (MM-GBSA) method is presented. The results show that the TCR-p-MHC binding free energy decomposition using this approach and including entropic terms provides a detailed and reliable description of the interactions between the molecules at an atomistic level. Comparison of the decomposition results with experimentally determined activity differences for alanine mutants yields a correlation of 0.67 when the entropy is neglected and 0.72 when the entropy is taken into account. Similarly, comparison of experimental activities with variations in binding free energies determined by computational alanine scanning yields correlations of 0.72 and 0.74 when the entropy is neglected or taken into account, respectively. Some key interactions for the TCR-p-MHC binding are analyzed and some possible side chains replacements are proposed in the context of TCR protein engineering. In addition, a comparison of the two theoretical approaches for estimating the role of each side chain in the complexation is given, and a new ad hoc approach to decompose the vibrational entropy term into atomic contributions, the linear decomposition of the vibrational entropy (LDVE), is introduced. The latter allows the rapid calculation of the entropic contribution of interesting side chains to the binding. This new method is based on the idea that the most important contributions to the vibrational entropy of a molecule originate from residues that contribute most to the vibrational amplitude of the normal modes. The LDVE approach is shown to provide results very similar to those of the exact but highly computationally demanding method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

WHAT'S KNOWN ON THE SUBJECT? AND WHAT DOES THE STUDY ADD?: The AMS 800 urinary control system is the gold standard for the treatment of urinary incontinence due to sphincter insufficiency. Despite excellent functional outcome and latest technological improvements, the revision rate remains significant. To overcome the shortcomings of the current device, we developed a modern electromechanical artificial urinary sphincter. The results demonstrated that this new sphincter is effective and well tolerated up to 3 months. This preliminary study represents a first step in the clinical application of novel technologies and an alternative compression mechanism to the urethra. OBJECTIVES: To evaluate the effectiveness in continence achievement of a new electromechanical artificial urinary sphincter (emAUS) in an animal model. To assess urethral response and animal general response to short-term and mid-term activation of the emAUS. MATERIALS AND METHODS: The principle of the emAUS is electromechanical induction of alternating compression of successive segments of the urethra by a series of cuffs activated by artificial muscles. Between February 2009 and May 2010 the emAUS was implanted in 17 sheep divided into three groups. The first phase aimed to measure bladder leak point pressure during the activation of the device. The second and third phases aimed to assess tissue response to the presence of the device after 2-9 weeks and after 3 months respectively. Histopathological and immunohistochemistry evaluation of the urethra was performed. RESULTS: Bladder leak point pressure was measured at levels between 1091 ± 30.6 cmH2 O and 1244.1 ± 99 cmH2 O (mean ± standard deviation) depending on the number of cuffs used. At gross examination, the explanted urethra showed no sign of infection, atrophy or stricture. On microscopic examination no significant difference in structure was found between urethral structure surrounded by a cuff and control urethra. In the peripheral tissues, the implanted material elicited a chronic foreign body reaction. Apart from one case, specimens did not show significant presence of lymphocytes, polymorphonuclear leucocytes, necrosis or cell degeneration. Immunohistochemistry confirmed the absence of macrophages in the samples. CONCLUSIONS: This animal study shows that the emAUS can provide continence. This new electronic controlled sequential alternating compression mechanism can avoid damage to urethral vascularity, at least up to 3 months after implantation. After this positive proof of concept, long-term studies are needed before clinical application could be considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

2005 - 2006

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biomphalaria glabrata, highly susceptible to Schistosoma mansoni, were seen to shed less and less cercariae along the time of infection. Histological examination kept a close correlation with this changing pattern of cercarial shedding, turning an initial picture of no-reaction (tolerance) gradually into one of hemocyte proliferation with formation of focal encapsulating lesions around disintegrating sporocysts and cercariae, a change that became disseminated toward the 142nd day post miracidial exposure. Findings were suggestive of a gradual installation of acquired immunity in snails infected with S. mansoni.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este proyecto tiene como objetivo crear y aplicar una metodología a una aplicación llamada MATE que fue creada en en el año 2003 por Anna Sikora para su tesis doctoral. Se trata de dotar el proyecto MATE de las herramientas necesarias para garantizar su evolución. La metodología creada consta de la especificación de un entorno de trabajo y una serie de documentos que detallan los procesos relativos al desarrollo de MATE. Además se han creado algunas nuevas características que hacen de MATE una herramienta más completa y cómoda.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Casework expercience has shown that, in some cases, long exposures of surfaces subjected to cyanoacrylate (CA) fuming had detrimental effects on the subsequent application of Bluestar. This study aimed to develop a control mechanism to monitor the amount of CA deposited prior to the subsequent treatment. A control slide bearing spots of sodium hydroxide (NaOH) of known concentrations and volume was designed and validated against both scanning electron microscopy (SEM) observations and latent print examiners' assessments of the quality of the developed marks. The control slide allows one to define three levels of development that were used to monitor the Bluestar reaction on depleting footwear marks left in diluted blood. The appropriate conditions for a successful application of both CA and Bluestar were determined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent progress in the experimental determination of protein structures allow to understand, at a very detailed level, the molecular recognition mechanisms that are at the basis of the living matter. This level of understanding makes it possible to design rational therapeutic approaches, in which effectors molecules are adapted or created de novo to perform a given function. An example of such an approach is drug design, were small inhibitory molecules are designed using in silico simulations and tested in vitro. In this article, we present a similar approach to rationally optimize the sequence of killer T lymphocytes receptors to make them more efficient against melanoma cells. The architecture of this translational research project is presented together with its implications both at the level of basic research as well as in the clinics.