181 resultados para Resampling methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recent advances in sequencing technologies have given all microbiology laboratories access to whole genome sequencing. Providing that tools for the automated analysis of sequence data and databases for associated meta-data are developed, whole genome sequencing will become a routine tool for large clinical microbiology laboratories. Indeed, the continuing reduction in sequencing costs and the shortening of the 'time to result' makes it an attractive strategy in both research and diagnostics. Here, we review how high-throughput sequencing is revolutionizing clinical microbiology and the promise that it still holds. We discuss major applications, which include: (i) identification of target DNA sequences and antigens to rapidly develop diagnostic tools; (ii) precise strain identification for epidemiological typing and pathogen monitoring during outbreaks; and (iii) investigation of strain properties, such as the presence of antibiotic resistance or virulence factors. In addition, recent developments in comparative metagenomics and single-cell sequencing offer the prospect of a better understanding of complex microbial communities at the global and individual levels, providing a new perspective for understanding host-pathogen interactions. Being a high-resolution tool, high-throughput sequencing will increasingly influence diagnostics, epidemiology, risk management, and patient care.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ACuteTox is a project within the 6th European Framework Programme which had as one of its goals to develop, optimise and prevalidate a non-animal testing strategy for predicting human acute oral toxicity. In its last 6 months, a challenging exercise was conducted to assess the predictive capacity of the developed testing strategies and final identification of the most promising ones. Thirty-two chemicals were tested blind in the battery of in vitro and in silico methods selected during the first phase of the project. This paper describes the classification approaches studied: single step procedures and two step tiered testing strategies. In summary, four in vitro testing strategies were proposed as best performing in terms of predictive capacity with respect to the European acute oral toxicity classification. In addition, a heuristic testing strategy is suggested that combines the prediction results gained from the neutral red uptake assay performed in 3T3 cells, with information on neurotoxicity alerts identified by the primary rat brain aggregates test method. Octanol-water partition coefficients and in silico prediction of intestinal absorption and blood-brain barrier passage are also considered. This approach allows to reduce the number of chemicals wrongly predicted as not classified (LD50>2000 mg/kg b.w.).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For several decades mechanical properties of shallow formations (soil) obtained by sonic to ultrasonic wave testing were reported to be greater than those based on mechanical tests. The present article relying on a statistical analysis of more than 300 tests shows that elastic moduli of the soil can indeed be obtained from (ultra)sonic tests and that they are identical to those resulting from mechanical tests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ancient Greek medical theory based on balance or imbalance of humors disappeared in the western world, but does survive elsewhere. Is this survival related to a certain degree of health care efficiency? We explored this hypothesis through a study of classical Greco-Arab medicine in Mauritania. Modern general practitioners evaluated the safety and effectiveness of classical Arabic medicine in a Mauritanian traditional clinic, with a prognosis/follow-up method allowing the following comparisons: (i) actual patient progress (clinical outcome) compared with what the traditional 'tabib' had anticipated (= prognostic ability) and (ii) patient progress compared with what could be hoped for if the patient were treated by a modern physician in the same neighborhood. The practice appeared fairly safe and, on average, clinical outcome was similar to what could be expected with modern medicine. In some cases, patient progress was better than expected. The ability to correctly predict an individual's clinical outcome did not seem to be better along modern or Greco-Arab theories. Weekly joint meetings (modern and traditional practitioners) were spontaneously organized with a modern health centre in the neighborhood. Practitioners of a different medical system can predict patient progress. For the patient, avoiding false expectations with health care and ensuring appropriate referral may be the most important. Prognosis and outcome studies such as the one presented here may help to develop institutions where patients find support in making their choices, not only among several treatment options, but also among several medical systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a novel spatiotemporal-adaptive Multiscale Finite Volume (MsFV) method, which is based on the natural idea that the global coarse-scale problem has longer characteristic time than the local fine-scale problems. As a consequence, the global problem can be solved with larger time steps than the local problems. In contrast to the pressure-transport splitting usually employed in the standard MsFV approach, we propose to start directly with a local-global splitting that allows to locally retain the original degree of coupling. This is crucial for highly non-linear systems or in the presence of physical instabilities. To obtain an accurate and efficient algorithm, we devise new adaptive criteria for global update that are based on changes of coarse-scale quantities rather than on fine-scale quantities, as it is routinely done before in the adaptive MsFV method. By means of a complexity analysis we show that the adaptive approach gives a noticeable speed-up with respect to the standard MsFV algorithm. In particular, it is efficient in case of large upscaling factors, which is important for multiphysics problems. Based on the observation that local time stepping acts as a smoother, we devise a self-correcting algorithm which incorporates the information from previous times to improve the quality of the multiscale approximation. We present results of multiphase flow simulations both for Darcy-scale and multiphysics (hybrid) problems, in which a local pore-scale description is combined with a global Darcy-like description. The novel spatiotemporal-adaptive multiscale method based on the local-global splitting is not limited to porous media flow problems, but it can be extended to any system described by a set of conservation equations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Specific properties emerge from the structure of large networks, such as that of worldwide air traffic, including a highly hierarchical node structure and multi-level small world sub-groups that strongly influence future dynamics. We have developed clustering methods to understand the form of these structures, to identify structural properties, and to evaluate the effects of these properties. Graph clustering methods are often constructed from different components: a metric, a clustering index, and a modularity measure to assess the quality of a clustering method. To understand the impact of each of these components on the clustering method, we explore and compare different combinations. These different combinations are used to compare multilevel clustering methods to delineate the effects of geographical distance, hubs, network densities, and bridges on worldwide air passenger traffic. The ultimate goal of this methodological research is to demonstrate evidence of combined effects in the development of an air traffic network. In fact, the network can be divided into different levels of âeurooecohesionâeuro, which can be qualified and measured by comparative studies (Newman, 2002; Guimera et al., 2005; Sales-Pardo et al., 2007).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A high-resolution three-dimensional (3-D) seismic reflection survey was conducted in Lake Geneva, near the city of Lausanne, Switzerland, as part of a project for developing such seismic techniques. Using a single 48-channel streamer, the 3-D site with an area of 1200 m x 600 m was surveyed in 10 days. A variety of complex geologic structures (e.g. thrusts, folds, channel-fill) up to similar to150 m below the water bottom were obtained with a 15 in.(3) water gun. The 3-D data allowed the construction of an accurate velocity model and the distinction of five major seismic facies within the Lower Freshwater Molasse (Aquitanian) and the Quaternary sedimentary units. Additionally, the Plateau Molasse (PM) and Subalpine Molasse (SM) erosional surface, "La Paudeze" thrust fault (PM-SM boundary) and the thickness of Quaternary sediments were accurately delineated in 3-D.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the International Olympic Committee (IOC) accredited laboratories, specific methods have been developed to detect anabolic steroids in athletes' urine. The technique of choice to achieve this is gas-chromatography coupled with mass spectrometry (GC-MS). In order to improve the efficiency of anti-doping programmes, the laboratories have defined new analytical strategies. The final sensitivity of the analytical procedure can be improved by choosing new technologies for use in detection, such as tandem mass spectrometry (MS-MS) or high resolution mass spectrometry (HRMS). A better sample preparation using immuno-affinity chromatography (IAC) is also a good tool for improving sensitivity. These techniques are suitable for the detection of synthetic anabolic steroids whose structure is not found naturally in the human body. The more and more evident use, on a large scale, of substances chemically similar to the endogenous steroids obliges both the laboratory and the sports authorities to use the steroid profile of the athlete in comparison with reference ranges from a population or with intraindividual reference values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.