188 resultados para perturbation methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the International Olympic Committee (IOC) accredited laboratories, specific methods have been developed to detect anabolic steroids in athletes' urine. The technique of choice to achieve this is gas-chromatography coupled with mass spectrometry (GC-MS). In order to improve the efficiency of anti-doping programmes, the laboratories have defined new analytical strategies. The final sensitivity of the analytical procedure can be improved by choosing new technologies for use in detection, such as tandem mass spectrometry (MS-MS) or high resolution mass spectrometry (HRMS). A better sample preparation using immuno-affinity chromatography (IAC) is also a good tool for improving sensitivity. These techniques are suitable for the detection of synthetic anabolic steroids whose structure is not found naturally in the human body. The more and more evident use, on a large scale, of substances chemically similar to the endogenous steroids obliges both the laboratory and the sports authorities to use the steroid profile of the athlete in comparison with reference ranges from a population or with intraindividual reference values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Oscillations have been increasingly recognized as a core property of neural responses that contribute to spontaneous, induced, and evoked activities within and between individual neurons and neural ensembles. They are considered as a prominent mechanism for information processing within and communication between brain areas. More recently, it has been proposed that interactions between periodic components at different frequencies, known as cross-frequency couplings, may support the integration of neuronal oscillations at different temporal and spatial scales. The present study details methods based on an adaptive frequency tracking approach that improve the quantification and statistical analysis of oscillatory components and cross-frequency couplings. This approach allows for time-varying instantaneous frequency, which is particularly important when measuring phase interactions between components. We compared this adaptive approach to traditional band-pass filters in their measurement of phase-amplitude and phase-phase cross-frequency couplings. Evaluations were performed with synthetic signals and EEG data recorded from healthy humans performing an illusory contour discrimination task. First, the synthetic signals in conjunction with Monte Carlo simulations highlighted two desirable features of the proposed algorithm vs. classical filter-bank approaches: resilience to broad-band noise and oscillatory interference. Second, the analyses with real EEG signals revealed statistically more robust effects (i.e. improved sensitivity) when using an adaptive frequency tracking framework, particularly when identifying phase-amplitude couplings. This was further confirmed after generating surrogate signals from the real EEG data. Adaptive frequency tracking appears to improve the measurements of cross-frequency couplings through precise extraction of neuronal oscillations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper characterizes and evaluates the potential of three commercial CT iterative reconstruction methods (ASIR?, VEO? and iDose(4 ()?())) for dose reduction and image quality improvement. We measured CT number accuracy, standard deviation (SD), noise power spectrum (NPS) and modulation transfer function (MTF) metrics on Catphan phantom images while five human observers performed four-alternative forced-choice (4AFC) experiments to assess the detectability of low- and high-contrast objects embedded in two pediatric phantoms. Results show that 40% and 100% ASIR as well as iDose(4) levels 3 and 6 do not affect CT number and strongly decrease image noise with relative SD constant in a large range of dose. However, while ASIR produces a shift of the NPS curve apex, less change is observed with iDose(4) with respect to FBP methods. With second-generation iterative reconstruction VEO, physical metrics are even further improved: SD decreased to 70.4% at 0.5 mGy and spatial resolution improved to 37% (MTF(50%)). 4AFC experiments show that few improvements in detection task performance are obtained with ASIR and iDose(4), whereas VEO makes excellent detections possible even at an ultra-low-dose (0.3 mGy), leading to a potential dose reduction of a factor 3 to 7 (67%-86%). In spite of its longer reconstruction time and the fact that clinical studies are still required to complete these results, VEO clearly confirms the tremendous potential of iterative reconstructions for dose reduction in CT and appears to be an important tool for patient follow-up, especially for pediatric patients where cumulative lifetime dose still remains high.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE OF REVIEW: Invasive candidiasis is a severe infectious complication occurring mostly in onco-hematologic and surgical patients. Its conventional diagnosis is insensitive and often late, leading to a delayed treatment and a high mortality. The purpose of this article is to review recent contributions in the nonconventional diagnostic approaches of invasive candidiasis, both for the detection of the epidose and the characterization of the etiologic agent. RECENT FINDINGS: Antigen-based tests to detect invasive candidiasis comprise a specific test, mannan, as well as a nonspecific test, beta-D-glucan. Both have a moderate sensitivity and a high specificity, and cannot be recommended alone as a negative screening tool or a positive syndrome driven diagnostic tool. Molecular-based tests still have not reached the stage of rapid, easy to use, standardized tests ideally complementing blood culture at the time of blood sampling. New tests (fluorescence in-situ hybridization or mass spectrometry) significantly reduce the delay of identification of Candida at the species level in positive blood cultures, and should have a positive impact on earlier appropriate antifungal therapy and possibly on outcome. SUMMARY: Both antigen-based and molecular tests appear as promising new tools to complement and accelerate the conventional diagnosis of invasive candidiasis with an expected significant impact on earlier and more focused treatment and on prognosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fraud is as old as Mankind. There are an enormous number of historical documents which show the interaction between truth and untruth; therefore it is not really surprising that the prevalence of publication discrepancies is increasing. More surprising is that new cases especially in the medical field generate such a huge astonishment. In financial mathematics a statistical tool for detection of fraud is known which uses the knowledge of Newcomb and Benford regarding the distribution of natural numbers. This distribution is not equal and lower numbers are more likely to be detected compared to higher ones. In this investigation all numbers contained in the blinded abstracts of the 2009 annual meeting of the Swiss Society of Anesthesia and Resuscitation (SGAR) were recorded and analyzed regarding the distribution. A manipulated abstract was also included in the investigation. The χ(2)-test was used to determine statistical differences between expected and observed counts of numbers. There was also a faked abstract integrated in the investigation. A p<0.05 was considered significant. The distribution of the 1,800 numbers in the 77 submitted abstracts followed Benford's law. The manipulated abstract was detected by statistical means (difference in expected versus observed p<0.05). Statistics cannot prove whether the content is true or not but can give some serious hints to look into the details in such conspicuous material. These are the first results of a test for the distribution of numbers presented in medical research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bacteria can survive on hospital textiles and surfaces, from which they can be disseminated, representing a source of health care-associated infections (HCAIs). Surfaces containing copper (Cu), which is known for its bactericidal properties, could be an efficient way to lower the burden of potential pathogens. The antimicrobial activity of Cu-sputtered polyester surfaces, obtained by direct-current magnetron sputtering (DCMS), against methicillin-resistant Staphylococcus aureus (MRSA) was tested. The Cu-polyester microstructure was characterized by high-resolution transmission electron microscopy to determine the microstructure of the Cu nanoparticles and by profilometry to assess the thickness of the layers. Sputtering at 300 mA for 160 s led to a Cu film thickness of 20 nm (100 Cu layers) containing 0.209% (wt/wt) polyester. The viability of MRSA strain ATCC 43300 on Cu-sputtered polyester was evaluated by four methods: (i) mechanical detachment, (ii) microcalorimetry, (iii) direct transfer onto plates, and (iv) stereomicroscopy. The low efficacy of mechanical detachment impeded bacterial viability estimations. Microcalorimetry provided only semiquantitative results. Direct transfer onto plates and stereomicroscopy seemed to be the most suitable methods to evaluate the bacterial inactivation potential of Cu-sputtered polyester surfaces, since they presented the least experimental bias. Cu-polyester samples sputtered for 160 s by DCMS were further tested against 10 clinical MRSA isolates and showed a high level of bactericidal activity, with a 4-log(10) reduction in the initial MRSA load (10(6) CFU) within 1 h. Cu-sputtered polyester surfaces might be of use to prevent the transmission of HCAI pathogens.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We extend PML theory to account for information on the conditional moments up to order four, but without assuming a parametric model, to avoid a risk of misspecification of the conditional distribution. The key statistical tool is the quartic exponential family, which allows us to generalize the PML2 and QGPML1 methods proposed in Gourieroux et al. (1984) to PML4 and QGPML2 methods, respectively. An asymptotic theory is developed. The key numerical tool that we use is the Gauss-Freud integration scheme that solves a computational problem that has previously been raised in several fields. Simulation exercises demonstrate the feasibility and robustness of the methods [Authors]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In occupational exposure assessment of airborne contaminants, exposure levels can either be estimated through repeated measurements of the pollutant concentration in air, expert judgment or through exposure models that use information on the conditions of exposure as input. In this report, we propose an empirical hierarchical Bayesian model to unify these approaches. Prior to any measurement, the hygienist conducts an assessment to generate prior distributions of exposure determinants. Monte-Carlo samples from these distributions feed two level-2 models: a physical, two-compartment model, and a non-parametric, neural network model trained with existing exposure data. The outputs of these two models are weighted according to the expert's assessment of their relevance to yield predictive distributions of the long-term geometric mean and geometric standard deviation of the worker's exposure profile (level-1 model). Bayesian inferences are then drawn iteratively from subsequent measurements of worker exposure. Any traditional decision strategy based on a comparison with occupational exposure limits (e.g. mean exposure, exceedance strategies) can then be applied. Data on 82 workers exposed to 18 contaminants in 14 companies were used to validate the model with cross-validation techniques. A user-friendly program running the model is available upon request.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The imatinib trough plasma concentration (C(min)) correlates with clinical response in cancer patients. Therapeutic drug monitoring (TDM) of plasma C(min) is therefore suggested. In practice, however, blood sampling for TDM is often not performed at trough. The corresponding measurement is thus only remotely informative about C(min) exposure. Objectives: The objectives of this study were to improve the interpretation of randomly measured concentrations by using a Bayesian approach for the prediction of C(min), incorporating correlation between pharmacokinetic parameters, and to compare the predictive performance of this method with alternative approaches, by comparing predictions with actual measured trough levels, and with predictions obtained by a reference method, respectively. Methods: A Bayesian maximum a posteriori (MAP) estimation method accounting for correlation (MAP-ρ) between pharmacokinetic parameters was developed on the basis of a population pharmacokinetic model, which was validated on external data. Thirty-one paired random and trough levels, observed in gastrointestinal stromal tumour patients, were then used for the evaluation of the Bayesian MAP-ρ method: individual C(min) predictions, derived from single random observations, were compared with actual measured trough levels for assessment of predictive performance (accuracy and precision). The method was also compared with alternative approaches: classical Bayesian MAP estimation assuming uncorrelated pharmacokinetic parameters, linear extrapolation along the typical elimination constant of imatinib, and non-linear mixed-effects modelling (NONMEM) first-order conditional estimation (FOCE) with interaction. Predictions of all methods were finally compared with 'best-possible' predictions obtained by a reference method (NONMEM FOCE, using both random and trough observations for individual C(min) prediction). Results: The developed Bayesian MAP-ρ method accounting for correlation between pharmacokinetic parameters allowed non-biased prediction of imatinib C(min) with a precision of ±30.7%. This predictive performance was similar for the alternative methods that were applied. The range of relative prediction errors was, however, smallest for the Bayesian MAP-ρ method and largest for the linear extrapolation method. When compared with the reference method, predictive performance was comparable for all methods. The time interval between random and trough sampling did not influence the precision of Bayesian MAP-ρ predictions. Conclusion: Clinical interpretation of randomly measured imatinib plasma concentrations can be assisted by Bayesian TDM. Classical Bayesian MAP estimation can be applied even without consideration of the correlation between pharmacokinetic parameters. Individual C(min) predictions are expected to vary less through Bayesian TDM than linear extrapolation. Bayesian TDM could be developed in the future for other targeted anticancer drugs and for the prediction of other pharmacokinetic parameters that have been correlated with clinical outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since the 1990's, cheating athletes have abused substances to increase their oxygen transport capabilities; among these substances, recombinant EPO is the most well known. Currently, other investigational pharmaceutical products are able to produce an effect similar to EPO but without having chemical structures related to EPO; these are the synthetic erythropoiesis stimulating agents (ESAs). Peginesatide (also known as Hematide?) is being developed by Affymax and Takeda and, if approved by regulatory authorities, could soon be released on the international market. To detect potential athletic abuse of this product and deter athletes who consider cheating, we initiated a collaboration to implement a detection test for anti-doping purposes. Peginesatide is a synthetic, PEGylated, investigational, peptide-based erythropoiesis-stimulating agent that is designed and engineered to stimulate specifically the erythropoietin receptor dimer that governs erythropoiesis. It is undetectable using current anti-doping tests due to its lack of sequence homology to EPO. To detect and deter potential abuse of peginesatide, we initiated an industry/antidoping laboratory collaboration to develop and validate screening and confirmation assays so that they would be available before peginesatide reaches the market. We describe a screening ELISA and a confirmation assay consisting of immune-purification followed by separation with SDS-PAGE and revelation with Western double blotting. Both assays can detect 0.5 ng/mL concentrations of peginesatide in blood samples, enabling detection for several days after administration of a physiologically relevant dose. This initial report describes experimental characterization of these assays, including testing with a blinded set of samples from a clinical study conducted in healthy volunteers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A survey of medical ambulatory practice was carried out in February-March 1981 in the two Swiss cantons of Vaud and Fribourg (total population: 700,000), in which 205 physicians participated. The methodology used was inspired from the U.S. National Ambulatory Medical Care Survey, the data collection instrument of which was adapted to our conditions; in addition, data were gathered on all referrals prescribed by 154 physicians during two weeks. (The instruments used are presented.) The potential and limits of this type of survey are discussed, as well as the representativity of the participating physicians and of the recorded visits, which are a systematic sample of over 43,000 visits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article uses a mixed methods design to investigate the effects of social influence on family formation in a sample of eastern and western German young adults at an early stage of their family formation. Theoretical propositions on the importance of informal interaction for fertility and family behavior are still rarely supported by systematic empirical evidence. Major problems are the correct identification of salient relationships and the comparability of social networks across population subgroups. This article addresses the two issues through a combination of qualitative and quantitative data collection and analysis. In-depth interviewing, network charts, and network grids are used to map individual personal relationships and their influence on family formation decisions. In addition, an analysis of friendship dyads is provided.