954 resultados para Parametric bootstrap


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Roadside cross-drainage culverts have been found to impact vehicle accident injury levels. Designers have commonly used three safety treatments to protect errant drivers from culvert accidents. These treatments have included: culvert extension, guardrail installation and grating. In order to define which safety treatment is the most appropriate, benefit-cost analysis has used accident cost reduction to estimate societal gains earned by using any safety treatment. The purpose of this study was to estimate accident costs for a wide range of roadway and roadside characteristics so that designers can calculate benefit/cost ratios for culvert safety treatment options under any particular scenario. This study began with conducting a parametric study in order to find variables which have significant impact on accident cost changes. The study proceeded with highway scenario modeling which included scenarios with different values for combinations of roadway and roadside variables. These variables were chosen based upon findings from the parametric study and their values were assigned based upon highway classification. This study shows that the use of different culvert safety treatments should be flexible to roadway and roadside characteristics. It also shows that culvert extension and grating were the safety treatments found to produce the lowest accident costs for all highway scenarios modeled. Therefore, it is believed that the expanded adoption of culvert extension and culvert grates can improve overall highway safety.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we deal with the identification of dependencies between time series of equity returns. Marginal distribution functions are assumed to be known, and a bivariate chi-square test of fit is applied in a fully parametric copula approach. Several families of copulas are fitted and compared with Spanish stock market data. The results show that the t-copula generally outperforms other dependence structures, and highlight the difficulty in adjusting a significant number of bivariate data series

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: The objective is to develop a cost-effective, reliable and non invasive screening test able to detect early CRCs and adenomas. This is done on a nucleic acids multigene assay performed on peripheral blood mononuclear cells (PBMCs). METHODS: A colonoscopy-controlled study was conducted on 179 subjects. 92 subjects (21 CRC, 30 adenoma >1 cm and 41 controls) were used as training set to generate a signature. Other 48 subjects kept blinded (controls, CRC and polyps) were used as a test set. To determine organ and disease specificity 38 subjects were used: 24 with inflammatory bowel disease (IBD),14 with other cancers (OC). Blood samples were taken and PBMCs were purified. After the RNA extraction, multiplex RT-qPCR was applied on 92 different candidate biomarkers. After different univariate and multivariate analysis 60 biomarkers with significant p-values (<0.01) were selected. 2 distinct biomarker signatures are used to separate patients without lesion from those with CRC or with adenoma, named COLOX CRC and COLOX POL. COLOX performances were validated using random resampling method, bootstrap. RESULTS: COLOX CRC and POL tests successfully separate patients without lesions from those with CRC (Se 67%, Sp 93%, AUC 0.87), and from those with adenoma > 1cm (Se 63%, Sp 83%, AUC 0.77). 6/24 patients in the IBD group and 1/14 patients in the OC group have a positive COLOX CRC. CONCLUSION: The two COLOX tests demonstrated a high Se and Sp to detect the presence of CRCs and adenomas > 1 cm. A prospective, multicenter, pivotal study is underway in order to confirm these promising results in a larger cohort.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Recent studies showed that pericardial fat was independently correlated with the development of coronary artery disease (CAD). The mechanism remains unclear. We aimed at assessing a possible relationship between pericardial fat volume and endothelium-dependent coronary vasomotion, a surrogate of future cardiovascular events.Methods: Fifty healthy volunteers without known CAD or cardiovascular risk factors (CRF) were enrolled. They all underwent a dynamic Rb- 82 cardiac PET/CT to quantify myocardial blood flow (MBF) at rest, during MBF response to cold pressure test (CPT-MBF) and adenosine stress. Pericardial fat volume (PFV) was measured using a 3D volumetric CT method and common biological CRF (glucose and insulin levels, HOMA-IR, cholesterol, triglyceride, hs-CRP). Relationships between MBF response to CPT, PFV and other CRF were assessed using non-parametric Spearman correlation and multivariate regression analysis of variables with significant correlation on univariate analysis (Stata 11.0).Results: All of the 50 participants had normal MBF response to adenosine (2.7±0.6 mL/min/g; 95%CI: 2.6−2.9) and myocardial flow reserve (2.8±0.8; 95%CI: 2.6−3.0) excluding underlying CAD. Simple regression analysis revealed a significant correlation between absolute CPTMBF and triglyceride level (rho = −0.32, p = 0.024) fasting blood insulin (rho = −0.43, p = 0.0024), HOMA-IR (rho = −0.39, p = 0.007) and PFV (rho = −0.52, p = 0.0001). MBF response to adenosine was only correlated with PFV (rho = −0.32, p = 0.026). On multivariate regression analysis PFV emerged as the only significant predictor of MBF response to CPT (p = 0.002).Conclusion: PFV is significantly correlated with endothelium-dependent coronary vasomotion. High PF burden might negatively influence MBF response to CPT, as well as to adenosine stress, even in persons with normal hyperemic myocardial perfusion imaging, suggesting a link between PF and future cardiovascular events. While outside-to-inside adipokines secretion through the arterial wall has been described, our results might suggest an effect upon NO-dependent and -independent vasodilatation. Further studies are needed to elucidate this mechanism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A pacemaker, regularly emitting chemical waves, is created out of noise when an excitable photosensitive Belousov-Zhabotinsky medium, strictly unable to autonomously initiate autowaves, is forced with a spatiotemporal patterned random illumination. These experimental observations are also reproduced numerically by using a set of reaction-diffusion equations for an activator-inhibitor model, and further analytically interpreted in terms of genuine coupling effects arising from parametric fluctuations. Within the same framework we also address situations of noise-sustained propagation in subexcitable media.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cette recherche sur les jeux d'argent et de hasard au sein de la population des jeunes résidents Suisses avait trois objectifs. Pour avoir des données de base, nous nous sommes d'une part intéressés à la prévalence de ce comportement et, basé sur des critères de fréquence, avons mis en évidence une population plus à risque de subir des conséquences néfastes du jeu;à savoir ceux qui jouent au minimum une fois par semaine. Le deuxième objectif était de déterminer s'il existait une association entre la fréquence du jeu et 1) l'utilisation de substances, 2) une mauvaise santé mentale et/ou 3) un faible support social, comme cela a été décrit dans la littérature pour les joueurs pathologiques. Finalement, pour savoir si les joueurs fréquents étaient «fixés» sur un seul type de jeu ou au contraire jouaient de manière non-sélective, nous avons effectué la corrélation entre la fréquence de jeu et le nombre de jeux différents dans lesquels les jeunes étaient impliqués.Pour ces analyses, nous avons utilisé la base de données de l'Enquête Suisse sur la Santé 2007, une étude transversale interrogeant des résidents suisses âgés de 15 ans ou plus. Cette enquête a été menée en deux étapes: 1) un questionnaire téléphonique (taux de réponse: 66.3%) puis 2) un questionnaire écrit (taux de réponse: 80.5% de ceux qui ont répondu à l'interview téléphonique). En tenant compte de la pondération pour l'échantillon de participants ayant répondu aux deux types d'interviews, nous avons considéré uniquement les personnes âgées de 15 à 24 ans. Au total 1116 (582 hommes) participants ont été retenus pour notre étude.Pour répondre au second objectif, nous avons comparé trois groupes. Les non-joueurs (NJ, n=577), les joueurs occasionnels (JO, n=388) et les joueurs fréquents (JF, n=151) ont été étudiés d'abord grâce à des analyses bivariées, puis à une régression multinomiale permettant de tenir compte des facteurs confondants. La sélection des variables pour la régression a été basée sur une méthode «bootstrap» permettant de produire des résultats représentatifs de la population entière et pas uniquement de l'échantillon analysé.Nous avons procédé de manière similaire pour répondre à la troisième question de recherche, mais en comparant uniquement les joueurs occasionnels et les joueurs fréquents.Les résultats ont mis en évidence que 48.3% des jeunes résidents suisses étaient impliqués dans au moins un type de jeu dans l'année précédente. Par ailleurs, 13.5% (n=151) des 15 à 24 ans jouaient au minimum une fois par semaine.Au niveau bivarié, la fréquence de jeu était associée à des facteurs sociodémographiques comme le sexe masculin, l'âge (les JO étant les plus âgés), et le revenu personnel. La fréquence de jeu était également significativement associée au fait de fumer du tabac quotidiennement, d'être actuellement fumeur de cannabis et d'avoir une consommation d'alcool à risque (beuveries). La mauvaise santé mentale (épisode de dépression majeure ou détresse psychologique) et le faible support relationnel (personne de confiance dans l'entourage ou activités de loisirs) n'étaient pas associés à la fréquence de jeu de manière significative, bien qu'une nette tendance en faveur des NJ ait pu être mise en évidence. Au niveau multivarié, les JO et JF étaient plus âgés, plus souvent de sexe masculin et habitaient plus souvent en Suisse romande que les NJ. Les JO étaient plus à risque que les NJ de se soumettre à des beuveries de manière occasionnelle et les JF étaient plus à risque que les NJ d'être des fumeurs de tabac quotidiens.En comparant les JO et les JF, nous avons obtenu une correlation élevée (r=0.85;p<0.0001) entre la fréquence de jeu et le nombre de jeux dans lesquels les jeunes étaient impliqués. Ceci indiquant que les JF ne semblent pas très sélectifs quant au type de jeu auquel ils jouent.Dans la mesure où le jeu est un comportement très prévalent au sein de la population des jeunes résidents suisses, il doit probablement être vu comme une conduite faisant partie des comportements exploratoires de l'adolescence. Néanmoins, au vu des comportements à risque qui y sont associés, la question du jeu devrait être soulevée par les médecins s'occupant de jeunes adultes à des fins de prévention.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents multiple kernel learning (MKL) regression as an exploratory spatial data analysis and modelling tool. The MKL approach is introduced as an extension of support vector regression, where MKL uses dedicated kernels to divide a given task into sub-problems and to treat them separately in an effective way. It provides better interpretability to non-linear robust kernel regression at the cost of a more complex numerical optimization. In particular, we investigate the use of MKL as a tool that allows us to avoid using ad-hoc topographic indices as covariables in statistical models in complex terrains. Instead, MKL learns these relationships from the data in a non-parametric fashion. A study on data simulated from real terrain features confirms the ability of MKL to enhance the interpretability of data-driven models and to aid feature selection without degrading predictive performances. Here we examine the stability of the MKL algorithm with respect to the number of training data samples and to the presence of noise. The results of a real case study are also presented, where MKL is able to exploit a large set of terrain features computed at multiple spatial scales, when predicting mean wind speed in an Alpine region.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a need for more efficient methods giving insight into the complex mechanisms of neurotoxicity. Testing strategies including in vitro methods have been proposed to comply with this requirement. With the present study we aimed to develop a novel in vitro approach which mimics in vivo complexity, detects neurotoxicity comprehensively, and provides mechanistic insight. For this purpose we combined rat primary re-aggregating brain cell cultures with a mass spectrometry (MS)-based metabolomics approach. For the proof of principle we treated developing re-aggregating brain cell cultures for 48h with the neurotoxicant methyl mercury chloride (0.1-100muM) and the brain stimulant caffeine (1-100muM) and acquired cellular metabolic profiles. To detect toxicant-induced metabolic alterations the profiles were analysed using commercial software which revealed patterns in the multi-parametric dataset by principal component analyses (PCA), and recognised the most significantly altered metabolites. PCA revealed concentration-dependent cluster formations for methyl mercury chloride (0.1-1muM), and treatment-dependent cluster formations for caffeine (1-100muM) at sub-cytotoxic concentrations. Four relevant metabolites responsible for the concentration-dependent alterations following methyl mercury chloride treatment could be identified using MS-MS fragmentation analysis. These were gamma-aminobutyric acid, choline, glutamine, creatine and spermine. Their respective mass ion intensities demonstrated metabolic alterations in line with the literature and suggest that the metabolites could be biomarkers for mechanisms of neurotoxicity or neuroprotection. In addition, we evaluated whether the approach could identify neurotoxic potential by testing eight compounds which have target organ toxicity in the liver, kidney or brain at sub-cytotoxic concentrations. PCA revealed cluster formations largely dependent on target organ toxicity indicating possible potential for the development of a neurotoxicity prediction model. With such results it could be useful to perform a validation study to determine the reliability, relevance and applicability of this approach to neurotoxicity screening. Thus, for the first time we show the benefits and utility of in vitro metabolomics to comprehensively detect neurotoxicity and to discover new biomarkers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study details a method to statistically determine, on a millisecond scale and for individual subjects, those brain areas whose activity differs between experimental conditions, using single-trial scalp-recorded EEG data. To do this, we non-invasively estimated local field potentials (LFPs) using the ELECTRA distributed inverse solution and applied non-parametric statistical tests at each brain voxel and for each time point. This yields a spatio-temporal activation pattern of differential brain responses. The method is illustrated here in the analysis of auditory-somatosensory (AS) multisensory interactions in four subjects. Differential multisensory responses were temporally and spatially consistent across individuals, with onset at approximately 50 ms and superposition within areas of the posterior superior temporal cortex that have traditionally been considered auditory in their function. The close agreement of these results with previous investigations of AS multisensory interactions suggests that the present approach constitutes a reliable method for studying multisensory processing with the temporal and spatial resolution required to elucidate several existing questions in this field. In particular, the present analyses permit a more direct comparison between human and animal studies of multisensory interactions and can be extended to examine correlation between electrophysiological phenomena and behavior.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction. - Les fractures ostéoporotiques entrainent une morbi-mortalité et des coûts économiques et humains grandissants. Des campagnes de dépistage se mettent en place associant questionnaire et DXA afin d'évaluer le risque fracturaire individuel et populationnel. La découverte fortuite d'une fracture vertébrale (VF), rendue possible par la réalisation d'une morphométrie vertébrale de profil (VFA par DXA) de D4 à L4, peut changer le diagnostic et le pronostic. Néanmoins sa reproductibilité de lecture est peu élevée, surtout sur le rachis dorsal et les fractures de grade 1 [1]. L'IOF/ISCD a proposé un guide pour en améliorer la lecture. Nous avons mesuré la reproductibilité de lecture des VFA avant et après application de ce guide sur une cohorte Suisse de dépistage de l'ostéoporose. Patients et méthodes. - 360 VFA (Hologic Delphi) issus aléatoirement de la cohorte OstéoLaus (femmes > 50 ans) ont été lus par 2 lecteurs indépendants avant et après application du guide de lecture. Il comporte des règles de condition de lecture (luminosité, contraste sur l'écran) et des étapes de lecture systématisées. La reproductibilité a été évaluée par le test de kappa sur : la lisibilité de chaque vertèbre, l'existence ou non d'une VF, son grade (1, 2 ou 3 selon Genant). Nous avons utilisé le Kappa de Cohen avec une technique de bootstrap pour les comparaisons avant/après sur des données corrélées. Résultats. - L'accord entre les lecteurs est élevé et s'améliore après application du guide de lecture (tableau). Le kappa de Cohen est modéré à bon selon Landis et Koch (0,4-0,7). La reproductibilité sur les grades est améliorée en regroupant les grades 0/1 et 2/3, mais pas par le guide de lecture. Conclusion. - L'utilisation du guide de lecture des VFA IOF/ISCD améliore la reproductibilité sur la lisibilité des vertèbres, la détection des VF, mais pas la classification du grade selon Genant. Ceci est principalement expliqué par le fait que le kappa de Cohen donne beaucoup d'importance à la distribution des données, qui devient asymétrique lorsque l'événement est rare. Le kappa uniforme [2] serait mieux adapté dans cette situation. Une réanalyse est en cours.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the regional scale represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed a downscaling procedure based on a non-linear Bayesian sequential simulation approach. The basic objective of this algorithm is to estimate the value of the sparsely sampled hydraulic conductivity at non-sampled locations based on its relation to the electrical conductivity, which is available throughout the model space. The in situ relationship between the hydraulic and electrical conductivities is described through a non-parametric multivariate kernel density function. This method is then applied to the stochastic integration of low-resolution, re- gional-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities. Finally, the overall viability of this downscaling approach is tested and verified by performing and comparing flow and transport simulation through the original and the downscaled hydraulic conductivity fields. Our results indicate that the proposed procedure does indeed allow for obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES: To assess the accuracy of high-resolution (HR) magnetic resonance imaging (MRI) in diagnosing early-stage optic nerve (ON) invasion in a retinoblastoma cohort. METHODS: This IRB-approved, prospective multicenter study included 95 patients (55 boys, 40 girls; mean age, 29 months). 1.5-T MRI was performed using surface coils before enucleation, including spin-echo unenhanced and contrast-enhanced (CE) T1-weighted sequences (slice thickness, 2 mm; pixel size <0.3 × 0.3 mm(2)). Images were read by five neuroradiologists blinded to histopathologic findings. ROC curves were constructed with AUC assessment using a bootstrap method. RESULTS: Histopathology identified 41 eyes without ON invasion and 25 with prelaminar, 18 with intralaminar and 12 with postlaminar invasion. All but one were postoperatively classified as stage I by the International Retinoblastoma Staging System. The accuracy of CE-T1 sequences in identifying ON invasion was limited (AUC = 0.64; 95 % CI, 0.55 - 0.72) and not confirmed for postlaminar invasion diagnosis (AUC = 0.64; 95 % CI, 0.47 - 0.82); high specificities (range, 0.64 - 1) and negative predictive values (range, 0.81 - 0.97) were confirmed. CONCLUSION: HR-MRI with surface coils is recommended to appropriately select retinoblastoma patients eligible for primary enucleation without the risk of IRSS stage II but cannot substitute for pathology in differentiating the first degrees of ON invasion. KEY POINTS: • HR-MRI excludes advanced optic nerve invasion with high negative predictive value. • HR-MRI accurately selects patients eligible for primary enucleation. • Diagnosis of early stages of optic nerve invasion still relies on pathology. • Several physiological MR patterns may mimic optic nerve invasion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a new non parametric atlas registration framework, derived from the optical flow model and the active contour theory, applied to automatic subthalamic nucleus (STN) targeting in deep brain stimulation (DBS) surgery. In a previous work, we demonstrated that the STN position can be predicted based on the position of surrounding visible structures, namely the lateral and third ventricles. A STN targeting process can thus be obtained by registering these structures of interest between a brain atlas and the patient image. Here we aim to improve the results of the state of the art targeting methods and at the same time to reduce the computational time. Our simultaneous segmentation and registration model shows mean STN localization errors statistically similar to the most performing registration algorithms tested so far and to the targeting expert's variability. Moreover, the computational time of our registration method is much lower, which is a worthwhile improvement from a clinical point of view.