969 resultados para PARAMETRIC TRANSDUCERS


Relevância:

10.00% 10.00%

Publicador:

Resumo:

La regressió basada en distàncies és un mètode de predicció que consisteix en dos passos: a partir de les distàncies entre observacions obtenim les variables latents, les quals passen a ser els regressors en un model lineal de mínims quadrats ordinaris. Les distàncies les calculem a partir dels predictors originals fent us d'una funció de dissimilaritats adequada. Donat que, en general, els regressors estan relacionats de manera no lineal amb la resposta, la seva selecció amb el test F usual no és possible. En aquest treball proposem una solució a aquest problema de selecció de predictors definint tests estadístics generalitzats i adaptant un mètode de bootstrap no paramètric per a l'estimació dels p-valors. Incluim un exemple numèric amb dades de l'assegurança d'automòbils.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Recent studies showed that pericardial fat was independently correlated with the development of coronary artery disease (CAD). The mechanism remains unclear. We aimed at assessing a possible relationship between pericardial fat volume and endothelium-dependent coronary vasomotion, a surrogate of future cardiovascular events.Methods: Fifty healthy volunteers without known CAD or cardiovascular risk factors (CRF) were enrolled. They all underwent a dynamic Rb- 82 cardiac PET/CT to quantify myocardial blood flow (MBF) at rest, during MBF response to cold pressure test (CPT-MBF) and adenosine stress. Pericardial fat volume (PFV) was measured using a 3D volumetric CT method and common biological CRF (glucose and insulin levels, HOMA-IR, cholesterol, triglyceride, hs-CRP). Relationships between MBF response to CPT, PFV and other CRF were assessed using non-parametric Spearman correlation and multivariate regression analysis of variables with significant correlation on univariate analysis (Stata 11.0).Results: All of the 50 participants had normal MBF response to adenosine (2.7±0.6 mL/min/g; 95%CI: 2.6−2.9) and myocardial flow reserve (2.8±0.8; 95%CI: 2.6−3.0) excluding underlying CAD. Simple regression analysis revealed a significant correlation between absolute CPTMBF and triglyceride level (rho = −0.32, p = 0.024) fasting blood insulin (rho = −0.43, p = 0.0024), HOMA-IR (rho = −0.39, p = 0.007) and PFV (rho = −0.52, p = 0.0001). MBF response to adenosine was only correlated with PFV (rho = −0.32, p = 0.026). On multivariate regression analysis PFV emerged as the only significant predictor of MBF response to CPT (p = 0.002).Conclusion: PFV is significantly correlated with endothelium-dependent coronary vasomotion. High PF burden might negatively influence MBF response to CPT, as well as to adenosine stress, even in persons with normal hyperemic myocardial perfusion imaging, suggesting a link between PF and future cardiovascular events. While outside-to-inside adipokines secretion through the arterial wall has been described, our results might suggest an effect upon NO-dependent and -independent vasodilatation. Further studies are needed to elucidate this mechanism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A pacemaker, regularly emitting chemical waves, is created out of noise when an excitable photosensitive Belousov-Zhabotinsky medium, strictly unable to autonomously initiate autowaves, is forced with a spatiotemporal patterned random illumination. These experimental observations are also reproduced numerically by using a set of reaction-diffusion equations for an activator-inhibitor model, and further analytically interpreted in terms of genuine coupling effects arising from parametric fluctuations. Within the same framework we also address situations of noise-sustained propagation in subexcitable media.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents multiple kernel learning (MKL) regression as an exploratory spatial data analysis and modelling tool. The MKL approach is introduced as an extension of support vector regression, where MKL uses dedicated kernels to divide a given task into sub-problems and to treat them separately in an effective way. It provides better interpretability to non-linear robust kernel regression at the cost of a more complex numerical optimization. In particular, we investigate the use of MKL as a tool that allows us to avoid using ad-hoc topographic indices as covariables in statistical models in complex terrains. Instead, MKL learns these relationships from the data in a non-parametric fashion. A study on data simulated from real terrain features confirms the ability of MKL to enhance the interpretability of data-driven models and to aid feature selection without degrading predictive performances. Here we examine the stability of the MKL algorithm with respect to the number of training data samples and to the presence of noise. The results of a real case study are also presented, where MKL is able to exploit a large set of terrain features computed at multiple spatial scales, when predicting mean wind speed in an Alpine region.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a need for more efficient methods giving insight into the complex mechanisms of neurotoxicity. Testing strategies including in vitro methods have been proposed to comply with this requirement. With the present study we aimed to develop a novel in vitro approach which mimics in vivo complexity, detects neurotoxicity comprehensively, and provides mechanistic insight. For this purpose we combined rat primary re-aggregating brain cell cultures with a mass spectrometry (MS)-based metabolomics approach. For the proof of principle we treated developing re-aggregating brain cell cultures for 48h with the neurotoxicant methyl mercury chloride (0.1-100muM) and the brain stimulant caffeine (1-100muM) and acquired cellular metabolic profiles. To detect toxicant-induced metabolic alterations the profiles were analysed using commercial software which revealed patterns in the multi-parametric dataset by principal component analyses (PCA), and recognised the most significantly altered metabolites. PCA revealed concentration-dependent cluster formations for methyl mercury chloride (0.1-1muM), and treatment-dependent cluster formations for caffeine (1-100muM) at sub-cytotoxic concentrations. Four relevant metabolites responsible for the concentration-dependent alterations following methyl mercury chloride treatment could be identified using MS-MS fragmentation analysis. These were gamma-aminobutyric acid, choline, glutamine, creatine and spermine. Their respective mass ion intensities demonstrated metabolic alterations in line with the literature and suggest that the metabolites could be biomarkers for mechanisms of neurotoxicity or neuroprotection. In addition, we evaluated whether the approach could identify neurotoxic potential by testing eight compounds which have target organ toxicity in the liver, kidney or brain at sub-cytotoxic concentrations. PCA revealed cluster formations largely dependent on target organ toxicity indicating possible potential for the development of a neurotoxicity prediction model. With such results it could be useful to perform a validation study to determine the reliability, relevance and applicability of this approach to neurotoxicity screening. Thus, for the first time we show the benefits and utility of in vitro metabolomics to comprehensively detect neurotoxicity and to discover new biomarkers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study details a method to statistically determine, on a millisecond scale and for individual subjects, those brain areas whose activity differs between experimental conditions, using single-trial scalp-recorded EEG data. To do this, we non-invasively estimated local field potentials (LFPs) using the ELECTRA distributed inverse solution and applied non-parametric statistical tests at each brain voxel and for each time point. This yields a spatio-temporal activation pattern of differential brain responses. The method is illustrated here in the analysis of auditory-somatosensory (AS) multisensory interactions in four subjects. Differential multisensory responses were temporally and spatially consistent across individuals, with onset at approximately 50 ms and superposition within areas of the posterior superior temporal cortex that have traditionally been considered auditory in their function. The close agreement of these results with previous investigations of AS multisensory interactions suggests that the present approach constitutes a reliable method for studying multisensory processing with the temporal and spatial resolution required to elucidate several existing questions in this field. In particular, the present analyses permit a more direct comparison between human and animal studies of multisensory interactions and can be extended to examine correlation between electrophysiological phenomena and behavior.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the regional scale represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed a downscaling procedure based on a non-linear Bayesian sequential simulation approach. The basic objective of this algorithm is to estimate the value of the sparsely sampled hydraulic conductivity at non-sampled locations based on its relation to the electrical conductivity, which is available throughout the model space. The in situ relationship between the hydraulic and electrical conductivities is described through a non-parametric multivariate kernel density function. This method is then applied to the stochastic integration of low-resolution, re- gional-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities. Finally, the overall viability of this downscaling approach is tested and verified by performing and comparing flow and transport simulation through the original and the downscaled hydraulic conductivity fields. Our results indicate that the proposed procedure does indeed allow for obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a new non parametric atlas registration framework, derived from the optical flow model and the active contour theory, applied to automatic subthalamic nucleus (STN) targeting in deep brain stimulation (DBS) surgery. In a previous work, we demonstrated that the STN position can be predicted based on the position of surrounding visible structures, namely the lateral and third ventricles. A STN targeting process can thus be obtained by registering these structures of interest between a brain atlas and the patient image. Here we aim to improve the results of the state of the art targeting methods and at the same time to reduce the computational time. Our simultaneous segmentation and registration model shows mean STN localization errors statistically similar to the most performing registration algorithms tested so far and to the targeting expert's variability. Moreover, the computational time of our registration method is much lower, which is a worthwhile improvement from a clinical point of view.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The unifying objective of Phases I and II of this study was to determine the feasibility of the post-tensioning strengthening method and to implement the technique on two composite bridges in Iowa. Following completion of these two phases, Phase III was undertaken and is documented in this report. The basic objectives of Phase III were further monitoring bridge behavior (both during and after post-tensioning) and developing a practical design methodology for designing the strengthening system under investigation. Specific objectives were: to develop strain and force transducers to facilitate the collection of field data; to investigate further the existence and effects of the end restraint on the post-tensioning process; to determine the amount of post-tensioning force loss that occurred during the time between the initial testing and the retesting of the existing bridges; to determine the significance of any temporary temperature-induced post-tensioning force change; and to develop a simplified design methodology that would incorporate various variables such as span length, angle-of-skew, beam spacing, and concrete strength. Experimental field results obtained during Phases II and III were compared to the theoretical results and to each other. Conclusions from this research are as follows: (1) Strengthening single-span composite bridges by post-tensioning is a viable, economical strengthening technique. (2) Behavior of both bridges was similar to the behavior observed from the bridges during field tests conducted under Phase II. (3) The strain transducers were very accurate at measuring mid-span strain. (4) The force transducers gave excellent results under laboratory conditions, but were found to be less effective when used in actual bridge tests. (5) Loss of post-tensioning force due to temperature effects in any particular steel beam post-tensioning tendon system were found to be small. (6) Loss of post-tensioning force over a two-year period was minimal. (7) Significant end restraint was measured in both bridges, caused primarily by reinforcing steel being continuous from the deck into the abutments. This end restraint reduced the effectiveness of the post-tensioning but also reduced midspan strains due to truck loadings. (8) The SAP IV finite element model is capable of accurately modeling the behavior of a post-tensioned bridge, if guardrails and end restraints are included in the model. (9) Post-tensioning distribution should be separated into distributions for the axial force and moment components of an eccentric post-tensioning force. (10) Skews of 45 deg or less have a minor influence on post-tensioning distribution. (11) For typical Iowa three-beam and four-beam composite bridges, simple regression-derived formulas for force and moment fractions can be used to estimate post-tensioning distribution at midspan. At other locations, a simple linear interpolation gives approximately correct results. (12) A simple analytical model can accurately estimate the flexural strength of an isolated post-tensioned composite beam.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Some of the Iowa Department of Transportation (Iowa DOT) continuous, steel, welded plate girder bridges have developed web cracking in the negative moment regions at the diaphragm connection plates. The cracks are due to out-of-plane bending of the web near the top flange of the girder. The out-of-plane bending occurs in the "web-gap", which is the portion of the girder web between (1) the top of the fillet welds attaching the diaphragm connection plate to the web and (2) the fillet welds attaching the flange to the web. A literature search indicated that four retrofit techniques have been suggested by other researchers to prevent or control this type of cracking. To eliminate the problem in new bridges, AASHTO specifications require a positive attachment between the connection plate and the top (tension) flange. Applying this requirement to existing bridges is expensive and difficult. The Iowa DOT has relied primarily on the hole-drilling technique to prevent crack extension once cracking has occurred; however, the literature indicates that hole-drilling alone may not be entirely effective in preventing crack extension. The objective of this research was to investigate experimentally a method proposed by the Iowa DOT to prevent cracking at the diaphragm/plate girder connection in steel bridges with X-type or K-type diaphragms. The method consists of loosening the bolts at some connections between the diaphragm diagonals and the connection plates. The investigation included selecting and testing five bridges: three with X-type diaphragms and two with K-type diaphragms. During 1996 and 1997, these bridges were instrumented using strain gages and displacement transducers to obtain the response at various locations before and after implementing the method. Bridges were subjected to loaded test trucks traveling in different lanes with speeds varying from crawl speed to 65 mph (104 km/h) to determine the effectiveness of the proposed method. The results of the study show that the effect of out-of-plane loading was confined to widths of approximately 4 in. (100 mm) on either side of the connection plates. Further, they demonstrate that the stresses in gaps with drilled holes were higher than those in gaps without cracks, implying that the drilling hole technique is not sufficient to prevent crack extension. The behavior of the web gaps in X-type diaphragm bridges was greatly enhanced by the proposed method as the stress range and out-of-plane distortion were reduced by at least 42% at the exterior girders. For bridges with K-type diaphragms, a similar trend was obtained. However, the stress range increased in one of the web gaps after implementing the proposed method. Other design aspects (wind, stability of compression flange, and lateral distribution of loads) must be considered when deciding whether to adopt the proposed method. Considering the results of this investigation, the proposed method can be implemented for X-type diaphragm bridges. Further research is recommended for K-type diaphragm bridges.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multi-centre data repositories like the Alzheimer's Disease Neuroimaging Initiative (ADNI) offer a unique research platform, but pose questions concerning comparability of results when using a range of imaging protocols and data processing algorithms. The variability is mainly due to the non-quantitative character of the widely used structural T1-weighted magnetic resonance (MR) images. Although the stability of the main effect of Alzheimer's disease (AD) on brain structure across platforms and field strength has been addressed in previous studies using multi-site MR images, there are only sparse empirically-based recommendations for processing and analysis of pooled multi-centre structural MR data acquired at different magnetic field strengths (MFS). Aiming to minimise potential systematic bias when using ADNI data we investigate the specific contributions of spatial registration strategies and the impact of MFS on voxel-based morphometry in AD. We perform a whole-brain analysis within the framework of Statistical Parametric Mapping, testing for main effects of various diffeomorphic spatial registration strategies, of MFS and their interaction with disease status. Beyond the confirmation of medial temporal lobe volume loss in AD, we detect a significant impact of spatial registration strategy on estimation of AD related atrophy. Additionally, we report a significant effect of MFS on the assessment of brain anatomy (i) in the cerebellum, (ii) the precentral gyrus and (iii) the thalamus bilaterally, showing no interaction with the disease status. We provide empirical evidence in support of pooling data in multi-centre VBM studies irrespective of disease status or MFS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This report is formatted to independently present four individual investigations related to similar web gap fatigue problems. Multiple steel girder bridges commonly exhibit fatigue cracking due to out-of-plane displacement of the web near the diaphragm connections. This fatigue-prone web gap area is typically located in negative moment regions of the girders where the diaphragm stiffener is not attached to the top flange. In the past, the Iowa Department of Transportation has attempted to stop fatigue crack propagation in these steel girder bridges by drilling holes at the crack tips. Other nondestructive retrofits have been tried; in a particular case on a two-girder bridge with floor beams, angles were bolted between the stiffener and top flange. The bolted angle retrofit has failed in the past and may not be a viable solution for diaphragm bridges. The drilled hole retrofit is often only a temporary solution, so a more permanent and effective retrofit is required. A new field retrofit has been developed that involves loosening the bolts in the connection between the diaphragm and the girders. Research on the retrofit has been initiated; however, no long-term studies of the effects of bolt loosening have been performed. The intent of this research is to study the short-term effects of the bolt loosening retrofit on I-beam and channel diaphragm bridges. The research also addressed the development of a continuous remote monitoring system to investigate the bolt loosening retrofit on an X-type diaphragm bridge over a number of months, ensuring that the measured strain and displacement reductions are not affected by time and continuous traffic loading on the bridge. The testing for the first three investigations is based on instrumentation of web gaps in a negative moment region on Iowa Department of Transportation bridges with I-beam, channel, and X-type diaphragms. One bridge of each type was instrumented with strain gages and deflection transducers. Field tests, using loaded trucks of known weight and configuration, were conducted on the bridges with the bolts in the tight condition and after implementing the bolt loosening retrofit to measure the effects of loosening the diaphragm bolts. Long-term data were also collected on the X-diaphragm bridge by a data acquisition system that collected the data continuously under ambient truck loading. The collected data were retrievable by an off-site modem connection to the remote data acquisition system. The data collection features and ruggedness of this system for remote bridge monitoring make it viable as a pilot system for future monitoring projects in Iowa. Results indicate that loosening the diaphragm bolts reduces strain and out-of-plane displacement in the web gap, and that the reduction is not affected over time by traffic or environmental loading on the bridge. Reducing the strain in the web gap allows the bridge to support more cycles of loading before experiencing fatigue, thus increase the service life of the bridge. Two-girder floor beam bridges may also exhibit fatigue cracking in girder webs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A family history of coronary artery disease (CAD), especially when the disease occurs at a young age, is a potent risk factor for CAD. DNA collection in families in which two or more siblings are affected at an early age allows identification of genetic factors for CAD by linkage analysis. We performed a genomewide scan in 1,168 individuals from 438 families, including 493 affected sibling pairs with documented onset of CAD before 51 years of age in men and before 56 years of age in women. We prospectively defined three phenotypic subsets of families: (1) acute coronary syndrome in two or more siblings; (2) absence of type 2 diabetes in all affected siblings; and (3) atherogenic dyslipidemia in any one sibling. Genotypes were analyzed for 395 microsatellite markers. Regions were defined as providing evidence for linkage if they provided parametric two-point LOD scores >1.5, together with nonparametric multipoint LOD scores >1.0. Regions on chromosomes 3q13 (multipoint LOD = 3.3; empirical P value <.001) and 5q31 (multipoint LOD = 1.4; empirical P value <.081) met these criteria in the entire data set, and regions on chromosomes 1q25, 3q13, 7p14, and 19p13 met these criteria in one or more of the subsets. Two regions, 3q13 and 1q25, met the criteria for genomewide significance. We have identified a region on chromosome 3q13 that is linked to early-onset CAD, as well as additional regions of interest that will require further analysis. These data provide initial areas of the human genome where further investigation may reveal susceptibility genes for early-onset CAD.