71 resultados para Current Density Mapping Method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

False identity documents constitute a potential powerful source of forensic intelligence because they are essential elements of transnational crime and provide cover for organized crime. In previous work, a systematic profiling method using false documents' visual features has been built within a forensic intelligence model. In the current study, the comparison process and metrics lying at the heart of this profiling method are described and evaluated. This evaluation takes advantage of 347 false identity documents of four different types seized in two countries whose sources were known to be common or different (following police investigations and dismantling of counterfeit factories). Intra-source and inter-sources variations were evaluated through the computation of more than 7500 similarity scores. The profiling method could thus be validated and its performance assessed using two complementary approaches to measuring type I and type II error rates: a binary classification and the computation of likelihood ratios. Very low error rates were measured across the four document types, demonstrating the validity and robustness of the method to link documents to a common source or to differentiate them. These results pave the way for an operational implementation of a systematic profiling process integrated in a developed forensic intelligence model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The regulation of gene expression is crucial for an organism's development and response to stress, and an understanding of the evolution of gene expression is of fundamental importance to basic and applied biology. To improve this understanding, we conducted expression quantitative trait locus (eQTL) mapping in the Tsu-1 (Tsushima, Japan) × Kas-1 (Kashmir, India) recombinant inbred line population of Arabidopsis thaliana across soil drying treatments. We then used genome resequencing data to evaluate whether genomic features (promoter polymorphism, recombination rate, gene length, and gene density) are associated with genes responding to the environment (E) or with genes with genetic variation (G) in gene expression in the form of eQTLs. We identified thousands of genes that responded to soil drying and hundreds of main-effect eQTLs. However, we identified very few statistically significant eQTLs that interacted with the soil drying treatment (GxE eQTL). Analysis of genome resequencing data revealed associations of several genomic features with G and E genes. In general, E genes had lower promoter diversity and local recombination rates. By contrast, genes with eQTLs (G) had significantly greater promoter diversity and were located in genomic regions with higher recombination. These results suggest that genomic architecture may play an important a role in the evolution of gene expression.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article extends existing discussion in literature on probabilistic inference and decision making with respect to continuous hypotheses that are prevalent in forensic toxicology. As a main aim, this research investigates the properties of a widely followed approach for quantifying the level of toxic substances in blood samples, and to compare this procedure with a Bayesian probabilistic approach. As an example, attention is confined to the presence of toxic substances, such as THC, in blood from car drivers. In this context, the interpretation of results from laboratory analyses needs to take into account legal requirements for establishing the 'presence' of target substances in blood. In a first part, the performance of the proposed Bayesian model for the estimation of an unknown parameter (here, the amount of a toxic substance) is illustrated and compared with the currently used method. The model is then used in a second part to approach-in a rational way-the decision component of the problem, that is judicial questions of the kind 'Is the quantity of THC measured in the blood over the legal threshold of 1.5 μg/l?'. This is pointed out through a practical example.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: As an important modifiable lifestyle factor in osteoporosis prevention, physical activity has been shown to positively influence bone mass accrual during growth. We have previously shown that a nine month general school based physical activity intervention increased bone mineral content (BMC) and density (aBMD) in primary school children. From a public health perspective, a major key issue is whether these effects persist during adolescence. We therefore measured BMC and aBMD three years after cessation of the intervention to investigate whether the beneficial short-term effects persisted. METHODS: All children from 28 randomly selected first and fifth grade classes (intervention group (INT): 16 classes, n=297; control group (CON): 12 classes, n=205) who had participated in KISS (Kinder-und Jugendsportstudie) were contacted three years after cessation of the intervention program. The intervention included daily physical education with daily impact loading activities over nine months. Measurements included anthropometry, vigorous physical activity (VPA) by accelerometers, and BMC/aBMD for total body, femoral neck, total hip, and lumbar spine by dual-energy X-ray absorptiometry (DXA). Sex- and age-adjusted Z-scores of BMC or aBMD at follow-up were regressed on intervention (1 vs. 0), the respective Z-score at baseline, gender, follow-up height and weight, pubertal stage at follow-up, previous and current VPA, adjusting for clustering within schools. RESULTS: 377 of 502 (75%) children participated in baseline DXA measurements and of those, 214 (57%) participated to follow-up. At follow-up INT showed significantly higher Z-scores of BMC at total body (adjusted group difference: 0.157 units (0.031-0.283); p=0.015), femoral neck (0.205 (0.007-0.402); p=0.042) and at total hip (0.195 (0.036 to 0.353); p=0.016) and higher Z-scores of aBMD for total body (0.167 (0.016 to 0.317); p=0.030) compared to CON, representing 6-8% higher values for children in the INT. No differences could be found for the remaining bone parameters. For the subpopulation with baseline VPA (n=163), effect sizes became stronger after baseline VPA adjustment. After adjustment for baseline and current VPA (n=101), intervention effects were no longer significant, while effect sizes remained the same as without adjustment for VPA. CONCLUSION: Beneficial effects on BMC of a nine month general physical activity intervention appeared to persist over three years. Part of the maintained effects may be explained by current physical activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The major problems associated with the use of corticosteroids for the treatment of ocular diseases are their poor intraocular penetration to the posterior segment when administered locally and their secondary side effects when given systemically. To circumvent these problems more efficient methods and techniques of local delivery are being developed. The purposes of this study were: (1) to investigate the pharmacokinetics of intraocular penetration of hemisuccinate methyl prednisolone (HMP) after its delivery using the transscleral Coulomb controlled iontophoresis (CCI) system applied to the eye or after intravenous (i.v.) injection in the rabbit, (2) to test the safety of the CCI system for the treated eyes and (3) to compare the pharmacokinetic profiles of HMP intraocular distribution after CCI delivery to i.v. injection. For each parameter evaluated, six rabbit eyes were used. For the CCI system, two concentrations of HMP (62.5 and 150mg ml(-1)), various intensities of current and duration of treatment were analyzed. In rabbits serving as controls the HMP was infused in the CCI device but without applied electric current. For the i.v. delivery, HMP at 10mg kg(-1)as a 62.5mg ml(-1)solution was used. The rabbits were observed clinically for evidence of ocular toxicity. At various time points after the administration of drug, rabbits were killed and intraocular fluids and tissues were sampled for methylprednisolone (MP) concentrations by high pressure liquid chromatography (HPLC). Histology examinations were performed on six eyes of each group. Among groups that received CCI, the concentrations of MP increased in all ocular tissues and fluids in relation to the intensities of current used (0.4, 1.0 and 2.0mA/0.5cm(2)) and its duration (4 and 10min). Sustained and highest levels of MP were achieved in the choroid and the retina of rabbit eyes treated with the highest current and 10min duration of CCI. No clinical toxicity or histological lesions were observed following CCI. Negligible amounts of MP were found in ocular tissues in the CCI control group without application of current. Compared to i.v. administration, CCI achieved higher and more sustained tissue concentrations with negligible systemic absorption. These data demonstrate that high levels of MP can be safely achieved in intraocular tissues and fluids of the rabbit eye, using CCI. With this system, intraocular tissues levels of MP are higher than those achieved after i.v. injection. Furthermore, if needed, the drug levels achieved with CCI can be modulated as a function of current intensity and duration of treatment. CCI could therefore be used as an alternative method for the delivery of high levels of MP to the intraocular tissues of both the anterior and posterior segments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim  Background The expected benefit of transvaginal specimen extraction is reduced incision-related morbidity. Objectives A systematic review of transvaginal specimen extraction in colorectal surgery was carried out to assess this expectation. Method  Search strategy The following keywords, in various combinations, were searched: NOSE (natural orifices specimen extraction), colorectal, colon surgery, transvaginal, right hemicolectomy, left hemicolectomy, low anterior resection, sigmoidectomy, ileocaecal resection, proctocolectomy, colon cancer, sigmoid diverticulitis and inflammatory bowel diseases. Selection criteria Selection criteria included large bowel resection with transvaginal specimen extraction, laparoscopic approach, human studies and English language. Exclusion criteria were experimental studies and laparotomic approach or local excision. All articles published up to February 2011 were included. Results  Twenty-three articles (including a total of 130 patients) fulfilled the search criteria. The primary diagnosis was colorectal cancer in 51% (67) of patients, endometriosis in 46% (60) of patients and other conditions in the remaining patients. A concurrent gynaecological procedure was performed in 17% (22) of patients. One case of conversion to laparotomy was reported. In two patients, transvaginal extraction failed. In left- and right-sided resections, the rate of severe complications was 3.7% and 2%, respectively. Two significant complications, one of pelvic seroma and one of rectovaginal fistula, were likely to have been related to transvaginal extraction. The degree of follow up was specified in only one study. Harvested nodes and negative margins were adequate and reported in 70% of oncological cases. Conclusion  Vaginal extraction of a colorectal surgery specimen shows potential benefit, particularly when associated with a gynaecological procedure. Data from prospective randomized trials are needed to support the routine use of this technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Bone health is a concern when treating early stage breast cancer patients with adjuvant aromatase inhibitors. Early detection of patients (pts) at risk of osteoporosis and fractures may be helpful for starting preventive therapies and selecting the most appropriate endocrine therapy schedule. We present statistical models describing the evolution of lumbar and hip bone mineral density (BMD) in pts treated with tamoxifen (T), letrozole (L) and sequences of T and L. Methods: Available dual-energy x-ray absorptiometry exams (DXA) of pts treated in trial BIG 1-98 were retrospectively collected from Swiss centers. Treatment arms: A) T for 5 years, B) L for 5 years, C) 2 years of T followed by 3 years of L and, D) 2 years of L followed by 3 years of T. Pts without DXA were used as a control for detecting selection biases. Patients randomized to arm A were subsequently allowed an unplanned switch from T to L. Allowing for variations between DXA machines and centres, two repeated measures models, using a covariance structure that allow for different times between DXA, were used to estimate changes in hip and lumbar BMD (g/cm2) from trial randomization. Prospectively defined covariates, considered as fixed effects in the multivariable models in an intention to treat analysis, at the time of trial randomization were: age, height, weight, hysterectomy, race, known osteoporosis, tobacco use, prior bone fracture, prior hormone replacement therapy (HRT), bisphosphonate use and previous neo-/adjuvant chemotherapy (ChT). Similarly, the T-scores for lumbar and hip BMD measurements were modeled using a per-protocol approach (allowing for treatment switch in arm A), specifically studying the effect of each therapy upon T-score percentage. Results: A total of 247 out of 546 pts had between 1 and 5 DXA; a total of 576 DXA were collected. Number of DXA measurements per arm were; arm A 133, B 137, C 141 and D 135. The median follow-up time was 5.8 years. Significant factors positively correlated with lumbar and hip BMD in the multivariate analysis were weight, previous HRT use, neo-/adjuvant ChT, hysterectomy and height. Significant negatively correlated factors in the models were osteoporosis, treatment arm (B/C/D vs. A), time since endocrine therapy start, age and smoking (current vs. never).Modeling the T-score percentage, differences from T to L were -4.199% (p = 0.036) and -4.907% (p = 0.025) for the hip and lumbar measurements respectively, before any treatment switch occurred. Conclusions: Our statistical models describe the lumbar and hip BMD evolution for pts treated with L and/or T. The results of both localisations confirm that, contrary to expectation, the sequential schedules do not seem less detrimental for the BMD than L monotherapy. The estimated difference in BMD T-score percent is at least 4% from T to L.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the advances in sensor networks and remote sensing technologies, the acquisition and storage rates of meteorological and climatological data increases every day and ask for novel and efficient processing algorithms. A fundamental problem of data analysis and modeling is the spatial prediction of meteorological variables in complex orography, which serves among others to extended climatological analyses, for the assimilation of data into numerical weather prediction models, for preparing inputs to hydrological models and for real time monitoring and short-term forecasting of weather.In this thesis, a new framework for spatial estimation is proposed by taking advantage of a class of algorithms emerging from the statistical learning theory. Nonparametric kernel-based methods for nonlinear data classification, regression and target detection, known as support vector machines (SVM), are adapted for mapping of meteorological variables in complex orography.With the advent of high resolution digital elevation models, the field of spatial prediction met new horizons. In fact, by exploiting image processing tools along with physical heuristics, an incredible number of terrain features which account for the topographic conditions at multiple spatial scales can be extracted. Such features are highly relevant for the mapping of meteorological variables because they control a considerable part of the spatial variability of meteorological fields in the complex Alpine orography. For instance, patterns of orographic rainfall, wind speed and cold air pools are known to be correlated with particular terrain forms, e.g. convex/concave surfaces and upwind sides of mountain slopes.Kernel-based methods are employed to learn the nonlinear statistical dependence which links the multidimensional space of geographical and topographic explanatory variables to the variable of interest, that is the wind speed as measured at the weather stations or the occurrence of orographic rainfall patterns as extracted from sequences of radar images. Compared to low dimensional models integrating only the geographical coordinates, the proposed framework opens a way to regionalize meteorological variables which are multidimensional in nature and rarely show spatial auto-correlation in the original space making the use of classical geostatistics tangled.The challenges which are explored during the thesis are manifolds. First, the complexity of models is optimized to impose appropriate smoothness properties and reduce the impact of noisy measurements. Secondly, a multiple kernel extension of SVM is considered to select the multiscale features which explain most of the spatial variability of wind speed. Then, SVM target detection methods are implemented to describe the orographic conditions which cause persistent and stationary rainfall patterns. Finally, the optimal splitting of the data is studied to estimate realistic performances and confidence intervals characterizing the uncertainty of predictions.The resulting maps of average wind speeds find applications within renewable resources assessment and opens a route to decrease the temporal scale of analysis to meet hydrological requirements. Furthermore, the maps depicting the susceptibility to orographic rainfall enhancement can be used to improve current radar-based quantitative precipitation estimation and forecasting systems and to generate stochastic ensembles of precipitation fields conditioned upon the orography.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CONTEXT: In the Health Outcomes and Reduced Incidence with Zoledronic Acid Once Yearly - Pivotal Fracture Trial (HORIZON-PFT), zoledronic acid (ZOL) 5 mg significantly reduced fracture risk. OBJECTIVE: The aim of the study was to identify factors associated with greater efficacy during ZOL 5 mg treatment. DESIGN, SETTING, AND PATIENTS: We conducted a subgroup analysis (preplanned and post hoc) of a multicenter, double-blind, placebo-controlled, 36-month trial in 7765 women with postmenopausal osteoporosis. Intervention: A single infusion of ZOL 5 mg or placebo was administered at baseline, 12, and 24 months. MAIN OUTCOME MEASURES: Primary endpoints were new vertebral fracture and hip fracture. Secondary endpoints were nonvertebral fracture and change in femoral neck bone mineral density (BMD). Baseline risk factor subgroups were age, BMD T-score and vertebral fracture status, total hip BMD, race, weight, geographical region, smoking, height loss, history of falls, physical activity, prior bisphosphonates, creatinine clearance, body mass index, and concomitant osteoporosis medications. RESULTS: Greater ZOL induced effects on vertebral fracture risk were seen with younger age (treatment-by-subgroup interaction, P = 0.05), normal creatinine clearance (P = 0.04), and body mass index >or= 25 kg/m(2) (P = 0.02). There were no significant treatment-factor interactions for hip or nonvertebral fracture or for change in BMD. CONCLUSIONS: ZOL appeared more effective in preventing vertebral fracture in younger women, overweight/obese women, and women with normal renal function. ZOL had similar effects irrespective of fracture risk factors or femoral neck BMD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The global structural connectivity of the brain, the human connectome, is now accessible at millimeter scale with the use of MRI. In this paper, we describe an approach to map the connectome by constructing normalized whole-brain structural connection matrices derived from diffusion MRI tractography at 5 different scales. Using a template-based approach to match cortical landmarks of different subjects, we propose a robust method that allows (a) the selection of identical cortical regions of interest of desired size and location in different subjects with identification of the associated fiber tracts (b) straightforward construction and interpretation of anatomically organized whole-brain connection matrices and (c) statistical inter-subject comparison of brain connectivity at various scales. The fully automated post-processing steps necessary to build such matrices are detailed in this paper. Extensive validation tests are performed to assess the reproducibility of the method in a group of 5 healthy subjects and its reliability is as well considerably discussed in a group of 20 healthy subjects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spatial data analysis mapping and visualization is of great importance in various fields: environment, pollution, natural hazards and risks, epidemiology, spatial econometrics, etc. A basic task of spatial mapping is to make predictions based on some empirical data (measurements). A number of state-of-the-art methods can be used for the task: deterministic interpolations, methods of geostatistics: the family of kriging estimators (Deutsch and Journel, 1997), machine learning algorithms such as artificial neural networks (ANN) of different architectures, hybrid ANN-geostatistics models (Kanevski and Maignan, 2004; Kanevski et al., 1996), etc. All the methods mentioned above can be used for solving the problem of spatial data mapping. Environmental empirical data are always contaminated/corrupted by noise, and often with noise of unknown nature. That's one of the reasons why deterministic models can be inconsistent, since they treat the measurements as values of some unknown function that should be interpolated. Kriging estimators treat the measurements as the realization of some spatial randomn process. To obtain the estimation with kriging one has to model the spatial structure of the data: spatial correlation function or (semi-)variogram. This task can be complicated if there is not sufficient number of measurements and variogram is sensitive to outliers and extremes. ANN is a powerful tool, but it also suffers from the number of reasons. of a special type ? multiplayer perceptrons ? are often used as a detrending tool in hybrid (ANN+geostatistics) models (Kanevski and Maignank, 2004). Therefore, development and adaptation of the method that would be nonlinear and robust to noise in measurements, would deal with the small empirical datasets and which has solid mathematical background is of great importance. The present paper deals with such model, based on Statistical Learning Theory (SLT) - Support Vector Regression. SLT is a general mathematical framework devoted to the problem of estimation of the dependencies from empirical data (Hastie et al, 2004; Vapnik, 1998). SLT models for classification - Support Vector Machines - have shown good results on different machine learning tasks. The results of SVM classification of spatial data are also promising (Kanevski et al, 2002). The properties of SVM for regression - Support Vector Regression (SVR) are less studied. First results of the application of SVR for spatial mapping of physical quantities were obtained by the authorsin for mapping of medium porosity (Kanevski et al, 1999), and for mapping of radioactively contaminated territories (Kanevski and Canu, 2000). The present paper is devoted to further understanding of the properties of SVR model for spatial data analysis and mapping. Detailed description of the SVR theory can be found in (Cristianini and Shawe-Taylor, 2000; Smola, 1996) and basic equations for the nonlinear modeling are given in section 2. Section 3 discusses the application of SVR for spatial data mapping on the real case study - soil pollution by Cs137 radionuclide. Section 4 discusses the properties of the modelapplied to noised data or data with outliers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: There may be a considerable gap between LDL cholesterol (LDL-C) and blood pressure (BP) goal values recommended by the guidelines and results achieved in daily practice. Design Prospective cross-sectional survey of cardiovascular disease risk profiles and management with focus on lipid lowering and BP lowering in clinical practice. Methods: In phase 1, the cardiovascular risk of patients with known lipid profile visiting their general practitioner was anonymously assessed in accordance to the PROCAM-score. In phase 2, high-risk patients who did not achieve LDL-C goal less than 2.6 mmol/l in phase 1 could be further documented. Results: Six hundred thirty-five general practitioners collected the data of 23 892 patients with known lipid profile. Forty percent were high-risk patients (diabetes mellitus or coronary heart disease or PROCAM-score >20%), compared with 27% estimated by the physicians. Goal attainment rate was almost double for BP than for LDL-C in high-risk patients (62 vs. 37%). Both goals were attained by 25%. LDL-C values in phase 1 and 2 were available for 3097 high-risk patients not at LDL-C goal in phase 1; 32% of patients achieved LDL-C goal of less than 2.6 mmol/l after a mean of 17 weeks. The most successful strategies for LDL-C reduction were implemented in only 22% of the high-risk patients. Conclusion: Although patients at high cardiovascular risk were treated more intensively than low or medium risk patients, the majority remained insufficiently controlled, which is an incentive for intensified medical education. Adequate implementation of Swiss and International guidelines would expectedly contribute to improved achievement of LDL-C and BP goal values in daily practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three standard radiation qualities (RQA 3, RQA 5 and RQA 9) and two screens, Kodak Lanex Regular and Insight Skeletal, were used to compare the imaging performance and dose requirements of the new Kodak Hyper Speed G and the current Kodak T-MAT G/RA medical x-ray films. The noise equivalent quanta (NEQ) and detective quantum efficiencies (DQE) of the four screen-film combinations were measured at three gross optical densities and compared with the characteristics for the Kodak CR 9000 system with GP (general purpose) and HR (high resolution) phosphor plates. The new Hyper Speed G film has double the intrinsic sensitivity of the T-MAT G/RA film and a higher contrast in the high optical density range for comparable exposure latitude. By providing both high sensitivity and high spatial resolution, the new film significantly improves the compromise between dose and image quality. As expected, the new film has a higher noise level and a lower signal-to-noise ratio than the standard film, although in the high frequency range this is compensated for by a better resolution, giving better DQE results--especially at high optical density. Both screen-film systems outperform the phosphor plates in terms of MTF and DQE for standard imaging conditions (Regular screen at RQA 5 and RQA 9 beam qualities). At low energy (RQA 3), the CR system has a comparable low-frequency DQE to screen-film systems when used with a fine screen at low and middle optical densities, and a superior low-frequency DQE at high optical density.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multi-center studies using magnetic resonance imaging facilitate studying small effect sizes, global population variance and rare diseases. The reliability and sensitivity of these multi-center studies crucially depend on the comparability of the data generated at different sites and time points. The level of inter-site comparability is still controversial for conventional anatomical T1-weighted MRI data. Quantitative multi-parameter mapping (MPM) was designed to provide MR parameter measures that are comparable across sites and time points, i.e., 1 mm high-resolution maps of the longitudinal relaxation rate (R1 = 1/T1), effective proton density (PD(*)), magnetization transfer saturation (MT) and effective transverse relaxation rate (R2(*) = 1/T2(*)). MPM was validated at 3T for use in multi-center studies by scanning five volunteers at three different sites. We determined the inter-site bias, inter-site and intra-site coefficient of variation (CoV) for typical morphometric measures [i.e., gray matter (GM) probability maps used in voxel-based morphometry] and the four quantitative parameters. The inter-site bias and CoV were smaller than 3.1 and 8%, respectively, except for the inter-site CoV of R2(*) (<20%). The GM probability maps based on the MT parameter maps had a 14% higher inter-site reproducibility than maps based on conventional T1-weighted images. The low inter-site bias and variance in the parameters and derived GM probability maps confirm the high comparability of the quantitative maps across sites and time points. The reliability, short acquisition time, high resolution and the detailed insights into the brain microstructure provided by MPM makes it an efficient tool for multi-center imaging studies.