52 resultados para Tertiary, Assessment, Statistics, Learning, Mathematics


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents multiple kernel learning (MKL) regression as an exploratory spatial data analysis and modelling tool. The MKL approach is introduced as an extension of support vector regression, where MKL uses dedicated kernels to divide a given task into sub-problems and to treat them separately in an effective way. It provides better interpretability to non-linear robust kernel regression at the cost of a more complex numerical optimization. In particular, we investigate the use of MKL as a tool that allows us to avoid using ad-hoc topographic indices as covariables in statistical models in complex terrains. Instead, MKL learns these relationships from the data in a non-parametric fashion. A study on data simulated from real terrain features confirms the ability of MKL to enhance the interpretability of data-driven models and to aid feature selection without degrading predictive performances. Here we examine the stability of the MKL algorithm with respect to the number of training data samples and to the presence of noise. The results of a real case study are also presented, where MKL is able to exploit a large set of terrain features computed at multiple spatial scales, when predicting mean wind speed in an Alpine region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: A survey was undertaken among Swiss occupational hygienists and other professionals to identify the different exposure assessment methods used, the contextual parameters observed and the uses, difficulties and possible developments of exposure models for field application. METHODS: A questionnaire was mailed to 121 occupational hygienists, all members of the Swiss Occupational Hygiene Society. A shorter questionnaire was also sent to registered occupational physicians and selected safety specialists. Descriptive statistics and multivariate analyses were performed. RESULTS: The response rate for occupational hygienists was 60%. The so-called expert judgement appeared to be the most widely used method, but its efficiency and reliability were both judged with very low scores. Long-term sampling was perceived as the most efficient and reliable method. Various determinants of exposure, such as emission rate and work activity, were often considered important, even though they were not included in the exposure assessment processes. Near field local phenomena determinants were also judged important for operator exposure estimation. CONCLUSION: Exposure models should be improved to integrate factors which are more easily accessible to practitioners. Descriptors of emission and local phenomena should also be included.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dans le domaine de la perception, l'apprentissage est contraint par la présence d'une architecture fonctionnelle constituée d'aires corticales distribuées et très spécialisées. Dans le domaine des troubles visuels d'origine cérébrale, l'apprentissage d'un patient hémi-anopsique ou agnosique sera limité par ses capacités perceptives résiduelles, mais un déficit de reconnaissance visuelle de nature apparemment perceptive, peut également être associé à une altération des représentations en mémoire à long terme. Des réseaux neuronaux distincts pour la reconnaissance - cortex temporal - et pour la localisation des sons - cortex pariétal - ont été décrits chez l'homme. L'étude de patients cérébro-lésés confirme le rôle des indices spatiaux dans un traitement auditif explicite du « where » et dans la discrimination implicite du « what ». Cette organisation, similaire à ce qui a été décrit dans la modalité visuelle, faciliterait les apprentissages perceptifs. Plus généralement, l'apprentissage implicite fonde une grande partie de nos connaissances sur le monde en nous rendant sensible, à notre insu, aux règles et régularités de notre environnement. Il serait impliqué dans le développement cognitif, la formation des réactions émotionnelles ou encore l'apprentissage par le jeune enfant de sa langue maternelle. Le caractère inconscient de cet apprentissage est confirmé par l'étude des temps de réaction sériels de patients amnésiques dans l'acquisition d'une grammaire artificielle. Son évaluation pourrait être déterminante dans la prise en charge ré-adaptative. [In the field of perception, learning is formed by a distributed functional architecture of very specialized cortical areas. For example, capacities of learning in patients with visual deficits - hemianopia or visual agnosia - from cerebral lesions are limited by perceptual abilities. Moreover a visual deficit in link with abnormal perception may be associated with an alteration of representations in long term (semantic) memory. Furthermore, perception and memory traces rely on parallel processing. This has been recently demonstrated for human audition. Activation studies in normal subjects and psychophysical investigations in patients with focal hemispheric lesions have shown that auditory information relevant to sound recognition and that relevant to sound localisation are processed in parallel, anatomically distinct cortical networks, often referred to as the "What" and "Where" processing streams. Parallel processing may appear counterintuitive from the point of view of a unified perception of the auditory world, but there are advantages, such as rapidity of processing within a single stream, its adaptability in perceptual learning or facility of multisensory interactions. More generally, implicit learning mechanisms are responsible for the non-conscious acquisition of a great part of our knowledge about the world, using our sensitivity to the rules and regularities structuring our environment. Implicit learning is involved in cognitive development, in the generation of emotional processing and in the acquisition of natural language. Preserved implicit learning abilities have been shown in amnesic patients with paradigms like serial reaction time and artificial grammar learning tasks, confirming that implicit learning mechanisms are not sustained by the cognitive processes and the brain structures that are damaged in amnesia. In a clinical perspective, the assessment of implicit learning abilities in amnesic patients could be critical for building adapted neuropsychological rehabilitation programs.]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Single amino acid substitution is the type of protein alteration most related to human diseases. Current studies seek primarily to distinguish neutral mutations from harmful ones. Very few methods offer an explanation of the final prediction result in terms of the probable structural or functional effect on the protein. In this study, we describe the use of three novel parameters to identify experimentally-verified critical residues of the TP53 protein (p53). The first two parameters make use of a surface clustering method to calculate the protein surface area of highly conserved regions or regions with high nonlocal atomic interaction energy (ANOLEA) score. These parameters help identify important functional regions on the surface of a protein. The last parameter involves the use of a new method for pseudobinding free-energy estimation to specifically probe the importance of residue side-chains to the stability of protein fold. A decision tree was designed to optimally combine these three parameters. The result was compared to the functional data stored in the International Agency for Research on Cancer (IARC) TP53 mutation database. The final prediction achieved a prediction accuracy of 70% and a Matthews correlation coefficient of 0.45. It also showed a high specificity of 91.8%. Mutations in the 85 correctly identified important residues represented 81.7% of the total mutations recorded in the database. In addition, the method was able to correctly assign a probable functional or structural role to the residues. Such information could be critical for the interpretation and prediction of the effect of missense mutations, as it not only provided the fundamental explanation of the observed effect, but also helped design the most appropriate laboratory experiment to verify the prediction results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aimed to use the plantar pressure insole for estimating the three-dimensional ground reaction force (GRF) as well as the frictional torque (T(F)) during walking. Eleven subjects, six healthy and five patients with ankle disease participated in the study while wearing pressure insoles during several walking trials on a force-plate. The plantar pressure distribution was analyzed and 10 principal components of 24 regional pressure values with the stance time percentage (STP) were considered for GRF and T(F) estimation. Both linear and non-linear approximators were used for estimating the GRF and T(F) based on two learning strategies using intra-subject and inter-subjects data. The RMS error and the correlation coefficient between the approximators and the actual patterns obtained from force-plate were calculated. Our results showed better performance for non-linear approximation especially when the STP was considered as input. The least errors were observed for vertical force (4%) and anterior-posterior force (7.3%), while the medial-lateral force (11.3%) and frictional torque (14.7%) had higher errors. The result obtained for the patients showed higher error; nevertheless, when the data of the same patient were used for learning, the results were improved and in general slight differences with healthy subjects were observed. In conclusion, this study showed that ambulatory pressure insole with data normalization, an optimal choice of inputs and a well-trained nonlinear mapping function can estimate efficiently the three-dimensional ground reaction force and frictional torque in consecutive gait cycle without requiring a force-plate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatic environmental monitoring networks enforced by wireless communication technologies provide large and ever increasing volumes of data nowadays. The use of this information in natural hazard research is an important issue. Particularly useful for risk assessment and decision making are the spatial maps of hazard-related parameters produced from point observations and available auxiliary information. The purpose of this article is to present and explore the appropriate tools to process large amounts of available data and produce predictions at fine spatial scales. These are the algorithms of machine learning, which are aimed at non-parametric robust modelling of non-linear dependencies from empirical data. The computational efficiency of the data-driven methods allows producing the prediction maps in real time which makes them superior to physical models for the operational use in risk assessment and mitigation. Particularly, this situation encounters in spatial prediction of climatic variables (topo-climatic mapping). In complex topographies of the mountainous regions, the meteorological processes are highly influenced by the relief. The article shows how these relations, possibly regionalized and non-linear, can be modelled from data using the information from digital elevation models. The particular illustration of the developed methodology concerns the mapping of temperatures (including the situations of Föhn and temperature inversion) given the measurements taken from the Swiss meteorological monitoring network. The range of the methods used in the study includes data-driven feature selection, support vector algorithms and artificial neural networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Extended pharmacological venous thromboembolism (VTE) prophylaxis beyond discharge is recommended for patients undergoing high-risk surgery. We prospectively investigated prophylaxis in 1,046 consecutive patients undergoing major orthopaedic (70%) or major cancer surgery (30%) in 14 Swiss hospitals. Appropriate in-hospital prophylaxis was used in 1,003 (96%) patients. At discharge, 638 (61%) patients received prescription for extended pharmacological prophylaxis: 564 (77%) after orthopaedic surgery, and 74 (23%) after cancer surgery (p < 0.001). Patients with knee replacement (94%), hip replacement (81%), major trauma (80%), and curative arthroscopy (73%) had the highest prescription rates for extended VTE prophylaxis; the lowest rates were found in patients undergoing major surgery for thoracic (7%), gastrointestinal (19%), and hepatobiliary (33%) cancer. The median duration of prescribed extended prophylaxis was longer in patients with orthopaedic surgery (32 days, interquartile range 14-40 days) than in patients with cancer surgery (23 days, interquartile range 11-30 days; p<0.001). Among the 278 patients with an extended prophylaxis order after hip replacement, knee replacement, or hip fracture surgery, 120 (43%) received a prescription for at least 35 days, and among the 74 patients with an extended prophylaxis order after major cancer surgery, 20 (27%) received a prescription for at least 28 days. In conclusion, approximately one quarter of the patients with major orthopaedic surgery and more than three quarters of the patients with major cancer surgery did not receive prescription for extended VTE prophylaxis. Future effort should focus on the improvement of extended VTE prophylaxis, particularly in patients undergoing major cancer surgery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Children and adolescents are at high risk of sustaining fractures during growth. Therefore, epidemiological assessment is crucial for fracture prevention. The AO Comprehensive Injury Automatic Classifier (AO COIAC) was used to evaluate epidemiological data of pediatric long bone fractures in a large cohort. METHODS: Data from children and adolescents with long bone fractures sustained between 2009 and 2011, treated at either of two tertiary pediatric surgery hospitals in Switzerland, were retrospectively collected. Fractures were classified according to the AO Pediatric Comprehensive Classification of Long Bone Fractures (PCCF). RESULTS: For a total of 2716 patients (60% boys), 2807 accidents with 2840 long bone fractures (59% radius/ulna; 21% humerus; 15% tibia/fibula; 5% femur) were documented. Children's mean age (SD) was 8.2 (4.0) years (6% infants; 26% preschool children; 40% school children; 28% adolescents). Adolescent boys sustained more fractures than girls (p < 0.001). The leading cause of fractures was falls (27%), followed by accidents occurring during leisure activities (25%), at home (14%), on playgrounds (11%), and traffic (11%) and school accidents (8%). There was boy predominance for all accident types except for playground and at home accidents. The distribution of accident types differed according to age classes (p < 0.001). Twenty-six percent of patients were classed as overweight or obese - higher than data published by the WHO for the corresponding ages - with a higher proportion of overweight and obese boys than in the Swiss population (p < 0.0001). CONCLUSION: Overall, differences in the fracture distribution were sex and age related. Overweight and obese patients seemed to be at increased risk of sustaining fractures. Our data give valuable input into future development of prevention strategies. The AO PCCF proved to be useful in epidemiological reporting and analysis of pediatric long bone fractures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Patients with rare diseases such as congenital hypogonadotropic hypogonadism (CHH) are dispersed, often challenged to find specialized care and face other health disparities. The internet has the potential to reach a wide audience of rare disease patients and can help connect patients and specialists. Therefore, this study aimed to: (i) determine if web-based platforms could be effectively used to conduct an online needs assessment of dispersed CHH patients; (ii) identify the unmet health and informational needs of CHH patients and (iii) assess patient acceptability regarding patient-centered, web-based interventions to bridge shortfalls in care. METHODS: A sequential mixed-methods design was used: first, an online survey was conducted to evaluate health promoting behavior and identify unmet health and informational needs of CHH men. Subsequently, patient focus groups were held to explore specific patient-identified targets for care and to examine the acceptability of possible online interventions. Descriptive statistics and thematic qualitative analyses were used. RESULTS: 105 male participants completed the online survey (mean age 37 ± 11, range 19-66 years) representing a spectrum of patients across a broad socioeconomic range and all but one subject had adequate healthcare literacy. The survey revealed periods of non-adherence to treatment (34/93, 37%) and gaps in healthcare (36/87, 41%) exceeding one year. Patient focus groups identified lasting psychological effects related to feelings of isolation, shame and body-image concerns. Survey respondents were active internet users, nearly all had sought CHH information online (101/105, 96%), and they rated the internet, healthcare providers, and online community as equally important CHH information sources. Focus group participants were overwhelmingly positive regarding online interventions/support with links to reach expert healthcare providers and for peer-to-peer support. CONCLUSION: The web-based needs assessment was an effective way to reach dispersed CHH patients. These individuals often have long gaps in care and struggle with the psychosocial sequelae of CHH. They are highly motivated internet users seeking information and tapping into online communities and are receptive to novel web-based interventions addressing their unmet needs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper presents some contemporary approaches to spatial environmental data analysis. The main topics are concentrated on the decision-oriented problems of environmental spatial data mining and modeling: valorization and representativity of data with the help of exploratory data analysis, spatial predictions, probabilistic and risk mapping, development and application of conditional stochastic simulation models. The innovative part of the paper presents integrated/hybrid model-machine learning (ML) residuals sequential simulations-MLRSS. The models are based on multilayer perceptron and support vector regression ML algorithms used for modeling long-range spatial trends and sequential simulations of the residuals. NIL algorithms deliver non-linear solution for the spatial non-stationary problems, which are difficult for geostatistical approach. Geostatistical tools (variography) are used to characterize performance of ML algorithms, by analyzing quality and quantity of the spatially structured information extracted from data with ML algorithms. Sequential simulations provide efficient assessment of uncertainty and spatial variability. Case study from the Chernobyl fallouts illustrates the performance of the proposed model. It is shown that probability mapping, provided by the combination of ML data driven and geostatistical model based approaches, can be efficiently used in decision-making process. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To assess religious coping in schizophrenia, we developed and tested a clinical grid, as no validated questionnaire exists for this population. One hundred fifteen outpatients were interviewed. Results obtained by 2 clinicians were compared. Religion was central in the lives of 45% of patients, 60% used religion extensively to cope with their illness. Religion is a multifaceted construct. Principal component analysis elicited 4 factors: subjective dimension, collective dimension, synergy with psychiatric treatment, and ease of talking about religion with psychiatrist. Different associations were found between these factors and psychopathology, substance abuse, and psychosocial adaptation. The high prevalence of spirituality and religious coping clearly indicates the necessity of addressing spirituality in patient care. Our clinical grid is suitable for this purpose. It proved its applicability to a broad diversity of religious beliefs, even pathological ones. Interjudge reliability and construct validity were high and specific training is not required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Pain assessment in mechanically ventilated patients is challenging, because nurses need to decode pain behaviour, interpret pain scores, and make appropriate decisions. This clinical reasoning process is inherent to advanced nursing practice, but is poorly understood. A better understanding of this process could contribute to improved pain assessment and management. OBJECTIVE: This study aimed to describe the indicators that influence expert nurses' clinical reasoning when assessing pain in critically ill nonverbal patients. METHODS: This descriptive observational study was conducted in the adult intensive care unit (ICU) of a tertiary referral hospital in Western Switzerland. A purposive sample of expert nurses, caring for nonverbal ventilated patients who received sedation and analgesia, were invited to participate in the study. Data were collected in "real life" using recorded think-aloud combined with direct non-participant observation and brief interviews. Data were analysed using deductive and inductive content analyses using a theoretical framework related to clinical reasoning and pain. RESULTS: Seven expert nurses with an average of 7.85 (±3.1) years of critical care experience participated in the study. The patients had respiratory distress (n=2), cardiac arrest (n=2), sub-arachnoid bleeding (n=1), and multi-trauma (n=2). A total of 1344 quotes in five categories were identified. Patients' physiological stability was the principal indicator for making decision in relation to pain management. Results also showed that it is a permanent challenge for nurses to discriminate situations requiring sedation from situations requiring analgesia. Expert nurses mainly used working knowledge and patterns to anticipate and prevent pain. CONCLUSIONS: Patient's clinical condition is important for making decision about pain in critically ill nonverbal patients. The concept of pain cannot be assessed in isolation and its assessment should take the patient's clinical stability and sedation into account. Further research is warranted to confirm these results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vertebral fracture assessments (VFAs) using dual-energy X-ray absorptiometry increase vertebral fracture detection in clinical practice and are highly reproducible. Measures of reproducibility are dependent on the frequency and distribution of the event. The aim of this study was to compare 2 reproducibility measures, reliability and agreement, in VFA readings in both a population-based and a clinical cohort. We measured agreement and reliability by uniform kappa and Cohen's kappa for vertebral reading and fracture identification: 360 VFAs from a population-based cohort and 85 from a clinical cohort. In the population-based cohort, 12% of vertebrae were unreadable. Vertebral fracture prevalence ranged from 3% to 4%. Inter-reader and intrareader reliability with Cohen's kappa was fair to good (0.35-0.71 and 0.36-0.74, respectively), with good inter-reader and intrareader agreement by uniform kappa (0.74-0.98 and 0.76-0.99, respectively). In the clinical cohort, 15% of vertebrae were unreadable, and vertebral fracture prevalence ranged from 7.6% to 8.1%. Inter-reader reliability was moderate to good (0.43-0.71), and the agreement was good (0.68-0.91). In clinical situations, the levels of reproducibility measured by the 2 kappa statistics are concordant, so that either could be used to measure agreement and reliability. However, if events are rare, as in a population-based cohort, we recommend evaluating reproducibility using the uniform kappa, as Cohen's kappa may be less accurate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geophysical tomography captures the spatial distribution of the underlying geophysical property at a relatively high resolution, but the tomographic images tend to be blurred representations of reality and generally fail to reproduce sharp interfaces. Such models may cause significant bias when taken as a basis for predictive flow and transport modeling and are unsuitable for uncertainty assessment. We present a methodology in which tomograms are used to condition multiple-point statistics (MPS) simulations. A large set of geologically reasonable facies realizations and their corresponding synthetically calculated cross-hole radar tomograms are used as a training image. The training image is scanned with a direct sampling algorithm for patterns in the conditioning tomogram, while accounting for the spatially varying resolution of the tomograms. In a post-processing step, only those conditional simulations that predicted the radar traveltimes within the expected data error levels are accepted. The methodology is demonstrated on a two-facies example featuring channels and an aquifer analog of alluvial sedimentary structures with five facies. For both cases, MPS simulations exhibit the sharp interfaces and the geological patterns found in the training image. Compared to unconditioned MPS simulations, the uncertainty in transport predictions is markedly decreased for simulations conditioned to tomograms. As an improvement to other approaches relying on classical smoothness-constrained geophysical tomography, the proposed method allows for: (1) reproduction of sharp interfaces, (2) incorporation of realistic geological constraints and (3) generation of multiple realizations that enables uncertainty assessment.