94 resultados para Rule-Based Classification


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Several studies have established Glioblastoma Multiforme (GBM) prognostic and predictive models based on age and Karnofsky Performance Status (KPS), while very few studies evaluated the prognostic and predictive significance of preoperative MR-imaging. However, to date, there is no simple preoperative GBM classification that also correlates with a highly prognostic genomic signature. Thus, we present for the first time a biologically relevant, and clinically applicable tumor Volume, patient Age, and KPS (VAK) GBM classification that can easily and non-invasively be determined upon patient admission. METHODS: We quantitatively analyzed the volumes of 78 GBM patient MRIs present in The Cancer Imaging Archive (TCIA) corresponding to patients in The Cancer Genome Atlas (TCGA) with VAK annotation. The variables were then combined using a simple 3-point scoring system to form the VAK classification. A validation set (N = 64) from both the TCGA and Rembrandt databases was used to confirm the classification. Transcription factor and genomic correlations were performed using the gene pattern suite and Ingenuity Pathway Analysis. RESULTS: VAK-A and VAK-B classes showed significant median survival differences in discovery (P = 0.007) and validation sets (P = 0.008). VAK-A is significantly associated with P53 activation, while VAK-B shows significant P53 inhibition. Furthermore, a molecular gene signature comprised of a total of 25 genes and microRNAs was significantly associated with the classes and predicted survival in an independent validation set (P = 0.001). A favorable MGMT promoter methylation status resulted in a 10.5 months additional survival benefit for VAK-A compared to VAK-B patients. CONCLUSIONS: The non-invasively determined VAK classification with its implication of VAK-specific molecular regulatory networks, can serve as a very robust initial prognostic tool, clinical trial selection criteria, and important step toward the refinement of genomics-based personalized therapy for GBM patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper deals with the development and application of the generic methodology for automatic processing (mapping and classification) of environmental data. General Regression Neural Network (GRNN) is considered in detail and is proposed as an efficient tool to solve the problem of spatial data mapping (regression). The Probabilistic Neural Network (PNN) is considered as an automatic tool for spatial classifications. The automatic tuning of isotropic and anisotropic GRNN/PNN models using cross-validation procedure is presented. Results are compared with the k-Nearest-Neighbours (k-NN) interpolation algorithm using independent validation data set. Real case studies are based on decision-oriented mapping and classification of radioactively contaminated territories.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Colorectal cancer (CRC) is a major cause of cancer mortality. Whereas some patients respond well to therapy, others do not, and thus more precise, individualized treatment strategies are needed. To that end, we analyzed gene expression profiles from 1,290 CRC tumors using consensus-based unsupervised clustering. The resultant clusters were then associated with therapeutic response data to the epidermal growth factor receptor-targeted drug cetuximab in 80 patients. The results of these studies define six clinically relevant CRC subtypes. Each subtype shares similarities to distinct cell types within the normal colon crypt and shows differing degrees of 'stemness' and Wnt signaling. Subtype-specific gene signatures are proposed to identify these subtypes. Three subtypes have markedly better disease-free survival (DFS) after surgical resection, suggesting these patients might be spared from the adverse effects of chemotherapy when they have localized disease. One of these three subtypes, identified by filamin A expression, does not respond to cetuximab but may respond to cMET receptor tyrosine kinase inhibitors in the metastatic setting. Two other subtypes, with poor and intermediate DFS, associate with improved response to the chemotherapy regimen FOLFIRI in adjuvant or metastatic settings. Development of clinically deployable assays for these subtypes and of subtype-specific therapies may contribute to more effective management of this challenging disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé Suite aux recentes avancées technologiques, les archives d'images digitales ont connu une croissance qualitative et quantitative sans précédent. Malgré les énormes possibilités qu'elles offrent, ces avancées posent de nouvelles questions quant au traitement des masses de données saisies. Cette question est à la base de cette Thèse: les problèmes de traitement d'information digitale à très haute résolution spatiale et/ou spectrale y sont considérés en recourant à des approches d'apprentissage statistique, les méthodes à noyau. Cette Thèse étudie des problèmes de classification d'images, c'est à dire de catégorisation de pixels en un nombre réduit de classes refletant les propriétés spectrales et contextuelles des objets qu'elles représentent. L'accent est mis sur l'efficience des algorithmes, ainsi que sur leur simplicité, de manière à augmenter leur potentiel d'implementation pour les utilisateurs. De plus, le défi de cette Thèse est de rester proche des problèmes concrets des utilisateurs d'images satellite sans pour autant perdre de vue l'intéret des méthodes proposées pour le milieu du machine learning dont elles sont issues. En ce sens, ce travail joue la carte de la transdisciplinarité en maintenant un lien fort entre les deux sciences dans tous les développements proposés. Quatre modèles sont proposés: le premier répond au problème de la haute dimensionalité et de la redondance des données par un modèle optimisant les performances en classification en s'adaptant aux particularités de l'image. Ceci est rendu possible par un système de ranking des variables (les bandes) qui est optimisé en même temps que le modèle de base: ce faisant, seules les variables importantes pour résoudre le problème sont utilisées par le classifieur. Le manque d'information étiquétée et l'incertitude quant à sa pertinence pour le problème sont à la source des deux modèles suivants, basés respectivement sur l'apprentissage actif et les méthodes semi-supervisées: le premier permet d'améliorer la qualité d'un ensemble d'entraînement par interaction directe entre l'utilisateur et la machine, alors que le deuxième utilise les pixels non étiquetés pour améliorer la description des données disponibles et la robustesse du modèle. Enfin, le dernier modèle proposé considère la question plus théorique de la structure entre les outputs: l'intègration de cette source d'information, jusqu'à présent jamais considérée en télédétection, ouvre des nouveaux défis de recherche. Advanced kernel methods for remote sensing image classification Devis Tuia Institut de Géomatique et d'Analyse du Risque September 2009 Abstract The technical developments in recent years have brought the quantity and quality of digital information to an unprecedented level, as enormous archives of satellite images are available to the users. However, even if these advances open more and more possibilities in the use of digital imagery, they also rise several problems of storage and treatment. The latter is considered in this Thesis: the processing of very high spatial and spectral resolution images is treated with approaches based on data-driven algorithms relying on kernel methods. In particular, the problem of image classification, i.e. the categorization of the image's pixels into a reduced number of classes reflecting spectral and contextual properties, is studied through the different models presented. The accent is put on algorithmic efficiency and the simplicity of the approaches proposed, to avoid too complex models that would not be used by users. The major challenge of the Thesis is to remain close to concrete remote sensing problems, without losing the methodological interest from the machine learning viewpoint: in this sense, this work aims at building a bridge between the machine learning and remote sensing communities and all the models proposed have been developed keeping in mind the need for such a synergy. Four models are proposed: first, an adaptive model learning the relevant image features has been proposed to solve the problem of high dimensionality and collinearity of the image features. This model provides automatically an accurate classifier and a ranking of the relevance of the single features. The scarcity and unreliability of labeled. information were the common root of the second and third models proposed: when confronted to such problems, the user can either construct the labeled set iteratively by direct interaction with the machine or use the unlabeled data to increase robustness and quality of the description of data. Both solutions have been explored resulting into two methodological contributions, based respectively on active learning and semisupervised learning. Finally, the more theoretical issue of structured outputs has been considered in the last model, which, by integrating outputs similarity into a model, opens new challenges and opportunities for remote sensing image processing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

More than 60% of neuroendocrine tumours, also called carcinoids, are localised within the gastrointestinal tract. Small bowel neuroendocrine tumours have been diagnosed with increasing frequency over the past 35 years, being the second most frequent tumours of the small intestine. Ileal neuroendocrine tumours diagnosis is late because patients have non-specific symptoms. We have proposed to illustrate as an example the case of a patient, and on its basis, to make a brief review of the literature on small bowel neuroendocrine tumours, resuming several recent changes in the field, concerning classification criteria of these tumours and new recommendations and current advances in diagnosis and treatment. This patient came to our emergency department with a complete bowel obstruction, along with a 2-year history of peristaltic abdominal pain, vomits and diarrhoea episodes. During emergency laparotomy, an ileal stricture was observed, that showed to be a neuroendocrine tumour of the small bowel.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For several years, the lack of consensus on definition, nomenclature, natural history, and biology of serrated polyps (SPs) of the colon has created considerable confusion among pathologists. According to the latest WHO classification, the family of SPs comprises hyperplastic polyps (HPs), sessile serrated adenomas/polyps (SSA/Ps), and traditional serrated adenomas (TSAs). The term SSA/P with dysplasia has replaced the category of mixed hyperplastic/adenomatous polyps (MPs). The present study aimed to evaluate the reproducibility of the diagnosis of SPs based on currently available diagnostic criteria and interactive consensus development. In an initial round, H&E slides of 70 cases of SPs were circulated among participating pathologists across Europe. This round was followed by a consensus discussion on diagnostic criteria. A second round was performed on the same 70 cases using the revised criteria and definitions according to the recent WHO classification. Data were evaluated for inter-observer agreement using Kappa statistics. In the initial round, for the total of 70 cases, a fair overall kappa value of 0.318 was reached, while in the second round overall kappa value improved to moderate (kappa = 0.557; p < 0.001). Overall kappa values for each diagnostic category also significantly improved in the final round, reaching 0.977 for HP, 0.912 for SSA/P, and 0.845 for TSA (p < 0.001). The diagnostic reproducibility of SPs improves when strictly defined, standardized diagnostic criteria adopted by consensus are applied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract in English : Ubiquitous Computing is the emerging trend in computing systems. Based on this observation this thesis proposes an analysis of the hardware and environmental constraints that rule pervasive platforms. These constraints have a strong impact on the programming of such platforms. Therefore solutions are proposed to facilitate this programming both at the platform and node levels. The first contribution presented in this document proposes a combination of agentoriented programming with the principles of bio-inspiration (Phylogenesys, Ontogenesys and Epigenesys) to program pervasive platforms such as the PERvasive computing framework for modeling comPLEX virtually Unbounded Systems platform. The second contribution proposes a method to program efficiently parallelizable applications on each computing node of this platform. Résumé en Français : Basée sur le constat que les calculs ubiquitaires vont devenir le paradigme de programmation dans les années à venir, cette thèse propose une analyse des contraintes matérielles et environnementale auxquelles sont soumises les plateformes pervasives. Ces contraintes ayant un impact fort sur la programmation des plateformes. Des solutions sont donc proposées pour faciliter cette programmation tant au niveau de l'ensemble des noeuds qu'au niveau de chacun des noeuds de la plateforme. La première contribution présentée dans ce document propose d'utiliser une alliance de programmation orientée agent avec les grands principes de la bio-inspiration (Phylogénèse, Ontogénèse et Épigénèse). Ceci pour répondres aux contraintes de programmation de plateformes pervasives comme la plateforme PERvasive computing framework for modeling comPLEX virtually Unbounded Systems . La seconde contribution propose quant à elle une méthode permettant de programmer efficacement des applications parallélisable sur chaque noeud de calcul de la plateforme

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of multiple legal and illegal substances by adolescents is a growing concern in all countries, but since no consensus about a taxonomy did emerge yet, it is difficult to understand the different patterns of consumption and to implement tailored prevention and treatment programs directed towards specific subgroups of the adolescent population. Using data from a Swiss survey on adolescent health, we analyzed the age at which ten legal and illegal substances were consumed for the first time ever by applying a method combining the strength of both automatic clustering and use of substance experts. Results were then compared to 30 socio-economic factors to establish the usefulness of and to validate our taxonomy. We also analyzed the succession of substance first use for each group. The final taxonomy consists of eight groups ranging from non-consumers to heavy drug addicts. All but four socio-economic factors were significantly associated with the taxonomy, the strongest associations being observed with health, behavior, and sexuality factors. Numerous factors influence adolescents in their decision to first try substances or to use them on a regular basis, and no factor alone can be considered as an absolute marker of problematic behavior regarding substance use. Different processes of experimentation with substances are associated with different behaviors, therefore focusing on only one substance or only one factor is not efficient. Prevention and treatment programs can then be tailored to address specific issues related to different youth subgroups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Predictive groundwater modeling requires accurate information about aquifer characteristics. Geophysical imaging is a powerful tool for delineating aquifer properties at an appropriate scale and resolution, but it suffers from problems of ambiguity. One way to overcome such limitations is to adopt a simultaneous multitechnique inversion strategy. We have developed a methodology for aquifer characterization based on structural joint inversion of multiple geophysical data sets followed by clustering to form zones and subsequent inversion for zonal parameters. Joint inversions based on cross-gradient structural constraints require less restrictive assumptions than, say, applying predefined petro-physical relationships and generally yield superior results. This approach has, for the first time, been applied to three geophysical data types in three dimensions. A classification scheme using maximum likelihood estimation is used to determine the parameters of a Gaussian mixture model that defines zonal geometries from joint-inversion tomograms. The resulting zones are used to estimate representative geophysical parameters of each zone, which are then used for field-scale petrophysical analysis. A synthetic study demonstrated how joint inversion of seismic and radar traveltimes and electrical resistance tomography (ERT) data greatly reduces misclassification of zones (down from 21.3% to 3.7%) and improves the accuracy of retrieved zonal parameters (from 1.8% to 0.3%) compared to individual inversions. We applied our scheme to a data set collected in northeastern Switzerland to delineate lithologic subunits within a gravel aquifer. The inversion models resolve three principal subhorizontal units along with some important 3D heterogeneity. Petro-physical analysis of the zonal parameters indicated approximately 30% variation in porosity within the gravel aquifer and an increasing fraction of finer sediments with depth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper presents a novel method for monitoring network optimisation, based on a recent machine learning technique known as support vector machine. It is problem-oriented in the sense that it directly answers the question of whether the advised spatial location is important for the classification model. The method can be used to increase the accuracy of classification models by taking a small number of additional measurements. Traditionally, network optimisation is performed by means of the analysis of the kriging variances. The comparison of the method with the traditional approach is presented on a real case study with climate data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Voxel-based morphometry from conventional T1-weighted images has proved effective to quantify Alzheimer's disease (AD) related brain atrophy and to enable fairly accurate automated classification of AD patients, mild cognitive impaired patients (MCI) and elderly controls. Little is known, however, about the classification power of volume-based morphometry, where features of interest consist of a few brain structure volumes (e.g. hippocampi, lobes, ventricles) as opposed to hundreds of thousands of voxel-wise gray matter concentrations. In this work, we experimentally evaluate two distinct volume-based morphometry algorithms (FreeSurfer and an in-house algorithm called MorphoBox) for automatic disease classification on a standardized data set from the Alzheimer's Disease Neuroimaging Initiative. Results indicate that both algorithms achieve classification accuracy comparable to the conventional whole-brain voxel-based morphometry pipeline using SPM for AD vs elderly controls and MCI vs controls, and higher accuracy for classification of AD vs MCI and early vs late AD converters, thereby demonstrating the potential of volume-based morphometry to assist diagnosis of mild cognitive impairment and Alzheimer's disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New evidence shows that older adults need more dietary protein than do younger adults to support good health, promote recovery from illness, and maintain functionality. Older people need to make up for age-related changes in protein metabolism, such as high splanchnic extraction and declining anabolic responses to ingested protein. They also need more protein to offset inflammatory and catabolic conditions associated with chronic and acute diseases that occur commonly with aging. With the goal of developing updated, evidence-based recommendations for optimal protein intake by older people, the European Union Geriatric Medicine Society (EUGMS), in cooperation with other scientific organizations, appointed an international study group to review dietary protein needs with aging (PROT-AGE Study Group). To help older people (>65 years) maintain and regain lean body mass and function, the PROT-AGE study group recommends average daily intake at least in the range of 1.0 to 1.2 g protein per kilogram of body weight per day. Both endurance- and resistance-type exercises are recommended at individualized levels that are safe and tolerated, and higher protein intake (ie, ≥1.2 g/kg body weight/d) is advised for those who are exercising and otherwise active. Most older adults who have acute or chronic diseases need even more dietary protein (ie, 1.2-1.5 g/kg body weight/d). Older people with severe kidney disease (ie, estimated GFR <30 mL/min/1.73m(2)), but who are not on dialysis, are an exception to this rule; these individuals may need to limit protein intake. Protein quality, timing of ingestion, and intake of other nutritional supplements may be relevant, but evidence is not yet sufficient to support specific recommendations. Older people are vulnerable to losses in physical function capacity, and such losses predict loss of independence, falls, and even mortality. Thus, future studies aimed at pinpointing optimal protein intake in specific populations of older people need to include measures of physical function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The Pulmonary Embolism Rule-out Criteria (PERC) rule is a clinical diagnostic rule designed to exclude pulmonary embolism (PE) without further testing. We sought to externally validate the diagnostic performance of the PERC rule alone and combined with clinical probability assessment based on the revised Geneva score. Methods: The PERC rule was applied retrospectively to consecutive patients who presented with a clinical suspicion of PE to six emergency departments, and who were enrolled in a randomized trial of PE diagnosis. Patients who met all eight PERC criteria [PERC(-)] were considered to be at a very low risk for PE. We calculated the prevalence of PE among PERC(-) patients according to their clinical pretest probability of PE. We estimated the negative likelihood ratio of the PERC rule to predict PE. Results: Among 1675 patients, the prevalence of PE was 21.3%. Overall, 13.2% of patients were PERC(-). The prevalence of PE was 5.4% [95% confidence interval (CI): 3.1-9.3%] among PERC(-) patients overall and 6.4% (95% CI: 3.7-10.8%) among those PERC(-) patients with a low clinical pretest probability of PE. The PERC rule had a negative likelihood ratio of 0.70 (95% CI: 0.67-0.73) for predicting PE overall, and 0.63 (95% CI: 0.38-1.06) in low-risk patients. Conclusions: Our results suggest that the PERC rule alone or even when combined with the revised Geneva score cannot safely identify very low risk patients in whom PE can be ruled out without additional testing, at least in populations with a relatively high prevalence of PE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although cross-sectional diffusion tensor imaging (DTI) studies revealed significant white matter changes in mild cognitive impairment (MCI), the utility of this technique in predicting further cognitive decline is debated. Thirty-five healthy controls (HC) and 67 MCI subjects with DTI baseline data were neuropsychologically assessed at one year. Among them, there were 40 stable (sMCI; 9 single domain amnestic, 7 single domain frontal, 24 multiple domain) and 27 were progressive (pMCI; 7 single domain amnestic, 4 single domain frontal, 16 multiple domain). Fractional anisotropy (FA) and longitudinal, radial, and mean diffusivity were measured using Tract-Based Spatial Statistics. Statistics included group comparisons and individual classification of MCI cases using support vector machines (SVM). FA was significantly higher in HC compared to MCI in a distributed network including the ventral part of the corpus callosum, right temporal and frontal pathways. There were no significant group-level differences between sMCI versus pMCI or between MCI subtypes after correction for multiple comparisons. However, SVM analysis allowed for an individual classification with accuracies up to 91.4% (HC versus MCI) and 98.4% (sMCI versus pMCI). When considering the MCI subgroups separately, the minimum SVM classification accuracy for stable versus progressive cognitive decline was 97.5% in the multiple domain MCI group. SVM analysis of DTI data provided highly accurate individual classification of stable versus progressive MCI regardless of MCI subtype, indicating that this method may become an easily applicable tool for early individual detection of MCI subjects evolving to dementia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tire traces can be observed on several crime scenes as vehicles are often used by criminals. The tread abrasion on the road, while braking or skidding, leads to the production of small rubber particles which can be collected for comparison purposes. This research focused on the statistical comparison of Py-GC/MS profiles of tire traces and tire treads. The optimisation of the analytical method was carried out using experimental designs. The aim was to determine the best pyrolysis parameters regarding the repeatability of the results. Thus, the pyrolysis factor effect could also be calculated. The pyrolysis temperature was found to be five time more important than time. Finally, a pyrolysis at 650 °C during 15 s was selected. Ten tires of different manufacturers and models were used for this study. Several samples were collected on each tire, and several replicates were carried out to study the variability within each tire (intravariability). More than eighty compounds were integrated for each analysis and the variability study showed that more than 75% presented a relative standard deviation (RSD) below 5% for the ten tires, thus supporting a low intravariability. The variability between the ten tires (intervariability) presented higher values and the ten most variant compounds had a RSD value above 13%, supporting their high potential of discrimination between the tires tested. Principal Component Analysis (PCA) was able to fully discriminate the ten tires with the help of the first three principal components. The ten tires were finally used to perform braking tests on a racetrack with a vehicle equipped with an anti-lock braking system. The resulting tire traces were adequately collected using sheets of white gelatine. As for tires, the intravariability for the traces was found to be lower than the intervariability. Clustering methods were carried out and the Ward's method based on the squared Euclidean distance was able to correctly group all of the tire traces replicates in the same cluster than the replicates of their corresponding tire. Blind tests on traces were performed and were correctly assigned to their tire source. These results support the hypothesis that the tested tires, of different manufacturers and models, can be discriminated by a statistical comparison of their chemical profiles. The traces were found to be not differentiable from their source but differentiable from all the other tires present in the subset. The results are promising and will be extended on a larger sample set.