915 resultados para Classification of Banach spaces
Resumo:
The Chicago Classification of esophageal motility was developed to facilitate the interpretation of clinical high resolution esophageal pressure topography (EPT) studies, concurrent with the widespread adoption of this technology into clinical practice. The Chicago Classification has been an evolutionary process, molded first by published evidence pertinent to the clinical interpretation of high resolution manometry (HRM) studies and secondarily by group experience when suitable evidence is lacking.
Resumo:
The delineation of shifting cultivation landscapes using remote sensing in mountainous regions is challenging. On the one hand, there are difficulties related to the distinction of forest and fallow forest classes as occurring in a shifting cultivation landscape in mountainous regions. On the other hand, the dynamic nature of the shifting cultivation system poses problems to the delineation of landscapes where shifting cultivation occurs. We present a two-step approach based on an object-oriented classification of Advanced Land Observing Satellite, Advanced Visible and Near-Infrared Spectrometer (ALOS AVNIR) and Panchromatic Remote-sensing Instrument for Stereo Mapping (ALOS PRISM) data and landscape metrics. When including texture measures in the object-oriented classification, the accuracy of forest and fallow forest classes could be increased substantially. Based on such a classification, landscape metrics in the form of land cover class ratios enabled the identification of crop-fallow rotation characteristics of the shifting cultivation land use practice. By classifying and combining these landscape metrics, shifting cultivation landscapes could be delineated using a single land cover dataset.
Resumo:
PURPOSE: To develop and implement a method for improved cerebellar tissue classification on the MRI of brain by automatically isolating the cerebellum prior to segmentation. MATERIALS AND METHODS: Dual fast spin echo (FSE) and fluid attenuation inversion recovery (FLAIR) images were acquired on 18 normal volunteers on a 3 T Philips scanner. The cerebellum was isolated from the rest of the brain using a symmetric inverse consistent nonlinear registration of individual brain with the parcellated template. The cerebellum was then separated by masking the anatomical image with individual FLAIR images. Tissues in both the cerebellum and rest of the brain were separately classified using hidden Markov random field (HMRF), a parametric method, and then combined to obtain tissue classification of the whole brain. The proposed method for tissue classification on real MR brain images was evaluated subjectively by two experts. The segmentation results on Brainweb images with varying noise and intensity nonuniformity levels were quantitatively compared with the ground truth by computing the Dice similarity indices. RESULTS: The proposed method significantly improved the cerebellar tissue classification on all normal volunteers included in this study without compromising the classification in remaining part of the brain. The average similarity indices for gray matter (GM) and white matter (WM) in the cerebellum are 89.81 (+/-2.34) and 93.04 (+/-2.41), demonstrating excellent performance of the proposed methodology. CONCLUSION: The proposed method significantly improved tissue classification in the cerebellum. The GM was overestimated when segmentation was performed on the whole brain as a single object.
Resumo:
This study subdivides the Potter Cove, King George Island, Antarctica, into seafloor regions using multivariate statistical methods. These regions are categories used for comparing, contrasting and quantifying biogeochemical processes and biodiversity between ocean regions geographically but also regions under development within the scope of global change. The division obtained is characterized by the dominating components and interpreted in terms of ruling environmental conditions. The analysis includes in total 42 different environmental variables, interpolated based on samples taken during Australian summer seasons 2010/2011 and 2011/2012. The statistical errors of several interpolation methods (e.g. IDW, Indicator, Ordinary and Co-Kriging) with changing settings have been compared and the most reasonable method has been applied. The multivariate mathematical procedures used are regionalized classification via k means cluster analysis, canonical-correlation analysis and multidimensional scaling. Canonical-correlation analysis identifies the influencing factors in the different parts of the cove. Several methods for the identification of the optimum number of clusters have been tested and 4, 7, 10 as well as 12 were identified as reasonable numbers for clustering the Potter Cove. Especially the results of 10 and 12 clusters identify marine-influenced regions which can be clearly separated from those determined by the geological catchment area and the ones dominated by river discharge.
Resumo:
The article shows a range of contemporary phenomena linked with urban space and the increasing citizens? interactivity in the network. The sources for theory and reflection are related to the ongoing research project ?Interactive Atlas of urban habitability" which is based on citizen participation in the sensitive description of the urban environment. It addresses a classification of variables related to the desires of urban habitability.
Resumo:
Light Detection and Ranging (LIDAR) provides high horizontal and vertical resolution of spatial data located in point cloud images, and is increasingly being used in a number of applications and disciplines, which have concentrated on the exploit and manipulation of the data using mainly its three dimensional nature. Bathymetric LIDAR systems and data are mainly focused to map depths in shallow and clear waters with a high degree of accuracy. Additionally, the backscattering produced by the different materials distributed over the bottom surface causes that the returned intensity signal contains important information about the reflection properties of these materials. Processing conveniently these values using a Simplified Radiative Transfer Model, allows the identification of different sea bottom types. This paper presents an original method for the classification of sea bottom by means of information processing extracted from the images generated through LIDAR data. The results are validated using a vector database containing benthic information derived by marine surveys.
Resumo:
Context: Replication plays an important role in experimental disciplines. There are still many uncertain- ties about how to proceed with replications of SE experiments. Should replicators reuse the baseline experiment materials? How much liaison should there be among the original and replicating experiment- ers, if any? What elements of the experimental configuration can be changed for the experiment to be considered a replication rather than a new experiment? Objective: To improve our understanding of SE experiment replication, in this work we propose a classi- fication which is intend to provide experimenters with guidance about what types of replication they can perform. Method: The research approach followed is structured according to the following activities: (1) a litera- ture review of experiment replication in SE and in other disciplines, (2) identification of typical elements that compose an experimental configuration, (3) identification of different replications purposes and (4) development of a classification of experiment replications for SE. Results: We propose a classification of replications which provides experimenters in SE with guidance about what changes can they make in a replication and, based on these, what verification purposes such a replication can serve. The proposed classification helped to accommodate opposing views within a broader framework, it is capable of accounting for less similar replications to more similar ones regarding the baseline experiment. Conclusion: The aim of replication is to verify results, but different types of replication serve special ver- ification purposes and afford different degrees of change. Each replication type helps to discover partic- ular experimental conditions that might influence the results. The proposed classification can be used to identify changes in a replication and, based on these, understand the level of verification.
Resumo:
The original motivation for this paper was to provide an efficient quantitative analysis of convex infinite (or semi-infinite) inequality systems whose decision variables run over general infinite-dimensional (resp. finite-dimensional) Banach spaces and that are indexed by an arbitrary fixed set J. Parameter perturbations on the right-hand side of the inequalities are required to be merely bounded, and thus the natural parameter space is l ∞(J). Our basic strategy consists of linearizing the parameterized convex system via splitting convex inequalities into linear ones by using the Fenchel–Legendre conjugate. This approach yields that arbitrary bounded right-hand side perturbations of the convex system turn on constant-by-blocks perturbations in the linearized system. Based on advanced variational analysis, we derive a precise formula for computing the exact Lipschitzian bound of the feasible solution map of block-perturbed linear systems, which involves only the system’s data, and then show that this exact bound agrees with the coderivative norm of the aforementioned mapping. In this way we extend to the convex setting the results of Cánovas et al. (SIAM J. Optim. 20, 1504–1526, 2009) developed for arbitrary perturbations with no block structure in the linear framework under the boundedness assumption on the system’s coefficients. The latter boundedness assumption is removed in this paper when the decision space is reflexive. The last section provides the aimed application to the convex case.
Resumo:
We consider quasi-Newton methods for generalized equations in Banach spaces under metric regularity and give a sufficient condition for q-linear convergence. Then we show that the well-known Broyden update satisfies this sufficient condition in Hilbert spaces. We also establish various modes of q-superlinear convergence of the Broyden update under strong metric subregularity, metric regularity and strong metric regularity. In particular, we show that the Broyden update applied to a generalized equation in Hilbert spaces satisfies the Dennis–Moré condition for q-superlinear convergence. Simple numerical examples illustrate the results.
Resumo:
Systematic protocols that use decision rules or scores arc, seen to improve consistency and transparency in classifying the conservation status of species. When applying these protocols, assessors are typically required to decide on estimates for attributes That are inherently uncertain, Input data and resulting classifications are usually treated as though they arc, exact and hence without operator error We investigated the impact of data interpretation on the consistency of protocols of extinction risk classifications and diagnosed causes of discrepancies when they occurred. We tested three widely used systematic classification protocols employed by the World Conservation Union, NatureServe, and the Florida Fish and Wildlife Conservation Commission. We provided 18 assessors with identical information for 13 different species to infer estimates for each of the required parameters for the three protocols. The threat classification of several of the species varied from low risk to high risk, depending on who did the assessment. This occurred across the three Protocols investigated. Assessors tended to agree on their placement of species in the highest (50-70%) and lowest risk categories (20-40%), but There was poor agreement on which species should be placed in the intermediate categories, Furthermore, the correspondence between The three classification methods was unpredictable, with large variation among assessors. These results highlight the importance of peer review and consensus among multiple assessors in species classifications and the need to be cautious with assessments carried out 4), a single assessor Greater consistency among assessors requires wide use of training manuals and formal methods for estimating parameters that allow uncertainties to be represented, carried through chains of calculations, and reported transparently.
Resumo:
Neural networks are statistical models and learning rules are estimators. In this paper a theory for measuring generalisation is developed by combining Bayesian decision theory with information geometry. The performance of an estimator is measured by the information divergence between the true distribution and the estimate, averaged over the Bayesian posterior. This unifies the majority of error measures currently in use. The optimal estimators also reveal some intricate interrelationships among information geometry, Banach spaces and sufficient statistics.
Resumo:
Background - Bipolar disorder (BD) is one of the leading causes of disability worldwide. Patients are further disadvantaged by delays in accurate diagnosis ranging between 5 and 10 years. We applied Gaussian process classifiers (GPCs) to structural magnetic resonance imaging (sMRI) data to evaluate the feasibility of using pattern recognition techniques for the diagnostic classification of patients with BD. Method - GPCs were applied to gray (GM) and white matter (WM) sMRI data derived from two independent samples of patients with BD (cohort 1: n = 26; cohort 2: n = 14). Within each cohort patients were matched on age, sex and IQ to an equal number of healthy controls. Results - The diagnostic accuracy of the GPC for GM was 73% in cohort 1 and 72% in cohort 2; the sensitivity and specificity of the GM classification were respectively 69% and 77% in cohort 1 and 64% and 99% in cohort 2. The diagnostic accuracy of the GPC for WM was 69% in cohort 1 and 78% in cohort 2; the sensitivity and specificity of the WM classification were both 69% in cohort 1 and 71% and 86% respectively in cohort 2. In both samples, GM and WM clusters discriminating between patients and controls were localized within cortical and subcortical structures implicated in BD. Conclusions - Our results demonstrate the predictive value of neuroanatomical data in discriminating patients with BD from healthy individuals. The overlap between discriminative networks and regions implicated in the pathophysiology of BD supports the biological plausibility of the classifiers.
Resumo:
∗ The present article was originally submitted for the second volume of Murcia Seminar on Functional Analysis (1989). Unfortunately it has been not possible to continue with Murcia Seminar publication anymore. For historical reasons the present vesion correspond with the original one.
Resumo:
The paper contains calculus rules for coderivatives of compositions, sums and intersections of set-valued mappings. The types of coderivatives considered correspond to Dini-Hadamard and limiting Dini-Hadamard subdifferentials in Gˆateaux differentiable spaces, Fréchet and limiting Fréchet subdifferentials in Asplund spaces and approximate subdifferentials in arbitrary Banach spaces. The key element of the unified approach to obtaining various calculus rules for various types of derivatives presented in the paper are simple formulas for subdifferentials of marginal, or performance functions.
Resumo:
* This work was supported by National Science Foundation grant DMS 9404431.