961 resultados para multiclass classification problems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this Thesis is to develop a robust and powerful method to classify galaxies from large surveys, in order to establish and confirm the connections between the principal observational parameters of the galaxies (spectral features, colours, morphological indices), and help unveil the evolution of these parameters from $z \sim 1$ to the local Universe. Within the framework of zCOSMOS-bright survey, and making use of its large database of objects ($\sim 10\,000$ galaxies in the redshift range $0 < z \lesssim 1.2$) and its great reliability in redshift and spectral properties determinations, first we adopt and extend the \emph{classification cube method}, as developed by Mignoli et al. (2009), to exploit the bimodal properties of galaxies (spectral, photometric and morphologic) separately, and then combining together these three subclassifications. We use this classification method as a test for a newly devised statistical classification, based on Principal Component Analysis and Unsupervised Fuzzy Partition clustering method (PCA+UFP), which is able to define the galaxy population exploiting their natural global bimodality, considering simultaneously up to 8 different properties. The PCA+UFP analysis is a very powerful and robust tool to probe the nature and the evolution of galaxies in a survey. It allows to define with less uncertainties the classification of galaxies, adding the flexibility to be adapted to different parameters: being a fuzzy classification it avoids the problems due to a hard classification, such as the classification cube presented in the first part of the article. The PCA+UFP method can be easily applied to different datasets: it does not rely on the nature of the data and for this reason it can be successfully employed with others observables (magnitudes, colours) or derived properties (masses, luminosities, SFRs, etc.). The agreement between the two classification cluster definitions is very high. ``Early'' and ``late'' type galaxies are well defined by the spectral, photometric and morphological properties, both considering them in a separate way and then combining the classifications (classification cube) and treating them as a whole (PCA+UFP cluster analysis). Differences arise in the definition of outliers: the classification cube is much more sensitive to single measurement errors or misclassifications in one property than the PCA+UFP cluster analysis, in which errors are ``averaged out'' during the process. This method allowed us to behold the \emph{downsizing} effect taking place in the PC spaces: the migration between the blue cloud towards the red clump happens at higher redshifts for galaxies of larger mass. The determination of $M_{\mathrm{cross}}$ the transition mass is in significant agreement with others values in literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we made the first steps towards the systematic application of a methodology for automatically building formal models of complex biological systems. Such a methodology could be useful also to design artificial systems possessing desirable properties such as robustness and evolvability. The approach we follow in this thesis is to manipulate formal models by means of adaptive search methods called metaheuristics. In the first part of the thesis we develop state-of-the-art hybrid metaheuristic algorithms to tackle two important problems in genomics, namely, the Haplotype Inference by parsimony and the Founder Sequence Reconstruction Problem. We compare our algorithms with other effective techniques in the literature, we show strength and limitations of our approaches to various problem formulations and, finally, we propose further enhancements that could possibly improve the performance of our algorithms and widen their applicability. In the second part, we concentrate on Boolean network (BN) models of gene regulatory networks (GRNs). We detail our automatic design methodology and apply it to four use cases which correspond to different design criteria and address some limitations of GRN modeling by BNs. Finally, we tackle the Density Classification Problem with the aim of showing the learning capabilities of BNs. Experimental evaluation of this methodology shows its efficacy in producing network that meet our design criteria. Our results, coherently to what has been found in other works, also suggest that networks manipulated by a search process exhibit a mixture of characteristics typical of different dynamical regimes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is a collection of essays related to the topic of innovation in the service sector. The choice of this structure is functional to the purpose of single out some of the relevant issues and try to tackle them, revising first the state of the literature and then proposing a way forward. Three relevant issues has been therefore selected: (i) the definition of innovation in the service sector and the connected question of measurement of innovation; (ii) the issue of productivity in services; (iii) the classification of innovative firms in the service sector. Facing the first issue, chapter II shows how the initial width of the original Schumpeterian definition of innovation has been narrowed and then passed to the service sector form the manufacturing one in a reduce technological form. Chapter III tackle the issue of productivity in services, discussing the difficulties for measuring productivity in a context where the output is often immaterial. We reconstruct the dispute on the Baumol’s cost disease argument and propose two different ways to go forward in the research on productivity in services: redefining the output along the line of a characteristic approach; and redefining the inputs, particularly analysing which kind of input it’s worth saving. Chapter IV derives an integrated taxonomy of innovative service and manufacturing firms, using data coming from the 2008 CIS survey for Italy. This taxonomy is based on the enlarged definition of “innovative firm” deriving from the Schumpeterian definition of innovation and classify firms using a cluster analysis techniques. The result is the emergence of a four cluster solution, where firms are differentiated by the breadth of the innovation activities in which they are involved. Chapter 5 reports some of the main conclusions of each singular previous chapter and the points worth of further research in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information is nowadays a key resource: machine learning and data mining techniques have been developed to extract high-level information from great amounts of data. As most data comes in form of unstructured text in natural languages, research on text mining is currently very active and dealing with practical problems. Among these, text categorization deals with the automatic organization of large quantities of documents in priorly defined taxonomies of topic categories, possibly arranged in large hierarchies. In commonly proposed machine learning approaches, classifiers are automatically trained from pre-labeled documents: they can perform very accurate classification, but often require a consistent training set and notable computational effort. Methods for cross-domain text categorization have been proposed, allowing to leverage a set of labeled documents of one domain to classify those of another one. Most methods use advanced statistical techniques, usually involving tuning of parameters. A first contribution presented here is a method based on nearest centroid classification, where profiles of categories are generated from the known domain and then iteratively adapted to the unknown one. Despite being conceptually simple and having easily tuned parameters, this method achieves state-of-the-art accuracy in most benchmark datasets with fast running times. A second, deeper contribution involves the design of a domain-independent model to distinguish the degree and type of relatedness between arbitrary documents and topics, inferred from the different types of semantic relationships between respective representative words, identified by specific search algorithms. The application of this model is tested on both flat and hierarchical text categorization, where it potentially allows the efficient addition of new categories during classification. Results show that classification accuracy still requires improvements, but models generated from one domain are shown to be effectively able to be reused in a different one.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Physicians and scientists use a broad spectrum of terms to classify contrast media (CM)-induced adverse reactions. In particular, the designation of hypersensitivity reactions is quite varied. Consequently, comparisons of different papers dealing with this subject are difficult or even impossible. Moreover, general descriptions may lead to problems in understanding reactions in patients with a history of adverse CM-reactions, and in efficiently managing these patients. Therefore, the goal of this paper is to suggest an easy system to clearly classify these reactions. The proposed three-step systems (3SS) is built up as follows: step 1 exactly describes the clinical features, including their severity; step 2 categorizes the time point of the onset (immediate or nonimmediate); and step 3 generally classifies the reaction (hypersensitivity or nonhypersensitivity reaction). The 3SS may facilitate better understanding of the clinical manifestations of adverse CM reactions and may support the prevention of these reactions on the basis of personalized medicine approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To compare the content covered by twelve obesity-specific health status measures using the International Classification of Functioning, Disability and Health (ICF). DESIGN: Obesity-specific health status measures were identified and then linked to the ICF separately by two trained health professionals according to standardized guidelines. The degree of agreement between health professionals was calculated by means of the kappa (kappa) statistic. Bootstrapped confidence intervals (CI) were calculated. The obesity-specific health-status measures were compared on the component and category level of the ICF. MEASUREMENTS: welve condition-specific health-status measures were identified and included in this study, namely the obesity-related problem scale, the obesity eating problems scale, the obesity-related coping and obesity-related distress questionnaire, the impact of weight on quality of life questionnaire (short version), the health-related quality of life questionnaire, the obesity adjustment survey (short form), the short specific quality of life scale, the obesity-related well-being questionnaire, the bariatric analysis and reporting outcome system, the bariatric quality of life index, the obesity and weight loss quality of life questionnaire and the weight-related symptom measure. RESULTS: In the 280 items of the eight measures, a total of 413 concepts were identified and linked to the 87 different ICF categories. The measures varied strongly in the number of concepts contained and the number of ICF categories used to map these concepts. Items on body functions varied form 12% in the obesity-related problem scale to 95% in the weight-related symptom measure. The estimated kappa coefficients ranged between 0.79 (CI: 0.72, 0.86) at the component ICFs level and 0.97 (CI: 0.93, 1.0) at the third ICF's level. CONCLUSION: The ICF proved highly useful for the content comparison of obesity-specific health-status measures. The results may provide clinicians and researchers with new insights when selecting health-status measures for clinical studies in obesity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: With the International Classification of Functioning, Disability and Health (ICF), we can now rely on a globally agreed-upon framework and system for classifying the typical spectrum of problems in the functioning of persons given the environmental context in which they live. ICF Core Sets are subgroups of ICF items selected to capture those aspects of functioning that are most likely to be affected by sleep disorders. OBJECTIVE: The objective of this paper is to outline the developmental process for the ICF Core Sets for Sleep. METHODS: The ICF Core Sets for Sleep will be defined at an ICF Core Sets Consensus Conference, which will integrate evidence from preliminary studies, namely (a) a systematic literature review regarding the outcomes used in clinical trials and observational studies, (b) focus groups with people in different regions of the world who have sleep disorders, (c) an expert survey with the involvement of international clinical experts, and (d) a cross-sectional study of people with sleep disorders in different regions of the world. CONCLUSION: The ICF Core Sets for Sleep are being designed with the goal of providing useful standards for research, clinical practice and teaching. It is hypothesized that the ICF Core Sets for Sleep will stimulate research that leads to an improved understanding of functioning, disability, and health in sleep medicine. It is of further hope that such research will lead to interventions and accommodations that improve the restoration and maintenance of functioning and minimize disability among people with sleep disorders throughout the world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the publication of the European Respiratory Society Task Force report in 2008, significant new evidence has become available on the classification and management of preschool wheezing disorders. In this report, an international consensus group reviews this new evidence and proposes some modifications to the recommendations made in 2008. Specifically, the consensus group acknowledges that wheeze patterns in young children vary over time and with treatment, rendering the distinction between episodic viral wheeze and multiple-trigger wheeze unclear in many patients. Inhaled corticosteroids remain first-line treatment for multiple-trigger wheeze, but may also be considered in patients with episodic viral wheeze with frequent or severe episodes, or when the clinician suspects that interval symptoms are being under reported. Any controller therapy should be viewed as a treatment trial, with scheduled close follow-up to monitor treatment effect. The group recommends discontinuing treatment if there is no benefit and taking favourable natural history into account when making decisions about long-term therapy. Oral corticosteroids are not indicated in mild-to-moderate acute wheeze episodes and should be reserved for severe exacerbations in hospitalised patients. Future research should focus on better clinical and genetic markers, as well as biomarkers, of disease severity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aggressive behavior can be classified in hostile and instrumental aggressions (anderson & bushman, 2002). this classification is mostly synonymously used with reactive and proactive aggression, whereas the differences between hostile and instrumental aggression lie on three dimensions, the primary goal, amount of anger and planning and calculation(bushman & anderson, 2001). although there are rating instruments and experimental paradigms to measure hostile aggression, there is no instrument to measure instrumental aggression. the following study will present an account to measure instrumental aggression with an experimental laboratory paradigm. the instrument was firstly tested on two samples of normal young adolescents (n1 = 100; amage. = 19.14; n2 = 60; amage. = 21.46). the first study revealed a strong correlation with a laboratory aggression paradigm measuring hostile aggression, but no correlations with self-reported aggression in the buss and perry questionnaire. these results were replicated in a second study, revealing an additional correlation with aggressive but not adaptive assertiveness. secondly the instrument was part of the evaluation of the reasoning and rehabilitation program r&r2 (ross, hilborn & lidell, 1984) in an institution for male adolescents with adjustment problems in switzerland. the r&r2 is a cognitive behavioral group therapy to reduce antisocial and promote prosocial cognitions and behavior. the treatment group (n= 16; rangeage = 15-17) is compared to a no treatment control group (n=24; rangeage = 17-19) preand post- treatment. further aggressive behavior was surveyed and experimentally measured. hostile rumination, aggressive and adaptive assertiveness, emotional and social competence were included in the measurement to estimate construct validity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The delineation of shifting cultivation landscapes using remote sensing in mountainous regions is challenging. On the one hand, there are difficulties related to the distinction of forest and fallow forest classes as occurring in a shifting cultivation landscape in mountainous regions. On the other hand, the dynamic nature of the shifting cultivation system poses problems to the delineation of landscapes where shifting cultivation occurs. We present a two-step approach based on an object-oriented classification of Advanced Land Observing Satellite, Advanced Visible and Near-Infrared Spectrometer (ALOS AVNIR) and Panchromatic Remote-sensing Instrument for Stereo Mapping (ALOS PRISM) data and landscape metrics. When including texture measures in the object-oriented classification, the accuracy of forest and fallow forest classes could be increased substantially. Based on such a classification, landscape metrics in the form of land cover class ratios enabled the identification of crop-fallow rotation characteristics of the shifting cultivation land use practice. By classifying and combining these landscape metrics, shifting cultivation landscapes could be delineated using a single land cover dataset.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is concerned with the modelling of storage configurations for intermediate products in process industries. Those models form the basis of algorithms for scheduling chemical production plants. Different storage capacity settings (unlimited, finite, and no intermediate storage), storage homogeneity settings (dedicated and shared storage), and storage time settings (unlimited, finite, and no wait) are considered. We discuss a classification of storage constraints in batch scheduling and show how those constraints can be integrated into a general production scheduling model that is based on the concept of cumulative resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a new fully-automatic method for localizing and segmenting 3D intervertebral discs from MR images, where the two problems are solved in a unified data-driven regression and classification framework. We estimate the output (image displacements for localization, or fg/bg labels for segmentation) of image points by exploiting both training data and geometric constraints simultaneously. The problem is formulated in a unified objective function which is then solved globally and efficiently. We validate our method on MR images of 25 patients. Taking manually labeled data as the ground truth, our method achieves a mean localization error of 1.3 mm, a mean Dice metric of 87%, and a mean surface distance of 1.3 mm. Our method can be applied to other localization and segmentation tasks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Pulmonary hypertension (PH) frequently coexists with severe aortic stenosis, and PH severity has been shown to predict outcomes after transcatheter aortic valve implantation (TAVI). The effect of PH hemodynamic presentation on clinical outcomes after TAVI is unknown. METHODS AND RESULTS Of 606 consecutive patients undergoing TAVI, 433 (71.4%) patients with severe aortic stenosis and a preprocedural right heart catheterization were assessed. Patients were dichotomized according to whether PH was present (mean pulmonary artery pressure, ≥25 mm Hg; n=325) or not (n=108). Patients with PH were further dichotomized by left ventricular end-diastolic pressure into postcapillary (left ventricular end-diastolic pressure, >15 mm Hg; n=269) and precapillary groups (left ventricular end-diastolic pressure, ≤15 mm Hg; n=56). Finally, patients with postcapillary PH were divided into isolated (n=220) and combined (n=49) subgroups according to whether the diastolic pressure difference (diastolic pulmonary artery pressure-left ventricular end-diastolic pressure) was normal (<7 mm Hg) or elevated (≥7 mm Hg). Primary end point was mortality at 1 year. PH was present in 325 of 433 (75%) patients and was predominantly postcapillary (n=269/325; 82%). Compared with baseline, systolic pulmonary artery pressure immediately improved after TAVI in patients with postcapillary combined (57.8±14.1 versus 50.4±17.3 mm Hg; P=0.015) but not in those with precapillary (49.0±12.6 versus 51.6±14.3; P=0.36). When compared with no PH, a higher 1-year mortality rate was observed in both precapillary (hazard ratio, 2.30; 95% confidence interval, 1.02-5.22; P=0.046) and combined (hazard ratio, 3.15; 95% confidence interval, 1.43-6.93; P=0.004) but not isolated PH patients (P=0.11). After adjustment, combined PH remained a strong predictor of 1-year mortality after TAVI (hazard ratio, 3.28; P=0.005). CONCLUSIONS Invasive stratification of PH according to hemodynamic presentation predicts acute response to treatment and 1-year mortality after TAVI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automated tissue characterization is one of the most crucial components of a computer aided diagnosis (CAD) system for interstitial lung diseases (ILDs). Although much research has been conducted in this field, the problem remains challenging. Deep learning techniques have recently achieved impressive results in a variety of computer vision problems, raising expectations that they might be applied in other domains, such as medical image analysis. In this paper, we propose and evaluate a convolutional neural network (CNN), designed for the classification of ILD patterns. The proposed network consists of 5 convolutional layers with 2×2 kernels and LeakyReLU activations, followed by average pooling with size equal to the size of the final feature maps and three dense layers. The last dense layer has 7 outputs, equivalent to the classes considered: healthy, ground glass opacity (GGO), micronodules, consolidation, reticulation, honeycombing and a combination of GGO/reticulation. To train and evaluate the CNN, we used a dataset of 14696 image patches, derived by 120 CT scans from different scanners and hospitals. To the best of our knowledge, this is the first deep CNN designed for the specific problem. A comparative analysis proved the effectiveness of the proposed CNN against previous methods in a challenging dataset. The classification performance (~85.5%) demonstrated the potential of CNNs in analyzing lung patterns. Future work includes, extending the CNN to three-dimensional data provided by CT volume scans and integrating the proposed method into a CAD system that aims to provide differential diagnosis for ILDs as a supportive tool for radiologists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brownfield rehabilitation is an essential step for sustainable land-use planning and management in the European Union. In brownfield regeneration processes, the legacy contamination plays a significant role, firstly because of the persistent contaminants in soil or groundwater which extends the existing hazards and risks well into the future; and secondly, problems from historical contamination are often more difficult to manage than contamination caused by new activities. Due to the complexity associated with the management of brownfield site rehabilitation, Decision Support Systems (DSSs) have been developed to support problem holders and stakeholders in the decision-making process encompassing all phases of the rehabilitation. This paper presents a comparative study between two DSSs, namely SADA (Spatial Analysis and Decision Assistance) and DESYRE (Decision Support System for the Requalification of Contaminated Sites), with the main objective of showing the benefits of using DSSs to introduce and process data and then to disseminate results to different stakeholders involved in the decision-making process. For this purpose, a former car manufacturing plant located in the Brasov area, Central Romania, contaminated chiefly by heavy metals and total petroleum hydrocarbons, has been selected as a case study to apply the two examined DSSs. Major results presented here concern the analysis of the functionalities of the two DSSs in order to identify similarities, differences and complementarities and, thus, to provide an indication of the most suitable integration options.