11 resultados para multiple classifiers integration


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many domains when we have several competing classifiers available we want to synthesize them or some of them to get a more accurate classifier by a combination function. In this paper we propose a ‘class-indifferent’ method for combining classifier decisions represented by evidential structures called triplet and quartet, using Dempster's rule of combination. This method is unique in that it distinguishes important elements from the trivial ones in representing classifier decisions, makes use of more information than others in calculating the support for class labels and provides a practical way to apply the theoretically appealing Dempster–Shafer theory of evidence to the problem of ensemble learning. We present a formalism for modelling classifier decisions as triplet mass functions and we establish a range of formulae for combining these mass functions in order to arrive at a consensus decision. In addition we carry out a comparative study with the alternatives of simplet and dichotomous structure and also compare two combination methods, Dempster's rule and majority voting, over the UCI benchmark data, to demonstrate the advantage our approach offers. (A continuation of the work in this area that was published in IEEE Trans on KDE, and conferences)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper a multiple classifier machine learning methodology for Predictive Maintenance (PdM) is presented. PdM is a prominent strategy for dealing with maintenance issues given the increasing need to minimize downtime and associated costs. One of the challenges with PdM is generating so called ’health factors’ or quantitative indicators of the status of a system associated with a given maintenance issue, and determining their relationship to operating costs and failure risk. The proposed PdM methodology allows dynamical decision rules to be adopted for maintenance management and can be used with high-dimensional and censored data problems. This is achieved by training multiple classification modules with different prediction horizons to provide different performance trade-offs in terms of frequency of unexpected breaks and unexploited lifetime and then employing this information in an operating cost based maintenance decision system to minimise expected costs. The effectiveness of the methodology is demonstrated using a simulated example and a benchmark semiconductor manufacturing maintenance problem.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Multiple myeloma is characterized by genomic alterations frequently involving gains and losses of chromosomes. Single nucleotide polymorphism (SNP)-based mapping arrays allow the identification of copy number changes at the sub-megabase level and the identification of loss of heterozygosity (LOH) due to monosomy and uniparental disomy (UPD). We have found that SNP-based mapping array data and fluorescence in situ hybridization (FISH) copy number data correlated well, making the technique robust as a tool to investigate myeloma genomics. The most frequently identified alterations are located at 1p, 1q, 6q, 8p, 13, and 16q. LOH is found in these large regions and also in smaller regions throughout the genome with a median size of 1 Mb. We have identified that UPD is prevalent in myeloma and occurs through a number of mechanisms including mitotic nondisjunction and mitotic recombination. For the first time in myeloma, integration of mapping and expression data has allowed us to reduce the complexity of standard gene expression data and identify candidate genes important in both the transition from normal to monoclonal gammopathy of unknown significance (MGUS) to myeloma and in different subgroups within myeloma. We have documented these genes, providing a focus for further studies to identify and characterize those that are key in the pathogenesis of myeloma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose The aim of this paper is to explore the issues involved in developing and applying performance management approaches within a large UK public sector department using a multiple stakeholder perspective and an accompanying theoretical framework. Design/methodology/approach An initial short questionnaire was used to determine perceptions about the implementation and effectiveness of the new performance management system across the organisation. In total, 700 questionnaires were distributed. Running concurrently with an ethnographic approach, and informed by the questionnaire responses, was a series of semi-structured interviews and focus groups. Findings Staff at all levels had an understanding of the new system and perceived it as being beneficial. However, there were concerns that the approach was not continuously managed throughout the year and was in danger of becoming an annual event, rather than an ongoing process. Furthermore, the change process seemed to have advanced without corresponding changes to appraisal and reward and recognition systems. Thus, the business objectives were not aligned with motivating factors within the organisation. Research limitations/implications Additional research to test the validity and usefulness of the theoretical model, as discussed in this paper, would be beneficial. Practical implications The strategic integration of the stakeholder performance measures and scorecards was found to be essential to producing an overall stakeholder-driven strategy within the case study organisation. Originality/value This paper discusses in detail the approach adopted and the progress made by one large UK public sector organisation, as it attempts to develop better relationships with all of its stakeholders and hence improve its performance. This paper provides a concerted attempt to link theory with practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Logistic regression and Gaussian mixture model (GMM) classifiers have been trained to estimate the probability of acute myocardial infarction (AMI) in patients based upon the concentrations of a panel of cardiac markers. The panel consists of two new markers, fatty acid binding protein (FABP) and glycogen phosphorylase BB (GPBB), in addition to the traditional cardiac troponin I (cTnI), creatine kinase MB (CKMB) and myoglobin. The effect of using principal component analysis (PCA) and Fisher discriminant analysis (FDA) to preprocess the marker concentrations was also investigated. The need for classifiers to give an accurate estimate of the probability of AMI is argued and three categories of performance measure are described, namely discriminatory ability, sharpness, and reliability. Numerical performance measures for each category are given and applied. The optimum classifier, based solely upon the samples take on admission, was the logistic regression classifier using FDA preprocessing. This gave an accuracy of 0.85 (95% confidence interval: 0.78-0.91) and a normalised Brier score of 0.89. When samples at both admission and a further time, 1-6 h later, were included, the performance increased significantly, showing that logistic regression classifiers can indeed use the information from the five cardiac markers to accurately and reliably estimate the probability AMI. © Springer-Verlag London Limited 2008.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

'At a time of crisis and therefore a crucial juncture in European politics, Dagmar Schiek offers us an inspiring vision of the potential of the European Union. In her brilliant study, she exposes the obstacles that economic integration has posed for achievement of social justice, and provides a bold solution. Rejecting more limited models of constitutionalism, she presents a convincing alternative which is socially embedded, allowing space for action by manifold actors at multiple levels of governance.' - Tonia Novitz, University of Bristol, UK. © Dagmar Schiek 2012. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last two decades there has been ongoing debate about the impact of environmental practices on operational performance. In recent years, studies have started to move beyond assessing the direct impact of environmental management on different dimensions of performance to consider factors that might moderate or mediate this relationship. This study considers the extent to which environmental integration and environmental capabilities moderate the relationship between pollution prevention and environmental performance outcomes. The mediating influence of environmental performance on the relationship between pollution prevention and cost and flexibility performance is also considered. Data were collected from a sample of UK food manufacturers and analysed using multiple regression analysis. The findings indicate the existence of some moderated and mediated relationships suggesting that there is more to improving performance than implementing environmental practices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Healthcare integration is a priority in many countries, yet there remains little direction on how to systematically evaluate this construct to inform further development. The examination of community-based palliative care networks provides an ideal opportunity for the advancement of integration measures, in consideration of how fundamental provider cohesion is to effective care at end of life.

AIM: This article presents a variable-oriented analysis from a theory-based case study of a palliative care network to help bridge the knowledge gap in integration measurement.

DESIGN: Data from a mixed-methods case study were mapped to a conceptual framework for evaluating integrated palliative care and a visual array depicting the extent of key factors in the represented palliative care network was formulated.

SETTING/PARTICIPANTS: The study included data from 21 palliative care network administrators, 86 healthcare professionals, and 111 family caregivers, all from an established palliative care network in Ontario, Canada.

RESULTS: The framework used to guide this research proved useful in assessing qualities of integration and functioning in the palliative care network. The resulting visual array of elements illustrates that while this network performed relatively well at the multiple levels considered, room for improvement exists, particularly in terms of interventions that could facilitate the sharing of information.

CONCLUSION: This study, along with the other evaluative examples mentioned, represents important initial attempts at empirically and comprehensively examining network-integrated palliative care and healthcare integration in general.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There has been an increasing interest in the development of new methods using Pareto optimality to deal with multi-objective criteria (for example, accuracy and time complexity). Once one has developed an approach to a problem of interest, the problem is then how to compare it with the state of art. In machine learning, algorithms are typically evaluated by comparing their performance on different data sets by means of statistical tests. Standard tests used for this purpose are able to consider jointly neither performance measures nor multiple competitors at once. The aim of this paper is to resolve these issues by developing statistical procedures that are able to account for multiple competing measures at the same time and to compare multiple algorithms altogether. In particular, we develop two tests: a frequentist procedure based on the generalized likelihood-ratio test and a Bayesian procedure based on a multinomial-Dirichlet conjugate model. We further extend them by discovering conditional independences among measures to reduce the number of parameters of such models, as usually the number of studied cases is very reduced in such comparisons. Data from a comparison among general purpose classifiers is used to show a practical application of our tests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Safety on public transport is a major concern for the relevant authorities. We
address this issue by proposing an automated surveillance platform which combines data from video, infrared and pressure sensors. Data homogenisation and integration is achieved by a distributed architecture based on communication middleware that resolves interconnection issues, thereby enabling data modelling. A common-sense knowledge base models and encodes knowledge about public-transport platforms and the actions and activities of passengers. Trajectory data from passengers is modelled as a time-series of human activities. Common-sense knowledge and rules are then applied to detect inconsistencies or errors in the data interpretation. Lastly, the rationality that characterises human behaviour is also captured here through a bottom-up Hierarchical Task Network planner that, along with common-sense, corrects misinterpretations to explain passenger behaviour. The system is validated using a simulated bus saloon scenario as a case-study. Eighteen video sequences were recorded with up to six passengers. Four metrics were used to evaluate performance. The system, with an accuracy greater than 90% for each of the four metrics, was found to outperform a rule-base system and a system containing planning alone.