32 resultados para one-class classification

em Aston University Research Archive


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cardiovascular diseases (CVD) contributed to almost 30% of worldwide mortality; with heart failure being one class of CVD. One popular and widely available treatment for heart failure is the intra-aortic balloon pump (IABP). This heart assist device is used in counterpulsation to improve myocardial function by increasing coronary perfusion, and decreasing aortic end-diastolic pressure (i.e. the resistance to blood ejection from the heart). However, this device can only be used acutely, and patients are bedridden. The subject of this research is a novel heart assist treatment called the Chronic Intermittent Mechanical Support (CIMS) which was conceived to offer advantages of the IABP device chronically, whilst overcoming its disadvantages. The CIMS device comprises an implantable balloon pump, a percutaneous drive line, and a wearable driver console. The research here aims to determine the haemodynamic effect of balloon pump activation under in vitro conditions. A human mock circulatory loop (MCL) with systemic and coronary perfusion was constructed, capable of simulating various degrees of heart failure. Two prototypes of the CIMS balloon pump were made with varying stiffness. Several experimental factors (balloon inflation/deflation timing, Helium gas volume, arterial compliance, balloon pump stiffness and heart valve type) form the factorial design experiments. A simple modification to the MCL allowed flow visualisation experiments using video recording. Suitable statistical tests were used to analyse the data obtained from all experiments. Balloon inflation and deflation in the ascending aorta of the MCL yielded favourable results. The sudden balloon deflation caused the heart valve to open earlier, thus causing longer valve opening duration in a cardiac cycle. It was also found that pressure augmentation in diastole was significantly correlated with increased cardiac output and coronary flowrate. With an optimum combination (low arterial compliance and low balloon pump stiffness), systemic and coronary perfusions were increased by 18% and 21% respectively, while the aortic end-diastolic pressure (forward flow resistance) decreased by 17%. Consequently, the ratio of oxygen supply and demand to myocardium (endocardial viability ratio, EVR) increased between 33% and 75%. The increase was mostly attributed to diastolic augmentation rather than systolic unloading.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The relationship between sleep apnoea–hypopnoea syndrome (SAHS) severity and the regularity of nocturnal oxygen saturation (SaO2) recordings was analysed. Three different methods were proposed to quantify regularity: approximate entropy (AEn), sample entropy (SEn) and kernel entropy (KEn). A total of 240 subjects suspected of suffering from SAHS took part in the study. They were randomly divided into a training set (96 subjects) and a test set (144 subjects) for the adjustment and assessment of the proposed methods, respectively. According to the measurements provided by AEn, SEn and KEn, higher irregularity of oximetry signals is associated with SAHS-positive patients. Receiver operating characteristic (ROC) and Pearson correlation analyses showed that KEn was the most reliable predictor of SAHS. It provided an area under the ROC curve of 0.91 in two-class classification of subjects as SAHS-negative or SAHS-positive. Moreover, KEn measurements from oximetry data exhibited a linear dependence on the apnoea–hypopnoea index, as shown by a correlation coefficient of 0.87. Therefore, these measurements could be used for the development of simplified diagnostic techniques in order to reduce the demand for polysomnographies. Furthermore, KEn represents a convincing alternative to AEn and SEn for the diagnostic analysis of noisy biomedical signals.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We consider the problem of assigning an input vector bfx to one of m classes by predicting P(c|bfx) for c = 1, ldots, m. For a two-class problem, the probability of class 1 given bfx is estimated by s(y(bfx)), where s(y) = 1/(1 + e-y). A Gaussian process prior is placed on y(bfx), and is combined with the training data to obtain predictions for new bfx points. We provide a Bayesian treatment, integrating over uncertainty in y and in the parameters that control the Gaussian process prior; the necessary integration over y is carried out using Laplace's approximation. The method is generalized to multi-class problems (m >2) using the softmax function. We demonstrate the effectiveness of the method on a number of datasets.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We consider the problem of assigning an input vector to one of m classes by predicting P(c|x) for c=1,...,m. For a two-class problem, the probability of class one given x is estimated by s(y(x)), where s(y)=1/(1+e-y). A Gaussian process prior is placed on y(x), and is combined with the training data to obtain predictions for new x points. We provide a Bayesian treatment, integrating over uncertainty in y and in the parameters that control the Gaussian process prior the necessary integration over y is carried out using Laplace's approximation. The method is generalized to multiclass problems (m>2) using the softmax function. We demonstrate the effectiveness of the method on a number of datasets.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Urban regions present some of the most challenging areas for the remote sensing community. Many different types of land cover have similar spectral responses, making them difficult to distinguish from one another. Traditional per-pixel classification techniques suffer particularly badly because they only use these spectral properties to determine a class, and no other properties of the image, such as context. This project presents the results of the classification of a deeply urban area of Dudley, West Midlands, using 4 methods: Supervised Maximum Likelihood, SMAP, ECHO and Unsupervised Maximum Likelihood. An accuracy assessment method is then developed to allow a fair representation of each procedure and a direct comparison between them. Subsequently, a classification procedure is developed that makes use of the context in the image, though a per-polygon classification. The imagery is broken up into a series of polygons extracted from the Marr-Hildreth zero-crossing edge detector. These polygons are then refined using a region-growing algorithm, and then classified according to the mean class of the fine polygons. The imagery produced by this technique is shown to be of better quality and of a higher accuracy than that of other conventional methods. Further refinements are suggested and examined to improve the aesthetic appearance of the imagery. Finally a comparison with the results produced from a previous study of the James Bridge catchment, in Darleston, West Midlands, is made, showing that the Polygon classified ATM imagery performs significantly better than the Maximum Likelihood classified videography used in the initial study, despite the presence of geometric correction errors.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis presents a thorough and principled investigation into the application of artificial neural networks to the biological monitoring of freshwater. It contains original ideas on the classification and interpretation of benthic macroinvertebrates, and aims to demonstrate their superiority over the biotic systems currently used in the UK to report river water quality. The conceptual basis of a new biological classification system is described, and a full review and analysis of a number of river data sets is presented. The biological classification is compared to the common biotic systems using data from the Upper Trent catchment. This data contained 292 expertly classified invertebrate samples identified to mixed taxonomic levels. The neural network experimental work concentrates on the classification of the invertebrate samples into biological class, where only a subset of the sample is used to form the classification. Other experimentation is conducted into the identification of novel input samples, the classification of samples from different biotopes and the use of prior information in the neural network models. The biological classification is shown to provide an intuitive interpretation of a graphical representation, generated without reference to the class labels, of the Upper Trent data. The selection of key indicator taxa is considered using three different approaches; one novel, one from information theory and one from classical statistical methods. Good indicators of quality class based on these analyses are found to be in good agreement with those chosen by a domain expert. The change in information associated with different levels of identification and enumeration of taxa is quantified. The feasibility of using neural network classifiers and predictors to develop numeric criteria for the biological assessment of sediment contamination in the Great Lakes is also investigated.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Text classification is essential for narrowing down the number of documents relevant to a particular topic for further pursual, especially when searching through large biomedical databases. Protein-protein interactions are an example of such a topic with databases being devoted specifically to them. This paper proposed a semi-supervised learning algorithm via local learning with class priors (LL-CP) for biomedical text classification where unlabeled data points are classified in a vector space based on their proximity to labeled nodes. The algorithm has been evaluated on a corpus of biomedical documents to identify abstracts containing information about protein-protein interactions with promising results. Experimental results show that LL-CP outperforms the traditional semisupervised learning algorithms such as SVMand it also performs better than local learning without incorporating class priors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We derive a mean field algorithm for binary classification with Gaussian processes which is based on the TAP approach originally proposed in Statistical Physics of disordered systems. The theory also yields an approximate leave-one-out estimator for the generalization error which is computed with no extra computational cost. We show that from the TAP approach, it is possible to derive both a simpler 'naive' mean field theory and support vector machines (SVM) as limiting cases. For both mean field algorithms and support vectors machines, simulation results for three small benchmark data sets are presented. They show 1. that one may get state of the art performance by using the leave-one-out estimator for model selection and 2. the built-in leave-one-out estimators are extremely precise when compared to the exact leave-one-out estimate. The latter result is a taken as a strong support for the internal consistency of the mean field approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vaccines are the greatest single instrument of prophylaxis against infectious diseases, with immeasurable benefits to human wellbeing. The accurate and reliable prediction of peptide-MHC binding is fundamental to the robust identification of T-cell epitopes and thus the successful design of peptide- and protein-based vaccines. The prediction of MHC class II peptide binding has hitherto proved recalcitrant and refractory. Here we illustrate the utility of existing computational tools for in silico prediction of peptides binding to class II MHCs. Most of the methods, tested in the present study, detect more than the half of the true binders in the top 5% of all possible nonamers generated from one protein. This number increases in the top 10% and 15% and then does not change significantly. For the top 15% the identified binders approach 86%. In terms of lab work this means 85% less expenditure on materials, labour and time. We show that while existing caveats are well founded, nonetheless use of computational models of class II binding can still offer viable help to the work of the immunologist and vaccinologist.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The G-protein coupled receptors--or GPCRs--comprise simultaneously one of the largest and one of the most multi-functional protein families known to modern-day molecular bioscience. From a drug discovery and pharmaceutical industry perspective, the GPCRs constitute one of the most commercially and economically important groups of proteins known. The GPCRs undertake numerous vital metabolic functions and interact with a hugely diverse range of small and large ligands. Many different methodologies have been developed to efficiently and accurately classify the GPCRs. These range from motif-based techniques to machine learning as well as a variety of alignment-free techniques based on the physiochemical properties of sequences. We review here the available methodologies for the classification of GPCRs. Part of this work focuses on how we have tried to build the intrinsically hierarchical nature of sequence relations, implicit within the family, into an adaptive approach to classification. Importantly, we also allude to some of the key innate problems in developing an effective approach to classifying the GPCRs: the lack of sequence similarity between the six classes that comprise the GPCR family and the low sequence similarity to other family members evinced by many newly revealed members of the family.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work the solution of a class of capital investment problems is considered within the framework of mathematical programming. Upon the basis of the net present value criterion, the problems in question are mainly characterized by the fact that the cost of capital is defined as a non-decreasing function of the investment requirements. Capital rationing and some cases of technological dependence are also included, this approach leading to zero-one non-linear programming problems, for which specifically designed solution procedures supported by a general branch and bound development are presented. In the context of both this development and the relevant mathematical properties of the previously mentioned zero-one programs, a generalized zero-one model is also discussed. Finally,a variant of the scheme, connected with the search sequencing of optimal solutions, is presented as an alternative in which reduced storage limitations are encountered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis analyses the work situation and class position of Brazilian engineers through a Marxist perspective. The research is based on two case studies, one focused on a large German steel company based in Brazil and the other on a large Brazilian energy corporation. The fieldwork involved 114 interviews, with engineers from different hierarchical positions in these two companies. Data was also gathered through interviews with representatives from the companies, the Council of Engineering, the Engineering Education System and the Engineers Trade Unions. The findings show that the engineering profession in Brazil has shifted from its initial condition as a liberal profession to an organizational profession, with the country's industrial deployment. Both companies consider all salaried workers as employees, including managers. Hence they are subject to the company's general personnel policies. The multinational company controls labour more rigidly than the national company, as well as reserving its top positions for its home country's executives. Although no deskilling process was found, engineers of both companies performed simple work, which required less engineering knowledge than they had learned from school. Engineers have little autonomy, authority and participation in decision making and are subject to direct supervision, performance evaluation, time control, overtime work, productivity and to poor working conditions in the multinational company. The majority of the engineers supervised other workers without being in a managerial position. They found that to move into management, was a good way to improve their autonomy, authority, prestige, salary, status, power and professional pride. Despite ideological divisions between capital and labour, most of the engineers were unionised and saw unions as the right way to deal with the employer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis provides an analysis of an occupation in the process of making itself a profession. The solicitors' profession in Birmingham underwent a great many changes during the 19th century against a background of industrialisation and urbanisation. The solicitors' conception of their status and role, in the face of these challenges, had implications for successful strategies of professionalisation. The increased prestige and power of the profession, and especially its elite, are examined in their social context rather than in terms of a technical process, or educational and organisational change. The thesis argues that -the profession's social relationships and broad concerns were significant in establishing solicitors as "professional men". In particular these are related to the profession's efforts to gain control of markets for legal services and increase social status. In the course of achieving these aims a concept of profession and a self-image were articulated by solicitors in order to persuade society and the state of the legitimacy of their claims. The concept of the gentlemanly professional was of critical importance in this instance. The successful creation of a provincial professional "community" by the end of the 19th century rested principally on a social and moral conception of professionalism rather than one which stressed specialised training and knowledge, professional organisations and credentials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The number of new chemical entities (NCE) is increasing every day after the introduction of combinatorial chemistry and high throughput screening to the drug discovery cycle. One third of these new compounds have aqueous solubility less than 20µg/mL [1]. Therefore, a great deal of interest has been forwarded to the salt formation technique to overcome solubility limitations. This study aims to improve the drug solubility of a Biopharmaceutical Classification System class II (BCS II) model drug (Indomethacin; IND) using basic amino acids (L-arginine, L-lysine and L-histidine) as counterions. Three new salts were prepared using freeze drying method and characterised by FT-IR spectroscopy, proton nuclear magnetic resonance ((1)HNMR), Differential Scanning Calorimetry (DSC) and Thermogravimetric analysis (TGA). The effect of pH on IND solubility was also investigated using pH-solubility profile. Both arginine and lysine formed novel salts with IND, while histidine failed to dissociate the free acid and in turn no salt was formed. Arginine and lysine increased IND solubility by 10,000 and 2296 fold, respectively. An increase in dissolution rate was also observed for the novel salts. Since these new salts have improved IND solubility to that similar to BCS class I drugs, IND salts could be considered for possible waivers of bioequivalence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The analysis of bacterial genomes for epidemiological purposes often results in the production of a banding profile of DNA fragments characteristic of the genome under investigation. These may be produced using various methods, many of which involve the cutting or amplification of DNA into defined and reproducible characteristic fragments. It is frequently of interest to enquire whether the bacterial isolates are naturally classifiable into distinct groups based on their DNA profiles. A major problem with this approach is whether classification or clustering of the data is even appropriate. It is always possible to classify such data but it does not follow that the strains they represent are ‘actually’ classifiable into well-defined separate parts. Hence, the act of classification does not in itself answer the question: do the strains consist of a number of different distinct groups or species or do they merge imperceptibly into one another because DNA profiles vary continuously? Nevertheless, we may still wish to classify the data for ‘convenience’ even though strains may vary continuously, and such a classification has been called a ‘dissection’. This Statnote discusses the use of classificatory methods in analyzing the DNA profiles from a sample of bacterial isolates.