938 resultados para area under the curve


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a hybrid model consisting of the fuzzy ARTMAP (FAM) neural network and the classification and regression tree (CART) is formulated. FAM is useful for tackling the stability–plasticity dilemma pertaining to data-based learning systems, while CART is useful for depicting its learned knowledge explicitly in a tree structure. By combining the benefits of both models, FAM–CART is capable of learning data samples stably and, at the same time, explaining its predictions with a set of decision rules. In other words, FAM–CART possesses two important properties of an intelligent system, i.e., learning in a stable manner (by overcoming the stability–plasticity dilemma) and extracting useful explanatory rules (by overcoming the opaqueness issue). To evaluate the usefulness of FAM–CART, six benchmark medical data sets from the UCI repository of machine learning and a real-world medical data classification problem are used for evaluation. For performance comparison, a number of performance metrics which include accuracy, specificity, sensitivity, and the area under the receiver operation characteristic curve are computed. The results are quantified with statistical indicators and compared with those reported in the literature. The outcomes positively indicate that FAM–CART is effective for undertaking data classification tasks. In addition to producing good results, it provides justifications of the predictions in the form of a decision tree so that domain users can easily understand the predictions, therefore making it a useful decision support tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, two real-world medical classification problems using electrocardiogram (ECG) and auscultatory blood pressure (Korotkoff) signals are examined. A total of nine machine learning models are applied to perform classification of the medical data sets. A number of useful performance metrics which include accuracy, sensitivity, specificity, as well as the area under the receiver operating characteristic curve are computed. In addition to the original data sets, noisy data sets are generated to evaluate the robustness of the classifiers against noise. The 10-fold cross validation method is used to compute the performance statistics, in order to ensure statistically reliable results pertaining to classification of the ECG and Korotkoff signals are produced. The outcomes indicate that while logistic regression models perform the best with the original data set, ensemble machine learning models achieve good accuracy rates with noisy data sets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces an automated medical data classification method using wavelet transformation (WT) and interval type-2 fuzzy logic system (IT2FLS). Wavelet coefficients, which serve as inputs to the IT2FLS, are a compact form of original data but they exhibits highly discriminative features. The integration between WT and IT2FLS aims to cope with both high-dimensional data challenge and uncertainty. IT2FLS utilizes a hybrid learning process comprising unsupervised structure learning by the fuzzy c-means (FCM) clustering and supervised parameter tuning by genetic algorithm. This learning process is computationally expensive, especially when employed with high-dimensional data. The application of WT therefore reduces computational burden and enhances performance of IT2FLS. Experiments are implemented with two frequently used medical datasets from the UCI Repository for machine learning: the Wisconsin breast cancer and Cleveland heart disease. A number of important metrics are computed to measure the performance of the classification. They consist of accuracy, sensitivity, specificity and area under the receiver operating characteristic curve. Results demonstrate a significant dominance of the wavelet-IT2FLS approach compared to other machine learning methods including probabilistic neural network, support vector machine, fuzzy ARTMAP, and adaptive neuro-fuzzy inference system. The proposed approach is thus useful as a decision support system for clinicians and practitioners in the medical practice. copy; 2015 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sewer odour and corrosion is caused by the reduction of sulphide ions and the release of hydrogen sulphide gas (H2S) into the sewer atmosphere. The reduction of sulphide is determined by its dissipation rate which depends on many processes such as emission, oxidation and precipitation that prevail in wastewater environments. Two factors that mainly affect the dissipation of sulphide are sewer hydraulics and wastewater characteristics; modification to the latter by dosing certain chemicals is known as one of the mitigation strategies to control the dissipation of sulphide. This study investigates the dissipation of sulphide in the presence of NaOH, Mg(OH)2, Ca(NO3)2 and FeCl3 and the dissipation rate is developed as a function of hydraulic parameters such as the slope of the sewer and the velocity gradient. Experiments were conducted in a 18m experimental sewer pipe with adjustable slope to which, firstly no chemical was added and secondly each of the above mentioned chemicals was supplemented in turn. A dissipation rate constant of 2×10-6 for sulphide was obtained from experiments with no chemical addition. This value was then used to predict the sulphide concentration that was responsible for the emission of H2S gas in the presence of one of the above mentioned four chemicals. It was found that the performance of alkali substances (NaOH and Mg(OH)2) in suppressing the H2S gas emission was excellent while ferric chloride showed a moderate mitigating effect due to its slow reaction kinetics. Calcium nitrate was of little value since the wastewater used in this study experienced almost no biological growth. Thus the effectiveness of selected chemicals in suppressing H2S gas emission had the following order: NaOH ≥ Mg(OH)2 ≥ FeCl3 ≥ Ca(NO3)2.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Shannon entropy H and related measures are increasingly used in molecular ecology and population genetics because (1) unlike measures based on heterozygosity or allele number, these measures weigh alleles in proportion to their population fraction, thus capturing a previously-ignored aspect of allele frequency distributions that may be important in many applications; (2) these measures connect directly to the rich predictive mathematics of information theory; (3) Shannon entropy is completely additive and has an explicitly hierarchical nature; and (4) Shannon entropy-based differentiation measures obey strong monotonicity properties that heterozygosity-based measures lack. We derive simple new expressions for the expected values of the Shannon entropy of the equilibrium allele distribution at a neutral locus in a single isolated population under two models of mutation: the infinite allele model and the stepwise mutation model. Surprisingly, this complex stochastic system for each model has an entropy expressable as a simple combination of well-known mathematical functions. Moreover, entropy- and heterozygosity-based measures for each model are linked by simple relationships that are shown by simulations to be approximately valid even far from equilibrium. We also identify a bridge between the two models of mutation. We apply our approach to subdivided populations which follow the finite island model, obtaining the Shannon entropy of the equilibrium allele distributions of the subpopulations and of the total population. We also derive the expected mutual information and normalized mutual information ("Shannon differentiation") between subpopulations at equilibrium, and identify the model parameters that determine them. We apply our measures to data from the common starling (Sturnus vulgaris) in Australia. Our measures provide a test for neutrality that is robust to violations of equilibrium assumptions, as verified on real world data from starlings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The so-called narrative test provides the means by which injured persons who satisfy the statutory and common law definition of serious injury may bring proceedings for common law damages under s 93 of the Transport Accident Act 1986 (Vic) and s 134AB of the Accident Compensation Act 1985 (Vic) (or, for injuries after 1 July 2014, under ss 324-347 of the Workplace Injury Rehabilitation and Compensation Act 2013 (Vic)). These are among the most litigated provisions in Australia. This article outlines the legislative and political background to these provisions, the provisions themselves, and an account of the statutory and common law requirements needed to satisfy the provisions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

How one affectively sounds loneliness on screen is dependent on what instruments, melodies, voices and sound effects are used to create a sonic membrane that manifests as melancholy and malcontent. It is in the syncretic and synesthetic entanglement that sounding loneliness takes root. It is in the added value inherent in the “sound-image” – to draw upon Chion1 – that loneliness fully emerges like a black dahlia. So many lonely people, where do they all come from? And yet, as I will suggest, this sounding loneliness is not only textually specific, simply or singularly driven by narrative and generic concerns, but is historically contingent and nationally and culturally locatable. For example, the sounds of urban isolation of the American 1940s film noir are different from the Chinese peasant laments of Chen Kaige’s Yellow Earth (1984), or what I will presently argue are the British austere strings of sounding loneliness today. When one employs a “diagnostic critique”2, one undertakes to find the history in the text and the text in the history. It is in the interplay between sound and image that historical and political truth emerges. These contextualised and historicised soundings change across and within national landscapes and their related imaginings. We don’t just see the crumbling walls of the imagined nation state, but get to hear its desolate tunes: The Specials wailing “Ghost Town” – the anthem of/to Margaret Thatcher’s first wave of 1980s neo-liberalism – is a striking case in point. But what specifically is this contemporary “sounding loneliness”, and where does it come from? I would like to suggest that this age of loneliness is composed in, through and within the sonic vibrations found in the wretched politics of austerity. My case study will be the anomic soundings of Jonathan Glazer’s Under the Skin (2013).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This document discusses Brazil and the Free Trade Area of the Americas (FTAA). Since the FTAA is only a proposed agreement and trade apparatus at the moment, NAFTA is used as a working model and its influence on and benefit for Mexico and that country’s economy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A determinação da taxa de juros estrutura a termo é um dos temas principais da gestão de ativos financeiros. Considerando a grande importância dos ativos financeiros para a condução das políticas econômicas, é fundamental para compreender a estrutura que é determinado. O principal objetivo deste estudo é estimar a estrutura a termo das taxas de juros brasileiras, juntamente com taxa de juros de curto prazo. A estrutura a termo será modelado com base em um modelo com uma estrutura afim. A estimativa foi feita considerando a inclusão de três fatores latentes e duas variáveis ​​macroeconômicas, através da técnica Bayesiana da Cadeia de Monte Carlo Markov (MCMC).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a structural monetary úamework featunng a demand function for non-monetary uses of gold, such as the one drawn by Barsky and Summers in their 1988 analy8ÚI of the Gibson Paradox as a natural concomitant of the gold standard period. That structural model predicts that the laws of behavior of nominal prices and interest rates are functions of the rules set by the government to command the money supply. !ta fiduciary vemon obtaina Fisherian relationships &8 particular cases. !ta gold atandard 801ution yields a modelsimilar to the Barsky and Summers model, in which interest rates are exogeneous and subject to shocb. This paper integrates governnment bonds into the analysis, treats interest rates endogenously, and ahifts the responsibility for the shocb to the government budgetary financing policies. The Gibson paradox appears as "practically" the only cl&18 of behavioral pattern open for interest rates and price movements under apure gold standard economy. Fisherian-like relationshipe are utterly ruled out.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper demonstrates that when an industry faces potential entry and this threat of entry constrains pre-entry prices, cost and conduct are not identified from the comparative statics of equilibrium. In such a setting, the identifying assumption behind the well-established technique of relying on exogenous demand perturbations to empirically distinguish between alternative hypotheses of conduct is shown to fail. The Brazilian cement industry, where the threat of imports restrains market outcomes, provides an empirical illustration. In particular, pricecost margins estimated using this established technique are considerably biased downward, underestimating the degree of market power. A test of conduct is proposed, adapted to this constrained setting, which suggests that outcomes in the industry are collusive and characterised by market division.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the issue of whether there was a stable money demand function for Japan in 1990's using both aggregate and disaggregate time series data. The aggregate data appears to support the contention that there was no stable money demand function. The disaggregate data shows that there was a stable money demand function. Neither was there any indication of the presence of liquidity trapo Possible sources of discrepancy are explored and the diametrically opposite results between the aggregate and disaggregate analysis are attributed to the neglected heterogeneity among micro units. We also conduct simulation analysis to show that when heterogeneity among micro units is present. The prediction of aggregate outcomes, using aggregate data is less accurate than the prediction based on micro equations. Moreover. policy evaluation based on aggregate data can be grossly misleading.