940 resultados para Binary mask
Resumo:
Environmental monitoring has become increasingly important due to the significant impact of human activities and climate change on biodiversity. Environmental sound sources such as rain and insect vocalizations are a rich and underexploited source of information in environmental audio recordings. This paper is concerned with the classification of rain within acoustic sensor re-cordings. We present the novel application of a set of features for classifying environmental acoustics: acoustic entropy, the acoustic complexity index, spectral cover, and background noise. In order to improve the performance of the rain classification system we automatically classify segments of environmental recordings into the classes of heavy rain or non-rain. A decision tree classifier is experientially compared with other classifiers. The experimental results show that our system is effective in classifying segments of environmental audio recordings with an accuracy of 93% for the binary classification of heavy rain/non-rain.
Resumo:
Determination of sequence similarity is a central issue in computational biology, a problem addressed primarily through BLAST, an alignment based heuristic which has underpinned much of the analysis and annotation of the genomic era. Despite their success, alignment-based approaches scale poorly with increasing data set size, and are not robust under structural sequence rearrangements. Successive waves of innovation in sequencing technologies – so-called Next Generation Sequencing (NGS) approaches – have led to an explosion in data availability, challenging existing methods and motivating novel approaches to sequence representation and similarity scoring, including adaptation of existing methods from other domains such as information retrieval. In this work, we investigate locality-sensitive hashing of sequences through binary document signatures, applying the method to a bacterial protein classification task. Here, the goal is to predict the gene family to which a given query protein belongs. Experiments carried out on a pair of small but biologically realistic datasets (the full protein repertoires of families of Chlamydia and Staphylococcus aureus genomes respectively) show that a measure of similarity obtained by locality sensitive hashing gives highly accurate results while offering a number of avenues which will lead to substantial performance improvements over BLAST..
Resumo:
Diatomite, a porous non-metal mineral, was used as support to prepare TiO2/diatomite composites by a modified sol–gel method. The as-prepared composites were calcined at temperatures ranging from 450 to 950 _C. The characterization tests included X-ray powder diffraction (XRD), scanning electron microscopy (SEM) with an energy-dispersive X-ray spectrometer (EDS), high-resolution transmission electron microscopy (HRTEM), X-ray photoelectron spectroscopy (XPS), and nitrogen adsorption/desorption measurements. The XRD analysis indicated that the binary mixtures of anatase and rutile exist in the composites. The morphology analysis confirmed the TiO2 particles were uniformly immobilized on the surface of diatom with a strong interfacial anchoring strength, which leads to few drain of photocatalytic components during practical applications. In further XPS studies of hybrid catalyst, we found the evidence of the presence of Ti–O–Si bond and increased percentage of surface hydroxyl. In addition, the adsorption capacity and photocatalytic activity of synthesized TiO2/diatomite composites were evaluated by studying the degradation kinetics of aqueous Rhodamine B under UV-light irradiation. The photocatalytic degradation was found to follow pseudo-first order kinetics according to the Langmuir–Hinshelwood model. The preferable removal efficiency was observed in composites by 750 _C calcination, which is attributed to a relatively appropriate anatase/rutile mixing ratio of 90/10.
Resumo:
Description of a patient's injuries is recorded in narrative text form by hospital emergency departments. For statistical reporting, this text data needs to be mapped to pre-defined codes. Existing research in this field uses the Naïve Bayes probabilistic method to build classifiers for mapping. In this paper, we focus on providing guidance on the selection of a classification method. We build a number of classifiers belonging to different classification families such as decision tree, probabilistic, neural networks, and instance-based, ensemble-based and kernel-based linear classifiers. An extensive pre-processing is carried out to ensure the quality of data and, in hence, the quality classification outcome. The records with a null entry in injury description are removed. The misspelling correction process is carried out by finding and replacing the misspelt word with a soundlike word. Meaningful phrases have been identified and kept, instead of removing the part of phrase as a stop word. The abbreviations appearing in many forms of entry are manually identified and only one form of abbreviations is used. Clustering is utilised to discriminate between non-frequent and frequent terms. This process reduced the number of text features dramatically from about 28,000 to 5000. The medical narrative text injury dataset, under consideration, is composed of many short documents. The data can be characterized as high-dimensional and sparse, i.e., few features are irrelevant but features are correlated with one another. Therefore, Matrix factorization techniques such as Singular Value Decomposition (SVD) and Non Negative Matrix Factorization (NNMF) have been used to map the processed feature space to a lower-dimensional feature space. Classifiers with these reduced feature space have been built. In experiments, a set of tests are conducted to reflect which classification method is best for the medical text classification. The Non Negative Matrix Factorization with Support Vector Machine method can achieve 93% precision which is higher than all the tested traditional classifiers. We also found that TF/IDF weighting which works well for long text classification is inferior to binary weighting in short document classification. Another finding is that the Top-n terms should be removed in consultation with medical experts, as it affects the classification performance.
Resumo:
Accounts of the governance of prostitution have typically argued that prostitutes are, in one way or another, stigmatised social outcasts. There is a persistent claim that power has operated to dislocate or banish the prostitute from the community in order to silence, isolate, hide, restrict, or punish. I argue that another position may be tenable; that is, power has operated to locate prostitution within the social. Power does not operate to 'desocialise' prostitution, but has in recent times operated increasingly to normalise it. Power does not demarcate prostitutes from the social according to some binary mechanics of difference, but works instead according to a principle of differentiation which seeks to connect, include, circulate and enable specific prostitute populations within the social. In this paper I examine how prostitution has been singled out for public attention as a sociopolitical problem and governed accordingly. The concept of governmentality is used to think through such issues, providing, as it does, a non-totalising and non-reductionist account of rule. It is argued that a combination of self-regulatory and punitive practices developed during modernity to manage socially problematic prostitute populations.
Resumo:
Modelling of food processing is complex because it involves sophisticated material and transport phenomena. Most of the agricultural products such fruits and vegetables are hygroscopic porous media containing free water, bound water, gas and solid matrix. Considering all phase in modelling is still not developed. In this article, a comprehensive porous media model for drying has been developed considering bound water, free water separately, as well as water vapour and air. Free water transport was considered as diffusion, pressure driven and evaporation. Bound water assumed to be converted to free water due to concentration difference and also can diffuse. Binary diffusion between water vapour and air was considered. Since, the model is fundamental physics based it can be applied to any drying applications and other food processing where heat and mass transfer takes place in porous media with significant evaporation and other phase change.
Resumo:
To enhance the efficiency of regression parameter estimation by modeling the correlation structure of correlated binary error terms in quantile regression with repeated measurements, we propose a Gaussian pseudolikelihood approach for estimating correlation parameters and selecting the most appropriate working correlation matrix simultaneously. The induced smoothing method is applied to estimate the covariance of the regression parameter estimates, which can bypass density estimation of the errors. Extensive numerical studies indicate that the proposed method performs well in selecting an accurate correlation structure and improving regression parameter estimation efficiency. The proposed method is further illustrated by analyzing a dental dataset.
Queensland's budget austerity and its impact on social welfare : is the cure worse than the disease?
Resumo:
While considerable attention has been paid to the austerity experiments in Europe, much less attention has been paid to austerity case studies from other parts of the world. This paper examines the case of Queensland, Australia, where the government has pursued austerity measures, while making dire warnings that unless public debt was slashed and the public service sector downsized,Queensland risked becoming the Spain of Australia. The comparison is incomprehensible, given the very different economic situation in Queensland compared with Spain. This comparison constructed a sense of crisis that helped to mask standard neoliberal economic reform. While pursuing neoliberal economic policies,the Queensland Government has also been introducing draconian laws that limit civil liberties and political freedoms for ordinary citizens. This mix of authoritarianism and austerity has met considerable resistance, and this dynamic is discussed in the paper, along with the predictable and unequal impact that austerity measures have had on the general population and social services.
Resumo:
Construction scholars suggest that procurement processes can be used as mechanisms to change construction industry practices. This paper discusses industry changes as a response to the calls for integration of sustainability ideals into construction practices. Because major infrastructure construction has been identified as a key producer of greenhouse gas emissions (GHGE), this study explores collaborative procurement models that have been used to facilitate mitigation of GHGE. The study focuses on the application of non-price incentives and rewards that work together as a binary mechanism. Data were collected using mixed-methods: government document content analysis was complemented with data collected through focus groups and individual interviews with both clients and contractors. This report includes examples of greening procurement agendas for three Australian road authorities relating to collaborative procurement project delivery models. Three collaborative procurement models, Alliance Consortium, Early Contractor Involvement and Public Private Partnerships provide evidence of construction projects that were completed early. It can also be argued that both clients and contractors are rewarded through collaborative project delivery. The incentive of early completion is rewarded with reduction of GHGE. This positive environmental outcome, based on a dual benefit and non-price sustainability criteria, suggests a step towards changed industry practices though the use of green procurement models.
Resumo:
Texture enhancement is an important component of image processing that finds extensive application in science and engineering. The quality of medical images, quantified using the imaging texture, plays a significant role in the routine diagnosis performed by medical practitioners. Most image texture enhancement is performed using classical integral order differential mask operators. Recently, first order fractional differential operators were used to enhance images. Experimentation with these methods led to the conclusion that fractional differential operators not only maintain the low frequency contour features in the smooth areas of the image, but they also nonlinearly enhance edges and textures corresponding to high frequency image components. However, whilst these methods perform well in particular cases, they are not routinely useful across all applications. To this end, we apply the second order Riesz fractional differential operator to improve upon existing approaches of texture enhancement. Compared with the classical integral order differential mask operators and other first order fractional differential operators, we find that our new algorithms provide higher signal to noise values and superior image quality.