50 resultados para Macrobrachium - Classification


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A deductive system that enables us to derive many legal rules from a few principles makes the law more, rather than less certain, since this approach parallels the actual process by which judicial decisions are reached. Uncertainty as to the meaning of equity in the law is inevitably . .. due to the absence of legal guidance for the standard of moral values to be observed in transactions . .. 1

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Previously, the authors proposed a new, simple method of frequency domain analysis based on the two-dimensional discrete wavelet transform to objectively measure the pilling intensity in sample fabric images. The method was further characterized, and the results obtained indicate that standard deviation and variance are the most appropriate measures of the dispersion of wavelet details coefficients for analysis, that the relationship between wavelet analysis scale and fabric inter-yarn pitch was empirically confirmed, and, that fabrics with random patterns do not appear to impact on the effectiveness of the analysis method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The complete mitochondrial DNA sequence was determined for the Australian giant crab Pseudocarcinns gigas (Crustacea: Decapoda: Menippidae) and the giant freshwater shrimp Macrobrachium rosenbergii (Crustacea: Decapoda: Palaemonidae). The Pse gigas and Mrosenbergii mitochondrial genomes are circular molecules, 15,515 and 15,772 bp in length, respectively, and have the same gene composition as found in other metazoans. The gene arrangement of M. rosenbergii corresponds with that of the presumed ancestral arthropod gene order, represented by Limulus polyphemus, except for the position of the tRNALeu(UUR) gene. The Pse. gigas gene arrangement corresponds exactly with that reported for another brachyuran, Portunus trituberculatus, and differs from the M. rosenbergii gene order by only the position of the tRNAHis gene. Given the relative positions of intergenic nonoding nucleotides, the “duplication/random loss” model appears to be the most plausible mechanism for the translocation of this gene. These data represent the first caridean and only the second brachyuran complete mtDNA sequences, and a source of information that will facilitate surveys of intraspecific variation within these commercially important decapod species.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has hitherto been little research into evolutionary and taxonomic relationships amongst species of the freshwater prawn genus Macrobrachium Bate across its global distribution. Previous work by the authors demonstrated that the endemic Australian species did not evolve from a single ancestral lineage. To examine whether other regional Macrobrachium faunas also reflect this pattern of multiple origins, the phylogeny of 30 Macrobrachium species from Asia, Central/South America and Australia was inferred from mitochondrial 16S rRNA sequences. Phylogenetic relationships demonstrate that, despite some evidence for regional diversification, Australia, Asia and South America clearly contain Macrobrachium species that do not share a common ancestry, suggesting that large-scale dispersal has been a major feature of the evolutionary history of the genus. The evolution of abbreviated larval development (ALD), associated with the transition from an estuarine into a purely freshwater lifecycle, was also mapped onto the phylogeny and was shown to be a relatively homoplasious trait and not taxonomically informative. Other taxonomic issues, as well as the evolutionary origins of Macrobrachium, are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fish-net algorithm is a novel field learning algorithm which derives classification rules by looking at the range of values of each attribute instead of the individual point values. In this paper, we present a Feature Selection Fish-net learning algorithm to solve the Dual Imbalance problem on text classification. Dual imbalance includes the instance imbalance and feature imbalance. The instance imbalance is caused by the unevenly distributed classes and feature imbalance is due to the different document length. The proposed approach consists of two phases: (1) select a feature subset which consists of the features that are more supportive to difficult minority class; (2) construct classification rules based on the original Fish-net algorithm. Our experimental results on Reuters21578 show that the proposed approach achieves better balanced accuracy rate on both majority and minority class than Naive Bayes MultiNomial and SVM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compared with conventional two-class learning schemes, one-class classification simply uses a single class for training purposes. Applying one-class classification to the minorities in an imbalanced data has been shown to achieve better performance than the two-class one. In this paper, in order to make the best use of all the available information during the learning procedure, we propose a general framework which first uses the minority class for training in the one-class classification stage; and then uses both minority and majority class for estimating the generalization performance of the constructed classifier. Based upon this generalization performance measurement, parameter search algorithm selects the best parameter settings for this classifier. Experiments on UCI and Reuters text data show that one-class SVM embedded in this framework achieves much better performance than the standard one-class SVM alone and other learning schemes, such as one-class Naive Bayes, one-class nearest neighbour and neural network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reviews the appropriateness for application to large data sets of standard machine learning algorithms, which were mainly developed in the context of small data sets. Sampling and parallelisation have proved useful means for reducing computation time when learning from large data sets. However, such methods assume that algorithms that were designed for use with what are now considered small data sets are also fundamentally suitable for large data sets. It is plausible that optimal learning from large data sets requires a different type of algorithm to optimal learning from small data sets. This paper investigates one respect in which data set size may affect the requirements of a learning algorithm — the bias plus variance decomposition of classification error. Experiments show that learning from large data sets may be more effective when using an algorithm that places greater emphasis on bias management, rather than variance management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a novel ant system based optimisation method which integrates genetic algorithms and simplex algorithms. This method is able to not only speed up the search process for solutions, but also improve the quality of the solutions. In this paper, the proposed method is applied to set up a learning model for the "tuned" mask, which is used for texture classification. Experimental results on aerial images and comparisons with genetic algorithms and genetic simplex algorithms are presented to illustrate the merit and feasibility of the proposed method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a novel ant colony algorithm integrating genetic algorithms and simplex algorithms. This method is able to not only speed up searching process for optimal solutions, but also improve the quality of the solutions. The proposed method is applied to set up a learning model for the "tuned" mask, which is used for texture classification. Experimental results on real world images and comparisons with genetic algorithms and genetic simplex algorithms are presented to illustrate the merit and feasibility of the proposed method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rough set is a new mathematical approach to imprecision, vagueness and uncertainty. The concept of reduction of the decision table based on the rough sets is very useful for feature selection. The paper describes an application of rough sets method to feature selection and reduction in texture images recognition. The methods applied include continuous data discretization based on Fuzzy c-means and, and rough set method for feature selection and reduction. The trees extractions in the aerial images were applied. The experiments show that the methods presented in this paper are practical and effective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An effective scheme for soccer summarization is significant to improve the usage of this massively growing video data. The paper presents an extension to our recent work which proposed a framework to integrate highlights into play-breaks to construct more complete soccer summaries. The current focus is to demonstrate the benefits of detecting some specific audio-visual features during play-break sequences in order to classify highlights contained within them. The main purpose is to generate summaries which are self-consumable individually. To support this framework, the algorithms for shot classification and detection of near-goal and slow-motion replay scenes is described. The results of our experiment using 5 soccer videos (20 minutes each) show the performance and reliability of our framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a performance study of four statistical test algorithms used to identify smooth image blocks in order to filter the reconstructed image of a video coded image. The four algorithms considered are the Coefficient of Variation (CV), Exponential Entropy of Pal and Pal (E), Shannon's (Logarithmic) Entropy (H), and Quadratic Entropy (Q). These statistical algorithms are employed to distinguish between smooth and textured blocks in a reconstructed image. The linear filtering is carried out on the smooth blocks of the image to reduce the blocking artefact. The rationale behind applying the filter on the smooth blocks only is that the blocking artefact is visually more prominent in the smooth region of an image rather than in the textured region.