942 resultados para Rule-Based Classification


Relevância:

100.00% 100.00%

Publicador:

Resumo:

his paper evaluates six commonly available parts-of-speech tagging tools over corpora other than those upon which they were originally trained. In particular this investigation measures the performance of the selected tools over varying styles and genres of text without retraining, under the assumption that domain specific training data is not always available. An investigation is performed to determine whether improved results can be achieved by combining the set of tagging tools into ensembles that use voting schemes to determine the best tag for each word. It is found that while accuracy drops due to non-domain specific training, and tag-mapping between corpora, accuracy remains very high, with the support vector machine-based tagger, and the decision tree-based tagger performing best over different corpora. It is also found that an ensemble containing a support vector machine-based tagger, a probabilistic tagger, a decision-tree based tagger and a rule-based tagger produces the largest increase in accuracy and the largest reduction in error across different corpora, using the Precision-Recall voting scheme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

 A material model for more effective analysis of plastic deformation of sheet materials is presented in this paper. The model is capable of considering the following aspects of plastic deformation behavior of sheet materials: the anisotropy in yielding stresses in different directions by using a quadratic yield function (based on Hill’s 1948 model and stress ratios), the anisotropy in work hardening by introducing non-constant flow stress hardening in different directions, the anisotropy in plastic strains in different directions by using a quadratic plastic potential function and non-associated flow rule (based on Hill’s 1948 model and plastic strain ratios, r-values), and finally some of the cyclic hardening phenomena such as Bauschinger’s effect and transient behavior for reverse loading by using a coupled nonlinear kinematic hardening (so-called Armstrong-Frederick-Chaboche model). Basic fundamentals of the plasticity of the model are presented in a general framework. Then, the model adjustment procedure is derived for the plasticity formulations. Also, a generic numerical stress integration procedure is developed based on backward-Euler method (so-called multistage return mapping algorithm). Different aspects of the model are verified for DP600 steel sheet. Results show that the new model is able to predict the sheet material behavior in both anisotropic hardening and cyclic hardening regimes more accurately. By featuring the above-mentioned facts in the presented constitutive model, it is expected that more accurate results can be obtained by implementing this model in computational simulations of sheet material forming processes. For instance, more precise results of springback prediction of the parts formed from highly anisotropic hardened materials or that of determining the forming limit diagrams is highly expected by using the developed material model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: This study proposes a new approach that considers uncertainty in predicting and quantifying the presence and severity of diabetic peripheral neuropathy. METHODS: A rule-based fuzzy expert system was designed by four experts in diabetic neuropathy. The model variables were used to classify neuropathy in diabetic patients, defining it as mild, moderate, or severe. System performance was evaluated by means of the Kappa agreement measure, comparing the results of the model with those generated by the experts in an assessment of 50 patients. Accuracy was evaluated by an ROC curve analysis obtained based on 50 other cases; the results of those clinical assessments were considered to be the gold standard. RESULTS: According to the Kappa analysis, the model was in moderate agreement with expert opinions. The ROC analysis (evaluation of accuracy) determined an area under the curve equal to 0.91, demonstrating very good consistency in classifying patients with diabetic neuropathy. CONCLUSION: The model efficiently classified diabetic patients with different degrees of neuropathy severity. In addition, the model provides a way to quantify diabetic neuropathy severity and allows a more accurate patient condition assessment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we compare the performance of two image classification paradigms (object- and pixel-based) for creating a land cover map of Asmara, the capital of Eritrea and its surrounding areas using a Landsat ETM+ imagery acquired in January 2000. The image classification methods used were maximum likelihood for the pixel-based approach and Bhattacharyya distance for the object-oriented approach available in, respectively, ArcGIS and SPRING software packages. Advantages and limitations of both approaches are presented and discussed. Classifications outputs were assessed using overall accuracy and Kappa indices. Pixel- and object-based classification methods result in an overall accuracy of 78% and 85%, respectively. The Kappa coefficient for pixel- and object-based approaches was 0.74 and 0.82, respectively. Although pixel-based approach is the most commonly used method, assessment and visual interpretation of the results clearly reveal that the object-oriented approach has advantages for this specific case-study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Melanoma is a type of skin cancer and is caused by the uncontrolled growth of atypical melanocytes. In recent decades, computer aided diagnosis is used to support medical professionals; however, there is still no globally accepted tool. In this context, similar to state-of-the-art we propose a system that receives a dermatoscopy image and provides a diagnostic if the lesion is benign or malignant. This tool is composed with next modules: Preprocessing, Segmentation, Feature Extraction, and Classification. Preprocessing involves the removal of hairs. Segmentation is to isolate the lesion. Feature extraction is considering the ABCD dermoscopy rule. The classification is performed by the Support Vector Machine. Experimental evidence indicates that the proposal has 90.63 % accuracy, 95 % sensitivity, and 83.33 % specificity on a data-set of 104 dermatoscopy images. These results are favorable considering the performance of diagnosis by traditional progress in the area of dermatology

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Precise, up-to-date and increasingly detailed road maps are crucial for various advanced road applications, such as lane-level vehicle navigation, and advanced driver assistant systems. With the very high resolution (VHR) imagery from digital airborne sources, it will greatly facilitate the data acquisition, data collection and updates if the road details can be automatically extracted from the aerial images. In this paper, we proposed an effective approach to detect road lane information from aerial images with employment of the object-oriented image analysis method. Our proposed algorithm starts with constructing the DSM and true orthophotos from the stereo images. The road lane details are detected using an object-oriented rule based image classification approach. Due to the affection of other objects with similar spectral and geometrical attributes, the extracted road lanes are filtered with the road surface obtained by a progressive two-class decision classifier. The generated road network is evaluated using the datasets provided by Queensland department of Main Roads. The evaluation shows completeness values that range between 76% and 98% and correctness values that range between 82% and 97%.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Modern computer graphics systems are able to construct renderings of such high quality that viewers are deceived into regarding the images as coming from a photographic source. Large amounts of computing resources are expended in this rendering process, using complex mathematical models of lighting and shading. However, psychophysical experiments have revealed that viewers only regard certain informative regions within a presented image. Furthermore, it has been shown that these visually important regions contain low-level visual feature differences that attract the attention of the viewer. This thesis will present a new approach to image synthesis that exploits these experimental findings by modulating the spatial quality of image regions by their visual importance. Efficiency gains are therefore reaped, without sacrificing much of the perceived quality of the image. Two tasks must be undertaken to achieve this goal. Firstly, the design of an appropriate region-based model of visual importance, and secondly, the modification of progressive rendering techniques to effect an importance-based rendering approach. A rule-based fuzzy logic model is presented that computes, using spatial feature differences, the relative visual importance of regions in an image. This model improves upon previous work by incorporating threshold effects induced by global feature difference distributions and by using texture concentration measures. A modified approach to progressive ray-tracing is also presented. This new approach uses the visual importance model to guide the progressive refinement of an image. In addition, this concept of visual importance has been incorporated into supersampling, texture mapping and computer animation techniques. Experimental results are presented, illustrating the efficiency gains reaped from using this method of progressive rendering. This visual importance-based rendering approach is expected to have applications in the entertainment industry, where image fidelity may be sacrificed for efficiency purposes, as long as the overall visual impression of the scene is maintained. Different aspects of the approach should find many other applications in image compression, image retrieval, progressive data transmission and active robotic vision.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents an extended study on the implementation of support vector machine(SVM) based speaker verification in systems that employ continuous progressive model adaptation using the weight-based factor analysis model. The weight-based factor analysis model compensates for session variations in unsupervised scenarios by incorporating trial confidence measures in the general statistics used in the inter-session variability modelling process. Employing weight-based factor analysis in Gaussian mixture models (GMM) was recently found to provide significant performance gains to unsupervised classification. Further improvements in performance were found through the integration of SVM-based classification in the system by means of GMM supervectors. This study focuses particularly on the way in which a client is represented in the SVM kernel space using single and multiple target supervectors. Experimental results indicate that training client SVMs using a single target supervector maximises performance while exhibiting a certain robustness to the inclusion of impostor training data in the model. Furthermore, the inclusion of low-scoring target trials in the adaptation process is investigated where they were found to significantly aid performance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The flying capacitor multicell inverter (FCMI) possesses natural balancing property. With the phase-shifted (PS) carrier-based scheme, natural balancing can be achieved in a straightforward manner. However, to achieve natural balancing with the harmonically optimal phase-disposition (PD) carrierbased scheme, the conventional approaches require (n-1) x (n-1) trapezoidal carrier signals for an n-level inverter, which is (n-1) x (n-2) times more than that in the standard PD scheme. This paper proposes two improved natural balancing strategies for FMI under PD scheme, which use the same (n-1) carrier signals as used in the standard PD scheme. In the first scheme, on-line detection is performed of the band in which the modulation signal is located, corresponding period number of the carrier, and rising or falling half cycle of the carrier waveform to generate the switching signals based on certain rules. In the second strategy, the output voltage level selection is first processed and the switching signals are then generated according to a rule based on preferential cell selection algorithm. These methods are easy to use and can be simply implemented as compared to the other available methods. Simulation and experimental results are presented for a five-level inverter to verify these proposed schemes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Abstract. In recent years, sparse representation based classification(SRC) has received much attention in face recognition with multipletraining samples of each subject. However, it cannot be easily applied toa recognition task with insufficient training samples under uncontrolledenvironments. On the other hand, cohort normalization, as a way of mea-suring the degradation effect under challenging environments in relationto a pool of cohort samples, has been widely used in the area of biometricauthentication. In this paper, for the first time, we introduce cohort nor-malization to SRC-based face recognition with insufficient training sam-ples. Specifically, a user-specific cohort set is selected to normalize theraw residual, which is obtained from comparing the test sample with itssparse representations corresponding to the gallery subject, using poly-nomial regression. Experimental results on AR and FERET databases show that cohort normalization can bring SRC much robustness against various forms of degradation factors for undersampled face recognition.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In attempting to build intelligent litigation support tools, we have moved beyond first generation, production rule legal expert systems. Our work supplements rule-based reasoning with case based reasoning and intelligent information retrieval. This research, specifies an approach to the case based retrieval problem which relies heavily on an extended object-oriented / rule-based system architecture that is supplemented with causal background information. Machine learning techniques and a distributed agent architecture are used to help simulate the reasoning process of lawyers. In this paper, we outline our implementation of the hybrid IKBALS II Rule Based Reasoning / Case Based Reasoning system. It makes extensive use of an automated case representation editor and background information.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In attempting to build intelligent litigation support tools, we have moved beyond first generation, production rule legal expert systems. Our work integrates rule based and case based reasoning with intelligent information retrieval. When using the case based reasoning methodology, or in our case the specialisation of case based retrieval, we need to be aware of how to retrieve relevant experience. Our research, in the legal domain, specifies an approach to the retrieval problem which relies heavily on an extended object oriented/rule based system architecture that is supplemented with causal background information. We use a distributed agent architecture to help support the reasoning process of lawyers. Our approach to integrating rule based reasoning, case based reasoning and case based retrieval is contrasted to the CABARET and PROLEXS architectures which rely on a centralised blackboard architecture. We discuss in detail how our various cooperating agents interact, and provide examples of the system at work. The IKBALS system uses a specialised induction algorithm to induce rules from cases. These rules are then used as indices during the case based retrieval process. Because we aim to build legal support tools which can be modified to suit various domains rather than single purpose legal expert systems, we focus on principles behind developing legal knowledge based systems. The original domain chosen was theAccident Compensation Act 1989 (Victoria, Australia), which relates to the provision of benefits for employees injured at work. For various reasons, which are indicated in the paper, we changed our domain to that ofCredit Act 1984 (Victoria, Australia). This Act regulates the provision of loans by financial institutions. The rule based part of our system which provides advice on the Credit Act has been commercially developed in conjunction with a legal firm. We indicate how this work has lead to the development of a methodology for constructing rule based legal knowledge based systems. We explain the process of integrating this existing commercial rule based system with the case base reasoning and retrieval architecture.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Recent research in modelling uncertainty in water resource systems has highlighted the use of fuzzy logic-based approaches. A number of research contributions exist in the literature that deal with uncertainty in water resource systems including fuzziness, subjectivity, imprecision and lack of adequate data. This chapter presents a broad overview of the fuzzy logic-based approaches adopted in addressing uncertainty in water resource systems modelling. Applications of fuzzy rule-based systems and fuzzy optimisation are then discussed. Perspectives on the scope for further research are presented.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A fuzzy logic based centralized control algorithm for irrigation canals is presented. Purpose of the algorithm is to control downstream discharge and water level of pools in the canal, by adjusting discharge release from the upstream end and gates settings. The algorithm is based on the dynamic wave model (Saint-Venant equations) inversion in space, wherein the momentum equation is replaced by a fuzzy rule based model, while retaining the continuity equation in its complete form. The fuzzy rule based model is developed on fuzzification of a new mathematical model for wave velocity, the derivational details of which are given. The advantages of the fuzzy control algorithm, over other conventional control algorithms, are described. It is transparent and intuitive, and no linearizations of the governing equations are involved. Timing of the algorithm and method of computation are explained. It is shown that the tuning is easy and the computations are straightforward. The algorithm provides stable, realistic and robust outputs. The disadvantage of the algorithm is reduced precision in its outputs due to the approximation inherent in the fuzzy logic. Feed back control logic is adopted to eliminate error caused by the system disturbances as well as error caused by the reduced precision in the outputs. The algorithm is tested by applying it to water level control problem in a fictitious canal with a single pool and also in a real canal with a series of pools. It is found that results obtained from the algorithm are comparable to those obtained from conventional control algorithms.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A new method based on unit continuity metric (UCM) is proposed for optimal unit selection in text-to-speech (TTS) synthesis. UCM employs two features, namely, pitch continuity metric and spectral continuity metric. The methods have been implemented and tested on our test bed called MILE-TTS and it is available as web demo. After verification by a self selection test, the algorithms are evaluated on 8 paragraphs each for Kannada and Tamil by native users of the languages. Mean-opinion-score (MOS) shows that naturalness and comprehension are better with UCM based algorithm than the non-UCM based ones. The naturalness of the TTS output is further enhanced by a new rule based algorithm for pause prediction for Tamil language. The pauses between the words are predicted based on parts-of-speech information obtained from the input text.