113 resultados para Cartographics feature
Resumo:
Objective: We carry out a systematic assessment on a suite of kernel-based learning machines while coping with the task of epilepsy diagnosis through automatic electroencephalogram (EEG) signal classification. Methods and materials: The kernel machines investigated include the standard support vector machine (SVM), the least squares SVM, the Lagrangian SVM, the smooth SVM, the proximal SVM, and the relevance vector machine. An extensive series of experiments was conducted on publicly available data, whose clinical EEG recordings were obtained from five normal subjects and five epileptic patients. The performance levels delivered by the different kernel machines are contrasted in terms of the criteria of predictive accuracy, sensitivity to the kernel function/parameter value, and sensitivity to the type of features extracted from the signal. For this purpose, 26 values for the kernel parameter (radius) of two well-known kernel functions (namely. Gaussian and exponential radial basis functions) were considered as well as 21 types of features extracted from the EEG signal, including statistical values derived from the discrete wavelet transform, Lyapunov exponents, and combinations thereof. Results: We first quantitatively assess the impact of the choice of the wavelet basis on the quality of the features extracted. Four wavelet basis functions were considered in this study. Then, we provide the average accuracy (i.e., cross-validation error) values delivered by 252 kernel machine configurations; in particular, 40%/35% of the best-calibrated models of the standard and least squares SVMs reached 100% accuracy rate for the two kernel functions considered. Moreover, we show the sensitivity profiles exhibited by a large sample of the configurations whereby one can visually inspect their levels of sensitiveness to the type of feature and to the kernel function/parameter value. Conclusions: Overall, the results evidence that all kernel machines are competitive in terms of accuracy, with the standard and least squares SVMs prevailing more consistently. Moreover, the choice of the kernel function and parameter value as well as the choice of the feature extractor are critical decisions to be taken, albeit the choice of the wavelet family seems not to be so relevant. Also, the statistical values calculated over the Lyapunov exponents were good sources of signal representation, but not as informative as their wavelet counterparts. Finally, a typical sensitivity profile has emerged among all types of machines, involving some regions of stability separated by zones of sharp variation, with some kernel parameter values clearly associated with better accuracy rates (zones of optimality). (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The large amount of information in electronic contracts hampers their establishment due to high complexity. An approach inspired in Software Product Line (PL) and based on feature modelling was proposed to make this process more systematic through information reuse and structuring. By assessing the feature-based approach in relation to a proposed set of requirements, it was showed that the approach does not allow the price of services and of Quality of Services (QoS) attributes to be considered in the negotiation and included in the electronic contract. Thus, this paper also presents an extension of such approach in which prices and price types associated to Web services and QoS levels are applied. An extended toolkit prototype is also presented as well as an experiment example of the proposed approach.
Resumo:
Age-related changes in running kinematics have been reported in the literature using classical inferential statistics. However, this approach has been hampered by the increased number of biomechanical gait variables reported and subsequently the lack of differences presented in these studies. Data mining techniques have been applied in recent biomedical studies to solve this problem using a more general approach. In the present work, we re-analyzed lower extremity running kinematic data of 17 young and 17 elderly male runners using the Support Vector Machine (SVM) classification approach. In total, 31 kinematic variables were extracted to train the classification algorithm and test the generalized performance. The results revealed different accuracy rates across three different kernel methods adopted in the classifier, with the linear kernel performing the best. A subsequent forward feature selection algorithm demonstrated that with only six features, the linear kernel SVM achieved 100% classification performance rate, showing that these features provided powerful combined information to distinguish age groups. The results of the present work demonstrate potential in applying this approach to improve knowledge about the age-related differences in running gait biomechanics and encourages the use of the SVM in other clinical contexts. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
This paper proposes a novel computer vision approach that processes video sequences of people walking and then recognises those people by their gait. Human motion carries different information that can be analysed in various ways. The skeleton carries motion information about human joints, and the silhouette carries information about boundary motion of the human body. Moreover, binary and gray-level images contain different information about human movements. This work proposes to recover these different kinds of information to interpret the global motion of the human body based on four different segmented image models, using a fusion model to improve classification. Our proposed method considers the set of the segmented frames of each individual as a distinct class and each frame as an object of this class. The methodology applies background extraction using the Gaussian Mixture Model (GMM), a scale reduction based on the Wavelet Transform (WT) and feature extraction by Principal Component Analysis (PCA). We propose four new schemas for motion information capture: the Silhouette-Gray-Wavelet model (SGW) captures motion based on grey level variations; the Silhouette-Binary-Wavelet model (SBW) captures motion based on binary information; the Silhouette-Edge-Binary model (SEW) captures motion based on edge information and the Silhouette Skeleton Wavelet model (SSW) captures motion based on skeleton movement. The classification rates obtained separately from these four different models are then merged using a new proposed fusion technique. The results suggest excellent performance in terms of recognising people by their gait.
Resumo:
A new circuit configuration, linearly conjugate to the standard Chua`s circuit, is presented. Its distinctive feature is that the equations now admit an additional parameter, which controls the dissipation in the network connected to the Chua diode. In the limiting case we obtain the simplest chaotic circuit, consisting of a piecewise-linear resistor and three lossless elements.
Resumo:
An accurate estimate of machining time is very important for predicting delivery time, manufacturing costs, and also to help production process planning. Most commercial CAM software systems estimate the machining time in milling operations simply by dividing the entire tool path length by the programmed feed rate. This time estimate differs drastically from the real process time because the feed rate is not always constant due to machine and computer numerical controlled (CNC) limitations. This study presents a practical mechanistic method for milling time estimation when machining free-form geometries. The method considers a variable called machine response time (MRT) which characterizes the real CNC machine`s capacity to move in high feed rates in free-form geometries. MRT is a global performance feature which can be obtained for any type of CNC machine configuration by carrying out a simple test. For validating the methodology, a workpiece was used to generate NC programs for five different types of CNC machines. A practical industrial case study was also carried out to validate the method. The results indicated that MRT, and consequently, the real machining time, depends on the CNC machine`s potential: furthermore, the greater MRT, the larger the difference between predicted milling time and real milling time. The proposed method achieved an error range from 0.3% to 12% of the real machining time, whereas the CAM estimation achieved from 211% to 1244% error. The MRT-based process is also suggested as an instrument for helping in machine tool benchmarking.
Resumo:
This paper proposes a physical non-linear formulation to deal with steel fiber reinforced concrete by the finite element method. The proposed formulation allows the consideration of short or long fibers placed arbitrarily inside a continuum domain (matrix). The most important feature of the formulation is that no additional degree of freedom is introduced in the pre-existent finite element numerical system to consider any distribution or quantity of fiber inclusions. In other words, the size of the system of equations used to solve a non-reinforced medium is the same as the one used to solve the reinforced counterpart. Another important characteristic of the formulation is the reduced work required by the user to introduce reinforcements, avoiding ""rebar"" elements, node by node geometrical definitions or even complex mesh generation. Bounded connection between long fibers and continuum is considered, for short fibers a simplified approach is proposed to consider splitting. Non-associative plasticity is adopted for the continuum and one dimensional plasticity is adopted to model fibers. Examples are presented in order to show the capabilities of the formulation.
Resumo:
This communication proposes a simple way to introduce fibers into finite element modelling. This is a promising formulation to deal with fiber-reinforced composites by the finite element method (FEM), as it allows the consideration of short or long fibers placed arbitrarily inside a continuum domain (matrix). The most important feature of the formulation is that no additional degree of freedom is introduced into the pre-existent finite element numerical system to consider any distribution of fiber inclusions. In other words, the size of the system of equations used to solve a non-reinforced medium is the same as the one used to solve the reinforced counterpart. Another important characteristic is the reduced work required by the user to introduce fibers, avoiding `rebar` elements, node-by-node geometrical definitions or even complex mesh generation. An additional characteristic of the technique is the possibility of representing unbounded stresses at the end of fibers using a finite number of degrees of freedom. Further studies are required for non-linear applications in which localization may occur. Along the text the linear formulation is presented and the bounded connection between fibers and continuum is considered. Four examples are presented, including non-linear analysis, to validate and show the capabilities of the formulation. Copyright (c) 2007 John Wiley & Sons, Ltd.
Resumo:
A round robin program zoos conducted to assess the ability of three different X-radiographic systems for imaging internal fatigue cracks in riveted lap joints of composite glass reinforced fiber/metal laminate. From an engineering perspective, conventional film radiography and direct radiography have produced the best results, identifying and characterizing in detail internal damage on metallic faying surfaces of fastened glass reinforced fiber/metal laminate joints. On the other hand, computed radiographic images presented large projected geometric distortions and feature shifts due to the angular incident radiation beam, disclosing only partial internal cracking patterns.
Resumo:
One of the e-learning environment goal is to attend the individual needs of students during the learning process. The adaptation of contents, activities and tools into different visualization or in a variety of content types is an important feature of this environment, bringing to the user the sensation that there are suitable workplaces to his profile in the same system. Nevertheless, it is important the investigation of student behaviour aspects, considering the context where the interaction happens, to achieve an efficient personalization process. The paper goal is to present an approach to identify the student learning profile analyzing the context of interaction. Besides this, the learning profile could be analyzed in different dimensions allows the system to deal with the different focus of the learning.
Resumo:
This paper presents an Adaptive Maximum Entropy (AME) approach for modeling biological species. The Maximum Entropy algorithm (MaxEnt) is one of the most used methods in modeling biological species geographical distribution. The approach presented here is an alternative to the classical algorithm. Instead of using the same set features in the training, the AME approach tries to insert or to remove a single feature at each iteration. The aim is to reach the convergence faster without affect the performance of the generated models. The preliminary experiments were well performed. They showed an increasing on performance both in accuracy and in execution time. Comparisons with other algorithms are beyond the scope of this paper. Some important researches are proposed as future works.
Resumo:
This work presents the development and implementation of an artificial neural network based algorithm for transmission lines distance protection. This algorithm was developed to be used in any transmission line regardless of its configuration or voltage level. The described ANN-based algorithm does not need any topology adaptation or ANN parameters adjustment when applied to different electrical systems. This feature makes this solution unique since all ANN-based solutions presented until now were developed for particular transmission lines, which means that those solutions cannot be implemented in commercial relays. (c) 2011 Elsevier Ltd. All rights reserved.
Resumo:
This paper addresses the development of several alternative novel hybrid/multi-field variational formulations of the geometrically exact three-dimensional elastostatic beam boundary-value problem. In the framework of the complementary energy-based formulations, a Legendre transformation is used to introduce the complementary energy density in the variational statements as a function of stresses only. The corresponding variational principles are shown to feature stationarity within the framework of the boundary-value problem. Both weak and linearized weak forms of the principles are presented. The main features of the principles are highlighted, giving special emphasis to their relationships from both theoretical and computational standpoints. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Several mining companies operating in Brazil have been preparing mine closure plans describing end of life actions to ensure long term physical stability, land rehabilitation and, to a certain extent, socioeconomic measures. Most Brazilian and international guidelines, however, skip early closure and only feature guidance to closure planning after reserve depletion. In this paper, two cases of early closure in Australia are reviewed, causes for early closure are explored, criteria to evaluate preparedness to early mine closure are proposed and tested in two mines. It concludes that, despite recent advances, planning for mine closure seldom take account of early closure risks.
Resumo:
In this work, poly(vinyl butyral) (PVB) film originated from the mechanical separation of windshields was tested as all impact modifier of Polyamide-6 (PA-6). The changes undergone by PVB film during the recycling process and the blend manufacturing were evaluated by thermal analyses, infrared spectroscopy and loss oil ignition. Blends of PA-6/original PVB film and PA-6/recovered PVB film were obtained in concentrations ranging from 90/10 to 60/40. The mechanical properties of the blends were investigated and explained in light of the blends morphologies, which in turns were correlated to the changes undergone by the PVB film during the recycling process. The original film presented a plasticizer content of 33 wt.%, which decreased to as low as 20 wt.%, after the recycling and blend preparation processes. The PA-6/PVB film blends presented lower values of tensile strength and Young`s modulus than Polyamide-6, but all blends presented a dramatic increase in their toughness, with a special feature for the 40 wt.%(, blend, which resulted in a super toughened material (impact strength exceeding 500 J/m). Similar results were obtained with recovered PVB film and super tough blends were also obtained. The use of recovered PVB resulted in a smaller improvement of the impact strength due to the loss of plasticizer undergone during the recycling process. The morphological observations showed that if the interparticle distance is smaller than around 0.2 mu m (critical value), the notched Izod impact strength values increase considerably and the fracture surface of blends exhibit characteristics of tough failure. (C) 2007 Elsevier Ltd. All rights reserved.