135 resultados para Feature vectors
Resumo:
Secondary forests are an increasingly common feature in tropical landscapes worldwide and understanding their regeneration is necessary to design effective restoration strategies. It has previously been shown that the woody species community in secondary forests can follow different successional pathways according to the nature of past human activities in the area, yet little is known about patterns of herbaceous species diversity in secondary forests with different histories of land use. We compared the diversity and abundance of herbaceous plant communities in two types of Central Amazonian secondary forests-those regenerating on pastures created by felling and burning trees and those where trees were felled only. We also tested if plant density and species richness in secondary forests are related to proximity to primary forest. In comparison with primary forest sites, forests regenerating on non-burned habitats had lower herbaceous plant density and species richness than those on burned ones. However, species composition and abundance in non-burned stands were more similar to those of primary forest, whereas several secondary forest specialist species were found in burned stands. In both non-burned and burned forests, distance from the forest edge was not related to herbaceous density and species richness. Overall, our results suggest that the natural regeneration of herbaceous species in secondary tropical forests is dependent on a site`s post-clearing treatment. We recommend evaluating the land history of a site prior to developing and implementing a restoration strategy, as this will influence the biological template on which restoration efforts are overlaid.
Resumo:
Objective: We carry out a systematic assessment on a suite of kernel-based learning machines while coping with the task of epilepsy diagnosis through automatic electroencephalogram (EEG) signal classification. Methods and materials: The kernel machines investigated include the standard support vector machine (SVM), the least squares SVM, the Lagrangian SVM, the smooth SVM, the proximal SVM, and the relevance vector machine. An extensive series of experiments was conducted on publicly available data, whose clinical EEG recordings were obtained from five normal subjects and five epileptic patients. The performance levels delivered by the different kernel machines are contrasted in terms of the criteria of predictive accuracy, sensitivity to the kernel function/parameter value, and sensitivity to the type of features extracted from the signal. For this purpose, 26 values for the kernel parameter (radius) of two well-known kernel functions (namely. Gaussian and exponential radial basis functions) were considered as well as 21 types of features extracted from the EEG signal, including statistical values derived from the discrete wavelet transform, Lyapunov exponents, and combinations thereof. Results: We first quantitatively assess the impact of the choice of the wavelet basis on the quality of the features extracted. Four wavelet basis functions were considered in this study. Then, we provide the average accuracy (i.e., cross-validation error) values delivered by 252 kernel machine configurations; in particular, 40%/35% of the best-calibrated models of the standard and least squares SVMs reached 100% accuracy rate for the two kernel functions considered. Moreover, we show the sensitivity profiles exhibited by a large sample of the configurations whereby one can visually inspect their levels of sensitiveness to the type of feature and to the kernel function/parameter value. Conclusions: Overall, the results evidence that all kernel machines are competitive in terms of accuracy, with the standard and least squares SVMs prevailing more consistently. Moreover, the choice of the kernel function and parameter value as well as the choice of the feature extractor are critical decisions to be taken, albeit the choice of the wavelet family seems not to be so relevant. Also, the statistical values calculated over the Lyapunov exponents were good sources of signal representation, but not as informative as their wavelet counterparts. Finally, a typical sensitivity profile has emerged among all types of machines, involving some regions of stability separated by zones of sharp variation, with some kernel parameter values clearly associated with better accuracy rates (zones of optimality). (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The large amount of information in electronic contracts hampers their establishment due to high complexity. An approach inspired in Software Product Line (PL) and based on feature modelling was proposed to make this process more systematic through information reuse and structuring. By assessing the feature-based approach in relation to a proposed set of requirements, it was showed that the approach does not allow the price of services and of Quality of Services (QoS) attributes to be considered in the negotiation and included in the electronic contract. Thus, this paper also presents an extension of such approach in which prices and price types associated to Web services and QoS levels are applied. An extended toolkit prototype is also presented as well as an experiment example of the proposed approach.
Resumo:
Age-related changes in running kinematics have been reported in the literature using classical inferential statistics. However, this approach has been hampered by the increased number of biomechanical gait variables reported and subsequently the lack of differences presented in these studies. Data mining techniques have been applied in recent biomedical studies to solve this problem using a more general approach. In the present work, we re-analyzed lower extremity running kinematic data of 17 young and 17 elderly male runners using the Support Vector Machine (SVM) classification approach. In total, 31 kinematic variables were extracted to train the classification algorithm and test the generalized performance. The results revealed different accuracy rates across three different kernel methods adopted in the classifier, with the linear kernel performing the best. A subsequent forward feature selection algorithm demonstrated that with only six features, the linear kernel SVM achieved 100% classification performance rate, showing that these features provided powerful combined information to distinguish age groups. The results of the present work demonstrate potential in applying this approach to improve knowledge about the age-related differences in running gait biomechanics and encourages the use of the SVM in other clinical contexts. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
In recent years, magnetic nanoparticles have been studied due to their potential applications as magnetic carriers in biomedical area. These materials have been increasingly exploited as efficient delivery vectors, leading to opportunities of use as magnetic resonance imaging (MRI) agents, mediators of hyperthermia cancer treatment and in targeted therapies. Much attention has been also focused on ""smart"" polymers, which are able to respond to environmental changes, such as changes in the temperature and pH. In this context, this article reviews the state-of-the art in stimuli-responsive magnetic systems for biomedical applications. The paper describes different types of stimuli-sensitive systems, mainly temperature- and pH sensitive polymers, the combination of this characteristic with magnetic properties and, finally, it gives an account of their preparation methods. The article also discusses the main in vivo biomedical applications of such materials. A survey of the recent literature on various stimuli-responsive magnetic gels in biomedical applications is also included. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
There are about 7500 water treatment plants in Brazil. The wastes these plants generate in their decantation tanks and filters are discharged directly into the same brooks and rivers that supply water for treatment. Another serious environmental problem is the unregulated disposal of construction and demolition rubble, which increases the expenditure of public resources by degrading the urban environment and contributing to aggravate flooding and the proliferation of vectors harmful to public health. In this study, an evaluation was made of the possibility of recycling water treatment sludge in construction and demolition waste recycling plants. The axial compressive strength and water absorption of concretes and mortars produced with the exclusive and joint addition of these two types of waste was also determined. The ecoefficiency of this recycling was evaluated by determining the concentration of aluminum in the leached extract resulting from the solubilization of the recycled products. The production of concretes and mortars with the joint addition of water treatment sludge and recycled concrete rubble aggregates proved to be a viable recycling alternative from the standpoint of axial compression strength, modulus of elasticity, water absorption and tensile strength by the Brazilian test method. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
This paper proposes a novel computer vision approach that processes video sequences of people walking and then recognises those people by their gait. Human motion carries different information that can be analysed in various ways. The skeleton carries motion information about human joints, and the silhouette carries information about boundary motion of the human body. Moreover, binary and gray-level images contain different information about human movements. This work proposes to recover these different kinds of information to interpret the global motion of the human body based on four different segmented image models, using a fusion model to improve classification. Our proposed method considers the set of the segmented frames of each individual as a distinct class and each frame as an object of this class. The methodology applies background extraction using the Gaussian Mixture Model (GMM), a scale reduction based on the Wavelet Transform (WT) and feature extraction by Principal Component Analysis (PCA). We propose four new schemas for motion information capture: the Silhouette-Gray-Wavelet model (SGW) captures motion based on grey level variations; the Silhouette-Binary-Wavelet model (SBW) captures motion based on binary information; the Silhouette-Edge-Binary model (SEW) captures motion based on edge information and the Silhouette Skeleton Wavelet model (SSW) captures motion based on skeleton movement. The classification rates obtained separately from these four different models are then merged using a new proposed fusion technique. The results suggest excellent performance in terms of recognising people by their gait.
Resumo:
A new circuit configuration, linearly conjugate to the standard Chua`s circuit, is presented. Its distinctive feature is that the equations now admit an additional parameter, which controls the dissipation in the network connected to the Chua diode. In the limiting case we obtain the simplest chaotic circuit, consisting of a piecewise-linear resistor and three lossless elements.
Resumo:
This letter presents some notes on the use of the Gram matrix in observability analysis. This matrix is constructed considering the rows of the measurement Jacobian matrix as vectors, and it can be employed in observability analysis and restoration methods. The determination of nonredundant pseudo-measurements (normally injections pseudo-measurements) for merging observable islands into an observable (single) system is carried out analyzing the pivots of the Gram matrix. The Gram matrix can also be used to verify local redundancy, which is important in measurement system planning. Some numerical examples` are used to illustrate these features. Others features of the Gram matrix are under study.
Resumo:
Modal filters may be obtained by a properly designed weighted sum of the output signals of an array of sensors distributed on the host structure. Although several research groups have been interested in techniques for designing and implementing modal filters based on a given array of sensors, the effect of the array topology on the effectiveness of the modal filter has received much less attention. In particular, it is known that some parameters, such as size, shape and location of a sensor, are very important in determining the observability of a vibration mode. Hence, this paper presents a methodology for the topological optimization of an array of sensors in order to maximize the effectiveness of a set of selected modal filters. This is done using a genetic algorithm optimization technique for the selection of 12 piezoceramic sensors from an array of 36 piezoceramic sensors regularly distributed on an aluminum plate, which maximize the filtering performance, over a given frequency range, of a set of modal filters, each one aiming to isolate one of the first vibration modes. The vectors of the weighting coefficients for each modal filter are evaluated using QR decomposition of the complex frequency response function matrix. Results show that the array topology is not very important for lower frequencies but it greatly affects the filter effectiveness for higher frequencies. Therefore, it is possible to improve the effectiveness and frequency range of a set of modal filters by optimizing the topology of an array of sensors. Indeed, using 12 properly located piezoceramic sensors bonded on an aluminum plate it is shown that the frequency range of a set of modal filters may be enlarged by 25-50%.
Resumo:
An accurate estimate of machining time is very important for predicting delivery time, manufacturing costs, and also to help production process planning. Most commercial CAM software systems estimate the machining time in milling operations simply by dividing the entire tool path length by the programmed feed rate. This time estimate differs drastically from the real process time because the feed rate is not always constant due to machine and computer numerical controlled (CNC) limitations. This study presents a practical mechanistic method for milling time estimation when machining free-form geometries. The method considers a variable called machine response time (MRT) which characterizes the real CNC machine`s capacity to move in high feed rates in free-form geometries. MRT is a global performance feature which can be obtained for any type of CNC machine configuration by carrying out a simple test. For validating the methodology, a workpiece was used to generate NC programs for five different types of CNC machines. A practical industrial case study was also carried out to validate the method. The results indicated that MRT, and consequently, the real machining time, depends on the CNC machine`s potential: furthermore, the greater MRT, the larger the difference between predicted milling time and real milling time. The proposed method achieved an error range from 0.3% to 12% of the real machining time, whereas the CAM estimation achieved from 211% to 1244% error. The MRT-based process is also suggested as an instrument for helping in machine tool benchmarking.
Resumo:
This paper proposes a physical non-linear formulation to deal with steel fiber reinforced concrete by the finite element method. The proposed formulation allows the consideration of short or long fibers placed arbitrarily inside a continuum domain (matrix). The most important feature of the formulation is that no additional degree of freedom is introduced in the pre-existent finite element numerical system to consider any distribution or quantity of fiber inclusions. In other words, the size of the system of equations used to solve a non-reinforced medium is the same as the one used to solve the reinforced counterpart. Another important characteristic of the formulation is the reduced work required by the user to introduce reinforcements, avoiding ""rebar"" elements, node by node geometrical definitions or even complex mesh generation. Bounded connection between long fibers and continuum is considered, for short fibers a simplified approach is proposed to consider splitting. Non-associative plasticity is adopted for the continuum and one dimensional plasticity is adopted to model fibers. Examples are presented in order to show the capabilities of the formulation.
Resumo:
This study presents a solid-like finite element formulation to solve geometric non-linear three-dimensional inhomogeneous frames. To achieve the desired representation, unconstrained vectors are used instead of the classic rigid director triad; as a consequence, the resulting formulation does not use finite rotation schemes. High order curved elements with any cross section are developed using a full three-dimensional constitutive elastic relation. Warping and variable thickness strain modes are introduced to avoid locking. The warping mode is solved numerically in FEM pre-processing computational code, which is coupled to the main program. The extra calculations are relatively small when the number of finite elements. with the same cross section, increases. The warping mode is based on a 2D free torsion (Saint-Venant) problem that considers inhomogeneous material. A scheme that automatically generates shape functions and its derivatives allow the use of any degree of approximation for the developed frame element. General examples are solved to check the objectivity, path independence, locking free behavior, generality and accuracy of the proposed formulation. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
This communication proposes a simple way to introduce fibers into finite element modelling. This is a promising formulation to deal with fiber-reinforced composites by the finite element method (FEM), as it allows the consideration of short or long fibers placed arbitrarily inside a continuum domain (matrix). The most important feature of the formulation is that no additional degree of freedom is introduced into the pre-existent finite element numerical system to consider any distribution of fiber inclusions. In other words, the size of the system of equations used to solve a non-reinforced medium is the same as the one used to solve the reinforced counterpart. Another important characteristic is the reduced work required by the user to introduce fibers, avoiding `rebar` elements, node-by-node geometrical definitions or even complex mesh generation. An additional characteristic of the technique is the possibility of representing unbounded stresses at the end of fibers using a finite number of degrees of freedom. Further studies are required for non-linear applications in which localization may occur. Along the text the linear formulation is presented and the bounded connection between fibers and continuum is considered. Four examples are presented, including non-linear analysis, to validate and show the capabilities of the formulation. Copyright (c) 2007 John Wiley & Sons, Ltd.
Resumo:
A round robin program zoos conducted to assess the ability of three different X-radiographic systems for imaging internal fatigue cracks in riveted lap joints of composite glass reinforced fiber/metal laminate. From an engineering perspective, conventional film radiography and direct radiography have produced the best results, identifying and characterizing in detail internal damage on metallic faying surfaces of fastened glass reinforced fiber/metal laminate joints. On the other hand, computed radiographic images presented large projected geometric distortions and feature shifts due to the angular incident radiation beam, disclosing only partial internal cracking patterns.