29 resultados para computer based experiments
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
We analyze free elementary particles with a rest mass m and total energy E
Resumo:
Pós-graduação em Psicologia do Desenvolvimento e Aprendizagem - FC
Resumo:
We consider a procedure for obtaining a compact fourth order method to the steady 2D Navier-Stokes equations in the streamfunction formulation using the computer algebra system Maple. The resulting code is short and from it we obtain the Fortran program for the method. To test the procedure we have solved many cavity-type problems which include one with an analytical solution and the results are compared with results obtained by second order central differences to moderate Reynolds numbers. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
We generalize a procedure proposed by Mancera and Hunt [P.F.A. Mancera, R. Hunt, Some experiments with high order compact methods using a computer algebra software-Part 1, Appl. Math. Comput., in press, doi: 10.1016/j.amc.2005.05.015] for obtaining a compact fourth-order method to the steady 2D Navier-Stokes equations in the streamfunction formulation-vorticity using the computer algebra system Maple, which includes conformal mappings and non-uniform grids. To analyse the procedure we have solved a constricted stepped channel problem, where a fine grid is placed near the re-entrant corner by transformation of the independent variables. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
The problem of dynamic camera calibration considering moving objects in close range environments using straight lines as references is addressed. A mathematical model for the correspondence of a straight line in the object and image spaces is discussed. This model is based on the equivalence between the vector normal to the interpretation plane in the image space and the vector normal to the rotated interpretation plane in the object space. In order to solve the dynamic camera calibration, Kalman Filtering is applied; an iterative process based on the recursive property of the Kalman Filter is defined, using the sequentially estimated camera orientation parameters to feedback the feature extraction process in the image. For the dynamic case, e.g. an image sequence of a moving object, a state prediction and a covariance matrix for the next instant is obtained using the available estimates and the system model. Filtered state estimates can be computed from these predicted estimates using the Kalman Filtering approach and based on the system model parameters with good quality, for each instant of an image sequence. The proposed approach was tested with simulated and real data. Experiments with real data were carried out in a controlled environment, considering a sequence of images of a moving cube in a linear trajectory over a flat surface.
Resumo:
The main objective involved with this paper consists of presenting the results obtained from the application of artificial neural networks and statistical tools in the automatic identification and classification process of faults in electric power distribution systems. The developed techniques to treat the proposed problem have used, in an integrated way, several approaches that can contribute to the successful detection process of faults, aiming that it is carried out in a reliable and safe way. The compilations of the results obtained from practical experiments accomplished in a pilot radial distribution feeder have demonstrated that the developed techniques provide accurate results, identifying and classifying efficiently the several occurrences of faults observed in the feeder.
Resumo:
This project aims to apply image processing techniques in computer vision featuring an omnidirectional vision system to agricultural mobile robots (AMR) used for trajectory navigation problems, as well as localization matters. To carry through this task, computational methods based on the JSEG algorithm were used to provide the classification and the characterization of such problems, together with Artificial Neural Networks (ANN) for pattern recognition. Therefore, it was possible to run simulations and carry out analyses of the performance of JSEG image segmentation technique through Matlab/Octave platforms, along with the application of customized Back-propagation algorithm and statistical methods in a Simulink environment. Having the aforementioned procedures been done, it was practicable to classify and also characterize the HSV space color segments, not to mention allow the recognition of patterns in which reasonably accurate results were obtained.
Resumo:
Thermal faceprint has been paramount in the last years. Since we can handle with face recognition using images acquired in the infrared spectrum, an unique individual's signature can be obtained through the blood vessels network of the face. In this work, we propose a novel framework for thermal faceprint extraction using a collection of graph-based techniques, which were never used to this task up to date. A robust method of thermal face segmentation is also presented. The experiments, which were conducted over the UND Collection C dataset, have showed promising results. © 2011 Springer-Verlag.
Resumo:
A body of knowledge in Software Engineering requires experiments replications. The knowledge generated by a study is registered in the so-called lab package, which, must be reviewed by an eventual research group with the intention to replicate it. However, researchers face difficulties reviewing the lab package, what leads to problems in share knowledge among research groups. Besides that, the lack of standardization is an obstacle to the integration of the knowledge from an isolated study in a common body of knowledge. In this sense, ontologies can be applied, since they can be seen as a standard that promotes the shared understanding of the experiment information structure. In this paper, we present a workflow to generate lab packages based on EXPEiiQntology, an ontology of controlled experiments domain. In addition, by means of lab packages instantiation, it is possible to evolve the ontology, in order to deal with new concepts that may appear in different lab packages. The iterative ontology evolution aims at achieve a standard that is able to accommodate different lab packages and, hence, facilitate to review and understand their content.
Resumo:
Increased accessibility to high-performance computing resources has created a demand for user support through performance evaluation tools like the iSPD (iconic Simulator for Parallel and Distributed systems), a simulator based on iconic modelling for distributed environments such as computer grids. It was developed to make it easier for general users to create their grid models, including allocation and scheduling algorithms. This paper describes how schedulers are managed by iSPD and how users can easily adopt the scheduling policy that improves the system being simulated. A thorough description of iSPD is given, detailing its scheduler manager. Some comparisons between iSPD and Simgrid simulations, including runs of the simulated environment in a real cluster, are also presented. © 2012 IEEE.
Resumo:
This article presents a new method to detect damage in structures based on the electromechanical impedance principle. The system follows the variations in the output voltage of piezoelectric transducers and does not compute the impedance itself. The proposed system is portable, autonomous, versatile, and could efficiently replace commercial instruments in different structural health monitoring applications. The identification of damage is performed by simply comparing the variations of root mean square voltage from response signals of piezoelectric transducers, such as lead zirconate titanate patches bonded to the structure, obtained for different frequencies of the excitation signal. The proposed system is not limited by the sampling rate of analog-to-digital converters, dispenses Fourier transform algorithms, and does not require a computer for processing, operating autonomously. A low-cost prototype based on microcontroller and digital synthesizer was built, and experiments were carried out on an aluminum structure and excellent results have been obtained. © The Author(s) 2012.
Resumo:
With the widespread proliferation of computers, many human activities entail the use of automatic image analysis. The basic features used for image analysis include color, texture, and shape. In this paper, we propose a new shape description method, called Hough Transform Statistics (HTS), which uses statistics from the Hough space to characterize the shape of objects or regions in digital images. A modified version of this method, called Hough Transform Statistics neighborhood (HTSn), is also presented. Experiments carried out on three popular public image databases showed that the HTS and HTSn descriptors are robust, since they presented precision-recall results much better than several other well-known shape description methods. When compared to Beam Angle Statistics (BAS) method, a shape description method that inspired their development, both the HTS and the HTSn methods presented inferior results regarding the precision-recall criterion, but superior results in the processing time and multiscale separability criteria. The linear complexity of the HTS and the HTSn algorithms, in contrast to BAS, make them more appropriate for shape analysis in high-resolution image retrieval tasks when very large databases are used, which are very common nowadays. (C) 2014 Elsevier Inc. All rights reserved.