24 resultados para Suino - Pesos e medidas
Resumo:
The predictive control technique has gotten, on the last years, greater number of adepts in reason of the easiness of adjustment of its parameters, of the exceeding of its concepts for multi-input/multi-output (MIMO) systems, of nonlinear models of processes could be linearised around a operating point, so can clearly be used in the controller, and mainly, as being the only methodology that can take into consideration, during the project of the controller, the limitations of the control signals and output of the process. The time varying weighting generalized predictive control (TGPC), studied in this work, is one more an alternative to the several existing predictive controls, characterizing itself as an modification of the generalized predictive control (GPC), where it is used a reference model, calculated in accordance with parameters of project previously established by the designer, and the application of a new function criterion, that when minimized offers the best parameters to the controller. It is used technique of the genetic algorithms to minimize of the function criterion proposed and searches to demonstrate the robustness of the TGPC through the application of performance, stability and robustness criterions. To compare achieves results of the TGPC controller, the GCP and proportional, integral and derivative (PID) controllers are used, where whole the techniques applied to stable, unstable and of non-minimum phase plants. The simulated examples become fulfilled with the use of MATLAB tool. It is verified that, the alterations implemented in TGPC, allow the evidence of the efficiency of this algorithm
Resumo:
Provide data and information on watershed becomes important since the knowledge of their physical characteristics, land use, etcetera, allows for better planning and sustainable use of economically, socially and environmentally in this area. The investigation of the physical environment has been commonly given with the use of geoprocessing, which has proved a very efficient tool. Within this context, this research aims at analyzing the river basin Punaú (located in the cities of Touros, Rio do Fogo and Pureza, state of Rio Grande do Norte) in several aspects, using geoprocessing as a tool of work, to provide information about the entire watershed. Specifically, this study aimed to update pre-existing maps, such as geological, geomorphological and land use, generating map of environmental vulnerability, under the aspect of erosion susceptibility of the area, generating map of legal incompatibility, identifying areas that are already being employed in breach of environmental legislation; propose solutions to the occupation of the river basin Punaú, focused on environmental planning. The methodology was based on the use of geoprocessing tools for data analysis and to make maps of legal incompatibility and environmental vulnerability. For the first map was taken into account the environmental legislation regarding the protection of watersheds. For the vulnerability analysis, the generated map was the result of crossing the maps of geology, geomorphology, soils and land use, having been assigned weights to different attributes of thematic maps, generating a map of environmental vulnerability in relation to susceptibility to erosion. The analysis results indicate that agriculture is the most significant activity in the basin, in total occupied area, which confers a high degree of environmental vulnerability in most of the basin, and some agricultural areas eventually develop in a manner inconsistent with Brazilian environmental legislation. It is proposed to consider deploying a measure of revitalization of the watershed in more critical areas and conservation through mitigation measures on the causes of environmental degradation, such as protection of water sources, protection and restoration of riparian vegetation, protection of permanent preservation areas, containment of erosion processes in general, and others listed or not in specific laws, and even the establishment of a committee of basins in the area
Resumo:
This work relates our experience in an investigative practice in the classroom, carried out in State School Jorge Fernandes , located in Natal - RN, having as objective to validate the applying of a education module about Measures and Largenesses in Primary Teaching. We used the constructivist approach; using to Richard s Skemp (1980) theory in order to explicit the students learning according to their comprehension levels. Initially, we carry out an exploratory research to check the participant s previous knowledge. Then, we developed an intervention to validate an education module, structured from needs pointed out from results of the initial research, analyzing the students advances through the final evaluation, displaying growth stages of group front to the knowledge about the matter approached. Finally, we presented our reflections about our experience; putting forward suggestions of the teachers activities, aiming at contributing to the improvement of their practices in classroom during the approach of subject for us investigated
Resumo:
This present research the aim to show to the reader the Geometry non-Euclidean while anomaly indicating the pedagogical implications and then propose a sequence of activities, divided into three blocks which show the relationship of Euclidean geometry with non-Euclidean, taking the Euclidean with respect to analysis of the anomaly in non-Euclidean. PPGECNM is tied to the line of research of History, Philosophy and Sociology of Science in the Teaching of Natural Sciences and Mathematics. Treat so on Euclid of Alexandria, his most famous work The Elements and moreover, emphasize the Fifth Postulate of Euclid, particularly the difficulties (which lasted several centuries) that mathematicians have to understand him. Until the eighteenth century, three mathematicians: Lobachevsky (1793 - 1856), Bolyai (1775 - 1856) and Gauss (1777-1855) was convinced that this axiom was correct and that there was another geometry (anomalous) as consistent as the Euclid, but that did not adapt into their parameters. It is attributed to the emergence of these three non-Euclidean geometry. For the course methodology we started with some bibliographical definitions about anomalies, after we ve featured so that our definition are better understood by the readers and then only deal geometries non-Euclidean (Hyperbolic Geometry, Spherical Geometry and Taxicab Geometry) confronting them with the Euclidean to analyze the anomalies existing in non-Euclidean geometries and observe its importance to the teaching. After this characterization follows the empirical part of the proposal which consisted the application of three blocks of activities in search of pedagogical implications of anomaly. The first on parallel lines, the second on study of triangles and the third on the shortest distance between two points. These blocks offer a work with basic elements of geometry from a historical and investigative study of geometries non-Euclidean while anomaly so the concept is understood along with it s properties without necessarily be linked to the image of the geometric elements and thus expanding or adapting to other references. For example, the block applied on the second day of activities that provides extend the result of the sum of the internal angles of any triangle, to realize that is not always 180° (only when Euclid is a reference that this conclusion can be drawn)
Resumo:
In the 20th century, the acupuncture has spread on occident as a complementary practice of heath care. This fact has motivated the international scientific community to invest in research that seek to understand why acupuncture works. In this work we compare statistically volt age fluctuation of bioelectric signals caught on the skin at an acupuncture point (IG 4) another nearby on acupuncture point. The acquisition of these signals was performed utilizing an electronic interface with a computer, which was based on an instrumentation amplifier designed with adequate specifications to this end. On the collected signals from a sample of 30 volunteers we have calculated major statistics and submitted them to pairing t-test with significance leveI a = O, 05. We have estimated to bioelectric signals the following parameters: standard deviation, asymmetry and curtose. Moreover, we have calculated the self-correlation function matched by on exponential curve we have observed that the signal decays more rapidly from a non-acupoint then from an acupoint. This fact is an indicative of the existence of information in the acupoint
Resumo:
Although some individual techniques of supervised Machine Learning (ML), also known as classifiers, or algorithms of classification, to supply solutions that, most of the time, are considered efficient, have experimental results gotten with the use of large sets of pattern and/or that they have a expressive amount of irrelevant data or incomplete characteristic, that show a decrease in the efficiency of the precision of these techniques. In other words, such techniques can t do an recognition of patterns of an efficient form in complex problems. With the intention to get better performance and efficiency of these ML techniques, were thought about the idea to using some types of LM algorithms work jointly, thus origin to the term Multi-Classifier System (MCS). The MCS s presents, as component, different of LM algorithms, called of base classifiers, and realized a combination of results gotten for these algorithms to reach the final result. So that the MCS has a better performance that the base classifiers, the results gotten for each base classifier must present an certain diversity, in other words, a difference between the results gotten for each classifier that compose the system. It can be said that it does not make signification to have MCS s whose base classifiers have identical answers to the sames patterns. Although the MCS s present better results that the individually systems, has always the search to improve the results gotten for this type of system. Aim at this improvement and a better consistency in the results, as well as a larger diversity of the classifiers of a MCS, comes being recently searched methodologies that present as characteristic the use of weights, or confidence values. These weights can describe the importance that certain classifier supplied when associating with each pattern to a determined class. These weights still are used, in associate with the exits of the classifiers, during the process of recognition (use) of the MCS s. Exist different ways of calculating these weights and can be divided in two categories: the static weights and the dynamic weights. The first category of weights is characterizes for not having the modification of its values during the classification process, different it occurs with the second category, where the values suffers modifications during the classification process. In this work an analysis will be made to verify if the use of the weights, statics as much as dynamics, they can increase the perfomance of the MCS s in comparison with the individually systems. Moreover, will be made an analysis in the diversity gotten for the MCS s, for this mode verify if it has some relation between the use of the weights in the MCS s with different levels of diversity
Resumo:
Committees of classifiers may be used to improve the accuracy of classification systems, in other words, different classifiers used to solve the same problem can be combined for creating a system of greater accuracy, called committees of classifiers. To that this to succeed is necessary that the classifiers make mistakes on different objects of the problem so that the errors of a classifier are ignored by the others correct classifiers when applying the method of combination of the committee. The characteristic of classifiers of err on different objects is called diversity. However, most measures of diversity could not describe this importance. Recently, were proposed two measures of the diversity (good and bad diversity) with the aim of helping to generate more accurate committees. This paper performs an experimental analysis of these measures applied directly on the building of the committees of classifiers. The method of construction adopted is modeled as a search problem by the set of characteristics of the databases of the problem and the best set of committee members in order to find the committee of classifiers to produce the most accurate classification. This problem is solved by metaheuristic optimization techniques, in their mono and multi-objective versions. Analyzes are performed to verify if use or add the measures of good diversity and bad diversity in the optimization objectives creates more accurate committees. Thus, the contribution of this study is to determine whether the measures of good diversity and bad diversity can be used in mono-objective and multi-objective optimization techniques as optimization objectives for building committees of classifiers more accurate than those built by the same process, but using only the accuracy classification as objective of optimization
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
It is presented an integrated geophysical investigation of the spatial distribution of faults and deformation bands (DB´s) in a faulted siliciclastic reservoir analogue, located in Tucano Basin, Bahia State, northeastern Brazil. Ground Penetrating Radar (GPR) and permeability measurements allowed the analysis of the influence of DB´s in the rock permeability and porosity. GPR data were processed using a suitable flow parametrization in order to highlight discontinuities in sedimentary layers. The obtained images allowed the subsurface detection of DB´s presenting displacements greater that 10 cm. A good correlation was verified between DB´s detected by GPR and those observed in surface, the latter identified using conventional structural methods. After some adaptations in the minipermeameter in order to increase measurement precision, two approaches to measure permeabilities were tested: in situ and in collected cores. The former approach provided better results than the latter and consisted of scratching the outcrop surface, followed by direct measurements on outcrop rocks. The measured permeability profiles allowed to characterize the spatial transition from DB´s to undeformed rock; variation of up to three orders of magnitude were detected. The permeability profiles also presented quasi-periodic patterns, associated with textural and granulometric changes, possibly associated to depositional cycles. Integrated interpretation of the geological, geophysical and core data, provided the subsurface identification of an increase in the DB´s number associated with a sedimentary layer presenting granulometric decrease at depths greater than 8 m. An associated sharp decrease in permeability was also measured in cores from boreholes. The obtained results reveal that radagrams, besides providing high resolution images, allowing the detection of small structures (> 10 cm), also presented a correlation with the permeability data. In this way, GPR data may be used to build upscaling laws, bridging the gap between outcrop and seismic data sets, which may result in better models for faulted reservoirs