997 resultados para Reservas técnicas


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A neuro-fuzzy system consists of two or more control techniques in only one structure. The main characteristic of this structure is joining one or more good aspects from each technique to make a hybrid controller. This controller can be based in Fuzzy systems, artificial Neural Networks, Genetics Algorithms or rein forced learning techniques. Neuro-fuzzy systems have been shown as a promising technique in industrial applications. Two models of neuro-fuzzy systems were developed, an ANFIS model and a NEFCON model. Both models were applied to control a ball and beam system and they had their results and needed changes commented. Choose of inputs to controllers and the algorithms used to learning, among other information about the hybrid systems, were commented. The results show the changes in structure after learning and the conditions to use each one controller based on theirs characteristics

Relevância:

20.00% 20.00%

Publicador:

Resumo:

According to the Forest Code of 1965, it is mandatory that every rural property destine part of its land to the establishment of Legal Reserves. When a diagnosis is made over all Brazil, the reality is quite different from what is demanded by law. Therefore, this work, as a general objective, proposes ways of establishing Legal Reserves based on the analysis of the environmental deterioration in a river basin. For this purpose, the environmental deterioration was detected based on three diagnoses: physical-conservational, socioeconomical, and environmental quality. In this way, from a quantitative and qualitative diagnosis, it was possible to identify the main aggressive factors in the studied river basin and to indicate the main vulnerabilities that the area is subjected. According to such diagnosis, some proposals for the establishment of Legal Reserves are discussed here based on scientific arguments aimed at the conservation of water resources, soil and biodiversity. It is hoped, that from this study, the environment receives a new tool for diagnosis, pollution control, recovery and conservation of natural resources.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Visual Odometry is the process that estimates camera position and orientation based solely on images and in features (projections of visual landmarks present in the scene) extraced from them. With the increasing advance of Computer Vision algorithms and computer processing power, the subarea known as Structure from Motion (SFM) started to supply mathematical tools composing localization systems for robotics and Augmented Reality applications, in contrast with its initial purpose of being used in inherently offline solutions aiming 3D reconstruction and image based modelling. In that way, this work proposes a pipeline to obtain relative position featuring a previously calibrated camera as positional sensor and based entirely on models and algorithms from SFM. Techniques usually applied in camera localization systems such as Kalman filters and particle filters are not used, making unnecessary additional information like probabilistic models for camera state transition. Experiments assessing both 3D reconstruction quality and camera position estimated by the system were performed, in which image sequences captured in reallistic scenarios were processed and compared to localization data gathered from a mobile robotic platform

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Simultaneous Localization and Mapping (SLAM - Simultaneous Localization and Mapping), a robot placed in an unknown location in any environment must be able to create a perspective of this environment (a map) and is situated in the same simultaneously, using only information captured by the robot s sensors and control signals known. Recently, driven by the advance of computing power, work in this area have proposed to use video camera as a sensor and it came so Visual SLAM. This has several approaches and the vast majority of them work basically extracting features of the environment, calculating the necessary correspondence and through these estimate the required parameters. This work presented a monocular visual SLAM system that uses direct image registration to calculate the image reprojection error and optimization methods that minimize this error and thus obtain the parameters for the robot pose and map of the environment directly from the pixels of the images. Thus the steps of extracting and matching features are not needed, enabling our system works well in environments where traditional approaches have difficulty. Moreover, when addressing the problem of SLAM as proposed in this work we avoid a very common problem in traditional approaches, known as error propagation. Worrying about the high computational cost of this approach have been tested several types of optimization methods in order to find a good balance between good estimates and processing time. The results presented in this work show the success of this system in different environments

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The introduction of new digital services in the cellular networks, in transmission rates each time more raised, has stimulated recent research that comes studying ways to increase the data communication capacity and to reduce the delays in forward and reverse links of third generation WCDMA systems. These studies have resulted in new standards, known as 3.5G, published by 3GPP group, for the evolution of the third generation of the cellular systems. In this Masters Thesis the performance of a 3G WCDMA system, with diverse base stations and thousand of users is developed with assists of the planning tool NPSW. Moreover the performance of the 3.5G techniques hybrid automatic retransmission and multi-user detection with interference cancellation, candidates for enhance the WCDMA uplink capacity, is verified by means of computational simulations in Matlab of the increase of the data communication capacity and the reduction of the delays in the retransmission of packages of information

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the most important goals of bioinformatics is the ability to identify genes in uncharacterized DNA sequences on world wide database. Gene expression on prokaryotes initiates when the RNA-polymerase enzyme interacts with DNA regions called promoters. In these regions are located the main regulatory elements of the transcription process. Despite the improvement of in vitro techniques for molecular biology analysis, characterizing and identifying a great number of promoters on a genome is a complex task. Nevertheless, the main drawback is the absence of a large set of promoters to identify conserved patterns among the species. Hence, a in silico method to predict them on any species is a challenge. Improved promoter prediction methods can be one step towards developing more reliable ab initio gene prediction methods. In this work, we present an empirical comparison of Machine Learning (ML) techniques such as Na¨ýve Bayes, Decision Trees, Support Vector Machines and Neural Networks, Voted Perceptron, PART, k-NN and and ensemble approaches (Bagging and Boosting) to the task of predicting Bacillus subtilis. In order to do so, we first built two data set of promoter and nonpromoter sequences for B. subtilis and a hybrid one. In order to evaluate of ML methods a cross-validation procedure is applied. Good results were obtained with methods of ML like SVM and Naïve Bayes using B. subtilis. However, we have not reached good results on hybrid database

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, classifying proteins in structural classes, which concerns the inference of patterns in their 3D conformation, is one of the most important open problems in Molecular Biology. The main reason for this is that the function of a protein is intrinsically related to its spatial conformation. However, such conformations are very difficult to be obtained experimentally in laboratory. Thus, this problem has drawn the attention of many researchers in Bioinformatics. Considering the great difference between the number of protein sequences already known and the number of three-dimensional structures determined experimentally, the demand of automated techniques for structural classification of proteins is very high. In this context, computational tools, especially Machine Learning (ML) techniques, have become essential to deal with this problem. In this work, ML techniques are used in the recognition of protein structural classes: Decision Trees, k-Nearest Neighbor, Naive Bayes, Support Vector Machine and Neural Networks. These methods have been chosen because they represent different paradigms of learning and have been widely used in the Bioinfornmatics literature. Aiming to obtain an improvment in the performance of these techniques (individual classifiers), homogeneous (Bagging and Boosting) and heterogeneous (Voting, Stacking and StackingC) multiclassification systems are used. Moreover, since the protein database used in this work presents the problem of imbalanced classes, artificial techniques for class balance (Undersampling Random, Tomek Links, CNN, NCL and OSS) are used to minimize such a problem. In order to evaluate the ML methods, a cross-validation procedure is applied, where the accuracy of the classifiers is measured using the mean of classification error rate, on independent test sets. These means are compared, two by two, by the hypothesis test aiming to evaluate if there is, statistically, a significant difference between them. With respect to the results obtained with the individual classifiers, Support Vector Machine presented the best accuracy. In terms of the multi-classification systems (homogeneous and heterogeneous), they showed, in general, a superior or similar performance when compared to the one achieved by the individual classifiers used - especially Boosting with Decision Tree and the StackingC with Linear Regression as meta classifier. The Voting method, despite of its simplicity, has shown to be adequate for solving the problem presented in this work. The techniques for class balance, on the other hand, have not produced a significant improvement in the global classification error. Nevertheless, the use of such techniques did improve the classification error for the minority class. In this context, the NCL technique has shown to be more appropriated

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The stability of synchronous generators connected to power grid has been the object of study and research for years. The interest in this matter is justified by the fact that much of the electricity produced worldwide is obtained with the use of synchronous generators. In this respect, studies have been proposed using conventional and unconventional control techniques such as fuzzy logic, neural networks, and adaptive controllers to increase the stabilitymargin of the systemduring sudden failures and transient disturbances. Thismaster thesis presents a robust unconventional control strategy for maintaining the stability of power systems and regulation of output voltage of synchronous generators connected to the grid. The proposed control strategy comprises the integration of a sliding surface with a linear controller. This control structure is designed to prevent the power system losing synchronism after a sudden failure and regulation of the terminal voltage of the generator after the fault. The feasibility of the proposed control strategy was experimentally tested in a salient pole synchronous generator of 5 kVA in a laboratory structure

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Self-organizing maps (SOM) are artificial neural networks widely used in the data mining field, mainly because they constitute a dimensionality reduction technique given the fixed grid of neurons associated with the network. In order to properly the partition and visualize the SOM network, the various methods available in the literature must be applied in a post-processing stage, that consists of inferring, through its neurons, relevant characteristics of the data set. In general, such processing applied to the network neurons, instead of the entire database, reduces the computational costs due to vector quantization. This work proposes a post-processing of the SOM neurons in the input and output spaces, combining visualization techniques with algorithms based on gravitational forces and the search for the shortest path with the greatest reward. Such methods take into account the connection strength between neighbouring neurons and characteristics of pattern density and distances among neurons, both associated with the position that the neurons occupy in the data space after training the network. Thus, the goal consists of defining more clearly the arrangement of the clusters present in the data. Experiments were carried out so as to evaluate the proposed methods using various artificially generated data sets, as well as real world data sets. The results obtained were compared with those from a number of well-known methods existent in the literature

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O objetivo deste trabalho foi avaliar o desenvolvimento da vegetação submersa, em termos da altura dos dosséis, considerando as dimensões espaço e tempo, usando técnicas de hidroacústica. Foram realizados dez levantamentos de campo no período de outubro de 2009 a dezembro de 2010, para aquisição de pontos georreferenciados de altura dos dosséis, frequência de ocorrência de vegetação, bem como de profundidade. Medidas limnológicas também foram feitas, a fim de verificar se suas variações poderiam explicar a distribuição espacial das macrófitas. Os dados de vegetação foram analisados por levantamento e por profundidade; além disso, compuseram um banco de dados implementado em um Sistema de Informação Geográfica. Foram então interpolados e das superfícies resultantes foram geradas cartas, que indicam a distribuição espacial do crescimento ou decaimento da vegetação. Modelos em três dimensões dos dosséis foram produzidos, para representar a ocupação volumétrica das macrófitas submersas. Os resultados mostraram que houve significativa redução da infestação de um ano para outro. Observou-se, ainda, que os maiores dosséis concentram-se em uma profundidade de 2 a 4 m. O mapeamento identificou tanto áreas de crescimento quanto de decaimento, distribuídas de modo heterogêneo. Não foi possível observar relação direta das medidas limnológicas com a dinâmica da vegetação, pois não apresentaram variação espaço-temporal significativa. Foi possível estimar o volume ocupado pelas macrófitas submersas, e a tendência observada é de que o aumento de volume é precedido por uma aparente homogeneização dos dosséis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An evaluation project was conducted on the technique of treatment for effluent oil which is the deriving process to improve cashews. During the evaluation the following techniques were developed: advanced processes of humid oxidation, oxidative processes, processes of biological treatment and processes of adsorption. The assays had been carried through in kinetic models, with an evaluation of the quality of the process by means of determining the chemical demand of oxygen (defined as a technique of control by means of comparative study between the available techniques). The results demonstrated that the natural biodegradation of the effluent ones is limited, as result using the present natural flora in the effluent one revealed impracticable for an application in the industrial systems, independent of the evaluation environment (with or without the oxygen presence). The job of specific microorganisms for the oily composite degradation developed the viability technique of this route, the acceptable levels of inclusion in effluent system of treatment of the improvement of the cashew being highly good with reasonable levels of removal of CDO. However, the use combined with other techniques of daily pay-treatment for these effluent ones revealed to still be more efficient for the context of the treatment of effluent and discarding in receiving bodies in acceptable standards for resolution CONAMA 357/2005. While the significant generation of solid residues the process of adsorption with agroindustrial residues (in special the chitosan) is a technical viable alternative, however, when applied only for the treatment of the effluent ones for discarding in bodies of water, the economic viability is harmed and minimized ambient profits. Though, it was proven that if used for ends of I reuse, the viability is equalized and justifies the investments. There was a study of the photochemistry process which have are applicable to the treatment of the effluent ones, having resulted more satisfactory than those gotten for the UV-Peroxide techniques. There was different result on the one waited for the use of catalyses used in the process of Photo. The catalyses contained the mixing oxide base of Cerium and Manganese, incorporated of Potassium promoters this had presented the best results in the decomposition of the involved pollutants. Having itself an agreed form the gotten photochemistry daily paytreatment resulted, then after disinfection with chlorine the characteristics next the portability to the water were guarantee. The job of the humid oxidation presented significant results in the removal of pollutants; however, its high cost alone is made possible for job in projects of reuses, areas of low scarcity and of raised costs with the capitation/acquisition of the water, in special, for use for industrial and potable use. The route with better economic conditions and techniques for the job in the treatment of the effluent ones of the improvement of the cashew possesses the sequence to follow: conventional process of separation water-oil, photochemistry process and finally, the complementary biological treatment

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Water still represents, on its critical properties and phase transitions, a problem of current scientific interest, as a consequence of the countless open questions and of the inadequacy of the existent theoretical models, mainly related to the different solid and liquid phases that this substance possesses. For example, there are 13 known crystalline forms of water, and also amorphous phases. One of them, the amorphous ice of very high density (VHDA), was just recently observed. Other example is the anomalous behavior in the macroscopic density, which presents a maximum at the temperature of 277 K. In order to experimentally investigate the behavior of one of the liquid-solid phase transitions, the anomaly in its density and also the metastability, we used three different cooling techniques and, as comparison systems, we made use of the solvents: acetone and ethyl alcohol. The first studied cooling system employ a Peltier plate, a device recently developed, which makes use of small cubes made up of semiconductors to change heat among two surfaces; the second system is a commercial refrigerator, similar to the residential ones. Finally, the liquid nitrogen technique, which is used to refrigerate the samples in a container, in two ways: a very fast and other one, almost static. In those three systems, three Beckers of aluminum were used (with a volume of 80 ml, each), containing water, alcohol and acetone. They were closed and maintained at atmospheric pressure. Inside of each Becker were installed three thermocouples, disposed along the vertical axis of the Beckers, one close to the inferior surface, other to the medium level and the last one close the superior surface. A system of data acquisition was built via virtual instrumentation using as a central equipment a Data-Acquisition board. The temperature data were collected by the three thermocouples in the three Beckers, simultaneously, in function of freezing time. We will present the behavior of temperature versus freezing time for the three substances. The results show the characterization of the transitions of the liquid

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation briefly presents the random graphs and the main quantities calculated from them. At the same time, basic thermodynamics quantities such as energy and temperature are associated with some of their characteristics. Approaches commonly used in Statistical Mechanics are employed and rules that describe a time evolution for the graphs are proposed in order to study their ergodicity and a possible thermal equilibrium between them