12 resultados para Multilinear polynomial
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
Mimosa caesalpiniaefolia Benth. is a forest species of the Mimosaceae family, recommended for recovery of degraded areas. The evaluation of vigor by biochemical tests have been an important tool in the control of seed quality programs, and the electrical conductivity and potassium leaching the most efficient in the verifying the physiological potential. The objective, therefore, to adjust the methodology of the electrical conductivity test for seeds of M. caesalpiniaefolia, for then compare the efficiency of this test with the potassium in the evaluation of seed vigor of different lots of seeds M. caesalpiniaefolia. To test the adequacy of the electrical conductivity were used different combinations of temperatures , 25 °C and 30 ºC, number of seeds , 25 and 50, periods of imbibition , 4 , 8 , 12 , 16 and 24 hours , and volumes deionized water, 50 mL and 75mL. For potassium leaching test, which was conducted from the results achieved by the methodology of the adequacy of the electrical conductivity test, to compare the efficiency of both tests , in the classification of seeds at different levels of vigor, and the period 4 hours also evaluated because the potassium leaching test can be more efficient in the shortest time . The best combination obtained in experiment of electrical conductivity is 25 seeds soaked in 50 mL deionized or distilled water for 8 hours at a temperature of 30 ° C. Data were subjected to analysis of variance, the means were compared with each other by F tests and Tukey at 5 % probability, and when necessary polynomial regression analysis was performed. The electrical conductivity test performed at period eight hour proved to be more efficient in the separation of seed lots M. caesalpiniaefolia at different levels of vigor compared to the potassium test
Resumo:
Telecommunications play a key role in contemporary society. However, as new technologies are put into the market, it also grows the demanding for new products and services that depend on the offered infrastructure, making the problems of planning telecommunications networks, despite the advances in technology, increasingly larger and complex. However, many of these problems can be formulated as models of combinatorial optimization, and the use of heuristic algorithms can help solving these issues in the planning phase. In this project it was developed two pure metaheuristic implementations Genetic algorithm (GA) and Memetic Algorithm (MA) plus a third hybrid implementation Memetic Algorithm with Vocabulary Building (MA+VB) for a problem in telecommunications that is known in the literature as Problem SONET Ring Assignment Problem or SRAP. The SRAP arises during the planning stage of the physical network and it consists in the selection of connections between a number of locations (customers) in order to meet a series of restrictions on the lowest possible cost. This problem is NP-hard, so efficient exact algorithms (in polynomial complexity ) are not known and may, indeed, even exist
Resumo:
The use of the maps obtained from remote sensing orbital images submitted to digital processing became fundamental to optimize conservation and monitoring actions of the coral reefs. However, the accuracy reached in the mapping of submerged areas is limited by variation of the water column that degrades the signal received by the orbital sensor and introduces errors in the final result of the classification. The limited capacity of the traditional methods based on conventional statistical techniques to solve the problems related to the inter-classes took the search of alternative strategies in the area of the Computational Intelligence. In this work an ensemble classifiers was built based on the combination of Support Vector Machines and Minimum Distance Classifier with the objective of classifying remotely sensed images of coral reefs ecosystem. The system is composed by three stages, through which the progressive refinement of the classification process happens. The patterns that received an ambiguous classification in a certain stage of the process were revalued in the subsequent stage. The prediction non ambiguous for all the data happened through the reduction or elimination of the false positive. The images were classified into five bottom-types: deep water; under-water corals; inter-tidal corals; algal and sandy bottom. The highest overall accuracy (89%) was obtained from SVM with polynomial kernel. The accuracy of the classified image was compared through the use of error matrix to the results obtained by the application of other classification methods based on a single classifier (neural network and the k-means algorithm). In the final, the comparison of results achieved demonstrated the potential of the ensemble classifiers as a tool of classification of images from submerged areas subject to the noise caused by atmospheric effects and the water column
Resumo:
The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.
Resumo:
This work presents a modelling and identification method for a wheeled mobile robot, including the actuator dynamics. Instead of the classic modelling approach, where the robot position coordinates (x,y) are utilized as state variables (resulting in a non linear model), the proposed discrete model is based on the travelled distance increment Delta_l. Thus, the resulting model is linear and time invariant and it can be identified through classical methods such as Recursive Least Mean Squares. This approach has a problem: Delta_l can not be directly measured. In this paper, this problem is solved using an estimate of Delta_l based on a second order polynomial approximation. Experimental data were colected and the proposed method was used to identify the model of a real robot
Resumo:
This paper presents an evaluative study about the effects of using a machine learning technique on the main features of a self-organizing and multiobjective genetic algorithm (GA). A typical GA can be seen as a search technique which is usually applied in problems involving no polynomial complexity. Originally, these algorithms were designed to create methods that seek acceptable solutions to problems where the global optimum is inaccessible or difficult to obtain. At first, the GAs considered only one evaluation function and a single objective optimization. Today, however, implementations that consider several optimization objectives simultaneously (multiobjective algorithms) are common, besides allowing the change of many components of the algorithm dynamically (self-organizing algorithms). At the same time, they are also common combinations of GAs with machine learning techniques to improve some of its characteristics of performance and use. In this work, a GA with a machine learning technique was analyzed and applied in a antenna design. We used a variant of bicubic interpolation technique, called 2D Spline, as machine learning technique to estimate the behavior of a dynamic fitness function, based on the knowledge obtained from a set of laboratory experiments. This fitness function is also called evaluation function and, it is responsible for determining the fitness degree of a candidate solution (individual), in relation to others in the same population. The algorithm can be applied in many areas, including in the field of telecommunications, as projects of antennas and frequency selective surfaces. In this particular work, the presented algorithm was developed to optimize the design of a microstrip antenna, usually used in wireless communication systems for application in Ultra-Wideband (UWB). The algorithm allowed the optimization of two variables of geometry antenna - the length (Ls) and width (Ws) a slit in the ground plane with respect to three objectives: radiated signal bandwidth, return loss and central frequency deviation. These two dimensions (Ws and Ls) are used as variables in three different interpolation functions, one Spline for each optimization objective, to compose a multiobjective and aggregate fitness function. The final result proposed by the algorithm was compared with the simulation program result and the measured result of a physical prototype of the antenna built in the laboratory. In the present study, the algorithm was analyzed with respect to their success degree in relation to four important characteristics of a self-organizing multiobjective GA: performance, flexibility, scalability and accuracy. At the end of the study, it was observed a time increase in algorithm execution in comparison to a common GA, due to the time required for the machine learning process. On the plus side, we notice a sensitive gain with respect to flexibility and accuracy of results, and a prosperous path that indicates directions to the algorithm to allow the optimization problems with "η" variables
Resumo:
A modelagem de processos industriais tem auxiliado na produção e minimização de custos, permitindo a previsão dos comportamentos futuros do sistema, supervisão de processos e projeto de controladores. Ao observar os benefícios proporcionados pela modelagem, objetiva-se primeiramente, nesta dissertação, apresentar uma metodologia de identificação de modelos não-lineares com estrutura NARX, a partir da implementação de algoritmos combinados de detecção de estrutura e estimação de parâmetros. Inicialmente, será ressaltada a importância da identificação de sistemas na otimização de processos industriais, especificamente a escolha do modelo para representar adequadamente as dinâmicas do sistema. Em seguida, será apresentada uma breve revisão das etapas que compõem a identificação de sistemas. Na sequência, serão apresentados os métodos fundamentais para detecção de estrutura (Modificado Gram- Schmidt) e estimação de parâmetros (Método dos Mínimos Quadrados e Método dos Mínimos Quadrados Estendido) de modelos. No trabalho será também realizada, através dos algoritmos implementados, a identificação de dois processos industriais distintos representados por uma planta de nível didática, que possibilita o controle de nível e vazão, e uma planta de processamento primário de petróleo simulada, que tem como objetivo representar um tratamento primário do petróleo que ocorre em plataformas petrolíferas. A dissertação é finalizada com uma avaliação dos desempenhos dos modelos obtidos, quando comparados com o sistema. A partir desta avaliação, será possível observar se os modelos identificados são capazes de representar as características estáticas e dinâmicas dos sistemas apresentados nesta dissertação
Resumo:
Information retrieval is of paramount importance in all areas of knowledge. Regarding the temperatures of Natal, they were simulated and analyzed. Thus, it was possible to recover, with some accuracy, the temperatures of days they were not collected. For this we constructed a software that displays the temperature value at each moment in the city. The program was developed in Delphi using interpolated polynomial function of third degree. The equations were obtained in Excel and data were collected at the Instituto Nacional de Pesquisas Espaciais (INPE). These functions were changed from a correction factor in order to provide values to temperatures between those who were not collected. Armed with this program you can build tables and charts to analyze the temperatures for certain periods of time. The same analysis was done by developing mathematical functions that describes the temperatures. With the data provided by this software is possible to say which are the hours of highest and lowest temperatures in the city, as the months have indexes with the highest and lowest temperatures.
Resumo:
The present study investigates how the inter-relationship of the content of polynomial equations works with structured activities and with the history of mathematics through a sequence of activities presented in an e-book, so that the result of this research will proceed will result in a didactic and pedagogic proposal for the teaching of polynomial equations in a historical approach via the reported e-book. Therefore, we have considered in theoretical and methodological assumptions of the History of Mathematics, in structured activities and new technologies with an emphasis on e-book tool. We used as a methodological approach the qualitative research, as our research object adjusts to the objectives of this research mode. As methodological instruments, we used the e-book as a synthesis tool of the sequence of activities to be evaluated, while the questionnaires, semi-structured interviews and participant observation were designed to register and analyze the evaluation made by the research, participants in the structured activities. The processing and analysis of data collected though the questionnaires were organized, classified and quantified in summary tables to facilitate visualization, interpretation, understanding, and analysis of these data. As for participant observation was used to contribute to the qualitative analysis of the quantified data. The interviews were synthetically transcribed and qualitatively analyzed. The analysis ratified our research objectives and contributed to improve, approve and indicate the use of e-book for the teaching of polynomial equations. Thus, we consider that this educational product will bring significant contributions to the teaching of mathematical content, in Basic Education
Resumo:
In this work is presented a new method for the determination of the orbital period (Porb) of eclipsing binary systems based on the wavelet technique. This method is applied on 18 eclipsing binary systems detected by the CoRoT (Convection Rotation and planetary transits) satellite. The periods obtained by wavelet were compared with those obtained by the conventional methods: box Fitting (EEBLS) for detached and semi-detached eclipsing binaries; and polynomial methods (ANOVA) for contact binary systems. Comparing the phase diagrams obtained by the different techniques the wavelet method determine better Porb compared with EEBLS. In the case of contact binary systems the wavelet method shows most of the times better results than the ANOVA method but when the number of data per orbital cicle is small ANOVA gives more accurate results. Thus, the wavelet technique seems to be a great tool for the analysis of data with the quality and precision given by CoRoT and the incoming photometric missions.
Resumo:
The separation methods are reduced applications as a result of the operational costs, the low output and the long time to separate the uids. But, these treatment methods are important because of the need for extraction of unwanted contaminants in the oil production. The water and the concentration of oil in water should be minimal (around 40 to 20 ppm) in order to take it to the sea. Because of the need of primary treatment, the objective of this project is to study and implement algorithms for identification of polynomial NARX (Nonlinear Auto-Regressive with Exogenous Input) models in closed loop, implement a structural identification, and compare strategies using PI control and updated on-line NARX predictive models on a combination of three-phase separator in series with three hydro cyclones batteries. The main goal of this project is to: obtain an optimized process of phase separation that will regulate the system, even in the presence of oil gushes; Show that it is possible to get optimized tunings for controllers analyzing the mesh as a whole, and evaluate and compare the strategies of PI and predictive control applied to the process. To accomplish these goals a simulator was used to represent the three phase separator and hydro cyclones. Algorithms were developed for system identification (NARX) using RLS(Recursive Least Square), along with methods for structure models detection. Predictive Control Algorithms were also implemented with NARX model updated on-line, and optimization algorithms using PSO (Particle Swarm Optimization). This project ends with a comparison of results obtained from the use of PI and predictive controllers (both with optimal state through the algorithm of cloud particles) in the simulated system. Thus, concluding that the performed optimizations make the system less sensitive to external perturbations and when optimized, the two controllers show similar results with the assessment of predictive control somewhat less sensitive to disturbances
Resumo:
Among several theorems which are taught in basic education some of them can be proved in the classroom and others do not, because the degree of difficulty of its formal proof. A classic example is the Fundamental Theorem of Algebra which is not proved, it is necessary higher-level knowledge in mathematics. In this paper, we justify the validity of this theorem intuitively using the software Geogebra. And, based on [2] we will present a clear formal proof of this theorem that is addressed to school teachers and undergraduate students in mathematics