143 resultados para Axiomatização dos Reais
Resumo:
The idea of considering imprecision in probabilities is old, beginning with the Booles George work, who in 1854 wanted to reconcile the classical logic, which allows the modeling of complete ignorance, with probabilities. In 1921, John Maynard Keynes in his book made explicit use of intervals to represent the imprecision in probabilities. But only from the work ofWalley in 1991 that were established principles that should be respected by a probability theory that deals with inaccuracies. With the emergence of the theory of fuzzy sets by Lotfi Zadeh in 1965, there is another way of dealing with uncertainty and imprecision of concepts. Quickly, they began to propose several ways to consider the ideas of Zadeh in probabilities, to deal with inaccuracies, either in the events associated with the probabilities or in the values of probabilities. In particular, James Buckley, from 2003 begins to develop a probability theory in which the fuzzy values of the probabilities are fuzzy numbers. This fuzzy probability, follows analogous principles to Walley imprecise probabilities. On the other hand, the uses of real numbers between 0 and 1 as truth degrees, as originally proposed by Zadeh, has the drawback to use very precise values for dealing with uncertainties (as one can distinguish a fairly element satisfies a property with a 0.423 level of something that meets with grade 0.424?). This motivated the development of several extensions of fuzzy set theory which includes some kind of inaccuracy. This work consider the Krassimir Atanassov extension proposed in 1983, which add an extra degree of uncertainty to model the moment of hesitation to assign the membership degree, and therefore a value indicate the degree to which the object belongs to the set while the other, the degree to which it not belongs to the set. In the Zadeh fuzzy set theory, this non membership degree is, by default, the complement of the membership degree. Thus, in this approach the non-membership degree is somehow independent of the membership degree, and this difference between the non-membership degree and the complement of the membership degree reveals the hesitation at the moment to assign a membership degree. This new extension today is called of Atanassov s intuitionistic fuzzy sets theory. It is worth noting that the term intuitionistic here has no relation to the term intuitionistic as known in the context of intuitionistic logic. In this work, will be developed two proposals for interval probability: the restricted interval probability and the unrestricted interval probability, are also introduced two notions of fuzzy probability: the constrained fuzzy probability and the unconstrained fuzzy probability and will eventually be introduced two notions of intuitionistic fuzzy probability: the restricted intuitionistic fuzzy probability and the unrestricted intuitionistic fuzzy probability
Resumo:
Currently, one of the biggest challenges for the field of data mining is to perform cluster analysis on complex data. Several techniques have been proposed but, in general, they can only achieve good results within specific areas providing no consensus of what would be the best way to group this kind of data. In general, these techniques fail due to non-realistic assumptions about the true probability distribution of the data. Based on this, this thesis proposes a new measure based on Cross Information Potential that uses representative points of the dataset and statistics extracted directly from data to measure the interaction between groups. The proposed approach allows us to use all advantages of this information-theoretic descriptor and solves the limitations imposed on it by its own nature. From this, two cost functions and three algorithms have been proposed to perform cluster analysis. As the use of Information Theory captures the relationship between different patterns, regardless of assumptions about the nature of this relationship, the proposed approach was able to achieve a better performance than the main algorithms in literature. These results apply to the context of synthetic data designed to test the algorithms in specific situations and to real data extracted from problems of different fields
Resumo:
In this Thesis, the development of the dynamic model of multirotor unmanned aerial vehicle with vertical takeoff and landing characteristics, considering input nonlinearities and a full state robust backstepping controller are presented. The dynamic model is expressed using the Newton-Euler laws, aiming to obtain a better mathematical representation of the mechanical system for system analysis and control design, not only when it is hovering, but also when it is taking-off, or landing, or flying to perform a task. The input nonlinearities are the deadzone and saturation, where the gravitational effect and the inherent physical constrains of the rotors are related and addressed. The experimental multirotor aerial vehicle is equipped with an inertial measurement unit and a sonar sensor, which appropriately provides measurements of attitude and altitude. A real-time attitude estimation scheme based on the extended Kalman filter using quaternions was developed. Then, for robustness analysis, sensors were modeled as the ideal value with addition of an unknown bias and unknown white noise. The bounded robust attitude/altitude controller were derived based on globally uniformly practically asymptotically stable for real systems, that remains globally uniformly asymptotically stable if and only if their solutions are globally uniformly bounded, dealing with convergence and stability into a ball of the state space with non-null radius, under some assumptions. The Lyapunov analysis technique was used to prove the stability of the closed-loop system, compute bounds on control gains and guaranteeing desired bounds on attitude dynamics tracking errors in the presence of measurement disturbances. The controller laws were tested in numerical simulations and in an experimental hexarotor, developed at the UFRN Robotics Laboratory
Resumo:
In recent decades, changes have been occurring in the telecommunications industry, allied to competition driven by the policies of privatization and concessions, have fomented the world market irrefutably causing the emergence of a new reality. The reflections in Brazil have become evident due to the appearance of significant growth rates, getting in 2012 to provide a net operating income of 128 billion dollars, placing the country among the five major powers in the world in mobile communications. In this context, an issue of increasing importance to the financial health of companies is their ability to retain their customers, as well as turn them into loyal customers. The appearance of infidelity from customer operators has been generating monthly rates shutdowns about two to four percent per month accounting for business management one of its biggest challenges, since capturing a new customer has meant an expenditure greater than five times to retention. For this purpose, models have been developed by means of structural equation modeling to identify the relationships between the various determinants of customer loyalty in the context of services. The original contribution of this thesis is to develop a model for loyalty from the identification of relationships between determinants of satisfaction (latent variables) and the inclusion of attributes that determine the perceptions of service quality for the mobile communications industry, such as quality, satisfaction, value, trust, expectation and loyalty. It is a qualitative research which will be conducted with customers of operators through simple random sampling technique, using structured questionnaires. As a result, the proposed model and statistical evaluations should enable operators to conclude that customer loyalty is directly influenced by technical and operational quality of the services offered, as well as provide a satisfaction index for the mobile communication segment
Resumo:
The precision and the fast identification of abnormalities of bottom hole are essential to prevent damage and increase production in the oil industry. This work presents a study about a new automatic approach to the detection and the classification of operation mode in the Sucker-rod Pumping through dynamometric cards of bottom hole. The main idea is the recognition of the well production status through the image processing of the bottom s hole dynamometric card (Boundary Descriptors) and statistics and similarity mathematics tools, like Fourier Descriptor, Principal Components Analysis (PCA) and Euclidean Distance. In order to validate the proposal, the Sucker-Rod Pumping system real data are used
Resumo:
In this work we present a new clustering method that groups up points of a data set in classes. The method is based in a algorithm to link auxiliary clusters that are obtained using traditional vector quantization techniques. It is described some approaches during the development of the work that are based in measures of distances or dissimilarities (divergence) between the auxiliary clusters. This new method uses only two a priori information, the number of auxiliary clusters Na and a threshold distance dt that will be used to decide about the linkage or not of the auxiliary clusters. The number os classes could be automatically found by the method, that do it based in the chosen threshold distance dt, or it is given as additional information to help in the choice of the correct threshold. Some analysis are made and the results are compared with traditional clustering methods. In this work different dissimilarities metrics are analyzed and a new one is proposed based on the concept of negentropy. Besides grouping points of a set in classes, it is proposed a method to statistical modeling the classes aiming to obtain a expression to the probability of a point to belong to one of the classes. Experiments with several values of Na e dt are made in tests sets and the results are analyzed aiming to study the robustness of the method and to consider heuristics to the choice of the correct threshold. During this work it is explored the aspects of information theory applied to the calculation of the divergences. It will be explored specifically the different measures of information and divergence using the Rényi entropy. The results using the different metrics are compared and commented. The work also has appendix where are exposed real applications using the proposed method
Resumo:
abstract
Resumo:
The purpose of this study was to develop a pilot plant which the main goal is to emulate a flow peak pressure in a separation vessel. Effect similar that is caused by the production in a slug flow in production wells equipped with the artificial lift method plunger lift. The motivation for its development was the need to test in a plant on a smaller scale, a new technique developed to estimate the gas flow in production wells equipped with plunger lift. To develop it, studies about multiphase flow effects, operation methods of artificial lift in plunger lift wells, industrial instrumentation elements, control valves, vessel sizing separators and measurement systems were done. The methodology used was the definition of process flowcharts, its parameters and how the effects needed would be generated for the success of the experiments. Therefore, control valves, the design and construction of vessels and the acquisition of other equipment used were defined. One of the vessels works as a tank of compressed air that is connected to the separation vessel and generates pulses of gas controlled by a on/off valve. With the emulator system ready, several control experiments were made, being the control of peak flow pressure generation and the flow meter the main experiments, this way, it was confirmed the efficiency of the plant usage in the problem that motivated it. It was concluded that the system is capable of generate effects of flow with peak pressure in a primary separation vessel. Studies such as the estimation of gas flow at the exit of the vessel and several academic studies can be done and tested on a smaller scale and then applied in real plants, avoiding waste of time and money.
Resumo:
Pipeline leak detection is a matter of great interest for companies who transport petroleum and its derivatives, in face of rising exigencies of environmental policies in industrialized and industrializing countries. However, existing technologies are not yet fully consolidated and many studies have been accomplished in order to achieve better levels of sensitivity and reliability for pipeline leak detection in a wide range of flowing conditions. In this sense, this study presents the results obtained from frequency spectrum analysis of pressure signals from pipelines in several flowing conditions like normal flowing, leakages, pump switching, etc. The results show that is possible to distinguish between the frequency spectra of those different flowing conditions, allowing recognition and announce of liquid pipeline leakages from pressure monitoring. Based upon these results, a pipeline leak detection algorithm employing frequency analysis of pressure signals is proposed, along with a methodology for its tuning and calibration. The proposed algorithm and its tuning methodology are evaluated with data obtained from real leakages accomplished in pipelines transferring crude oil and water, in order to evaluate its sensitivity, reliability and applicability to different flowing conditions
Resumo:
The present work describes the use of a mathematical tool to solve problems arising from control theory, including the identification, analysis of the phase portrait and stability, as well as the temporal evolution of the plant s current induction motor. The system identification is an area of mathematical modeling that has as its objective the study of techniques which can determine a dynamic model in representing a real system. The tool used in the identification and analysis of nonlinear dynamical system is the Radial Basis Function (RBF). The process or plant that is used has a mathematical model unknown, but belongs to a particular class that contains an internal dynamics that can be modeled.Will be presented as contributions to the analysis of asymptotic stability of the RBF. The identification using radial basis function is demonstrated through computer simulations from a real data set obtained from the plant
Resumo:
The electric energy is essential to the development of modern society and its increasing demand in recent years, effect from population and economic growth, becomes the companies more interested in the quality and continuity of supply, factors regulated by ANEEL (Agência Nacional de Energia Elétrica). These factors must be attended when a permanent fault occurs in the system, where the defect location that caused the power interruption should be identified quickly, which is not a simple assignment because the current systems complexity. An example of this occurs in multiple terminals transmission lines, which interconnect existing circuits to feed the demand. These transmission lines have been adopted as a feasible solution to suply loads of magnitudes that do not justify economically the construction of new substations. This paper presents a fault location algorithm for multiple terminals transmission lines - two and three terminals. The location method is based on the use of voltage and current fundamental phasors, as well as the representation of the line through its series impedance. The wavelet transform is an effective mathematical tool in signals analysis with discontinuities and, therefore, is used to synchronize voltage and current data. The Fourier transform is another tool used in this work for extract voltage and current fundamental phasors. Tests to validate the location algorithm applicability used data from faulty signals simulated in ATP (Alternative Transients Program) as well as real data obtained from oscillographic recorders installed on CHESF s lines.
Resumo:
This work proposes hardware architecture, VHDL described, developed to embedded Artificial Neural Network (ANN), Multilayer Perceptron (MLP). The present work idealizes that, in this architecture, ANN applications could easily embed several different topologies of MLP network industrial field. The MLP topology in which the architecture can be configured is defined by a simple and specifically data input (instructions) that determines the layers and Perceptron quantity of the network. In order to set several MLP topologies, many components (datapath) and a controller were developed to execute these instructions. Thus, an user defines a group of previously known instructions which determine ANN characteristics. The system will guarantee the MLP execution through the neural processors (Perceptrons), the components of datapath and the controller that were developed. In other way, the biases and the weights must be static, the ANN that will be embedded must had been trained previously, in off-line way. The knowledge of system internal characteristics and the VHDL language by the user are not needed. The reconfigurable FPGA device was used to implement, simulate and test all the system, allowing application in several real daily problems
Resumo:
This work consists in the use of techniques of signals processing and artificial neural networks to identify leaks in pipes with multiphase flow. In the traditional methods of leak detection exists a great difficulty to mount a profile, that is adjusted to the found in real conditions of the oil transport. These difficult conditions go since the unevenly soil that cause columns or vacuum throughout pipelines until the presence of multiphases like water, gas and oil; plus other components as sand, which use to produce discontinuous flow off and diverse variations. To attenuate these difficulties, the transform wavelet was used to map the signal pressure in different resolution plan allowing the extraction of descriptors that identify leaks patterns and with then to provide training for the neural network to learning of how to classify this pattern and report whenever this characterize leaks. During the tests were used transient and regime signals and pipelines with punctures with size variations from ½' to 1' of diameter to simulate leaks and between Upanema and Estreito B, of the UN-RNCE of the Petrobras, where it was possible to detect leaks. The results show that the proposed descriptors considered, based in statistical methods applied in domain transform, are sufficient to identify leaks patterns and make it possible to train the neural classifier to indicate the occurrence of pipeline leaks
Resumo:
Fuzzy intelligent systems are present in a variety of equipment ranging from household appliances to Fuzzy intelligent systems are present in a variety of equipment ranging from household appliances to small devices such as digital cameras and cell phones being used primarily for dealing with the uncertainties in the modeling of real systems. However, commercial implementations of Fuzzy systems are not general purpose and do not have portability to different hardware platforms. Thinking about these issues this work presents the implementation of an open source development environment that consists of a desktop system capable of generate Graphically a general purpose Fuzzy controller and export these parameters for an embedded system with a Fuzzy controller written in Java Platform Micro Edition To (J2ME), whose modular design makes it portable to any mobile device that supports J2ME. Thus, the proposed development platform is capable of generating all the parameters of a Fuzzy controller and export it in XML file, and the code responsible for the control logic that is embedded in the mobile device is able to read this file and start the controller. All the parameters of a Fuzzy controller are configurable using the desktop system, since the membership functions and rule base, even the universe of discourse of the linguistic terms of output variables. This system generates Fuzzy controllers for the interpolation model of Takagi-Sugeno. As the validation process and testing of the proposed solution the Fuzzy controller was embedded on the mobile device Sun SPOT ® and used to control a plant-level Quanser®, and to compare the Fuzzy controller generated by the system with other types of controllers was implemented and embedded in sun spot a PID controller to control the same level plant of Quanser®
Resumo:
Visual Odometry is the process that estimates camera position and orientation based solely on images and in features (projections of visual landmarks present in the scene) extraced from them. With the increasing advance of Computer Vision algorithms and computer processing power, the subarea known as Structure from Motion (SFM) started to supply mathematical tools composing localization systems for robotics and Augmented Reality applications, in contrast with its initial purpose of being used in inherently offline solutions aiming 3D reconstruction and image based modelling. In that way, this work proposes a pipeline to obtain relative position featuring a previously calibrated camera as positional sensor and based entirely on models and algorithms from SFM. Techniques usually applied in camera localization systems such as Kalman filters and particle filters are not used, making unnecessary additional information like probabilistic models for camera state transition. Experiments assessing both 3D reconstruction quality and camera position estimated by the system were performed, in which image sequences captured in reallistic scenarios were processed and compared to localization data gathered from a mobile robotic platform