201 resultados para Compressão de dados (Telecomunicações)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work describes the study and the implementation of the vector speed control for a three-phase Bearingless induction machine with divided winding of 4 poles and 1,1 kW using the neural rotor flux estimation. The vector speed control operates together with the radial positioning controllers and with the winding currents controllers of the stator phases. For the radial positioning, the forces controlled by the internal machine magnetic fields are used. For the radial forces optimization , a special rotor winding with independent circuits which allows a low rotational torque influence was used. The neural flux estimation applied to the vector speed controls has the objective of compensating the parameter dependences of the conventional estimators in relation to the parameter machine s variations due to the temperature increases or due to the rotor magnetic saturation. The implemented control system allows a direct comparison between the respective responses of the speed and radial positioning controllers to the machine oriented by the neural rotor flux estimator in relation to the conventional flux estimator. All the system control is executed by a program developed in the ANSI C language. The DSP resources used by the system are: the Analog/Digital channels converters, the PWM outputs and the parallel and RS-232 serial interfaces, which are responsible, respectively, by the DSP programming and the data capture through the supervisory system

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of the maps obtained from remote sensing orbital images submitted to digital processing became fundamental to optimize conservation and monitoring actions of the coral reefs. However, the accuracy reached in the mapping of submerged areas is limited by variation of the water column that degrades the signal received by the orbital sensor and introduces errors in the final result of the classification. The limited capacity of the traditional methods based on conventional statistical techniques to solve the problems related to the inter-classes took the search of alternative strategies in the area of the Computational Intelligence. In this work an ensemble classifiers was built based on the combination of Support Vector Machines and Minimum Distance Classifier with the objective of classifying remotely sensed images of coral reefs ecosystem. The system is composed by three stages, through which the progressive refinement of the classification process happens. The patterns that received an ambiguous classification in a certain stage of the process were revalued in the subsequent stage. The prediction non ambiguous for all the data happened through the reduction or elimination of the false positive. The images were classified into five bottom-types: deep water; under-water corals; inter-tidal corals; algal and sandy bottom. The highest overall accuracy (89%) was obtained from SVM with polynomial kernel. The accuracy of the classified image was compared through the use of error matrix to the results obtained by the application of other classification methods based on a single classifier (neural network and the k-means algorithm). In the final, the comparison of results achieved demonstrated the potential of the ensemble classifiers as a tool of classification of images from submerged areas subject to the noise caused by atmospheric effects and the water column

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The skin cancer is the most common of all cancers and the increase of its incidence must, in part, caused by the behavior of the people in relation to the exposition to the sun. In Brazil, the non-melanoma skin cancer is the most incident in the majority of the regions. The dermatoscopy and videodermatoscopy are the main types of examinations for the diagnosis of dermatological illnesses of the skin. The field that involves the use of computational tools to help or follow medical diagnosis in dermatological injuries is seen as very recent. Some methods had been proposed for automatic classification of pathology of the skin using images. The present work has the objective to present a new intelligent methodology for analysis and classification of skin cancer images, based on the techniques of digital processing of images for extraction of color characteristics, forms and texture, using Wavelet Packet Transform (WPT) and learning techniques called Support Vector Machine (SVM). The Wavelet Packet Transform is applied for extraction of texture characteristics in the images. The WPT consists of a set of base functions that represents the image in different bands of frequency, each one with distinct resolutions corresponding to each scale. Moreover, the characteristics of color of the injury are also computed that are dependants of a visual context, influenced for the existing colors in its surround, and the attributes of form through the Fourier describers. The Support Vector Machine is used for the classification task, which is based on the minimization principles of the structural risk, coming from the statistical learning theory. The SVM has the objective to construct optimum hyperplanes that represent the separation between classes. The generated hyperplane is determined by a subset of the classes, called support vectors. For the used database in this work, the results had revealed a good performance getting a global rightness of 92,73% for melanoma, and 86% for non-melanoma and benign injuries. The extracted describers and the SVM classifier became a method capable to recognize and to classify the analyzed skin injuries

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The concepts of the industrial automation are being incorporated in the medical area, in other words, they also pass to be applied in the hospital automation. In this sense, researches have been developed and have usually been approached several of the problems that are pertinent to the processes that can be automated in the hospital environment. Considering that in the automation processes, an imperative factor is the communication, because the systems are usually distributed, the network for data transference becomes itself an important point in these processes. Because this network should be capable to provide the exchange of data and to guarantee the demands that are imposed by the automation process. In this context, this doctorate thesis proposed, specified, analyzed and validated the Multicycles Protocol for Hospital Automation (MP-HA), which is customized to assist the demands in these automation processes, seeking to guarantee the determinism in the communications and to optimize the factor of use of the mean of transmission

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The usual programs for load flow calculation were in general developped aiming the simulation of electric energy transmission, subtransmission and distribution systems. However, the mathematical methods and algorithms used by the formulations were based, in majority, just on the characteristics of the transmittion systems, which were the main concern focus of engineers and researchers. Though, the physical characteristics of these systems are quite different from the distribution ones. In the transmission systems, the voltage levels are high and the lines are generally very long. These aspects contribute the capacitive and inductive effects that appear in the system to have a considerable influence in the values of the interest quantities, reason why they should be taken into consideration. Still in the transmission systems, the loads have a macro nature, as for example, cities, neiborhoods, or big industries. These loads are, generally, practically balanced, what reduces the necessity of utilization of three-phase methodology for the load flow calculation. Distribution systems, on the other hand, present different characteristics: the voltage levels are small in comparison to the transmission ones. This almost annul the capacitive effects of the lines. The loads are, in this case, transformers, in whose secondaries are connected small consumers, in a sort of times, mono-phase ones, so that the probability of finding an unbalanced circuit is high. This way, the utilization of three-phase methodologies assumes an important dimension. Besides, equipments like voltage regulators, that use simultaneously the concepts of phase and line voltage in their functioning, need a three-phase methodology, in order to allow the simulation of their real behavior. For the exposed reasons, initially was developped, in the scope of this work, a method for three-phase load flow calculation in order to simulate the steady-state behaviour of distribution systems. Aiming to achieve this goal, the Power Summation Algorithm was used, as a base for developing the three phase method. This algorithm was already widely tested and approved by researchers and engineers in the simulation of radial electric energy distribution systems, mainly for single-phase representation. By our formulation, lines are modeled in three-phase circuits, considering the magnetic coupling between the phases; but the earth effect is considered through the Carson reduction. It s important to point out that, in spite of the loads being normally connected to the transformer s secondaries, was considered the hypothesis of existence of star or delta loads connected to the primary circuit. To perform the simulation of voltage regulators, a new model was utilized, allowing the simulation of various types of configurations, according to their real functioning. Finally, was considered the possibility of representation of switches with current measuring in various points of the feeder. The loads are adjusted during the iteractive process, in order to match the current in each switch, converging to the measured value specified by the input data. In a second stage of the work, sensibility parameters were derived taking as base the described load flow, with the objective of suporting further optimization processes. This parameters are found by calculating of the partial derivatives of a variable in respect to another, in general, voltages, losses and reactive powers. After describing the calculation of the sensibility parameters, the Gradient Method was presented, using these parameters to optimize an objective function, that will be defined for each type of study. The first one refers to the reduction of technical losses in a medium voltage feeder, through the installation of capacitor banks; the second one refers to the problem of correction of voltage profile, through the instalation of capacitor banks or voltage regulators. In case of the losses reduction will be considered, as objective function, the sum of the losses in all the parts of the system. To the correction of the voltage profile, the objective function will be the sum of the square voltage deviations in each node, in respect to the rated voltage. In the end of the work, results of application of the described methods in some feeders are presented, aiming to give insight about their performance and acuity

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work deals with a mathematical fundament for digital signal processing under point view of interval mathematics. Intend treat the open problem of precision and repesention of data in digital systems, with a intertval version of signals representation. Signals processing is a rich and complex area, therefore, this work makes a cutting with focus in systems linear invariant in the time. A vast literature in the area exists, but, some concepts in interval mathematics need to be redefined or to be elaborated for the construction of a solid theory of interval signal processing. We will construct a basic fundaments for signal processing in the interval version, such as basic properties linearity, stability, causality, a version to intervalar of linear systems e its properties. They will be presented interval versions of the convolution and the Z-transform. Will be made analysis of convergences of systems using interval Z-transform , a essentially interval distance, interval complex numbers , application in a interval filter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Internet applications such as media streaming, collaborative computing and massive multiplayer are on the rise,. This leads to the need for multicast communication, but unfortunately group communications support based on IP multicast has not been widely adopted due to a combination of technical and non-technical problems. Therefore, a number of different application-layer multicast schemes have been proposed in recent literature to overcome the drawbacks. In addition, these applications often behave as both providers and clients of services, being called peer-topeer applications, and where participants come and go very dynamically. Thus, servercentric architectures for membership management have well-known problems related to scalability and fault-tolerance, and even peer-to-peer traditional solutions need to have some mechanism that takes into account member's volatility. The idea of location awareness distributes the participants in the overlay network according to their proximity in the underlying network allowing a better performance. Given this context, this thesis proposes an application layer multicast protocol, called LAALM, which takes into account the actual network topology in the assembly process of the overlay network. The membership algorithm uses a new metric, IPXY, to provide location awareness through the processing of local information, and it was implemented using a distributed shared and bi-directional tree. The algorithm also has a sub-optimal heuristic to minimize the cost of membership process. The protocol has been evaluated in two ways. First, through an own simulator developed in this work, where we evaluated the quality of distribution tree by metrics such as outdegree and path length. Second, reallife scenarios were built in the ns-3 network simulator where we evaluated the network protocol performance by metrics such as stress, stretch, time to first packet and reconfiguration group time

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis proposes the specification and performance analysis of a real-time communication mechanism for IEEE 802.11/11e standard. This approach is called Group Sequential Communication (GSC). The GSC has a better performance for dealing with small data packets when compared to the HCCA mechanism by adopting a decentralized medium access control using a publish/subscribe communication scheme. The main objective of the thesis is the HCCA overhead reduction of the Polling, ACK and QoS Null frames exchanged between the Hybrid Coordinator and the polled stations. The GSC eliminates the polling scheme used by HCCA scheduling algorithm by using a Virtual Token Passing procedure among members of the real-time group to whom a high-priority and sequential access to communication medium is granted. In order to improve the reliability of the mechanism proposed into a noisy channel, it is presented an error recovery scheme called second chance algorithm. This scheme is based on block acknowledgment strategy where there is a possibility of retransmitting when missing real-time messages. Thus, the GSC mechanism maintains the real-time traffic across many IEEE 802.11/11e devices, optimized bandwidth usage and minimal delay variation for data packets in the wireless network. For validation purpose of the communication scheme, the GSC and HCCA mechanisms have been implemented in network simulation software developed in C/C++ and their performance results were compared. The experiments show the efficiency of the GSC mechanism, especially in industrial communication scenarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study presents a description of the development model of a representation of simplified grid applied in hybrid load flow for calculation of the voltage variations in a steady-state caused by the wind farm on power system. Also, it proposes an optimal load-flow able to control power factor on connection bar and to minimize the loss. The analysis process on system, led by the wind producer, it has as base given technician supplied by the grid. So, the propose model to the simplification of the grid that allows the necessity of some knowledge only about the data referring the internal network, that is, the part of the network that interests in the analysis. In this way, it is intended to supply forms for the auxiliary in the systematization of the relations between the sector agents. The model for simplified network proposed identifies the internal network, external network and the buses of boulders from a study of vulnerability of the network, attributing them floating liquid powers attributing slack models. It was opted to apply the presented model in Newton-Raphson and a hybrid load flow, composed by The Gauss-Seidel method Zbarra and Summation Power. Finally, presents the results obtained to a developed computational environment of SCILAB and FORTRAN, with their respective analysis and conclusion, comparing them with the ANAREDE

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this research study, in which I discuss the discursive constitution of ethnic-racial identity of black male and female teachers, I understand that the process of identity formation of the subject covers both personal/family and social/professional areas. In it, I propose, in general terms, to analyze the discursive practices present in narratives of black male and female teachers when they look for their social insertion into different social contexts, identifying outbreaks of resistance that are present in their process of ethnicracial identities. The fundamental issue that permeates the survey investigates: how can black male and female teachers behave discursively in the construction of ethnicracial identities in multiple distinct contexts? The theoretical foundations that support this research work come from theoretical fields that complement each other; among them, French Discourse Analysis, Foucault s Theory and cultural studies. These, even with their singularities, are being interlaced by the conception that conceives language as social practice. Methodologically, I adopt an interpretative and qualitative paradigm to examine not only the linguistic repertoires that compose these teachers written narratives written but also the data that were generated by semi-structured interviews. The results show that the subjects, realizing contrary forces that interfere in their process of social inclusion, make use of acetic techniques to (re)signify the history of their lives

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The human voice is an important communication tool and any disorder of the voice can have profound implications for social and professional life of an individual. Techniques of digital signal processing have been used by acoustic analysis of vocal disorders caused by pathologies in the larynx, due to its simplicity and noninvasive nature. This work deals with the acoustic analysis of voice signals affected by pathologies in the larynx, specifically, edema, and nodules on the vocal folds. The purpose of this work is to develop a classification system of voices to help pre-diagnosis of pathologies in the larynx, as well as monitoring pharmacological treatments and after surgery. Linear Prediction Coefficients (LPC), Mel Frequency cepstral coefficients (MFCC) and the coefficients obtained through the Wavelet Packet Transform (WPT) are applied to extract relevant characteristics of the voice signal. For the classification task is used the Support Vector Machine (SVM), which aims to build optimal hyperplanes that maximize the margin of separation between the classes involved. The hyperplane generated is determined by the support vectors, which are subsets of points in these classes. According to the database used in this work, the results showed a good performance, with a hit rate of 98.46% for classification of normal and pathological voices in general, and 98.75% in the classification of diseases together: edema and nodules

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The area of the hospital automation has been the subject a lot of research, addressing relevant issues which can be automated, such as: management and control (electronic medical records, scheduling appointments, hospitalization, among others); communication (tracking patients, staff and materials), development of medical, hospital and laboratory equipment; monitoring (patients, staff and materials); and aid to medical diagnosis (according to each speciality). This thesis presents an architecture for a patient monitoring and alert systems. This architecture is based on intelligent systems techniques and is applied in hospital automation, specifically in the Intensive Care Unit (ICU) for the patient monitoring in hospital environment. The main goal of this architecture is to transform the multiparameter monitor data into useful information, through the knowledge of specialists and normal parameters of vital signs based on fuzzy logic that allows to extract information about the clinical condition of ICU patients and give a pre-diagnosis. Finally, alerts are dispatched to medical professionals in case any abnormality is found during monitoring. After the validation of the architecture, the fuzzy logic inferences were applied to the trainning and validation of an Artificial Neural Network for classification of the cases that were validated a priori with the fuzzy system

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we use Interval Mathematics to establish interval counterparts for the main tools used in digital signal processing. More specifically, the approach developed here is oriented to signals, systems, sampling, quantization, coding and Fourier transforms. A detailed study for some interval arithmetics which handle with complex numbers is provided; they are: complex interval arithmetic (or rectangular), circular complex arithmetic, and interval arithmetic for polar sectors. This lead us to investigate some properties that are relevant for the development of a theory of interval digital signal processing. It is shown that the sets IR and R(C) endowed with any correct arithmetic is not an algebraic field, meaning that those sets do not behave like real and complex numbers. An alternative to the notion of interval complex width is also provided and the Kulisch- Miranker order is used in order to write complex numbers in the interval form enabling operations on endpoints. The use of interval signals and systems is possible thanks to the representation of complex values into floating point systems. That is, if a number x 2 R is not representable in a floating point system F then it is mapped to an interval [x;x], such that x is the largest number in F which is smaller than x and x is the smallest one in F which is greater than x. This interval representation is the starting point for definitions like interval signals and systems which take real or complex values. It provides the extension for notions like: causality, stability, time invariance, homogeneity, additivity and linearity to interval systems. The process of quantization is extended to its interval counterpart. Thereafter the interval versions for: quantization levels, quantization error and encoded signal are provided. It is shown that the interval levels of quantization represent complex quantization levels and the classical quantization error ranges over the interval quantization error. An estimation for the interval quantization error and an interval version for Z-transform (and hence Fourier transform) is provided. Finally, the results of an Matlab implementation is given

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work intends to analyze the behavior of the gas flow of plunger lift wells producing to well testing separators in offshore production platforms to aim a technical procedure to estimate the gas flow during the slug production period. The motivation for this work appeared from the expectation of some wells equipped with plunger lift method by PETROBRAS in Ubarana sea field located at Rio Grande do Norte State coast where the produced fluids measurement is made in well testing separators at the platform. The oil artificial lift method called plunger lift is used when the available energy of the reservoir is not high enough to overcome all the necessary load losses to lift the oil from the bottom of the well to the surface continuously. This method consists, basically, in one free piston acting as a mechanical interface between the formation gas and the produced liquids, greatly increasing the well s lifting efficiency. A pneumatic control valve is mounted at the flow line to control the cycles. When this valve opens, the plunger starts to move from the bottom to the surface of the well lifting all the oil and gas that are above it until to reach the well test separator where the fluids are measured. The well test separator is used to measure all the volumes produced by the well during a certain period of time called production test. In most cases, the separators are designed to measure stabilized flow, in other words, reasonably constant flow by the use of level and pressure electronic controllers (PLC) and by assumption of a steady pressure inside the separator. With plunger lift wells the liquid and gas flow at the surface are cyclical and unstable what causes the appearance of slugs inside the separator, mainly in the gas phase, because introduce significant errors in the measurement system (e.g.: overrange error). The flow gas analysis proposed in this work is based on two mathematical models used together: i) a plunger lift well model proposed by Baruzzi [1] with later modifications made by Bolonhini [2] to built a plunger lift simulator; ii) a two-phase separator model (gas + liquid) based from a three-phase separator model (gas + oil + water) proposed by Nunes [3]. Based on the models above and with field data collected from the well test separator of PUB-02 platform (Ubarana sea field) it was possible to demonstrate that the output gas flow of the separator can be estimate, with a reasonable precision, from the control signal of the Pressure Control Valve (PCV). Several models of the System Identification Toolbox from MATLAB® were analyzed to evaluate which one better fit to the data collected from the field. For validation of the models, it was used the AIC criterion, as well as a variant of the cross validation criterion. The ARX model performance was the best one to fit to the data and, this way, we decided to evaluate a recursive algorithm (RARX) also with real time data. The results were quite promising that indicating the viability to estimate the output gas flow rate from a plunger lift well producing to a well test separator, with the built-in information of the control signal to the PCV

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the recovering process of oil, rock heterogeneity has a huge impact on how fluids move in the field, defining how much oil can be recovered. In order to study this variability, percolation theory, which describes phenomena involving geometry and connectivity are the bases, is a very useful model. Result of percolation is tridimensional data and have no physical meaning until visualized in form of images or animations. Although a lot of powerful and sophisticated visualization tools have been developed, they focus on generation of planar 2D images. In order to interpret data as they would be in the real world, virtual reality techniques using stereo images could be used. In this work we propose an interactive and helpful tool, named ZSweepVR, based on virtual reality techniques that allows a better comprehension of volumetric data generated by simulation of dynamic percolation. The developed system has the ability to render images using two different techniques: surface rendering and volume rendering. Surface rendering is accomplished by OpenGL directives and volume rendering is accomplished by the Zsweep direct volume rendering engine. In the case of volumetric rendering, we implemented an algorithm to generate stereo images. We also propose enhancements in the original percolation algorithm in order to get a better performance. We applied our developed tools to a mature field database, obtaining satisfactory results. The use of stereoscopic and volumetric images brought valuable contributions for the interpretation and clustering formation analysis in percolation, what certainly could lead to better decisions about the exploration and recovery process in oil fields