884 resultados para Sistemas de tempo contínuo
Resumo:
The goal of the power monitoring in electrical power systems is to promote the reliablility as well as the quality of electrical power.Therefore, this dissertation proposes a new theory of power based on wavelet transform for real-time estimation of RMS voltages and currents, and some power amounts, such as active power, reactive power, apparent power, and power factor. The appropriate estimation the of RMS and power values is important for many applications, such as: design and analysis of power systems, compensation devices for improving power quality, and instruments for energy measuring. Simulation and experimental results obtained through the proposed MaximalOverlap Discrete Wavelet Transform-based method were compared with the IEEE Standard 1459-2010 and the commercial oscilloscope, respectively, presenting equivalent results. The proposed method presented good performance for compact mother wavelet, which is in accordance with real-time applications.
Resumo:
This work concerns a refinement of a suboptimal dual controller for discrete time systems with stochastic parameters. The dual property means that the control signal is chosen so that estimation of the model parameters and regulation of the output signals are optimally balanced. The control signal is computed in such a way so as to minimize the variance of output around a reference value one step further, with the addition of terms in the loss function. The idea is add simple terms depending on the covariance matrix of the parameter estimates two steps ahead. An algorithm is used for the adaptive adjustment of the adjustable parameter lambda, for each step of the way. The actual performance of the proposed controller is evaluated through a Monte Carlo simulations method.
Resumo:
The high-intensity interval exercise has been described as an option for increasing physical activity and its use also being suggested in the therapeutic management of many conditions such as diabetes mellitus and heart failure. However, the knowledge of its physiological effects and parameters that can assure greater safety for interval exercise prescription; especially its effect on short- and medium-term (24 hours after exercise) exercise recovery, need to be clarified. This study objective was to evaluate the effect of continuous and interval aerobic exercise on the cardiac autonomic control immediate and medium term (24 hours), by assessing heart rate variability (HRV). The present study is a randomized crossover clinical trial in which healthy young individuals with low level of physical activity had the VFC 24 hours measured by a heart rate sensor and portable accelerometer (3D eMotion HRV, Kuopio, Finland) before and after continuous aerobic exercise (60-70% HR max, 21 min.) and interval exercise (cycle 1 min. 80-90% HR max, 2 min. at 50-60% HR max, duration 21 min.). HRV was measured in the time and frequency domain and the sympathovagal balance determined by the ratio LF / HF. Nonlinear evaluation was calculated by Shannon entropy. The data demonstrated delayed heart rate recovery immediate after exercise and lower HR after 24 hours compared to pre intervention values, especially in the interval exercise group. There was a tendency to higher predominance and representatives index values of sympathetic stimulation during the day in interval exercise group; however, without statistical significance. The study results help to clarify the effects of interval exercise on the 24 hours following interval exercise, setting parameters for prescription and for further evaluation of groups with metabolic and cardiovascular diseases.
Resumo:
Binary systems are key environments to study the fundamental properties of stars. In this work, we analyze 99 binary systems identified by the CoRoT space mission. From the study of the phase diagrams of these systems, our sample is divided into three groups: those whose systems are characterized by the variability relative to the binary eclipses; those presenting strong modulations probably due to the presence of stellar spots on the surface of star; and those whose systems have variability associated with the expansion and contraction of the surface layers. For eclipsing binary stars, phase diagrams are used to estimate the classification in regard to their morphology, based on the study of equipotential surfaces. In this context, to determine the rotation period, and to identify the presence of active regions, and to investigate if the star exhibits or not differential rotation and study stellar pulsation, we apply the wavelet procedure. The wavelet transform has been used as a powerful tool in the treatment of a large number of problems in astrophysics. Through the wavelet transform, one can perform an analysis in time-frequency light curves rich in details that contribute significantly to the study of phenomena associated with the rotation, the magnetic activity and stellar pulsations. In this work, we apply Morlet wavelet (6th order), which offers high time and frequency resolution and obtain local (energy distribution of the signal) and global (time integration of local map) wavelet power spectra. Using the wavelet analysis, we identify thirteen systems with periodicities related to the rotational modulation, besides the beating pattern signature in the local wavelet map of five pulsating stars over the entire time span.
Resumo:
Binary systems are key environments to study the fundamental properties of stars. In this work, we analyze 99 binary systems identified by the CoRoT space mission. From the study of the phase diagrams of these systems, our sample is divided into three groups: those whose systems are characterized by the variability relative to the binary eclipses; those presenting strong modulations probably due to the presence of stellar spots on the surface of star; and those whose systems have variability associated with the expansion and contraction of the surface layers. For eclipsing binary stars, phase diagrams are used to estimate the classification in regard to their morphology, based on the study of equipotential surfaces. In this context, to determine the rotation period, and to identify the presence of active regions, and to investigate if the star exhibits or not differential rotation and study stellar pulsation, we apply the wavelet procedure. The wavelet transform has been used as a powerful tool in the treatment of a large number of problems in astrophysics. Through the wavelet transform, one can perform an analysis in time-frequency light curves rich in details that contribute significantly to the study of phenomena associated with the rotation, the magnetic activity and stellar pulsations. In this work, we apply Morlet wavelet (6th order), which offers high time and frequency resolution and obtain local (energy distribution of the signal) and global (time integration of local map) wavelet power spectra. Using the wavelet analysis, we identify thirteen systems with periodicities related to the rotational modulation, besides the beating pattern signature in the local wavelet map of five pulsating stars over the entire time span.
Resumo:
Sandstone-type reservoir rocks are commonly responsible for oil accumulation. The wettability is an important parameter for the physical properties of the container, since it interferes in characteristics such as relative permeability to the aqueous phase, residual oil distribution in the reservoir, operating characteristics with waterflood and recovery of crude oil. This study applied different types of microemulsion systems - MES - in sandstone reservoirs and evaluated their influences on wettability and residual oil recovery. For this purpose, four microemulsion were prepared by changing the nature of ionic surfactants (ionic and nonionic). Microemulsions could then be characterized by surface tension analysis, density, particle diameter and viscosity in the temperature range 30° C to 70° C. The studied oil was described as light and the sandstone rock was derived from the Botucatu formation. The study of the influence of microemulsion systems on sandstone wettability was performed by contact angle measurements using as parameters the rock treatment time with the MES and the time after the brine surface contact by checking the angle variation behavior. In the study results, the rock was initially wettable to oil and had its wettability changed to mixed wettability after treatment with MES, obtaining preference for water. Regarding rock-MES contact time, it was observed that the rock wettability changed more when the contact time between the surface and the microemulsion systems was longer. It was also noted only a significant reduction for the first 5 minutes of interaction between the treated surface and brine. The synthesized anionic surfactant, commercial cationic, commercial anionic and commercial nonionic microemulsion systems presented the best results, respectively. With regard to enhanced oil recovery performance, all systems showed a significant percentage of recovered oil, with the anionic systems presenting the best results. A percentage of 80% recovery was reached, confirming the wettability study results, which pointed the influence of this property on the interaction of fluids and reservoir rock, and the ability of microemulsion systems to perform enhanced oil recovery in sandstone reservoirs.
Resumo:
Sandstone-type reservoir rocks are commonly responsible for oil accumulation. The wettability is an important parameter for the physical properties of the container, since it interferes in characteristics such as relative permeability to the aqueous phase, residual oil distribution in the reservoir, operating characteristics with waterflood and recovery of crude oil. This study applied different types of microemulsion systems - MES - in sandstone reservoirs and evaluated their influences on wettability and residual oil recovery. For this purpose, four microemulsion were prepared by changing the nature of ionic surfactants (ionic and nonionic). Microemulsions could then be characterized by surface tension analysis, density, particle diameter and viscosity in the temperature range 30° C to 70° C. The studied oil was described as light and the sandstone rock was derived from the Botucatu formation. The study of the influence of microemulsion systems on sandstone wettability was performed by contact angle measurements using as parameters the rock treatment time with the MES and the time after the brine surface contact by checking the angle variation behavior. In the study results, the rock was initially wettable to oil and had its wettability changed to mixed wettability after treatment with MES, obtaining preference for water. Regarding rock-MES contact time, it was observed that the rock wettability changed more when the contact time between the surface and the microemulsion systems was longer. It was also noted only a significant reduction for the first 5 minutes of interaction between the treated surface and brine. The synthesized anionic surfactant, commercial cationic, commercial anionic and commercial nonionic microemulsion systems presented the best results, respectively. With regard to enhanced oil recovery performance, all systems showed a significant percentage of recovered oil, with the anionic systems presenting the best results. A percentage of 80% recovery was reached, confirming the wettability study results, which pointed the influence of this property on the interaction of fluids and reservoir rock, and the ability of microemulsion systems to perform enhanced oil recovery in sandstone reservoirs.
Resumo:
The increasing demand in electricity and decrease forecast, increasingly, of fossil fuel reserves, as well as increasing environmental concern in the use of these have generated a concern about the quality of electricity generation, making it well welcome new investments in generation through alternative, clean and renewable sources. Distributed generation is one of the main solutions for the independent and selfsufficient generating systems, such as the sugarcane industry. This sector has grown considerably, contributing expressively in the production of electricity to the distribution networks. Faced with this situation, one of the main objectives of this study is to propose the implementation of an algorithm to detect islanding disturbances in the electrical system, characterized by situations of under- or overvoltage. The algorithm should also commonly quantize the time that the system was operating in these conditions, to check the possible consequences that will be caused in the electric power system. In order to achieve this it used the technique of wavelet multiresolution analysis (AMR) for detecting the generated disorders. The data obtained can be processed so as to be used for a possible predictive maintenance in the protection equipment of electrical network, since they are prone to damage on prolonged operation under abnormal conditions of frequency and voltage.
Resumo:
This paper makes a comparative study of two Soft Single Switched Quadratic Boost Converters (SSS1 and SSS2) focused on Maximum Power Point Tracking (MPPT) of a PV array using Perturb and Observe (P&O) algorithm. The proposed converters maintain the static gain characteristics and dynamics of the original converter with the advantage of considerably reducing the switching losses and Electromagnetic Interference (EMI). It is displayed the input voltage Quadratic Boost converter modeling; qualitative and quantitative analysis of soft switching converters, defining the operation principles, main waveforms, time intervals and the state variables in each operation steps, phase planes of resonant elements, static voltage gain expressions, analysis of voltage and current efforts in semiconductors and the operational curves at 200 W to 800 W. There are presented project of PI, PID and PID + Notch compensators for MPPT closed-loop system and resonant elements design. In order to analyze the operation of a complete photovoltaic system connected to the grid, it was chosen to simulate a three-phase inverter using the P-Q control theory of three-phase instantaneous power. Finally, the simulation results and experimental with the necessary comparative analysis of the proposed converters will be presented.
Resumo:
In this work it was developed mathematical resolutions taking as parameter maximum intensity values for the interference analysis of electric and magnetic fields and was given two virtual computer system that supports families of CDMA and WCDMA technologies. The first family were developed computational resources to solve electric and magnetic field calculations and power densities in Radio Base stations , with the use of CDMA technology in the 800 MHz band , taking into account the permissible values referenced by the Commission International Protection on non-Ionizing Radiation . The first family is divided into two segments of calculation carried out in virtual operation. In the first segment to compute the interference field radiated by the base station with input information such as radio channel power; Gain antenna; Radio channel number; Operating frequency; Losses in the cable; Attenuation of direction; Minimum Distance; Reflections. Said computing system allows to quickly and without the need of implementing instruments for measurements, meet the following calculated values: Effective Radiated Power; Sector Power Density; Electric field in the sector; Magnetic field in the sector; Magnetic flux density; point of maximum permissible exposure of electric field and power density. The results are shown in charts for clarity of view of power density in the industry, as well as the coverage area definition. The computer module also includes folders specifications antennas, cables and towers used in cellular telephony, the following manufacturers: RFS World, Andrew, Karthein and BRASILSAT. Many are presented "links" network access "Internet" to supplement the cable specifications, antennas, etc. . In the second segment of the first family work with more variables , seeking to perform calculations quickly and safely assisting in obtaining results of radio signal loss produced by ERB . This module displays screens representing propagation systems denominated "A" and "B". By propagating "A" are obtained radio signal attenuation calculations in areas of urban models , dense urban , suburban , and rural open . In reflection calculations are present the reflection coefficients , the standing wave ratio , return loss , the reflected power ratio , as well as the loss of the signal by mismatch impedance. With the spread " B" seek radio signal losses in the survey line and not targeted , the effective area , the power density , the received power , the coverage radius , the conversion levels and the gain conversion systems radiant . The second family of virtual computing system consists of 7 modules of which 5 are geared towards the design of WCDMA and 2 technology for calculation of telephone traffic serving CDMA and WCDMA . It includes a portfolio of radiant systems used on the site. In the virtual operation of the module 1 is compute-: distance frequency reuse, channel capacity with noise and without noise, Doppler frequency, modulation rate and channel efficiency; Module 2 includes computes the cell area, thermal noise, noise power (dB), noise figure, signal to noise ratio, bit of power (dBm); with the module 3 reaches the calculation: breakpoint, processing gain (dB) loss in the space of BTS, noise power (w), chip period and frequency reuse factor. Module 4 scales effective radiated power, sectorization gain, voice activity and load effect. The module 5 performs the calculation processing gain (Hz / bps) bit time, bit energy (Ws). Module 6 deals with the telephone traffic and scales 1: traffic volume, occupancy intensity, average time of occupancy, traffic intensity, calls completed, congestion. Module 7 deals with two telephone traffic and allows calculating call completion and not completed in HMM. Tests were performed on the mobile network performance field for the calculation of data relating to: CINP , CPI , RSRP , RSRQ , EARFCN , Drop Call , Block Call , Pilot , Data Bler , RSCP , Short Call, Long Call and Data Call ; ECIO - Short Call and Long Call , Data Call Troughput . As survey were conducted surveys of electric and magnetic field in an ERB , trying to observe the degree of exposure to non-ionizing radiation they are exposed to the general public and occupational element. The results were compared to permissible values for health endorsed by the ICNIRP and the CENELEC .
Resumo:
Hoje em dia, a área de codificação de dados é transversal a diversos tipos de engenharias devido à sua grande importância. Com o aumento exponencial na criação de dados digitais, o campo da compressão de dados ganhou uma grande visibilidade nesta área. São constantemente desenvolvidos e melhorados algoritmos de compressão por forma a obter a maior compressão de dados possível seja com ou sem perda de dados, permitindo sustentar o rápido e constante crescimento dos mesmos. Um dos grandes problemas deste tipo de algoritmos deve-se ao grande poder computacional que por vezes é necessário para obter uma boa taxa de compressão mantendo a qualidade dos dados quando descompactados. Este documento descreve uma estratégia para tentar reduzir o impacto do poder computacional necessário à codificação de imagens utilizando uma implementação heterogénea. O objetivo é tentar efetuar a paralelização das secções que requerem elevado poder computacional reduzindo assim o tempo necessário à compressão de dados. Este documento baseia-se na implementação desta estratégia para o algoritmo de codificação de imagens MMP-Intra. Utilizando inicialmente uma análise teórica, demonstramos que é viável efetuar a paralelização do algoritmo, sendo possível obter elevados ganhos de desempenho. Por forma a provar que o algoritmo MMP-Intra era paralelizavel e identificar os ganhos reais foi desenvolvido um protótipo inicial, o qual obteve um desempenho muito inferiore ao do algoritmo original, necessitando de muito mais tempo para obter os mesmo resultados. Utilizando um processo de otimização iterativo o protótipo passou por várias etapas de refinação. O protótipo refinado final obteve resultados muito superiores ao algoritmo sequencial no qual o mesmo foi baseado chegando a obter desempenhos quatro vezes superior ao original.
Resumo:
The IT capability is a organizational ability to perform activities of this role more effectively and an important mechanism in creating value. Its building process (stages of creation and development) occurs through management initiatives for improvement in the performance of their activities, using human resources and IT assets complementary responsible for the evolution of their organizational routines. This research deals with the IT capabilities related to SIG (integrated institutional management systems), built and deployed in UFRN (Universidade Federal do Rio Grande do Norte) to realization and control of administrative, academic and human resources activities. Since 2009, through cooperative agreements with federal and educational institutions of direct administration, UFRN has supported the implementation of these systems, currently involving more than 30 institutions. The present study aims to understand how IT capabilities, relevant in the design, implementation and dissemination of SIG, were built over time. This is a single case study of qualitative and longitudinal nature, performed by capturing, coding and analysis from secondary data and from semi-structured interviews conducted primarily with members of Superintenência de Informática, organizational unit responsible for SIG systems in UFRN. As a result, the technical, of internal relationship and external cooperation capabilities were identified as relevant in the successful trajectory of SIG systems, which have evolved in different ways. The technical capacity, initiated in 2004, toured the stages of creation and development until it reached the stage of stability in 2013, due to technological limits. Regarding the internal relationship capability, begun in 2006, it toured the stages of creation and development, having extended its scope of activities in 2009, being in development since then. Unlike the standard life cycle observed in the literature, the external cooperation capability was initiated by an intensity of initiatives and developments in the routines in 2009, which were decreasing to cease in 2013 in order to stabilize the technological infrastructure already created for cooperative institutions. It was still identified the start of cooperation in 2009 as an important event selection, responsible for changing or creating trajectories of evolution in all three capacities. The most frequent improvements initiatives were of organizational nature and the internal planning activity has been transformed over the routines of the three capabilities. Important resources and complementary assets have been identified as important for the realization of initiatives, such as human resources technical knowledge to the technical capabilities and external cooperation, and business knowledge, for all of them, as well as IT assets: the iproject application for control of development processes, and the document repository wiki. All these resources and complementary assets grew along the capacities, demonstrating its strategic value to SINFO/UFRN
Resumo:
Forecast is the basis for making strategic, tactical and operational business decisions. In financial economics, several techniques have been used to predict the behavior of assets over the past decades.Thus, there are several methods to assist in the task of time series forecasting, however, conventional modeling techniques such as statistical models and those based on theoretical mathematical models have produced unsatisfactory predictions, increasing the number of studies in more advanced methods of prediction. Among these, the Artificial Neural Networks (ANN) are a relatively new and promising method for predicting business that shows a technique that has caused much interest in the financial environment and has been used successfully in a wide variety of financial modeling systems applications, in many cases proving its superiority over the statistical models ARIMA-GARCH. In this context, this study aimed to examine whether the ANNs are a more appropriate method for predicting the behavior of Indices in Capital Markets than the traditional methods of time series analysis. For this purpose we developed an quantitative study, from financial economic indices, and developed two models of RNA-type feedfoward supervised learning, whose structures consisted of 20 data in the input layer, 90 neurons in one hidden layer and one given as the output layer (Ibovespa). These models used backpropagation, an input activation function based on the tangent sigmoid and a linear output function. Since the aim of analyzing the adherence of the Method of Artificial Neural Networks to carry out predictions of the Ibovespa, we chose to perform this analysis by comparing results between this and Time Series Predictive Model GARCH, developing a GARCH model (1.1).Once applied both methods (ANN and GARCH) we conducted the results' analysis by comparing the results of the forecast with the historical data and by studying the forecast errors by the MSE, RMSE, MAE, Standard Deviation, the Theil's U and forecasting encompassing tests. It was found that the models developed by means of ANNs had lower MSE, RMSE and MAE than the GARCH (1,1) model and Theil U test indicated that the three models have smaller errors than those of a naïve forecast. Although the ANN based on returns have lower precision indicator values than those of ANN based on prices, the forecast encompassing test rejected the hypothesis that this model is better than that, indicating that the ANN models have a similar level of accuracy . It was concluded that for the data series studied the ANN models show a more appropriate Ibovespa forecasting than the traditional models of time series, represented by the GARCH model
Resumo:
This work aims to analyze risks related to information technology (IT) in procedures related to data migration. This is done considering ALEPH, Integrated Libray System (ILS) that migrated data to the Library Module present in the software called Sistema Integrado de Gestão de Atividades Acadêmicas (SIGAA) at the Zila Mamede Central Library at the Federal University of Rio Grande do Norte (UFRN) in Natal/Brazil. The methodological procedure used was of a qualitative exploratory research with the realization of case study at the referred library in order to better understand this phenomenon. Data collection was able once there was use of a semi-structured interview that was applied with (11) subjects that are employed at the library as well as in the Technology Superintendence at UFRN. In order to examine data Content analysis as well as thematic review process was performed. After data migration the results of the interview were then linked to both analysis units and their system register with category correspondence. The main risks detected were: data destruction; data loss; data bank communication failure; user response delay; data inconsistency and duplicity. These elements point out implication and generate disorders that affect external and internal system users and lead to stress, work duplicity and hassles. Thus, some measures were taken related to risk management such as adequate planning, central management support, and pilot test simulations. For the advantages it has reduced of: risk, occurrence of problems and possible unforeseen costs, and allows achieving organizational objectives, among other. It is inferred therefore that the risks present in data bank conversion in libraries exist and some are predictable, however, it is seen that librarians do not know or ignore and are not very worried in the identification risks in data bank conversion, their acknowledge would minimize or even extinguish them. Another important aspect to consider is the existence of few empirical research that deal specifically with this subject and thus presenting the new of new approaches in order to promote better understanding of the matter in the corporate environment of the information units
Resumo:
Interações sociais são frequentemente descritas como trocas sociais. Na literatura, trocas sociais em Sistemas Multiagentes são objeto de estudo em diversos contextos, nos quais as relações sociais são interpretadas como trocas sociais. Dentre os problemas estudados, um problema fundamental discutido na literatura e a regulação¸ ao de trocas sociais, por exemplo, a emergência de trocas equilibradas ao longo do tempo levando ao equilíbrio social e/ou comportamento de equilíbrio/justiça. Em particular, o problema da regulação de trocas sociais e difícil quando os agentes tem informação incompleta sobre as estratégias de troca dos outros agentes, especificamente se os agentes tem diferentes estratégias de troca. Esta dissertação de mestrado propõe uma abordagem para a autorregulacao de trocas sociais em sistemas multiagentes, baseada na Teoria dos Jogos. Propõe o modelo de Jogo de Autorregulacão ao de Processos de Trocas Sociais (JAPTS), em uma versão evolutiva e espacial, onde os agentes organizados em uma rede complexa, podem evoluir suas diferentes estratégias de troca social. As estratégias de troca são definidas através dos parâmetros de uma função de fitness. Analisa-se a possibilidade do surgimento do comportamento de equilíbrio quando os agentes, tentando maximizar sua adaptação através da função de fitness, procuram aumentar o numero de interações bem sucedidas. Considera-se um jogo de informação incompleta, uma vez que os agentes não tem informações sobre as estratégias de outros agentes. Para o processo de aprendizado de estratégias, utiliza-se um algoritmo evolutivo, no qual os agentes visando maximizar a sua função de fitness, atuam como autorregulares dos processos de trocas possibilitadas pelo jogo, contribuindo para o aumento do numero de interações bem sucedidas. São analisados 5 diferentes casos de composição da sociedade. Para alguns casos, analisa-se também um segundo tipo de cenário, onde a topologia de rede é modificada, representando algum tipo de mobilidade, a fim de analisar se os resultados são dependentes da vizinhança. Alem disso, um terceiro cenário é estudado, no qual é se determinada uma política de influencia, quando as medias dos parâmetros que definem as estratégias adotadas pelos agentes tornam-se publicas em alguns momentos da simulação, e os agentes que adotam a mesma estratégia de troca, influenciados por isso, imitam esses valores. O modelo foi implementado em NetLogo.