73 resultados para Complex Programmable Logic Device (CPLD)
Resumo:
In this article we describe several methods for the discretization of the differintegral operator sa, where α = u + jv is a complex value. The concept of the conjugated-order differintegral is also introduced, which enables the use of complex-order differintegrals while still producing real-valued time responses and transfer functions. The performance of the resulting approximations is analysed in both the time and frequency domains. Several results are presented that demonstrate its utility in control system design.
Resumo:
This paper studies the dynamics of foot–ground interaction in hexapod locomotion systems. For that objective the robot motion is characterized in terms of several locomotion variables and the ground is modelled through a non-linear spring-dashpot system, with parameters based on the studies of soil mechanics. Moreover, it is adopted an algorithm with foot-force feedback to control the robot locomotion. A set of model-based experiments reveals the influence of the locomotion velocity on the foot–ground transfer function, which presents complex-order dynamics.
Resumo:
Dynamically reconfigurable systems have benefited from a new class of FPGAs recently introduced into the market, which allow partial and dynamic reconfiguration at run-time, enabling multiple independent functions from different applications to share the same device, swapping resources as needed. When the sequence of tasks to be performed is not predictable, resource allocation decisions have to be made on-line, fragmenting the FPGA logic space. A rearrangement may be necessary to get enough contiguous space to efficiently implement incoming functions, to avoid spreading their components and, as a result, degrading their performance. This paper presents a novel active replication mechanism for configurable logic blocks (CLBs), able to implement on-line rearrangements, defragmenting the available FPGA resources without disturbing those functions that are currently running.
Resumo:
Hoje em dia as fontes de alimentação possuem correção do fator de potência, devido às diversas normas regulamentares existentes, que introduziram grandes restrições no que respeita à distorção harmónica (THD) e fator de potência (FP). Este trabalho trata da análise, desenvolvimento e implementação de um Pré-Regulador de fator de potência com controlo digital. O controlo digital de conversores com recurso a processamento digital de sinal tem vindo a ser ao longo dos últimos anos, objeto de investigação e desenvolvimento, estando constantemente a surgirem modificações nas topologias existentes. Esta dissertação tem como objetivo estudar e implementar um Pré-Regulador Retificador Boost e o respetivo controlo digital. O controlo do conversor é feito através da técnica dos valores médios instantâneos da corrente de entrada, desenvolvido através da linguagem de descrição de hardware VHDL (VHSIC HDL – Very High Speed Integrated Circuit Hardware Description Language) e implementado num dispositivo FPGA (Field Programmable Gate Array) Spartan-3E. Neste trabalho são apresentadas análises matemáticas, para a obtenção das funções de transferência pertinentes ao projeto dos controladores. Para efetuar este controlo é necessário adquirir os sinais da corrente de entrada, tensão de entrada e tensão de saída. O sinal resultante do módulo de controlo é um sinal de PWM com valor de fator de ciclo variável ao longo do tempo. O projeto é simulado e validado através da plataforma MatLab/Simulink e PSIM, onde são apresentados resultados para o regime permanente e para transitórios da carga e da tensão de alimentação. Finalmente, o Pré-Regulador Retificador Boost controlado de forma digital é implementado em laboratório. Os resultados experimentais são apresentados para validar a metodologia e o projeto desenvolvidos.
Resumo:
Fuzzy logic controllers (FLC) are intelligent systems, based on heuristic knowledge, that have been largely applied in numerous areas of everyday life. They can be used to describe a linear or nonlinear system and are suitable when a real system is not known or too difficult to find their model. FLC provide a formal methodology for representing, manipulating and implementing a human heuristic knowledge on how to control a system. These controllers can be seen as artificial decision makers that operate in a closed-loop system, in real time. The main aim of this work was to develop a single optimal fuzzy controller, easily adaptable to a wide range of systems – simple to complex, linear to nonlinear – and able to control all these systems. Due to their efficiency in searching and finding optimal solution for high complexity problems, GAs were used to perform the FLC tuning by finding the best parameters to obtain the best responses. The work was performed using the MATLAB/SIMULINK software. This is a very useful tool that provides an easy way to test and analyse the FLC, the PID and the GAs in the same environment. Therefore, it was proposed a Fuzzy PID controller (FL-PID) type namely, the Fuzzy PD+I. For that, the controller was compared with the classical PID controller tuned with, the heuristic Ziegler-Nichols tuning method, the optimal Zhuang-Atherton tuning method and the GA method itself. The IAE, ISE, ITAE and ITSE criteria, used as the GA fitness functions, were applied to compare the controllers performance used in this work. Overall, and for most systems, the FL-PID results tuned with GAs were very satisfactory. Moreover, in some cases the results were substantially better than for the other PID controllers. The best system responses were obtained with the IAE and ITAE criteria used to tune the FL-PID and PID controllers.
Resumo:
The main purpose of this work was the development of procedures for the simulation of atmospheric ows over complex terrain, using OpenFOAM. For this aim, tools and procedures were developed apart from this code for the preprocessing and data extraction, which were thereafter applied in the simulation of a real case. For the generation of the computational domain, a systematic method able to translate the terrain elevation model to a native OpenFOAM format (blockMeshDict) was developed. The outcome was a structured mesh, in which the user has the ability to de ne the number of control volumes and its dimensions. With this procedure, the di culties of case set up and the high computation computational e ort reported in literature associated to the use of snappyHexMesh, the OpenFOAM resource explored until then for the accomplishment of this task, were considered to be overwhelmed. Developed procedures for the generation of boundary conditions allowed for the automatic creation of idealized inlet vertical pro les, de nition of wall functions boundary conditions and the calculation of internal eld rst guesses for the iterative solution process, having as input experimental data supplied by the user. The applicability of the generated boundary conditions was limited to the simulation of turbulent, steady-state, incompressible and neutrally strati ed atmospheric ows, always recurring to RaNS (Reynolds-averaged Navier-Stokes) models. For the modelling of terrain roughness, the developed procedure allowed to the user the de nition of idealized conditions, like an uniform aerodynamic roughness length or making its value variable as a function of topography characteristic values, or the using of real site data, and it was complemented by the development of techniques for the visual inspection of generated roughness maps. The absence and the non inclusion of a forest canopy model limited the applicability of this procedure to low aerodynamic roughness lengths. The developed tools and procedures were then applied in the simulation of a neutrally strati ed atmospheric ow over the Askervein hill. In the performed simulations was evaluated the solution sensibility to di erent convection schemes, mesh dimensions, ground roughness and formulations of the k - ε and k - ω models. When compared to experimental data, calculated values showed a good agreement of speed-up in hill top and lee side, with a relative error of less than 10% at a height of 10 m above ground level. Turbulent kinetic energy was considered to be well simulated in the hill windward and hill top, and grossly predicted in the lee side, where a zone of ow separation was also identi ed. Despite the need of more work to evaluate the importance of the downstream recirculation zone in the quality of gathered results, the agreement between the calculated and experimental values and the OpenFOAM sensibility to the tested parameters were considered to be generally in line with the simulations presented in the reviewed bibliographic sources.
Resumo:
Recent studies of mobile Web trends show a continuous explosion of mobile-friendly content. However, the increasing number and heterogeneity of mobile devices poses several challenges for Web programmers who want to automatically get the delivery context and adapt the content to mobile devices. In this process, the devices detection phase assumes an important role where an inaccurate detection could result in a poor mobile experience for the enduser. In this paper we compare the most promising approaches for mobile device detection. Based on this study, we present an architecture for a system to detect and deliver uniform m-Learning content to students in a Higher School. We focus mainly on the devices capabilities repository manageable and accessible through an API. We detail the structure of the capabilities XML Schema that formalizes the data within the devices capabilities XML repository and the REST Web Service API for selecting the correspondent devices capabilities data according to a specific request. Finally, we validate our approach by presenting the access and usage statistics of the mobile web interface of the proposed system such as hits and new visitors, mobile platforms, average time on site and rejection rate.
Resumo:
Recent studies of mobile Web trends show the continued explosion of mobile-friend content. However, the wide number and heterogeneity of mobile devices poses several challenges for Web programmers, who want automatic delivery of context and adaptation of the content to mobile devices. Hence, the device detection phase assumes an important role in this process. In this chapter, the authors compare the most used approaches for mobile device detection. Based on this study, they present an architecture for detecting and delivering uniform m-Learning content to students in a Higher School. The authors focus mainly on the XML device capabilities repository and on the REST API Web Service for dealing with device data. In the former, the authors detail the respective capabilities schema and present a new caching approach. In the latter, they present an extension of the current API for dealing with it. Finally, the authors validate their approach by presenting the overall data and statistics collected through the Google Analytics service, in order to better understand the adherence to the mobile Web interface, its evolution over time, and the main weaknesses.
Resumo:
The aim of this work was to assess the influence of meteorological conditions on the dispersion of particulate matter from an industrial zone into urban and suburban areas. The particulate matter concentration was related to the most important meteorological variables such as wind direction, velocity and frequency. A coal-fired power plant was considered to be the main emission source with two stacks of 225 m height. A middle point between the two stacks was taken as the centre of two concentric circles with 6 and 20 km radius delimiting the sampling area. About 40 sampling collectors were placed within this area. Meteorological data was obtained from a portable meteorological station placed at approximately 1.7 km to SE from the stacks. Additional data was obtained from the electrical company that runs the coal power plant. These data covers the years from 2006 to the present. A detailed statistical analysis was performed to identify the most frequent meteorological conditions concerning mainly wind speed and direction. This analysis revealed that the most frequent wind blows from Northwest and North and the strongest winds blow from Northwest. Particulate matter deposition was obtained in two sampling campaigns carried out in summer and in spring. For the first campaign the monthly average flux deposition was 1.90 g/m2 and for the second campaign this value was 0.79 g/m2. Wind dispersion occurred predominantly from North to South, away from the nearest residential area, located at about 6 km to Northwest from the stacks. Nevertheless, the higher deposition fluxes occurred in the NW/N and NE/E quadrants. This study was conducted considering only the contribution of particulate matter from coal combustion, however, others sources may be present as well, such as road traffic. Additional chemical analyses and microanalysis are needed to identify the source linkage to flux deposition levels.
Resumo:
Reconfigurable computing experienced a considerable expansion in the last few years, due in part to the fast run-time partial reconfiguration features offered by recent SRAM-based Field Programmable Gate Arrays (FPGAs), which allowed the implementation in real-time of dynamic resource allocation strategies, with multiple independent functions from different applications sharing the same logic resources in the space and temporal domains. However, when the sequence of reconfigurations to be performed is not predictable, the efficient management of the logic space available becomes the greatest challenge posed to these systems. Resource allocation decisions have to be made concurrently with system operation, taking into account function priorities and optimizing the space currently available. As a consequence of the unpredictability of this allocation procedure, the logic space becomes fragmented, with many small areas of free resources failing to satisfy most requests and so remaining unused. A rearrangement of the currently running functions is therefore necessary, so as to obtain enough contiguous space to implement incoming functions, avoiding the spreading of their components and the resulting degradation of system performance. A novel active relocation procedure for Configurable Logic Blocks (CLBs) is herein presented, able to carry out online rearrangements, defragmenting the available FPGA resources without disturbing functions currently running.
Resumo:
A chromatographic separation of active ingredients of Combivir, Epivir, Kaletra, Norvir, Prezista, Retrovir, Trivizir, Valcyte, and Viramune is performed on thin layer chromatography. The spectra of these nine drugs were recorded using the Fourier transform infrared spectroscopy. This information is then analyzed by means of the cosine correlation. The comparison of the infrared spectra in the perspective of the adopted similarity measure is possible to visualize with present day computer tools, and the emerging clusters provide additional information about the similarities of the investigated set of complex drugs.
Resumo:
Technology is present in almost every simple aspect of the people’s daily life. As an instance, let us refer to the smartphone. This device is usually equipped with a GPS modulewhich may be used as an orientation system, if it carries the right functionalities. The problem is that these applications may be complex to operate and may not be within the bounds of everybody. Therefore, the main goal here is to develop an orientation system that may help people with cognitive disabilities in their day-to-day journeys, when the caregivers are absent. On the other hand, to keep paid helpers aware of the current location of the disable people, it will be also considered a localization system. Knowing their current locations, caregiversmay engage in others activities without neglecting their prime work, and, at the same time, turning people with cognitive disabilities more independent.
Resumo:
Develop a client-server application for a mobile environment can bring many challenges because of the mobile devices limitations. So, in this paper is discussed what can be the more reliable way to exchange information between a server and an Android mobile application, since it is important for users to have an application that really works in a responsive way and preferably without any errors. In this discussion two data transfer protocols (Socket and HTTP) and three serialization data formats (XML, JSON and Protocol Buffers) were tested using some metrics to evaluate which is the most practical and fast to use.
Resumo:
All over the world, the liberalization of electricity markets, which follows different paradigms, has created new challenges for those involved in this sector. In order to respond to these challenges, electric power systems suffered a significant restructuring in its mode of operation and planning. This restructuring resulted in a considerable increase of the electric sector competitiveness. Particularly, the Ancillary Services (AS) market has been target of constant renovations in its operation mode as it is a targeted market for the trading of services, which have as main objective to ensure the operation of electric power systems with appropriate levels of stability, safety, quality, equity and competitiveness. In this way, with the increasing penetration of distributed energy resources including distributed generation, demand response, storage units and electric vehicles, it is essential to develop new smarter and hierarchical methods of operation of electric power systems. As these resources are mostly connected to the distribution network, it is important to consider the introduction of this kind of resources in AS delivery in order to achieve greater reliability and cost efficiency of electrical power systems operation. The main contribution of this work is the design and development of mechanisms and methodologies of AS market and for energy and AS joint market, considering different management entities of transmission and distribution networks. Several models developed in this work consider the most common AS in the liberalized market environment: Regulation Down; Regulation Up; Spinning Reserve and Non-Spinning Reserve. The presented models consider different rules and ways of operation, such as the division of market by network areas, which allows the congestion management of interconnections between areas; or the ancillary service cascading process, which allows the replacement of AS of superior quality by lower quality of AS, ensuring a better economic performance of the market. A major contribution of this work is the development an innovative methodology of market clearing process to be used in the energy and AS joint market, able to ensure viable and feasible solutions in markets, where there are technical constraints in the transmission network involving its division into areas or regions. The proposed method is based on the determination of Bialek topological factors and considers the contribution of the dispatch for all services of increase of generation (energy, Regulation Up, Spinning and Non-Spinning reserves) in network congestion. The use of Bialek factors in each iteration of the proposed methodology allows limiting the bids in the market while ensuring that the solution is feasible in any context of system operation. Another important contribution of this work is the model of the contribution of distributed energy resources in the ancillary services. In this way, a Virtual Power Player (VPP) is considered in order to aggregate, manage and interact with distributed energy resources. The VPP manages all the agents aggregated, being able to supply AS to the system operator, with the main purpose of participation in electricity market. In order to ensure their participation in the AS, the VPP should have a set of contracts with the agents that include a set of diversified and adapted rules to each kind of distributed resource. All methodologies developed and implemented in this work have been integrated into the MASCEM simulator, which is a simulator based on a multi-agent system that allows to study complex operation of electricity markets. In this way, the developed methodologies allow the simulator to cover more operation contexts of the present and future of the electricity market. In this way, this dissertation offers a huge contribution to the AS market simulation, based on models and mechanisms currently used in several real markets, as well as the introduction of innovative methodologies of market clearing process on the energy and AS joint market. This dissertation presents five case studies; each one consists of multiple scenarios. The first case study illustrates the application of AS market simulation considering several bids of market players. The energy and ancillary services joint market simulation is exposed in the second case study. In the third case study it is developed a comparison between the simulation of the joint market methodology, in which the player bids to the ancillary services is considered by network areas and a reference methodology. The fourth case study presents the simulation of joint market methodology based on Bialek topological distribution factors applied to transmission network with 7 buses managed by a TSO. The last case study presents a joint market model simulation which considers the aggregation of small players to a VPP, as well as complex contracts related to these entities. The case study comprises a distribution network with 33 buses managed by VPP, which comprises several kinds of distributed resources, such as photovoltaic, CHP, fuel cells, wind turbines, biomass, small hydro, municipal solid waste, demand response, and storage units.