51 resultados para Classical methods
Resumo:
This paper presents the measurement, frequency-response modeling and identification, and the corresponding impulse time response of the human respiratory impedance and admittance. The investigated adult patient groups were healthy, diagnosed with chronic obstructive pulmonary disease and kyphoscoliosis, respectively. The investigated children patient groups were healthy, diagnosed with asthma and cystic fibrosis, respectively. Fractional order (FO) models are identified on the measured impedance to quantify the respiratory mechanical properties. Two methods are presented for obtaining and simulating the time-domain impulse response from FO models of the respiratory admittance: (i) the classical pole-zero interpolation proposed by Oustaloup in the early 90s, and (ii) the inverse discrete Fourier Transform (DFT). The results of the identified FO models for the respiratory admittance are presented by means of their average values for each group of patients. Consequently, the impulse time response calculated from the frequency response of the averaged FO models is given by means of the two methods mentioned above. Our results indicate that both methods provide similar impulse response data. However, we suggest that the inverse DFT is a more suitable alternative to the high order transfer functions obtained using the classical Oustaloup filter. Additionally, a power law model is fitted on the impulse response data, emphasizing the intrinsic fractal dynamics of the respiratory system.
Resumo:
The goal of this study is the analysis of the dynamical properties of financial data series from worldwide stock market indexes during the period 2000–2009. We analyze, under a regional criterium, ten main indexes at a daily time horizon. The methods and algorithms that have been explored for the description of dynamical phenomena become an effective background in the analysis of economical data. We start by applying the classical concepts of signal analysis, fractional Fourier transform, and methods of fractional calculus. In a second phase we adopt the multidimensional scaling approach. Stock market indexes are examples of complex interacting systems for which a huge amount of data exists. Therefore, these indexes, viewed from a different perspectives, lead to new classification patterns.
Resumo:
Dissertation to obtain the degree of Master in Music - Artistic Interpretation
Resumo:
The synthesis and application of fractional-order controllers is now an active research field. This article investigates the use of fractional-order PID controllers in the velocity control of an experimental modular servo system. The systern consists of a digital servomechanism and open-architecture software environment for real-time control experiments using MATLAB/Simulink. Different tuning methods will be employed, such as heuristics based on the well-known Ziegler Nichols rules, techniques based on Bode’s ideal transfer function and optimization tuning methods. Experimental responses obtained from the application of the several fractional-order controllers are presented and analyzed. The effectiveness and superior performance of the proposed algorithms are also compared with classical integer-order PID controllers.
Resumo:
On-chip debug (OCD) features are frequently available in modern microprocessors. Their contribution to shorten the time-to-market justifies the industry investment in this area, where a number of competing or complementary proposals are available or under development, e.g. NEXUS, CJTAG, IJTAG. The controllability and observability features provided by OCD infrastructures provide a valuable toolbox that can be used well beyond the debugging arena, improving the return on investment rate by diluting its cost across a wider spectrum of application areas. This paper discusses the use of OCD features for validating fault tolerant architectures, and in particular the efficiency of various fault injection methods provided by enhanced OCD infrastructures. The reference data for our comparative study was captured on a workbench comprising the 32-bit Freescale MPC-565 microprocessor, an iSYSTEM IC3000 debugger (iTracePro version) and the Winidea 2005 debugging package. All enhanced OCD infrastructures were implemented in VHDL and the results were obtained by simulation within the same fault injection environment. The focus of this paper is on the comparative analysis of the experimental results obtained for various OCD configurations and debugging scenarios.
Resumo:
The goal of this study is the analysis of the dynamical properties of financial data series from worldwide stock market indices. We analyze the Dow Jones Industrial Average ( ∧ DJI) and the NASDAQ Composite ( ∧ IXIC) indexes at a daily time horizon. The methods and algorithms that have been explored for description of physical phenomena become an effective background, and even inspiration, for very productive methods used in the analysis of economical data. We start by applying the classical concepts of signal analysis, Fourier transform, and methods of fractional calculus. In a second phase we adopt a pseudo phase plane approach.
Resumo:
The Maxwell equations constitute a formalism for the development of models describing electromagnetic phenomena. The four Maxwell laws have been adopted successfully in many applications and involve only the integer order differential calculus. Recently, a closer look for the cases of transmission lines, electrical motors and transformers, that reveal the so-called skin effect, motivated a new perspective towards the replacement of classical models by fractional-order mathematical descriptions. Bearing these facts in mind this paper addresses the concept of static fractional electric potential. The fractional potential was suggested some years ago. However, the idea was not fully explored and practical methods of implementation were not proposed. In this line of thought, this paper develops a new approximation algorithm for establishing the fractional order electrical potential and analyzes its characteristics.
Resumo:
Beyond the classical statistical approaches (determination of basic statistics, regression analysis, ANOVA, etc.) a new set of applications of different statistical techniques has increasingly gained relevance in the analysis, processing and interpretation of data concerning the characteristics of forest soils. This is possible to be seen in some of the recent publications in the context of Multivariate Statistics. These new methods require additional care that is not always included or refered in some approaches. In the particular case of geostatistical data applications it is necessary, besides to geo-reference all the data acquisition, to collect the samples in regular grids and in sufficient quantity so that the variograms can reflect the spatial distribution of soil properties in a representative manner. In the case of the great majority of Multivariate Statistics techniques (Principal Component Analysis, Correspondence Analysis, Cluster Analysis, etc.) despite the fact they do not require in most cases the assumption of normal distribution, they however need a proper and rigorous strategy for its utilization. In this work, some reflections about these methodologies and, in particular, about the main constraints that often occur during the information collecting process and about the various linking possibilities of these different techniques will be presented. At the end, illustrations of some particular cases of the applications of these statistical methods will also be presented.
Resumo:
The development of an intelligent wheelchair (IW) platform that may be easily adapted to any commercial electric powered wheelchair and aid any person with special mobility needs is the main objective of this project. To be able to achieve this main objective, three distinct control methods were implemented in the IW: manual, shared and automatic. Several algorithms were developed for each of these control methods. This paper presents three of the most significant of those algorithms with emphasis on the shared control method. Experiments were performed by users suffering from cerebral palsy, using a realistic simulator, in order to validate the approach. The experiments revealed the importance of using shared (aided) controls for users with severe disabilities. The patients still felt having complete control over the wheelchair movement when using a shared control at a 50% level and thus this control type was very well accepted. Thus it may be used in intelligent wheelchairs since it is able to correct the direction in case of involuntary movements of the user but still gives him a sense of complete control over the IW movement.
Resumo:
Forest fires dynamics is often characterized by the absence of a characteristic length-scale, long range correlations in space and time, and long memory, which are features also associated with fractional order systems. In this paper a public domain forest fires catalogue, containing information of events for Portugal, covering the period from 1980 up to 2012, is tackled. The events are modelled as time series of Dirac impulses with amplitude proportional to the burnt area. The time series are viewed as the system output and are interpreted as a manifestation of the system dynamics. In the first phase we use the pseudo phase plane (PPP) technique to describe forest fires dynamics. In the second phase we use multidimensional scaling (MDS) visualization tools. The PPP allows the representation of forest fires dynamics in two-dimensional space, by taking time series representative of the phenomena. The MDS approach generates maps where objects that are perceived to be similar to each other are placed on the map forming clusters. The results are analysed in order to extract relationships among the data and to better understand forest fires behaviour.
Resumo:
Demand response has gained increasing importance in the context of competitive electricity markets and smart grid environments. In addition to the importance that has been given to the development of business models for integrating demand response, several methods have been developed to evaluate the consumers’ performance after the participation in a demand response event. The present paper uses those performance evaluation methods, namely customer baseline load calculation methods, to determine the expected consumption in each period of the consumer historic data. In the cases in which there is a certain difference between the actual consumption and the estimated consumption, the consumer is identified as a potential cause of non-technical losses. A case study demonstrates the application of the proposed method to real consumption data.
Resumo:
Demand response has gain increasing importance in the context of competitive electricity markets environment. The use of demand resources is also advantageous in the context of smart grid operation. In addition to the need of new business models for integrating demand response, adequate methods are necessary for an accurate determination of the consumers’ performance evaluation after the participation in a demand response event. The present paper makes a comparison between some of the existing baseline methods related to the consumers’ performance evaluation, comparing the results obtained with these methods and also with a method proposed by the authors of the paper. A case study demonstrates the application of the referred methods to real consumption data belonging to a consumer connected to a distribution network.
Resumo:
Electric power networks, namely distribution networks, have been suffering several changes during the last years due to changes in the power systems operation, towards the implementation of smart grids. Several approaches to the operation of the resources have been introduced, as the case of demand response, making use of the new capabilities of the smart grids. In the initial levels of the smart grids implementation reduced amounts of data are generated, namely consumption data. The methodology proposed in the present paper makes use of demand response consumers’ performance evaluation methods to determine the expected consumption for a given consumer. Then, potential commercial losses are identified using monthly historic consumption data. Real consumption data is used in the case study to demonstrate the application of the proposed method.