965 resultados para Non-stationary
Resumo:
This paper presents an approach for structural health monitoring (SHM) by using adaptive filters. The experimental signals from different structural conditions provided by piezoelectric actuators/sensors bonded in the test structure are modeled by a discrete-time recursive least square (RLS) filter. The biggest advantage to use a RLS filter is the clear possibility to perform an online SHM procedure since that the identification is also valid for non-stationary linear systems. An online damage-sensitive index feature is computed based on autoregressive (AR) portion of coefficients normalized by the square root of the sum of the square of them. The proposed method is then utilized in a laboratory test involving an aeronautical panel coupled with piezoelectric sensors/actuators (PZTs) in different positions. A hypothesis test employing the t-test is used to obtain the damage decision. The proposed algorithm was able to identify and localize the damages simulated in the structure. The results have shown the applicability and drawbacks the method and the paper concludes with suggestions to improve it. ©2010 Society for Experimental Mechanics Inc.
Resumo:
Fertilization of guava relies on soil and tissue testing. The interpretation of tissue test is currently conducted by comparing nutrient concentrations or dual ratios with critical values or ranges. The critical value approach is affected by nutrient interactions. Nutrient interactions can be described by dual ratios where two nutrients are compressed into a single expression or a ternary diagrams where one redundant proportion can be computed by difference between 100% and the sum of the other two. There are D(D-1) possible dual ratios in a D-parts composition and most of them are thus redundant. Nutrients are components of a mixture that convey relative, not absolute information on the composition. There are D-1 balances between components or ingredients in any mixture. Compositional data are intrinsically redundant, scale dependent and non-normally distributed. Based on the principles of equilibrium and orthogonality, the nutrient balance concept projects D-1 isometric log ratio (ilr) coordinates into the Euclidean space. The D-1 balances between groups of nutrients are ordered to reflect knowledge in plant physiology, soil fertility and crop management. Our objective was to evaluate the ilr approach using nutrient data from a guava orchard survey and fertilizer trials across the state of São Paulo, Brazil. Cationic balances varied widely between orchards. We found that the Redfield N/P ratio of 13 was critical for high guava yield. We present guava yield maps in ternary diagrams. Although the ratio between nutrients changing in the same direction with time is often assumed to be stationary, most guava nutrient balances and dual ratios were found to be non-stationary. The ilr model provided an unbiased nutrient diagnosis of guava. © ISHS.
Resumo:
Incluye Bibliografía
Resumo:
Esta tese apresenta uma metodologia para avaliação de desempenho de redes de acesso banda larga. A avaliação de desempenho de redes é uma forma de identificar e analisar como determinadas características tais como diferentes tipos de tráfego ou formas de utilização, por exemplo, podem influenciar no comportamento da rede em foco, podendo assim prever como tal rede se comportará frente a situações futuras. A metodologia apresentada é composta de duas abordagens: uma abordagem baseada em medições e outra baseada em modelagem via processos Markovianos. As redes analisadas englobam os dois tipos básicos de arquitetura de acesso: redes ADSL2+ (linha digital do assinante assimétrica 2+ – Asymmetric Digital Subscriber Line 2+), as quais são redes cabeadas que utilizam cabos metálicos de pares trançados; redes FBWN (rede sem fio banda larga fixa – Fixed Broadband Wireless Network), as quais são redes sem fio (wireless) baseadas no padrão IEEE 802.16. A abordagem de medições é focada na forma como a rede analisada se comporta frente a três situações: transmissão de um tráfego genérico; impacto de ruídos não-estacionários no sistema; e uso da rede como meio de transmissão de tráfego multimídia em tempo real. A abordagem de modelagem, por sua vez, ´e baseada em prever o comportamento das redes analisadas utilizando uma formulação matemática fundamentada em processos Markovianos. Os resultados apresentados indicam a viabilidade de aplicação desta metodologia como forma de avaliação de desempenho. Os resultados ainda tornam possível a extensão desta metodologia a outros tipos de redes de acesso banda larga, tais como: redes de fibras ópticas, redes de enlaces de microondas, redes VDSL/VDSL2 (linha digital do assinante de alta taxa de dados – Very-high-data-rate DSL), etc.
Resumo:
O presente trabalho trata da aplicação do filtro Kalman-Bucy (FKB), organizado como uma deconvolução (FKBD), para extração da função refletividade a partir de dados sísmicos. Isto significa que o processo é descrito como estocástico não-estacionário, e corresponde a uma generalização da teoria de Wiener-Kolmogorov. A descrição matemática do FKB conserva a relação com a do filtro Wiener-Hopf (FWH) que trata da contra-parte com um processo estocástico estacionário. A estratégia de ataque ao problema é estruturada em partes: (a) Critério de otimização; (b) Conhecimento a priori; (c) Algoritmo; e (d) Qualidade. O conhecimento a priori inclui o modelo convolucional, e estabelece estatísticas para as suas componentes do modelo (pulso-fonte efetivo, função refletividade, ruídos geológico e local). Para demostrar a versatilidade, a aplicabilidade e limitações do método, elaboramos experimentos sistemáticos de deconvolução sob várias situações de nível de ruídos aditivos e de pulso-fonte efetivo. Demonstramos, em primeiro lugar, a necessidade de filtros equalizadores e, em segundo lugar, que o fator de coerência espectral é uma boa medida numérica da qualidade do processo. Justificamos também o presente estudo para a aplicação em dados reais, como exemplificado.
Resumo:
O Município de Marabá- PA, situado na região Amazônica, sudeste do Estado do Pará, sofre anualmente com eventos de enchentes, ocasionados pelo aumento periódico do rio Tocantins e pela situação de vulnerabilidade da população que reside em áreas de risco. A defesa civil estadual e municipal anualmente planeja e prepara equipes para ações de defesa no município. Nesta fase o monitoramento e previsão de eventos de enchentes são importantes. Portanto, com o objetivo de diminuir erros nas previsões hidrológicas para o Município de Marabá, desenvolveu-se um modelo estocástico para previsão de nível do rio Tocantins, baseado na metodologia de Box e Jenkins. Utilizou os dados de níveis diários observados nas estações hidrológicas de Marabá e Carolina e Conceição do Araguaia da Agência Nacional de Águas (ANA), do período de 01/12/ 2008 a 31/03/2011. Efetuou-se o ajustamento de três modelos (Mt, Nt e Yt), através de diferentes aplicativos estatísticos: o SAS e o Gretl, usando diferentes interpretações do comportamento das séries para gerar as equações dos modelos. A principal diferença entre os aplicativos é que no SAS usa o modelo de função de transferência na modelagem. Realizou-se uma classificação da variabilidade do nível do rio, através da técnica dos Quantis para o período de 1972 a 2011, examinando-se apenas as categorizações de níveis ACIMA e MUITO ACIMA do normal. Para análise de impactos socioeconômicos foram usados os dados das ações da Defesa Civil Estado do Pará nas cheias de 2009 e 2011. Os resultados mostraram que o número de eventos de cheias com níveis MUITO ACIMA do normal, geralmente, podem estar associados a eventos de La Niña. Outro resultado importante: os modelos gerados simularam muito bem o nível do rio para o período de sete dias (01/04/2011 a 07/04/2011). O modelo multivariado Nt (com pequenos erros) representou o comportamento da série original, subestimando os valores reais nos dias 3, 4 e 5 de abril de 2011, com erro máximo de 0,28 no dia 4. O modelo univariado (Yt) teve bons resultados nas simulações com erros absolutos em torno de 0,12 m. O modelo com menor erro absoluto (0,08m) para o mesmo período foi o modelo Mt, desenvolvido pelo aplicativo SAS, que interpreta a série original como sendo não linear e não estacionária. A análise quantitativa dos impactos fluviométricos, ocorridos nas enchentes de 2009 e 2011 na cidade de Marabá, revelou em média que mais de 4 mil famílias sofrem com estes eventos, implicado em gastos financeiros elevados. Logo, conclui-se que os modelos de previsão de níveis são importantes ferramentas que a Defesa Civil, utiliza no planejamento e preparo de ações preventivas para o município de Marabá.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Concept drift, which refers to non stationary learning problems over time, has increasing importance in machine learning and data mining. Many concept drift applications require fast response, which means an algorithm must always be (re)trained with the latest available data. But the process of data labeling is usually expensive and/or time consuming when compared to acquisition of unlabeled data, thus usually only a small fraction of the incoming data may be effectively labeled. Semi-supervised learning methods may help in this scenario, as they use both labeled and unlabeled data in the training process. However, most of them are based on assumptions that the data is static. Therefore, semi-supervised learning with concept drifts is still an open challenging task in machine learning. Recently, a particle competition and cooperation approach has been developed to realize graph-based semi-supervised learning from static data. We have extend that approach to handle data streams and concept drift. The result is a passive algorithm which uses a single classifier approach, naturally adapted to concept changes without any explicit drift detection mechanism. It has built-in mechanisms that provide a natural way of learning from new data, gradually "forgetting" older knowledge as older data items are no longer useful for the classification of newer data items. The proposed algorithm is applied to the KDD Cup 1999 Data of network intrusion, showing its effectiveness.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This paper presents two diagnostic methods for the online detection of broken bars in induction motors with squirrel-cage type rotors. The wavelet representation of a function is a new technique. Wavelet transform of a function is the improved version of Fourier transform. Fourier transform is a powerful tool for analyzing the components of a stationary signal. But it is failed for analyzing the non-stationary signal whereas wavelet transform allows the components of a non-stationary signal to be analyzed. In this paper, our main goal is to find out the advantages of wavelet transform compared to Fourier transform in rotor failure diagnosis of induction motors.
Resumo:
Every seismic event produces seismic waves which travel throughout the Earth. Seismology is the science of interpreting measurements to derive information about the structure of the Earth. Seismic tomography is the most powerful tool for determination of 3D structure of deep Earth's interiors. Tomographic models obtained at the global and regional scales are an underlying tool for determination of geodynamical state of the Earth, showing evident correlation with other geophysical and geological characteristics. The global tomographic images of the Earth can be written as a linear combinations of basis functions from a specifically chosen set, defining the model parameterization. A number of different parameterizations are commonly seen in literature: seismic velocities in the Earth have been expressed, for example, as combinations of spherical harmonics or by means of the simpler characteristic functions of discrete cells. With this work we are interested to focus our attention on this aspect, evaluating a new type of parameterization, performed by means of wavelet functions. It is known from the classical Fourier theory that a signal can be expressed as the sum of a, possibly infinite, series of sines and cosines. This sum is often referred as a Fourier expansion. The big disadvantage of a Fourier expansion is that it has only frequency resolution and no time resolution. The Wavelet Analysis (or Wavelet Transform) is probably the most recent solution to overcome the shortcomings of Fourier analysis. The fundamental idea behind this innovative analysis is to study signal according to scale. Wavelets, in fact, are mathematical functions that cut up data into different frequency components, and then study each component with resolution matched to its scale, so they are especially useful in the analysis of non stationary process that contains multi-scale features, discontinuities and sharp strike. Wavelets are essentially used in two ways when they are applied in geophysical process or signals studies: 1) as a basis for representation or characterization of process; 2) as an integration kernel for analysis to extract information about the process. These two types of applications of wavelets in geophysical field, are object of study of this work. At the beginning we use the wavelets as basis to represent and resolve the Tomographic Inverse Problem. After a briefly introduction to seismic tomography theory, we assess the power of wavelet analysis in the representation of two different type of synthetic models; then we apply it to real data, obtaining surface wave phase velocity maps and evaluating its abilities by means of comparison with an other type of parametrization (i.e., block parametrization). For the second type of wavelet application we analyze the ability of Continuous Wavelet Transform in the spectral analysis, starting again with some synthetic tests to evaluate its sensibility and capability and then apply the same analysis to real data to obtain Local Correlation Maps between different model at same depth or between different profiles of the same model.
Resumo:
Several diagnostic techniques are presented for the detection of electrical fault in induction motor variable speed drives. These techinques are developed taking into account the impact of the control system on machine variables and non stationary operating conditions.
Resumo:
I test di qualifica a vibrazioni vengono usati in fase di progettazione di un componente per verificarne la resistenza meccanica alle sollecitazioni dinamiche (di natura vibratoria) applicate durante la sua vita utile. La durata delle vibrazioni applicate al componente durante la sua vita utile (migliaia di ore) deve essere ridotta al fine di realizzare test fattibili in laboratorio, condotti in genere utilizzando uno shaker elettrodinamico. L’idea è quella di aumentare l’intensità delle vibrazioni riducendone la durata. Esistono diverse procedure di Test Tailoring che tramite un metodo di sintesi definiscono un profilo vibratorio da applicare in laboratorio a partire dalle reali vibrazioni applicate al componente: una delle metodologie più comuni si basa sull’equivalenza del danno a fatica prodotto dalle reali vibrazioni e dalle vibrazioni sintetizzate. Questo approccio è piuttosto diffuso tuttavia all’autore non risulta presente nessun riferimento in letteratura che ne certifichi la validità tramite evidenza sperimentalmente. L’obiettivo dell’attività di ricerca è stato di verificare la validità del metodo tramite una campagna sperimentale condotta su opportuni provini. Il metodo viene inizialmente usato per sintetizzare un profilo vibratorio (random stazionario) avente la stessa durata di un profilo vibratorio non stazionario acquisito in condizioni reali. Il danno a fatica prodotto dalla vibrazione sintetizzata è stato confrontato con quello della vibrazione reale in termini di tempo di rottura dei provini. I risultati mostrano che il danno prodotto dalla vibrazione sintetizzata è sovrastimato, quindi l’equivalenza non è rispettata. Sono stati individuati alcuni punti critici e sono state proposte alcune modifiche al metodo per rendere la teoria più robusta. Il metodo è stato verificato con altri test e i risultati confermano la validità del metodo a condizione che i punti critici individuati siano correttamente analizzati.
Resumo:
This thesis aimed at addressing some of the issues that, at the state of the art, avoid the P300-based brain computer interface (BCI) systems to move from research laboratories to end users’ home. An innovative asynchronous classifier has been defined and validated. It relies on the introduction of a set of thresholds in the classifier, and such thresholds have been assessed considering the distributions of score values relating to target, non-target stimuli and epochs of voluntary no-control. With the asynchronous classifier, a P300-based BCI system can adapt its speed to the current state of the user and can automatically suspend the control when the user diverts his attention from the stimulation interface. Since EEG signals are non-stationary and show inherent variability, in order to make long-term use of BCI possible, it is important to track changes in ongoing EEG activity and to adapt BCI model parameters accordingly. To this aim, the asynchronous classifier has been subsequently improved by introducing a self-calibration algorithm for the continuous and unsupervised recalibration of the subjective control parameters. Finally an index for the online monitoring of the EEG quality has been defined and validated in order to detect potential problems and system failures. This thesis ends with the description of a translational work involving end users (people with amyotrophic lateral sclerosis-ALS). Focusing on the concepts of the user centered design approach, the phases relating to the design, the development and the validation of an innovative assistive device have been described. The proposed assistive technology (AT) has been specifically designed to meet the needs of people with ALS during the different phases of the disease (i.e. the degree of motor abilities impairment). Indeed, the AT can be accessed with several input devices either conventional (mouse, touchscreen) or alterative (switches, headtracker) up to a P300-based BCI.
Resumo:
In the first chapter, I develop a panel no-cointegration test which extends Pesaran, Shin and Smith (2001)'s bounds test to the panel framework by considering the individual regressions in a Seemingly Unrelated Regression (SUR) system. This allows to take into account unobserved common factors that contemporaneously affect all the units of the panel and provides, at the same time, unit-specific test statistics. Moreover, the approach is particularly suited when the number of individuals of the panel is small relatively to the number of time series observations. I develop the algorithm to implement the test and I use Monte Carlo simulation to analyze the properties of the test. The small sample properties of the test are remarkable, compared to its single equation counterpart. I illustrate the use of the test through a test of Purchasing Power Parity in a panel of EU15 countries. In the second chapter of my PhD thesis, I verify the Expectation Hypothesis of the Term Structure in the repurchasing agreements (repo) market with a new testing approach. I consider an "inexact" formulation of the EHTS, which models a time-varying component in the risk premia and I treat the interest rates as a non-stationary cointegrated system. The effect of the heteroskedasticity is controlled by means of testing procedures (bootstrap and heteroskedasticity correction) which are robust to variance and covariance shifts over time. I fi#nd that the long-run implications of EHTS are verified. A rolling window analysis clarifies that the EHTS is only rejected in periods of turbulence of #financial markets. The third chapter introduces the Stata command "bootrank" which implements the bootstrap likelihood ratio rank test algorithm developed by Cavaliere et al. (2012). The command is illustrated through an empirical application on the term structure of interest rates in the US.