11 resultados para LOCALLY STATIONARY WAVELET PROCESSES
em CentAUR: Central Archive University of Reading - UK
Resumo:
The use of Bayesian inference in the inference of time-frequency representations has, thus far, been limited to offline analysis of signals, using a smoothing spline based model of the time-frequency plane. In this paper we introduce a new framework that allows the routine use of Bayesian inference for online estimation of the time-varying spectral density of a locally stationary Gaussian process. The core of our approach is the use of a likelihood inspired by a local Whittle approximation. This choice, along with the use of a recursive algorithm for non-parametric estimation of the local spectral density, permits the use of a particle filter for estimating the time-varying spectral density online. We provide demonstrations of the algorithm through tracking chirps and the analysis of musical data.
Resumo:
Although difference-stationary (DS) and trend-stationary (TS) processes have been subject to considerable analysis, there are no direct comparisons for each being the data-generation process (DGP). We examine incorrect choice between these models for forecasting for both known and estimated parameters. Three sets of Monte Carlo simulations illustrate the analysis, to evaluate the biases in conventional standard errors when each model is mis-specified, compute the relative mean-square forecast errors of the two models for both DGPs, and investigate autocorrelated errors, so both models can better approximate the converse DGP. The outcomes are surprisingly different from established results.
Resumo:
This work compares and contrasts results of classifying time-domain ECG signals with pathological conditions taken from the MITBIH arrhythmia database. Linear discriminant analysis and a multi-layer perceptron were used as classifiers. The neural network was trained by two different methods, namely back-propagation and a genetic algorithm. Converting the time-domain signal into the wavelet domain reduced the dimensionality of the problem at least 10-fold. This was achieved using wavelets from the db6 family as well as using adaptive wavelets generated using two different strategies. The wavelet transforms used in this study were limited to two decomposition levels. A neural network with evolved weights proved to be the best classifier with a maximum of 99.6% accuracy when optimised wavelet-transform ECG data wits presented to its input and 95.9% accuracy when the signals presented to its input were decomposed using db6 wavelets. The linear discriminant analysis achieved a maximum classification accuracy of 95.7% when presented with optimised and 95.5% with db6 wavelet coefficients. It is shown that the much simpler signal representation of a few wavelet coefficients obtained through an optimised discrete wavelet transform facilitates the classification of non-stationary time-variant signals task considerably. In addition, the results indicate that wavelet optimisation may improve the classification ability of a neural network. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Associative memory networks such as Radial Basis Functions, Neurofuzzy and Fuzzy Logic used for modelling nonlinear processes suffer from the curse of dimensionality (COD), in that as the input dimension increases the parameterization, computation cost, training data requirements, etc. increase exponentially. Here a new algorithm is introduced for the construction of a Delaunay input space partitioned optimal piecewise locally linear models to overcome the COD as well as generate locally linear models directly amenable to linear control and estimation algorithms. The training of the model is configured as a new mixture of experts network with a new fast decision rule derived using convex set theory. A very fast simulated reannealing (VFSR) algorithm is utilized to search a global optimal solution of the Delaunay input space partition. A benchmark non-linear time series is used to demonstrate the new approach.
Resumo:
This paper derives exact discrete time representations for data generated by a continuous time autoregressive moving average (ARMA) system with mixed stock and flow data. The representations for systems comprised entirely of stocks or of flows are also given. In each case the discrete time representations are shown to be of ARMA form, the orders depending on those of the continuous time system. Three examples and applications are also provided, two of which concern the stationary ARMA(2, 1) model with stock variables (with applications to sunspot data and a short-term interest rate) and one concerning the nonstationary ARMA(2, 1) model with a flow variable (with an application to U.S. nondurable consumers’ expenditure). In all three examples the presence of an MA(1) component in the continuous time system has a dramatic impact on eradicating unaccounted-for serial correlation that is present in the discrete time version of the ARMA(2, 0) specification, even though the form of the discrete time model is ARMA(2, 1) for both models.
Resumo:
The detection of long-range dependence in time series analysis is an important task to which this paper contributes by showing that whilst the theoretical definition of a long-memory (or long-range dependent) process is based on the autocorrelation function, it is not possible for long memory to be identified using the sum of the sample autocorrelations, as usually defined. The reason for this is that the sample sum is a predetermined constant for any stationary time series; a result that is independent of the sample size. Diagnostic or estimation procedures, such as those in the frequency domain, that embed this sum are equally open to this criticism. We develop this result in the context of long memory, extending it to the implications for the spectral density function and the variance of partial sums of a stationary stochastic process. The results are further extended to higher order sample autocorrelations and the bispectral density. The corresponding result is that the sum of the third order sample (auto) bicorrelations at lags h,k≥1, is also a predetermined constant, different from that in the second order case, for any stationary time series of arbitrary length.
Resumo:
Potential vorticity (PV) succinctly describes the evolution of large-scale atmospheric flow because of its material conservation and invertibility properties. However, diabatic processes in extratropical cyclones can modify PV and influence both mesoscale weather and the evolution of the synoptic-scale wave pattern. In this investigation, modification of PV by diabatic processes is diagnosed in a Met Office Unified Model (MetUM) simulation of a North Atlantic cyclone using a set of PV tracers. The structure of diabatic PV within the extratropical cyclone is investigated and linked to the processes responsible for it. On the mesoscale, a tripole of diabatic PV is generated across the tropopause fold extending down to the cold front. The structure results from a dipole in heating across the frontal interface due to condensation in the warm conveyor belt flanking the upper side of the fold and evaporation of precipitation in the dry intrusion and below. On isentropic surfaces intersecting the tropopause, positive diabatic PV is generated on the stratospheric side, while negative diabatic PV is generated on the tropospheric side. The stratospheric diabatic PV is generated primarily by long-wave cooling which peaks at the tropopause itself due to the sharp gradient in humidity there. The tropospheric diabatic PV originates locally from the long-wave radiation and non-locally by advection out of the top of heating associated with the large-scale cloud, convection and boundary layer schemes. In most locations there is no diabatic modification of PV at the tropopause itself but diabatic PV anomalies would influence the tropopause indirectly through the winds they induce and subsequent advection. The consequences of this diabatic PV dipole for the evolution of synoptic-scale wave patterns are discussed.
Resumo:
For a Lévy process ξ=(ξt)t≥0 drifting to −∞, we define the so-called exponential functional as follows: Formula Under mild conditions on ξ, we show that the following factorization of exponential functionals: Formula holds, where × stands for the product of independent random variables, H− is the descending ladder height process of ξ and Y is a spectrally positive Lévy process with a negative mean constructed from its ascending ladder height process. As a by-product, we generate an integral or power series representation for the law of Iξ for a large class of Lévy processes with two-sided jumps and also derive some new distributional properties. The proof of our main result relies on a fine Markovian study of a class of generalized Ornstein–Uhlenbeck processes, which is itself of independent interest. We use and refine an alternative approach of studying the stationary measure of a Markov process which avoids some technicalities and difficulties that appear in the classical method of employing the generator of the dual Markov process.
Resumo:
Nonlinear spectral transfers of kinetic energy and enstrophy, and stationary-transient interaction, are studied using global FGGE data for January 1979. It is found that the spectral transfers arise primarily from a combination, in roughly equal measure, of pure transient and mixed stationary-transient interactions. The pure transient interactions are associated with a transient eddy field which is approximately locally homogeneous and isotropic, and they appear to be consistently understood within the context of two-dimensional homogeneous turbulence. Theory based on spatial wale separation concepts suggests that the mixed interactions may be understood physically, to a first approximation, as a process of shear-induced spectral transfer of transient enstrophy along lines of constant zonal wavenumber. This essentially conservative enstrophy transfer generally involves highly nonlocal stationary-transient energy conversions. The observational analysis demonstrates that the shear-induced transient enstrophy transfer is mainly associated with intermediate-scale (zonal wavenumber m > 3) transients and is primarily to smaller (meridional) scales, so that the transient flow acts as a source of stationary energy. In quantitative terms, this transient-eddy rectification corresponds to a forcing timescale in the stationary energy budget which is of the same order of magnitude as most estimates of the damping timescale in simple stationary-wave models (5 to 15 days). Moreover, the nonlinear interactions involved are highly nonlocal and cover a wide range of transient scales of motion.
Resumo:
Quasi-stationary convective bands can cause large localised rainfall accumulations and are often anchored by topographic features. Here, the predictability of and mechanisms causing one such band are determined using ensembles of the Met Office Unified Model at convection-permitting resolution (1.5 km grid length). The band was stationary over the UK for 3 h and produced rainfall accumulations of up to 34 mm. The amount and location of the predicted rainfall was highly variable despite only small differences between the large-scale conditions of the ensemble members. Only three of 21 members of the control ensemble produced a stationary rain band; these three had the weakest upstream winds and hence lowest Froude number. Band formation was due to the superposition of two processes: lee-side convergence resulting from flow around an upstream obstacle and thermally forced convergence resulting from elevated heating over the upstream terrain. Both mechanisms were enhanced when the Froude number was lower. By increasing the terrain height (thus reducing the Froude number), the band became more predictable. An ensemble approach is required to successfully predict the possible occurrence of such quasi-stationary convective events because the rainfall variability is largely modulated by small variations of the large-scale flow. However, high-resolution models are required to accurately resolve the small-scale interactions of the flow with the topography upon which the band formation depends. Thus, although topography provides some predictability, the quasi-stationary convective bands anchored by it are likely to remain a forecasting challenge for many years to come.
Resumo:
A case of long-range transport of a biomass burning plume from Alaska to Europe is analyzed using a Lagrangian approach. This plume was sampled several times in the free troposphere over North America, the North Atlantic and Europe by three different aircraft during the IGAC Lagrangian 2K4 experiment which was part of the ICARTT/ITOP measurement intensive in summer 2004. Measurements in the plume showed enhanced values of CO, VOCs and NOy, mainly in form of PAN. Observed O3 levels increased by 17 ppbv over 5 days. A photochemical trajectory model, CiTTyCAT, was used to examine processes responsible for the chemical evolution of the plume. The model was initialized with upwind data and compared with downwind measurements. The influence of high aerosol loading on photolysis rates in the plume was investigated using in situ aerosol measurements in the plume and lidar retrievals of optical depth as input into a photolysis code (Fast-J), run in the model. Significant impacts on photochemistry are found with a decrease of 18% in O3 production and 24% in O3 destruction over 5 days when including aerosols. The plume is found to be chemically active with large O3 increases attributed primarily to PAN decomposition during descent of the plume toward Europe. The predicted O3 changes are very dependent on temperature changes during transport and also on water vapor levels in the lower troposphere which can lead to O3 destruction. Simulation of mixing/dilution was necessary to reproduce observed pollutant levels in the plume. Mixing was simulated using background concentrations from measurements in air masses in close proximity to the plume, and mixing timescales (averaging 6.25 days) were derived from CO changes. Observed and simulated O3/CO correlations in the plume were also compared in order to evaluate the photochemistry in the model. Observed slopes change from negative to positive over 5 days. This change, which can be attributed largely to photochemistry, is well reproduced by multiple model runs even if slope values are slightly underestimated suggesting a small underestimation in modeled photochemical O3 production. The possible impact of this biomass burning plume on O3 levels in the European boundary layer was also examined by running the model for a further 5 days and comparing with data collected at surface sites, such as Jungfraujoch, which showed small O3 increases and elevated CO levels. The model predicts significant changes in O3 over the entire 10 day period due to photochemistry but the signal is largely lost because of the effects of dilution. However, measurements in several other BB plumes over Europe show that O3 impact of Alaskan fires can be potentially significant over Europe.