35 resultados para Time-frequency analysis
Resumo:
This correspondence studies the formulation of members ofthe Cohen-Posch class of positive time-frequency energy distributions.Minimization of cross-entropy measures with respect to different priorsand the case of no prior or maximum entropy were considered. It isconcluded that, in general, the information provided by the classicalmarginal constraints is very limited, and thus, the final distributionheavily depends on the prior distribution. To overcome this limitation,joint time and frequency marginals are derived based on a "directioninvariance" criterion on the time-frequency plane that are directly relatedto the fractional Fourier transform.
Resumo:
A compositional time series is obtained when a compositional data vector is observed atdifferent points in time. Inherently, then, a compositional time series is a multivariatetime series with important constraints on the variables observed at any instance in time.Although this type of data frequently occurs in situations of real practical interest, atrawl through the statistical literature reveals that research in the field is very much in itsinfancy and that many theoretical and empirical issues still remain to be addressed. Anyappropriate statistical methodology for the analysis of compositional time series musttake into account the constraints which are not allowed for by the usual statisticaltechniques available for analysing multivariate time series. One general approach toanalyzing compositional time series consists in the application of an initial transform tobreak the positive and unit sum constraints, followed by the analysis of the transformedtime series using multivariate ARIMA models. In this paper we discuss the use of theadditive log-ratio, centred log-ratio and isometric log-ratio transforms. We also presentresults from an empirical study designed to explore how the selection of the initialtransform affects subsequent multivariate ARIMA modelling as well as the quality ofthe forecasts
Resumo:
Background: oscillatory activity, which can be separated in background and oscillatory burst pattern activities, is supposed to be representative of local synchronies of neural assemblies. Oscillatory burst events should consequently play a specific functional role, distinct from background EEG activity – especially for cognitive tasks (e.g. working memory tasks), binding mechanisms and perceptual dynamics (e.g. visual binding), or in clinical contexts (e.g. effects of brain disorders). However extracting oscillatory events in single trials, with a reliable and consistent method, is not a simple task. Results: in this work we propose a user-friendly stand-alone toolbox, which models in a reasonable time a bump time-frequency model from the wavelet representations of a set of signals. The software is provided with a Matlab toolbox which can compute wavelet representations before calling automatically the stand-alone application. Conclusion: The tool is publicly available as a freeware at the address: http:// www.bsp.brain.riken.jp/bumptoolbox/toolbox_home.html
Resumo:
The Wigner higher order moment spectra (WHOS)are defined as extensions of the Wigner-Ville distribution (WD)to higher order moment spectra domains. A general class oftime-frequency higher order moment spectra is also defined interms of arbitrary higher order moments of the signal as generalizations of the Cohen’s general class of time-frequency representations. The properties of the general class of time-frequency higher order moment spectra can be related to theproperties of WHOS which are, in fact, extensions of the properties of the WD. Discrete time and frequency Wigner higherorder moment spectra (DTF-WHOS) distributions are introduced for signal processing applications and are shown to beimplemented with two FFT-based algorithms. One applicationis presented where the Wigner bispectrum (WB), which is aWHOS in the third-order moment domain, is utilized for thedetection of transient signals embedded in noise. The WB iscompared with the WD in terms of simulation examples andanalysis of real sonar data. It is shown that better detectionschemes can be derived, in low signal-to-noise ratio, when theWB is applied.
Resumo:
The mismatch negativity is an electrophysiological marker of auditory change detection in the event-related brain potential and has been proposed to reflect an automatic comparison process between an incoming stimulus and the representation of prior items in a sequence. There is evidence for two main functional subcomponents comprising the MMN, generated by temporal and frontal brain areas, respectively. Using data obtained in an MMN paradigm, we performed time-frequency analysis to reveal the changes in oscillatory neural activity in the theta band. The results suggest that the frontal component of the MMN is brought about by an increase in theta power for the deviant trials and, possibly, by an additional contribution of theta phase alignment. By contrast, the temporal component of the MMN, best seen in recordings from mastoid electrodes, is generated by phase resetting of theta rhythm with no concomitant power modulation. Thus, frontal and temporal MMN components do not only differ with regard to their functional significance but also appear to be generated by distinct neurophysiological mechanisms.
Resumo:
The study of the thermal behavior of complex packages as multichip modules (MCM¿s) is usually carried out by measuring the so-called thermal impedance response, that is: the transient temperature after a power step. From the analysis of this signal, the thermal frequency response can be estimated, and consequently, compact thermal models may be extracted. We present a method to obtain an estimate of the time constant distribution underlying the observed transient. The method is based on an iterative deconvolution that produces an approximation to the time constant spectrum while preserving a convenient convolution form. This method is applied to the obtained thermal response of a microstructure as analyzed by finite element method as well as to the measured thermal response of a transistor array integrated circuit (IC) in a SMD package.
Resumo:
This paper introduces the approach of using Total Unduplicated Reach and Frequency analysis (TURF) to design a product line through a binary linear programming model. This improves the efficiency of the search for the solution to the problem compared to the algorithms that have been used to date. The results obtained through our exact algorithm are presented, and this method shows to be extremely efficient both in obtaining optimal solutions and in computing time for very large instances of the problem at hand. Furthermore, the proposed technique enables the model to be improved in order to overcome the main drawbacks presented by TURF analysis in practice.
Resumo:
Capital taxation is currently under debate, basically due to problems of administrative control and proper assessment of the levied assets. We analyze both problems focusing on a capital tax, the annual wealth tax (WT), which is only applied in five OECD countries, being Spain one of them. We concentrate our analysis on top 1% adult population, which permits us to describe the evolution of wealth concentration in Spain along 1983-2001. On average top 1% holds about 18% of total wealth, which rises to 19% when tax incompliance and under-assessment is corrected for housing, the main asset. The evolution suggests wealth concentration has risen. Regarding WT, we analyze whether it helps to reduce wealth inequality or, on the contrary, it reinforces vertical inequity (due to especial concessions) and horizontal inequity (due to the de iure and to de facto different treatment of assets). We analyze in detail housing and equity shares. By means of a time series analysis, we relate the reported values with reasonable price indicators and proxies of the propensity to save. We infer net tax compliance is extremely low, which includes both what we commonly understand by (gross) tax compliance and the degree of under-assessment due to fiscal legislation (for housing). That is especially true for housing, whose level of net tax compliance is well below 50%. Hence, we corroborate the difficulties in taxing capital, and so cast doubts on the current role of the WT in Spain in reducing wealth inequality.
Resumo:
R from http://www.r-project.org/ is ‘GNU S’ – a language and environment for statistical computingand graphics. The environment in which many classical and modern statistical techniques havebeen implemented, but many are supplied as packages. There are 8 standard packages and many moreare available through the cran family of Internet sites http://cran.r-project.org .We started to develop a library of functions in R to support the analysis of mixtures and our goal isa MixeR package for compositional data analysis that provides support foroperations on compositions: perturbation and power multiplication, subcomposition with or withoutresiduals, centering of the data, computing Aitchison’s, Euclidean, Bhattacharyya distances,compositional Kullback-Leibler divergence etc.graphical presentation of compositions in ternary diagrams and tetrahedrons with additional features:barycenter, geometric mean of the data set, the percentiles lines, marking and coloring ofsubsets of the data set, theirs geometric means, notation of individual data in the set . . .dealing with zeros and missing values in compositional data sets with R procedures for simpleand multiplicative replacement strategy,the time series analysis of compositional data.We’ll present the current status of MixeR development and illustrate its use on selected data sets
Resumo:
This study presents new evidence concerning the uneven processes of industrialization innineteenth century Spain and Italy based on a disaggregate analysis of the productivesectors from which the behaviour of the aggregate indices is comprised. The use of multivariate time-series analysis techniques can aid our understanding and characterization of these two processes of industrialization. The identification of those sectors with key rolesin leading industrial growth provides new evidence concerning the factors that governed thebehaviour of the aggregates in the two economies. In addition, the analysis of the existenceof interindustry linkages reveals the scale of the industrialization process, and wheresignificant differences exist, accounts for many of the divergences recorded in the historiography for the period 1850-1913.
Resumo:
We apply the formalism of the continuous-time random walk to the study of financial data. The entire distribution of prices can be obtained once two auxiliary densities are known. These are the probability densities for the pausing time between successive jumps and the corresponding probability density for the magnitude of a jump. We have applied the formalism to data on the U.S. dollardeutsche mark future exchange, finding good agreement between theory and the observed data.
Resumo:
This study presents new evidence concerning the uneven processes of industrialization innineteenth century Spain and Italy based on a disaggregate analysis of the productivesectors from which the behaviour of the aggregate indices is comprised. The use of multivariate time-series analysis techniques can aid our understanding and characterization of these two processes of industrialization. The identification of those sectors with key rolesin leading industrial growth provides new evidence concerning the factors that governed thebehaviour of the aggregates in the two economies. In addition, the analysis of the existenceof interindustry linkages reveals the scale of the industrialization process, and wheresignificant differences exist, accounts for many of the divergences recorded in the historiography for the period 1850-1913.
Resumo:
This paper presents a new method to analyze timeinvariant linear networks allowing the existence of inconsistent initial conditions. This method is based on the use of distributions and state equations. Any time-invariant linear network can be analyzed. The network can involve any kind of pure or controlled sources. Also, the transferences of energy that occur at t=O are determined, and the concept of connection energy is introduced. The algorithms are easily implemented in a computer program.
Resumo:
Seismic methods used in the study of snow avalanches may be employed to detect and characterize landslides and other mass movements, using standard spectrogram/sonogram analysis. For snow avalanches, the spectrogram for a station that is approached by a sliding mass exhibits a triangular time/frequency signature due to an increase over time in the higher-frequency constituents. Recognition of this characteristic footprint in a spectrogram suggests a useful metric for identifying other mass-movement events such as landslides. The 1 June 2005 slide at Laguna Beach, California is examined using data obtained from the Caltech/USGS Regional Seismic Network. This event exhibits the same general spectrogram features observed in studies of Alpine snow avalanches. We propose that these features are due to the systematic relative increase in high-frequency energy transmitted to a seismometer in the path of a mass slide owing to a reduction of distance from the source signal. This phenomenon is related to the path of the waves whose high frequencies are less attenuated as they traverse shorter source-receiver paths. Entrainment of material in the course of the slide may also contribute to the triangular time/frequency signature as a consequence of the increase in the energy involved in the process; in this case the contribution would be a source effect. By applying this commonly observed characteristic to routine monitoring algorithms, along with custom adjustments for local site effects, we seek to contribute to the improvement in automatic detection and monitoring methods of landslides and other mass movements.
Resumo:
After a rockfall event, a usual post event survey includes qualitative volume estimation, trajectory mapping and determination of departing zones. However, quantitative measurements are not usually made. Additional relevant quantitative information could be useful in determining the spatial occurrence of rockfall events and help us in quantifying their size. Seismic measurements could be suitable for detection purposes since they are non invasive methods and are relatively inexpensive. Moreover, seismic techniques could provide important information on rockfall size and location of impacts. On 14 February 2007 the Avalanche Group of the University of Barcelona obtained the seismic data generated by an artificially triggered rockfall event at the Montserrat massif (near Barcelona, Spain) carried out in order to purge a slope. Two 3 component seismic stations were deployed in the area about 200 m from the explosion point that triggered the rockfall. Seismic signals and video images were simultaneously obtained. The initial volume of the rockfall was estimated to be 75 m3 by laser scanner data analysis. After the explosion, dozens of boulders ranging from 10¿4 to 5 m3 in volume impacted on the ground at different locations. The blocks fell down onto a terrace, 120 m below the release zone. The impact generated a small continuous mass movement composed of a mixture of rocks, sand and dust that ran down the slope and impacted on the road 60 m below. Time, time-frequency evolution and particle motion analysis of the seismic records and seismic energy estimation were performed. The results are as follows: 1 ¿ A rockfall event generates seismic signals with specific characteristics in the time domain; 2 ¿ the seismic signals generated by the mass movement show a time-frequency evolution different from that of other seismogenic sources (e.g. earthquakes, explosions or a single rock impact). This feature could be used for detection purposes; 3 ¿ particle motion plot analysis shows that the procedure to locate the rock impact using two stations is feasible; 4 ¿ The feasibility and validity of seismic methods for the detection of rockfall events, their localization and size determination are comfirmed.