36 resultados para joint time-frequency analysis
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
This correspondence studies the formulation of members ofthe Cohen-Posch class of positive time-frequency energy distributions.Minimization of cross-entropy measures with respect to different priorsand the case of no prior or maximum entropy were considered. It isconcluded that, in general, the information provided by the classicalmarginal constraints is very limited, and thus, the final distributionheavily depends on the prior distribution. To overcome this limitation,joint time and frequency marginals are derived based on a "directioninvariance" criterion on the time-frequency plane that are directly relatedto the fractional Fourier transform.
Resumo:
A compositional time series is obtained when a compositional data vector is observed atdifferent points in time. Inherently, then, a compositional time series is a multivariatetime series with important constraints on the variables observed at any instance in time.Although this type of data frequently occurs in situations of real practical interest, atrawl through the statistical literature reveals that research in the field is very much in itsinfancy and that many theoretical and empirical issues still remain to be addressed. Anyappropriate statistical methodology for the analysis of compositional time series musttake into account the constraints which are not allowed for by the usual statisticaltechniques available for analysing multivariate time series. One general approach toanalyzing compositional time series consists in the application of an initial transform tobreak the positive and unit sum constraints, followed by the analysis of the transformedtime series using multivariate ARIMA models. In this paper we discuss the use of theadditive log-ratio, centred log-ratio and isometric log-ratio transforms. We also presentresults from an empirical study designed to explore how the selection of the initialtransform affects subsequent multivariate ARIMA modelling as well as the quality ofthe forecasts
Resumo:
Background: oscillatory activity, which can be separated in background and oscillatory burst pattern activities, is supposed to be representative of local synchronies of neural assemblies. Oscillatory burst events should consequently play a specific functional role, distinct from background EEG activity – especially for cognitive tasks (e.g. working memory tasks), binding mechanisms and perceptual dynamics (e.g. visual binding), or in clinical contexts (e.g. effects of brain disorders). However extracting oscillatory events in single trials, with a reliable and consistent method, is not a simple task. Results: in this work we propose a user-friendly stand-alone toolbox, which models in a reasonable time a bump time-frequency model from the wavelet representations of a set of signals. The software is provided with a Matlab toolbox which can compute wavelet representations before calling automatically the stand-alone application. Conclusion: The tool is publicly available as a freeware at the address: http:// www.bsp.brain.riken.jp/bumptoolbox/toolbox_home.html
Resumo:
The Wigner higher order moment spectra (WHOS)are defined as extensions of the Wigner-Ville distribution (WD)to higher order moment spectra domains. A general class oftime-frequency higher order moment spectra is also defined interms of arbitrary higher order moments of the signal as generalizations of the Cohen’s general class of time-frequency representations. The properties of the general class of time-frequency higher order moment spectra can be related to theproperties of WHOS which are, in fact, extensions of the properties of the WD. Discrete time and frequency Wigner higherorder moment spectra (DTF-WHOS) distributions are introduced for signal processing applications and are shown to beimplemented with two FFT-based algorithms. One applicationis presented where the Wigner bispectrum (WB), which is aWHOS in the third-order moment domain, is utilized for thedetection of transient signals embedded in noise. The WB iscompared with the WD in terms of simulation examples andanalysis of real sonar data. It is shown that better detectionschemes can be derived, in low signal-to-noise ratio, when theWB is applied.
Resumo:
The mismatch negativity is an electrophysiological marker of auditory change detection in the event-related brain potential and has been proposed to reflect an automatic comparison process between an incoming stimulus and the representation of prior items in a sequence. There is evidence for two main functional subcomponents comprising the MMN, generated by temporal and frontal brain areas, respectively. Using data obtained in an MMN paradigm, we performed time-frequency analysis to reveal the changes in oscillatory neural activity in the theta band. The results suggest that the frontal component of the MMN is brought about by an increase in theta power for the deviant trials and, possibly, by an additional contribution of theta phase alignment. By contrast, the temporal component of the MMN, best seen in recordings from mastoid electrodes, is generated by phase resetting of theta rhythm with no concomitant power modulation. Thus, frontal and temporal MMN components do not only differ with regard to their functional significance but also appear to be generated by distinct neurophysiological mechanisms.
Resumo:
The study of the thermal behavior of complex packages as multichip modules (MCM¿s) is usually carried out by measuring the so-called thermal impedance response, that is: the transient temperature after a power step. From the analysis of this signal, the thermal frequency response can be estimated, and consequently, compact thermal models may be extracted. We present a method to obtain an estimate of the time constant distribution underlying the observed transient. The method is based on an iterative deconvolution that produces an approximation to the time constant spectrum while preserving a convenient convolution form. This method is applied to the obtained thermal response of a microstructure as analyzed by finite element method as well as to the measured thermal response of a transistor array integrated circuit (IC) in a SMD package.
Resumo:
This paper introduces the approach of using Total Unduplicated Reach and Frequency analysis (TURF) to design a product line through a binary linear programming model. This improves the efficiency of the search for the solution to the problem compared to the algorithms that have been used to date. The results obtained through our exact algorithm are presented, and this method shows to be extremely efficient both in obtaining optimal solutions and in computing time for very large instances of the problem at hand. Furthermore, the proposed technique enables the model to be improved in order to overcome the main drawbacks presented by TURF analysis in practice.
Resumo:
Capital taxation is currently under debate, basically due to problems of administrative control and proper assessment of the levied assets. We analyze both problems focusing on a capital tax, the annual wealth tax (WT), which is only applied in five OECD countries, being Spain one of them. We concentrate our analysis on top 1% adult population, which permits us to describe the evolution of wealth concentration in Spain along 1983-2001. On average top 1% holds about 18% of total wealth, which rises to 19% when tax incompliance and under-assessment is corrected for housing, the main asset. The evolution suggests wealth concentration has risen. Regarding WT, we analyze whether it helps to reduce wealth inequality or, on the contrary, it reinforces vertical inequity (due to especial concessions) and horizontal inequity (due to the de iure and to de facto different treatment of assets). We analyze in detail housing and equity shares. By means of a time series analysis, we relate the reported values with reasonable price indicators and proxies of the propensity to save. We infer net tax compliance is extremely low, which includes both what we commonly understand by (gross) tax compliance and the degree of under-assessment due to fiscal legislation (for housing). That is especially true for housing, whose level of net tax compliance is well below 50%. Hence, we corroborate the difficulties in taxing capital, and so cast doubts on the current role of the WT in Spain in reducing wealth inequality.
Resumo:
R from http://www.r-project.org/ is ‘GNU S’ – a language and environment for statistical computingand graphics. The environment in which many classical and modern statistical techniques havebeen implemented, but many are supplied as packages. There are 8 standard packages and many moreare available through the cran family of Internet sites http://cran.r-project.org .We started to develop a library of functions in R to support the analysis of mixtures and our goal isa MixeR package for compositional data analysis that provides support foroperations on compositions: perturbation and power multiplication, subcomposition with or withoutresiduals, centering of the data, computing Aitchison’s, Euclidean, Bhattacharyya distances,compositional Kullback-Leibler divergence etc.graphical presentation of compositions in ternary diagrams and tetrahedrons with additional features:barycenter, geometric mean of the data set, the percentiles lines, marking and coloring ofsubsets of the data set, theirs geometric means, notation of individual data in the set . . .dealing with zeros and missing values in compositional data sets with R procedures for simpleand multiplicative replacement strategy,the time series analysis of compositional data.We’ll present the current status of MixeR development and illustrate its use on selected data sets
Resumo:
In the past, sensors networks in cities have been limited to fixed sensors, embedded in particular locations, under centralised control. Today, new applications can leverage wireless devices and use them as sensors to create aggregated information. In this paper, we show that the emerging patterns unveiled through the analysis of large sets of aggregated digital footprints can provide novel insights into how people experience the city and into some of the drivers behind these emerging patterns. We particularly explore the capacity to quantify the evolution of the attractiveness of urban space with a case study of in the area of the New York City Waterfalls, a public art project of four man-made waterfalls rising from the New York Harbor. Methods to study the impact of an event of this nature are traditionally based on the collection of static information such as surveys and ticket-based people counts, which allow to generate estimates about visitors’ presence in specific areas over time. In contrast, our contribution makes use of the dynamic data that visitors generate, such as the density and distribution of aggregate phone calls and photos taken in different areas of interest and over time. Our analysis provides novel ways to quantify the impact of a public event on the distribution of visitors and on the evolution of the attractiveness of the points of interest in proximity. This information has potential uses for local authorities, researchers, as well as service providers such as mobile network operators.
Resumo:
This study presents new evidence concerning the uneven processes of industrialization innineteenth century Spain and Italy based on a disaggregate analysis of the productivesectors from which the behaviour of the aggregate indices is comprised. The use of multivariate time-series analysis techniques can aid our understanding and characterization of these two processes of industrialization. The identification of those sectors with key rolesin leading industrial growth provides new evidence concerning the factors that governed thebehaviour of the aggregates in the two economies. In addition, the analysis of the existenceof interindustry linkages reveals the scale of the industrialization process, and wheresignificant differences exist, accounts for many of the divergences recorded in the historiography for the period 1850-1913.
Resumo:
We apply the formalism of the continuous-time random walk to the study of financial data. The entire distribution of prices can be obtained once two auxiliary densities are known. These are the probability densities for the pausing time between successive jumps and the corresponding probability density for the magnitude of a jump. We have applied the formalism to data on the U.S. dollardeutsche mark future exchange, finding good agreement between theory and the observed data.
Resumo:
This study presents new evidence concerning the uneven processes of industrialization innineteenth century Spain and Italy based on a disaggregate analysis of the productivesectors from which the behaviour of the aggregate indices is comprised. The use of multivariate time-series analysis techniques can aid our understanding and characterization of these two processes of industrialization. The identification of those sectors with key rolesin leading industrial growth provides new evidence concerning the factors that governed thebehaviour of the aggregates in the two economies. In addition, the analysis of the existenceof interindustry linkages reveals the scale of the industrialization process, and wheresignificant differences exist, accounts for many of the divergences recorded in the historiography for the period 1850-1913.
Resumo:
This paper presents a new method to analyze timeinvariant linear networks allowing the existence of inconsistent initial conditions. This method is based on the use of distributions and state equations. Any time-invariant linear network can be analyzed. The network can involve any kind of pure or controlled sources. Also, the transferences of energy that occur at t=O are determined, and the concept of connection energy is introduced. The algorithms are easily implemented in a computer program.
Resumo:
Seismic methods used in the study of snow avalanches may be employed to detect and characterize landslides and other mass movements, using standard spectrogram/sonogram analysis. For snow avalanches, the spectrogram for a station that is approached by a sliding mass exhibits a triangular time/frequency signature due to an increase over time in the higher-frequency constituents. Recognition of this characteristic footprint in a spectrogram suggests a useful metric for identifying other mass-movement events such as landslides. The 1 June 2005 slide at Laguna Beach, California is examined using data obtained from the Caltech/USGS Regional Seismic Network. This event exhibits the same general spectrogram features observed in studies of Alpine snow avalanches. We propose that these features are due to the systematic relative increase in high-frequency energy transmitted to a seismometer in the path of a mass slide owing to a reduction of distance from the source signal. This phenomenon is related to the path of the waves whose high frequencies are less attenuated as they traverse shorter source-receiver paths. Entrainment of material in the course of the slide may also contribute to the triangular time/frequency signature as a consequence of the increase in the energy involved in the process; in this case the contribution would be a source effect. By applying this commonly observed characteristic to routine monitoring algorithms, along with custom adjustments for local site effects, we seek to contribute to the improvement in automatic detection and monitoring methods of landslides and other mass movements.