977 resultados para FOURIER-ANALYSIS
Resumo:
Los análisis de Fourier permiten caracterizar el contorno del diente a partir de un número determinado de puntos y extraer una serie de parámetros para un posterior análisis multivariante. No obstante, la gran complejidad que presentan algunas conformaciones, obliga a comprobar cuántos puntos son necesarios para una correcta representación de ésta. El objetivo de este trabajo es aplicar y validar los análisis de Fourier (Polar y Elíptico) en el estudio de la forma dental a partir de diferentes puntos de contorno y explorar la variabilidad morfométrica en diferentes géneros. Se obtuvieron fotografías digitales de la superfi cie oclusal en segundos molares inferiores (M2s) de 4 especies de Primates (Hylobates moloch, Gorilla beringei graueri, Pongo pygmaeus pygmaeus y Pan troglodytes schweirfurthii) y se defi nió su contorno con 30, 40, 60, 80, 100 y 120 puntos y su representación formal a 10 armónicos. El análisis de la variabilidad morfométrica se realizó mediante la aplicación de Análisis Discriminantes y un NP-MANOVA a partir de matrices de distancias para determinar la variabilidad y porcentajes de clasifi cacióncorrecta, a nivel metodológico y taxonómico. Los resultados indicaron que los análisis de forma con series de Fourier permiten analizar la variabilidad morfométrica de M2s en géneros de Hominoidea, con independencia del número de puntos de contorno (30 a 120). Los porcentajes de clasifi cación son más variables e inferiores con el uso de la serie Polar (≈60-90) que con la Elíptica (75-100%). Un número entre 60-100 puntos de contorno mediante el método elíptico garantiza una descripción correcta de la forma del diente.
Resumo:
Vehicle operations in underwater environments are often compromised by poor visibility conditions. For instance, the perception range of optical devices is heavily constrained in turbid waters, thus complicating navigation and mapping tasks in environments such as harbors, bays, or rivers. A new generation of high-definition forward-looking sonars providing acoustic imagery at high frame rates has recently emerged as a promising alternative for working under these challenging conditions. However, the characteristics of the sonar data introduce difficulties in image registration, a key step in mosaicing and motion estimation applications. In this work, we propose the use of a Fourier-based registration technique capable of handling the low resolution, noise, and artifacts associated with sonar image formation. When compared to a state-of-the art region-based technique, our approach shows superior performance in the alignment of both consecutive and nonconsecutive views as well as higher robustness in featureless environments. The method is used to compute pose constraints between sonar frames that, integrated inside a global alignment framework, enable the rendering of consistent acoustic mosaics with high detail and increased resolution. An extensive experimental section is reported showing results in relevant field applications, such as ship hull inspection and harbor mapping
Resumo:
The goal of this work is to assess the efficacy of texture measures for estimating levels of crowd densities ill images. This estimation is crucial for the problem of crowd monitoring. and control. The assessment is carried out oil a set of nearly 300 real images captured from Liverpool Street Train Station. London, UK using texture measures extracted from the images through the following four different methods: gray level dependence matrices, straight lille segments. Fourier analysis. and fractal dimensions. The estimations of dowel densities are given in terms of the classification of the input images ill five classes of densities (very low, low. moderate. high and very high). Three types of classifiers are used: neural (implemented according to the Kohonen model). Bayesian. and an approach based on fitting functions. The results obtained by these three classifiers. using the four texture measures. allowed the conclusion that, for the problem of crowd density estimation. texture analysis is very effective.
Resumo:
The Weyl-Wigner correspondence prescription, which makes great use of Fourier duality, is reexamined from the point of view of Kac algebras, the most general background for noncommutative Fourier analysis allowing for that property. It is shown how the standard Kac structure has to be extended in order to accommodate the physical requirements. Both an Abelian and a symmetric projective Kac algebra are shown to provide, in close parallel to the standard case, a new dual framework and a well-defined notion of projective Fourier duality for the group of translations on the plane. The Weyl formula arises naturally as an irreducible component of the duality mapping between these projective algebras.
Resumo:
Every seismic event produces seismic waves which travel throughout the Earth. Seismology is the science of interpreting measurements to derive information about the structure of the Earth. Seismic tomography is the most powerful tool for determination of 3D structure of deep Earth's interiors. Tomographic models obtained at the global and regional scales are an underlying tool for determination of geodynamical state of the Earth, showing evident correlation with other geophysical and geological characteristics. The global tomographic images of the Earth can be written as a linear combinations of basis functions from a specifically chosen set, defining the model parameterization. A number of different parameterizations are commonly seen in literature: seismic velocities in the Earth have been expressed, for example, as combinations of spherical harmonics or by means of the simpler characteristic functions of discrete cells. With this work we are interested to focus our attention on this aspect, evaluating a new type of parameterization, performed by means of wavelet functions. It is known from the classical Fourier theory that a signal can be expressed as the sum of a, possibly infinite, series of sines and cosines. This sum is often referred as a Fourier expansion. The big disadvantage of a Fourier expansion is that it has only frequency resolution and no time resolution. The Wavelet Analysis (or Wavelet Transform) is probably the most recent solution to overcome the shortcomings of Fourier analysis. The fundamental idea behind this innovative analysis is to study signal according to scale. Wavelets, in fact, are mathematical functions that cut up data into different frequency components, and then study each component with resolution matched to its scale, so they are especially useful in the analysis of non stationary process that contains multi-scale features, discontinuities and sharp strike. Wavelets are essentially used in two ways when they are applied in geophysical process or signals studies: 1) as a basis for representation or characterization of process; 2) as an integration kernel for analysis to extract information about the process. These two types of applications of wavelets in geophysical field, are object of study of this work. At the beginning we use the wavelets as basis to represent and resolve the Tomographic Inverse Problem. After a briefly introduction to seismic tomography theory, we assess the power of wavelet analysis in the representation of two different type of synthetic models; then we apply it to real data, obtaining surface wave phase velocity maps and evaluating its abilities by means of comparison with an other type of parametrization (i.e., block parametrization). For the second type of wavelet application we analyze the ability of Continuous Wavelet Transform in the spectral analysis, starting again with some synthetic tests to evaluate its sensibility and capability and then apply the same analysis to real data to obtain Local Correlation Maps between different model at same depth or between different profiles of the same model.
Resumo:
Wavelet analysis offers an alternative to Fourier based time-series analysis, and is particularly useful when the amplitudes and periods of dominant cycles are time dependent. We analyse climatic records derived from oxygen isotopic ratios of marine sediment cores with modified Morlet wavelets. We use a normalization of the Morlet wavelets which allows direct correspondence with Fourier analysis. This provides a direct view of the oscillations at various frequencies, and illustrates the nature of the time-dependence of the dominant cycles.
Resumo:
Date-32 is a fast and easily used computer program developed to date Quaternary deep-sea cores by associating variations in the earth's orbit with recurring oscillations in core properties, such as carbonate content or isotope composition. Starting with known top and bottom dates, distortions in the periodicities of the core properties due to varying sedimentation rates are realigned by fast Fourier analysis so as to maximise the spectral energy density at the orbital frequencies. This allows age interpolation to all parts of the core to an accuracy of 10 kyrs, or about 1.5% of the record duration for a typical Brunhes sequence. The influence of astronomical forcing is examined and the method is applied to provide preliminary dates in a high-resolution Brunhes record from DSDP Site 594 off southeastern New Zealand.
Resumo:
Sediment samples from both Site 165-999/165-1000 (Atlantic) and Site 202-1241 (Pacific) were chosen at 1Ma intervals over the period 0.3-9.3Ma. Samples were washed and sieved <150µm. Splits of the sediment fraction were picked completely to obtain, where possible, at least 30 specimens each of planktic foraminifer species Globigerinoides sacculifer and Globorotalia tumida, on which outline analysis (Fourier) was performed. Sea surface and thermocline temperatures were reconstructed from palaeoenvironmental proxies (UK37' and Tex86H respectively).
Resumo:
Bibliography: p. 23.
Resumo:
This thesis was concerned with investigating methods of improving the IOP pulse’s potential as a measure of clinical utility. There were three principal sections to the work. 1. Optimisation of measurement and analysis of the IOP pulse. A literature review, covering the years 1960 – 2002 and other relevant scientific publications, provided a knowledge base on the IOP pulse. Initial studies investigated suitable instrumentation and measurement techniques. Fourier transformation was identified as a promising method of analysing the IOP pulse and this technique was developed. 2. Investigation of ocular and systemic variables that affect IOP pulse measurements In order to recognise clinically important changes in IOP pulse measurement, studies were performed to identify influencing factors. Fourier analysis was tested against traditional parameters in order to assess its ability to detect differences in IOP pulse. In addition, it had been speculated that the waveform components of the IOP pulse contained vascular characteristic analogous to those components found in arterial pulse waves. Validation studies to test this hypothesis were attempted. 3. The nature of the intraocular pressure pulse in health and disease and its relation to systemic cardiovascular variables. Fourier analysis and traditional parameters were applied to the IOP pulse measurements taken on diseased and healthy eyes. Only the derived parameter, pulsatile ocular blood flow (POBF) detected differences in diseased groups. The use of an ocular pressure-volume relationship may have improved the POBF measure’s variance in comparison to the measurement of the pulse’s amplitude or Fourier components. Finally, the importance of the driving force of pulsatile blood flow, the arterial pressure pulse, is highlighted. A method of combining the measurements of pulsatile blood flow and pulsatile blood pressure to create a measure of ocular vascular impedance is described along with its advantages for future studies.
Resumo:
* Supported by INTAS 2000-626, INTAS YSF 03-55-1969, INTAS INNO 182, and TIC 2003-09319-c03-03.
Resumo:
The classifier support vector machine is used in several problems in various areas of knowledge. Basically the method used in this classier is to end the hyperplane that maximizes the distance between the groups, to increase the generalization of the classifier. In this work, we treated some problems of binary classification of data obtained by electroencephalography (EEG) and electromyography (EMG) using Support Vector Machine with some complementary techniques, such as: Principal Component Analysis to identify the active regions of the brain, the periodogram method which is obtained by Fourier analysis to help discriminate between groups and Simple Moving Average to eliminate some of the existing noise in the data. It was developed two functions in the software R, for the realization of training tasks and classification. Also, it was proposed two weights systems and a summarized measure to help on deciding in classification of groups. The application of these techniques, weights and the summarized measure in the classier, showed quite satisfactory results, where the best results were an average rate of 95.31% to visual stimuli data, 100% of correct classification for epilepsy data and rates of 91.22% and 96.89% to object motion data for two subjects.