955 resultados para Hilbert transform


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The autonomic nervous system plays an important role in physiological and pathological conditions, and has been extensively evaluated by parametric and non-parametric spectral analysis. To compare the results obtained with fast Fourier transform (FFT) and the autoregressive (AR) method, we performed a comprehensive comparative study using data from humans and rats during pharmacological blockade (in rats), a postural test (in humans), and in the hypertensive state (in both humans and rats). Although postural hypotension in humans induced an increase in normalized low-frequency (LFnu) of systolic blood pressure, the increase in the ratio was detected only by AR. In rats, AR and FFT analysis did not agree for LFnu and high frequency (HFnu) under basal conditions and after vagal blockade. The increase in the LF/HF ratio of the pulse interval, induced by methylatropine, was detected only by FFT. In hypertensive patients, changes in LF and HF for systolic blood pressure were observed only by AR; FFT was able to detect the reduction in both blood pressure variance and total power. In hypertensive rats, AR presented different values of variance and total power for systolic blood pressure. Moreover, AR and FFT presented discordant results for LF, LFnu, HF, LF/HF ratio, and total power for pulse interval. We provide evidence for disagreement in 23% of the indices of blood pressure and heart rate variability in humans and 67% discordance in rats when these variables are evaluated by AR and FFT under physiological and pathological conditions. The overall disagreement between AR and FFT in this study was 43%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the present study was to develop a classifier able to discriminate between healthy controls and dyspeptic patients by analysis of their electrogastrograms. Fifty-six electrogastrograms were analyzed, corresponding to 42 dyspeptic patients and 14 healthy controls. The original signals were subsampled, filtered and divided into the pre-, post-, and prandial stages. A time-frequency transformation based on wavelets was used to extract the signal characteristics, and a special selection procedure based on correlation was used to reduce their number. The analysis was carried out by evaluating different neural network structures to classify the wavelet coefficients into two groups (healthy subjects and dyspeptic patients). The optimization process of the classifier led to a linear model. A dimension reduction that resulted in only 25% of uncorrelated electrogastrogram characteristics gave 24 inputs for the classifier. The prandial stage gave the most significant results. Under these conditions, the classifier achieved 78.6% sensitivity, 92.9% specificity, and an error of 17.9 ± 6% (with a 95% confidence level). These data show that it is possible to establish significant differences between patients and normal controls when time-frequency characteristics are extracted from an electrogastrogram, with an adequate component reduction, outperforming the results obtained with classical Fourier analysis. These findings can contribute to increasing our understanding of the pathophysiological mechanisms involved in functional dyspepsia and perhaps to improving the pharmacological treatment of functional dyspeptic patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tesis (Maestro en Ingeniería Eléctrica con Orientación en Potencia) UANL, 2011.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En France, les changements sociaux, culturels et politiques du tournant des XVIIIe et XIXe siècles vont imposer au romantisme naissant une autre base d’inspiration que l’Antiquité qui fut celle du classicisme : le Moyen Âge. Victor et Hugo et Honoré de Balzac feront partie des auteurs romantiques qui adapteront les ressources imaginaires des œuvres médiévales dont la figure du chevalier. Pourquoi les romantiques ont-ils perçu en cette figure une source de sens ? Quels sont les aménagements nécessaires pour qu’une figure aussi liée au Moyen Âge soit réactualisée dans l’esthétique romantique? Cette étude se propose de répondre à ces question en observant la figure du chevalier dans des œuvres médiévales, Le chevalier de la charrette (Chrétien de Troyes) et Le Lancelot en prose (auteur inconnu), comparée au chevalier romantique présenté dans La légende du beau Pécopin et de la belle Bauldour (Victor Hugo) et Le frère d’armes (Honoré de Balzac). Cette comparaison permettra de mettre en lumière que cette figure est représentée dans ces œuvres transformée et actualisée.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

L'objectif du présent texte est de discuter de la portée épistémique de la méthode axiomatique. Tout d'abord, il sera question du contexte à partir duquel la méthode axiomatique a émergé, ce qui sera suivi d'une discussion des motivations du programme de Hilbert et de ses objectifs. Ensuite, nous exposerons la méthode axiomatique dans un cadre plus moderne afin de mettre en lumière son utilité et sa portée théorique. Finalement, il s'agira d'explorer l'influence de la méthode axiomatique en physique, surtout en ce qui a trait à l'application de la méthode par Hilbert. Nous discuterons de ses objectifs et de l'épistémologie qui accompagnait sa vision du 6 e problème, ce qui nous amènera à discuter des limites épistémiques de la méthode axiomatique et de l'entreprise scientifique en général.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La crise des fondements n’a pas affecté les fondements arithmétiques du constructivisme de Kronecker, Bien plutôt, c’est le finitisme kroneckerien de la théorie de l’arithmétique générale ou polynomiale qui a permis à Hilbert de surmonter la crise des fondements ensemblistes et qui a poussé Gödel, inspiré par Hilbert, à proposer une extension du point de vue finitiste pour obtenir une preuve constructive de la consistance de l’arithmétique dans son interprétation fonctionnelle « Dialectica ».

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A method for computer- aided diagnosis of micro calcification clusters in mammograms images presented . Micro calcification clus.eni which are an early sign of bread cancer appear as isolated bright spots in mammograms. Therefore they correspond to local maxima of the image. The local maxima of the image is lint detected and they are ranked according to it higher-order statistical test performed over the sub band domain data

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fourier transform methods are employed heavily in digital signal processing. Discrete Fourier Transform (DFT) is among the most commonly used digital signal transforms. The exponential kernel of the DFT has the properties of symmetry and periodicity. Fast Fourier Transform (FFT) methods for fast DFT computation exploit these kernel properties in different ways. In this thesis, an approach of grouping data on the basis of the corresponding phase of the exponential kernel of the DFT is exploited to introduce a new digital signal transform, named the M-dimensional Real Transform (MRT), for l-D and 2-D signals. The new transform is developed using number theoretic principles as regards its specific features. A few properties of the transform are explored, and an inverse transform presented. A fundamental assumption is that the size of the input signal be even. The transform computation involves only real additions. The MRT is an integer-to-integer transform. There are two kinds of redundancy, complete redundancy & derived redundancy, in MRT. Redundancy is analyzed and removed to arrive at a more compact version called the Unique MRT (UMRT). l-D UMRT is a non-expansive transform for all signal sizes, while the 2-D UMRT is non-expansive for signal sizes that are powers of 2. The 2-D UMRT is applied in image processing applications like image compression and orientation analysis. The MRT & UMRT, being general transforms, will find potential applications in various fields of signal and image processing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the application of wavelet processing in the domain of handwritten character recognition. To attain high recognition rate, robust feature extractors and powerful classifiers that are invariant to degree of variability of human writing are needed. The proposed scheme consists of two stages: a feature extraction stage, which is based on Haar wavelet transform and a classification stage that uses support vector machine classifier. Experimental results show that the proposed method is effective

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Partial moments are extensively used in actuarial science for the analysis of risks. Since the first order partial moments provide the expected loss in a stop-loss treaty with infinite cover as a function of priority, it is referred as the stop-loss transform. In the present work, we discuss distributional and geometric properties of the first and second order partial moments defined in terms of quantile function. Relationships of the scaled stop-loss transform curve with the Lorenz, Gini, Bonferroni and Leinkuhler curves are developed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper compares the most common digital signal processing methods of exon prediction in eukaryotes, and also proposes a technique for noise suppression in exon prediction. The specimen used here which has relevance in medical research, has been taken from the public genomic database - GenBank.Here exon prediction has been done using the digital signal processing methods viz. binary method, EIIP (electron-ion interaction psuedopotential) method and filter methods. Under filter method two filter designs, and two approaches using these two designs have been tried. The discrete wavelet transform has been used for de-noising of the exon plots.Results of exon prediction based on the methods mentioned above, which give values closest to the ones found in the NCBI database are given here. The exon plot de-noised using discrete wavelet transform is also given.Alterations to the proven methods as done by the authors, improves performance of exon prediction algorithms. Also it has been proven that the discrete wavelet transform is an effective tool for de-noising which can be used with exon prediction algorithms

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a computation of the $V_gamma$ dimension for regression in bounded subspaces of Reproducing Kernel Hilbert Spaces (RKHS) for the Support Vector Machine (SVM) regression $epsilon$-insensitive loss function, and general $L_p$ loss functions. Finiteness of the RV_gamma$ dimension is shown, which also proves uniform convergence in probability for regression machines in RKHS subspaces that use the $L_epsilon$ or general $L_p$ loss functions. This paper presenta a novel proof of this result also for the case that a bias is added to the functions in the RKHS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compositional data analysis motivated the introduction of a complete Euclidean structure in the simplex of D parts. This was based on the early work of J. Aitchison (1986) and completed recently when Aitchinson distance in the simplex was associated with an inner product and orthonormal bases were identified (Aitchison and others, 2002; Egozcue and others, 2003). A partition of the support of a random variable generates a composition by assigning the probability of each interval to a part of the composition. One can imagine that the partition can be refined and the probability density would represent a kind of continuous composition of probabilities in a simplex of infinitely many parts. This intuitive idea would lead to a Hilbert-space of probability densities by generalizing the Aitchison geometry for compositions in the simplex into the set probability densities

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Central notations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform. In this way very elaborated aspects of mathematical statistics can be understood easily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating, combination of likelihood and robust M-estimation functions are simple additions/ perturbations in A2(Pprior). Weighting observations corresponds to a weighted addition of the corresponding evidence. Likelihood based statistics for general exponential families turns out to have a particularly easy interpretation in terms of A2(P). Regular exponential families form finite dimensional linear subspaces of A2(P) and they correspond to finite dimensional subspaces formed by their posterior in the dual information space A2(Pprior). The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P. The discussion of A2(P) valued random variables, such as estimation functions or likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lean is common sense and good business sense. As organizations grow and become more successful, they begin to lose insight into the basic truths of what made them successful. Organizations have to deal with more and more issues that may not have anything to do with directly providing products or services to their customers. Lean is a holistic management approach that brings the focus of the organization back to providing value to the customer. In August 2002, Mrs. Darleen Druyun, the Principal Deputy to the Assistant Secretary of the Air Force for Acquisition and government co-chairperson of the Lean Aerospace Initiative (LAI), decided it was time for Air Force acquisitions to embrace the concepts of lean. At her request, the LAI Executive Board developed a concept and methodology to employ lean into the Air Force’s acquisition culture and processes. This was the birth of the “Lean Now” initiative. An enterprise-wide approach was used, involving Air Force System Program Offices (SPOs), aerospace industry, and several Department of Defense agencies. The aim of Lean Now was to focus on the process interfaces between these “enterprise” stakeholders to eliminate barriers that impede progress. Any best practices developed would be institutionalized throughout the Air Force and the Department of Defense (DoD). The industry members of LAI agreed to help accelerate the government-industry transformation by donating lean Subject Matter Experts (SMEs) to mentor, train, and facilitate the lean events of each enterprise. Currently, the industry SMEs and the Massachusetts Institute of Technology are working together to help the Air Force develop its own lean infrastructure of training courses and Air Force lean SMEs. The first Lean Now programs were the F/A-22, Global Hawk, and F-16. Each program focused on specific acquisition processes. The F/A-22 focused on the Test and Evaluation process; the Global Hawk focused on Evolutionary Acquisitions; and the F-16 focused on improving the Contract Closeout process. Through lean, each enterprise made many significant improvements. The F/A-22 was able to reduce its Operational Flight Plan (OFP) Preparation and Load process time of 2 to 3 months down to 7 hours. The Global Hawk developed a new production plan that increases the annual production of its Integrated Sensor Suite from 3 per year to 6 per year. The F-16 enterprise generated and is working 12 initiatives that could result in a contract closeout cycle time reduction of 3 to 7 years. Each enterprise continues to generate more lean initiatives that focus on other areas and processes within their respective enterprises.