956 resultados para signals analysis


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Both deepening sleep and evolving epileptic seizures are associated with increasing slow-wave activity. Larger-scale functional networks derived from electroencephalogram indicate that in both transitions dramatic changes of communication between brain areas occur. During seizures these changes seem to be 'condensed', because they evolve more rapidly than during deepening sleep. Here we set out to assess quantitatively functional network dynamics derived from electroencephalogram signals during seizures and normal sleep. Functional networks were derived from electroencephalogram signals from wakefulness, light and deep sleep of 12 volunteers, and from pre-seizure, seizure and post-seizure time periods of 10 patients suffering from focal onset pharmaco-resistant epilepsy. Nodes of the functional network represented electrical signals recorded by single electrodes and were linked if there was non-random cross-correlation between the two corresponding electroencephalogram signals. Network dynamics were then characterized by the evolution of global efficiency, which measures ease of information transmission. Global efficiency was compared with relative delta power. Global efficiency significantly decreased both between light and deep sleep, and between pre-seizure, seizure and post-seizure time periods. The decrease of global efficiency was due to a loss of functional links. While global efficiency decreased significantly, relative delta power increased except between the time periods wakefulness and light sleep, and pre-seizure and seizure. Our results demonstrate that both epileptic seizures and deepening sleep are characterized by dramatic fragmentation of larger-scale functional networks, and further support the similarities between sleep and seizures.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We consider the problem of developing efficient sampling schemes for multiband sparse signals. Previous results on multicoset sampling implementations that lead to universal sampling patterns (which guarantee perfect reconstruction), are based on a set of appropriate interleaved analog to digital converters, all of them operating at the same sampling frequency. In this paper we propose an alternative multirate synchronous implementation of multicoset codes, that is, all the analog to digital converters in the sampling scheme operate at different sampling frequencies, without need of introducing any delay. The interleaving is achieved through the usage of different rates, whose sum is significantly lower than the Nyquist rate of the multiband signal. To obtain universal patterns the sampling matrix is formulated and analyzed. Appropriate choices of the parameters, that is the block length and the sampling rates, are also proposed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The main objective of this paper is to present some tools to analyze a digital chaotic signal. We have proposed some of them previously, as a new type of phase diagrams with binary signals converted to hexadecimal. Moreover, the main emphasis will be given in this paper to an analysis of the chaotic signal based on the Lempel and Ziv method. This technique has been employed partly by us to a very short stream of data. In this paper we will extend this method to long trains of data (larger than 2000 bit units). The main characteristics of the chaotic signal are obtained with this method being possible to present numerical values to indicate the properties of the chaos.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Human immunodeficiency virus type 1 (HIV-1) and human T cell leukemia virus type II (HTLV-2) use a similar mechanism for –1 translational frameshifting to overcome the termination codon in viral RNA at the end of the gag gene. Previous studies have identified two important RNA signals for frameshifting, the slippery sequence and a downstream stem–loop structure. However, there have been somewhat conflicting reports concerning the individual contributions of these sequences. In this study we have performed a comprehensive mutational analysis of the cis-acting RNA sequences involved in HIV-1 gag–pol and HTLV-2 gag–pro frameshifting. Using an in vitro translation system we determined frameshifting efficiencies for shuffled HIV-1/HTLV-2 RNA elements in a background of HIV-1 or HTLV-2 sequences. We show that the ability of the slippery sequence and stem–loop to promote ribosomal frameshifting is influenced by the flanking upstream sequence and the nucleotides in the spacer element. A wide range of frameshift efficiency rates was observed for both viruses when shuffling single sequence elements. The results for HIV-1/HTLV-2 chimeric constructs represent strong evidence supporting the notion that the viral wild-type sequences are not designed for maximal frameshifting activity but are optimized to a level suited to efficient viral replication.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

To improve the accuracy of predicting membrane protein sorting signals, we developed a general methodology for defining trafficking signal consensus sequences in the environment of the living cell. Our approach uses retroviral gene transfer to create combinatorial expression libraries of trafficking signal variants in mammalian cells, flow cytometry to sort cells based on trafficking phenotype, and quantitative trafficking assays to measure the efficacy of individual signals. Using this strategy to analyze arginine- and lysine-based endoplasmic reticulum localization signals, we demonstrate that small changes in the local sequence context dramatically alter signal strength, generating a broad spectrum of trafficking phenotypes. Finally, using sequences from our screen, we found that the potency of di-lysine, but not di-arginine, mediated endoplasmic reticulum localization was correlated with the strength of interaction with α-COP.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A novel approach to watermarking of audio signals using Independent Component Analysis (ICA) is proposed. It exploits the statistical independence of components obtained by practical ICA algorithms to provide a robust watermarking scheme with high information rate and low distortion. Numerical simulations have been performed on audio signals, showing good robustness of the watermark against common attacks with unnoticeable distortion, even for high information rates. An important aspect of the method is its domain independence: it can be used to hide information in other types of data, with minor technical adaptations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The presence of the conceptus in uterine cavity necessitates an elaborate network of interactions between the implanting embryo and a receptive endometrial tissue. We believe that embryo-derived signals play an important role in the remodeling and the extension of endometrial receptivity period. Our previous studies provided original evidence that human Chorionic Gonadotropin (hCG) modulates and potentiates endometrial epithelial as well as stromal cell responsiveness to interleukin 1 (IL1), one of the earliest embryonic signals, which may represent a novel pathway by which the embryo favors its own implantation and growth within the maternal endometrial host. The present study was designed to gain a broader understanding of hCG impact on the modulation of endometrial cell receptivity, and in particular, cell responsiveness to IL1 and the acquisition of growth-promoting phenotype capable of receiving, sustaining, and promoting early and crucial steps of embryonic development. Our results showed significant changes in the expression of genes involved in cell proliferation, immune modulation, tissue remodeling, apoptotic and angiogenic processes. This points to a relevant impact of these embryonic signals on the receptivity of the maternal endometrium, its adaptation to the implanting embryo and the creation of an environment that is favorable for the implantation and the growth of this latter within a new and likely hostile host tissue. Interestingly our data further identified a complex interaction between IL1 and hCG, which, despite a synergistic action on several significant endometrial target genes, may encompass a tight control of endogenous IL1 and extends to other IL1 family members.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Common bottlenose dolphins (Tursiops truncatus), produce a wide variety of vocal emissions for communication and echolocation, of which the pulsed repertoire has been the most difficult to categorize. Packets of high repetition, broadband pulses are still largely reported under a general designation of burst-pulses, and traditional attempts to classify these emissions rely mainly in their aural characteristics and in graphical aspects of spectrograms. Here, we present a quantitative analysis of pulsed signals emitted by wild bottlenose dolphins, in the Sado estuary, Portugal (2011-2014), and test the reliability of a traditional classification approach. Acoustic parameters (minimum frequency, maximum frequency, peak frequency, duration, repetition rate and inter-click-interval) were extracted from 930 pulsed signals, previously categorized using a traditional approach. Discriminant function analysis revealed a high reliability of the traditional classification approach (93.5% of pulsed signals were consistently assigned to their aurally based categories). According to the discriminant function analysis (Wilk's Λ = 0.11, F3, 2.41 = 282.75, P < 0.001), repetition rate is the feature that best enables the discrimination of different pulsed signals (structure coefficient = 0.98). Classification using hierarchical cluster analysis led to a similar categorization pattern: two main signal types with distinct magnitudes of repetition rate were clustered into five groups. The pulsed signals, here described, present significant differences in their time-frequency features, especially repetition rate (P < 0.001), inter-click-interval (P < 0.001) and duration (P < 0.001). We document the occurrence of a distinct signal type-short burst-pulses, and highlight the existence of a diverse repertoire of pulsed vocalizations emitted in graded sequences. The use of quantitative analysis of pulsed signals is essential to improve classifications and to better assess the contexts of emission, geographic variation and the functional significance of pulsed signals.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

It is an open-ended challenge to accurately detect the epileptic seizures through electroencephalogram (EEG) signals. Recently published studies have made elaborate attempts to distinguish between the normal and epileptic EEG signals by advanced nonlinear entropy methods, such as the approximate entropy, sample entropy, fuzzy entropy, and permutation entropy, etc. Most recently, a novel distribution entropy (DistEn) has been reported to have superior performance compared with the conventional entropy methods for especially short length data. We thus aimed, in the present study, to show the potential of DistEn in the analysis of epileptic EEG signals. The publicly-accessible Bonn database which consisted of normal, interictal, and ictal EEG signals was used in this study. Three different measurement protocols were set for better understanding the performance of DistEn, which are: i) calculate the DistEn of a specific EEG signal using the full recording; ii) calculate the DistEn by averaging the results for all its possible non-overlapped 5 second segments; and iii) calculate it by averaging the DistEn values for all the possible non-overlapped segments of 1 second length, respectively. Results for all three protocols indicated a statistically significantly increased DistEn for the ictal class compared with both the normal and interictal classes. Besides, the results obtained under the third protocol, which only used very short segments (1 s) of EEG recordings showed a significantly (p <; 0.05) increased DistEn for the interictal class in compassion with the normal class, whereas both analyses using relatively long EEG signals failed in tracking this difference between them, which may be due to a nonstationarity effect on entropy algorithm. The capability of discriminating between the normal and interictal EEG signals is of great clinical relevance since it may provide helpful tools for the detection of a seizure onset. Therefore, our study suggests that the DistEn analysis of EEG signals is very promising for clinical and even portable EEG monitoring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multicarrier code division multiple access (MC-CDMA) is a very promising candidate for the multiple access scheme in fourth generation wireless communi- cation systems. During asynchronous transmission, multiple access interference (MAI) is a major challenge for MC-CDMA systems and significantly affects their performance. The main objectives of this thesis are to analyze the MAI in asyn- chronous MC-CDMA, and to develop robust techniques to reduce the MAI effect. Focus is first on the statistical analysis of MAI in asynchronous MC-CDMA. A new statistical model of MAI is developed. In the new model, the derivation of MAI can be applied to different distributions of timing offset, and the MAI power is modelled as a Gamma distributed random variable. By applying the new statistical model of MAI, a new computer simulation model is proposed. This model is based on the modelling of a multiuser system as a single user system followed by an additive noise component representing the MAI, which enables the new simulation model to significantly reduce the computation load during computer simulations. MAI reduction using slow frequency hopping (SFH) technique is the topic of the second part of the thesis. Two subsystems are considered. The first sub- system involves subcarrier frequency hopping as a group, which is referred to as GSFH/MC-CDMA. In the second subsystem, the condition of group hopping is dropped, resulting in a more general system, namely individual subcarrier frequency hopping MC-CDMA (ISFH/MC-CDMA). This research found that with the introduction of SFH, both of GSFH/MC-CDMA and ISFH/MC-CDMA sys- tems generate less MAI power than the basic MC-CDMA system during asyn- chronous transmission. Because of this, both SFH systems are shown to outper- form MC-CDMA in terms of BER. This improvement, however, is at the expense of spectral widening. In the third part of this thesis, base station polarization diversity, as another MAI reduction technique, is introduced to asynchronous MC-CDMA. The com- bined system is referred to as Pol/MC-CDMA. In this part a new optimum com- bining technique namely maximal signal-to-MAI ratio combining (MSMAIRC) is proposed to combine the signals in two base station antennas. With the applica- tion of MSMAIRC and in the absents of additive white Gaussian noise (AWGN), the resulting signal-to-MAI ratio (SMAIR) is not only maximized but also in- dependent of cross polarization discrimination (XPD) and antenna angle. In the case when AWGN is present, the performance of MSMAIRC is still affected by the XPD and antenna angle, but to a much lesser degree than the traditional maximal ratio combining (MRC). Furthermore, this research found that the BER performance for Pol/MC-CDMA can be further improved by changing the angle between the two receiving antennas. Hence the optimum antenna angles for both MSMAIRC and MRC are derived and their effects on the BER performance are compared. With the derived optimum antenna angle, the Pol/MC-CDMA system is able to obtain the lowest BER for a given XPD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.