912 resultados para Frequency domain measurement


Relevância:

80.00% 80.00%

Publicador:

Resumo:

La presente Tesi ha per oggetto lo sviluppo e la validazione di nuovi criteri per la verifica a fatica multiassiale di componenti strutturali metallici . In particolare, i nuovi criteri formulati risultano applicabili a componenti metallici, soggetti ad un’ampia gamma di configurazioni di carico: carichi multiassiali variabili nel tempo, in modo ciclico e random, per alto e basso/medio numero di cicli di carico. Tali criteri costituiscono un utile strumento nell’ambito della valutazione della resistenza/vita a fatica di elementi strutturali metallici, essendo di semplice implementazione, e richiedendo tempi di calcolo piuttosto modesti. Nel primo Capitolo vengono presentate le problematiche relative alla fatica multiassiale, introducendo alcuni aspetti teorici utili a descrivere il meccanismo di danneggiamento a fatica (propagazione della fessura e frattura finale) di componenti strutturali metallici soggetti a carichi variabili nel tempo. Vengono poi presentati i diversi approcci disponibili in letteratura per la verifica a fatica multiassiale di tali componenti, con particolare attenzione all'approccio del piano critico. Infine, vengono definite le grandezze ingegneristiche correlate al piano critico, utilizzate nella progettazione a fatica in presenza di carichi multiassiali ciclici per alto e basso/medio numero di cicli di carico. Il secondo Capitolo è dedicato allo sviluppo di un nuovo criterio per la valutazione della resistenza a fatica di elementi strutturali metallici soggetti a carichi multiassiali ciclici e alto numero di cicli. Il criterio risulta basato sull'approccio del piano critico ed è formulato in termini di tensioni. Lo sviluppo del criterio viene affrontato intervenendo in modo significativo su una precedente formulazione proposta da Carpinteri e collaboratori nel 2011. In particolare, il primo intervento riguarda la determinazione della giacitura del piano critico: nuove espressioni dell'angolo che lega la giacitura del piano critico a quella del piano di frattura vengono implementate nell'algoritmo del criterio. Il secondo intervento è relativo alla definizione dell'ampiezza della tensione tangenziale e un nuovo metodo, noto come Prismatic Hull (PH) method (di Araújo e collaboratori), viene implementato nell'algoritmo. L'affidabilità del criterio viene poi verificata impiegando numerosi dati di prove sperimentali disponibili in letteratura. Nel terzo Capitolo viene proposto un criterio di nuova formulazione per la valutazione della vita a fatica di elementi strutturali metallici soggetti a carichi multiassiali ciclici e basso/medio numero di cicli. Il criterio risulta basato sull'approccio del piano critico, ed è formulato in termini di deformazioni. In particolare, la formulazione proposta trae spunto, come impostazione generale, dal criterio di fatica multiassiale in regime di alto numero di cicli discusso nel secondo Capitolo. Poiché in presenza di deformazioni plastiche significative (come quelle caratterizzanti la fatica per basso/medio numero di cicli di carico) è necessario conoscere il valore del coefficiente efficace di Poisson del materiale, vengono impiegate tre differenti strategie. In particolare, tale coefficiente viene calcolato sia per via analitica, che per via numerica, che impiegando un valore costante frequentemente adottato in letteratura. Successivamente, per validarne l'affidabilità vengono impiegati numerosi dati di prove sperimentali disponibili in letteratura; i risultati numerici sono ottenuti al variare del valore del coefficiente efficace di Poisson. Inoltre, al fine di considerare i significativi gradienti tensionali che si verificano in presenza di discontinuità geometriche, come gli intagli, il criterio viene anche esteso al caso dei componenti strutturali intagliati. Il criterio, riformulato implementando il concetto del volume di controllo proposto da Lazzarin e collaboratori, viene utilizzato per stimare la vita a fatica di provini con un severo intaglio a V, realizzati in lega di titanio grado 5. Il quarto Capitolo è rivolto allo sviluppo di un nuovo criterio per la valutazione del danno a fatica di elementi strutturali metallici soggetti a carichi multiassiali random e alto numero di cicli. Il criterio risulta basato sull'approccio del piano critico ed è formulato nel dominio della frequenza. Lo sviluppo del criterio viene affrontato intervenendo in modo significativo su una precedente formulazione proposta da Carpinteri e collaboratori nel 2014. In particolare, l’intervento riguarda la determinazione della giacitura del piano critico, e nuove espressioni dell'angolo che lega la giacitura del piano critico con quella del piano di frattura vengono implementate nell'algoritmo del criterio. Infine, l’affidabilità del criterio viene verificata impiegando numerosi dati di prove sperimentali disponibili in letteratura.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A prática do ioga tem se tornado cada vez mais popular, não apenas pelos benefícios físicos, mas principalmente pelo bem-estar psicológico trazido pela sua prática. Um dos componentes do ioga é o Prãnãyama, ou controle da respiração. A atenção e a respiração são dois mecanismos fisiológicos e involuntários requeridos para a execução do Prãnãyama. O principal objetivo desse estudo foi verificar se variáveis contínuas do EEG (potência de diferentes faixas que o compõem) seriam moduladas pelo controle respiratório, comparando-se separadamente as duas fases do ciclo respiratório (inspiração e expiração), na situação de respiração espontânea e controlada. Fizeram parte do estudo 19 sujeitos (7 homens/12 mulheres, idade média de 36,89 e DP = ± 14,46) que foram convidados a participar da pesquisa nas dependências da Faculdade de Saúde da Universidade Metodista de São Paulo. Para o registro do eletroencefalograma foi utilizado um sistema de posicionamento de cinco eletrodos Ag AgCl (FPz, Fz, Cz, Pz e Oz) fixados a uma touca de posicionamento rápido (Quick-Cap, Neuromedical Supplies®), em sistema 10-20. Foram obtidos valores de máxima amplitude de potência (espectro de potência no domínio da frequência) nas frequências teta, alfa e beta e delta e calculada a razão teta/beta nas diferentes fases do ciclo respiratório (inspiração e expiração), separadamente, nas condições de respiração espontânea e de controle respiratório. Para o registro do ciclo respiratório, foi utilizada uma cinta de esforço respiratório M01 (Pletismógrafo). Os resultados mostram diferenças significativas entre as condições de respiração espontânea e de controle com valores das médias da razão teta/beta menores na respiração controlada do que na respiração espontânea e valores de média da potência alfa sempre maiores no controle respiratório. Diferenças significativas foram encontradas na comparação entre inspiração e expiração da respiração controlada com diminuição dos valores das médias da razão teta/beta na inspiração e aumento nos valores das médias da potência alfa, sobretudo na expiração. Os achados deste estudo trazem evidências de que o controle respiratório modula variáveis eletrofisiológicas relativas à atenção refletindo um estado de alerta, porém mais relaxado do que na situação de respiração espontânea.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The underlying work to this thesis focused on the exploitation and investigation of photosensitivity mechanisms in optical fibres and planar waveguides for the fabrication of advanced integrated optical devices for telecoms and sensing applications. One major scope is the improvement of grating fabrication specifications by introducing new writing techniques and the use of advanced characterisation methods for grating testing. For the first time the polarisation control method for advanced grating fabrication has successfully been converted to apodised planar waveguide fabrication and the development of a holographic method for the inscription of chirped gratings at arbitrary wavelength is presented. The latter resulted in the fabrication of gratings for pulse-width suppression and wavelength selection in diode lasers. In co-operation with research partners a number of samples were tested using optical frequency domain and optical low coherence reflectometry for a better insight into the limitations of grating writing techniques. Using a variety of different fabrication methods, custom apodised and chirped fibre Bragg gratings were written for the use as filter elements for multiplexer-demultiplexer devices, as well as for short pulse generation and wavelength selection in telecommunication transmission systems. Long period grating based devices in standard, speciality and tapered fibres are presented, showing great potential for multi-parameter sensing. One particular scope is the development of vectorial curvature and refractive index sensors with potential for medical, chemical and biological sensing. In addition the design of an optically tunable Mach-Zehnder based multiwavelength filter is introduced. The discovery of a Type IA grating type through overexposure of hydrogen loaded standard and Boron-Germanium co-doped fibres strengthened the assumption of UV-photosensitivity being a highly non-linear process. Gratings of this type show a significantly lower thermal sensitivity compared to standard gratings, which makes them useful for sensing applications. An Oxford Lasers copper-vapour laser operating at 255 nm in pulsed mode was used for their inscription, in contrast to previous work using CW-Argon-Ion lasers and contributing to differences in the processes of the photorefractive index change

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Image segmentation is one of the most computationally intensive operations in image processing and computer vision. This is because a large volume of data is involved and many different features have to be extracted from the image data. This thesis is concerned with the investigation of practical issues related to the implementation of several classes of image segmentation algorithms on parallel architectures. The Transputer is used as the basic building block of hardware architectures and Occam is used as the programming language. The segmentation methods chosen for implementation are convolution, for edge-based segmentation; the Split and Merge algorithm for segmenting non-textured regions; and the Granlund method for segmentation of textured images. Three different convolution methods have been implemented. The direct method of convolution, carried out in the spatial domain, uses the array architecture. The other two methods, based on convolution in the frequency domain, require the use of the two-dimensional Fourier transform. Parallel implementations of two different Fast Fourier Transform algorithms have been developed, incorporating original solutions. For the Row-Column method the array architecture has been adopted, and for the Vector-Radix method, the pyramid architecture. The texture segmentation algorithm, for which a system-level design is given, demonstrates a further application of the Vector-Radix Fourier transform. A novel concurrent version of the quad-tree based Split and Merge algorithm has been implemented on the pyramid architecture. The performance of the developed parallel implementations is analysed. Many of the obtained speed-up and efficiency measures show values close to their respective theoretical maxima. Where appropriate comparisons are drawn between different implementations. The thesis concludes with comments on general issues related to the use of the Transputer system as a development tool for image processing applications; and on the issues related to the engineering of concurrent image processing applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis considers sparse approximation of still images as the basis of a lossy compression system. The Matching Pursuit (MP) algorithm is presented as a method particularly suited for application in lossy scalable image coding. Its multichannel extension, capable of exploiting inter-channel correlations, is found to be an efficient way to represent colour data in RGB colour space. Known problems with MP, high computational complexity of encoding and dictionary design, are tackled by finding an appropriate partitioning of an image. The idea of performing MP in the spatio-frequency domain after transform such as Discrete Wavelet Transform (DWT) is explored. The main challenge, though, is to encode the image representation obtained after MP into a bit-stream. Novel approaches for encoding the atomic decomposition of a signal and colour amplitudes quantisation are proposed and evaluated. The image codec that has been built is capable of competing with scalable coders such as JPEG 2000 and SPIHT in terms of compression ratio.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

It is desirable that energy performance improvement is not realized at the expense of other network performance parameters. This paper investigates the trade off between energy efficiency, spectral efficiency and user QoS performance for a multi-cell multi-user radio access network. Specifically, the energy consumption ratio (ECR) and the spectral efficiency of several common frequency domain packet schedulers in a cellular E-UTRAN downlink are compared for both the SISO transmission mode and the 2x2 Alamouti Space Frequency Block Code (SFBC) MIMO transmission mode. It is well known that the 2x2 SFBC MIMO transmission mode is more spectrally efficient compared to the SISO transmission mode, however, the relationship between energy efficiency and spectral efficiency is undecided. It is shown that, for the E-UTRAN downlink with fixed transmission power, spectral efficiency improvement results into energy efficiency improvement. The effect of SFBC MIMO versus SISO on the user QoS performance is also studied. © 2011 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In traditional communication systems the transmission medium is considered as a given characteristic of the channel, which does not depend on the properties of the transmitter and the receiver. Recent experimental demonstrations of the feasibility of extending the laser cavity over the whole communication link connecting the two parties, forming an ultra-long fiber laser (UFL), have raised groundbreaking possibilities in communication and particularly in secure communications. Here, a 500 km long secure key distribution link based on Raman gain UFL is demonstrated. An error-free distribution of a random key with an average rate of 100 bps between the users is demonstrated and the key is shown to be unrecoverable to an eavesdropper employing either time or frequency domain passive attacks. In traditional communication systems the transmission medium is considered as a given characteristic of the channel, which does not depend on the properties of the transmitter and the receiver. Recent demonstrations of the feasibility of extending the laser cavity over the whole communication link connecting the two parties, forming an ultra-long fiber laser (UFL), have raised groundbreaking possibilities in communication. Here, a 500 km long secure key distribution link based on Raman gain UFL is demonstrated. © 2014 by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A tilted fiber Bragg grating (TFBG) was integrated as the dispersive element in a high performance biomedical imaging system. The spectrum emitted by the 23 mm long active region of the fiber is projected through custom designed optics consisting of a cylindrical lens for vertical beam collimation and successively by an achromatic doublet onto a linear detector array. High resolution tomograms of biomedical samples were successfully acquired by the frequency domain OCT-system. Tomograms of ophthalmic and dermal samples obtained by the frequency domain OCT-system were obtained achieving 2.84 μm axial and 10.2 μm lateral resolution. The miniaturization reduces costs and has the potential to further extend the field of application for OCT-systems in biology, medicine and technology. © 2014 SPIE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes a method of signal preprocessing under active monitoring. Suppose we want to solve the inverse problem of getting the response of a medium to one powerful signal, which is equivalent to obtaining the transmission function of the medium, but do not have an opportunity to conduct such an experiment (it might be too expensive or harmful for the environment). Practically the problem can be reduced to obtaining the transmission function of the medium. In this case we can conduct a series of experiments of relatively low power and superpose the response signals. However, this method is conjugated with considerable loss of information (especially in the high frequency domain) due to fluctuations of the phase, the frequency and the starting time of each individual experiment. The preprocessing technique presented in this paper allows us to substantially restore the response of the medium and consequently to find a better estimate for the transmission function. This technique is based on expanding the initial signal into the system of orthogonal functions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this study is to accurately distinguish Parkinson's disease (PD) participants from healthy controls using self-administered tests of gait and postural sway. Using consumer-grade smartphones with in-built accelerometers, we objectively measure and quantify key movement severity symptoms of Parkinson's disease. Specifically, we record tri-axial accelerations, and extract a range of different features based on the time and frequency-domain properties of the acceleration time series. The features quantify key characteristics of the acceleration time series, and enhance the underlying differences in the gait and postural sway accelerations between PD participants and controls. Using a random forest classifier, we demonstrate an average sensitivity of 98.5% and average specificity of 97.5% in discriminating PD participants from controls. © 2014 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Unwanted spike noise in a digital signal is a common problem in digital filtering. However, sometimes the spikes are wanted and other, superimposed, signals are unwanted, and linear, time invariant (LTI) filtering is ineffective because the spikes are wideband - overlapping with independent noise in the frequency domain. So, no LTI filter can separate them, necessitating nonlinear filtering. However, there are applications in which the noise includes drift or smooth signals for which LTI filters are ideal. We describe a nonlinear filter formulated as the solution to an elastic net regularization problem, which attenuates band-limited signals and independent noise, while enhancing superimposed spikes. Making use of known analytic solutions a novel, approximate path-following algorithm is given that provides a good, filtered output with reduced computational effort by comparison to standard convex optimization methods. Accurate performance is shown on real, noisy electrophysiological recordings of neural spikes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A compact, fiber-based spectrometer for biomedical application utilizing a tilted fiber Bragg grating (TFBG) as integrated dispersive element is demonstrated. Based on a 45° UV-written PS750 TFBG a refractive spectrometer with 2.06 radiant/μm dispersion and a numerical aperture of 0.1 was set up and tested as integrated detector for an optical coherence tomography (OCT) system. Featuring a 23 mm long active region at the fiber the spectrum is projected via a cylindrical lens for vertical beam collimation and focused by an achromatic doublet onto the detector array. Covering 740 nm to 860 nm the spectrometer was optically connected to a broadband white light interferometer and a wide field scan head and electronically to an acquisition and control computer. Tomograms of ophthalmic and dermal samples obtained by the frequency domain OCT-system were obtained achieving 2.84 μm axial and 7.6 μm lateral resolution. © 2014 SPIE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We investigated 50 young patients with a diagnosis of Rolandic Epilepsy (RE) for the presence of abnormalities in autonomic tone compared with 50 young patients with idiopathic generalized epilepsy with absences and 50 typically developing children of comparable age. We analyzed time domain (N-N interval, pNN50) and frequency domain (High Frequency (HF), Low Frequency (LF) and LF/HF ratio) indices from ten-minute resting EKG activity. Patients with RE showed significantly higher HF and lower LF power and lower LF/HF ratio than controls, independent of the epilepsy group, and did not show significant differences in any other autonomic index with respect to the two control groups. In RE, we found a negative relationship between both seizure load and frequency of sleep interictal EEG abnormalities with parasympathetic drive levels. These changes might be the expression of adaptive mechanisms to prevent the excessive sympathetic drive seen in patients with refractory epilepsies. © 2012 Elsevier Inc.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We propose a Wiener-Hammerstein (W-H) channel estimation algorithm for Long-Term Evolution (LTE) systems. The LTE standard provides known data as pilot symbols and exploits them through coherent detection to improve system performance. These drivers are placed in a hybrid way to cover up both time and frequency domain. Our aim is to adapt the W-H equalizer (W-H/E) to LTE standard for compensation of both linear and nonlinear effects induced by power amplifiers and multipath channels. We evaluate the performance of the W-H/E for a Downlink LTE system in terms of BLER, EVM and Throughput versus SNR. Afterwards, we compare the results with a traditional Least-Mean Square (LMS) equalizer. It is shown that W-H/E can significantly reduce both linear and nonlinear distortions compared to LMS and improve LTE Downlink system performance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation proposed a new approach to seizure detection in intracranial EEG recordings using nonlinear decision functions. It implemented well-established features that were designed to deal with complex signals such as brain recordings, and proposed a 2-D domain of analysis. Since the features considered assume both the time and frequency domains, the analysis was carried out both temporally and as a function of different frequency ranges in order to ascertain those measures that were most suitable for seizure detection. In retrospect, this study established a generalized approach to seizure detection that works across several features and across patients. ^ Clinical experiments involved 8 patients with intractable seizures that were evaluated for potential surgical interventions. A total of 35 iEEG data files collected were used in a training phase to ascertain the reliability of the formulated features. The remaining 69 iEEG data files were then used in the testing phase. ^ The testing phase revealed that the correlation sum is the feature that performed best across all patients with a sensitivity of 92% and an accuracy of 99%. The second best feature was the gamma power with a sensitivity of 92% and an accuracy of 96%. In the frequency domain, all of the 5 other spectral bands considered, revealed mixed results in terms of low sensitivity in some frequency bands and low accuracy in other frequency bands, which is expected given that the dominant frequencies in iEEG are those of the gamma band. In the time domain, other features which included mobility, complexity, and activity, all performed very well with an average a sensitivity of 80.3% and an accuracy of 95%. ^ The computational requirement needed for these nonlinear decision functions to be generated in the training phase was extremely long. It was determined that when the duration dimension was rescaled, the results improved and the convergence rates of the nonlinear decision functions were reduced dramatically by more than a 100 fold. Through this rescaling, the sensitivity of the correlation sum improved to 100% and the sensitivity of the gamma power to 97%, which meant that there were even less false negatives and false positives detected. ^