855 resultados para Empirical Mode Decomposition, vibration-based analysis, damage detection, signal decomposition


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Structural Health Monitoring (SHM) schemes are useful for proper management of the performance of structures and for preventing their catastrophic failures. Vibration based SHM schemes has gained popularity during the past two decades resulting in significant research. It is hence evitable that future SHM schemes will include robust and automated vibration based damage assessment techniques (VBDAT) to detect, localize and quantify damage. In this context, the Damage Index (DI) method which is classified as non-model or output based VBDAT, has the ability to automate the damage assessment process without using a computer or numerical model along with actual measurements. Although damage assessment using DI methods have been able to achieve reasonable success for structures made of homogeneous materials such as steel, the same success level has not been reported with respect to Reinforced Concrete (RC) structures. The complexity of flexural cracks is claimed to be the main reason to hinder the applicability of existing DI methods in RC structures. Past research also indicates that use of a constant baseline throughout the damage assessment process undermines the potential of the Modal Strain Energy based Damage Index (MSEDI). To address this situation, this paper presents a novel method that has been developed as part of a comprehensive research project carried out at Queensland University of Technology, Brisbane, Australia. This novel process, referred to as the baseline updating method, continuously updates the baseline and systematically tracks both crack formation and propagation with the ability to automate the damage assessment process using output only data. The proposed method is illustrated through examples and the results demonstrate the capability of the method to achieve the desired outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Timely reporting, effective analyses and rapid distribution of surveillance data can assist in detecting the aberration of disease occurrence and further facilitate a timely response. In China, a new nationwide web-based automated system for outbreak detection and rapid response was developed in 2008. The China Infectious Disease Automated-alert and Response System (CIDARS) was developed by the Chinese Center for Disease Control and Prevention based on the surveillance data from the existing electronic National Notifiable Infectious Diseases Reporting Information System (NIDRIS) started in 2004. NIDRIS greatly improved the timeliness and completeness of data reporting with real time reporting information via the Internet. CIDARS further facilitates the data analysis, aberration detection, signal dissemination, signal response and information communication needed by public health departments across the country. In CIDARS, three aberration detection methods are used to detect the unusual occurrence of 28 notifiable infectious diseases at the county level and to transmit that information either in real-time or on a daily basis. The Internet, computers and mobile phones are used to accomplish rapid signal generation and dissemination, timely reporting and reviewing of the signal response results. CIDARS has been used nationwide since 2008; all Centers for Disease Control and Prevention (CDC) in China at the county, prefecture, provincial and national levels are involved in the system. It assists with early outbreak detection at the local level and prompts reporting of unusual disease occurrences or potential outbreaks to CDCs throughout the country.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information from the full diffusion tensor (DT) was used to compute voxel-wise genetic contributions to brain fiber microstructure. First, we designed a new multivariate intraclass correlation formula in the log-Euclidean framework. We then analyzed used the full multivariate structure of the tensor in a multivariate version of a voxel-wise maximum-likelihood structural equation model (SEM) that computes the variance contributions in the DTs from genetic (A), common environmental (C) and unique environmental (E) factors. Our algorithm was tested on DT images from 25 identical and 25 fraternal twin pairs. After linear and fluid registration to a mean template, we computed the intraclass correlation and Falconer's heritability statistic for several scalar DT-derived measures and for the full multivariate tensors. Covariance matrices were found from the DTs, and inputted into SEM. Analyzing the full DT enhanced the detection of A and C effects. This approach should empower imaging genetics studies that use DTI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human sport doping control analysis is a complex and challenging task for anti-doping laboratories. The List of Prohibited Substances and Methods, updated annually by World Anti-Doping Agency (WADA), consists of hundreds of chemically and pharmacologically different low and high molecular weight compounds. This poses a considerable challenge for laboratories to analyze for them all in a limited amount of time from a limited sample aliquot. The continuous expansion of the Prohibited List obliges laboratories to keep their analytical methods updated and to research new available methodologies. In this thesis, an accurate mass-based analysis employing liquid chromatography - time-of-flight mass spectrometry (LC-TOFMS) was developed and validated to improve the power of doping control analysis. New analytical methods were developed utilizing the high mass accuracy and high information content obtained by TOFMS to generate comprehensive and generic screening procedures. The suitability of LC-TOFMS for comprehensive screening was demonstrated for the first time in the field with mass accuracies better than 1 mDa. Further attention was given to generic sample preparation, an essential part of screening analysis, to rationalize the whole work flow and minimize the need for several separate sample preparation methods. Utilizing both positive and negative ionization allowed the detection of almost 200 prohibited substances. Automatic data processing produced a Microsoft Excel based report highlighting the entries fulfilling the criteria of the reverse data base search (retention time (RT), mass accuracy, isotope match). The quantitative performance of LC-TOFMS was demonstrated with morphine, codeine and their intact glucuronide conjugates. After a straightforward sample preparation the compounds were analyzed directly without the need for hydrolysis, solvent transfer, evaporation or reconstitution. The hydrophilic interaction technique (HILIC) provided good chromatographic separation, which was critical for the morphine glucuronide isomers. A wide linear range (50-5000 ng/ml) with good precision (RSD<10%) and accuracy (±10%) was obtained, showing comparable or better performance to other methods used. In-source collision-induced dissociation (ISCID) allowed confirmation analysis with three diagnostic ions with a median mass accuracy of 1.08 mDa and repeatable ion ratios fulfilling WADA s identification criteria. The suitability of LC-TOFMS for screening of high molecular weight doping agents was demonstrated with plasma volume expanders (PVE), namely dextran and hydroxyethylstarch (HES). Specificity of the assay was improved, since interfering matrix compounds were removed by size exclusion chromatography (SEC). ISCID produced three characteristic ions with an excellent mean mass accuracy of 0.82 mDa at physiological concentration levels. In summary, by combining TOFMS with a proper sample preparation and chromatographic separation, the technique can be utilized extensively in doping control laboratories for comprehensive screening of chemically different low and high molecular weight compounds, for quantification of threshold substances and even for confirmation. LC-TOFMS rationalized the work flow in doping control laboratories by simplifying the screening scheme, expediting reporting and minimizing the analysis costs. Therefore LC-TOFMS can be exploited widely in doping control, and the need for several separate analysis techniques is reduced.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a complete solution to the problem of coherent-mode decomposition of the most general anisotropic Gaussian Schell-model (AGSM) beams, which constitute a ten-parameter family. Our approach is based on symmetry considerations. Concepts and techniques familiar from the context of quantum mechanics in the two-dimensional plane are used to exploit the Sp(4, R) dynamical symmetry underlying the AGSM problem. We take advantage of the fact that the symplectic group of first-order optical system acts unitarily through the metaplectic operators on the Hilbert space of wave amplitudes over the transverse plane, and, using the Iwasawa decomposition for the metaplectic operator and the classic theorem of Williamson on the normal forms of positive definite symmetric matrices under linear canonical transformations, we demonstrate the unitary equivalence of the AGSM problem to a separable problem earlier studied by Li and Wolf [Opt. Lett. 7, 256 (1982)] and Gori and Guattari [Opt. Commun. 48, 7 (1983)]. This conn ction enables one to write down, almost by inspection, the coherent-mode decomposition of the general AGSM beam. A universal feature of the eigenvalue spectrum of the AGSM family is noted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the design and performance analysis of a detector based on suprathreshold stochastic resonance (SSR) for the detection of deterministic signals in heavy-tailed non-Gaussian noise. The detector consists of a matched filter preceded by an SSR system which acts as a preprocessor. The SSR system is composed of an array of 2-level quantizers with independent and identically distributed (i.i.d) noise added to the input of each quantizer. The standard deviation sigma of quantizer noise is chosen to maximize the detection probability for a given false alarm probability. In the case of a weak signal, the optimum sigma also minimizes the mean-square difference between the output of the quantizer array and the output of the nonlinear transformation of the locally optimum detector. The optimum sigma depends only on the probability density functions (pdfs) of input noise and quantizer noise for weak signals, and also on the signal amplitude and the false alarm probability for non-weak signals. Improvement in detector performance stems primarily from quantization and to a lesser extent from the optimization of quantizer noise. For most input noise pdfs, the performance of the SSR detector is very close to that of the optimum detector. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lamb wave type guided wave propagation in foam core sandwich structures and detectability of damages using spectral analysis method are reported in this paper. An experimental study supported by theoretical evaluation of the guided wave characteristics is presented here that shows the applicability of Lamb wave type guided ultrasonic wave for detection of damage in foam core sandwich structures. Sandwich beam specimens were fabricated with 10 mm thick foam core and 0.3 mm thick aluminum face sheets. Thin piezoelectric patch actuators and sensors are used to excite and sense guided wave. Group velocity dispersion curves and frequency response of sensed signal are obtained experimentally. The nature of damping present in the sandwich panel is monitored by measuring the sensor signal amplitude at various different distances measured from the center of the linear phased array. Delaminations of increasing width are created and detected experimentally by pitch-catch interrogation with guided waves and wavelet transform of the sensed signal. Signal amplitudes are analyzed for various different sizes of damages to differentiate the damage size/severity. A sandwich panel is also fabricated with a planer dimension of 600 mm x 400 mm. Release film delamination is introduced during fabrication. Non-contact Laser Doppler Vibrometer (LDV) is used to scan the panel while exciting with a surface bonded piezoelectric actuator. Presence of damage is confirmed by the reflected wave fringe pattern obtained from the LDV scan. With this approach it is possible to locate and monitor the damages by tracking the wave packets scattered from the damages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A series of fluoranthene derivatives (F1-F5) varied with nature and type of substituents were synthesized via Diels-Alder reaction followed by in situ decarbonylation. The solid state structures have been established through single crystal X-ray diffraction (XRD). The presence of extended conjugation and having two alkyloxy chains on phenyl rings induces flexibility to orient opposite to each other and interacts with another fluoranthene unit with weak pi-pi interactions and show unique supramolecular arrangements. The envisaged photophysical and DFT studies demonstrated that HOMO-LUMO levels were effectively tuned by different substituents with an optical band gap from 3.44 to 3.88 eV provoked to examine as sensitive fluorescent chemosensors for the detection of nitroaromatic compounds (NACs). The sensitivity toward the detection of NACs was evaluated through fluorescence quenching in solution (aqueous and non-aqueous) and solid state (vapor and contact mode). Fluorescence studies demonstrated that electron transfer occurs from the electron rich fluoranthene fluorophores to the electron deficient NACs by the dominant static quenching mechanism and the quenching process is reversible. It was found that the detection sensitivity increases with extent of conjugation on fluoranthene unit. The contact mode approach using thin layer silica chromatographic plates exhibits a femtogram (1.15 fg/cm(2)) detection limit for trinitrotoluene (TNT) and picric acid (PA), while the solution state fluorescence quenching shows for PA detection at the 2-20 ppb level. The sensing performance of fluoranthene thin films to NACs in aqueous solution reveals that fluorophores are highly selective towards the detection of PA. The smart performances of thin film fluorophores with high photostability have great advantage than those of conjugated polymers with superior sensitive detection of PA in groundwater.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The equivalence of triangle-comparison-based pulse width modulation (TCPWM) and space vector based PWM (SVPWM) during linear modulation is well-known. This paper analyses triangle-comparison based PWM techniques (TCPWM) such as sine-triangle PWM (SPWM) and common-mode voltage injection PWM during overmodulation from a space vector point of view. The average voltage vector produced by TCPWM during overmodulation is studied in the stationary (a-b) reference frame. This is compared and contrasted with the average voltage vector corresponding to the well-known standard two-zone algorithm for space vector modulated inverters. It is shown that the two-zone overmodulation algorithm itself can be derived from the variation of average voltage vector with TCPWM. The average voltage vector is further studied in a synchronously revolving (d-q) reference frame. The RMS value of low-order voltage ripple can be estimated, and can be used to compare harmonic distortion due to different PWM methods during overmodulation. The measured values of the total harmonic distortion (THD) in the line currents are presented at various fundamental frequencies. The relative values of measured current THD pertaining to different PWM methods tally with those of analytically evaluated RMS voltage ripple.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research has been undertaken to ascertain the predictability of non-stationary time series using wavelet and Empirical Mode Decomposition (EMD) based time series models. Methods have been developed in the past to decompose a time series into components. Forecasting of these components combined with random component could yield predictions. Using this ideology, wavelet and EMD analyses have been incorporated separately which decomposes a time series into independent orthogonal components with both time and frequency localizations. The component series are fit with specific auto-regressive models to obtain forecasts which are later combined to obtain the actual predictions. Four non-stationary streamflow sites (USGS data resources) of monthly total volumes and two non-stationary gridded rainfall sites (IMD) of monthly total rainfall are considered for the study. The predictability is checked for six and twelve months ahead forecasts across both the methodologies. Based on performance measures, it is observed that wavelet based method has better prediction capabilities over EMD based method despite some of the limitations of time series methods and the manner in which decomposition takes place. Finally, the study concludes that the wavelet based time series algorithm can be used to model events such as droughts with reasonable accuracy. Also, some modifications that can be made in the model have been discussed that could extend the scope of applicability to other areas in the field of hydrology. (C) 2013 Elesvier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Approximate Nearest Neighbour Field maps are commonly used by computer vision and graphics community to deal with problems like image completion, retargetting, denoising, etc. In this paper, we extend the scope of usage of ANNF maps to medical image analysis, more specifically to optic disk detection in retinal images. In the analysis of retinal images, optic disk detection plays an important role since it simplifies the segmentation of optic disk and other retinal structures. The proposed approach uses FeatureMatch, an ANNF algorithm, to find the correspondence between a chosen optic disk reference image and any given query image. This correspondence provides a distribution of patches in the query image that are closest to patches in the reference image. The likelihood map obtained from the distribution of patches in query image is used for optic disk detection. The proposed approach is evaluated on five publicly available DIARETDB0, DIARETDB1, DRIVE, STARE and MESSIDOR databases, with total of 1540 images. We show, experimentally, that our proposed approach achieves an average detection accuracy of 99% and an average computation time of 0.2 s per image. (C) 2013 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The decomposition of experimental data into dynamic modes using a data-based algorithm is applied to Schlieren snapshots of a helium jet and to time-resolved PIV-measurements of an unforced and harmonically forced jet. The algorithm relies on the reconstruction of a low-dimensional inter-snapshot map from the available flow field data. The spectral decomposition of this map results in an eigenvalue and eigenvector representation (referred to as dynamic modes) of the underlying fluid behavior contained in the processed flow fields. This dynamic mode decomposition allows the breakdown of a fluid process into dynamically revelant and coherent structures and thus aids in the characterization and quantification of physical mechanisms in fluid flow. © 2010 Springer-Verlag.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Offshore seismic exploration is full of high investment and risk. And there are many problems, such as multiple. The technology of high resolution and high S/N ratio on marine seismic data processing is becoming an important project. In this paper, the technology of multi-scale decomposition on both prestack and poststack seismic data based on wavelet and Hilbert-Huang transform and the theory of phase deconvolution is proposed by analysis of marine seismic exploration, investigation and study of literatures, and integration of current mainstream and emerging technology. Related algorithms are studied. The Pyramid algorithm of decomposition and reconstruction had been given by the Mallat algorithm of discrete wavelet transform In this paper, it is introduced into seismic data processing, the validity is shown by test with field data. The main idea of Hilbert-Huang transform is the empirical mode decomposition with which any complicated data set can be decomposed into a finite and often small number of intrinsic mode functions that admit well-behaved Hilbert transform. After the decomposition, a analytical signal is constructed by Hilbert transform, from which the instantaneous frequency and amplitude can be obtained. And then, Hilbert spectrum. This decomposition method is adaptive and highly efficient. Since the decomposition is based on the local characteristics of the time scale of data, it is applicable to nonlinear and non-stationary processes. The phenomenons of fitting overshoot and undershoot and end swings are analyzed in Hilbert-Huang transform. And these phenomenons are eliminated by effective method which is studied in the paper. The technology of multi-scale decomposition on both prestack and poststack seismic data can realize the amplitude preserved processing, enhance the seismic data resolution greatly, and overcome the problem that different frequency components can not restore amplitude properly uniformly in the conventional method. The method of phase deconvolution, which has overcome the minimum phase limitation in traditional deconvolution, approached the base fact well that the seismic wavelet is phase mixed in practical application. And a more reliable result will be given by this method. In the applied research, the high resolution relative amplitude preserved processing result has been obtained by careful analysis and research with the application of the methods mentioned above in seismic data processing in four different target areas of China Sea. Finally, a set of processing flow and method system was formed in the paper, which has been carried on in the application in the actual production process and has made the good progress and the huge economic benefit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In modem signal Processing,non-linear,non-Gaussian and non-stable signals are usually the analyzed and Processed objects,especially non-stable signals. The convention always to analyze and Process non-stable signals are: short time Fourier transform,Wigner-Ville distribution,wavelet Transform and so on. But the above three algorithms are all based on Fourier Transform,so they all have the shortcoming of Fourier Analysis and cannot get rid of the localization of it. Hilbert-Huang Transform is a new non-stable signal processing technology,proposed by N. E. Huang in 1998. It is composed of Empirical Mode Decomposition (referred to as EMD) and Hilbert Spectral Analysis (referred to as HSA). After EMD Processing,any non-stable signal will be decomposed to a series of data sequences with different scales. Each sequence is called an Intrinsic Mode Function (referred to as IMF). And then the energy distribution plots of the original non-stable signal can be found by summing all the Hilbert spectrums of each IMF. In essence,this algorithm makes the non-stable signals become stable and decomposes the fluctuations and tendencies of different scales by degrees and at last describes the frequency components with instantaneous frequency and energy instead of the total frequency and energy in Fourier Spectral Analysis. In this case,the shortcoming of using many fake harmonic waves to describe non-linear and non-stable signals in Fourier Transform can be avoided. This Paper researches in the following parts: Firstly,This paper introduce the history and development of HHT,subsequently the characters and main issues of HHT. This paper briefly introduced the basic realization principles and algorithms of Hilbert-Huang transformation and confirms its validity by simulations. Secondly, This paper discuss on some shortcoming of HHT. By using FFT interpolation, we solve the problem of IMF instability and instantaneous frequency undulate which are caused by the insufficiency of sampling rate. As to the bound effect caused by the limitation of envelop algorithm of HHT, we use the wave characteristic matching method, and have good result. Thirdly, This paper do some deeply research on the application of HHT in electromagnetism signals processing. Based on the analysis of actual data examples, we discussed its application in electromagnetism signals processing and noise suppression. Using empirical mode decomposition method and multi-scale filter characteristics can effectively analyze the noise distribution of electromagnetism signal and suppress interference processing and information interpretability. It has been founded that selecting electromagnetism signal sessions using Hilbert time-frequency energy spectrum is helpful to improve signal quality and enhance the quality of data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Biosensors are used for a large number of applications within biotechnology, including the pharmaceutical industry and life sciences. Since the production of Biacore surface-plasmon resonance instruments in the early 1990s, there has been steadily growing use of this technology for the detection of food contaminants (e.g., veterinary drugs, mycotoxins, marine toxins, food dyes and processing contaminants). Other biosensing technologies (e.g., electrochemical and piezoelectric) have also been employed for the analysis of small-molecule contaminants. This review concentrates on recent advances made in detection and quantification of antimicrobial compounds with different types of biosensors and on the emergence of multiplexing, which is highly desirable as it increases sample analysis at lower cost and in less time. (C) 2010 Elsevier Ltd. All rights reserved.