992 resultados para signal reconstruction


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Considering a general linear model of signal degradation, by modeling the probability density function (PDF) of the clean signal using a Gaussian mixture model (GMM) and additive noise by a Gaussian PDF, we derive the minimum mean square error (MMSE) estimator. The derived MMSE estimator is non-linear and the linear MMSE estimator is shown to be a special case. For speech signal corrupted by independent additive noise, by modeling the joint PDF of time-domain speech samples of a speech frame using a GMM, we propose a speech enhancement method based on the derived MMSE estimator. We also show that the same estimator can be used for transform-domain speech enhancement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of reconstruction of a refractive-index distribution (RID) in optical refraction tomography (ORT) with optical path-length difference (OPD) data is solved using two adaptive-estimation-based extended-Kalman-filter (EKF) approaches. First, a basic single-resolution EKF (SR-EKF) is applied to a state variable model describing the tomographic process, to estimate the RID of an optically transparent refracting object from noisy OPD data. The initialization of the biases and covariances corresponding to the state and measurement noise is discussed. The state and measurement noise biases and covariances are adaptively estimated. An EKF is then applied to the wavelet-transformed state variable model to yield a wavelet-based multiresolution EKF (MR-EKF) solution approach. To numerically validate the adaptive EKF approaches, we evaluate them with benchmark studies of standard stationary cases, where comparative results with commonly used efficient deterministic approaches can be obtained. Detailed reconstruction studies for the SR-EKF and two versions of the MR-EKF (with Haar and Daubechies-4 wavelets) compare well with those obtained from a typically used variant of the (deterministic) algebraic reconstruction technique, the average correction per projection method, thus establishing the capability of the EKF for ORT. To the best of our knowledge, the present work contains unique reconstruction studies encompassing the use of EKF for ORT in single-resolution and multiresolution formulations, and also in the use of adaptive estimation of the EKF's noise covariances. (C) 2010 Optical Society of America

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe a noniterative method for recovering optical absorption coefficient distribution from the absorbed energy map reconstructed using simulated and noisy boundary pressure measurements. The source reconstruction problem is first solved for the absorbed energy map corresponding to single- and multiple-source illuminations from the side of the imaging plane. It is shown that the absorbed energy map and the absorption coefficient distribution, recovered from the single-source illumination with a large variation in photon flux distribution, have signal-to-noise ratios comparable to those of the reconstructed parameters from a more uniform photon density distribution corresponding to multiple-source illuminations. The absorbed energy map is input as absorption coefficient times photon flux in the time-independent diffusion equation (DE) governing photon transport to recover the photon flux in a single step. The recovered photon flux is used to compute the optical absorption coefficient distribution from the absorbed energy map. In the absence of experimental data, we obtain the boundary measurements through Monte Carlo simulations, and we attempt to address the possible limitations of the DE model in the overall reconstruction procedure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Solar ultraviolet (UV) radiation has a broad range of effects concerning life on Earth. Soon after the mid-1980s, it was recognized that the stratospheric ozone content was declining over large areas of the globe. Because the stratospheric ozone layer protects life on Earth from harmful UV radiation, this lead to concern about possible changes in the UV radiation due to anthropogenic activity. Initiated by this concern, many stations for monitoring of the surface UV radiation were founded in the late 1980s and early 1990s. As a consequence, there is an apparent lack of information on UV radiation further in the past: measurements cannot tell us how the UV radiation levels have changed on time scales of, for instance, several decades. The aim of this thesis was to improve our understanding of past variations in the surface UV radiation by developing techniques for UV reconstruction. Such techniques utilize commonly available meteorological data together with measurements of the total ozone column for reconstructing, or estimating, the amount of UV radiation reaching Earth's surface in the past. Two different techniques for UV reconstruction were developed. Both are based on first calculating the clear-sky UV radiation using a radiative transfer model. The clear-sky value is then corrected for the effect of clouds based on either (i) sunshine duration or (ii) pyranometer measurements. Both techniques account also for the variations in the surface albedo caused by snow, whereas aerosols are included as a typical climatological aerosol load. Using these methods, long time series of reconstructed UV radiation were produced for five European locations, namely Sodankylä and Jokioinen in Finland, Bergen in Norway, Norrköping in Sweden, and Davos in Switzerland. Both UV reconstruction techniques developed in this thesis account for the greater part of the factors affecting the amount of UV radiation reaching the Earth's surface. Thus, they are considered reliable and trustworthy, as suggested also by the good performance of the methods. The pyranometer-based method shows better performance than the sunshine-based method, especially for daily values. For monthly values, the difference between the performances of the methods is smaller, indicating that the sunshine-based method is roughly as good as the pyranometer-based for assessing long-term changes in the surface UV radiation. The time series of reconstructed UV radiation produced in this thesis provide new insight into the past UV radiation climate and how the UV radiation has varied throughout the years. Especially the sunshine-based UV time series, extending back to 1926 and 1950 at Davos and Sodankylä, respectively, also put the recent changes driven by the ozone decline observed over the last few decades into perspective. At Davos, the reconstructed UV over the period 1926-2003 shows considerable variation throughout the entire period, with high values in the mid-1940s, early 1960s, and in the 1990s. Moreover, the variations prior to 1980 were found to be caused primarily by variations in the cloudiness, while the increase of 4.5 %/decade over the period 1979-1999 was supported by both the decline in the total ozone column and changes in the cloudiness. Of the other stations included in this work, both Sodankylä and Norrköping show a clear increase in the UV radiation since the early 1980s (3-4 %/decade), driven primarily by changes in the cloudiness, and to a lesser extent by the diminution of the total ozone. At Jokioinen, a weak increase was found, while at Bergen there was no considerable overall change in the UV radiation level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work belongs to the field of computational high-energy physics (HEP). The key methods used in this thesis work to meet the challenges raised by the Large Hadron Collider (LHC) era experiments are object-orientation with software engineering, Monte Carlo simulation, the computer technology of clusters, and artificial neural networks. The first aspect discussed is the development of hadronic cascade models, used for the accurate simulation of medium-energy hadron-nucleus reactions, up to 10 GeV. These models are typically needed in hadronic calorimeter studies and in the estimation of radiation backgrounds. Various applications outside HEP include the medical field (such as hadron treatment simulations), space science (satellite shielding), and nuclear physics (spallation studies). Validation results are presented for several significant improvements released in Geant4 simulation tool, and the significance of the new models for computing in the Large Hadron Collider era is estimated. In particular, we estimate the ability of the Bertini cascade to simulate Compact Muon Solenoid (CMS) hadron calorimeter HCAL. LHC test beam activity has a tightly coupled cycle of simulation-to-data analysis. Typically, a Geant4 computer experiment is used to understand test beam measurements. Thus an another aspect of this thesis is a description of studies related to developing new CMS H2 test beam data analysis tools and performing data analysis on the basis of CMS Monte Carlo events. These events have been simulated in detail using Geant4 physics models, full CMS detector description, and event reconstruction. Using the ROOT data analysis framework we have developed an offline ANN-based approach to tag b-jets associated with heavy neutral Higgs particles, and we show that this kind of NN methodology can be successfully used to separate the Higgs signal from the background in the CMS experiment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of designing high rate, full diversity noncoherent space-time block codes (STBCs) with low encoding and decoding complexity is addressed. First, the notion of g-group encodable and g-group decodable linear STBCs is introduced. Then for a known class of rate-1 linear designs, an explicit construction of fully-diverse signal sets that lead to four-group encodable and four-group decodable differential scaled unitary STBCs for any power of two number of antennas is provided. Previous works on differential STBCs either sacrifice decoding complexity for higher rate or sacrifice rate for lower decoding complexity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis describes methods for the reliable identification of hadronically decaying tau leptons in the search for heavy Higgs bosons of the minimal supersymmetric standard model of particle physics (MSSM). The identification of the hadronic tau lepton decays, i.e. tau-jets, is applied to the gg->bbH, H->tautau and gg->tbH+, H+->taunu processes to be searched for in the CMS experiment at the CERN Large Hadron Collider. Of all the event selections applied in these final states, the tau-jet identification is the single most important event selection criterion to separate the tiny Higgs boson signal from a large number of background events. The tau-jet identification is studied with methods based on a signature of a low charged track multiplicity, the containment of the decay products within a narrow cone, an isolated electromagnetic energy deposition, a non-zero tau lepton flight path, the absence of electrons, muons, and neutral hadrons in the decay signature, and a relatively small tau lepton mass compared to the mass of most hadrons. Furthermore, in the H+->taunu channel, helicity correlations are exploited to separate the signal tau jets from those originating from the W->taunu decays. Since many of these identification methods rely on the reconstruction of charged particle tracks, the systematic uncertainties resulting from the mechanical tolerances of the tracking sensor positions are estimated with care. The tau-jet identification and other standard selection methods are applied to the search for the heavy neutral and charged Higgs bosons in the H->tautau and H+->taunu decay channels. For the H+->taunu channel, the tau-jet identification is redone and optimized with a recent and more detailed event simulation than previously in the CMS experiment. Both decay channels are found to be very promising for the discovery of the heavy MSSM Higgs bosons. The Higgs boson(s), whose existence has not yet been experimentally verified, are a part of the standard model and its most popular extensions. They are a manifestation of a mechanism which breaks the electroweak symmetry and generates masses for particles. Since the H->tautau and H+->taunu decay channels are important for the discovery of the Higgs bosons in a large region of the permitted parameter space, the analysis described in this thesis serves as a probe for finding out properties of the microcosm of particles and their interactions in the energy scales beyond the standard model of particle physics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aujourd'hui, techniques et technologies interagissent avec le corps humain et donnent aux personnes la possibilité de reconstruire leur corps, mais aussi de l'améliorer et de l'augmenter. L'hybridation est un processus technologique visant à compenser les défaillances humaines. L'augmentation de la puissance d'être est exaltée (santé, sexualité, performance, jeunesse), pourtant son accès n'est pas pour tous. Ce livre propose de démêler les différentes représentations du corps hybride et les projets qui les sous-tendent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measurable electrical signal is generated when a gas flows over a variety of solids, including doped semiconductors, even at the modest speed of a few meters per second. The underlying mechanism is an interesting interplay of Bernoulli's principle and the Seebeck effect. The electrical signal depends on the square of Mach number (M) and is proportional to the Seebeck coefficient (S) of the solids. Here we present experimental estimate of the response time of the signal rise and fall process, i.e. how fast the semiconductor materials respond to a steady flow as soon as it is set on or off. A theoretical model is also presented to understand the process and the dependence of the response time on the nature and physical dimensions of the semiconductor material used and they are compared with the experimental observations. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple analog instrumentation for Electrical Impedance Tomography is developed and calibrated using the practical phantoms. A constant current injector consisting of a modified Howland voltage controlled current source fed by a voltage controlled oscillator is developed to inject a constant current to the phantom boundary. An instrumentation amplifier, 50 Hz notch filter and a narrow band pass filter are developed and used for signal conditioning. Practical biological phantoms are developed and the forward problem is studied to calibrate the EIT-instrumentation. An array of sixteen stainless steel electrodes is developed and placed inside the phantom tank filled with KCl solution. 1 mA, 50 kHz sinusoidal current is injected at the phantom boundary using adjacent current injection protocol. The differential potentials developed at the voltage electrodes are measured for sixteen current injections. Differential voltage signal is passed through an instrumentation amplifier and a filtering block and measured by a digital multimeter. A forward solver is developed using Finite Element Method in MATLAB7.0 for solving the EIT governing equation. Differential potentials are numerically calculated using the forward solver with a simulated current and bathing solution conductivity. Measured potential data is compared with the differential potentials calculated for calibrating the instrumentation to acquire the voltage data suitable for better image reconstruction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The book presents a reconstruction, interpretation and critical evaluation of the Schumpeterian theoretical approach to socio-economic change. The analysis focuses on the problem of social evolution, on the interpretation of the innovation process and business cycles and, finally, on Schumpeter s optimistic neglect of ecological-environmental conditions as possible factors influencing social-economic change. The author investigates how the Schumpeterian approach describes the process of social and economic evolution, and how the logic of transformations is described, explained and understood in the Schumpeterian theory. The material of the study includes Schumpeter s works written after 1925, a related part of the commentary literature on these works, and a selected part of the related literature on the innovation process, technological transformations and the problem of long waves. Concerning the period after 1925, the Schumpeterian oeuvre is conceived and analysed as a more or less homogenous corpus of texts. The book is divided into 9 chapters. Chapters 1-2 describe the research problems and methods. Chapter 3 is an effort to provide a systematic reconstruction of Schumpeter's ideas concerning social and economic evolution. Chapters 4 and 5 focus their analysis on the innovation process. In Chapters 6 and 7 Schumpeter's theory of business cycles is examined. Chapter 8 evaluates Schumpeter's views concerning his relative neglect of ecological-environmental conditions as possible factors influencing social-economic change. Finally, chapter 9 draws the main conclusions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of designing high rate, full diversity noncoherent space-time block codes (STBCs) with low encoding and decoding complexity is addressed. First, the notion of g-group encodable and g-group decodable linear STBCs is introduced. Then for a known class of rate-1 linear designs, an explicit construction of fully-diverse signal sets that lead to four-group encodable and four-group decodable differential scaled unitary STBCs for any power of two number of antennas is provided. Previous works on differential STBCs either sacrifice decoding complexity for higher rate or sacrifice rate for lower decoding complexity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a measurement of the top quark mass and of the top-antitop pair production cross section using p-pbar data collected with the CDFII detector at the Tevatron Collider at the Fermi National Accelerator Laboratory and corresponding to an integrated luminosity of 2.9 fb-1. We select events with six or more jets satisfying a number of kinematical requirements imposed by means of a neural network algorithm. At least one of these jets must originate from a b quark, as identified by the reconstruction of a secondary vertex inside the jet. The mass measurement is based on a likelihood fit incorporating reconstructed mass distributions representative of signal and background, where the absolute jet energy scale (JES) is measured simultaneously with the top quark mass. The measurement yields a value of 174.8 +- 2.4(stat+JES) ^{+1.2}_{-1.0}(syst) GeV/c^2, where the uncertainty from the absolute jet energy scale is evaluated together with the statistical uncertainty. The procedure measures also the amount of signal from which we derive a cross section, sigma_{ttbar} = 7.2 +- 0.5(stat) +- 1.0 (syst) +- 0.4 (lum) pb, for the measured values of top quark mass and JES.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The matched filter method for detecting a periodic structure on a surface hidden behind randomness is known to detect up to (r(0)/Lambda) gt;= 0.11, where r(0) is the coherence length of light on scattering from the rough part and 3 is the wavelength of the periodic part of the surface-the above limit being much lower than what is allowed by conventional detection methods. The primary goal of this technique is the detection and characterization of the periodic structure hidden behind randomness without the use of any complicated experimental or computational procedures. This paper examines this detection procedure for various values of the amplitude a of the periodic part beginning from a = 0 to small finite values of a. We thus address the importance of the following quantities: `(a)lambda) `, which scales the amplitude of the periodic part with the wavelength of light, and (r(0))Lambda),in determining the detectability of the intensity peaks.