10 resultados para transverse stochastic cooling

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The stochastic filtering has been in general an estimation of indirectly observed states given observed data. This means that one is discussing conditional expected values as being one of the most accurate estimation, given the observations in the context of probability space. In my thesis, I have presented the theory of filtering using two different kind of observation process: the first one is a diffusion process which is discussed in the first chapter, while the third chapter introduces the latter which is a counting process. The majority of the fundamental results of the stochastic filtering is stated in form of interesting equations, such the unnormalized Zakai equation that leads to the Kushner-Stratonovich equation. The latter one which is known also by the normalized Zakai equation or equally by Fujisaki-Kallianpur-Kunita (FKK) equation, shows the divergence between the estimate using a diffusion process and a counting process. I have also introduced an example for the linear gaussian case, which is mainly the concept to build the so-called Kalman-Bucy filter. As the unnormalized and the normalized Zakai equations are in terms of the conditional distribution, a density of these distributions will be developed through these equations and stated by Kushner Theorem. However, Kushner Theorem has a form of a stochastic partial differential equation that needs to be verify in the sense of the existence and uniqueness of its solution, which is covered in the second chapter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Minimum Description Length (MDL) is an information-theoretic principle that can be used for model selection and other statistical inference tasks. There are various ways to use the principle in practice. One theoretically valid way is to use the normalized maximum likelihood (NML) criterion. Due to computational difficulties, this approach has not been used very often. This thesis presents efficient floating-point algorithms that make it possible to compute the NML for multinomial, Naive Bayes and Bayesian forest models. None of the presented algorithms rely on asymptotic analysis and with the first two model classes we also discuss how to compute exact rational number solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report a search for single top quark production with the CDF II detector using 2.1 fb-1 of integrated luminosity of pbar p collisions at sqrt{s}=1.96 TeV. The data selected consist of events characterized by large energy imbalance in the transverse plane and hadronic jets, and no identified electrons and muons, so the sample is enriched in W -> tau nu decays. In order to suppress backgrounds, additional kinematic and topological requirements are imposed through a neural network, and at least one of the jets must be identified as a b-quark jet. We measure an excess of signal-like events in agreement with the standard model prediction, but inconsistent with a model without single top quark production by 2.1 standard deviations (sigma), with a median expected sensitivity of 1.4 sigma. Assuming a top quark mass of 175 GeV/c2 and ascribing the excess to single top quark production, the cross section is measured to be 4.9+2.5-2.2(stat+syst)pb, consistent with measurements performed in independent datasets and with the standard model prediction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present the results of a search for supersymmetry with gauge-mediated breaking and $\NONE\to\gamma\Gravitino$ in the $\gamma\gamma$+missing transverse energy final state. In 2.6$\pm$0.2 \invfb of $p{\bar p}$ collisions at $\sqrt{s}$$=$1.96 TeV recorded by the CDF II detector we observe no candidate events, consistent with a standard model background expectation of 1.4$\pm$0.4 events. We set limits on the cross section at the 95% C.L. and place the world's best limit of 149\gevc on the \none mass at $\tau_{\tilde{\chi}_1^0}$$

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a signature-based search for anomalous production of events containing a photon, two jets, of which at least one is identified as originating from a b quark, and missing transverse energy. The search uses data corresponding to 2.0/fb of integrated luminosity from p-pbar collisions at a center-of-mass energy of sqrt(s)=1.96 TeV, collected with the CDF II detector at the Fermilab Tevatron. From 6,697,466 events with a photon candidate with transverse energy ET> 25 GeV, we find 617 events with missing transverse energy > 25 GeV and two or more jets with ET> 15 GeV, at least one identified as originating from a b quark, versus an expectation of 607+- 113 events. Increasing the requirement on missing transverse energy to 50 GeV, we find 28 events versus an expectation of 30+-11 events. We find no indications of non-standard-model phenomena.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measurements of inclusive charged-hadron transverse-momentum and pseudorapidity distributions are presented for proton-proton collisions at sqrt(s) = 0.9 and 2.36 TeV. The data were collected with the CMS detector during the LHC commissioning in December 2009. For non-single-diffractive interactions, the average charged-hadron transverse momentum is measured to be 0.46 +/- 0.01 (stat.) +/- 0.01 (syst.) GeV/c at 0.9 TeV and 0.50 +/- 0.01 (stat.) +/- 0.01 (syst.) GeV/c at 2.36 TeV, for pseudorapidities between -2.4 and +2.4. At these energies, the measured pseudorapidity densities in the central region, dN(charged)/d(eta) for |eta|

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this paper is to investigate the pricing accuracy under stochastic volatility where the volatility follows a square root process. The theoretical prices are compared with market price data (the German DAX index options market) by using two different techniques of parameter estimation, the method of moments and implicit estimation by inversion. Standard Black & Scholes pricing is used as a benchmark. The results indicate that the stochastic volatility model with parameters estimated by inversion using the available prices on the preceding day, is the most accurate pricing method of the three in this study and can be considered satisfactory. However, as the same model with parameters estimated using a rolling window (the method of moments) proved to be inferior to the benchmark, the importance of stable and correct estimation of the parameters is evident.