22 resultados para Two section

em Helda - Digital Repository of University of Helsinki


Relevância:

30.00% 30.00%

Publicador:

Resumo:

At the Tevatron, the total p_bar-p cross-section has been measured by CDF at 546 GeV and 1.8 TeV, and by E710/E811 at 1.8 TeV. The two results at 1.8 TeV disagree by 2.6 standard deviations, introducing big uncertainties into extrapolations to higher energies. At the LHC, the TOTEM collaboration is preparing to resolve the ambiguity by measuring the total p-p cross-section with a precision of about 1 %. Like at the Tevatron experiments, the luminosity-independent method based on the Optical Theorem will be used. The Tevatron experiments have also performed a vast range of studies about soft and hard diffractive events, partly with antiproton tagging by Roman Pots, partly with rapidity gap tagging. At the LHC, the combined CMS/TOTEM experiments will carry out their diffractive programme with an unprecedented rapidity coverage and Roman Pot spectrometers on both sides of the interaction point. The physics menu comprises detailed studies of soft diffractive differential cross-sections, diffractive structure functions, rapidity gap survival and exclusive central production by Double Pomeron Exchange.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a measurement of the $WW+WZ$ production cross section observed in a final state consisting of an identified electron or muon, two jets, and missing transverse energy. The measurement is carried out in a data sample corresponding to up to 4.6~fb$^{-1}$ of integrated luminosity at $\sqrt{s} = 1.96$ TeV collected by the CDF II detector. Matrix element calculations are used to separate the diboson signal from the large backgrounds. The $WW+WZ$ cross section is measured to be $17.4\pm3.3$~pb, in agreement with standard model predictions. A fit to the dijet invariant mass spectrum yields a compatible cross section measurement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a search for a Higgs boson decaying to two W bosons in ppbar collisions at sqrt(s)=1.96 TeV center-of-mass energy. The data sample corresponds to an integrated luminosity of 3.0 fb-1 collected with the CDF II detector. We find no evidence for production of a Higgs boson with mass between 110 and 200 GeV/c^2, and determine upper limits on the production cross section. For the mass of 160 GeV/c^2, where the analysis is most sensitive, the observed (expected) limit is 0.7 pb (0.9 pb) at 95% Bayesian credibility level which is 1.7 (2.2) times the standard model cross section.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report two complementary measurements of the WW+WZ cross section in the final state consisting of an electron or muon, missing transverse energy, and jets, performed using p\bar{p} collision data at sqrt{s} = 1.96 TeV collected by the CDF II detector. The first method uses the dijet invariant mass distribution while the second more sensitive method uses matrix-element calculations. The result from the second method has a signal significance of 5.4 sigma and is the first observation of WW+WZ production using this signature. Combining the results gives sigma_{WW+WZ} = 16.0 +/- 3.3 pb, in agreement with the standard model prediction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A measurement of the $\ttbar$ production cross section in $\ppbar$ collisions at $\sqrt{{\rm s}}$ = 1.96 TeV using events with two leptons, missing transverse energy, and jets is reported. The data were collected with the CDF II Detector. The result in a data sample corresponding to an integrated luminosity 2.8 fb$^{-1}$ is: $\sigma_{\ttbar}$ = 6.27 $\pm$ 0.73(stat) $\pm$ 0.63(syst) $\pm$ 0.39(lum) pb. for an assumed top mass of 175 GeV/$c^{2}$.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report a measurement of the ratio of the tt̅ to Z/γ* production cross sections in √s=1.96  TeV pp̅ collisions using data corresponding to an integrated luminosity of up to 4.6  fb-1, collected by the CDF II detector. The tt̅ cross section ratio is measured using two complementary methods, a b-jet tagging measurement and a topological approach. By multiplying the ratios by the well-known theoretical Z/γ*→ll cross section predicted by the standard model, the extracted tt̅ cross sections are effectively insensitive to the uncertainty on luminosity. A best linear unbiased estimate is used to combine both measurements with the result σtt̅ =7.70±0.52  pb, for a top-quark mass of 172.5  GeV/c2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using data from 2.9  fb-1 of integrated luminosity collected with the CDF II detector at the Tevatron, we search for resonances decaying into a pair of on-shell gauge bosons, WW or WZ, where one W decays into an electron and a neutrino, and the other boson decays into two jets. We observed no statistically significant excess above the expected standard model background, and we set cross section limits at 95% confidence level on G* (Randall-Sundrum graviton), Z′, and W′ bosons. By comparing these limits to theoretical cross sections, mass exclusion regions for the three particles are derived. The mass exclusion regions for Z′ and W′ are further evaluated as a function of their gauge coupling strength.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using data from 2.9/fb of integrated luminosity collected with the CDF II detector at the Tevatron, we search for resonances decaying into a pair of on-shell gauge bosons, WW or WZ, where one W decays into an electron and a neutrino, and the other boson decays into two jets. We observed no statistically significant excess above the expected standard model background, and we set cross section limits at 95% confidence level on G*(Randall-Sundrum graviton), Z', and W' bosons. By comparing these limits to theoretical cross sections, mass exclusion regions for the three particles are derived. The mass exclusion regions for Z' and W' are further evaluated as a function of their gauge coupling strength.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report a measurement of the ratio of the tt̅ to Z/γ* production cross sections in √s=1.96  TeV pp̅ collisions using data corresponding to an integrated luminosity of up to 4.6  fb-1, collected by the CDF II detector. The tt̅ cross section ratio is measured using two complementary methods, a b-jet tagging measurement and a topological approach. By multiplying the ratios by the well-known theoretical Z/γ*→ll cross section predicted by the standard model, the extracted tt̅ cross sections are effectively insensitive to the uncertainty on luminosity. A best linear unbiased estimate is used to combine both measurements with the result σtt̅ =7.70±0.52  pb, for a top-quark mass of 172.5  GeV/c2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report a measurement of the ratio of the top-antitop to Z/gamma* production cross sections in sqrt(s) = 1.96 TeV proton-antiproton collisions using data corresponding to an integrated luminosity of up to 4.6 fb-1, collected by the CDF II detector. The top-antitop cross section ratio is measured using two complementary methods, a b-jet tagging measurement and a topological approach. By multiplying the ratios by the well-known theoretical Z/gamma*->ll cross section, the extracted top-antitop cross sections are effectively insensitive to the uncertainty on luminosity. A best linear unbiased estimate is used to combine both measurements with the result sigma_(top-antitop) = 7.70 +/- 0.52 pb, for a top-quark mass of 172.5 GeV/c^2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the most fundamental and widely accepted ideas in finance is that investors are compensated through higher returns for taking on non-diversifiable risk. Hence the quantification, modeling and prediction of risk have been, and still are one of the most prolific research areas in financial economics. It was recognized early on that there are predictable patterns in the variance of speculative prices. Later research has shown that there may also be systematic variation in the skewness and kurtosis of financial returns. Lacking in the literature so far, is an out-of-sample forecast evaluation of the potential benefits of these new more complicated models with time-varying higher moments. Such an evaluation is the topic of this dissertation. Essay 1 investigates the forecast performance of the GARCH (1,1) model when estimated with 9 different error distributions on Standard and Poor’s 500 Index Future returns. By utilizing the theory of realized variance to construct an appropriate ex post measure of variance from intra-day data it is shown that allowing for a leptokurtic error distribution leads to significant improvements in variance forecasts compared to using the normal distribution. This result holds for daily, weekly as well as monthly forecast horizons. It is also found that allowing for skewness and time variation in the higher moments of the distribution does not further improve forecasts. In Essay 2, by using 20 years of daily Standard and Poor 500 index returns, it is found that density forecasts are much improved by allowing for constant excess kurtosis but not improved by allowing for skewness. By allowing the kurtosis and skewness to be time varying the density forecasts are not further improved but on the contrary made slightly worse. In Essay 3 a new model incorporating conditional variance, skewness and kurtosis based on the Normal Inverse Gaussian (NIG) distribution is proposed. The new model and two previously used NIG models are evaluated by their Value at Risk (VaR) forecasts on a long series of daily Standard and Poor’s 500 returns. The results show that only the new model produces satisfactory VaR forecasts for both 1% and 5% VaR Taken together the results of the thesis show that kurtosis appears not to exhibit predictable time variation, whereas there is found some predictability in the skewness. However, the dynamic properties of the skewness are not completely captured by any of the models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we deal with the concept of risk. The objective is to bring together and conclude on some normative information regarding quantitative portfolio management and risk assessment. The first essay concentrates on return dependency. We propose an algorithm for classifying markets into rising and falling. Given the algorithm, we derive a statistic: the Trend Switch Probability, for detection of long-term return dependency in the first moment. The empirical results suggest that the Trend Switch Probability is robust over various volatility specifications. The serial dependency in bear and bull markets behaves however differently. It is strongly positive in rising market whereas in bear markets it is closer to a random walk. Realized volatility, a technique for estimating volatility from high frequency data, is investigated in essays two and three. In the second essay we find, when measuring realized variance on a set of German stocks, that the second moment dependency structure is highly unstable and changes randomly. Results also suggest that volatility is non-stationary from time to time. In the third essay we examine the impact from market microstructure on the error between estimated realized volatility and the volatility of the underlying process. With simulation-based techniques we show that autocorrelation in returns leads to biased variance estimates and that lower sampling frequency and non-constant volatility increases the error variation between the estimated variance and the variance of the underlying process. From these essays we can conclude that volatility is not easily estimated, even from high frequency data. It is neither very well behaved in terms of stability nor dependency over time. Based on these observations, we would recommend the use of simple, transparent methods that are likely to be more robust over differing volatility regimes than models with a complex parameter universe. In analyzing long-term return dependency in the first moment we find that the Trend Switch Probability is a robust estimator. This is an interesting area for further research, with important implications for active asset allocation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last few decades there have been far going financial market deregulation, technical development, advances in information technology, and standardization of legislation between countries. As a result, one can expect that financial markets have grown more interlinked. The proper understanding of the cross-market linkages has implications for investment and risk management, diversification, asset pricing, and regulation. The purpose of this research is to assess the degree of price, return, and volatility linkages between both geographic markets and asset categories within one country, Finland. Another purpose is to analyze risk asymmetries, i.e., the tendency of equity risk to be higher after negative events than after positive events of equal magnitude. The analysis is conducted both with respect to total risk (volatility), and systematic risk (beta). The thesis consists of an introductory part and four essays. The first essay studies to which extent international stock prices comove. The degree of comovements is low, indicating benefits from international diversification. The second essay examines the degree to which the Finnish market is linked to the “world market”. The total risk is divided into two parts, one relating to world factors, and one relating to domestic factors. The impact of world factors has increased over time. After 1993, when foreign investors were allowed to freely invest in Finnish assets, the risk level has been higher than previously. This was also the case during the economic recession in the beginning of the 1990’s. The third essay focuses on the stock, bond, and money markets in Finland. According to a trading model, the degree of volatility linkages should be strong. However, the results contradict this. The linkages are surprisingly weak, even negative. The stock market is the most independent, while the money market is affected by events on the two other markets. The fourth essay concentrates on volatility and beta asymmetries. Contrary to many international studies there are only few cases of risk asymmetries. When they occur, they tend to be driven by the market-wide component rather than the portfolio specific element.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

First, in Essay 1, we test whether it is possible to forecast Finnish Options Index return volatility by examining the out-of-sample predictive ability of several common volatility models with alternative well-known methods; and find additional evidence for the predictability of volatility and for the superiority of the more complicated models over the simpler ones. Secondly, in Essay 2, the aggregated volatility of stocks listed on the Helsinki Stock Exchange is decomposed into a market, industry-and firm-level component, and it is found that firm-level (i.e., idiosyncratic) volatility has increased in time, is more substantial than the two former, predicts GDP growth, moves countercyclically and as well as the other components is persistent. Thirdly, in Essay 3, we are among the first in the literature to seek for firm-specific determinants of idiosyncratic volatility in a multivariate setting, and find for the cross-section of stocks listed on the Helsinki Stock Exchange that industrial focus, trading volume, and block ownership, are positively associated with idiosyncratic volatility estimates––obtained from both the CAPM and the Fama and French three-factor model with local and international benchmark portfolios––whereas a negative relation holds between firm age as well as size and idiosyncratic volatility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the thesis we consider inference for cointegration in vector autoregressive (VAR) models. The thesis consists of an introduction and four papers. The first paper proposes a new test for cointegration in VAR models that is directly based on the eigenvalues of the least squares (LS) estimate of the autoregressive matrix. In the second paper we compare a small sample correction for the likelihood ratio (LR) test of cointegrating rank and the bootstrap. The simulation experiments show that the bootstrap works very well in practice and dominates the correction factor. The tests are applied to international stock prices data, and the .nite sample performance of the tests are investigated by simulating the data. The third paper studies the demand for money in Sweden 1970—2000 using the I(2) model. In the fourth paper we re-examine the evidence of cointegration between international stock prices. The paper shows that some of the previous empirical results can be explained by the small-sample bias and size distortion of Johansen’s LR tests for cointegration. In all papers we work with two data sets. The first data set is a Swedish money demand data set with observations on the money stock, the consumer price index, gross domestic product (GDP), the short-term interest rate and the long-term interest rate. The data are quarterly and the sample period is 1970(1)—2000(1). The second data set consists of month-end stock market index observations for Finland, France, Germany, Sweden, the United Kingdom and the United States from 1980(1) to 1997(2). Both data sets are typical of the sample sizes encountered in economic data, and the applications illustrate the usefulness of the models and tests discussed in the thesis.