964 resultados para sticky-price DGSE models
Resumo:
We present the results of a search for Higgs bosons predicted in two-Higgs-doublet models, in the case where the Higgs bosons decay to tau lepton pairs, using 1.8 inverse fb of integrated luminosity of proton-antiproton collisions recorded by the CDF II experiment at the Fermilab Tevatron. Studying the observed mass distribution in events where one or both tau leptons decay leptonically, no evidence for a Higgs boson signal is observed. The result is used to infer exclusion limits in the two-dimensional parameter space of tan beta versus m(A).
Resumo:
This paper develops a model for military conflicts where the defending forces have to determine an optimal partitioning of available resources to counter attacks from an adversary in two different fronts. The Lanchester attrition model is used to develop the dynamical equations governing the variation in force strength. Three different allocation schemes - Time-Zero-Allocation (TZA), Allocate-Assess-Reallocate (AAR), and Continuous Constant Allocation (CCA) - are considered and the optimal solutions are obtained in each case. Numerical examples are given to support the analytical results.
Resumo:
Suitable pin-to-hole interference can significantly increase the fatigue life of a pin joint. In practical design, the initial stresses due to interference are high and they are proportional to the effective interference. In experimental studies on such joints, difficulties have been experienced in estimating the interference accurately from physical measurements of pin and hole diameters. A simple photoelastic method has been developed to determine the effective interference to a high degree of accuracy. This paper presents the method and reports illustrative data from a successful application thereof.
Resumo:
We combine results from searches by the CDF and D0 collaborations for a standard model Higgs boson (H) in the process gg->H->W+W- in p=pbar collisions at the Fermilab Tevatron Collider at sqrt{s}=1.96 TeV. With 4.8 fb-1 of integrated luminosity analyzed at CDF and 5.4 fb-1 at D0, the 95% Confidence Level upper limit on \sigma(gg->H) x B(H->W+W-) is 1.75 pb at m_H=120 GeV, 0.38 pb at m_H=165 GeV, and 0.83 pb at m_H=200 GeV. Assuming the presence of a fourth sequential generation of fermions with large masses, we exclude at the 95% Confidence Level a standard-model-like Higgs boson with a mass between 131 and 204 GeV.
Resumo:
This paper studies an ultrasonic wave dispersion characteristics of a nanorod. Nonlocal strain gradient models (both second and fourth order) are introduced to analyze the ultrasonic wave behavior in nanorod. Explicit expressions are derived for wave numbers and the wave speeds of the nanorod. The analysis shows that the fourth order strain gradient model gives approximate results over the second order strain gradient model for dynamic analysis. The second order strain gradient model gives a critical wave number at certain wave frequency, where the wave speeds are zero. A relation among the number of waves along the nanorod, the nonlocal scaling parameter (e(0)a), and the length of the nanorod is obtained from the nonlocal second order strain gradient model. The ultrasonic wave characteristics of the nanorod obtained from the nonlocal strain gradient models are compared with the classical continuum model. The dynamic response behavior of nanorods is explained from both the strain gradient models. The effect of e(0)a on the ultrasonic wave behavior of the nanorods is also observed. (C) 2010 American Institute of Physics.
Resumo:
We study effective models of chiral fields and Polyakov loop expected to describe the dynamics responsible for the phase structure of two-flavor QCD at finite temperature and density. We consider chiral sector described either using linear sigma model or Nambu-Jona-Lasinio model and study the phase diagram and determine the location of the critical point as a function of the explicit chiral symmetry breaking (i.e. the bare quark mass $m_q$). We also discuss the possible emergence of the quarkyonic phase in this model.
Resumo:
The question at issue in this dissertation is the epistemic role played by ecological generalizations and models. I investigate and analyze such properties of generalizations as lawlikeness, invariance, and stability, and I ask which of these properties are relevant in the context of scientific explanations. I will claim that there are generalizable and reliable causal explanations in ecology by generalizations, which are invariant and stable. An invariant generalization continues to hold or be valid under a special change called an intervention that changes the value of its variables. Whether a generalization remains invariant during its interventions is the criterion that determines whether it is explanatory. A generalization can be invariant and explanatory regardless of its lawlike status. Stability deals with a generality that has to do with holding of a generalization in possible background conditions. The more stable a generalization, the less dependent it is on background conditions to remain true. Although it is invariance rather than stability of generalizations that furnishes us with explanatory generalizations, there is an important function that stability has in this context of explanations, namely, stability furnishes us with extrapolability and reliability of scientific explanations. I also discuss non-empirical investigations of models that I call robustness and sensitivity analyses. I call sensitivity analyses investigations in which one model is studied with regard to its stability conditions by making changes and variations to the values of the model s parameters. As a general definition of robustness analyses I propose investigations of variations in modeling assumptions of different models of the same phenomenon in which the focus is on whether they produce similar or convergent results or not. Robustness and sensitivity analyses are powerful tools for studying the conditions and assumptions where models break down and they are especially powerful in pointing out reasons as to why they do this. They show which conditions or assumptions the results of models depend on. Key words: ecology, generalizations, invariance, lawlikeness, philosophy of science, robustness, explanation, models, stability
Resumo:
Yhteenveto: Talvivirtaamien redukointi vesistömallien avulla
Resumo:
We examine the stability of hadron resonance gas models by extending them to include undiscovered resonances through the Hagedorn formula. We find that the influence of unknown resonances on thermodynamics is large but bounded. We model the decays of resonances and investigate the ratios of particle yields in heavy-ion collisions. We find that observables such as hydrodynamics and hadron yield ratios change little upon extending the model. As a result, heavy-ion collisions at the RHIC and LHC are insensitive to a possible exponential rise in the hadronic density of states, thus increasing the stability of the predictions of hadron resonance gas models in this context. Hadron resonance gases are internally consistent up to a temperature higher than the crossover temperature in QCD, but by examining quark number susceptibilities we find that their region of applicability ends below the QCD crossover.
Resumo:
The increased availability of high frequency data sets have led to important new insights in understanding of financial markets. The use of high frequency data is interesting and persuasive, since it can reveal new information that cannot be seen in lower data aggregation. This dissertation explores some of the many important issues connected with the use, analysis and application of high frequency data. These include the effects of intraday seasonal, the behaviour of time varying volatility, the information content of various market data, and the issue of inter market linkages utilizing high frequency 5 minute observations from major European and the U.S stock indices, namely DAX30 of Germany, CAC40 of France, SMI of Switzerland, FTSE100 of the UK and SP500 of the U.S. The first essay in the dissertation shows that there are remarkable similarities in the intraday behaviour of conditional volatility across European equity markets. Moreover, the U.S macroeconomic news announcements have significant cross border effect on both, European equity returns and volatilities. The second essay reports substantial intraday return and volatility linkages across European stock indices of the UK and Germany. This relationship appears virtually unchanged by the presence or absence of the U.S stock market. However, the return correlation among the U.K and German markets rises significantly following the U.S stock market opening, which could largely be described as a contemporaneous effect. The third essay sheds light on market microstructure issues in which traders and market makers learn from watching market data, and it is this learning process that leads to price adjustments. This study concludes that trading volume plays an important role in explaining international return and volatility transmissions. The examination concerning asymmetry reveals that the impact of the positive volume changes is larger on foreign stock market volatility than the negative changes. The fourth and the final essay documents number of regularities in the pattern of intraday return volatility, trading volume and bid-ask spreads. This study also reports a contemporaneous and positive relationship between the intraday return volatility, bid ask spread and unexpected trading volume. These results verify the role of trading volume and bid ask quotes as proxies for information arrival in producing contemporaneous and subsequent intraday return volatility. Moreover, asymmetric effect of trading volume on conditional volatility is also confirmed. Overall, this dissertation explores the role of information in explaining the intraday return and volatility dynamics in international stock markets. The process through which the information is incorporated in stock prices is central to all information-based models. The intraday data facilitates the investigation that how information gets incorporated into security prices as a result of the trading behavior of informed and uninformed traders. Thus high frequency data appears critical in enhancing our understanding of intraday behavior of various stock markets’ variables as it has important implications for market participants, regulators and academic researchers.
Resumo:
Modeling and forecasting of implied volatility (IV) is important to both practitioners and academics, especially in trading, pricing, hedging, and risk management activities, all of which require an accurate volatility. However, it has become challenging since the 1987 stock market crash, as implied volatilities (IVs) recovered from stock index options present two patterns: volatility smirk(skew) and volatility term-structure, if the two are examined at the same time, presents a rich implied volatility surface (IVS). This implies that the assumptions behind the Black-Scholes (1973) model do not hold empirically, as asset prices are mostly influenced by many underlying risk factors. This thesis, consists of four essays, is modeling and forecasting implied volatility in the presence of options markets’ empirical regularities. The first essay is modeling the dynamics IVS, it extends the Dumas, Fleming and Whaley (DFW) (1998) framework; for instance, using moneyness in the implied forward price and OTM put-call options on the FTSE100 index, a nonlinear optimization is used to estimate different models and thereby produce rich, smooth IVSs. Here, the constant-volatility model fails to explain the variations in the rich IVS. Next, it is found that three factors can explain about 69-88% of the variance in the IVS. Of this, on average, 56% is explained by the level factor, 15% by the term-structure factor, and the additional 7% by the jump-fear factor. The second essay proposes a quantile regression model for modeling contemporaneous asymmetric return-volatility relationship, which is the generalization of Hibbert et al. (2008) model. The results show strong negative asymmetric return-volatility relationship at various quantiles of IV distributions, it is monotonically increasing when moving from the median quantile to the uppermost quantile (i.e., 95%); therefore, OLS underestimates this relationship at upper quantiles. Additionally, the asymmetric relationship is more pronounced with the smirk (skew) adjusted volatility index measure in comparison to the old volatility index measure. Nonetheless, the volatility indices are ranked in terms of asymmetric volatility as follows: VIX, VSTOXX, VDAX, and VXN. The third essay examines the information content of the new-VDAX volatility index to forecast daily Value-at-Risk (VaR) estimates and compares its VaR forecasts with the forecasts of the Filtered Historical Simulation and RiskMetrics. All daily VaR models are then backtested from 1992-2009 using unconditional, independence, conditional coverage, and quadratic-score tests. It is found that the VDAX subsumes almost all information required for the volatility of daily VaR forecasts for a portfolio of the DAX30 index; implied-VaR models outperform all other VaR models. The fourth essay models the risk factors driving the swaption IVs. It is found that three factors can explain 94-97% of the variation in each of the EUR, USD, and GBP swaption IVs. There are significant linkages across factors, and bi-directional causality is at work between the factors implied by EUR and USD swaption IVs. Furthermore, the factors implied by EUR and USD IVs respond to each others’ shocks; however, surprisingly, GBP does not affect them. Second, the string market model calibration results show it can efficiently reproduce (or forecast) the volatility surface for each of the swaptions markets.
Resumo:
Market microstructure is “the study of the trading mechanisms used for financial securities” (Hasbrouck (2007)). It seeks to understand the sources of value and reasons for trade, in a setting with different types of traders, and different private and public information sets. The actual mechanisms of trade are a continually changing object of study. These include continuous markets, auctions, limit order books, dealer markets, or combinations of these operating as a hybrid market. Microstructure also has to allow for the possibility of multiple prices. At any given time an investor may be faced with a multitude of different prices, depending on whether he or she is buying or selling, the quantity he or she wishes to trade, and the required speed for the trade. The price may also depend on the relationship that the trader has with potential counterparties. In this research, I touch upon all of the above issues. I do this by studying three specific areas, all of which have both practical and policy implications. First, I study the role of information in trading and pricing securities in markets with a heterogeneous population of traders, some of whom are informed and some not, and who trade for different private or public reasons. Second, I study the price discovery of stocks in a setting where they are simultaneously traded in more than one market. Third, I make a contribution to the ongoing discussion about market design, i.e. the question of which trading systems and ways of organizing trading are most efficient. A common characteristic throughout my thesis is the use of high frequency datasets, i.e. tick data. These datasets include all trades and quotes in a given security, rather than just the daily closing prices, as in traditional asset pricing literature. This thesis consists of four separate essays. In the first essay I study price discovery for European companies cross-listed in the United States. I also study explanatory variables for differences in price discovery. In my second essay I contribute to earlier research on two issues of broad interest in market microstructure: market transparency and informed trading. I examine the effects of a change to an anonymous market at the OMX Helsinki Stock Exchange. I broaden my focus slightly in the third essay, to include releases of macroeconomic data in the United States. I analyze the effect of these releases on European cross-listed stocks. The fourth and last essay examines the uses of standard methodologies of price discovery analysis in a novel way. Specifically, I study price discovery within one market, between local and foreign traders.
Resumo:
In this article, the problem of two Unmanned Aerial Vehicles (UAVs) cooperatively searching an unknown region is addressed. The search region is discretized into hexagonal cells and each cell is assumed to possess an uncertainty value. The UAVs have to cooperatively search these cells taking limited endurance, sensor and communication range constraints into account. Due to limited endurance, the UAVs need to return to the base station for refuelling and also need to select a base station when multiple base stations are present. This article proposes a route planning algorithm that takes endurance time constraints into account and uses game theoretical strategies to reduce the uncertainty. The route planning algorithm selects only those cells that ensure the agent will return to any one of the available bases. A set of paths are formed using these cells which the game theoretical strategies use to select a path that yields maximum uncertainty reduction. We explore non-cooperative Nash, cooperative and security strategies from game theory to enhance the search effectiveness. Monte-Carlo simulations are carried out which show the superiority of the game theoretical strategies over greedy strategy for different look ahead step length paths. Within the game theoretical strategies, non-cooperative Nash and cooperative strategy perform similarly in an ideal case, but Nash strategy performs better than the cooperative strategy when the perceived information is different. We also propose a heuristic based on partitioning of the search space into sectors to reduce computational overhead without performance degradation.
Resumo:
The purpose of this thesis is to examine the role of trade durations in price discovery. The motivation to use trade durations in the study of price discovery is that durations are robust to many microstructure effects that introduce a bias in the measurement of returns volatility. Another motivation to use trade durations in the study of price discovery is that it is difficult to think of economic variables, which really are useful in the determination of the source of volatility at arbitrarily high frequencies. The dissertation contains three essays. In the first essay, the role of trade durations in price discovery is examined with respect to the volatility pattern of stock returns. The theory on volatility is associated with the theory on the information content of trade, dear to the market microstructure theory. The first essay documents that the volatility per transaction is related to the intensity of trade, and a strong relationship between the stochastic process of trade durations and trading variables. In the second essay, the role of trade durations in price discovery is examined with respect to the quantification of risk due to a trading volume of a certain size. The theory on volume is intrinsically associated with the stock volatility pattern. The essay documents that volatility increases, in general, when traders choose to trade with large transactions. In the third essay, the role of trade durations in price discovery is examined with respect to the information content of a trade. The theory on the information content of a trade is associated with the theory on the rate of price revisions in the market. The essay documents that short durations are associated with information. Thus, traders are compensated for responding quickly to information