960 resultados para Variance
Resumo:
Background Traffic offences have been considered an important predictor of crash involvement, and have often been used as a proxy safety variable for crashes. However the association between crashes and offences has never been meta-analysed and the population effect size never established. Research is yet to determine the extent to which this relationship may be spuriously inflated through systematic measurement error, with obvious implications for researchers endeavouring to accurately identify salient factors predictive of crashes. Methodology and Principal Findings Studies yielding a correlation between crashes and traffic offences were collated and a meta-analysis of 144 effects drawn from 99 road safety studies conducted. Potential impact of factors such as age, time period, crash and offence rates, crash severity and data type, sourced from either self-report surveys or archival records, were considered and discussed. After weighting for sample size, an average correlation of r = .18 was observed over the mean time period of 3.2 years. Evidence emerged suggesting the strength of this correlation is decreasing over time. Stronger correlations between crashes and offences were generally found in studies involving younger drivers. Consistent with common method variance effects, a within country analysis found stronger effect sizes in self-reported data even controlling for crash mean. Significance The effectiveness of traffic offences as a proxy for crashes may be limited. Inclusion of elements such as independently validated crash and offence histories or accurate measures of exposure to the road would facilitate a better understanding of the factors that influence crash involvement.
Resumo:
This study examines the properties of Generalised Regression (GREG) estimators for domain class frequencies and proportions. The family of GREG estimators forms the class of design-based model-assisted estimators. All GREG estimators utilise auxiliary information via modelling. The classic GREG estimator with a linear fixed effects assisting model (GREG-lin) is one example. But when estimating class frequencies, the study variable is binary or polytomous. Therefore logistic-type assisting models (e.g. logistic or probit model) should be preferred over the linear one. However, other GREG estimators than GREG-lin are rarely used, and knowledge about their properties is limited. This study examines the properties of L-GREG estimators, which are GREG estimators with fixed-effects logistic-type models. Three research questions are addressed. First, I study whether and when L-GREG estimators are more accurate than GREG-lin. Theoretical results and Monte Carlo experiments which cover both equal and unequal probability sampling designs and a wide variety of model formulations show that in standard situations, the difference between L-GREG and GREG-lin is small. But in the case of a strong assisting model, two interesting situations arise: if the domain sample size is reasonably large, L-GREG is more accurate than GREG-lin, and if the domain sample size is very small, estimation of assisting model parameters may be inaccurate, resulting in bias for L-GREG. Second, I study variance estimation for the L-GREG estimators. The standard variance estimator (S) for all GREG estimators resembles the Sen-Yates-Grundy variance estimator, but it is a double sum of prediction errors, not of the observed values of the study variable. Monte Carlo experiments show that S underestimates the variance of L-GREG especially if the domain sample size is minor, or if the assisting model is strong. Third, since the standard variance estimator S often fails for the L-GREG estimators, I propose a new augmented variance estimator (A). The difference between S and the new estimator A is that the latter takes into account the difference between the sample fit model and the census fit model. In Monte Carlo experiments, the new estimator A outperformed the standard estimator S in terms of bias, root mean square error and coverage rate. Thus the new estimator provides a good alternative to the standard estimator.
Resumo:
A 50-year tree-ring delta O-18 chronology of Abies spectabilis growing close to the tree line (3850 m asl) in the Nepal Himalaya is established to explore its dendroclimatic potential. Response function analysis with ambient climatic records revealed that tree-ring delta O-18 is primarily governed by rainfall during the monsoon season (June September), and the regression model accounts for 35% of the variance in rainfall. Extreme dry years identified in instrumental weather data are detected in the delta O-18 chronology. Further, tree-ring delta O-18 is much more sensitive to rainfall fluctuations than other tree-ring parameters such as width and density typically used in dendroclimatology. Correlation analyses with Nino 3.4 SST reveal time-dependent behavior of ENSO-monsoon relationships. (C) 2009 Elsevier GmbH. All rights reserved.
Resumo:
In humans, well-replicated and robust sex differences in cognitive functions exist for handedness and mental rotation ability. A common characteristic in human cognitive functions is the lateralization of language functions. Handedness is a common measure of laterality and is related to language lateralization. The prevalence of left-handedness is higher in males than in females, the male to female ratio being about 1.2. Among cognitive abilities, the largest sex difference is evident in the Vandenberg and Kuse Mental Rotation Test (MRT), which requires the ability to rotate objects in mental space. On average, males achieve scores one standard deviation higher than females in the MRT. The present thesis investigated the origins of the sex differences in laterality and spatial ability as represented by handedness and mental rotation ability, respectively. Two population-based Finnish twin cohorts were utilized in this study. Handedness was studied in 25 810 twins and 4068 singletons born before 1958 from the Older Finnish Twin Cohort, and in 4736 twins born in 1983-87 from the FinnTwin12. MRT was studied in a sub-sample of 804 young adult participants from the FinnTwin12 sample. The main findings of this study were: 1) the prevalence of left-handedness was higher among males than among females in both singletons and in twins; 2) males had significantly higher scores than females in MRT; 3) about one quarter of the variance in handedness and about half of the variance in MRT was explained by genetic effects, whereas the remainder of the variance in these traits was explained by environmental effects unique to each individual. The magnitude of the genetic effects was similar in both sexes; 4) left-handedness was significantly less common in female co-twins of a male than in female co-twins of a female, and female co-twins of a male scored significantly higher than did female co-twins of a female in the Mental Rotation Test. This dissertation discusses whether these differences between females from opposite- and same-sex twin pairs are due to the prenatal transfer of testosterone from the male fetus in females with male co-twins or whether they arise from postnatal socialization effects.
Resumo:
A laminated composite plate model based on first order shear deformation theory is implemented using the finite element method.Matrix cracks are introduced into the finite element model by considering changes in the A, B and D matrices of composites. The effects of different boundary conditions, laminate types and ply angles on the behavior of composite plates with matrix cracks are studied.Finally, the effect of material property uncertainty, which is important for composite material on the composite plate, is investigated using Monte Carlo simulations. Probabilistic estimates of damage detection reliability in composite plates are made for static and dynamic measurements. It is found that the effect of uncertainty must be considered for accurate damage detection in composite structures. The estimates of variance obtained for observable system properties due to uncertainty can be used for developing more robust damage detection algorithms. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Two optimal non-linear reinforcement schemes—the Reward-Inaction and the Penalty-Inaction—for the two-state automaton functioning in a stationary random environment are considered. Very simple conditions of symmetry of the non-linear function figuring in the reinforcement scheme are shown to be necessary and sufficient for optimality. General expressions for the variance and rate of learning are derived. These schemes are compared with the already existing optimal linear schemes in the light of average variance and average rate of learning.
Resumo:
Nonlinear vibration analysis is performed using a C-0 assumed strain interpolated finite element plate model based on Reddy's third order theory. An earlier model is modified to include the effect of transverse shear variation along the plate thickness and Von-Karman nonlinear strain terms. Monte Carlo Simulation with Latin Hypercube Sampling technique is used to obtain the variance of linear and nonlinear natural frequencies of the plate due to randomness in its material properties. Numerical results are obtained for composite plates with different aspect ratio, stacking sequence and oscillation amplitude ratio. The numerical results are validated with the available literature. It is found that the nonlinear frequencies show increasing non-Gaussian probability density function with increasing amplitude of vibration and show dual peaks at high amplitude ratios. This chaotic nature of the dispersion of nonlinear eigenvalues is also r
Resumo:
One of the most fundamental and widely accepted ideas in finance is that investors are compensated through higher returns for taking on non-diversifiable risk. Hence the quantification, modeling and prediction of risk have been, and still are one of the most prolific research areas in financial economics. It was recognized early on that there are predictable patterns in the variance of speculative prices. Later research has shown that there may also be systematic variation in the skewness and kurtosis of financial returns. Lacking in the literature so far, is an out-of-sample forecast evaluation of the potential benefits of these new more complicated models with time-varying higher moments. Such an evaluation is the topic of this dissertation. Essay 1 investigates the forecast performance of the GARCH (1,1) model when estimated with 9 different error distributions on Standard and Poor’s 500 Index Future returns. By utilizing the theory of realized variance to construct an appropriate ex post measure of variance from intra-day data it is shown that allowing for a leptokurtic error distribution leads to significant improvements in variance forecasts compared to using the normal distribution. This result holds for daily, weekly as well as monthly forecast horizons. It is also found that allowing for skewness and time variation in the higher moments of the distribution does not further improve forecasts. In Essay 2, by using 20 years of daily Standard and Poor 500 index returns, it is found that density forecasts are much improved by allowing for constant excess kurtosis but not improved by allowing for skewness. By allowing the kurtosis and skewness to be time varying the density forecasts are not further improved but on the contrary made slightly worse. In Essay 3 a new model incorporating conditional variance, skewness and kurtosis based on the Normal Inverse Gaussian (NIG) distribution is proposed. The new model and two previously used NIG models are evaluated by their Value at Risk (VaR) forecasts on a long series of daily Standard and Poor’s 500 returns. The results show that only the new model produces satisfactory VaR forecasts for both 1% and 5% VaR Taken together the results of the thesis show that kurtosis appears not to exhibit predictable time variation, whereas there is found some predictability in the skewness. However, the dynamic properties of the skewness are not completely captured by any of the models.
Resumo:
Modeling and forecasting of implied volatility (IV) is important to both practitioners and academics, especially in trading, pricing, hedging, and risk management activities, all of which require an accurate volatility. However, it has become challenging since the 1987 stock market crash, as implied volatilities (IVs) recovered from stock index options present two patterns: volatility smirk(skew) and volatility term-structure, if the two are examined at the same time, presents a rich implied volatility surface (IVS). This implies that the assumptions behind the Black-Scholes (1973) model do not hold empirically, as asset prices are mostly influenced by many underlying risk factors. This thesis, consists of four essays, is modeling and forecasting implied volatility in the presence of options markets’ empirical regularities. The first essay is modeling the dynamics IVS, it extends the Dumas, Fleming and Whaley (DFW) (1998) framework; for instance, using moneyness in the implied forward price and OTM put-call options on the FTSE100 index, a nonlinear optimization is used to estimate different models and thereby produce rich, smooth IVSs. Here, the constant-volatility model fails to explain the variations in the rich IVS. Next, it is found that three factors can explain about 69-88% of the variance in the IVS. Of this, on average, 56% is explained by the level factor, 15% by the term-structure factor, and the additional 7% by the jump-fear factor. The second essay proposes a quantile regression model for modeling contemporaneous asymmetric return-volatility relationship, which is the generalization of Hibbert et al. (2008) model. The results show strong negative asymmetric return-volatility relationship at various quantiles of IV distributions, it is monotonically increasing when moving from the median quantile to the uppermost quantile (i.e., 95%); therefore, OLS underestimates this relationship at upper quantiles. Additionally, the asymmetric relationship is more pronounced with the smirk (skew) adjusted volatility index measure in comparison to the old volatility index measure. Nonetheless, the volatility indices are ranked in terms of asymmetric volatility as follows: VIX, VSTOXX, VDAX, and VXN. The third essay examines the information content of the new-VDAX volatility index to forecast daily Value-at-Risk (VaR) estimates and compares its VaR forecasts with the forecasts of the Filtered Historical Simulation and RiskMetrics. All daily VaR models are then backtested from 1992-2009 using unconditional, independence, conditional coverage, and quadratic-score tests. It is found that the VDAX subsumes almost all information required for the volatility of daily VaR forecasts for a portfolio of the DAX30 index; implied-VaR models outperform all other VaR models. The fourth essay models the risk factors driving the swaption IVs. It is found that three factors can explain 94-97% of the variation in each of the EUR, USD, and GBP swaption IVs. There are significant linkages across factors, and bi-directional causality is at work between the factors implied by EUR and USD swaption IVs. Furthermore, the factors implied by EUR and USD IVs respond to each others’ shocks; however, surprisingly, GBP does not affect them. Second, the string market model calibration results show it can efficiently reproduce (or forecast) the volatility surface for each of the swaptions markets.
Resumo:
In this thesis we deal with the concept of risk. The objective is to bring together and conclude on some normative information regarding quantitative portfolio management and risk assessment. The first essay concentrates on return dependency. We propose an algorithm for classifying markets into rising and falling. Given the algorithm, we derive a statistic: the Trend Switch Probability, for detection of long-term return dependency in the first moment. The empirical results suggest that the Trend Switch Probability is robust over various volatility specifications. The serial dependency in bear and bull markets behaves however differently. It is strongly positive in rising market whereas in bear markets it is closer to a random walk. Realized volatility, a technique for estimating volatility from high frequency data, is investigated in essays two and three. In the second essay we find, when measuring realized variance on a set of German stocks, that the second moment dependency structure is highly unstable and changes randomly. Results also suggest that volatility is non-stationary from time to time. In the third essay we examine the impact from market microstructure on the error between estimated realized volatility and the volatility of the underlying process. With simulation-based techniques we show that autocorrelation in returns leads to biased variance estimates and that lower sampling frequency and non-constant volatility increases the error variation between the estimated variance and the variance of the underlying process. From these essays we can conclude that volatility is not easily estimated, even from high frequency data. It is neither very well behaved in terms of stability nor dependency over time. Based on these observations, we would recommend the use of simple, transparent methods that are likely to be more robust over differing volatility regimes than models with a complex parameter universe. In analyzing long-term return dependency in the first moment we find that the Trend Switch Probability is a robust estimator. This is an interesting area for further research, with important implications for active asset allocation.
Resumo:
A better understanding of stock price changes is important in guiding many economic activities. Since prices often do not change without good reasons, searching for related explanatory variables has involved many enthusiasts. This book seeks answers from prices per se by relating price changes to their conditional moments. This is based on the belief that prices are the products of a complex psychological and economic process and their conditional moments derive ultimately from these psychological and economic shocks. Utilizing information about conditional moments hence makes it an attractive alternative to using other selective financial variables in explaining price changes. The first paper examines the relation between the conditional mean and the conditional variance using information about moments in three types of conditional distributions; it finds that the significance of the estimated mean and variance ratio can be affected by the assumed distributions and the time variations in skewness. The second paper decomposes the conditional industry volatility into a concurrent market component and an industry specific component; it finds that market volatility is on average responsible for a rather small share of total industry volatility — 6 to 9 percent in UK and 2 to 3 percent in Germany. The third paper looks at the heteroskedasticity in stock returns through an ARCH process supplemented with a set of conditioning information variables; it finds that the heteroskedasticity in stock returns allows for several forms of heteroskedasticity that include deterministic changes in variances due to seasonal factors, random adjustments in variances due to market and macro factors, and ARCH processes with past information. The fourth paper examines the role of higher moments — especially skewness and kurtosis — in determining the expected returns; it finds that total skewness and total kurtosis are more relevant non-beta risk measures and that they are costly to be diversified due either to the possible eliminations of their desirable parts or to the unsustainability of diversification strategies based on them.
Resumo:
Financial time series tend to behave in a manner that is not directly drawn from a normal distribution. Asymmetries and nonlinearities are usually seen and these characteristics need to be taken into account. To make forecasts and predictions of future return and risk is rather complicated. The existing models for predicting risk are of help to a certain degree, but the complexity in financial time series data makes it difficult. The introduction of nonlinearities and asymmetries for the purpose of better models and forecasts regarding both mean and variance is supported by the essays in this dissertation. Linear and nonlinear models are consequently introduced in this dissertation. The advantages of nonlinear models are that they can take into account asymmetries. Asymmetric patterns usually mean that large negative returns appear more often than positive returns of the same magnitude. This goes hand in hand with the fact that negative returns are associated with higher risk than in the case where positive returns of the same magnitude are observed. The reason why these models are of high importance lies in the ability to make the best possible estimations and predictions of future returns and for predicting risk.
Resumo:
This paper studies the effect of the expiration day of index options and futures on the trading volume, variance and price of the underlying shares. The data consists of all trades for the underlying shares in the FOX-index for expiration days during the period October 1995 to the mid of yer 1999. The main results seem to support the findings of Kan 2001, i.e. no manipulation on a larger scale. However, some indication of manipulation could be found if certain characteristics are favorable. These characteristics include: a) a large quantity of outstanding futures or at/in the money options contracts, b) there exists shares with high index weight but fairly low trading volume. Lastly, there is some indication that manipulation might be more popular towards the end of the examined time period.
Resumo:
The use of different time units in option pricing may lead to inconsistent estimates of time decay and spurious jumps in implied volatilities. Different time units in the pricing model leads to different implied volatilities although the option price itself is the same.The chosen time unit should make it necessary to adjust the volatility parameter only when there are some fundamental reasons for it and not due to wrong specifications of the model. This paper examined the effects of option pricing using different time hypotheses and empirically investigated which time frame the option markets in Germany employ over weekdays. The paper specifically tries to get a picture of how the market prices options. The results seem to verify that the German market behaves in a fashion that deviates from the most traditional time units in option pricing, calendar and trading days. The study also showed that the implied volatility of Thursdays was somewhat higher and thus differed from the pattern of other days of the week. Using a GARCH model to further investigate the effect showed that although a traditional tests, like the analysis of variance, indicated a negative return for Thursday during the same period as the implied volatilities used, this was not supported using a GARCH model.
Resumo:
This paper examines how volatility in financial markets can preferable be modeled. The examination investigates how good the models for the volatility, both linear and nonlinear, are in absorbing skewness and kurtosis. The examination is done on the Nordic stock markets, including Finland, Sweden, Norway and Denmark. Different linear and nonlinear models are applied, and the results indicates that a linear model can almost always be used for modeling the series under investigation, even though nonlinear models performs slightly better in some cases. These results indicate that the markets under study are exposed to asymmetric patterns only to a certain degree. Negative shocks generally have a more prominent effect on the markets, but these effects are not really strong. However, in terms of absorbing skewness and kurtosis, nonlinear models outperform linear ones.