895 resultados para volatility index


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim: The aim was to investigate whether the sleep practices in early childhood education (ECE) settings align with current evidence on optimal practice to support sleep. Background: Internationally, scheduled sleep times are a common feature of daily schedules in ECE settings, yet little is known about the degree to which care practices in these settings align with the evidence regarding appropriate support of sleep. Methods: Observations were conducted in 130 Australian ECE rooms attended by preschool children (Mean = 4.9 years). Of these rooms, 118 had daily scheduled sleep times. Observed practices were scored against an optimality index, the Sleep Environment and Practices Optimality Score, developed with reference to current evidence regarding sleep scheduling, routines, environmental stimuli, and emotional climate. Cluster analysis was applied to identify patterns and prevalence of care practices in the sleep time. Results: Three sleep practices types were identified. Supportive rooms (36%) engaged in practices that maintained regular schedules, promoted routine, reduced environmental stimulation, and maintained positive emotional climate. The majority of ECE rooms (64%), although offering opportunity for sleep, did not engage in supportive practices: Ambivalent rooms (45%) were emotionally positive but did not support sleep; Unsupportive rooms (19%) were both emotionally negative and unsupportive in their practices. Conclusions: Although ECE rooms schedule sleep time, many do not adopt practices that are supportive of sleep. Our results underscore the need for education about sleep supporting practice and research to ascertain the impact of sleep practices in ECE settings on children’s sleep health and broader well-being.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Based on unique news data relating to gold and crude oil, we investigate how news volume and sentiment, shocks in trading activity, market depth and trader positions unrelated to information flow covary with realized volatility. Positive shocks to the rate of news arrival, and negative shocks to news sentiment exhibit the largest effects. After controlling for the level of news flow and cross-correlations, net trader positions play only a minor role. These findings are at odds with those of [Wang (2002a). The Journal of Futures Markets, 22, 427–450; Wang (2002b). The Financial Review, 37, 295–316], but are consistent with the previous literature which doesn't find a strong link between volatility and trader positions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rarely is it possible to obtain absolute numbers in free-ranging populations and although various direct and indirect methods are used to estimate abundance, few are validated against populations of known size. In this paper, we apply grounding, calibration and verification methods, used to validate mathematical models, to methods of estimating relative abundance. To illustrate how this might be done, we consider and evaluate the widely applied passive tracking index (PTI) methodology. Using published data, we examine the rationality of PTI methodology, how conceptually animal activity and abundance are related and how alternative methods are subject to similar biases or produce similar abundance estimates and trends. We then attune the method against populations representing a range of densities likely to be encountered in the field. Finally, we compare PTI trends against a prediction that adjacent populations of the same species will have similar abundance values and trends in activity. We show that while PTI abundance estimates are subject to environmental and behavioural stochasticity peculiar to each species, the PTI method and associated variance estimate showed high probability of detection, high precision of abundance values and, generally, low variability between surveys, and suggest that the PTI method applied using this procedure and for these species provides a sensitive and credible index of abundance. This same or similar validation approach can and should be applied to alternative relative abundance methods in order to demonstrate their credibility and justify their use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study examined whether a specific property of cell microstructures may be useful as a biomarker of aging. Specifically, the association between age and changes of cellular structures reflected in electrophoretic mobility of cell nuclei index (EMN index) values across the adult lifespan was examined. This report considers findings from cross sections of females (n = 1273) aged 18–98 years, and males (n = 506) aged 19–93 years. A Biotest apparatus was used to perform intracellular microelectrophoresis on buccal epithelial cells collected from each individual. EMN index was calculated on the basis of the number of epithelial cells with mobile nuclei in reference to the cells with immobile nuclei per 100 cells. Regression analyses indicated a significant negative association between EMN index value and age for men (r = −0.71, p < 0.001) and women (r = −0.60, p < 0.001); demonstrating a key requirement that must be met by a biomarker of aging. The strength of association observed between EMN index and age for both men and women was encouraging and supports the potential use of EMN index for determining a biological age of an individual (or a group). In this study, a new attempt of complex explanation of cellular mechanisms contributing to age related changes of the EMN index was made. In this study, a new attempt of complex explanation of cellular mechanisms contributing to age related changes of the EMN index was made. EMN index has demonstrated potential to meet criteria proposed for biomarkers of aging and further investigations are necessary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A wide range of models used in agriculture, ecology, carbon cycling, climate and other related studies require information on the amount of leaf material present in a given environment to correctly represent radiation, heat, momentum, water, and various gas exchanges with the overlying atmosphere or the underlying soil. Leaf area index (LAI) thus often features as a critical land surface variable in parameterisations of global and regional climate models, e.g., radiation uptake, precipitation interception, energy conversion, gas exchange and momentum, as all areas are substantially determined by the vegetation surface. Optical wavelengths of remote sensing are the common electromagnetic regions used for LAI estimations and generally for vegetation studies. The main purpose of this dissertation was to enhance the determination of LAI using close-range remote sensing (hemispherical photography), airborne remote sensing (high resolution colour and colour infrared imagery), and satellite remote sensing (high resolution SPOT 5 HRG imagery) optical observations. The commonly used light extinction models are applied at all levels of optical observations. For the sake of comparative analysis, LAI was further determined using statistical relationships between spectral vegetation index (SVI) and ground based LAI. The study areas of this dissertation focus on two regions, one located in Taita Hills, South-East Kenya characterised by tropical cloud forest and exotic plantations, and the other in Gatineau Park, Southern Quebec, Canada dominated by temperate hardwood forest. The sampling procedure of sky map of gap fraction and size from hemispherical photographs was proven to be one of the most crucial steps in the accurate determination of LAI. LAI and clumping index estimates were significantly affected by the variation of the size of sky segments for given zenith angle ranges. On sloping ground, gap fraction and size distributions present strong upslope/downslope asymmetry of foliage elements, and thus the correction and the sensitivity analysis for both LAI and clumping index computations were demonstrated. Several SVIs can be used for LAI mapping using empirical regression analysis provided that the sensitivities of SVIs at varying ranges of LAI are large enough. Large scale LAI inversion algorithms were demonstrated and were proven to be a considerably efficient alternative approach for LAI mapping. LAI can be estimated nonparametrically from the information contained solely in the remotely sensed dataset given that the upper-end (saturated SVI) value is accurately determined. However, further study is still required to devise a methodology as well as instrumentation to retrieve on-ground green leaf area index . Subsequently, the large scale LAI inversion algorithms presented in this work can be precisely validated. Finally, based on literature review and this dissertation, potential future research prospects and directions were recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis addresses modeling of financial time series, especially stock market returns and daily price ranges. Modeling data of this kind can be approached with so-called multiplicative error models (MEM). These models nest several well known time series models such as GARCH, ACD and CARR models. They are able to capture many well established features of financial time series including volatility clustering and leptokurtosis. In contrast to these phenomena, different kinds of asymmetries have received relatively little attention in the existing literature. In this thesis asymmetries arise from various sources. They are observed in both conditional and unconditional distributions, for variables with non-negative values and for variables that have values on the real line. In the multivariate context asymmetries can be observed in the marginal distributions as well as in the relationships of the variables modeled. New methods for all these cases are proposed. Chapter 2 considers GARCH models and modeling of returns of two stock market indices. The chapter introduces the so-called generalized hyperbolic (GH) GARCH model to account for asymmetries in both conditional and unconditional distribution. In particular, two special cases of the GARCH-GH model which describe the data most accurately are proposed. They are found to improve the fit of the model when compared to symmetric GARCH models. The advantages of accounting for asymmetries are also observed through Value-at-Risk applications. Both theoretical and empirical contributions are provided in Chapter 3 of the thesis. In this chapter the so-called mixture conditional autoregressive range (MCARR) model is introduced, examined and applied to daily price ranges of the Hang Seng Index. The conditions for the strict and weak stationarity of the model as well as an expression for the autocorrelation function are obtained by writing the MCARR model as a first order autoregressive process with random coefficients. The chapter also introduces inverse gamma (IG) distribution to CARR models. The advantages of CARR-IG and MCARR-IG specifications over conventional CARR models are found in the empirical application both in- and out-of-sample. Chapter 4 discusses the simultaneous modeling of absolute returns and daily price ranges. In this part of the thesis a vector multiplicative error model (VMEM) with asymmetric Gumbel copula is found to provide substantial benefits over the existing VMEM models based on elliptical copulas. The proposed specification is able to capture the highly asymmetric dependence of the modeled variables thereby improving the performance of the model considerably. The economic significance of the results obtained is established when the information content of the volatility forecasts derived is examined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conventional methods for determining the refractive index demand specimens of optical quality, the preparation of which is often very difficult. An indirect determination by matching the refractive indices of specimen and immersion liquid is a practical alternative for photoelastic specimen of nonoptical quality. An experimental arrangement used for this technique and observations made while matching the refractive indices of three different specimens are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital image

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stochastic volatility models are of fundamental importance to the pricing of derivatives. One of the most commonly used models of stochastic volatility is the Heston Model in which the price and volatility of an asset evolve as a pair of coupled stochastic differential equations. The computation of asset prices and volatilities involves the simulation of many sample trajectories with conditioning. The problem is treated using the method of particle filtering. While the simulation of a shower of particles is computationally expensive, each particle behaves independently making such simulations ideal for massively parallel heterogeneous computing platforms. In this paper, we present our portable Opencl implementation of the Heston model and discuss its performance and efficiency characteristics on a range of architectures including Intel cpus, Nvidia gpus, and Intel Many-Integrated-Core (mic) accelerators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Volatility is central in options pricing and risk management. It reflects the uncertainty of investors and the inherent instability of the economy. Time series methods are among the most widely applied scientific methods to analyze and predict volatility. Very frequently sampled data contain much valuable information about the different elements of volatility and may ultimately reveal the reasons for time varying volatility. The use of such ultra-high-frequency data is common to all three essays of the dissertation. The dissertation belongs to the field of financial econometrics. The first essay uses wavelet methods to study the time-varying behavior of scaling laws and long-memory in the five-minute volatility series of Nokia on the Helsinki Stock Exchange around the burst of the IT-bubble. The essay is motivated by earlier findings which suggest that different scaling laws may apply to intraday time-scales and to larger time-scales, implying that the so-called annualized volatility depends on the data sampling frequency. The empirical results confirm the appearance of time varying long-memory and different scaling laws that, for a significant part, can be attributed to investor irrationality and to an intraday volatility periodicity called the New York effect. The findings have potentially important consequences for options pricing and risk management that commonly assume constant memory and scaling. The second essay investigates modelling the duration between trades in stock markets. Durations convoy information about investor intentions and provide an alternative view at volatility. Generalizations of standard autoregressive conditional duration (ACD) models are developed to meet needs observed in previous applications of the standard models. According to the empirical results based on data of actively traded stocks on the New York Stock Exchange and the Helsinki Stock Exchange the proposed generalization clearly outperforms the standard models and also performs well in comparison to another recently proposed alternative to the standard models. The distribution used to derive the generalization may also prove valuable in other areas of risk management. The third essay studies empirically the effect of decimalization on volatility and market microstructure noise. Decimalization refers to the change from fractional pricing to decimal pricing and it was carried out on the New York Stock Exchange in January, 2001. The methods used here are more accurate than in the earlier studies and put more weight on market microstructure. The main result is that decimalization decreased observed volatility by reducing noise variance especially for the highly active stocks. The results help risk management and market mechanism designing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces an index of tax optimality that measures the distance of some current tax structure from the optimal tax structure in the presence of public goods. This index is defined on the [0, 1] interval and measures the proportion of the optimal tax rates that will achieve the same welfare outcome as some arbitrarily given initial tax structure. We call this number the Tax Optimality Index. We also show how the basic methodology can be altered to derive a revenue equivalent uniform tax, which measures the tax burden implied by the public sector. A numerical example is used to illustrate the method developed, and extensions of the analysis to handle models with multiple households and nonlinear taxation structures are undertaken.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An acyclic edge coloring of a graph is a proper edge coloring such that there are no bichromatic cycles. The acyclic chromatic index of a graph is the minimum number k such that there is an acyclic edge coloring using k colors and is denoted by a'(G). It was conjectured by Alon, Suclakov and Zaks (and earlier by Fiamcik) that a'(G) <= Delta+2, where Delta = Delta(G) denotes the maximum degree of the graph. Alon et al. also raised the question whether the complete graphs of even order are the only regular graphs which require Delta+2 colors to be acyclically edge colored. In this article, using a simple counting argument we observe not only that this is not true, but in fact all d-regular graphs with 2n vertices and d>n, requires at least d+2 colors. We also show that a'(K-n,K-n) >= n+2, when n is odd using a more non-trivial argument. (Here K-n,K-n denotes the complete bipartite graph with n vertices on each side.) This lower bound for Kn,n can be shown to be tight for some families of complete bipartite graphs and for small values of n. We also infer that for every d, n such that d >= 5, n >= 2d+3 and dn even, there exist d-regular graphs which require at least d+2-colors to be acyclically edge colored. (C) 2009 Wiley Periodicals, Inc. J Graph Theory 63: 226-230, 2010.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Atmospheric aerosol particles affect the global climate as well as human health. In this thesis, formation of nanometer sized atmospheric aerosol particles and their subsequent growth was observed to occur all around the world. Typical formation rate of 3 nm particles at varied from 0.01 to 10 cm-3s-1. One order of magnitude higher formation rates were detected in urban environment. Highest formation rates up to 105 cm-3s-1 were detected in coastal areas and in industrial pollution plumes. Subsequent growth rates varied from 0.01 to 20 nm h-1. Smallest growth rates were observed in polar areas and the largest in the polluted urban environment. This was probably due to competition between growth by condensation and loss by coagulation. Observed growth rates were used in the calculation of a proxy condensable vapour concentration and its source rate in vastly different environments from pristine Antarctica to polluted India. Estimated concentrations varied only 2 orders of magnitude, but the source rates for the vapours varied up to 4 orders of magnitude. Highest source rates were in New Delhi and lowest were in the Antarctica. Indirect methods were applied to study the growth of freshly formed particles in the atmosphere. Also a newly developed Water Condensation Particle Counter, TSI 3785, was found to be a potential candidate to detect water solubility and thus indirectly composition of atmospheric ultra-fine particles. Based on indirect methods, the relative roles of sulphuric acid, non-volatile material and coagulation were investigated in rural Melpitz, Germany. Condensation of non-volatile material explained 20-40% and sulphuric acid the most of the remaining growth up to a point, when nucleation mode reached 10 to 20 nm in diameter. Coagulation contributed typically less than 5%. Furthermore, hygroscopicity measurements were applied to detect the contribution of water soluble and insoluble components in Athens. During more polluted days, the water soluble components contributed more to the growth. During less anthropogenic influence, non-soluble compounds explained a larger fraction of the growth. In addition, long range transport to a measurement station in Finland in a relatively polluted air mass was found to affect the hygroscopicity of the particles. This aging could have implications to cloud formation far away from the pollution sources.