992 resultados para Statistical index


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article describes a maximum likelihood method for estimating the parameters of the standard square-root stochastic volatility model and a variant of the model that includes jumps in equity prices. The model is fitted to data on the S&P 500 Index and the prices of vanilla options written on the index, for the period 1990 to 2011. The method is able to estimate both the parameters of the physical measure (associated with the index) and the parameters of the risk-neutral measure (associated with the options), including the volatility and jump risk premia. The estimation is implemented using a particle filter whose efficacy is demonstrated under simulation. The computational load of this estimation method, which previously has been prohibitive, is managed by the effective use of parallel computing using graphics processing units (GPUs). The empirical results indicate that the parameters of the models are reliably estimated and consistent with values reported in previous work. In particular, both the volatility risk premium and the jump risk premium are found to be significant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using surface charts at 0330GMT, the movement df the monsoon trough during the months June to September 1990 al two fixed longitudes, namely 79 degrees E and 85 degrees E, is studied. The probability distribution of trough position shows that the median, mean and mode occur at progressively more northern latitudes, especially at 85 degrees E, with a pronounced mode that is close to the northern-most limit reached by the trough. A spectral analysis of the fluctuating latitudinal position of the trough is carried out using FFT and the Maximum Entropy Method (MEM). Both methods show significant peaks around 7.5 and 2.6 days, and a less significant one around 40-50 days. The two peaks at the shorter period are more prominent at the eastern longitude. MEM shows an additional peak around 15 days. A study of the weather systems that occurred during the season shows them to have a duration around 3 days and an interval between systems of around 9 days, suggesting a possible correlation with the dominant short periods observed in the spectrum of trough position.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective of statistical analysis of experi- mental investigations is to make predictions on the basis of mathematical equations so as the number of experiments. Abrasive jet machining (AJM) is an unconventional and novel machining process wherein microabrasive particles are propelled at high veloc- ities on to a workpiece. The resulting erosion can be used for cutting, etching, cleaning, deburring, drilling and polishing. In the study completed by the authors, statistical design of experiments was successfully employed to predict the rate of material removal by AJM. This paper discusses the details of such an approach and the findings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bentonite, commonly used for liner constructions in waste containment systems, possesses many limitations. Illite or illite containing bentonite has been proposed as an alternative material for liner construction. Their properties in different types of pore fluids are important to assess the long-term performance of the liner. Further, the illite-bentonite interaction occurs and changes their properties. The effect of these interactions is known when the pore fluid is only water. How their properties are modified in electrolyte solutions has been brought out in this paper. The index properties have been studied since they give an indication of their engineering properties. Due to reduction in the thickness of the diffused double layer and consequent particle aggregation in bentonite, the effect of clay-clay interaction reduces in electrolyte solutions. In electrolyte solutions, the liquid limit, the plasticity index, and free swell index of bentonite are lower than illite. The plasticity index of bentonite is further reduced in KCI solution. Clays with a higher plasticity index perform better to retain pollutants and reduce permeability. Hence, the presence of both illite and bentonite ensures better performance of the liner in different fluids.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Biomechanical stresses play an important role in determining plaque stability. Quantification of these simulated stresses can be potentially used to assess plaque vulnerability and differentiate different patient groups. Methods and Results: 54 asymptomatic and 45 acutely symptomatic patients underwent in vivo multicontrast magnetic resonance imaging (MRI) of the carotid arteries. Plaque geometry used for finite element analysis was derived from in vivo MRI at the sites of maximum and minimum plaque burden. In total, 198 slices were used for the computational simulations. A pre-shrink technique was used to refine the simulation. Maximum principle stress at the vulnerable plaque sites (ie, critical stress) was extracted for the selected slices and a comparison was performed between the 2 groups. Critical stress in the slice with maximum plaque burden is significantly higher in acutely symptomatic patients as compared to asymptomatic patients (median, inter quartile range: 198.0 kPa (119.8-359.0 kPa) vs 138.4 kPa (83.8-242.6 kPa), P=0.04). No significant difference was found in the slice with minimum plaque burden between the 2 groups (196.7 kPa (133.3-282.7 kPa) vs 182.4 kPa (117.2-310.6 kPa), P=0.82). Conclusions: Acutely symptomatic carotid plaques have significantly high biomechanical stresses than asymptomatic plaques. This might be potentially useful for establishing a biomechanical risk stratification criteria based on plaque burden in future studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The export of sediments from coastal catchments can have detrimental impacts on estuaries and near shore reef ecosystems such as the Great Barrier Reef. Catchment management approaches aimed at reducing sediment loads require monitoring to evaluate their effectiveness in reducing loads over time. However, load estimation is not a trivial task due to the complex behaviour of constituents in natural streams, the variability of water flows and often a limited amount of data. Regression is commonly used for load estimation and provides a fundamental tool for trend estimation by standardising the other time specific covariates such as flow. This study investigates whether load estimates and resultant power to detect trends can be enhanced by (i) modelling the error structure so that temporal correlation can be better quantified, (ii) making use of predictive variables, and (iii) by identifying an efficient and feasible sampling strategy that may be used to reduce sampling error. To achieve this, we propose a new regression model that includes an innovative compounding errors model structure and uses two additional predictive variables (average discounted flow and turbidity). By combining this modelling approach with a new, regularly optimised, sampling strategy, which adds uniformity to the event sampling strategy, the predictive power was increased to 90%. Using the enhanced regression model proposed here, it was possible to detect a trend of 20% over 20 years. This result is in stark contrast to previous conclusions presented in the literature. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the development of statistical models for prediction of constituent concentration of riverine pollutants, which is a key step in load estimation from frequent flow rate data and less frequently collected concentration data. We consider how to capture the impacts of past flow patterns via the average discounted flow (ADF) which discounts the past flux based on the time lapsed - more recent fluxes are given more weight. However, the effectiveness of ADF depends critically on the choice of the discount factor which reflects the unknown environmental cumulating process of the concentration compounds. We propose to choose the discount factor by maximizing the adjusted R-2 values or the Nash-Sutcliffe model efficiency coefficient. The R2 values are also adjusted to take account of the number of parameters in the model fit. The resulting optimal discount factor can be interpreted as a measure of constituent exhaustion rate during flood events. To evaluate the performance of the proposed regression estimators, we examine two different sampling scenarios by resampling fortnightly and opportunistically from two real daily datasets, which come from two United States Geological Survey (USGS) gaging stations located in Des Plaines River and Illinois River basin. The generalized rating-curve approach produces biased estimates of the total sediment loads by -30% to 83%, whereas the new approaches produce relatively much lower biases, ranging from -24% to 35%. This substantial improvement in the estimates of the total load is due to the fact that predictability of concentration is greatly improved by the additional predictors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Power calculation and sample size determination are critical in designing environmental monitoring programs. The traditional approach based on comparing the mean values may become statistically inappropriate and even invalid when substantial proportions of the response values are below the detection limits or censored because strong distributional assumptions have to be made on the censored observations when implementing the traditional procedures. In this paper, we propose a quantile methodology that is robust to outliers and can also handle data with a substantial proportion of below-detection-limit observations without the need of imputing the censored values. As a demonstration, we applied the methods to a nutrient monitoring project, which is a part of the Perth Long-Term Ocean Outlet Monitoring Program. In this example, the sample size required by our quantile methodology is, in fact, smaller than that by the traditional t-test, illustrating the merit of our method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As part of the development of the ASEAN Regional Road Safety Strategy, a new index for measuring road safety maturity (RSM) was constructed from numerical weightings given to measurable factors presented for each of the pillars that guide national road safety plans and activities in WHO Global Road Safety Report 2013: road safety management, safer road and mobility, safer vehicles, safer road users and post-crash response. The index is based on both a content analysis approach and a binary methodology (report/no report) including measures which have been considered pertinent and not redundant. For instance, the use of random breath testing and/or police checkpoints in the national drink driving law are combined in the enforcement index. The value of the index per pillar ranges from 0 to 100%, taking into account whether there is total, partial or non-implementation of certain actions. In addition, when possible, the self-rated level of enforcement is included. The overall ratings for the I 0 ASEAN countries and the scores for each of the pillars are presented in the paper. The extent to which the RSM index is a valid indicator of road safety performance is also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statistical methods are often used to analyse commercial catch and effort data to provide standardised fishing effort and/or a relative index of fish abundance for input into stock assessment models. Achieving reliable results has proved difficult in Australia's Northern Prawn Fishery (NPF), due to a combination of such factors as the biological characteristics of the animals, some aspects of the fleet dynamics, and the changes in fishing technology. For this set of data, we compared four modelling approaches (linear models, mixed models, generalised estimating equations, and generalised linear models) with respect to the outcomes of the standardised fishing effort or the relative index of abundance. We also varied the number and form of vessel covariates in the models. Within a subset of data from this fishery, modelling correlation structures did not alter the conclusions from simpler statistical models. The random-effects models also yielded similar results. This is because the estimators are all consistent even if the correlation structure is mis-specified, and the data set is very large. However, the standard errors from different models differed, suggesting that different methods have different statistical efficiency. We suggest that there is value in modelling the variance function and the correlation structure, to make valid and efficient statistical inferences and gain insight into the data. We found that fishing power was separable from the indices of prawn abundance only when we offset the impact of vessel characteristics at assumed values from external sources. This may be due to the large degree of confounding within the data, and the extreme temporal changes in certain aspects of individual vessels, the fleet and the fleet dynamics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The charge at which adsorption of orgamc compounds attains a maximum ( \sigma MAX M) at an electrochenucal interface is analysed using several multi-state models in a hierarchical manner The analysis is based on statistical mechamcal results for the following models (A) two-state site parity, (B) two-state muhl-slte, and (C) three-state site parity The coulombic interactions due to permanent and reduced dipole effects (using mean field approximation), electrostatic field effects and specific substrate interactions have been taken into account. The simplest model in the hierarchy (two-state site parity) yields the exphcit dependence of ( \sigma MAX M) on the permanent dipole moment, polarizability of the solvent and the adsorbate, lattice spacing, effective coordination number, etc Other models in the baerarchy bring to hght the influence of the solvent structure and the role of substrate interactions, etc As a result of this approach, the "composition" of oM.x m terms of the fundamental molecular constants becomes clear. With a view to use these molecular results to maxamum advantage, the derived results for ( \sigma MAX M) have been converted into those involving experimentally observable parameters lake Co, C 1, E N, etc Wherever possible, some of the earlier phenomenologlcal relations reported for ( \sigma MAX M), notably by Parsons, Damaskm and Frumkln, and Trasattl, are shown to have a certain molecular basis, vlz a simple two-state sate panty model.As a corollary to the hxerarcbacal modelling, \sigma MAX M and the potential corresponding to at (Emax) are shown to be constants independent of 0max or Corg for all models The lmphcatlon of our analysis f o r OmMa x with respect to that predicted by the generalized surface layer equation (which postulates Om~ and Ema x varlaUon with 0) is discussed in detail Finally we discuss an passing o M. and the electrosorptlon valency an this context.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seven discrete stages and substages of moulting in the ornate rock lobster, Panulirus ornatus, have been distinguished by microscopic examination of the cuticle and setae of the pleopods . The diagnostic features and the duration of each of the stages are described. Freezing did not visually alter the tissue features used to identify each moult stage. Pleopod morphology can reliably indicate whether a lobster has moulted within the previous 24 h or is within 72 h of the next ecdysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many statistical forecast systems are available to interested users. In order to be useful for decision-making, these systems must be based on evidence of underlying mechanisms. Once causal connections between the mechanism and their statistical manifestation have been firmly established, the forecasts must also provide some quantitative evidence of `quality’. However, the quality of statistical climate forecast systems (forecast quality) is an ill-defined and frequently misunderstood property. Often, providers and users of such forecast systems are unclear about what ‘quality’ entails and how to measure it, leading to confusion and misinformation. Here we present a generic framework to quantify aspects of forecast quality using an inferential approach to calculate nominal significance levels (p-values) that can be obtained either by directly applying non-parametric statistical tests such as Kruskal-Wallis (KW) or Kolmogorov-Smirnov (KS) or by using Monte-Carlo methods (in the case of forecast skill scores). Once converted to p-values, these forecast quality measures provide a means to objectively evaluate and compare temporal and spatial patterns of forecast quality across datasets and forecast systems. Our analysis demonstrates the importance of providing p-values rather than adopting some arbitrarily chosen significance levels such as p < 0.05 or p < 0.01, which is still common practice. This is illustrated by applying non-parametric tests (such as KW and KS) and skill scoring methods (LEPS and RPSS) to the 5-phase Southern Oscillation Index classification system using historical rainfall data from Australia, The Republic of South Africa and India. The selection of quality measures is solely based on their common use and does not constitute endorsement. We found that non-parametric statistical tests can be adequate proxies for skill measures such as LEPS or RPSS. The framework can be implemented anywhere, regardless of dataset, forecast system or quality measure. Eventually such inferential evidence should be complimented by descriptive statistical methods in order to fully assist in operational risk management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigated whether mixed-species designs can increase the growth of a tropical eucalypt when compared to monocultures. Monocultures of Eucalyptus pellita (E) and Acacia peregrina (A) and mixtures in various proportions (75E:25A, 50E:50A, 25E:75A) were planted in a replacement series design on the Atherton Tablelands of north Queensland, Australia. High mortality in the establishment phase due to repeated damage by tropical cyclones altered the trial design. Effects of experimental designs on tree growth were estimated using a linear mixed-effects model with restricted maximum likelihood analysis (REML). Volume growth of individual eucalypt trees were positively affected by the presence of acacia trees at age 5 years and this effect generally increased with time up to age 10 years. However, the stand volume and basal area increased with increasing proportions of E. pellita, due to its larger individual tree size. Conventional analysis did not offer convincing support for mixed-species designs. Preliminary individual-based modelling using a modified Hegyi competition index offered a solution and an equation that indicates acacias have positive ecological interactions (facilitation or competitive reduction) and definitely do not cause competition like a eucalypt. These results suggest that significantly increased in growth rates could be achieved with mixed-species designs. This statistical methodology could enable a better understanding of species interactions in similarly altered experiments, or undesigned mixed-species plantations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Forty-four study sites were established in remnant woodland in the Burdekin River catchment in tropical north-east Queensland, Australia, to assess recent (decadal) vegetation change. The aim of this study was further to evaluate whether wide-scale vegetation 'thickening' (proliferation of woody plants in formerly more open woodlands) had occurred during the last century, coinciding with significant changes in land management. Soil samples from several depth intervals were size separated into different soil organic carbon (SOC) fractions, which differed from one another by chemical composition and turnover times. Tropical (C4) grasses dominate in the Burdekin catchment, and thus δ13C analyses of SOC fractions with different turnover times can be used to assess whether the relative proportion of trees (C3) and grasses (C4) had changed over time. However, a method was required to permit standardized assessment of the δ13C data for the individual sites within the 13 Mha catchment, which varied in soil and vegetation characteristics. Thus, an index was developed using data from three detailed study sites and global literature to standardize individual isotopic data from different soil depths and SOC fractions to reflect only the changed proportion of trees (C3) to grasses (C3) over decadal timescales. When applied to the 44 individual sites distributed throughout the Burdekin catchment, 64% of the sites were shown to have experienced decadal vegetation thickening, while 29% had remained stable and the remaining 7% had thinned. Thus, the development of this index enabled regional scale assessment and comparison of decadal vegetation patterns without having to rely on prior knowledge of vegetation changes or aerial photography.