30 resultados para Monte Carle Simulation

em Deakin Research Online - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper compares the credit risk profile for two types of model, the Monte Carlo model used in the existing literature, and the Cox, Ingersoll and Ross (CIR) model. Each of the profiles has a concave or hump-backed shape, reflecting the amortisation and diffusion effects. However, the CIR model generates significantly different results. In addition, we consider the sensitivity of these models of credit risk to initial interest rates, volatility, maturity, kappa and delta. The results show that the sensitivities vary across the models, and we explore the meaning of that variation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: Antidepressant drugs and cognitive–behavioural therapy (CBT) are effective treatment options for depression and are recommended by clinical practice guidelines. As part of the Assessing Cost-effectiveness – Mental Health project we evaluate the available evidence on costs and benefits of CBT and drugs in the episodic and maintenance treatment of major depression.

Method: The cost-effectiveness is modelled from a health-care perspective as the cost per disability-adjusted life year. Interventions are targeted at people with major depression who currently seek care but receive non-evidence based treatment. Uncertainty in model inputs is tested using Monte Carlo simulation methods.

Results: All interventions for major depression examined have a favourable incremental cost-effectiveness ratio under Australian health service conditions. Bibliotherapy, group CBT, individual CBT by a psychologist on a public salary and tricyclic antidepressants (TCAs) are very cost-effective treatment options falling below $A10 000 per disability-adjusted life year (DALY) even when taking the upper limit of the uncertainty interval into account. Maintenance treatment with selective serotonin re-uptake inhibitors (SSRIs) is the most expensive option (ranging from $A17 000 to $A20 000 per DALY) but still well below $A50 000, which is considered the affordable threshold.

Conclusions: A range of cost-effective interventions for episodes of major depression exists and is currently underutilized. Maintenance treatment strategies are required to significantly reduce the burden of depression, but the cost of long-term drug treatment for the large number of depressed people is high if SSRIs are the drug of choice. Key policy issues with regard to expanded provision of CBT concern the availability of suitably trained providers and the funding mechanisms for therapy in primary care.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: To evaluate whether the introduction of a national, co-ordinated screening program using the faecal occult blood test represents 'value-for-money' from the perspective of the Australian Government as third-party funder.  Methods: The annual equivalent costs and consequences of a   biennial screening program in 'steady-state' operation were estimated for the Australian population using 1996 as the reference year. Disability-adjusted life years (DALYs) and the years of life lost (YLLs) averted, and the health service costs were modelled, based on the epidemiology and the costs of colorectal cancer in Australia together with the mortality reduction achieved in randomised controlled trials. Uncertainty in the model was examined using Monte Carlo simulation methods. Results: We estimate a minimum or 'base program' of screening those aged 55 to 69 years could avert 250 deaths per annum (95% uncertainty interval 99–400), at a gross cost of $A55 million (95% UI $A46 million to $A96 million) and a gross incremental cost-effectiveness ratio of $A17,000/DALY (95% UI $A13,000/DALY to $A52,000/DALY). Extending the program to include 70 to 74-year-olds is a more effective option (cheaper and higher health gain) than including the 50 to 54-year-olds. Conclusions: The findings of this study support the case for a national program directed at the 55 to 69-year-old age group with extension to 70 to 74-year-olds if there are sufficient resources. The pilot tests recently announced in Australia provide an important opportunity to consider the age range for screening and the sources of uncertainty, identified in the modelled evaluation, to assist decisions on implementing a full national program.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The performance of different information criteria - namely Akaike, corrected Akaike (AICC), Schwarz-Bayesian (SBC), and Hannan-Quinn - is investigated so as to choose the optimal lag length in stable and unstable vector autoregressive (VAR) models both when autoregressive conditional heteroscedasticity (ARCH) is present and when it is not. The investigation covers both large and small sample sizes. The Monte Carlo simulation results show that SBC has relatively better performance in lag-choice accuracy in many situations. It is also generally the least sensitive to ARCH regardless of stability or instability of the VAR model, especially in large sample sizes. These appealing properties of SBC make it the optimal criterion for choosing lag length in many situations, especially in the case of financial data, which are usually characterized by occasional periods of high volatility. SBC also has the best forecasting abilities in the majority of situations in which we vary sample size, stability, variance structure (ARCH or not), and forecast horizon (one period or five). frequently, AICC also has good lag-choosing and forecasting properties. However, when ARCH is present, the five-period forecast performance of all criteria in all situations worsens.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We investigate the effectiveness of several well-known parametric and non-parametric event study test statistics with security price data from the major Asia-Pacific security markets. Extensive Monte Carlo simulation experiments with actual daily security returns data reveal that the parametric test statistics are prone to misspecification with Asia-Pacific returns data. Two non-parametric tests, a rank test [Corrado and Zivney (Corrado, C.J., Zivney, T.L., 1992, The specification and power of the sign test in event study hypothesis tests using daily stock returns, Journal of Financial and Quantitative Analysis 27(3), 465-478)] and a sign test [Cowan (Cowan, A.R., 1992, Non-parametric event study tests, Review of Quantitative Finance and Accounting 1(4), 343–358)] were the best performers overall with market model excess returns computed using an equal weight index.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to estimate the cost of granting executive stocks with strike prices adjusted by the cost of capital.

Design/methodology/approach – In the paper a Monte Carlo simulation approach developed in Longstaff and Schwartz is used in conjunction with the subjective valuation model developed in Ingersoll to value these executive stock options that are subject to performance hurdles.

Findings – The paper finds that standard European Black-Scholes-Merton option values overstate the true cost to the firm of granting these executive stock options. The option values also decrease with a higher dividend yield, a higher performance hurdle, a longer vesting period, and a shorter maturity.

Research limitations/implications – While the study in the paper is limited to the valuation of executive options, the methodology can be used to study incentive effects of executive stock options that have a performance hurdle.

Practical implications – The approach used in this paper to estimate the cost of granting executive stock options is a clear improvement over standard European option pricing approaches that often result in biased estimates.

Originality/value – This paper presents a first attempt to integrate the Ingersoll utility-theoretic model and the Longstaff and Schwartz least squares Monte Carlo algorithm to estimate the subjective value and the objective cost of executive stock options with a performance hurdle. This valuation approach will be useful in the study of other types of executive compensation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The goal of this paper is to examine whether the volatility of the growth in the US oil stocks has changed overtime, and if it has then whether or not this change is real. We find that the growth in volatility of oil stocks has declined overtime. We conduct a Monte Carlo simulation exercise to investigate whether this decline is real or an artefact of the growth definition. Our findings support the fact that the decline in growth volatility of oil stocks is an artefact of the growth definition. This is because a data generating process having a unit root with drift has a tendency to grow and thereby pulls the variance of growth down with time.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis reports on a quantitative exposure assessment and on an analysis of the attributes of the data used in the estimations, in particular distinguishing between its uncertainty and variability. A retrospective assessment of exposure to benzene was carried out for a case control study of leukaemia in the Australian petroleum industry. The study used the mean of personal task-based measurements (Base Estimates) in a deterministic algorithm and applied factors to model back to places, times etc for which no exposure measurements were available. Mean daily exposures were estimated, on an individual subject basis, by summing the task-based exposures. These mean exposures were multiplied by the years spent on each job to provide exposure estimates in ppm-years. These were summed to provide a Cumulative Estimate for each subject. Validation was completed for the model and key inputs. Exposures were low, most jobs were below TWA of 5 ppm benzene. Exposures in terminals were generally higher than at refineries. Cumulative Estimates ranged from 0.005 to 50.9 ppm-years, with 84 percent less than 10 ppm-years. Exposure probability distributions were developed for tanker drivers using Monte Carlo simulation of the exposure estimation algorithm. The outcome was a lognormal distribution of exposure for each driver. These provide the basis for alternative risk assessment metrics e.g. the frequency of short but intense exposures which provided only a minimal contribution to the long-term average exposure but may increase risk of leukaemia. The effect of different inputs to the model were examined and their significance assessed using Monte Carlo simulation. The Base Estimates were the most important determinant of exposure in the model. The sources of variability in the measured data were examined, including the effect of having censored data and the between and within-worker variability. The sources of uncertainty in the exposure estimates were analysed and consequential improvements in exposure assessment identified. Monte Carlo sampling was also used to examine the uncertainties and variability associated with the tanker drivers' exposure assessment, to derive an estimate of the range and to put confidence intervals on the daily mean exposures. The identified uncertainty was less than the variability associated with the estimates. The traditional approach to exposure estimation typically derives only point estimates of mean exposure. The approach developed here allows a range of exposure estimates to be made and provides a more flexible and improved basis for risk assessment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is debate over the casual factors for the rise in body weight in the UK. The present study investigates whether increases between 1986 and 2000 for men and women were a result of increases in mean total energy intake, decreases in mean physical activity levels or both. Estimates of mean total energy intake in 1986 and 2000 were derived from food availability data adjusted for wastage. Estimates of mean body weight for adults aged 19–64 years were derived from nationally representative dietary surveys conducted in 1986–7 and 2000–1. Predicted body weight in 1986 and 2000 was calculated using an equation relating body weight to total energy intake and sex. Differences in predicted mean body weight and actual mean body weight between the two time points were compared. Monte Carlo simulation methods were used to assess the stability of the estimates. The predicted increase in mean body weight due to changes in total energy intake between 1986 and 2000 was 4·7 (95 % credible interval 4·2, 5·3) kg for men and 6·4 (95 % credible interval 5·9, 7·1) kg for women. Actual mean body weight increased by 7·7 kg for men and 5·4 kg for women between the two time points. We conclude that increases in mean total energy intake are sufficient to explain the increase in mean body weight for women between 1986 and 2000, but for men, the increase in mean body weight is likely to be due to a combination of increased total energy intake and reduced physical activity levels.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Risk analysis is one of the critical functions of the risk management process. It relies on a detailed understanding of risks and their possible implications. Construction projects, because of their large and complex nature, are plagued by a variety of risks which must be considered and responded to in order to ensure project success. This study conducts an extensive comparative analysis of major quantitative risk analysis techniques in the construction industry. The techniques discussed and comparatively analyzed in this report include: Programme Evaluation and Review Technique (PERT), Judgmental Risk Analysis Process (JRAP), Estimating Using Risk Analysis (ERA), Monte Carlo Simulation technique, Computer Aided Simulation for Project Appraisal and Review (CASPAR), Failure Modes and Effects Analysis technique (FMEA) and Advanced Programmatic Risk Analysis and Management model (APRAM). The findings highlight the fact that each risk analysis technique addresses risks in any or all of the following areas – schedule risks, budget risks or technical risks. Through comparative analysis, it has been revealed that a majority of risk analysis techniques focus on schedule or budget risks. Very little has been documented in terms of technical risk analysis techniques. In an era where clients are demanding and expecting higher quality projects and finishes, project managers must endeavor to invest time and resources to ensure that the few existing technical risk analysis techniques are developed and further refined, and that new technical risk analysis techniques are developed to suit the current construction industries requirements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND/OBJECTIVES:


Food is responsible for around one-fifth of all greenhouse gas (GHG) emissions from products consumed in the UK, the largest contributor of which is meat and dairy. The Committee on Climate Change have modelled the impact on GHG emissions of three dietary scenarios for food consumption in the UK. This paper models the impact of the three scenarios on mortality from cardiovascular disease and cancer.
SUBJECTS/METHODS:


A previously published model (DIETRON) was used. The three scenarios were parameterised by fruit and vegetables, fibre, total fat, saturated fat, monounsaturated fatty acids, polyunsaturated fatty acids, cholesterol and salt using the 2008 Family Food Survey. A Monte Carlo simulation generated 95% credible intervals.
RESULTS:


Scenario 1 (50% reduction in meat and dairy replaced by fruit, vegetables and cereals: 19% reduction in GHG emissions) resulted in 36 910 (30 192 to 43 592) deaths delayed or averted per year. Scenario 2 (75% reduction in cow and sheep meat replaced by pigs and poultry: 9% reduction in GHG emissions) resulted in 1999 (1739 to 2389) deaths delayed or averted. Scenario 3 (50% reduction in pigs and poultry replaced with fruit, vegetables and cereals: 3% reduction in GHG emissions) resulted in 9297 (7288 to 11 301) deaths delayed or averted.
CONCLUSION:


Modelled results suggest that public health and climate change dietary goals are in broad alignment with the largest results in both domains occurring when consumption of all meat and dairy products are reduced. Further work in real-life settings is needed to confirm these results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Balancing tests are diagnostics designed for use with propensity score methods, a widely used non-experimental approach in the evaluation literature. Such tests provide useful information on whether plausible counterfactuals have been created. Currently, multiple balancing tests exist in the literature but it is unclear which is the most useful. This article highlights the poor size properties of commonly employed balancing tests and attempts to shed some light on the link between the results of balancing tests and bias of the evaluation estimator. The simulation results suggest that in scenarios where the conditional independence assumption holds, a permutation version of the balancing test described in Dehejia and Wahba (Rev Econ Stat 84:151–161, 2002) can be useful in applied study. The proposed test has good size properties. In addition, the test appears to have good power for detecting a misspecification in the link function and some power for detecting an omission of relevant non-linear terms involving variables that are included at a lower order.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Making decision usually occurs in the state of being uncertain. These kinds of problems often expresses in a formula as optimization problems. It is desire for decision makers to find a solution for optimization problems. Typically, solving optimization problems in uncertain environment is difficult. This paper proposes a new hybrid intelligent algorithm to solve a kind of stochastic optimization i.e. dependent chance programming (DCP) model. In order to speed up the solution process, we used support vector machine regression (SVM regression) to approximate chance functions which is the probability of a sequence of uncertain event occurs based on the training data generated by the stochastic simulation. The proposed algorithm consists of three steps: (1) generate data to estimate the objective function, (2) utilize SVM regression to reveal a trend hidden in the data (3) apply genetic algorithm (GA) based on SVM regression to obtain an estimation for the chance function. Numerical example is presented to show the ability of algorithm in terms of time-consuming and precision.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aim: Although the guaranteed superannuation system is believed by many to provide a safe and adequate source of funds in retirement, some will be unpleasantly surprised. The aim of this paper is to demonstrate the significant effect of the economic cycle on the final accumulated balance in superannuation retirement accounts. Method: A Monte Carlo simulation is used to illustrate the variance in outcomes that can be expected for a hypothetical individual. Results: The expected accumulated superannuation balances for two hypothetical individuals are estimated. The spread of outcomes is used to illustrate the problem of using only the mean of the distribution as a predictor of wealth in the retirement years. Conclusions: Many retirees rely on superannuation to fund their retirement. However regular contributions to superannuation does not ensure a predictable outcome, and active management of contributions is required if retirement goals are to be met.