984 resultados para Financial Modelling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to develop, test and benchmark a framework and a predictive risk model for hospital emergency readmission within 12 months. We performed the development using routinely collected Hospital Episode Statistics data covering inpatient hospital admissions in England. Three different timeframes were used for training, testing and benchmarking: 1999 to 2004, 2000 to 2005 and 2004 to 2009 financial years. Each timeframe includes 20% of all inpatients admitted within the trigger year. The comparisons were made using positive predictive value, sensitivity and specificity for different risk cut-offs, risk bands and top risk segments, together with the receiver operating characteristic curve. The constructed Bayes Point Machine using this feature selection framework produces a risk probability for each admitted patient, and it was validated for different timeframes, sub-populations and cut-off points. At risk cut-off of 50%, the positive predictive value was 69.3% to 73.7%, the specificity was 88.0% to 88.9% and sensitivity was 44.5% to 46.3% across different timeframes. Also, the area under the receiver operating characteristic curve was 73.0% to 74.3%. The developed framework and model performed considerably better than existing modelling approaches with high precision and moderate sensitivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese apresentada como requisito parcial para obtenção do grau de Doutor em Estatística e Gestão de Informação pelo Instituto Superior de Estatística e Gestão de Informação da Universidade Nova de Lisboa

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper aims to provide a model that allows BPI to measure the credit risk, through its rating scale, of the subsidiaries included in the corporate groups who are their clients. This model should be simple enough to be applied in practice, accurate, and must give consistent results in comparison to what have been the ratings given by the bank. The model proposed includes operational, strategic, and financial factors and ends up giving one of three results: no support, partial support, or full support from the holding to the subsidiary, and each of them translates in adjustments in each subsidiary’s credit rating. As it would be expectable, most of the subsidiaries should have the same credit rating of its parent company.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyzes the measure of systemic importance ∆CoV aR proposed by Adrian and Brunnermeier (2009, 2010) within the context of a similar class of risk measures used in the risk management literature. In addition, we develop a series of testing procedures, based on ∆CoV aR, to identify and rank the systemically important institutions. We stress the importance of statistical testing in interpreting the measure of systemic importance. An empirical application illustrates the testing procedures, using equity data for three European banks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quadratic programming techniques were applied to household food consumption data in England and Wales to estimate likely changes in diet under healthy eating guidelines, and the consequences this would have on agriculture and land use in England and Wales. The first step entailed imposing nutrient restrictions on food consumption following dietary recommendations suggested by the UK Department of Health. The resulting diet was used, in a second step as a proxy for demand in agricultural commodities, to test the impact of such a scenario on food production and land use in England and Wales and the impacts of this on agricultural landscapes. Results of the diet optimisation indicated a large drop in consumption of foods rich in saturated fats and sugar, essentially cheese and sugar-based products, along with lesser cuts of fat and meat products. Conversely, consumption of fruit and vegetables, cereals, and flour would increase to meet dietary fibre recommendations. Such a shift in demand would dramatically affect production patterns: the financial net margin of England and Wales agriculture would rise, due to increased production of high market value and high economic margin crops. Some regions would, however, be negatively affected, mostly those dependent on beef cattle and sheep production that could not benefit from an increased demand for cereals and horticultural crops. The effects of these changes would also be felt in upstream industries, such as animal feed suppliers. While arable dominated landscapes would be little affected, pastoral landscapes would suffer through loss of grazing management and, possibly, land abandonment, especially in upland areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We examine whether a three-regime model that allows for dormant, explosive and collapsing speculative behaviour can explain the dynamics of the S&P 500. We extend existing models of speculative behaviour by including a third regime that allows a bubble to grow at a steady rate, and propose abnormal volume as an indicator of the probable time of bubble collapse. We also examine the financial usefulness of the three-regime model by studying a trading rule formed using inferences from it, whose use leads to higher Sharpe ratios and end of period wealth than from employing existing models or a buy-and-hold strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to investigate the effect of choices of model structure and scale in development viability appraisal. The paper addresses two questions concerning the application of development appraisal techniques to viability modelling within the UK planning system. The first relates to the extent to which, given intrinsic input uncertainty, the choice of model structure significantly affects model outputs. The second concerns the extent to which, given intrinsic input uncertainty, the level of model complexity significantly affects model outputs. Design/methodology/approach – Monte Carlo simulation procedures are applied to a hypothetical development scheme in order to measure the effects of model aggregation and structure on model output variance. Findings – It is concluded that, given the particular scheme modelled and unavoidably subjective assumptions of input variance, that simple and simplistic models may produce similar outputs to more robust and disaggregated models. Evidence is found of equifinality in the outputs of a simple, aggregated model of development viability relative to more complex, disaggregated models. Originality/value – Development viability appraisal has become increasingly important in the planning system. Consequently, the theory, application and outputs from development appraisal are under intense scrutiny from a wide range of users. However, there has been very little published evaluation of viability models. This paper contributes to the limited literature in this area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Johne's disease in cattle is a contagious wasting disease caused by Mycobacterium avium subspecies paratuberculosis (MAP). Johne's infection is characterised by a long subclinical phase and can therefore go undetected for long periods of time during which substantial production losses can occur. The protracted nature of Johne's infection therefore presents a challenge for both veterinarians and farmers when discussing control options due to a paucity of information and limited test performance when screening for the disease. The objectives were to model Johne's control decisions in suckler beef cattle using a decision support approach, thus implying equal focus on ‘end user’ (veterinarian) participation whilst still focusing on the technical disease modelling aspects during the decision support model development. The model shows how Johne's disease is likely to affect a herd over time both in terms of physical and financial impacts. In addition, the model simulates the effect on production from two different Johne's control strategies; herd management measures and test and cull measures. The article also provides and discusses results from a sensitivity analysis to assess the effects on production from improving the currently available test performance. Output from running the model shows that a combination of management improvements to reduce routes of infection and testing and culling to remove infected and infectious animals is likely to be the least-cost control strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of the Water Framework Directive (WFD) in the European Union (EU) targets certain threshold levels for the concentration of various nutrients, nitrogen and phosphorous being the most important. In the EU, agri-environmental measures constitute a significant component of Pillar 2—Rural Development Policies in both financial and regulatory terms. Environmental measures also are linked to Pillar 1 payments through cross-compliance and the greening proposals. This paper drawing from work carried out in the REFRESH FP7 project aims to show how an INtegrated CAtchment model of plant/soil system dynamics and instream biogeochemical and hydrological dynamics can be used to assess the cost-effectiveness of agri-environmental measures in relation to nutrient concentration targets set by the WFD, especially in the presence of important habitats. We present the procedures (methodological steps, challenges and problems) for assessing the cost-effectiveness of agri-environmental measures at the baseline situation, and climate and land use change scenarios. Furthermore, we present results of an application of this methodology to the Louros watershed in Greece and discuss the likely uses and future extensions of the modelling approach. Finally, we attempt to reveal the importance of this methodology for designing and incorporating alternative environmental practices in Pillar 1 and 2 measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last decade the English planning system has placed greater emphasis on the financial viability of development. ‘Calculative’ practices have been used to quantify and capture land value uplifts. Development viability appraisal (DVA) has become a key part of the evidence base used in planning decision-making and informs both ‘site-specific’ negotiations about the level of land value capture for individual schemes and ‘area-wide’ planning policy formation. This paper investigates how implementation of DVA is governed in planning policy formation. It is argued that the increased use of DVA raises important questions about how planning decisions are made and operationalised, not least because DVA is often poorly understood by some key stakeholders. The paper uses the concept of governance to thematically analyse semi-structured interviews conducted with the producers of DVAs and considers key procedural issues including (in)consistencies in appraisal practices, levels of stakeholder consultation and the potential for client and producer bias. Whilst stakeholder consultation is shown to be integral to the appraisal process in order to improve the quality of the appraisals and to legitimise the outputs, participation is restricted to industry experts and excludes some interest groups, including local communities. It is concluded that, largely because of its recent adoption and knowledge asymmetries between local planning authorities and appraisers, DVA is a weakly governed process characterised by emerging and contested guidance and is therefore ‘up for grabs’.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The year 1968 saw a major shift from univariate to multivariate methodological approaches to ratio-based modelling of corporate collapse. This was facilitated by the introduction of a new statistical tool called Multiple Discriminant Analysis (MDA). However, it did not take long before other statistical tools were developed. The primary objective for developing these tools was to enable deriving models that would at least do as good a job asMDA, but rely on fewer assumptions. With the introduction of new statistical tools, researchers became pre-occupied with testing them in signalling collapse. lLTUong the ratio-based approaches were Logit analysis, Neural Network analysis, Probit analysis, ID3, Recursive Partitioning Algorithm, Rough Sets analysis, Decomposition analysis, Going Concern Advisor, Koundinya and Purl judgmental approach, Tabu Search and Mixed Logit analysis. Regardless of which methodological approach was chosen, most were compared to MDA. This paper reviews these various approaches. Emphasis is placed on how they fared against MDA in signalling corporate collapse.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper highlights the prevalence and extent of financial fraud amongst collapsed corporations. In doing so, it examines the recent spectacular corporate collapses of Parmalat in Europe, Enron and WoridCom in the USA and HIH in Australia. A new methodology that provides empirical evidence to the financial fraud claims found in the literature, is then put forward. The proposed methodology argues that if financial fraud was a possibility amongst collapsed corporations, then two premises ought to be observed in the literature on ratio based multivariate modelling for predicting corporate collapse. First, in the absence of financial fraud, we expect the models to consistently predict corporate collapse with a high degree of accuracy; particularly, as one approaches the incident of collapse. Second, if financial fraud takes place and statement figures are distorted, then we expect the financial ratios, which are the predictor variables in these models, to lose relevance and therefore their use in models will be short-lived. Empirical support from Hossari and Rahman (2004) and Hossari and Rahman (2005) is presented as evidence to the two premises.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates problems associated with interpretations of corporate collapse, and argues for a unified legal, rather than financial, definition of the event. In the absence of a formal definition of the event of corporate collapse, the integrity of sample selection becomes questionable; moreover, comparisons between empirical studies becomes less useful, if not altogether futile, due to the lack of a common ground in the basic building block. Upon close examination of 84 studies on ratio-based modeling of corporate collapse, between 1968 and 2004, this paper finds evidence in favor of a legal interpretation of the event of corporate collapse. Specifically, studies that adopted a legal definition are five times as many as those that opted for a financial explanation.