48 resultados para accounting-based valuation models

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract: Following a workshop exercise, two models, an individual-based landscape model (IBLM) and a non-spatial life-history model were used to assess the impact of a fictitious insecticide on populations of skylarks in the UK. The chosen population endpoints were abundance, population growth rate, and the chances of population persistence. Both models used the same life-history descriptors and toxicity profiles as the basis for their parameter inputs. The models differed in that exposure was a pre-determined parameter in the life-history model, but an emergent property of the IBLM, and the IBLM required a landscape structure as an input. The model outputs were qualitatively similar between the two models. Under conditions dominated by winter wheat, both models predicted a population decline that was worsened by the use of the insecticide. Under broader habitat conditions, population declines were only predicted for the scenarios where the insecticide was added. Inputs to the models are very different, with the IBLM requiring a large volume of data in order to achieve the flexibility of being able to integrate a range of environmental and behavioural factors. The life-history model has very few explicit data inputs, but some of these relied on extensive prior modelling needing additional data as described in Roelofs et al.(2005, this volume). Both models have strengths and weaknesses; hence the ideal approach is that of combining the use of both simple and comprehensive modeling tools.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study analyzes the issue of American option valuation when the underlying exhibits a GARCH-type volatility process. We propose the usage of Rubinstein's Edgeworth binomial tree (EBT) in contrast to simulation-based methods being considered in previous studies. The EBT-based valuation approach makes an implied calibration of the pricing model feasible. By empirically analyzing the pricing performance of American index and equity options, we illustrate the superiority of the proposed approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Valuation is the process of estimating price. The methods used to determine value attempt to model the thought processes of the market and thus estimate price by reference to observed historic data. This can be done using either an explicit model, that models the worth calculation of the most likely bidder, or an implicit model, that that uses historic data suitably adjusted as a short cut to determine value by reference to previous similar sales. The former is generally referred to as the Discounted Cash Flow (DCF) model and the latter as the capitalisation (or All Risk Yield) model. However, regardless of the technique used, the valuation will be affected by uncertainties. Uncertainty in the comparable data available; uncertainty in the current and future market conditions and uncertainty in the specific inputs for the subject property. These input uncertainties will translate into an uncertainty with the output figure, the estimate of price. In a previous paper, we have considered the way in which uncertainty is allowed for in the capitalisation model in the UK. In this paper, we extend the analysis to look at the way in which uncertainty can be incorporated into the explicit DCF model. This is done by recognising that the input variables are uncertain and will have a probability distribution pertaining to each of them. Thus buy utilising a probability-based valuation model (using Crystal Ball) it is possible to incorporate uncertainty into the analysis and address the shortcomings of the current model. Although the capitalisation model is discussed, the paper concentrates upon the application of Crystal Ball to the Discounted Cash Flow approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Valuation is often said to be “an art not a science” but this relates to the techniques employed to calculate value not to the underlying concept itself. Valuation is the process of estimating price in the market place. Yet, such an estimation will be affected by uncertainties. Uncertainty in the comparable information available; uncertainty in the current and future market conditions and uncertainty in the specific inputs for the subject property. These input uncertainties will translate into an uncertainty with the output figure, the valuation. The degree of the uncertainties will vary according to the level of market activity; the more active a market, the more credence will be given to the input information. In the UK at the moment the Royal Institution of Chartered Surveyors (RICS) is considering ways in which the uncertainty of the output figure, the valuation, can be conveyed to the use of the valuation, but as yet no definitive view has been taken apart from a single Guidance Note (GN5, RICS 2003) stressing the importance of recognising uncertainty in valuation but not proffering any particular solution. One of the major problems is that Valuation models (in the UK) are based upon comparable information and rely upon single inputs. They are not probability based, yet uncertainty is probability driven. In this paper, we discuss the issues underlying uncertainty in valuations and suggest a probability-based model (using Crystal Ball) to address the shortcomings of the current model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Valuation is often said to be “an art not a science” but this relates to the techniques employed to calculate value not to the underlying concept itself. Valuation is the process of estimating price in the market place. Yet, such an estimation will be affected by uncertainties. Uncertainty in the comparable information available; uncertainty in the current and future market conditions and uncertainty in the specific inputs for the subject property. These input uncertainties will translate into an uncertainty with the output figure, the valuation. The degree of the uncertainties will vary according to the level of market activity; the more active a market, the more credence will be given to the input information. In the UK at the moment the Royal Institution of Chartered Surveyors (RICS) is considering ways in which the uncertainty of the output figure, the valuation, can be conveyed to the use of the valuation, but as yet no definitive view has been taken. One of the major problems is that Valuation models (in the UK) are based upon comparable information and rely upon single inputs. They are not probability based, yet uncertainty is probability driven. In this paper, we discuss the issues underlying uncertainty in valuations and suggest a probability-based model (using Crystal Ball) to address the shortcomings of the current model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The geospace environment is controlled largely by events on the Sun, such as solar flares and coronal mass ejections, which generate significant geomagnetic and upper atmospheric disturbances. The study of this Sun-Earth system, which has become known as space weather, has both intrinsic scientific interest and practical applications. Adverse conditions in space can damage satellites and disrupt communications, navigation, and electric power grids, as well as endanger astronauts. The Center for Integrated Space Weather Modeling (CISM), a Science and Technology Center (STC) funded by the U.S. National Science Foundation (see http://www.bu.edu/cism/), is developing a suite of integrated physics-based computer models that describe the space environment from the Sun to the Earth for use in both research and operations [Hughes and Hudson, 2004, p. 1241]. To further this mission, advanced education and training programs sponsored by CISM encourage students to view space weather as a system that encompasses the Sun, the solar wind, the magnetosphere, and the ionosphere/thermosphere. This holds especially true for participants in the CISM space weather summer school [Simpson, 2004].

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The hierarchical and "bob" (or branch-on-branch) models are tube-based computational models recently developed for predicting the linear rheology of general mixtures of polydisperse branched polymers. These two models are based on a similar tube-theory framework but differ in their numerical implementation and details of relaxation mechanisms. We present a detailed overview of the similarities and differences of these models and examine the effects of these differences on the predictions of the linear viscoelastic properties of a set of representative branched polymer samples in order to give a general picture of the performance of these models. Our analysis confirms that the hierarchical and bob models quantitatively predict the linear rheology of a wide range of branched polymer melts but also indicate that there is still no unique solution to cover all types of branched polymers without case-by-case adjustment of parameters such as the dilution exponent alpha and the factor p(2) which defines the hopping distance of a branch point relative to the tube diameter. An updated version of the hierarchical model, which shows improved computational efficiency and refined relaxation mechanisms, is introduced and used in these analyses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper a new nonlinear digital baseband predistorter design is introduced based on direct learning, together with a new Wiener system modeling approach for the high power amplifiers (HPA) based on the B-spline neural network. The contribution is twofold. Firstly, by assuming that the nonlinearity in the HPA is mainly dependent on the input signal amplitude the complex valued nonlinear static function is represented by two real valued B-spline neural networks, one for the amplitude distortion and another for the phase shift. The Gauss-Newton algorithm is applied for the parameter estimation, in which the De Boor recursion is employed to calculate both the B-spline curve and the first order derivatives. Secondly, we derive the predistorter algorithm calculating the inverse of the complex valued nonlinear static function according to B-spline neural network based Wiener models. The inverse of the amplitude and phase shift distortion are then computed and compensated using the identified phase shift model. Numerical examples have been employed to demonstrate the efficacy of the proposed approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Geomagnetic activity has long been known to exhibit approximately 27 day periodicity, resulting from solar wind structures repeating each solar rotation. Thus a very simple near-Earth solar wind forecast is 27 day persistence, wherein the near-Earth solar wind conditions today are assumed to be identical to those 27 days previously. Effective use of such a persistence model as a forecast tool, however, requires the performance and uncertainty to be fully characterized. The first half of this study determines which solar wind parameters can be reliably forecast by persistence and how the forecast skill varies with the solar cycle. The second half of the study shows how persistence can provide a useful benchmark for more sophisticated forecast schemes, namely physics-based numerical models. Point-by-point assessment methods, such as correlation and mean-square error, find persistence skill comparable to numerical models during solar minimum, despite the 27 day lead time of persistence forecasts, versus 2–5 days for numerical schemes. At solar maximum, however, the dynamic nature of the corona means 27 day persistence is no longer a good approximation and skill scores suggest persistence is out-performed by numerical models for almost all solar wind parameters. But point-by-point assessment techniques are not always a reliable indicator of usefulness as a forecast tool. An event-based assessment method, which focusses key solar wind structures, finds persistence to be the most valuable forecast throughout the solar cycle. This reiterates the fact that the means of assessing the “best” forecast model must be specifically tailored to its intended use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Environmental change research often relies on simplistic, static models of human behaviour in social-ecological systems. This limits understanding of how social-ecological change occurs. Integrative, process-based behavioural models, which include feedbacks between action, and social and ecological system structures and dynamics, can inform dynamic policy assessment in which decision making is internalised in the model. These models focus on dynamics rather than states. They stimulate new questions and foster interdisciplinarity between and within the natural and social sciences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Management accounting in recent times, and perhaps rightly so, has begun to gain recognition as a profession separate and complimentary to financial accounting. Evidence exists to suggest that management accountants are exposed to a unique set of ethical challenges within industry and that a significant high number of management accountants have engaged in unethical practices in performing their jobs. For the accounting profession as a whole, the growing number of corporate failures has created a credibility crisis that requires a deliberate intervention to mitigate. If this is not addressed sooner, the accounting profession stands the risk of losing relevance. Scholarship on ethical issues in accounting practice have either focused mostly on financial accounting or have sought to combine ethical issues for financial and management accounting. Various arguments have been made in recent times of the need to treat ethical issues in behavioural studies as context-specific and therefore separate ethical considerations in management accounting from financial accounting. This study adopts an approach, following various literature, that effective ethics education can help practitioners deal appropriately with ethical issues at the work place, and explores students’ and faculty members’perceptions on current practices in ethics education. As expected, faculty and students differ significantly on a wide range of issues on ethics education in management accounting. Based on the insights provided from this study, appropriate recommendations have been made to improve ethics education in management accounting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines to what extent crops and their environment should be viewed as a coupled system. Crop impact assessments currently use climate model output offline to drive process-based crop models. However, in regions where local climate is sensitive to land surface conditions more consistent assessments may be produced with the crop model embedded within the land surface scheme of the climate model. Using a recently developed coupled crop–climate model, the sensitivity of local climate, in particular climate variability, to climatically forced variations in crop growth throughout the tropics is examined by comparing climates simulated with dynamic and prescribed seasonal growth of croplands. Interannual variations in land surface properties associated with variations in crop growth and development were found to have significant impacts on near-surface fluxes and climate; for example, growing season temperature variability was increased by up to 40% by the inclusion of dynamic crops. The impact was greatest in dry years where the response of crop growth to soil moisture deficits enhanced the associated warming via a reduction in evaporation. Parts of the Sahel, India, Brazil, and southern Africa were identified where local climate variability is sensitive to variations in crop growth, and where crop yield is sensitive to variations in surface temperature. Therefore, offline seasonal forecasting methodologies in these regions may underestimate crop yield variability. The inclusion of dynamic crops also altered the mean climate of the humid tropics, highlighting the importance of including dynamical vegetation within climate models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current feed evaluation systems for dairy cattle aim to match nutrient requirements with nutrient intake at pre-defined production levels. These systems were not developed to address, and are not suitable to predict, the responses to dietary changes in terms of production level and product composition, excretion of nutrients to the environment, and nutrition related disorders. The change from a requirement to a response system to meet the needs of various stakeholders requires prediction of the profile of absorbed nutrients and its subsequent utilisation for various purposes. This contribution examines the challenges to predicting the profile of nutrients available for absorption in dairy cattle and provides guidelines for further improved prediction with regard to animal production responses and environmental pollution. The profile of nutrients available for absorption comprises volatile fatty acids, long-chain fatty acids, amino acids and glucose. Thus the importance of processes in the reticulo-rumen is obvious. Much research into rumen fermentation is aimed at determination of substrate degradation rates. Quantitative knowledge on rates of passage of nutrients out of the rumen is rather limited compared with that on degradation rates, and thus should be an important theme in future research. Current systems largely ignore microbial metabolic variation, and extant mechanistic models of rumen fermentation give only limited attention to explicit representation of microbial metabolic activity. Recent molecular techniques indicate that knowledge on the presence and activity of various microbial species is far from complete. Such techniques may give a wealth of information, but to include such findings in systems predicting the nutrient profile requires close collaboration between molecular scientists and mathematical modellers on interpreting and evaluating quantitative data. Protozoal metabolism is of particular interest here given the paucity of quantitative data. Empirical models lack the biological basis necessary to evaluate mitigation strategies to reduce excretion of waste, including nitrogen, phosphorus and methane. Such models may have little predictive value when comparing various feeding strategies. Examples include the Intergovernmental Panel on Climate Change (IPCC) Tier II models to quantify methane emissions and current protein evaluation systems to evaluate low protein diets to reduce nitrogen losses to the environment. Nutrient based mechanistic models can address such issues. Since environmental issues generally attract more funding from governmental offices, further development of nutrient based models may well take place within an environmental framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is generally acknowledged that population-level assessments provide,I better measure of response to toxicants than assessments of individual-level effects. population-level assessments generally require the use of models to integrate potentially complex data about the effects of toxicants on life-history traits, and to provide a relevant measure of ecological impact. Building on excellent earlier reviews we here briefly outline the modelling options in population-level risk assessment. Modelling is used to calculate population endpoints from available data, which is often about Individual life histories, the ways that individuals interact with each other, the environment and other species, and the ways individuals are affected by pesticides. As population endpoints, we recommend the use of population abundance, population growth rate, and the chance of population persistence. We recommend two types of model: simple life-history models distinguishing two life-history stages, juveniles and adults; and spatially-explicit individual-based landscape models. Life-history models are very quick to set up and run, and they provide a great deal or insight. At the other extreme, individual-based landscape models provide the greatest verisimilitude, albeit at the cost of greatly increased complexity. We conclude with a discussion of the cations of the severe problems of parameterising models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Habitat-based statistical models relating patterns of presence and absence of species to habitat variables could be useful to resolve conservation-related problems and highlight the causes of population declines. In this paper, we apply such a modelling approach to an endemic amphibian, the Sardinian mountain newt Euproctus platycephalus, considered by IUCN a critically endangered species. Sardinian newts inhabit freshwater habitat in streams, small lakes and pools on the island of Sardinia (Italy). Reported declines of newt populations are not yet supported by quantitative data, however, they are perceived or suspected across the species' historical range. This study represents a first attempt trying to statistically relate habitat characteristics to Sardinian newt occurrence and persistence. Linear regression analysis revealed that newts are more likely to be found in sites with colder water temperature, less riparian vegetation and, marginally, absence of fish. The implications of the results for the conservation of the species are discussed, and suggestions for the short-term management of newt inhabited sites suggested. (C) 2003 Elsevier Ltd. All rights reserved.