27 resultados para 150507 Pricing (incl. Consumer Value Estimation)
Resumo:
Valuation is often said to be “an art not a science” but this relates to the techniques employed to calculate value not to the underlying concept itself. Valuation practice has documented different bases of value or definitions of value both internationally and nationally. This paper discusses these definitions and suggests that there is a common thread that ties the definitions together.
Resumo:
The general focus of this paper is the regional estimation of marginal benefits of targeted water pollution abatement to instream uses. Benefit estimates are derived from actual consumer choices of recreational fishing activities and the implied expenditures for various levels of water quality. The methodology is applied to measuring the benefits accruing to recreational anglers in Indiana from the abatement of pollutants that are by-products of agricultural crop production.
Resumo:
The recent roll-out of smart metering technologies in several developed countries has intensified research on the impacts of Time-of-Use (TOU) pricing on consumption. This paper analyses a TOU dataset from the Province of Trento in Northern Italy using a stochastic adjustment model. Findings highlight the non-steadiness of the relationship between consumption and TOU price. Weather and active occupancy can partly explain future consumption in relation to price.
Resumo:
We introduce a modified conditional logit model that takes account of uncertainty associated with mis-reporting in revealed preference experiments estimating willingness-to-pay (WTP). Like Hausman et al. [Journal of Econometrics (1988) Vol. 87, pp. 239-269], our model captures the extent and direction of uncertainty by respondents. Using a Bayesian methodology, we apply our model to a choice modelling (CM) data set examining UK consumer preferences for non-pesticide food. We compare the results of our model with the Hausman model. WTP estimates are produced for different groups of consumers and we find that modified estimates of WTP, that take account of mis-reporting, are substantially revised downwards. We find a significant proportion of respondents mis-reporting in favour of the non-pesticide option. Finally, with this data set, Bayes factors suggest that our model is preferred to the Hausman model.
Resumo:
This paper considers the problem of estimation when one of a number of populations, assumed normal with known common variance, is selected on the basis of it having the largest observed mean. Conditional on selection of the population, the observed mean is a biased estimate of the true mean. This problem arises in the analysis of clinical trials in which selection is made between a number of experimental treatments that are compared with each other either with or without an additional control treatment. Attempts to obtain approximately unbiased estimates in this setting have been proposed by Shen [2001. An improved method of evaluating drug effect in a multiple dose clinical trial. Statist. Medicine 20, 1913–1929] and Stallard and Todd [2005. Point estimates and confidence regions for sequential trials involving selection. J. Statist. Plann. Inference 135, 402–419]. This paper explores the problem in the simple setting in which two experimental treatments are compared in a single analysis. It is shown that in this case the estimate of Stallard and Todd is the maximum-likelihood estimate (m.l.e.), and this is compared with the estimate proposed by Shen. In particular, it is shown that the m.l.e. has infinite expectation whatever the true value of the mean being estimated. We show that there is no conditionally unbiased estimator, and propose a new family of approximately conditionally unbiased estimators, comparing these with the estimators suggested by Shen.
Resumo:
Finding an estimate of the channel impulse response (CIR) by correlating a received known (training) sequence with the sent training sequence is commonplace. Where required, it is also common to truncate the longer correlation to a sub-set of correlation coefficients by finding the set of N sequential correlation coefficients with the maximum power. This paper presents a new approach to selecting the optimal set of N CIR coefficients from the correlation rather than relying on power. The algorithm reconstructs a set of predicted symbols using the training sequence and various sub-sets of the correlation to find the sub-set that results in the minimum mean squared error between the actual received symbols and the reconstructed symbols. The application of the algorithm is presented in the context of the TDMA based GSM/GPRS system to demonstrate an improvement in the system performance with the new algorithm and the results are presented in the paper. However, the application lends itself to any training sequence based communication system often found within wireless consumer electronic device(1).
Resumo:
Numerous studies have documented the failure of the static and conditional capital asset pricing models to explain the difference in returns between value and growth stocks. This paper examines the post-1963 value premium by employing a model that captures the time-varying total risk of the value-minus-growth portfolios. Our results show that the time-series of value premia is strongly and positively correlated with its volatility. This conclusion is robust to the criterion used to sort stocks into value and growth portfolios and to the country under review (the US and the UK). Our paper is consistent with evidence on the possible role of idiosyncratic risk in explaining equity returns, and also with a separate strand of literature concerning the relative lack of reversibility of value firms' investment decisions.
Resumo:
This study proposes a conceptual model for customer experience quality and its impact on customer relationship outcomes. Customer experience is conceptualized as the customer’s subjective response to the holistic direct and indirect encounter with the firm, and customer experience quality as its perceived excellence or superiority. Using the repertory grid technique in 40 interviews in B2B and B2C contexts, the authors find that customer experience quality is judged with respect to its contribution to value-in-use, and hence propose that value-in-use mediates between experience quality and relationship outcomes. Experience quality includes evaluations not just of the firm’s products and services but also of peer-to-peer and complementary supplier encounters. In assessing experience quality in B2B contexts, customers place a greater emphasis on firm practices that focus on understanding and delivering value-in-use than is generally the case in B2C contexts. Implications for practitioners’ customer insight processes and future research directions are suggested.
Resumo:
This paper compares a number of different extreme value models for determining the value at risk (VaR) of three LIFFE futures contracts. A semi-nonparametric approach is also proposed, where the tail events are modeled using the generalised Pareto distribution, and normal market conditions are captured by the empirical distribution function. The value at risk estimates from this approach are compared with those of standard nonparametric extreme value tail estimation approaches, with a small sample bias-corrected extreme value approach, and with those calculated from bootstrapping the unconditional density and bootstrapping from a GARCH(1,1) model. The results indicate that, for a holdout sample, the proposed semi-nonparametric extreme value approach yields superior results to other methods, but the small sample tail index technique is also accurate.
Resumo:
The application of real options theory to commercial real estate has developed rapidly during the last 15 Years. In particular, several pricing models have been applied to value real options embedded in development projects. In this study we use a case study of a mixed use development scheme and identify the major implied and explicit real options available to the developer. We offer the perspective of a real market application by exploring different binomial models and the associated methods of estimating the crucial parameter of volatility. We include simple binomial lattices, quadranomial lattices and demonstrate the sensitivity of the results to the choice of inputs and method.
Resumo:
Statistical graphics are a fundamental, yet often overlooked, set of components in the repertoire of data analytic tools. Graphs are quick and efficient, yet simple instruments of preliminary exploration of a dataset to understand its structure and to provide insight into influential aspects of inference such as departures from assumptions and latent patterns. In this paper, we present and assess a graphical device for choosing a method for estimating population size in capture-recapture studies of closed populations. The basic concept is derived from a homogeneous Poisson distribution where the ratios of neighboring Poisson probabilities multiplied by the value of the larger neighbor count are constant. This property extends to the zero-truncated Poisson distribution which is of fundamental importance in capture–recapture studies. In practice however, this distributional property is often violated. The graphical device developed here, the ratio plot, can be used for assessing specific departures from a Poisson distribution. For example, simple contaminations of an otherwise homogeneous Poisson model can be easily detected and a robust estimator for the population size can be suggested. Several robust estimators are developed and a simulation study is provided to give some guidance on which should be used in practice. More systematic departures can also easily be detected using the ratio plot. In this paper, the focus is on Gamma mixtures of the Poisson distribution which leads to a linear pattern (called structured heterogeneity) in the ratio plot. More generally, the paper shows that the ratio plot is monotone for arbitrary mixtures of power series densities.
Resumo:
Consumer studies of meat have tended to use quantitative methodologies providing a wealth of statistically malleable information, but little in-depth insight into consumer perceptions of meat. The aim of the present study was therefore, to understand factors perceived important in the selection of chicken meat, using qualitative methodology. Focus group discussions were tape recorded, transcribed verbatim and content analysed for major themes. Themes arising implied that “appearance” and “convenience” were the most important determinants of choice of chicken meat and these factors appeared to be associated with perceptions of freshness, healthiness, product versatility and concepts of value. A descriptive model has been developed to illustrate the interrelationship between factors affecting chicken meat choice. This study indicates that those involved in the production and retailing of chicken products should concentrate upon product appearance and convenience as market drivers for their products.
Resumo:
In this paper we perform an analytical and numerical study of Extreme Value distributions in discrete dynamical systems. In this setting, recent works have shown how to get a statistics of extremes in agreement with the classical Extreme Value Theory. We pursue these investigations by giving analytical expressions of Extreme Value distribution parameters for maps that have an absolutely continuous invariant measure. We compare these analytical results with numerical experiments in which we study the convergence to limiting distributions using the so called block-maxima approach, pointing out in which cases we obtain robust estimation of parameters. In regular maps for which mixing properties do not hold, we show that the fitting procedure to the classical Extreme Value Distribution fails, as expected. However, we obtain an empirical distribution that can be explained starting from a different observable function for which Nicolis et al. (Phys. Rev. Lett. 97(21): 210602, 2006) have found analytical results.
Resumo:
This paper presents novel observer-based techniques for the estimation of flow demands in gas networks, from sparse pressure telemetry. A completely observable model is explored, constructed by incorporating difference equations that assume the flow demands are steady. Since the flow demands usually vary slowly with time, this is a reasonable approximation. Two techniques for constructing robust observers are employed: robust eigenstructure assignment and singular value assignment. These techniques help to reduce the effects of the system approximation. Modelling error may be further reduced by making use of known profiles for the flow demands. The theory is extended to deal successfully with the problem of measurement bias. The pressure measurements available are subject to constant biases which degrade the flow demand estimates, and such biases need to be estimated. This is achieved by constructing a further model variation that incorporates the biases into an augmented state vector, but now includes information about the flow demand profiles in a new form.