507 resultados para 150507 Pricing (incl. Consumer Value Estimation)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A pervasive and puzzling feature of banks’ Value-at-Risk (VaR) is its abnormally high level, which leads to excessive regulatory capital. A possible explanation for the tendency of commercial banks to overstate their VaR is that they incompletely account for the diversification effect among broad risk categories (e.g., equity, interest rate, commodity, credit spread, and foreign exchange). By underestimating the diversification effect, bank’s proprietary VaR models produce overly prudent market risk assessments. In this paper, we examine empirically the validity of this hypothesis using actual VaR data from major US commercial banks. In contrast to the VaR diversification hypothesis, we find that US banks show no sign of systematic underestimation of the diversification effect. In particular, diversification effects used by banks is very close to (and quite often larger than) our empirical diversification estimates. A direct implication of this finding is that individual VaRs for each broad risk category, just like aggregate VaRs, are biased risk assessments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we study both the level of Value-at-Risk (VaR) disclosure and the accuracy of the disclosed VaR figures for a sample of US and international commercial banks. To measure the level of VaR disclosures, we develop a VaR Disclosure Index that captures many different facets of market risk disclosure. Using panel data over the period 1996–2005, we find an overall upward trend in the quantity of information released to the public. We also find that Historical Simulation is by far the most popular VaR method. We assess the accuracy of VaR figures by studying the number of VaR exceedances and whether actual daily VaRs contain information about the volatility of subsequent trading revenues. Unlike the level of VaR disclosure, the quality of VaR disclosure shows no sign of improvement over time. We find that VaR computed using Historical Simulation contains very little information about future volatility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lignocellulosic waste materials are the most promising feedstock for generation of a renewable, carbon-neutral substitute for existing liquid fuels. The development of value-added products from lignin will greatly improve the economics of producing liquid fuels from biomass. This review gives an outline of lignin chemistry, describes the current processes of lignocellulosic biomass fractionation and the lignin products obtained through these processes, then outlines current and potential value-added applications of these products, in particular as components of polymer composites. Research highlights The use of lignocellulosic biomass to produce platform chemicals and industrial products enhances the sustainability of natural resources and improves environmental quality by reducing greenhouse and toxic emissions. In addition, the development of lignin based products improves the economics producing liquid transportation fuel from lignocellulosic feedstock. Value adding can be achieved by converting lignin to functionally equivalent products that rely in its intrinsic properties. This review outlines lignin chemistry and some potential high value products that can be made from lignin. Keywords: Lignocellulose materials; Lignin chemistry; Application

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As part of a larger literature focused on identifying and relating the antecedents and consequences of diffusing organizational practices/ideas, recent research has debated the international adoption of a shareholder-value-orientation (SVO). The debate has financial economists characterizing the adoption of an SVO as performance-enhancing and thus inevitable, with behavioral scientists disputing both claims, invoking institutional differences. This study seeks to provide some resolution to the debate (and advance current understanding on the diffusion of practices/ideas) by developing a socio-political perspective that links the antecedents and consequences of an SVO. In particular, we introduce the notion of misaligned elites and misfitted practices in our analysis of how and why differences in the technical and cultural preferences of major owners will influence a firm’s adoption and (un)successful implementation of an SVO among the largest 100 corporations in the Netherlands from 1992-2006. We conclude with a discussion of the implications of our perspective and our findings for future research on corporate governance and the diffusion of organizational practices/ideas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The literature abounds with descriptions of failures in high-profile projects and a range of initiatives has been generated to enhance project management practice (e.g., Morris, 2006). Estimating from our own research, there are scores of other project failures that are unrecorded. Many of these failures can be explained using existing project management theory; poor risk management, inaccurate estimating, cultures of optimism dominating decision making, stakeholder mismanagement, inadequate timeframes, and so on. Nevertheless, in spite of extensive discussion and analysis of failures and attention to the presumed causes of failure, projects continue to fail in unexpected ways. In the 1990s, three U.S. state departments of motor vehicles (DMV) cancelled major projects due to time and cost overruns and inability to meet project goals (IT-Cortex, 2010). The California DMV failed to revitalize their drivers’ license and registration application process after spending $45 million. The Oregon DMV cancelled their five year, $50 million project to automate their manual, paper-based operation after three years when the estimates grew to $123 million; its duration stretched to eight years or more and the prototype was a complete failure. In 1997, the Washington state DMV cancelled their license application mitigation project because it would have been too big and obsolete by the time it was estimated to be finished. There are countless similar examples of projects that have been abandoned or that have not delivered the requirements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerous tools and techniques have been developed to eliminate or reduce waste and carry out lean concepts in the manufacturing environment. However, appropriate lean tools need to be selected and implemented in order to fulfil the manufacturer needs within their budgetary constraints. As a result, it is important to identify manufacturer needs and implement only those tools, which contribute maximum benefit to their needs. In this research a mathematical model is proposed for maximising the perceived value of manufacturer needs and developed a step-by-step methodology to select best performance metrics along with appropriate lean strategies within the budgetary constraints. With the help of a case study, the proposed model and method have been demonstrated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many traffic situations require drivers to cross or merge into a stream having higher priority. Gap acceptance theory enables us to model such processes to analyse traffic operation. This discussion demonstrated that numerical search fine tuned by statistical analysis can be used to determine the most likely critical gap for a sample of drivers, based on their largest rejected gap and accepted gap. This method shares some common features with the Maximum Likelihood Estimation technique (Troutbeck 1992) but lends itself well to contemporary analysis tools such as spreadsheet and is particularly analytically transparent. This method is considered not to bias estimation of critical gap due to very small rejected gaps or very large rejected gaps. However, it requires a sufficiently large sample that there is reasonable representation of largest rejected gap/accepted gap pairs within a fairly narrow highest likelihood search band.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Markov chain Monte Carlo (MCMC) estimation provides a solution to the complex integration problems that are faced in the Bayesian analysis of statistical problems. The implementation of MCMC algorithms is, however, code intensive and time consuming. We have developed a Python package, which is called PyMCMC, that aids in the construction of MCMC samplers and helps to substantially reduce the likelihood of coding error, as well as aid in the minimisation of repetitive code. PyMCMC contains classes for Gibbs, Metropolis Hastings, independent Metropolis Hastings, random walk Metropolis Hastings, orientational bias Monte Carlo and slice samplers as well as specific modules for common models such as a module for Bayesian regression analysis. PyMCMC is straightforward to optimise, taking advantage of the Python libraries Numpy and Scipy, as well as being readily extensible with C or Fortran.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To date, consumer behaviour research is still over-focused on the functional rather than the dysfunctional. Both empirical and anecdotal evidence suggest that service organisations are burdened with the concept of consumer sovereignty, while consumers freely flout the ‘rules’ of social exchange and behave in deviant and dysfunctional ways. Further, the current scope of consumer misbehaviour research suggests that the phenomenon has principally been studied in the context of economically-focused exchange. This limits our current understanding of consumer misbehaviour to service encounters that are more transactional than relational in nature. Consequently, this thesis takes a Social Exchange approach to consumer misbehaviour and reports a three-stage multi-method study that examined the nature and antecedents of consumer misbehaviour in professional services. It addresses the following broad research question: What is the nature of consumer misbehaviour during professional service encounters? Study One initially explored the nature of consumer misbehaviour in professional service encounters using critical incident technique (CIT) within 38 semi-structured in-depth interviews. The study was designed to develop a better understanding of what constitutes consumer misbehaviour from a service provider’s perspective. Once the nature of consumer misbehaviour had been qualified, Study Two focused on developing and refining calibrated items that formed Guttman-like scales for two consumer misbehaviour constructs: one for the most theoretically-central type of consumer misbehaviour identified in Study One (i.e. refusal to participate) and one for the most well-theorised and salient type of consumer misbehaviour (i.e. verbal abuse) identified in Study One to afford a comparison. This study used Rasch modelling to investigate whether it was possible to calibrate the escalating severity of a series of decontextualised behavioural descriptors in a valid and reliable manner. Creating scales of calibrated items that capture the variation in severity of different types of consumer misbehaviour identified in Study One allowed for a more valid and reliable investigation of the antecedents of such behaviour. Lastly, Study Three utilised an experimental design to investigate three key antecedents of consumer misbehaviour: (1) the perceived quality of the service encounter [drawn from Fullerton and Punj’s (1993) model of aberrant consumer behaviour], (2) the violation of consumers’ perceptions of justice and equity [drawn from Rousseau’s (1989) Psychological Contract Theory], and (3) consumers’ affective responses to exchange [drawn from Weiss and Cropanzano’s (1996) Affective Events Theory]. Investigating three key antecedents of consumer misbehaviour confirmed the newly-developed understanding of the nature of consumer misbehaviour during professional service encounters. Combined, the results of the three studies suggest that consumer misbehaviour is characteristically different within professional services. The most salient and theoretically-central behaviours can be measured using increasingly severe decontextualised behavioural descriptors. Further, increasingly severe forms of consumer misbehaviour are likely to occur as a response to consumer anger at low levels of interpersonal service quality. These findings have a range of key implications for both marketing theory and practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The number of software vendors offering ‘Software-as-a-Service’ has been increasing in recent years. In the Software-as-a-Service model software is operated by the software vendor and delivered to the customer as a service. Existing business models and industry structures are challenged by the changes to the deployment and pricing model compared to traditional software. However, the full implications on the way companies create, deliver and capture value are not yet sufficiently analyzed. Current research is scattered on specific aspects, only a few studies provide a more holistic view of the impact from a business model perspective. For vendors it is, however, crucial to be aware of the potentially far reaching consequences of Software-as-a-Service. Therefore, a literature review and three exploratory case studies of leading software vendors are used to evaluate possible implications of Software-as-a-Service on business models. The results show an impact on all business model building blocks and highlight in particular the often less articulated impact on key activities, customer relationship and key partnerships for leading software vendors and show related challenges, for example, with regard to the integration of development and operations processes. The observed implications demonstrate the disruptive character of the concept and identify future research requirements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ethernet is a key component of the standards used for digital process buses in transmission substations, namely IEC 61850 and IEEE Std 1588-2008 (PTPv2). These standards use multicast Ethernet frames that can be processed by more than one device. This presents some significant engineering challenges when implementing a sampled value process bus due to the large amount of network traffic. A system of network traffic segregation using a combination of Virtual LAN (VLAN) and multicast address filtering using managed Ethernet switches is presented. This includes VLAN prioritisation of traffic classes such as the IEC 61850 protocols GOOSE, MMS and sampled values (SV), and other protocols like PTPv2. Multicast address filtering is used to limit SV/GOOSE traffic to defined subsets of subscribers. A method to map substation plant reference designations to multicast address ranges is proposed that enables engineers to determine the type of traffic and location of the source by inspecting the destination address. This method and the proposed filtering strategy simplifies future changes to the prioritisation of network traffic, and is applicable to both process bus and station bus applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study model selection strategies based on penalized empirical loss minimization. We point out a tight relationship between error estimation and data-based complexity penalization: any good error estimate may be converted into a data-based penalty function and the performance of the estimate is governed by the quality of the error estimate. We consider several penalty functions, involving error estimates on independent test data, empirical VC dimension, empirical VC entropy, and margin-based quantities. We also consider the maximal difference between the error on the first half of the training data and the second half, and the expected maximal discrepancy, a closely related capacity estimate that can be calculated by Monte Carlo integration. Maximal discrepancy penalty functions are appealing for pattern classification problems, since their computation is equivalent to empirical risk minimization over the training data with some labels flipped.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider complexity penalization methods for model selection. These methods aim to choose a model to optimally trade off estimation and approximation errors by minimizing the sum of an empirical risk term and a complexity penalty. It is well known that if we use a bound on the maximal deviation between empirical and true risks as a complexity penalty, then the risk of our choice is no more than the approximation error plus twice the complexity penalty. There are many cases, however, where complexity penalties like this give loose upper bounds on the estimation error. In particular, if we choose a function from a suitably simple convex function class with a strictly convex loss function, then the estimation error (the difference between the risk of the empirical risk minimizer and the minimal risk in the class) approaches zero at a faster rate than the maximal deviation between empirical and true risks. In this paper, we address the question of whether it is possible to design a complexity penalized model selection method for these situations. We show that, provided the sequence of models is ordered by inclusion, in these cases we can use tight upper bounds on estimation error as a complexity penalty. Surprisingly, this is the case even in situations when the difference between the empirical risk and true risk (and indeed the error of any estimate of the approximation error) decreases much more slowly than the complexity penalty. We give an oracle inequality showing that the resulting model selection method chooses a function with risk no more than the approximation error plus a constant times the complexity penalty.