260 resultados para Value Chains


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A pervasive and puzzling feature of banks’ Value-at-Risk (VaR) is its abnormally high level, which leads to excessive regulatory capital. A possible explanation for the tendency of commercial banks to overstate their VaR is that they incompletely account for the diversification effect among broad risk categories (e.g., equity, interest rate, commodity, credit spread, and foreign exchange). By underestimating the diversification effect, bank’s proprietary VaR models produce overly prudent market risk assessments. In this paper, we examine empirically the validity of this hypothesis using actual VaR data from major US commercial banks. In contrast to the VaR diversification hypothesis, we find that US banks show no sign of systematic underestimation of the diversification effect. In particular, diversification effects used by banks is very close to (and quite often larger than) our empirical diversification estimates. A direct implication of this finding is that individual VaRs for each broad risk category, just like aggregate VaRs, are biased risk assessments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we study both the level of Value-at-Risk (VaR) disclosure and the accuracy of the disclosed VaR figures for a sample of US and international commercial banks. To measure the level of VaR disclosures, we develop a VaR Disclosure Index that captures many different facets of market risk disclosure. Using panel data over the period 1996–2005, we find an overall upward trend in the quantity of information released to the public. We also find that Historical Simulation is by far the most popular VaR method. We assess the accuracy of VaR figures by studying the number of VaR exceedances and whether actual daily VaRs contain information about the volatility of subsequent trading revenues. Unlike the level of VaR disclosure, the quality of VaR disclosure shows no sign of improvement over time. We find that VaR computed using Historical Simulation contains very little information about future volatility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lignocellulosic waste materials are the most promising feedstock for generation of a renewable, carbon-neutral substitute for existing liquid fuels. The development of value-added products from lignin will greatly improve the economics of producing liquid fuels from biomass. This review gives an outline of lignin chemistry, describes the current processes of lignocellulosic biomass fractionation and the lignin products obtained through these processes, then outlines current and potential value-added applications of these products, in particular as components of polymer composites. Research highlights The use of lignocellulosic biomass to produce platform chemicals and industrial products enhances the sustainability of natural resources and improves environmental quality by reducing greenhouse and toxic emissions. In addition, the development of lignin based products improves the economics producing liquid transportation fuel from lignocellulosic feedstock. Value adding can be achieved by converting lignin to functionally equivalent products that rely in its intrinsic properties. This review outlines lignin chemistry and some potential high value products that can be made from lignin. Keywords: Lignocellulose materials; Lignin chemistry; Application

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As part of a larger literature focused on identifying and relating the antecedents and consequences of diffusing organizational practices/ideas, recent research has debated the international adoption of a shareholder-value-orientation (SVO). The debate has financial economists characterizing the adoption of an SVO as performance-enhancing and thus inevitable, with behavioral scientists disputing both claims, invoking institutional differences. This study seeks to provide some resolution to the debate (and advance current understanding on the diffusion of practices/ideas) by developing a socio-political perspective that links the antecedents and consequences of an SVO. In particular, we introduce the notion of misaligned elites and misfitted practices in our analysis of how and why differences in the technical and cultural preferences of major owners will influence a firm’s adoption and (un)successful implementation of an SVO among the largest 100 corporations in the Netherlands from 1992-2006. We conclude with a discussion of the implications of our perspective and our findings for future research on corporate governance and the diffusion of organizational practices/ideas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The literature abounds with descriptions of failures in high-profile projects and a range of initiatives has been generated to enhance project management practice (e.g., Morris, 2006). Estimating from our own research, there are scores of other project failures that are unrecorded. Many of these failures can be explained using existing project management theory; poor risk management, inaccurate estimating, cultures of optimism dominating decision making, stakeholder mismanagement, inadequate timeframes, and so on. Nevertheless, in spite of extensive discussion and analysis of failures and attention to the presumed causes of failure, projects continue to fail in unexpected ways. In the 1990s, three U.S. state departments of motor vehicles (DMV) cancelled major projects due to time and cost overruns and inability to meet project goals (IT-Cortex, 2010). The California DMV failed to revitalize their drivers’ license and registration application process after spending $45 million. The Oregon DMV cancelled their five year, $50 million project to automate their manual, paper-based operation after three years when the estimates grew to $123 million; its duration stretched to eight years or more and the prototype was a complete failure. In 1997, the Washington state DMV cancelled their license application mitigation project because it would have been too big and obsolete by the time it was estimated to be finished. There are countless similar examples of projects that have been abandoned or that have not delivered the requirements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerous tools and techniques have been developed to eliminate or reduce waste and carry out lean concepts in the manufacturing environment. However, appropriate lean tools need to be selected and implemented in order to fulfil the manufacturer needs within their budgetary constraints. As a result, it is important to identify manufacturer needs and implement only those tools, which contribute maximum benefit to their needs. In this research a mathematical model is proposed for maximising the perceived value of manufacturer needs and developed a step-by-step methodology to select best performance metrics along with appropriate lean strategies within the budgetary constraints. With the help of a case study, the proposed model and method have been demonstrated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ethernet is a key component of the standards used for digital process buses in transmission substations, namely IEC 61850 and IEEE Std 1588-2008 (PTPv2). These standards use multicast Ethernet frames that can be processed by more than one device. This presents some significant engineering challenges when implementing a sampled value process bus due to the large amount of network traffic. A system of network traffic segregation using a combination of Virtual LAN (VLAN) and multicast address filtering using managed Ethernet switches is presented. This includes VLAN prioritisation of traffic classes such as the IEC 61850 protocols GOOSE, MMS and sampled values (SV), and other protocols like PTPv2. Multicast address filtering is used to limit SV/GOOSE traffic to defined subsets of subscribers. A method to map substation plant reference designations to multicast address ranges is proposed that enables engineers to determine the type of traffic and location of the source by inspecting the destination address. This method and the proposed filtering strategy simplifies future changes to the prioritisation of network traffic, and is applicable to both process bus and station bus applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gradient-based approaches to direct policy search in reinforcement learning have received much recent attention as a means to solve problems of partial observability and to avoid some of the problems associated with policy degradation in value-function methods. In this paper we introduce GPOMDP, a simulation-based algorithm for generating a biased estimate of the gradient of the average reward in Partially Observable Markov Decision Processes (POMDPs) controlled by parameterized stochastic policies. A similar algorithm was proposed by Kimura, Yamamura, and Kobayashi (1995). The algorithm's chief advantages are that it requires storage of only twice the number of policy parameters, uses one free parameter β ∈ [0,1) (which has a natural interpretation in terms of bias-variance trade-off), and requires no knowledge of the underlying state. We prove convergence of GPOMDP, and show how the correct choice of the parameter β is related to the mixing time of the controlled POMDP. We briefly describe extensions of GPOMDP to controlled Markov chains, continuous state, observation and control spaces, multiple-agents, higher-order derivatives, and a version for training stochastic policies with internal states. In a companion paper (Baxter, Bartlett, & Weaver, 2001) we show how the gradient estimates generated by GPOMDP can be used in both a traditional stochastic gradient algorithm and a conjugate-gradient procedure to find local optima of the average reward. ©2001 AI Access Foundation and Morgan Kaufmann Publishers. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The uniformization method (also known as randomization) is a numerically stable algorithm for computing transient distributions of a continuous time Markov chain. When the solution is needed after a long run or when the convergence is slow, the uniformization method involves a large number of matrix-vector products. Despite this, the method remains very popular due to its ease of implementation and its reliability in many practical circumstances. Because calculating the matrix-vector product is the most time-consuming part of the method, overall efficiency in solving large-scale problems can be significantly enhanced if the matrix-vector product is made more economical. In this paper, we incorporate a new relaxation strategy into the uniformization method to compute the matrix-vector products only approximately. We analyze the error introduced by these inexact matrix-vector products and discuss strategies for refining the accuracy of the relaxation while reducing the execution cost. Numerical experiments drawn from computer systems and biological systems are given to show that significant computational savings are achieved in practical applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding consumer value is imperative in health care as the receipt of value drives the demand for health care services. While there is increasing research into health-care that adopts an economic approach to value, this paper investigates a non-financial exchange context and uses an experiential approach to value, guided by a social marketing approach to behaviour change. An experiential approach is deemed more appropriate for government health-care services that are free and for preventative rather than treatment purposes. Thus instead of using an illness-paradigm to view health services outcomes, we adopt a wellness paradigm. Using qualitative data gathered during 25 depth interviews the authors demonstrate how social marketing thinking has guided the identification of six themes that represent four dimensions of value (functional, emotional, social and altruistic) evident during the health care consumption process of a free government service.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the 2005 season, Mackay Sugar and its growers agreed to implement a new cane payment system. The aim of the new system was to better align the business drivers between the mill and its growers and as a result improve business decision making. The technical basis of the new cane payment system included a fixed sharing of the revenue from sugar cane between the mill and growers. Further, the new system replaced the CCS formula with a new estimate of recoverable sugar (PRS) and introduced NIR for payment analyses. Significant mill and grower consultation processes led to the agreement to implement the new system in 2005 and this consultative approach has been reflected in two seasons of successful operation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Transmission smart grids will use a digital platform for the automation of high voltage substations. The IEC 61850 series of standards, released in parts over the last ten years, provide a specification for substation communications networks and systems. These standards, along with IEEE Std 1588-2008 Precision Time Protocol version 2 (PTPv2) for precision timing, are recommended by the both IEC Smart Grid Strategy Group and the NIST Framework and Roadmap for Smart Grid Interoperability Standards for substation automation. IEC 61850, PTPv2 and Ethernet are three complementary protocol families that together define the future of sampled value digital process connections for smart substation automation. A time synchronisation system is required for a sampled value process bus, however the details are not defined in IEC 61850-9-2. PTPv2 provides the greatest accuracy of network based time transfer systems, with timing errors of less than 100 ns achievable. The suitability of PTPv2 to synchronise sampling in a digital process bus is evaluated, with preliminary results indicating that steady state performance of low cost clocks is an acceptable ±300 ns, but that corrections issued by grandmaster clocks can introduce significant transients. Extremely stable grandmaster oscillators are required to ensure any corrections are sufficiently small that time synchronising performance is not degraded.