531 resultados para Value Stream Mapping
Resumo:
In this paper we study both the level of Value-at-Risk (VaR) disclosure and the accuracy of the disclosed VaR figures for a sample of US and international commercial banks. To measure the level of VaR disclosures, we develop a VaR Disclosure Index that captures many different facets of market risk disclosure. Using panel data over the period 1996–2005, we find an overall upward trend in the quantity of information released to the public. We also find that Historical Simulation is by far the most popular VaR method. We assess the accuracy of VaR figures by studying the number of VaR exceedances and whether actual daily VaRs contain information about the volatility of subsequent trading revenues. Unlike the level of VaR disclosure, the quality of VaR disclosure shows no sign of improvement over time. We find that VaR computed using Historical Simulation contains very little information about future volatility.
Resumo:
Lignocellulosic waste materials are the most promising feedstock for generation of a renewable, carbon-neutral substitute for existing liquid fuels. The development of value-added products from lignin will greatly improve the economics of producing liquid fuels from biomass. This review gives an outline of lignin chemistry, describes the current processes of lignocellulosic biomass fractionation and the lignin products obtained through these processes, then outlines current and potential value-added applications of these products, in particular as components of polymer composites. Research highlights The use of lignocellulosic biomass to produce platform chemicals and industrial products enhances the sustainability of natural resources and improves environmental quality by reducing greenhouse and toxic emissions. In addition, the development of lignin based products improves the economics producing liquid transportation fuel from lignocellulosic feedstock. Value adding can be achieved by converting lignin to functionally equivalent products that rely in its intrinsic properties. This review outlines lignin chemistry and some potential high value products that can be made from lignin. Keywords: Lignocellulose materials; Lignin chemistry; Application
Resumo:
Early detection surveillance programs aim to find invasions of exotic plant pests and diseases before they are too widespread to eradicate. However, the value of these programs can be difficult to justify when no positive detections are made. To demonstrate the value of pest absence information provided by these programs, we use a hierarchical Bayesian framework to model estimates of incursion extent with and without surveillance. A model for the latent invasion process provides the baseline against which surveillance data are assessed. Ecological knowledge and pest management criteria are introduced into the model using informative priors for invasion parameters. Observation models assimilate information from spatio-temporal presence/absence data to accommodate imperfect detection and generate posterior estimates of pest extent. When applied to an early detection program operating in Queensland, Australia, the framework demonstrates that this typical surveillance regime provides a modest reduction in the estimate that a surveyed district is infested. More importantly, the model suggests that early detection surveillance programs can provide a dramatic reduction in the putative area of incursion and therefore offer a substantial benefit to incursion management. By mapping spatial estimates of the point probability of infestation, the model identifies where future surveillance resources can be most effectively deployed.
Resumo:
An initialisation process is a key component in modern stream cipher design. A well-designed initialisation process should ensure that each key-IV pair generates a different key stream. In this paper, we analyse two ciphers, A5/1 and Mixer, for which this does not happen due to state convergence. We show how the state convergence problem occurs and estimate the effective key-space in each case.
Resumo:
As part of a larger literature focused on identifying and relating the antecedents and consequences of diffusing organizational practices/ideas, recent research has debated the international adoption of a shareholder-value-orientation (SVO). The debate has financial economists characterizing the adoption of an SVO as performance-enhancing and thus inevitable, with behavioral scientists disputing both claims, invoking institutional differences. This study seeks to provide some resolution to the debate (and advance current understanding on the diffusion of practices/ideas) by developing a socio-political perspective that links the antecedents and consequences of an SVO. In particular, we introduce the notion of misaligned elites and misfitted practices in our analysis of how and why differences in the technical and cultural preferences of major owners will influence a firm’s adoption and (un)successful implementation of an SVO among the largest 100 corporations in the Netherlands from 1992-2006. We conclude with a discussion of the implications of our perspective and our findings for future research on corporate governance and the diffusion of organizational practices/ideas.
Resumo:
The literature abounds with descriptions of failures in high-profile projects and a range of initiatives has been generated to enhance project management practice (e.g., Morris, 2006). Estimating from our own research, there are scores of other project failures that are unrecorded. Many of these failures can be explained using existing project management theory; poor risk management, inaccurate estimating, cultures of optimism dominating decision making, stakeholder mismanagement, inadequate timeframes, and so on. Nevertheless, in spite of extensive discussion and analysis of failures and attention to the presumed causes of failure, projects continue to fail in unexpected ways. In the 1990s, three U.S. state departments of motor vehicles (DMV) cancelled major projects due to time and cost overruns and inability to meet project goals (IT-Cortex, 2010). The California DMV failed to revitalize their drivers’ license and registration application process after spending $45 million. The Oregon DMV cancelled their five year, $50 million project to automate their manual, paper-based operation after three years when the estimates grew to $123 million; its duration stretched to eight years or more and the prototype was a complete failure. In 1997, the Washington state DMV cancelled their license application mitigation project because it would have been too big and obsolete by the time it was estimated to be finished. There are countless similar examples of projects that have been abandoned or that have not delivered the requirements.
Resumo:
Numerous tools and techniques have been developed to eliminate or reduce waste and carry out lean concepts in the manufacturing environment. However, appropriate lean tools need to be selected and implemented in order to fulfil the manufacturer needs within their budgetary constraints. As a result, it is important to identify manufacturer needs and implement only those tools, which contribute maximum benefit to their needs. In this research a mathematical model is proposed for maximising the perceived value of manufacturer needs and developed a step-by-step methodology to select best performance metrics along with appropriate lean strategies within the budgetary constraints. With the help of a case study, the proposed model and method have been demonstrated.
Resumo:
In this paper, I would like to outline the approach we have taken to mapping and assessing integrity systems and how this has led us to see integrity systems in a new light. Indeed, it has led us to a new visual metaphor for integrity systems – a bird’s nest rather than a Greek temple. This was the result of a pair of major research projects completed in partnership with Transparency International (TI). One worked on refining and extending the measurement of corruption. This, the second, looked at what was then the emerging institutional means for reducing corruption – ‘national integrity systems’
Resumo:
Ethernet is a key component of the standards used for digital process buses in transmission substations, namely IEC 61850 and IEEE Std 1588-2008 (PTPv2). These standards use multicast Ethernet frames that can be processed by more than one device. This presents some significant engineering challenges when implementing a sampled value process bus due to the large amount of network traffic. A system of network traffic segregation using a combination of Virtual LAN (VLAN) and multicast address filtering using managed Ethernet switches is presented. This includes VLAN prioritisation of traffic classes such as the IEC 61850 protocols GOOSE, MMS and sampled values (SV), and other protocols like PTPv2. Multicast address filtering is used to limit SV/GOOSE traffic to defined subsets of subscribers. A method to map substation plant reference designations to multicast address ranges is proposed that enables engineers to determine the type of traffic and location of the source by inspecting the destination address. This method and the proposed filtering strategy simplifies future changes to the prioritisation of network traffic, and is applicable to both process bus and station bus applications.
Resumo:
The heat transfer through the attics of buildings under realistic thermal forcing has been considered in this study. A periodic temperature boundary condition is applied on the sloping walls of the attic to show the basic flow features in the attic space over diurnal cycles. The numerical results reveal that, during the daytime heating stage, the flow in the attic space is stratified; whereas at the night-time cooling stage, the flow becomes unstable. A symmetrical solution is seen for relatively low Rayleigh numbers. However, as the Ra gradually increases, a transition occurs at a critical value of Ra. Above this critical value, an asymmetrical solution exhibiting a pitchfork bifurcation arises at the night-time. It is also found that the calculated heat transfer rate at the night-time cooling stage is much higher than that during the daytime heating stage.
Resumo:
Objective: To compare the location and accessibility of current Australian chronic heart failure (CHF) management programs and general practice services with the probable distribution of the population with CHF. Design and setting: Data on the prevalence and distribution of the CHF population throughout Australia, and the locations of CHF management programs and general practice services from 1 January 2004 to 31 December 2005 were analysed using geographic information systems (GIS) technology. Outcome measures: Distance of populations with CHF to CHF management programs and general practice services. Results: The highest prevalence of CHF (20.3–79.8 per 1000 population) occurred in areas with high concentrations of people over 65 years of age and in areas with higher proportions of Indigenous people. Five thousand CHF patients (8%) discharged from hospital in 2004–2005 were managed in one of the 62 identified CHF management programs. There were no CHF management programs in the Northern Territory or Tasmania. Only four CHF management programs were located outside major cities, with a total case load of 80 patients (0.7%). The mean distance from any Australian population centre to the nearest CHF management program was 332 km (median, 163 km; range, 0.15–3246 km). In rural areas, where the burden of CHF management falls upon general practitioners, the mean distance to general practice services was 37 km (median, 20 km; range, 0–656 km). Conclusion: There is an inequity in the provision of CHF management programs to rural Australians.
Resumo:
This paper presents a method of spatial sampling based on stratification by Local Moran’s I i calculated using auxiliary information. The sampling technique is compared to other design-based approaches including simple random sampling, systematic sampling on a regular grid, conditional Latin Hypercube sampling and stratified sampling based on auxiliary information, and is illustrated using two different spatial data sets. Each of the samples for the two data sets is interpolated using regression kriging to form a geostatistical map for their respective areas. The proposed technique is shown to be competitive in reproducing specific areas of interest with high accuracy.