871 resultados para value stream analysis
Resumo:
Much of the atmospheric variability in the North Atlantic sector is associated with variations in the eddy-driven component of the zonal flow. Here we present a simple method to specifically diagnose this component of the flow using the low-level wind field (925–700 hpa ). We focus on the North Atlantic winter season in the ERA-40 reanalysis. Diagnostics of the latitude and speed of the eddy-driven jet stream are compared with conventional diagnostics of the North Atlantic Oscillation (NAO) and the East Atlantic (EA) pattern. This shows that the NAO and the EA both describe combined changes in the latitude and speed of the jet stream. It is therefore necessary, but not always sufficient, to consider both the NAO and the EA in identifying changes in the jet stream. The jet stream analysis suggests that there are three preferred latitudinal positions of the North Atlantic eddy-driven jet stream in winter. This result is in very good agreement with the application of a statistical mixture model to the two-dimensional state space defined by the NAO and the EA. These results are consistent with several other studies which identify four European/Atlantic regimes, comprising three jet stream patterns plus European blocking events.
Resumo:
Classical risk assessment approaches for animal diseases are influenced by the probability of release, exposure and consequences of a hazard affecting a livestock population. Once a pathogen enters into domestic livestock, potential risks of exposure and infection both to animals and people extend through a chain of economic activities related to producing, buying and selling of animals and products. Therefore, in order to understand economic drivers of animal diseases in different ecosystems and to come up with effective and efficient measures to manage disease risks from a country or region, the entire value chain and related markets for animal and product needs to be analysed to come out with practical and cost effective risk management options agreed by actors and players on those value chains. Value chain analysis enriches disease risk assessment providing a framework for interdisciplinary collaboration, which seems to be in increasing demand for problems concerning infectious livestock diseases. The best way to achieve this is to ensure that veterinary epidemiologists and social scientists work together throughout the process at all levels.
Resumo:
Value chain studies, including production system and market chain studies, are essential to value chain analysis, which when coupled with disease risk analysis is a powerful tool to identify key constraints and opportunities for disease control based on risk management in a livestock production and marketing system. Several production system and market chain studies have been conducted to support disease control interventions in South East Asia. This practical aid summarizes experiences and lessons learned from the implementation of such value chain studies in South East Asia. Based on these experiences it prioritizes the required data for the respective purpose of a value chain study and recommends data collection as well as data analysis tools. This practical aid is intended as an adjunct to the FAO value chain approach and animal diseases risk management guidelines document. Further practical advice is provided for more effective use of value chain studies in South and South East Asia as part of animal health decision support.
Resumo:
While large-scale transverse drainages (TDs) such as those of the Susquehanna River above Harrisburg, PA, have been recognized since the 19th century, there have been no systematic surveys done of TDs since that of Ver Steeg's in 1930. Here, the results are presented of a topographic and statistical analysis of TDs in the Susquehanna River basin using Google Earth and associated overlays. 653 TDs were identified in the study area, 95% of which contain streams with discharges of less than 10 m3/s. TD depths ranged from a 23 m deep water gap near Blain, PA, to the 539 m deep gorge of the Juniata River through Jacks Mountain. Although TD depth tended to increase with stream size, many small streams were located in deep gaps, and eight streams with discharges of 10 m3/s or less were found in gorges whose depths matched or exceeded the deepest TD of the Susquehanna, the largest stream in the basin. Streams of less than 10 m3/s made up the majority of TDs regardless of the rock type capping the breached structure. Overall, TDs through sandstone-capped ridges were deeper than those topped by shales, and TDs in both sandstones and shales displayed a lognormal distribution of depths, which may be indicative of a preferred value. Stream flow direction was primarily perpendicular to local structural strike, with 47% of streams flowing NW and 53% flowing SE. 19% of the TDs were found to be in alignment with at least one other TD, with aligned segment lengths ranging from .5 to 14.8 km. The majority of TDs were in rocks of Paleozoic age. The techniques described here allow the frequency and distribution of TDs to be quantified so that they can be integrated into models of basin evolution.
Resumo:
Choosing an appropriate accounting system for manufacturing has always been a challenge for managers. In this article we try to compare three accounting systems designed since 1980 to address problems of traditional accounting system. In the first place we are going to present a short overview on background and definition of three accounting systems: Activity Based costing, Time-Driven Activity Based Costing and Lean Accounting. Comparisons are made based on the three basic roles of information generated by accounting systems: financial reporting, decision making, and operational control and improvement. The analysis in this paper reveals how decisions are made over the value stream in the companies using Lean Accounting while decisions under the ABC Accounting system are taken at individual product level, and finally we will show how TD-ABC covers both product and process levels for decision making. In addition, this paper shows the importance of nonfinancial measures for operational control and improvement under the Lean Accounting and TD-ABC methods whereas ABC relies mostly on financial measures in this context.
Resumo:
Bibliography: p. 32.
Resumo:
The aim of this report is to describe the use of WinBUGS for two datasets that arise from typical population pharmacokinetic studies. The first dataset relates to gentamicin concentration-time data that arose as part of routine clinical care of 55 neonates. The second dataset incorporated data from 96 patients receiving enoxaparin. Both datasets were originally analyzed by using NONMEM. In the first instance, although NONMEM provided reasonable estimates of the fixed effects parameters it was unable to provide satisfactory estimates of the between-subject variance. In the second instance, the use of NONMEM resulted in the development of a successful model, albeit with limited available information on the between-subject variability of the pharmacokinetic parameters. WinBUGS was used to develop a model for both of these datasets. Model comparison for the enoxaparin dataset was performed by using the posterior distribution of the log-likelihood and a posterior predictive check. The use of WinBUGS supported the same structural models tried in NONMEM. For the gentamicin dataset a one-compartment model with intravenous infusion was developed, and the population parameters including the full between-subject variance-covariance matrix were available. Analysis of the enoxaparin dataset supported a two compartment model as superior to the one-compartment model, based on the posterior predictive check. Again, the full between-subject variance-covariance matrix parameters were available. Fully Bayesian approaches using MCMC methods, via WinBUGS, can offer added value for analysis of population pharmacokinetic data.
Resumo:
We present the first innovation value chain analysis for a representative sample of new technology based firms (NTBFs) in the UK. This involves determining which factors lead to the usage of different knowledge sources and the relationships that exist between those sources of knowledge; the effect that each knowledge source has on innovative activity; and how innovation outputs affect the performance of NTBFs. We find that internal and external knowledge sources are complementary for NTBFs, and that supply chain linkages have both a direct and indirect effect on innovation. NTBFs’ skill resources matter throughout the innovation value chain, being positively associated with external knowledge linkages and innovation success, and also having a direct effect on growth independent of the effect on innovation. Exporting matters for performance, but not through any effect on innovation.
Resumo:
There has been a revival of interest in economic techniques to measure the value of a firm through the use of economic value added as a technique for measuring such value to shareholders. This technique, based upon the concept of economic value equating to total value, is founded upon the assumptions of classical liberal economic theory. Such techniques have been subject to criticism both from the point of view of the level of adjustment to published accounts needed to make the technique work and from the point of view of the validity of such techniques in actually measuring value in a meaningful context. This paper critiques economic value added techniques as a means of calculating changes in shareholder value, contrasting such techniques with more traditional techniques of measuring value added. It uses the company Severn Trent plc as an actual example in order to evaluate and contrast the techniques in action. The paper demonstrates discrepancies between the calculated results from using economic value added analysis and those reported using conventional accounting measures. It considers the merits of the respective techniques in explaining shareholder and managerial behaviour and the problems with using such techniques in considering the wider stakeholder concept of value. It concludes that this economic value added technique has merits when compared with traditional accounting measures of performance but that it does not provide the universal panacea claimed by its proponents.
Resumo:
This chapter reports on a framework that has been successfully used to analyze the e-business capabilities of an organization with a view to developing their e-capability maturity levels. This should be the first stage of any systems development project. The framework has been used widely within start-up companies and well-established companies both large and small; it has been deployed in the service and manufacturing sectors. It has been applied by practitioners and consultants to help improve e-business capability levels, and by academics for teaching and research purposes at graduate and undergraduate levels. This chapter will provide an account of the unique e-business planning and analysis framework (E-PAF) and demonstrate how it works via an abridged version of a case study (selected from hundreds that have been produced). This will include a brief account of the three techniques that are integrated to form the analysis framework: quality function deployment (QFD) (Akao, 1972), the balanced scorecard (BSC) (Kaplan & Norton, 1992), and value chain analysis (VCA) (Porter, 1985). The case study extract is based on an online community and dating agency service identified as VirtualCom which has been produced through a consulting assignment with the founding directors of that company and has not been published previously. It has been chosen because it gives a concise, comprehensive example from an industry that is relatively easy to relate to.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-07
Resumo:
In recent years, the analysis of trade in value added has been explored by many researchers. Although they have made important contributions by developing GVC-related indices and proposing techniques for decomposing trade data, they have not yet explored the method of value chain mapping—a core element of conventional value chain analysis. This paper introduces a method of value chain mapping that uses international input-output data and reveals both upstream and downstream transactions of goods and services induced by production activities of a specific commodity or industry. This method is subsequently applied to the agricultural value chain of three Greater Mekong Sub-region countries (i.e., Thailand, Vietnam, and Cambodia). The results show that the agricultural value chain has been increasingly internationalized, although there is still room for obtaining benefits from GVC participation, especially in a country such as Cambodia.
Resumo:
Twenty-two Triceps brachii muscle obtained from 11 cows aged 3 and 4 years , killed in an experimental slaughter plant, were submitted to mechanical tenderization, injection with acetic acid 0,1M and lactic acid 0,2M, ageing for 9 and 14 days and electrical stimulation (250v - 60Hz - 90s), some of them were reserved as a control group, without treatment. The 14 days ageing time presented 21% of increase in subjective tenderness and 12% of reduction in shear force, these values were similar to the electrical stimulated meat. However the injection with acids and the ageing time 9 days did not present significant effect in the texture. Although the shear force values of mechanical tenderized meat was the shortest among all treatments, suspect of superestimation because of the fractures plan created by this process. Another analyses were carried out: pH reduction curve, R value; colour analysis; weight losses by cooking and by treatment; and microbiological analysis.
Resumo:
Mestrado em Contabilidade e Análise Financeira
Resumo:
Uma grande parte do tempo de uma organização é despendida em atividades que não criam qualquer tipo de valor. Este tipo de atividades são consideradas como desperdícios, pois consomem recursos e tempo, como é o caso de deslocações, controlos, ajustes, armazenamento de materiais, resolução de problemas, entre tantos outros, levando a um elevado custo dos produtos disponibilizados. Em 1996 a designação de Lean Thinking foi usada, pela primeira vez, por Womack e Jones, onde é falada como uma filosofia de gestão, que tem como principal objetivo reduzir os desperdícios num processo produtivo. Reduzindo os desperdícios aumenta-se a qualidade e diminui-se os tempos de processamento e, consequentemente, os custos de produção. É nesta base que assenta o documento aqui presente, que tem o objetivo de criar e desenvolver um jogo de simulação onde seja possível aplicar várias ferramentas Lean. O jogo de simulação é uma continuação de uma pesquisa e estudo teórico de um aluno de erasmus e faz parte de um projeto internacional do Lean Learning Academy (LLA). Criou-se um processo produtivo de montagem de canetas que fosse o mais semelhante ao que se encontram nas empresas, com todos os acessórios para o pleno funcionamento da simulação, como é o caso de instruções de montagem, procedimentos de controlo e ordens de produção, para assim posteriormente ser possível analisar os dados e as dificuldades encontradas, de modo a aplicar-se as ferramentas Lean. Apesar de serem abordadas várias ferramentas Lean neste trabalho, foram trabalhadas mais detalhadamente as seguintes: - Value Stream Mapping (VSM); - Single Minute Exchange of Dies (SMED); - Balanceamento da linha. De modo a ser percetível o conteúdo e as vantagens das três ferramentas Lean mencionadas no trabalho, estas foram aplicadas e simuladas, de forma a existir uma componente prática no seu estudo, para mais fácil compreensão e rápida aprendizagem.