739 resultados para common long-run components
Resumo:
The economic environment of today can be characterized as highly dynamic and competitive if not being in a constant flux. Globalization and the Information Technology (IT) revolution are perhaps the main contributing factors to this observation. While companies have to some extent adapted to the current business environment, new pressures such as the recent increase in environmental awareness and its likely effects on regulations are underway. Hence, in the light of market and competitive pressures, companies must constantly evaluate and if necessary update their strategies to sustain and increase the value they create for shareholders (Hunt and Morgan, 1995; Christopher and Towill, 2002). One way to create greater value is to become more efficient in producing and delivering goods and services to customers, which can lead to a strategy known as cost leadership (Porter, 1980). Even though Porter (1996) notes that in the long run cost leadership may not be a sufficient strategy for competitive advantage, operational efficiency is certainly necessary and should therefore be on the agenda of every company. ----- ----- ----- Better workflow management, technology, and resource utilization can lead to greater internal operational efficiency, which explains why, for example, many companies have recently adopted Enterprise Resource Planning (ERP) Systems: integrated softwares that streamline business processes. However, as today more and more companies are approaching internal operational excellence, the focus for finding inefficiencies and cost saving opportunities is moving beyond the boundaries of the firm. Today many firms in the supply chain are engaging in collaborative relationships with customers, suppliers, and third parties (services) in an attempt to cut down on costs related to for example, inventory, production, as well as to facilitate synergies. Thus, recent years have witnessed fluidity and blurring regarding organizational boundaries (Coad and Cullen, 2006). ----- ----- ----- The Information Technology (IT) revolution of the late 1990’s has played an important role in bringing organizations closer together. In their efforts to become more efficient, companies first integrated their information systems to speed up transactions such as ordering and billing. Later collaboration on a multidimensional scale including logistics, production, and Research & Development became evident as companies expected substantial benefits from collaboration. However, one could also argue that the recent popularity of the concepts falling under Supply Chain Management (SCM) such as Vendor Managed Inventory, Collaborative Planning, Replenishment, and Forecasting owe to the marketing efforts of software vendors and consultants who provide these solutions. Nevertheless, reports from professional organizations as well as academia indicate that the trend towards interorganizational collaboration is gaining wider ground. For example, the ARC Advisory Group, a research organization on supply chain solutions, estimated that the market for SCM, which includes various kinds of collaboration tools and related services, is going to grow at an annual rate of 7.4% during the years 2004-2008, reaching to $7.4 billion in 2008 (Engineeringtalk 2004).
Resumo:
Due to the limitation of current condition monitoring technologies, the estimates of asset health states may contain some uncertainties. A maintenance strategy ignoring this uncertainty of asset health state can cause additional costs or downtime. The partially observable Markov decision process (POMDP) is a commonly used approach to derive optimal maintenance strategies when asset health inspections are imperfect. However, existing applications of the POMDP to maintenance decision-making largely adopt the discrete time and state assumptions. The discrete-time assumption requires the health state transitions and maintenance activities only happen at discrete epochs, which cannot model the failure time accurately and is not cost-effective. The discrete health state assumption, on the other hand, may not be elaborate enough to improve the effectiveness of maintenance. To address these limitations, this paper proposes a continuous state partially observable semi-Markov decision process (POSMDP). An algorithm that combines the Monte Carlo-based density projection method and the policy iteration is developed to solve the POSMDP. Different types of maintenance activities (i.e., inspections, replacement, and imperfect maintenance) are considered in this paper. The next maintenance action and the corresponding waiting durations are optimized jointly to minimize the long-run expected cost per unit time and availability. The result of simulation studies shows that the proposed maintenance optimization approach is more cost-effective than maintenance strategies derived by another two approximate methods, when regular inspection intervals are adopted. The simulation study also shows that the maintenance cost can be further reduced by developing maintenance strategies with state-dependent maintenance intervals using the POSMDP. In addition, during the simulation studies the proposed POSMDP shows the ability to adopt a cost-effective strategy structure when multiple types of maintenance activities are involved.
Resumo:
The uniformization method (also known as randomization) is a numerically stable algorithm for computing transient distributions of a continuous time Markov chain. When the solution is needed after a long run or when the convergence is slow, the uniformization method involves a large number of matrix-vector products. Despite this, the method remains very popular due to its ease of implementation and its reliability in many practical circumstances. Because calculating the matrix-vector product is the most time-consuming part of the method, overall efficiency in solving large-scale problems can be significantly enhanced if the matrix-vector product is made more economical. In this paper, we incorporate a new relaxation strategy into the uniformization method to compute the matrix-vector products only approximately. We analyze the error introduced by these inexact matrix-vector products and discuss strategies for refining the accuracy of the relaxation while reducing the execution cost. Numerical experiments drawn from computer systems and biological systems are given to show that significant computational savings are achieved in practical applications.
Resumo:
According to Tan et al. (2011), the establishment of a clear sustainability policy in the construction industry is paramount, if only as a statement of the commitment of the top management to protecting the environment and enhancing social responsibility. The resulting policies should then translate into proactive strategies and action plans that improve the sustainability performance of contractors and provide a competitive advantage by integrating “long-run profitability” with sustainable development efforts. The strategies should also take into account climatic protection issues through greenhouse gas emissions (GHGe) monitoring and reduction initiatives (Stocker & Luptacik, 2009)...
Resumo:
This conceptual paper explores the extent to which reported accounting information captures unique family firm decision-making and intangible asset factors that impact financial value. We review the family firm valuation-relevant literature and identify that this body of research is predicated on the assumption that accounting information reflects the underlying reality of family firms. This research, however, fails to recognise that current accounting technology does not fully recognise the family firm factors in the book value of the firm or the implications for long run persistence of earnings. Thus, valuation models underpinning the extant empirical research, which are predicated on reported accounting information, may not fully reflect the intrinsic value of family firms. We present propositions on the interaction between accounting information, family factors and valuation as a road map for future empirical research with a discussion of appropriate methodologies.
Resumo:
This paper reports outcomes of a pilot study to develop a conceptual framework to allow people to retrofit a building-layer to gain better control of their own built- environments. The study was initiated by the realisation that discussions surrounding the improvement of building performances tend to be about top-down technological solutions rather than to help and encourage bottom-up involvement of building-users. While users are the ultimate beneficiaries and their feedback is always appreciated, their direct involvements in managing buildings would often be regarded as obstruction or distraction. This is largely because casual interventions by uninformed building-users tend to disrupt the system. Some earlier researches showed however that direct and active participation of users could improve the building performance if appropriate training and/or systems were introduced. We also speculate this in long run would also make the built environment more sustainable. With this in mind, we looked for opportunities to retrofit our own office with an interactive layer to study how we could introduce ad-hoc systems for building-users. The aim of this paper is to describe our vision and initial attempts followed by discussion.
Resumo:
Purpose: Tacit knowledge is perceived as the most strategically important resource of the construction organisation, and the only renewable and sustainable base for its activities and competitiveness. Knowledge management (KM) activities that deal with tacit knowledge are essential in helping an organisation to achieve its long-term organisational objectives. The purpose of this paper is to provide empirical evidence for the stronger strategic role of tacit KM in comparison to explicit KM. Design/methodology/approach: A questionnaire survey was administered in 2005 to a sample of construction contractors operating in Hong Kong to elicit opinions on the internal business environment, intensity of KM activities as executed by targeted organisations, and contribution of these activities to business performance (BP). A total of 149 usable responses were received from 99 organisations representing about 38 per cent of the sampling frame. The statistical analyses helped to map the reported KM activities into two groups that, respectively, deal with tacit and explicit knowledge. The sensitivity to variations of organisational policies and strength of association with BP in relation to the two groups of KM activities were also compared empirically. A total of 15 interviews with the managerial and professional staff of leading contractors was undertaken to provide insightful narratives of KM implementations. Findings: The effective implementation of organisational policies, such as encouraging innovations and strengthening strategic guidance for KM, would facilitate human interactions of tacit KM. Higher intensity of activities in managing tacit knowledge would ultimately help the organisations to achieve economic gain in the long run. Originality/value: The stronger strategic role of tacit KM is empirically investigated and established within the context of construction organisations.
Resumo:
While scientists continue to explore the level of climate change impact to new weather patterns and our environment in general, there have been some devastating natural disasters worldwide in the last two decades. Indeed natural disasters are becoming a major concern in our society. Yet in many previous examples, our reconstruction efforts only focused on providing short-term necessities. How to develop resilience in the long run is now a highlight for research and industry practice. This paper introduces a research project aimed at exploring the relationship between resilience building and sustainability in order to identify key factors during reconstruction efforts. From extensive literature study, the authors considered the inherent linkage between the two issues as evidenced from past research. They found that sustainability considerations can improve the level of resilience but are not currently given due attention. Reconstruction efforts need to focus on resilience factors but as part of urban development, they must also respond to the sustainability challenge. Sustainability issues in reconstruction projects need to be amplified, identified, processed, and managed properly. On-going research through empirical study aims to establish critical factors (CFs) for stakeholders in disaster prone areas to plan for and develop new building infrastructure through holistic considerations and balanced approaches to sustainability. A questionnaire survey examined a range of potential factors and the subsequent data analysis revealed six critical factors for sustainable Post Natural Disaster Reconstruction that include: considerable building materials and construction methods, good governance, multilateral coordination, appropriate land-use planning and policies, consideration of different social needs, and balanced combination of long-term and short-term needs. Findings from this study should have an influence on policy development towards Post Natural Disaster Reconstruction and help with the achievement of sustainable objectives.
Resumo:
Being able to innovate has become a critical capability for many contemporary organizations in an effort to sustain their operations in the long run. However, existing innovation models that attempt to guide organizations emphasize different aspects of innovation (e.g., products, services or business models), different stages of innovation (e.g., ideation, implementation or operation) or different skills (e.g., development or crowdsourcing) that are necessary to innovate, in turn creating isolated pockets of understanding about different aspects of innovation. In order to yield more predictable innovation outcomes organizations need to understand what exactly they need to focus on, what capabilities they need to have and what is necessary in order to take an idea to market. This paper aims at constructing a framework for innovation that contributes to this understanding. We will focus on a number of different stages in the innovation process and highlight different types and levels of organizational, technological, individual and process capabilities required to manage the organizational innovation process. Our work offers a comprehensive conceptualization of innovation as a multi-level process model, and provides a range of implications for further empirical and theoretical examination.
Resumo:
This thesis examines how the initial institutional and technological aspects of the economy and the reforms that alter these aspects influence long run growth and development. These issues are addressed in the framework of stochastic endogenous growth models and an empirical framework. The thesis is able to explain why developing nations exhibit diverse growth and inequality patterns. Consequently, the thesis raises a number of policy implications regarding how these nations can improve their economic outcomes.
Resumo:
This study explores the accuracy and valuation implications of the application of a comprehensive list of equity multiples in the takeover context. Motivating the study is the prevalent use of equity multiples in practice, the observed long-run underperformance of acquirers following takeovers, and the scarcity of multiplesbased research in the merger and acquisition setting. In exploring the application of equity multiples in this context three research questions are addressed: (1) how accurate are equity multiples (RQ1); which equity multiples are more accurate in valuing the firm (RQ2); and which equity multiples are associated with greater misvaluation of the firm (RQ3). Following a comprehensive review of the extant multiples-based literature it is hypothesised that the accuracy of multiples in estimating stock market prices in the takeover context will rank as follows (from best to worst): (1) forecasted earnings multiples, (2) multiples closer to bottom line earnings, (3) multiples based on Net Cash Flow from Operations (NCFO) and trading revenue. The relative inaccuracies in multiples are expected to flow through to equity misvaluation (as measured by the ratio of estimated market capitalisation to residual income value, or P/V). Accordingly, it is hypothesised that greater overvaluation will be exhibited for multiples based on Trading Revenue, NCFO, Book Value (BV) and earnings before interest, tax, depreciation and amortisation (EBITDA) versus multiples based on bottom line earnings; and that multiples based on Intrinsic Value will display the least overvaluation. The hypotheses are tested using a sample of 147 acquirers and 129 targets involved in Australian takeover transactions announced between 1990 and 2005. The results show that first, the majority of computed multiples examined exhibit valuation errors within 30 percent of stock market values. Second, and consistent with expectations, the results provide support for the superiority of multiples based on forecasted earnings in valuing targets and acquirers engaged in takeover transactions. Although a gradual improvement in estimating stock market values is not entirely evident when moving down the Income Statement, historical earnings multiples perform better than multiples based on Trading Revenue or NCFO. Third, while multiples based on forecasted earnings have the highest valuation accuracy they, along with Trading Revenue multiples for targets, produce the most overvalued valuations for acquirers and targets. Consistent with predictions, greater overvaluation is exhibited for multiples based on Trading Revenue for targets, and NCFO and EBITDA for both acquirers and targets. Finally, as expected, multiples based Intrinsic Value (along with BV) are associated with the least overvaluation. Given the widespread usage of valuation multiples in takeover contexts these findings offer a unique insight into their relative effectiveness. Importantly, the findings add to the growing body of valuation accuracy literature, especially within Australia, and should assist market participants to better understand the relative accuracy and misvaluation consequences of various equity multiples used in takeover documentation and assist them in subsequent investment decision making.
Resumo:
Diabetic peripheral neuropathy (DPN) is one of the most common long-term complications of diabetes. The accurate detection and quantification of DPN are important for defining at-risk patients, anticipating deterioration, and assessing new therapies. Current methods of detecting and quantifying DPN, such as neurophysiology, lack sensitivity, require expert assessment and focus primarily on large nerve fibers. However, the earliest damage to nerve fibers in diabetic neuropathy is to the small nerve fibers. At present, small nerve fiber damage is currently assessed using skin/nerve biopsy; both are invasive technique and are not suitable for repeated investigations.
Resumo:
This study tests whether an international market exists in the platinum-group metal (PGM) futures markets. For this purpose, we tested the law of one price (LOP) and the causality between the U.S. and Japanese platinum and palladium futures markets. We also performed the test when structural breaks are considered. Long-run price relationships were found in both platinum and palladium markets but the LOP only sustained in the palladium market. The causality test revealed that it is the U.S. market that leads the price to transmit information between the U.S. and Japanese markets. Structural breaks had large impacts on the test results, suggesting that incorporating breaks is important when investigating the international price linkage in the PGM futures markets.
Resumo:
This study investigates how markets for different levels of copper purity are interrelated by testing the long-run price linkage and causalities among the copper futures, primary, copper scrap, and brass scrap markets. It is expected that copper markets that deal with high purity levels, such as the futures, primary, and copper scrap markets, have a long-run relationship. However, brass scrap markets where copper with a lower purity is traded may not have a price linkage with other copper markets. The results reveal that a long-run relationship holds between the futures, primary, and copper scrap markets but the brass scrap market does not have a long-run relationship with the other markets. From the short-run and long-run causality tests, we determine that the futures market plays an important role in transmitting price information to other copper markets while such information flow is not found for the brass scrap market.