855 resultados para Decision makers
Resumo:
Several parties (stakeholders) are involved in a construction project. The conventional Risk Management Process (RMP) manages risks from a single party perspective, which does not give adequate consideration to the needs of others. The objective of multi-party risk management is to assist decision-makers in managing risk systematically and most efficiently in a multi-party environment. Multi-party Risk Management Processes (MRMP) consist of risk identification, structuring, analysis and developing responses from all party perspectives. The MRMP has been applied to a cement plant construction project in Thailand to demonstrate its effectiveness.
Resumo:
A discrete event simulation model was developed and used to estimate the storage area required for a proposed overseas textile manufacturing facility. It was found that the simulation was able to achieve this because of its ability to both store attribute values and to show queuing levels at an individual product level. It was also found that the process of undertaking the simulation project initiated useful discussions regarding the operation of the facility. Discrete event simulation is shown to be much more than an exercise in quantitative analysis of results and an important task of the simulation project manager is to initiate a debate among decision makers regarding the assumptions of how the system operates.
Resumo:
Purpose - To provide an example of the use of system dynamics within the context of a discrete-event simulation study. Design/methodology/approach - A discrete-event simulation study of a production-planning facility in a gas cylinder-manufacturing plant is presented. The case study evidence incorporates questionnaire responses from sales managers involved in the order-scheduling process. Findings - As the project progressed it became clear that, although the discrete-event simulation would meet the objectives of the study in a technical sense, the organizational problem of "delivery performance" would not be solved by the discrete-event simulation study alone. The case shows how the qualitative outcomes of the discrete-event simulation study led to an analysis using the system dynamics technique. The system dynamics technique was able to model the decision-makers in the sales and production process and provide a deeper understanding of the performance of the system. Research limitations/implications - The case study describes a traditional discrete-event simulation study which incorporated an unplanned investigation using system dynamics. Further, case studies using a planned approach to showing consideration of organizational issues in discrete-event simulation studies are required. Then the role of both qualitative data in a discrete-event simulation study and the use of supplementary tools which incorporate organizational aspects may help generate a methodology for discrete-event simulation that incorporates human aspects and so improve its relevance for decision making. Practical implications - It is argued that system dynamics can provide a useful addition to the toolkit of the discrete-event simulation practitioner in helping them incorporate a human aspect in their analysis. Originality/value - Helps decision makers gain a broader perspective on the tools available to them by showing the use of system dynamics to supplement the use of discrete-event simulation. © Emerald Group Publishing Limited.
Resumo:
Different forms of strategic flexibility allow for reactive adaptation to different changing environments and the proactive driving of change. It is therefore becoming increasingly important for decision makers to not only possess marketing capabilities, but also the capabilities for strategic flexibility in its various forms. However, our knowledge of the relationships between decision makers’ different ways of thinking and their capabilities for strategic flexibility is limited. This limitation is constraining research and understanding. In this article we develop a theoretical cognitive content framework that postulates relationships between different ways of thinking about strategy and different information-processing demands. We then outline how the contrasting beliefs of decision makers may influence their capabilities to generate different hybrid forms of strategic flexibility at the cognitive level. Theoretically, the framework is embedded in resource-based theory, personal construct theory and schema theory. The implications for research and theory are discussed.
Resumo:
Due to its wide applicability and ease of use, the analytic hierarchy process (AHP) has been studied extensively for the last 20 years. Recently, it is observed that the focus has been confined to the applications of the integrated AHPs rather than the stand-alone AHP. The five tools that commonly combined with the AHP include mathematical programming, quality function deployment (QFD), meta-heuristics, SWOT analysis, and data envelopment analysis (DEA). This paper reviews the literature of the applications of the integrated AHPs. Related articles appearing in the international journals from 1997 to 2006 are gathered and analyzed so that the following three questions can be answered: (i) which type of the integrated AHPs was paid most attention to? (ii) which area the integrated AHPs were prevalently applied to? (iii) is there any inadequacy of the approaches? Based on the inadequacy, if any, some improvements and possible future work are recommended. This research not only provides evidence that the integrated AHPs are better than the stand-alone AHP, but also aids the researchers and decision makers in applying the integrated AHPs effectively.
Resumo:
This paper formulates a logistics distribution problem as the multi-depot travelling salesman problem (MDTSP). The decision makers not only have to determine the travelling sequence of the salesman for delivering finished products from a warehouse or depot to a customer, but also need to determine which depot stores which type of products so that the total travelling distance is minimised. The MDTSP is similar to the combination of the travelling salesman and quadratic assignment problems. In this paper, the two individual hard problems or models are formulated first. Then, the problems are integrated together, that is, the MDTSP. The MDTSP is constructed as both integer nonlinear and linear programming models. After formulating the models, we verify the integrated models using commercial packages, and most importantly, investigate whether an iterative approach, that is, solving the individual models repeatedly, can generate an optimal solution to the MDTSP. Copyright © 2006 Inderscience Enterprises Ltd.
Resumo:
Problem structuring methods (PSMs) aim to build shared understanding in a group of decision makers. This shared understanding is used as a basis for them to negotiate an agreed action plan that they are prepared to help implement. Engaging in a social process of negotiation with a large number of people is difficult, and so PSMs have typically focused on small groups of less than 20. This paper explores the legitimacy of deploying PSMs in large groups of people (50–1000), where the aim is to negotiate action and build commitment to its implementation. We review the difficulties of facilitating large groups with PSMs, drawing heavily on our experience of working with over 25 large groups. We offer a range of lessons learned and suggest concrete approaches to facilitating large groups to achieve the objectives of PSMs. This paper contributes to the evaluation and development of PSMs.
Resumo:
Purpose - Despite the increasing sophistication of new product development (NPD) research, the reliance on traditional approaches to studying NPD has left several areas in need of further research. The authors propose addressing some of these gaps, especially the limited focus on consumer brands, evaluation criteria used across different project-review points in the NPD process, and the distinction between "kills", "successes", and "failures". Moreover, they propose investigating how screening criteria change across project-review points, using real-time NPD projects. Design/methodology/approach - A postal survey generated 172 usable questionnaires from a sample of European, North American, Far Eastern and Australian consumer packaged-goods firms, providing data on 314 new product projects covering different development and post-commercialization review points. Findings - The results confirm that acceptance-rejection criteria vary through the NPD process. However, financial criteria dominate across all the project-review points. Initial screening is coarse, focusing predominantly on financial criteria. Fit with organizational, product, brand, promotional, and market requirements dominate in the detailed screen and pre-development evaluation points. At pre-launch, decision-makers focus on product, brand, and promotional criteria. Commercial fit, production synergies, and reliability of the firm's market intelligence are significant discriminators in the post-launch review. Moreover, the importance of marketing and channel issues makes the criteria for screening brands different from those of industrial markets. Originality/value - The study, although largely descriptive and involves a relatively small sample of consumer goods firms, offers new insights into NPD project evaluation behavior. Future, larger-scale investigations covering a broader spectrum of consumer product sectors are needed to validate our results and to explain the reasons behind managers' decisions. © Emerald Group Publishing Limited.
Resumo:
In present day knowledge societies political decisions are often justified on the basis of scientific expertise. Traditionally, a linear relation between knowledge production and application was postulated which would lead, with more and better science, to better policies. Empirical studies in Science and Technology studies have essentially demolished this idea. However, it is still powerful, not least among practitioners working in fields where decision making is based on large doses of expert knowledge. Based on conceptual work in the field of Science and Technology Studies (STS) I shall examine two cases of global environmental governance, ozone layer protection and global climate change. I will argue that hybridization and purification are important for two major forms of scientific expertise. One is delivered though scientific advocacy (by individual scientists or groups of scientists), the other through expert committees, i.e. institutionalized forms of collecting and communicating expertise to decision makers. Based on this analysis lessons will be drawn, also with regard to the stalling efforts at establishing an international forestry regime.
Resumo:
Very large spatially-referenced datasets, for example, those derived from satellite-based sensors which sample across the globe or large monitoring networks of individual sensors, are becoming increasingly common and more widely available for use in environmental decision making. In large or dense sensor networks, huge quantities of data can be collected over small time periods. In many applications the generation of maps, or predictions at specific locations, from the data in (near) real-time is crucial. Geostatistical operations such as interpolation are vital in this map-generation process and in emergency situations, the resulting predictions need to be available almost instantly, so that decision makers can make informed decisions and define risk and evacuation zones. It is also helpful when analysing data in less time critical applications, for example when interacting directly with the data for exploratory analysis, that the algorithms are responsive within a reasonable time frame. Performing geostatistical analysis on such large spatial datasets can present a number of problems, particularly in the case where maximum likelihood. Although the storage requirements only scale linearly with the number of observations in the dataset, the computational complexity in terms of memory and speed, scale quadratically and cubically respectively. Most modern commodity hardware has at least 2 processor cores if not more. Other mechanisms for allowing parallel computation such as Grid based systems are also becoming increasingly commonly available. However, currently there seems to be little interest in exploiting this extra processing power within the context of geostatistics. In this paper we review the existing parallel approaches for geostatistics. By recognising that diffeerent natural parallelisms exist and can be exploited depending on whether the dataset is sparsely or densely sampled with respect to the range of variation, we introduce two contrasting novel implementations of parallel algorithms based on approximating the data likelihood extending the methods of Vecchia [1988] and Tresp [2000]. Using parallel maximum likelihood variogram estimation and parallel prediction algorithms we show that computational time can be significantly reduced. We demonstrate this with both sparsely sampled data and densely sampled data on a variety of architectures ranging from the common dual core processor, found in many modern desktop computers, to large multi-node super computers. To highlight the strengths and weaknesses of the diffeerent methods we employ synthetic data sets and go on to show how the methods allow maximum likelihood based inference on the exhaustive Walker Lake data set.
Resumo:
In this book, Stehr and Grundmann outline the theoretical significance and practical importance of the growing stratum of experts, counsellors and advisors in contemporary society, and claim that the growing spectrum of knowledge-based occupations has led to the pluralisation of expertise. As decision makers in organizations and private citizens, for various reasons, increasingly seek advice from experts, the authors examine the nature of expert activity, and suggest that the role of experts needs to be distinguised from other roles such as professionals, scientists, or intellectuals. Experts, they argue, perform knowledge based activities that mediate between the context of knowledge creation and application. Existing approaches tend to restrict the role of the expert to scientists, or to conflate the roles of professionals with experts. In avoiding such restrictions, this book sets out a framework to understanding the growing role of expertise in a better way. Experts provides thought-provoking discussion that will be of interest to postgraduate students and academics working within the fields of social theory, knowledge, and consumption.
Resumo:
The concept of ordered weighted averaging (OWA) operator weights arises in uncertain decision making problems, however some weights may have a specific relationship with other. This information about the weights can be obtained from decision makers (DMs). This paper intends to introduce a theory of weight restrictions into the existing OWA operator weight models. Based on the DMs' value judgment the obtained OWA operator weights could be more realistic.
Resumo:
Since much knowledge is tacit, eliciting knowledge is a common bottleneck during the development of knowledge-based systems. Visual interactive simulation (VIS) has been proposed as a means for eliciting experts’ decision-making by getting them to interact with a visual simulation of the real system in which they work. In order to explore the effectiveness and efficiency of VIS based knowledge elicitation, an experiment has been carried out with decision-makers in a Ford Motor Company engine assembly plant. The model properties under investigation were the level of visual representation (2-dimensional, 2½-dimensional and 3-dimensional) and the model parameter settings (unadjusted and adjusted to represent more uncommon and extreme situations). The conclusion from the experiment is that using a 2-dimensional representation with adjusted parameter settings provides the better simulation-based means for eliciting knowledge, at least for the case modelled.
Resumo:
One of the most significant paradigm shifts of modern business management is that individual businesses no longer compete as solely autonomous entities, but rather as supply chains. Firms worldwide have embraced the concept of supply chain management as important and sometimes critical to their business. The idea of a collaborative supply chain is to gain a competitive advantage by improving overall performance through measuring a holistic perspective of the supply chain. However, contemporary performance measurement theory is somewhat fragmented and fails to support this idea. Therefore, this research develops and applies an integrated supply chain performance measurement framework that provides a more holistic approach to the study of supply chain performance measurement by combining both supply chain macro processes and decision making levels. Therefore, the proposed framework can provide a balanced horizontal (cross-process) and vertical (hierarchical decision) view and measure the performance of the entire supply chain system. Firstly, literature on performance measurement frameworks and performance measurement factors of supply chain management will help to develop a conceptual framework. Next the proposed framework will be presented. The framework will be validated through in-depth interviews with three Thai manufacturing companies. The fieldwork combined varied sources in order to understand the views of manufacturers on supply chain performance in the three case study companies. The collected data were analyzed, interpreted, and reported using thematic analysis and analysis hierarchy process (AHP), which was influenced by the study’s conceptual framework. This research contributes a new theory of supply chain performance measurement and knowledge on supply chain characteristics of a developing country, Thailand. The research also affects organisations by preparing decision makers to make strategic, tactical and operational level decisions with respect to supply chain macro processes. The results from the case studies also indicate the similarities and differences in their supply chain performance. Furthermore, the implications of the study are offered for both academic and practical use.
Resumo:
In the quest to secure the much vaunted benefits of North Sea oil, highly non-incremental technologies have been adopted. Nowhere is this more the case than with the early fields of the central and northern North Sea. By focusing on the inflexible nature of North Sea hardware, in such fields, this thesis examines the problems that this sort of technology might pose for policy making. More particularly, the following issues are raised. First, the implications of non-incremental technical change for the successful conduct of oil policy is raised. Here, the focus is on the micro-economic performance of the first generation of North Sea oil fields and the manner in which this relates to government policy. Secondly, the question is posed as to whether there were more flexible, perhaps more incremental policy alternatives open to the decision makers. Conclusions drawn relate to the degree to which non-incremental shifts in policy permit decision makers to achieve their objectives at relatively low cost. To discover cases where non-incremental policy making has led to success in this way, would be to falsify the thesis that decision makers are best served by employing incremental politics as an approach to complex problem solving.