776 resultados para Decision-making Mathematical models
Resumo:
A participatory modelling process has been conducted in two areas of the Guadiana river (the upper and the middle sub-basins), in Spain, with the aim of providing support for decision making in the water management field. The area has a semi-arid climate where irrigated agriculture plays a key role in the economic development of the region and accounts for around 90% of water use. Following the guidelines of the European Water Framework Directive, we promote stakeholder involvement in water management with the aim to achieve an improved understanding of the water system and to encourage the exchange of knowledge and views between stakeholders in order to help building a shared vision of the system. At the same time, the resulting models, which integrate the different sectors and views, provide some insight of the impacts that different management options and possible future scenarios could have. The methodology is based on a Bayesian network combined with an economic model and, in the middle Guadiana sub-basin, with a crop model. The resulting integrated modelling framework is used to simulate possible water policy, market and climate scenarios to find out the impacts of those scenarios on farm income and on the environment. At the end of the modelling process, an evaluation questionnaire was filled by participants in both sub-basins. Results show that this type of processes are found very helpful by stakeholders to improve the system understanding, to understand each others views and to reduce conflict when it exists. In addition, they found the model an extremely useful tool to support management. The graphical interface, the quantitative output and the explicit representation of uncertainty helped stakeholders to better understand the implications of the scenario tested. Finally, the combination of different types of models was also found very useful, as it allowed exploring in detail specific aspects of the water management problems.
Resumo:
One of the core objectives of urban planning practice is to provide spatial equity in terms of opportunities and use of public space and facilities. Accessibility is the element that serves this purpose as a concept linking the reciprocal relationship between transport and land use, thus shaping individual potential mobility to reach the desired destinations. Accessibility concepts are increasingly acknowledged as fundamental to understand the functioning of cities and urban regions. Indeed, by introducing them in planning practice, better solutions can be achieved in terms of spatial equity. The COST Action TU1002 "Accessibility instruments for planning practice" was specifically designed to address the gap between scientific research in measuring and modelling accessibility, and the current use of indicators of accessibility in urban planning practice. This paper shows the full process of introducing an easily understandable measure of accessibility to planning practitioners in Madrid, which is one of the case studies of the above-mentioned COST action. Changes in accessibility after the opening of a new metro line using contour measures were analyzed and then presented to a selection of urban planners and practitioners in Madrid as part of a workshop to evaluate the usefulness of this tool for planning practice. Isochrone maps were confirmed as an effective tool, as their utility can be supplemented by other indicators, and being GIS-based, it can be easily computed (when compared with transport models) and integrated with other datasets.
Resumo:
Fundamental principles of precaution are legal maxims that ask for preventive actions, perhaps as contingent interim measures while relevant information about causality and harm remains unavailable, to minimize the societal impact of potentially severe or irreversible outcomes. Such principles do not explain how to make choices or how to identify what is protective when incomplete and inconsistent scientific evidence of causation characterizes the potential hazards. Rather, they entrust lower jurisdictions, such as agencies or authorities, to make current decisions while recognizing that future information can contradict the scientific basis that supported the initial decision. After reviewing and synthesizing national and international legal aspects of precautionary principles, this paper addresses the key question: How can society manage potentially severe, irreversible or serious environmental outcomes when variability, uncertainty, and limited causal knowledge characterize their decision-making? A decision-analytic solution is outlined that focuses on risky decisions and accounts for prior states of information and scientific beliefs that can be updated as subsequent information becomes available. As a practical and established approach to causal reasoning and decision-making under risk, inherent to precautionary decision-making, these (Bayesian) methods help decision-makers and stakeholders because they formally account for probabilistic outcomes, new information, and are consistent and replicable. Rational choice of an action from among various alternatives-defined as a choice that makes preferred consequences more likely-requires accounting for costs, benefits and the change in risks associated with each candidate action. Decisions under any form of the precautionary principle reviewed must account for the contingent nature of scientific information, creating a link to the decision-analytic principle of expected value of information (VOI), to show the relevance of new information, relative to the initial ( and smaller) set of data on which the decision was based. We exemplify this seemingly simple situation using risk management of BSE. As an integral aspect of causal analysis under risk, the methods developed in this paper permit the addition of non-linear, hormetic dose-response models to the current set of regulatory defaults such as the linear, non-threshold models. This increase in the number of defaults is an important improvement because most of the variants of the precautionary principle require cost-benefit balancing. Specifically, increasing the set of causal defaults accounts for beneficial effects at very low doses. We also show and conclude that quantitative risk assessment dominates qualitative risk assessment, supporting the extension of the set of default causal models.
Resumo:
There have been many models developed by scientists to assist decision-makers in making socio-economic and environmental decisions. It is now recognised that there is a shift in the dominant paradigm to making decisions with stakeholders, rather than making decisions for stakeholders. Our paper investigates two case studies where group model building has been undertaken for maintaining biodiversity in Australia. The first case study focuses on preservation and management of green spaces and biodiversity in metropolitan Melbourne under the umbrella of the Melbourne 2030 planning strategy. A geographical information system is used to collate a number of spatial datasets encompassing a range of cultural and natural assets data layers including: existing open spaces, waterways, threatened fauna and flora, ecological vegetation covers, registered cultural heritage sites, and existing land parcel zoning. Group model building is incorporated into the study through eliciting weightings and ratings of importance for each datasets from urban planners to formulate different urban green system scenarios. The second case study focuses on modelling ecoregions from spatial datasets for the state of Queensland. The modelling combines collaborative expert knowledge and a vast amount of environmental data to build biogeographical classifications of regions. An information elicitation process is used to capture expert knowledge of ecoregions as geographical descriptions, and to transform this into prior probability distributions that characterise regions in terms of environmental variables. This prior information is combined with measured data on the environmental variables within a Bayesian modelling technique to produce the final classified regions. We describe how linked views between descriptive information, mapping and statistical plots are used to decide upon representative regions that satisfy a number of criteria for biodiversity and conservation. This paper discusses the advantages and problems encountered when undertaking group model building. Future research will extend the group model building approach to include interested individuals and community groups.
Resumo:
The successful restructuring of Chinese industries is of immense importance not only for the continued development of China but also to the stability of the world economy. The transformation of the Chinese wool textile industry illustrates well the many problems and pressures currently facing most Chinese industries. The Chinese wool textile industry has undergone major upheaval and restructuring in its drive to modernize and take advantage of developments in world textile markets. Macro level ownership and administrative reforms are well advanced as is the uptake of new technology and equipment. However, the changing market and institutional environment also demands an increasing level of sophistication in mill management decisions including product selection, input procurement, product pricing, investment appraisal, cost analysis and proactive identification of new market and growth opportunities. This paper outlines a series of analyses that have been integrated into a decision-making model designed to assist mill managers with these decisions. Features of the model include a whole-of-mill approach, a design based on existing mill structures and information systems, and the capacity for the model to be tailored to individual mills. All of these features facilitate the adoption of the model by time and resource constrained managers seeking to maintain the viability of their enterprises in the face of extremely dynamic market conditions.
Resumo:
Expert systems, and artificial intelligence more generally, can provide a useful means for representing decision-making processes. By linking expert systems software to simulation software an effective means of including these decision-making processes in a simulation model can be achieved. This paper demonstrates how a commercial-off-the-shelf simulation package (Witness) can be linked to an expert systems package (XpertRule) through a Visual Basic interface. The methodology adopted could be used for models, and possibly software, other than those presented here.
Resumo:
Group decision making is the study of identifying and selecting alternatives based on the values and preferences of the decision maker. Making a decision implies that there are several alternative choices to be considered. This paper uses the concept of Data Envelopment Analysis to introduce a new mathematical method for selecting the best alternative in a group decision making environment. The introduced model is a multi-objective function which is converted into a multi-objective linear programming model from which the optimal solution is obtained. A numerical example shows how the new model can be applied to rank the alternatives or to choose a subset of the most promising alternatives.
Resumo:
Data Envelopment Analysis (DEA) is a nonparametric method for measuring the efficiency of a set of decision making units such as firms or public sector agencies, first introduced into the operational research and management science literature by Charnes, Cooper, and Rhodes (CCR) [Charnes, A., Cooper, W.W., Rhodes, E., 1978. Measuring the efficiency of decision making units. European Journal of Operational Research 2, 429–444]. The original DEA models were applicable only to technologies characterized by positive inputs/outputs. In subsequent literature there have been various approaches to enable DEA to deal with negative data. In this paper, we propose a semi-oriented radial measure, which permits the presence of variables which can take both negative and positive values. The model is applied to data on a notional effluent processing system to compare the results with those yielded by two alternative methods for dealing with negative data in DEA: The modified slacks-based model suggested by Sharp et al. [Sharp, J.A., Liu, W.B., Meng, W., 2006. A modified slacks-based measure model for data envelopment analysis with ‘natural’ negative outputs and inputs. Journal of Operational Research Society 57 (11) 1–6] and the range directional model developed by Portela et al. [Portela, M.C.A.S., Thanassoulis, E., Simpson, G., 2004. A directional distance approach to deal with negative data in DEA: An application to bank branches. Journal of Operational Research Society 55 (10) 1111–1121]. A further example explores the advantages of using the new model.
Resumo:
Information systems have developed to the stage that there is plenty of data available in most organisations but there are still major problems in turning that data into information for management decision making. This thesis argues that the link between decision support information and transaction processing data should be through a common object model which reflects the real world of the organisation and encompasses the artefacts of the information system. The CORD (Collections, Objects, Roles and Domains) model is developed which is richer in appropriate modelling abstractions than current Object Models. A flexible Object Prototyping tool based on a Semantic Data Storage Manager has been developed which enables a variety of models to be stored and experimented with. A statistical summary table model COST (Collections of Objects Statistical Table) has been developed within CORD and is shown to be adequate to meet the modelling needs of Decision Support and Executive Information Systems. The COST model is supported by a statistical table creator and editor COSTed which is also built on top of the Object Prototyper and uses the CORD model to manage its metadata.
Resumo:
This thesis reviews the main methodological developments in public sector investment appraisal and finds growing evidence that appraisal techniques are not fulfilling their earlier promise. It is suggested that an important reason for this failure lies in the inability of these techniques to handle uncertainty except in a highly circumscribed fashion. It is argued that a more fruitful approach is to strive for flexibility. Investment projects should be formulated with a view to making them responsive to a wide range of possible future events, rather than embodying a solution which is optimal for one configuration of circumstances only. The distinction drawn in economics between the short and the long run is used to examine the nature of flexibility. The concept of long run flexibility is applied to the pre-investment range of choice open to the decisionmaker. It is demonstrated that flexibility is reduced at a very early stage of decisionmaking by the conventional system of appraisal which evaluates only a small number of options. The pre-appraisal filtering process is considered further in relation to decisionmaking models. It is argued that for public sector projects the narrowing down of options is best understood in relation to an amended mixed scanning model which places importance on the process by which the 'national interest ' is determined. Short run flexibility deals with operational characteristics, the degree to which particular projects may respond to changing demands when the basic investment is already in place. The tension between flexibility and cost is noted. A short case study on the choice of electricity generating plant is presented. The thesis concludes with a brief examination of the approaches used by successive British governments to public sector investment, particularly in relation to the nationalised industries
Resumo:
Guest editorial Ali Emrouznejad is a Senior Lecturer at the Aston Business School in Birmingham, UK. His areas of research interest include performance measurement and management, efficiency and productivity analysis as well as data mining. He has published widely in various international journals. He is an Associate Editor of IMA Journal of Management Mathematics and Guest Editor to several special issues of journals including Journal of Operational Research Society, Annals of Operations Research, Journal of Medical Systems, and International Journal of Energy Management Sector. He is in the editorial board of several international journals and co-founder of Performance Improvement Management Software. William Ho is a Senior Lecturer at the Aston University Business School. Before joining Aston in 2005, he had worked as a Research Associate in the Department of Industrial and Systems Engineering at the Hong Kong Polytechnic University. His research interests include supply chain management, production and operations management, and operations research. He has published extensively in various international journals like Computers & Operations Research, Engineering Applications of Artificial Intelligence, European Journal of Operational Research, Expert Systems with Applications, International Journal of Production Economics, International Journal of Production Research, Supply Chain Management: An International Journal, and so on. His first authored book was published in 2006. He is an Editorial Board member of the International Journal of Advanced Manufacturing Technology and an Associate Editor of the OR Insight Journal. Currently, he is a Scholar of the Advanced Institute of Management Research. Uses of frontier efficiency methodologies and multi-criteria decision making for performance measurement in the energy sector This special issue aims to focus on holistic, applied research on performance measurement in energy sector management and for publication of relevant applied research to bridge the gap between industry and academia. After a rigorous refereeing process, seven papers were included in this special issue. The volume opens with five data envelopment analysis (DEA)-based papers. Wu et al. apply the DEA-based Malmquist index to evaluate the changes in relative efficiency and the total factor productivity of coal-fired electricity generation of 30 Chinese administrative regions from 1999 to 2007. Factors considered in the model include fuel consumption, labor, capital, sulphur dioxide emissions, and electricity generated. The authors reveal that the east provinces were relatively and technically more efficient, whereas the west provinces had the highest growth rate in the period studied. Ioannis E. Tsolas applies the DEA approach to assess the performance of Greek fossil fuel-fired power stations taking undesirable outputs into consideration, such as carbon dioxide and sulphur dioxide emissions. In addition, the bootstrapping approach is deployed to address the uncertainty surrounding DEA point estimates, and provide bias-corrected estimations and confidence intervals for the point estimates. The author revealed from the sample that the non-lignite-fired stations are on an average more efficient than the lignite-fired stations. Maethee Mekaroonreung and Andrew L. Johnson compare the relative performance of three DEA-based measures, which estimate production frontiers and evaluate the relative efficiency of 113 US petroleum refineries while considering undesirable outputs. Three inputs (capital, energy consumption, and crude oil consumption), two desirable outputs (gasoline and distillate generation), and an undesirable output (toxic release) are considered in the DEA models. The authors discover that refineries in the Rocky Mountain region performed the best, and about 60 percent of oil refineries in the sample could improve their efficiencies further. H. Omrani, A. Azadeh, S. F. Ghaderi, and S. Abdollahzadeh presented an integrated approach, combining DEA, corrected ordinary least squares (COLS), and principal component analysis (PCA) methods, to calculate the relative efficiency scores of 26 Iranian electricity distribution units from 2003 to 2006. Specifically, both DEA and COLS are used to check three internal consistency conditions, whereas PCA is used to verify and validate the final ranking results of either DEA (consistency) or DEA-COLS (non-consistency). Three inputs (network length, transformer capacity, and number of employees) and two outputs (number of customers and total electricity sales) are considered in the model. Virendra Ajodhia applied three DEA-based models to evaluate the relative performance of 20 electricity distribution firms from the UK and the Netherlands. The first model is a traditional DEA model for analyzing cost-only efficiency. The second model includes (inverse) quality by modelling total customer minutes lost as an input data. The third model is based on the idea of using total social costs, including the firm’s private costs and the interruption costs incurred by consumers, as an input. Both energy-delivered and number of consumers are treated as the outputs in the models. After five DEA papers, Stelios Grafakos, Alexandros Flamos, Vlasis Oikonomou, and D. Zevgolis presented a multiple criteria analysis weighting approach to evaluate the energy and climate policy. The proposed approach is akin to the analytic hierarchy process, which consists of pairwise comparisons, consistency verification, and criteria prioritization. In the approach, stakeholders and experts in the energy policy field are incorporated in the evaluation process by providing an interactive mean with verbal, numerical, and visual representation of their preferences. A total of 14 evaluation criteria were considered and classified into four objectives, such as climate change mitigation, energy effectiveness, socioeconomic, and competitiveness and technology. Finally, Borge Hess applied the stochastic frontier analysis approach to analyze the impact of various business strategies, including acquisition, holding structures, and joint ventures, on a firm’s efficiency within a sample of 47 natural gas transmission pipelines in the USA from 1996 to 2005. The author finds that there were no significant changes in the firm’s efficiency by an acquisition, and there is a weak evidence for efficiency improvements caused by the new shareholder. Besides, the author discovers that parent companies appear not to influence a subsidiary’s efficiency positively. In addition, the analysis shows a negative impact of a joint venture on technical efficiency of the pipeline company. To conclude, we are grateful to all the authors for their contribution, and all the reviewers for their constructive comments, which made this special issue possible. We hope that this issue would contribute significantly to performance improvement of the energy sector.
Resumo:
This paper discusses the use of a Model developed by Aston Business School to record the work load of its academic staff. By developing a database to register annual activity in all areas of teaching, administration and research the School has created a flexible tool which can be used for facilitating both day-to-day managerial and longer term strategic decisions. This paper gives a brief outline of the Model and discusses the factors which were taken into account when setting it up. Particular attention is paid to the uses made of the Model and the problems encountered in developing it. The paper concludes with an appraisal of the Model’s impact and of additional developments which are currently being considered. Aston Business School has had a Load Model in some form for many years. The Model has, however, been refined over the past five years, so that it has developed into a form which can be used for a far greater number of purposes within the School. The Model is coordinated by a small group of academic and administrative staff, chaired by the Head of the School. This group is responsible for the annual cycle of collecting and inputting data, validating returns, carrying out analyses of the raw data, and presenting the mater ial to different sections of the School. The authors of this paper are members of this steer ing group.
Resumo:
An expert system (ES) is a class of computer programs developed by researchers in artificial intelligence. In essence, they are programs made up of a set of rules that analyze information about a specific class of problems, as well as provide analysis of the problems, and, depending upon their design, recommend a course of user action in order to implement corrections. ES are computerized tools designed to enhance the quality and availability of knowledge required by decision makers in a wide range of industries. Decision-making is important for the financial institutions involved due to the high level of risk associated with wrong decisions. The process of making decision is complex and unstructured. The existing models for decision-making do not capture the learned knowledge well enough. In this study, we analyze the beneficial aspects of using ES for decision- making process.
Resumo:
Decision-making in product quality is an indispensable stage in product development, in order to reduce product development risk. Based on the identification of the deficiencies of quality function deployment (QFD) and failure modes and effects analysis (FMEA), a novel decision-making method is presented that draws upon a knowledge network of failure scenarios. An ontological expression of failure scenarios is presented together with a framework of failure knowledge network (FKN). According to the roles of quality characteristics (QCs) in failure processing, QCs are set into three categories namely perceptible QCs, restrictive QCs, and controllable QCs, which present the monitor targets, control targets and improvement targets respectively for quality management. A mathematical model and algorithms based on the analytic network process (ANP) is introduced for calculating the priority of QCs with respect to different development scenarios. A case study is provided according to the proposed decision-making procedure based on FKN. This methodology is applied in the propeller design process to solve the problem of prioritising QCs. This paper provides a practical approach for decision-making in product quality. Copyright © 2011 Inderscience Enterprises Ltd.
Resumo:
Számos korábbi kutatás – köztük a szerzők korábbi vizsgálatai is – azt mutatja, hogy a menedzsmentképességek és a vállalatok versenyképessége között pozitív kapcsolat áll fenn, a jobban teljesítő és a proaktívabb vállalatok rendre felkészültebb, jobb vezetői képességekkel bíró, kockázatvállalóbb vezetőkkel rendelkeznek. Az is megfigyelhető, hogy az ebből a nézőpontból sikeresebben működő vállalatok döntéseiben az átlagosnál is erősebben érvényesül a racionális közelítésmód, melynek alkalmazásával a menedzserek az optimális cselekvési alternatíva kiválasztására törekszenek. A cikkben a szerzők az elmúlt 15 év versenyképességi kutatásainak tapasztalatait összegzik, kiemelt hangsúlyt helyezve a legfrissebb felmérés eredményeire. ________________ The article summarizes the main findings of the Competitiveness Research Program with respect to the skills and capabilities of the Hungarian managers and the decision making approaches they use during their work. The results of the four surveys conducted in 1996, 1999, 2004 and 2009 are fairly stable over time: practice minded behavior, professional expertise, and problem solving skills are on the top of the list of the most developed skills of the Hungarian executives. The rational approach is the most popular among the most widespread decision making models in the authors’ sample which is rather alarming since the present turbulent economic environment may demand more adaptive and intuitive approaches.