933 resultados para Goal Programming
Resumo:
This paper explores the use of the optimization procedures in SAS/OR software with application to the contemporary logistics distribution network design using an integrated multiple criteria decision making approach. Unlike the traditional optimization techniques, the proposed approach, combining analytic hierarchy process (AHP) and goal programming (GP), considers both quantitative and qualitative factors. In the integrated approach, AHP is used to determine the relative importance weightings or priorities of alternative warehouses with respect to both deliverer oriented and customer oriented criteria. Then, a GP model incorporating the constraints of system, resource, and AHP priority is formulated to select the best set of warehouses without exceeding the limited available resources. To facilitate the use of integrated multiple criteria decision making approach by SAS users, an ORMCDM code was implemented in the SAS programming language. The SAS macro developed in this paper selects the chosen variables from a SAS data file and constructs sets of linear programming models based on the selected GP model. An example is given to illustrate how one could use the code to design the logistics distribution network.
Resumo:
Using a wide range of operational research (OR) optimization examples, Applied Operational Research with SAS demonstrates how the OR procedures in SAS work. The book is one of the first to extensively cover the application of SAS procedures to OR problems, such as single criterion optimization, project management decisions, printed circuit board assembly, and multiple criteria decision making. The text begins with the algorithms and methods for linear programming, integer linear programming, and goal programming models. It then describes the principles of several OR procedures in SAS. Subsequent chapters explain how to use these procedures to solve various types of OR problems. Each of these chapters describes the concept of an OR problem, presents an example of the problem, and discusses the specific procedure and its macros for the optimal solution of the problem. The macros include data handling, model building, and report writing. While primarily designed for SAS users in OR and marketing analytics, the book can also be used by readers interested in mathematical modeling techniques. By formulating the OR problems as mathematical models, the authors show how SAS can solve a variety of optimization problems.
Resumo:
Purpose: The purpose of this paper is to review the literature which focuses on four major higher education decision problems. These are: resource allocation; performance measurement; budgeting; and scheduling. Design/methodology/approach: Related articles appearing in the international journals from 1996 to 2005 are gathered and analyzed so that the following three questions can be answered: "What kind of decision problems were paid most attention to?"; "Were the multiple criteria decision-making techniques prevalently adopted?"; and "What are the inadequacies of these approaches?" Findings: Based on the inadequacies, some improvements and possible future work are recommended, and a comprehensive resource allocation model is developed taking account of these factors. Finally, a new knowledge-based goal programming technique which integrates some operations of analytic hierarchy process is proposed to tackle the model intelligently. Originality/value: Higher education has faced the problem of budget cuts or constrained budgets for the past 30 years. Managing the process of the higher education system is, therefore, a crucial and urgent task for the decision makers of universities in order to improve their performance or competitiveness. © Emerald Group Publishing Limited.
Resumo:
This article presents a potential method to assist developers of future bioenergy schemes when selecting from available suppliers of biomass materials. The method aims to allow tacit requirements made on biomass suppliers to be considered at the design stage of new developments. The method used is a combination of the Analytical Hierarchy Process and the Quality Function Deployment methods (AHP-QFD). The output of the method is a ranking and relative weighting of the available suppliers which could be used to improve optimization algorithms such as linear and goal programming. The paper is at a conceptual stage and no results have been obtained. The aim is to use the AHP-QFD method to bridge the gap between treatment of explicit and tacit requirements of bioenergy schemes; allowing decision makers to identify the most successful supply strategy available.
Resumo:
Lack of discrimination power and poor weight dispersion remain major issues in Data Envelopment Analysis (DEA). Since the initial multiple criteria DEA (MCDEA) model developed in the late 1990s, only goal programming approaches; that is, the GPDEA-CCR and GPDEA-BCC were introduced for solving the said problems in a multi-objective framework. We found GPDEA models to be invalid and demonstrate that our proposed bi-objective multiple criteria DEA (BiO-MCDEA) outperforms the GPDEA models in the aspects of discrimination power and weight dispersion, as well as requiring less computational codes. An application of energy dependency among 25 European Union member countries is further used to describe the efficacy of our approach. © 2013 Elsevier B.V. All rights reserved.
Resumo:
Projects exposed to an uncertain environment must be adapted to deal with the effective integration of various planning elements and the optimization of project parameters. Time, cost, and quality are the prime objectives of a project that need to be optimized to fulfill the owner's goal. In an uncertain environment, there exist many other conflicting objectives that may also need to be optimized. These objectives are characterized by varying degrees of conflict. Moreover, an uncertain environment also causes several changes in the project plan throughout its life, demanding that the project plan be totally flexible. Goal programming (GP), a multiple criteria decision making technique, offers a good solution for this project planning problem. There the planning problem is considered from the owner's perspective, which leads to classifying the project up to the activity level. GP is applied separately at each level, and the formulated models are integrated through information flow. The flexibility and adaptability of the models lies in the ease of updating the model parameters at the required level through changing priorities and/or constraints and transmitting the information to other levels. The hierarchical model automatically provides integration among various element of planning. The proposed methodology is applied in this paper to plan a petroleum pipeline construction project, and its effectiveness is demonstrated.
Resumo:
A cross-country pipeline construction project is exposed to an uncertain environment due to its enormous size (physical, manpower requirement and financial value), complexity in design technology and involvement of external factors. These uncertainties can lead to several changes in project scope during the process of project execution. Unless the changes are properly controlled, the time, cost and quality goals of the project may never be achieved. A methodology is proposed for project control through risk analysis, contingency allocation and hierarchical planning models. Risk analysis is carried out through the analytic hierarchy process (AHP) due to the subjective nature of risks in construction projects. The results of risk analysis are used to determine the logical contingency for project control with the application of probability theory. Ultimate project control is carried out by hierarchical planning model which enables decision makers to take vital decisions during the changing environment of the construction period. Goal programming (GP), a multiple criteria decision-making technique, is proposed for model formulation because of its flexibility and priority-base structure. The project is planned hierarchically in three levels—project, work package and activity. GP is applied separately at each level. Decision variables of each model are different planning parameters of the project. In this study, models are formulated from the owner's perspective and its effectiveness in project control is demonstrated.
Resumo:
Since the seminal works of Markowitz (1952), Sharpe (1964), and Lintner (1965), numerous studies on portfolio selection and performance measure have been based upon the mean-variance framework. However, several researchers (e.g., Arditti (1967, and 1971), Samuelson (1970), and Rubinstein (1973)) argue that the higher moments cannot be neglected unless there is reason to believe that: (i) the asset returns are normally distributed and the investor's utility function is quadratic, or (ii) the empirical evidence demonstrates that higher moments are irrelevant to the investor's decision. Based on the same argument, this dissertation investigates the impact of higher moments of return distributions on three issues concerning the 14 international stock markets.^ First, the portfolio selection with skewness is determined using: the Polynomial Goal Programming in which investor preferences for skewness can be incorporated. The empirical findings suggest that the return distributions of international stock markets are not normally distributed, and that the incorporation of skewness into an investor's portfolio decision causes a major change in the construction of his optimal portfolio. The evidence also indicates that an investor will trade expected return of the portfolio for skewness. Moreover, when short sales are allowed, investors are better off as they attain higher expected return and skewness simultaneously.^ Second, the performance of international stock markets are evaluated using two types of performance measures: (i) the two-moment performance measures of Sharpe (1966), and Treynor (1965), and (ii) the higher-moment performance measures of Prakash and Bear (1986), and Stephens and Proffitt (1991). The empirical evidence indicates that higher moments of return distributions are significant and relevant to the investor's decision. Thus, the higher moment performance measures should be more appropriate to evaluate the performances of international stock markets. The evidence also indicates that various measures provide a vastly different performance ranking of the markets, albeit in the same direction.^ Finally, the inter-temporal stability of the international stock markets is investigated using the Parhizgari and Prakash (1989) algorithm for the Sen and Puri (1968) test which accounts for non-normality of return distributions. The empirical finding indicates that there is strong evidence to support the stability in international stock market movements. However, when the Anderson test which assumes normality of return distributions is employed, the stability in the correlation structure is rejected. This suggests that the non-normality of the return distribution is an important factor that cannot be ignored in the investigation of inter-temporal stability of international stock markets. ^
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Engenharia Informática.
Resumo:
A stochastic programming approach is proposed in this paper for the development of offering strategies for a wind power producer. The optimization model is characterized by making the analysis of several scenarios and treating simultaneously two kinds of uncertainty: wind power and electricity market prices. The approach developed allows evaluating alternative production and offers strategies to submit to the electricity market with the ultimate goal of maximizing profits. An innovative comparative study is provided, where the imbalances are treated differently. Also, an application to two new realistic case studies is presented. Finally, conclusions are duly drawn.
Resumo:
Currently, the teaching-learning process in domains, such as computer programming, is characterized by an extensive curricula and a high enrolment of students. This poses a great workload for faculty and teaching assistants responsible for the creation, delivery, and assessment of student exercises. The main goal of this chapter is to foster practice-based learning in complex domains. This objective is attained with an e-learning framework—called Ensemble—as a conceptual tool to organize and facilitate technical interoperability among services. The Ensemble framework is used on a specific domain: computer programming. Content issues are tacked with a standard format to describe programming exercises as learning objects. Communication is achieved with the extension of existing specifications for the interoperation with several systems typically found in an e-learning environment. In order to evaluate the acceptability of the proposed solution, an Ensemble instance was validated on a classroom experiment with encouraging results.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
The Intel R Xeon PhiTM is the first processor based on Intel’s MIC (Many Integrated Cores) architecture. It is a co-processor specially tailored for data-parallel computations, whose basic architectural design is similar to the ones of GPUs (Graphics Processing Units), leveraging the use of many integrated low computational cores to perform parallel computations. The main novelty of the MIC architecture, relatively to GPUs, is its compatibility with the Intel x86 architecture. This enables the use of many of the tools commonly available for the parallel programming of x86-based architectures, which may lead to a smaller learning curve. However, programming the Xeon Phi still entails aspects intrinsic to accelerator-based computing, in general, and to the MIC architecture, in particular. In this thesis we advocate the use of algorithmic skeletons for programming the Xeon Phi. Algorithmic skeletons abstract the complexity inherent to parallel programming, hiding details such as resource management, parallel decomposition, inter-execution flow communication, thus removing these concerns from the programmer’s mind. In this context, the goal of the thesis is to lay the foundations for the development of a simple but powerful and efficient skeleton framework for the programming of the Xeon Phi processor. For this purpose we build upon Marrow, an existing framework for the orchestration of OpenCLTM computations in multi-GPU and CPU environments. We extend Marrow to execute both OpenCL and C++ parallel computations on the Xeon Phi. We evaluate the newly developed framework, several well-known benchmarks, like Saxpy and N-Body, will be used to compare, not only its performance to the existing framework when executing on the co-processor, but also to assess the performance on the Xeon Phi versus a multi-GPU environment.
Resumo:
The main objective of this master's thesis is to study robot programming using simulation software, and also how to embed the simulation software into company's own robot controlling software. The further goal is to study a new communication interface to the assembly line's components -more precisely how to connect the robot cell into this new communication system. Conveyor lines are already available where the conveyors use the new communication standard. The robot cell is not yet capable of communicating with to other devices using the new communication protocols. The main problem among robot manufacturers is that they all have their own communication systems and programming languages. There has not been any common programming language to program all the different robot manufacturers robots, until the RRS (Realistic Robot Simulation) standards were developed. The RRS - II makes it possible to create the robot programs in the simulation software and it gives a common user interface for different robot manufacturers robots. This thesis will present the RRS - II standard and the robot manufacturers situation for the RRS - II support. Thesis presents how the simulation software can be embedded into company's own robot controlling software and also how the robot cell can be connected to the CAMX (Computer Aided Manufacturing using XML) communication system.
Resumo:
The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.