937 resultados para Quadratic Programming
Resumo:
Logistics distribution network design is one of the major decision problems arising in contemporary supply chain management. The decision involves many quantitative and qualitative factors that may be conflicting in nature. This paper applies an integrated multiple criteria decision making approach to design an optimal distribution network. In the approach, the analytic hierarchy process (AHP) is used first to determine the relative importance weightings or priorities of alternative warehouses with respect to both deliverer oriented and customer oriented criteria. Then, the goal programming (GP) model incorporating the constraints of system, resource, and AHP priority is formulated to select the best set of warehouses without exceeding the limited available resources. In this paper, two commercial packages are used: Expert Choice for determining the AHP priorities of the warehouses, and LINDO for solving the GP model. © 2007 IEEE.
Resumo:
This paper formulates a logistics distribution problem as the multi-depot travelling salesman problem (MDTSP). The decision makers not only have to determine the travelling sequence of the salesman for delivering finished products from a warehouse or depot to a customer, but also need to determine which depot stores which type of products so that the total travelling distance is minimised. The MDTSP is similar to the combination of the travelling salesman and quadratic assignment problems. In this paper, the two individual hard problems or models are formulated first. Then, the problems are integrated together, that is, the MDTSP. The MDTSP is constructed as both integer nonlinear and linear programming models. After formulating the models, we verify the integrated models using commercial packages, and most importantly, investigate whether an iterative approach, that is, solving the individual models repeatedly, can generate an optimal solution to the MDTSP. Copyright © 2006 Inderscience Enterprises Ltd.
Resumo:
For a submitted query to multiple search engines finding relevant results is an important task. This paper formulates the problem of aggregation and ranking of multiple search engines results in the form of a minimax linear programming model. Besides the novel application, this study detects the most relevant information among a return set of ranked lists of documents retrieved by distinct search engines. Furthermore, two numerical examples aree used to illustrate the usefulness of the proposed approach.
Resumo:
A graphical process control language has been developed as a means of defining process control software. The user configures a block diagram describing the required control system, from a menu of functional blocks, using a graphics software system with graphics terminal. Additions may be made to the menu of functional blocks, to extend the system capability, and a group of blocks may be defined as a composite block. This latter feature provides for segmentation of the overall system diagram and the repeated use of the same group of blocks within the system. The completed diagram is analyzed by a graphics compiler which generates the programs and data structure to realise the run-time software. The run-time software has been designed as a data-driven system which allows for modifications at the run-time level in both parameters and system configuration. Data structures have been specified to ensure efficient execution and minimal storage requirements in the final control software. Machine independence has been accomodated as far as possible using CORAL 66 as the high level language throughout the entire system; the final run-time code being generated by a CORAL 66 compiler appropriate to the target processor.
The effective use of implicit parallelism through the use of an object-oriented programming language
Resumo:
This thesis explores translating well-written sequential programs in a subset of the Eiffel programming language - without syntactic or semantic extensions - into parallelised programs for execution on a distributed architecture. The main focus is on constructing two object-oriented models: a theoretical self-contained model of concurrency which enables a simplified second model for implementing the compiling process. There is a further presentation of principles that, if followed, maximise the potential levels of parallelism. Model of Concurrency. The concurrency model is designed to be a straightforward target for mapping sequential programs onto, thus making them parallel. It aids the compilation process by providing a high level of abstraction, including a useful model of parallel behaviour which enables easy incorporation of message interchange, locking, and synchronization of objects. Further, the model is sufficient such that a compiler can and has been practically built. Model of Compilation. The compilation-model's structure is based upon an object-oriented view of grammar descriptions and capitalises on both a recursive-descent style of processing and abstract syntax trees to perform the parsing. A composite-object view with an attribute grammar style of processing is used to extract sufficient semantic information for the parallelisation (i.e. code-generation) phase. Programming Principles. The set of principles presented are based upon information hiding, sharing and containment of objects and the dividing up of methods on the basis of a command/query division. When followed, the level of potential parallelism within the presented concurrency model is maximised. Further, these principles naturally arise from good programming practice. Summary. In summary this thesis shows that it is possible to compile well-written programs, written in a subset of Eiffel, into parallel programs without any syntactic additions or semantic alterations to Eiffel: i.e. no parallel primitives are added, and the parallel program is modelled to execute with equivalent semantics to the sequential version. If the programming principles are followed, a parallelised program achieves the maximum level of potential parallelisation within the concurrency model.
Using interior point algorithms for the solution of linear programs with special structural features
Resumo:
Linear Programming (LP) is a powerful decision making tool extensively used in various economic and engineering activities. In the early stages the success of LP was mainly due to the efficiency of the simplex method. After the appearance of Karmarkar's paper, the focus of most research was shifted to the field of interior point methods. The present work is concerned with investigating and efficiently implementing the latest techniques in this field taking sparsity into account. The performance of these implementations on different classes of LP problems is reported here. The preconditional conjugate gradient method is one of the most powerful tools for the solution of the least square problem, present in every iteration of all interior point methods. The effect of using different preconditioners on a range of problems with various condition numbers is presented. Decomposition algorithms has been one of the main fields of research in linear programming over the last few years. After reviewing the latest decomposition techniques, three promising methods were chosen the implemented. Sparsity is again a consideration and suggestions have been included to allow improvements when solving problems with these methods. Finally, experimental results on randomly generated data are reported and compared with an interior point method. The efficient implementation of the decomposition methods considered in this study requires the solution of quadratic subproblems. A review of recent work on algorithms for convex quadratic was performed. The most promising algorithms are discussed and implemented taking sparsity into account. The related performance of these algorithms on randomly generated separable and non-separable problems is also reported.
Resumo:
Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.
Resumo:
This thesis addresses the problem of offline identification of salient patterns in genetic programming individuals. It discusses the main issues related to automatic pattern identification systems, namely that these (a) should help in understanding the final solutions of the evolutionary run, (b) should give insight into the course of evolution and (c) should be helpful in optimizing future runs. Moreover, it proposes an algorithm, Extended Pattern Growing Algorithm ([E]PGA) to extract, filter and sort the identified patterns so that these fulfill as many as possible of the following criteria: (a) they are representative for the evolutionary run and/or search space, (b) they are human-friendly and (c) their numbers are within reasonable limits. The results are demonstrated on six problems from different domains.
Resumo:
The aim of this thesis is to examine the specific contextual factors affecting the applicability and development of the planning, programming, budgeting system (P.P.B.S.) as a systems approach to public sector budgeting. The concept of P.P.B.S. as a systems approach to public sector budgeting will first be developed and the preliminary hypothesis that general contextual factors may be classified under political, structural and cognitive headings will be put forward. This preliminary hypothesis will be developed and refined using American and early British experience. The refined hypothesis will then be tested in detail in the case of the English health and personal social services (H.P.S.S.), The reasons for this focus are that it is the most recent, the sole remaining, and the most significant example in British central government outside of defence, and is fairly representative of non-defence government programme areas. The method of data collection relies on the examination of unpublished and difficult to obtain central government, health and local authority documents, and interviews with senior civil servants and public officials. The conclusion will be that the political constraints on, or factors affecting P.P.B.S., vary with product characteristics and cultural imperatives on pluralistic decision-making; that structural constraints vary with the degree of coincidence of programme and organisation structure and with the degree of controllability of the organisation; and finally, that cognitive constraints vary according to product characteristics, organisational responsibilities, and analytical effort.
Resumo:
This thesis describes the development and use of a goal programming methodology for the evaluation of public housjng strategies in Mexico City, The methodology responds to the need to incorporate the location, size and densities of housing projects on the one hand, and "external" constraints such as the ability of low income families to pay for housing, and the amounts of capital and land available, on the other. The provision of low cost housing by public housing agencies in Mexico City is becoming increasingly difficult because there are so many constraints to be met and overcome, the most important of which is the ability of families to pay for housing. Other important limiting factors are the availability of capital and of land plots of the right size in desired locations. The location of public housing projects is significant because it determines the cost and pattern of work trips, which in a metropolitan area such as Mexico City are of considerable importance to both planners and potential. house owners. In addition, since the price of land is closely related to its location, the last factor is also significant in determining the price of the total housing package. Consequently there is a major trade-off between a housing strategy based on the provision of housing at locations close to employment, and the opposite one based on the provjsion of housjng at locations where employment accessibility is poorer but housing can be provided at a lower price. The goal programming evaluation methodology presented in this thesis was developed to aid housing planners to evaluate housing strategies which incorporate the issues raised above,
Resumo:
The 2011 National Student Survey (NSS) revealed that 40% of full-time students in England did not think that the feedback on their work has been helpful, even though 66% of these students agreed that the feedback was detailed and 62% of them agreed that the feedback has been prompt. Detailed feedback that is not considered helpful by students means a waste of tutors' time while students continue to struggle with their learning. What do students consider as helpful feedback? What are the qualities of helpful feedback? What are the preferred forms of feedback? How should tutors write feedback so that students will find it helpful? Can ICT help to improve the quality of feedback? In our ongoing search for answers to the above questions, we have trialled the use of a novel Internet application, called eCAF, to assess programming coursework from Engineering, Mathematics and Computing students and have collected their views on the feedback received through a survey. The survey reveals that most students prefer electronic feedback as given through eCAF, with verbal feedback ranked second and hand-written feedback ranked even lower. The survey also indicates that the feedback from some tutors is considered more helpful than others. We report on the detailed findings of the survey. By comparing the kinds of feedback given by each tutor who took part in the trial, we explore ways to improve the helpfulness of feedback on programming coursework in a bid to promote learning amongst engineering students.