885 resultados para Objectoriented programming
Resumo:
-scale vary from a planetary scale and million years for convection problems to 100km and 10 years for fault systems simulations. Various techniques are in use to deal with the time dependency (e.g. Crank-Nicholson), with the non-linearity (e.g. Newton-Raphson) and weakly coupled equations (e.g. non-linear Gauss-Seidel). Besides these high-level solution algorithms discretization methods (e.g. finite element method (FEM), boundary element method (BEM)) are used to deal with spatial derivatives. Typically, large-scale, three dimensional meshes are required to resolve geometrical complexity (e.g. in the case of fault systems) or features in the solution (e.g. in mantel convection simulations). The modelling environment escript allows the rapid implementation of new physics as required for the development of simulation codes in earth sciences. Its main object is to provide a programming language, where the user can define new models and rapidly develop high-level solution algorithms. The current implementation is linked with the finite element package finley as a PDE solver. However, the design is open and other discretization technologies such as finite differences and boundary element methods could be included. escript is implemented as an extension of the interactive programming environment python (see www.python.org). Key concepts introduced are Data objects, which are holding values on nodes or elements of the finite element mesh, and linearPDE objects, which are defining linear partial differential equations to be solved by the underlying discretization technology. In this paper we will show the basic concepts of escript and will show how escript is used to implement a simulation code for interacting fault systems. We will show some results of large-scale, parallel simulations on an SGI Altix system. Acknowledgements: Project work is supported by Australian Commonwealth Government through the Australian Computational Earth Systems Simulator Major National Research Facility, Queensland State Government Smart State Research Facility Fund, The University of Queensland and SGI.
Resumo:
Logistics distribution network design is one of the major decision problems arising in contemporary supply chain management. The decision involves many quantitative and qualitative factors that may be conflicting in nature. This paper applies an integrated multiple criteria decision making approach to design an optimal distribution network. In the approach, the analytic hierarchy process (AHP) is used first to determine the relative importance weightings or priorities of alternative warehouses with respect to both deliverer oriented and customer oriented criteria. Then, the goal programming (GP) model incorporating the constraints of system, resource, and AHP priority is formulated to select the best set of warehouses without exceeding the limited available resources. In this paper, two commercial packages are used: Expert Choice for determining the AHP priorities of the warehouses, and LINDO for solving the GP model. © 2007 IEEE.
Resumo:
For a submitted query to multiple search engines finding relevant results is an important task. This paper formulates the problem of aggregation and ranking of multiple search engines results in the form of a minimax linear programming model. Besides the novel application, this study detects the most relevant information among a return set of ranked lists of documents retrieved by distinct search engines. Furthermore, two numerical examples aree used to illustrate the usefulness of the proposed approach.
Resumo:
A graphical process control language has been developed as a means of defining process control software. The user configures a block diagram describing the required control system, from a menu of functional blocks, using a graphics software system with graphics terminal. Additions may be made to the menu of functional blocks, to extend the system capability, and a group of blocks may be defined as a composite block. This latter feature provides for segmentation of the overall system diagram and the repeated use of the same group of blocks within the system. The completed diagram is analyzed by a graphics compiler which generates the programs and data structure to realise the run-time software. The run-time software has been designed as a data-driven system which allows for modifications at the run-time level in both parameters and system configuration. Data structures have been specified to ensure efficient execution and minimal storage requirements in the final control software. Machine independence has been accomodated as far as possible using CORAL 66 as the high level language throughout the entire system; the final run-time code being generated by a CORAL 66 compiler appropriate to the target processor.
The effective use of implicit parallelism through the use of an object-oriented programming language
Resumo:
This thesis explores translating well-written sequential programs in a subset of the Eiffel programming language - without syntactic or semantic extensions - into parallelised programs for execution on a distributed architecture. The main focus is on constructing two object-oriented models: a theoretical self-contained model of concurrency which enables a simplified second model for implementing the compiling process. There is a further presentation of principles that, if followed, maximise the potential levels of parallelism. Model of Concurrency. The concurrency model is designed to be a straightforward target for mapping sequential programs onto, thus making them parallel. It aids the compilation process by providing a high level of abstraction, including a useful model of parallel behaviour which enables easy incorporation of message interchange, locking, and synchronization of objects. Further, the model is sufficient such that a compiler can and has been practically built. Model of Compilation. The compilation-model's structure is based upon an object-oriented view of grammar descriptions and capitalises on both a recursive-descent style of processing and abstract syntax trees to perform the parsing. A composite-object view with an attribute grammar style of processing is used to extract sufficient semantic information for the parallelisation (i.e. code-generation) phase. Programming Principles. The set of principles presented are based upon information hiding, sharing and containment of objects and the dividing up of methods on the basis of a command/query division. When followed, the level of potential parallelism within the presented concurrency model is maximised. Further, these principles naturally arise from good programming practice. Summary. In summary this thesis shows that it is possible to compile well-written programs, written in a subset of Eiffel, into parallel programs without any syntactic additions or semantic alterations to Eiffel: i.e. no parallel primitives are added, and the parallel program is modelled to execute with equivalent semantics to the sequential version. If the programming principles are followed, a parallelised program achieves the maximum level of potential parallelisation within the concurrency model.
Resumo:
Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.
Resumo:
This thesis addresses the problem of offline identification of salient patterns in genetic programming individuals. It discusses the main issues related to automatic pattern identification systems, namely that these (a) should help in understanding the final solutions of the evolutionary run, (b) should give insight into the course of evolution and (c) should be helpful in optimizing future runs. Moreover, it proposes an algorithm, Extended Pattern Growing Algorithm ([E]PGA) to extract, filter and sort the identified patterns so that these fulfill as many as possible of the following criteria: (a) they are representative for the evolutionary run and/or search space, (b) they are human-friendly and (c) their numbers are within reasonable limits. The results are demonstrated on six problems from different domains.
Resumo:
The aim of this thesis is to examine the specific contextual factors affecting the applicability and development of the planning, programming, budgeting system (P.P.B.S.) as a systems approach to public sector budgeting. The concept of P.P.B.S. as a systems approach to public sector budgeting will first be developed and the preliminary hypothesis that general contextual factors may be classified under political, structural and cognitive headings will be put forward. This preliminary hypothesis will be developed and refined using American and early British experience. The refined hypothesis will then be tested in detail in the case of the English health and personal social services (H.P.S.S.), The reasons for this focus are that it is the most recent, the sole remaining, and the most significant example in British central government outside of defence, and is fairly representative of non-defence government programme areas. The method of data collection relies on the examination of unpublished and difficult to obtain central government, health and local authority documents, and interviews with senior civil servants and public officials. The conclusion will be that the political constraints on, or factors affecting P.P.B.S., vary with product characteristics and cultural imperatives on pluralistic decision-making; that structural constraints vary with the degree of coincidence of programme and organisation structure and with the degree of controllability of the organisation; and finally, that cognitive constraints vary according to product characteristics, organisational responsibilities, and analytical effort.
Resumo:
This thesis describes the development and use of a goal programming methodology for the evaluation of public housjng strategies in Mexico City, The methodology responds to the need to incorporate the location, size and densities of housing projects on the one hand, and "external" constraints such as the ability of low income families to pay for housing, and the amounts of capital and land available, on the other. The provision of low cost housing by public housing agencies in Mexico City is becoming increasingly difficult because there are so many constraints to be met and overcome, the most important of which is the ability of families to pay for housing. Other important limiting factors are the availability of capital and of land plots of the right size in desired locations. The location of public housing projects is significant because it determines the cost and pattern of work trips, which in a metropolitan area such as Mexico City are of considerable importance to both planners and potential. house owners. In addition, since the price of land is closely related to its location, the last factor is also significant in determining the price of the total housing package. Consequently there is a major trade-off between a housing strategy based on the provision of housing at locations close to employment, and the opposite one based on the provjsion of housjng at locations where employment accessibility is poorer but housing can be provided at a lower price. The goal programming evaluation methodology presented in this thesis was developed to aid housing planners to evaluate housing strategies which incorporate the issues raised above,