35 resultados para Graphical programming

em Aston University Research Archive


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The inclusion of high-level scripting functionality in state-of-the-art rendering APIs indicates a movement toward data-driven methodologies for structuring next generation rendering pipelines. A similar theme can be seen in the use of composition languages to deploy component software using selection and configuration of collaborating component implementations. In this paper we introduce the Fluid framework, which places particular emphasis on the use of high-level data manipulations in order to develop component based software that is flexible, extensible, and expressive. We introduce a data-driven, object oriented programming methodology to component based software development, and demonstrate how a rendering system with a similar focus on abstract manipulations can be incorporated, in order to develop a visualization application for geospatial data. In particular we describe a novel SAS script integration layer that provides access to vertex and fragment programs, producing a very controllable, responsive rendering system. The proposed system is very similar to developments speculatively planned for DirectX 10, but uses open standards and has cross platform applicability. © The Eurographics Association 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A graphical process control language has been developed as a means of defining process control software. The user configures a block diagram describing the required control system, from a menu of functional blocks, using a graphics software system with graphics terminal. Additions may be made to the menu of functional blocks, to extend the system capability, and a group of blocks may be defined as a composite block. This latter feature provides for segmentation of the overall system diagram and the repeated use of the same group of blocks within the system. The completed diagram is analyzed by a graphics compiler which generates the programs and data structure to realise the run-time software. The run-time software has been designed as a data-driven system which allows for modifications at the run-time level in both parameters and system configuration. Data structures have been specified to ensure efficient execution and minimal storage requirements in the final control software. Machine independence has been accomodated as far as possible using CORAL 66 as the high level language throughout the entire system; the final run-time code being generated by a CORAL 66 compiler appropriate to the target processor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spectral and coherence methodologies are ubiquitous for the analysis of multiple time series. Partial coherence analysis may be used to try to determine graphical models for brain functional connectivity. The outcome of such an analysis may be considerably influenced by factors such as the degree of spectral smoothing, line and interference removal, matrix inversion stabilization and the suppression of effects caused by side-lobe leakage, the combination of results from different epochs and people, and multiple hypothesis testing. This paper examines each of these steps in turn and provides a possible path which produces relatively ‘clean’ connectivity plots. In particular we show how spectral matrix diagonal up-weighting can simultaneously stabilize spectral matrix inversion and reduce effects caused by side-lobe leakage, and use the stepdown multiple hypothesis test procedure to help formulate an interaction strength.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Logistics distribution network design is one of the major decision problems arising in contemporary supply chain management. The decision involves many quantitative and qualitative factors that may be conflicting in nature. This paper applies an integrated multiple criteria decision making approach to design an optimal distribution network. In the approach, the analytic hierarchy process (AHP) is used first to determine the relative importance weightings or priorities of alternative warehouses with respect to both deliverer oriented and customer oriented criteria. Then, the goal programming (GP) model incorporating the constraints of system, resource, and AHP priority is formulated to select the best set of warehouses without exceeding the limited available resources. In this paper, two commercial packages are used: Expert Choice for determining the AHP priorities of the warehouses, and LINDO for solving the GP model. © 2007 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For a submitted query to multiple search engines finding relevant results is an important task. This paper formulates the problem of aggregation and ranking of multiple search engines results in the form of a minimax linear programming model. Besides the novel application, this study detects the most relevant information among a return set of ranked lists of documents retrieved by distinct search engines. Furthermore, two numerical examples aree used to illustrate the usefulness of the proposed approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis explores translating well-written sequential programs in a subset of the Eiffel programming language - without syntactic or semantic extensions - into parallelised programs for execution on a distributed architecture. The main focus is on constructing two object-oriented models: a theoretical self-contained model of concurrency which enables a simplified second model for implementing the compiling process. There is a further presentation of principles that, if followed, maximise the potential levels of parallelism. Model of Concurrency. The concurrency model is designed to be a straightforward target for mapping sequential programs onto, thus making them parallel. It aids the compilation process by providing a high level of abstraction, including a useful model of parallel behaviour which enables easy incorporation of message interchange, locking, and synchronization of objects. Further, the model is sufficient such that a compiler can and has been practically built. Model of Compilation. The compilation-model's structure is based upon an object-oriented view of grammar descriptions and capitalises on both a recursive-descent style of processing and abstract syntax trees to perform the parsing. A composite-object view with an attribute grammar style of processing is used to extract sufficient semantic information for the parallelisation (i.e. code-generation) phase. Programming Principles. The set of principles presented are based upon information hiding, sharing and containment of objects and the dividing up of methods on the basis of a command/query division. When followed, the level of potential parallelism within the presented concurrency model is maximised. Further, these principles naturally arise from good programming practice. Summary. In summary this thesis shows that it is possible to compile well-written programs, written in a subset of Eiffel, into parallel programs without any syntactic additions or semantic alterations to Eiffel: i.e. no parallel primitives are added, and the parallel program is modelled to execute with equivalent semantics to the sequential version. If the programming principles are followed, a parallelised program achieves the maximum level of potential parallelisation within the concurrency model.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of increasingly powerful computers, which has enabled the use of windowing software, has also opened the way for the computer study, via simulation, of very complex physical systems. In this study, the main issues related to the implementation of interactive simulations of complex systems are identified and discussed. Most existing simulators are closed in the sense that there is no access to the source code and, even if it were available, adaptation to interaction with other systems would require extensive code re-writing. This work aims to increase the flexibility of such software by developing a set of object-oriented simulation classes, which can be extended, by subclassing, at any level, i.e., at the problem domain, presentation or interaction levels. A strategy, which involves the use of an object-oriented framework, concurrent execution of several simulation modules, use of a networked windowing system and the re-use of existing software written in procedural languages, is proposed. A prototype tool which combines these techniques has been implemented and is presented. It allows the on-line definition of the configuration of the physical system and generates the appropriate graphical user interface. Simulation routines have been developed for the chemical recovery cycle of a paper pulp mill. The application, by creation of new classes, of the prototype to the interactive simulation of this physical system is described. Besides providing visual feedback, the resulting graphical user interface greatly simplifies the interaction with this set of simulation modules. This study shows that considerable benefits can be obtained by application of computer science concepts to the engineering domain, by helping domain experts to tailor interactive tools to suit their needs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The concept of an Expert System (ES) has been acknowledged as a very useful tool, but few studies have been carried out in its application to the design of cold rolled sections. This study involves primarily the use of an ES as a tool to improve the design process and to capture the draughtsman's knowledge. Its main purpose is to reduce substantially the time taken to produce a section drawing, thereby facilitating a speedy feedback to the customer. In order to communicate with a draughtsman, it is necessary to use sketches, symbolic representations and numerical data. This increases the complexity of programming an ES, as it is necessary to use a combination of languages so that decisions, calculations, graphical drawings and control of the system can be effected. A production system approach is used and a further step has been taken by introducing an Activator which is an autoexecute operation set up by the ES to operate an external program automatically. To speed up the absorption of new knowledge into the knowledge base, a new Learning System has been constructed. In addition to developing the ES, other software has been written to assist the design process. The section properties software has been introduced to improve the speed and consistency of calculating the section properties. A method of selecting or comparing the most appropriate section for a given specification is also implemented. Simple loading facilities have been introduced to guide the designer as to the loading capacity of the section. This research has concluded that the application of an ES is beneficial and with the activator approach, automated designing can be achieved. On average a complex drawing can be displayed on the screen in about 100 seconds, where over 95% of the initial section design time for repetitive or similar profile can be saved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis addresses the problem of offline identification of salient patterns in genetic programming individuals. It discusses the main issues related to automatic pattern identification systems, namely that these (a) should help in understanding the final solutions of the evolutionary run, (b) should give insight into the course of evolution and (c) should be helpful in optimizing future runs. Moreover, it proposes an algorithm, Extended Pattern Growing Algorithm ([E]PGA) to extract, filter and sort the identified patterns so that these fulfill as many as possible of the following criteria: (a) they are representative for the evolutionary run and/or search space, (b) they are human-friendly and (c) their numbers are within reasonable limits. The results are demonstrated on six problems from different domains.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this thesis is to examine the specific contextual factors affecting the applicability and development of the planning, programming, budgeting system (P.P.B.S.) as a systems approach to public sector budgeting. The concept of P.P.B.S. as a systems approach to public sector budgeting will first be developed and the preliminary hypothesis that general contextual factors may be classified under political, structural and cognitive headings will be put forward. This preliminary hypothesis will be developed and refined using American and early British experience. The refined hypothesis will then be tested in detail in the case of the English health and personal social services (H.P.S.S.), The reasons for this focus are that it is the most recent, the sole remaining, and the most significant example in British central government outside of defence, and is fairly representative of non-defence government programme areas. The method of data collection relies on the examination of unpublished and difficult to obtain central government, health and local authority documents, and interviews with senior civil servants and public officials. The conclusion will be that the political constraints on, or factors affecting P.P.B.S., vary with product characteristics and cultural imperatives on pluralistic decision-making; that structural constraints vary with the degree of coincidence of programme and organisation structure and with the degree of controllability of the organisation; and finally, that cognitive constraints vary according to product characteristics, organisational responsibilities, and analytical effort.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis describes the development and use of a goal programming methodology for the evaluation of public housjng strategies in Mexico City, The methodology responds to the need to incorporate the location, size and densities of housing projects on the one hand, and "external" constraints such as the ability of low income families to pay for housing, and the amounts of capital and land available, on the other. The provision of low cost housing by public housing agencies in Mexico City is becoming increasingly difficult because there are so many constraints to be met and overcome, the most important of which is the ability of families to pay for housing. Other important limiting factors are the availability of capital and of land plots of the right size in desired locations. The location of public housing projects is significant because it determines the cost and pattern of work trips, which in a metropolitan area such as Mexico City are of considerable importance to both planners and potential. house owners. In addition, since the price of land is closely related to its location, the last factor is also significant in determining the price of the total housing package. Consequently there is a major trade-off between a housing strategy based on the provision of housing at locations close to employment, and the opposite one based on the provjsion of housjng at locations where employment accessibility is poorer but housing can be provided at a lower price. The goal programming evaluation methodology presented in this thesis was developed to aid housing planners to evaluate housing strategies which incorporate the issues raised above,