368 resultados para compiler backend


Relevância:

10.00% 10.00%

Publicador:

Resumo:

"Compiler's note" signed: Samuel E. Mays.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

"Owner and part compiler, Brian MacDermot"--Pref.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dr. W.H. Gilbert, compiler.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Vols. for 1994 disributed to depository libraries in microfiche.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Includes bibliographical references.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim To develop a population pharmacokinetic model for mycophenolic acid in adult kidney transplant recipients, quantifying average population pharmacokinetic parameter values, and between- and within-subject variability and to evaluate the influence of covariates on the pharmacokinetic variability. Methods Pharmacokinetic data for mycophenolic acid and covariate information were previously available from 22 patients who underwent kidney transplantation at the Princess Alexandra Hospital. All patients received mycophenolate mofetil 1 g orally twice daily. A total of 557 concentration-time points were available. Data were analysed using the first-order method in NONMEM (version 5 level 1.1) using the G77 FORTRAN compiler. Results The best base model was a two-compartment model with a lag time (apparent oral clearance was 271 h(-1), and apparent volume of the central compartment 981). There was visual evidence of complex absorption and time-dependent clearance processes, but they could not be successfully modelled in this study. Weight was investigated as a covariate, but no significant relationship was determined. Conclusions The complexity in determining the pharmacokinetics of mycophenolic acid is currently underestimated. More complex pharmacokinetic models, though not supported by the limited data collected for this study, may prove useful in the future. The large between-subject and between-occasion variability and the possibility of nonlinear processes associated with the pharmacokinetics of mycophenolic acid raise questions about the value of the use of therapeutic monitoring and limited sampling strategies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The real-time refinement calculus is a formal method for the systematic derivation of real-time programs from real-time specifications in a style similar to the non-real-time refinement calculi of Back and Morgan. In this paper we extend the real-time refinement calculus with procedures and provide refinement rules for refining real-time specifications to procedure calls. A real-time specification can include constraints on, not only what outputs are produced, but also when they are produced. The derived programs can also include time constraints oil when certain points in the program must be reached; these are expressed in the form of deadline commands. Such programs are machine independent. An important consequence of the approach taken is that, not only are the specifications machine independent, but the whole refinement process is machine independent. To implement the machine independent code on a target machine one has a separate task of showing that the compiled machine code will reach all its deadlines before they expire. For real-time programs, externally observable input and output variables are essential. These differ from local variables in that their values are observable over the duration of the execution of the program. Hence procedures require input and output parameter mechanisms that are references to the actual parameters so that changes to external inputs are observable within the procedure and changes to output parameters are externally observable. In addition, we allow value and result parameters. These may be auxiliary parameters, which are used for reasoning about the correctness of real-time programs as well as in the expression of timing deadlines, but do not lead to any code being generated for them by a compiler. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Previous work on formally modelling and analysing program compilation has shown the need for a simple and expressive semantics for assembler level programs. Assembler programs contain unstructured jumps and previous formalisms have modelled these by using continuations, or by embedding the program in an explicit emulator. We propose a simpler approach, which uses techniques from compiler theory in a formal setting. This approach is based on an interpretation of programs as collections of program paths, each of which has a weakest liberal precondition semantics. We then demonstrate, by example, how we can use this formalism to justify the compilation of block-structured high-level language programs into assembler.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sarah Poulton Kalley é conhecida, em quase todos os segmentos do protestantismo do Brasil, devido à organização e compilação d e Salmos e Hinos, o mais antigo hinário protestante editado no vernáculo em nosso país. Seus hinos, ainda em uso em muitas igrejas, marcaram por mais de um século a teologia do protestantismo no Brasil. Apesar desta notoriedade, sua influência na gênese do protestantismo brasileiro nunca foi objeto de estudo. Assim, o objetivo desta pesquisa é resgatar e visibilizar áreas e estratégias de atuação que conferem a esta mulher um perfil de atuação relativamente autônomo. Contudo, centrada no estudo da trajetória intelectual e biográfica de um sujeito histórico, a investigação se defronta com um universo de personagens anônimos, envoltos numa complexa teia de relações, através das quais o protestantismo se insere no Brasil em um contexto especifico: huguenotes, puritanos, luddistas, famílias não-conformistas inglesas, líderes políticos e eclesiásticos, exilados madeirenses, brasileiros, portugueses, imigrantes alemães e, principalmente, a mulher protestante brasileira. A busca por informações sobre este universo relegado ao anonimato pela historiografia do protestantismo no Brasil, reve lou alguns documentos inéditos, inclusive um livro escrito por Sarah Poulton Kalley, em 1866: o A Alegria da Casa. Muito além do papel de esposa de um missionário e médico, Sarah Poulton Kalley emerge de uma rede de relações e práticas como professora, missionária e poetisa. Nestes três campos de atuação e através do desenvolvimento de múltiplos contatos e relacionamentos, procurava transformar e influenciar atitudes e crenças de seus interlocutores. Junto com a nova fé divulgava uma cosmovisão própria da cultura anglo-saxã, protestante e puritana, adaptando-a seletivamente ao universo cultural e social de seus interlocutores.(AU)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A graphical process control language has been developed as a means of defining process control software. The user configures a block diagram describing the required control system, from a menu of functional blocks, using a graphics software system with graphics terminal. Additions may be made to the menu of functional blocks, to extend the system capability, and a group of blocks may be defined as a composite block. This latter feature provides for segmentation of the overall system diagram and the repeated use of the same group of blocks within the system. The completed diagram is analyzed by a graphics compiler which generates the programs and data structure to realise the run-time software. The run-time software has been designed as a data-driven system which allows for modifications at the run-time level in both parameters and system configuration. Data structures have been specified to ensure efficient execution and minimal storage requirements in the final control software. Machine independence has been accomodated as far as possible using CORAL 66 as the high level language throughout the entire system; the final run-time code being generated by a CORAL 66 compiler appropriate to the target processor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis explores translating well-written sequential programs in a subset of the Eiffel programming language - without syntactic or semantic extensions - into parallelised programs for execution on a distributed architecture. The main focus is on constructing two object-oriented models: a theoretical self-contained model of concurrency which enables a simplified second model for implementing the compiling process. There is a further presentation of principles that, if followed, maximise the potential levels of parallelism. Model of Concurrency. The concurrency model is designed to be a straightforward target for mapping sequential programs onto, thus making them parallel. It aids the compilation process by providing a high level of abstraction, including a useful model of parallel behaviour which enables easy incorporation of message interchange, locking, and synchronization of objects. Further, the model is sufficient such that a compiler can and has been practically built. Model of Compilation. The compilation-model's structure is based upon an object-oriented view of grammar descriptions and capitalises on both a recursive-descent style of processing and abstract syntax trees to perform the parsing. A composite-object view with an attribute grammar style of processing is used to extract sufficient semantic information for the parallelisation (i.e. code-generation) phase. Programming Principles. The set of principles presented are based upon information hiding, sharing and containment of objects and the dividing up of methods on the basis of a command/query division. When followed, the level of potential parallelism within the presented concurrency model is maximised. Further, these principles naturally arise from good programming practice. Summary. In summary this thesis shows that it is possible to compile well-written programs, written in a subset of Eiffel, into parallel programs without any syntactic additions or semantic alterations to Eiffel: i.e. no parallel primitives are added, and the parallel program is modelled to execute with equivalent semantics to the sequential version. If the programming principles are followed, a parallelised program achieves the maximum level of potential parallelisation within the concurrency model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conventional structured methods of software engineering are often based on the use of functional decomposition coupled with the Waterfall development process model. This approach is argued to be inadequate for coping with the evolutionary nature of large software systems. Alternative development paradigms, including the operational paradigm and the transformational paradigm, have been proposed to address the inadequacies of this conventional view of software developement, and these are reviewed. JSD is presented as an example of an operational approach to software engineering, and is contrasted with other well documented examples. The thesis shows how aspects of JSD can be characterised with reference to formal language theory and automata theory. In particular, it is noted that Jackson structure diagrams are equivalent to regular expressions and can be thought of as specifying corresponding finite automata. The thesis discusses the automatic transformation of structure diagrams into finite automata using an algorithm adapted from compiler theory, and then extends the technique to deal with areas of JSD which are not strictly formalisable in terms of regular languages. In particular, an elegant and novel method for dealing with so called recognition (or parsing) difficulties is described,. Various applications of the extended technique are described. They include a new method of automatically implementing the dismemberment transformation; an efficient way of implementing inversion in languages lacking a goto-statement; and a new in-the-large implementation strategy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis provides a set of tools for managing uncertainty in Web-based models and workflows.To support the use of these tools, this thesis firstly provides a framework for exposing models through Web services. An introduction to uncertainty management, Web service interfaces,and workflow standards and technologies is given, with a particular focus on the geospatial domain.An existing specification for exposing geospatial models and processes, theWeb Processing Service (WPS), is critically reviewed. A processing service framework is presented as a solutionto usability issues with the WPS standard. The framework implements support for Simple ObjectAccess Protocol (SOAP), Web Service Description Language (WSDL) and JavaScript Object Notation (JSON), allowing models to be consumed by a variety of tools and software. Strategies for communicating with models from Web service interfaces are discussed, demonstrating the difficultly of exposing existing models on the Web. This thesis then reviews existing mechanisms for uncertainty management, with an emphasis on emulator methods for building efficient statistical surrogate models. A tool is developed to solve accessibility issues with such methods, by providing a Web-based user interface and backend to ease the process of building and integrating emulators. These tools, plus the processing service framework, are applied to a real case study as part of the UncertWeb project. The usability of the framework is proved with the implementation of aWeb-based workflow for predicting future crop yields in the UK, also demonstrating the abilities of the tools for emulator building and integration. Future directions for the development of the tools are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The software architecture and development consideration for open metadata extraction and processing framework are outlined. Special attention is paid to the aspects of reliability and fault tolerance. Grid infrastructure is shown as useful backend for general-purpose task.