24 resultados para Emulators (Computer programs)

em University of Queensland eSpace - Australia


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Processor emulators are a software tool for allowing legacy computer programs to be executed on a modern processor. In the past emulators have been used in trivial applications such as maintenance of video games. Now, however, processor emulation is being applied to safety-critical control systems, including military avionics. These applications demand utmost guarantees of correctness, but no verification techniques exist for proving that an emulated system preserves the original system’s functional and timing properties. Here we show how this can be done by combining concepts previously used for reasoning about real-time program compilation, coupled with an understanding of the new and old software architectures. In particular, we show how both the old and new systems can be given a common semantics, thus allowing their behaviours to be compared directly.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Flexible video bronchoscopes, in particular the Olympus BF Type 3C160, are commonly used in pediatric respiratory medicine. There is no data on the magnification and distortion effects of these bronchoscopes yet important clinical decisions are made from the images. The aim of this study was to systematically describe the magnification and distortion of flexible bronchoscope images taken at various distances from the object. Methods: Using images of known objects and processing these by digital video and computer programs both magnification and distortion scales were derived. Results: Magnification changes as a linear function between 100 mm ( x 1) and 10 mm ( x 9.55) and then as an exponential function between 10 mm and 3 mm ( x 40) from the object. Magnification depends on the axis of orientation of the object to the optic axis or geometrical axis of the bronchoscope. Magnification also varies across the field of view with the central magnification being 39% greater than at the periphery of the field of view at 15 mm from the object. However, in the paediatric situation the diameter of the orifices is usually less than 10 mm and thus this limits the exposure to these peripheral limits of magnification reduction. Intraclass correlations for measurements and repeatability studies between instruments are very high, r = 0.96. Distortion occurs as both barrel and geometric types but both types are heterogeneous across the field of view. Distortion of geometric type ranges up to 30% at 3 mm from the object but may be as low as 5% depending on the position of the object in relation to the optic axis. Conclusion: We conclude that the optimal working distance range is between 40 and 10 mm from the object. However the clinician should be cognisant of both variations in magnification and distortion in clinical judgements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Despite the number of computer-assisted methods described for the derivation of steady-state equations of enzyme systems, most of them are focused on strict steady-state conditions or are not able to solve complex reaction mechanisms. Moreover, many of them are based on computer programs that are either not readily available or have limitations. We present here a computer program called WinStes, which derives equations for both strict steady-state systems and those with the assumption of rapid equilibrium, for branched or unbranched mechanisms, containing both reversible and irreversible conversion steps. It solves reaction mechanisms involving up to 255 enzyme species, connected by up to 255 conversion steps. The program provides all the advantages of the Windows programs, such as a user-friendly graphical interface, and has a short computation time. WinStes is available free of charge on request from the authors. (c) 2006 Elsevier Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The increasing use of information and communications technologies among government departments and non-government agencies has fundamentally changed the implementation of employment services policy in Australia. The administrative arrangements for governing unemployment and unemployed people are now constituted by a complex contractual interplay between government departments as ‘purchasers’ and a range of small and large private organizations as ‘providers’. Assessing, tracking and monitoring the activities of unemployed people through the various parts of the employment services system has been made possible by developments in information technology and tailored computer programs. Consequently, the discretionary capacity that is traditionally associated with ‘street-level bureaucracy’ has been partly transformed into more prescriptive forms of ‘screen-level bureaucracy’. The knowledge embedded in these new computer-based technologies is considered superior because it is based on ‘objective calculations’, rather than subjective assessments of individual employees. The relationship between the sociopolitical context of unemployment policy and emerging forms of e-government is explored using illustrative findings from a qualitative pilot study undertaken in two Australian sites. The findings suggest that some of the new technologies in the employment services system are welcomed, while other applications are experienced as contradictory to the aims of delivering a personalized and respectful service.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Computer-based, socio-technical systems projects are frequently failures. In particular, computer-based information systems often fail to live up to their promise. Part of the problem lies in the uncertainty of the effect of combining the subsystems that comprise the complete system; i.e. the system's emergent behaviour cannot be predicted from a knowledge of the subsystems. This paper suggests uncertainty management is a fundamental unifying concept in analysis and design of complex systems and goes on to indicate that this is due to the co-evolutionary nature of the requirements and implementation of socio-technical systems. The paper shows a model of the propagation of a system change that indicates that the introduction of two or more changes over time can cause chaotic emergent behaviour.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Software simulation models are computer programs that need to be verified and debugged like any other software. In previous work, a method for error isolation in simulation models has been proposed. The method relies on a set of feature matrices that can be used to determine which part of the model implementation is responsible for deviations in the output of the model. Currrently these feature matrices have to be generated by hand from the model implementation, which is a tedious and error-prone task. In this paper, a method based on mutation analysis, as well as prototype tool support for the verification of the manually generated feature matrices is presented. The application of the method and tool to a model for wastewater treatment shows that the feature matrices can be verified effectively using a minimal number of mutants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A computer model of the mechanical alloying process has been developed to simulate phase formation during the mechanical alloying of Mo and Si elemental powders with a ternary addition of Al, Mg, Ti or Zr. Using the Arhennius equation, the model balances the formation rates of the competing reactions that are observed during milling. These reactions include the formation of tetragonal C11(b) MOSi2 (t-MoSi2) by combustion, the formation of the hexagonal C40 MoSi2 polymorph (h-MoSi2), the transformation of the tetragonal to the hexagonal form, and the recovery of t-MoSi2 from h-MoSi2 and deformed t-MoSi2. The addition of the ternary additions changes the free energy of formation of the associated MoSi2 alloys, i.e. Mo(Si, Al)(2), Mo(Mg, Al)(2), (Mo, Ti)Si-2 (Mo, Zr)Si-2 and (Mo, Fe)Si-2, respectively. Variation of the energy of formation alone is sufficient for the simulation to accurately model the observed phase formation. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We provide an abstract command language for real-time programs and outline how a partial correctness semantics can be used to compute execution times. The notions of a timed command, refinement of a timed command, the command traversal condition, and the worst-case and best-case execution time of a command are formally introduced and investigated with the help of an underlying weakest liberal precondition semantics. The central result is a theory for the computation of worst-case and best-case execution times from the underlying semantics based on supremum and infimum calculations. The framework is applied to the analysis of a message transmitter program and its implementation. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present some techniques to obtain smooth derivations of concurrent programs that address both safety and progress in a formal manner. Our techniques form an extension to the calculational method of Feijen and van Casteren using a UNITY style progress logic. We stress the role of stable guards, and we illustrate the derivation techniques on some examples in which progress plays an essential role.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes an experiment in the design of distributed programs. It is based on the theory of Owicki and Gries extended with rules for reasoning about message passing. The experiment is designed to test the effectiveness of the extended theory for designing distributed programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real-time control programs are often used in contexts where (conceptually) they run forever. Repetitions within such programs (or their specifications) may either (i) be guaranteed to terminate, (ii) be guaranteed to never terminate (loop forever), or (iii) may possibly terminate. In dealing with real-time programs and their specifications, we need to be able to represent these possibilities, and define suitable refinement orderings. A refinement ordering based on Dijkstra's weakest precondition only copes with the first alternative. Weakest liberal preconditions allow one to constrain behaviour provided the program terminates, which copes with the third alternative to some extent. However, neither of these handles the case when a program does not terminate. To handle this case a refinement ordering based on relational semantics can be used. In this paper we explore these issues and the definition of loops for real-time programs as well as corresponding refinement laws.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research in verification and validation (V&V) for concurrent programs can be guided by practitioner information. A survey was therefore run to gain state-of-practice information in this context. The survey presented in this paper collected state-of-practice information on V&V technology in concurrency from 35 respondents. The results of the survey can help refine existing V&V technology by providing a better understanding of the context of V&V technology usage. Responses to questions regarding the motivation for selecting V&V technologies can help refine a systematic approach to V&V technology selection.