964 resultados para Compilers (Computer programs)
Resumo:
Issued June 1978.
Resumo:
"Work performed for the Air Force Flight Dynamics Laboratory...by the Aerodynamics Research Department of the Northrup Corporation, Aircraft Division."
Resumo:
"February 22, 1977."
Resumo:
Incldues bibliographical references
Resumo:
Background: Flexible video bronchoscopes, in particular the Olympus BF Type 3C160, are commonly used in pediatric respiratory medicine. There is no data on the magnification and distortion effects of these bronchoscopes yet important clinical decisions are made from the images. The aim of this study was to systematically describe the magnification and distortion of flexible bronchoscope images taken at various distances from the object. Methods: Using images of known objects and processing these by digital video and computer programs both magnification and distortion scales were derived. Results: Magnification changes as a linear function between 100 mm ( x 1) and 10 mm ( x 9.55) and then as an exponential function between 10 mm and 3 mm ( x 40) from the object. Magnification depends on the axis of orientation of the object to the optic axis or geometrical axis of the bronchoscope. Magnification also varies across the field of view with the central magnification being 39% greater than at the periphery of the field of view at 15 mm from the object. However, in the paediatric situation the diameter of the orifices is usually less than 10 mm and thus this limits the exposure to these peripheral limits of magnification reduction. Intraclass correlations for measurements and repeatability studies between instruments are very high, r = 0.96. Distortion occurs as both barrel and geometric types but both types are heterogeneous across the field of view. Distortion of geometric type ranges up to 30% at 3 mm from the object but may be as low as 5% depending on the position of the object in relation to the optic axis. Conclusion: We conclude that the optimal working distance range is between 40 and 10 mm from the object. However the clinician should be cognisant of both variations in magnification and distortion in clinical judgements.
Resumo:
Despite the number of computer-assisted methods described for the derivation of steady-state equations of enzyme systems, most of them are focused on strict steady-state conditions or are not able to solve complex reaction mechanisms. Moreover, many of them are based on computer programs that are either not readily available or have limitations. We present here a computer program called WinStes, which derives equations for both strict steady-state systems and those with the assumption of rapid equilibrium, for branched or unbranched mechanisms, containing both reversible and irreversible conversion steps. It solves reaction mechanisms involving up to 255 enzyme species, connected by up to 255 conversion steps. The program provides all the advantages of the Windows programs, such as a user-friendly graphical interface, and has a short computation time. WinStes is available free of charge on request from the authors. (c) 2006 Elsevier Inc. All rights reserved.
Resumo:
Processor emulators are a software tool for allowing legacy computer programs to be executed on a modern processor. In the past emulators have been used in trivial applications such as maintenance of video games. Now, however, processor emulation is being applied to safety-critical control systems, including military avionics. These applications demand utmost guarantees of correctness, but no verification techniques exist for proving that an emulated system preserves the original system’s functional and timing properties. Here we show how this can be done by combining concepts previously used for reasoning about real-time program compilation, coupled with an understanding of the new and old software architectures. In particular, we show how both the old and new systems can be given a common semantics, thus allowing their behaviours to be compared directly.
Resumo:
The increasing use of information and communications technologies among government departments and non-government agencies has fundamentally changed the implementation of employment services policy in Australia. The administrative arrangements for governing unemployment and unemployed people are now constituted by a complex contractual interplay between government departments as ‘purchasers’ and a range of small and large private organizations as ‘providers’. Assessing, tracking and monitoring the activities of unemployed people through the various parts of the employment services system has been made possible by developments in information technology and tailored computer programs. Consequently, the discretionary capacity that is traditionally associated with ‘street-level bureaucracy’ has been partly transformed into more prescriptive forms of ‘screen-level bureaucracy’. The knowledge embedded in these new computer-based technologies is considered superior because it is based on ‘objective calculations’, rather than subjective assessments of individual employees. The relationship between the sociopolitical context of unemployment policy and emerging forms of e-government is explored using illustrative findings from a qualitative pilot study undertaken in two Australian sites. The findings suggest that some of the new technologies in the employment services system are welcomed, while other applications are experienced as contradictory to the aims of delivering a personalized and respectful service.
Resumo:
Computer-based, socio-technical systems projects are frequently failures. In particular, computer-based information systems often fail to live up to their promise. Part of the problem lies in the uncertainty of the effect of combining the subsystems that comprise the complete system; i.e. the system's emergent behaviour cannot be predicted from a knowledge of the subsystems. This paper suggests uncertainty management is a fundamental unifying concept in analysis and design of complex systems and goes on to indicate that this is due to the co-evolutionary nature of the requirements and implementation of socio-technical systems. The paper shows a model of the propagation of a system change that indicates that the introduction of two or more changes over time can cause chaotic emergent behaviour.
Resumo:
Software simulation models are computer programs that need to be verified and debugged like any other software. In previous work, a method for error isolation in simulation models has been proposed. The method relies on a set of feature matrices that can be used to determine which part of the model implementation is responsible for deviations in the output of the model. Currrently these feature matrices have to be generated by hand from the model implementation, which is a tedious and error-prone task. In this paper, a method based on mutation analysis, as well as prototype tool support for the verification of the manually generated feature matrices is presented. The application of the method and tool to a model for wastewater treatment shows that the feature matrices can be verified effectively using a minimal number of mutants.
Resumo:
In ordinary computer programs, the relationship between data in a machine and the concepts it represents is defined arbitrarily by the programmer. It is argued here that the Strong AI hypothesis suggests that no such arbitrariness is possible in the relationship between brain states and mental experiences, and that this may place surprising limitations on the possible variety of mental experiences. Possible psychology experiments are sketched which aim to falsify the Strong AI hypothesis by indicating that these limits can be exceeded. It is concluded that although such experiments might be valuable, they are unlikely to succeed in this aim.
Resumo:
The thesis is concerned with the development and testing of a mathematical model of a distillation process in which the components react chemically. The formaldehyde-methanol-water system was selected and only the reversible reactions between formaldehyde and water giving methylene glycol and between formaldehyde and methanol producing hemiformal were assumed to occur under the distillation conditions. Accordingly the system has been treated as a five component system. The vapour-liquid equilibrium calculations were performed by solving iteratively the thermodynamic relationships expressing the phase equilibria with the stoichiometric equations expressing the chemical equilibria. Using optimisation techniques, the Wilson single parameters and Henry's constants were calculated for binary systems containing formaldehyde which was assumed to be a supercritical component whilst Wilson binary parameters were calculated for the remaining binary systems. Thus the phase equilibria for the formaldehyde system could be calculated using these parameters and good accuracy was obtained when calculated values were compared with experimental values. The distillation process was modelled using the mass and energy balance equations together with the phase equilibria calculations. The plate efficiencies were obtained from a modified A.I.Ch.E. Bubble Tray method. The resulting equations were solved by an iterative plate to plate calculation based on the Newton Raphson method. Experiments were carried out in a 76mm I.D., eight sieve plate distillation column and the results were compared with the mathematical model calculations. Overall, good agreement was obtained but some discrepancies were observed in the concentration profiles and these may have been caused by the effect of limited physical property data and a limited understanding of the reactions mechanism. The model equations were solved in the form of modular computer programs. Although they were written to describe the steady state distillation with simultaneous chemical reaction of the formaldehyde system, the approach used may be of wider application.
Resumo:
Computer programs have been developed to enable the coordination of fuses and overcurrent relays for radial power systems under estimated fault current conditions. The grading curves for these protection devices can be produced on a graphics terminal and a hard copy can be obtained. Additional programs have also been developed which could be used to assess the validity of relay settings (obtained under the above conditions) when the transient effect is included. Modelling of a current transformer is included because transformer saturation may occur if the fault current is high, and hence the secondary current is distorted. Experiments were carried out to confirm that distorted currents will affect the relay operating time, and it is shown that if the relay current contains only a small percentage of harmonic distortion, the relay operating time is increased. System equations were arranged to enable the model to predict fault currents with a generator transformer incorporated in the system, and also to include the effect of circuit breaker opening, arcing resistance, and earthing resistance. A fictitious field winding was included to enable more accurate prediction of fault currents when the system is operating at both lagging and leading power factors prior to the occurrence of the fault.