117 resultados para Foundations Computer programs


Relevância:

80.00% 80.00%

Publicador:

Resumo:

A Geographic Information System (GIS) was used to model datasets of Leyte Island, the Philippines, to identify land which was suitable for a forest extension program on the island. The datasets were modelled to provide maps of the distance of land from cities and towns, land which was a suitable elevation and slope for smallholder forestry and land of various soil types. An expert group was used to assign numeric site suitabilities to the soil types and maps of site suitability were used to assist the selection of municipalities for the provision of extension assistance to smallholders. Modelling of the datasets was facilitated by recent developments of the ArcGIS® suite of computer programs and derivation of elevation and slope was assisted by the availability of digital elevation models (DEM) produced by the Shuttle Radar Topography (SRTM) mission. The usefulness of GIS software as a decision support tool for small-scale forestry extension programs is discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The second edition of An Introduction to Efficiency and Productivity Analysis is designed to be a general introduction for those who wish to study efficiency and productivity analysis. The book provides an accessible, well-written introduction to the four principal methods involved: econometric estimation of average response models; index numbers, data envelopment analysis (DEA); and stochastic frontier analysis (SFA). For each method, a detailed introduction to the basic concepts is presented, numerical examples are provided, and some of the more important extensions to the basic methods are discussed. Of special interest is the systematic use of detailed empirical applications using real-world data throughout the book. In recent years, there have been a number of excellent advance-level books published on performance measurement. This book, however, is the first systematic survey of performance measurement with the express purpose of introducing the field to a wide audience of students, researchers, and practitioners. Indeed, the 2nd Edition maintains its uniqueness: (1) It is a well-written introduction to the field. (2) It outlines, discusses and compares the four principal methods for efficiency and productivity analysis in a well-motivated presentation. (3) It provides detailed advice on computer programs that can be used to implement these performance measurement methods. The book contains computer instructions and output listings for the SHAZAM, LIMDEP, TFPIP, DEAP and FRONTIER computer programs. More extensive listings of data and computer instruction files are available on the book's website: (www.uq.edu.au/economics/cepa/crob2005).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The World Wide Web (WWW) is useful for distributing scientific data. Most existing web data resources organize their information either in structured flat files or relational databases with basic retrieval capabilities. For databases with one or a few simple relations, these approaches are successful, but they can be cumbersome when there is a data model involving multiple relations between complex data. We believe that knowledge-based resources offer a solution in these cases. Knowledge bases have explicit declarations of the concepts in the domain, along with the relations between them. They are usually organized hierarchically, and provide a global data model with a controlled vocabulary, We have created the OWEB architecture for building online scientific data resources using knowledge bases. OWEB provides a shell for structuring data, providing secure and shared access, and creating computational modules for processing and displaying data. In this paper, we describe the translation of the online immunological database MHCPEP into an OWEB system called MHCWeb. This effort involved building a conceptual model for the data, creating a controlled terminology for the legal values for different types of data, and then translating the original data into the new structure. The 0 WEB environment allows for flexible access to the data by both users and computer programs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Qualitative data analysis (QDA) is often a time-consuming and laborious process usually involving the management of large quantities of textual data. Recently developed computer programs offer great advances in the efficiency of the processes of QDA. In this paper we report on an innovative use of a combination of extant computer software technologies to further enhance and simplify QDA. Used in appropriate circumstances, we believe that this innovation greatly enhances the speed with which theoretical and descriptive ideas can be abstracted from rich, complex, and chaotic qualitative data. © 2001 Human Sciences Press, Inc.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This communications describes an electromagnetic model of a radial line planar antenna consisting of a radial guide with one central probe and many peripheral probes arranged in concentric circles feeding an array of antenna elements such as patches or wire curls. The model takes into account interactions between the coupling probes while assuming isolation of radiating elements. Based on this model, computer programs are developed to determine equivalent circuit parameters of the feed network and the radiation pattern of the radial line planar antenna. Comparisons are made between the present model and the two-probe model developed earlier by other researchers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Flexible video bronchoscopes, in particular the Olympus BF Type 3C160, are commonly used in pediatric respiratory medicine. There is no data on the magnification and distortion effects of these bronchoscopes yet important clinical decisions are made from the images. The aim of this study was to systematically describe the magnification and distortion of flexible bronchoscope images taken at various distances from the object. Methods: Using images of known objects and processing these by digital video and computer programs both magnification and distortion scales were derived. Results: Magnification changes as a linear function between 100 mm ( x 1) and 10 mm ( x 9.55) and then as an exponential function between 10 mm and 3 mm ( x 40) from the object. Magnification depends on the axis of orientation of the object to the optic axis or geometrical axis of the bronchoscope. Magnification also varies across the field of view with the central magnification being 39% greater than at the periphery of the field of view at 15 mm from the object. However, in the paediatric situation the diameter of the orifices is usually less than 10 mm and thus this limits the exposure to these peripheral limits of magnification reduction. Intraclass correlations for measurements and repeatability studies between instruments are very high, r = 0.96. Distortion occurs as both barrel and geometric types but both types are heterogeneous across the field of view. Distortion of geometric type ranges up to 30% at 3 mm from the object but may be as low as 5% depending on the position of the object in relation to the optic axis. Conclusion: We conclude that the optimal working distance range is between 40 and 10 mm from the object. However the clinician should be cognisant of both variations in magnification and distortion in clinical judgements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Despite the number of computer-assisted methods described for the derivation of steady-state equations of enzyme systems, most of them are focused on strict steady-state conditions or are not able to solve complex reaction mechanisms. Moreover, many of them are based on computer programs that are either not readily available or have limitations. We present here a computer program called WinStes, which derives equations for both strict steady-state systems and those with the assumption of rapid equilibrium, for branched or unbranched mechanisms, containing both reversible and irreversible conversion steps. It solves reaction mechanisms involving up to 255 enzyme species, connected by up to 255 conversion steps. The program provides all the advantages of the Windows programs, such as a user-friendly graphical interface, and has a short computation time. WinStes is available free of charge on request from the authors. (c) 2006 Elsevier Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Processor emulators are a software tool for allowing legacy computer programs to be executed on a modern processor. In the past emulators have been used in trivial applications such as maintenance of video games. Now, however, processor emulation is being applied to safety-critical control systems, including military avionics. These applications demand utmost guarantees of correctness, but no verification techniques exist for proving that an emulated system preserves the original system’s functional and timing properties. Here we show how this can be done by combining concepts previously used for reasoning about real-time program compilation, coupled with an understanding of the new and old software architectures. In particular, we show how both the old and new systems can be given a common semantics, thus allowing their behaviours to be compared directly.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The increasing use of information and communications technologies among government departments and non-government agencies has fundamentally changed the implementation of employment services policy in Australia. The administrative arrangements for governing unemployment and unemployed people are now constituted by a complex contractual interplay between government departments as ‘purchasers’ and a range of small and large private organizations as ‘providers’. Assessing, tracking and monitoring the activities of unemployed people through the various parts of the employment services system has been made possible by developments in information technology and tailored computer programs. Consequently, the discretionary capacity that is traditionally associated with ‘street-level bureaucracy’ has been partly transformed into more prescriptive forms of ‘screen-level bureaucracy’. The knowledge embedded in these new computer-based technologies is considered superior because it is based on ‘objective calculations’, rather than subjective assessments of individual employees. The relationship between the sociopolitical context of unemployment policy and emerging forms of e-government is explored using illustrative findings from a qualitative pilot study undertaken in two Australian sites. The findings suggest that some of the new technologies in the employment services system are welcomed, while other applications are experienced as contradictory to the aims of delivering a personalized and respectful service.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Computer-based, socio-technical systems projects are frequently failures. In particular, computer-based information systems often fail to live up to their promise. Part of the problem lies in the uncertainty of the effect of combining the subsystems that comprise the complete system; i.e. the system's emergent behaviour cannot be predicted from a knowledge of the subsystems. This paper suggests uncertainty management is a fundamental unifying concept in analysis and design of complex systems and goes on to indicate that this is due to the co-evolutionary nature of the requirements and implementation of socio-technical systems. The paper shows a model of the propagation of a system change that indicates that the introduction of two or more changes over time can cause chaotic emergent behaviour.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Software simulation models are computer programs that need to be verified and debugged like any other software. In previous work, a method for error isolation in simulation models has been proposed. The method relies on a set of feature matrices that can be used to determine which part of the model implementation is responsible for deviations in the output of the model. Currrently these feature matrices have to be generated by hand from the model implementation, which is a tedious and error-prone task. In this paper, a method based on mutation analysis, as well as prototype tool support for the verification of the manually generated feature matrices is presented. The application of the method and tool to a model for wastewater treatment shows that the feature matrices can be verified effectively using a minimal number of mutants.