188 resultados para Source code (Computer science)
Resumo:
With the advent of object-oriented languages and the portability of Java, the development and use of class libraries has become widespread. Effective class reuse depends on class reliability which in turn depends on thorough testing. This paper describes a class testing approach based on modeling each test case with a tuple and then generating large numbers of tuples to thoroughly cover an input space with many interesting combinations of values. The testing approach is supported by the Roast framework for the testing of Java classes. Roast provides automated tuple generation based on boundary values, unit operations that support driver standardization, and test case templates used for code generation. Roast produces thorough, compact test drivers with low development and maintenance cost. The framework and tool support are illustrated on a number of non-trivial classes, including a graphical user interface policy manager. Quantitative results are presented to substantiate the practicality and effectiveness of the approach. Copyright (C) 2002 John Wiley Sons, Ltd.
Resumo:
The rise of component-based software development has created an urgent need for effective application program interface (API) documentation. Experience has shown that it is hard to create precise and readable documentation. Prose documentation can provide a good overview but lacks precision. Formal methods offer precision but the resulting documentation is expensive to develop. Worse, few developers have the skill or inclination to read formal documentation. We present a pragmatic solution to the problem of API documentation. We augment the prose documentation with executable test cases, including expected outputs, and use the prose plus the test cases as the documentation. With appropriate tool support, the test cases are easy to develop and read. Such test cases constitute a completely formal, albeit partial, specification of input/output behavior. Equally important, consistency between code and documentation is demonstrated by running the test cases. This approach provides an attractive bridge between formal and informal documentation. We also present a tool that supports compact and readable test cases; and generation of test drivers and documentation, and illustrate the approach with detailed case studies. (C) 2002 Elsevier Science Inc. All rights reserved.
Resumo:
The integration of geo-information from multiple sources and of diverse nature in developing mineral favourability indexes (MFIs) is a well-known problem in mineral exploration and mineral resource assessment. Fuzzy set theory provides a convenient framework to combine and analyse qualitative and quantitative data independently of their source or characteristics. A novel, data-driven formulation for calculating MFIs based on fuzzy analysis is developed in this paper. Different geo-variables are considered fuzzy sets and their appropriate membership functions are defined and modelled. A new weighted average-type aggregation operator is then introduced to generate a new fuzzy set representing mineral favourability. The membership grades of the new fuzzy set are considered as the MFI. The weights for the aggregation operation combine the individual membership functions of the geo-variables, and are derived using information from training areas and L, regression. The technique is demonstrated in a case study of skarn tin deposits and is used to integrate geological, geochemical and magnetic data. The study area covers a total of 22.5 km(2) and is divided into 349 cells, which include nine control cells. Nine geo-variables are considered in this study. Depending on the nature of the various geo-variables, four different types of membership functions are used to model the fuzzy membership of the geo-variables involved. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Purpose: The aim of this project was to design and evaluate a system that would produce tailored information for stroke patients and their carers, customised according to their informational needs, and facilitate communication between the patient and, health professional. Method: A human factors development approach was used to develop a computer system, which dynamically compiles stroke education booklets for patients and carers. Patients and carers are able to select the topics about which they wish to receive information, the amount of information they want, and the font size of the printed booklet. The system is designed so that the health professional interacts with it, thereby providing opportunities for communication between the health professional and patient/carer at a number of points in time. Results: Preliminary evaluation of the system by health professionals, patients and carers was positive. A randomised controlled trial that examines the effect of the system on patient and carer outcomes is underway. (C) 2004 Elsevier Ireland Ltd. All rights reserved.
Resumo:
The design of liquid-retaining structures involves many decisions to be made by the designer based on rules of thumb, heuristics, judgement, codes of practice and previous experience. Structural design problems are often ill structured and there is a need to develop programming environments that can incorporate engineering judgement along with algorithmic tools. Recent developments in artificial intelligence have made it possible to develop an expert system that can provide expert advice to the user in the selection of design criteria and design parameters. This paper introduces the development of an expert system in the design of liquid-retaining structures using blackboard architecture. An expert system shell, Visual Rule Studio, is employed to facilitate the development of this prototype system. It is a coupled system combining symbolic processing with traditional numerical processing. The expert system developed is based on British Standards Code of Practice BS8007. Explanations are made to assist inexperienced designers or civil engineering students to learn how to design liquid-retaining structures effectively and sustainably in their design practices. The use of this expert system in disseminating heuristic knowledge and experience to practitioners and engineering students is discussed.
Resumo:
The bispectrum and third-order moment can be viewed as equivalent tools for testing for the presence of nonlinearity in stationary time series. This is because the bispectrum is the Fourier transform of the third-order moment. An advantage of the bispectrum is that its estimator comprises terms that are asymptotically independent at distinct bifrequencies under the null hypothesis of linearity. An advantage of the third-order moment is that its values in any subset of joint lags can be used in the test, whereas when using the bispectrum the entire (or truncated) third-order moment is required to construct the Fourier transform. In this paper, we propose a test for nonlinearity based upon the estimated third-order moment. We use the phase scrambling bootstrap method to give a nonparametric estimate of the variance of our test statistic under the null hypothesis. Using a simulation study, we demonstrate that the test obtains its target significance level, with large power, when compared to an existing standard parametric test that uses the bispectrum. Further we show how the proposed test can be used to identify the source of nonlinearity due to interactions at specific frequencies. We also investigate implications for heuristic diagnosis of nonstationarity.
Resumo:
While object-oriented programming offers great solutions for today's software developers, this success has created difficult problems in class documentation and testing. In Java, two tools provide assistance: Javadoc allows class interface documentation to be embedded as code comments and JUnit supports unit testing by providing assert constructs and a test framework. This paper describes JUnitDoc, an integration of Javadoc and JUnit, which provides better support for class documentation and testing. With JUnitDoc, test cases are embedded in Javadoc comments and used as both examples for documentation and test cases for quality assurance. JUnitDoc extracts the test cases for use in HTML files serving as class documentation and in JUnit drivers for class testing. To address the difficult problem of testing inheritance hierarchies, JUnitDoc provides a novel solution in the form of a parallel test hierarchy. A small controlled experiment compares the readability of JUnitDoc documentation to formal documentation written in Object-Z. Copyright (c) 2005 John Wiley & Sons, Ltd.
Resumo:
The real-time refinement calculus is a formal method for the systematic derivation of real-time programs from real-time specifications in a style similar to the non-real-time refinement calculi of Back and Morgan. In this paper we extend the real-time refinement calculus with procedures and provide refinement rules for refining real-time specifications to procedure calls. A real-time specification can include constraints on, not only what outputs are produced, but also when they are produced. The derived programs can also include time constraints oil when certain points in the program must be reached; these are expressed in the form of deadline commands. Such programs are machine independent. An important consequence of the approach taken is that, not only are the specifications machine independent, but the whole refinement process is machine independent. To implement the machine independent code on a target machine one has a separate task of showing that the compiled machine code will reach all its deadlines before they expire. For real-time programs, externally observable input and output variables are essential. These differ from local variables in that their values are observable over the duration of the execution of the program. Hence procedures require input and output parameter mechanisms that are references to the actual parameters so that changes to external inputs are observable within the procedure and changes to output parameters are externally observable. In addition, we allow value and result parameters. These may be auxiliary parameters, which are used for reasoning about the correctness of real-time programs as well as in the expression of timing deadlines, but do not lead to any code being generated for them by a compiler. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
We revisit the one-unit gradient ICA algorithm derived from the kurtosis function. By carefully studying properties of the stationary points of the discrete-time one-unit gradient ICA algorithm, with suitable condition on the learning rate, convergence can be proved. The condition on the learning rate helps alleviate the guesswork that accompanies the problem of choosing suitable learning rate in practical computation. These results may be useful to extract independent source signals on-line.
Resumo:
Objectives: In this paper, we present a unified electrodynamic heart model that permits simulations of the body surface potentials generated by the heart in motion. The inclusion of motion in the heart model significantly improves the accuracy of the simulated body surface potentials and therefore also the 12-lead ECG. Methods: The key step is to construct an electromechanical heart model. The cardiac excitation propagation is simulated by an electrical heart model, and the resulting cardiac active forces are used to calculate the ventricular wall motion based on a mechanical model. The source-field point relative position changes during heart systole and diastole. These can be obtained, and then used to calculate body surface ECG based on the electrical heart-torso model. Results: An electromechanical biventricular heart model is constructed and a standard 12-lead ECG is simulated. Compared with a simulated ECG based on the static electrical heart model, the simulated ECG based on the dynamic heart model is more accordant with a clinically recorded ECG, especially for the ST segment and T wave of a V1-V6 lead ECG. For slight-degree myocardial ischemia ECG simulation, the ST segment and T wave changes can be observed from the simulated ECG based on a dynamic heart model, while the ST segment and T wave of simulated ECG based on a static heart model is almost unchanged when compared with a normal ECG. Conclusions: This study confirms the importance of the mechanical factor in the ECG simulation. The dynamic heart model could provide more accurate ECG simulation, especially for myocardial ischemia or infarction simulation, since the main ECG changes occur at the ST segment and T wave, which correspond with cardiac systole and diastole phases.
Resumo:
Irrigation practices that are profligate in their use of water have come under closer scrutiny by water managers and the public. Trickle irrigation has the propensity to increase water use efficiency but only if the system is designed to meet the soil and plant conditions. Recently we have provided a software tool, WetUp (http://www.clw.csiro.au/products/wetup/), to calculate the wetting patterns from trickle irrigation emitters. WetUp uses an analytical solution to calculate the wetted perimeter for both buried and surface emitters. This analytical solution has a number of assumptions, two of which are that the wetting front is defined by water content at which the hydraulic conductivity (K) is I mm day(-1) and that the flow occurs from a point source. Here we compare the wetting patterns calculated with a 2-dimensional numerical model, HYDRUS2D, for solving the water flow into typical soils with the analytical solution. The results show that the wetting patterns are similar, except when the soil properties result in the assumption of a point source no longer being a good description of the flow regime. Difficulties were also experienced with getting stable solutions with HYDRUS2D for soils with low hydraulic conductivities. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
Formal methods have significant benefits for developing safety critical systems, in that they allow for correctness proofs, model checking safety and liveness properties, deadlock checking, etc. However, formal methods do not scale very well and demand specialist skills, when developing real-world systems. For these reasons, development and analysis of large-scale safety critical systems will require effective integration of formal and informal methods. In this paper, we use such an integrative approach to automate Failure Modes and Effects Analysis (FMEA), a widely used system safety analysis technique, using a high-level graphical modelling notation (Behavior Trees) and model checking. We inject component failure modes into the Behavior Trees and translate the resulting Behavior Trees to SAL code. This enables us to model check if the system in the presence of these faults satisfies its safety properties, specified by temporal logic formulas. The benefit of this process is tool support that automates the tedious and error-prone aspects of FMEA.
Resumo:
Bellerophon is a program for detecting chimeric sequences in multiple sequence datasets by an adaption of partial treeing analysis. Bellerophon was specifically developed to detect 16S rRNA gene chimeras in PCR-clone libraries of environmental samples but can be applied to other nucleotide sequence alignments.
Resumo:
Evolution strategies are a class of general optimisation algorithms which are applicable to functions that are multimodal, nondifferentiable, or even discontinuous. Although recombination operators have been introduced into evolution strategies, the primary search operator is still mutation. Classical evolution strategies rely on Gaussian mutations. A new mutation operator based on the Cauchy distribution is proposed in this paper. It is shown empirically that the new evolution strategy based on Cauchy mutation outperforms the classical evolution strategy on most of the 23 benchmark problems tested in this paper. The paper also shows empirically that changing the order of mutating the objective variables and mutating the strategy parameters does not alter the previous conclusion significantly, and that Cauchy mutations with different scaling parameters still outperform the Gaussian mutation with self-adaptation. However, the advantage of Cauchy mutations disappears when recombination is used in evolution strategies. It is argued that the search step size plays an important role in determining evolution strategies' performance. The large step size of recombination plays a similar role as Cauchy mutation.