34 resultados para Automated Software Debugging
em University of Queensland eSpace - Australia
Resumo:
The fundamental role of dendritic cells (DC in initiating and directing the primary immune response is well established. Furthermore, it is now accepted that DC may be useful in new vaccination strategies for preventing certain malignant and infectious diseases. As blood DC (BDC physiology differs from that of the DC homologues generated in vitro from monocyte precursors, it is becoming more relevant to consider BDC for therapeutic interventions. Until recently, protocols for the isolation of BDC were laborious and inefficient; therefore, their use for investigative cancer immunotherapy is not widespread. In this study, we carefully documented BDC counts, yields and subsets during apheresis (Cobe Spectra), the initial and essential procedure in creating a BDC isolation platform for cancer immunotherapy. We established that an automated software package (Version 6,0 AutoPBPC) provides an operator-independent reliable source of motionuclear cells (MNC for BDC preparation. Further, we observed that BDC might be recovered in high yields, often greater than 100% relative to the number of circulating BDC predicted by blood volume. An average of 66 million (range, 17-179) BDC per 10-1 procedure were obtained, largely satisfying the needs for immunization. Higher yields were possible on total processed blood volumes of 151. BDC were not activated by the isolation procedure and, more importantly, both BDC subsets (CD11c(+)CD123(low) and CD11c(-)CD123(high)) were equally represented. Finally, we established that the apheresis product could be used for antibody-based BDC immunoselection and demonstrated that fully functional BDC can be obtained by this procedure. (C) 2002 Published by Elsevier Science B.V.
Resumo:
Background Many clinical trials of DC-based immunotherapy involve administration of monocyte-derived DCs (Mo-DC) on multiple occasions. We aimed to determine the optimal cell processing procedures and timing (leukapheresis, RBC depletion and cryopreservation) for generation of Mo-DC for clinical purposes. Methods Leukapheresis was undertaken using a COBE Spectra. Two instrument settings were compared - the standard semi-automated software (Version 4.7) (n = 10) and the fully automated software (Version 6.0) (n = 40). Density gradient centrifugation using Ficoll, Percoll, a combination of these methods or neither for RBC depletion were compared. Outcomes (including cell yield and purity) were compared for cryopreserved unmanipulated monocytes and cryopreserved Mo-DC. Results Software Version 6.0 provided significantly better enrichment for monocytes (P
Resumo:
Concerns have been raised about the reproducibility of brachial artery reactivity (BAR), because subjective decisions regarding the location of interfaces may influence the measurement of very small changes in lumen diameter. We studied 120 consecutive patients with BAR to address if an automated technique could be applied, and if experience influenced reproducibility between two observers, one experienced and one inexperienced. Digital cineloops were measured automatically, using software that measures the leading edge of the endothelium and tracks this in sequential frames and also manually, where a set of three point-to-point measurements were averaged. There was a high correlation between automated and manual techniques for both observers, although less variability was present with expert readers. The limits of agreement overall for interobserver concordance were 0.13 +/-0.65 mm for the manual and 0.03 +/-0.74 mm for the automated measurement. For intraobserver concordance, the limits of agreement were -0.07 +/-0.38 mm for observer 1 and -0.16 +/-0.55 mm for observer 2. We concluded that BAR measurements were highly concordant between observers, although more concordant using the automated method, and that experience does affect concordance. Care must be taken to ensure that the same segments are measured between observers and serially.
Resumo:
The refinement calculus is a well-established theory for deriving program code from specifications. Recent research has extended the theory to handle timing requirements, as well as functional ones, and we have developed an interactive programming tool based on these extensions. Through a number of case studies completed using the tool, this paper explains how the tool helps the programmer by supporting the many forms of variables needed in the theory. These include simple state variables as in the untimed calculus, trace variables that model the evolution of properties over time, auxiliary variables that exist only to support formal reasoning, subroutine parameters, and variables shared between parallel processes.
Resumo:
In Part 1 of this paper a methodology for back-to-back testing of simulation software was described. Residuals with error-dependent geometric properties were generated. A set of potential coding errors was enumerated, along with a corresponding set of feature matrices, which describe the geometric properties imposed on the residuals by each of the errors. In this part of the paper, an algorithm is developed to isolate the coding errors present by analysing the residuals. A set of errors is isolated when the subspace spanned by their combined feature matrices corresponds to that of the residuals. Individual feature matrices are compared to the residuals and classified as 'definite', 'possible' or 'impossible'. The status of 'possible' errors is resolved using a dynamic subset testing algorithm. To demonstrate and validate the testing methodology presented in Part 1 and the isolation algorithm presented in Part 2, a case study is presented using a model for biological wastewater treatment. Both single and simultaneous errors that are deliberately introduced into the simulation code are correctly detected and isolated. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
Formal methods have significant benefits for developing safety critical systems, in that they allow for correctness proofs, model checking safety and liveness properties, deadlock checking, etc. However, formal methods do not scale very well and demand specialist skills, when developing real-world systems. For these reasons, development and analysis of large-scale safety critical systems will require effective integration of formal and informal methods. In this paper, we use such an integrative approach to automate Failure Modes and Effects Analysis (FMEA), a widely used system safety analysis technique, using a high-level graphical modelling notation (Behavior Trees) and model checking. We inject component failure modes into the Behavior Trees and translate the resulting Behavior Trees to SAL code. This enables us to model check if the system in the presence of these faults satisfies its safety properties, specified by temporal logic formulas. The benefit of this process is tool support that automates the tedious and error-prone aspects of FMEA.
Resumo:
This report describes recent updates to the custom-built data-acquisition hardware operated by the Center for Hypersonics. In 2006, an ISA-to-USB bridging card was developed as part of Luke Hillyard's final-year thesis. This card allows the hardware to be connected to any recent personal computers via a (USB or RS232) serial port and it provides a number of simple text-based commands for control of the hardware. A graphical user interface program was also updated to help the experimenter manage the data acquisition functions. Sampled data is stored in text files that have been compressed with the gzip for mat. To simplify the later archiving or transport of the data, all files specific to a shot are stored in a single directory. This includes a text file for the run description, the signal configuration file and the individual sampled-data files, one for each signal that was recorded.
Resumo:
This paper reports on a system for automated agent negotiation, based on a formal and executable approach to capture the behavior of parties involved in a negotiation. It uses the JADE agent framework, and its major distinctive feature is the use of declarative negotiation strategies. The negotiation strategies are expressed in a declarative rules language, defeasible logic, and are applied using the implemented system DR-DEVICE. The key ideas and the overall system architecture are described, and a particular negotiation case is presented in detail.
Resumo:
The XSophe-Sophe-XeprView((R)) computer simulation software suite enables scientists to easily determine spin Hamiltonian parameters from isotropic, randomly oriented and single crystal continuous wave electron paramagnetic resonance (CW EPR) spectra from radicals and isolated paramagnetic metal ion centers or clusters found in metalloproteins, chemical systems and materials science. XSophe provides an X-windows graphical user interface to the Sophe programme and allows: creation of multiple input files, local and remote execution of Sophe, the display of sophelog (output from Sophe) and input parameters/files. Sophe is a sophisticated computer simulation software programme employing a number of innovative technologies including; the Sydney OPera HousE (SOPHE) partition and interpolation schemes, a field segmentation algorithm, the mosaic misorientation linewidth model, parallelization and spectral optimisation. In conjunction with the SOPHE partition scheme and the field segmentation algorithm, the SOPHE interpolation scheme and the mosaic misorientation linewidth model greatly increase the speed of simulations for most spin systems. Employing brute force matrix diagonalization in the simulation of an EPR spectrum from a high spin Cr(III) complex with the spin Hamiltonian parameters g(e) = 2.00, D = 0.10 cm(-1), E/D = 0.25, A(x) = 120.0, A(y) = 120.0, A(z) = 240.0 x 10(-4) cm(-1) requires a SOPHE grid size of N = 400 (to produce a good signal to noise ratio) and takes 229.47 s. In contrast the use of either the SOPHE interpolation scheme or the mosaic misorientation linewidth model requires a SOPHE grid size of only N = 18 and takes 44.08 and 0.79 s, respectively. Results from Sophe are transferred via the Common Object Request Broker Architecture (CORBA) to XSophe and subsequently to XeprView((R)) where the simulated CW EPR spectra (1D and 2D) can be compared to the experimental spectra. Energy level diagrams, transition roadmaps and transition surfaces aid the interpretation of complicated randomly oriented CW EPR spectra and can be viewed with a web browser and an OpenInventor scene graph viewer.
Resumo:
Using Landsat imagery, forest canopy density (FCD) estimated with the FCD Mapper®, was correlated with predominant height (PDH, measured as the average height of the tallest 50 trees per hectare) for 20 field plots measured in native forest at Noosa Heads, south-east Queensland, Australia. A corresponding image was used to calculate FCD in Leyte Island, the Philippines and was validated on the ground for accuracy. The FCD Mapper was produced for the International Tropical Timber Organisation and estimates FCD as an index of canopy density using reflectance characteristics of Landsat Enhanced Thematic (ETM) Mapper images. The FCD Mapper is a ‘semi-expert’ computer program which uses interactive screens to allow the operator to make decisions concerning the classification of land into bare soil, grass and forest. At Noosa, a positive strong nonlinear relationship (r2 = 0.86) was found between FCD and PDH for 15 field plots with variable PDH but complete canopy closure. An additional five field plots were measured in forest with a broken canopy and the software assessed these plots as having a much lower FCD than forest with canopy closure. FCD estimates for forest and agricultural land in the island of Leyte and subsequent field validation showed that at appropriate settings, the FCD Mapper differentiated between tropical rainforest and banana or coconut plantation. These findings suggest that in forests with a closed canopy this remote sensing technique has promise for forest inventory and productivity assessment. The findings also suggest that the software has promise for discriminating between native forest with a complete canopy and forest which has a broken canopy, such as coconut or banana plantation.
Resumo:
Expokit provides a set of routines aimed at computing matrix exponentials. More precisely, it computes either a small matrix exponential in full, the action of a large sparse matrix exponential on an operand vector, or the solution of a system of linear ODEs with constant inhomogeneity. The backbone of the sparse routines consists of matrix-free Krylov subspace projection methods (Arnoldi and Lanczos processes), and that is why the toolkit is capable of coping with sparse matrices of large dimension. The software handles real and complex matrices and provides specific routines for symmetric and Hermitian matrices. The computation of matrix exponentials is a numerical issue of critical importance in the area of Markov chains and furthermore, the computed solution is subject to probabilistic constraints. In addition to addressing general matrix exponentials, a distinct attention is assigned to the computation of transient states of Markov chains.
Resumo:
An automated method for extracting brain volumes from three commonly acquired three-dimensional (3D) MR images (proton density, T1 weighted, and T2-weighted) of the human head is described. The procedure is divided into four levels: preprocessing, segmentation, scalp removal, and postprocessing. A user-provided reference point is the sole operator-dependent input required, The method's parameters were first optimized and then fixed and applied to 30 repeat data sets from 15 normal older adult subjects to investigate its reproducibility. Percent differences between total brain volumes (TBVs) for the subjects' repeated data sets ranged from .5% to 2.2%. We conclude that the method is both robust and reproducible and has the potential for wide application.