44 resultados para B formal method
Resumo:
The real-time refinement calculus is a formal method for the systematic derivation of real-time programs from real-time specifications in a style similar to the non-real-time refinement calculi of Back and Morgan. In this paper we extend the real-time refinement calculus with procedures and provide refinement rules for refining real-time specifications to procedure calls. A real-time specification can include constraints on, not only what outputs are produced, but also when they are produced. The derived programs can also include time constraints oil when certain points in the program must be reached; these are expressed in the form of deadline commands. Such programs are machine independent. An important consequence of the approach taken is that, not only are the specifications machine independent, but the whole refinement process is machine independent. To implement the machine independent code on a target machine one has a separate task of showing that the compiled machine code will reach all its deadlines before they expire. For real-time programs, externally observable input and output variables are essential. These differ from local variables in that their values are observable over the duration of the execution of the program. Hence procedures require input and output parameter mechanisms that are references to the actual parameters so that changes to external inputs are observable within the procedure and changes to output parameters are externally observable. In addition, we allow value and result parameters. These may be auxiliary parameters, which are used for reasoning about the correctness of real-time programs as well as in the expression of timing deadlines, but do not lead to any code being generated for them by a compiler. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Well understood methods exist for developing programs from given specifications. A formal method identifies proof obligations at each development step: if all such proof obligations are discharged, a precisely defined class of errors can be excluded from the final program. For a class of closed systems such methods offer a gold standard against which less formal approaches can be measured. For open systems -those which interact with the physical world- the task of obtaining the program specification can be as challenging as the task of deriving the program. And, when a system of this class must tolerate certain kinds of unreliability in the physical world, it is still more challenging to reach confidence that the specification obtained is adequate. We argue that widening the notion of software development to include specifying the behaviour of the relevant parts of the physical world gives a way to derive the specification of a control system and also to record precisely the assumptions being made about the world outside the computer.
Resumo:
Univariate linkage analysis is used routinely to localise genes for human complex traits. Often, many traits are analysed but the significance of linkage for each trait is not corrected for multiple trait testing, which increases the experiment-wise type-I error rate. In addition, univariate analyses do not realise the full power provided by multivariate data sets. Multivariate linkage is the ideal solution but it is computationally intensive, so genome-wide analysis and evaluation of empirical significance are often prohibitive. We describe two simple methods that efficiently alleviate these caveats by combining P-values from multiple univariate linkage analyses. The first method estimates empirical pointwise and genome-wide significance between one trait and one marker when multiple traits have been tested. It is as robust as an appropriate Bonferroni adjustment, with the advantage that no assumptions are required about the number of independent tests performed. The second method estimates the significance of linkage between multiple traits and one marker and, therefore, it can be used to localise regions that harbour pleiotropic quantitative trait loci (QTL). We show that this method has greater power than individual univariate analyses to detect a pleiotropic QTL across different situations. In addition, when traits are moderately correlated and the QTL influences all traits, it can outperform formal multivariate VC analysis. This approach is computationally feasible for any number of traits and was not affected by the residual correlation between traits. We illustrate the utility of our approach with a genome scan of three asthma traits measured in families with a twin proband.
Resumo:
Summary form only given. The Java programming language supports concurrency. Concurrent programs are harder to verify than their sequential counterparts due to their inherent nondeterminism and a number of specific concurrency problems such as interference and deadlock. In previous work, we proposed a method for verifying concurrent Java components based on a mix of code inspection, static analysis tools, and the ConAn testing tool. The method was derived from an analysis of concurrency failures in Java components, but was not applied in practice. In this paper, we explore the method by applying it to an implementation of the well-known readers-writers problem and a number of mutants of that implementation. We only apply it to a single, well-known example, and so we do not attempt to draw any general conclusions about the applicability or effectiveness of the method. However, the exploration does point out several strengths and weaknesses in the method, which enable us to fine-tune the method before we carry out a more formal evaluation on other, more realistic components.
Resumo:
The structure constants of quantum Lie algebras depend on a quantum deformation parameter q and they reduce to the classical structure constants of a Lie algebra at q = 1. We explain the relationship between the structure constants of quantum Lie algebras and quantum Clebsch-Gordan coefficients for adjoint x adjoint --> adjoint We present a practical method for the determination of these quantum Clebsch-Gordan coefficients and are thus able to give explicit expressions for the structure constants of the quantum Lie algebras associated to the classical Lie algebras B-l, C-l and D-l. In the quantum case the structure constants of the Cartan subalgebra are non-zero and we observe that they are determined in terms of the simple quantum roots. We introduce an invariant Killing form on the quantum Lie algebras and find that it takes values which are simple q-deformations of the classical ones.
Resumo:
A novel screening strategy has been developed for the identification of alpha-chymotrypsin inhibitors from a phage peptide library. In this strategy, the standard affinity selection protocol was modified by adding a proteolytic cleavage period to avoid recovery of alpha-chymotrypsin substrates. After four cycles of selection and further activity assay, a group of related peptides were identified by DNA sequencing. These peptides share a consensus sequence motif as (S/T)RVPR(R/H). Then, a corresponding short peptide (Ac-ASRVPRRG-NH2) was synthesized chemically and proved to be an inhibitor of alpha-chymotrypsin. The present work provides a useful way for searching proteinase inhibitors without detailed knowledge of the molecular structure.
Resumo:
Previous work on generating state machines for the purpose of class testing has not been formally based. There has also been work on deriving state machines from formal specifications for testing non-object-oriented software. We build on this work by presenting a method for deriving a state machine for testing purposes from a formal specification of the class under test. We also show how the resulting state machine can be used as the basis for a test suite developed and executed using an existing framework for class testing. To derive the state machine, we identify the states and possible interactions of the operations of the class under test. The Test Template Framework is used to formally derive the states from the Object-Z specification of the class under test. The transitions of the finite state machine are calculated from the derived states and the class's operations. The formally derived finite state machine is transformed to a ClassBench testgraph, which is used as input to the ClassBench framework to test a C++ implementation of the class. The method is illustrated using a simple bounded queue example.
Resumo:
Gauging data are available from numerous streams throughout Australia, and these data provide a basis for historical analysis of geomorphic change in stream channels in response to both natural phenomena and human activities. We present a simple method for analysis of these data, and a briefcase study of an application to channel change in the Tully River, in the humid tropics of north Queensland. The analysis suggests that this channel has narrowed and deepened, rather than aggraded: channel aggradation was expected, given the intensification of land use in the catchment, upstream of the gauging station. Limitations of the method relate to the time periods over which stream gauging occurred; the spatial patterns of stream gauging sites; the quality and consistency of data collection; and the availability of concurrent land-use histories on which to base the interpretation of the channel changes.
Resumo:
Hedley er al. (1982) developed what has become the most widely used land modified), phosphorus (P) fractionation technique. It consists of sequential extraction of increasingly less phytoavailable P pools. Extracts are centrifuged at up to 25000 g (RCF) and filtered to 0.45 mu m to ensure that soil is not lost between extractions. In attempting to transfer this method to laboratories with limited facilities, it was considered that access to high-speed centrifuges, and the cost of frequent filtration may prevent adoption of this P fractionation technique. The modified method presented here was developed to simplify methodology, reduce cost, and therefore increase accessibility of P fractionation technology. It provides quantitative recovery of soil between extractions, using low speed centrifugation without filtration. This is achieved by increasing the ionic strength of dilute extracts, through the addition of NaCl, to flocculate clay particles. Addition of NaCl does not change the amount of P extracted. Flocculation with low speed centrifugation produced extracts comparable with those having undergone filtration (0.025 mu m). A malachite green colorimetric method was adopted for inorganic P determination, as this simple manual method provides high sensitivity with negligible interference from other anions. This approach can also be used for total P following digestion, alternatively non-discriminatory methods, such as inductively coupled plasma atomic emission spectroscopy, may be employed.
Resumo:
This paper presents a numerical technique for the design of an RF coil for asymmetric magnetic resonance imaging (MRI) systems. The formulation is based on an inverse approach where the cylindrical surface currents are expressed in terms of a combination of sub-domain basis functions: triangular and pulse functions. With the homogeneous transverse magnetic field specified in a spherical region, a functional method is applied to obtain the unknown current coefficients. The current distribution is then transformed to a conductor pattern by use of a stream function technique. Preliminary MR images acquired using a prototype RF coil are presented and validate the design method. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Taking functional programming to its extremities in search of simplicity still requires integration with other development (e.g. formal) methods. Induction is the key to deriving and verifying functional programs, but can be simplified through packaging proofs with functions, particularly folds, on data (structures). Totally Functional Programming avoids the complexities of interpretation by directly representing data (structures) as platonic combinators - the functions characteristic to the data. The link between the two simplifications is that platonic combinators are a kind of partially-applied fold, which means that platonic combinators inherit fold-theoretic properties, but with some apparent simplifications due to the platonic combinator representation. However, despite observable behaviour within functional programming that suggests that TFP is widely-applicable, significant work remains before TFP as such could be widely adopted.
Resumo:
The artificial chaperone method for protein refolding developed by Rozema et al. (Rozema, D.; Gellman, S. H. J. Am. Chem. Soc. 1995, 117 (8), 2373-2374) involves the sequential dilution of denatured protein into a buffer containing detergent (cetyltrimethylammonium bromide, CTAB) and then into a refolding buffer containing cyclodextrin WD). In this paper a simplified one-step artificial chaperone method is reported, whereby CTAB is added directly to the denatured solution, which is then diluted directly into a refolding buffer containing P-cyclodextrin (P-CD). This new method can be applied at high protein concentrations, resulting in smaller processing volumes and a more concentrated protein solution following refolding. The increase in achievable protein concentration results from the enhanced solubility of CTAB at elevated temperatures in concentrated denaturant. The refolding yields obtained for the new method were significantly higher than for control experiments lacking additives and were comparable to the yields obtained with the classical two-step approach. A study of the effect of beta-CD and CTAB concentrations on refolding yield suggested two operational regimes: slow stripping ( beta-CDXTABsimilar to1), most suited for higher protein concentrations, and fast stripping (beta-CD/CTABsimilar to2.7), best suited for lower protein concentrations. An increased chaotrope concentration resulted in higher refolding yields and an enlarged operational regime.
Resumo:
The role of catecholamines in the control of the GnRH pulse generator is unclear as studies have relied on the use of peripheral or intracerebroventricular injections, which lack specificity in relation to the anatomical site of action. Direct brain site infusions have been used, however, these are limited by the ability to accurately target small brain regions. One such area of interest in the control of GnRH is the median eminence and arcuate nucleus within the medial basal hypothalamus. Here we describe a method of stereotaxically targeting this area in a large animal (sheep) and an infusion system to deliver drugs into unrestrained conscious animals. To test our technique we infused the dopamine agonist, quinpirole or vehicle into the medial basal hypothalamus of ovariectomised ewes. Quinpirole significantly suppressed LH pulsatility only in animals with injectors located close to the lateral median eminence. This in vivo result supports the hypothesis that dopamine inhibits GnRH secretion by presynaptic inhibition in the lateral median eminence. Also infusion of quinpirole into the medial basal hypothalamus suppressed prolactin secretion providing in vivo evidence that is consistent with the hypothesis that there are stimulatory autoreceptors on tubero-infundibular dopamine neurons. (C) 1997 Elsevier Science B.V.