275 resultados para Equivalence Proof
Resumo:
This article examines some questions of statutory interpretation as they apply to section 130 of the Land Title Act 1994 (Qld)
Resumo:
In Chapters 1 through 9 of the book (with the exception of a brief discussion on observers and integral action in Section 5.5 of Chapter 5) we considered constrained optimal control problems for systems without uncertainty, that is, with no unmodelled dynamics or disturbances, and where the full state was available for measurement. More realistically, however, it is necessary to consider control problems for systems with uncertainty. This chapter addresses some of the issues that arise in this situation. As in Chapter 9, we adopt a stochastic description of uncertainty, which associates probability distributions to the uncertain elements, that is, disturbances and initial conditions. (See Section 12.6 for references to alternative approaches to model uncertainty.) When incomplete state information exists, a popular observer-based control strategy in the presence of stochastic disturbances is to use the certainty equivalence [CE] principle, introduced in Section 5.5 of Chapter 5 for deterministic systems. In the stochastic framework, CE consists of estimating the state and then using these estimates as if they were the true state in the control law that results if the problem were formulated as a deterministic problem (that is, without uncertainty). This strategy is motivated by the unconstrained problem with a quadratic objective function, for which CE is indeed the optimal solution (˚Astr¨om 1970, Bertsekas 1976). One of the aims of this chapter is to explore the issues that arise from the use of CE in RHC in the presence of constraints. We then turn to the obvious question about the optimality of the CE principle. We show that CE is, indeed, not optimal in general. We also analyse the possibility of obtaining truly optimal solutions for single input linear systems with input constraints and uncertainty related to output feedback and stochastic disturbances.We first find the optimal solution for the case of horizon N = 1, and then we indicate the complications that arise in the case of horizon N = 2. Our conclusion is that, for the case of linear constrained systems, the extra effort involved in the optimal feedback policy is probably not justified in practice. Indeed, we show by example that CE can give near optimal performance. We thus advocate this approach in real applications.
Resumo:
This article studies the problem of transforming a process model with an arbitrary topology into an equivalent well-structured process model. While this problem has received significant attention, there is still no full characterization of the class of unstructured process models that can be transformed into well-structured ones, nor an automated method for structuring any process model that belongs to this class. This article fills this gap in the context of acyclic process models. The article defines a necessary and sufficient condition for an unstructured acyclic process model to have an equivalent well-structured process model under fully concurrent bisimulation, as well as a complete structuring method. The method has been implemented as a tool that takes process models captured in the BPMN and EPC notations as input. The article also reports on an empirical evaluation of the structuring method using a repository of process models from commercial practice.
Resumo:
Process compliance measurement is getting increasing attention in companies due to stricter legal requirements and market pressure for operational excellence. On the other hand, the metrics to quantify process compliance have only been defined recently. A major criticism points to the fact that existing measures appear to be unintuitive. In this paper, we trace back this problem to a more foundational question: which notion of behavioural equivalence is appropriate for discussing compliance? We present a quantification approach based on behavioural profiles, which is a process abstraction mechanism. Behavioural profiles can be regarded as weaker than existing equivalence notions like trace equivalence, and they can be calculated efficiently. As a validation, we present a respective implementation that measures compliance of logs against a normative process model. This implementation is being evaluated in a case study with an international service provider.
Resumo:
Process compliance measurement is getting increasing attention in companies due to stricter legal requirements and market pressure for operational excellence. In order to judge on compliance of the business processing, the degree of behavioural deviation of a case, i.e., an observed execution sequence, is quantified with respect to a process model (referred to as fitness, or recall). Recently, different compliance measures have been proposed. Still, nearly all of them are grounded on state-based techniques and the trace equivalence criterion, in particular. As a consequence, these approaches have to deal with the state explosion problem. In this paper, we argue that a behavioural abstraction may be leveraged to measure the compliance of a process log – a collection of cases. To this end, we utilise causal behavioural profiles that capture the behavioural characteristics of process models and cases, and can be computed efficiently. We propose different compliance measures based on these profiles, discuss the impact of noise in process logs on our measures, and show how diagnostic information on non-compliance is derived. As a validation, we report on findings of applying our approach in a case study with an international service provider.
Resumo:
Analysis of behavioural consistency is an important aspect of software engineering. In process and service management, consistency verification of behavioural models has manifold applications. For instance, a business process model used as system specification and a corresponding workflow model used as implementation have to be consistent. Another example would be the analysis to what degree a process log of executed business operations is consistent with the corresponding normative process model. Typically, existing notions of behaviour equivalence, such as bisimulation and trace equivalence, are applied as consistency notions. Still, these notions are exponential in computation and yield a Boolean result. In many cases, however, a quantification of behavioural deviation is needed along with concepts to isolate the source of deviation. In this article, we propose causal behavioural profiles as the basis for a consistency notion. These profiles capture essential behavioural information, such as order, exclusiveness, and causality between pairs of activities of a process model. Consistency based on these profiles is weaker than trace equivalence, but can be computed efficiently for a broad class of models. In this article, we introduce techniques for the computation of causal behavioural profiles using structural decomposition techniques for sound free-choice workflow systems if unstructured net fragments are acyclic or can be traced back to S- or T-nets. We also elaborate on the findings of applying our technique to three industry model collections.
Resumo:
Identification of behavioural contradictions is an important aspect of software engineering, in particular for checking the consistency between a business process model used as system specification and a corresponding workflow model used as implementation. In this paper, we propose causal behavioural profiles as the basis for a consistency notion, which capture essential behavioural information, such as order, exclusiveness, and causality between pairs of activities. Existing notions of behavioural equivalence, such as bisimulation and trace equivalence, might also be applied as consistency notions. Still, they are exponential in computation. Our novel concept of causal behavioural profiles provides a weaker behavioural consistency notion that can be computed efficiently using structural decomposition techniques for sound free-choice workflow systems if unstructured net fragments are acyclic or can be traced back to S- or T-nets.
Resumo:
Traction force microscopy (TFM) is commonly used to estimate cells’ traction forces from the deformation that they cause on their substrate. The accuracy of TFM highly depends on the computational methods used to measure the deformation of the substrate and estimate the forces, and also on the specifics of the experimental set-up. Computer simulations can be used to evaluate the effect of both the computational methods and the experimental set-up without the need to perform numerous experiments. Here, we present one such TFM simulator that addresses several limitations of the existing ones. As a proof of principle, we recreate a TFM experimental set-up, and apply a classic 2D TFM algorithm to recover the forces. In summary, our simulator provides a valuable tool to study the performance, refine experimentally, and guide the extraction of biological conclusions from TFM experiments.
Resumo:
In vivo osteochondral defect models predominantly consist of small animals, such as rabbits. Although they have an advantage of low cost and manageability, their joints are smaller and more easily healed compared with larger animals or humans. We hypothesized that osteochondral cores from large animals can be implanted subcutaneously in rats to create an ectopic osteochondral defect model for routine and high-throughput screening of multiphasic scaffold designs and/or tissue-engineered constructs (TECs). Bovine osteochondral plugs with 4 mm diameter osteochondral defect were fitted with novel multiphasic osteochondral grafts composed of chondrocyte-seeded alginate gels and osteoblast-seeded polycaprolactone scaffolds, prior to being implanted in rats subcutaneously with bone morphogenic protein-7. After 12 weeks of in vivo implantation, histological and micro-computed tomography analyses demonstrated that TECs are susceptible to mineralization. Additionally, there was limited bone formation in the scaffold. These results suggest that the current model requires optimization to facilitate robust bone regeneration and vascular infiltration into the defect site. Taken together, this study provides a proof-of-concept for a high-throughput osteochondral defect model. With further optimization, the presented hybrid in vivo model may address the growing need for a cost-effective way to screen osteochondral repair strategies before moving to large animal preclinical trials.
Resumo:
Objective To determine whether a 5-day course of oral prednisolone is superior to a 3-day course in reducing the 2-week morbidity of children with asthma exacerbations who are not hospitalised. Design, setting and participants Double-blind randomised controlled trial of asthma outcomes following a 5-day course of oral prednisolone (1 mg/kg) compared with a 3-day course of prednisolone plus placebo for 2 days. Participants were children aged 2–15 years who presented to the emergency departments of three Queensland hospitals between March 2004 and February 2007 with an acute exacerbation of asthma, but were not hospitalised. Sample size was defined a priori for a study power of 90%. Main outcome measures Difference in proportion of children who were symptom-free at Day 7, as measured by intention-to-treat (ITT) and per-protocol analysis; quality of life (QOL) on Days 7 and 14. Results 201 children were enrolled, and there was an 82% completion rate. There was no difference between groups in the proportion of children who were symptom-free (observed difference, 0.04 [95% CI, − 0.09 to 0.18] by ITT analysis; 0.04 [95% CI, − 0.17 to 0.09] by per-protocol analysis). There was also no difference between groups in QOL (P = 0.42). The difference between groups for the primary outcome was within the equivalence range calculated post priori. Conclusion A 5-day course of oral prednisolone confers no advantage over a 3-day course for children with asthma exacerbations who are not hospitalised.
Resumo:
Gel dosimetry and plastic chemical dosimeters such as PresageTM are capable of very accurately mapping dose distributions in three dimensions. Combined with their near tissue equivalence one would expect that after several decades of development they would be the dosimeter of choice for dosimetry, however they have not achieve widespread clinical use. This presentation will include a brief description and history of developments in gels and 3D plastics for dosimetry, the limitations and advantages, and their role in the future.
Resumo:
Tissue engineering of vascularized constructs has great utility in reconstructive surgery. While we have been successful in generating vascularized granulation-like tissue and adipose tissue in an in vivo tissue engineering chamber, production of other differentiated tissues in a stable construct remains a challenge. One approach is to utilize potent differentiation factors, which can influence the base tissue. Endothelial precursor cells (EPCs) have the ability to both carry differentiation factors and home to developing vasculature. In this study, proof-of-principle experiments demonstrate that such cells can be recruited from the circulation into an in vivo tissue engineering chamber. CXC chemokine ligand 12 (CXCL12)/stromal cell-derived factor 1 was infused into the chamber through Alzet osmotic pumps and chamber cannulation between days 0 and 7, and facilitated recruitment of systemically inoculated exogenous human EPCs injected on day 6. CXCL12 infusion resulted in an eightfold increase in EPC recruitment, 2 (p = 0.03) and 7 days postinfusion (p = 0.008). Delivery of chemotactic/proliferation and/or differentiation factors and appropriately timed introduction of effective cells may allow us to better exploit the regenerative potential of the established chamber construct. © Copyright 2009, Mary Ann Liebert, Inc. 2009.
Resumo:
Solving indeterminate algebraic equations in integers is a classic topic in the mathematics curricula across grades. At the undergraduate level, the study of solutions of non-linear equations of this kind can be motivated by the use of technology. This article shows how the unity of geometric contextualization and spreadsheet-based amplification of this topic can provide a discovery experience for prospective secondary teachers and information technology students. Such experience can be extended to include a transition from a computationally driven conjecturing to a formal proof based on a number of simple yet useful techniques.
Resumo:
We present a proof of concept for a novel nanosensor for the detection of ultra-trace amounts of bio-active molecules in complex matrices. The nanosensor is comprised of gold nanoparticles with an ultra-thin silica shell and antibody surface attachment, which allows for the immobilization and direct detection of bio-active molecules by surface enhanced Raman spectroscopy (SERS) without requiring a Raman label. The ultra-thin passive layer (~1.3 nm thickness) prevents competing molecules from binding non-selectively to the gold surface without compromising the signal enhancement. The antibodies attached on the surface of the nanoparticles selectively bind to the target molecule with high affinity. The interaction between the nanosensor and the target analyte result in conformational rearrangements of the antibody binding sites, leading to significant changes in the surface enhanced Raman spectra of the nanoparticles when compared to the spectra of the un-reacted nanoparticles. Nanosensors of this design targeting the bio-active compounds erythropoietin and caffeine were able to detect ultra-trace amounts the analyte to the lower quantification limits of 3.5×10−13 M and 1×10−9 M, respectively.
Resumo:
Recurrence relations in mathematics form a very powerful and compact way of looking at a wide range of relationships. Traditionally, the concept of recurrence has often been a difficult one for the secondary teacher to convey to students. Closely related to the powerful proof technique of mathematical induction, recurrences are able to capture many relationships in formulas much simpler than so-called direct or closed formulas. In computer science, recursive coding often has a similar compactness property, and, perhaps not surprisingly, suffers from similar problems in the classroom as recurrences: the students often find both the basic concepts and practicalities elusive. Using models designed to illuminate the relevant principles for the students, we offer a range of examples which use the modern spreadsheet environment to powerfully illustrate the great expressive and computational power of recurrences.