207 resultados para Constraints-Led Approach
em University of Queensland eSpace - Australia
Resumo:
We discuss how integrity consistency constraints between different UML models can be precisely defined at a language level. In doing so, we introduce a formal object-oriented metamodeling approach. In the approach, integrity consistency constraints between UML models are defined in terms of invariants of the UML model elements used to define the models at the language-level. Adopting a formal approach, constraints are formally defined using Object-Z. We demonstrate how integrity consistency constraints for UML models can be precisely defined at the language-level and once completed, the formal description of the consistency constraints will be a precise reference of checking consistency of UML models as well as for tool development.
Resumo:
Applied econometricians often fail to impose economic regularity constraints in the exact form economic theory prescribes. We show how the Singular Value Decomposition (SVD) Theorem and Markov Chain Monte Carlo (MCMC) methods can be used to rigorously impose time- and firm-varying equality and inequality constraints. To illustrate the technique we estimate a system of translog input demand functions subject to all the constraints implied by economic theory, including observation-varying symmetry and concavity constraints. Results are presented in the form of characteristics of the estimated posterior distributions of functions of the parameters. Copyright (C) 2001 John Wiley Sons, Ltd.
Resumo:
In this work, a new method of optimization is successfully applied to the theoretical design of compact, actively shielded, clinical MRI magnets. The problem is formulated as a two-step process in which the desired current densities on multiple, cc-axial surface layers are first calculated by solving Fredholm equations of the first kind. Non-linear optimization methods with inequality constraints are then invoked to fit practical magnet coils to the desired current densities. The current density approach allows rapid prototyping of unusual magnet designs. The emphasis of this work is on the optimal design of short, actively-shielded MRI magnets for whole-body imaging. Details of the hybrid numerical model are presented, and the model is used to investigate compact, symmetric, and asymmetric MRI magnets. Magnet designs are presented for actively-shielded, symmetric magnets of coil length 1.0 m, which is considerably shorter than currently available designs of comparable dsv size. Novel, actively-shielded, asymmetric magnet designs are also presented in which the beginning of a 50-cm dsv is positioned just 11 cm from the end of the coil structure, allowing much improved access to the patient and reduced patient claustrophobia. Magn Reson Med 45:331540, 2001. (C) 2001 Wiley-Liss, Inc.
Resumo:
This article describes a workshop and consultation process utilized by four community rehabilitation services and other stakeholders. This process led to the development of an evaluation Template upon which to plan a service evaluation. The Template comprises a number of guiding questions within three broad domains. These are, the people domain (pertaining to the client, their disability, their family and service context), the program domain (pertaining to the service and its activities), and the perspective domain (pertaining to the broader social and community context). It is suggested that the Template, the process by which it was developed, and the guidelines for its use will have relevance to rehabilitation managers, administrators, and others involved in evaluation of community rehabilitation services.
Resumo:
Interconnecting business processes across systems and organisations is considered to provide significant benefits, such as greater process transparency, higher degrees of integration, facilitation of communication, and consequently higher throughput in a given time interval. However, to achieve these benefits requires tackling constraints. In the context of this paper these are privacy-requirements of the involved workflows and their mutual dependencies. Workflow views are a promising conceptional approach to address the issue of privacy; however this approach requires addressing the issue of interdependencies between workflow view and adjacent private workflow. In this paper we focus on three aspects concerning the support for execution of cross-organisational workflows that have been modelled with a workflow view approach: (i) communication between the entities of a view-based workflow model, (ii) their impact on an extended workflow engine, and (iii) the design of a cross-organisational workflow architecture (CWA). We consider communication aspects in terms of state dependencies and control flow dependencies. We propose to tightly couple private workflow and workflow view with state dependencies, whilst to loosely couple workflow views with control flow dependencies. We introduce a Petri-Net-based state transition approach that binds states of private workflow tasks to their adjacent workflow view-task. On the basis of these communication aspects we develop a CWA for view-based cross-organisational workflow execution. Its concepts are valid for mediated and unmediated interactions and express no choice of a particular technology. The concepts are demonstrated by a scenario, run by two extended workflow management systems. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Over the past 30 years, research in the area of applied behaviour. analysis has led to a rich knowledge and understanding of the variables that influence human behaviour. This understanding and knowledge has given rise to a range of assessment and intervention techniques that have been applied to individuals with challenging behaviour. Interventions have produced changes in the severity and frequency of behaviours such as self-injury, aggression, and property destruction, card have also led to the acquisition of desired behaviours. While behaviour change has been achieved, families have expressed a desire for positive behaviour support approaches that adopt a family,focus. Research and development of support frameworks that emphasise the interrelatedness of family members, and the child with a disability as part of his or her family, have gained prominence in the family systems literature. The present paper reviews some of the behaviourally based research in this area. Through the use of a case illustration, the authors discuss the links between behavioural support and family-centred support systems for children with developmental disabilities. Theoretical and practical implications are considered and areas for future research are highlighted.
Resumo:
Peptidyl privileged structures have been widely used by many groups to discover biologically active molecules. In this context, privileged substructures are used as hydrophobic anchors, to which peptide functionality is appended to gain specificity. Utilization of this concept has led to the discovery of many different active compounds at a wide range of biological receptors. A synthetic approach to these compounds has been developed on a safety-catch linker that allows rapid preparation of large libraries of these molecules. Importantly, amide bond formation/cleavage through treatment with amines is the final step; it is a linker strategy that allows significant diversification to be easily incorporated, and it only requires the inclusion of an amide bond. In addition, chemistry has been developed that permits the urea moiety to be inserted at the N-terminus of the peptide, allowing the same set of amines (either privileged substructures or amino acid analogues) to be used at both the N- and C-termini of the molecule. To show the robustness of this approach, a small library of peptidyl privileged structures were synthesized, illustrating that large combinatorial libraries can be synthesized using these technologies.
Resumo:
The estimated parameters of output distance functions frequently violate the monotonicity, quasi-convexity and convexity constraints implied by economic theory, leading to estimated elasticities and shadow prices that are incorrectly signed, and ultimately to perverse conclusions concerning the effects of input and output changes on productivity growth and relative efficiency levels. We show how a Bayesian approach can be used to impose these constraints on the parameters of a translog output distance function. Implementing the approach involves the use of a Gibbs sampler with data augmentation. A Metropolis-Hastings algorithm is also used within the Gibbs to simulate observations from truncated pdfs. Our methods are developed for the case where panel data is available and technical inefficiency effects are assumed to be time-invariant. Two models-a fixed effects model and a random effects model-are developed and applied to panel data on 17 European railways. We observe significant changes in estimated elasticities and shadow price ratios when regularity restrictions are imposed. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
Refinement in software engineering allows a specification to be developed in stages, with design decisions taken at earlier stages constraining the design at later stages. Refinement in complex data models is difficult due to lack of a way of defining constraints, which can be progressively maintained over increasingly detailed refinements. Category theory provides a way of stating wide scale constraints. These constraints lead to a set of design guidelines, which maintain the wide scale constraints under increasing detail. Previous methods of refinement are essentially local, and the proposed method does not interfere very much with these local methods. The result is particularly applicable to semantic web applications, where ontologies provide systems of more or less abstract constraints on systems, which must be implemented and therefore refined by participating systems. With the approach of this paper, the concept of committing to an ontology carries much more force. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Workflow systems have traditionally focused on the so-called production processes which are characterized by pre-definition, high volume, and repetitiveness. Recently, the deployment of workflow systems in non-traditional domains such as collaborative applications, e-learning and cross-organizational process integration, have put forth new requirements for flexible and dynamic specification. However, this flexibility cannot be offered at the expense of control, a critical requirement of business processes. In this paper, we will present a foundation set of constraints for flexible workflow specification. These constraints are intended to provide an appropriate balance between flexibility and control. The constraint specification framework is based on the concept of pockets of flexibility which allows ad hoc changes and/or building of workflows for highly flexible processes. Basically, our approach is to provide the ability to execute on the basis of a partially specified model, where the full specification of the model is made at runtime, and may be unique to each instance. The verification of dynamically built models is essential. Where as ensuring that the model conforms to specified constraints does not pose great difficulty, ensuring that the constraint set itself does not carry conflicts and redundancy is an interesting and challenging problem. In this paper, we will provide a discussion on both the static and dynamic verification aspects. We will also briefly present Chameleon, a prototype workflow engine that implements these concepts. (c) 2004 Elsevier Ltd. All rights reserved.
Resumo:
The effects of dredging on the benthic communities in the Noosa River, a subtropical estuary in SE Queensland, Australia, were examined using a 'Beyond BACF experimental design. Changes in the numbers and types of animals and characteristics of the sediments in response to dredging in the coarse sandy sediments near the mouth of the estuary were compared with those occurring naturally in two control regions. Samples were collected twice before and twice after the dredging operations, at multiple spatial scales, ranging from metres to kilometres. Significant effects from the dredging were detected on the abundance of some polychaetes and bivalves and two measures of diversity (numbers of polychaete families and total taxonomic richness). In addition, the dredging caused a significant increase in the diversity of sediment particle sizes found in the dredged region compared with elsewhere. Community composition in the dredged region was more similar to that in the control regions after dredging than before. Changes in the characteristics of the sedimentary environment as a result of the dredging appeared to lead to the benthic communities of the dredged region becoming more similar to those elsewhere in the estuary, so dredging in this system may have led to the loss or reduction in area of a specific type of habitat in the estuary with implications for overall patterns of biodiversity and ecosystem function. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
The worldwide trend for the deregulation of the electricity generation and transmission industries has led to dramatic changes in system operation and planning procedures. The optimum approach to transmission-expansion planning in a deregulated environment is an open problem especially when the responsibilities of the organisations carrying out the planning work need to be addressed. To date there is a consensus that the system operator and network manager perform the expansion planning work in a centralised way. However, with an increasing input from the electricity market, the objectives, constraints and approaches toward transmission planning should be carefully designed to ensure system reliability as well as meeting the market requirements. A market-oriented approach for transmission planning in a deregulated environment is proposed. Case studies using the IEEE 14-bus system and the Australian national electricity market grid are performed. In addition, the proposed method is compared with a traditional planning method to further verify its effectiveness.
Resumo:
Frequent Itemsets mining is well explored for various data types, and its computational complexity is well understood. There are methods to deal effectively with computational problems. This paper shows another approach to further performance enhancements of frequent items sets computation. We have made a series of observations that led us to inventing data pre-processing methods such that the final step of the Partition algorithm, where a combination of all local candidate sets must be processed, is executed on substantially smaller input data. The paper shows results from several experiments that confirmed our general and formally presented observations.
Resumo:
This paper presents an approach for translating legalese expression of business contracts into candidate business activities and processes while ensuring their compliance with contract. This is a progressive refinement using logic-based formalism to capture contract semantics and to serve as an intermediate step for transformation. Particular value of this approach is for those organisations that consider moving towards new approaches to enterprise contract management and applying them to their future contracts.