948 resultados para Constraints-Led Approach
Resumo:
Interconnecting business processes across systems and organisations is considered to provide significant benefits, such as greater process transparency, higher degrees of integration, facilitation of communication, and consequently higher throughput in a given time interval. However, to achieve these benefits requires tackling constraints. In the context of this paper these are privacy-requirements of the involved workflows and their mutual dependencies. Workflow views are a promising conceptional approach to address the issue of privacy; however this approach requires addressing the issue of interdependencies between workflow view and adjacent private workflow. In this paper we focus on three aspects concerning the support for execution of cross-organisational workflows that have been modelled with a workflow view approach: (i) communication between the entities of a view-based workflow model, (ii) their impact on an extended workflow engine, and (iii) the design of a cross-organisational workflow architecture (CWA). We consider communication aspects in terms of state dependencies and control flow dependencies. We propose to tightly couple private workflow and workflow view with state dependencies, whilst to loosely couple workflow views with control flow dependencies. We introduce a Petri-Net-based state transition approach that binds states of private workflow tasks to their adjacent workflow view-task. On the basis of these communication aspects we develop a CWA for view-based cross-organisational workflow execution. Its concepts are valid for mediated and unmediated interactions and express no choice of a particular technology. The concepts are demonstrated by a scenario, run by two extended workflow management systems. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Over the past 30 years, research in the area of applied behaviour. analysis has led to a rich knowledge and understanding of the variables that influence human behaviour. This understanding and knowledge has given rise to a range of assessment and intervention techniques that have been applied to individuals with challenging behaviour. Interventions have produced changes in the severity and frequency of behaviours such as self-injury, aggression, and property destruction, card have also led to the acquisition of desired behaviours. While behaviour change has been achieved, families have expressed a desire for positive behaviour support approaches that adopt a family,focus. Research and development of support frameworks that emphasise the interrelatedness of family members, and the child with a disability as part of his or her family, have gained prominence in the family systems literature. The present paper reviews some of the behaviourally based research in this area. Through the use of a case illustration, the authors discuss the links between behavioural support and family-centred support systems for children with developmental disabilities. Theoretical and practical implications are considered and areas for future research are highlighted.
Resumo:
Peptidyl privileged structures have been widely used by many groups to discover biologically active molecules. In this context, privileged substructures are used as hydrophobic anchors, to which peptide functionality is appended to gain specificity. Utilization of this concept has led to the discovery of many different active compounds at a wide range of biological receptors. A synthetic approach to these compounds has been developed on a safety-catch linker that allows rapid preparation of large libraries of these molecules. Importantly, amide bond formation/cleavage through treatment with amines is the final step; it is a linker strategy that allows significant diversification to be easily incorporated, and it only requires the inclusion of an amide bond. In addition, chemistry has been developed that permits the urea moiety to be inserted at the N-terminus of the peptide, allowing the same set of amines (either privileged substructures or amino acid analogues) to be used at both the N- and C-termini of the molecule. To show the robustness of this approach, a small library of peptidyl privileged structures were synthesized, illustrating that large combinatorial libraries can be synthesized using these technologies.
Resumo:
The estimated parameters of output distance functions frequently violate the monotonicity, quasi-convexity and convexity constraints implied by economic theory, leading to estimated elasticities and shadow prices that are incorrectly signed, and ultimately to perverse conclusions concerning the effects of input and output changes on productivity growth and relative efficiency levels. We show how a Bayesian approach can be used to impose these constraints on the parameters of a translog output distance function. Implementing the approach involves the use of a Gibbs sampler with data augmentation. A Metropolis-Hastings algorithm is also used within the Gibbs to simulate observations from truncated pdfs. Our methods are developed for the case where panel data is available and technical inefficiency effects are assumed to be time-invariant. Two models-a fixed effects model and a random effects model-are developed and applied to panel data on 17 European railways. We observe significant changes in estimated elasticities and shadow price ratios when regularity restrictions are imposed. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
Refinement in software engineering allows a specification to be developed in stages, with design decisions taken at earlier stages constraining the design at later stages. Refinement in complex data models is difficult due to lack of a way of defining constraints, which can be progressively maintained over increasingly detailed refinements. Category theory provides a way of stating wide scale constraints. These constraints lead to a set of design guidelines, which maintain the wide scale constraints under increasing detail. Previous methods of refinement are essentially local, and the proposed method does not interfere very much with these local methods. The result is particularly applicable to semantic web applications, where ontologies provide systems of more or less abstract constraints on systems, which must be implemented and therefore refined by participating systems. With the approach of this paper, the concept of committing to an ontology carries much more force. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Workflow systems have traditionally focused on the so-called production processes which are characterized by pre-definition, high volume, and repetitiveness. Recently, the deployment of workflow systems in non-traditional domains such as collaborative applications, e-learning and cross-organizational process integration, have put forth new requirements for flexible and dynamic specification. However, this flexibility cannot be offered at the expense of control, a critical requirement of business processes. In this paper, we will present a foundation set of constraints for flexible workflow specification. These constraints are intended to provide an appropriate balance between flexibility and control. The constraint specification framework is based on the concept of pockets of flexibility which allows ad hoc changes and/or building of workflows for highly flexible processes. Basically, our approach is to provide the ability to execute on the basis of a partially specified model, where the full specification of the model is made at runtime, and may be unique to each instance. The verification of dynamically built models is essential. Where as ensuring that the model conforms to specified constraints does not pose great difficulty, ensuring that the constraint set itself does not carry conflicts and redundancy is an interesting and challenging problem. In this paper, we will provide a discussion on both the static and dynamic verification aspects. We will also briefly present Chameleon, a prototype workflow engine that implements these concepts. (c) 2004 Elsevier Ltd. All rights reserved.
Resumo:
The effects of dredging on the benthic communities in the Noosa River, a subtropical estuary in SE Queensland, Australia, were examined using a 'Beyond BACF experimental design. Changes in the numbers and types of animals and characteristics of the sediments in response to dredging in the coarse sandy sediments near the mouth of the estuary were compared with those occurring naturally in two control regions. Samples were collected twice before and twice after the dredging operations, at multiple spatial scales, ranging from metres to kilometres. Significant effects from the dredging were detected on the abundance of some polychaetes and bivalves and two measures of diversity (numbers of polychaete families and total taxonomic richness). In addition, the dredging caused a significant increase in the diversity of sediment particle sizes found in the dredged region compared with elsewhere. Community composition in the dredged region was more similar to that in the control regions after dredging than before. Changes in the characteristics of the sedimentary environment as a result of the dredging appeared to lead to the benthic communities of the dredged region becoming more similar to those elsewhere in the estuary, so dredging in this system may have led to the loss or reduction in area of a specific type of habitat in the estuary with implications for overall patterns of biodiversity and ecosystem function. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
The worldwide trend for the deregulation of the electricity generation and transmission industries has led to dramatic changes in system operation and planning procedures. The optimum approach to transmission-expansion planning in a deregulated environment is an open problem especially when the responsibilities of the organisations carrying out the planning work need to be addressed. To date there is a consensus that the system operator and network manager perform the expansion planning work in a centralised way. However, with an increasing input from the electricity market, the objectives, constraints and approaches toward transmission planning should be carefully designed to ensure system reliability as well as meeting the market requirements. A market-oriented approach for transmission planning in a deregulated environment is proposed. Case studies using the IEEE 14-bus system and the Australian national electricity market grid are performed. In addition, the proposed method is compared with a traditional planning method to further verify its effectiveness.
Resumo:
Frequent Itemsets mining is well explored for various data types, and its computational complexity is well understood. There are methods to deal effectively with computational problems. This paper shows another approach to further performance enhancements of frequent items sets computation. We have made a series of observations that led us to inventing data pre-processing methods such that the final step of the Partition algorithm, where a combination of all local candidate sets must be processed, is executed on substantially smaller input data. The paper shows results from several experiments that confirmed our general and formally presented observations.
Resumo:
This paper presents an approach for translating legalese expression of business contracts into candidate business activities and processes while ensuring their compliance with contract. This is a progressive refinement using logic-based formalism to capture contract semantics and to serve as an intermediate step for transformation. Particular value of this approach is for those organisations that consider moving towards new approaches to enterprise contract management and applying them to their future contracts.
Resumo:
Pac-Man is a well-known, real-time computer game that provides an interesting platform for research. We describe an initial approach to developing an artificial agent that replaces the human to play a simplified version of Pac-Man. The agent is specified as a simple finite state machine and ruleset. with parameters that control the probability of movement by the agent given the constraints of the maze at some instant of time. In contrast to previous approaches, the agent represents a dynamic strategy for playing Pac-Man, rather than a pre-programmed maze-solving method. The agent adaptively "learns" through the application of population-based incremental learning (PBIL) to adjust the agents' parameters. Experimental results are presented that give insight into some of the complexities of the game, as well as highlighting the limitations and difficulties of the representation of the agent.
Resumo:
A program can be decomposed into a set of possible execution paths. These can be described in terms of primitives such as assignments, assumptions and coercions, and composition operators such as sequential composition and nondeterministic choice as well as finitely or infinitely iterated sequential composition. Some of these paths cannot possibly be followed (they are dead or infeasible), and they may or may not terminate. Decomposing programs into paths provides a foundation for analyzing properties of programs. Our motivation is timing constraint analysis of real-time programs, but the same techniques can be applied in other areas such as program testing. In general the set of execution paths for a program is infinite. For timing analysis we would like to decompose a program into a finite set of subpaths that covers all possible execution paths, in the sense that we only have to analyze the subpaths in order to determine suitable timing constraints that cover all execution paths.
Resumo:
Time, cost and quality achievements on large-scale construction projects are uncertain because of technological constraints, involvement of many stakeholders, long durations, large capital requirements and improper scope definitions. Projects that are exposed to such an uncertain environment can effectively be managed with the application of risk management throughout the project life cycle. Risk is by nature subjective. However, managing risk subjectively poses the danger of non-achievement of project goals. Moreover, risk analysis of the overall project also poses the danger of developing inappropriate responses. This article demonstrates a quantitative approach to construction risk management through an analytic hierarchy process (AHP) and decision tree analysis. The entire project is classified to form a few work packages. With the involvement of project stakeholders, risky work packages are identified. As all the risk factors are identified, their effects are quantified by determining probability (using AHP) and severity (guess estimate). Various alternative responses are generated, listing the cost implications of mitigating the quantified risks. The expected monetary values are derived for each alternative in a decision tree framework and subsequent probability analysis helps to make the right decision in managing risks. In this article, the entire methodology is explained by using a case application of a cross-country petroleum pipeline project in India. The case study demonstrates the project management effectiveness of using AHP and DTA.
An integrated multiple criteria decision making approach for resource allocation in higher education
Resumo:
Resource allocation is one of the major decision problems arising in higher education. Resources must be allocated optimally in such a way that the performance of universities can be improved. This paper applies an integrated multiple criteria decision making approach to the resource allocation problem. In the approach, the Analytic Hierarchy Process (AHP) is first used to determine the priority or relative importance of proposed projects with respect to the goals of the universities. Then, the Goal Programming (GP) model incorporating the constraints of AHP priority, system, and resource is formulated for selecting the best set of projects without exceeding the limited available resources. The projects include 'hardware' (tangible university's infrastructures), and 'software' (intangible effects that can be beneficial to the university, its members, and its students). In this paper, two commercial packages are used: Expert Choice for determining the AHP priority ranking of the projects, and LINDO for solving the GP model. Copyright © 2007 Inderscience Enterprises Ltd.
Resumo:
Purpose – This paper sets out to study a production-planning problem for printed circuit board (PCB) assembly. A PCB assembly company may have a number of assembly lines for production of several product types in large volume. Design/methodology/approach – Pure integer linear programming models are formulated for assigning the product types to assembly lines, which is the line assignment problem, with the objective of minimizing the total production cost. In this approach, unrealistic assignment, which was suffered by previous researchers, is avoided by incorporating several constraints into the model. In this paper, a genetic algorithm is developed to solve the line assignment problem. Findings – The procedure of the genetic algorithm to the problem and a numerical example for illustrating the models are provided. It is also proved that the algorithm is effective and efficient in dealing with the problem. Originality/value – This paper studies the line assignment problem arising in a PCB manufacturing company in which the production volume is high.