748 resultados para Normative process
em Queensland University of Technology - ePrints Archive
Resumo:
Process compliance measurement is getting increasing attention in companies due to stricter legal requirements and market pressure for operational excellence. On the other hand, the metrics to quantify process compliance have only been defined recently. A major criticism points to the fact that existing measures appear to be unintuitive. In this paper, we trace back this problem to a more foundational question: which notion of behavioural equivalence is appropriate for discussing compliance? We present a quantification approach based on behavioural profiles, which is a process abstraction mechanism. Behavioural profiles can be regarded as weaker than existing equivalence notions like trace equivalence, and they can be calculated efficiently. As a validation, we present a respective implementation that measures compliance of logs against a normative process model. This implementation is being evaluated in a case study with an international service provider.
Resumo:
Analysis of behavioural consistency is an important aspect of software engineering. In process and service management, consistency verification of behavioural models has manifold applications. For instance, a business process model used as system specification and a corresponding workflow model used as implementation have to be consistent. Another example would be the analysis to what degree a process log of executed business operations is consistent with the corresponding normative process model. Typically, existing notions of behaviour equivalence, such as bisimulation and trace equivalence, are applied as consistency notions. Still, these notions are exponential in computation and yield a Boolean result. In many cases, however, a quantification of behavioural deviation is needed along with concepts to isolate the source of deviation. In this article, we propose causal behavioural profiles as the basis for a consistency notion. These profiles capture essential behavioural information, such as order, exclusiveness, and causality between pairs of activities of a process model. Consistency based on these profiles is weaker than trace equivalence, but can be computed efficiently for a broad class of models. In this article, we introduce techniques for the computation of causal behavioural profiles using structural decomposition techniques for sound free-choice workflow systems if unstructured net fragments are acyclic or can be traced back to S- or T-nets. We also elaborate on the findings of applying our technique to three industry model collections.
Resumo:
Norms regulate the behaviour of their subjects and define what is legal and what is illegal. Norms typically describe the conditions under which they are applicable and the normative effects as a results of their applications. On the other hand, process models specify how a business operation or service is to be carried out to achieve a desired outcome. Norms can have significant impact on how business operations are conducted and they can apply to the whole or part of a business process. For example, they may impose conditions on the different aspects of a process (e.g., perform tasks in a specific sequence (control-flow), at a specific time or within a certain time frame (temporal aspect), by specific people (resources)). We propose a framework that provides the formal semantics of the normative requirements for determining whether a business process complies with a normative document (where a normative document can be understood in a very broad sense, ranging from internal policies to best practice policies, to statutory acts). We also present a classification of normal requirements based on the notion of different types of obligations and the effects of violating these obligations.
Resumo:
As an international norm, the Responsibility to Protect (R2P) has gained substantial influence and institutional presence—and created no small controversy—in the ten years since its first conceptualisation. Conversely, the Protection of Civilians in Armed Conflict (PoC) has a longer pedigree and enjoys a less contested reputation. Yet UN Security Council action in Libya in 2011 has thrown into sharp relief the relationship between the two. UN Security Council Resolutions 1970 and 1973 follow exactly the process envisaged by R2P in response to imminent atrocity crimes, yet the operative paragraphs of the resolutions themselves invoke only PoC. This article argues that, while the agendas of PoC and R2P converge with respect to Security Council action in cases like Libya, outside this narrow context it is important to keep the two norms distinct. Peacekeepers, humanitarian actors, international lawyers, individual states and regional organisations are required to act differently with respect to the separate agendas and contexts covered by R2P and PoC. While overlap between the two does occur in highly visible cases like Libya, neither R2P nor PoC collapses normatively, institutionally or operationally into the other.
Resumo:
Purpose: Young novice drivers experience significantly greater risk of being injured or killed in car crashes than older more experienced drivers. This research utilised a qualitative approach guided by the framework of Akers’ social learning theory. It explored young novice drivers’ perspectives on risky driving including rewards and punishments expected from and administered by parents, friends, and police, imitation of parents’ and friends’ driving, and advantages and disadvantages of risky driving. Methods: Twenty-one young drivers (12 females, 9 males) aged 16–25 years (M = 17.71 years, SD = 2.15) with a Learner (n = 11) or Provisional (n = 10) driver licence participated in individual or small group interviews. Findings and conclusions: Content analysis supported four themes: (1) rewards and (2) punishments for risky driving, and the influence of (3) parents and (4) friends. The young novice drivers differed in their vulnerability to the negative influences of friends and parents, with some novices advising they were able to resist risky normative influences whilst others felt they could not. The authority of the police as enforcers of road rules was either accepted and respected or seen as being used to persecute young novices. These findings suggest that road safety interventions should consider the normative influence of parents and friends on the risky and safe behaviour of young novices. Police were also seen as influential upon behaviour. Future research should explore the complicated relationship between parents, friends, the police, young novices, and their risky driving behaviour.
Resumo:
Drink walking, that is walking in a public place while intoxicated, is associated with increased risk of injury and fatality. Young people and males are especially prone to engaging in this behaviour, yet little is known about the factors associated with individual’s decisions to drink walk. The present research explores the role of different normative influences (friendship group norm, parent group norm, university peer group norm) and perceived risk, within an extended theory of planned behaviour (TPB) framework, in predicting young people’s self-reported drink walking intentions. One hundred and eighteen young people (aged 17-25 years) completed a survey including sociodemographic measures and extended TPB measures related to drink walking. Overall the extended TPB explained 72.8% of the variance in young people’s intentions to drink walk in the next six months with attitude, perceived behavioural control, friendship group norm, and gender (male) emerging as significant predictors. Males, as compared with females, had higher intentions to drink walk and lower perceptions of risk regarding drink walking. Together, these findings provide a clearer indication of the salient normative influences and gender differences in young pedestrian’s decisions to walk while intoxicated. Such findings can be used to inform future interventions designed to reduce injuries and fatalities associated with drink walking.
Resumo:
Criminal law scholarship is enjoying a renaissance in normative theory, evident in a growing list of publications from leading scholars that attempt to elucidate a set of principles on which criminalisation and criminal law might — indeed should — be based. This development has been less marked in Australia, where a stream of criminologically influenced criminal law scholarship, teaching and practice has emerged over nearly three decades. There are certain tensions between this predominantly contextual, process-oriented and criminological tradition that has emerged in Australia, characterised by a critical approach to the search for ‘general principles’ of the criminal law, and the more recent revival of interest in developing a set of principles on which a ‘normative theory of criminal law’ might be founded. Aspects of this tension will be detailed through examination of recent examples of criminalisation in New South Wales that are broadly representative of trends across all Australian urisdictions. The article will then reflect on the links between these particular features of criminalisation and attempts to develop a ‘normative theory’ of criminalisation.
Resumo:
The operation of the law rests on the selection of an account of the facts. Whether this involves prediction or postdiction, it is not possible to achieve certainty. Any attempt to model the operation of the law completely will therefore raise questions of how to model the process of proof. In the selection of a model a crucial question will be whether the model is to be used normatively or descriptively. Focussing on postdiction, this paper presents and contrasts the mathematical model with the story model. The former carries the normative stamp of scientific approval, whereas the latter has been developed by experimental psychologists to describe how humans reason. Neil Cohen's attempt to use a mathematical model descriptively provides an illustration of the dangers in not clearly setting this parameter of the modelling process. It should be kept in mind that the labels 'normative' and 'descriptive' are not eternal. The mathematical model has its normative limits, beyond which we may need to critically assess models with descriptive origins.
Resumo:
Purpose The purpose of this paper is to foster a common understanding of business process management (BPM) by proposing a set of ten principles that characterize BPM as a research domain and guide its successful use in organizational practice. Design/methodology/approach The identification and discussion of the principles reflects our viewpoint, which was informed by extant literature and focus groups, including 20 BPM experts from academia and practice. Findings We identify ten principles which represent a set of capabilities essential for mastering contemporary and future challenges in BPM. Their antonyms signify potential roadblocks and bad practices in BPM. We also identify a set of open research questions that can guide future BPM research. Research limitation/implication Our findings suggest several areas of research regarding each of the identified principles of good BPM. Also, the principles themselves should be systematically and empirically examined in future studies. Practical implications – Our findings allow practitioners to comprehensively scope their BPM initiatives and provide a general guidance for BPM implementation. Moreover, the principles may also serve to tackle contemporary issues in other management areas. Originality/value This is the first paper that distills principles of BPM in the sense of both good and bad practice recommendations. The value of the principles lies in providing normative advice to practitioners as well as in identifying open research areas for academia, thereby extending the reach and richness of BPM beyond its traditional frontiers.
Resumo:
This paper evaluates the suitability of sequence classification techniques for analyzing deviant business process executions based on event logs. Deviant process executions are those that deviate in a negative or positive way with respect to normative or desirable outcomes, such as non-compliant executions or executions that undershoot or exceed performance targets. We evaluate a range of feature types and classification methods in terms of their ability to accurately discriminate between normal and deviant executions both when deviances are infrequent (unbalanced) and when deviances are as frequent as normal executions (balanced). We also analyze the ability of the discovered rules to explain potential causes and contributing factors of observed deviances. The evaluation results show that feature types extracted using pattern mining techniques only slightly outperform those based on individual activity frequency. The results also suggest that more complex feature types ought to be explored to achieve higher levels of accuracy.
Resumo:
By definition, regulatory rules (in legal context called norms) intend to achieve specific behaviour from business processes, and might be relevant to the whole or part of a business process. They can impose conditions on different aspects of process models, e.g., control-flow, data and resources etc. Based on the rules sets, norms can be classified into various classes and sub-classes according to their effects. This paper presents an abstract framework consisting of a list of norms and a generic compliance checking approach on the idea of (possible) execution of processes. The proposed framework is independent of any existing formalism, and provides a conceptually rich and exhaustive ontology and semantics of norms needed for business process compliance checking. The possible uses of the proposed framework include to compare different compliance management frameworks (CMFs).
Resumo:
This research contributes a formal framework to evaluate whether existing CMFs can model and reason about various types of normative requirements. The framework can be used to determine the level of coverage of concepts provided by CMFs, establish mappings between CMF languages and the semantics for the normative concepts and evaluate the suitability of a CMF for issuing a certification of compliance. The developed framework is independent of any specific formalism and it has been formally defined and validated through the examples of such mappings of CMFs.