706 resultados para peace process
Resumo:
This chapter is concerned with the complexity and difficulty of truth telling as it is played out in two graphic novels: Stitches: A Memoir (Small 2009) and Why We Broke Up (Handler & Kalman, 2011). These texts establish a link between creative imagination and pain as the central protagonists come to see the therapeutic value of literature and film in helping them understand the complex emotional worlds they inhabit and the bitter truths about love and relationships. The discussion examines how these texts privilege a particular kind of independent subjectivity through aesthetic creation and appropriation. It also considers how speaking and silence are co-present elements in gender relations and each has its part to play in the double process of suffering and healing.
Resumo:
This paper seeks to re-conceptualize the research supervision relationship. The literature has tended to view doctoral study in four ways: (i) as an exercise in self-management; (ii) as a research experience; (iii) as training for research, or; (iv) as an instance of student-centred learning. Although each of these approaches has their merits, they also suffer from conceptual weaknesses. This paper seeks to harness the merits — and minimize the disadvantages — by re-conceptualizing doctoral research as a ‘writing journey’. The paper utilizes the insights of new rhetoric in linguistic theory to defend a writing-centered conception of supervised research and offers some practical strategies on how it might be put into effect.
Resumo:
Over the last 30 years, numerous research groups have attempted to provide mathematical descriptions of the skin wound healing process. The development of theoretical models of the interlinked processes that underlie the healing mechanism has yielded considerable insight into aspects of this critical phenomenon that remain difficult to investigate empirically. In particular, the mathematical modeling of angiogenesis, i.e., capillary sprout growth, has offered new paradigms for the understanding of this highly complex and crucial step in the healing pathway. With the recent advances in imaging and cell tracking, the time is now ripe for an appraisal of the utility and importance of mathematical modeling in wound healing angiogenesis research. The purpose of this review is to pedagogically elucidate the conceptual principles that have underpinned the development of mathematical descriptions of wound healing angiogenesis, specifically those that have utilized a continuum reaction-transport framework, and highlight the contribution that such models have made toward the advancement of research in this field. We aim to draw attention to the common assumptions made when developing models of this nature, thereby bringing into focus the advantages and limitations of this approach. A deeper integration of mathematical modeling techniques into the practice of wound healing angiogenesis research promises new perspectives for advancing our knowledge in this area. To this end we detail several open problems related to the understanding of wound healing angiogenesis, and outline how these issues could be addressed through closer cross-disciplinary collaboration.
Resumo:
Background: Recently there have been efforts to derive safe, efficient processes to rule out acute coronary syndrome (ACS) in emergency department (ED) chest pain patients. We aimed to prospectively validate an ACS assessment pathway (the 2-Hour Accelerated Diagnostic Protocol to Assess Patients with Chest Pain Symptoms Using Contemporary Troponins as the Only Biomarker (ADAPT) pathway) under pragmatic ED working conditions. Methods: This prospective cohort study included patients with atraumatic chest pain in whom ACS was suspected but who did not have clear evidence of ischaemia on ECG. Thrombolysis in myocardial infarction (TIMI) score and troponin (TnI Ultra) were measured at ED presentation, 2 h later and according to current national recommendations. The primary outcome of interest was the occurrence of major adverse cardiac events (MACE) including prevalent myocardial infarction (MI) at 30 days in the group who had a TIMI score of 0 and had presentation and 2-h TnI assays <99th percentile. Results: Eight hundred and forty patients were studied of whom 177 (21%) had a TIMI score of 0. There were no MI, MACE or revascularization in the per protocol and intention-to-treat 2-h troponin groups (0%, 95% confidence interval (CI) 0% to 4.5% and 0%, 95% CI 0% to 3.8%, respectively). The negative predictive value (NPV) was 100% (95% CI 95.5% to 100%) and 100% (95% CI 96.2% to 100%), respectively. Conclusions: A 2-h accelerated rule-out process for ED chest pain patients using electrocardiography, a TIMI score of 0 and a contemporary sensitive troponin assay accurately identifies a group at very low risk of 30-day MI or MACE.
Resumo:
This paper proposes the Clinical Pathway Analysis Method (CPAM) approach that enables the extraction of valuable organisational and medical information on past clinical pathway executions from the event logs of healthcare information systems. The method deals with the complexity of real-world clinical pathways by introducing a perspective-based segmentation of the date-stamped event log. CPAM enables the clinical pathway analyst to effectively and efficiently acquire a profound insight into the clinical pathways. By comparing the specific medical conditions of patients with the factors used for characterising the different clinical pathway variants, the medical expert can identify the best therapeutic option. Process mining-based analytics enables the acquisition of valuable insights into clinical pathways, based on the complete audit traces of previous clinical pathway instances. Additionally, the methodology is suited to assess guideline compliance and analyse adverse events. Finally, the methodology provides support for eliciting tacit knowledge and providing treatment selection assistance.
Resumo:
Molecular phylogenetic studies of homologous sequences of nucleotides often assume that the underlying evolutionary process was globally stationary, reversible, and homogeneous (SRH), and that a model of evolution with one or more site-specific and time-reversible rate matrices (e.g., the GTR rate matrix) is enough to accurately model the evolution of data over the whole tree. However, an increasing body of data suggests that evolution under these conditions is an exception, rather than the norm. To address this issue, several non-SRH models of molecular evolution have been proposed, but they either ignore heterogeneity in the substitution process across sites (HAS) or assume it can be modeled accurately using the distribution. As an alternative to these models of evolution, we introduce a family of mixture models that approximate HAS without the assumption of an underlying predefined statistical distribution. This family of mixture models is combined with non-SRH models of evolution that account for heterogeneity in the substitution process across lineages (HAL). We also present two algorithms for searching model space and identifying an optimal model of evolution that is less likely to over- or underparameterize the data. The performance of the two new algorithms was evaluated using alignments of nucleotides with 10 000 sites simulated under complex non-SRH conditions on a 25-tipped tree. The algorithms were found to be very successful, identifying the correct HAL model with a 75% success rate (the average success rate for assigning rate matrices to the tree's 48 edges was 99.25%) and, for the correct HAL model, identifying the correct HAS model with a 98% success rate. Finally, parameter estimates obtained under the correct HAL-HAS model were found to be accurate and precise. The merits of our new algorithms were illustrated with an analysis of 42 337 second codon sites extracted from a concatenation of 106 alignments of orthologous genes encoded by the nuclear genomes of Saccharomyces cerevisiae, S. paradoxus, S. mikatae, S. kudriavzevii, S. castellii, S. kluyveri, S. bayanus, and Candida albicans. Our results show that second codon sites in the ancestral genome of these species contained 49.1% invariable sites, 39.6% variable sites belonging to one rate category (V1), and 11.3% variable sites belonging to a second rate category (V2). The ancestral nucleotide content was found to differ markedly across these three sets of sites, and the evolutionary processes operating at the variable sites were found to be non-SRH and best modeled by a combination of eight edge-specific rate matrices (four for V1 and four for V2). The number of substitutions per site at the variable sites also differed markedly, with sites belonging to V1 evolving slower than those belonging to V2 along the lineages separating the seven species of Saccharomyces. Finally, sites belonging to V1 appeared to have ceased evolving along the lineages separating S. cerevisiae, S. paradoxus, S. mikatae, S. kudriavzevii, and S. bayanus, implying that they might have become so selectively constrained that they could be considered invariable sites in these species.
Resumo:
A facile and up-scalable wet-mechanochemical process is designed for fabricating ultra-fine SnO2 nanoparticles anchored on graphene networks for use as anode materials for sodium ion batteries. A hierarchical structure of the SnO2@graphene composite is obtained from the process. The resultant rechargeable SIBs achieved high rate capability and good cycling stability.
Resumo:
Since their inception in 1962, Petri nets have been used in a wide variety of application domains. Although Petri nets are graphical and easy to understand, they have formal semantics and allow for analysis techniques ranging from model checking and structural analysis to process mining and performance analysis. Over time Petri nets emerged as a solid foundation for Business Process Management (BPM) research. The BPM discipline develops methods, techniques, and tools to support the design, enactment, management, and analysis of operational business processes. Mainstream business process modeling notations and workflow management systems are using token-based semantics borrowed from Petri nets. Moreover, state-of-the-art BPM analysis techniques are using Petri nets as an internal representation. Users of BPM methods and tools are often not aware of this. This paper aims to unveil the seminal role of Petri nets in BPM.
Authorisation management in business process environments: An authorisation model and a policy model
Resumo:
This thesis provides two main contributions. The first one is BP-TRBAC, a unified authorisation model that can support legacy systems as well as business process systems. BP-TRBAC supports specific features that are required by business process environments. BP-TRBAC is designed to be used as an independent enterprise-wide authorisation model, rather than having it as part of the workflow system. It is designed to be the main authorisation model for an organisation. The second contribution is BP-XACML, an authorisation policy language that is designed to represent BPM authorisation policies for business processes. The contribution also includes a policy model for BP-XACML. Using BP-TRBAC as an authorisation model together with BP-XACML as an authorisation policy language will allow an organisation to manage and control authorisation requests from workflow systems and other legacy systems.
Resumo:
Graphene films were produced by chemical vapor deposition (CVD) of pyridine on copper substrates. Pyridine-CVD is expected to lead to doped graphene by the insertion of nitrogen atoms in the growing sp2 carbon lattice, possibly improving the properties of graphene as a transparent conductive film. We here report on the influence that the CVD parameters (i.e., temperature and gas flow) have on the morphology, transmittance, and electrical conductivity of the graphene films grown with pyridine. A temperature range between 930 and 1070 °C was explored and the results were compared to those of pristine graphene grown by ethanol-CVD under the same process conditions. The films were characterized by atomic force microscopy, Raman and X-ray photoemission spectroscopy. The optical transmittance and electrical conductivity of the films were measured to evaluate their performance as transparent conductive electrodes. Graphene films grown by pyridine reached an electrical conductivity of 14.3 × 105 S/m. Such a high conductivity seems to be associated with the electronic doping induced by substitutional nitrogen atoms. In particular, at 930 °C the nitrogen/carbon ratio of pyridine-grown graphene reaches 3%, and its electrical conductivity is 40% higher than that of pristine graphene grown from ethanol-CVD.
Resumo:
Process view technology is catching more attentions in modern business process management, as it enables the customisation of business process representation. This capability helps improve the privacy protection, authority control, flexible display, etc., in business process modelling. One of approaches to generate process views is to allow users to construct an aggregate on their underlying processes. However, most aggregation approaches stick to a strong assumption that business processes are always well-structured, which is over strict to BPMN. Aiming to build process views for non-well-structured BPMN processes, this paper investigates the characteristics of BPMN structures, tasks, events, gateways, etc., and proposes a formal process view aggregation approach to facilitate BPMN process view creation. A set of consistency rules and construction rules are defined to regulate the aggregation and guarantee the order preservation, structural and behaviour correctness and a novel aggregation technique, called EP-Fragment, is developed to tackle non-well-structured BPMN processes.
Resumo:
The process view concept deploys a partial and temporal representation to adjust the visible view of a business process according to various perception constraints of users. Process view technology is of practical use for privacy protection and authorization control in process-oriented business management. Owing to complex organizational structure, it is challenging for large companies to accurately specify the diverse perception of different users over business processes. Aiming to tackle this issue, this article presents a role-based process view model to incorporate role dependencies into process view derivation. Compared to existing process view approaches, ours particularly supports runtime updates to the process view perceivable to a user with specific view merging operations, thereby enabling the dynamic tracing of process perception. A series of rules and theorems are established to guarantee the structural consistency and validity of process view transformation. A hypothetical case is conducted to illustrate the feasibility of our approach, and a prototype is developed for the proof-of-concept purpose.
Resumo:
This workshop introduces a range of process drama activities to develop students' critical literacy responses. Whilst children's picture books and process drama strategies have not traditionally been seen as sophisticated resources and strategies for developing students' critical literacy responses, this workshop shows teaching strategies that can be used in language instruction in primary classrooms with diverse student groups. The teaching activities include ‘attribute lists’, ‘sculptures’ and ‘freeze frames’.
Resumo:
Existing process mining techniques provide summary views of the overall process performance over a period of time, allowing analysts to identify bottlenecks and associated performance issues. However, these tools are not de- signed to help analysts understand how bottlenecks form and dissolve over time nor how the formation and dissolution of bottlenecks – and associated fluctua- tions in demand and capacity – affect the overall process performance. This paper presents an approach to analyze the evolution of process performance via a notion of Staged Process Flow (SPF). An SPF abstracts a business process as a series of queues corresponding to stages. The paper defines a number of stage character- istics and visualizations that collectively allow process performance evolution to be analyzed from multiple perspectives. The approach has been implemented in the ProM process mining framework. The paper demonstrates the advantages of the SPF approach over state-of-the-art process performance mining tools using two real-life event logs publicly available.
Resumo:
This paper addresses the following predictive business process monitoring problem: Given the execution trace of an ongoing case,and given a set of traces of historical (completed) cases, predict the most likely outcome of the ongoing case. In this context, a trace refers to a sequence of events with corresponding payloads, where a payload consists of a set of attribute-value pairs. Meanwhile, an outcome refers to a label associated to completed cases, like, for example, a label indicating that a given case completed “on time” (with respect to a given desired duration) or “late”, or a label indicating that a given case led to a customer complaint or not. The paper tackles this problem via a two-phased approach. In the first phase, prefixes of historical cases are encoded using complex symbolic sequences and clustered. In the second phase, a classifier is built for each of the clusters. To predict the outcome of an ongoing case at runtime given its (uncompleted) trace, we select the closest cluster(s) to the trace in question and apply the respective classifier(s), taking into account the Euclidean distance of the trace from the center of the clusters. We consider two families of clustering algorithms – hierarchical clustering and k-medoids – and use random forests for classification. The approach was evaluated on four real-life datasets.