186 resultados para process control
Resumo:
There is a wide range of potential study designs for intervention studies to decrease nosocomial infections in hospitals. The analysis is complex due to competing events, clustering, multiple timescales and time-dependent period and intervention variables. This review considers the popular pre-post quasi-experimental design and compares it with randomized designs. Randomization can be done in several ways: randomization of the cluster [intensive care unit (ICU) or hospital] in a parallel design; randomization of the sequence in a cross-over design; and randomization of the time of intervention in a stepped-wedge design. We introduce each design in the context of nosocomial infections and discuss the designs with respect to the following key points: bias, control for nonintervention factors, and generalizability. Statistical issues are discussed. A pre-post-intervention design is often the only choice that will be informative for a retrospective analysis of an outbreak setting. It can be seen as a pilot study with further, more rigorous designs needed to establish causality. To yield internally valid results, randomization is needed. Generally, the first choice in terms of the internal validity should be a parallel cluster randomized trial. However, generalizability might be stronger in a stepped-wedge design because a wider range of ICU clinicians may be convinced to participate, especially if there are pilot studies with promising results. For analysis, the use of extended competing risk models is recommended.
Resumo:
We learn from the past that invasive species have caused tremendous damage to native species and serious disruption to agricultural industries. It is crucial for us to prevent this in the future. The first step of this process is to identify correctly an invasive species from native ones. Current identification methods, relying on mainly 2D images, can result in low accuracy and be time consuming. Such methods provide little help to a quarantine officer who has time constraints to response when on duty. To deal with this problem, we propose new solutions using 3D virtual models of insects. We explain how working with insects in the 3D domain can be much better than the 2D domain. We also describe how to create true-color 3D models of insects using an image-based 3D reconstruction method. This method is ideal for quarantine control and inspection tasks that involve the verification of a physical specimen against known invasive species. Finally we show that these insect models provide valuable material for other applications such as research, education, arts and entertainment. © 2013 IEEE.
Resumo:
Kaolinite naturally occurs in the plate form for the interlayer hydrogen bond and the distortion and adaption of tetrahedron and octahedron. But kaolinite sheets can be exfoliated to nanoscrolls artificially in laboratory through multiple-step displacement intercalation. The driving force for kaolinite sheet to be curled nanoscroll originates from the size discrepancy of Si–O tetrahedron and Al–O octahedron. The displacement intercalation promoted the platy kaolinite sheets spontaneously to be scrolled by eliminating the interlayer hydrogen bond and atomic interaction. Kaolinite nanoscrolls are hollow tubes with outer face of tetrahedral sheet and inner face of octahedral sheet. Based on the theoretical calculation it is firstly reported that the minimum interior diameter for a single kaolinite sheet to be scrolled is about 9.08 nm, and the optimal 24.30 nm, the maximum 100 nm, which is verified by the observation of scanning electron microscope and transmission electron microscope. The different adaption types and discrepancy degree between tetrahedron and octahedron generate various curling forces in different directions. The nanoscroll axes prefer the directions as [100], [1 �10], [110], [3 �10], and the relative curling force are as follows, [3 �10] > [100] = [1�10] > [110].
Resumo:
This paper proposes a recommendation system that supports process participants in taking risk-informed decisions, with the goal of reducing risks that may arise during process execution. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a business process exposed to risks, e.g. a financial process exposed to a risk of reputation loss, we enact this process and whenever a process participant needs to provide input to the process, e.g. by selecting the next task to execute or by filling out a form, we suggest to the participant the action to perform which minimizes the predicted process risk. Risks are predicted by traversing decision trees generated from the logs of past process executions, which consider process data, involved resources, task durations and other information elements like task frequencies. When applied in the context of multiple process instances running concurrently, a second technique is employed that uses integer linear programming to compute the optimal assignment of resources to tasks to be performed, in order to deal with the interplay between risks relative to different instances. The recommendation system has been implemented as a set of components on top of the YAWL BPM system and its effectiveness has been evaluated using a real-life scenario, in collaboration with risk analysts of a large insurance company. The results, based on a simulation of the real-life scenario and its comparison with the event data provided by the company, show that the process instances executed concurrently complete with significantly fewer faults and with lower fault severities, when the recommendations provided by our recommendation system are taken into account.
Resumo:
Business Process Management describes a holistic management approach for the systematic design, modeling, execution, validation, monitoring and improvement of organizational business processes. Traditionally, most attention within this community has been given to control-flow aspects, i.e., the ordering and sequencing of business activities, oftentimes in isolation with regards to the context in which these activities occur. In this paper, we propose an approach that allows executable process models to be integrated with Geographic Information Systems. This approach enables process models to take geospatial and other geographic aspects into account in an explicit manner both during the modeling phase and the execution phase. We contribute a structured modeling methodology, based on the well-known Business Process Model and Notation standard, which is formalized by means of a mapping to executable Colored Petri nets. We illustrate the feasibility of our approach by means of a sustainability-focused case example of a process with important ecological concerns.
Resumo:
Knowledge of the pollutant build-up process is a key requirement for developing stormwater pollution mitigation strategies. In this context, process variability is a concept which needs to be understood in-depth. Analysis of particulate build-up on three road surfaces in an urban catchment confirmed that particles <150µm and >150µm have characteristically different build-up patterns, and these patterns are consistent over different field conditions. Three theoretical build-up patterns were developed based on the size-fractionated particulate build-up patterns, and these patterns explain the variability in particle behavior and the variation in particle-bound pollutant load and composition over the antecedent dry period. Behavioral variability of particles <150µm was found to exert the most significant influence on the build-up process variability. As characterization of process variability is particularly important in stormwater quality modeling, it is recommended that the influence of behavioral variability of particles <150µm on pollutant build-up should be specifically addressed. This would eliminate model deficiencies in the replication of the build-up process and facilitate the accounting of the inherent process uncertainty, and thereby enhance the water quality predictions.
Resumo:
Despite significant improvements in capacity-distortion performance, a computationally efficient capacity control is still lacking in the recent watermarking schemes. In this paper, we propose an efficient capacity control framework to substantiate the notion of watermarking capacity control to be the process of maintaining “acceptable” distortion and running time, while attaining the required capacity. The necessary analysis and experimental results on the capacity control are reported to address practical aspects of the watermarking capacity problem, in dynamic (size) payload embedding.
Resumo:
Integration of small-scale electricity generators, known as distributed generation (DG), into the distribution networks has become increasingly popular at the present. This tendency together with the falling price of the synchronous-type generator has potential to give DG a better chance at participating in the voltage regulation process together with other devices already available in the system. The voltage control issue turns out to be a very challenging problem for the distribution engineers since existing control coordination schemes would need to be reconsidered to take into account the DG operation. In this paper, we propose a control coordination technique, which is able to utilize the ability of DG as a voltage regulator and, at the same time, minimize interaction with other active devices, such as an on-load tap changing transformer and a voltage regulator. The technique has been developed based on the concept of control zone, line drop compensation, dead band, as well as the choice of controllers' parameters. Simulations carried out on an Australian system show that the technique is suitable and flexible for any system with multiple regulating devices including DG.
Resumo:
Big Tobacco has been engaged in a dark, shadowy plot and conspiracy to hijack the Trans-Pacific Partnership Agreement (TPP) and undermine tobacco control measures – such as graphic health warnings and the plain packaging of tobacco products... In the context of this heavy lobbying by Big Tobacco and its proxies, this chapter provides an analysis of the debate over trade, tobacco, and the TPP. This discussion is necessarily focused on the negotiations of the free trade agreement – the shadowy conflicts before the finalisation of the text. This chapter contends that the trade negotiations threaten hard-won gains in public health – including international developments such as the WHO Framework Convention on Tobacco Control, and domestic measures, such as graphic health warnings and the plain packaging of tobacco products. It maintains that there is a need for regional trade agreements to respect the primacy of the WHO Framework Convention on Tobacco Control. There is a need both to provide for an open and transparent process regarding such trade negotiations, as well as a due and proper respect for public health in terms of substantive obligations. Part I focuses on the debate over the intellectual property chapter of the TPP, within the broader context of domestic litigation against Australia’s plain tobacco packaging regime and associated WTO disputes. Part II examines the investment chapter of the TPP, taking account of ongoing investment disputes concerning tobacco control and the declared approaches of Australia and New Zealand to investor-state dispute settlement. Part III looks at the discussion as to whether there should be specific text on tobacco control in the TPP, and, if so, what should be its nature and content. This chapter concludes that the plain packaging of tobacco products – and other best practices in tobacco control – should be adopted by members of the Pacific Rim.
Resumo:
This study developed and tested a model of job uncertainty for survivors and victims of downsizing. Data were collected from three samples of employees in a public hospital, each representing three phases of the downsizing process: immediately before the announcement of the redeployment of staff, during the implementation of the downsizing, and towards the end of the official change programme. As predicted, levels of job uncertainty and personal control had a direct relationship with emotional exhaustion and job satisfaction. In addition, there was evidence to suggest that personal control mediated the relationship between job uncertainty and employee adjustment, a pattern of results that varied across each of the three phases of the change event. From the perspective of the organization’s overall climate, it was found that levels of job uncertainty, personal control and job satisfaction improved and/or stabilized over the downsizing process. During the implementation phase, survivors experienced higher levels of personal control than victims, but both groups of employees reported similar levels of job uncertainty. We discuss the implications of our results for strategically managing uncertainty during and after organizational change.
Resumo:
This paper applies concepts Deleuze developed in his ‘Postscript on the Societies of Control’, especially those relating to modulatory power, dividuation and control, to aspects of Australian schooling to explore how this transition is manifesting itself. Two modulatory machines of assessment, NAPLAN and My Schools, are examined as a means to better understand how the disciplinary institution is changing as a result of modulation. This transition from discipline to modulation is visible in the declining importance of the disciplinary teacher–student relationship as a measure of the success of the educative process. The transition occurs through seduction because that which purports to measure classroom quality is in fact a serpent of modulation that produces simulacra of the disciplinary classroom. The effect is to sever what happens in the disciplinary space from its representations in a luminiferous ether that overlays the classroom.
Resumo:
While anecdotal evidence indicates financial advice affects consumers’ financial well-being, this research project is motivated by the absence of empirically-grounded research relating to the extent to which, and, importantly, how, financial planning advice contributes to broader client well-being. Accordingly, the aim of this project is to establish how the quality of financial planning advice can be optimised to add value, not only to clients’ financial situation, but also to broader aspects of their well-being. This broader construct of well-being captures a range of process and outcome factors that map to concepts of security, control, choice, mastery, and life satisfaction (Irving, 2012; Gallery, Gallery, Irving & Newton, 2011; Irving, Gallery, and Gallery, 2009). Financial planning is commonly purported to confer not only tangible benefits, but also intangible benefits, such as increased security and peace of mind that are considered as important, if not more important, than material outcomes. Such claims are intuitively appealing; however, little empirical evidence exists for the notion that engaging with a financial planner or adviser promotes peace of mind, feelings of security, and expands choices and possibilities. Nor is there evidence signalling what mechanisms might underpin such client benefits. In addressing this issue, we examine the financial planning advice (including financial product advice) provided to retail clients, and consider the short- and longer-term impacts on clients’ financial satisfaction and broader well-being. To this end, we examine both process (e.g., how financial planning advice is given) and outcome (e.g., financial situation) effects.
Authorisation management in business process environments: An authorisation model and a policy model
Resumo:
This thesis provides two main contributions. The first one is BP-TRBAC, a unified authorisation model that can support legacy systems as well as business process systems. BP-TRBAC supports specific features that are required by business process environments. BP-TRBAC is designed to be used as an independent enterprise-wide authorisation model, rather than having it as part of the workflow system. It is designed to be the main authorisation model for an organisation. The second contribution is BP-XACML, an authorisation policy language that is designed to represent BPM authorisation policies for business processes. The contribution also includes a policy model for BP-XACML. Using BP-TRBAC as an authorisation model together with BP-XACML as an authorisation policy language will allow an organisation to manage and control authorisation requests from workflow systems and other legacy systems.
Resumo:
Process view technology is catching more attentions in modern business process management, as it enables the customisation of business process representation. This capability helps improve the privacy protection, authority control, flexible display, etc., in business process modelling. One of approaches to generate process views is to allow users to construct an aggregate on their underlying processes. However, most aggregation approaches stick to a strong assumption that business processes are always well-structured, which is over strict to BPMN. Aiming to build process views for non-well-structured BPMN processes, this paper investigates the characteristics of BPMN structures, tasks, events, gateways, etc., and proposes a formal process view aggregation approach to facilitate BPMN process view creation. A set of consistency rules and construction rules are defined to regulate the aggregation and guarantee the order preservation, structural and behaviour correctness and a novel aggregation technique, called EP-Fragment, is developed to tackle non-well-structured BPMN processes.
Resumo:
The process view concept deploys a partial and temporal representation to adjust the visible view of a business process according to various perception constraints of users. Process view technology is of practical use for privacy protection and authorization control in process-oriented business management. Owing to complex organizational structure, it is challenging for large companies to accurately specify the diverse perception of different users over business processes. Aiming to tackle this issue, this article presents a role-based process view model to incorporate role dependencies into process view derivation. Compared to existing process view approaches, ours particularly supports runtime updates to the process view perceivable to a user with specific view merging operations, thereby enabling the dynamic tracing of process perception. A series of rules and theorems are established to guarantee the structural consistency and validity of process view transformation. A hypothetical case is conducted to illustrate the feasibility of our approach, and a prototype is developed for the proof-of-concept purpose.