723 resultados para Closure process
Resumo:
The construction of large?volume methacrylate monolithic columns for preparative-scale plasmid purification is obstructed by the enormous release of exotherms, thus introducing structural heterogeneity in the monolith pore system. A remarkable radial temperature gradient develops along the monolith thickness, reaching a terminal temperature that supersedes the maximum temperature required for the preparation of a structurally homogeneous monolith. A novel heat expulsion technique is employed to overcome the heat build-up during the synthesis process. The enormous heat build-up is perceived to encompass the heat associated with initiator decomposition and the heat released from free radical-monomer and monomer-monomer interactions. The heat resulting from the initiator decomposition was expelled along with some gaseous fumes before commencing polymerisation in a gradual addition fashion. Characteristics of a 50 mL monolith synthesized using this technique showed an improved uniformity in the pore structure radially along the length on the monolith. Chromatographic characterization of this adsorbent displayed a persistent binding capacity of 14.5 mg pDNA/mL of the adsorbent. The adsorbent was able to fractionate a clarified bacteria lysate in only 3 min (after loading) into RNA, protein and pDNA respectively. The pDNA fraction obtained was analyzed to be a homogeneous supercoiled pDNA.
Resumo:
The maturing of the biotechnology industry and a focus on productivity has seen a shift from discovery science to small-scale bench-top research to higher productivity, large scale production. Health companies are aggressively expanding their biopharmaceutical interests, an expansion which is facilitated by biochemical and bioprocess engineering. An area of continuous growth is vaccines. Vaccination will be a key intervention in the case of an influenza pandemic. The global manufacturing capacity for fast turn around vaccines is currently woefully inadequate at around 300 million shots. As the prevention of epidemics requires > 80 % vaccination, in theory the world should currently be aiming for the ability to produce around 5.3 billion vaccines. Presented is a production method for the creation of a fast turn around DNA vaccine. A DNA vaccine could have a production time scale of as little as two weeks. This process has been harnessed into a pilot scale production system for the creation of a pre-clinical grade malaria vaccine in a collaborative project with the Coppel Lab, Department of Microbiology, Monash University. In particular, improvements to the fermentation, chromatography and delivery stages will be discussed. Consideration will then be given as to how the fermentation stage affects the mid and downstream processing stages.
Resumo:
1. Introduction The success of self-regulation, in terms of enhancing older drivers’ safety and maintaining their mobility, depends largely upon older drivers’ awareness of the declines in their driving abilities. Therefore, interventions targeted at increasing older drivers’ safety should aim to enhance their awareness of their physical, sensory and cognitive limitations. Moreover, previous research suggests that driving behaviour change may occur through stages and that interventions and feedback may be perceived differently at each stage. 2. Study aims To further understand the process of driving self-regulation among older adults by exploring their perceptions and experiences of self-regulation, using the PAPM as a framework. To investigate the possible impact of feedback on their driving on their decision making process. 3. Methodology Research tool: Qualitative focus groups (n=5 sessions) Recruitment: Posters, media, newspaper advertisement and emails Inclusion criteria: Aged 70 or more, English-speaking, current drivers Participants: Convenience sample of 27 men and women aged 74 to 90 in the Sunshine Coast and Brisbane city, Queensland, Australia. 4. Analysis Thematic analysis was conducted following the process outlined by Braun and Clarke (2006) to identify, analyse and report themes within the data. Four main themes were identified.
Resumo:
Business process models have traditionally been an effective way of examining business practices to identify areas for improvement. While common information gathering approaches are generally efficacious, they can be quite time consuming and have the risk of developing inaccuracies when information is forgotten or incorrectly interpreted by analysts. In this study, the potential of a role-playing approach for process elicitation and specification has been examined. This method allows stakeholders to enter a virtual world and role-play actions as they would in reality. As actions are completed, a model is automatically developed, removing the need for stakeholders to learn and understand a modelling grammar. Empirical data obtained in this study suggests that this approach may not only improve both the number of individual process task steps remembered and the correctness of task ordering, but also provide a reduction in the time required for stakeholders to model a process view.
Resumo:
Business Process Management describes a holistic management approach for the systematic design, modeling, execution, validation, monitoring and improvement of organizational business processes. Traditionally, most attention within this community has been given to control-flow aspects, i.e., the ordering and sequencing of business activities, oftentimes in isolation with regards to the context in which these activities occur. In this paper, we propose an approach that allows executable process models to be integrated with Geographic Information Systems. This approach enables process models to take geospatial and other geographic aspects into account in an explicit manner both during the modeling phase and the execution phase. We contribute a structured modeling methodology, based on the well-known Business Process Model and Notation standard, which is formalized by means of a mapping to executable Colored Petri nets. We illustrate the feasibility of our approach by means of a sustainability-focused case example of a process with important ecological concerns.
Resumo:
One quarter of Australian children are overweight or obese (ABS, 2010), putting them at increased risk of physical and psychological health problems (Reilly et al., 2003). Overweight and obesity in childhood tends to persist into adulthood and is associated with premature death and morbidity (Reilly & Kelly, 2011). Increases in Australian children’s weight have coincided with declines in active transportation, such as walking, to school (Salmon et al., 2005). To address this problem, the Victorian Health Promotion Foundation (VicHealth), which is an independent statutory authority which advises government and contributes to promoting good health in Victoria (VicHealth, 2014), developed the Walk to School program. Walk to School aims to encourage primary school children in Victoria to walk to and from school more often. Walking to school is a low cost and effective means of reducing excess weight (Rosenberg et al., 2006) that can be easily integrated into daily routine (Brophy et al., 2011). The purpose of this paper is to present the results of the stakeholder process evaluation of Walk to School 2013, which forms part of a broader outcome evaluation that is currently in field. Although there is an emphasis on outcome evaluation of programs, process evaluation can be equally important in determining program success (Saunders et al., 2005). Further, process evaluation to assess program delivery and utilization is explicitly recommended by two social marketing frameworks (see Lefebvre et al., 1988; Walsh et al., 1993).
Resumo:
One of the main challenges facing online and offline path planners is the uncertainty in the magnitude and direction of the environmental energy because it is dynamic, changeable with time, and hard to forecast. This thesis develops an artificial intelligence for a mobile robot to learn from historical or forecasted data of environmental energy available in the area of interest which will help for a persistence monitoring under uncertainty using the developed algorithm.
Resumo:
Knowledge of the pollutant build-up process is a key requirement for developing stormwater pollution mitigation strategies. In this context, process variability is a concept which needs to be understood in-depth. Analysis of particulate build-up on three road surfaces in an urban catchment confirmed that particles <150µm and >150µm have characteristically different build-up patterns, and these patterns are consistent over different field conditions. Three theoretical build-up patterns were developed based on the size-fractionated particulate build-up patterns, and these patterns explain the variability in particle behavior and the variation in particle-bound pollutant load and composition over the antecedent dry period. Behavioral variability of particles <150µm was found to exert the most significant influence on the build-up process variability. As characterization of process variability is particularly important in stormwater quality modeling, it is recommended that the influence of behavioral variability of particles <150µm on pollutant build-up should be specifically addressed. This would eliminate model deficiencies in the replication of the build-up process and facilitate the accounting of the inherent process uncertainty, and thereby enhance the water quality predictions.
Resumo:
One of the riskiest activities in the course of a person's work is driving. By developing and testing a new work driving risk assessment measurement tool for use by organisations this research will contribute to the safety of those who drive for work purposes. The research results highlighted limitations associated with current self-report measures and provided evidence that the work driving environment is extremely complex and involves constant interactions between humans, vehicles, the road environment, and the organisational context.
Resumo:
There is consensus among practitioners and academics that culture is a critical factor that is able to determine success or failure of BPM initiatives. Yet, culture is a topic that seems difficult to grasp and manage. This may be the reason for the overall lack of guidance on how to address this topic in practice. We have conducted in-depth research for more than three years to examine why and how culture is relevant to BPM. In this chapter, we introduce a framework that explains the role of culture in BPM. We also present the relevant cultural values that compose a BPM culture, and we introduce a tool to examine the supportiveness of organizational cultures for BPM. Our research results provide the basis for further empirical analyses on the topic and support practitioners in the management of culture as an important factor in BPM initiatives.
Resumo:
This paper presents a technique for the automated removal of noise from process execution logs. Noise is the result of data quality issues such as logging errors and manifests itself in the form of infrequent process behavior. The proposed technique generates an abstract representation of an event log as an automaton capturing the direct follows relations between event labels. This automaton is then pruned from arcs with low relative frequency and used to remove from the log those events not fitting the automaton, which are identified as outliers. The technique has been extensively evaluated on top of various auto- mated process discovery algorithms using both artificial logs with different levels of noise, as well as a variety of real-life logs. The results show that the technique significantly improves the quality of the discovered process model along fitness, appropriateness and simplicity, without negative effects on generalization. Further, the technique scales well to large and complex logs.
Resumo:
Organizational and technological systems analysis and design practices such as process modeling have received much attention in recent years. However, while knowledge about related artifacts such as models, tools, or grammars has substantially matured, little is known about the actual tasks and interaction activities that are conducted as part of analysis and design acts. In particular, key role of the facilitator has not been researched extensively to date. In this paper, we propose a new conceptual framework that can be used to examine facilitation behaviors in process modeling projects. The framework distinguishes four behavioral styles in facilitation (the driving engineer, the driving artist, the catalyzing engineer, and the catalyzing artist) that a facilitator can adopt. To distinguish between the four styles, we provide a set of ten behavioral anchors that underpin facilitation behaviors. We also report on a preliminary empirical exploration of our framework through interviews with experienced analysts in six modeling cases. Our research provides a conceptual foundation for an emerging theory for describing and explaining different behaviors associated with process modeling facilitation, provides first preliminary empirical results about facilitation in modeling projects, and provides a fertile basis for examining facilitation in other conceptual modeling activities.
Resumo:
Business processes are prone to continuous and unexpected changes. Process workers may start executing a process differently in order to adjust to changes in workload, season, guidelines or regulations for example. Early detection of business process changes based on their event logs – also known as business process drift detection – enables analysts to identify and act upon changes that may otherwise affect process performance. Previous methods for business process drift detection are based on an exploration of a potentially large feature space and in some cases they require users to manually identify the specific features that characterize the drift. Depending on the explored feature set, these methods may miss certain types of changes. This paper proposes a fully automated and statistically grounded method for detecting process drift. The core idea is to perform statistical tests over the distributions of runs observed in two consecutive time windows. By adaptively sizing the window, the method strikes a trade-off between classification accuracy and drift detection delay. A validation on synthetic and real-life logs shows that the method accurately detects typical change patterns and scales up to the extent it is applicable for online drift detection.
Resumo:
This paper addresses the problem of identifying and explaining behavioral differences between two business process event logs. The paper presents a method that, given two event logs, returns a set of statements in natural language capturing behavior that is present or frequent in one log, while absent or infrequent in the other. This log delta analysis method allows users to diagnose differences between normal and deviant executions of a process or between two versions or variants of a process. The method relies on a novel approach to losslessly encode an event log as an event structure, combined with a frequency-enhanced technique for differencing pairs of event structures. A validation of the proposed method shows that it accurately diagnoses typical change patterns and can explain differences between normal and deviant cases in a real-life log, more compactly and precisely than previously proposed methods.