887 resultados para Process-Control Agents
Resumo:
Use of focus groups as a technique of inquiry is gaining attention in the area of health-care research. This paper will report on the technique of focus group interviewing to investigate the role of the infection control practitioner. Infection control is examined as a specialty area of health-care practice that has received little research attention to date. Additionally, it is an area of practice that is expanding in response to social, economic and microbiological forces. The focus group technique in this study helped a group of infection control practitioners from urban, regional and rural areas throughout Queensland identify and categorise their daily work activities. The outcomes of this process were then analysed to identify the growth in breadth and complexity of the role of the infection control practitioner in the contemporary health-care environment. Findings indicate that the role of the infection control practitioner in Australia has undergone changes consistent with and reflecting changing models of health-care delivery.
Resumo:
The following proposal is submitted by the AICA's credentialling and certification subcommittee for your consideration. It outlines a process and procedure for interim credentialling of infection control practitioners. In submitting this proposal, the subcommittee acknowledges that, while competency-based education is the preferred process for credentialling, there are clinicians who, in the absence of educational opportunities, have developed a specialist level of competency in infection control practice through self education and experience. Committee members also recognise the need for self-regulation of accrediting processes, to maintain standards in practice and support members in their clinical roles. We ask you to review the following proposal and invite your comments and critique, to be received by the last week in January 1998.
Resumo:
Widespread adoption by electricity utilities of Non-Conventional Instrument Transformers, such as optical or capacitive transducers, has been limited due to the lack of a standardised interface and multi-vendor interoperability. Low power analogue interfaces are being replaced by IEC 61850 9 2 and IEC 61869 9 digital interfaces that use Ethernet networks for communication. These ‘process bus’ connections achieve significant cost savings by simplifying connections between switchyard and control rooms; however the in-service performance when these standards are employed is largely unknown. The performance of real-time Ethernet networks and time synchronisation was assessed using a scale model of a substation automation system. The test bed was constructed from commercially available timing and protection equipment supplied by a range of vendors. Test protocols have been developed to thoroughly evaluate the performance of Ethernet networks and network based time synchronisation. The suitability of IEEE Std 1588 Precision Time Protocol (PTP) as a synchronising system for sampled values was tested in the steady state and under transient conditions. Similarly, the performance of hardened Ethernet switches designed for substation use was assessed under a range of network operating conditions. This paper presents test methods that use a precision Ethernet capture card to accurately measure PTP and network performance. These methods can be used for product selection and to assess ongoing system performance as substations age. Key findings on the behaviour of multi-function process bus networks are presented. System level tests were performed using a Real Time Digital Simulator and transformer protection relay with sampled value and Generic Object Oriented Substation Events (GOOSE) capability. These include the interactions between sampled values, PTP and GOOSE messages. Our research has demonstrated that several protocols can be used on a shared process bus, even with very high network loads. This should provide confidence that this technology is suitable for transmission substations.
Resumo:
The Beauty Leaf tree (Calophyllum inophyllum) is a potential source of non-edible vegetable oil for producing future generation biodiesel because of its ability to grow in a wide range of climate conditions, easy cultivation, high fruit production rate, and the high oil content in the seed. This plant naturally occurs in the coastal areas of Queensland and the Northern Territory in Australia, and is also widespread in south-east Asia, India and Sri Lanka. Although Beauty Leaf is traditionally used as a source of timber and orientation plant, its potential as a source of second generation biodiesel is yet to be exploited. In this study, the extraction process from the Beauty Leaf oil seed has been optimised in terms of seed preparation, moisture content and oil extraction methods. The two methods that have been considered to extract oil from the seed kernel are mechanical oil extraction using an electric powered screw press, and chemical oil extraction using n-hexane as an oil solvent. The study found that seed preparation has a significant impact on oil yields, especially in the screw press extraction method. Kernels prepared to 15% moisture content provided the highest oil yields for both extraction methods. Mechanical extraction using the screw press can produce oil from correctly prepared product at a low cost, however overall this method is ineffective with relatively low oil yields. Chemical extraction was found to be a very effective method for oil extraction for its consistence performance and high oil yield, but cost of production was relatively higher due to the high cost of solvent. However, a solvent recycle system can be implemented to reduce the production cost of Beauty Leaf biodiesel. The findings of this study are expected to serve as the basis from which industrial scale biodiesel production from Beauty Leaf can be made.
Resumo:
Norms regulate the behaviour of their subjects and define what is legal and what is illegal. Norms typically describe the conditions under which they are applicable and the normative effects as a results of their applications. On the other hand, process models specify how a business operation or service is to be carried out to achieve a desired outcome. Norms can have significant impact on how business operations are conducted and they can apply to the whole or part of a business process. For example, they may impose conditions on the different aspects of a process (e.g., perform tasks in a specific sequence (control-flow), at a specific time or within a certain time frame (temporal aspect), by specific people (resources)). We propose a framework that provides the formal semantics of the normative requirements for determining whether a business process complies with a normative document (where a normative document can be understood in a very broad sense, ranging from internal policies to best practice policies, to statutory acts). We also present a classification of normal requirements based on the notion of different types of obligations and the effects of violating these obligations.
Resumo:
Process-Aware Information Systems (PAIS) support organizations in managing and automating their processes. A full automation of processes is in particular industries, such as service-oriented markets, not practicable. The integration of humans in PAIS is necessary to manage and perform processes that require human capabilities, judg- ments and decisions. A challenge of interdisciplinary PAIS research is to provide concepts and solutions that support human integration in PAIS and human orientation of PAIS in a way that provably increase the PAIS users' satisfaction and motivation with working with the Human-Centric Process Aware Information System (HC-PAIS) and consequently in uence users' performance of tasks. This work is an initial step of research that aims at providing a definition of Human-Centric Process Aware Information Systems (HC-PAIS) and future research challenges of HC-PAIS. Results of focus group research are presented.
Resumo:
We examine which capabilities technologies provide to support collaborative process modeling. We develop a model that explains how technology capabilities impact cognitive group processes, and how they lead to improved modeling outcomes and positive technology beliefs. We test this model through a free simulation experiment of collaborative process modelers structured around a set of modeling tasks. With our study, we provide an understanding of the process of collaborative process modeling, and detail implications for research and guidelines for the practical design of collaborative process modeling.
Resumo:
We study a political economy model which aims to understand the diversity in the growth and technology-adoption experiences in different economies. In this model the cost of technology adoption is endogenous and varies across heterogeneous agents. Agents in the model vote on the proportion of revenues allocated towards such expenditures. In the early stages of development, the political-economy outcome of the model ensures that a sub-optimal proportion of government revenue is used to finance adoption-cost reducing expenditures. This sub-optimality is due to the presence of inequality; agents at the lower end of the distribution favor a larger amount of revenue allocated towards redistribution in the form of lump-sum transfers. Eventually all individuals make the switch to the better technology and their incomes converge. The outcomes of the model therefore explain why public choice is more likely to be conservative in nature; it represents the majority choice given conflicting preferences among agents. Consequently, the transition path towards growth and technology adoption varies across countries depending on initial levels of inequality.
Resumo:
Background: Previous attempts at costing infection control programmes have tended to focus on accounting costs rather than economic costs. For studies using economic costs, estimates tend to be quite crude and probably underestimate the true cost. One of the largest costs of any intervention is staff time, but this cost is difficult to quantify and has been largely ignored in previous attempts. Aim: To design and evaluate the costs of hospital-based infection control interventions or programmes. This article also discusses several issues to consider when costing interventions, and suggests strategies for overcoming these issues. Methods: Previous literature and techniques in both health economics and psychology are reviewed and synthesized. Findings: This article provides a set of generic, transferable costing guidelines. Key principles such as definition of study scope and focus on large costs, as well as pitfalls (e.g. overconfidence and uncertainty), are discussed. Conclusion: These new guidelines can be used by hospital staff and other researchers to cost their infection control programmes and interventions more accurately.
Resumo:
This paper presents practical vision-based collision avoidance for objects approximating a single point feature. Using a spherical camera model, a visual predictive control scheme guides the aircraft around the object along a conical spiral trajectory. Visibility, state and control constraints are considered explicitly in the controller design by combining image and vehicle dynamics in the process model, and solving the nonlinear optimization problem over the resulting state space. Importantly, range is not required. Instead, the principles of conical spiral motion are used to design an objective function that simultaneously guides the aircraft along the avoidance trajectory, whilst providing an indication of the appropriate point to stop the spiral behaviour. Our approach is aimed at providing a potential solution to the See and Avoid problem for unmanned aircraft and is demonstrated through a series.
Resumo:
The increased adoption of business process management approaches, tools and practices, has led organizations to accumulate large collections of business process models. These collections can easily include hundred to thousand models, especially in the context of multinational corporations or as a result of organizational mergers and acquisitions. A concrete problem is thus how to maintain these large repositories in such a way that their complexity does not hamper their practical usefulness as a means to describe and communicate business operations. This paper proposes a technique to automatically infer suitable names for business process models and fragments thereof. This technique is useful for model abstraction scenarios, as for instance when user-specific views of a repository are required, or as part of a refactoring initiative aimed to simplify the repository’s complexity. The technique is grounded in an adaptation of the theory of meaning to the realm of business process models. We implemented the technique in a prototype tool and conducted an extensive evaluation using three process model collections from practice and a case study involving process modelers with different experience.
Resumo:
In the electricity market environment, coordination of system reliability and economics of a power system is of great significance in determining the available transfer capability (ATC). In addition, the risks associated with uncertainties should be properly addressed in the ATC determination process for risk-benefit maximization. Against this background, it is necessary that the ATC be optimally allocated and utilized within relative security constraints. First of all, the non-sequential Monte Carlo stimulation is employed to derive the probability density distribution of ATC of designated areas incorporating uncertainty factors. Second, on the basis of that, a multi-objective optimization model is formulated to determine the multi-area ATC so as to maximize the risk-benefits. Then, the solution to the developed model is achieved by the fast non-dominated sorting (NSGA-II) algorithm, which could decrease the risk caused by uncertainties while coordinating the ATCs of different areas. Finally, the IEEE 118-bus test system is served for demonstrating the essential features of the developed model and employed algorithm.
Resumo:
The overall aim of our research was to characterize airborne particles from selected nanotechnology processes and to utilize the data to develop and test quantitative particle concentration-based criteria that can be used to trigger an assessment of particle emission controls. We investigated particle number concentration (PNC), particle mass (PM) concentration, count median diameter (CMD), alveolar deposited surface area, elemental composition, and morphology from sampling of aerosols arising from six nanotechnology processes. These included fibrous and non-fibrous particles, including carbon nanotubes (CNTs). We adopted standard occupational hygiene principles in relation to controlling peak emission and exposures, as outlined by both Safe Work Australia, (1) and the American Conference of Governmental Industrial Hygienists (ACGIH®). (2) The results from the study were used to analyses peak and 30-minute averaged particle number and mass concentration values measured during the operation of the nanotechnology processes. Analysis of peak (highest value recorded) and 30-minute averaged particle number and mass concentration values revealed: Peak PNC20–1000 nm emitted from the nanotechnology processes were up to three orders of magnitude greater than the local background particle concentration (LBPC). Peak PNC300–3000 nm was up to an order of magnitude greater, and PM2.5 concentrations up to four orders of magnitude greater. For three of these nanotechnology processes, the 30-minute average particle number and mass concentrations were also significantly different from the LBPC (p-value < 0.001). We propose emission or exposure controls may need to be implemented or modified, or further assessment of the controls be undertaken, if concentrations exceed three times the LBPC, which is also used as the local particle reference value, for more than a total of 30 minutes during a workday, and/or if a single short-term measurement exceeds five times the local particle reference value. The use of these quantitative criteria, which we are terming the universal excursion guidance criteria, will account for the typical variation in LBPC and inaccuracy of instruments, while precautionary enough to highlight peaks in particle concentration likely to be associated with particle emission from the nanotechnology process. Recommendations on when to utilize local excursion guidance criteria are also provided.
Resumo:
The purpose of Business Process Management (BPM) is to increase the efficiency and effectiveness of organizational processes through improvement and innovation. Despite a common understanding that culture is an important element in these efforts, there is a dearth of theoretical and empirical research on culture as a facilitator of successful BPM. We develop the BPM culture construct and propose a validated instrument with which to measure organizational cultures’ support of BPM. The operationalization of the BPM culture concept provides a theoretical foundation for future research and a tool to assist organizations in developing a cultural environment that supports successful BPM.
Resumo:
Organisations are constantly seeking efficiency gains for their business processes in terms of time and cost. Management accounting enables detailed cost reporting of business operations for decision making purposes, although significant effort is required to gather accurate operational data. Process mining, on the other hand, may provide valuable insight into processes through analysis of events recorded in logs by IT systems, but its primary focus is not on cost implications. In this paper, a framework is proposed which aims to exploit the strengths of both fields in order to better support management decisions on cost control. This is achieved by automatically merging cost data with historical data from event logs for the purposes of monitoring, predicting, and reporting process-related costs. The on-demand generation of accurate, relevant and timely cost reports, in a style akin to reports in the area of management accounting, will also be illustrated. This is achieved through extending the open-source process mining framework ProM.