25 resultados para System implementation
Resumo:
Aim. The paper presents a study assessing the rate of adoption of a sedation scoring system and sedation guideline. Background. Clinical practice guidelines including sedation guidelines have been shown to improve patient outcomes by standardizing care. In particular sedation guidelines have been shown to be beneficial for intensive care patients by reducing the duration of ventilation. Despite the acceptance that clinical practice guidelines are beneficial, adoption rates are rarely measured. Adoption data may reveal other factors which contribute to improved outcomes. Therefore, the usefulness of the guideline may be more appropriately assessed by collecting adoption data. Method. A quasi-experimental pre-intervention and postintervention quality improvement design was used. Adoption was operationalized as documentation of sedation score every 4 hours and use of the sedation and analgesic medications suggested in the guideline. Adoption data were collected from patients' charts on a random day of the month; all patients in the intensive care unit on that day were assigned an adoption category. Sedation scoring system adoption data were collected before implementation of a sedation guideline, which was implemented using an intensive information-giving strategy, and guideline adoption data were fed back to bedside nurses. After implementation of the guideline, adoption data were collected for both the sedation scoring system and the guideline. The data were collected in the years 2002-2004. Findings. The sedation scoring system was not used extensively in the pre-intervention phase of the study; however, this improved in the postintervention phase. The findings suggest that the sedation guideline was gradually adopted following implementation in the postintervention phase of the study. Field notes taken during the implementation of the sedation scoring system and the guideline reveal widespread acceptance of both. Conclusion. Measurement of adoption is a complex process. Appropriate operationalization contributes to greater accuracy. Further investigation is warranted to establish the intensity and extent of implementation required to positively affect patient outcomes.
Resumo:
There is a widening gulf in change literature between theoretical notions of evolving organisational form and the emerging reality that old and new organisational structures coexist. This paper explores this dichotomy in Enterprise Resource Planning change. It develops a cellular hierarchy framework to explain how different types of hierarchy coexist within the same organisation during the implementation of Enterprise Resource Planning. © 2006 The Author; Journal compilation © 2006 Blackwell Publishing Ltd.
Resumo:
Since 2001, Mexico has been designing, legislating, and implementing a major health-system reform. A key component was the creation of Seguro Popular, which is intended to expand insurance coverage over 7 years to uninsured people, nearly half the total population at the start of 2001. The reform included five actions: legislation of entitlement per family affiliated which, with full implementation, will increase public spending on health by 0.8-1.0% of gross domestic product; creation of explicit benefits packages; allocation of monies to decentralised state ministries of health in proportion to number of families affiliated; division of federal resources flowing to states into separate funds for personal and non-personal health services; and creation of a fund to protect families against catastrophic health expenditures. Using the WHO health-systems framework, we used a wide range of datasets to assess the effect of this reform on different dimensions of the health system. Key findings include: affiliation is preferentially reaching the poor and the marginalised communities; federal non-social security expenditure in real per-head terms increased by 38% from 2000 to 2005; equity of public-health expenditure across states improved; Seguro Popular affiliates used more inpatient and outpatient services than uninsured people; effective coverage of 11 interventions has improved between 2000 and 2005-06; inequalities in effective coverage across states and wealth deciles has decreased over this period; catastrophic expenditures for Seguro Popular affiliates are lower than for uninsured people even though use of services has increased. We present some lessons for Mexico based on this interim evaluation and explore implications for other countries considering health reforms.
Resumo:
Achieving consistency between a specification and its implementation is an important part of software development In previous work, we have presented a method and tool support for testing a formal specification using animation and then verifying an implementation of that specification. The method is based on a testgraph, which provides a partial model of the application under test. The testgraph is used in combination with an animator to generate test sequences for testing the formal specification. The same testgraph is used during testing to execute those same sequences on the implementation and to ensure that the implementation conforms to the specification. So far, the method and its tool support have been applied to software components that can be accessed through an application programmer interface (API). In this paper, we use an industrially-based case study to discuss the problems associated with applying the method to a software system with a graphical user interface (GUI). In particular, the lack of a standardised interface, as well as controllability and observability problems, make it difficult to automate the testing of the implementation. The method can still be applied, but the amount of testing that can be carried on the implementation is limited by the manual effort involved.
Resumo:
User requirements of multimedia authentication are various. In some cases, the user requires an authentication system to monitor a set of specific areas with respective sensitivity while neglecting other modification. Most current existing fragile watermarking schemes are mixed systems, which can not satisfy accurate user requirements. Therefore, in this paper we designed a sensor-based multimedia authentication architecture. This system consists of sensor combinations and a fuzzy response logic system. A sensor is designed to strictly respond to given area tampering of a certain type. With this scheme, any complicated authentication requirement can be satisfied, and many problems such as error tolerant tamper method detection will be easily resolved. We also provided experiments to demonstrate the implementation of the sensor-based system
Resumo:
Computer-based, socio-technical systems projects are frequently failures. In particular, computer-based information systems often fail to live up to their promise. Part of the problem lies in the uncertainty of the effect of combining the subsystems that comprise the complete system; i.e. the system's emergent behaviour cannot be predicted from a knowledge of the subsystems. This paper suggests uncertainty management is a fundamental unifying concept in analysis and design of complex systems and goes on to indicate that this is due to the co-evolutionary nature of the requirements and implementation of socio-technical systems. The paper shows a model of the propagation of a system change that indicates that the introduction of two or more changes over time can cause chaotic emergent behaviour.
Resumo:
Experimental and theoretical studies have shown the importance of stochastic processes in genetic regulatory networks and cellular processes. Cellular networks and genetic circuits often involve small numbers of key proteins such as transcriptional factors and signaling proteins. In recent years stochastic models have been used successfully for studying noise in biological pathways, and stochastic modelling of biological systems has become a very important research field in computational biology. One of the challenge problems in this field is the reduction of the huge computing time in stochastic simulations. Based on the system of the mitogen-activated protein kinase cascade that is activated by epidermal growth factor, this work give a parallel implementation by using OpenMP and parallelism across the simulation. Special attention is paid to the independence of the generated random numbers in parallel computing, that is a key criterion for the success of stochastic simulations. Numerical results indicate that parallel computers can be used as an efficient tool for simulating the dynamics of large-scale genetic regulatory networks and cellular processes
Resumo:
-scale vary from a planetary scale and million years for convection problems to 100km and 10 years for fault systems simulations. Various techniques are in use to deal with the time dependency (e.g. Crank-Nicholson), with the non-linearity (e.g. Newton-Raphson) and weakly coupled equations (e.g. non-linear Gauss-Seidel). Besides these high-level solution algorithms discretization methods (e.g. finite element method (FEM), boundary element method (BEM)) are used to deal with spatial derivatives. Typically, large-scale, three dimensional meshes are required to resolve geometrical complexity (e.g. in the case of fault systems) or features in the solution (e.g. in mantel convection simulations). The modelling environment escript allows the rapid implementation of new physics as required for the development of simulation codes in earth sciences. Its main object is to provide a programming language, where the user can define new models and rapidly develop high-level solution algorithms. The current implementation is linked with the finite element package finley as a PDE solver. However, the design is open and other discretization technologies such as finite differences and boundary element methods could be included. escript is implemented as an extension of the interactive programming environment python (see www.python.org). Key concepts introduced are Data objects, which are holding values on nodes or elements of the finite element mesh, and linearPDE objects, which are defining linear partial differential equations to be solved by the underlying discretization technology. In this paper we will show the basic concepts of escript and will show how escript is used to implement a simulation code for interacting fault systems. We will show some results of large-scale, parallel simulations on an SGI Altix system. Acknowledgements: Project work is supported by Australian Commonwealth Government through the Australian Computational Earth Systems Simulator Major National Research Facility, Queensland State Government Smart State Research Facility Fund, The University of Queensland and SGI.