962 resultados para full implementation
Resumo:
Introduction Patients with dysphagia (PWDs) have been shown to be four times more likely to suffer medication administration errors (MAEs).1 2 Individualised medication administration guides (I-MAGs) which outline how each formulation should be administered, have been developed to standardise medication administration by nurses on the ward and reduce the likelihood of errors. This pilot study aimed to determine the recruitment rates, estimate effect on errors and develop the intervention to design a future full scale randomised controlled trial to determine the costs and effects of I-MAG implementation. Ethical approval was granted by local ethics committee. Method Software was developed to enable I-MAG production (based on current best practice)3 4 for all PWDs on two care of the older person wards admitted during a six month period from January to July 2011. I-MAGs were attached to the medication administration record charts to be utilised by nurses when administering medicines. Staff training was provided for all staff on the intervention wards. Two care of the older person wards in the same hospital were used for control purposes. All patients with dysphagia were recruited for follow up purposes at discharge. Four ward rounds at each intervention and control ward were observed pre and post I-MAG implementation to determine the level of medication administration errors. NHS ethical approval for the study was obtained. Results 164 I-MAGs were provided for 75 patients with dysphagia (PWDs) in the two intervention wards. At discharge, 23 patients in the intervention wards and 7 patients in the control wards were approached for recruitment of which 17 (74%) & 5 (71.5%) respectively consented. Discussion Recruitment rates were low on discharge due to the dysphagia remitting during hospitalisation. The introduction of the I-MAG demonstrated no effect on the quality of administration on the intervention ward and interestingly practice improved on the control ward. The observation of medication rounds at least one month post I-MAG removal may have identified a reversal to normal practice and ideally observations should have been undertaken with I-MAGs in place. Identification of the reason for the improvement in the control ward is warranted.
Resumo:
The intent of this study was to design, document and implement a Quality Management System (QMS) into a laboratory that incorporated both research and development (R&D) and routine analytical activities. In addition, it was necessary for the QMS to be easily and efficiently maintained to: (a) provide documented evidence that would validate the system's compliance with a certifiable standard, (b) fit the purpose of the laboratory, (c) accommodate prevailing government policies and standards, and (d) promote positive outcomes for the laboratory through documentation and verification of the procedures and methodologies implemented. Initially, a matrix was developed that documented the standards' requirements and the necessary steps to be made to meet those requirements. The matrix provided a check mechanism on the progression of the system's development. In addition, it was later utilised in the Quality Manual as a reference tool for the location of full procedures documented elsewhere in the system. The necessary documentation to build and monitor the system consisted of a series of manuals along with forms that provided auditable evidence of the workings of the QMS. Quality Management (QM), in one form or another, has been in existence since the early 1900's. However, the question still remains: is it a good thing or just a bugbear? Many of the older style systems failed because they were designed by non-users, fiercely regulatory, restrictive and generally deemed to be an imposition. It is now considered important to foster a sense of ownership of the system by the people who use the system. The system's design must be tailored to best fit the purpose of the operations of the facility if maximum benefits to the organisation are to be gained.
Resumo:
We present a case study of formal verification of full-wave rectifier for analog and mixed signal designs. We have used the Checkmate tool from CMU [1], which is a public domain formal verification tool for hybrid systems. Due to the restriction imposed by Checkmate it necessitates to make the changes in the Checkmate implementation to implement the complex and non-linear system. Full-wave rectifier has been implemented by using the Checkmate custom blocks and the Simulink blocks from MATLAB from Math works. After establishing the required changes in the Checkmate implementation we are able to efficiently verify, the safety properties of the full-wave rectifier.
Resumo:
Purpose - There are many library automation packages available as open-source software, comprising two modules: staff-client module and online public access catalogue (OPAC). Although the OPAC of these library automation packages provides advanced features of searching and retrieval of bibliographic records, none of them facilitate full-text searching. Most of the available open-source digital library software facilitates indexing and searching of full-text documents in different formats. This paper makes an effort to enable full-text search features in the widely used open-source library automation package Koha, by integrating it with two open-source digital library software packages, Greenstone Digital Library Software (GSDL) and Fedora Generic Search Service (FGSS), independently. Design/methodology/approach - The implementation is done by making use of the Search and Retrieval by URL (SRU) feature available in Koha, GSDL and FGSS. The full-text documents are indexed both in Koha and GSDL and FGSS. Findings - Full-text searching capability in Koha is achieved by integrating either GSDL or FGSS into Koha and by passing an SRU request to GSDL or FGSS from Koha. The full-text documents are indexed both in the library automation package (Koha) and digital library software (GSDL, FGSS) Originality/value - This is the first implementation enabling the full-text search feature in a library automation software by integrating it into digital library software.
Resumo:
A low complexity, essentially-ML decoding technique for the Golden code and the three antenna Perfect code was introduced by Sirianunpiboon, Howard and Calderbank. Though no theoretical analysis of the decoder was given, the simulations showed that this decoding technique has almost maximum-likelihood (ML) performance. Inspired by this technique, in this paper we introduce two new low complexity decoders for Space-Time Block Codes (STBCs)-the Adaptive Conditional Zero-Forcing (ACZF) decoder and the ACZF decoder with successive interference cancellation (ACZF-SIC), which include as a special case the decoding technique of Sirianunpiboon et al. We show that both ACZF and ACZF-SIC decoders are capable of achieving full-diversity, and we give a set of sufficient conditions for an STBC to give full-diversity with these decoders. We then show that the Golden code, the three and four antenna Perfect codes, the three antenna Threaded Algebraic Space-Time code and the four antenna rate 2 code of Srinath and Rajan are all full-diversity ACZF/ACZF-SIC decodable with complexity strictly less than that of their ML decoders. Simulations show that the proposed decoding method performs identical to ML decoding for all these five codes. These STBCs along with the proposed decoding algorithm have the least decoding complexity and best error performance among all known codes for transmit antennas. We further provide a lower bound on the complexity of full-diversity ACZF/ACZF-SIC decoding. All the five codes listed above achieve this lower bound and hence are optimal in terms of minimizing the ACZF/ACZF-SIC decoding complexity. Both ACZF and ACZF-SIC decoders are amenable to sphere decoding implementation.
Resumo:
This paper presents the experimental results for an attractive control scheme implementation using an 8 bit microcontroller. The power converter involved is a 3 phase full controlled bridge rectifier. A single quadrant DC drive has been realized and results have been presented for both open and closed loop implementations.
Resumo:
Jisc Freedom of Information retention schedules Disclaimer We aim to provide accurate and current information on this website. However, we accept no liability for errors or ommissions, or for loss or damage arising from using this information. The statements made and views expressed in publications are those of the authors and do not represent in any way the views of the Service. The JISC infoNet Service offers general guidance only on issues relevant to the planning and implementation of information systems. Such guidance does not constitute definitive or legal advice and should not be regarded as a substitute therefor. The JISC infoNet Service does not accept any liability for any loss suffered by persons who consult the Service whether or not such loss is suffered directly or indirectly as a result of reliance placed on guidance given by the Service. The reader is reminded that changes may have taken place since issue, particularly in rapidly changing areas such as internet addressing, and consequently URLs and e-mail addresses should be used with caution. We are not responsible for the content of other websites linked to this site. No part of this Web site or its contents may be reproduced or distributed in any form except by bona fide UK public sector education establishments or in accordance with the provisions of the Copyright, Designs and Patents Act 1988 and any amending legislation. All reproductions require an acknowledgement of the source and the author of the work. Parties outside the education sector should contact JISC infoNet regarding use of these materials.
Resumo:
The Southern Florida Shallow-water Coral Ecosystem Mapping Implementation Plan (MIP) discusses the need to produce shallow-water (~0-40 m; 0-22 fm) benthic habitat and bathymetric maps of critical areas in southern Florida and moderate-depth (~40-200 m; 22 -109 fm) bathymetric maps for all of Florida. The ~0-40 m depth regime generally represents where most hermatypic coral species are found and where most direct impacts from pollution and coastal development occur. The plan was developed with extensive input from over 90 representatives of state regulatory and management agencies, federal agencies, universities, and non-governmental organizations involved in the conservation and management of Florida’s coral ecosystems. Southern Florida’s coral ecosystems are extensive. They extend from the Dry Tortugas in the Florida Keys as far north as St Lucie Inlet on the Atlantic Ocean coast and Tarpon Springs on the Gulf of Mexico coast. Using 10 fm (18 m) depth curves on nautical charts as a guide, southern Florida has as much as 84 percent (30,801 sq km) of 36,812 sq km of potential shallow-water (<10 fm; <18 m) coral ecosystems the tropical and subtropical U.S. Moreover, southern Florida’s coral ecosystems contribute greatly to the regional economy. Coral ecosystem-related expenditures generated $4.4 billion in sales, income, and employment and created over 70,000 full-time and part-time jobs in the region during the recent 12-month periods when surveys were conducted.
Resumo:
A semi-active truck damper was developed in conjunction with a commercial shock absorber manufacturer. A linearized damper model was developed for control system design purposes. Open- and closed-loop damper force tracking control was implemented, with tests showing that an open-loop approach gave the best compromise between response speed and accuracy. A hardware-in-the-loop test facility was used to investigate performance of the damper when combined with a simulated quarter-car model. The input to the vehicle model was a set of randomly generated road profiles, each profile traversed at an appropriate speed. Modified skyhook damping tests showed a simultaneous improvement over the optimum passive case of 13 per cent in vertical body acceleration and 8 per cent in dynamic tyre forces. Full-scale vehicle tests of the damper on a heavy tri-axle trailer were carried out. Implementation of modified skyhook damping yielded a simultaneous improvement over the optimum passive case of 8 per cent in vertical body acceleration and 8 per cent in dynamic tyre forces. © IMechE 2008.
Resumo:
BGCore reactor analysis system was recently developed at Ben-Gurion University for calculating in-core fuel composition and spent fuel emissions following discharge. It couples the Monte Carlo transport code MCNP with an independently developed burnup and decay module SARAF. Most of the existing MCNP based depletion codes (e.g. MOCUP, Monteburns, MCODE) tally directly the one-group fluxes and reaction rates in order to prepare one-group cross sections necessary for the fuel depletion analysis. BGCore, on the other hand, uses a multi-group (MG) approach for generation of one group cross-sections. This coupling approach significantly reduces the code execution time without compromising the accuracy of the results. Substantial reduction in the BGCore code execution time allows consideration of problems with much higher degree of complexity, such as introduction of thermal hydraulic (TH) feedback into the calculation scheme. Recently, a simplified TH feedback module, THERMO, was developed and integrated into the BGCore system. To demonstrate the capabilities of the upgraded BGCore system, a coupled neutronic TH analysis of a full PWR core was performed. The BGCore results were compared with those of the state of the art 3D deterministic nodal diffusion code DYN3D (Grundmann et al.; 2000). Very good agreement in major core operational parameters including k-eff eigenvalue, axial and radial power profiles, and temperature distributions between the BGCore and DYN3D results was observed. This agreement confirms the consistency of the implementation of the TH feedback module. Although the upgraded BGCore system is capable of performing both, depletion and TH analyses, the calculations in this study were performed for the beginning of cycle state with pre-generated fuel compositions. © 2011 Published by Elsevier B.V.
Resumo:
This report describes the implementation of a theory of edge detection, proposed by Marr and Hildreth (1979). According to this theory, the image is first processed independently through a set of different size filters, whose shape is the Laplacian of a Gaussian, ***. Zero-crossings in the output of these filters mark the positions of intensity changes at different resolutions. Information about these zero-crossings is then used for deriving a full symbolic description of changes in intensity in the image, called the raw primal sketch. The theory is closely tied with early processing in the human visual systems. In this report, we first examine the critical properties of the initial filters used in the edge detection process, both from a theoretical and practical standpoint. The implementation is then used as a test bed for exploring aspects of the human visual system; in particular, acuity and hyperacuity. Finally, we present some preliminary results concerning the relationship between zero-crossings detected at different resolutions, and some observations relevant to the process by which the human visual system integrates descriptions of intensity changes obtained at different resolutions.
Resumo:
Thomas, R., Spink, S., Durbin, J. & Urquhart, C. (2005). NHS Wales user needs study including knowledgebase tools report. Report for Informing Healthcare Strategy implementation programme. Aberystwyth: Department of Information Studies, University of Wales Aberystwyth. Sponsorship: Informing Healthcare, NHS Wales
Resumo:
In this short paper, we have gone through some key results of monetary policy research applied for the Vietnamese economy, over the past 20 years after Doi Moi, together with a few caveats when putting these results in use. We look at different research themes, and suggest that future research make better and more diverse choice of analytic framework, and also put macro and micro-setting connection at work, which appear to likely bring about better and more insightful results for the monetary economics literature in Vietnam.
Resumo:
Objective Describe the methodology and selection of quality indicators (QI) to be implemented in the EFFECT (EFFectiveness of Endometrial Cancer Treatment) project. EFFECT aims to monitor the variability in Quality of Care (QoC) of uterine cancer in Belgium, to compare the effectiveness of different treatment strategies to improve the QoC and to check the internal validity of the QI to validate the impact of process indicators on outcome. Methods A QI list was retrieved from literature, recent guidelines and QI databases. The Belgian Healthcare Knowledge Center methodology was used for the selection process and involved an expert's panel rating the QI on 4 criteria. The resulting scores and further discussion resulted in a final QI list. An online EFFECT module was developed by the Belgian Cancer Registry including the list of variables required for measuring the QI. Three test phases were performed to evaluate the relevance, feasibility and understanding of the variables and to test the compatibility of the dataset. Results 138 QI were considered for further discussion and 82 QI were eligible for rating. Based on the rating scores and consensus among the expert's panel, 41 QI were considered measurable and relevant. Testing of the data collection enabled optimization of the content and the user-friendliness of the dataset and online module. Conclusions This first Belgian initiative for monitoring the QoC of uterine cancer indicates that the previously used QI selection methodology is reproducible for uterine cancer. The QI list could be applied by other research groups for comparison. © 2013 Elsevier Inc.
Resumo:
The parallelization of an industrially important in-house computational fluid dynamics (CFD) code for calculating the airflow over complex aircraft configurations using the Euler or Navier–Stokes equations is presented. The code discussed is the flow solver module of the SAUNA CFD suite. This suite uses a novel grid system that may include block-structured hexahedral or pyramidal grids, unstructured tetrahedral grids or a hybrid combination of both. To assist in the rapid convergence to a solution, a number of convergence acceleration techniques are employed including implicit residual smoothing and a multigrid full approximation storage scheme (FAS). Key features of the parallelization approach are the use of domain decomposition and encapsulated message passing to enable the execution in parallel using a single programme multiple data (SPMD) paradigm. In the case where a hybrid grid is used, a unified grid partitioning scheme is employed to define the decomposition of the mesh. The parallel code has been tested using both structured and hybrid grids on a number of different distributed memory parallel systems and is now routinely used to perform industrial scale aeronautical simulations. Copyright © 2000 John Wiley & Sons, Ltd.