881 resultados para Enterprise system implementation
Resumo:
New reimbursement policies developed by the Centers for Medicare and Medicaid Services (CMS) are revolutionizing the health care landscape in America. The policies focus on clinical quality and patient outcomes. As part of the new policies, certain hospital acquired conditions have been identified by Medicare as "reasonably preventable". Beginning October 1, 2008, Medicare will no longer reimburse hospitals for these conditions developed after admission, pressure ulcers are among the most common of these conditions.^ In this practice-based culminating experience the objective was to provide a practical account of the process of program development, implementation and evaluation in a public health setting. In order to decrease the incidence of pressure ulcers, the program development team of the hospital system developed a comprehensive pressure ulcer prevention program using a "bundled" approach. The pressure ulcer prevention bundle was based on research supported by the Institute for Healthcare Improvement, and addressed key areas of clinical vulnerability for pressure ulcer development. The bundle consisted of clinical processes, policies, forms, and resources designed to proactively identify patients at risk for pressure ulcer development. Each element of the bundle was evaluated to ensure ease of integration into the workflow of nurses and clinical ancillary staff. Continued monitoring of pressure ulcer incidence rates will provide statistical validation of the impact of the prevention bundle. ^
Resumo:
Introduction: The Texas Occupational Safety & Health Surveillance System (TOSHSS) was created to collect, analyze and interpret occupational injury and illness data in order to decrease the impact of occupational injuries within the state of Texas. This process evaluation was performed midway through the 4-year grant to assess the efficiency and effectiveness of the surveillance system’s planning and implementation activities1. ^ Methods: Two evaluation guidelines published by the Centers for Disease Control and Prevention (CDC) were used as the theoretical models for this process evaluation. The Framework for Program Evaluation in Public Health was used to examine the planning and design of TOSHSS using logic models. The Framework for Evaluating Public Health Surveillance Systems was used to examine the implementation of approximately 60 surveillance activities, including uses of the data obtained from the surveillance system. ^ Results/Discussion: TOSHSS planning activities omitted the creation of a scientific advisory committee and specific activities designed to maintain contacts with stakeholders; and proposed activities should be reassessed and aligned with ongoing performance measurement criteria, including the role of collaborators in helping the surveillance system achieve each proposed activity. TOSHSS implementation activities are substantially meeting expectations and received an overall score of 61% for all activities being performed. TOSHSS is considered a surveillance system that is simple, flexible, acceptable, fairly stable, timely, moderately useful, with good data quality and a PVP of 86%. ^ Conclusions: Through the third year of TOSHSS implementation, the surveillance system is has made a considerable contribution to the collection of occupational injury and illness information within the state of Texas. Implementation of the nine recommendations provided under this process evaluation is expected to increase the overall usefulness of the surveillance system and assist TDSHS in reducing occupational fatalities, injuries, and diseases within the state of Texas. ^ 1 Disclaimer: The Texas Occupational Safety and Health Surveillance System is supported by Grant/Cooperative Agreement Number (U60 OH008473-01A1). The content of the current evaluation are solely the responsibility of the authors and do not necessarily represent the official views of the Centers for Disease Control and Prevention, National Institute for Occupational Safety and Health.^
Resumo:
Objective. The World Health Organization (WHO) estimates that nearly 450 million people suffer from a mental disorder in the world. Developing countries do not have the health system structure in place to support the demand of mental health services. This study will conduct a review of mental health integration in primary care research that is carried out in low-income countries identified as such from the World Bank economic analysis. The research follows the standard of care that WHO has labeled appropriate in treatment of mental health populations. Methods. This study will use the WHO 10 principles of mental health integration into primary care as the global health standard of care for mental health. Low-income countries that used these principles in their national programs will be analyzed for effectiveness of mental health integration in primary care. Results. This study showed that mental health service integration in primary care did have an effect on health outcomes of low-income countries. However, information did not lead to significant quantitative results that determined how positive the effect was. Conclusion. More ethnographic research is needed in low-income countries to truly assess how effective the program is in integrating with the health system currently in place.^ ^
Resumo:
This work aimed to create a mailable and OSLD-based phantom with accuracy suitable for RPC audits of HDR brachytherapy sources at institutions participating in NCI-funded cooperative clinical trials. An 8 × 8 × 10 cm3 prototype with two slots capable of holding nanoDot Al2O3:C OSL dosimeters (Landauer, Glenwood, IL) was designed and built. The phantom has a single channel capable of accepting all 192Ir HDR brachytherapy sources in current clinical use in the United States. Irradiations were performed with an 192Ir HDR source to determine correction factors for linearity with dose, dose rate, and the combined effect of irradiation energy and phantom construction. The uncertainties introduced by source positioning in the phantom and timer resolution limitations were also investigated. It was found that the linearity correction factor was where dose is in cGy, which differed from that determined by the RPC for the same batch of dosimeters under 60Co irradiation. There was no significant dose rate effect. Separate energy+block correction factors were determined for both models of 192Ir sources currently in clinical use and these vendor-specific correction factors differed by almost 2.6%. For Nucletron sources, this correction factor was 1.026±0.004 (99% Confidence Interval) and for Varian sources it was 1.000±0.007 (99% CI). Reasonable deviations in source positioning within the phantom and the limited resolution of the source timer had insignificant effects on the ability to measure dose. Overall measurement uncertainty of the system was estimated to be ±2.5% for both Nucletron and Varian source audits (95% CI). This uncertainty was sufficient to establish a ±5% acceptance criterion for source strength audits under a formal RPC audit program. Trial audits of eight participating institutions resulted in an average RPC-to-institution dose ratio of 1.000 with a standard deviation of 0.011.
Resumo:
Background: The Sacred Vocation Program (SVP) (Amick B, Karff S., 2003) helps workers find meaning, spirituality, and see their job as a sacred vocation. The SVP is based on Participatory Action Research (PAR) (Minkler & Wallerstein, 1997; Parker & Wall, 1998). This study aims to evaluate the SVP implemented at the Baylor Healthcare System, Dallas-Fort Worth. ^ Methods: The study design is a qualitative design. We used data from study participants who have participated in focus groups. During these focus groups specific questions and probes regarding the effectiveness of the SVP have been asked. We analyzed the focus groups and derived themes. ^ Results: Results of this study demonstrate SVP helps graduates feel valued and important. The SVP has improved meaningful work for employees and improved a sense of belonging for participants. The program has also increased participant spirituality. The coping techniques developed during a SVP class helps participants deal with stressful situations. The SVP faces challenges of implementation fidelity, poor communication, program viability in tough economic times and implementation of phase II. Another sustainability challenge for SVP is the perception of the program being a religious one versus a spiritual program. ^ Conclusion: Several aspects of the SVP work. The phase I of SVP is successful in improving meaningful work and a sense of belonging for participants. The coping techniques help participants deal with difficult work situations. The SVP can increase effectiveness through improvements in implementation fidelity, communication and leadership commitment. ^
Resumo:
The oceans play a critical role in the Earth's climate, but unfortunately, the extent of this role is only partially understood. One major obstacle is the difficulty associated with making high-quality, globally distributed observations, a feat that is nearly impossible using only ships and other ocean-based platforms. The data collected by satellite-borne ocean color instruments, however, provide environmental scientists a synoptic look at the productivity and variability of the Earth's oceans and atmosphere, respectively, on high-resolution temporal and spatial scales. Three such instruments, the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) onboard ORBIMAGE's OrbView-2 satellite, and two Moderate Resolution Imaging Spectroradiometers (MODIS) onboard the National Aeronautic and Space Administration's (NASA) Terra and Aqua satellites, have been in continuous operation since September 1997, February 2000, and June 2002, respectively. To facilitate the assembly of a suitably accurate data set for climate research, members of the NASA Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project and SeaWiFS Project Offices devote significant attention to the calibration and validation of these and other ocean color instruments. This article briefly presents results from the SIMBIOS and SeaWiFS Project Office's (SSPO) satellite ocean color validation activities and describes the SeaWiFS Bio-optical Archive and Storage System (SeaBASS), a state-of-the-art system for archiving, cataloging, and distributing the in situ data used in these activities.
Resumo:
Este trabajo tiene como objetivo describir la experiencia de implementación y desarrollo del Portal de revistas de la Facultad de Humanidades y Ciencias de Educación de la Universidad Nacional de La Plata a fin de que pueda ser aprovechada por todos aquellos que emprendan iniciativas de características similares. Para ello, se realiza en primer lugar un repaso por la trayectoria de la Facultad respecto a la edición de revistas científicas y la labor bibliotecaria para contribuir a su visualización. En segundo orden, se exponen las tareas llevadas adelante por la Prosecretaría de Gestión Editorial y Difusión (PGEyD) de la Facultad para concretar la puesta en marcha del portal. Se hace especial referencia a la personalización del software, a la metodología utilizada para la carga masiva de información en el sistema (usuarios y números retrospectivos) y a los procedimientos que permiten la inclusión en repositorio institucional y en el catálogo web de todos los contenidos del portal de manera semi-automática. Luego, se hace alusión al trabajo que se está realizando en relación al soporte y a la capacitación de los editores. Se exponen, después, los resultados conseguidos hasta el momento en un año de trabajo: creación de 10 revistas, migración de 4 títulos completos e inclusión del 25de las contribuciones publicadas en las revistas editadas por la FaHCE. A modo de cierre se enuncian una serie de desafíos que la Prosecretaría se ha propuesto para mejorar el Portal y optimizar los flujos de trabajo intra e interinstitucionales
Resumo:
Este trabajo tiene como objetivo describir la experiencia de implementación y desarrollo del Portal de revistas de la Facultad de Humanidades y Ciencias de Educación de la Universidad Nacional de La Plata a fin de que pueda ser aprovechada por todos aquellos que emprendan iniciativas de características similares. Para ello, se realiza en primer lugar un repaso por la trayectoria de la Facultad respecto a la edición de revistas científicas y la labor bibliotecaria para contribuir a su visualización. En segundo orden, se exponen las tareas llevadas adelante por la Prosecretaría de Gestión Editorial y Difusión (PGEyD) de la Facultad para concretar la puesta en marcha del portal. Se hace especial referencia a la personalización del software, a la metodología utilizada para la carga masiva de información en el sistema (usuarios y números retrospectivos) y a los procedimientos que permiten la inclusión en repositorio institucional y en el catálogo web de todos los contenidos del portal de manera semi-automática. Luego, se hace alusión al trabajo que se está realizando en relación al soporte y a la capacitación de los editores. Se exponen, después, los resultados conseguidos hasta el momento en un año de trabajo: creación de 10 revistas, migración de 4 títulos completos e inclusión del 25de las contribuciones publicadas en las revistas editadas por la FaHCE. A modo de cierre se enuncian una serie de desafíos que la Prosecretaría se ha propuesto para mejorar el Portal y optimizar los flujos de trabajo intra e interinstitucionales
Resumo:
Este trabajo tiene como objetivo describir la experiencia de implementación y desarrollo del Portal de revistas de la Facultad de Humanidades y Ciencias de Educación de la Universidad Nacional de La Plata a fin de que pueda ser aprovechada por todos aquellos que emprendan iniciativas de características similares. Para ello, se realiza en primer lugar un repaso por la trayectoria de la Facultad respecto a la edición de revistas científicas y la labor bibliotecaria para contribuir a su visualización. En segundo orden, se exponen las tareas llevadas adelante por la Prosecretaría de Gestión Editorial y Difusión (PGEyD) de la Facultad para concretar la puesta en marcha del portal. Se hace especial referencia a la personalización del software, a la metodología utilizada para la carga masiva de información en el sistema (usuarios y números retrospectivos) y a los procedimientos que permiten la inclusión en repositorio institucional y en el catálogo web de todos los contenidos del portal de manera semi-automática. Luego, se hace alusión al trabajo que se está realizando en relación al soporte y a la capacitación de los editores. Se exponen, después, los resultados conseguidos hasta el momento en un año de trabajo: creación de 10 revistas, migración de 4 títulos completos e inclusión del 25de las contribuciones publicadas en las revistas editadas por la FaHCE. A modo de cierre se enuncian una serie de desafíos que la Prosecretaría se ha propuesto para mejorar el Portal y optimizar los flujos de trabajo intra e interinstitucionales
Resumo:
Zernike polynomials are a well known set of functions that find many applications in image or pattern characterization because they allow to construct shape descriptors that are invariant against translations, rotations or scale changes. The concepts behind them can be extended to higher dimension spaces, making them also fit to describe volumetric data. They have been less used than their properties might suggest due to their high computational cost. We present a parallel implementation of 3D Zernike moments analysis, written in C with CUDA extensions, which makes it practical to employ Zernike descriptors in interactive applications, yielding a performance of several frames per second in voxel datasets about 2003 in size. In our contribution, we describe the challenges of implementing 3D Zernike analysis in a general-purpose GPU. These include how to deal with numerical inaccuracies, due to the high precision demands of the algorithm, or how to deal with the high volume of input data so that it does not become a bottleneck for the system.
Application of the Extended Kalman filter to fuzzy modeling: Algorithms and practical implementation
Resumo:
Modeling phase is fundamental both in the analysis process of a dynamic system and the design of a control system. If this phase is in-line is even more critical and the only information of the system comes from input/output data. Some adaptation algorithms for fuzzy system based on extended Kalman filter are presented in this paper, which allows obtaining accurate models without renounce the computational efficiency that characterizes the Kalman filter, and allows its implementation in-line with the process
Resumo:
This article presents the model and implementation of a multiagent fuzzy system (MAFS), to automate the search of solutions of incidents in telecommunications, expressed by the users in an imprecise way and, later, registered in a a knowledge base keeping their original vaguenesses and the relationships between the incidents considered as ancestors and descendants. The process of the fuzzy incidents, no matter their causes, is based on the application of a formula which transforms the intervals of the fuzzy incidents to a computational language and in the interaction between the different kinds of software agents and the humans. To search and suggest solutions of the incident originally stated, a search algorithm is used and illustrated with an example. The preliminary results obtained show the users' satisfaction, in a great percentage of the presented cases. The system is adaptive and allows to record new solutions for future users.
Resumo:
Nowadays, PBL is considered a suitable methodology for engineering education. But making the most of this methodology requires some features, such as multidisciplinary, illstructured teamwork and autonomous research that sometimes are not easy to achieve. In fact, traditional university systems, including curricula, teaching methodologies, assessment and regulation, do not help the implementation of these features. Firstly, we look through the main differences found between a traditional system and the Aalborg model, considered a reference point in PBL. Then, this work is aimed at detecting the main obstacles that a standing traditional system presents to PBL implementation. A multifaceted PBL experience, covering three different disciplines, brings us to analyse these difficulties, order them according to its importance and decide which should be the first changes. Finally, we propose a straightforward introduction of generic competences in the curricula aimed at supporting the use of Problem-Based Project-Organized Learning
Resumo:
In parallel to the effort of creating Open Linked Data for the World Wide Web there is a number of projects aimed for developing the same technologies but in the context of their usage in closed environments such as private enterprises. In the paper, we present results of research on interlinking structured data for use in Idea Management Systems - a still rare breed of knowledge management systems dedicated to innovation management. In our study, we show the process of extending an ontology that initially covers only the Idea Management System structure towards the concept of linking with distributed enterprise data and public data using Semantic Web technologies. Furthermore we point out how the established links can help to solve the key problems of contemporary Idea Management Systems
Resumo:
This paper discusses a novel hybrid approach for text categorization that combines a machine learning algorithm, which provides a base model trained with a labeled corpus, with a rule-based expert system, which is used to improve the results provided by the previous classifier, by filtering false positives and dealing with false negatives. The main advantage is that the system can be easily fine-tuned by adding specific rules for those noisy or conflicting categories that have not been successfully trained. We also describe an implementation based on k-Nearest Neighbor and a simple rule language to express lists of positive, negative and relevant (multiword) terms appearing in the input text. The system is evaluated in several scenarios, including the popular Reuters-21578 news corpus for comparison to other approaches, and categorization using IPTC metadata, EUROVOC thesaurus and others. Results show that this approach achieves a precision that is comparable to top ranked methods, with the added value that it does not require a demanding human expert workload to train