986 resultados para Intelligence process
Resumo:
Background: Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables"frequency" and"degree of conflict". In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable"exposure to conflict", as well as considering six"types of ethical conflict". An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods: The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach"s alpha was used to evaluate the instrument"s reliability. All analyses were performed using the statistical software PASW v19. Results: Cronbach"s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions: The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.
Resumo:
[Les métamorphoses (français moyen). 1556]
Resumo:
Therapeutic drug monitoring (TDM), i. e., the quantification of serum or plasma concentrations of medications for dose optimization, has proven a valuable tool for the patient-matched psychopharmacotherapy. Uncertain drug adherence, suboptimal tolerability, non-response at therapeutic doses, or pharmacokinetic drug-drug interactions are typical situations when measurement of medication concentrations is helpful. Patient populations that may predominantly benefit from TDM in psychiatry are children, pregnant women, elderly patients, individuals with intelligence disabilities, forensic patients, patients with known or suspected genetically determined pharmacokinetic abnormalities or individuals with pharmacokinetically relevant comorbidities. However, the potential benefits of TDM for optimization of pharmacotherapy can only be obtained if the method is adequately integrated into the clinical treatment process. To promote an appropriate use of TDM, the TDM expert group of the Arbeitsgemeinschaft für Neuropsychopharmakologie und Pharmakopsychiatrie (AGNP) issued guidelines for TDM in psychiatry in 2004. Since then, knowledge has advanced significantly, and new psychopharmacologic agents have been introduced that are also candidates for TDM. Therefore the TDM consensus guidelines were updated and extended to 128 neuropsychiatric drugs. 4 levels of recommendation for using TDM were defined ranging from "strongly recommended" to "potentially useful". Evidence-based "therapeutic reference ranges" and "dose related reference ranges" were elaborated after an extensive literature search and a structured internal review process. A "laboratory alert level" was introduced, i. e., a plasma level at or above which the laboratory should immediately inform the treating physician. Supportive information such as cytochrome P450 substrate- and inhibitor properties of medications, normal ranges of ratios of concentrations of drug metabolite to parent drug and recommendations for the interpretative services are given. Recommendations when to combine TDM with pharmacogenetic tests are also provided. Following the guidelines will help to improve the outcomes of psychopharmacotherapy of many patients especially in case of pharmacokinetic problems. Thereby, one should never forget that TDM is an interdisciplinary task that sometimes requires the respectful discussion of apparently discrepant data so that, ultimately, the patient can profit from such a joint effort.
Resumo:
El Business Intelligence ha pasado en los últimos 20 años de ser un capricho de unos pocos CIO, que podían permitirse destinar partidas presupuestarias para tal efecto, a convertirse en una realidad ya presente en muchas de las grandes empresas o una necesidad urgente para las que todavía no han implantado un sistema de esas características.La primera parte del presente documento, denominada “Estudio del Business Intelligence”, presenta una introducción a dicho concepto, desde la base. Explicando los conceptos teóricos clave necesarios para entender este tipo de soluciones, más adelante se comentan los componentes tecnológicos que van desde los procesos de extracción e integración de información a cómo debemos estructurar la información para facilitar el análisis. Por último, se repasan los diferentes tipos de aplicaciones que existen en el mercado así como las tendencias más actuales en este campo.La segunda parte del documento centra su foco en la implantación de un Cuadro de Mandos para el análisis de las ventas de una empresa, se identifican las diferentes fases del proyecto así como se entra en detalle de los requerimientos identificados. En último lugar, se presenta el desarrollo realizado del Cuadro de Mandos con tecnología Xcelsius, que permite exportar a flash el resultado y visualizarlo en cualquier navegador web.
Resumo:
Information concerning standard design practices and details for the Iowa Department of Transportation (IDOT) was provided to the research team. This was reviewed in detail so that the researchers would be familiar with the terminology and standard construction details. A comprehensive literature review was completed to gather information concerning constructability concepts applicable to bridges. It was determined that most of the literature deals with constructability as a general topic with only a limited amount of literature with specific concepts for bridge design and construction. Literature was also examined concerning the development of appropriate microcomputer databases. These activities represent completion of Task 1 as identified in the study.
Resumo:
This paper presents a validation study on statistical nonsupervised brain tissue classification techniques in magnetic resonance (MR) images. Several image models assuming different hypotheses regarding the intensity distribution model, the spatial model and the number of classes are assessed. The methods are tested on simulated data for which the classification ground truth is known. Different noise and intensity nonuniformities are added to simulate real imaging conditions. No enhancement of the image quality is considered either before or during the classification process. This way, the accuracy of the methods and their robustness against image artifacts are tested. Classification is also performed on real data where a quantitative validation compares the methods' results with an estimated ground truth from manual segmentations by experts. Validity of the various classification methods in the labeling of the image as well as in the tissue volume is estimated with different local and global measures. Results demonstrate that methods relying on both intensity and spatial information are more robust to noise and field inhomogeneities. We also demonstrate that partial volume is not perfectly modeled, even though methods that account for mixture classes outperform methods that only consider pure Gaussian classes. Finally, we show that simulated data results can also be extended to real data.
Resumo:
Abstract
Resumo:
The software development industry is constantly evolving. The rise of the agile methodologies in the late 1990s, and new development tools and technologies require growing attention for everybody working within this industry. The organizations have, however, had a mixture of various processes and different process languages since a standard software development process language has not been available. A promising process meta-model called Software & Systems Process Engineering Meta- Model (SPEM) 2.0 has been released recently. This is applied by tools such as Eclipse Process Framework Composer, which is designed for implementing and maintaining processes and method content. Its aim is to support a broad variety of project types and development styles. This thesis presents the concepts of software processes, models, traditional and agile approaches, method engineering, and software process improvement. Some of the most well-known methodologies (RUP, OpenUP, OpenMethod, XP and Scrum) are also introduced with a comparison provided between them. The main focus is on the Eclipse Process Framework and SPEM 2.0, their capabilities, usage and modeling. As a proof of concept, I present a case study of modeling OpenMethod with EPF Composer and SPEM 2.0. The results show that the new meta-model and tool have made it possible to easily manage method content, publish versions with customized content, and connect project tools (such as MS Project) with the process content. The software process modeling also acts as a process improvement activity.