977 resultados para Evaluation Studies as Topic
Resumo:
PURPOSE: To evaluate the technical quality and the diagnostic performance of a protocol with use of low volumes of contrast medium (25 mL) at 64-detector spiral computed tomography (CT) in the diagnosis and management of adult, nontraumatic subarachnoid hemorrhage (SAH). MATERIALS AND METHODS: This study was performed outside the United States and was approved by the institutional review board. Intracranial CT angiography was performed in 73 consecutive patients with nontraumatic SAH diagnosed at nonenhanced CT. Image quality was evaluated by two observers using two criteria: degree of arterial enhancement and venous contamination. The two independent readers evaluated diagnostic performance (lesion detection and correct therapeutic decision-making process) by using rotational angiographic findings as the standard of reference. Sensitivity, specificity, and positive and negative predictive values were calculated for patients who underwent CT angiography and three-dimensional rotational angiography. The intraclass correlation coefficient was calculated to assess interobserver concordance concerning aneurysm measurements and therapeutic management. RESULTS: All aneurysms were detected, either ruptured or unruptured. Arterial opacification was excellent in 62 cases (85%), and venous contamination was absent or minor in 61 cases (84%). In 95% of cases, CT angiographic findings allowed optimal therapeutic management. The intraclass correlation coefficient ranged between 0.93 and 0.95, indicating excellent interobserver agreement. CONCLUSION: With only 25 mL of iodinated contrast medium focused on the arterial phase, 64-detector CT angiography allowed satisfactory diagnostic and therapeutic management of nontraumatic SAH.
Resumo:
BACKGROUND AND STUDY AIMS: To summarize the published literature on assessment of appropriateness of colonoscopy for the investigation of iron-deficiency anemia (IDA) and hematochezia, and report appropriateness criteria developed by an expert panel, the 2008 European Panel on the Appropriateness of Gastrointestinal Endoscopy, EPAGE II. METHODS: A systematic search of guidelines, systematic reviews and primary studies regarding the evaluation and management of IDA and hematochezia was performed. The RAND/UCLA Appropriateness Method was applied to develop appropriateness criteria for colonoscopy for these conditions. RESULTS: IDA occurs in 2 %-5 % of adult men and postmenopausal women. Examination of both the upper and lower gastrointestinal tract is recommended in patients with iron deficiency. Colonoscopy for IDA yields one colorectal cancer (CRC) in every 9-13 colonoscopies. Hematochezia is a well-recognized alarm symptom and such patients are likely to be referred for colonoscopy. Colonoscopy is unanimously recommended in patients aged > or = 50. Diverticulosis, vascular ectasias, and ischemic colitis are common causes of acute lower gastrointestinal bleeding (LGIB); CRC is found in 0.2 %-11 % of the colonoscopies performed for LGIB. Most patients with scant hematochezia have an anorectal or a distal source of bleeding. The expert panel considered most clinical indications for colonoscopy as appropriate in the presence of IDA (58 %) or hematochezia (83 %). CONCLUSION: Despite the limitations of the published studies, guidelines unanimously recommend colonoscopy for the investigation of IDA and hematochezia in patients aged > or = 50 years. These indications were also considered appropriate by EPAGE II, as were indications in patients at low risk for CRC with no obvious cause of bleeding found during adequate previous investigations.
Resumo:
ABSTRACT: BACKGROUND: Decision curve analysis has been introduced as a method to evaluate prediction models in terms of their clinical consequences if used for a binary classification of subjects into a group who should and into a group who should not be treated. The key concept for this type of evaluation is the "net benefit", a concept borrowed from utility theory. METHODS: We recall the foundations of decision curve analysis and discuss some new aspects. First, we stress the formal distinction between the net benefit for the treated and for the untreated and define the concept of the "overall net benefit". Next, we revisit the important distinction between the concept of accuracy, as typically assessed using the Youden index and a receiver operating characteristic (ROC) analysis, and the concept of utility of a prediction model, as assessed using decision curve analysis. Finally, we provide an explicit implementation of decision curve analysis to be applied in the context of case-control studies. RESULTS: We show that the overall net benefit, which combines the net benefit for the treated and the untreated, is a natural alternative to the benefit achieved by a model, being invariant with respect to the coding of the outcome, and conveying a more comprehensive picture of the situation. Further, within the framework of decision curve analysis, we illustrate the important difference between the accuracy and the utility of a model, demonstrating how poor an accurate model may be in terms of its net benefit. Eventually, we expose that the application of decision curve analysis to case-control studies, where an accurate estimate of the true prevalence of a disease cannot be obtained from the data, is achieved with a few modifications to the original calculation procedure. CONCLUSIONS: We present several interrelated extensions to decision curve analysis that will both facilitate its interpretation and broaden its potential area of application.
Resumo:
OBJECTIVES: To evaluate the combination of ultrasound (US) + fine-needle aspiration (FNA) in the assessment of salivary gland tumours in the hands of the otolaryngologist. DESIGN: A retrospective review of case notes was performed. SETTING: Two university teaching hospitals in Switzerland. PARTICIPANTS: One hundred and three patients with a total of 106 focal masses of the salivary glands were included. Clinician-operated US + FNA were the first line of investigation for these lesions. All patients underwent surgical excision of the lesion, which allowed for confirmation of diagnosis by histopathology in 104 lesions and by laboratory testing in two lesions. MAIN OUTCOME MEASURES: Primary--diagnostic accuracy in identifying true salivary gland neoplasms and detecting malignancy. Secondary--predicting an approximate and specific diagnosis in these tumours. RESULTS: The combination of US + FNA achieved a diagnostic accuracy of 99% in identifying and differentiating true salivary gland neoplasms from tumour-like lesions. In detecting malignancy, this combination permitted an accuracy of 98%. An approximate diagnosis was possible in 89%, and a specific diagnosis in 69% of our patients. CONCLUSIONS: Due to economic factors and a high diagnostic accuracy, the combination of US + FNA represents the investigation method of choice for most salivary gland tumours. We suggest that the otolaryngologist be employed in carrying out these procedures, as is already the rule in other medical specialties, while computed tomography and magnetic resonance imaging should be reserved to those few lesions, which cannot be delineated completely by sonography.
Resumo:
BACKGROUND: Methodological research has found that non-published studies often have different results than those that are published, a phenomenon known as publication bias. When results are not published, or are published selectively based on the direction or the strength of the findings, healthcare professionals and consumers of healthcare cannot base their decision-making on the full body of current evidence. METHODS: As part of the OPEN project (http://www.open-project.eu) we will conduct a systematic review with the following objectives:1. To determine the proportion and/or rate of non-publication of studies by systematically reviewing methodological research projects that followed up a cohort of studies that a. received research ethics committee (REC) approval,b. were registered in trial registries, orc. were presented as abstracts at conferences.2. To assess the association of study characteristics (for example, direction and/or strength of findings) with likelihood of full publication.To identify reports of relevant methodological research projects we will conduct electronic database searches, check reference lists, and contact experts. Published and unpublished projects will be included. The inclusion criteria are as follows:a. RECs: methodological research projects that examined the subsequent proportion and/or rate of publication of studies that received approval from RECs;b. Trial registries: methodological research projects that examine the subsequent proportion and/or rate of publication of studies registered in trial registries;c. Conference abstracts: methodological research projects that examine the subsequent proportion and/or rate of full publication of studies which were initially presented at conferences as abstracts.Primary outcomes: Proportion/rate of published studies; time to full publication (mean/median; cumulative publication rate by time).Secondary outcomes: Association of study characteristics with full publication.The different questions (a, b, and c) will be investigated separately. Data synthesis will involve a combination of descriptive and statistical summaries of the included methodological research projects. DISCUSSION: Results are expected to be publicly available in mid 2013.
Resumo:
The present prospective study, with a five-year follow-up, presents an extensive psychiatric and educational assessment of an adolescent population (N = 30) in the age range 14-20, suffering from several psychiatric disorders, though apt to follow a normal academic program. The residential settings where the study took place provide both psychiatric and schooling facilities. In this environment, what is the effectiveness of long-term hospitalization? Are there any criteria for predicting results? After discharge, could social adjustments difficulties be prevented? Assessment instruments are described and the results of one preliminary study are presented. The actual data seems to confirm the impact of the special treatment facilities combining schooling and psychiatric settings on the long term outcome of adolescents.
Resumo:
The effects resulting from the introduction of an oxime group in place of the distal aromatic ring of the diphenyl moiety of LT175, previously reported as a PPARα/γ dual agonist, have been investigated. This modification allowed the identification of new bioisosteric ligands with fairly good activity on PPARα and fine-tuned moderate activity on PPARγ. For the most interesting compound (S)-3, docking studies in PPARα and PPARγ provided a molecular explanation for its different behavior as full and partial agonist of the two receptor isotypes, respectively. A further investigation of this compound was carried out performing gene expression studies on HepaRG cells. The results obtained allowed to hypothesize a possible mechanism through which this ligand could be useful in the treatment of metabolic disorders. The higher induction of the expression of some genes, compared to selective agonists, seems to confirm the importance of a dual PPARα/γ activity which probably involves a synergistic effect on both receptor subtypes.
Resumo:
The report compares and contrasts the automated PASCO method of pavement evaluation to the manual procedures used by the Iowa Department of Transportation (DOT) to evaluate pavement condition. Iowa DOT's use of IJK and BPR roadmeters and manual crack and patch surveys are compared to PASCO's use of 35-mm photography, artificial lighting and hairline projection, tracking wheels and lasers to measure ride, cracking and patching, rut depths, and roughness. The Iowa DOT method provides a Present Serviceability Index (PSI) value and PASCO provides a Maintenance Control Index (MCI). Seven sections of Interstate Highway, county roads and city streets, and one shoulder section were tested with different speeds of data collection, surface types and textures, and stop and start conditions. High correlation of results between the two methods in the measurement of roughness (0.93 for the tracking wheel and 0.84 for the laser method) were recorded. Rut depth correlations of 0.61 and cracking of 0.32 are attributed to PASCO's more comprehensive measurement techniques. A cost analysis of the data provided by both systems indicates that PASCO is capable of providing a comparable result with improved accuracy at a cost of $125-$150 or less per two-lane mile depending on survey mileage. Improved data collection speed, accuracy, and reliability, and a visible record of pavement condition for comparable costs are available. The PASCO system's ability to provide the data required in the Highway Pavement Distress Identification Manual, the Pavement Condition Rating Guide, and the Strategic Highway Research Program Long Term Pavement Performance (LTPP) Studies, is also outlined in the report.
Resumo:
Transverse joints are placed in portland cement concrete pavements to control the development of random cracking due to stresses induced by moisture and thermal gradients and restrained slab movement. These joints are strengthened through the use of load transfer devices, typically dowel bars, designed to transfer load across the joint from one pavement slab to the next. Epoxy coated steel bars are the materials of choice at the present time, but have experienced some difficulties with resistance to corrosion from deicing salts. The research project investigated the use of alternative materials, dowel size and spacing to determine the benefits and limitations of each material. In this project two types of fiber composite materials, stainless steel solid dowels and epoxy coated dowels were tested for five years in side by side installation in a portion of U.S. 65 near Des Moines, Iowa, between 1997 and 2002. The work was directed at analyzing the load transfer characteristics of 8-in. vs. 12-in. spacing of the dowels and the alternative dowel materials, fiber composite (1.5- and 1.88-in. diameter) and stainless steel (1.5-in. diameter), compared to typical 1.5-in. diameter epoxy-coated steel dowels placed on 12-in. spacing. Data were collected biannually within each series of joints and variables in terms of load transfer in each lane (outer wheel path), visual distress, joint openings, and faulting in each wheel path. After five years of performance the following observations were made from the data collected. Each of the dowel materials is performing equally in terms of load transfer, joint movement and faulting. Stainless steel dowels are providing load transfer performance equal to or greater than epoxy-coated steel dowels at the end of five years. Fiber reinforced polymer (FRP) dowels of the sizes and materials tested should be spaced no greater than 8 in. apart to achieve comparable performance to epoxy coated dowels. No evidence of deterioration due to road salts was identified on any of the products tested. The relatively high cost of stainless steel solid and FRP dowels was a limitation at the time of this study conclusion. Work is continuing with the subject materials in laboratory studies to determine the proper shape, spacing, chemical composition and testing specification to make the FRP and stainless (clad or solid) dowels a viable alternative joint load transfer material for long lasting portland cement concrete pavements.
Resumo:
BACKGROUND: Meta-analyses are particularly vulnerable to the effects of publication bias. Despite methodologists' best efforts to locate all evidence for a given topic the most comprehensive searches are likely to miss unpublished studies and studies that are published in the gray literature only. If the results of the missing studies differ systematically from the published ones, a meta-analysis will be biased with an inaccurate assessment of the intervention's effects.As part of the OPEN project (http://www.open-project.eu) we will conduct a systematic review with the following objectives:â-ª To assess the impact of studies that are not published or published in the gray literature on pooled effect estimates in meta-analyses (quantitative measure).â-ª To assess whether the inclusion of unpublished studies or studies published in the gray literature leads to different conclusions in meta-analyses (qualitative measure). METHODS/DESIGN: Inclusion criteria: Methodological research projects of a cohort of meta-analyses which compare the effect of the inclusion or exclusion of unpublished studies or studies published in the gray literature.Literature search: To identify relevant research projects we will conduct electronic searches in Medline, Embase and The Cochrane Library; check reference lists; and contact experts.Outcomes: 1) The extent to which the effect estimate in a meta-analyses changes with the inclusion or exclusion of studies that were not published or published in the gray literature; and 2) the extent to which the inclusion of unpublished studies impacts the meta-analyses' conclusions.Data collection: Information will be collected on the area of health care; the number of meta-analyses included in the methodological research project; the number of studies included in the meta-analyses; the number of study participants; the number and type of unpublished studies; studies published in the gray literature and published studies; the sources used to retrieve studies that are unpublished, published in the gray literature, or commercially published; and the validity of the methodological research project.Data synthesis: Data synthesis will involve descriptive and statistical summaries of the findings of the included methodological research projects. DISCUSSION: Results are expected to be publicly available in the middle of 2013.
Resumo:
Several strategies are available to the Iowa Department of Transportation (IaDOT) for limiting deterioration due to chloride-induced corrosion of embedded reinforcing bars in concrete bridge decks. While the method most commonly used throughout the Midwestern United States is to construct concrete bridge decks with fusion-bonded epoxy-coated reinforcing bars, galvanized reinforcing bars are an available alternative. Previous studies of the in situ performance of galvanized reinforcing bars in service in bridge decks have been limited. IaDOT requested that Wiss, Janney, Elstner Associates, Inc. (WJE) perform this study to gain further understanding of the long-term performance of an Iowa bridge deck reinforced with galvanized reinforcing bars. This study characterized the condition of a bridge deck with galvanized reinforcing bars after about 36 years of service and compared that performance to the expected performance of epoxy-coated or uncoated reinforcing bars in similar bridge construction. For this study, IaDOT selected the Iowa State Highway 92 bridge across Drainage Ditch #25 in Louisa County, Iowa (Structure No. 5854.5S092), which was constructed using galvanized reinforcing bars as the main deck reinforcing. The scope of work for this study included: field assessment, testing, and sampling; laboratory testing and analysis; analysis of findings; service life modeling; and preparation of this report. In addition, supplemental observations of the condition of the galvanized reinforcing bars were made during a subsequent project to repair the bride deck.
Resumo:
The approach to intervention programs varies depending on the methodological perspective adopted. This means that health professionals lack clear guidelines regarding how best to proceed, and it hinders the accumulation of knowledge. The aim of this paper is to set out the essential and common aspects that should be included in any program evaluation report, thereby providing a useful guide for the professional regardless of the procedural approach used. Furthermore, the paper seeks to integrate the different methodologies and illustrate their complementarity, this being a key aspect in terms of real intervention contexts, which are constantly changing. The aspects to be included are presented in relation to the main stages of the evaluation process: needs, objectives and design (prior to the intervention), implementation (during the intervention), and outcomes (after the intervention). For each of these stages the paper describes the elements on which decisions should be based, highlighting the role of empirical evidence gathered through the application of instruments to defined samples and according to a given procedure.
Resumo:
Health assessment and medical surveillance of workers exposed to combustion nanoparticles are challenging. The aim was to evaluate the feasibility of using exhaled breath condensate (EBC) from healthy volunteers for (1) assessing the lung deposited dose of combustion nanoparticles and (2) determining the resulting oxidative stress by measuring hydrogen peroxide (H2O2) and malondialdehyde (MDA). Methods: Fifteen healthy nonsmoker volunteers were exposed to three different levels of sidestream cigarette smoke under controlled conditions. EBC was repeatedly collected before, during, and 1 and 2 hr after exposure. Exposure variables were measured by direct reading instruments and by active sampling. The different EBC samples were analyzed for particle number concentration (light-scattering-based method) and for selected compounds considered oxidative stress markers. Results: Subjects were exposed to an average airborne concentration up to 4.3×10(5) particles/cm(3) (average geometric size ∼60-80 nm). Up to 10×10(8) particles/mL could be measured in the collected EBC with a broad size distribution (50(th) percentile ∼160 nm), but these biological concentrations were not related to the exposure level of cigarette smoke particles. Although H2O2 and MDA concentrations in EBC increased during exposure, only H2O2 showed a transient normalization 1 hr after exposure and increased afterward. In contrast, MDA levels stayed elevated during the 2 hr post exposure. Conclusions: The use of diffusion light scattering for particle counting proved to be sufficiently sensitive to detect objects in EBC, but lacked the specificity for carbonaceous tobacco smoke particles. Our results suggest two phases of oxidation markers in EBC: first, the initial deposition of particles and gases in the lung lining liquid, and later the start of oxidative stress with associated cell membrane damage. Future studies should extend the follow-up time and should remove gases or particles from the air to allow differentiation between the different sources of H2O2 and MDA.
Resumo:
Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.
Resumo:
OBJECTIVE: The aim of this study was to assess the implementation process and economic impact of a new pharmaceutical care service provided since 2002 by pharmacists in Swiss nursing homes. SETTING: The setting was 42 nursing homes located in the canton of Fribourg, Switzerland under the responsibility of 22 pharmacists. METHOD: We developed different facilitators, such as a monitoring system, a coaching program, and a research project, to help pharmacists change their practice and to improve implementation of this new service. We evaluated the implementation rate of the service delivered in nursing homes. We assessed the economic impact of the service since its start in 2002 using statistical evaluation (Chow test) with retrospective analysis of the annual drug costs per resident over an 8-year period (1998-2005). MAIN OUTCOME MEASURES: The description of the facilitators and their implications in implementation of the service; the economic impact of the service since its start in 2002. RESULTS: In 2005, after a 4-year implementation period supported by the introduction of facilitators of practice change, all 42 nursing homes (2,214 residents) had implemented the pharmaceutical care service. The annual drug costs per resident decreased by about 16.4% between 2002 and 2005; this change proved to be highly significant. The performance of the pharmacists continuously improved using a specific coaching program including an annual expert comparative report, working groups, interdisciplinary continuing education symposia, and individual feedback. This research project also determined priorities to develop practice guidelines to prevent drug-related problems in nursing homes, especially in relation to the use of psychotropic drugs. CONCLUSION: The pharmaceutical care service was fully and successfully implemented in Fribourg's nursing homes within a period of 4 years. These findings highlight the importance of facilitators designed to assist pharmacists in the implementation of practice changes. The economic impact was confirmed on a large scale, and priorities for clinical and pharmacoeconomic research were identified in order to continue to improve the quality of integrated care for the elderly.