326 resultados para Abstracting and Indexing as Topic
Resumo:
BACKGROUND: Methodological research has found that non-published studies often have different results than those that are published, a phenomenon known as publication bias. When results are not published, or are published selectively based on the direction or the strength of the findings, healthcare professionals and consumers of healthcare cannot base their decision-making on the full body of current evidence. METHODS: As part of the OPEN project (http://www.open-project.eu) we will conduct a systematic review with the following objectives:1. To determine the proportion and/or rate of non-publication of studies by systematically reviewing methodological research projects that followed up a cohort of studies that a. received research ethics committee (REC) approval,b. were registered in trial registries, orc. were presented as abstracts at conferences.2. To assess the association of study characteristics (for example, direction and/or strength of findings) with likelihood of full publication.To identify reports of relevant methodological research projects we will conduct electronic database searches, check reference lists, and contact experts. Published and unpublished projects will be included. The inclusion criteria are as follows:a. RECs: methodological research projects that examined the subsequent proportion and/or rate of publication of studies that received approval from RECs;b. Trial registries: methodological research projects that examine the subsequent proportion and/or rate of publication of studies registered in trial registries;c. Conference abstracts: methodological research projects that examine the subsequent proportion and/or rate of full publication of studies which were initially presented at conferences as abstracts.Primary outcomes: Proportion/rate of published studies; time to full publication (mean/median; cumulative publication rate by time).Secondary outcomes: Association of study characteristics with full publication.The different questions (a, b, and c) will be investigated separately. Data synthesis will involve a combination of descriptive and statistical summaries of the included methodological research projects. DISCUSSION: Results are expected to be publicly available in mid 2013.
Resumo:
"The vulnerable are those whose autonomy, dignity and integrity are capable of being threatened". Based on this ethical definition of vulnerability, four risk factors of vulnerability might be identified among elderly persons, and are described in this article: the functional limitation, the loss of autonomy, the social precariousness and the restriction of access to medical care. A clinical case of elderly abuse is presented to illustrate vulnerability. Finally, some recommendations to lower the risk of vulnerability in elderly persons are proposed.
Resumo:
Soluble MHC-peptide complexes, commonly known as tetramers, allow the detection and isolation of antigen-specific T cells. Although other types of soluble MHC-peptide complexes have been introduced, the most commonly used MHC class I staining reagents are those originally described by Altman and Davis. As these reagents have become an essential tool for T cell analysis, it is important to have a large repertoire of such reagents to cover a broad range of applications in cancer research and clinical trials. Our tetramer collection currently comprises 228 human and 60 mouse tetramers and new reagents are continuously being added. For the MHC II tetramers, the list currently contains 21 human (HLA-DR, DQ and DP) and 5 mouse (I-A(b)) tetramers. Quantitative enumeration of antigen-specific T cells by tetramer staining, especially at low frequencies, critically depends on the quality of the tetramers and on the staining procedures. For conclusive longitudinal monitoring, standardized reagents and analysis protocols need to be used. This is especially true for the monitoring of antigen-specific CD4+ T cells, as there are large variations in the quality of MHC II tetramers and staining conditions. This commentary provides an overview of our tetramer collection and indications on how tetramers should be used to obtain optimal results.
Resumo:
In the healthcare debate, it is often stated that better quality leads to savings. Quality systems lead to additional costs for setting up, running and external evaluations. In addition, suppression of implicit rationing leads to additional costs. On the other hand, they lead to savings by procedures simplification, improvement of patients' health state and quicker integration of new collaborators. It is then logical to imagine that financial incentives could improve quality. First evidences of pay for performances initiatives show a positive impact but also some limitations. Quality and savings are linked together and require all our attention.
Resumo:
Searching for matches between large collections of short (14-30 nucleotides) words and sequence databases comprising full genomes or transcriptomes is a common task in biological sequence analysis. We investigated the performance of simple indexing strategies for handling such tasks and developed two programs, fetchGWI and tagger, that index either the database or the query set. Either strategy outperforms megablast for searches with more than 10,000 probes. FetchGWI is shown to be a versatile tool for rapidly searching multiple genomes, whose performance is limited in most cases by the speed of access to the filesystem. We have made publicly available a Web interface for searching the human, mouse, and several other genomes and transcriptomes with oligonucleotide queries.
Resumo:
The efficacy and safety of anti-infective treatments are associated with the drug blood concentration profile, which is directly correlated with a dosing adjustment to the individual patient's condition. Dosing adjustments to the renal function recommended in reference books are often imprecise and infrequently applied in clinical practice. The recent generalisation of the KDOQI (Kidney Disease Outcome Quality Initiative) staging of chronically impaired renal function represents an opportunity to review and refine the dosing recommendations in patients with renal insufficiency. The literature has been reviewed and compared to a predictive model of the fraction of drug cleared by the kidney based on the Dettli's principle. Revised drug dosing recommendations integrating these predictive parameters are proposed.
Resumo:
BACKGROUND: Meta-analyses are particularly vulnerable to the effects of publication bias. Despite methodologists' best efforts to locate all evidence for a given topic the most comprehensive searches are likely to miss unpublished studies and studies that are published in the gray literature only. If the results of the missing studies differ systematically from the published ones, a meta-analysis will be biased with an inaccurate assessment of the intervention's effects.As part of the OPEN project (http://www.open-project.eu) we will conduct a systematic review with the following objectives:â-ª To assess the impact of studies that are not published or published in the gray literature on pooled effect estimates in meta-analyses (quantitative measure).â-ª To assess whether the inclusion of unpublished studies or studies published in the gray literature leads to different conclusions in meta-analyses (qualitative measure). METHODS/DESIGN: Inclusion criteria: Methodological research projects of a cohort of meta-analyses which compare the effect of the inclusion or exclusion of unpublished studies or studies published in the gray literature.Literature search: To identify relevant research projects we will conduct electronic searches in Medline, Embase and The Cochrane Library; check reference lists; and contact experts.Outcomes: 1) The extent to which the effect estimate in a meta-analyses changes with the inclusion or exclusion of studies that were not published or published in the gray literature; and 2) the extent to which the inclusion of unpublished studies impacts the meta-analyses' conclusions.Data collection: Information will be collected on the area of health care; the number of meta-analyses included in the methodological research project; the number of studies included in the meta-analyses; the number of study participants; the number and type of unpublished studies; studies published in the gray literature and published studies; the sources used to retrieve studies that are unpublished, published in the gray literature, or commercially published; and the validity of the methodological research project.Data synthesis: Data synthesis will involve descriptive and statistical summaries of the findings of the included methodological research projects. DISCUSSION: Results are expected to be publicly available in the middle of 2013.
Resumo:
Patients with type 2 diabetes mellitus exhibit a marked increase in cardiovascular and renal risk. A number of interventional trials have shown that these patients benefit greatly from aggressive BP lowering, especially when the drug regimen comprises an inhibitor of the renin-angiotensin system. The results of the placebo-controlled ADVANCE (Action in Diabetes and Vascular disease: PreterAx and DiamicroN MR Controlled Evaluation) trial, conducted in patients with type 2 diabetes, are exemplary in this respect. The systematic use of a fixed-dose combination containing the ACE inhibitor perindopril and the diuretic indapamide afforded substantial protection against cardiovascular mortality and myocardial infarction, while providing important renoprotection, reducing the development of micro- and macroalbuminuria, and allowing regression of nephropathy. The beneficial effects were obtained regardless of baseline BP and whether or not the patients were receiving antihypertensive therapy.
Resumo:
BACKGROUND: The risk of osteoporosis and fracture influences the selection of adjuvant endocrine therapy. We analyzed bone mineral density (BMD) in Swiss patients of the Breast International Group (BIG) 1-98 trial [treatment arms: A, tamoxifen (T) for 5 years; B, letrozole (L) for 5 years; C, 2 years of T followed by 3 years of L; D, 2 years of L followed by 3 years of T]. PATIENTS AND METHODS: Dual-energy X-ray absorptiometry (DXA) results were retrospectively collected. Patients without DXA served as control group. Repeated measures models using covariance structures allowing for different times between DXA were used to estimate changes in BMD. Prospectively defined covariates were considered as fixed effects in the multivariable models. RESULTS: Two hundred and sixty-one of 546 patients had one or more DXA with 577 lumbar and 550 hip measurements. Weight, height, prior hormone replacement therapy, and hysterectomy were positively correlated with BMD; the correlation was negative for letrozole arms (B/C/D versus A), known osteoporosis, time on trial, age, chemotherapy, and smoking. Treatment did not influence the occurrence of osteoporosis (T score < -2.5 standard deviation). CONCLUSIONS: All aromatase inhibitor regimens reduced BMD. The sequential schedules were as detrimental for bone density as L monotherapy.
Resumo:
Urinary magnesium and pH are known to modulate urinary calcium excretion, but the mechanisms underlying these relationships are unknown. In this study, the data from 17 clinical trials in which urinary magnesium and pH were pharmacologically manipulated were analyzed, and it was found that the change in urinary calcium excretion is directly proportional to the change in magnesium excretion and inversely proportional to the change in urine pH; a regression equation was generated to relate these variables (R(2) = 0.58). For further exploration of these relationships, intravenous calcium chloride, magnesium chloride, or vehicle was administered to rats. Magnesium infusion significantly increased urinary calcium excretion (normalized to urinary creatinine), but calcium infusion did not affect magnesium excretion. Parathyroidectomy did not prevent this magnesium-induced hypercalciuria. The effect of magnesium loading on calciuria was still observed after treatment with furosemide, which disrupts calcium and magnesium absorption in the thick ascending limb, suggesting that the effect may be mediated by the distal nephron. The calcium channel TRPV5, normally present in the distal tubule, was expressed in Xenopus oocytes. Calcium uptake by TRPV5 was directly inhibited by magnesium and low pH. In summary, these data are compatible with the hypothesis that urinary magnesium directly inhibits renal calcium absorption, which can be negated by high luminal pH, and that this regulation likely takes place in the distal tubule.
Resumo:
BACKGROUND: People with neurological disease have a much higher risk of both faecal incontinence and constipation than the general population. There is often a fine line between the two conditions, with any management intended to ameliorate one risking precipitating the other. Bowel problems are observed to be the cause of much anxiety and may reduce quality of life in these people. Current bowel management is largely empirical with a limited research base. OBJECTIVES: To determine the effects of management strategies for faecal incontinence and constipation in people with neurological diseases affecting the central nervous system. SEARCH STRATEGY: We searched the Cochrane Incontinence Group Specialised Trials Register (searched 26 January 2005), the Cochrane Central Register of Controlled Trials (Issue 2, 2005), MEDLINE (January 1966 to May 2005), EMBASE (January 1998 to May 2005) and all reference lists of relevant articles. SELECTION CRITERIA: All randomised or quasi-randomised trials evaluating any types of conservative or surgical measure for the management of faecal incontinence and constipation in people with neurological diseases were selected. Specific therapies for the treatment of neurological diseases that indirectly affect bowel dysfunction were also considered. DATA COLLECTION AND ANALYSIS: Two reviewers assessed the methodological quality of eligible trials and two reviewers independently extracted data from included trials using a range of pre-specified outcome measures. MAIN RESULTS: Ten trials were identified by the search strategy, most were small and of poor quality. Oral medications for constipation were the subject of four trials. Cisapride does not seem to have clinically useful effects in people with spinal cord injuries (three trials). Psyllium was associated with increased stool frequency in people with Parkinson's disease but did not alter colonic transit time (one trial). Prucalopride, an enterokinetic did not demonstrate obvious benefits in this patient group (one study). Some rectal preparations to initiate defaecation produced faster results than others (one trial). Different time schedules for administration of rectal medication may produce different bowel responses (one trial). Mechanical evacuation may be more effective than oral or rectal medication (one trial). There appears to be a benefit to patients in one-off educational interventions from nurses. The clinical significance of any of these results is difficult to interpret. AUTHORS' CONCLUSIONS: There is still remarkably little research on this common and, to patients, very significant condition. It is not possible to draw any recommendation for bowel care in people with neurological diseases from the trials included in this review. Bowel management for these people must remain empirical until well-designed controlled trials with adequate numbers and clinically relevant outcome measures become available.
Genetic variation in GIPR influences the glucose and insulin responses to an oral glucose challenge.
Resumo:
Glucose levels 2 h after an oral glucose challenge are a clinical measure of glucose tolerance used in the diagnosis of type 2 diabetes. We report a meta-analysis of nine genome-wide association studies (n = 15,234 nondiabetic individuals) and a follow-up of 29 independent loci (n = 6,958-30,620). We identify variants at the GIPR locus associated with 2-h glucose level (rs10423928, beta (s.e.m.) = 0.09 (0.01) mmol/l per A allele, P = 2.0 x 10(-15)). The GIPR A-allele carriers also showed decreased insulin secretion (n = 22,492; insulinogenic index, P = 1.0 x 10(-17); ratio of insulin to glucose area under the curve, P = 1.3 x 10(-16)) and diminished incretin effect (n = 804; P = 4.3 x 10(-4)). We also identified variants at ADCY5 (rs2877716, P = 4.2 x 10(-16)), VPS13C (rs17271305, P = 4.1 x 10(-8)), GCKR (rs1260326, P = 7.1 x 10(-11)) and TCF7L2 (rs7903146, P = 4.2 x 10(-10)) associated with 2-h glucose. Of the three newly implicated loci (GIPR, ADCY5 and VPS13C), only ADCY5 was found to be associated with type 2 diabetes in collaborating studies (n = 35,869 cases, 89,798 controls, OR = 1.12, 95% CI 1.09-1.15, P = 4.8 x 10(-18)).
Resumo:
Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.
Resumo:
OBJECTIVE: The aim of this study was to assess the implementation process and economic impact of a new pharmaceutical care service provided since 2002 by pharmacists in Swiss nursing homes. SETTING: The setting was 42 nursing homes located in the canton of Fribourg, Switzerland under the responsibility of 22 pharmacists. METHOD: We developed different facilitators, such as a monitoring system, a coaching program, and a research project, to help pharmacists change their practice and to improve implementation of this new service. We evaluated the implementation rate of the service delivered in nursing homes. We assessed the economic impact of the service since its start in 2002 using statistical evaluation (Chow test) with retrospective analysis of the annual drug costs per resident over an 8-year period (1998-2005). MAIN OUTCOME MEASURES: The description of the facilitators and their implications in implementation of the service; the economic impact of the service since its start in 2002. RESULTS: In 2005, after a 4-year implementation period supported by the introduction of facilitators of practice change, all 42 nursing homes (2,214 residents) had implemented the pharmaceutical care service. The annual drug costs per resident decreased by about 16.4% between 2002 and 2005; this change proved to be highly significant. The performance of the pharmacists continuously improved using a specific coaching program including an annual expert comparative report, working groups, interdisciplinary continuing education symposia, and individual feedback. This research project also determined priorities to develop practice guidelines to prevent drug-related problems in nursing homes, especially in relation to the use of psychotropic drugs. CONCLUSION: The pharmaceutical care service was fully and successfully implemented in Fribourg's nursing homes within a period of 4 years. These findings highlight the importance of facilitators designed to assist pharmacists in the implementation of practice changes. The economic impact was confirmed on a large scale, and priorities for clinical and pharmacoeconomic research were identified in order to continue to improve the quality of integrated care for the elderly.
Resumo:
An implantable cardiac defibrillator (ICD) is a cardiac implantable electronic device that is capable of identifying and treating ventricular arrhythmias. Consideration about the type of ICD to select for a given patient include whether the patient has bradycardia requiring pacing support, has associated atrial tachyarrhythmias, or would benefit from cardiac resynchronization therapy. The ICD functions by continuously monitoring the patient's cardiac rate and delivering therapies (anti-tachycardia pacing, shocks) when the rate exceeds the programmed rate "cutoff". Secondary prevention trials have demonstrated that ICDs reduce the incidence of arrhythmic death and total mortality in patients presenting with a cardiac arrest. ICDs are also indicated for primary prevention of sudden cardiac death in specific high-risk subgroups of patients.