149 resultados para least common subgraph algorithm

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Contexte: la planification infirmière de sortie des personnes âgées est une composante importante des soins pour assurer une transition optimale entre l'hôpital et la maison. Beaucoup d'événements indésirables peuvent survenir après la sortie de l'hôpital. Dans une perspective de système de santé, les facteurs qui augmentent ce risque incluent un nombre croissant de patients âgés, l'augmentation de la complexité des soins nécessitant une meilleure coordination des soins après la sortie, ainsi qu'une augmentation de la pression financière. Objectif: évaluer si les interventions infirmières liées à la planification de sortie chez les personnes âgées et leurs proches aidants sont prédictives de leur perception d'être prêts pour le départ, du niveau d'anxiété du patient le jour de la sortie de l'hôpital et du nombre de recours non programmé aux services de santé durant les trente jours après la sortie. Méthode: le devis est prédictif corrélationnel avec un échantillon de convenance de 235 patients. Les patients âgés de 65 ans de quatre unités d'hôpitaux dans le canton de Vaud en Suisse ont été recrutés entre novembre 2011 et octobre 2012. Les types et les niveaux d'interventions infirmières ont été extraits des dossiers de soins et analysés selon les composantes du modèle de Naylor. La perception d'être prêt pour la sortie et l'anxiété ont été mesurées un jour avant la sortie en utilisant l'échelle de perception d'être prêt pour la sortie et l'échelle Hospital Anxiety and Depression. Un mois après la sortie, un entretien téléphonique a été mené pour évaluer le recours non programmé aux services de santé durant cette période. Des analyses descriptives et un modèle randomisé à deux niveaux ont été utilisés pour analyser les données. Résultats: peu de patients ont reçu une planification globale de sortie. L'intervention la plus fréquente était la coordination (M = 55,0/100). et la moins fréquente était la participation du patient à la planification de sortie (M = 16,1/100). Contrairement aux hypothèses formulées, les patients ayant bénéficié d'un plus grand nombre d'interventions infirmières de préparation à la sortie ont un niveau moins élevé de perception d'être prêt pour le départ (B = -0,3, p < 0,05, IC 95% [-0,57, -0,11]); le niveau d'anxiété n'est pas associé à la planification de sortie (r = -0,21, p <0,01) et la présence de troubles cognitifs est le seul facteur prédictif d'une réhospitalisation dans les 30 jours après la sortie de l'hôpital ( OR = 1,50, p = 0,04, IC 95% [1,02, 2,22]). Discussion: en se focalisant sur chaque intervention de la planification de sortie, cette étude permet une meilleure compréhension du processus de soins infirmiers actuellement en cours dans les hôpitaux vaudois. Elle met en lumière les lacunes entre les pratiques actuelles et celles de pratiques exemplaires donnant ainsi une orientation pour des changements dans la pratique clinique et des recherches ultérieures. - Background: Nursing discharge planning in elderly patients is an important component of care to ensure optimal transition from hospital to home. Many adverse events may occur after hospital discharge. From a health care system perspective, contributing factors that increase the risk of these adverse events include a growing number of elderly patients, increased complexity of care requiring better care coordination after discharge, as well as increased financial pressure. Aim: To investigate whether older medical inpatients who receive comprehensive discharge planning interventions a) feel more ready for hospital discharge, b) have reduced anxiety at the time of discharge, c) have lower health care utilization after discharge compared to those who receive less comprehensive interventions. Methods: Using a predictive correlational design, a convenience sample of 235 patients was recruited. Patients aged 65 and older from 4 units of hospitals in the canton of Vaud in Switzerland were enrolled between November 2011 and October 2012. Types and level of interventions were extracted from the medical charts and analyzed according to the components of Naylor's model. Discharge readiness and anxiety were measured one day before discharge using the Readiness for Hospital Discharge Scale and the Hospital Anxiety and Depression scale. A telephone interview was conducted one month after hospital discharge to asses unplanned health services utilization during this follow-up period. Descriptive analyses and a two- level random model were used for statistical analyses. Results: Few patients received comprehensive discharge planning interventions. The most frequent intervention was Coordination (M = 55,0/100) and the least common was Patient participation in the discharge planning (M = 16,1/100). Contrary to our hypotheses, patients who received more nursing discharge interventions were significantly less ready to go home (B = -0,3, p < 0,05, IC 95% [-0,57, -0,11]); their anxiety level was not associated with their readiness for hospital discharge (r = -0,21, p <0,01) and cognitive impairment was the only factor that predicted rehospitalization within 30 days after discharge ( OR = 1,50, p = 0,04, IC 95% [1,02, 2,22]). Discussion: By focusing on each component of the discharge planning, this study provides a greater and more detailed insight on the usual nursing process currently performed in medical inpatients units. Results identified several gaps between current and Best practices, providing guidance to changes in clinical practice and further research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé : La radiothérapie par modulation d'intensité (IMRT) est une technique de traitement qui utilise des faisceaux dont la fluence de rayonnement est modulée. L'IMRT, largement utilisée dans les pays industrialisés, permet d'atteindre une meilleure homogénéité de la dose à l'intérieur du volume cible et de réduire la dose aux organes à risque. Une méthode usuelle pour réaliser pratiquement la modulation des faisceaux est de sommer de petits faisceaux (segments) qui ont la même incidence. Cette technique est appelée IMRT step-and-shoot. Dans le contexte clinique, il est nécessaire de vérifier les plans de traitement des patients avant la première irradiation. Cette question n'est toujours pas résolue de manière satisfaisante. En effet, un calcul indépendant des unités moniteur (représentatif de la pondération des chaque segment) ne peut pas être réalisé pour les traitements IMRT step-and-shoot, car les poids des segments ne sont pas connus à priori, mais calculés au moment de la planification inverse. Par ailleurs, la vérification des plans de traitement par comparaison avec des mesures prend du temps et ne restitue pas la géométrie exacte du traitement. Dans ce travail, une méthode indépendante de calcul des plans de traitement IMRT step-and-shoot est décrite. Cette méthode est basée sur le code Monte Carlo EGSnrc/BEAMnrc, dont la modélisation de la tête de l'accélérateur linéaire a été validée dans une large gamme de situations. Les segments d'un plan de traitement IMRT sont simulés individuellement dans la géométrie exacte du traitement. Ensuite, les distributions de dose sont converties en dose absorbée dans l'eau par unité moniteur. La dose totale du traitement dans chaque élément de volume du patient (voxel) peut être exprimée comme une équation matricielle linéaire des unités moniteur et de la dose par unité moniteur de chacun des faisceaux. La résolution de cette équation est effectuée par l'inversion d'une matrice à l'aide de l'algorithme dit Non-Negative Least Square fit (NNLS). L'ensemble des voxels contenus dans le volume patient ne pouvant être utilisés dans le calcul pour des raisons de limitations informatiques, plusieurs possibilités de sélection ont été testées. Le meilleur choix consiste à utiliser les voxels contenus dans le Volume Cible de Planification (PTV). La méthode proposée dans ce travail a été testée avec huit cas cliniques représentatifs des traitements habituels de radiothérapie. Les unités moniteur obtenues conduisent à des distributions de dose globale cliniquement équivalentes à celles issues du logiciel de planification des traitements. Ainsi, cette méthode indépendante de calcul des unités moniteur pour l'IMRT step-andshootest validée pour une utilisation clinique. Par analogie, il serait possible d'envisager d'appliquer une méthode similaire pour d'autres modalités de traitement comme par exemple la tomothérapie. Abstract : Intensity Modulated RadioTherapy (IMRT) is a treatment technique that uses modulated beam fluence. IMRT is now widespread in more advanced countries, due to its improvement of dose conformation around target volume, and its ability to lower doses to organs at risk in complex clinical cases. One way to carry out beam modulation is to sum smaller beams (beamlets) with the same incidence. This technique is called step-and-shoot IMRT. In a clinical context, it is necessary to verify treatment plans before the first irradiation. IMRT Plan verification is still an issue for this technique. Independent monitor unit calculation (representative of the weight of each beamlet) can indeed not be performed for IMRT step-and-shoot, because beamlet weights are not known a priori, but calculated by inverse planning. Besides, treatment plan verification by comparison with measured data is time consuming and performed in a simple geometry, usually in a cubic water phantom with all machine angles set to zero. In this work, an independent method for monitor unit calculation for step-and-shoot IMRT is described. This method is based on the Monte Carlo code EGSnrc/BEAMnrc. The Monte Carlo model of the head of the linear accelerator is validated by comparison of simulated and measured dose distributions in a large range of situations. The beamlets of an IMRT treatment plan are calculated individually by Monte Carlo, in the exact geometry of the treatment. Then, the dose distributions of the beamlets are converted in absorbed dose to water per monitor unit. The dose of the whole treatment in each volume element (voxel) can be expressed through a linear matrix equation of the monitor units and dose per monitor unit of every beamlets. This equation is solved by a Non-Negative Least Sqvare fif algorithm (NNLS). However, not every voxels inside the patient volume can be used in order to solve this equation, because of computer limitations. Several ways of voxel selection have been tested and the best choice consists in using voxels inside the Planning Target Volume (PTV). The method presented in this work was tested with eight clinical cases, which were representative of usual radiotherapy treatments. The monitor units obtained lead to clinically equivalent global dose distributions. Thus, this independent monitor unit calculation method for step-and-shoot IMRT is validated and can therefore be used in a clinical routine. It would be possible to consider applying a similar method for other treatment modalities, such as for instance tomotherapy or volumetric modulated arc therapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nursing discharge planning for elderly medical inpatients is an essential element of care to ensure optimal transition to home and to reduce post-discharge adverse events. The objectives of this cross-sectional study were to investigate the association between nursing discharge planning components in older medical inpatients, patients' readiness for hospital discharge and unplanned health care utilization during the following 30 days. Results indicated that no patients benefited from comprehensive discharge planning but most benefited from less than half of the discharge planning components. The most frequent intervention recorded was coordination, and the least common was patients' participation in decisions regarding discharge. Patients who received more nursing discharge components felt significantly less ready to go home and had significantly more readmissions during the 30-day follow-up period. This study highlights large gaps in the nursing discharge planning process in older medical inpatients and identifies specific areas where improvements are most needed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Context: Ovarian tumors (OT) typing is a competency expected from pathologists, with significant clinical implications. OT however come in numerous different types, some rather rare, with the consequence of few opportunities for practice in some departments. Aim: Our aim was to design a tool for pathologists to train in less common OT typing. Method and Results: Representative slides of 20 less common OT were scanned (Nano Zoomer Digital Hamamatsu®) and the diagnostic algorithm proposed by Young and Scully applied to each case (Young RH and Scully RE, Seminars in Diagnostic Pathology 2001, 18: 161-235) to include: recognition of morphological pattern(s); shortlisting of differential diagnosis; proposition of relevant immunohistochemical markers. The next steps of this project will be: evaluation of the tool in several post-graduate training centers in Europe and Québec; improvement of its design based on evaluation results; diffusion to a larger public. Discussion: In clinical medicine, solving many cases is recognized as of utmost importance for a novice to become an expert. This project relies on the virtual slides technology to provide pathologists with a learning tool aimed at increasing their skills in OT typing. After due evaluation, this model might be extended to other uncommon tumors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Jeune asphyxiating thoracic dystrophy (JATD) is a rare, often lethal, recessively inherited chondrodysplasia characterised by shortened ribs and long bones, sometimes accompanied by polydactyly, and renal, liver and retinal disease. Mutations in intraflagellar transport (IFT) genes cause JATD, including the IFT dynein-2 motor subunit gene DYNC2H1. Genetic heterogeneity and the large DYNC2H1 gene size have hindered JATD genetic diagnosis. AIMS AND METHODS: To determine the contribution to JATD we screened DYNC2H1 in 71 JATD patients JATD patients combining SNP mapping, Sanger sequencing and exome sequencing. RESULTS AND CONCLUSIONS: We detected 34 DYNC2H1 mutations in 29/71 (41%) patients from 19/57 families (33%), showing it as a major cause of JATD especially in Northern European patients. This included 13 early protein termination mutations (nonsense/frameshift, deletion, splice site) but no patients carried these in combination, suggesting the human phenotype is at least partly hypomorphic. In addition, 21 missense mutations were distributed across DYNC2H1 and these showed some clustering to functional domains, especially the ATP motor domain. DYNC2H1 patients largely lacked significant extra-skeletal involvement, demonstrating an important genotype-phenotype correlation in JATD. Significant variability exists in the course and severity of the thoracic phenotype, both between affected siblings with identical DYNC2H1 alleles and among individuals with different alleles, which suggests the DYNC2H1 phenotype might be subject to modifier alleles, non-genetic or epigenetic factors. Assessment of fibroblasts from patients showed accumulation of anterograde IFT proteins in the ciliary tips, confirming defects similar to patients with other retrograde IFT machinery mutations, which may be of undervalued potential for diagnostic purposes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reactivity spectrum of five different monoclonal anti-melanoma antibodies cross-reacting with gliomas and neuroblastomas and one monoclonal anti-glioma antibody cross-reacting with melanomas and neuroblastomas was investigated. Comparison of the binding activity of these monoclonal antibodies for 11 melanoma, seven glioma, and three neuroblastoma cell lines showed that each of these clones had a different pattern of cross-reactivity. The results indicated that the antigenic determinants detected by these antibodies were not associated with the same antigen and thus suggested the existence of at least six different antigens common to melanomas, gliomas, and neuroblastomas. Since all these tumors are known to derive from cells originating embryologically from the neural crest, it can be assumed that the antigens recognized by our monoclonal antibodies are neuroectodermal differentiation antigens. However, absorption with fetal brain homogenates abolished only the binding of monoclonal anti-glioma antibody, but did not modify the binding of monoclonal anti-melanoma antibodies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AbstractFor a wide range of environmental, hydrological, and engineering applications there is a fast growing need for high-resolution imaging. In this context, waveform tomographic imaging of crosshole georadar data is a powerful method able to provide images of pertinent electrical properties in near-surface environments with unprecedented spatial resolution. In contrast, conventional ray-based tomographic methods, which consider only a very limited part of the recorded signal (first-arrival traveltimes and maximum first-cycle amplitudes), suffer from inherent limitations in resolution and may prove to be inadequate in complex environments. For a typical crosshole georadar survey the potential improvement in resolution when using waveform-based approaches instead of ray-based approaches is in the range of one order-of- magnitude. Moreover, the spatial resolution of waveform-based inversions is comparable to that of common logging methods. While in exploration seismology waveform tomographic imaging has become well established over the past two decades, it is comparably still underdeveloped in the georadar domain despite corresponding needs. Recently, different groups have presented finite-difference time-domain waveform inversion schemes for crosshole georadar data, which are adaptations and extensions of Tarantola's seminal nonlinear generalized least-squares approach developed for the seismic case. First applications of these new crosshole georadar waveform inversion schemes on synthetic and field data have shown promising results. However, there is little known about the limits and performance of such schemes in complex environments. To this end, the general motivation of my thesis is the evaluation of the robustness and limitations of waveform inversion algorithms for crosshole georadar data in order to apply such schemes to a wide range of real world problems.One crucial issue to making applicable and effective any waveform scheme to real-world crosshole georadar problems is the accurate estimation of the source wavelet, which is unknown in reality. Waveform inversion schemes for crosshole georadar data require forward simulations of the wavefield in order to iteratively solve the inverse problem. Therefore, accurate knowledge of the source wavelet is critically important for successful application of such schemes. Relatively small differences in the estimated source wavelet shape can lead to large differences in the resulting tomograms. In the first part of my thesis, I explore the viability and robustness of a relatively simple iterative deconvolution technique that incorporates the estimation of the source wavelet into the waveform inversion procedure rather than adding additional model parameters into the inversion problem. Extensive tests indicate that this source wavelet estimation technique is simple yet effective, and is able to provide remarkably accurate and robust estimates of the source wavelet in the presence of strong heterogeneity in both the dielectric permittivity and electrical conductivity as well as significant ambient noise in the recorded data. Furthermore, our tests also indicate that the approach is insensitive to the phase characteristics of the starting wavelet, which is not the case when directly incorporating the wavelet estimation into the inverse problem.Another critical issue with crosshole georadar waveform inversion schemes which clearly needs to be investigated is the consequence of the common assumption of frequency- independent electromagnetic constitutive parameters. This is crucial since in reality, these parameters are known to be frequency-dependent and complex and thus recorded georadar data may show significant dispersive behaviour. In particular, in the presence of water, there is a wide body of evidence showing that the dielectric permittivity can be significantly frequency dependent over the GPR frequency range, due to a variety of relaxation processes. The second part of my thesis is therefore dedicated to the evaluation of the reconstruction limits of a non-dispersive crosshole georadar waveform inversion scheme in the presence of varying degrees of dielectric dispersion. I show that the inversion algorithm, combined with the iterative deconvolution-based source wavelet estimation procedure that is partially able to account for the frequency-dependent effects through an "effective" wavelet, performs remarkably well in weakly to moderately dispersive environments and has the ability to provide adequate tomographic reconstructions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We herein present a preliminary practical algorithm for evaluating complementary and alternative medicine (CAM) for children which relies on basic bioethical principles and considers the influence of CAM on global child healthcare. CAM is currently involved in almost all sectors of pediatric care and frequently represents a challenge to the pediatrician. The aim of this article is to provide a decision-making tool to assist the physician, especially as it remains difficult to keep up-to-date with the latest developments in the field. The reasonable application of our algorithm together with common sense should enable the pediatrician to decide whether pediatric (P)-CAM represents potential harm to the patient, and allow ethically sound counseling. In conclusion, we propose a pragmatic algorithm designed to evaluate P-CAM, briefly explain the underlying rationale and give a concrete clinical example.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To investigate the association between common carotid artery intima-media thickness (cIMT) and exposure to secondhand smoke (SHS) in children. Methods: Data were available at baseline in the Quebec Adiposity and Lifestyle investigation in Youth (QUALITY) study, an ongoing longitudinal investigation of Caucasian children aged 8e10 years at cohort inception, who had at least one obese parent. Data on exposure to parents, siblings and friends smoking were collected in interviewer-administered child, and self-report parent questionnaires. Blood cotinine was measured with a high sensitivity ELISA. cIMTwas measured by ultrasound. The association between blood cotinine and cIMT was investigated in multivariable linear regression analyses controlling for age, body mass index, and child smoking status. Results: Mean (SD) cIMT (0.5803 (0.04602)) did not differ across age or sex. Overall 26%, 6% and 3% of children were exposed to parents, siblings and friends smoking, respectively. Cotinine ranged from 0.13 ng/ml to 7.38 ng/ml (median (IQR)¼0.18 ng/ml)). Multivariately, a 1 ng/ml increase in cotinine was associated with a 0.090 mm increase in cIMT (p¼0.034). Conclusion: In children as young as age 8e10 years, exposure to SHS relates to cIMT, a marker of pre-clinical atherosclerosis. Given the wide range of health effects of SHS, increased public health efforts are needed to reduced exposure among children in homes an private vehicles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: The natural course of chronic hepatitis C varies widely. To improve the profiling of patients at risk of developing advanced liver disease, we assessed the relative contribution of factors for liver fibrosis progression in hepatitis C. DESIGN: We analysed 1461 patients with chronic hepatitis C with an estimated date of infection and at least one liver biopsy. Risk factors for accelerated fibrosis progression rate (FPR), defined as ≥0.13 Metavir fibrosis units per year, were identified by logistic regression. Examined factors included age at infection, sex, route of infection, HCV genotype, body mass index (BMI), significant alcohol drinking (≥20 g/day for ≥5 years), HIV coinfection and diabetes. In a subgroup of 575 patients, we assessed the impact of single nucleotide polymorphisms previously associated with fibrosis progression in genome-wide association studies. Results were expressed as attributable fraction (AF) of risk for accelerated FPR. RESULTS: Age at infection (AF 28.7%), sex (AF 8.2%), route of infection (AF 16.5%) and HCV genotype (AF 7.9%) contributed to accelerated FPR in the Swiss Hepatitis C Cohort Study, whereas significant alcohol drinking, anti-HIV, diabetes and BMI did not. In genotyped patients, variants at rs9380516 (TULP1), rs738409 (PNPLA3), rs4374383 (MERTK) (AF 19.2%) and rs910049 (major histocompatibility complex region) significantly added to the risk of accelerated FPR. Results were replicated in three additional independent cohorts, and a meta-analysis confirmed the role of age at infection, sex, route of infection, HCV genotype, rs738409, rs4374383 and rs910049 in accelerating FPR. CONCLUSIONS: Most factors accelerating liver fibrosis progression in chronic hepatitis C are unmodifiable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS: We aimed to assess the prevalence and management of clinical familial hypercholesterolaemia (FH) among patients with acute coronary syndrome (ACS). METHODS AND RESULTS: We studied 4778 patients with ACS from a multi-centre cohort study in Switzerland. Based on personal and familial history of premature cardiovascular disease and LDL-cholesterol levels, two validated algorithms for diagnosis of clinical FH were used: the Dutch Lipid Clinic Network algorithm to assess possible (score 3-5 points) or probable/definite FH (>5 points), and the Simon Broome Register algorithm to assess possible FH. At the time of hospitalization for ACS, 1.6% had probable/definite FH [95% confidence interval (CI) 1.3-2.0%, n = 78] and 17.8% possible FH (95% CI 16.8-18.9%, n = 852), respectively, according to the Dutch Lipid Clinic algorithm. The Simon Broome algorithm identified 5.4% (95% CI 4.8-6.1%, n = 259) patients with possible FH. Among 1451 young patients with premature ACS, the Dutch Lipid Clinic algorithm identified 70 (4.8%, 95% CI 3.8-6.1%) patients with probable/definite FH, and 684 (47.1%, 95% CI 44.6-49.7%) patients had possible FH. Excluding patients with secondary causes of dyslipidaemia such as alcohol consumption, acute renal failure, or hyperglycaemia did not change prevalence. One year after ACS, among 69 survivors with probable/definite FH and available follow-up information, 64.7% were using high-dose statins, 69.0% had decreased LDL-cholesterol from at least 50, and 4.6% had LDL-cholesterol ≤1.8 mmol/L. CONCLUSION: A phenotypic diagnosis of possible FH is common in patients hospitalized with ACS, particularly among those with premature ACS. Optimizing long-term lipid treatment of patients with FH after ACS is required.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The number of nonagenarians and centenarians is rising dramatically, and many of them live in nursing homes. Very little is known about psychiatric symptoms and cognitive abilities other than memory in this population. This exploratory study focuses on anosognosia and its relationship with common psychiatric and cognitive symptoms. METHODS: Fifty-eight subjects aged 90 years or older were recruited from geriatric nursing homes and divided into five groups according to Mini-Mental State Examination scores. Assessment included the five-word test, executive clock-drawing task, lexical and categorical fluencies, Anosognosia Questionnaire-Dementia, Neuropsychiatric Inventory, and Charlson Comorbidity Index. RESULTS: Subjects had moderate cognitive impairment, with mean ± SD Mini-Mental State Examination being 15.41 ± 7.04. Anosognosia increased with cognitive impairment and was associated with all cognitive domains, as well as with apathy and agitation. Subjects with mild global cognitive decline seemed less anosognosic than subjects with the least or no impairment. Neither anosognosia nor psychopathological features were related to physical conditions. CONCLUSIONS: Anosognosia in oldest-old nursing home residents was mostly mild. It was associated with both cognitive and psychopathological changes, but whether anosognosia is causal to the observed psychopathological features requires further investigation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The "Five-Day Plan to Stop Smoking" (FDP) is an educational group technique for smoking cessation. We studied a cohort of 123 smokers (55 men, 68 women, mean age 42 years) who participated in 11 successive FDP sessions held in Switzerland between 1995 and 1998 and who were followed up for at least 12 months by telephone or direct interview. Overall, 102 of the 123 subjects (83%) had stopped smoking by the end of the FDP, and self-declared smoking cessation rate was 25% after one year. The following factors potentially associated with outcome were studied: age, sex, smoking habit duration, cigarettes per day, Fagerström Test for Nicotine Dependence (FTND), group size, and medical presence among the group leaders. Smoking habit duration was the only variable which showed a statistically significant association with success: the rate of smoking cessation was higher among patients who had smoked for less than 20 years (34.7% vs. 18.9%, p = 0.049). Stress was the most common cause of relapse. The FDP appears to be an effective smoking cessation therapy. Propositions are made in order to improve the success rate of future sessions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Hypotension, a common intra-operative incident, bears an important potential for morbidity. It is most often manageable and sometimes preventable, which renders its study important. Therefore, we aimed at examining hospital variations in the occurrence of intra-operative hypotension and its predictors. As secondary endpoints, we determined to what extent hypotension relates to the risk of post-operative incidents and death. METHODS: We used the Anaesthesia Databank Switzerland, built on routinely and prospectively collected data on all anaesthesias in 21 hospitals. The three outcomes were assessed using multi-level logistic regression models. RESULTS: Among 147,573 anaesthesias, hypotension ranged from 0.6% to 5.2% in participating hospitals, and from 0.3% up to 12% in different surgical specialties. Most (73.4%) were minor single events. Age, ASA status, combined general and regional anaesthesia techniques, duration of surgery and hospitalization were significantly associated with hypotension. Although significantly associated, the emergency status of the surgery had a weaker effect. Hospitals' odds ratios for hypotension varied between 0.12 and 2.50 (P < or = 0.001), even after adjusting for patient and anaesthesia factors, and for type of surgery. At least one post-operative incident occurred in 9.7% of the procedures, including 0.03% deaths. Intra-operative hypotension was associated with a higher risk of post-operative incidents and death. CONCLUSION: Wide variations remain in the occurrence of hypotension among hospitals after adjustment for risk factors. Although differential reporting from hospitals may exist, variations in anaesthesia techniques and blood pressure maintenance may also have contributed. Intra-operative hypotension is associated with morbidities and sometimes death, and constant vigilance must thus be advocated.