143 resultados para Applications for positions.
em Université de Lausanne, Switzerland
Resumo:
This article examines the determinants of positional incongruence between pre-election statements and post-election behaviour in the Swiss parliament between 2003 and 2009. The question is examined at the individual MP level, which is appropriate for dispersion-of-powers systems like Switzerland. While the overall rate of political congruence reaches about 85%, a multilevel logit analysis detects the underlying factors which push or curb a candidate's propensity to change his or her mind once elected. The results show that positional changes are more likely when (1) MPs are freshmen, (2) individual voting behaviour is invisible to the public, (3) the electoral district magnitude is not small, (4) the vote is not about a party's core issue, (5) the MP belongs to a party which is located in the political centre, and (6) if the pre-election statement dissents from the majority position of the legislative party group. Of these factors, the last one is paramount.
Resumo:
To what extent do Voting Advice Applications (VAA) have an influence on voting behaviour and to what extent should providers be hold accountable for such tools? This paper puts forward some empirical evidence from the Swiss VAA smartvote. The enormous popularity of smartvote in the last national elections in 2007 and the feedback of users and candidates let us come to the conclusion that smartvote is more than a toy and likely to have an influence on the voting decisions. Since Swiss citizens not only vote for parties but also for candidates, and the voting recommendation of smartvote is based on the political positions of the candidates, smartvote turns out to be particularly helpful. Political scientists must not keep their hands off such tools. Scientific research is needed to understand their functioning and possibilities to manipulate elections. On the bases of a legal study we come to the conclusion, that a science driven way of setting up such tools is essential for their legitimacy. However, we do not believe that there is a single best way of setting up such a tool and rather support a market like solution with different competing tools, provided they meet minimal standards like transparency and equal access for all parties and candidates. Once the process of selecting candidates and parties are directly linked to the act of voting, all these questions will become even more salient.
Resumo:
Dual-energy X-ray absorptiometry (DXA) is commonly used in the care of patients for diagnostic classification of osteoporosis, low bone mass (osteopenia), or normal bone density; assessment of fracture risk; and monitoring changes in bone density over time. The development of other technologies for the evaluation of skeletal health has been associated with uncertainties regarding their applications in clinical practice. Quantitative ultrasound (QUS), a technology for measuring properties of bone at peripheral skeletal sites, is more portable and less expensive than DXA, without the use of ionizing radiation. The proliferation of QUS devices that are technologically diverse, measuring and reporting variable bone parameters in different ways, examining different skeletal sites, and having differing levels of validating data for association with DXA-measured bone density and fracture risk, has created many challenges in applying QUS for use in clinical practice. The International Society for Clinical Densitometry (ISCD) 2007 Position Development Conference (PDC) addressed clinical applications of QUS for fracture risk assessment, diagnosis of osteoporosis, treatment initiation, monitoring of treatment, and quality assurance/quality control. The ISCD Official Positions on QUS resulting from this PDC, the rationale for their establishment, and recommendations for further study are presented here.
Resumo:
3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.
Resumo:
Biological materials are increasingly used in abdominal surgery for ventral, pelvic and perineal reconstructions, especially in contaminated fields. Future applications are multi-fold and include prevention and one-step closure of infected areas. This includes prevention of abdominal, parastomal and pelvic hernia, but could also include prevention of separation of multiple anastomoses, suture- or staple-lines. Further indications could be a containment of infected and/or inflammatory areas and protection of vital implants such as vascular grafts. Reinforcement patches of high-risk anastomoses or unresectable perforation sites are possibilities at least. Current applications are based mostly on case series and better data is urgently needed. Clinical benefits need to be assessed in prospective studies to provide reliable proof of efficacy with a sufficient follow-up. Only superior results compared with standard treatment will justify the higher costs of these materials. To date, the use of biological materials is not standard and applications should be limited to case-by-case decision.
Resumo:
This letter describes a data telemetry biomedical experiment. An implant, consisting of a biometric data sensor, electronics, an antenna, and a biocompatible capsule, is described. All the elements were co-designed in order to maximize the transmission distance. The device was implanted in a pig for an in vivo experiment of temperature monitoring.
Resumo:
Pharmacogenomics is a field with origins in the study of monogenic variations in drug metabolism in the 1950s. Perhaps because of these historical underpinnings, there has been an intensive investigation of 'hepatic pharmacogenes' such as CYP450s and liver drug metabolism using pharmacogenomics approaches over the past five decades. Surprisingly, kidney pathophysiology, attendant diseases and treatment outcomes have been vastly under-studied and under-theorized despite their central importance in maintenance of health, susceptibility to disease and rational personalized therapeutics. Indeed, chronic kidney disease (CKD) represents an increasing public health burden worldwide, both in developed and developing countries. Patients with CKD suffer from high cardiovascular morbidity and mortality, which is mainly attributable to cardiovascular events before reaching end-stage renal disease. In this paper, we focus our analyses on renal function before end-stage renal disease, as seen through the lens of pharmacogenomics and human genomic variation. We herein synthesize the recent evidence linking selected Very Important Pharmacogenes (VIP) to renal function, blood pressure and salt-sensitivity in humans, and ways in which these insights might inform rational personalized therapeutics. Notably, we highlight and present the rationale for three applications that we consider as important and actionable therapeutic and preventive focus areas in renal pharmacogenomics: 1) ACE inhibitors, as a confirmed application, 2) VDR agonists, as a promising application, and 3) moderate dietary salt intake, as a suggested novel application. Additionally, we emphasize the putative contributions of gene-environment interactions, discuss the implications of these findings to treat and prevent hypertension and CKD. Finally, we conclude with a strategic agenda and vision required to accelerate advances in this under-studied field of renal pharmacogenomics with vast significance for global public health.
Resumo:
Age is the main clinical determinant of large artery stiffness. Central arteries stiffen progressively with age, whereas peripheral muscular arteries change little with age. A number of clinical studies have analyzed the effects of age on aortic stiffness. Increase of central artery stiffness with age is responsible for earlier wave reflections and changes in pressure wave contours. The stiffening of aorta and other central arteries is a potential risk factor for increased cardiovascular morbidity and mortality. Arterial stiffening with aging is accompanied by an elevation in systolic blood pressure (BP) and pulse pressure (PP). Although arterial stiffening with age is a common situation, it has now been confirmed that older subjects with increased arterial stiffness and elevated PP have higher cardiovascular morbidity and mortality. Increase in aortic stiffness with age occurs gradually and continuously, similarly for men and women. Cross-sectional studies have shown that aortic and carotid stiffness (evaluated by the pulse wave velocity) increase with age by approximately 10% to 15% during a period of 10 years. Women always have 5% to 10% lower stiffness than men of the same age. Although large artery stiffness increases with age independently of the presence of cardiovascular risk factors or other associated conditions, the extent of this increase may depend on several environmental or genetic factors. Hypertension may increase arterial stiffness, especially in older subjects. Among other cardiovascular risk factors, diabetes type 1 and 2 accelerates arterial stiffness, whereas the role of dyslipidemia and tobacco smoking is unclear. Arterial stiffness is also present in several cardiovascular and renal diseases. Patients with heart failure, end stage renal disease, and those with atherosclerotic lesions often develop central artery stiffness. Decreased carotid distensibility, increased arterial thickness, and presence of calcifications and plaques often coexist in the same subject. However, relationships between these three alterations of the arterial wall remain to be explored.
Resumo:
Microarray transcript profiling and RNA interference are two new technologies crucial for large-scale gene function studies in multicellular eukaryotes. Both rely on sequence-specific hybridization between complementary nucleic acid strands, inciting us to create a collection of gene-specific sequence tags (GSTs) representing at least 21,500 Arabidopsis genes and which are compatible with both approaches. The GSTs were carefully selected to ensure that each of them shared no significant similarity with any other region in the Arabidopsis genome. They were synthesized by PCR amplification from genomic DNA. Spotted microarrays fabricated from the GSTs show good dynamic range, specificity, and sensitivity in transcript profiling experiments. The GSTs have also been transferred to bacterial plasmid vectors via recombinational cloning protocols. These cloned GSTs constitute the ideal starting point for a variety of functional approaches, including reverse genetics. We have subcloned GSTs on a large scale into vectors designed for gene silencing in plant cells. We show that in planta expression of GST hairpin RNA results in the expected phenotypes in silenced Arabidopsis lines. These versatile GST resources provide novel and powerful tools for functional genomics.
Resumo:
SUMMARY : Eukaryotic DNA interacts with the nuclear proteins using non-covalent ionic interactions. Proteins can recognize specific nucleotide sequences based on the sterical interactions with the DNA and these specific protein-DNA interactions are the basis for many nuclear processes, e.g. gene transcription, chromosomal replication, and recombination. New technology termed ChIP-Seq has been recently developed for the analysis of protein-DNA interactions on a whole genome scale and it is based on immunoprecipitation of chromatin and high-throughput DNA sequencing procedure. ChIP-Seq is a novel technique with a great potential to replace older techniques for mapping of protein-DNA interactions. In this thesis, we bring some new insights into the ChIP-Seq data analysis. First, we point out to some common and so far unknown artifacts of the method. Sequence tag distribution in the genome does not follow uniform distribution and we have found extreme hot-spots of tag accumulation over specific loci in the human and mouse genomes. These artifactual sequence tags accumulations will create false peaks in every ChIP-Seq dataset and we propose different filtering methods to reduce the number of false positives. Next, we propose random sampling as a powerful analytical tool in the ChIP-Seq data analysis that could be used to infer biological knowledge from the massive ChIP-Seq datasets. We created unbiased random sampling algorithm and we used this methodology to reveal some of the important biological properties of Nuclear Factor I DNA binding proteins. Finally, by analyzing the ChIP-Seq data in detail, we revealed that Nuclear Factor I transcription factors mainly act as activators of transcription, and that they are associated with specific chromatin modifications that are markers of open chromatin. We speculate that NFI factors only interact with the DNA wrapped around the nucleosome. We also found multiple loci that indicate possible chromatin barrier activity of NFI proteins, which could suggest the use of NFI binding sequences as chromatin insulators in biotechnology applications. RESUME : L'ADN des eucaryotes interagit avec les protéines nucléaires par des interactions noncovalentes ioniques. Les protéines peuvent reconnaître les séquences nucléotidiques spécifiques basées sur l'interaction stérique avec l'ADN, et des interactions spécifiques contrôlent de nombreux processus nucléaire, p.ex. transcription du gène, la réplication chromosomique, et la recombinaison. Une nouvelle technologie appelée ChIP-Seq a été récemment développée pour l'analyse des interactions protéine-ADN à l'échelle du génome entier et cette approche est basée sur l'immuno-précipitation de la chromatine et sur la procédure de séquençage de l'ADN à haut débit. La nouvelle approche ChIP-Seq a donc un fort potentiel pour remplacer les anciennes techniques de cartographie des interactions protéine-ADN. Dans cette thèse, nous apportons de nouvelles perspectives dans l'analyse des données ChIP-Seq. Tout d'abord, nous avons identifié des artefacts très communs associés à cette méthode qui étaient jusqu'à présent insoupçonnés. La distribution des séquences dans le génome ne suit pas une distribution uniforme et nous avons constaté des positions extrêmes d'accumulation de séquence à des régions spécifiques, des génomes humains et de la souris. Ces accumulations des séquences artéfactuelles créera de faux pics dans toutes les données ChIP-Seq, et nous proposons différentes méthodes de filtrage pour réduire le nombre de faux positifs. Ensuite, nous proposons un nouvel échantillonnage aléatoire comme un outil puissant d'analyse des données ChIP-Seq, ce qui pourraient augmenter l'acquisition de connaissances biologiques à partir des données ChIP-Seq. Nous avons créé un algorithme d'échantillonnage aléatoire et nous avons utilisé cette méthode pour révéler certaines des propriétés biologiques importantes de protéines liant à l'ADN nommés Facteur Nucléaire I (NFI). Enfin, en analysant en détail les données de ChIP-Seq pour la famille de facteurs de transcription nommés Facteur Nucléaire I, nous avons révélé que ces protéines agissent principalement comme des activateurs de transcription, et qu'elles sont associées à des modifications de la chromatine spécifiques qui sont des marqueurs de la chromatine ouverte. Nous pensons que lés facteurs NFI interagir uniquement avec l'ADN enroulé autour du nucléosome. Nous avons également constaté plusieurs régions génomiques qui indiquent une éventuelle activité de barrière chromatinienne des protéines NFI, ce qui pourrait suggérer l'utilisation de séquences de liaison NFI comme séquences isolatrices dans des applications de la biotechnologie.
Resumo:
Grâce à des développements technologiques récents, tels que le système de positionnement par satellites GPS (Global Positioning System) en mode différentie on est maintenant capable de mesurer avec une grande précision non seulement le profil de vitesse de déplacement d'un sujet sur la terre mais aussi sa trajectoire en faisant totalement abstraction de la chronométrie classique. De plus, des capteurs accélérométriques miniaturisés permettent d'obtenir un complément d'information biomécanique utile (fréquence et longueur du pas, signature accélérométrique individuelle). Dans cet artide, un exemple d'application de ces deux techniques à des sports tels que le ski alpin (descente ou Super G) et le sprint est présenté. La combinaison de plusieurs mesures physiologiques, cinétiques et biomécaniques permettra de mieux comprendre les facteurs endogènes et exogènes qui jouent un rôle dans l'amélioration des performances de l'homme.
Resumo:
So-called online Voting Advice Applications (VAAs) have become very popular all over Europe. Millions of voters are using them as an assistance to make up their minds for which party they should vote. Despite this popularity there are only very few studies about the impact of these tools on individual electoral choice. On the basis of the Swiss VAA smartvote we present some first findings about the question whether VAAs do have a direct impact on the actual vote of their users. In deed, we find strong evidence that Swiss voters were affected by smartvote. However, our findings are somewhat contrary to the results of previous studies from other countries. Furthermore, the quality of available data for such studies needs to be improved. Future studies should pay attention to both: the improvement of the available data, as well as the explanation of the large variance of findings between the specific European countries.
Resumo:
L'éthique, en particulier dans sa version théologique, est exposée aujourd'hui à des mises à l'épreuve redoutables. Sans cesse solicitée par le public et les médias, elle engage une intelligence de la foi, une capacité analytique, une mobilisation de la raison et une implication des émotions. Le présent ouvrage entend conjoindre trois exigences: théorique, figurative et pratique. L'exigence théorique reprend à nouveaux frais la question des fondements de l'éthique, à l'interface de la rationalité, de la foi et de la théologie. L'exigence figurative, non sans rapport avec la démarche généalogique, précise les enjeux du dialogue que le théologien mène avec différentes formes de réflexion philosophique. L'exigence pratique, enfin, renoue les fils, jamais perdus de vue, avec l'expérience et l'existence des humains et des sociétés, sur la base de plusieurs cas exemplaires d'éthique appliquée: statut de l'embryon, compréhension de la maladie et de la santé, définition de la mort, transplantation d'organes, engagement social, toxicomanie, etc.
Resumo:
The recent advances in sequencing technologies have given all microbiology laboratories access to whole genome sequencing. Providing that tools for the automated analysis of sequence data and databases for associated meta-data are developed, whole genome sequencing will become a routine tool for large clinical microbiology laboratories. Indeed, the continuing reduction in sequencing costs and the shortening of the 'time to result' makes it an attractive strategy in both research and diagnostics. Here, we review how high-throughput sequencing is revolutionizing clinical microbiology and the promise that it still holds. We discuss major applications, which include: (i) identification of target DNA sequences and antigens to rapidly develop diagnostic tools; (ii) precise strain identification for epidemiological typing and pathogen monitoring during outbreaks; and (iii) investigation of strain properties, such as the presence of antibiotic resistance or virulence factors. In addition, recent developments in comparative metagenomics and single-cell sequencing offer the prospect of a better understanding of complex microbial communities at the global and individual levels, providing a new perspective for understanding host-pathogen interactions. Being a high-resolution tool, high-throughput sequencing will increasingly influence diagnostics, epidemiology, risk management, and patient care.