34 resultados para AD HOC

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: To preliminarily evaluate prospectively the accuracy and reliability of a specific ad hoc reduction-compression forceps in intraoral open reduction of transverse and displaced mandibular angle fractures. STUDY DESIGN: We analyzed the clinical and radiologic data of 7 patients with 7 single transverse and displaced angle fractures. An intraoral approach was performed in all of the patients without using perioperative intermaxillary fixation. A single Arbeitsgemeinschaft Osteosynthese (AO) unilock reconstruction plate was fixed to each stable fragment with 3 locking screws (2.0 mm in 5 patients and 2.4 mm in 2 patients) at the basilar border of the mandible, according to AO/American Society of Internal Fixation (ASIF) principles. Follow-up was at 1, 3, 6, and 12 months, and we noted the status of healing and complications, if any. RESULTS: All of the patients had satisfactory fracture reduction as well as a successful treatment outcome without complications. CONCLUSION: This preliminary study demonstrated that the intraoral reduction of transverse and displaced angle fractures using a specific ad hoc reduction-forceps results in a high rate of success.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract This PhD thesis addresses the issue of alleviating the burden of developing ad hoc applications. Such applications have the particularity of running on mobile devices, communicating in a peer-to-peer manner and implement some proximity-based semantics. A typical example of such application can be a radar application where users see their avatar as well as the avatars of their friends on a map on their mobile phone. Such application become increasingly popular with the advent of the latest generation of mobile smart phones with their impressive computational power, their peer-to-peer communication capabilities and their location detection technology. Unfortunately, the existing programming support for such applications is limited, hence the need to address this issue in order to alleviate their development burden. This thesis specifically tackles this problem by providing several tools for application development support. First, it provides the location-based publish/subscribe service (LPSS), a communication abstraction, which elegantly captures recurrent communication issues and thus allows to dramatically reduce the code complexity. LPSS is implemented in a modular manner in order to be able to target two different network architectures. One pragmatic implementation is aimed at mainstream infrastructure-based mobile networks, where mobile devices can communicate through fixed antennas. The other fully decentralized implementation targets emerging mobile ad hoc networks (MANETs), where no fixed infrastructure is available and communication can only occur in a peer-to-peer fashion. For each of these architectures, various implementation strategies tailored for different application scenarios that can be parametrized at deployment time. Second, this thesis provides two location-based message diffusion protocols, namely 6Shot broadcast and 6Shot multicast, specifically aimed at MANETs and fine tuned to be used as building blocks for LPSS. Finally this thesis proposes Phomo, a phone motion testing tool that allows to test proximity semantics of ad hoc applications without having to move around with mobile devices. These different developing support tools have been packaged in a coherent middleware framework called Pervaho.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A mobile ad hoc network (MANET) is a decentralized and infrastructure-less network. This thesis aims to provide support at the system-level for developers of applications or protocols in such networks. To do this, we propose contributions in both the algorithmic realm and in the practical realm. In the algorithmic realm, we contribute to the field by proposing different context-aware broadcast and multicast algorithms in MANETs, namely six-shot broadcast, six-shot multicast, PLAN-B and ageneric algorithmic approach to optimize the power consumption of existing algorithms. For each algorithm we propose, we compare it to existing algorithms that are either probabilistic or context-aware, and then we evaluate their performance based on simulations. We demonstrate that in some cases, context-aware information, such as location or signal-strength, can improve the effciency. In the practical realm, we propose a testbed framework, namely ManetLab, to implement and to deploy MANET-specific protocols, and to evaluate their performance. This testbed framework aims to increase the accuracy of performance evaluation compared to simulations, while keeping the ease of use offered by the simulators to reproduce a performance evaluation. By evaluating the performance of different probabilistic algorithms with ManetLab, we observe that both simulations and testbeds should be used in a complementary way. In addition to the above original contributions, we also provide two surveys about system-level support for ad hoc communications in order to establish a state of the art. The first is about existing broadcast algorithms and the second is about existing middleware solutions and the way they deal with privacy and especially with location privacy. - Un réseau mobile ad hoc (MANET) est un réseau avec une architecture décentralisée et sans infrastructure. Cette thèse vise à fournir un support adéquat, au niveau système, aux développeurs d'applications ou de protocoles dans de tels réseaux. Dans ce but, nous proposons des contributions à la fois dans le domaine de l'algorithmique et dans celui de la pratique. Nous contribuons au domaine algorithmique en proposant différents algorithmes de diffusion dans les MANETs, algorithmes qui sont sensibles au contexte, à savoir six-shot broadcast,six-shot multicast, PLAN-B ainsi qu'une approche générique permettant d'optimiser la consommation d'énergie de ces algorithmes. Pour chaque algorithme que nous proposons, nous le comparons à des algorithmes existants qui sont soit probabilistes, soit sensibles au contexte, puis nous évaluons leurs performances sur la base de simulations. Nous montrons que, dans certains cas, des informations liées au contexte, telles que la localisation ou l'intensité du signal, peuvent améliorer l'efficience de ces algorithmes. Sur le plan pratique, nous proposons une plateforme logicielle pour la création de bancs d'essai, intitulé ManetLab, permettant d'implémenter, et de déployer des protocoles spécifiques aux MANETs, de sorte à évaluer leur performance. Cet outil logiciel vise à accroître la précision desévaluations de performance comparativement à celles fournies par des simulations, tout en conservant la facilité d'utilisation offerte par les simulateurs pour reproduire uneévaluation de performance. En évaluant les performances de différents algorithmes probabilistes avec ManetLab, nous observons que simulateurs et bancs d'essai doivent être utilisés de manière complémentaire. En plus de ces contributions principales, nous fournissons également deux états de l'art au sujet du support nécessaire pour les communications ad hoc. Le premier porte sur les algorithmes de diffusion existants et le second sur les solutions de type middleware existantes et la façon dont elles traitent de la confidentialité, en particulier celle de la localisation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

RESUME GRAND PUBLICLe cerveau est composé de différents types cellulaires, dont les neurones et les astrocytes. Faute de moyens pour les observer, les astrocytes sont très longtemps restés dans l'ombre alors que les neurones, bénéficiant des outils ad hoc pour être stimulés et étudiés, ont fait l'objet de toutes les attentions. Le développement de l'imagerie cellulaire et des outils fluorescents ont permis d'observer ces cellules non électriquement excitables et d'obtenir des informations qui laissent penser que ces cellules sont loin d'être passives et participent activement au fonctionnement cérébral. Cette participation au fonctionnement cérébral se fait en partie par le biais de la libération de substances neuro-actives (appellées gliotransmetteurs) que les astrocytes libèrent à proximité des synapses permettant ainsi de moduler le fonctionnement neuronal. Cette libération de gliotransmetteurs est principalement causée par l'activité neuronale que les astrocytes sont capables de sentir. Néanmoins, nous savons encore peu de chose sur les propriétés précises de la libération des gliotransmetteurs. Comprendre les propriétés spatio-temporelles de cette libération est essentiel pour comprendre le mode de communication de ces cellules et leur implication dans la transmission de l'information cérébrale. En utilisant des outils fluorescents récemment développés et en combinant différentes techniques d'imagerie cellulaire, nous avons pu obtenir des informations très précises sur la libération de ces gliotransmetteurs par les astrocytes. Nous avons ainsi confirmé que cette libération était un processus très rapide et qu'elle était contrôlée par des augmentations de calcium locales et rapides. Nous avons également décrit une organisation complexe de la machinerie supportant la libération des gliotransmetteurs. Cette organisation complexe semble être à la base de la libération extrêmement rapide des gliotransmetteurs. Cette rapidité de libération et cette complexité structurelle semblent indiquer que les astrocytes sont des cellules particulièrement adaptées à une communication rapide et qu'elles peuvent, au même titre que les neurones dont elles seraient les partenaires légitimes, participer à la transmission et à l'intégration de l'information cérébrale.RESUMEDe petites vésicules, les « SLMVs » ou « Synaptic Like MicroVesicles », exprimant des transporteurs vésiculaires du glutamate (VGluTs) et libérant du glutamate par exocytose régulée, ont récemment été décrites dans les astrocytes en culture et in situ. Néanmoins, nous savons peu de chose sur les propriétés précises de la sécrétion de ces SLMVs. Contrairement aux neurones, le couplage stimulussécrétion des astrocytes n'est pas basé sur l'ouverture des canaux calciques membranaires mais nécessite l'intervention de seconds messagers et la libération du calcium par le reticulum endoplasmique (RE). Comprendre les propriétés spatio-temporelles de la sécrétion astrocytaire est essentiel pour comprendre le mode de communication de ces cellules et leur implication dans la transmission de l'information cérébrale. Nous avons utilisé des outils fluorescents récemment développés pour étudier le recyclage des vésicules synaptiques glutamatergiques comme les colorants styryles et la pHluorin afin de pouvoir suivre la sécrétion des SLMVs à l'échelle de la cellule mais également à l'échelle des évènements. L'utilisation combinée de l'épifluorescence et de la fluorescence à onde évanescente nous a permis d'obtenir une résolution temporelle et spatiale sans précédent. Ainsi avons-nous confirmé que la sécrétion régulée des astrocytes était un processus très rapide (de l'ordre de quelques centaines de millisecondes). Nous avons découvert que cette sécrétion est contrôlée par des augmentations de calcium locales et rapides. Nous avons également décrit des compartiments cytosoliques délimités par le RE à proximité de la membrane plasmique et contenant les SLMVs. Cette organisation semble être à la base du couplage rapide entre l'activation des GPCRs et la sécrétion. L'existence de compartiments subcellulaires indépendants permettant de contenir les messagers intracellulaires et de limiter leur diffusion semble compenser de manière efficace la nonexcitabilité électrique des astrocytes. Par ailleurs, l'existence des différents pools de vésicules recrutés séquentiellement et fusionnant selon des modalités distinctes ainsi que l'existence de mécanismes permettant le renouvellement de ces pools lors de la stimulation suggèrent que les astrocytes peuvent faire face à une stimulation soutenue de leur sécrétion. Ces données suggèrent que la libération de gliotransmetteurs par exocytose régulée n'est pas seulement une propriété des astrocytes en culture mais bien le résultat d'une forte spécialisation de ces cellules pour la sécrétion. La rapidité de cette sécrétion donne aux astrocytes toutes les compétences pour pouvoir intervenir de manière active dans la transmission et l'intégration de l'information.ABSTRACTRecently, astrocytic synaptic like microvesicles (SLMVs), that express vesicular glutamate transporters (VGluTs) and are able to release glutamate by Ca2+-dependent regulated exocytosis, have been described both in tissue and in cultured astrocytes. Nevertheless, little is known about the specific properties of regulated secretion in astrocytes. Important differences may exist between astrocytic and neuronal exocytosis, starting from the fact that stimulus-secretion coupling in astrocytes is voltage independent, mediated by G-protein-coupled receptors and the release of Ca2+ from internal stores. Elucidating the spatiotemporal properties of astrocytic exo-endocytosis is, therefore, of primary importance for understanding the mode of communication of these cells and their role in brain signaling. We took advantage of fluorescent tools recently developed for studying recycling of glutamatergic vesicles at synapses like styryl dyes and pHluorin in order to follow exocytosis and endocytosis of SLMVs at the level of the entire cell or at the level of single event. We combined epifluorescence and total internal reflection fluorescence imaging to investigate, with unprecedented temporal and spatial resolution, the events underlying the stimulus-secretion in astrocytes. We confirmed that exo-endocytosis process in astrocytes proceeds with a time course on the millisecond time scale. We discovered that SLMVs exocytosis is controlled by local and fast Ca2+ elevations; indeed submicrometer cytosolic compartments delimited by endoplasmic reticulum (ER) tubuli reaching beneath the plasma membrane and containing SLMVs. Such complex organization seems to support the fast stimulus-secretion coupling reported here. Independent subcellular compartments formed by ER, SLMVs and plasma membrane containing intracellular messengers and limiting their diffusion seem to compensate efficiently the non-electrical excitability of astrocytes. Moreover, the existence of two pools of SLMVs which are sequentially recruited suggests a compensatory mechanisms allowing the refill of SLMVs and supporting exocytosis process over a wide range of multiple stimuli. These data suggest that regulated secretion is not only a feature of cultured astrocytes but results from a strong specialization of these cells. The rapidity of secretion demonstrates that astrocytes are able to actively participate in brain information transmission and processing.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVES: To develop data-driven criteria for clinically inactive disease on and off therapy for juvenile dermatomyositis (JDM). METHODS: The Paediatric Rheumatology International Trials Organisation (PRINTO) database contains 275 patients with active JDM evaluated prospectively up to 24 months. Thirty-eight patients off therapy at 24 months were defined as clinically inactive and included in the reference group. These were compared with a random sample of 76 patients who had active disease at study baseline. Individual measures of muscle strength/endurance, muscle enzymes, physician's and parent's global disease activity/damage evaluations, inactive disease criteria derived from the literature and other ad hoc criteria were evaluated for sensitivity, specificity and Cohen's κ agreement. RESULTS: The individual measures that best characterised inactive disease (sensitivity and specificity >0.8 and Cohen's κ >0.8) were manual muscle testing (MMT) ≥78, physician global assessment of muscle activity=0, physician global assessment of overall disease activity (PhyGloVAS) ≤0.2, Childhood Myositis Assessment Scale (CMAS) ≥48, Disease Activity Score ≤3 and Myositis Disease Activity Assessment Visual Analogue Scale ≤0.2. The best combination of variables to classify a patient as being in a state of inactive disease on or off therapy is at least three of four of the following criteria: creatine kinase ≤150, CMAS ≥48, MMT ≥78 and PhyGloVAS ≤0.2. After 24 months, 30/31 patients (96.8%) were inactive off therapy and 69/145 (47.6%) were inactive on therapy. CONCLUSION: PRINTO established data-driven criteria with clearly evidence-based cut-off values to identify JDM patients with clinically inactive disease. These criteria can be used in clinical trials, in research and in clinical practice.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Hybridization has played a central role in the evolutionary history of domesticated plants. Notably, several breeding programs relying on gene introgression from the wild compartment have been performed in fruit tree species within the genus Prunus but few studies investigated spontaneous gene flow among wild and domesticated Prunus species. Consequently, a comprehensive understanding of genetic relationships and levels of gene flow between domesticated and wild Prunus species is needed. Combining nuclear and chloroplastic microsatellites, we investigated the gene flow and hybridization among two key almond tree species, the cultivated Prunus dulcis and one of the most widespread wild relative Prunus orientalis in the Fertile Crescent. We detected high genetic diversity levels in both species along with substantial and symmetric gene flow between the domesticated P. dulcis and the wild P. orientalis. These results were discussed in light of the cultivated species diversity, by outlining the frequent spontaneous genetic contributions of wild species to the domesticated compartment. In addition, crop-to-wild gene flow suggests that ad hoc transgene containment strategies would be required if genetically modified cultivars were introduced in the northwestern Mediterranean.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En parallèle à l'avènement des modèles policiers guidés par le renseignement, les méthodes d'analyse criminelle et de renseignement forensique ont connu des développements importants ces dernières années. Des applications ont été proposées dans divers domaines des sciences forensiques afin d'exploiter et de gérer différents types de traces matérielles de façon systématique et plus performante. A cet égard, le domaine des faux documents d'identité n'a été l'objet que de peu d'attention bien qu'il s'agisse d'une criminalité grave dans laquelle le crime organisé est impliqué.La présente étude cherche à combler cette lacune en proposant une méthode de profilage des fausses pièces d'identité simple et généralisable qui vise à découvrir des liens existants sur la base des caractéristiques matérielles analysables visuellement. Ces caractéristiques sont considérées comme constituant la marque de fabrique particulière du faussaire et elle peuvent ainsi être exploitées pour inférer des liens entre fausses pièces d'identité provenant d'une même source.Un collectif de plus de 200 fausses pièces d'identité composé de trois types de faux documents a été récolté auprès des polices de neuf cantons suisses et a été intégré dans une banque de données ad hoc. Les liens détectés de façon systématique et automatique par cette banque de données ont été exploités et analysés afin de produire des renseignements d'ordre stratégique et opérationnel utiles à la lutte contre la fraude documentaire.Les démarches de profilage et de renseignement mises en place pour les trois types de fausses pièces d'identité étudiées se sont révélées efficaces, un fort pourcentage des documents s'avérant liés (de 30 % à 50 %). La fraude documentaire apparaît comme une criminalité structurée et interrégionale, pour laquelle les liens établis entre fausses pièces d'identité peuvent servir d'aide à l'enquête et de soutien aux décisions stratégiques. Les résultats suggèrent le développement d'approches préventives et répressives pour lutter contre la fraude documentaire.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cette thèse est construite en quatre parties : trois annexes qui présentent six études de cas (env. 800 pages), précédées par une analyse transversale, plus synthétique (env. 150 pages), dont traite ce résumé. Chaque annexe contient une synthèse détaillée des études de cas. Cette thèse aborde la « gestion des ressources naturelles » en affirmant d'emblée que l'appellation est inappropriée, car ce ne sont pas les ressources qui sont gérées, mais leurs usages. Il s'agit donc d'identifier et d'analyser ce qui influence les comportements humains en lien avec la ressource. Cette affirmation fonde la perspective des sciences sociales sur la gestion des ressources naturelles, dans laquelle s'inscrit cette thèse. L'approche néo-institutionnaliste considère que les usages sont influencés par des institutions, qui sont elles-mêmes influencées par les usagers. Ces institutions sont des constructions humaines qui composent le contexte institutionnel dans lequel les acteurs décident de leurs usages (abattre un arbre, prélever de l'eau, etc.). Les usages des ressources ne sont donc jamais libres et il s'agit de comprendre comment ces règles du jeu influencent les pratiques. Elles sont nombreuses, interdépendantes et forment la trame sur laquelle se décident les usages. Pour saisir cette complexité, l'auteur applique le cadre d'analyse des régimes institutionnels des ressources (RIR) qui se limite à l'analyse de deux types de droits d'usages : ceux issues des règles de la propriété (titres de propriété, servitudes, etc.) et ceux issus des politiques publiques (lois, ordonnances, etc.). Le RIR permet d'identifier un « régime institutionnel », spécifique à la ressource étudiée, dont les évolutions peuvent être comparées dans le temps ou entre plusieurs lieux. Dans cette recherche, ce cadre d'analyse a été appliqué au même objet - la gestion forestière dans les zones de captage d'eau souterraine destinée au réseau public - dans trois pays : en France, en Suisse et en Indonésie. Trois années de recherche de terrain ont permis à l'auteur de s'intéresser non seulement aux règles prédéterminées (la réglementation), mais aussi aux règles effectivement activées sur le terrain (la régulation) par les acteurs rencontrés. Les études de cas montrent que les règles prévues sont inégalement activées et que les acteurs privilégient parfois la négociation directe pour résoudre leurs rivalités d'usages, à la place d'invoquer leurs droits acquis. Ce constat conduit l'auteur à proposer un élargissement de la focale du RIR, qui constitue le coeur de sa thèse. On ne s'intéresse plus seulement à ce qui « est » régulé, mais aussi à ce qui ne l'« est pas » et qui échappe à l'application classique du RIR. Ce renversement de perspective est crucial pour comprendre les usages concrets des ressources dans les régimes peu intégrés, où les pratiques s'expliquent davantage par la marge de manoeuvre laissée aux acteurs que par les règles prédéterminées. Cette relecture, testée avec succès dans cette thèse, permet d'intégrer la marge de manoeuvre à l'analyse au moyen du RIR. Elle se concrétise par l'identification des lacunes et incohérences dans les régimes institutionnels étudiés. Le champ d'application du RIR s'en trouve élargi et sa vulgarisation pour des non-spécialistes est facilitée, notamment pour les environnementalistes. La complémentarité entre les approches s'en trouve renforcée. Les résultats montrent deux choses : premièrement les acteurs disposent toujours d'une marge de manoeuvre pour négocier des régulations ponctuelles, qui sont autant d'alternatives à l'application des règles prévues. Deuxièmement, la conclusion d'accords issus de la négociation bi-/multilatérale dépend directement de la marge de manoeuvre laissée par le contexte institutionnel. Ceci explique pourquoi la négociation entre les propriétaires forestiers et les exploitants de captages s'imposent en Indonésie, est envisageable en France, mais n'aboutit pas en Suisse. Les nombreuses tentatives infructueuses de mise en oeuvre de solutions négociées, notamment sous forme de paiements pour services environnementaux (PSE), trouvent ici une explication. - This thesis (written in French) is built in four parts: three annexes that present six case studies (approx. 800 pages), preceded by a transverse, more conceptual analysis (approx. 150 pages), which this summary is about. Each annexe contains a detailed summary of the case studies. 'Natural resource management' is an inappropriate designation because it is not the resources that are managed but the uses made of them, therefore this thesis addresses the identification and analysis of the influences on human behaviour in relation to the resource. This statement roots the social sciences perspective on the management of natural resources, in which this thesis fits. A neoinstitutionalist approach considers that the uses are influenced by institutions, which are themselves influenced by users. These institutions are human constructions that form the institutional context in which the actors decide on the use of resources (felling a tree, collecting water, etc.). Thus, the uses of resources are never independent from institutional influences and it becomes necessary to understand how these rules of the game affect practices. They are numerous, interrelated and form the basis for the uses of resources. To understand this complexity, the author applies the institutional regime resource framework (IRR) which limits the analysis to two types of use rights: those resulting from the property rights (deeds, easements, etc.) and those from public policies (laws, ordinances, etc.). The IRR identifies an 'institutional regime', specific to the resource, from which developments can be compared over time or between several places. In this research, this analytical framework has been applied to the same topic - forest management in the recharging areas of groundwater piped for public supply - in three countries: France, Switzerland and Indonesia. Three years of field research allow the author to look not only at predetermined rules (rules), but also at regulations that are actually activated on the ground (rules-in-use). The case studies show that the predetermined rules are unevenly applied and that sometimes actors favour direct negotiation to resolve their rivalry of uses, instead of invoking their vested rights. From this observation the author proposes an enlargement of the IRR's scope, forming the core of his thesis. The interest covers not only what 'is' regulated, but what 'is not' and so is beyond the classical application of the IRR. This shift in perspective is crucial to understand the concrete uses of resources in poorly integrated regimes, where practices are explained by the margin of manoeuvre left to the actors rather than predetermined rules. This reinterpretation, tested successfully in this research, allows the margin of manoeuvre to be integrated in the analysis using the IRR and is made concrete by the identification of gaps and inconsistencies in the investigated institutional context. The new interpretation of the IRR in this thesis complements and enhances its classical application. In particular, its use and understanding by non-specialists, especially environmentalists, is facilitated. The results show two things: first the actors always have leeway to negotiate ad hoc regulations, which are alternatives to the application of the predefined rules. Second, the conclusion of bi/multilateral negotiated agreements depends directly on the leeway left by the institutional context. This explains why the negotiation between forest owners and operators of water catchments is needed in Indonesia, is possible in France, but does not succeed in Switzerland. This offers an explanation for many unsuccessful attempts to implement negotiated solutions, notably payments for environmental services (PES).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Recognition by the T-cell receptor (TCR) of immunogenic peptides (p) presented by Class I major histocompatibility complexes (MHC) is the key event in the immune response against virus-infected cells or tumor cells. A study of the 2C TCR/SIYR/H-2K(b) system using a computational alanine scanning and a much faster binding free energy decomposition based on the Molecular Mechanics-Generalized Born Surface Area (MM-GBSA) method is presented. The results show that the TCR-p-MHC binding free energy decomposition using this approach and including entropic terms provides a detailed and reliable description of the interactions between the molecules at an atomistic level. Comparison of the decomposition results with experimentally determined activity differences for alanine mutants yields a correlation of 0.67 when the entropy is neglected and 0.72 when the entropy is taken into account. Similarly, comparison of experimental activities with variations in binding free energies determined by computational alanine scanning yields correlations of 0.72 and 0.74 when the entropy is neglected or taken into account, respectively. Some key interactions for the TCR-p-MHC binding are analyzed and some possible side chains replacements are proposed in the context of TCR protein engineering. In addition, a comparison of the two theoretical approaches for estimating the role of each side chain in the complexation is given, and a new ad hoc approach to decompose the vibrational entropy term into atomic contributions, the linear decomposition of the vibrational entropy (LDVE), is introduced. The latter allows the rapid calculation of the entropic contribution of interesting side chains to the binding. This new method is based on the idea that the most important contributions to the vibrational entropy of a molecule originate from residues that contribute most to the vibrational amplitude of the normal modes. The LDVE approach is shown to provide results very similar to those of the exact but highly computationally demanding method.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The identification of genetically homogeneous groups of individuals is a long standing issue in population genetics. A recent Bayesian algorithm implemented in the software STRUCTURE allows the identification of such groups. However, the ability of this algorithm to detect the true number of clusters (K) in a sample of individuals when patterns of dispersal among populations are not homogeneous has not been tested. The goal of this study is to carry out such tests, using various dispersal scenarios from data generated with an individual-based model. We found that in most cases the estimated 'log probability of data' does not provide a correct estimation of the number of clusters, K. However, using an ad hoc statistic DeltaK based on the rate of change in the log probability of data between successive K values, we found that STRUCTURE accurately detects the uppermost hierarchical level of structure for the scenarios we tested. As might be expected, the results are sensitive to the type of genetic marker used (AFLP vs. microsatellite), the number of loci scored, the number of populations sampled, and the number of individuals typed in each sample.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Adolescent health surveys, like those for other segments of the population, tend to remain in the hands of researchers, where they can have no real impact on the way critical health issues are dealt with by policy makers or other professionals directly connected to young people in their everyday work. This paper reviews important issues concerning the dissemination of survey results among professionals from various fields. The content, length and wording of the messages should be tailored to the audience one wants to reach as well as the type of channels used for their diffusion. Survey data sets can be used to select priorities for interventions: ad hoc presentations, attractive summaries and brochures, or even films expressing young peoples' opinions have been used by European public health professionals to make data sets usable in various local, regional and national contexts. CONCLUSION: The impact of these diffusion strategies is, however, difficult to assess and needs to be refined. The adequate delivery of survey findings as well as advocacy and lobbying activities require specific skills which can be endorsed by specialized professionals. Ultimately, it is the researchers' responsibility to ensure that such tasks are effectively performed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Many studies show strong variation of health consumption between regions, suggesting that theses variations are related to the uncertainty of medical practice or to other factors related to health services or patients attitude. However the statistical interpretation of these variations is far from easy: apart from usual and specific information bias, there are statistical problems when observing incidence of events like health care consumption: it is in fact a rare event, which is observed within small population, and among regions with unequal number of person. Therefore, most of the variation reported might be well explained by a purely statistical phenomenon. This paper presents some aspects of this variability for three common indicators of variation, and suggest the use of ad hoc simulation to get statistical criteria.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Interpretability and power of genome-wide association studies can be increased by imputing unobserved genotypes, using a reference panel of individuals genotyped at higher marker density. For many markers, genotypes cannot be imputed with complete certainty, and the uncertainty needs to be taken into account when testing for association with a given phenotype. In this paper, we compare currently available methods for testing association between uncertain genotypes and quantitative traits. We show that some previously described methods offer poor control of the false-positive rate (FPR), and that satisfactory performance of these methods is obtained only by using ad hoc filtering rules or by using a harsh transformation of the trait under study. We propose new methods that are based on exact maximum likelihood estimation and use a mixture model to accommodate nonnormal trait distributions when necessary. The new methods adequately control the FPR and also have equal or better power compared to all previously described methods. We provide a fast software implementation of all the methods studied here; our new method requires computation time of less than one computer-day for a typical genome-wide scan, with 2.5 M single nucleotide polymorphisms and 5000 individuals.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Microarray data is frequently used to characterize the expression profile of a whole genome and to compare the characteristics of that genome under several conditions. Geneset analysis methods have been described previously to analyze the expression values of several genes related by known biological criteria (metabolic pathway, pathology signature, co-regulation by a common factor, etc.) at the same time and the cost of these methods allows for the use of more values to help discover the underlying biological mechanisms. Results: As several methods assume different null hypotheses, we propose to reformulate the main question that biologists seek to answer. To determine which genesets are associated with expression values that differ between two experiments, we focused on three ad hoc criteria: expression levels, the direction of individual gene expression changes (up or down regulation), and correlations between genes. We introduce the FAERI methodology, tailored from a two-way ANOVA to examine these criteria. The significance of the results was evaluated according to the self-contained null hypothesis, using label sampling or by inferring the null distribution from normally distributed random data. Evaluations performed on simulated data revealed that FAERI outperforms currently available methods for each type of set tested. We then applied the FAERI method to analyze three real-world datasets on hypoxia response. FAERI was able to detect more genesets than other methodologies, and the genesets selected were coherent with current knowledge of cellular response to hypoxia. Moreover, the genesets selected by FAERI were confirmed when the analysis was repeated on two additional related datasets. Conclusions: The expression values of genesets are associated with several biological effects. The underlying mathematical structure of the genesets allows for analysis of data from several genes at the same time. Focusing on expression levels, the direction of the expression changes, and correlations, we showed that two-step data reduction allowed us to significantly improve the performance of geneset analysis using a modified two-way ANOVA procedure, and to detect genesets that current methods fail to detect.