201 resultados para Hand-drawn concept map


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Les laves torrentielles sont l'un des vecteurs majeurs de sédiments en milieu montagneux. Leur comportement hydrogéomorphologique est contrôlé par des facteurs géologique, géomorphologique, topographique, hydrologique, climatique et anthropique. Si, en Europe, la recherche s'est plus focalisée sur les aspects hydrologiques que géomorphologiques de ces phénomènes, l'identification des volumes de sédiments potentiellement mobilisables au sein de petits systèmes torrentiels et des processus responsables de leur transfert est d'une importance très grande en termes d'aménagement du territoire et de gestion des dangers naturels. De plus, une corrélation entre des événements pluviométriques et l'occurrence de laves torrentielles n'est pas toujours établie et de nombreux événements torrentiels semblent se déclencher lorsqu'un seuil géomorphologique intrinsèque (degré de remplissage du chenal) au cours d'eau est atteint.Une méthodologie pragmatique a été développée pour cartographier les stocks sédimentaires constituant une source de matériaux pour les laves torrentielles, comme outil préliminaire à la quantification des volumes transportés par ces phénomènes. La méthode s'appuie sur des données dérivées directement d'analyses en environnement SIG réalisées sur des modèles numériques d'altitude de haute précision, de mesures de terrain et d'interprétation de photographies aériennes. La méthode a été conçue pour évaluer la dynamique des transferts sédimentaires, en prenant en compte le rôle des différents réservoirs sédimentaires, par l'application du concept de cascade sédimentaire sous un angle cartographique.Les processus de transferts sédimentaires ont été étudiés dans deux bassins versants des Alpes suisses (torrent du Bruchi, à Blatten beiNaters et torrent du Meretschibach, à Agarn). La cartographie géomorphologique a été couplée avec des mesures complémentaires permettant d'estimer les flux sédimentaires et les taux d'érosion (traçages de peinture, piquets de dénudation et utilisation du LiDAR terrestre). La méthode proposée se révèle innovatrice en comparaison avec la plupart des systèmes de légendes géomorphologiques existants, qui ne sont souvent pas adaptés pour cartographier de manière satisfaisante les systèmes géomorphologiques complexes et actifs que sont les bassins torrentiels. L'intérêt de cette méthode est qu'elle permet l'établissement d'une cascade sédimentaire, mais uniquement pour des systèmes où l'occurrence d'une lave torrentielle est contrôlé par le degré de remplissage en matériaux du chenal. Par ailleurs, le produit cartographique ne peut être directement utilisé pour la création de cartes de dangers - axées sur les zones de dépôt - mais revêt un intérêt pour la mise en place de mesures de correction et pour l'installation de systèmes de monitoring ou d'alerte.La deuxième partie de ce travail de recherche est consacrée à la cartographie géomorphologique. Une analyse a porté sur un échantillon de 146 cartes ou systèmes de légende datant des années 1950 à 2009 et réalisés dans plus de 40 pays. Cette analyse a permis de mettre en évidence la diversité des applications et des techniques d'élaboration des cartes géomorphologiques. - Debris flows are one of the most important vectors of sediment transfer in mountainous areas. Their hydro-geomorphological behaviour is conditioned by geological, geomorphological, topographical, hydrological, climatic and anthropic factors. European research in torrential systems has focused more on hydrological processes than on geomorphological processes acting as debris flow triggers. Nevertheless, the identification of sediment volumes that have the potential to be mobilised in small torrential systems, as well as the recognition of processes responsible for their mobilisation and transfer within the torrential system, are important in terms of land-use planning and natural hazard management. Moreover, a correlation between rainfall and debris flow occurrence is not always established and a number of debris flows seems to occur when a poorly understood geomorphological threshold is reached.A pragmatic methodology has been developed for mapping sediment storages that may constitute source zone of bed load transport and debris flows as a preliminary tool before quantifying their volumes. It is based on data directly derived from GIS analysis using high resolution DEM's, field measurements and aerial photograph interpretations. It has been conceived to estimate sediment transfer dynamics, taking into account the role of different sediment stores in the torrential system applying the concept of "sediment cascade" in a cartographic point of view.Sediment transfer processes were investigated in two small catchments in the Swiss Alps (Bruchi torrent, Blatten bei Naters and Meretschibach torrent, Agarn). Thorough field geomorphological mapping coupled with complementary measurements were conducted to estimate sediment fluxes and denudation rates, using various methods (reference coloured lines, wooden markers and terrestrial LiDAR). The proposed geomorphological mapping methodology is quite innovative in comparison with most legend systems that are not adequate for mapping active and complex geomorphological systems such as debris flow catchments. The interest of this mapping method is that it allows the concept of sediment cascade to be spatially implemented but only for supply-limited systems. The map cannot be used directly for the creation of hazard maps, focused on the deposition areas, but for the design of correction measures and the implementation of monitoring and warning systems.The second part of this work focuses on geomorphological mapping. An analysis of a sample of 146 (extracts of) maps or legend systems dating from the middle of the 20th century to 2009 - realised in more than 40 different countries - was carried out. Even if this study is not exhaustive, it shows a clear renewed interest for the discipline worldwide. It highlights the diversity of applications, techniques (scale, colours and symbology) used for their conception.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Empirical testing of candidate vaccines has led to the successful development of a number of lifesaving vaccines. The advent of new tools to manipulate antigens and new methods and vectors for vaccine delivery has led to a veritable explosion of potential vaccine designs. As a result, selection of candidate vaccines suitable for large-scale efficacy testing has become more challenging. This is especially true for diseases such as dengue, HIV, and tuberculosis where there is no validated animal model or correlate of immune protection. Establishing guidelines for the selection of vaccine candidates for advanced testing has become a necessity. A number of factors could be considered in making these decisions, including, for example, safety in animal and human studies, immune profile, protection in animal studies, production processes with product quality and stability, availability of resources, and estimated cost of goods. The "immune space template" proposed here provides a standardized approach by which the quality, level, and durability of immune responses elicited in early human trials by a candidate vaccine can be described. The immune response profile will demonstrate if and how the candidate is unique relative to other candidates, especially those that have preceded it into efficacy testing and, thus, what new information concerning potential immune correlates could be learned from an efficacy trial. A thorough characterization of immune responses should also provide insight into a developer's rationale for the vaccine's proposed mechanism of action. HIV vaccine researchers plan to include this general approach in up-selecting candidates for the next large efficacy trial. This "immune space" approach may also be applicable to other vaccine development endeavors where correlates of vaccine-induced immune protection remain unknown.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Several methods have already been proposed to improve the mobility of reversed prostheses (lateral or inferior displacement, increase of the glenosphere size). However, the effect of these design changes have only been evaluated on the maximal range of motion and were not related to activities of daily living (ADL). Our aim was thus to measure the effect of these design changes and to relate it to 4 typical ADL. Methods: CT data were used to reconstruct a accurate geometric model of the scapula and humerus. The Aequalis reversed prosthesis (Tornier) was used. The mobility of a healthy shoulder was compared to the mobility of 4 different reversed designs: 36 and 42 mm glenospheres diameters, inferior (4 mm) and lateral (3.2 mm) glenospheres displacements. The complete mobility map of the prosthesis was compared to kinematics measurement on healthy subjects for 4 ADL: 1) hand to contra lateral shoulder, 2) hand to mouth, 3) combing hair, 4) hand to back pocket. The results are presented as percentage of the allowed movement of the prosthestic shouder relative to the healthy shoulder, considered as the control group. Results: None of the tested designs allowed to recover a full mobility. The differences of allowed range of motion among each prosthetic designs appeared mainly in two of the 4 movements: hand to back pocket and hand to contra lateral shoulder. For the hand to back pocket, the 36 had the lowest mobility range, particularly for the last third of the movement. The 42 appeared to be a good compromise for all ADL activities. Conclusion: Reverse shoulder prostheses does not allow to recover a full range of motion compared to healthy shoulders, even for ADL. The present study allowed to obtain a complete 3D mobility map for several glenosphere positions and sizes, and to relate it to typical ADL. We mainly observed an improved mobility with inferior displacement and increased glenosphere size. We would suggest to use larger glenosphere, whenever it is possible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: In May 2010, Switzerland introduced a heterogeneous smoking ban in the hospitality sector. While the law leaves room for exceptions in some cantons, it is comprehensive in others. This longitudinal study uses different measurement methods to examine airborne nicotine levels in hospitality venues and the level of personal exposure of non-smoking hospitality workers before and after implementation of the law. METHODS: Personal exposure to second hand smoke (SHS) was measured by three different methods. We compared a passive sampler called MoNIC (Monitor of NICotine) badge, to salivary cotinine and nicotine concentration as well as questionnaire data. Badges allowed the number of passively smoked cigarettes to be estimated. They were placed at the venues as well as distributed to the participants for personal measurements. To assess personal exposure at work, a time-weighted average of the workplace badge measurements was calculated. RESULTS: Prior to the ban, smoke-exposed hospitality venues yielded a mean badge value of 4.48 (95%-CI: 3.7 to 5.25; n = 214) cigarette equivalents/day. At follow-up, measurements in venues that had implemented a smoking ban significantly declined to an average of 0.31 (0.17 to 0.45; n = 37) (p = 0.001). Personal badge measurements also significantly decreased from an average of 2.18 (1.31-3.05 n = 53) to 0.25 (0.13-0.36; n = 41) (p = 0.001). Spearman rank correlations between badge exposure measures and salivary measures were small to moderate (0.3 at maximum). CONCLUSIONS: Nicotine levels significantly decreased in all types of hospitality venues after implementation of the smoking ban. In-depth analyses demonstrated that a time-weighted average of the workplace badge measurements represented typical personal SHS exposure at work more reliably than personal exposure measures such as salivary cotinine and nicotine.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mitogen-activated protein kinase (MAPK) cascades regulate a wide variety of cellular processes that ultimately depend on changes in gene expression. We have found a novel mechanism whereby one of the key MAP3 kinases, Mekk1, regulates transcriptional activity through an interaction with p53. The tumor suppressor protein p53 down-regulates a number of genes, including the gene most frequently mutated in autosomal dominant polycystic kidney disease (PKD1). We have discovered that Mekk1 translocates to the nucleus and acts as a co-repressor with p53 to down-regulate PKD1 transcriptional activity. This repression does not require Mekk1 kinase activity, excluding the need for an Mekk1 phosphorylation cascade. However, this PKD1 repression can also be induced by the stress-pathway stimuli, including TNFα, suggesting that Mekk1 activation induces both JNK-dependent and JNK-independent pathways that target the PKD1 gene. An Mekk1-p53 interaction at the PKD1 promoter suggests a new mechanism by which abnormally elevated stress-pathway stimuli might directly down-regulate the PKD1 gene, possibly causing haploinsufficiency and cyst formation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rationale of this study was to investigate molecular flexibility and its influence on physicochemical properties with a view to uncovering additional information on the fuzzy concept of dynamic molecular structure. Indeed, it is now known that computed molecular interaction fields (MIFs) such as molecular electrostatic potentials (MEPs) and lipophilicity potentials (MLPs) are conformation-dependent, as are dipole moments. A database of 125 compounds was used whose conformational space was explored, while conformation-dependent parameters were computed for each non-redundant conformer found in the conformational space of the compounds. These parameters were the virtual log P (log P(MLP), calculated by a MLP approach), the apolar surface area (ASA), polar surface area (PSA), and solvent-accessible surface (SAS). For each compound, the range taken by each parameter (its property space) was divided by the number of rotors taken as an index of flexibility, yielding a parameter termed 'molecular sensitivity'. This parameter was poorly correlated with others (i.e., it contains novel information) and showed the compounds to fall into two broad classes. 'Sensitive' molecules are those whose computed property ranges are markedly sensitive to conformational effects, whereas 'insensitive' (in fact, less sensitive) molecules have property ranges which are comparatively less affected by conformational fluctuations. A pharmacokinetic application is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Popularisée par l'usage extensif (et de ce fait quelque peu galvaudé) qu'en fit Gérard Genette en narratologie littéraire, la « diégèse » constitue certes la notion vedette des filmologues, mais elle fut pendant plusieurs décennies le lieu d'une certaine occultation de son champ originel en raison des appropriations non référencées qu'en firent poéticiens et sémiologues. En suivant le fil historiographique des différents emplois et acceptions de cette notion associée à Etienne Souriau dans l'espace francophone, le présent article propose un état des lieux qui, par la petite porte d'une entrée terminologique, entend s'interroger sur les conséquences théoriques des variations que le terme « diégèse » a subies et, plus généralement, sur la question du statut octroyé au courant filmologique. En examinant les différentes implications de la définition première qu'il s'agit parfois de délester de sens dont on l'a ultérieurement investie, cet article tente de montrer que le cadre dans lequel la diégèse a été conceptualisée contient des potentialités théoriques qui n'ont pas encore été épuisées aujourd'hui, et qui convergent avec certains champs d'étude récents (comme celui de la logique des mondes possibles) dont l'application au cinéma n'a probablement jamais été aussi pertinent qu'à l'ère des images de synthèse, des mondes virtuels et des environnements de jeux vidéo.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recognition and identification processes for deceased persons. Determining the identity of deceased persons is a routine task performed essentially by police departments and forensic experts. This thesis highlights the processes necessary for the proper and transparent determination of the civil identities of deceased persons. The identity of a person is defined as the establishment of a link between that person ("the source") and information pertaining to the same individual ("identifiers"). Various identity forms could emerge, depending on the nature of the identifiers. There are two distinct types of identity, namely civil identity and biological identity. The paper examines four processes: identification by witnesses (the recognition process) and comparisons of fingerprints, dental data and DNA profiles (the identification processes). During the recognition process, the memory function is examined and helps to clarify circumstances that may give rise to errors. To make the process more rigorous, a body presentation procedure is proposed to investigators. Before examining the other processes, three general concepts specific to forensic science are considered with regard to the identification of a deceased person, namely, matter divisibility (Inman and Rudin), transfer (Locard) and uniqueness (Kirk). These concepts can be applied to the task at hand, although some require a slightly broader scope of application. A cross comparison of common forensic fields and the identification of deceased persons reveals certain differences, including 1 - reverse positioning of the source (i.e. the source is not sought from traces, but rather the identifiers are obtained from the source); 2 - the need for civil identity determination in addition to the individualisation stage; and 3 - a more restricted population (closed set), rather than an open one. For fingerprints, dental and DNA data, intravariability and intervariability are examined, as well as changes in these post mortem (PM) identifiers. Ante-mortem identifiers (AM) are located and AM-PM comparisons made. For DNA, it has been shown that direct identifiers (taken from a person whose civil identity has been alleged) tend to lead to determining civil identity whereas indirect identifiers (obtained from a close relative) direct towards a determination of biological identity. For each process, a Bayesian model is presented which includes sources of uncertainty deemed to be relevant. The results of the different processes combine to structure and summarise an overall outcome and a methodology. The modelling of dental data presents a specific difficulty with respect to intravariability, which in itself is not quantifiable. The concept of "validity" is, therefore, suggested as a possible solution to the problem. Validity uses various parameters that have an acknowledged impact on teeth intravariability. In cases where identifying deceased persons proves to be extremely difficult due to the limited discrimination of certain procedures, the use of a Bayesian approach is of great value in bringing a transparent and synthetic value. RESUME : Titre: Processus de reconnaissance et d'identification de personnes décédées. L'individualisation de personnes décédées est une tâche courante partagée principalement par des services de police, des odontologues et des laboratoires de génétique. L'objectif de cette recherche est de présenter des processus pour déterminer valablement, avec une incertitude maîtrisée, les identités civiles de personnes décédées. La notion d'identité est examinée en premier lieu. L'identité d'une personne est définie comme l'établissement d'un lien entre cette personne et des informations la concernant. Les informations en question sont désignées par le terme d'identifiants. Deux formes distinctes d'identité sont retenues: l'identité civile et l'identité biologique. Quatre processus principaux sont examinés: celui du témoignage et ceux impliquant les comparaisons d'empreintes digitales, de données dentaires et de profils d'ADN. Concernant le processus de reconnaissance, le mode de fonctionnement de la mémoire est examiné, démarche qui permet de désigner les paramètres pouvant conduire à des erreurs. Dans le but d'apporter un cadre rigoureux à ce processus, une procédure de présentation d'un corps est proposée à l'intention des enquêteurs. Avant d'entreprendre l'examen des autres processus, les concepts généraux propres aux domaines forensiques sont examinés sous l'angle particulier de l'identification de personnes décédées: la divisibilité de la matière (Inman et Rudin), le transfert (Locard) et l'unicité (Kirk). Il est constaté que ces concepts peuvent être appliqués, certains nécessitant toutefois un léger élargissement de leurs principes. Une comparaison croisée entre les domaines forensiques habituels et l'identification de personnes décédées montre des différences telles qu'un positionnement inversé de la source (la source n'est plus à rechercher en partant de traces, mais ce sont des identifiants qui sont recherchés en partant de la source), la nécessité de devoir déterminer une identité civile en plus de procéder à une individualisation ou encore une population d'intérêt limitée plutôt qu'ouverte. Pour les empreintes digitales, les dents et l'ADN, l'intra puis l'inter-variabilité sont examinées, de même que leurs modifications post-mortem (PM), la localisation des identifiants ante-mortem (AM) et les comparaisons AM-PM. Pour l'ADN, il est démontré que les identifiants directs (provenant de la personne dont l'identité civile est supposée) tendent à déterminer une identité civile alors que les identifiants indirects (provenant d'un proche parent) tendent à déterminer une identité biologique. Puis une synthèse des résultats provenant des différents processus est réalisée grâce à des modélisations bayesiennes. Pour chaque processus, une modélisation est présentée, modélisation intégrant les paramètres reconnus comme pertinents. À ce stade, une difficulté apparaît: celle de quantifier l'intra-variabilité dentaire pour laquelle il n'existe pas de règle précise. La solution préconisée est celle d'intégrer un concept de validité qui intègre divers paramètres ayant un impact connu sur l'intra-variabilité. La possibilité de formuler une valeur de synthèse par l'approche bayesienne s'avère d'une aide précieuse dans des cas très difficiles pour lesquels chacun des processus est limité en termes de potentiel discriminant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neuroticism is a moderately heritable personality trait considered to be a risk factor for developing major depression, anxiety disorders and dementia. We performed a genome-wide association study in 2,235 participants drawn from a population-based study of neuroticism, making this the largest association study for neuroticism to date. Neuroticism was measured by the Eysenck Personality Questionnaire. After Quality Control, we analysed 430,000 autosomal SNPs together with an additional 1.2 million SNPs imputed with high quality from the Hap Map CEU samples. We found a very small effect of population stratification, corrected using one principal component, and some cryptic kinship that required no correction. NKAIN2 showed suggestive evidence of association with neuroticism as a main effect (p < 10(-6)) and GPC6 showed suggestive evidence for interaction with age (p approximately = 10(-7)). We found support for one previously-reported association (PDE4D), but failed to replicate other recent reports. These results suggest common SNP variation does not strongly influence neuroticism. Our study was powered to detect almost all SNPs explaining at least 2% of heritability, and so our results effectively exclude the existence of loci having a major effect on neuroticism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The function of DNA-binding proteins is controlled not just by their abundance, but mainly at the level of their activity in terms of their interactions with DNA and protein targets. Moreover, the affinity of such transcription factors to their target sequences is often controlled by co-factors and/or modifications that are not easily assessed from biological samples. Here, we describe a scalable method for monitoring protein-DNA interactions on a microarray surface. This approach was designed to determine the DNA-binding activity of proteins in crude cell extracts, complementing conventional expression profiling arrays. Enzymatic labeling of DNA enables direct normalization of the protein binding to the microarray, allowing the estimation of relative binding affinities. Using DNA sequences covering a range of affinities, we show that the new microarray-based method yields binding strength estimates similar to low-throughput gel mobility-shift assays. The microarray is also of high sensitivity, as it allows the detection of a rare DNA-binding protein from breast cancer cells, the human tumor suppressor AP-2. This approach thus mediates precise and robust assessment of the activity of DNA-binding proteins and takes present DNA-binding assays to a high throughput level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The identity [r]evolution is happening. Who are you, who am I in the information society? In recent years, the convergence of several factors - technological, political, economic - has accelerated a fundamental change in our networked world. On a technological level, information becomes easier to gather, to store, to exchange and to process. The belief that more information brings more security has been a strong political driver to promote information gathering since September 11. Profiling intends to transform information into knowledge in order to anticipate one's behaviour, or needs, or preferences. It can lead to categorizations according to some specific risk criteria, for example, or to direct and personalized marketing. As a consequence, new forms of identities appear. They are not necessarily related to our names anymore. They are based on information, on traces that we leave when we act or interact, when we go somewhere or just stay in one place, or even sometimes when we make a choice. They are related to the SIM cards of our mobile phones, to our credit card numbers, to the pseudonyms that we use on the Internet, to our email addresses, to the IP addresses of our computers, to our profiles... Like traditional identities, these new forms of identities can allow us to distinguish an individual within a group of people, or describe this person as belonging to a community or a category. How far have we moved through this process? The identity [r]evolution is already becoming part of our daily lives. People are eager to share information with their "friends" in social networks like Facebook, in chat rooms, or in Second Life. Customers take advantage of the numerous bonus cards that are made available. Video surveillance is becoming the rule. In several countries, traditional ID documents are being replaced by biometric passports with RFID technologies. This raises several privacy issues and might actually even result in changing the perception of the concept of privacy itself, in particular by the younger generation. In the information society, our (partial) identities become the illusory masks that we choose -or that we are assigned- to interplay and communicate with each other. Rights, obligations, responsibilities, even reputation are increasingly associated with these masks. On the one hand, these masks become the key to access restricted information and to use services. On the other hand, in case of a fraud or negative reputation, the owner of such a mask can be penalized: doors remain closed, access to services is denied. Hence the current preoccupying growth of impersonation, identity-theft and other identity-related crimes. Where is the path of the identity [r]evolution leading us? The booklet is giving a glance on possible scenarios in the field of identity.