28 resultados para map-matching gps gps-traces openstreetmap past-choice-modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biological invasions and land-use changes are two major causes of the global modifications of biodiversity. Habitat suitability models are the tools of choice to predict potential distributions of invasive species. Although land-use is a key driver of alien species invasions, it is often assumed that land-use is constant in time. Here we combine historical and present day information, to evaluate whether land-use changes could explain the dynamic of invasion of the American bullfrog Rana catesbeiana (=Lithobathes catesbeianus) in Northern Italy, from the 1950s to present-day. We used maxent to build habitat suitability models, on the basis of past (1960s, 1980s) and present-day data on land-uses and species distribution. For example, we used models built using the 1960s data to predict distribution in the 1980s, and so on. Furthermore, we used land-use scenarios to project suitability in the future. Habitat suitability models predicted well the spread of bullfrogs in the subsequent temporal step. Models considering land-use changes predicted invasion dynamics better than models assuming constant land-use over the last 50 years. Scenarios of future land-use suggest that suitability will remain similar in the next years. Habitat suitability models can help to understand and predict the dynamics of invasions; however, land-use is not constant in time: land-use modifications can strongly affect invasions; furthermore, both land management and the suitability of a given land-use class may vary in time. An integration of land-use changes in studies of biological invasions can help to improve management strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

QUESTION UNDER STUDY: To assess how important the possibility to choose specialist physicians is for Swiss residents and to determine which variables are associated with this opinion. METHODS: This cross-sectional study used data from the 2007 Swiss population-based health survey and included 13,642 non-institutionalised adults who responded to the telephone and paper questionnaires. The dependent variable included answers to the question "How important is it for you to be able to choose the specialist you would like to visit?" Independent variables included socio-demographics, health and past year healthcare use measures. Crude and adjusted logistic regressions for the importance of being able to choose specialist physicians were performed, accounting for the survey design. RESULTS: 45% of participants found it very important to be able to choose the specialist physician they wanted to visit. The answers "rather important", "rather not important" and "not important" were reported by 28%, 20% and 7% of respondents. Women, individuals in middle/high executive position, those with an ordinary insurance scheme, those reporting ≥2 chronic conditions or poorer subjective health, or those who had had ≥2 outpatient visits in the preceding year were more likely to find this choice very important. CONCLUSIONS: In 2007, almost half of all Swiss residents found it very important to be able to choose his/her specialist physician. The further development of physician networks or other chronic disease management initiatives in Switzerland, towards integrated care, need to pay attention to the freedom of choice of specialist physicians that Swiss residents value. Future surveys should provide information on access and consultations with specialist physicians.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To what extent do social policy preferences explain party choice? This question has received little attention over the past years, because the bulk of the literature has argued that electoral choice is increasingly shaped by identity-based attitudes, rather than by preferences for economic-distributive social policies. We argue that in the wake of this debate, the significance of social policy preferences for electoral choice has been underestimated, because most contributions neglect social policy debates that are specific to post-industrial societies. In particular, they merely focus on income redistribution, while neglecting distributive conflicts around social investment. The Selects 2011 data allows investigating this crucial distinction for Switzerland. Our empirical analyses confirm that it is pivotal to take the pluridimensionality of distributive conflicts seriously: when looking at preferences for social investment rather than income redistribution, we find that social policy preferences are significant explanatory factors for the choice of the five major Swiss political parties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the coevolution between philopatry and altruism in island-model populations when kin recognition occurs through phenotype matching. In saturated environments, a good discrimination ability is a necessary prerequisite for the emergence of sociality. Discrimination decreases not only with the average phenotypic similarity between immigrants and residents (i.e., with environmental homogeneity and past gene flow) but also with the sampling variance of similarity distributions (a negative function of the number of traits sampled). Whether discrimination should rely on genetically or environmentally determined traits depends on the apportionment of phenotypic variance and, in particular, on the relative values of e (the among-group component of environmental variance) and r (the among-group component of genetic variance, which also measures relatedness among group members). If r exceeds e, highly heritable cues do better. Discrimination and altruism, however, remain low unless philopatry is enforced by ecological constraints. If e exceeds r, by contrast, nonheritable traits do better. High e values improve discrimination drastically and thus have the potential to drive sociality, even in the absence of ecological constraints. The emergence of sociality thus can be facilitated by enhancing e, which we argue is the main purpose of cue standardization within groups, as observed in many social insects, birds, and mammals, including humans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1406 I. 1407 II. 1408 III. 1410 IV. 1411 V. 1413 VI. 1416 VII. 1418 1418 References 1419 SUMMARY: Almost all land plants form symbiotic associations with mycorrhizal fungi. These below-ground fungi play a key role in terrestrial ecosystems as they regulate nutrient and carbon cycles, and influence soil structure and ecosystem multifunctionality. Up to 80% of plant N and P is provided by mycorrhizal fungi and many plant species depend on these symbionts for growth and survival. Estimates suggest that there are c. 50 000 fungal species that form mycorrhizal associations with c. 250 000 plant species. The development of high-throughput molecular tools has helped us to better understand the biology, evolution, and biodiversity of mycorrhizal associations. Nuclear genome assemblies and gene annotations of 33 mycorrhizal fungal species are now available providing fascinating opportunities to deepen our understanding of the mycorrhizal lifestyle, the metabolic capabilities of these plant symbionts, the molecular dialogue between symbionts, and evolutionary adaptations across a range of mycorrhizal associations. Large-scale molecular surveys have provided novel insights into the diversity, spatial and temporal dynamics of mycorrhizal fungal communities. At the ecological level, network theory makes it possible to analyze interactions between plant-fungal partners as complex underground multi-species networks. Our analysis suggests that nestedness, modularity and specificity of mycorrhizal networks vary and depend on mycorrhizal type. Mechanistic models explaining partner choice, resource exchange, and coevolution in mycorrhizal associations have been developed and are being tested. This review ends with major frontiers for further research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whether for investigative or intelligence aims, crime analysts often face up the necessity to analyse the spatiotemporal distribution of crimes or traces left by suspects. This article presents a visualisation methodology supporting recurrent practical analytical tasks such as the detection of crime series or the analysis of traces left by digital devices like mobile phone or GPS devices. The proposed approach has led to the development of a dedicated tool that has proven its effectiveness in real inquiries and intelligence practices. It supports a more fluent visual analysis of the collected data and may provide critical clues to support police operations as exemplified by the presented case studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the past few years, technological breakthroughs have helpedcompetitive sports to attain new levels. Training techniques, athletes' management and methods to analyse specific technique and performancehave sharpened, leading to performance improvement. Alpine skiing is not different. The objective of the present work was to study the technique of highy skilled alpine skiers performing giant slalom, in order to determine the quantity of energy that can be produced by skiers to increase their speed. To reach this goal, several tools have been developed to allow field testing on ski slopes; a multi cameras system, a wireless synchronization system, an aerodynamic drag model and force plateforms have especially been designed and built. The analyses performed using the different tools highlighted the possibility for several athletes to increase their energy by approximately 1.5 % using muscular work. Nevertheless, the athletes were in average not able to use their muscular work in an efficient way. By offering functional tools such as drift analysis using combined data from GPS and inertial sensors, or trajectory analysis based on tracking morphological points, this research makes possible the analysis of alpine skiers technique and performance in real training conditions. The author wishes for this work to be used as a basis for continued knowledge and understanding of alpine skiing technique. - Le sport de compétition bénéficie depuis quelques années des progrès technologiques apportés par la science. Les techniques d'entraînement, le suivi des athlètes et les méthodes d'analyse deviennent plus pointus, induisant une nette amélioration des performances. Le ski alpin ne dérogeant pas à cette règle, l'objectif de ce travail était d'analyser la technique de skieurs de haut niveau en slalom géant afin de déterminer la quantité d'énergie fournie par les skieurs pour augmenter leur vitesse. Pour ce faire, il a été nécessaire de developer différents outils d'analyse adaptés aux contraintes inhérentes aux tests sur les pistes de skis; un système multi caméras, un système de synchronisation, un modèle aérodynamique et des plateformes de force ont notamment été développés. Les analyses effectuées grâce à ces différents outils ont montré qu'il était possible pour certains skieur d'augmenter leur énergie d'environ 1.5 % grâce au travail musculaire. Cependant, les athlètes n'ont en moyenne pas réussi à utiliser leur travail musculaire de manière efficace. Ce projet a également rendu possible des analyses adaptées aux conditions d'entraînement des skieurs en proposant des outils fonctionnels tels que l'analyse du drift grâce à des capteurs inertiels et GPS, ainsi que l'analyse simplifiée de trajectoires grâce au suivi de points morphologiques. L'auteur espère que ce travail servira de base pour approfondir les connaissances de la technique en ski alpin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Few epidemiological studies have addressed the health of workers exposed to novel manufactured nanomaterials. The small current workforce will necessitate pooling international cohorts. METHOD: A road map was defined for a globally harmonized framework for the careful choice of materials, exposure characterization, identification of study populations, definition of health endpoints, evaluation of appropriateness of study designs, data collection and analysis, and interpretation of the results. RESULTS: We propose a road map to reach global consensus on these issues. The proposed strategy should ensure that the costs of action are not disproportionate to the potential benefits and that the approach is pragmatic and practical. CONCLUSIONS: We should aim to go beyond the collection of health complaints, illness statistics, or even counts of deaths; the manifestation of such clear endpoints would indicate a failure of preventive measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A criminal investigation requires to search and to interpret vestiges of a criminal act that happened in a past time. The forensic investigator arises in this context as a critical reader of the investigation scene, in search of physical traces that should enable her to tell a story of the offence/crime which allegedly occurred. The challenge of any investigator is to detect and recognise relevant physical traces in order to provide forensic clues for investigation and intelligence purposes. Inspired by this obser- vation, the current research focuses on the following questions : What is a relevant physical trace? And, how does the forensic investigator know she is facing one ? The interest of such questions is to provide a definition of a dimension often used in forensic science but never studied in its implications and operations. This doctoral research investigates scientific paths that are not often explored in forensic science, by using semiotic and sociological tools combined with statistical data analysis. The results are shown following a semiotic path, strongly influenced by Peir- ce's studies, and a second track, called empirical, where investigations data were analysed and forensic investigators interviewed about their work practices in the field. By the semiotic track, a macroscopic view is given of a signification process running from the discove- red physical trace at the scene to what is evaluated as being relevant for the investigator. The physical trace is perceived in the form of several signs, whose meaning is culturally codified. The reasoning should consist of three main steps : 1- What kind of source does the discovered physical trace refer to ? 2- What cause/activity is at the origin of this source in the specific context of the case ? 3- What story can be told from these observations ? The stage 3 requires to reason in creating hypotheses that should explain the presence of the discovered trace coming from an activity ; the specific activity that is related to the investigated case. To validate this assumption, it would depend on their ability to respond to a rule of relevancy. The last step is the symbolisation of the relevancy. The rule would consist of two points : the recognition of the factual/circumstantial relevancy (Is the link between the trace and the case recognised with the formulated hypothesis ? ) and appropriate relevancy (What investment is required to collect and analyse the discovered trace considering the expected outcome at the investigation/intelligence level?). This process of meaning is based on observations and a conjectural reasoning subject to many influences. In this study, relevancy in forensic science is presented as a conventional dimension that is symbolised and conditioned by the context, the forensic investigator's practice and her workplace environment (culture of the place). In short, the current research states relevancy results of the interactions between parameters from situational, structural (or organisational) and individual orders. The detection, collection and analysis of relevant physical traces at scenes depends on the knowledge and culture mastered by the forensic investigator. In the study of the relation relevant trace-forensic investigator, this research introduces the KEE model as a conceptual map that illustrates three major areas of forensic knowledge and culture acquisition, involved in the research and evaluation of the relevant physical trace. Through the analysis of the investigation data and interviews, the relationship between those three parameters and the relevancy was highlighted. K, for knowing, embodies a rela- tionship to the immediate knowledge allowing to give an overview of the reality at a specific moment ; an important point since relevancy is signified in a context. E, for education, is considered through its relationship with relevancy via a culture that tends to become institutionalised ; it represents the theoretical knowledge. As for the parameter E, for experience, it exists in its relation to relevancy in the adjustments of the strategies of intervention (i.e a practical knowledge) of each practitioner having modulated her work in the light of success and setbacks case after case. The two E parameters constitute the library resources for the semiotic recognition process and the K parameter ensures the contextualisation required to set up the reasoning and to formulate explanatory hypotheses for the discovered physical traces, questioned in their relevancy. This research demonstrates that the relevancy is not absolute. It is temporal and contextual; it is a conventional and relative dimension that must be discussed. This is where the whole issue of the meaning of what is relevant to each stakeholder of the investigation process rests. By proposing a step by step approach to the meaning process from the physical trace to the forensic clue, this study aims to provide a more advanced understanding of the reasoning and its operation, in order to streng- then forensic investigators' training. This doctoral research presents a set of tools critical to both pedagogical and practical aspects for crime scene management while identifying key-influences with individual, structural and situational dimensions. - Une enquête criminelle consiste à rechercher et à faire parler les vestiges d'un acte incriminé passé. L'investigateur forensique se pose dans ce cadre comme un lecteur critique des lieux à la recherche de traces devant lui permettre de former son récit, soit l'histoire du délit/crime censé s'être produit. Le challenge de tout investigateur est de pouvoir détecter et reconnaître les traces dites pertinentes pour fournir des indices forensiques à buts d'enquête et de renseignement. Inspirée par un tel constat, la présente recherche pose au coeur de ses réflexions les questions suivantes : Qu'est-ce qu'une trace pertinente ? Et, comment fait le forensicien pour déterminer qu'il y fait face ? L'intérêt de tels questionnements se comprend dans la volonté de définir une dimension souvent utili- sée en science forensique, mais encore jamais étudiée dans ses implications et fonctionnements. Cette recherche se lance dans des voies d'étude encore peu explorées en usant d'outils sémiotiques et des pratiques d'enquêtes sociologiques combinés à des traitements statistiques de données. Les résultats sont représentés en suivant une piste sémiotique fortement influencée par les écrits de Peirce et une seconde piste dite empirique où des données d'interventions ont été analysées et des investigateurs forensiques interviewés sur leurs pratiques de travail sur le terrain. Par la piste sémiotique, une vision macroscopique du processus de signification de la trace en élément pertinent est représentée. La trace est perçue sous la forme de plusieurs signes dont la signification est codifiée culturellement. Le raisonnement se formaliserait en trois principales étapes : 1- Quel type de source évoque la trace détectée? 2- Quelle cause/activité est à l'origine de cette source dans le contexte précis du cas ? 3- Quelle histoire peut être racontée à partir de ces observations ? Cette dernière étape consiste à raisonner en créant des hypothèses devant expliquer la présence de la trace détectée suite à une activité posée comme étant en lien avec le cas investigué. Pour valider ces hypothèses, cela dépendrait de leur capacité à répondre à une règle, celle de la pertinence. Cette dernière étape consiste en la symbolisation de la pertinence. La règle se composerait de deux points : la reconnaissance de la pertinence factuelle (le lien entre la trace et le cas est-il reconnu dans l'hypothèse fournie?) et la pertinence appropriée (quel est l'investissement à fournir dans la collecte et l'exploitation de la trace pour le bénéfice attendu au niveau de l'investigation/renseignement?). Tout ce processus de signification se base sur des observations et un raisonnement conjectural soumis à de nombreuses influences. Dans cette étude, la pertinence en science forensique se formalise sous les traits d'une dimension conventionnelle, symbolisée, conditionnée par le contexte, la pratique de l'investigateur forensique et la culture du milieu ; en somme cette recherche avance que la pertinence est le fruit d'une interaction entre des paramètres d'ordre situationnel, structurel (ou organisationnel) et individuel. Garantir la détection, la collecte et l'exploitation des traces pertinentes sur les lieux dépend de la connaissance et d'une culture maîtrisées par le forensicien. Dans l'étude du rapport trace pertinente-investigateur forensique, la présente recherche pose le modèle SFE comme une carte conceptuelle illustrant trois grands axes d'acquisition de la connaissance et de la culture forensiques intervenant dans la recherche et l'évaluation de la trace pertinente. Par l'analyse des données d'in- terventions et des entretiens, le rapport entre ces trois paramètres et la pertinence a été mis en évidence. S, pour savoir, incarne un rapport à la connaissance immédiate pour se faire une représentation d'une réalité à un instant donné, un point important pour une pertinence qui se comprend dans un contexte. F, pour formation, se conçoit dans son rapport à la pertinence via cette culture qui tend à s'institutionnaliser (soit une connaissance théorique). Quant au paramètre E, pour expérience, il se comprend dans son rapport à la pertinence dans cet ajustement des stratégies d'intervention (soit une connaissance pratique) de chaque praticien ayant modulé leur travail au regard des succès et échecs enregistrés cas après cas. F et E formeraient la bibliothèque de ressources permettant le processus de reconnaissance sémiotique et S assurerait la contextualisation nécessaire pour poser le raisonnement et formuler les hypothèses explicatives pour les traces détectées et questionnées dans leur pertinence. Ce travail démontre que la pertinence n'est pas absolue. Elle est temporelle et contextuelle, c'est une dimension conventionnelle relative et interprétée qui se doit d'être discutée. C'est là que repose toute la problématique de la signification de ce qui est pertinent pour chaque participant du processus d'investigation. En proposant une lecture par étapes du processus de signification depuis la trace à l'indice, l'étude vise à offrir une compréhension plus poussée du raisonnement et de son fonctionnement pour renforcer la formation des intervenants forensiques. Cette recherche présente ainsi un ensemble d'outils critiques à portée tant pédagogiques que pratiques pour la gestion des lieux tout en identifiant des influences-clé jouées par des dimensions individuelles, structurelles et situationnelles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Six pioneer physicians-pharmacists quality circles (PPQCs) located in the Swiss canton of Fribourg (administratively corresponding to a state in the US) were under the responsibility of 6 trained community pharmacists moderating the prescribing process of 24 general practitioners (GPs). PPQCs are based on a multifaceted collaborative process mediated by community pharmacists for improving compliance with clinical guidelines within GPs' prescribing practices. OBJECTIVE: To assess, over a 9-year period (1999-2007), the cost-containment impact of the PPQCs. METHODS: The key elements of PPQCs are a structured continuous quality improvement and education process; local networking; feedback of comparative and detailed data regarding costs, drug choice, and frequency of prescribed drugs; and structured independent literature review for interdisciplinary continuing education. The data are issued from the community pharmacy invoices to the health insurance companies. The study analyzed the cost-containment impact of the PPQCs in comparison with GPs working in similar conditions of care without particular collaboration with pharmacists, the percentage of generic prescriptions for specific cardiovascular drug classes, and the percentage of drug costs or units prescribed for specific cardiovascular drugs. RESULTS: For the 9-year period, there was a 42% decrease in the drug costs in the PPQC group as compared to the control group, representing a $225,000 (USD) savings per GP only in 2007. These results are explained by better compliance with clinical and pharmacovigilance guidelines, larger distribution of generic drugs, a more balanced attitude toward marketing strategies, and interdisciplinary continuing education on the rational use of drugs. CONCLUSIONS: The PPQC work process has yielded sustainable results, such as significant cost savings, higher penetration of generics and reflection on patient safety, and the place of "new" drugs in therapy. The PPQCs may also constitute a solid basis for implementing more comprehensive collaborative programs, such as medication reviews, adherence-enhancing interventions, or disease management approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The considerable malaria decline in several countries challenges the strategy of chemoprophylaxis for travellers visiting moderate- to low-risk areas. An international consensus on the best strategy is lacking. It is essential to include travellers' opinions in the decision process. The preference of travellers regarding malaria prevention for moderate- to low-risk areas, related to their risk perception, as well as the reasons for their choices were investigated. METHODS: Prior to pre-travel consultation in the Travel Clinic, a self-administered questionnaire was given to travellers visiting moderate- to low-risk malaria areas. Four preventive options were proposed to the traveller, i.e., bite prevention only, chemoprophylaxis, stand-by emergency treatment alone, and stand-by emergency treatment with rapid diagnostic test. The information was accompanied by a risk scale for incidence of malaria, anti-malarial adverse drug reactions and other travel-related risks, inspired by Paling palettes from the Risk Communication Institute. RESULTS: A total of 391 travellers were included from December 2012 to December 2013. Fifty-nine (15%) opted for chemoprophylaxis, 116 (30%) for stand-by emergency treatment, 112 (29%) for stand-by emergency treatment with rapid diagnostic test, 100 (26%) for bite prevention only, and four (1%) for other choices. Travellers choosing chemoprophylaxis justified their choice for security reasons (42%), better preventive action (29%), higher efficacy (15%) and easiness (15%). The reasons for choosing stand-by treatment or bite prevention only were less medication consumed (29%), less adverse drug reactions (23%) and lower price (9%). Those who chose chemoprophylaxis were more likely to have used it in the past (OR = 3.0 (CI 1.7-5.44)), but were not different in terms of demographic, travel characteristics or risk behaviour. CONCLUSIONS: When travelling to moderate- to low-risk malaria areas, 85% of interviewees chose not to take chemoprophylaxis as malaria prevention, although most guidelines recommend it. They had coherent reasons for their choice. New recommendations should include shared decision-making to take into account travellers' preferences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

«Quel est l'âge de cette trace digitale?» Cette question est relativement souvent soulevée au tribunal ou lors d'investigations, lorsque la personne suspectée admet avoir laissé ses empreintes digitales sur une scène de crime mais prétend l'avoir fait à un autre moment que celui du crime et pour une raison innocente. Toutefois, aucune réponse ne peut actuellement être donnée à cette question, puisqu'aucune méthodologie n'est pour l'heure validée et acceptée par l'ensemble de la communauté forensique. Néanmoins, l'inventaire de cas américains conduit dans cette recherche a montré que les experts fournissent tout de même des témoignages au tribunal concernant l'âge de traces digitales, même si ceux-­‐ci sont majoritairement basés sur des paramètres subjectifs et mal documentés. Il a été relativement aisé d'accéder à des cas américains détaillés, ce qui explique le choix de l'exemple. Toutefois, la problématique de la datation des traces digitales est rencontrée dans le monde entier, et le manque de consensus actuel dans les réponses données souligne la nécessité d'effectuer des études sur le sujet. Le but de la présente recherche est donc d'évaluer la possibilité de développer une méthode de datation objective des traces digitales. Comme les questions entourant la mise au point d'une telle procédure ne sont pas nouvelles, différentes tentatives ont déjà été décrites dans la littérature. Cette recherche les a étudiées de manière critique, et souligne que la plupart des méthodologies reportées souffrent de limitations prévenant leur utilisation pratique. Néanmoins, certaines approches basées sur l'évolution dans le temps de composés intrinsèques aux résidus papillaires se sont montrées prometteuses. Ainsi, un recensement détaillé de la littérature a été conduit afin d'identifier les composés présents dans les traces digitales et les techniques analytiques capables de les détecter. Le choix a été fait de se concentrer sur les composés sébacés détectés par chromatographie gazeuse couplée à la spectrométrie de masse (GC/MS) ou par spectroscopie infrarouge à transformée de Fourier. Des analyses GC/MS ont été menées afin de caractériser la variabilité initiale de lipides cibles au sein des traces digitales d'un même donneur (intra-­‐variabilité) et entre les traces digitales de donneurs différents (inter-­‐variabilité). Ainsi, plusieurs molécules ont été identifiées et quantifiées pour la première fois dans les résidus papillaires. De plus, il a été déterminé que l'intra-­‐variabilité des résidus était significativement plus basse que l'inter-­‐variabilité, mais que ces deux types de variabilité pouvaient être réduits en utilisant différents pré-­‐ traitements statistiques s'inspirant du domaine du profilage de produits stupéfiants. Il a également été possible de proposer un modèle objectif de classification des donneurs permettant de les regrouper dans deux classes principales en se basant sur la composition initiale de leurs traces digitales. Ces classes correspondent à ce qui est actuellement appelé de manière relativement subjective des « bons » ou « mauvais » donneurs. Le potentiel d'un tel modèle est élevé dans le domaine de la recherche en traces digitales, puisqu'il permet de sélectionner des donneurs représentatifs selon les composés d'intérêt. En utilisant la GC/MS et la FTIR, une étude détaillée a été conduite sur les effets de différents facteurs d'influence sur la composition initiale et le vieillissement de molécules lipidiques au sein des traces digitales. Il a ainsi été déterminé que des modèles univariés et multivariés pouvaient être construits pour décrire le vieillissement des composés cibles (transformés en paramètres de vieillissement par pré-­‐traitement), mais que certains facteurs d'influence affectaient ces modèles plus sérieusement que d'autres. En effet, le donneur, le substrat et l'application de techniques de révélation semblent empêcher la construction de modèles reproductibles. Les autres facteurs testés (moment de déposition, pression, température et illumination) influencent également les résidus et leur vieillissement, mais des modèles combinant différentes valeurs de ces facteurs ont tout de même prouvé leur robustesse dans des situations bien définies. De plus, des traces digitales-­‐tests ont été analysées par GC/MS afin d'être datées en utilisant certains des modèles construits. Il s'est avéré que des estimations correctes étaient obtenues pour plus de 60 % des traces-­‐tests datées, et jusqu'à 100% lorsque les conditions de stockage étaient connues. Ces résultats sont intéressants mais il est impératif de conduire des recherches supplémentaires afin d'évaluer les possibilités d'application de ces modèles dans des cas réels. Dans une perspective plus fondamentale, une étude pilote a également été effectuée sur l'utilisation de la spectroscopie infrarouge combinée à l'imagerie chimique (FTIR-­‐CI) afin d'obtenir des informations quant à la composition et au vieillissement des traces digitales. Plus précisément, la capacité de cette technique à mettre en évidence le vieillissement et l'effet de certains facteurs d'influence sur de larges zones de traces digitales a été investiguée. Cette information a ensuite été comparée avec celle obtenue par les spectres FTIR simples. Il en a ainsi résulté que la FTIR-­‐CI était un outil puissant, mais que son utilisation dans l'étude des résidus papillaires à des buts forensiques avait des limites. En effet, dans cette recherche, cette technique n'a pas permis d'obtenir des informations supplémentaires par rapport aux spectres FTIR traditionnels et a également montré des désavantages majeurs, à savoir de longs temps d'analyse et de traitement, particulièrement lorsque de larges zones de traces digitales doivent être couvertes. Finalement, les résultats obtenus dans ce travail ont permis la proposition et discussion d'une approche pragmatique afin d'aborder les questions de datation des traces digitales. Cette approche permet ainsi d'identifier quel type d'information le scientifique serait capable d'apporter aux enquêteurs et/ou au tribunal à l'heure actuelle. De plus, le canevas proposé décrit également les différentes étapes itératives de développement qui devraient être suivies par la recherche afin de parvenir à la validation d'une méthodologie de datation des traces digitales objective, dont les capacités et limites sont connues et documentées. -- "How old is this fingermark?" This question is relatively often raised in trials when suspects admit that they have left their fingermarks on a crime scene but allege that the contact occurred at a time different to that of the crime and for legitimate reasons. However, no answer can be given to this question so far, because no fingermark dating methodology has been validated and accepted by the whole forensic community. Nevertheless, the review of past American cases highlighted that experts actually gave/give testimonies in courts about the age of fingermarks, even if mostly based on subjective and badly documented parameters. It was relatively easy to access fully described American cases, thus explaining the origin of the given examples. However, fingermark dating issues are encountered worldwide, and the lack of consensus among the given answers highlights the necessity to conduct research on the subject. The present work thus aims at studying the possibility to develop an objective fingermark dating method. As the questions surrounding the development of dating procedures are not new, different attempts were already described in the literature. This research proposes a critical review of these attempts and highlights that most of the reported methodologies still suffer from limitations preventing their use in actual practice. Nevertheless, some approaches based on the evolution of intrinsic compounds detected in fingermark residue over time appear to be promising. Thus, an exhaustive review of the literature was conducted in order to identify the compounds available in the fingermark residue and the analytical techniques capable of analysing them. It was chosen to concentrate on sebaceous compounds analysed using gas chromatography coupled with mass spectrometry (GC/MS) or Fourier transform infrared spectroscopy (FTIR). GC/MS analyses were conducted in order to characterize the initial variability of target lipids among fresh fingermarks of the same donor (intra-­‐variability) and between fingermarks of different donors (inter-­‐variability). As a result, many molecules were identified and quantified for the first time in fingermark residue. Furthermore, it was determined that the intra-­‐variability of the fingermark residue was significantly lower than the inter-­‐variability, but that it was possible to reduce both kind of variability using different statistical pre-­‐ treatments inspired from the drug profiling area. It was also possible to propose an objective donor classification model allowing the grouping of donors in two main classes based on their initial lipid composition. These classes correspond to what is relatively subjectively called "good" or "bad" donors. The potential of such a model is high for the fingermark research field, as it allows the selection of representative donors based on compounds of interest. Using GC/MS and FTIR, an in-­‐depth study of the effects of different influence factors on the initial composition and aging of target lipid molecules found in fingermark residue was conducted. It was determined that univariate and multivariate models could be build to describe the aging of target compounds (transformed in aging parameters through pre-­‐ processing techniques), but that some influence factors were affecting these models more than others. In fact, the donor, the substrate and the application of enhancement techniques seemed to hinder the construction of reproducible models. The other tested factors (deposition moment, pressure, temperature and illumination) also affected the residue and their aging, but models combining different values of these factors still proved to be robust. Furthermore, test-­‐fingermarks were analysed with GC/MS in order to be dated using some of the generated models. It turned out that correct estimations were obtained for 60% of the dated test-­‐fingermarks and until 100% when the storage conditions were known. These results are interesting but further research should be conducted to evaluate if these models could be used in uncontrolled casework conditions. In a more fundamental perspective, a pilot study was also conducted on the use of infrared spectroscopy combined with chemical imaging in order to gain information about the fingermark composition and aging. More precisely, its ability to highlight influence factors and aging effects over large areas of fingermarks was investigated. This information was then compared with that given by individual FTIR spectra. It was concluded that while FTIR-­‐ CI is a powerful tool, its use to study natural fingermark residue for forensic purposes has to be carefully considered. In fact, in this study, this technique does not yield more information on residue distribution than traditional FTIR spectra and also suffers from major drawbacks, such as long analysis and processing time, particularly when large fingermark areas need to be covered. Finally, the results obtained in this research allowed the proposition and discussion of a formal and pragmatic framework to approach the fingermark dating questions. It allows identifying which type of information the scientist would be able to bring so far to investigators and/or Justice. Furthermore, this proposed framework also describes the different iterative development steps that the research should follow in order to achieve the validation of an objective fingermark dating methodology, whose capacities and limits are well known and properly documented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A major problem in developmental neurotoxicity (DNT) risk assessment is the lack of toxicological hazard information for most compounds. Therefore, new approaches are being considered to provide adequate experimental data that allow regulatory decisions. This process requires a matching of regulatory needs on the one hand and the opportunities provided by new test systems and methods on the other hand. Alignment of academically and industrially driven assay development with regulatory needs in the field of DNT is a core mission of the International STakeholder NETwork (ISTNET) in DNT testing. The first meeting of ISTNET was held in Zurich on 23-24 January 2014 in order to explore the concept of adverse outcome pathway (AOP) to practical DNT testing. AOPs were considered promising tools to promote test systems development according to regulatory needs. Moreover, the AOP concept was identified as an important guiding principle to assemble predictive integrated testing strategies (ITSs) for DNT. The recommendations on a road map towards AOP-based DNT testing is considered a stepwise approach, operating initially with incomplete AOPs for compound grouping, and focussing on key events of neurodevelopment. Next steps to be considered in follow-up activities are the use of case studies to further apply the AOP concept in regulatory DNT testing, making use of AOP intersections (common key events) for economic development of screening assays, and addressing the transition from qualitative descriptions to quantitative network modelling.