76 resultados para Evolving Object-Oriented Compiler
Resumo:
The paper deals with the development and application of the methodology for automatic mapping of pollution/contamination data. General Regression Neural Network (GRNN) is considered in detail and is proposed as an efficient tool to solve this problem. The automatic tuning of isotropic and an anisotropic GRNN model using cross-validation procedure is presented. Results are compared with k-nearest-neighbours interpolation algorithm using independent validation data set. Quality of mapping is controlled by the analysis of raw data and the residuals using variography. Maps of probabilities of exceeding a given decision level and ?thick? isoline visualization of the uncertainties are presented as examples of decision-oriented mapping. Real case study is based on mapping of radioactively contaminated territories.
Resumo:
La gouvernance de l'Internet est une thématique récente dans la politique mondiale. Néanmoins, elle est devenue au fil des années un enjeu économique et politique important. La question a même pris une importance particulière au cours des derniers mois en devenant un sujet d'actualité récurrent. Forte de ce constat, c ette recherche retrace l'histoire de la gouvernance de l'Internet depuis son émergence comme enjeu politique dans les années 1980 jusqu'à la fin du Sommet Mondial sur la Société de l'Information (SMSI) en 2005. Plutôt que de se focaliser sur l'une ou l'autre des institutions impliquées dans la régulation du réseau informatique mondial, cette recherche analyse l'émergence et l'évolution historique d'un espace de luttes rassemblant un nombre croissant d'acteurs différents. Cette évolution est décrite à travers le prisme de la relation dialectique entre élites et non-élites et de la lutte autour de la définition de la gouvernance de l'Internet. Cette thèse explore donc la question de comment les relations au sein des élites de la gouvernance de l'Internet et entre ces élites et les non-élites expliquent l'emergence, l'évolution et la structuration d'un champ relativement autonome de la politique mondiale centré sur la gouvernance de l'Internet. Contre les perspectives dominantes réaliste et libérales, cette recherche s'ancre dans une approche issue de la combinaison des traditions hétérodoxes en économie politique internationale et des apports de la sociologie politique internationale. Celle-ci s'articule autour des concepts de champ, d'élites et d'hégémonie. Le concept de champ, développé par Bourdieu inspire un nombre croissant d'études de la politique mondiale. Il permet à la fois une étude différenciée de la mondialisation et l'émergence d'espaces de lutte et de domination au niveau transnational. La sociologie des élites, elle, permet une approche pragmatique et centrée sur les acteurs des questions de pouvoir dans la mondialisation. Cette recherche utilise plus particulièrement le concept d'élite du pouvoir de Wright Mills pour étudier l'unification d'élites a priori différentes autour de projets communs. Enfin, cette étude reprend le concept néo-gramscien d'hégémonie afin d'étudier à la fois la stabilité relative du pouvoir d'une élite garantie par la dimension consensuelle de la domination, et les germes de changement contenus dans tout ordre international. A travers l'étude des documents produits au cours de la période étudiée et en s'appuyant sur la création de bases de données sur les réseaux d'acteurs, cette étude s'intéresse aux débats qui ont suivi la commercialisation du réseau au début des années 1990 et aux négociations lors du SMSI. La première période a abouti à la création de l'Internet Corporation for Assigned Names and Numbers (ICANN) en 1998. Cette création est le résultat de la recherche d'un consensus entre les discours dominants des années 1990. C'est également le fruit d'une coalition entre intérêts au sein d'une élite du pouvoir de la gouvernance de l'Internet. Cependant, cette institutionnalisation de l'Internet autour de l'ICANN excluait un certain nombre d'acteurs et de discours qui ont depuis tenté de renverser cet ordre. Le SMSI a été le cadre de la remise en cause du mode de gouvernance de l'Internet par les États exclus du système, des universitaires et certaines ONG et organisations internationales. C'est pourquoi le SMSI constitue la seconde période historique étudiée dans cette thèse. La confrontation lors du SMSI a donné lieu à une reconfiguration de l'élite du pouvoir de la gouvernance de l'Internet ainsi qu'à une redéfinition des frontières du champ. Un nouveau projet hégémonique a vu le jour autour d'éléments discursifs tels que le multipartenariat et autour d'insitutions telles que le Forum sur la Gouvernance de l'Internet. Le succès relatif de ce projet a permis une stabilité insitutionnelle inédite depuis la fin du SMSI et une acceptation du discours des élites par un grand nombre d'acteurs du champ. Ce n'est que récemment que cet ordre a été remis en cause par les pouvoirs émergents dans la gouvernance de l'Internet. Cette thèse cherche à contribuer au débat scientifique sur trois plans. Sur le plan théorique, elle contribue à l'essor d'un dialogue entre approches d'économie politique mondiale et de sociologie politique internationale afin d'étudier à la fois les dynamiques structurelles liées au processus de mondialisation et les pratiques localisées des acteurs dans un domaine précis. Elle insiste notamment sur l'apport de les notions de champ et d'élite du pouvoir et sur leur compatibilité avec les anlayses néo-gramsciennes de l'hégémonie. Sur le plan méthodologique, ce dialogue se traduit par une utilisation de méthodes sociologiques telles que l'anlyse de réseaux d'acteurs et de déclarations pour compléter l'analyse qualitative de documents. Enfin, sur le plan empirique, cette recherche offre une perspective originale sur la gouvernance de l'Internet en insistant sur sa dimension historique, en démontrant la fragilité du concept de gouvernance multipartenaire (multistakeholder) et en se focalisant sur les rapports de pouvoir et les liens entre gouvernance de l'Internet et mondialisation. - Internet governance is a recent issue in global politics. However, it gradually became a major political and economic issue. It recently became even more important and now appears regularly in the news. Against this background, this research outlines the history of Internet governance from its emergence as a political issue in the 1980s to the end of the World Summit on the Information Society (WSIS) in 2005. Rather than focusing on one or the other institution involved in Internet governance, this research analyses the emergence and historical evolution of a space of struggle affecting a growing number of different actors. This evolution is described through the analysis of the dialectical relation between elites and non-elites and through the struggle around the definition of Internet governance. The thesis explores the question of how the relations among the elites of Internet governance and between these elites and non-elites explain the emergence, the evolution, and the structuration of a relatively autonomous field of world politics centred around Internet governance. Against dominant realist and liberal perspectives, this research draws upon a cross-fertilisation of heterodox international political economy and international political sociology. This approach focuses on concepts such as field, elites and hegemony. The concept of field, as developed by Bourdieu, is increasingly used in International Relations to build a differentiated analysis of globalisation and to describe the emergence of transnational spaces of struggle and domination. Elite sociology allows for a pragmatic actor-centred analysis of the issue of power in the globalisation process. This research particularly draws on Wright Mill's concept of power elite in order to explore the unification of different elites around shared projects. Finally, this thesis uses the Neo-Gramscian concept of hegemony in order to study both the consensual dimension of domination and the prospect of change contained in any international order. Through the analysis of the documents produced within the analysed period, and through the creation of databases of networks of actors, this research focuses on the debates that followed the commercialisation of the Internet throughout the 1990s and during the WSIS. The first time period led to the creation of the Internet Corporation for Assigned Names and Numbers (ICANN) in 1998. This creation resulted from the consensus-building between the dominant discourses of the time. It also resulted from the coalition of interests among an emerging power elite. However, this institutionalisation of Internet governance around the ICANN excluded a number of actors and discourses that resisted this mode of governance. The WSIS became the institutional framework within which the governance system was questioned by some excluded states, scholars, NGOs and intergovernmental organisations. The confrontation between the power elite and counter-elites during the WSIS triggered a reconfiguration of the power elite as well as a re-definition of the boundaries of the field. A new hegemonic project emerged around discursive elements such as the idea of multistakeholderism and institutional elements such as the Internet Governance Forum. The relative success of the hegemonic project allowed for a certain stability within the field and an acceptance by most non-elites of the new order. It is only recently that this order began to be questioned by the emerging powers of Internet governance. This research provides three main contributions to the scientific debate. On the theoretical level, it contributes to the emergence of a dialogue between International Political Economy and International Political Sociology perspectives in order to analyse both the structural trends of the globalisation process and the located practices of actors in a given issue-area. It notably stresses the contribution of concepts such as field and power elite and their compatibility with a Neo-Gramscian framework to analyse hegemony. On the methodological level, this perspective relies on the use of mixed methods, combining qualitative content analysis with social network analysis of actors and statements. Finally, on the empirical level, this research provides an original perspective on Internet governance. It stresses the historical dimension of current Internet governance arrangements. It also criticise the notion of multistakeholde ism and focuses instead on the power dynamics and the relation between Internet governance and globalisation.
Resumo:
PURPOSE OF REVIEW: Recent findings in the physiology and neurobiology of ejaculation have expanded our understanding of male sexual function and have allowed the development of new instruments to investigate ejaculatory and orgasmic disorders. RECENT FINDINGS: The evidence-based definition of lifelong premature ejaculation has set a model in the evaluation and treatment outcome of sexual dysfunction. New instruments to objectively assess arousal, orgasm and the expulsion phase of ejaculation such as functional MRI, dynamic pelvic ultrasound, PET scans and validated questionnaires have lead to a better understanding of sexual dysfunction in men. Animal models, developments in neurobiology and clinical experience have transformed a purely psychoanalytical approach to ejaculatory and orgasmic function into a novel multidisciplinary, scientifically sound and evidence-based discipline of medicine. SUMMARY: Ejaculation is an integral part of normal sexual function. Ejaculatory dysfunction is common and may cause substantial disruption to the quality of a patient's life. A better understanding of the epidemiology, pathophysiology, neuroscience and genetics of ejaculatory and orgasmic function will eventually lead to the development of new, effective methods of treatment of disorders of ejaculation and orgasm in men.
Resumo:
Multisensory processes facilitate perception of currently-presented stimuli and can likewise enhance later object recognition. Memories for objects originally encountered in a multisensory context can be more robust than those for objects encountered in an exclusively visual or auditory context [1], upturning the assumption that memory performance is best when encoding and recognition contexts remain constant [2]. Here, we used event-related potentials (ERPs) to provide the first evidence for direct links between multisensory brain activity at one point in time and subsequent object discrimination abilities. Across two experiments we found that individuals showing a benefit and those impaired during later object discrimination could be predicted by their brain responses to multisensory stimuli upon their initial encounter. These effects were observed despite the multisensory information being meaningless, task-irrelevant, and presented only once. We provide critical insights into the advantages associated with multisensory interactions; they are not limited to the processing of current stimuli, but likewise encompass the ability to determine the benefit of one's memories for object recognition in later, unisensory contexts.
Resumo:
Single-trial encounters with multisensory stimuli affect both memory performance and early-latency brain responses to visual stimuli. Whether and how auditory cortices support memory processes based on single-trial multisensory learning is unknown and may differ qualitatively and quantitatively from comparable processes within visual cortices due to purported differences in memory capacities across the senses. We recorded event-related potentials (ERPs) as healthy adults (n = 18) performed a continuous recognition task in the auditory modality, discriminating initial (new) from repeated (old) sounds of environmental objects. Initial presentations were either unisensory or multisensory; the latter entailed synchronous presentation of a semantically congruent or a meaningless image. Repeated presentations were exclusively auditory, thus differing only according to the context in which the sound was initially encountered. Discrimination abilities (indexed by d') were increased for repeated sounds that were initially encountered with a semantically congruent image versus sounds initially encountered with either a meaningless or no image. Analyses of ERPs within an electrical neuroimaging framework revealed that early stages of auditory processing of repeated sounds were affected by prior single-trial multisensory contexts. These effects followed from significantly reduced activity within a distributed network, including the right superior temporal cortex, suggesting an inverse relationship between brain activity and behavioural outcome on this task. The present findings demonstrate how auditory cortices contribute to long-term effects of multisensory experiences on auditory object discrimination. We propose a new framework for the efficacy of multisensory processes to impact both current multisensory stimulus processing and unisensory discrimination abilities later in time.
Resumo:
The depositional stratigraphy of within-channel deposits in sandy braided rivers is dominated by a variety of barforms (both singular `unit' bars and complex `compound' bars), as well as the infill of individual channels (herein termed `channel fills'). The deposits of bars and channel fills define the key components of facies models for braided rivers and their within-channel heterogeneity, knowledge of which is important for reservoir characterization. However, few studies have sought to address the question of whether the deposits of bars and channel fills can be readily differentiated from each other. This paper presents the first quantitative study to achieve this aim, using aerial images of an evolving modern sandy braided river and geophysical imaging of its subsurface deposits. Aerial photographs taken between 2000 and 2004 document the abandonment and fill of a 1 3 km long, 80 m wide anabranch channel in the sandy braided South Saskatchewan River, Canada. Upstream river regulation traps the majority of very fine sediment and there is little clay (<1%) in the bed sediments. Channel abandonment was initiated by a series of unit bars that stalled and progressively blocked the anabranch entrance, together with dune deposition and stacking at the anabranch entrance and exit. Complete channel abandonment and subsequent fill of up to 3 m of sediment took approximately two years. Thirteen kilometres of ground-penetrating radar surveys, coupled with 18 cores, were obtained over the channel fill and an adjacent 750 m long, 400 m wide, compound bar, enabling a quantitative analysis of the channel and bar deposits. Results show that, in terms of grain-size trends, facies proportions and scale of deposits, there are only subtle differences between the channel fill and bar deposits which, therefore, renders them indistinguishable. Thus, it may be inappropriate to assign different geometric and sedimentological attributes to channel fill and bar facies in object-based models of sandy braided river alluvial architecture.
Resumo:
L'étude classique des attributions de responsabilité instiguée par Heider en psychologie sociale s'est principalement bornée à aborder ce processus psychosocial dans une perspective individualiste qui se cantonne aux niveaux intra-individuel et interpersonnel (selon la distinction opérée par Doise). Les réflexions et les travaux empiriques présentés dans cette thèse ont deux objectifs. Dans un premier temps, il s?agit d'élargir cette perspective aux autres niveaux sociologique et idéologique (en faisant notamment recours à l'approche des attributions sociales et aux propositions de Fauconnet sur les règles de responsabilité). Deuxièmement, il s?agit d'éprouver la pertinence d'une telle approche dans un contexte particulier : celui du travail en groupe dont la nature des rapports sociaux qui y étaient présentés ont été manipulés à l'aide de scénarii chez des étudiant-e-s de l?Université de Lausanne. L?objectif principal de cette thèse est donc de tester un modèle d?ancrage des attributions de responsabilité qui permette de souligner les dynamiques représentationnelles sous-jacentes en termes de légitimation ou de remise en cause de l?organisation des groupes. Dans l?ensemble les résultats indiquent que si la nature des rapports sociaux (re)présentés dans un groupe sont de puissants déterminants de la manière de légitimer ou de remettre en cause l?organisation des groupes, le niveau individuel d'adhésion à des croyances idéologiques dominantes, comme la justification du système économique, représente un modérateur des prises de position des répondant-e-s interrogé-e-s. De plus, il semble que ces processus évoluent dans le temps, faisant ainsi apparaître l'existence de phénomènes de socialisation relativement plus complexes que ne le laissent entendre les recherches actuellement effectuées dans ce domaine. En effet, si des connaissances idéologiques sur le monde sont acquises dans les filières universitaires et n?interviennent pas toujours dans les processus de formation des représentations du travail en groupe, des connaissances spécifiques aux disciplines et à la politique de sélection universitaire semblent intervenir dans le processus de légitimation des rapports sociaux dans les groupes au niveau des attributions. En tentant une articulation entre les concepts d?ancrage des représentations sociales, d?attribution et de socialisation, cette thèse permet ainsi de souligner la pertinence qu?il y a à insérer une problématique en termes de croyances idéologiques dans l?étude des groupes sociaux.<br/><br/>Heider?s approach of responsibility attributions almost exclusively emphasized on an individualistic point of view ; i.e. focusing at an intraindividual and interpersonnal level of analysis according to Doise?s distinction. The reflexions and empirical studies presented here firstly aim at broaden this perspective by taking socio-structural as well as societal levels of analysis into account. Secondly, it is to test this approach in the particular domain of organized groups. Manipulation of the structure of social relations in work groups on screenplays were undertaken (in a population of students from the Lausanne University). Hence, the main goal of these studies is to test the impact of the social ancoring of social representations in the responsibility processes in terms of legitimation or opposition to the group organization. All in all, the results show that social structures are powerfull predictors of the formation of social representations of a work situation and so forth of the attribution process. Nevertheless hegemonic ideological beliefs, such as Economical System Justification, do play a substantial moderating role in this process. It also proves to be evolving through time. The present findings show that a complexe process of socialization is occuring during the student?s university life. Indeed, the results let us believe that ideological beliefs may not interact anytime in the group?s perception and in the construction of the representation of the situation. In the same time, it seems that more discipline specific oriented knowledge and the impact of selection policy at the Lausanne University also predict the groupe legimation process and interfer with the ideological beliefs. Trying to articulate concepts of fields of research like social representations, attribution and socialization, the present thesis allows to underline the heuristic potential of reabilitating ideological beliefs at a dispositional level in the study of group process.
Resumo:
ABSTRACT: Massive synaptic pruning following over-growth is a general feature of mammalian brain maturation. Pruning starts near time of birth and is completed by time of sexual maturation. Trigger signals able to induce synaptic pruning could be related to dynamic functions that depend on the timing of action potentials. Spike-timing-dependent synaptic plasticity (STDP) is a change in the synaptic strength based on the ordering of pre- and postsynaptic spikes. The relation between synaptic efficacy and synaptic pruning suggests that the weak synapses may be modified and removed through competitive "learning" rules. This plasticity rule might produce the strengthening of the connections among neurons that belong to cell assemblies characterized by recurrent patterns of firing. Conversely, the connections that are not recurrently activated might decrease in efficiency and eventually be eliminated. The main goal of our study is to determine whether or not, and under which conditions, such cell assemblies may emerge out of a locally connected random network of integrate-and-fire units distributed on a 2D lattice receiving background noise and content-related input organized in both temporal and spatial dimensions. The originality of our study stands on the relatively large size of the network, 10,000 units, the duration of the experiment, 10E6 time units (one time unit corresponding to the duration of a spike), and the application of an original bio-inspired STDP modification rule compatible with hardware implementation. A first batch of experiments was performed to test that the randomly generated connectivity and the STDP-driven pruning did not show any spurious bias in absence of stimulation. Among other things, a scale factor was approximated to compensate for the network size on the ac¬tivity. Networks were then stimulated with the spatiotemporal patterns. The analysis of the connections remaining at the end of the simulations, as well as the analysis of the time series resulting from the interconnected units activity, suggest that feed-forward circuits emerge from the initially randomly connected networks by pruning. RESUME: L'élagage massif des synapses après une croissance excessive est une phase normale de la ma¬turation du cerveau des mammifères. L'élagage commence peu avant la naissance et est complété avant l'âge de la maturité sexuelle. Les facteurs déclenchants capables d'induire l'élagage des synapses pourraient être liés à des processus dynamiques qui dépendent de la temporalité rela¬tive des potentiels d'actions. La plasticité synaptique à modulation temporelle relative (STDP) correspond à un changement de la force synaptique basé sur l'ordre des décharges pré- et post- synaptiques. La relation entre l'efficacité synaptique et l'élagage des synapses suggère que les synapses les plus faibles pourraient être modifiées et retirées au moyen d'une règle "d'appren¬tissage" faisant intervenir une compétition. Cette règle de plasticité pourrait produire le ren¬forcement des connexions parmi les neurones qui appartiennent à une assemblée de cellules caractérisée par des motifs de décharge récurrents. A l'inverse, les connexions qui ne sont pas activées de façon récurrente pourraient voir leur efficacité diminuée et être finalement éliminées. Le but principal de notre travail est de déterminer s'il serait possible, et dans quelles conditions, que de telles assemblées de cellules émergent d'un réseau d'unités integrate-and¬-fire connectées aléatoirement et distribuées à la surface d'une grille bidimensionnelle recevant à la fois du bruit et des entrées organisées dans les dimensions temporelle et spatiale. L'originalité de notre étude tient dans la taille relativement grande du réseau, 10'000 unités, dans la durée des simulations, 1 million d'unités de temps (une unité de temps correspondant à une milliseconde), et dans l'utilisation d'une règle STDP originale compatible avec une implémentation matérielle. Une première série d'expériences a été effectuée pour tester que la connectivité produite aléatoirement et que l'élagage dirigé par STDP ne produisaient pas de biais en absence de stimu¬lation extérieure. Entre autres choses, un facteur d'échelle a pu être approximé pour compenser l'effet de la variation de la taille du réseau sur son activité. Les réseaux ont ensuite été stimulés avec des motifs spatiotemporels. L'analyse des connexions se maintenant à la fin des simulations, ainsi que l'analyse des séries temporelles résultantes de l'activité des neurones, suggèrent que des circuits feed-forward émergent par l'élagage des réseaux initialement connectés au hasard.
Resumo:
Multisensory memory traces established via single-trial exposures can impact subsequent visual object recognition. This impact appears to depend on the meaningfulness of the initial multisensory pairing, implying that multisensory exposures establish distinct object representations that are accessible during later unisensory processing. Multisensory contexts may be particularly effective in influencing auditory discrimination, given the purportedly inferior recognition memory in this sensory modality. The possibility of this generalization and the equivalence of effects when memory discrimination was being performed in the visual vs. auditory modality were at the focus of this study. First, we demonstrate that visual object discrimination is affected by the context of prior multisensory encounters, replicating and extending previous findings by controlling for the probability of multisensory contexts during initial as well as repeated object presentations. Second, we provide the first evidence that single-trial multisensory memories impact subsequent auditory object discrimination. Auditory object discrimination was enhanced when initial presentations entailed semantically congruent multisensory pairs and was impaired after semantically incongruent multisensory encounters, compared to sounds that had been encountered only in a unisensory manner. Third, the impact of single-trial multisensory memories upon unisensory object discrimination was greater when the task was performed in the auditory vs. visual modality. Fourth, there was no evidence for correlation between effects of past multisensory experiences on visual and auditory processing, suggestive of largely independent object processing mechanisms between modalities. We discuss these findings in terms of the conceptual short term memory (CSTM) model and predictive coding. Our results suggest differential recruitment and modulation of conceptual memory networks according to the sensory task at hand.
Resumo:
La fabrication, la distribution et l'usage de fausses pièces d'identité constituent une menace pour la sécurité autant publique que privée. Ces faux documents représentent en effet un catalyseur pour une multitude de formes de criminalité, des plus anodines aux formes les plus graves et organisées. La dimension, la complexité, la faible visibilité, ainsi que les caractères répétitif et évolutif de la fraude aux documents d'identité appellent des réponses nouvelles qui vont au-delà d'une approche traditionnelle au cas par cas ou de la stratégie du tout technologique dont la perspective historique révèle l'échec. Ces nouvelles réponses passent par un renforcement de la capacité de comprendre les problèmes criminels que posent la fraude aux documents d'identité et les phénomènes qui l'animent. Cette compréhension est tout bonnement nécessaire pour permettre d'imaginer, d'évaluer et de décider les solutions et mesures les plus appropriées. Elle requière de développer les capacités d'analyse et la fonction de renseignement criminel qui fondent en particulier les modèles d'action de sécurité les plus récents, tels que l'intelligence-led policing ou le problem-oriented policing par exemple. Dans ce contexte, le travail doctoral adopte une position originale en postulant que les fausses pièces d'identité se conçoivent utilement comme la trace matérielle ou le vestige résultant de l'activité de fabrication ou d'altération d'un document d'identité menée par les faussaires. Sur la base de ce postulat fondamental, il est avancé que l'exploitation scientifique, méthodique et systématique de ces traces au travers d'un processus de renseignement forensique permet de générer des connaissances phénoménologiques sur les formes de criminalité qui fabriquent, diffusent ou utilisent les fausses pièces d'identité, connaissances qui s'intègrent et se mettent avantageusement au service du renseignement criminel. A l'appui de l'épreuve de cette thèse de départ et de l'étude plus générale du renseignement forensique, le travail doctoral propose des définitions et des modèles. Il décrit des nouvelles méthodes de profilage et initie la constitution d'un catalogue de formes d'analyses. Il recourt également à des expérimentations et des études de cas. Les résultats obtenus démontrent que le traitement systématique de la donnée forensique apporte une contribution utile et pertinente pour le renseignement criminel stratégique, opérationnel et tactique, ou encore la criminologie. Combiné aux informations disponibles par ailleurs, le renseignement forensique produit est susceptible de soutenir l'action de sécurité dans ses dimensions répressive, proactive, préventive et de contrôle. En particulier, les méthodes de profilage des fausses pièces d'identité proposées permettent de révéler des tendances au travers de jeux de données étendus, d'analyser des modus operandi ou d'inférer une communauté ou différence de source. Ces méthodes appuient des moyens de détection et de suivi des séries, des problèmes et des phénomènes criminels qui s'intègrent dans le cadre de la veille opérationnelle. Ils permettent de regrouper par problèmes les cas isolés, de mettre en évidence les formes organisées de criminalité qui méritent le plus d'attention, ou de produire des connaissances robustes et inédites qui offrent une perception plus profonde de la criminalité. Le travail discute également les difficultés associées à la gestion de données et d'informations propres à différents niveaux de généralité, ou les difficultés relatives à l'implémentation du processus de renseignement forensique dans la pratique. Ce travail doctoral porte en premier lieu sur les fausses pièces d'identité et leur traitement par les protagonistes de l'action de sécurité. Au travers d'une démarche inductive, il procède également à une généralisation qui souligne que les observations ci-dessus ne valent pas uniquement pour le traitement systématique des fausses pièces d'identité, mais pour celui de tout type de trace dès lors qu'un profil en est extrait. Il ressort de ces travaux une définition et une compréhension plus transversales de la notion et de la fonction de renseignement forensique. The production, distribution and use of false identity documents constitute a threat to both public and private security. Fraudulent documents are a catalyser for a multitude of crimes, from the most trivial to the most serious and organised forms. The dimension, complexity, low visibility as well as the repetitive and evolving character of the production and use of false identity documents call for new solutions that go beyond the traditional case-by-case approach, or the technology-focused strategy whose failure is revealed by the historic perspective. These new solutions require to strengthen the ability to understand crime phenomena and crime problems posed by false identity documents. Such an understanding is pivotal in order to be able to imagine, evaluate and decide on the most appropriate measures and responses. Therefore, analysis capacities and crime intelligence functions, which found the most recent policing models such as intelligence-led policing or problem-oriented policing for instance, have to be developed. In this context, the doctoral research work adopts an original position by postulating that false identity documents can be usefully perceived as the material remnant resulting from the criminal activity undertook by forgers, namely the manufacture or the modification of identity documents. Based on this fundamental postulate, it is proposed that a scientific, methodical and systematic processing of these traces through a forensic intelligence approach can generate phenomenological knowledge on the forms of crime that produce, distribute and use false identity documents. Such knowledge should integrate and serve advantageously crime intelligence efforts. In support of this original thesis and of a more general study of forensic intelligence, the doctoral work proposes definitions and models. It describes new profiling methods and initiates the construction of a catalogue of analysis forms. It also leverages experimentations and case studies. Results demonstrate that the systematic processing of forensic data usefully and relevantly contributes to strategic, tactical and operational crime intelligence, and also to criminology. Combined with alternative information available, forensic intelligence may support policing in its repressive, proactive, preventive and control activities. In particular, the proposed profiling methods enable to reveal trends among extended datasets, to analyse modus operandi, or to infer that false identity documents have a common or different source. These methods support the detection and follow-up of crime series, crime problems and phenomena and therefore contribute to crime monitoring efforts. They enable to link and regroup by problems cases that were previously viewed as isolated, to highlight organised forms of crime which deserve greatest attention, and to elicit robust and novel knowledge offering a deeper perception of crime. The doctoral research work discusses also difficulties associated with the management of data and information relating to different levels of generality, or difficulties associated with the implementation in practice of the forensic intelligence process. The doctoral work focuses primarily on false identity documents and their treatment by policing stakeholders. However, through an inductive process, it makes a generalisation which underlines that observations do not only apply to false identity documents but to any kind of trace as soon as a profile is extracted. A more transversal definition and understanding of the concept and function of forensic intelligence therefore derives from the doctoral work.
Resumo:
Given the cost constraints of the European health-care systems, criteria are needed to decide which genetic services to fund from the public budgets, if not all can be covered. To ensure that high-priority services are available equitably within and across the European countries, a shared set of prioritization criteria would be desirable. A decision process following the accountability for reasonableness framework was undertaken, including a multidisciplinary EuroGentest/PPPC-ESHG workshop to develop shared prioritization criteria. Resources are currently too limited to fund all the beneficial genetic testing services available in the next decade. Ethically and economically reflected prioritization criteria are needed. Prioritization should be based on considerations of medical benefit, health need and costs. Medical benefit includes evidence of benefit in terms of clinical benefit, benefit of information for important life decisions, benefit for other people apart from the person tested and the patient-specific likelihood of being affected by the condition tested for. It may be subject to a finite time window. Health need includes the severity of the condition tested for and its progression at the time of testing. Further discussion and better evidence is needed before clearly defined recommendations can be made or a prioritization algorithm proposed. To our knowledge, this is the first time a clinical society has initiated a decision process about health-care prioritization on a European level, following the principles of accountability for reasonableness. We provide points to consider to stimulate this debate across the EU and to serve as a reference for improving patient management.
Resumo:
BACKGROUND: Reports of patients with secondary acute promyelocytic leukemia (APL) have increased in recent years, particularly for those who received treatment with mitoxantrone, and retrospective studies have suggested that their characteristics and outcomes were similar to those of patients with de novo APL. METHODS: The authors investigated patients with de novo and secondary APL who were included in the ongoing APL-2006 trial. Patients with secondary APL who were included in that trial also were compared with a previous retrospective cohort of patients with secondary APL. RESULTS: In the APL-2006 trial, 42 of 280 patients (15%) had secondary APL. Compared with the retrospective cohort, patients with secondary APL in the APL-2006 trial had a lower incidence of prior breast carcinoma (35.7% vs 57%; P = .03) and a higher incidence of prior prostate carcinoma (26.2% vs 4.7%; P < .001). Treatment of the primary tumor in the APL-2006 trial less frequently included combined radiochemotherapy (28.6% vs 47.2%; P = .044) and no mitoxantrone (0% vs 46.7%; P = .016) but more frequently included anthracyclines (53.3% vs 38.3%; P = .015). In the APL-2006 trial, patients who had secondary APL, compared with those who had de novo APL, were older (mean, 60.2 years vs 48.7 years, respectively; P < .0001) but had a similar complete response rate (97.6% vs 90.3%, respectively), cumulative incidence of relapse (0% vs 1.8%, respectively), and overall survival (92.3% vs 90.9%, respectively) at 18 months. CONCLUSIONS: Although the incidence of secondary APL appears to be stable over time, evolving strategies for the treatment of primary cancers have reduced its occurrence among breast cancer patients but have increased its incidence among patients with prostate cancer. The current results confirm prospectively that patients with secondary APL have characteristics and outcomes similar to those of patients with de novo APL. Cancer 2015;121:2393-2399. © 2015 American Cancer Society.
Resumo:
Cette thèse étudie l'engagement des intellectuels de gauche dans la vie politique suisse, de 1945 à 1968. D'une part, il s'agit de retracer l'évolution du statut des intellectuels, que ce soit dans ou hors des partis. D'autre part, il est question d'analyser les débats politiques au sein desquels ces intellectuels furent impliqués, et la manière dont ces débats suscitèrent des clivages entre eux. De ce point de vue, nous mettons en lumière les différents courants et groupes formés par les intellectuels progressistes, souvent structurés autour de revues ; il s'agit aussi bien d'étudier l'engagement de personnalités sociales-démocrates que des communistes prosoviétiques, sans oublier les chrétiens pacifistes ou les intellectuels proches de la gauche radicale antistalinienne. S'agissant de l'évolution du statut des acteurs étudiés, ce travail souligne le déclin de la figure de l'intellectuel de gauche organiquement lié à son parti, souvent issu du milieu ouvrier, au profit d'intellectuels critiques, généralement au bénéfice d'une formation académique et revendiquant une certaine autonomie par rapport aux organisations politiques. Du point de vue des débats politiques, l'engagement des intellectuels de gauche est envisagé à la lumière de trois périodes. Tout d'abord, nous étudions la phase de l'immédiat après-guerre (1945-1949), marquée par une poussée de la gauche, y compris prosoviétique, qui met en cause le conservatisme politique issu des années de Mobilisation. Nous étudions ensuite les années les plus tendues de la guerre froide, entre 1950 et 1962, durant lesquelles la vie politique et intellectuelle en Suisse est dominée par un fort anticommunisme, auquel se rallient les dirigeants du Parti socialiste. Pourtant, l'engagement de certains intellectuels progressistes, en particulier dans le mouvement pacifiste, met en cause le consensus politique de guerre froide. Enfin, dans une troisième partie, nous montrons comment la critique intellectuelle de gauche se renforce après 1962, à la faveur de la détente Est-Ouest sur le plan international, et avec l'essor, en Suisse même, du mouvement des « non- conformistes ». Ce mouvement est animé par des intellectuels qui dénoncent le conservatisme helvétique, les excès de l'anticommunisme ou qui affirment leur solidarité avec les travailleurs immigrés en Suisse, aussi bien qu'avec les mouvements sociaux dans les pays du tiers-monde. Nous montrons en particulier comment l'engagement de ces intellectuels progressistes contribue à préparer le terrain pour les mobilisations de la jeunesse qui surviendront dans les « années 1968 ». -- This thesis adresses the political commitment of left-wing intellectuals in Switzerland between 1945 and 1968. It aims, on the one hand, to examine how the status of intellectuals developed within and outside of political parties. On the other hand, it endeavours to understand the political debates that involved and sometimes split these intellectuals. In this intent, we examine the various political orientations and formations that brought left-wing intellectuals together - often around dedicated periodicals - such as the Social-democratic, the Communist, the Christian Progressist or the anti-Stalinist Marxist movements. Regarding the evolving status of left-wing opinion leaders, we observe the decline of the organic, party-affiliated intellectuals - often from a working class background. By contrast, critical academics - left-wing oriented but.not directly linked to a political formation - became prevailing figures. Concerning left-wing intellectuals' involvement in the political debate, we differentiate three historical periods. Firstly, the immediate postwar years (1945-1949) were characterised by the strengthening of a left-wing faction, including pro-Soviet forces, which criticized the conservative political consensus built up during the War. Secondly, during the most tense years of the Cold War (1950-1962), the Swiss political and intellectual life became widely dominated by a strong anticommunism, supported by the Social-Democratic leaders. Still, the commitment of certain progressist intellectuals, particularly in the pacifist movement, called into question the political consensus resulting from the Cold War. This questioning strengthened after 1962, in the context of the Détente, corresponding to the rise of the "non-conformist" movement. This movement stemmed from progressist intellectuals, who criticized the Swiss conservatism, and the excesses of official anticommunism, while declaring their solidarity with immigrant workers or with the social movements in the Third World. We show in particular how these intellectuals paved the way for the youth mobilization due to occur in the "1968 years".
Resumo:
Contact stains recovered at break-in crime scenes are frequently characterized by mixtures of DNA from several persons. Broad knowledge on the relative contribution of DNA left behind by different users overtime is of paramount importance. Such information might help crime investigators to robustly evaluate the possibility of detecting a specific (or known) individual's DNA profile based on the type and history of an object. To address this issue, a contact stain simulation-based protocol was designed. Fourteen volunteers either acting as first or second object's users were recruited. The first user was required to regularly handle/wear 9 different items during an 8-10-day period, whilst the second user for 5, 30 and 120 min, in three independent simulation sessions producing a total of 231 stains. Subsequently, the relative DNA profile contribution of each individual pair was investigated. Preliminary results showed a progressive increase of the percentage contribution of the second user compared to the first. Interestingly, the second user generally became the major DNA contributor when most objects were handled/worn for 120 min, Furthermore, the observation of unexpected additional alleles will then prompt the investigation of indirect DNA transfer events.
Resumo:
Humans like some colours and dislike others, but which particular colours and why remains to be understood. Empirical studies on colour preferences generally targeted most preferred colours, but rarely least preferred (disliked) colours. In addition, findings are often based on general colour preferences leaving open the question whether results generalise to specific objects. Here, 88 participants selected the colours they preferred most and least for three context conditions (general, interior walls, t-shirt) using a high-precision colour picker. Participants also indicated whether they associated their colour choice to a valenced object or concept. The chosen colours varied widely between individuals and contexts and so did the reasons for their choices. Consistent patterns also emerged, as most preferred colours in general were more chromatic, while for walls they were lighter and for t-shirts they were darker and less chromatic compared to least preferred colours. This meant that general colour preferences could not explain object specific colour preferences. Measures of the selection process further revealed that, compared to most preferred colours, least preferred colours were chosen more quickly and were less often linked to valenced objects or concepts. The high intra- and inter-individual variability in this and previous reports furthers our understanding that colour preferences are determined by subjective experiences and that most and least preferred colours are not processed equally.