38 resultados para NON-IDEAL POWER SOURCES
Resumo:
Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.
Resumo:
Retrouver le sens de la violence est une démarche certes essentielle, mais non suffisante. Il s'agit aussi d'aller au-delà du sens et de revoir sans cesse les réponses aux expressions nouvelles et inattendues des phénomènes violents, individuels ou collectifs. Cherchant à dépasser une dramatisation de la problématique de la violence des jeunes, ce livre offre des perspectives de réflexion et d'action utiles aux professionnels de la santé mentale des enfants et des adolescents, et plus largement à tous les partenaires du réseau social, éducatif et scolaire. 1. Définition et sources de la violence : Infamies d'enfance. - Agressivité, haine et destructivité: les racines de la violence. - L'enfance des patients adultes diagnostiqués de "paraphilie". - La violence, ses sources pulsionnelles et sa genèse psychique. 2. Formes de violence : Réalités et énigmes de la sexualité infantile déviante. - Les parents battus. - Le harcèlement entre jeunes du même âge. L'arrière-plan familial des jeunes impliqués. - Comportements psychopathiques et réactivité traumatique. - La transgression dans les camps d'extermination nazis. Perdre son âme. Sauver la vie. 3. Psychopathologie. 4. Traitements : De la consommation de substances à la dépendance: un modèle intégratif. - Agressivité et transfert dans le quotidien psychothérapeutique. - Expériences d'interventions au niveau des cellules d'urgences médico-psychologiques. - Sens et non-sens de la violence. - Violences et soins à l'adolescence. - Evacuation et représentation de la violence chez l'enfant psychotique. - La violence pulsionnelle et ses répercussions transférentielles. - Dystraitance. Un accompagnement institutionnel.
Resumo:
This thesis is composed of three main parts. The first consists of a state of the art of the different notions that are significant to understand the elements surrounding art authentication in general, and of signatures in particular, and that the author deemed them necessary to fully grasp the microcosm that makes up this particular market. Individuals with a solid knowledge of the art and expertise area, and that are particularly interested in the present study are advised to advance directly to the fourth Chapter. The expertise of the signature, it's reliability, and the factors impacting the expert's conclusions are brought forward. The final aim of the state of the art is to offer a general list of recommendations based on an exhaustive review of the current literature and given in light of all of the exposed issues. These guidelines are specifically formulated for the expertise of signatures on paintings, but can also be applied to wider themes in the area of signature examination. The second part of this thesis covers the experimental stages of the research. It consists of the method developed to authenticate painted signatures on works of art. This method is articulated around several main objectives: defining measurable features on painted signatures and defining their relevance in order to establish the separation capacities between groups of authentic and simulated signatures. For the first time, numerical analyses of painted signatures have been obtained and are used to attribute their authorship to given artists. An in-depth discussion of the developed method constitutes the third and final part of this study. It evaluates the opportunities and constraints when applied by signature and handwriting experts in forensic science. A brief summary covering each chapter allows a rapid overview of the study and summarizes the aims and main themes of each chapter. These outlines presented below summarize the aims and main themes addressed in each chapter. Part I - Theory Chapter 1 exposes legal aspects surrounding the authentication of works of art by art experts. The definition of what is legally authentic, the quality and types of the experts that can express an opinion concerning the authorship of a specific painting, and standard deontological rules are addressed. The practices applied in Switzerland will be specifically dealt with. Chapter 2 presents an overview of the different scientific analyses that can be carried out on paintings (from the canvas to the top coat). Scientific examinations of works of art have become more common, as more and more museums equip themselves with laboratories, thus an understanding of their role in the art authentication process is vital. The added value that a signature expertise can have in comparison to other scientific techniques is also addressed. Chapter 3 provides a historical overview of the signature on paintings throughout the ages, in order to offer the reader an understanding of the origin of the signature on works of art and its evolution through time. An explanation is given on the transitions that the signature went through from the 15th century on and how it progressively took on its widely known modern form. Both this chapter and chapter 2 are presented to show the reader the rich sources of information that can be provided to describe a painting, and how the signature is one of these sources. Chapter 4 focuses on the different hypotheses the FHE must keep in mind when examining a painted signature, since a number of scenarios can be encountered when dealing with signatures on works of art. The different forms of signatures, as well as the variables that may have an influence on the painted signatures, are also presented. Finally, the current state of knowledge of the examination procedure of signatures in forensic science in general, and in particular for painted signatures, is exposed. The state of the art of the assessment of the authorship of signatures on paintings is established and discussed in light of the theoretical facets mentioned previously. Chapter 5 considers key elements that can have an impact on the FHE during his or her2 examinations. This includes a discussion on elements such as the skill, confidence and competence of an expert, as well as the potential bias effects he might encounter. A better understanding of elements surrounding handwriting examinations, to, in turn, better communicate results and conclusions to an audience, is also undertaken. Chapter 6 reviews the judicial acceptance of signature analysis in Courts and closes the state of the art section of this thesis. This chapter brings forward the current issues pertaining to the appreciation of this expertise by the non- forensic community, and will discuss the increasing number of claims of the unscientific nature of signature authentication. The necessity to aim for more scientific, comprehensive and transparent authentication methods will be discussed. The theoretical part of this thesis is concluded by a series of general recommendations for forensic handwriting examiners in forensic science, specifically for the expertise of signatures on paintings. These recommendations stem from the exhaustive review of the literature and the issues exposed from this review and can also be applied to the traditional examination of signatures (on paper). Part II - Experimental part Chapter 7 describes and defines the sampling, extraction and analysis phases of the research. The sampling stage of artists' signatures and their respective simulations are presented, followed by the steps that were undertaken to extract and determine sets of characteristics, specific to each artist, that describe their signatures. The method is based on a study of five artists and a group of individuals acting as forgers for the sake of this study. Finally, the analysis procedure of these characteristics to assess of the strength of evidence, and based on a Bayesian reasoning process, is presented. Chapter 8 outlines the results concerning both the artist and simulation corpuses after their optical observation, followed by the results of the analysis phase of the research. The feature selection process and the likelihood ratio evaluation are the main themes that are addressed. The discrimination power between both corpuses is illustrated through multivariate analysis. Part III - Discussion Chapter 9 discusses the materials, the methods, and the obtained results of the research. The opportunities, but also constraints and limits, of the developed method are exposed. Future works that can be carried out subsequent to the results of the study are also presented. Chapter 10, the last chapter of this thesis, proposes a strategy to incorporate the model developed in the last chapters into the traditional signature expertise procedures. Thus, the strength of this expertise is discussed in conjunction with the traditional conclusions reached by forensic handwriting examiners in forensic science. Finally, this chapter summarizes and advocates a list of formal recommendations for good practices for handwriting examiners. In conclusion, the research highlights the interdisciplinary aspect of signature examination of signatures on paintings. The current state of knowledge of the judicial quality of art experts, along with the scientific and historical analysis of paintings and signatures, are overviewed to give the reader a feel of the different factors that have an impact on this particular subject. The temperamental acceptance of forensic signature analysis in court, also presented in the state of the art, explicitly demonstrates the necessity of a better recognition of signature expertise by courts of law. This general acceptance, however, can only be achieved by producing high quality results through a well-defined examination process. This research offers an original approach to attribute a painted signature to a certain artist: for the first time, a probabilistic model used to measure the discriminative potential between authentic and simulated painted signatures is studied. The opportunities and limits that lie within this method of scientifically establishing the authorship of signatures on works of art are thus presented. In addition, the second key contribution of this work proposes a procedure to combine the developed method into that used traditionally signature experts in forensic science. Such an implementation into the holistic traditional signature examination casework is a large step providing the forensic, judicial and art communities with a solid-based reasoning framework for the examination of signatures on paintings. The framework and preliminary results associated with this research have been published (Montani, 2009a) and presented at international forensic science conferences (Montani, 2009b; Montani, 2012).
Resumo:
The identification and characterization of long noncoding RNA in a variety of tissues represent major achievements that contribute to our understanding of the molecular mechanisms controlling gene expression. In particular, long noncoding RNA play crucial roles in the epigenetic regulation of the adaptive response to environmental cues via their capacity to target chromatin modifiers to specific locus. In addition, these transcripts have been implicated in controlling splicing, translation and degradation of messenger RNA. Long noncoding RNA have also been shown to act as decoy molecules for microRNA. In the heart, a few long noncoding RNA have been demonstrated to regulate cardiac commitment and differentiation during development. Furthermore, recent findings suggest their involvement as regulators of the pathophysiological response to injury in the adult heart. Their high cellular specificity makes them attractive target molecules for innovative therapies and ideal biomarkers.
Resumo:
OBJECTIVE: The goal was to demonstrate that tailored therapy, according to tumor histology and epidermal growth factor receptor (EGFR) mutation status, and the introduction of novel drug combinations in the treatment of advanced non-small-cell lung cancer are promising for further investigation. METHODS: We conducted a multicenter phase II trial with mandatory EGFR testing and 2 strata. Patients with EGFR wild type received 4 cycles of bevacizumab, pemetrexed, and cisplatin, followed by maintenance with bevacizumab and pemetrexed until progression. Patients with EGFR mutations received bevacizumab and erlotinib until progression. Patients had computed tomography scans every 6 weeks and repeat biopsy at progression. The primary end point was progression-free survival (PFS) ≥ 35% at 6 months in stratum EGFR wild type; 77 patients were required to reach a power of 90% with an alpha of 5%. Secondary end points were median PFS, overall survival, best overall response rate (ORR), and tolerability. Further biomarkers and biopsy at progression were also evaluated. RESULTS: A total of 77 evaluable patients with EGFR wild type received an average of 9 cycles (range, 1-25). PFS at 6 months was 45.5%, median PFS was 6.9 months, overall survival was 12.1 months, and ORR was 62%. Kirsten rat sarcoma oncogene mutations and circulating vascular endothelial growth factor negatively correlated with survival, but thymidylate synthase expression did not. A total of 20 patients with EGFR mutations received an average of 16 cycles. PFS at 6 months was 70%, median PFS was 14 months, and ORR was 70%. Biopsy at progression was safe and successful in 71% of the cases. CONCLUSIONS: Both combination therapies were promising for further studies. Biopsy at progression was feasible and will be part of future SAKK studies to investigate molecular mechanisms of resistance.
Resumo:
Perioperative management of patients treated with the non-vitamin K antagonist oral anticoagulants is an ongoing challenge. Due to the lack of good clinical studies involving adequate monitoring and reversal therapies, management requires knowledge and understanding of pharmacokinetics, renal function, drug interactions, and evaluation of the surgical bleeding risk. Consideration of the benefit of reversal of anticoagulation is important and, for some low risk bleeding procedures, it may be in the patient's interest to continue anticoagulation. In case of major intra-operative bleeding in patients likely to have therapeutic or supra-therapeutic levels of anticoagulation, specific reversal agents/antidotes would be of value but are currently lacking. As a consequence, a multimodal approach should be taken which includes the administration of 25 to 50 U/kg 4-factor prothrombin complex concentrates or 30 to 50 U/kg activated prothrombin complex concentrate (FEIBA®) in some life-threatening situations. Finally, further studies are needed to clarify the ideal therapeutic intervention.
Resumo:
BACKGROUND: The unique situation of the liver with arterial and venous blood supply and the dependency of the tumor on the arterial blood flow make this organ an ideal target for intrahepatic catheter-based therapies. Main forms of treatment are classical bland embolization (TAE) cutting the blood flow to the tumors, chemoembolization (TACE) inducing high chemotherapy concentration in tumors, and radioembolization (TARE) without embolizing effect but very high local radiation. These different forms of therapies are used in different centers with different protocols. This overview summarizes the different forms of treatment, their indications and protocols, possible side effects, and available data in patients with non-colorectal liver tumors. METHODS: A research in PubMed was performed. Mainly clinical controlled trials were reviewed. The search terms were 'embolization liver', 'TAE', 'chemoembolization liver', 'TACE', 'radioembolization liver', and 'TARE' as well as 'chemosaturation' and 'TACP' in the indications 'breast cancer', 'neuroendocrine', and 'melanoma'. All reported studies were analyzed for impact and reported according to their clinical relevance. RESULTS: The main search criteria revealed the following results: 'embolization liver + breast cancer', 122 results, subgroup clinical trials 16; 'chemoembolization liver + breast cancer', 62 results, subgroup clinical trials 11; 'radioembolization liver + breast cancer', 37 results, subgroup clinical trials 3; 'embolization liver + neuroendocrine', 283 results, subgroup clinical trials 20; 'chemoembolization liver + neuroendocrine', 202 results, subgroup clinical trials 9; 'radioembolization liver + neuroendocrine', 64 results, subgroup clinical trials 9; 'embolization liver + melanoma', 79 results, subgroup clinical trials 15; 'chemoembolization liver + melanoma', 60 results, subgroup clinical trials 14; 'radioembolization liver + melanoma', 18 results, subgroup clinical trials 3. The term 'chemosaturation liver' was tested without indication since only few publications exist and provided us with five results and only one clinical trial. CONCLUSION: Despite many years of clinical use and documented efficacy on intra-arterial treatments of the liver, there are still only a few prospective multicenter trials with many different protocols. To guarantee the future use of these efficacious therapies, especially in the light of many systemic or surgical therapies in the treatment of non-colorectal liver metastases, further large randomized trials and transparent guidelines need to be established.
Resumo:
Internationally, policies for attracting highly-skilled migrants have become the guidelines mainly used by the Organisation for Economic Co-operation and Development (OECD) countries. Governments are implementing specific procedures to capture and facilitate their mobility. However, all professions are not equal when it comes to welcoming highly-skilled migrants. The medical profession, as a protective market, is one of these. Taking the case of non-EU/EEA doctors in France, this paper shows that the medical profession defined as the closed labour market, remains the most controversial in terms of professional integration of migrants, protectionist barriers to migrant competition and challenge of medical shortage. Based on the path-dependency approach, this paper argues that non-EU/EEA doctors' issues in France derive from a complex historical process of interaction between standards settled in the past, particularly the historical power of medical corporatism, the unexpected long-term effects of French hospital reforms of 1958, and budgetary pressures. Theoretically, this paper shows two significant findings. Firstly, the French medical system has undergone a series of transformations unthinkable in the strict sense of a path-dependence approach: an opening of the medical profession to foreign physicians in the context of the Europeanisation of public policy, acceptance of non-EU/EEA doctors in a context of medical shortage and budgetary pressures. Secondly, there is no change of the overall paradigm: significantly, the recruitment policies of non-EU/EEA doctors continue to highlight the imprint of the past and reveal a significant persistence of prejudices. Non-EU/EEA doctors are not considered legitimate doctors even if they have the qualifications of physicians which are legitimate in their country and which can be recognised in other receiving countries.