32 resultados para Historic Cartography
Resumo:
Reproductive isolation between lineages is expected to accumulate with divergence time, but the time taken to speciate may strongly vary between different groups of organisms. In anuran amphibians, laboratory crosses can still produce viable hybrid offspring >20 My after separation, but the speed of speciation in closely related anuran lineages under natural conditions is poorly studied. Palearctic green toads (Bufo viridis subgroup) offer an excellent system to address this question, comprising several lineages that arose at different times and form secondary contact zones. Using mitochondrial and nuclear markers, we previously demonstrated that in Sicily, B. siculus and B. balearicus developed advanced reproductive isolation after Plio-Pleistocene divergence (2.6 My, 3.3-1.9), with limited historic mtDNA introgression, scarce nuclear admixture, but low, if any, current gene flow. Here, we study genetic interactions between younger lineages of early Pleistocene divergence (1.9 My, 2.5-1.3) in northeastern Italy (B. balearicus, B. viridis). We find significantly more, asymmetric nuclear and wider, differential mtDNA introgression. The population structure seems to be molded by geographic distance and barriers (rivers), much more than by intrinsic genomic incompatibilities. These differences of hybridization between zones may be partly explained by differences in the duration of previous isolation. Scattered research on other anurans suggests that wide hybrid zones with strong introgression may develop when secondary contacts occur <2 My after divergence, whereas narrower zones with restricted gene flow form when divergence exceeds 3 My. Our study strengthens support for this rule of thumb by comparing lineages with different divergence times within the same radiation.
Resumo:
BACKGROUND: Mutations in SCN4A may lead to myotonia. METHODS: Presentation of a large family with myotonia, including molecular studies and patch clamp experiments using human embryonic kidney 293 cells expressing wild-type and mutated channels. RESULTS: In a large family with historic data on seven generations and a clear phenotype, including myotonia at movement onset, with worsening by cold temperature, pregnancy, mental stress, and especially after rest after intense physical activity, but without weakness, the phenotype was linked with the muscle sodium channel gene (SCN4A) locus, in which a novel p.I141V mutation was found. This modification is located within the first transmembrane segment of domain I of the Na(v)1.4 alpha subunit, a region where no mutation has been reported so far. Patch clamp experiments revealed a mutation-induced hyperpolarizing shift (-12.9 mV) of the voltage dependence of activation, leading to a significant increase (approximately twofold) of the window current amplitude. In addition, the mutation shifted the voltage dependence of slow inactivation by -8.7 mV and accelerated the entry to this state. CONCLUSIONS: We propose that the gain-of-function alteration in activation leads to the observed myotonic phenotype, whereas the enhanced slow inactivation may prevent depolarization-induced paralysis.
Resumo:
Few subjects have caught the attention of the entire world as much as those dealing with natural hazards. The first decade of this new millennium provides a litany of tragic examples of various hazards that turned into disasters affecting millions of individuals around the globe. The human losses (some 225,000 people) associated with the 2004 Indian Ocean earthquake and tsunami, the economic costs (approximately 200 billion USD) of the 2011 Tohoku Japan earthquake, tsunami and reactor event, and the collective social impacts of human tragedies experienced during Hurricane Katrina in 2005 all provide repetitive reminders that we humans are temporary guests occupying a very active and angry planet. Any examples may have been cited here to stress the point that natural events on Earth may, and often do, lead to disasters and catastrophes when humans place themselves into situations of high risk. Few subjects share the true interdisciplinary dependency that characterizes the field of natural hazards. From geology and geophysics to engineering and emergency response to social psychology and economics, the study of natural hazards draws input from an impressive suite of unique and previously independent specializations. Natural hazards provide a common platform to reduce disciplinary boundaries and facilitate a beneficial synergy in the provision of timely and useful information and action on this critical subject matter. As social norms change regarding the concept of acceptable risk and human migration leads to an explosion in the number of megacities, coastal over-crowding and unmanaged habitation in precarious environments such as mountainous slopes, the vulnerability of people and their susceptibility to natural hazards increases dramatically. Coupled with the concerns of changing climates, escalating recovery costs, a growing divergence between more developed and less developed countries, the subject of natural hazards remains on the forefront of issues that affect all people, nations, and environments all the time.This treatise provides a compendium of critical, timely and very detailed information and essential facts regarding the basic attributes of natural hazards and concomitant disasters. The Encyclopedia of Natural Hazards effectively captures and integrates contributions from an international portfolio of almost 300 specialists whose range of expertise addresses over 330 topics pertinent to the field of natural hazards. Disciplinary barriers are overcome in this comprehensive treatment of the subject matter. Clear illustrations and numerous color images enhance the primary aim to communicate and educate. The inclusion of a series of unique ?classic case study? events interspersed throughout the volume provides tangible examples linking concepts, issues, outcomes and solutions. These case studies illustrate different but notable recent, historic and prehistoric events that have shaped the world as we now know it. They provide excellent focal points linking the remaining terms in the volume to the primary field of study. This Encyclopedia of Natural Hazards will remain a standard reference of choice for many years.
Resumo:
This contribution builds upon a former paper by the authors (Lipps and Betz 2004), in which a stochastic population projection for East- and West Germany is performed. Aim was to forecast relevant population parameters and their distribution in a consistent way. We now present some modifications, which have been modelled since. First, population parameters for the entire German population are modelled. In order to overcome the modelling problem of the structural break in the East during reunification, we show that the adaptation process of the relevant figures by the East can be considered to be completed by now. As a consequence, German parameters can be modelled just by using the West German historic patterns, with the start-off population of entire Germany. Second, a new model to simulate age specific fertility rates is presented, based on a quadratic spline approach. This offers a higher flexibility to model various age specific fertility curves. The simulation results are compared with the scenario based official forecasts for Germany in 2050. Exemplary for some population parameters (e.g. dependency ratio), it can be shown that the range spanned by the medium and extreme variants correspond to the s-intervals in the stochastic framework. It seems therefore more appropriate to treat this range as a s-interval covering about two thirds of the true distribution.
Resumo:
The paper deals with the development and application of the generic methodology for automatic processing (mapping and classification) of environmental data. General Regression Neural Network (GRNN) is considered in detail and is proposed as an efficient tool to solve the problem of spatial data mapping (regression). The Probabilistic Neural Network (PNN) is considered as an automatic tool for spatial classifications. The automatic tuning of isotropic and anisotropic GRNN/PNN models using cross-validation procedure is presented. Results are compared with the k-Nearest-Neighbours (k-NN) interpolation algorithm using independent validation data set. Real case studies are based on decision-oriented mapping and classification of radioactively contaminated territories.
Geochemistry of the thermal springs and fumaroles of Basse-Terre Island, Guadeloupe, Lesser Antilles
Resumo:
The purpose of this work was to study jointly the volcanic-hydrothermal system of the high-risk volcano La Soufriere, in the southern part of Basse-Terre, and the geothermal area of Bouillante, on its western coast, to derive an all-embracing and coherent conceptual geochemical model that provides the necessary basis for adequate volcanic surveillance and further geothermal exploration. The active andesitic dome of La Soufriere has erupted eight times since 1660, most recently in 1976-1977. All these historic eruptions have been phreatic. High-salinity, Na-CI geothermal liquids circulate in the Bouillante geothermal reservoir, at temperatures close to 250 degrees C. These Na-CI solutions rise toward the surface, undergo boiling and mixing with groundwater and/or seawater, and feed most Na-CI thermal springs in the central Bouillante area. The Na-Cl thermal springs are surrounded by Na-HCO3 thermal springs and by the Na-Cl thermal spring of Anse a la Barque (a groundwater slightly mixed with seawater), which are all heated through conductive transfer. The two main fumarolic fields of La Soufriere area discharge vapors formed through boiling of hydrothermal aqueous solutions at temperatures of 190-215 degrees C below the ``Ty'' fault area and close to 260 degrees C below the dome summit. The boiling liquid producing the vapors of the Ty fault area has SD and delta(18)O values relatively similar to those of the Na-CI liquids of the Bouillante geothermal reservoir, whereas the liquid originating the vapors of the summit fumaroles is strongly enriched in O-18, due to input of magmatic fluids from below. This process is also responsible for the paucity of CH;I in the fumaroles. The thermal features around La Soufriere dome include: (a) Ca-SO4 springs, produced through absorption of hydrothermal vapors in shallow groundwaters; (b) conductively heated, Ca-Na-HCO3 springs; and (c) two Ca-Na-Cl springs produced through mixing of shallow Ca-SO4 waters and deep Na-Cl hydrothermal liquids. The geographical distribution of the different thermal features of La Soufriere area indicates the presence of: (a) a central zone dominated by the ascent of steam, which either discharges at the surface in the fumarolic fields or is absorbed in shallow groundwaters; and (b) an outer zone, where the shallow groundwaters are heated through conduction or addition of Na-Cl liquids coming from hydrothermal aquifer(s).
Resumo:
OBJECTIVE: Our aim was to evaluate a fluorescence-based enhanced-reality system to assess intestinal viability in a laparoscopic mesenteric ischemia model. MATERIALS AND METHODS: A small bowel loop was exposed, and 3 to 4 mesenteric vessels were clipped in 6 pigs. Indocyanine green (ICG) was administered intravenously 15 minutes later. The bowel was illuminated with an incoherent light source laparoscope (D-light-P, KarlStorz). The ICG fluorescence signal was analyzed with Ad Hoc imaging software (VR-RENDER), which provides a digital perfusion cartography that was superimposed to the intraoperative laparoscopic image [augmented reality (AR) synthesis]. Five regions of interest (ROIs) were marked under AR guidance (1, 2a-2b, 3a-3b corresponding to the ischemic, marginal, and vascularized zones, respectively). One hour later, capillary blood samples were obtained by puncturing the bowel serosa at the identified ROIs and lactates were measured using the EDGE analyzer. A surgical biopsy of each intestinal ROI was sent for mitochondrial respiratory rate assessment and for metabolites quantification. RESULTS: Mean capillary lactate levels were 3.98 (SD = 1.91) versus 1.05 (SD = 0.46) versus 0.74 (SD = 0.34) mmol/L at ROI 1 versus 2a-2b (P = 0.0001) versus 3a-3b (P = 0.0001), respectively. Mean maximal mitochondrial respiratory rate was 104.4 (±21.58) pmolO2/second/mg at the ROI 1 versus 191.1 ± 14.48 (2b, P = 0.03) versus 180.4 ± 16.71 (3a, P = 0.02) versus 199.2 ± 25.21 (3b, P = 0.02). Alanine, choline, ethanolamine, glucose, lactate, myoinositol, phosphocholine, sylloinositol, and valine showed statistically significant different concentrations between ischemic and nonischemic segments. CONCLUSIONS: Fluorescence-based AR may effectively detect the boundary between the ischemic and the vascularized zones in this experimental model.
Resumo:
"Morphing Romania and the Moldova Province" gives a short insight of cartograms. Digital cartograms provide potential to move away from classical visualization of geographical data and benefit of new understanding of our world. They introduce a human vision instead of a planimetric one. By applying the Gastner-Newman algorithm for generating density-equalising cartograms to Romania and its Moldova province we can discuss the making of cartograms in general.
Resumo:
Over the past decade much has been learned about the mechanisms of crystal-induced inflammation and renal excretion of uric acid, which has led to more specific targeting of gout therapies and a more potent approach to future management of gout. This article outlines agents being developed for more aggressive lowering of urate and more specific anti-inflammatory activity. The emerging urate-lowering therapies include lesinurad, arhalofenate, ulodesine, and levotofisopam. Novel gout-specific anti-inflammatories include the interleukin-1β inhibitors anakinra, canakinumab, and rilonacept, the melanocortins, and caspase inhibitors. The historic shortcomings of current gout treatment may, in part, be overcome by these novel approaches.
Resumo:
The paper deals with the development and application of the methodology for automatic mapping of pollution/contamination data. General Regression Neural Network (GRNN) is considered in detail and is proposed as an efficient tool to solve this problem. The automatic tuning of isotropic and an anisotropic GRNN model using cross-validation procedure is presented. Results are compared with k-nearest-neighbours interpolation algorithm using independent validation data set. Quality of mapping is controlled by the analysis of raw data and the residuals using variography. Maps of probabilities of exceeding a given decision level and ?thick? isoline visualization of the uncertainties are presented as examples of decision-oriented mapping. Real case study is based on mapping of radioactively contaminated territories.
Resumo:
La fabrication, la distribution et l'usage de fausses pièces d'identité constituent une menace pour la sécurité autant publique que privée. Ces faux documents représentent en effet un catalyseur pour une multitude de formes de criminalité, des plus anodines aux formes les plus graves et organisées. La dimension, la complexité, la faible visibilité, ainsi que les caractères répétitif et évolutif de la fraude aux documents d'identité appellent des réponses nouvelles qui vont au-delà d'une approche traditionnelle au cas par cas ou de la stratégie du tout technologique dont la perspective historique révèle l'échec. Ces nouvelles réponses passent par un renforcement de la capacité de comprendre les problèmes criminels que posent la fraude aux documents d'identité et les phénomènes qui l'animent. Cette compréhension est tout bonnement nécessaire pour permettre d'imaginer, d'évaluer et de décider les solutions et mesures les plus appropriées. Elle requière de développer les capacités d'analyse et la fonction de renseignement criminel qui fondent en particulier les modèles d'action de sécurité les plus récents, tels que l'intelligence-led policing ou le problem-oriented policing par exemple. Dans ce contexte, le travail doctoral adopte une position originale en postulant que les fausses pièces d'identité se conçoivent utilement comme la trace matérielle ou le vestige résultant de l'activité de fabrication ou d'altération d'un document d'identité menée par les faussaires. Sur la base de ce postulat fondamental, il est avancé que l'exploitation scientifique, méthodique et systématique de ces traces au travers d'un processus de renseignement forensique permet de générer des connaissances phénoménologiques sur les formes de criminalité qui fabriquent, diffusent ou utilisent les fausses pièces d'identité, connaissances qui s'intègrent et se mettent avantageusement au service du renseignement criminel. A l'appui de l'épreuve de cette thèse de départ et de l'étude plus générale du renseignement forensique, le travail doctoral propose des définitions et des modèles. Il décrit des nouvelles méthodes de profilage et initie la constitution d'un catalogue de formes d'analyses. Il recourt également à des expérimentations et des études de cas. Les résultats obtenus démontrent que le traitement systématique de la donnée forensique apporte une contribution utile et pertinente pour le renseignement criminel stratégique, opérationnel et tactique, ou encore la criminologie. Combiné aux informations disponibles par ailleurs, le renseignement forensique produit est susceptible de soutenir l'action de sécurité dans ses dimensions répressive, proactive, préventive et de contrôle. En particulier, les méthodes de profilage des fausses pièces d'identité proposées permettent de révéler des tendances au travers de jeux de données étendus, d'analyser des modus operandi ou d'inférer une communauté ou différence de source. Ces méthodes appuient des moyens de détection et de suivi des séries, des problèmes et des phénomènes criminels qui s'intègrent dans le cadre de la veille opérationnelle. Ils permettent de regrouper par problèmes les cas isolés, de mettre en évidence les formes organisées de criminalité qui méritent le plus d'attention, ou de produire des connaissances robustes et inédites qui offrent une perception plus profonde de la criminalité. Le travail discute également les difficultés associées à la gestion de données et d'informations propres à différents niveaux de généralité, ou les difficultés relatives à l'implémentation du processus de renseignement forensique dans la pratique. Ce travail doctoral porte en premier lieu sur les fausses pièces d'identité et leur traitement par les protagonistes de l'action de sécurité. Au travers d'une démarche inductive, il procède également à une généralisation qui souligne que les observations ci-dessus ne valent pas uniquement pour le traitement systématique des fausses pièces d'identité, mais pour celui de tout type de trace dès lors qu'un profil en est extrait. Il ressort de ces travaux une définition et une compréhension plus transversales de la notion et de la fonction de renseignement forensique. The production, distribution and use of false identity documents constitute a threat to both public and private security. Fraudulent documents are a catalyser for a multitude of crimes, from the most trivial to the most serious and organised forms. The dimension, complexity, low visibility as well as the repetitive and evolving character of the production and use of false identity documents call for new solutions that go beyond the traditional case-by-case approach, or the technology-focused strategy whose failure is revealed by the historic perspective. These new solutions require to strengthen the ability to understand crime phenomena and crime problems posed by false identity documents. Such an understanding is pivotal in order to be able to imagine, evaluate and decide on the most appropriate measures and responses. Therefore, analysis capacities and crime intelligence functions, which found the most recent policing models such as intelligence-led policing or problem-oriented policing for instance, have to be developed. In this context, the doctoral research work adopts an original position by postulating that false identity documents can be usefully perceived as the material remnant resulting from the criminal activity undertook by forgers, namely the manufacture or the modification of identity documents. Based on this fundamental postulate, it is proposed that a scientific, methodical and systematic processing of these traces through a forensic intelligence approach can generate phenomenological knowledge on the forms of crime that produce, distribute and use false identity documents. Such knowledge should integrate and serve advantageously crime intelligence efforts. In support of this original thesis and of a more general study of forensic intelligence, the doctoral work proposes definitions and models. It describes new profiling methods and initiates the construction of a catalogue of analysis forms. It also leverages experimentations and case studies. Results demonstrate that the systematic processing of forensic data usefully and relevantly contributes to strategic, tactical and operational crime intelligence, and also to criminology. Combined with alternative information available, forensic intelligence may support policing in its repressive, proactive, preventive and control activities. In particular, the proposed profiling methods enable to reveal trends among extended datasets, to analyse modus operandi, or to infer that false identity documents have a common or different source. These methods support the detection and follow-up of crime series, crime problems and phenomena and therefore contribute to crime monitoring efforts. They enable to link and regroup by problems cases that were previously viewed as isolated, to highlight organised forms of crime which deserve greatest attention, and to elicit robust and novel knowledge offering a deeper perception of crime. The doctoral research work discusses also difficulties associated with the management of data and information relating to different levels of generality, or difficulties associated with the implementation in practice of the forensic intelligence process. The doctoral work focuses primarily on false identity documents and their treatment by policing stakeholders. However, through an inductive process, it makes a generalisation which underlines that observations do not only apply to false identity documents but to any kind of trace as soon as a profile is extracted. A more transversal definition and understanding of the concept and function of forensic intelligence therefore derives from the doctoral work.
Resumo:
Ce travail présente une étude de cas post-catastrophe à San Cristobal, Guatemala, où un important glissement de terrain du nom «Los Chorros» (8-10 millions de m3 de roche) affecte depuis 2009 diverses communautés et une des routes principales du pays. Les gestionnaires des risques, sur la base de leur propre évaluation, ont décidé de répondre d'une manière qui ne correspond pas aux intérêts de la population affectée. Les communautés locales ont évalué le risque de catastrophe et ont établi une autre solution suivant une conception du risque différente. Les conflits sociaux et la concurrence entre les différents acteurs du territoire, pour la définition des priorités et des solutions, révèlent les aspects sous-jacents de la société, utiles pour identifier et comprendre ce qui constitue le risque de catastrophe dans un contexte donné. Ce conflit montre que le risque de catastrophe n'est pas univoque mais un concept complexe, constitué par un grand nombre de composants. En termes de gouvernance, il met également en évidence la confrontation des savoirs et la tension qui peut exister entre les différentes approches du risque. Depuis une approche où le risque de catastrophe est considéré comme une construction sociale (les vulnérabilités étant historiquement générées par des processus sociaux, politiques, économiques et culturels), ce travail évalue d'autres modes d'interprétation, de traitement et d'intervention qui peuvent aider à améliorer les méthodes d'évaluation et de gestion des risques. Enfin, la proposition de gestion qui découle de l'exemple guatémaltèque invite à une autre manière de concevoir la gestion des risques en intégrant les différentes conceptions du risque et en visant une coordination stratégique entre les acteurs des politiques publiques, les échelles d'intervention, les experts en charge des différents aléas et la société civile, afin d'obtenir une solution acceptable pour tous les acteurs impliqués dans un territoire. -- This work analyses a post-disaster case study from San Cristobal, Guatemala where a large landslide named "Los Chorros (8 millions cubic meters of rock) affects several communities and one of the country's main west-east access highways. Risk managers, starting from their own assessment, decided to respond in a way that does not correspond to the interests of the afected population. Local communities assessed the risk disaster situation and establised another solution from a different conception of risk. These social conflict and competition for priorities and solutions for risk management reveal that disaster risk is not unequivocal but a complex and holistic concept, constituted by a large set of components. From a social constructivism approach, where disaster risk is considered as the results of social, political, economic and historic process, this thesis evaluates other modes of interpreting, shaping and managing risk that can help improve methods of risk assessment and management. Studying the logic of action of actors, who mobilize to establish a solution, enables to identify as to what constitutes a disaster. For this reason, the study focus, in particular, on the analysis of practices (practical science) implemented by all actors in San Cristobal Altaverapaz. Finally, it puts into perspective the risk management in terms of an integrative approach for policy experts that find compromise between different conceptions of risk in order to obtain a solution acceptable to all those involved.
Resumo:
This research examines the impacts of the Swiss reform of the allocation of tasks which was accepted in 2004 and implemented in 2008 to "re-assign" the responsibilities between the federal government and the cantons. The public tasks were redistributed, according to the leading and fundamental principle of subsidiarity. Seven tasks came under exclusive federal responsibility; ten came under the control of the cantons; and twenty-two "common tasks" were allocated to both the Confederation and the cantons. For these common tasks it wasn't possible to separate the management and the implementation. In order to deal with nineteen of them, the reform introduced the conventions-programs (CPs), which are public law contracts signed by the Confederation with each canton. These CPs are generally valid for periods of four years (2008-11, 2012-15 and 2016-19, respectively). The third period is currently being prepared. By using the principal-agent theory I examine how contracts can improve political relations between a principal (Confederation) and an agent (canton). I also provide a first qualitative analysis by examining the impacts of these contracts on the vertical cooperation and on the implication of different actors by focusing my study on five CPs - protection of cultural heritage and conservation of historic monuments, encouragement of the integration of foreigners, economic development, protection against noise and protection of the nature and landscape - applied in five cantons, which represents twenty-five cases studies.