918 resultados para Globe céleste
Resumo:
One of two active volcanoes in the western branch of the East African Rift, Nyamuragira (1.408ºS, 29.20ºE; 3058 m) is located in the D.R. Congo. Nyamuragira emits large amounts of SO2 (up to ~1 Mt/day) and erupts low-silica, alkalic lavas, which achieve flow rates of up to ~20 km/hr. The source of the large SO2 emissions and pre-eruptive magma conditions were unknown prior to this study, and 1994-2010 lava volumes were only recently mapped via satellite imagery, mainly due to the region’s political instability. In this study, new olivine-hosted melt inclusion volatile (H2O, CO2, S, Cl, F) and major element data from five historic Nyamuragira eruptions (1912, 1938, 1948, 1986, 2006) are presented. Melt compositions derived from the 1986 and 2006 tephra samples best represent pre-eruptive volatile compositions because these samples contain naturally glassy inclusions that underwent less post-entrapment modification than crystallized inclusions. The total amount of SO2 released from the 1986 (0.04 Mt) and 2006 (0.06 Mt) eruptions are derived using the petrologic method, whereby S contents in melt inclusions are scaled to erupted lava volumes. These amounts are significantly less than satellite-based SO2 emissions for the same eruptions (1986 = ~1 Mt; 2006 = ~2 Mt). Potential explanations for this observation are: 1) accumulation of a vapor phase within the magmatic system that is only released during eruptions, and/or 2) syn-eruptive gas release from unerupted magma. Post-1994 Nyamuragira lava volumes were not available at the beginning of this study. These flows (along with others since 1967) are mapped with Landsat MSS, TM, and ETM+, Hyperion, and ALI satellite data and combined with published flow thicknesses to derive volumes. Satellite remote sensing data was also used to evaluate Nyamuragira SO2 emissions. These results show that the most recent Nyamuragira eruptions injected SO2 into the atmosphere between 15 km (2006 eruption) and 5 km (2010 eruption). This suggests that past effusive basaltic eruptions (e.g., Laki 1783) are capable of similar plume heights that reached the upper troposphere or tropopause, allowing SO2 and resultant aerosols to remain longer in the atmosphere, travel farther around the globe, and affect global climates.
Resumo:
Mt Etna's activity has increased during the last decade with a tendency towards more explosive eruptions that produce paroxysmal lava fountains. From January 2011 to April 2012, 25 lava fountaining episodes took place at Etna's New South-East Crater (NSEC). Improved understanding of the mechanism driving these explosive basaltic eruptions is needed to reduce volcanic hazards. This type of activity produces high sulfur dioxide (SO2) emissions, associated with lava flows and ash fall-out, but to date the SO2 emissions associated with Etna's lava fountains have been poorly constrained. The Ultraviolet (UV) Ozone Monitoring Instrument (OMI) on NASA's Aura satellite and the Atmospheric Infrared Sounder (AIRS) on Aqua were used to measure the SO2 loadings. Ground-based data from the Observatoire de Physique du Globe de Clermont-Ferrand (OPGC) L-band Doppler radar, VOLDORAD 2B, used in collaboration with the Italian National Institute of Geophysics and Volcanology in Catania (INGV-CT), also detected the associated ash plumes, giving precise timing and duration for the lava fountains. This study resulted in the first detailed analysis of the OMI and AIRS SO2 data for Etna's lava fountains during the 2011-2012 eruptive cycle. The HYSPLIT trajectory model is used to constrain the altitude of the observed SO2 clouds, and results show that the SO2 emission usually coincided with the lava fountain peak intensity as detected by VOLDORAD. The UV OMI and IR AIRS SO2 retrievals permit quantification of the SO2 loss rate in the volcanic SO2 clouds, many of which were tracked for several days after emission. A first attempt to quantitatively validate AIRS SO2 retrievals with OMI data revealed a good correlation for high altitude SO2 clouds. Using estimates of the emitted SO2 at the time each paroxysm, we observe a correlation with the inter-paroxysm repose time. We therefore suggest that our data set supports the collapsing foam (CF) model [1] as driving mechanism for the paroxysmal events at the NSEC. Using VOLDORAD-based estimates of the erupted magma mass, we observe a large excess of SO2 in the eruption clouds. Satellite measurements indicate that SO2 emissions from Etnean lava fountains can reach the lower stratosphere and hence could pose a hazard to aviation. [1] Parfitt E.A (2004). A discussion of the mechanisms of explosive basaltic eruptions. J. Volcanol. Geotherm. Res. 134, 77-107.
Resumo:
For 20 years, AIDS has continued its relentless spread across the globe. By the end of the year 2000, the United Nations’ Joint Programme on HIV/AIDS reported that 36.1 million men, women, and children around the world were living with HIV and 21.8 million had died of it. Though AIDS is now found in every country, it has most seriously affected sub-Saharan Africa - home to 70 % of all adults and 80 % of all children living with HIV, and the continent with the least medical resources in the world. Today, AIDS is the primary cause of death in Africa and it has had a devastating impact on villages, communities and families. In many African countries, the number of newly infected persons is increasing at a rate that is threatening to destroy the social fabric. Life expectancy is decreasing rapidly in many of these countries as a result of AIDS related illnesses and socioeconomic problems. Of the approximately 13.2 million children orphaned by HIV/AIDS worldwide, 12.1 million live in Africa.
Resumo:
When Creative Commons (CC) was founded in 2001, the core Creative Commons licenses were drafted according to United States Copyright Law. Since their first introduction in December 2002, Creative Commons licenses have been enthusiastically adopted by many creators, authors, and other content producers – not only in the United States, but in many other jurisdictions as well. Global interest in the CC licenses prompted a discussion about the need for national versions of the CC licenses. To best address this need, the international license porting project (“Creative Commons International” – formerly known as “International Commons”) was launched in 2003. Creative Commons International works to port the core Creative Commons licenses to different copyright legislations around the world. The porting process includes both linguistically translating the licenses and legally adapting the licenses to a particular jurisdiction such that they are comprehensible in the local jurisdiction and legally enforceable but concurrently retain the same key elements. Since its inception, Creative Commons International has found many supporters all over the world. With Finland, Brazil, and Japan as the first completed jurisdiction projects, experts around the globe have followed their lead and joined the international collaboration with Creative Commons to adapt the licenses to their local copyright. This article aims to present an overview of the international porting process, explain and clarify the international license architecture, its legal and promotional aspects, as well as its most recent challenges.
Resumo:
The development of a completely annotated sheep genome sequence is a key need for understanding the phylogenetic relationships and genetic diversity among the many different sheep breeds worldwide and for identifying genes controlling economically and physiologically important traits. The ovine genome sequence assembly will be crucial for developing optimized breeding programs based on highly productive, healthy sheep phenotypes that are adapted to modern breeding and production conditions. Scientists and breeders around the globe have been contributing to this goal by generating genomic and cDNA libraries, performing genome-wide and trait-associated analyses of polymorphism, expression analysis, genome sequencing, and by developing virtual and physical comparative maps. The International Sheep Genomics Consortium (ISGC), an informal network of sheep genomics researchers, is playing a major role in coordinating many of these activities. In addition to serving as an essential tool for monitoring chromosome abnormalities in specific sheep populations, ovine molecular cytogenetics provides physical anchors which link and order genome regions, such as sequence contigs, genes and polymorphic DNA markers to ovine chromosomes. Likewise, molecular cytogenetics can contribute to the process of defining evolutionary breakpoints between related species. The selective expansion of the sheep cytogenetic map, using loci to connect maps and identify chromosome bands, can substantially contribute to improving the quality of the annotated sheep genome sequence and will also accelerate its assembly. Furthermore, identifying major morphological chromosome anomalies and micro-rearrangements, such as gene duplications or deletions, that might occur between different sheep breeds and other Ovis species will also be important to understand the diversity of sheep chromosome structure and its implications for cross-breeding. To date, 566 loci have been assigned to specific chromosome regions in sheep and the new cytogenetic map is presented as part of this review. This review will also summarize the current cytogenomic status of the sheep genome, describe current activities in the sheep cytogenomics research sector, and will discuss the cytogenomics data in context with other major sheep genomics projects.
Resumo:
The development of the Internet has made it possible to transfer data ‘around the globe at the click of a mouse’. Especially fresh business models such as cloud computing, the newest driver to illustrate the speed and breadth of the online environment, allow this data to be processed across national borders on a routine basis. A number of factors cause the Internet to blur the lines between public and private space: Firstly, globalization and the outsourcing of economic actors entrain an ever-growing exchange of personal data. Secondly, the security pressure in the name of the legitimate fight against terrorism opens the access to a significant amount of data for an increasing number of public authorities.And finally,the tools of the digital society accompany everyone at each stage of life by leaving permanent individual and borderless traces in both space and time. Therefore, calls from both the public and private sectors for an international legal framework for privacy and data protection have become louder. Companies such as Google and Facebook have also come under continuous pressure from governments and citizens to reform the use of data. Thus, Google was not alone in calling for the creation of ‘global privacystandards’. Efforts are underway to review established privacy foundation documents. There are similar efforts to look at standards in global approaches to privacy and data protection. The last remarkable steps were the Montreux Declaration, in which the privacycommissioners appealed to the United Nations ‘to prepare a binding legal instrument which clearly sets out in detail the rights to data protection and privacy as enforceable human rights’. This appeal was repeated in 2008 at the 30thinternational conference held in Strasbourg, at the 31stconference 2009 in Madrid and in 2010 at the 32ndconference in Jerusalem. In a globalized world, free data flow has become an everyday need. Thus, the aim of global harmonization should be that it doesn’t make any difference for data users or data subjects whether data processing takes place in one or in several countries. Concern has been expressed that data users might seek to avoid privacy controls by moving their operations to countries which have lower standards in their privacy laws or no such laws at all. To control that risk, some countries have implemented special controls into their domestic law. Again, such controls may interfere with the need for free international data flow. A formula has to be found to make sure that privacy at the international level does not prejudice this principle.
Resumo:
The Twentieth Century Reanalysis (20CR) is an atmospheric dataset consisting of 56 ensemble members, which covers the entire globe and reaches back to 1871. To assess the suitability of this dataset for studying past extremes, we analysed a prominent extreme event, namely the Galveston Hurricane, which made landfall in September 1900 in Texas, USA. The ensemble mean of 20CR shows a track of the pressure minimum with a small standard deviation among the 56 ensemble members in the area of the Gulf of Mexico. However, there are systematic differences between the assimilated “Best Track” from the International Best Track Archive for Climate Stewardship (IBTrACS) and the ensemble mean track in 20CR. East of the Strait of Florida, the tracks derived from 20CR are located systematically northeast of the assimilated track while in the Gulf of Mexico, the 20CR tracks are systematically shifted to the southwest compared to the IBTrACS position. The hurricane can also be observed in the wind field, which shows a cyclonic rotation and a relatively calm zone in the centre of the hurricane. The 20CR data reproduce the pressure gradient and cyclonic wind field. Regarding the amplitude of the wind speeds, the ensemble mean values from 20CR are significantly lower than the wind speeds known from measurements.
Resumo:
High-resolution reconstructions of climate variability that cover the past millennia are necessary to improve the understanding of natural and anthropogenic climate change across the globe. Although numerous records are available for the mid- and high-latitudes of the Northern Hemisphere, global assessments are still compromised by the scarcity of data from the Southern Hemisphere. This is particularly the case for the tropical and subtropical areas. In addition, high elevation sites in the South American Andes may provide insight into the vertical structure of climate change in the mid-troposphere. This study presents a 3000 yr-long austral summer (November to February) temperature reconstruction derived from the 210Pb- and 14C-dated organic sediments of Laguna Chepical (32°16' S, 70°30' W, 3050 m a.s.l.), a high-elevation glacial lake in the subtropical Andes of central Chile. Scanning reflectance spectroscopy in the visible light range provided the spectral index R570/R630, which reflects the clay mineral content in lake sediments. For the calibration period (AD 1901–2006), the R570/R630 data were regressed against monthly meteorological reanalysis data, showing that this proxy was strongly and significantly correlated with mean summer (NDJF) temperatures (R3 yr = −0.63, padj = 0.01). This calibration model was used to make a quantitative temperature reconstruction back to 1000 BC. The reconstruction (with a model error RMSEPboot of 0.33 °C) shows that the warmest decades of the past 3000 yr occurred during the calibration period. The 19th century (end of the Little Ice Age (LIA)) was cool. The prominent warmth reconstructed for the 18th century, which was also observed in other records from this area, seems systematic for subtropical and southern South America but remains difficult to explain. Except for this warm period, the LIA was generally characterized by cool summers. Back to AD 1400, the results from this study compare remarkably well to low altitude records from the Chilean Central Valley and southern South America. However, the reconstruction from Laguna Chepical does not show a warm Medieval Climate Anomaly during the 12–13th century, which is consistent with records from tropical South America. The Chepical record also indicates substantial cooling prior to 800 BC. This coincides with well-known regional as well as global glacier advances which have been attributed to a grand solar minimum. This study thus provides insight into the climatic drivers and temperature patterns in a region for which currently very few data are available. It also shows that since ca. AD 1400, long-term temperature patterns were generally similar at low and high altitudes in central Chile.
Resumo:
External forcing and internal dynamics result in climate system variability ranging from sub-daily weather to multi-centennial trends and beyond1, 2. State-of-the-art palaeoclimatic methods routinely use hydroclimatic proxies to reconstruct temperature (for example, refs 3, 4), possibly blurring differences in the variability continuum of temperature and precipitation before the instrumental period. Here, we assess the spectral characteristics of temperature and precipitation fluctuations in observations, model simulations and proxy records across the globe. We find that whereas an ensemble of different general circulation models represents patterns captured in instrumental measurements, such as land–ocean contrasts and enhanced low-frequency tropical variability, the tree-ring-dominated proxy collection does not. The observed dominance of inter-annual precipitation fluctuations is not reflected in the annually resolved hydroclimatic proxy records. Likewise, temperature-sensitive proxies overestimate, on average, the ratio of low- to high-frequency variability. These spectral biases in the proxy records seem to propagate into multi-proxy climate reconstructions for which we observe an overestimation of low-frequency signals. Thus, a proper representation of the high- to low-frequency spectrum in proxy records is needed to reduce uncertainties in climate reconstruction efforts.
Resumo:
Analysing historical weather extremes such as the tropical cyclone in Samoa in March 1889 could add to our understanding of extreme events. However, up to now the availability of suitable data was limiting the analysis of historical extremes, particularly in remote regions. The new “Twentieth Century Reanalysis” (20CR), which provides six-hourly, three-dimensional data for the entire globe back to 1871, might provide the means to study this and other early events. While its suitability for studying historical extremes has been analysed for events in the northern extratropics (see other papers in this volume), the representation of tropical cyclones, especially in early times, remains unknown. The aim of this paper is to study to the hurricane that struck Samoa on 15-16 March 1889. We analyse the event in 20CR as well as in contemporary observations. We find that the event is not reproduced in the ensemble mean of 20CR, nor is it within the ensemble spread. We argue that this is due to the paucity of data assimilated into 20CR. A preliminary compilation of historical observations from ships for that period, in contrast, provides a relatively consistent picture of the event. This shows that more observations would be available and implies that future versions of surface-based reanalyses might profit from digitizing further observations in the tropical region.
Resumo:
Historical, i.e. pre-1957, upper-air data are a valuable source of information on the state of the atmosphere, in some parts of the world dating back to the early 20th century. However, to date, reanalyses have only partially made use of these data, and only of observations made after 1948. Even for the period between 1948 (the starting year of the NCEP/NCAR (National Centers for Environmental Prediction/National Center for Atmospheric Research) reanalysis) and the International Geophysical Year in 1957 (the starting year of the ERA-40 reanalysis), when the global upper-air coverage reached more or less its current status, many observations have not yet been digitised. The Comprehensive Historical Upper-Air Network (CHUAN) already compiled a large collection of pre-1957 upper-air data. In the framework of the European project ERA-CLIM (European Reanalysis of Global Climate Observations), significant amounts of additional upper-air data have been catalogued (> 1.3 million station days), imaged (> 200 000 images) and digitised (> 700 000 station days) in order to prepare a new input data set for upcoming reanalyses. The records cover large parts of the globe, focussing on, so far, less well covered regions such as the tropics, the polar regions and the oceans, and on very early upper-air data from Europe and the US. The total number of digitised/inventoried records is 61/101 for moving upper-air data, i.e. data from ships, etc., and 735/1783 for fixed upper-air stations. Here, we give a detailed description of the resulting data set including the metadata and the quality checking procedures applied. The data will be included in the next version of CHUAN. The data are available at doi:10.1594/PANGAEA.821222
Resumo:
Traditionally, desertification research has focused on degradation assessments, whereas prevention and mitigation strategies have not sufficiently been emphasised, although the concept of sustainable land management (SLM) is increasingly being acknowledged. SLM strategies are interventions at the local to regional scale aiming at increasing productivity, protecting the natural resource base, and improving livelihoods. The global WOCAT initiative and its partners have developed harmonized frameworks to compile, evaluate and analyse the impact of SLM practices around the globe. Recent studies within the EU research project DESIRE developed a methodological framework that combines a collective learning and decision-making approach with use of best practices from the WOCAT database. In-depth assessment of 30 technologies and 8 approaches from 17 desertification sites enabled an evaluation of how SLM addresses prevalent dryland threats such as water scarcity, soil and vegetation degradation, low production, climate change, resource use conflicts and migration. Among the impacts attributed to the documented technologies, those mentioned most were diversified and enhanced production and better management of water and soil degradation, whether through water harvesting, improving soil moisture, or reducing runoff. Water harvesting offers under-exploited opportunities for the drylands and the predominantly rainfed farming systems of the developing world. Recently compiled guidelines introduce the concepts behind water harvesting and propose a harmonised classification system, followed by an assessment of suitability, adoption and up-scaling of practices. Case studies go from large-scale floodwater spreading that make alluvial plains cultivable, to systems that boost cereal production in small farms, as well as practices that collect and store water from household compounds. Once contextualized and set in appropriate institutional frameworks, they can form part of an overall adaptation strategy for land users. More field research is needed to reinforce expert assessments of SLM impacts and provide the necessary evidence-based rationale for investing in SLM. This includes developing methods to quantify and value ecosystem services, both on-site and off-site, and assess the resilience of SLM practices, as currently aimed at within the new EU CASCADE project.
Resumo:
The tropical region is an area of maximum humidity and serves as the major humidity source of the globe. Among other phenomena, it is governed by the so-called Inter-Tropical Convergence Zone (ITCZ) which is commonly defined by converging low-level winds or enhanced precipitation. Given its importance as a humidity source, we investigate the humidity fields in the tropics in different reanalysis data sets, deduce the climatology and variability and assess the relationship to the ITCZ. Therefore, a new analysis method of the specific humidity distribution is introduced which allows detecting the location of the humidity maximum, the strength and the meridional extent. The results show that the humidity maximum in boreal summer is strongly shifted northward over the warm pool/Asia Monsoon area and the Gulf of Mexico. These shifts go along with a peak in the strength in both areas; however, the extent shrinks over the warm pool/Asia Monsoon area, whereas it is wider over the Gulf of Mexico. In winter, such connections between location, strength and extent are not found. Still, a peak in strength is again identified over the Gulf of Mexico in boreal winter. The variability of the three characteristics is dominated by inter-annual signals in both seasons. The results using ERA-interim data suggest a positive trend in the Gulf of Mexico/Atlantic region from 1979 to 2010, showing an increased northward shift in the recent years. Although the trend is only weakly confirmed by the results using MERRA reanalysis data, it is in phase with a trend in hurricane activity�a possible hint of the importance of the new method on hurricanes. Furthermore, the position of the maximum humidity coincides with one of the ITCZ in most areas. One exception is the western and central Pacific, where the area is dominated by the double ITCZ in boreal winter. Nevertheless, the new method enables us to gain more insight into the humidity distribution, its variability and the relationship to ITCZ characteristics.
Resumo:
There is a growing number of proxy-based reconstructions detailing the climatic changes that occurred during the last interglacial period (LIG). This period is of special interest, because large parts of the globe were characterized by a warmer-than-present-day climate, making this period an interesting test bed for climate models in light of projected global warming. However, mainly because synchronizing the different palaeoclimatic records is difficult, there is no consensus on a global picture of LIG temperature changes. Here we present the first model inter-comparison of transient simulations covering the LIG period. By comparing the different simulations, we aim at investigating the common signal in the LIG temperature evolution, investigating the main driving forces behind it and at listing the climate feedbacks which cause the most apparent inter-model differences. The model inter-comparison shows a robust Northern Hemisphere July temperature evolution characterized by a maximum between 130–125 ka BP with temperatures 0.3 to 5.3 K above present day. A Southern Hemisphere July temperature maximum, −1.3 to 2.5 K at around 128 ka BP, is only found when changes in the greenhouse gas concentrations are included. The robustness of simulated January temperatures is large in the Southern Hemisphere and the mid-latitudes of the Northern Hemisphere. For these regions maximum January temperature anomalies of respectively −1 to 1.2 K and −0.8 to 2.1 K are simulated for the period after 121 ka BP. In both hemispheres these temperature maxima are in line with the maximum in local summer insolation. In a number of specific regions, a common temperature evolution is not found amongst the models. We show that this is related to feedbacks within the climate system which largely determine the simulated LIG temperature evolution in these regions. Firstly, in the Arctic region, changes in the summer sea-ice cover control the evolution of LIG winter temperatures. Secondly, for the Atlantic region, the Southern Ocean and the North Pacific, possible changes in the characteristics of the Atlantic meridional overturning circulation are crucial. Thirdly, the presence of remnant continental ice from the preceding glacial has shown to be important when determining the timing of maximum LIG warmth in the Northern Hemisphere. Finally, the results reveal that changes in the monsoon regime exert a strong control on the evolution of LIG temperatures over parts of Africa and India. By listing these inter-model differences, we provide a starting point for future proxy-data studies and the sensitivity experiments needed to constrain the climate simulations and to further enhance our understanding of the temperature evolution of the LIG period.
Resumo:
The Personal Health Assistant Project (PHA) is a pilot system implementation sponsored by the Kozani Region Governors’ Association (KRGA) and installed in one of the two major public hospitals of the city of Kozani. PHA is intended to demonstrate how a secure, networked, multipurpose electronic health and food benefits digital signage system can transform common TV sets inside patient homes or hospital rooms into health care media players and facilitate information sharing and improve administrative efficiency among private doctors, public health care providers, informal caregivers, and nutrition program private companies, while placing individual patients firmly in control of the information at hand. This case evaluation of the PHA demonstration is intended to provide critical information to other decision makers considering implementing PHA or related digital signage technology at other institutions and public hospitals around the globe.