984 resultados para Fortune Global 500,


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Shelf and coastal seas are regions of exceptionally high biological productivity, high rates of biogeochemical cycling and immense socio-economic importance. They are, however, poorly represented by the present generation of Earth system models, both in terms of resolution and process representation. Hence, these models cannot be used to elucidate the role of the coastal ocean in global biogeochemical cycles and the effects global change (both direct anthropogenic and climatic) are having on them. Here, we present a system for simulating all the coastal regions around the world (the Global Coastal Ocean Modelling System) in a systematic and practical fashion. It is based on automatically generating multiple nested model domains, using the Proudman Oceanographic Laboratory Coastal Ocean Modelling System coupled to the European Regional Seas Ecosystem Model. Preliminary results from the system are presented. These demonstrate the viability of the concept, and we discuss the prospects for using the system to explore key areas of global change in shelf seas, such as their role in the carbon cycle and climate change effects on fisheries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Under global warming, the predicted intensification of the global freshwater cycle will modify the net freshwater flux at the ocean surface. Since the freshwater flux maintains ocean salinity structures, changes to the density-driven ocean circulation are likely. A modified ocean circulation could further alter the climate, potentially allowing rapid changes, as seen in the past. The relevant feedback mechanisms and timescales are poorly understood in detail, however, especially at low latitudes where the effects of salinity are relatively subtle. In an attempt to resolve some of these outstanding issues, we present an investigation of the climate response of the low-latitude Pacific region to changes in freshwater forcing. Initiated from the present-day thermohaline structure, a control run of a coupled ocean-atmosphere general circulation model is compared with a perturbation run in which the net freshwater flux is prescribed to be zero over the ocean. Such an extreme experiment helps to elucidate the general adjustment mechanisms and their timescales. The atmospheric greenhouse gas concentrations are held constant, and we restrict our attention to the adjustment of the upper 1,000 m of the Pacific Ocean between 40°N and 40°S, over 100 years. In the perturbation run, changes to the surface buoyancy, near-surface vertical mixing and mixed-layer depth are established within 1 year. Subsequently, relative to the control run, the surface of the low-latitude Pacific Ocean in the perturbation run warms by an average of 0.6°C, and the interior cools by up to 1.1°C, after a few decades. This vertical re-arrangement of the ocean heat content is shown to be achieved by a gradual shutdown of the heat flux due to isopycnal (i.e. along surfaces of constant density) mixing, the vertical component of which is downwards at low latitudes. This heat transfer depends crucially upon the existence of density-compensating temperature and salinity gradients on isopycnal surfaces. The timescale of the thermal changes in the perturbation run is therefore set by the timescale for the decay of isopycnal salinity gradients in response to the eliminated freshwater forcing, which we demonstrate to be around 10-20 years. Such isopycnal heat flux changes may play a role in the response of the low-latitude climate to a future accelerated freshwater cycle. Specifically, the mechanism appears to represent a weak negative sea surface temperature feedback, which we speculate might partially shield from view the anthropogenically-forced global warming signal at low latitudes. Furthermore, since the surface freshwater flux is shown to play a role in determining the ocean's thermal structure, it follows that evaporation and/or precipitation biases in general circulation models are likely to cause sea surface temperature biases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simulations of the last 500 yr carried out using the Third Hadley Centre Coupled Ocean-Atmosphere GCM (HadCM3) with anthropogenic and natural (solar and volcanic) forcings have been analyzed. Global-mean surface temperature change during the twentieth century is well reproduced. Simulated contributions to global-mean sea level rise during recent decades due to thermal expansion (the largest term) and to mass loss from glaciers and ice caps agree within uncertainties with observational estimates of these terms, but their sum falls short of the observed rate of sea level rise. This discrepancy has been discussed by previous authors; a completely satisfactory explanation of twentieth-century sea level rise is lacking. The model suggests that the apparent onset of sea level rise and glacier retreat during the first part of the nineteenth century was due to natural forcing. The rate of sea level rise was larger during the twentieth century than during the previous centuries because of anthropogenic forcing, but decreasing natural forcing during the second half of the twentieth century tended to offset the anthropogenic acceleration in the rate. Volcanic eruptions cause rapid falls in sea level, followed by recovery over several decades. The model shows substantially less decadal variability in sea level and its thermal expansion component than twentieth-century observations indicate, either because it does not generate sufficient ocean internal variability, or because the observational analyses overestimate the variability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Assessment of changes in precipitation (P) as a function of percentiles of surface temperature (T) and 500 hPa vertical velocity (ω) are presented, considering present-day simulations and observational estimates from the Global Precipitation Climatology Project (GPCP) combined with the European Centre for Medium-range Weather Forecasts Interim reanalysis (ERA Interim). There is a tendency for models to overestimate P in the warm, subsiding regimes compared to GPCP, in some cases by more than 100%, while many models underestimate P in the moderate temperature regimes. Considering climate change projections between 1980–1999 and 2080–2099, responses in P are characterised by dP/dT ≥ 4%/K over the coldest 10–20% of land points and over warm, ascending ocean points while P declines over the warmest, descending regimes (dP/dT ∼ − 4%/K for model ensemble means). The reduced Walker circulation limits this contrasting dP/dT response in the tropical wet and dry regimes only marginally. Around 70% of the global surface area exhibits a consistent sign for dP/dT in at least 6 out of a 7-member model ensemble when considering P composites in terms of dynamic regime.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global flood hazard maps can be used in the assessment of flood risk in a number of different applications, including (re)insurance and large scale flood preparedness. Such global hazard maps can be generated using large scale physically based models of rainfall-runoff and river routing, when used in conjunction with a number of post-processing methods. In this study, the European Centre for Medium Range Weather Forecasts (ECMWF) land surface model is coupled to ERA-Interim reanalysis meteorological forcing data, and resultant runoff is passed to a river routing algorithm which simulates floodplains and flood flow across the global land area. The global hazard map is based on a 30 yr (1979–2010) simulation period. A Gumbel distribution is fitted to the annual maxima flows to derive a number of flood return periods. The return periods are calculated initially for a 25×25 km grid, which is then reprojected onto a 1×1 km grid to derive maps of higher resolution and estimate flooded fractional area for the individual 25×25 km cells. Several global and regional maps of flood return periods ranging from 2 to 500 yr are presented. The results compare reasonably to a benchmark data set of global flood hazard. The developed methodology can be applied to other datasets on a global or regional scale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Long-range global climate forecasts have been made by use of a model for predicting a tropical Pacific sea surface temperature (SST) in tandem with an atmospheric general circulation model. The SST is predicted first at long lead times into the future. These ocean forecasts are then used to force the atmospheric model and so produce climate forecasts at lead times of the SST forecasts. Prediction of the wintertime 500 mb height, surface air temperature and precipitation for seven large climatic events of the 1970–1990s by this two-tiered technique agree well in general with observations over many regions of the globe. The levels of agreement are high enough in some regions to have practical utility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, we investigated the impact of global warming on the variabilities of large-scale interannual and interdecadal climate modes and teleconnection patterns with two long-term integrations of the coupled general circulation model of ECHAM4/OPYC3 at the Max-Planck-Institute for Meteorology, Hamburg. One is the control (CTRL) run with fixed present-day concentrations of greenhouse gases. The other experiment is a simulation of transient greenhouse warming, named GHG run. In the GHG run the averaged geopotential height at 500 hPa is increased significantly, and a negative phase of the Pacific/North American (PNA) teleconnection-like distribution pattern is intensified. The standard deviation over the tropics (high latitudes) is enhanced (reduced) on the interdecadal time scales and reduced (enhanced) on the interannual time scales in the GHG run. Except for an interdecadal mode related to the Southern Oscillation (SO) in the GHG run, the spatial variation patterns are similar for different (interannual + interdecadal, interannual, and interdecadal) time scales in the GHG and CTRL runs. Spatial distributions of the teleconnection patterns on the interannual and interdecadal time scales in the GHG run are also similar to those in the CTRL run. But some teleconnection patterns show linear trends and changes of variances and frequencies in the GHG run. Apart from the positive linear trend of the SO, the interdecadal modulation to the El Niño/SO cycle is enhanced during the GHG 2040 ∼ 2099. This is the result of an enhancement of the Walker circulation during that period. La Niña events intensify and El Niño events relatively weaken during the GHG 2070 ∼ 2090. It is interesting to note that with increasing greenhouse gas concentrations the relation between the SO and the PNA pattern is reversed significantly from a negative to a positive correlation on the interdecadal time scales and weakened on the interannual time scales. This suggests that the increase of the greenhouse gas concentrations will trigger the nonstationary correlation between the SO and the PNA pattern both on the interdecadal and interannual time scales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global NDVI data are routinely derived from the AVHRR, SPOT-VGT, and MODIS/Terra earth observation records for a range of applications from terrestrial vegetation monitoring to climate change modeling. This has led to a substantial interest in the harmonization of multisensor records. Most evaluations of the internal consistency and continuity of global multisensor NDVI products have focused on time-series harmonization in the spectral domain, often neglecting the spatial domain. We fill this void by applying variogram modeling (a) to evaluate the differences in spatial variability between 8-km AVHRR, 1-km SPOT-VGT, and 1-km, 500-m, and 250-m MODIS NDVI products over eight EOS (Earth Observing System) validation sites, and (b) to characterize the decay of spatial variability as a function of pixel size (i.e. data regularization) for spatially aggregated Landsat ETM+ NDVI products and a real multisensor dataset. First, we demonstrate that the conjunctive analysis of two variogram properties – the sill and the mean length scale metric – provides a robust assessment of the differences in spatial variability between multiscale NDVI products that are due to spatial (nominal pixel size, point spread function, and view angle) and non-spatial (sensor calibration, cloud clearing, atmospheric corrections, and length of multi-day compositing period) factors. Next, we show that as the nominal pixel size increases, the decay of spatial information content follows a logarithmic relationship with stronger fit value for the spatially aggregated NDVI products (R2 = 0.9321) than for the native-resolution AVHRR, SPOT-VGT, and MODIS NDVI products (R2 = 0.5064). This relationship serves as a reference for evaluation of the differences in spatial variability and length scales in multiscale datasets at native or aggregated spatial resolutions. The outcomes of this study suggest that multisensor NDVI records cannot be integrated into a long-term data record without proper consideration of all factors affecting their spatial consistency. Hence, we propose an approach for selecting the spatial resolution, at which differences in spatial variability between NDVI products from multiple sensors are minimized. This approach provides practical guidance for the harmonization of long-term multisensor datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present an assessment of how tropical cyclone activity might change due to the influence of increased atmospheric carbon dioxide concentrations, using the UK’s High Resolution Global Environment Model (HiGEM) with N144 resolution (~90 km in the atmosphere and ~40 km in the ocean). Tropical cyclones are identified using a feature tracking algorithm applied to model output. Tropical cyclones from idealized 30-year 2×CO2 (2CO2) and 4×CO2 (4CO2) simulations are compared to those identified in a 150-year present-day simulation, which is separated into a 5-member ensemble of 30-year integrations. Tropical cyclones are shown to decrease in frequency globally by 9% in the 2CO2 and 26% in the 4CO2. Tropical cyclones only become more intese in the 4CO2, however uncoupled time slice experiments reveal an increase in intensity in the 2CO2. An investigation into the large-scale environmental conditions, known to influence tropical cyclone activity in the main development regions, is used to determine the response of tropical cyclone activity to increased atmospheric CO2. A weaker Walker circulation and a reduction in zonally averaged regions of updrafts lead to a shift in the location of tropical cyclones in the northern hemisphere. A decrease in mean ascent at 500 hPa contributes to the reduction of tropical cyclones in the 2CO2 in most basins. The larger reduction of tropical cyclones in the 4CO2 arises from further reduction of mean ascent at 500 hPa and a large enhancement of vertical wind shear, especially in the southern hemisphere, North Atlantic and North East Pacific.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The global mean temperature in 2008 was slightly cooler than that in 2007; however, it still ranks within the 10 warmest years on record. Annual mean temperatures were generally well above average in South America, northern and southern Africa, Iceland, Europe, Russia, South Asia, and Australia. In contrast, an exceptional cold outbreak occurred during January across Eurasia and over southern European Russia and southern western Siberia. There has been a general increase in land-surface temperatures and in permafrost temperatures during the last several decades throughout the Arctic region, including increases of 1° to 2°C in the last 30 to 35 years in Russia. Record setting warm summer (JJA) air temperatures were observed throughout Greenland. The year 2008 was also characterized by heavy precipitation in a number of regions of northern South America, Africa, and South Asia. In contrast, a prolonged and intense drought occurred during most of 2008 in northern Argentina, Paraguay, Uruguay, and southern Brazil, causing severe impacts to agriculture and affecting many communities. The year began with a strong La Niña episode that ended in June. Eastward surface current anomalies in the tropical Pacific Ocean in early 2008 played a major role in adjusting the basin from strong La Niña conditions to ENSO-neutral conditions by July–August, followed by a return to La Niña conditions late in December. The La Niña conditions resulted in far-reaching anomalies such as a cooling in the central tropical Pacific, Arctic Ocean, and the regions extending from the Gulf of Alaska to the west coast of North America; changes in the sea surface salinity and heat content anomalies in the tropics; and total column water vapor, cloud cover, tropospheric temperature, and precipitation patterns typical of a La Niña. Anomalously salty ocean surface salinity values in climatologically drier locations and anomalously fresh values in rainier locations observed in recent years generally persisted in 2008, suggesting an increase in the hydrological cycle. The 2008 Atlantic hurricane season was the 14th busiest on record and the only season ever recorded with major hurricanes each month from July through November. Conversely, activity in the northwest Pacific was considerably below normal during 2008. While activity in the north Indian Ocean was only slightly above average, the season was punctuated by Cyclone Nargis, which killed over 145,000 people; in addition, it was the seventh-strongest cyclone ever in the basin and the most devastating to hit Asia since 1991. Greenhouse gas concentrations continued to rise, increasing by more than expected based on with CO2 the 1979 to 2007 trend. In the oceans, the global mean uptake for 2007 is estimated to be 1.67 Pg-C, about CO2 0.07 Pg-C lower than the long-term average, making it the third-largest anomaly determined with this method since 1983, with the largest uptake of carbon over the past decade coming from the eastern Indian Ocean. Global phytoplankton chlorophyll concentrations were slightly elevated in 2008 relative to 2007, but regional changes were substantial (ranging to about 50%) and followed long-term patterns of net decreases in chlorophyll with increasing sea surface temperature. Ozone-depleting gas concentrations continued to fall globally to about 4% below the peak levels of the 2000–02 period. Total column ozone concentrations remain well below pre-1980, levels and the 2008 ozone hole was unusually large (sixth worst on record) and persistent, with low ozone values extending into the late December period. In fact the polar vortex in 2008 persisted longer than for any previous year since 1979. Northern Hemisphere snow cover extent for the year was well below average due in large part to the record-low ice extent in March and despite the record-maximum coverage in January and the shortest snow cover duration on record (which started in 1966) in the North American Arctic. Limited preliminary data imply that in 2008 glaciers continued to lose mass, and full data for 2007 show it was the 17th consecutive year of loss. The northern region of Greenland and adjacent areas of Arctic Canada experienced a particularly intense melt season, even though there was an abnormally cold winter across Greenland's southern half. One of the most dramatic signals of the general warming trend was the continued significant reduction in the extent of the summer sea-ice cover and, importantly, the decrease in the amount of relatively older, thicker ice. The extent of the 2008 summer sea-ice cover was the second-lowest value of the satellite record (which started in 1979) and 36% below the 1979–2000 average. Significant losses in the mass of ice sheets and the area of ice shelves continued, with several fjords on the northern coast of Ellesmere Island being ice free for the first time in 3,000–5,500 years. In Antarctica, the positive phase of the SAM led to record-high total sea ice extent for much of early 2008 through enhanced equatorward Ekman transport. With colder continental temperatures at this time, the 2007–08 austral summer snowmelt season was dramatically weakened, making it the second shortest melt season since 1978 (when the record began). There was strong warming and increased precipitation along the Antarctic Peninsula and west Antarctica in 2008, and also pockets of warming along coastal east Antarctica, in concert with continued declines in sea-ice concentration in the Amundsen/Bellingshausen Seas. One significant event indicative of this warming was the disintegration and retreat of the Wilkins Ice Shelf in the southwest peninsula area of Antarctica.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The global cycle of multicomponent aerosols including sulfate, black carbon (BC),organic matter (OM), mineral dust, and sea salt is simulated in the Laboratoire de Me´te´orologie Dynamique general circulation model (LMDZT GCM). The seasonal open biomass burning emissions for simulation years 2000–2001 are scaled from climatological emissions in proportion to satellite detected fire counts. The emissions of dust and sea salt are parameterized online in the model. The comparison of model-predicted monthly mean aerosol optical depth (AOD) at 500 nm with Aerosol Robotic Network (AERONET) shows good agreement with a correlation coefficient of 0.57(N = 1324) and 76% of data points falling within a factor of 2 deviation. The correlation coefficient for daily mean values drops to 0.49 (N = 23,680). The absorption AOD (ta at 670 nm) estimated in the model is poorly correlated with measurements (r = 0.27, N = 349). It is biased low by 24% as compared to AERONET. The model reproduces the prominent features in the monthly mean AOD retrievals from Moderate Resolution Imaging Spectroradiometer (MODIS). The agreement between the model and MODIS is better over source and outflow regions (i.e., within a factor of 2).There is an underestimation of the model by up to a factor of 3 to 5 over some remote oceans. The largest contribution to global annual average AOD (0.12 at 550 nm) is from sulfate (0.043 or 35%), followed by sea salt (0.027 or 23%), dust (0.026 or 22%),OM (0.021 or 17%), and BC (0.004 or 3%). The atmospheric aerosol absorption is predominantly contributed by BC and is about 3% of the total AOD. The globally and annually averaged shortwave (SW) direct aerosol radiative perturbation (DARP) in clear-sky conditions is �2.17 Wm�2 and is about a factor of 2 larger than in all-sky conditions (�1.04 Wm�2). The net DARP (SW + LW) by all aerosols is �1.46 and �0.59 Wm�2 in clear- and all-sky conditions, respectively. Use of realistic, less absorbing in SW, optical properties for dust results in negative forcing over the dust-dominated regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate sensitivity is defined as the change in global mean equilibrium temperature after a doubling of atmospheric CO2 concentration and provides a simple measure of global warming. An early estimate of climate sensitivity, 1.5—4.5°C, has changed little subsequently, including the latest assessment by the Intergovernmental Panel on Climate Change. The persistence of such large uncertainties in this simple measure casts doubt on our understanding of the mechanisms of climate change and our ability to predict the response of the climate system to future perturbations. This has motivated continued attempts to constrain the range with climate data, alone or in conjunction with models. The majority of studies use data from the instrumental period (post-1850), but recent work has made use of information about the large climate changes experienced in the geological past. In this review, we first outline approaches that estimate climate sensitivity using instrumental climate observations and then summarize attempts to use the record of climate change on geological timescales. We examine the limitations of these studies and suggest ways in which the power of the palaeoclimate record could be better used to reduce uncertainties in our predictions of climate sensitivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study assesses the influence of the El Niño–Southern Oscillation (ENSO) on global tropical cyclone activity using a 150-yr-long integration with a high-resolution coupled atmosphere–ocean general circulation model [High-Resolution Global Environmental Model (HiGEM); with N144 resolution: ~90 km in the atmosphere and ~40 km in the ocean]. Tropical cyclone activity is compared to an atmosphere-only simulation using the atmospheric component of HiGEM (HiGAM). Observations of tropical cyclones in the International Best Track Archive for Climate Stewardship (IBTrACS) and tropical cyclones identified in the Interim ECMWF Re-Analysis (ERA-Interim) are used to validate the models. Composite anomalies of tropical cyclone activity in El Niño and La Niña years are used. HiGEM is able to capture the shift in tropical cyclone locations to ENSO in the Pacific and Indian Oceans. However, HiGEM does not capture the expected ENSO–tropical cyclone teleconnection in the North Atlantic. HiGAM shows more skill in simulating the global ENSO–tropical cyclone teleconnection; however, variability in the Pacific is overpronounced. HiGAM is able to capture the ENSO–tropical cyclone teleconnection in the North Atlantic more accurately than HiGEM. An investigation into the large-scale environmental conditions, known to influence tropical cyclone activity, is used to further understand the response of tropical cyclone activity to ENSO in the North Atlantic and western North Pacific. The vertical wind shear response over the Caribbean is not captured in HiGEM compared to HiGAM and ERA-Interim. Biases in the mean ascent at 500 hPa in HiGEM remain in HiGAM over the western North Pacific; however, a more realistic low-level vorticity in HiGAM results in a more accurate ENSO–tropical cyclone teleconnection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents results of the AQL2004 project, which has been develope within the GOFC-GOLD Latin American network of remote sensing and forest fires (RedLatif). The project intended to obtain monthly burned-land maps of the entire region, from Mexico to Patagonia, using MODIS (moderate-resolution imaging spectroradiometer) reflectance data. The project has been organized in three different phases: acquisition and preprocessing of satellite data; discrimination of burned pixels; and validation of results. In the first phase, input data consisting of 32-day composites of MODIS 500-m reflectance data generated by the Global Land Cover Facility (GLCF) of the University of Maryland (College Park, Maryland, U.S.A.) were collected and processed. The discrimination of burned areas was addressed in two steps: searching for "burned core" pixels using postfire spectral indices and multitemporal change detection and mapping of burned scars using contextual techniques. The validation phase was based on visual analysis of Landsat and CBERS (China-Brazil Earth Resources Satellite) images. Validation of the burned-land category showed an agreement ranging from 30% to 60%, depending on the ecosystem and vegetation species present. The total burned area for the entire year was estimated to be 153 215 km2. The most affected countries in relation to their territory were Cuba, Colombia, Bolivia, and Venezuela. Burned areas were found in most land covers; herbaceous vegetation (savannas and grasslands) presented the highest proportions of burned area, while perennial forest had the lowest proportions. The importance of croplands in the total burned area should be taken with reserve, since this cover presented the highest commission errors. The importance of generating systematic products of burned land areas for different ecological processes is emphasized.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The network paradigm has been highly influential in spatial analysis in the globalisation era. As economies across the world have become increasingly integrated, so-called global cities have come to play a growing role as central nodes in the networked global economy. The idea that a city’s position in global networks benefits its economic performance has resulted in a competitive policy focus on promoting the economic growth of cities by improving their network connectivity. However, in spite of the attention being given to boosting city connectivity little is known about whether this directly translates to improved city economic performance and, if so, how well connected a city needs to be in order to benefit from this. In this paper we test the relationship between network connectivity and economic performance between 2000 and 2008 for cities with over 500,000 inhabitants in Europe and the USA to inform European policy.