959 resultados para Global Campus Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The climates of the mid-Holocene (MH), 6,000 years ago, and of the Last Glacial Maximum (LGM), 21,000 years ago, have extensively been simulated, in particular in the framework of the Palaeoclimate Modelling Intercomparion Project. These periods are well documented by paleo-records, which can be used for evaluating model results for climates different from the present one. Here, we present new simulations of the MH and the LGM climates obtained with the IPSL_CM5A model and compare them to our previous results obtained with the IPSL_CM4 model. Compared to IPSL_CM4, IPSL_CM5A includes two new features: the interactive representation of the plant phenology and marine biogeochemistry. But one of the most important differences between these models is the latitudinal resolution and vertical domain of their atmospheric component, which have been improved in IPSL_CM5A and results in a better representation of the mid-latitude jet-streams. The Asian monsoon’s representation is also substantially improved. The global average mean annual temperature simulated for the pre-industrial (PI) period is colder in IPSL_CM5A than in IPSL_CM4 but their climate sensitivity to a CO2 doubling is similar. Here we show that these differences in the simulated PI climate have an impact on the simulated MH and LGM climatic anomalies. The larger cooling response to LGM boundary conditions in IPSL_CM5A appears to be mainly due to differences between the PMIP3 and PMIP2 boundary conditions, as shown by a short wave radiative forcing/feedback analysis based on a simplified perturbation method. It is found that the sensitivity computed from the LGM climate is lower than that computed from 2 × CO2 simulations, confirming previous studies based on different models. For the MH, the Asian monsoon, stronger in the IPSL_CM5A PI simulation, is also more sensitive to the insolation changes. The African monsoon is also further amplified in IPSL_CM5A due to the impact of the interactive phenology. Finally the changes in variability for both models and for MH and LGM are presented taking the example of the El-Niño Southern Oscillation (ENSO), which is very different in the PI simulations. ENSO variability is damped in both model versions at the MH, whereas inconsistent responses are found between the two versions for the LGM. Part 2 of this paper examines whether these differences between IPSL_CM4 and IPSL_CM5A can be distinguished when comparing those results to palaeo-climatic reconstructions and investigates new approaches for model-data comparisons made possible by the inclusion of new components in IPSL_CM5A.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Substantial low-frequency rainfall fluctuations occurred in the Sahel throughout the twentieth century, causing devastating drought. Modeling these low-frequency rainfall fluctuations has remained problematic for climate models for many years. Here we show using a combination of state-of-the-art rainfall observations and high-resolution global climate models that changes in organized heavy rainfall events carry most of the rainfall variability in the Sahel at multiannual to decadal time scales. Ability to produce intense, organized convection allows climate models to correctly simulate the magnitude of late-twentieth century rainfall change, underlining the importance of model resolution. Increasing model resolution allows a better coupling between large-scale circulation changes and regional rainfall processes over the Sahel. These results provide a strong basis for developing more reliable and skilful long-term predictions of rainfall (seasons to years) which could benefit many sectors in the region by allowing early adaptation to impending extremes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intercomparison and evaluation of the global ocean surface mixed layer depth (MLD) fields estimated from a suite of major ocean syntheses are conducted. Compared with the reference MLDs calculated from individual profiles, MLDs calculated from monthly mean and gridded profiles show negative biases of 10–20 m in early spring related to the re-stratification process of relatively deep mixed layers. Vertical resolution of profiles also influences the MLD estimation. MLDs are underestimated by approximately 5–7 (14–16) m with the vertical resolution of 25 (50) m when the criterion of potential density exceeding the 10-m value by 0.03 kg m−3 is used for the MLD estimation. Using the larger criterion (0.125 kg m−3) generally reduces the underestimations. In addition, positive biases greater than 100 m are found in wintertime subpolar regions when MLD criteria based on temperature are used. Biases of the reanalyses are due to both model errors and errors related to differences between the assimilation methods. The result shows that these errors are partially cancelled out through the ensemble averaging. Moreover, the bias in the ensemble mean field of the reanalyses is smaller than in the observation-only analyses. This is largely attributed to comparably higher resolutions of the reanalyses. The robust reproduction of both the seasonal cycle and interannual variability by the ensemble mean of the reanalyses indicates a great potential of the ensemble mean MLD field for investigating and monitoring upper ocean processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biomass burning impacts vegetation dynamics, biogeochemical cycling, atmospheric chemistry, and climate, with sometimes deleterious socio-economic impacts. Under future climate projections it is often expected that the risk of wildfires will increase. Our ability to predict the magnitude and geographic pattern of future fire impacts rests on our ability to model fire regimes, either using well-founded empirical relationships or process-based models with good predictive skill. A large variety of models exist today and it is still unclear which type of model or degree of complexity is required to model fire adequately at regional to global scales. This is the central question underpinning the creation of the Fire Model Intercomparison Project - FireMIP, an international project to compare and evaluate existing global fire models against benchmark data sets for present-day and historical conditions. In this paper we summarise the current state-of-the-art in fire regime modelling and model evaluation, and outline what essons may be learned from FireMIP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Floods are the most frequent of natural disasters, affecting millions of people across the globe every year. The anticipation and forecasting of floods at the global scale is crucial to preparing for severe events and providing early awareness where local flood models and warning services may not exist. As numerical weather prediction models continue to improve, operational centres are increasingly using the meteorological output from these to drive hydrological models, creating hydrometeorological systems capable of forecasting river flow and flood events at much longer lead times than has previously been possible. Furthermore, developments in, for example, modelling capabilities, data and resources in recent years have made it possible to produce global scale flood forecasting systems. In this paper, the current state of operational large scale flood forecasting is discussed, including probabilistic forecasting of floods using ensemble prediction systems. Six state-of-the-art operational large scale flood forecasting systems are reviewed, describing similarities and differences in their approaches to forecasting floods at the global and continental scale. Currently, operational systems have the capability to produce coarse-scale discharge forecasts in the medium-range and disseminate forecasts and, in some cases, early warning products, in real time across the globe, in support of national forecasting capabilities. With improvements in seasonal weather forecasting, future advances may include more seamless hydrological forecasting at the global scale, alongside a move towards multi-model forecasts and grand ensemble techniques, responding to the requirement of developing multi-hazard early warning systems for disaster risk reduction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ocean–sea ice reanalyses are crucial for assessing the variability and recent trends in the Arctic sea ice cover. This is especially true for sea ice volume, as long-term and large scale sea ice thickness observations are inexistent. Results from the Ocean ReAnalyses Intercomparison Project (ORA-IP) are presented, with a focus on Arctic sea ice fields reconstructed by state-of-the-art global ocean reanalyses. Differences between the various reanalyses are explored in terms of the effects of data assimilation, model physics and atmospheric forcing on properties of the sea ice cover, including concentration, thickness, velocity and snow. Amongst the 14 reanalyses studied here, 9 assimilate sea ice concentration, and none assimilate sea ice thickness data. The comparison reveals an overall agreement in the reconstructed concentration fields, mainly because of the constraints in surface temperature imposed by direct assimilation of ocean observations, prescribed or assimilated atmospheric forcing and assimilation of sea ice concentration. However, some spread still exists amongst the reanalyses, due to a variety of factors. In particular, a large spread in sea ice thickness is found within the ensemble of reanalyses, partially caused by the biases inherited from their sea ice model components. Biases are also affected by the assimilation of sea ice concentration and the treatment of sea ice thickness in the data assimilation process. An important outcome of this study is that the spatial distribution of ice volume varies widely between products, with no reanalysis standing out as clearly superior as compared to altimetry estimates. The ice thickness from systems without assimilation of sea ice concentration is not worse than that from systems constrained with sea ice observations. An evaluation of the sea ice velocity fields reveals that ice drifts too fast in most systems. As an ensemble, the ORA-IP reanalyses capture trends in Arctic sea ice area and extent relatively well. However, the ensemble can not be used to get a robust estimate of recent trends in the Arctic sea ice volume. Biases in the reanalyses certainly impact the simulated air–sea fluxes in the polar regions, and questions the suitability of current sea ice reanalyses to initialize seasonal forecasts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Arctic flaw polynyas are considered to be highly productive areas for the formation of sea-ice throughout the winter season. Most estimates of sea-ice production are based on the surface energy balance equation and use global reanalyses as atmospheric forcing, which are too coarse to take into account the impact of polynyas on the atmosphere. Additional errors in the estimates of polynya ice production may result from the methods of calculating atmospheric energy fluxes and the assumption of a thin-ice distribution within polynyas. The present study uses simulations using the mesoscale weather prediction model of the Consortium for Small-scale Modelling (COSMO), where polynya area is prescribed from satellite data. The polynya area is either assumed to be ice-free or to be covered with thin ice of 10 cm. Simulations have been performed for two winter periods (2007/08 and 2008/09). When using a realistic thin-ice thickness of 10 cm, sea-ice production in Laptev polynyas amount to 30 km3 and 73 km3 for the winters 2007/08 and 2008/09, respectively. The higher turbulent energy fluxes of open-water polynyas result in a 50-70% increase in sea-ice production (49 km3 in 2007/08 and 123 km3 in 2008/09). Our results suggest that previous studies have overestimated ice production in the Laptev Sea.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The polynyas of the Laptev Sea are regions of particular interest due to the strong formation of Arctic sea-ice. In order to simulate the polynya dynamics and to quantify ice production, we apply the Finite Element Sea-Ice Ocean Model FESOM. In previous simulations FESOM has been forced with daily atmospheric NCEP (National Centers for Environmental Prediction) 1. For the periods 1 April to 9 May 2008 and 1 January to 8 February 2009 we examine the impact of different forcing data: daily and 6-hourly NCEP reanalyses 1 (1.875° x 1.875°), 6-hourly NCEP reanalyses 2 (1.875° x 1.875°), 6-hourly analyses from the GME (Global Model of the German Weather Service) (0.5° x 0.5°) and high-resolution hourly COSMO (Consortium for Small-Scale Modeling) data (5 km x 5 km). In all FESOM simulations, except for those with 6-hourly and daily NCEP 1 data, the openings and closings of polynyas are simulated in principle agreement with satellite products. Over the fast-ice area the wind fields of all atmospheric data are similar and close to in situ measurements. Over the polynya areas, however, there are strong differences between the forcing data with respect to air temperature and turbulent heat flux. These differences have a strong impact on sea-ice production rates. Depending on the forcing fields polynya ice production ranges from 1.4 km3 to 7.8 km3 during 1 April to 9 May 2011 and from 25.7 km3 to 66.2 km3 during 1 January to 8 February 2009. Therefore, atmospheric forcing data with high spatial and temporal resolution which account for the presence of the polynyas are needed to reduce the uncertainty in quantifying ice production in polynyas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The role of the local atmospheric forcing on the ocean mixed layer depth (MLD) over the global oceans is studied using ocean reanalysis data products and a single-column ocean model coupled to an atmospheric general circulation model. The focus of this study is on how the annual mean and the seasonal cycle of the MLD relate to various forcing characteristics in different parts of the world's ocean, and how anomalous variations in the monthly mean MLD relate to anomalous atmospheric forcings. By analysing both ocean reanalysis data and the single-column ocean model, regions with different dominant forcings and different mean and variability characteristics of the MLD can be identified. Many of the global oceans' MLD characteristics appear to be directly linked to different atmospheric forcing characteristics at different locations. Here, heating and wind-stress are identified as the main drivers; in some, mostly coastal, regions the atmospheric salinity forcing also contributes. The annual mean MLD is more closely related to the annual mean wind-stress and the MLD seasonality is more closely to the seasonality in heating. The single-column ocean model, however, also points out that the MLD characteristics over most global ocean regions, and in particular the tropics and subtropics, cannot be maintained by local atmospheric forcings only, but are also a result of ocean dynamics that are not simulated in a single-column ocean model. Thus, lateral ocean dynamics are essentially in correctly simulating observed MLD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Arctic sea ice cover is thinning and retreating, causing changes in surface roughness that in turn modify the momentum flux from the atmosphere through the ice into the ocean. New model simulations comprising variable sea ice drag coefficients for both the air and water interface demonstrate that the heterogeneity in sea ice surface roughness significantly impacts the spatial distribution and trends of ocean surface stress during the last decades. Simulations with constant sea ice drag coefficients as used in most climate models show an increase in annual mean ocean surface stress (0.003 N/m2 per decade, 4.6%) due to the reduction of ice thickness leading to a weakening of the ice and accelerated ice drift. In contrast, with variable drag coefficients our simulations show annual mean ocean surface stress is declining at a rate of -0.002 N/m2 per decade (3.1%) over the period 1980-2013 because of a significant reduction in surface roughness associated with an increasingly thinner and younger sea ice cover. The effectiveness of sea ice in transferring momentum does not only depend on its resistive strength against the wind forcing but is also set by its top and bottom surface roughness varying with ice types and ice conditions. This reveals the need to account for sea ice surface roughness variations in climate simulations in order to correctly represent the implications of sea ice loss under global warming.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In both the observational record and atmosphere-ocean general circulation model (AOGCM) simulations of the last ∼∼ 150 years, short-lived negative radiative forcing due to volcanic aerosol, following explosive eruptions, causes sudden global-mean cooling of up to ∼∼ 0.3 K. This is about five times smaller than expected from the transient climate response parameter (TCRP, K of global-mean surface air temperature change per W m−2 of radiative forcing increase) evaluated under atmospheric CO2 concentration increasing at 1 % yr−1. Using the step model (Good et al. in Geophys Res Lett 38:L01703, 2011. doi:10.​1029/​2010GL045208), we confirm the previous finding (Held et al. in J Clim 23:2418–2427, 2010. doi:10.​1175/​2009JCLI3466.​1) that the main reason for the discrepancy is the damping of the response to short-lived forcing by the thermal inertia of the upper ocean. Although the step model includes this effect, it still overestimates the volcanic cooling simulated by AOGCMs by about 60 %. We show that this remaining discrepancy can be explained by the magnitude of the volcanic forcing, which may be smaller in AOGCMs (by 30 % for the HadCM3 AOGCM) than in off-line calculations that do not account for rapid cloud adjustment, and the climate sensitivity parameter, which may be smaller than for increasing CO2 (40 % smaller than for 4 × CO2 in HadCM3).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reconstructions of salinity are used to diagnose changes in the hydrological cycle and ocean circulation. A widely used method of determining past salinity uses oxygen isotope (δOw) residuals after the extraction of the global ice volume and temperature components. This method relies on a constant relationship between δOw and salinity throughout time. Here we use the isotope-enabled fully coupled General Circulation Model (GCM) HadCM3 to test the application of spatially and time-independent relationships in the reconstruction of past ocean salinity. Simulations of the Late Holocene (LH), Last Glacial Maximum (LGM), and Last Interglacial (LIG) climates are performed and benchmarked against existing compilations of stable oxygen isotopes in carbonates (δOc), which primarily reflect δOw and temperature. We find that HadCM3 produces an accurate representation of the surface ocean δOc distribution for the LH and LGM. Our simulations show considerable variability in spatial and temporal δOw-salinity relationships. Spatial gradients are generally shallower but within ∼50% of the actual simulated LH to LGM and LH to LIG temporal gradients and temporal gradients calculated from multi-decadal variability are generally shallower than both spatial and actual simulated gradients. The largest sources of uncertainty in salinity reconstructions are found to be caused by changes in regional freshwater budgets, ocean circulation, and sea ice regimes. These can cause errors in salinity estimates exceeding 4 psu. Our results suggest that paleosalinity reconstructions in the South Atlantic, Indian and Tropical Pacific Oceans should be most robust, since these regions exhibit relatively constant δOw-salinity relationships across spatial and temporal scales. Largest uncertainties will affect North Atlantic and high latitude paleosalinity reconstructions. Finally, the results show that it is difficult to generate reliable salinity estimates for regions of dynamic oceanography, such as the North Atlantic, without additional constraints.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SHIMMER (Soil biogeocHemIcal Model for Microbial Ecosystem Response) is a new numerical modelling framework designed to simulate microbial dynamics and biogeochemical cycling during initial ecosystem development in glacier forefield soils. However, it is also transferable to other extreme ecosystem types (such as desert soils or the surface of glaciers). The rationale for model development arises from decades of empirical observations in glacier forefields, and enables a quantitative and process focussed approach. Here, we provide a detailed description of SHIMMER, test its performance in two case study forefields: the Damma Glacier (Switzerland) and the Athabasca Glacier (Canada) and analyse sensitivity to identify the most sensitive and unconstrained model parameters. Results show that the accumulation of microbial biomass is highly dependent on variation in microbial growth and death rate constants, Q10 values, the active fraction of microbial biomass and the reactivity of organic matter. The model correctly predicts the rapid accumulation of microbial biomass observed during the initial stages of succession in the forefields of both the case study systems. Primary production is responsible for the initial build-up of labile substrate that subsequently supports heterotrophic growth. However, allochthonous contributions of organic matter, and nitrogen fixation, are important in sustaining this productivity. The development and application of SHIMMER also highlights aspects of these systems that require further empirical research: quantifying nutrient budgets and biogeochemical rates, exploring seasonality and microbial growth and cell death. This will lead to increased understanding of how glacier forefields contribute to global biogeochemical cycling and climate under future ice retreat.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Animal models of acquired epilepsies aim to provide researchers with tools for use in understanding the processes underlying the acquisition, development and establishment of the disorder. Typically, following a systemic or local insult, vulnerable brain regions undergo a process leading to the development, over time, of spontaneous recurrent seizures. Many such models make use of a period of intense seizure activity or status epilepticus, and this may be associated with high mortality and/or global damage to large areas of the brain. These undesirable elements have driven improvements in the design of chronic epilepsy models, for example the lithium-pilocarpine epileptogenesis model. Here, we present an optimised model of chronic epilepsy that reduces mortality to 1% whilst retaining features of high epileptogenicity and development of spontaneous seizures. Using local field potential recordings from hippocampus in vitro as a probe, we show that the model does not result in significant loss of neuronal network function in area CA3 and, instead, subtle alterations in network dynamics appear during a process of epileptogenesis, which eventually leads to a chronic seizure state. The model’s features of very low mortality and high morbidity in the absence of global neuronal damage offer the chance to explore the processes underlying epileptogenesis in detail, in a population of animals not defined by their resistance to seizures, whilst acknowledging and being driven by the 3Rs (Replacement, Refinement and Reduction of animal use in scientific procedures) principles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Madden-Julian Oscillation (MJO) is the dominant mode of intraseasonal variability in the Trop- ics. It can be characterised as a planetary-scale coupling between the atmospheric circulation and organised deep convection that propagates east through the equatorial Indo-Pacific region. The MJO interacts with weather and climate systems on a near-global scale and is a crucial source of predictability for weather forecasts on medium to seasonal timescales. Despite its global signifi- cance, accurately representing the MJO in numerical weather prediction (NWP) and climate models remains a challenge. This thesis focuses on the representation of the MJO in the Integrated Forecasting System (IFS) at the European Centre for Medium-Range Weather Forecasting (ECMWF), a state-of-the-art NWP model. Recent modifications to the model physics in Cycle 32r3 (Cy32r3) of the IFS led to ad- vances in the simulation of the MJO; for the first time the observed amplitude of the MJO was maintained throughout the integration period. A set of hindcast experiments, which differ only in their formulation of convection, have been performed between May 2008 and April 2009 to asses the sensitivity of MJO simulation in the IFS to the Cy32r3 convective parameterization. Unique to this thesis is the attribution of the advances in MJO simulation in Cy32r3 to the mod- ified convective parameterization, specifically, the relative-humidity-dependent formulation for or- ganised deep entrainment. Increasing the sensitivity of the deep convection scheme to environmen- tal moisture is shown to modify the relationship between precipitation and moisture in the model. Through dry-air entrainment, convective plumes ascending in low-humidity environments terminate lower in the atmosphere. As a result, there is an increase in the occurrence of cumulus congestus, which acts to moisten the mid-troposphere. Due to the modified precipitation-moisture relationship more moisture is able to build up which effectively preconditions the tropical atmosphere for the transition to deep convection. Results from this thesis suggest that a tropospheric moisture control on convection is key to simulating the interaction between the physics and large-scale circulation associated with the MJO.