901 resultados para global nonhydrostatic model
Resumo:
A statistical-dynamical downscaling method is used to estimate future changes of wind energy output (Eout) of a benchmark wind turbine across Europe at the regional scale. With this aim, 22 global climate models (GCMs) of the Coupled Model Intercomparison Project Phase 5 (CMIP5) ensemble are considered. The downscaling method uses circulation weather types and regional climate modelling with the COSMO-CLM model. Future projections are computed for two time periods (2021–2060 and 2061–2100) following two scenarios (RCP4.5 and RCP8.5). The CMIP5 ensemble mean response reveals a more likely than not increase of mean annual Eout over Northern and Central Europe and a likely decrease over Southern Europe. There is some uncertainty with respect to the magnitude and the sign of the changes. Higher robustness in future changes is observed for specific seasons. Except from the Mediterranean area, an ensemble mean increase of Eout is simulated for winter and a decreasing for the summer season, resulting in a strong increase of the intra-annual variability for most of Europe. The latter is, in particular, probable during the second half of the 21st century under the RCP8.5 scenario. In general, signals are stronger for 2061–2100 compared to 2021–2060 and for RCP8.5 compared to RCP4.5. Regarding changes of the inter-annual variability of Eout for Central Europe, the future projections strongly vary between individual models and also between future periods and scenarios within single models. This study showed for an ensemble of 22 CMIP5 models that changes in the wind energy potentials over Europe may take place in future decades. However, due to the uncertainties detected in this research, further investigations with multi-model ensembles are needed to provide a better quantification and understanding of the future changes.
Resumo:
Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961–2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño–Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.
Resumo:
Purpose – The purpose of this paper is to present the model of the translation of particularly important ideas for the organization and its context, called mythical ideas. Design/methodology/approach – The study is based on ethnographic research. Findings – It is found that change processes based on mythical ideas are especially dynamic but also very vulnerable. The consequences of failure can be vital for the organization and its environment. Originality/value – The paper explores the outcomes to which the translation of a mythical idea can lead. The findings are of value for people involved in organizational change processes.
Resumo:
Initializing the ocean for decadal predictability studies is a challenge, as it requires reconstructing the little observed subsurface trajectory of ocean variability. In this study we explore to what extent surface nudging using well-observed sea surface temperature (SST) can reconstruct the deeper ocean variations for the 1949–2005 period. An ensemble made with a nudged version of the IPSLCM5A model and compared to ocean reanalyses and reconstructed datasets. The SST is restored to observations using a physically-based relaxation coefficient, in contrast to earlier studies, which use a much larger value. The assessment is restricted to the regions where the ocean reanalyses agree, i.e. in the upper 500 m of the ocean, although this can be latitude and basin dependent. Significant reconstruction of the subsurface is achieved in specific regions, namely region of subduction in the subtropical Atlantic, below the thermocline in the equatorial Pacific and, in some cases, in the North Atlantic deep convection regions. Beyond the mean correlations, ocean integrals are used to explore the time evolution of the correlation over 20-year windows. Classical fixed depth heat content diagnostics do not exhibit any significant reconstruction between the different existing observation-based references and can therefore not be used to assess global average time-varying correlations in the nudged simulations. Using the physically based average temperature above an isotherm (14 °C) alleviates this issue in the tropics and subtropics and shows significant reconstruction of these quantities in the nudged simulations for several decades. This skill is attributed to the wind stress reconstruction in the tropics, as already demonstrated in a perfect model study using the same model. Thus, we also show here the robustness of this result in an historical and observational context.
Resumo:
We present the global general circulation model IPSL-CM5 developed to study the long-term response of the climate system to natural and anthropogenic forcings as part of the 5th Phase of the Coupled Model Intercomparison Project (CMIP5). This model includes an interactive carbon cycle, a representation of tropospheric and stratospheric chemistry, and a comprehensive representation of aerosols. As it represents the principal dynamical, physical, and bio-geochemical processes relevant to the climate system, it may be referred to as an Earth System Model. However, the IPSL-CM5 model may be used in a multitude of configurations associated with different boundary conditions and with a range of complexities in terms of processes and interactions. This paper presents an overview of the different model components and explains how they were coupled and used to simulate historical climate changes over the past 150 years and different scenarios of future climate change. A single version of the IPSL-CM5 model (IPSL-CM5A-LR) was used to provide climate projections associated with different socio-economic scenarios, including the different Representative Concentration Pathways considered by CMIP5 and several scenarios from the Special Report on Emission Scenarios considered by CMIP3. Results suggest that the magnitude of global warming projections primarily depends on the socio-economic scenario considered, that there is potential for an aggressive mitigation policy to limit global warming to about two degrees, and that the behavior of some components of the climate system such as the Arctic sea ice and the Atlantic Meridional Overturning Circulation may change drastically by the end of the twenty-first century in the case of a no climate policy scenario. Although the magnitude of regional temperature and precipitation changes depends fairly linearly on the magnitude of the projected global warming (and thus on the scenario considered), the geographical pattern of these changes is strikingly similar for the different scenarios. The representation of atmospheric physical processes in the model is shown to strongly influence the simulated climate variability and both the magnitude and pattern of the projected climate changes.
Resumo:
The climates of the mid-Holocene (MH), 6,000 years ago, and of the Last Glacial Maximum (LGM), 21,000 years ago, have extensively been simulated, in particular in the framework of the Palaeoclimate Modelling Intercomparion Project. These periods are well documented by paleo-records, which can be used for evaluating model results for climates different from the present one. Here, we present new simulations of the MH and the LGM climates obtained with the IPSL_CM5A model and compare them to our previous results obtained with the IPSL_CM4 model. Compared to IPSL_CM4, IPSL_CM5A includes two new features: the interactive representation of the plant phenology and marine biogeochemistry. But one of the most important differences between these models is the latitudinal resolution and vertical domain of their atmospheric component, which have been improved in IPSL_CM5A and results in a better representation of the mid-latitude jet-streams. The Asian monsoon’s representation is also substantially improved. The global average mean annual temperature simulated for the pre-industrial (PI) period is colder in IPSL_CM5A than in IPSL_CM4 but their climate sensitivity to a CO2 doubling is similar. Here we show that these differences in the simulated PI climate have an impact on the simulated MH and LGM climatic anomalies. The larger cooling response to LGM boundary conditions in IPSL_CM5A appears to be mainly due to differences between the PMIP3 and PMIP2 boundary conditions, as shown by a short wave radiative forcing/feedback analysis based on a simplified perturbation method. It is found that the sensitivity computed from the LGM climate is lower than that computed from 2 × CO2 simulations, confirming previous studies based on different models. For the MH, the Asian monsoon, stronger in the IPSL_CM5A PI simulation, is also more sensitive to the insolation changes. The African monsoon is also further amplified in IPSL_CM5A due to the impact of the interactive phenology. Finally the changes in variability for both models and for MH and LGM are presented taking the example of the El-Niño Southern Oscillation (ENSO), which is very different in the PI simulations. ENSO variability is damped in both model versions at the MH, whereas inconsistent responses are found between the two versions for the LGM. Part 2 of this paper examines whether these differences between IPSL_CM4 and IPSL_CM5A can be distinguished when comparing those results to palaeo-climatic reconstructions and investigates new approaches for model-data comparisons made possible by the inclusion of new components in IPSL_CM5A.
Resumo:
Substantial low-frequency rainfall fluctuations occurred in the Sahel throughout the twentieth century, causing devastating drought. Modeling these low-frequency rainfall fluctuations has remained problematic for climate models for many years. Here we show using a combination of state-of-the-art rainfall observations and high-resolution global climate models that changes in organized heavy rainfall events carry most of the rainfall variability in the Sahel at multiannual to decadal time scales. Ability to produce intense, organized convection allows climate models to correctly simulate the magnitude of late-twentieth century rainfall change, underlining the importance of model resolution. Increasing model resolution allows a better coupling between large-scale circulation changes and regional rainfall processes over the Sahel. These results provide a strong basis for developing more reliable and skilful long-term predictions of rainfall (seasons to years) which could benefit many sectors in the region by allowing early adaptation to impending extremes.
Resumo:
Intercomparison and evaluation of the global ocean surface mixed layer depth (MLD) fields estimated from a suite of major ocean syntheses are conducted. Compared with the reference MLDs calculated from individual profiles, MLDs calculated from monthly mean and gridded profiles show negative biases of 10–20 m in early spring related to the re-stratification process of relatively deep mixed layers. Vertical resolution of profiles also influences the MLD estimation. MLDs are underestimated by approximately 5–7 (14–16) m with the vertical resolution of 25 (50) m when the criterion of potential density exceeding the 10-m value by 0.03 kg m−3 is used for the MLD estimation. Using the larger criterion (0.125 kg m−3) generally reduces the underestimations. In addition, positive biases greater than 100 m are found in wintertime subpolar regions when MLD criteria based on temperature are used. Biases of the reanalyses are due to both model errors and errors related to differences between the assimilation methods. The result shows that these errors are partially cancelled out through the ensemble averaging. Moreover, the bias in the ensemble mean field of the reanalyses is smaller than in the observation-only analyses. This is largely attributed to comparably higher resolutions of the reanalyses. The robust reproduction of both the seasonal cycle and interannual variability by the ensemble mean of the reanalyses indicates a great potential of the ensemble mean MLD field for investigating and monitoring upper ocean processes.
Resumo:
Biomass burning impacts vegetation dynamics, biogeochemical cycling, atmospheric chemistry, and climate, with sometimes deleterious socio-economic impacts. Under future climate projections it is often expected that the risk of wildfires will increase. Our ability to predict the magnitude and geographic pattern of future fire impacts rests on our ability to model fire regimes, either using well-founded empirical relationships or process-based models with good predictive skill. A large variety of models exist today and it is still unclear which type of model or degree of complexity is required to model fire adequately at regional to global scales. This is the central question underpinning the creation of the Fire Model Intercomparison Project - FireMIP, an international project to compare and evaluate existing global fire models against benchmark data sets for present-day and historical conditions. In this paper we summarise the current state-of-the-art in fire regime modelling and model evaluation, and outline what essons may be learned from FireMIP.
Resumo:
Floods are the most frequent of natural disasters, affecting millions of people across the globe every year. The anticipation and forecasting of floods at the global scale is crucial to preparing for severe events and providing early awareness where local flood models and warning services may not exist. As numerical weather prediction models continue to improve, operational centres are increasingly using the meteorological output from these to drive hydrological models, creating hydrometeorological systems capable of forecasting river flow and flood events at much longer lead times than has previously been possible. Furthermore, developments in, for example, modelling capabilities, data and resources in recent years have made it possible to produce global scale flood forecasting systems. In this paper, the current state of operational large scale flood forecasting is discussed, including probabilistic forecasting of floods using ensemble prediction systems. Six state-of-the-art operational large scale flood forecasting systems are reviewed, describing similarities and differences in their approaches to forecasting floods at the global and continental scale. Currently, operational systems have the capability to produce coarse-scale discharge forecasts in the medium-range and disseminate forecasts and, in some cases, early warning products, in real time across the globe, in support of national forecasting capabilities. With improvements in seasonal weather forecasting, future advances may include more seamless hydrological forecasting at the global scale, alongside a move towards multi-model forecasts and grand ensemble techniques, responding to the requirement of developing multi-hazard early warning systems for disaster risk reduction.
Resumo:
Ocean–sea ice reanalyses are crucial for assessing the variability and recent trends in the Arctic sea ice cover. This is especially true for sea ice volume, as long-term and large scale sea ice thickness observations are inexistent. Results from the Ocean ReAnalyses Intercomparison Project (ORA-IP) are presented, with a focus on Arctic sea ice fields reconstructed by state-of-the-art global ocean reanalyses. Differences between the various reanalyses are explored in terms of the effects of data assimilation, model physics and atmospheric forcing on properties of the sea ice cover, including concentration, thickness, velocity and snow. Amongst the 14 reanalyses studied here, 9 assimilate sea ice concentration, and none assimilate sea ice thickness data. The comparison reveals an overall agreement in the reconstructed concentration fields, mainly because of the constraints in surface temperature imposed by direct assimilation of ocean observations, prescribed or assimilated atmospheric forcing and assimilation of sea ice concentration. However, some spread still exists amongst the reanalyses, due to a variety of factors. In particular, a large spread in sea ice thickness is found within the ensemble of reanalyses, partially caused by the biases inherited from their sea ice model components. Biases are also affected by the assimilation of sea ice concentration and the treatment of sea ice thickness in the data assimilation process. An important outcome of this study is that the spatial distribution of ice volume varies widely between products, with no reanalysis standing out as clearly superior as compared to altimetry estimates. The ice thickness from systems without assimilation of sea ice concentration is not worse than that from systems constrained with sea ice observations. An evaluation of the sea ice velocity fields reveals that ice drifts too fast in most systems. As an ensemble, the ORA-IP reanalyses capture trends in Arctic sea ice area and extent relatively well. However, the ensemble can not be used to get a robust estimate of recent trends in the Arctic sea ice volume. Biases in the reanalyses certainly impact the simulated air–sea fluxes in the polar regions, and questions the suitability of current sea ice reanalyses to initialize seasonal forecasts.
Resumo:
Arctic flaw polynyas are considered to be highly productive areas for the formation of sea-ice throughout the winter season. Most estimates of sea-ice production are based on the surface energy balance equation and use global reanalyses as atmospheric forcing, which are too coarse to take into account the impact of polynyas on the atmosphere. Additional errors in the estimates of polynya ice production may result from the methods of calculating atmospheric energy fluxes and the assumption of a thin-ice distribution within polynyas. The present study uses simulations using the mesoscale weather prediction model of the Consortium for Small-scale Modelling (COSMO), where polynya area is prescribed from satellite data. The polynya area is either assumed to be ice-free or to be covered with thin ice of 10 cm. Simulations have been performed for two winter periods (2007/08 and 2008/09). When using a realistic thin-ice thickness of 10 cm, sea-ice production in Laptev polynyas amount to 30 km3 and 73 km3 for the winters 2007/08 and 2008/09, respectively. The higher turbulent energy fluxes of open-water polynyas result in a 50-70% increase in sea-ice production (49 km3 in 2007/08 and 123 km3 in 2008/09). Our results suggest that previous studies have overestimated ice production in the Laptev Sea.
Resumo:
The polynyas of the Laptev Sea are regions of particular interest due to the strong formation of Arctic sea-ice. In order to simulate the polynya dynamics and to quantify ice production, we apply the Finite Element Sea-Ice Ocean Model FESOM. In previous simulations FESOM has been forced with daily atmospheric NCEP (National Centers for Environmental Prediction) 1. For the periods 1 April to 9 May 2008 and 1 January to 8 February 2009 we examine the impact of different forcing data: daily and 6-hourly NCEP reanalyses 1 (1.875° x 1.875°), 6-hourly NCEP reanalyses 2 (1.875° x 1.875°), 6-hourly analyses from the GME (Global Model of the German Weather Service) (0.5° x 0.5°) and high-resolution hourly COSMO (Consortium for Small-Scale Modeling) data (5 km x 5 km). In all FESOM simulations, except for those with 6-hourly and daily NCEP 1 data, the openings and closings of polynyas are simulated in principle agreement with satellite products. Over the fast-ice area the wind fields of all atmospheric data are similar and close to in situ measurements. Over the polynya areas, however, there are strong differences between the forcing data with respect to air temperature and turbulent heat flux. These differences have a strong impact on sea-ice production rates. Depending on the forcing fields polynya ice production ranges from 1.4 km3 to 7.8 km3 during 1 April to 9 May 2011 and from 25.7 km3 to 66.2 km3 during 1 January to 8 February 2009. Therefore, atmospheric forcing data with high spatial and temporal resolution which account for the presence of the polynyas are needed to reduce the uncertainty in quantifying ice production in polynyas.
Resumo:
The role of the local atmospheric forcing on the ocean mixed layer depth (MLD) over the global oceans is studied using ocean reanalysis data products and a single-column ocean model coupled to an atmospheric general circulation model. The focus of this study is on how the annual mean and the seasonal cycle of the MLD relate to various forcing characteristics in different parts of the world's ocean, and how anomalous variations in the monthly mean MLD relate to anomalous atmospheric forcings. By analysing both ocean reanalysis data and the single-column ocean model, regions with different dominant forcings and different mean and variability characteristics of the MLD can be identified. Many of the global oceans' MLD characteristics appear to be directly linked to different atmospheric forcing characteristics at different locations. Here, heating and wind-stress are identified as the main drivers; in some, mostly coastal, regions the atmospheric salinity forcing also contributes. The annual mean MLD is more closely related to the annual mean wind-stress and the MLD seasonality is more closely to the seasonality in heating. The single-column ocean model, however, also points out that the MLD characteristics over most global ocean regions, and in particular the tropics and subtropics, cannot be maintained by local atmospheric forcings only, but are also a result of ocean dynamics that are not simulated in a single-column ocean model. Thus, lateral ocean dynamics are essentially in correctly simulating observed MLD.
Resumo:
The Arctic sea ice cover is thinning and retreating, causing changes in surface roughness that in turn modify the momentum flux from the atmosphere through the ice into the ocean. New model simulations comprising variable sea ice drag coefficients for both the air and water interface demonstrate that the heterogeneity in sea ice surface roughness significantly impacts the spatial distribution and trends of ocean surface stress during the last decades. Simulations with constant sea ice drag coefficients as used in most climate models show an increase in annual mean ocean surface stress (0.003 N/m2 per decade, 4.6%) due to the reduction of ice thickness leading to a weakening of the ice and accelerated ice drift. In contrast, with variable drag coefficients our simulations show annual mean ocean surface stress is declining at a rate of -0.002 N/m2 per decade (3.1%) over the period 1980-2013 because of a significant reduction in surface roughness associated with an increasingly thinner and younger sea ice cover. The effectiveness of sea ice in transferring momentum does not only depend on its resistive strength against the wind forcing but is also set by its top and bottom surface roughness varying with ice types and ice conditions. This reveals the need to account for sea ice surface roughness variations in climate simulations in order to correctly represent the implications of sea ice loss under global warming.