980 resultados para Modified k-epsilon model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Radiative forcing and climate sensitivity have been widely used as concepts to understand climate change. This work performs climate change experiments with an intermediate general circulation model (IGCM) to examine the robustness of the radiative forcing concept for carbon dioxide and solar constant changes. This IGCM has been specifically developed as a computationally fast model, but one that allows an interaction between physical processes and large-scale dynamics; the model allows many long integrations to be performed relatively quickly. It employs a fast and accurate radiative transfer scheme, as well as simple convection and surface schemes, and a slab ocean, to model the effects of climate change mechanisms on the atmospheric temperatures and dynamics with a reasonable degree of complexity. The climatology of the IGCM run at T-21 resolution with 22 levels is compared to European Centre for Medium Range Weather Forecasting Reanalysis data. The response of the model to changes in carbon dioxide and solar output are examined when these changes are applied globally and when constrained geographically (e.g. over land only). The CO2 experiments have a roughly 17% higher climate sensitivity than the solar experiments. It is also found that a forcing at high latitudes causes a 40% higher climate sensitivity than a forcing only applied at low latitudes. It is found that, despite differences in the model feedbacks, climate sensitivity is roughly constant over a range of distributions of CO2 and solar forcings. Hence, in the IGCM at least, the radiative forcing concept is capable of predicting global surface temperature changes to within 30%, for the perturbations described here. It is concluded that radiative forcing remains a useful tool for assessing the natural and anthropogenic impact of climate change mechanisms on surface temperature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Initial results are presented from a middle atmosphere extension to a version of the European Centre For Medium Range Weather Forecasting tropospheric model. The extended version of the model has been developed as part of the UK Universities Global Atmospheric Modelling Project and extends from the ground to approximately 90 km. A comprehensive solar radiation scheme is included which uses monthly averaged climatological ozone values. A linearised infrared cooling scheme is employed. The basic climatology of the model is described; the parametrization of drag due to orographically forced gravity waves is shown to have a dramatic effect on the simulations of the winter hemisphere.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Medium range flood forecasting activities, driven by various meteorological forecasts ranging from high resolution deterministic forecasts to low spatial resolution ensemble prediction systems, share a major challenge in the appropriateness and design of performance measures. In this paper possible limitations of some traditional hydrological and meteorological prediction quality and verification measures are identified. Some simple modifications are applied in order to circumvent the problem of the autocorrelation dominating river discharge time-series and in order to create a benchmark model enabling the decision makers to evaluate the forecast quality and the model quality. Although the performance period is quite short the advantage of a simple cost-loss function as a measure of forecast quality can be demonstrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The direct radiative forcing of 65 chlorofluorocarbons, hydrochlorofluorocarbons, hydrofluorocarbons, hydrofluoroethers, halons, iodoalkanes, chloroalkanes, bromoalkanes, perfluorocarbons and nonmethane hydrocarbons has been evaluated using a consistent set of infrared absorption cross sections. For the radiative transfer models, both line-by-line and random band model approaches were employed for each gas. The line-by-line model was first validated against measurements taken by the Airborne Research Interferometer Evaluation System (ARIES) of the U.K. Meteorological Office; the computed spectrally integrated radiance of agreed to within 2% with experimental measurements. Three model atmospheres, derived from a three-dimensional climatology, were used in the radiative forcing calculations to more accurately represent hemispheric differences in water vapor, ozone concentrations, and cloud cover. Instantaneous, clear-sky radiative forcing values calculated by the line-by-line and band models were in close agreement. The band model values were subsequently modified to ensure exact agreement with the line-by-line model values. Calibrated band model radiative forcing values, for atmospheric profiles with clouds and using stratospheric adjustment, are reported and compared with previous literature values. Fourteen of the 65 molecules have forcings that differ by more than 15% from those in the World Meteorological Organization [1999] compilation. Eleven of the molecules have not been reported previously. The 65-molecule data set reported here is the most comprehensive and consistent database yet available to evaluate the relative impact of halocarbons and hydrocarbons on climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The semi-distributed, dynamic INCA-N model was used to simulate the behaviour of dissolved inorganic nitrogen (DIN) in two Finnish research catchments. Parameter sensitivity and model structural uncertainty were analysed using generalized sensitivity analysis. The Mustajoki catchment is a forested upstream catchment, while the Savijoki catchment represents intensively cultivated lowlands. In general, there were more influential parameters in Savijoki than Mustajoki. Model results were sensitive to N-transformation rates, vegetation dynamics, and soil and river hydrology. Values of the sensitive parameters were based on long-term measurements covering both warm and cold years. The highest measured DIN concentrations fell between minimum and maximum values estimated during the uncertainty analysis. The lowest measured concentrations fell outside these bounds, suggesting that some retention processes may be missing from the current model structure. The lowest concentrations occurred mainly during low flow periods; so effects on total loads were small.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper ensembles of forecasts (of up to six hours) are studied from a convection-permitting model with a representation of model error due to unresolved processes. The ensemble prediction system (EPS) used is an experimental convection-permitting version of the UK Met Office’s 24- member Global and Regional Ensemble Prediction System (MOGREPS). The method of representing model error variability, which perturbs parameters within the model’s parameterisation schemes, has been modified and we investigate the impact of applying this scheme in different ways. These are: a control ensemble where all ensemble members have the same parameter values; an ensemble where the parameters are different between members, but fixed in time; and ensembles where the parameters are updated randomly every 30 or 60 min. The choice of parameters and their ranges of variability have been determined from expert opinion and parameter sensitivity tests. A case of frontal rain over the southern UK has been chosen, which has a multi-banded rainfall structure. The consequences of including model error variability in the case studied are mixed and are summarised as follows. The multiple banding, evident in the radar, is not captured for any single member. However, the single band is positioned in some members where a secondary band is present in the radar. This is found for all ensembles studied. Adding model error variability with fixed parameters in time does increase the ensemble spread for near-surface variables like wind and temperature, but can actually decrease the spread of the rainfall. Perturbing the parameters periodically throughout the forecast does not further increase the spread and exhibits “jumpiness” in the spread at times when the parameters are perturbed. Adding model error variability gives an improvement in forecast skill after the first 2–3 h of the forecast for near-surface temperature and relative humidity. For precipitation skill scores, adding model error variability has the effect of improving the skill in the first 1–2 h of the forecast, but then of reducing the skill after that. Complementary experiments were performed where the only difference between members was the set of parameter values (i.e. no initial condition variability). The resulting spread was found to be significantly less than the spread from initial condition variability alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Past climates provide a test of models’ ability to predict climate change. We present a comprehensive evaluation of state-of-the-art models against Last Glacial Maximum and mid-Holocene climates, using reconstructions of land and ocean climates and simulations from the Palaeoclimate Modelling and Coupled Modelling Intercomparison Projects. Newer models do not perform better than earlier versions despite higher resolution and complexity. Differences in climate sensitivity only weakly account for differences in model performance. In the glacial, models consistently underestimate land cooling (especially in winter) and overestimate ocean surface cooling (especially in the tropics). In the mid-Holocene, models generally underestimate the precipitation increase in the northern monsoon regions, and overestimate summer warming in central Eurasia. Models generally capture large-scale gradients of climate change but have more limited ability to reproduce spatial patterns. Despite these common biases, some models perform better than others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A process-based fire regime model (SPITFIRE) has been developed, coupled with ecosystem dynamics in the LPJ Dynamic Global Vegetation Model, and used to explore fire regimes and the current impact of fire on the terrestrial carbon cycle and associated emissions of trace atmospheric constituents. The model estimates an average release of 2.24 Pg C yr−1 as CO2 from biomass burning during the 1980s and 1990s. Comparison with observed active fire counts shows that the model reproduces where fire occurs and can mimic broad geographic patterns in the peak fire season, although the predicted peak is 1–2 months late in some regions. Modelled fire season length is generally overestimated by about one month, but shows a realistic pattern of differences among biomes. Comparisons with remotely sensed burnt-area products indicate that the model reproduces broad geographic patterns of annual fractional burnt area over most regions, including the boreal forest, although interannual variability in the boreal zone is underestimated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cholesterol is one of the key constituents for maintaining the cellular membrane and thus the integrity of the cell itself. In contrast high levels of cholesterol in the blood are known to be a major risk factor in the development of cardiovascular disease. We formulate a deterministic nonlinear ordinary differential equation model of the sterol regulatory element binding protein 2 (SREBP-2) cholesterol genetic regulatory pathway in an hepatocyte. The mathematical model includes a description of genetic transcription by SREBP-2 which is subsequently translated to mRNA leading to the formation of 3-hydroxy-3-methylglutaryl coenzyme A reductase (HMGCR), a main precursor of cholesterol synthesis. Cholesterol synthesis subsequently leads to the regulation of SREBP-2 via a negative feedback formulation. Parameterised with data from the literature, the model is used to understand how SREBP-2 transcription and regulation affects cellular cholesterol concentration. Model stability analysis shows that the only positive steady-state of the system exhibits purely oscillatory, damped oscillatory or monotic behaviour under certain parameter conditions. In light of our findings we postulate how cholesterol homestasis is maintained within the cell and the advantages of our model formulation are discussed with respect to other models of genetic regulation within the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Plaut, McClelland, Seidenberg and Patterson (1996) connectionist model of reading was evaluated at two points early in its training against reading data collected from British children on two occasions during their first year of literacy instruction. First, the network’s non-word reading was poor relative to word reading when compared with the children. Second, the network made more non-lexical than lexical errors, the opposite pattern to the children. Three adaptations were made to the training of the network to bring it closer to the learning environment of a child: an incremental training regime was adopted; the network was trained on grapheme– phoneme correspondences; and a training corpus based on words found in children’s early reading materials was used. The modifications caused a sharp improvement in non-word reading, relative to word reading, resulting in a near perfect match to the children’s data on this measure. The modified network, however, continued to make predominantly non-lexical errors, although evidence from a small-scale implementation of the full triangle framework suggests that this limitation stems from the lack of a semantic pathway. Taken together, these results suggest that, when properly trained, connectionist models of word reading can offer insights into key aspects of reading development in children.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[1] We present a model of the dust cycle that successfully predicts dust emissions as determined by land surface properties, monthly vegetation and snow cover, and 6-hourly surface wind speeds for the years 1982–1993. The model takes account of the role of dry lake beds as preferential source areas for dust emission. The occurrence of these preferential sources is determined by a water routing and storage model. The dust source scheme also explicitly takes into account the role of vegetation type as well as monthly vegetation cover. Dust transport is computed using assimilated winds for the years 1987–1990. Deposition of dust occurs through dry and wet deposition, where subcloud scavenging is calculated using assimilated precipitation fields. Comparison of simulated patterns of atmospheric dust loading with the Total Ozone Mapping Spectrometer satellite absorbing aerosol index shows that the model produces realistic results from daily to interannual timescales. The magnitude of dust deposition agrees well with sediment flux data from marine sites. Emission of submicron dust from preferential source areas are required for the computation of a realistic dust optical thickness. Sensitivity studies show that Asian dust source strengths are particularly sensitive to the seasonality of vegetation cover.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe Global Atmosphere 4.0 (GA4.0) and Global Land 4.0 (GL4.0): configurations of the Met Office Unified Model and JULES (Joint UK Land Environment Simulator) community land surface model developed for use in global and regional climate research and weather prediction activities. GA4.0 and GL4.0 are based on the previous GA3.0 and GL3.0 configurations, with the inclusion of developments made by the Met Office and its collaborators during its annual development cycle. This paper provides a comprehensive technical and scientific description of GA4.0 and GL4.0 as well as details of how these differ from their predecessors. We also present the results of some initial evaluations of their performance. Overall, performance is comparable with that of GA3.0/GL3.0; the updated configurations include improvements to the science of several parametrisation schemes, however, and will form a baseline for further ongoing development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mineral dust aerosols in the atmosphere have the potential to affect the global climate by influencing the radiative balance of the atmosphere and the supply of micronutrients to the ocean. Ice and marine sediment cores indicate that dust deposition from the atmosphere was at some locations 2–20 times greater during glacial periods, raising the possibility that mineral aerosols might have contributed to climate change on glacial-interglacial time scales. To address this question, we have used linked terrestrial biosphere, dust source, and atmospheric transport models to simulate the dust cycle in the atmosphere for current and last glacial maximum (LGM) climates. We obtain a 2.5-fold higher dust loading in the entire atmosphere and a twenty-fold higher loading in high latitudes, in LGM relative to present. Comparisons to a compilation of atmospheric dust deposition flux estimates for LGM and present in marine sediment and ice cores show that the simulated flux ratios are broadly in agreement with observations; differences suggest where further improvements in the simple dust model could be made. The simulated increase in high-latitude dustiness depends on the expansion of unvegetated areas, especially in the high latitudes and in central Asia, caused by a combination of increased aridity and low atmospheric [CO2]. The existence of these dust source areas at the LGM is supported by pollen data and loess distribution in the northern continents. These results point to a role for vegetation feedbacks, including climate effects and physiological effects of low [CO2], in modulating the atmospheric distribution of dust.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to examine metacognitive accuracy (i.e., the relationship between metacognitive judgment and memory performance), researchers often rely on by-participant analysis, where metacognitive accuracy (e.g., resolution, as measured by the gamma coefficient or signal detection measures) is computed for each participant and the computed values are entered into group-level statistical tests such as the t-test. In the current work, we argue that the by-participant analysis, regardless of the accuracy measurements used, would produce a substantial inflation of Type-1 error rates, when a random item effect is present. A mixed-effects model is proposed as a way to effectively address the issue, and our simulation studies examining Type-1 error rates indeed showed superior performance of mixed-effects model analysis as compared to the conventional by-participant analysis. We also present real data applications to illustrate further strengths of mixed-effects model analysis. Our findings imply that caution is needed when using the by-participant analysis, and recommend the mixed-effects model analysis.