86 resultados para 2447: modelling and forecasting


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The martian solsticial pause, presented in a companion paper (Lewis et al., this issue), was investigated further through a series of model runs using the UK version of the LMD/UK Mars Global Climate Model. It was found that the pause could not be adequately reproduced if radiatively active water ice clouds were omitted from the model. When clouds were used, along with a realistic time-dependent dust opacity distribution, a substantial minimum in near-surface transient eddy activity formed around solstice in both hemispheres. The net effect of the clouds in the model is, by altering the thermal structure of the atmosphere, to decrease the vertical shear of the westerly jet near the surface around solstice, and thus reduce baroclinic growth rates. A similar effect was seen under conditions of large dust loading, implying that northern midlatitude eddy activity will tend to become suppressed after a period of intense flushing storm formation around the northern cap edge. Suppression of baroclinic eddy generation by the barotropic component of the flow and via diabatic eddy dissipation were also investigated as possible mechanisms leading to the formation of the solsticial pause but were found not to make major contributions. Zonal variations in topography were found to be important, as their presence results in weakened transient eddies around winter solstice in both hemispheres, through modification of the near-surface flow. The zonal topographic asymmetry appears to be the primary reason for the weakness of eddy activity in the southern hemisphere relative to the northern hemisphere, and the ultimate cause of the solsticial pause in both hemispheres. The meridional topographic gradient was found to exert a much weaker influence on near-surface transient eddies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In situ high resolution aircraft measurements of cloud microphysical properties were made in coordination with ground based remote sensing observations of a line of small cumulus clouds, using Radar and Lidar, as part of the Aerosol Properties, PRocesses And InfluenceS on the Earth's climate (APPRAISE) project. A narrow but extensive line (~100 km long) of shallow convective clouds over the southern UK was studied. Cloud top temperatures were observed to be higher than −8 °C, but the clouds were seen to consist of supercooled droplets and varying concentrations of ice particles. No ice particles were observed to be falling into the cloud tops from above. Current parameterisations of ice nuclei (IN) numbers predict too few particles will be active as ice nuclei to account for ice particle concentrations at the observed, near cloud top, temperatures (−7.5 °C). The role of mineral dust particles, consistent with concentrations observed near the surface, acting as high temperature IN is considered important in this case. It was found that very high concentrations of ice particles (up to 100 L−1) could be produced by secondary ice particle production providing the observed small amount of primary ice (about 0.01 L−1) was present to initiate it. This emphasises the need to understand primary ice formation in slightly supercooled clouds. It is shown using simple calculations that the Hallett-Mossop process (HM) is the likely source of the secondary ice. Model simulations of the case study were performed with the Aerosol Cloud and Precipitation Interactions Model (ACPIM). These parcel model investigations confirmed the HM process to be a very important mechanism for producing the observed high ice concentrations. A key step in generating the high concentrations was the process of collision and coalescence of rain drops, which once formed fell rapidly through the cloud, collecting ice particles which caused them to freeze and form instant large riming particles. The broadening of the droplet size-distribution by collision-coalescence was, therefore, a vital step in this process as this was required to generate the large number of ice crystals observed in the time available. Simulations were also performed with the WRF (Weather, Research and Forecasting) model. The results showed that while HM does act to increase the mass and number concentration of ice particles in these model simulations it was not found to be critical for the formation of precipitation. However, the WRF simulations produced a cloud top that was too cold and this, combined with the assumption of continual replenishing of ice nuclei removed by ice crystal formation, resulted in too many ice crystals forming by primary nucleation compared to the observations and parcel modelling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To bridge the gaps between traditional mesoscale modelling and microscale modelling, the National Center for Atmospheric Research, in collaboration with other agencies and research groups, has developed an integrated urban modelling system coupled to the weather research and forecasting (WRF) model as a community tool to address urban environmental issues. The core of this WRF/urban modelling system consists of the following: (1) three methods with different degrees of freedom to parameterize urban surface processes, ranging from a simple bulk parameterization to a sophisticated multi-layer urban canopy model with an indoor–outdoor exchange sub-model that directly interacts with the atmospheric boundary layer, (2) coupling to fine-scale computational fluid dynamic Reynolds-averaged Navier–Stokes and Large-Eddy simulation models for transport and dispersion (T&D) applications, (3) procedures to incorporate high-resolution urban land use, building morphology, and anthropogenic heating data using the National Urban Database and Access Portal Tool (NUDAPT), and (4) an urbanized high-resolution land data assimilation system. This paper provides an overview of this modelling system; addresses the daunting challenges of initializing the coupled WRF/urban model and of specifying the potentially vast number of parameters required to execute the WRF/urban model; explores the model sensitivity to these urban parameters; and evaluates the ability of WRF/urban to capture urban heat islands, complex boundary-layer structures aloft, and urban plume T&D for several major metropolitan regions. Recent applications of this modelling system illustrate its promising utility, as a regional climate-modelling tool, to investigate impacts of future urbanization on regional meteorological conditions and on air quality under future climate change scenarios. Copyright © 2010 Royal Meteorological Society

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Floods are the most frequent of natural disasters, affecting millions of people across the globe every year. The anticipation and forecasting of floods at the global scale is crucial to preparing for severe events and providing early awareness where local flood models and warning services may not exist. As numerical weather prediction models continue to improve, operational centres are increasingly using the meteorological output from these to drive hydrological models, creating hydrometeorological systems capable of forecasting river flow and flood events at much longer lead times than has previously been possible. Furthermore, developments in, for example, modelling capabilities, data and resources in recent years have made it possible to produce global scale flood forecasting systems. In this paper, the current state of operational large scale flood forecasting is discussed, including probabilistic forecasting of floods using ensemble prediction systems. Six state-of-the-art operational large scale flood forecasting systems are reviewed, describing similarities and differences in their approaches to forecasting floods at the global and continental scale. Currently, operational systems have the capability to produce coarse-scale discharge forecasts in the medium-range and disseminate forecasts and, in some cases, early warning products, in real time across the globe, in support of national forecasting capabilities. With improvements in seasonal weather forecasting, future advances may include more seamless hydrological forecasting at the global scale, alongside a move towards multi-model forecasts and grand ensemble techniques, responding to the requirement of developing multi-hazard early warning systems for disaster risk reduction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although climate models have been improving in accuracy and efficiency over the past few decades, it now seems that these incremental improvements may be slowing. As tera/petascale computing becomes massively parallel, our legacy codes are less suitable, and even with the increased resolution that we are now beginning to use, these models cannot represent the multiscale nature of the climate system. This paper argues that it may be time to reconsider the use of adaptive mesh refinement for weather and climate forecasting in order to achieve good scaling and representation of the wide range of spatial scales in the atmosphere and ocean. Furthermore, the challenge of introducing living organisms and human responses into climate system models is only just beginning to be tackled. We do not yet have a clear framework in which to approach the problem, but it is likely to cover such a huge number of different scales and processes that radically different methods may have to be considered. The challenges of multiscale modelling and petascale computing provide an opportunity to consider a fresh approach to numerical modelling of the climate (or Earth) system, which takes advantage of the computational fluid dynamics developments in other fields and brings new perspectives on how to incorporate Earth system processes. This paper reviews some of the current issues in climate (and, by implication, Earth) system modelling, and asks the question whether a new generation of models is needed to tackle these problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the development and first results of the “Community Integrated Assessment System” (CIAS), a unique multi-institutional modular and flexible integrated assessment system for modelling climate change. Key to this development is the supporting software infrastructure, SoftIAM. Through it, CIAS is distributed between the community of institutions which has each contributed modules to the CIAS system. At the heart of SoftIAM is the Bespoke Framework Generator (BFG) which enables flexibility in the assembly and composition of individual modules from a pool to form coupled models within CIAS, and flexibility in their deployment onto the available software and hardware resources. Such flexibility greatly enhances modellers’ ability to re-configure the CIAS coupled models to answer different questions, thus tracking evolving policy needs. It also allows rigorous testing of the robustness of IA modelling results to the use of different component modules representing the same processes (for example, the economy). Such processes are often modelled in very different ways, using different paradigms, at the participating institutions. An illustrative application to the study of the relationship between the economy and the earth’s climate system is provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The unsaturated zone exerts a major control on the delivery of nutrients to Chalk streams, yet flow and transport processes in this complex, dual-porosity medium have remained controversial. A major challenge arises in characterising these processes, both at the detailed mechanistic level and at an appropriate level for inclusion within catchment-scale models for nutrient management. The lowland catchment research (LOCAR) programme in the UK has provided a unique set of comprehensively instrumented groundwater-dominated catchments. Of these, the Pang and Lambourn, tributaries of the Thames near Reading, have been a particular focus for research into subsurface processes and surface water-groundwater interactions. Data from LOCAR and other sources, along with a new dual permeability numerical model of the Chalk, have been used to explore the relative roles of matrix and fracture flow within the unsaturated zone and resolve conflicting hypotheses of response. From the improved understanding gained through these explorations, a parsimonious conceptualisation of the general response of flow and transport within the Chalk unsaturated zone was formulated. This paper summarises the modelling and data findings of these explorations, and describes the integration of the new simplified unsaturated zone representation with a catchment-scale model of nutrients (INCA), resulting in a new model for catchment-scale flow and transport within Chalk systems: INCA-Chalk. This model is applied to the Lambourn, and results, including hindcast and forecast simulations, are presented. These clearly illustrate the decadal time-scales that need to be considered in the context of nutrient management and the EU Water Framework Directive. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the success of studies attempting to integrate remotely sensed data and flood modelling and the need to provide near-real time data routinely on a global scale as well as setting up online data archives, there is to date a lack of spatially and temporally distributed hydraulic parameters to support ongoing efforts in modelling. Therefore, the objective of this project is to provide a global evaluation and benchmark data set of floodplain water stages with uncertainties and assimilation in a large scale flood model using space-borne radar imagery. An algorithm is developed for automated retrieval of water stages with uncertainties from a sequence of radar imagery and data are assimilated in a flood model using the Tewkesbury 2007 flood event as a feasibility study. The retrieval method that we employ is based on possibility theory which is an extension of fuzzy sets and that encompasses probability theory. In our case we first attempt to identify main sources of uncertainty in the retrieval of water stages from radar imagery for which we define physically meaningful ranges of parameter values. Possibilities of values are then computed for each parameter using a triangular ‘membership’ function. This procedure allows the computation of possible values of water stages at maximum flood extents along a river at many different locations. At a later stage in the project these data are then used in assimilation, calibration or validation of a flood model. The application is subsequently extended to a global scale using wide swath radar imagery and a simple global flood forecasting model thereby providing improved river discharge estimates to update the latter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We survey the literature on spatial bio-economic and land-use modelling and assess its thematic development. Unobserved site-specific heterogeneity is a feature of almost all the surveyed works, and this feature, it seems, has stimulated significant methodological innovation. In an attempt to improve the suitability with which the prototype incorporates heterogeneity, we consider modelling alternatives and extensions. We discuss solutions and conjecture others.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We survey the literature on spatial bio-economic and land-use modelling and assess its thematic development. Unobserved site-specific heterogeneity is a feature of almost all the surveyed works, and this feature, it seems, has stimulated significant methodological innovation. In an attempt to improve the suitability with which the prototype incorporates heterogeneity, we consider modelling alternatives and extensions. We discuss solutions and conjecture others.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analysis of X-ray powder data for the melt-crystallisable aromatic poly(thioether thioether ketone) [-S-Ar-S-Ar-CO-Ar](n), ('PTTK', Ar= 1,4-phenylene), reveals that it adopts a crystal structure very different from that established for its ether-analogue PEEK. Molecular modelling and diffraction-simulation studies of PTTK show that the structure of this polymer is analogous to that of melt-crystallised poly(thioetherketone) [-SAr-CO-Ar](n) in which the carbonyl linkages in symmetry-related chains are aligned anti-parallel to one another. and that these bridging units are crystallographically interchangeable. The final model for the crystal structure of PTTK is thus disordered, in the monoclinic space group 121a (two chains per unit cell), with cell dimensions a = 7.83, b = 6.06, c = 10.35 angstrom, beta = 93.47 degrees. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Crumpets are made by heating fermented batter on a hot plate at around 230°C. The characteristic structure dominated by vertical pores develops rapidly: structure has developed throughout around 75% of the product height within 30s, which is far faster than might be expected from transient heat conduction through the batter. Cooking is complete within around 3 min. Image analysis based on results from X-ray tomography shows that the voidage fraction is approximately constant and that there is continual coalescence between the larger pores throughout the product although there is also a steady level of small bubbles trapped within the solidified batter. We report here experimental studies which shed light on some of the mechanisms responsible for this structure, together with some models of key phenomena.Three aspects are discussed here: the role of gas (carbon dioxide and nitrogen) nuclei in initiating structure development; convective heat transfer inside the developing pores; and the kinetics of setting the batter into an elastic solid structure. It is shown conclusively that the small bubbles of carbon dioxide resulting from the fermentation stage play a crucial role as nuclei for pore development: without these nuclei, the result is not a porous structure, but rather a solid, elastic, inedible, gelatinized product. These nuclei are also responsible for the tiny bubbles which are set in the final product. The nuclei form the source of the dominant pore structure which is largely driven by the, initially explosive, release of water vapour from the batter together with the desorption of dissolved carbon dioxide. It is argued that the rapid evaporation, transport and condensation of steam within the growing pores provides an important mechanism, as in a heat pipe, for rapid heat transfer, and models for this process are developed and tested. The setting of the continuous batter phase is essential for final product quality: studies using differential scanning calorimetry and on the kinetics of change in the visco-elastic properties of the batter suggest that this process is driven by the kinetics of gelatinization. Unlike many thermally driven food processes the rates of heating are such that gelatinization kinetics cannot be neglected. The implications of these results for modelling and for the development of novel structures are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a world of almost permanent and rapidly increasing electronic data availability, techniques of filtering, compressing, and interpreting this data to transform it into valuable and easily comprehensible information is of utmost importance. One key topic in this area is the capability to deduce future system behavior from a given data input. This book brings together for the first time the complete theory of data-based neurofuzzy modelling and the linguistic attributes of fuzzy logic in a single cohesive mathematical framework. After introducing the basic theory of data-based modelling, new concepts including extended additive and multiplicative submodels are developed and their extensions to state estimation and data fusion are derived. All these algorithms are illustrated with benchmark and real-life examples to demonstrate their efficiency. Chris Harris and his group have carried out pioneering work which has tied together the fields of neural networks and linguistic rule-based algortihms. This book is aimed at researchers and scientists in time series modeling, empirical data modeling, knowledge discovery, data mining, and data fusion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we investigate the role of judgement in the formation of forecasts in commercial real estate markets. Based on interview surveys with the majority of forecast producers, we find that real estate forecasters are using a range of inputs and data sets to form models to predict an array of variables for a range of locations. The findings suggest that forecasts need to be acceptable to their users (and purchasers) and consequently forecasters generally have incentives to avoid presenting contentious or conspicuous forecasts. Where extreme forecasts are generated by a model, forecasters often engage in ‘self-censorship’ or are ‘censored’ following in-house consultation. It is concluded that the forecasting process is more complex than merely carrying out econometric modelling and that the impact of the influences within this process vary considerably across different organizational contexts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Models play a vital role in supporting a range of activities in numerous domains. We rely on models to support the design, visualisation, analysis and representation of parts of the world around us, and as such significant research effort has been invested into numerous areas of modelling; including support for model semantics, dynamic states and behaviour, temporal data storage and visualisation. Whilst these efforts have increased our capabilities and allowed us to create increasingly powerful software-based models, the process of developing models, supporting tools and /or data structures remains difficult, expensive and error-prone. In this paper we define from literature the key factors in assessing a model’s quality and usefulness: semantic richness, support for dynamic states and object behaviour, temporal data storage and visualisation. We also identify a number of shortcomings in both existing modelling standards and model development processes and propose a unified generic process to guide users through the development of semantically rich, dynamic and temporal models.