915 resultados para Microscopic simulation models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Determining how El Niño and its impacts may change over the next 10 to 100 years remains a difficult scientific challenge. Ocean–atmosphere coupled general circulation models (CGCMs) are routinely used both to analyze El Niño mechanisms and teleconnections and to predict its evolution on a broad range of time scales, from seasonal to centennial. The ability to simulate El Niño as an emergent property of these models has largely improved over the last few years. Nevertheless, the diversity of model simulations of present-day El Niño indicates current limitations in our ability to model this climate phenomenon and to anticipate changes in its characteristics. A review of the several factors that contribute to this diversity, as well as potential means to improve the simulation of El Niño, is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new technique is described for the analysis of cloud-resolving model simulations, which allows one to investigate the statistics of the lifecycles of cumulus clouds. Clouds are tracked from timestep-to-timestep within the model run. This allows for a very simple method of tracking, but one which is both comprehensive and robust. An approach for handling cloud splits and mergers is described which allows clouds with simple and complicated time histories to be compared within a single framework. This is found to be important for the analysis of an idealized simulation of radiative-convective equilibrium, in which the moist, buoyant, updrafts (i.e., the convective cores) were tracked. Around half of all such cores were subject to splits and mergers during their lifecycles. For cores without any such events, the average lifetime is 30min, but events can lengthen the typical lifetime considerably.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite its relevance to a wide range of technological and fundamental areas, a quantitative understanding of protein surface clustering dynamics is often lacking. In inorganic crystal growth, surface clustering of adatoms is well described by diffusion-aggregation models. In such models, the statistical properties of the aggregate arrays often reveal the molecular scale aggregation processes. We investigate the potential of these theories to reveal hitherto hidden facets of protein clustering by carrying out concomitant observations of lysozyme adsorption onto mica surfaces, using atomic force microscopy. and Monte Carlo simulations of cluster nucleation and growth. We find that lysozyme clusters diffuse across the substrate at a rate that varies inversely with size. This result suggests which molecular scale mechanisms are responsible for the mobility of the proteins on the substrate. In addition the surface diffusion coefficient of the monomer can also be extracted from the comparison between experiments and simulations. While concentrating on a model system of lysozyme-on-mica, this 'proof of concept' study successfully demonstrates the potential of our approach to understand and influence more biomedically applicable protein-substrate couples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Madden–Julian oscillation (MJO) interacts with and influences a wide range of weather and climate phenomena (e.g., monsoons, ENSO, tropical storms, midlatitude weather), and represents an important, and as yet unexploited, source of predictability at the subseasonal time scale. Despite the important role of the MJO in climate and weather systems, current global circulation models (GCMs) exhibit considerable shortcomings in representing this phenomenon. These shortcomings have been documented in a number of multimodel comparison studies over the last decade. However, diagnosis of model performance has been challenging, and model progress has been difficult to track, because of the lack of a coherent and standardized set of MJO diagnostics. One of the chief objectives of the U.S. Climate Variability and Predictability (CLIVAR) MJO Working Group is the development of observation-based diagnostics for objectively evaluating global model simulations of the MJO in a consistent framework. Motivation for this activity is reviewed, and the intent and justification for a set of diagnostics is provided, along with specification for their calculation, and illustrations of their application. The diagnostics range from relatively simple analyses of variance and correlation to more sophisticated space–time spectral and empirical orthogonal function analyses. These diagnostic techniques are used to detect MJO signals, to construct composite life cycles, to identify associations of MJO activity with the mean state, and to describe interannual variability of the MJO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Results are presented from a matrix of coupled model integrations, using atmosphere resolutions of 135 and 90 km, and ocean resolutions of 1° and 1/3°, to study the impact of resolution on simulated climate. The mean state of the tropical Pacific is found to be improved in the models with a higher ocean resolution. Such an improved mean state arises from the development of tropical instability waves, which are poorly resolved at low resolution; these waves reduce the equatorial cold tongue bias. The improved ocean state also allows for a better simulation of the atmospheric Walker circulation. Several sensitivity studies have been performed to further understand the processes involved in the different component models. Significantly decreasing the horizontal momentum dissipation in the coupled model with the lower-resolution ocean has benefits for the mean tropical Pacific climate, but decreases model stability. Increasing the momentum dissipation in the coupled model with the higher-resolution ocean degrades the simulation toward that of the lower-resolution ocean. These results suggest that enhanced ocean model resolution can have important benefits for the climatology of both the atmosphere and ocean components of the coupled model, and that some of these benefits may be achievable at lower ocean resolution, if the model formulation allows.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Applications such as neuroscience, telecommunication, online social networking, transport and retail trading give rise to connectivity patterns that change over time. In this work, we address the resulting need for network models and computational algorithms that deal with dynamic links. We introduce a new class of evolving range-dependent random graphs that gives a tractable framework for modelling and simulation. We develop a spectral algorithm for calibrating a set of edge ranges from a sequence of network snapshots and give a proof of principle illustration on some neuroscience data. We also show how the model can be used computationally and analytically to investigate the scenario where an evolutionary process, such as an epidemic, takes place on an evolving network. This allows us to study the cumulative effect of two distinct types of dynamics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the Met Office large-eddy model (LEM) we simulate a mixed-phase altocumulus cloud that was observed from Chilbolton in southern England by a 94 GHz Doppler radar, a 905 nm lidar, a dual-wavelength microwave radiometer and also by four radiosondes. It is important to test and evaluate such simulations with observations, since there are significant differences between results from different cloud-resolving models for ice clouds. Simulating the Doppler radar and lidar data within the LEM allows us to compare observed and modelled quantities directly, and allows us to explore the relationships between observed and unobserved variables. For general-circulation models, which currently tend to give poor representations of mixed-phase clouds, the case shows the importance of using: (i) separate prognostic ice and liquid water, (ii) a vertical resolution that captures the thin layers of liquid water, and (iii) an accurate representation the subgrid vertical velocities that allow liquid water to form. It is shown that large-scale ascents and descents are significant for this case, and so the horizontally averaged LEM profiles are relaxed towards observed profiles to account for these. The LEM simulation then gives a reasonable. cloud, with an ice-water path approximately two thirds of that observed, with liquid water at the cloud top, as observed. However, the liquid-water cells that form in the updraughts at cloud top in the LEM have liquid-water paths (LWPs) up to half those observed, and there are too few cells, giving a mean LWP five to ten times smaller than observed. In reality, ice nucleation and fallout may deplete ice-nuclei concentrations at the cloud top, allowing more liquid water to form there, but this process is not represented in the model. Decreasing the heterogeneous nucleation rate in the LEM increased the LWP, which supports this hypothesis. The LEM captures the increase in the standard deviation in Doppler velocities (and so vertical winds) with height, but values are 1.5 to 4 times smaller than observed (although values are larger in an unforced model run, this only increases the modelled LWP by a factor of approximately two). The LEM data show that, for values larger than approximately 12 cm s(-1), the standard deviation in Doppler velocities provides an almost unbiased estimate of the standard deviation in vertical winds, but provides an overestimate for smaller values. Time-smoothing the observed Doppler velocities and modelled mass-squared-weighted fallspeeds shows that observed fallspeeds are approximately two-thirds of the modelled values. Decreasing the modelled fallspeeds to those observed increases the modelled IWC, giving an IWP 1.6 times that observed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[ 1] A rapid increase in the variety, quality, and quantity of observations in polar regions is leading to a significant improvement in the understanding of sea ice dynamic and thermodynamic processes and their representation in global climate models. We assess the simulation of sea ice in the new Hadley Centre Global Environmental Model (HadGEM1) against the latest available observations. The HadGEM1 sea ice component uses elastic-viscous-plastic dynamics, multiple ice thickness categories, and zero-layer thermodynamics. The model evaluation is focused on the mean state of the key variables of ice concentration, thickness, velocity, and albedo. The model shows good agreement with observational data sets. The variability of the ice forced by the North Atlantic Oscillation is also found to agree with observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Improvements in the resolution of satellite imagery have enabled extraction of water surface elevations at the margins of the flood. Comparison between modelled and observed water surface elevations provides a new means for calibrating and validating flood inundation models, however the uncertainty in this observed data has yet to be addressed. Here a flood inundation model is calibrated using a probabilistic treatment of the observed data. A LiDAR guided snake algorithm is used to determine an outline of a flood event in 2006 on the River Dee, North Wales, UK, using a 12.5m ERS-1 image. Points at approximately 100m intervals along this outline are selected, and the water surface elevation recorded as the LiDAR DEM elevation at each point. With a planar water surface from the gauged upstream to downstream water elevations as an approximation, the water surface elevations at points along this flooded extent are compared to their ‘expected’ value. The pattern of errors between the two show a roughly normal distribution, however when plotted against coordinates there is obvious spatial autocorrelation. The source of this spatial dependency is investigated by comparing errors to the slope gradient and aspect of the LiDAR DEM. A LISFLOOD-FP model of the flood event is set-up to investigate the effect of observed data uncertainty on the calibration of flood inundation models. Multiple simulations are run using different combinations of friction parameters, from which the optimum parameter set will be selected. For each simulation a T-test is used to quantify the fit between modelled and observed water surface elevations. The points chosen for use in this T-test are selected based on their error. The criteria for selection enables evaluation of the sensitivity of the choice of optimum parameter set to uncertainty in the observed data. This work explores the observed data in detail and highlights possible causes of error. The identification of significant error (RMSE = 0.8m) between approximate expected and actual observed elevations from the remotely sensed data emphasises the limitations of using this data in a deterministic manner within the calibration process. These limitations are addressed by developing a new probabilistic approach to using the observed data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Projections of stratospheric ozone from a suite of chemistry-climate models (CCMs) have been analyzed. In addition to a reference simulation where anthropogenic halogenated ozone depleting substances (ODSs) and greenhouse gases (GHGs) vary with time, sensitivity simulations with either ODS or GHG concentrations fixed at 1960 levels were performed to disaggregate the drivers of projected ozone changes. These simulations were also used to assess the two distinct milestones of ozone returning to historical values (ozone return dates) and ozone no longer being influenced by ODSs (full ozone recovery). The date of ozone returning to historical values does not indicate complete recovery from ODSs in most cases, because GHG-induced changes accelerate or decelerate ozone changes in many regions. In the upper stratosphere where CO2-induced stratospheric cooling increases ozone, full ozone recovery is projected to not likely have occurred by 2100 even though ozone returns to its 1980 or even 1960 levels well before (~2025 and 2040, respectively). In contrast, in the tropical lower stratosphere ozone decreases continuously from 1960 to 2100 due to projected increases in tropical upwelling, while by around 2040 it is already very likely that full recovery from the effects of ODSs has occurred, although ODS concentrations are still elevated by this date. In the midlatitude lower stratosphere the evolution differs from that in the tropics, and rather than a steady decrease in ozone, first a decrease in ozone is simulated from 1960 to 2000, which is then followed by a steady increase through the 21st century. Ozone in the midlatitude lower stratosphere returns to 1980 levels by ~2045 in the Northern Hemisphere (NH) and by ~2055 in the Southern Hemisphere (SH), and full ozone recovery is likely reached by 2100 in both hemispheres. Overall, in all regions except the tropical lower stratosphere, full ozone recovery from ODSs occurs significantly later than the return of total column ozone to its 1980 level. The latest return of total column ozone is projected to occur over Antarctica (~2045–2060) whereas it is not likely that full ozone recovery is reached by the end of the 21st century in this region. Arctic total column ozone is projected to return to 1980 levels well before polar stratospheric halogen loading does so (~2025–2030 for total column ozone, cf. 2050–2070 for Cly+60×Bry) and it is likely that full recovery of total column ozone from the effects of ODSs has occurred by ~2035. In contrast to the Antarctic, by 2100 Arctic total column ozone is projected to be above 1960 levels, but not in the fixed GHG simulation, indicating that climate change plays a significant role.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impacts of climate change on crop productivity are often assessed using simulations from a numerical climate model as an input to a crop simulation model. The precision of these predictions reflects the uncertainty in both models. We examined how uncertainty in a climate (HadAM3) and crop General Large-Area Model (GLAM) for annual crops model affects the mean and standard deviation of crop yield simulations in present and doubled carbon dioxide (CO2) climates by perturbation of parameters in each model. The climate sensitivity parameter (λ, the equilibrium response of global mean surface temperature to doubled CO2) was used to define the control climate. Observed 1966–1989 mean yields of groundnut (Arachis hypogaea L.) in India were simulated well by the crop model using the control climate and climates with values of λ near the control value. The simulations were used to measure the contribution to uncertainty of key crop and climate model parameters. The standard deviation of yield was more affected by perturbation of climate parameters than crop model parameters in both the present-day and doubled CO2 climates. Climate uncertainty was higher in the doubled CO2 climate than in the present-day climate. Crop transpiration efficiency was key to crop model uncertainty in both present-day and doubled CO2 climates. The response of crop development to mean temperature contributed little uncertainty in the present-day simulations but was among the largest contributors under doubled CO2. The ensemble methods used here to quantify physical and biological uncertainty offer a method to improve model estimates of the impacts of climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reanalysis data provide an excellent test bed for impacts prediction systems. because they represent an upper limit on the skill of climate models. Indian groundnut (Arachis hypogaea L.) yields have been simulated using the General Large-Area Model (GLAM) for annual crops and the European Centre for Medium-Range Weather Forecasts (ECMWF) 40-yr reanalysis (ERA-40). The ability of ERA-40 to represent the Indian summer monsoon has been examined. The ability of GLAM. when driven with daily ERA-40 data, to model both observed yields and observed relationships between subseasonal weather and yield has been assessed. Mean yields "were simulated well across much of India. Correlations between observed and modeled yields, where these are significant. are comparable to correlations between observed yields and ERA-40 rainfall. Uncertainties due to the input planting window, crop duration, and weather data have been examined. A reduction in the root-mean-square error of simulated yields was achieved by applying bias correction techniques to the precipitation. The stability of the relationship between weather and yield over time has been examined. Weather-yield correlations vary on decadal time scales. and this has direct implications for the accuracy of yield simulations. Analysis of the skewness of both detrended yields and precipitation suggest that nonclimatic factors are partly responsible for this nonstationarity. Evidence from other studies, including data on cereal and pulse yields, indicates that this result is not particular to groundnut yield. The detection and modeling of nonstationary weather-yield relationships emerges from this study as an important part of the process of understanding and predicting the impacts of climate variability and change on crop yields.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impacts of climate change on crop productivity are often assessed using simulations from a numerical climate model as an input to a crop simulation model. The precision of these predictions reflects the uncertainty in both models. We examined how uncertainty in a climate (HadAM3) and crop General Large-Area Model (GLAM) for annual crops model affects the mean and standard deviation of crop yield simulations in present and doubled carbon dioxide (CO2) climates by perturbation of parameters in each model. The climate sensitivity parameter (lambda, the equilibrium response of global mean surface temperature to doubled CO2) was used to define the control climate. Observed 1966-1989 mean yields of groundnut (Arachis hypogaea L.) in India were simulated well by the crop model using the control climate and climates with values of lambda near the control value. The simulations were used to measure the contribution to uncertainty of key crop and climate model parameters. The standard deviation of yield was more affected by perturbation of climate parameters than crop model parameters in both the present-day and doubled CO2 climates. Climate uncertainty was higher in the doubled CO2 climate than in the present-day climate. Crop transpiration efficiency was key to crop model uncertainty in both present-day and doubled CO2 climates. The response of crop development to mean temperature contributed little uncertainty in the present-day simulations but was among the largest contributors under doubled CO2. The ensemble methods used here to quantify physical and biological uncertainty offer a method to improve model estimates of the impacts of climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We focus on the comparison of three statistical models used to estimate the treatment effect in metaanalysis when individually pooled data are available. The models are two conventional models, namely a multi-level and a model based upon an approximate likelihood, and a newly developed model, the profile likelihood model which might be viewed as an extension of the Mantel-Haenszel approach. To exemplify these methods, we use results from a meta-analysis of 22 trials to prevent respiratory tract infections. We show that by using the multi-level approach, in the case of baseline heterogeneity, the number of clusters or components is considerably over-estimated. The approximate and profile likelihood method showed nearly the same pattern for the treatment effect distribution. To provide more evidence two simulation studies are accomplished. The profile likelihood can be considered as a clear alternative to the approximate likelihood model. In the case of strong baseline heterogeneity, the profile likelihood method shows superior behaviour when compared with the multi-level model. Copyright (C) 2006 John Wiley & Sons, Ltd.