31 resultados para computer modelling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evaluating agents in decision-making applications requires assessing their skill and predicting their behaviour. Both are well developed in Poker-like situations, but less so in more complex game and model domains. This paper addresses both tasks by using Bayesian inference in a benchmark space of reference agents. The concepts are explained and demonstrated using the game of chess but the model applies generically to any domain with quantifiable options and fallible choice. Demonstration applications address questions frequently asked by the chess community regarding the stability of the rating scale, the comparison of players of different eras and/or leagues, and controversial incidents possibly involving fraud. The last include alleged under-performance, fabrication of tournament results, and clandestine use of computer advice during competition. Beyond the model world of games, the aim is to improve fallible human performance in complex, high-value tasks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent severe flooding in the UK has highlighted the need for better information on flood risk, increasing the pressure on engineers to enhance the capabilities of computer models for flood prediction. This paper evaluates the benefits to be gained from the use of remotely sensed data to support flood modelling. The remotely sensed data available can be used either to produce high-resolution digital terrain models (DTMs) (light detection and ranging (Lidar) data), or to generate accurate inundation mapping of past flood events (airborne synthetic aperture radar (SAR) data and aerial photography). The paper reports on the modelling of real flood events that occurred at two UK sites on the rivers Severn and Ouse. At these sites a combination of remotely sensed data and recorded hydrographs was available. It is concluded first that light detection and ranging Lidar generated DTMs support the generation of considerably better models and enhance the visualisation of model results and second that flood outlines obtained from airborne SAR or aerial images help develop an appreciation of the hydraulic behaviour of important model components, and facilitate model validation. The need for further research is highlighted by a number of limitations, namely: the difficulties in obtaining an adequate representation of hydraulically important features such as embankment crests and walls; uncertainties in the validation data; and difficulties in extracting flood outlines from airborne SAR images in urban areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uncertainties associated with the representation of various physical processes in global climate models (GCMs) mean that, when projections from GCMs are used in climate change impact studies, the uncertainty propagates through to the impact estimates. A complete treatment of this ‘climate model structural uncertainty’ is necessary so that decision-makers are presented with an uncertainty range around the impact estimates. This uncertainty is often underexplored owing to the human and computer processing time required to perform the numerous simulations. Here, we present a 189-member ensemble of global river runoff and water resource stress simulations that adequately address this uncertainty. Following several adaptations and modifications, the ensemble creation time has been reduced from 750 h on a typical single-processor personal computer to 9 h of high-throughput computing on the University of Reading Campus Grid. Here, we outline the changes that had to be made to the hydrological impacts model and to the Campus Grid, and present the main results. We show that, although there is considerable uncertainty in both the magnitude and the sign of regional runoff changes across different GCMs with climate change, there is much less uncertainty in runoff changes for regions that experience large runoff increases (e.g. the high northern latitudes and Central Asia) and large runoff decreases (e.g. the Mediterranean). Furthermore, there is consensus that the percentage of the global population at risk to water resource stress will increase with climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To construct Biodiversity richness maps from Environmental Niche Models (ENMs) of thousands of species is time consuming. A separate species occurrence data pre-processing phase enables the experimenter to control test AUC score variance due to species dataset size. Besides, removing duplicate occurrences and points with missing environmental data, we discuss the need for coordinate precision, wide dispersion, temporal and synonymity filters. After species data filtering, the final task of a pre-processing phase should be the automatic generation of species occurrence datasets which can then be directly ’plugged-in’ to the ENM. A software application capable of carrying out all these tasks will be a valuable time-saver particularly for large scale biodiversity studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Graphical tracking is a technique for crop scheduling where the actual plant state is plotted against an ideal target curve which encapsulates all crop and environmental characteristics. Management decisions are made on the basis of the position of the actual crop against the ideal position. Due to the simplicity of the approach it is possible for graphical tracks to be developed on site without the requirement for controlled experimentation. Growth models and graphical tracks are discussed, and an implementation of the Richards curve for graphical tracking described. In many cases, the more intuitively desirable growth models perform sub-optimally due to problems with the specification of starting conditions, environmental factors outside the scope of the original model and the introduction of new cultivars. Accurate specification for a biological model requires detailed and usually costly study, and as such is not adaptable to a changing cultivar range and changing cultivation techniques. Fitting of a new graphical track for a new cultivar can be conducted on site and improved over subsequent seasons. Graphical tracking emphasises the current position relative to the objective, and as such does not require the time consuming or system specific input of an environmental history, although it does require detailed crop measurement. The approach is flexible and could be applied to a variety of specification metrics, with digital imaging providing a route for added value. For decision making regarding crop manipulation from the observed current state, there is a role for simple predictive modelling over the short term to indicate the short term consequences of crop manipulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses experimental and theoretical investigations and Computational Fluid Dynamics (CFD) modelling considerations to evaluate the performance of a square section wind catcher system connected to the top of a test room for the purpose of natural ventilation. The magnitude and distribution of pressure coefficients (C-p) around a wind catcher and the air flow into the test room were analysed. The modelling results indicated that air was supplied into the test room through the wind catcher's quadrants with positive external pressure coefficients and extracted out of the test room through quadrants with negative pressure coefficients. The air flow achieved through the wind catcher depends on the speed and direction of the wind. The results obtained using the explicit and AIDA implicit calculation procedures and CFX code correlate relatively well with the experimental results at lower wind speeds and with wind incidents at an angle of 0 degrees. Variation in the C-p and air flow results were observed particularly with a wind direction of 45 degrees. The explicit and implicit calculation procedures were found to be quick and easy to use in obtaining results whereas the wind tunnel tests were more expensive in terms of effort, cost and time. CFD codes are developing rapidly and are widely available especially with the decreasing prices of computer hardware. However, results obtained using CFD codes must be considered with care, particularly in the absence of empirical data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses how the use of computer-based modelling tools has aided the design of a telemetry unit for use with oil well logging. With the aid of modern computer-based simulation techniques, the new design is capable of operating at data rates of 2.5 times faster than previous designs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Virtual reality has the potential to improve visualisation of building design and construction, but its implementation in the industry has yet to reach maturity. Present day translation of building data to virtual reality is often unidirectional and unsatisfactory. Three different approaches to the creation of models are identified and described in this paper. Consideration is given to the potential of both advances in computer-aided design and the emerging standards for data exchange to facilitate an integrated use of virtual reality. Commonalities and differences between computer-aided design and virtual reality packages are reviewed, and trials of current system, are described. The trials have been conducted to explore the technical issues related to the integrated use of CAD and virtual environments within the house building sector of the construction industry and to investigate the practical use of the new technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with the integration of radial basis function (RBF) networks into the industrial software control package Connoisseur. The paper shows the improved modelling capabilities offered by RBF networks within the Connoisseur environment compared to linear modelling techniques such as recursive least squares. The paper also goes on to mention the way this improved modelling capability, obtained through the RBF networks will be utilised within Connoisseur.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acquiring a mechanistic understanding of the role of the biotic feedbacks on the links between atmospheric CO2 concentrations and temperature is essential for trustworthy climate predictions. Currently, computer based simulations are the only available tool to estimate the global impact of the biotic feedbacks on future atmospheric CO2 and temperatures. Here we propose an alternative and complementary approaches by using materially closed and energetically open analogue/physical models of the carbon cycle. We argue that there is potential in using a materially closed approach to improve our understanding of the magnitude and sign of many biotic feedbacks, and that recent technological advance make this feasible. We also suggest how such systems could be designed and discuss the advantages and limitations of establishing physical models of the global carbon cycle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Determination of the local structure of a polymer glass by scattering methods is complex due to the number of spatial and orientational correlations, both from within the polymer chain (intrachain) and between neighbouring chains (interchain), from which the scattering arises. Recently considerable advances have been made in the structural analysis of relatively simple polymers such as poly(ethylene) through the use of broad Q neutron scattering data tightly coupled to atomistic modelling procedures. This paper presents the results of an investigation into the use of these procedures for the analysis of the local structure of a-PMMA which is chemically more complex with a much greater number of intrachain structural parameters. We have utilised high quality neutron scattering data obtained using SANDALS at ISIS coupled with computer models representing both the single chain and bulk polymer system. Several different modelling approaches have been explored which encompass such techniques as Reverse Monte Carlo refinement and energy minimisation and their relative merits and successes are discussed. These different approaches highlight structural parameters which any realistic model of glassy atactic PMMA must replicate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.