32 resultados para Computer models
em CentAUR: Central Archive University of Reading - UK
Resumo:
In this paper we are mainly concerned with the development of efficient computer models capable of accurately predicting the propagation of low-to-middle frequency sound in the sea, in axially symmetric (2D) and in fully 3D environments. The major physical features of the problem, i.e. a variable bottom topography, elastic properties of the subbottom structure, volume attenuation and other range inhomogeneities are efficiently treated. The computer models presented are based on normal mode solutions of the Helmholtz equation on the one hand, and on various types of numerical schemes for parabolic approximations of the Helmholtz equation on the other. A new coupled mode code is introduced to model sound propagation in range-dependent ocean environments with variable bottom topography, where the effects of an elastic bottom, of volume attenuation, surface and bottom roughness are taken into account. New computer models based on finite difference and finite element techniques for the numerical solution of parabolic approximations are also presented. They include an efficient modeling of the bottom influence via impedance boundary conditions, they cover wide angle propagation, elastic bottom effects, variable bottom topography and reverberation effects. All the models are validated on several benchmark problems and versus experimental data. Results thus obtained were compared with analogous results from standard codes in the literature.
Resumo:
The possibility of future rapid climatic changes is a pressing concern amongst climate scientists. For example, an abrupt collapse of the ocean's Thermohaline Circulation (THC) would rapidly cool the northern hemisphere and reduce the net global primary productivity of vegetation, according to computer models. It is unclear how to incorporate such low-probability, high-impact events into the development of economics policies. This paper reviews the salient aspects of rapid climate change relevant to economists and policy makers. The main scientific certainties and uncertainties are clearly delineated, with the aim of guiding economics goals and ensuring that they retain fidelity to their scientific underpinnings.
Resumo:
Our understanding of the climate system has been revolutionized recently, by the development of sophisticated computer models. The predictions of such models are used to formulate international protocols, intended to mitigate the severity of global warming and its impacts. Yet, these models are not perfect representations of reality, because they remove from explicit consideration many physical processes which are known to be key aspects of the climate system, but which are too small or fast to be modelled. The purpose of this paper is to give a personal perspective of the current state of knowledge regarding the problem of unresolved scales in climate models. A recent novel solution to the problem is discussed, in which it is proposed, somewhat counter-intuitively, that the performance of models may be improved by adding random noise to represent the unresolved processes.
Resumo:
Recent severe flooding in the UK has highlighted the need for better information on flood risk, increasing the pressure on engineers to enhance the capabilities of computer models for flood prediction. This paper evaluates the benefits to be gained from the use of remotely sensed data to support flood modelling. The remotely sensed data available can be used either to produce high-resolution digital terrain models (DTMs) (light detection and ranging (Lidar) data), or to generate accurate inundation mapping of past flood events (airborne synthetic aperture radar (SAR) data and aerial photography). The paper reports on the modelling of real flood events that occurred at two UK sites on the rivers Severn and Ouse. At these sites a combination of remotely sensed data and recorded hydrographs was available. It is concluded first that light detection and ranging Lidar generated DTMs support the generation of considerably better models and enhance the visualisation of model results and second that flood outlines obtained from airborne SAR or aerial images help develop an appreciation of the hydraulic behaviour of important model components, and facilitate model validation. The need for further research is highlighted by a number of limitations, namely: the difficulties in obtaining an adequate representation of hydraulically important features such as embankment crests and walls; uncertainties in the validation data; and difficulties in extracting flood outlines from airborne SAR images in urban areas.
Resumo:
The geospace environment is controlled largely by events on the Sun, such as solar flares and coronal mass ejections, which generate significant geomagnetic and upper atmospheric disturbances. The study of this Sun-Earth system, which has become known as space weather, has both intrinsic scientific interest and practical applications. Adverse conditions in space can damage satellites and disrupt communications, navigation, and electric power grids, as well as endanger astronauts. The Center for Integrated Space Weather Modeling (CISM), a Science and Technology Center (STC) funded by the U.S. National Science Foundation (see http://www.bu.edu/cism/), is developing a suite of integrated physics-based computer models that describe the space environment from the Sun to the Earth for use in both research and operations [Hughes and Hudson, 2004, p. 1241]. To further this mission, advanced education and training programs sponsored by CISM encourage students to view space weather as a system that encompasses the Sun, the solar wind, the magnetosphere, and the ionosphere/thermosphere. This holds especially true for participants in the CISM space weather summer school [Simpson, 2004].
Resumo:
Determination of the local structure of a polymer glass by scattering methods is complex due to the number of spatial and orientational correlations, both from within the polymer chain (intrachain) and between neighbouring chains (interchain), from which the scattering arises. Recently considerable advances have been made in the structural analysis of relatively simple polymers such as poly(ethylene) through the use of broad Q neutron scattering data tightly coupled to atomistic modelling procedures. This paper presents the results of an investigation into the use of these procedures for the analysis of the local structure of a-PMMA which is chemically more complex with a much greater number of intrachain structural parameters. We have utilised high quality neutron scattering data obtained using SANDALS at ISIS coupled with computer models representing both the single chain and bulk polymer system. Several different modelling approaches have been explored which encompass such techniques as Reverse Monte Carlo refinement and energy minimisation and their relative merits and successes are discussed. These different approaches highlight structural parameters which any realistic model of glassy atactic PMMA must replicate.
Resumo:
Land cover data derived from satellites are commonly used to prescribe inputs to models of the land surface. Since such data inevitably contains errors, quantifying how uncertainties in the data affect a model’s output is important. To do so, a spatial distribution of possible land cover values is required to propagate through the model’s simulation. However, at large scales, such as those required for climate models, such spatial modelling can be difficult. Also, computer models often require land cover proportions at sites larger than the original map scale as inputs, and it is the uncertainty in these proportions that this article discusses. This paper describes a Monte Carlo sampling scheme that generates realisations of land cover proportions from the posterior distribution as implied by a Bayesian analysis that combines spatial information in the land cover map and its associated confusion matrix. The technique is computationally simple and has been applied previously to the Land Cover Map 2000 for the region of England and Wales. This article demonstrates the ability of the technique to scale up to large (global) satellite derived land cover maps and reports its application to the GlobCover 2009 data product. The results show that, in general, the GlobCover data possesses only small biases, with the largest belonging to non–vegetated surfaces. In vegetated surfaces, the most prominent area of uncertainty is Southern Africa, which represents a complex heterogeneous landscape. It is also clear from this study that greater resources need to be devoted to the construction of comprehensive confusion matrices.
Resumo:
A new algorithm is described for refining the pose of a model of a rigid object, to conform more accurately to the image structure. Elemental 3D forces are considered to act on the model. These are derived from directional derivatives of the image local to the projected model features. The convergence properties of the algorithm is investigated and compared to a previous technique. Its use in a video sequence of a cluttered outdoor traffic scene is also illustrated and assessed.
Resumo:
This workshop paper reports recent developments to a vision system for traffic interpretation which relies extensively on the use of geometrical and scene context. Firstly, a new approach to pose refinement is reported, based on forces derived from prominent image derivatives found close to an initial hypothesis. Secondly, a parameterised vehicle model is reported, able to represent different vehicle classes. This general vehicle model has been fitted to sample data, and subjected to a Principal Component Analysis to create a deformable model of common car types having 6 parameters. We show that the new pose recovery technique is also able to operate on the PCA model, to allow the structure of an initial vehicle hypothesis to be adapted to fit the prevailing context. We report initial experiments with the model, which demonstrate significant improvements to pose recovery.
Resumo:
A new formulation of a pose refinement technique using ``active'' models is described. An error term derived from the detection of image derivatives close to an initial object hypothesis is linearised and solved by least squares. The method is particularly well suited to problems involving external geometrical constraints (such as the ground-plane constraint). We show that the method is able to recover both the pose of a rigid model, and the structure of a deformable model. We report an initial assessment of the performance and cost of pose and structure recovery using the active model in comparison with our previously reported ``passive'' model-based techniques in the context of traffic surveillance. The new method is more stable, and requires fewer iterations, especially when the number of free parameters increases, but shows somewhat poorer convergence.
Resumo:
Different optimization methods can be employed to optimize a numerical estimate for the match between an instantiated object model and an image. In order to take advantage of gradient-based optimization methods, perspective inversion must be used in this context. We show that convergence can be very fast by extrapolating to maximum goodness-of-fit with Newton's method. This approach is related to methods which either maximize a similar goodness-of-fit measure without use of gradient information, or else minimize distances between projected model lines and image features. Newton's method combines the accuracy of the former approach with the speed of convergence of the latter.
Resumo:
Flocking is the capacity of coherent movement between multiple animals, including birds. Prominent research into flocking is presented. Particle Swarm Optimisation (PSO) has been the prominent result from research into flocking. It is considered that opportunities for further research in flocking exist. With the potential for automated traffic systems, it is concluded that flocking should be reinvestigated for this purpose.
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them also involves complicated workflows implemented as shell scripts. A new grid middleware system that is well suited to climate modelling applications is presented in this paper. Grid Remote Execution (G-Rex) allows climate models to be deployed as Web services on remote computer systems and then launched and controlled as if they were running on the user's own computer. Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model. G-Rex has a REST architectural style, featuring a Java client program that can easily be incorporated into existing scientific workflow scripts. Some technical details of G-Rex are presented, with examples of its use by climate modellers.