987 resultados para reference modelling
On the modelling of the thermal interactions between a spray curtain and an impinging cold gas cloud
Resumo:
A mixed Lagrangian-Eulerian model of a Water Curtain barrier is presented. The heat, mass and momentum processes are modelled in a Lagrangian framework for the dispersed phase and in an Eulerian framework for the carrier phase. The derivation of the coupling source terms is illustrated with reference to a given carrier phase cell. The turbulent character of the flow is treated with a single equation model, modified to directly account for the influence of the particles on the flow. The model is implemented in the form of a 2 D incompressible Navier Stokes solver, coupled to an adaptive Rung Kutta method for the Lagrangian sub-system. Simulations of a free standing full cone water spray show satisfactory agreement with experiment. Predictions of a Water Curtain barrier impacted by a cold gas cloud point to markedly different flow fields for the upward and downward configurations, which could influence the effectiveness of chemical absorption in the liquid phase.
Resumo:
This paper presents the computational modelling of welding phenomena within a versatile numerical framework. The framework embraces models from both the fields of computational fluid dynamics (CFD) and computational solid mechanics (CSM). With regard to the CFD modelling of the weld pool fluid dynamics, heat transfer and phase change, cell-centred finite volume (FV) methods are employed. Additionally, novel vertex-based FV methods are employed with regard to the elasto-plastic deformation associated with the CSM. The FV methods are included within an integrated modelling framework, PHYSICA, which can be readily applied to unstructured meshes. The modelling techniques are validated against a variety of reference solutions.
Resumo:
In this paper we evaluate whether the assimilation of remotely-sensed optical data into a marine ecosystem model improves the simulation of biogeochemistry in a shelf sea. A localized Ensemble Kalman filter was used to assimilate weekly diffuse light attenuation coefficient data, Kd(443) from SeaWiFs, into an ecosystem model of the western English Channel. The spatial distributions of (unassimilated) surface chlorophyll from satellite, and a multivariate time series of eighteen biogeochemical and optical variables measured in situ at one long-term monitoring site were used to evaluate the system performance for the year 2006. Assimilation reduced the root mean square error and improved the correlation with the assimilated Kd(443) observations, for both the analysis and, to a lesser extent, the forecast estimates, when compared to the reference model simulation. Improvements in the simulation of (unassimilated) ocean colour chlorophyll were less evident, and in some parts of the Channel the simulation of this data deteriorated. The estimation errors for the (unassimilated) in situ data were reduced for most variables with some exceptions, e.g. dissolved nitrogen. Importantly, the assimilation adjusted the balance of ecosystem processes by shifting the simulated food web towards the microbial loop, thus improving the estimation of some properties, e.g. total particulate carbon. Assimilation of Kd(443) outperformed a comparative chlorophyll assimilation experiment, in both the estimation of ocean colour data and in the simulation of independent in situ data. These results are related to relatively low error in Kd(443) data, and because it is a bulk optical property of marine ecosystems. Assimilation of remotely-sensed optical properties is a promising approach to improve the simulation of biogeochemical and optical variables that are relevant for ecosystem functioning and climate change studies.
Resumo:
A coupled hydrodynamic-biogeochemical model was implemented in order to estimate the effects of Major Baltic Inflows on the near-bottom hydrophysical and biogeochemical conditions in the northern Baltic Proper and the western Gulf of Finland during the period 1991�2009. We compared results of a realistic reference run to the results of an experimental run where Major Baltic Inflows were suppressed. Further to the expected overall decrease in bottom salinity, this modelling experiment confirms that in the absence of strong saltwater inflows the deep areas of the Baltic Proper would become more anoxic, while in the shallower areas (western Gulf of Finland) near-bottom average conditions improve. Our experiment revealed that typical estuarine circulation results in the sporadic emergence of short-lasting events of near-bottom anoxia in the western Gulf of Finland due to transport of water masses from the Baltic Proper. Extrapolating our results beyond the modelled period, we speculate that the further deepening of the halocline in the Baltic Proper is likely to prevent inflows of anoxic water to the Gulf of Finland and in the longer term would lead to improvement in near-bottom conditions in the Baltic Proper. Our results reaffirm the importance of accurate representation of salinity dynamics in coupled Baltic Sea models serving as a basis for credible hindcast and future projection simulations of biogeochemical conditions.
Resumo:
We describe a detailed depth-and time-dependent model of the molecular cloud associated with the ultracompact H II region G 34.3+0.15. Previous work on observations of NH3 and CS indicates that the molecular cloud has three distinct physical components:- an ultracompact hot core, a compact hot core and an extended halo. We have used the physical parameters derived from these observations as input to our detailed chemical kinetic modelling. The results of the model calculations are discussed with reference to the different chemistries occuring in each component and are compared with abundances derived from our recent spectral line survey of G 34.3+0.15 (Paper I).
Resumo:
This Integration Insight provides a brief overview of the most popular modelling techniques used to analyse complex real-world problems, as well as some less popular but highly relevant techniques. The modelling methods are divided into three categories, with each encompassing a number of methods, as follows: 1) Qualitative Aggregate Models (Soft Systems Methodology, Concept Maps and Mind Mapping, Scenario Planning, Causal (Loop) Diagrams), 2) Quantitative Aggregate Models (Function fitting and Regression, Bayesian Nets, System of differential equations / Dynamical systems, System Dynamics, Evolutionary Algorithms) and 3) Individual Oriented Models (Cellular Automata, Microsimulation, Agent Based Models, Discrete Event Simulation, Social Network
Analysis). Each technique is broadly described with example uses, key attributes and reference material.
Resumo:
Tidally induced currents in estuarine flows are usually modulated by the tidal regime and respond differently to changes imposed to its natural propagation due to geomorphologic alterations. Some of these changes are due to the implementation of heavy engineering works, most of the times imposed by navigation needs associated with harbours growth. The main purpose of this study is to evaluate the hydrodynamic response of Ria de Aveiro to an alteration on the present geometry of its inlet, which was artificially delimited in 1808 through the construction of two jetties. In order to provide access to deeper draft vessels to the Aveiro harbour, its Administration intends to create better conditions for navigation through the extension by 200 m of the north jetty. A bidimensional hydrodynamic model SIMSYS2D was used in this study to simulate two distinct situations: the actual Ria de Aveiro configuration (2009), which is used as reference, and other including the future inlet configuration with the jetty extension. Several simulations were performed, using both bathymetries and considering extreme tidal conditions as forcing on the model oceanic open boundary. The tidal prism at the lagoon mouth and at the main lagoon channels was determined. Values of sea surface elevation and horizontal current velocity were comparatively analyzed as well as harmonic analysis results. The results for the projected inlet increase comparatively to those for the present configuration, although the differences found are not significant for most of the cases analyzed. More studies should be performed in order to clarify the long term impact of these works on the lagoon hydrodynamics.
Resumo:
The selection of the energy source to power the transport sector is one of the main current concerns, not only relative with the energy paradigm but also due to the strong influence of road traffic in urban areas, which highly affects human exposure to air pollutants and human health and quality of life. Due to current important technical limitations of advanced energy sources for transportation purposes, biofuels are seen as an alternative way to power the world’s motor vehicles in a near-future, helping to reduce GHG emissions while at the same time stimulating rural development. Motivated by European strategies, Portugal, has been betting on biofuels to meet the Directive 2009/28/CE goals for road transports using biofuels, especially biodiesel, even though, there is unawareness regarding its impacts on air quality. In this sense, this work intends to clarify this issue by trying to answer the following question: can biodiesel use contribute to a better air quality over Portugal, particularly over urban areas? The first step of this work consisted on the characterization of the national biodiesel supply chain, which allows verifying that the biodiesel chain has problems of sustainability as it depends on raw materials importation, therefore not contributing to reduce the external energy dependence. Next, atmospheric pollutant emissions and air quality impacts associated to the biodiesel use on road transports were assessed, over Portugal and in particular over the Porto urban area, making use of the WRF-EURAD mesoscale numerical modelling system. For that, two emission scenarios were defined: a reference situation without biodiesel use and a scenario reflecting the use of a B20 fuel. Through the comparison of both scenarios, it was verified that the use of B20 fuels helps in controlling air pollution, promoting reductions on PM10, PM2.5, CO and total NMVOC concentrations. It was also verified that NO2 concentrations decrease over the mainland Portugal, but increase in the Porto urban area, as well as formaldehyde, acetaldehyde and acrolein emissions in the both case studies. However, the use of pure diesel is more injurious for human health due to its dominant VOC which have higher chronic hazard quotients and hazard indices when compared to B20.
Resumo:
The decadal predictability of three-dimensional Atlantic Ocean anomalies is examined in a coupled global climate model (HadCM3) using a Linear Inverse Modelling (LIM) approach. It is found that the evolution of temperature and salinity in the Atlantic, and the strength of the meridional overturning circulation (MOC), can be effectively described by a linear dynamical system forced by white noise. The forecasts produced using this linear model are more skillful than other reference forecasts for several decades. Furthermore, significant non-normal amplification is found under several different norms. The regions from which this growth occurs are found to be fairly shallow and located in the far North Atlantic. Initially, anomalies in the Nordic Seas impact the MOC, and the anomalies then grow to fill the entire Atlantic basin, especially at depth, over one to three decades. It is found that the structure of the optimal initial condition for amplification is sensitive to the norm employed, but the initial growth seems to be dominated by MOC-related basin scale changes, irrespective of the choice of norm. The consistent identification of the far North Atlantic as the most sensitive region for small perturbations suggests that additional observations in this region would be optimal for constraining decadal climate predictions.
Resumo:
While the standard models of concentration addition and independent action predict overall toxicity of multicomponent mixtures reasonably, interactions may limit the predictive capability when a few compounds dominate a mixture. This study was conducted to test if statistically significant systematic deviations from concentration addition (i.e. synergism/antagonism, dose ratio- or dose level-dependency) occur when two taxonomically unrelated species, the earthworm Eisenia fetida and the nematode Caenorhabditis elegans were exposed to a full range of mixtures of the similar acting neonicotinoid pesticides imidacloprid and thiacloprid. The effect of the mixtures on C. elegans was described significantly better (p<0.01) by a dose level-dependent deviation from the concentration addition model than by the reference model alone, while the reference model description of the effects on E. fetida could not be significantly improved. These results highlight that deviations from concentration addition are possible even with similar acting compounds, but that the nature of such deviations are species dependent. For improving ecological risk assessment of simple mixtures, this implies that the concentration addition model may need to be used in a probabilistic context, rather than in its traditional deterministic manner. Crown Copyright (C) 2008 Published by Elsevier Inc. All rights reserved.
Resumo:
Evaluating agents in decision-making applications requires assessing their skill and predicting their behaviour. Both are well developed in Poker-like situations, but less so in more complex game and model domains. This paper addresses both tasks by using Bayesian inference in a benchmark space of reference agents. The concepts are explained and demonstrated using the game of chess but the model applies generically to any domain with quantifiable options and fallible choice. Demonstration applications address questions frequently asked by the chess community regarding the stability of the rating scale, the comparison of players of different eras and/or leagues, and controversial incidents possibly involving fraud. The last include alleged under-performance, fabrication of tournament results, and clandestine use of computer advice during competition. Beyond the model world of games, the aim is to improve fallible human performance in complex, high-value tasks.
Resumo:
The European research project TIDE (Tidal Inlets Dynamics and Environment) is developing and validating coupled models describing the morphological, biological and ecological evolution of tidal environments. The interactions between the physical and biological processes occurring in these regions requires that the system be studied as a whole rather than as separate parts. Extensive use of remote sensing including LiDAR is being made to provide validation data for the modelling. This paper describes the different uses of LiDAR within the project and their relevance to the TIDE science objectives. LiDAR data have been acquired from three different environments, the Venice Lagoon in Italy, Morecambe Bay in England, and the Eden estuary in Scotland. LiDAR accuracy at each site has been evaluated using ground reference data acquired with differential GPS. A semi-automatic technique has been developed to extract tidal channel networks from LiDAR data either used alone or fused with aerial photography. While the resulting networks may require some correction, the procedure does allow network extraction over large areas using objective criteria and reduces fieldwork requirements. The networks extracted may subsequently be used in geomorphological analyses, for example to describe the drainage patterns induced by networks and to examine the rate of change of networks. Estimation of the heights of the low and sparse vegetation on marshes is being investigated by analysis of the statistical distribution of the measured LiDAR heights. Species having different mean heights may be separated using the first-order moments of the height distribution.
Resumo:
The assumption that negligible work is involved in the formation of new surfaces in the machining of ductile metals, is re-examined in the light of both current Finite Element Method (FEM) simulations of cutting and modern ductile fracture mechanics. The work associated with separation criteria in FEM models is shown to be in the kJ/m2 range rather than the few J/m2 of the surface energy (surface tension) employed by Shaw in his pioneering study of 1954 following which consideration of surface work has been omitted from analyses of metal cutting. The much greater values of surface specific work are not surprising in terms of ductile fracture mechanics where kJ/m2 values of fracture toughness are typical of the ductile metals involved in machining studies. This paper shows that when even the simple Ernst–Merchant analysis is generalised to include significant surface work, many of the experimental observations for which traditional ‘plasticity and friction only’ analyses seem to have no quantitative explanation, are now given meaning. In particular, the primary shear plane angle φ becomes material-dependent. The experimental increase of φ up to a saturated level, as the uncut chip thickness is increased, is predicted. The positive intercepts found in plots of cutting force vs. depth of cut, and in plots of force resolved along the primary shear plane vs. area of shear plane, are shown to be measures of the specific surface work. It is demonstrated that neglect of these intercepts in cutting analyses is the reason why anomalously high values of shear yield stress are derived at those very small uncut chip thicknesses at which the so-called size effect becomes evident. The material toughness/strength ratio, combined with the depth of cut to form a non-dimensional parameter, is shown to control ductile cutting mechanics. The toughness/strength ratio of a given material will change with rate, temperature, and thermomechanical treatment and the influence of such changes, together with changes in depth of cut, on the character of machining is discussed. Strength or hardness alone is insufficient to describe machining. The failure of the Ernst–Merchant theory seems less to do with problems of uniqueness and the validity of minimum work, and more to do with the problem not being properly posed. The new analysis compares favourably and consistently with the wide body of experimental results available in the literature. Why considerable progress in the understanding of metal cutting has been achieved without reference to significant surface work is also discussed.
Resumo:
A Bond Graph is a graphical modelling technique that allows the representation of energy flow between the components of a system. When used to model power electronic systems, it is necessary to incorporate bond graph elements to represent a switch. In this paper, three different methods of modelling switching devices are compared and contrasted: the Modulated Transformer with a binary modulation ratio (MTF), the ideal switch element, and the Switched Power Junction (SPJ) method. These three methods are used to model a dc-dc Boost converter and then run simulations in MATLAB/SIMULINK. To provide a reference to compare results, the converter is also simulated using PSPICE. Both quantitative and qualitative comparisons are made to determine the suitability of each of the three Bond Graph switch models in specific power electronics applications
Resumo:
There is a strong drive towards hyperresolution earth system models in order to resolve finer scales of motion in the atmosphere. The problem of obtaining more realistic representation of terrestrial fluxes of heat and water, however, is not just a problem of moving to hyperresolution grid scales. It is much more a question of a lack of knowledge about the parameterisation of processes at whatever grid scale is being used for a wider modelling problem. Hyperresolution grid scales cannot alone solve the problem of this hyperresolution ignorance. This paper discusses these issues in more detail with specific reference to land surface parameterisations and flood inundation models. The importance of making local hyperresolution model predictions available for evaluation by local stakeholders is stressed. It is expected that this will be a major driving force for improving model performance in the future. Keith BEVEN, Hannah CLOKE, Florian PAPPENBERGER, Rob LAMB, Neil HUNTER