88 resultados para Bondgraph modelling approach
Resumo:
Aiming to establish a rigorous link between macroscopic random motion (described e.g. by Langevin-type theories) and microscopic dynamics, we have undertaken a kinetic-theoretical study of the dynamics of a classical test-particle weakly coupled to a large heat-bath in thermal equilibrium. Both subsystems are subject to an external force field. From the (time-non-local) generalized master equation a Fokker-Planck-type equation follows as a "quasi-Markovian" approximation. The kinetic operator thus defined is shown to be ill-defined; in specific, it does not preserve the positivity of the test-particle distribution function f(x, v; t). Adopting an alternative approach, previously introduced for quantum open systems, is proposed to lead to a correct kinetic operator, which yields all the expected properties. A set of explicit expressions for the diffusion and drift coefficients are obtained, allowing for modelling macroscopic diffusion and dynamical friction phenomena, in terms of an external field and intrinsic physical parameters.
Resumo:
Approach:
In-situ passive gradient comparative artificial tracer testing, undertaken using solutes (Uranine and Iodide), Bacteria (E.coli and P.putida) and bacteriophage (H40/1), permitted comparison of the mobility of different sized microorganisms relative to solutes in the sand and gravel aquifer underlying Dornach, Germany.
Tracer breakthrough curves reveal that even though uranine initially arrived at observation wells at the same time as microbiological tracers, maximum relative concentrations were sometimes less than those of microbiological tracers, while solute breakthrough curves proved more disperse.
Monitoring uranine breakthrough with depth suggested tracers arrived in observation wells in discrete 0.5m-1m thick intervals, over the aquifer’s 12m saturated thickness. Nearby exposures of aquifer material suggested that the aquifer consisted of sandy gravels enveloping sequences of open framework (OW) gravel up to 1m thick. Detailed examination of OW units revealed that they contained lenses of silty sand up to 1m long x 30cm thick., while granulometric data suggested that the gravel was two to three orders of magnitude more permeable than the enveloping sandy gravel.
Solute and microorganism tracer responses could not be simulated using conventional advective-dispersive equation solutions employing the same velocity and dispersion terms. By contrast solute tracer responses, modelled using a dual porosity approach for fractured media (DP-1D) corresponded well to observed field data. Simulating microorganism responses using the same transport terms, but no dual porosity term, generated good model fits and explained the higher relative concentration of the bacteria, compared to the non-reactive solute, even with first order removal to account for lower RR. Geologically, model results indicate that the silty units within open framework gravels are accessible to solute tracers, but not to microorganisms.
Importance:
Results highlight the benefits of geological observations developing appropriate conceptual models of solute and micro organism transport and in developing suitable numerical approaches to quantifying microorganism mobility at scales appropriate for the development of groundwater supply (wellhead) protection zones.
Resumo:
This study undertakes a modeling based performance assessment of all Irish credit unions between 2002 and 2010, a particularly turbulent period in their history. The analysis explicitly addresses the current challenges faced by credit unions in that the modeling approach used rewards credit unions for reducing undesirable outputs (impaired loans and investments) as well as for increasing desirable outputs (loans, earning assets and members’ funds) and decreasing inputs (labour expenditure, capital expenditure and fund expenses). The main findings are: credit unions are subject to increasing returns to scale; technical regression occurred in the years after 2007; there is significant scope for an improvement in efficiency through expansion of desirable outputs and contraction of undesirable outputs and inputs; and that larger credit unions, that are better capitalised and pay a higher dividend to members are more efficient than their smaller, less capitalised, and lower dividend paying counterparts.
Resumo:
Simulations of the injection stretch-blow moulding process have been developed for the manufacture of poly(ethylene terephthalate) bottles using the commercial finite element package ABAQUS/standard. Initially a simulation of the manufacture of a 330 mL bottle was developed with three different material models (hyperelastic, creep, and a non-linear viscoelastic model (Buckley model)) to ascertain their suitability for modelling poly(ethylene terephthalate). The Buckley model was found to give results for the sidewall thickness that matched best with those measured from bottles off the production line. Following the investigation of the material models, the Buckley model was chosen to conduct a three-dimensional simulation of the manufacture of a 2 L bottle. It was found that the model was also capable of predicting the wall thickness distribution accurately for this bottle. In the development of the three-dimensional simulation a novel approach, which uses an axisymmetric model until the material reaches the petaloid base, was developed. This resulted in substantial savings in computing time. © 2000 IoM Communication Ltd.
Resumo:
The management of water resources in Ireland prior to the Water Framework Directive (WFD) has focussed on surface water and groundwater as separate entities. A critical element to the successful implementation of the
WFD is to improve our understanding of the interaction between the two and flow mechanisms by which groundwaters discharge to surface waters. An improved understanding of the contribution of groundwater to surface water is required for the classification of groundwater body status and the determination of groundwater quality thresholds. The results of the study will also have a wider application to many areas of the WFD.
A subcommittee of the WFD Groundwater Working Group (GWWG) has been formed to develop a methodology to estimate the groundwater contribution to Irish Rivers. The group has selected a number of analytical techniques to quantify components of stream flow in an Irish context (Master Recession Curve, Unit Hydrograph, Flood Studies Report methodologies and
hydrogeological analytical modelling). The components of stream flow that can be identified include deep groundwater, intermediate and overland. These analyses have been tested on seven pilot catchments that have a variety of hydrogeological settings and have been used to inform and constrain a mathematical model. The mathematical model used was the NAM (NedbØr-AfstrØmnings-Model) rainfall-runoff model which is a module of DHIs MIKE 11 modelling suite. The results from these pilot catchments have been used to develop a decision model based on catchment descriptors from GIS datasets for the selection of NAM parameters. The datasets used include the mapping of aquifers, vulnerability and subsoils, soils, the Digital Terrain Model, CORINE and lakes. The national coverage of the GIS datasets has allowed the extrapolation of the mathematical model to regional catchments across Ireland.
Resumo:
Two models that can predict the voltage-dependent scattering from liquid crystal (LC)-based reflectarray cells are presented. The validity of both numerical techniques is demonstrated using measured results in the frequency range 94-110 GHz. The most rigorous approach models, for each voltage, the inhomogeneous and anisotropic permittivity of the LC as a stratified media in the direction of the biasing field. This accounts for the different tilt angles of the LC molecules inside the cell calculated from the solution of the elastic problem. The other model is based on an effective homogeneous permittivity tensor that corresponds to the average tilt angle along the longitudinal direction for each biasing voltage. In this model, convergence problems associated with the longitudinal inhomogeneity are avoided, and the computation efficiency is improved. Both models provide a correspondence between the reflection coefficient (losses and phase-shift) of the LC-based reflectarray cell and the value of biasing voltage, which can be used to design beam scanning reflectarrays. The accuracy and the efficiency of both models are also analyzed and discussed.
Resumo:
The focus of this paper is to outline a method for consolidating and implementing the work on performance-based specification and testing. First part of the paper will review the mathematical significance of the variables used in common service life models. The aim is to identify a set of significant variables that influence the ingress of chloride ions into concrete. These variables are termed as Key Performance Indicators (KPI’s). This will also help to reduce the complexity of some of the service life models and make them more appealing for practicing engineers. The second part of the paper presents a plan for developing a database based on these KPI’s so that relationships can then be drawn between common concrete mix parameters and KPI’s. This will assist designers in specifying a concrete with adequate performance for a particular environment. This, collectively, is referred to as the KPI based approach and the concluding remarks will outline how the authors envisage the KPI theory to relate to performance assessment and monitoring.
Resumo:
This paper reports an approach by which laboratory based testing and numerical modelling can be combined to predict the long term performance of a range of concretes exposed to marine environments. Firstly, a critical review of the test methods for assessing the chloride penetration resistance of concrete is given. The repeatability of the different test results is also included. In addition to the test methods, a numerical simulation model is used to explore the test data further to obtain long-term chloride ingress trends. The combined use of testing and modelling is validated with the help of long-term chloride ingress data from a North Sea exposure site. In summary, the paper outlines a methodology for determining the long term performance of concrete in marine environments.
Resumo:
In the production process of polyethylene terephthalate (PET) bottles, the initial temperature of preforms plays a central role on the final thickness, intensity and other structural properties of the bottles. Also, the difference between inside and outside temperature profiles could make a significant impact on the final product quality. The preforms are preheated by infrared heating oven system which is often an open loop system and relies heavily on trial and error approach to adjust the lamp power settings. In this paper, a radial basis function (RBF) neural network model, optimized by a two-stage selection (TSS) algorithm combined with partial swarm optimization (PSO), is developed to model the nonlinear relations between the lamp power settings and the output temperature profile of PET bottles. Then an improved PSO method for lamp setting adjustment using the above model is presented. Simulation results based on experimental data confirm the effectiveness of the modelling and optimization method.
Resumo:
The momentum term has long been used in machine learning algorithms, especially back-propagation, to improve their speed of convergence. In this paper, we derive an expression to prove the O(1/k2) convergence rate of the online gradient method, with momentum type updates, when the individual gradients are constrained by a growth condition. We then apply these type of updates to video background modelling by using it in the update equations of the Region-based Mixture of Gaussians algorithm. Extensive evaluations are performed on both simulated data, as well as challenging real world scenarios with dynamic backgrounds, to show that these regularised updates help the mixtures converge faster than the conventional approach and consequently improve the algorithm’s performance.
Resumo:
Classification methods with embedded feature selection capability are very appealing for the analysis of complex processes since they allow the analysis of root causes even when the number of input variables is high. In this work, we investigate the performance of three techniques for classification within a Monte Carlo strategy with the aim of root cause analysis. We consider the naive bayes classifier and the logistic regression model with two different implementations for controlling model complexity, namely, a LASSO-like implementation with a L1 norm regularization and a fully Bayesian implementation of the logistic model, the so called relevance vector machine. Several challenges can arise when estimating such models mainly linked to the characteristics of the data: a large number of input variables, high correlation among subsets of variables, the situation where the number of variables is higher than the number of available data points and the case of unbalanced datasets. Using an ecological and a semiconductor manufacturing dataset, we show advantages and drawbacks of each method, highlighting the superior performance in term of classification accuracy for the relevance vector machine with respect to the other classifiers. Moreover, we show how the combination of the proposed techniques and the Monte Carlo approach can be used to get more robust insights into the problem under analysis when faced with challenging modelling conditions.
Resumo:
Reduced Order Models (ROMs) have proven to be a valid and efficient approach to model the thermal behaviour of building zones. The main issues associated with the use of zonal/lumped models are how to (1) divide the domain (lumps) and (2) evaluate the pa- rameters which characterise the lump-to-lump exchange of energy and momentum. The object of this research is to develop a methodology for the generation of ROMs from CFD models. The lumps of the ROM and their average property values are automatically ex- tracted from the CFD models through user defined constraints. This methodology has been applied to validated CFD models of a zone of the Environmental Research Insti- tute (ERI) Building in University College Cork (UCC). The ROM predicts temperature distribution in the domain with an average error lower than 2%. It is computationally efficient with an execution time of 3.45 seconds. Future steps in this research will be the development of the procedure to automatically extract the parameters which define lump-to-lump energy and momentum exchange. At the moment these parameters are evaluated through the minimisation of a cost function. The ROMs will also be utilised to predict the transient thermal behaviour of the building zone.
Resumo:
One of the most widely used techniques in computer vision for foreground detection is to model each background pixel as a Mixture of Gaussians (MoG). While this is effective for a static camera with a fixed or a slowly varying background, it fails to handle any fast, dynamic movement in the background. In this paper, we propose a generalised framework, called region-based MoG (RMoG), that takes into consideration neighbouring pixels while generating the model of the observed scene. The model equations are derived from Expectation Maximisation theory for batch mode, and stochastic approximation is used for online mode updates. We evaluate our region-based approach against ten sequences containing dynamic backgrounds, and show that the region-based approach provides a performance improvement over the traditional single pixel MoG. For feature and region sizes that are equal, the effect of increasing the learning rate is to reduce both true and false positives. Comparison with four state-of-the art approaches shows that RMoG outperforms the others in reducing false positives whilst still maintaining reasonable foreground definition. Lastly, using the ChangeDetection (CDNet 2014) benchmark, we evaluated RMoG against numerous surveillance scenes and found it to amongst the leading performers for dynamic background scenes, whilst providing comparable performance for other commonly occurring surveillance scenes.
An integrated approach for real-time model-based state-of-charge estimation of lithium-ion batteries
Resumo:
Lithium-ion batteries have been widely adopted in electric vehicles (EVs), and accurate state of charge (SOC) estimation is of paramount importance for the EV battery management system. Though a number of methods have been proposed, the SOC estimation for Lithium-ion batteries, such as LiFePo4 battery, however, faces two key challenges: the flat open circuit voltage (OCV) vs SOC relationship for some SOC ranges and the hysteresis effect. To address these problems, an integrated approach for real-time model-based SOC estimation of Lithium-ion batteries is proposed in this paper. Firstly, an auto-regression model is adopted to reproduce the battery terminal behaviour, combined with a non-linear complementary model to capture the hysteresis effect. The model parameters, including linear parameters and non-linear parameters, are optimized off-line using a hybrid optimization method that combines a meta-heuristic method (i.e., the teaching learning based optimization method) and the least square method. Secondly, using the trained model, two real-time model-based SOC estimation methods are presented, one based on the real-time battery OCV regression model achieved through weighted recursive least square method, and the other based on the state estimation using the extended Kalman filter method (EKF). To tackle the problem caused by the flat OCV-vs-SOC segments when the OCV-based SOC estimation method is adopted, a method combining the coulombic counting and the OCV-based method is proposed. Finally, modelling results and SOC estimation results are presented and analysed using the data collected from LiFePo4 battery cell. The results confirmed the effectiveness of the proposed approach, in particular the joint-EKF method.