980 resultados para correlation modelling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study focuses on the modelling of turbulent lifted jet flames using flamelets and a presumed Probability Density Function (PDF) approach with interest in both flame lift-off height and flame brush structure. First, flamelet models used to capture contributions from premixed and non-premixed modes of the partially premixed combustion in the lifted jet flame are assessed using a Direct Numerical Simulation (DNS) data for a turbulent lifted hydrogen jet flame. The joint PDFs of mixture fraction Z and progress variable c, including their statistical correlation, are obtained using a copula method, which is also validated using the DNS data. The statistically independent PDFs are found to be generally inadequate to represent the joint PDFs from the DNS data. The effects of Z-c correlation and the contribution from the non-premixed combustion mode on the flame lift-off height are studied systematically by including one effect at a time in the simulations used for a posteriori validation. A simple model including the effects of chemical kinetics and scalar dissipation rate is suggested and used for non-premixed combustion contributions. The results clearly show that both Z-c correlation and non-premixed combustion effects are required in the premixed flamelets approach to get good agreement with the measured flame lift-off heights as a function of jet velocity. The flame brush structure reported in earlier experimental studies is also captured reasonably well for various axial positions. It seems that flame stabilisation is influenced by both premixed and non-premixed combustion modes, and their mutual influences. © 2014 Taylor & Francis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper outlines a novel information sharing method using Binary Decision Diagrams (BBDs). It is inspired by the work of Al-Shaer and Hamed, who applied BDDs into the modelling of network firewalls. This is applied into an information sharing policy system which optimizes the search of redundancy, shadowing, generalisation and correlation within information sharing rules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

M. H. Lee, and S. M. Garrett, Qualitative modelling of unknown interface behaviour, International Journal of Human Computer Studies, Vol. 53, No. 4, pp. 493-515, 2000

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Soldering technologies continue to evolve to meet the demands of the continuous miniaturisation of electronic products, particularly in the area of solder paste formulations used in the reflow soldering of surface mount devices. Stencil printing continues to be a leading process used for the deposition of solder paste onto printed circuit boards (PCBs) in the volume production of electronic assemblies, despite problems in achieving a consistent print quality at an ultra-fine pitch. In order to eliminate these defects a good understanding of the processes involved in printing is important. Computational simulations may complement experimental print trials and paste characterisation studies, and provide an extra dimension to the understanding of the process. The characteristics and flow properties of solder pastes depend primarily on their chemical and physical composition and good material property data is essential for meaningful results to be obtained by computational simulation.This paper describes paste characterisation and computational simulation studies that have been undertaken through the collaboration of the School of Aeronautical, Mechanical and Manufacturing Engineering at Salford University and the Centre for Numerical Modelling and Process Analysis at the University of Greenwich. The rheological profile of two different paste formulations (lead and lead-free) for sub 100 micron flip-chip devices are tested and applied to computational simulations of their flow behaviour during the printing process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we evaluate whether the assimilation of remotely-sensed optical data into a marine ecosystem model improves the simulation of biogeochemistry in a shelf sea. A localized Ensemble Kalman filter was used to assimilate weekly diffuse light attenuation coefficient data, Kd(443) from SeaWiFs, into an ecosystem model of the western English Channel. The spatial distributions of (unassimilated) surface chlorophyll from satellite, and a multivariate time series of eighteen biogeochemical and optical variables measured in situ at one long-term monitoring site were used to evaluate the system performance for the year 2006. Assimilation reduced the root mean square error and improved the correlation with the assimilated Kd(443) observations, for both the analysis and, to a lesser extent, the forecast estimates, when compared to the reference model simulation. Improvements in the simulation of (unassimilated) ocean colour chlorophyll were less evident, and in some parts of the Channel the simulation of this data deteriorated. The estimation errors for the (unassimilated) in situ data were reduced for most variables with some exceptions, e.g. dissolved nitrogen. Importantly, the assimilation adjusted the balance of ecosystem processes by shifting the simulated food web towards the microbial loop, thus improving the estimation of some properties, e.g. total particulate carbon. Assimilation of Kd(443) outperformed a comparative chlorophyll assimilation experiment, in both the estimation of ocean colour data and in the simulation of independent in situ data. These results are related to relatively low error in Kd(443) data, and because it is a bulk optical property of marine ecosystems. Assimilation of remotely-sensed optical properties is a promising approach to improve the simulation of biogeochemical and optical variables that are relevant for ecosystem functioning and climate change studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models and software products have been developed for modelling, simulation and prediction of different correlations in materials science, including 1. the correlation between processing parameters and properties in titanium alloys and ?-titanium aluminides; 2. time–temperature–transformation (TTT) diagrams for titanium alloys; 3. corrosion resistance of titanium alloys; 4. surface hardness and microhardness profile of nitrocarburised layers; 5. fatigue stress life (S–N) diagrams for Ti–6Al–4V alloys. The programs are based on trained artificial neural networks. For each particular case appropriate combination of inputs and outputs is chosen. Very good performances of the models are achieved. Graphical user interfaces (GUI) are created for easy use of the models. In addition interactive text versions are developed. The models designed are combined and integrated in software package that is built up on a modular fashion. The software products are available in versions for different platforms including Windows 95/98/2000/NT, UNIX and Apple Macintosh. Description of the software products is given, to demonstrate that they are convenient and powerful tools for practical applications in solving various problems in materials science. Examples for optimisation of the alloy compositions, processing parameters and working conditions are illustrated. An option for use of the software in materials selection procedure is described.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work in this paper is of particular significance since it considers the problem of modelling cross- and auto-correlation in statistical process monitoring. The presence of both types of correlation can lead to fault insensitivity or false alarms, although in published literature to date, only autocorrelation has been broadly considered. The proposed method, which uses a Kalman innovation model, effectively removes both correlations. The paper (and Part 2 [2]) has emerged from work supported by EPSRC grant GR/S84354/01 and is of direct relevance to problems in several application areas including chemical, electrical, and mechanical process monitoring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tephra horizons are potentially perfect time markers for dating and cross-correlation among diverse Holocene palaeoenvironmental records such as ice cores and marine and terrestrial sequences, but we need to trust their age. Here we present a new age estimate of the Holocene Mjauvotn tephra A using accelerator mass spectrometry C-14 dates from two lakes on the Faroe Islands. With Bayesian age modelling it is dated to 6668-6533 cal. a BP (68.2% confidence interval) - significantly older and better constrained than the previous age. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The injection stretch blow moulding process is used to manufacture PET containers used in the soft drinks and carbonated soft drinks industry. The process consists of a test tube like specimen known as a preform which is heated, stretch and blown into a mould to form the container. This research is focused on developing a validated simulation of the process thus enabling manufacturers to design their products in a virtual environment without the need to waste time, material and energy. The simulation has been developed using the commercial FEA package Abaqus and has been validated using state of the art data acquisition system consisting of measurements for preform temperature (inner and outer wall) using a device known as THERMOscan (Figure 1), stretch rod force and velocity, internal pressure and air temperature inside the preform using an instrumented stretch rod and the?exact?timing of when the preform touches the mould wall using contact sensors.? In addition, validation studies have also been performed by blowing a perform without a mould and using high sped imaging technology in cooperation with an advanced digital image correlation system (VIC 3D) to provided new quantitative information on the behaviour of PET during blowing.? The approach has resulted in a realistic simulation in terms of accurate input parameters, preform shape evolution and prediction of final properties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When studying heterogeneous aquifer systems, especially at regional scale, a degree of generalization is anticipated. This can be due to sparse sampling regimes, complex depositional environments or lack of accessibility to measure the subsurface. This can lead to an inaccurate conceptualization which can be detrimental when applied to groundwater flow models. It is important that numerical models are based on observed and accurate geological information and do not rely on the distribution of artificial aquifer properties. This can still be problematic as data will be modelled at a different scale to which it was collected. It is proposed here that integrating geophysics and upscaling techniques can assist in a more realistic and deterministic groundwater flow model. In this study, the sedimentary aquifer of the Lagan Valley in Northern Ireland is chosen due to intruding sub-vertical dolerite dykes. These dykes are of a lower permeability than the sandstone aquifer. The use of airborne magnetics allows the delineation of heterogeneities, confirmed by field analysis. Permeability measured at the field scale is then upscaled to different levels using a correlation with the geophysical data, creating equivalent parameters that can be directly imported into numerical groundwater flow models. These parameters include directional equivalent permeabilities and anisotropy. Several stages of upscaling are modelled in finite element. Initial modelling is providing promising results, especially at the intermediate scale, suggesting an accurate distribution of aquifer properties. This deterministic based methodology is being expanded to include stochastic methods of obtaining heterogeneity location based on airborne geophysical data. This is through the Direct Sample method of Multiple-Point Statistics (MPS). This method uses the magnetics as a training image to computationally determine a probabilistic occurrence of heterogeneity. There is also a need to apply the method to alternate geological contexts where the heterogeneity is of a higher permeability than the host rock.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The injection stretch blow moulding process involves the inflation and stretching of a hot preform into a mould to form bottles. A critical process variable and an essential input for process simulations is the rate of pressure increase within the preform during forming, which is regulated by an air flow restrictor valve. The paper describes a set of experiments for measuring the air flow rate within an industrial ISBM machine and the subsequent modelling of it with the FEA package ABAQUS. Two rigid containers were inserted into a Sidel SBO1 blow moulding machine and subjected to different supply pressures and air flow restrictor settings. The pressure and air temperature were recorded for each experiment enabling the mass flow rate of air to be determined along with an important machine characteristic known as the ‘dead volume’. The experimental setup was simulated within the commercial FEA package ABAQUS/Explicit using a combination of structural, fluid and fluid link elements that idealize the air flowing through an orifice behaving as an ideal gas under isothermal conditions. Results between experiment and simulation are compared and show a good correlation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper details the theory and implementation of a composite damage model, addressing damage within a ply (intralaminar) and delamination (interlaminar), for the simulation of crushing of laminated composite structures. It includes a more accurate determination of the characteristic length to achieve mesh objectivity in capturing intralaminar damage consisting of matrix cracking and fibre failure, a load-history dependent material response, an isotropic hardening nonlinear matrix response, as well as a more physically-based interactive matrix-dominated damage mechanism. The developed damage model requires a set of material parameters obtained from a combination of standard and non-standard material characterisation tests. The fidelity of the model mitigates the need to manipulate, or "calibrate", the input data to achieve good agreement with experimental results. The intralaminar damage model was implemented as a VUMAT subroutine, and used in conjunction with an existing interlaminar damage model, in Abaqus/Explicit. This approach was validated through the simulation of the crushing of a cross-ply composite tube with a tulip-shaped trigger, loaded in uniaxial compression. Despite the complexity of the chosen geometry, excellent correlation was achieved with experimental results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel digital image correlation (DIC) technique has been developed to track changes in textile yarn orientations during shear characterisation experiments, requiring only low-cost digital imaging equipment. Fabric shear angles and effective yarn strains are calculated and visualised using this new DIC technique for bias extension testing of an aerospace grade, carbon-fibre reinforcement material with a plain weave architecture. The DIC results are validated by direct measurement, and the use of a wide bias extension sample is evaluated against a more commonly used narrow sample. Wide samples exhibit a shear angle range 25% greater than narrow samples and peak loads which are 10 times higher. This is primarily due to excessive yarn slippage in the narrow samples; hence, the wide sample configuration is recommended for characterisation of shear properties which are required for accurate modelling of textile draping.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional experimental economics methods often consume enormous resources of qualified human participants, and the inconsistence of a participant’s decisions among repeated trials prevents investigation from sensitivity analyses. The problem can be solved if computer agents are capable of generating similar behaviors as the given participants in experiments. An experimental economics based analysis method is presented to extract deep information from questionnaire data and emulate any number of participants. Taking the customers’ willingness to purchase electric vehicles (EVs) as an example, multi-layer correlation information is extracted from a limited number of questionnaires. Multi-agents mimicking the inquired potential customers are modelled through matching the probabilistic distributions of their willingness embedded in the questionnaires. The authenticity of both the model and the algorithm is validated by comparing the agent-based Monte Carlo simulation results with the questionnaire-based deduction results. With the aid of agent models, the effects of minority agents with specific preferences on the results are also discussed.