28 resultados para Process-based model


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper a new structural model is presented to describe the evolution of porosity of char during the gasification process. The model assumes the char structure to be composed of bundles of parallel graphite layers, and the reactivities of each layer with the gasification agent are assumed to be different to represent the different degree of heterogeneity of each layer (i.e. each layer will react with the gasification agent at a different rate). It is this difference in the reactivity that allows micropores to be created during the course of gasification. This simple structural model enables the evolution of pore volume, pore geometrical surface area and the pore size distribution to be described with respect to the extent of char burn-off. The model is tested against the experimental data of gasification of longan seed-derived char with carbon dioxide and it is found that the agreement between the model and the data is reasonably satisfactory, especially the evolution of surface area and pore volume with burn-off.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Let (Phi(t))(t is an element of R+) be a Harris ergodic continuous-time Markov process on a general state space, with invariant probability measure pi. We investigate the rates of convergence of the transition function P-t(x, (.)) to pi; specifically, we find conditions under which r(t) vertical bar vertical bar P-t (x, (.)) - pi vertical bar vertical bar -> 0 as t -> infinity, for suitable subgeometric rate functions r(t), where vertical bar vertical bar - vertical bar vertical bar denotes the usual total variation norm for a signed measure. We derive sufficient conditions for the convergence to hold, in terms of the existence of suitable points on which the first hitting time moments are bounded. In particular, for stochastically ordered Markov processes, explicit bounds on subgeometric rates of convergence are obtained. These results are illustrated in several examples.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of the study presented was to implement a process model to simulate the dynamic behaviour of a pilot-scale process for anaerobic two-stage digestion of sewage sludge. The model implemented was initiated to support experimental investigations of the anaerobic two-stage digestion process. The model concept implemented in the simulation software package MATLAB(TM)/Simulink(R) is a derivative of the IWA Anaerobic Digestion Model No.1 (ADM1) that has been developed by the IWA task group for mathematical modelling of anaerobic processes. In the present study the original model concept has been adapted and applied to replicate a two-stage digestion process. Testing procedures, including balance checks and 'benchmarking' tests were carried out to verify the accuracy of the implementation. These combined measures ensured a faultless model implementation without numerical inconsistencies. Parameters for both, the thermophilic and the mesophilic process stage, have been estimated successfully using data from lab-scale experiments described in literature. Due to the high number of parameters in the structured model, it was necessary to develop a customised procedure that limited the range of parameters to be estimated. The accuracy of the optimised parameter sets has been assessed against experimental data from pilot-scale experiments. Under these conditions, the model predicted reasonably well the dynamic behaviour of a two-stage digestion process in pilot scale. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Simulations of a complete reflected shock tunnel facility have been performed with the aim of providing a better understanding of the flow through these facilities. In particular, the analysis is focused on the premature contamination of the test flow with the driver gas. The axisymmetric simulations model the full geometry of the shock tunnel and incorporate an iris-based model of the primary diaphragm rupture mechanics, an ideal secondary diaphragm and account for turbulence in the shock tube boundary layer with the Baldwin-Lomax eddy viscosity model. Two operating conditions were examined: one resulting in an over-tailored mode of operation and the other resulting in approximately tailored operation. The accuracy of the simulations is assessed through comparison with experimental measurements of static pressure, pitot pressure and stagnation temperature. It is shown that the widely-accepted driver gas contamination mechanism in which driver gas 'jets' along the walls through action of the bifurcated foot of the reflected shock, does not directly transport the driver gas to the nozzle at these conditions. Instead, driver gas laden vortices are generated by the bifurcated reflected shock. These vortices prevent jetting of the driver gas along the walls and convect driver gas away from the shock tube wall and downstream into the nozzle. Additional vorticity generated by the interaction of the reflected shock and the contact surface enhances the process in the over-tailored case. However, the basic mechanism appears to operate in a similar way for both the over-tailored and the approximately tailored conditions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The particle-based lattice solid model developed to study the physics of rocks and the nonlinear dynamics of earthquakes is refined by incorporating intrinsic friction between particles. The model provides a means for studying the causes of seismic wave attenuation, as well as frictional heat generation, fault zone evolution, and localisation phenomena. A modified velocity-Verlat scheme that allows friction to be precisely modelled is developed. This is a difficult computational problem given that a discontinuity must be accurately simulated by the numerical approach (i.e., the transition from static to dynamical frictional behaviour). This is achieved using a half time step integration scheme. At each half time step, a nonlinear system is solved to compute the static frictional forces and states of touching particle-pairs. Improved efficiency is achieved by adaptively adjusting the time step increment, depending on the particle velocities in the system. The total energy is calculated and verified to remain constant to a high precision during simulations. Numerical experiments show that the model can be applied to the study of earthquake dynamics, the stick-slip instability, heat generation, and fault zone evolution. Such experiments may lead to a conclusive resolution of the heat flow paradox and improved understanding of earthquake precursory phenomena and dynamics. (C) 1999 Academic Press.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Domain specific information retrieval has become in demand. Not only domain experts, but also average non-expert users are interested in searching domain specific (e.g., medical and health) information from online resources. However, a typical problem to average users is that the search results are always a mixture of documents with different levels of readability. Non-expert users may want to see documents with higher readability on the top of the list. Consequently the search results need to be re-ranked in a descending order of readability. It is often not practical for domain experts to manually label the readability of documents for large databases. Computational models of readability needs to be investigated. However, traditional readability formulas are designed for general purpose text and insufficient to deal with technical materials for domain specific information retrieval. More advanced algorithms such as textual coherence model are computationally expensive for re-ranking a large number of retrieved documents. In this paper, we propose an effective and computationally tractable concept-based model of text readability. In addition to textual genres of a document, our model also takes into account domain specific knowledge, i.e., how the domain-specific concepts contained in the document affect the document’s readability. Three major readability formulas are proposed and applied to health and medical information retrieval. Experimental results show that our proposed readability formulas lead to remarkable improvements in terms of correlation with users’ readability ratings over four traditional readability measures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we describe a model of the human visual system (HVS) based on the wavelet transform. This model is largely based on a previously proposed model, but has a number of modifications that make it more amenable to potential integration into a wavelet based image compression scheme. These modifications include the use of a separable wavelet transform instead of the cortex transform, the application of a wavelet contrast sensitivity function (CSP), and a simplified definition of subband contrast that allows us to predict noise visibility directly from wavelet coefficients. Initially, we outline the luminance, frequency, and masking sensitivities of the HVS and discuss how these can be incorporated into the wavelet transform. We then outline a number of limitations of the wavelet transform as a model of the HVS, namely the lack of translational invariance and poor orientation sensitivity. In order to investigate the efficacy of this wavelet based model, a wavelet visible difference predictor (WVDP) is described. The WVDP is then used to predict visible differences between an original and compressed (or noisy) image. Results are presented to emphasize the limitations of commonly used measures of image quality and to demonstrate the performance of the WVDP, The paper concludes with suggestions on bow the WVDP can be used to determine a visually optimal quantization strategy for wavelet coefficients and produce a quantitative measure of image quality.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent El Nino events have stimulated interest in the development of modeling techniques to forecast extremes of climate and related health events. Previous studies have documented associations between specific climate variables (particularly temperature and rainfall) and outbreaks of arboviral disease. In some countries, such diseases are sensitive to Fl Nino. Here we describe a climate-based model for the prediction of Ross River virus epidemics in Australia. From a literature search and data on case notifications, we determined in which years there were epidemics of Ross River virus in southern Australia between 1928 and 1998. Predictor variables were monthly Southern Oscillation index values for the year of an epidemic or lagged by 1 year. We found that in southeastern states, epidemic years were well predicted by monthly Southern Oscillation index values in January and September in the previous year. The model forecasts that there is a high probability of epidemic Ross River virus in the southern states of Australia in 1999. We conclude that epidemics of arboviral disease can, at least in principle, be predicted on the basis of climate relationships.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Problems associated with the stickiness of food in processing and storage practices along with its causative factors are outlined. Fundamental mechanisms that explain why and how food products become sticky are discussed. Methods currently in use for characterizing and overcoming stickiness problems in food processing and storage operations are described. The use of glass transition temperature-based model, which provides a rational basis for understanding and characterizing the stickiness of many food products, is highlighted.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Public sector organizations traditionally have been associated with the internal process (bureaucratic) model of organizational culture. Public choice and management theory have suggested that public sector managers can learn from the experience of private sector management, and need to change from the Internal process model of organizational culture. Due to these Influences an managers, the current research proposes that managers' perceptions of Ideal organizational culture would no longer reflect the Internal process model. Public sector managers' perceptions of the current culture, as well as their perceptions of the Ideal culture, were measured. A mail-out survey was conducted In the Queensland (a state of Australia) public sector. Responses to a competing values culture Inventory were received from 222 managers. Results Indicated that a reliance on the Internal process model persists, while managers had a desire for cultural models other than the Internal process model, as hypothesized.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Observations of accelerating seismic activity prior to large earthquakes in natural fault systems have raised hopes for intermediate-term eartquake forecasting. If this phenomena does exist, then what causes it to occur? Recent theoretical work suggests that the accelerating seismic release sequence is a symptom of increasing long-wavelength stress correlation in the fault region. A more traditional explanation, based on Reid's elastic rebound theory, argues that an accelerating sequence of seismic energy release could be a consequence of increasing stress in a fault system whose stress moment release is dominated by large events. Both of these theories are examined using two discrete models of seismicity: a Burridge-Knopoff block-slider model and an elastic continuum based model. Both models display an accelerating release of seismic energy prior to large simulated earthquakes. In both models there is a correlation between the rate of seismic energy release with the total root-mean-squared stress and the level of long-wavelength stress correlation. Furthermore, both models exhibit a systematic increase in the number of large events at high stress and high long-wavelength stress correlation levels. These results suggest that either explanation is plausible for the accelerating moment release in the models examined. A statistical model based on the Burridge-Knopoff block-slider is constructed which indicates that stress alone is sufficient to produce accelerating release of seismic energy with time prior to a large earthquake.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a new approach to the LU decomposition method for the simulation of stationary and ergodic random fields. The approach overcomes the size limitations of LU and is suitable for any size simulation. The proposed approach can facilitate fast updating of generated realizations with new data, when appropriate, without repeating the full simulation process. Based on a novel column partitioning of the L matrix, expressed in terms of successive conditional covariance matrices, the approach presented here demonstrates that LU simulation is equivalent to the successive solution of kriging residual estimates plus random terms. Consequently, it can be used for the LU decomposition of matrices of any size. The simulation approach is termed conditional simulation by successive residuals as at each step, a small set (group) of random variables is simulated with a LU decomposition of a matrix of updated conditional covariance of residuals. The simulated group is then used to estimate residuals without the need to solve large systems of equations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The particle-based Lattice Solid Model (LSM) was developed to provide a basis to study the physics of rocks and the nonlinear dynamics of earthquakes (MORA and PLACE, 1994; PLACE and MORA, 1999). A new modular and flexible LSM approach has been developed that allows different microphysics to be easily included in or removed from the model. The approach provides a virtual laboratory where numerical experiments can easily be set up and all measurable quantities visualised. The proposed approach provides a means to simulate complex phenomena such as fracturing or localisation processes, and enables the effect of different micro-physics on macroscopic behaviour to be studied. The initial 2-D model is extended to allow three-dimensional simulations to be performed and particles of different sizes to be specified. Numerical bi-axial compression experiments under different confining pressure are used to calibrate the model. By tuning the different microscopic parameters (such as coefficient of friction, microscopic strength and distribution of grain sizes), the macroscopic strength of the material and can be adjusted to be in agreement with laboratory experiments, and the orientation of fractures is consistent with the theoretical value predicted based on Mohr-Coulomb diagram. Simulations indicate that 3-D numerical models have different macroscopic properties than in 2-D and, hence, the model must be recalibrated for 3-D simulations. These numerical experiments illustrate that the new approach is capable of simulating typical rock fracture behaviour. The new model provides a basis to investigate nucleation, rupture and slip pulse propagation in complex fault zones without the previous model limitations of a regular low-level surface geometry and being restricted to two-dimensions.