21 resultados para model validation

em Aston University Research Archive


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis presents an effective methodology for the generation of a simulation which can be used to increase the understanding of viscous fluid processing equipment and aid in their development, design and optimisation. The Hampden RAPRA Torque Rheometer internal batch twin rotor mixer has been simulated with a view to establishing model accuracies, limitations, practicalities and uses. As this research progressed, via the analyses several 'snap-shot' analysis of several rotor configurations using the commercial code Polyflow, it was evident that the model was of some worth and its predictions are in good agreement with the validation experiments, however, several major restrictions were identified. These included poor element form, high man-hour requirements for the construction of each geometry and the absence of the transient term in these models. All, or at least some, of these limitations apply to the numerous attempts to model internal mixes by other researchers and it was clear that there was no generally accepted methodology to provide a practical three-dimensional model which has been adequately validated. This research, unlike others, presents a full complex three-dimensional, transient, non-isothermal, generalised non-Newtonian simulation with wall slip which overcomes these limitations using unmatched ridding and sliding mesh technology adapted from CFX codes. This method yields good element form and, since only one geometry has to be constructed to represent the entire rotor cycle, is extremely beneficial for detailed flow field analysis when used in conjunction with user defined programmes and automatic geometry parameterisation (AGP), and improves accuracy for investigating equipment design and operation conditions. Model validation has been identified as an area which has been neglected by other researchers in this field, especially for time dependent geometries, and has been rigorously pursued in terms of qualitative and quantitative velocity vector analysis of the isothermal, full fill mixing of generalised non-Newtonian fluids, as well as torque comparison, with a relatively high degree of success. This indicates that CFD models of this type can be accurate and perhaps have not been validated to this extent previously because of the inherent difficulties arising from most real processes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The optical illumination of a microstrip gap on a thick semiconductor substrate creates an inhomogeneous electron-hole plasma in the gap region. This allows the study of the propagation mechanism through the plasma region. This paper uses a multilayer plasma model to explain the origin of high losses in such structures. Measured results are shown up to 50 GHz and show good agreement with the simulated multilayer model. The model also allows the estimation of certain key parameters of the plasma, such as carrier density and diffusion length, which are difficult to measure by direct means. The detailed model validation performed here will enable the design of more complex microwave structures based on this architecture. While this paper focuses on monocrystalline silicon as the substrate, the model is easily adaptable to other semiconductor materials such as GaAs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to outline a seven-phase simulation conceptual modelling procedure that incorporates existing practice and embeds a process reference model (i.e. SCOR). Design/methodology/approach – An extensive review of the simulation and SCM literature identifies a set of requirements for a domain-specific conceptual modelling procedure. The associated design issues for each requirement are discussed and the utility of SCOR in the process of conceptual modelling is demonstrated using two development cases. Ten key concepts are synthesised and aligned to a general process for conceptual modelling. Further work is outlined to detail, refine and test the procedure with different process reference models in different industrial contexts. Findings - Simulation conceptual modelling is often regarded as the most important yet least understood aspect of a simulation project (Robinson, 2008a). Even today, there has been little research development into guidelines to aid in the creation of a conceptual model. Design issues are discussed for building an ‘effective’ conceptual model and the domain-specific requirements for modelling supply chains are addressed. The ten key concepts are incorporated to aid in describing the supply chain problem (i.e. components and relationships that need to be included in the model), model content (i.e. rules for determining the simplest model boundary and level of detail to implement the model) and model validation. Originality/value – Paper addresses Robinson (2008a) call for research in defining and developing new approaches for conceptual modelling and Manuj et al., (2009) discussion on improving the rigour of simulation studies in SCM. It is expected that more detailed guidelines will yield benefits to both expert (i.e. avert typical modelling failures) and novice modellers (i.e. guided practice; less reliance on hopeful intuition)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A new mesoscale simulation model for solids dissolution based on an computationally efficient and versatile digital modelling approach (DigiDiss) is considered and validated against analytical solutions and published experimental data for simple geometries. As the digital model is specifically designed to handle irregular shapes and complex multi-component structures, use of the model is explored for single crystals (sugars) and clusters. Single crystals and the cluster were first scanned using X-ray microtomography to obtain a digital version of their structures. The digitised particles and clusters were used as a structural input to digital simulation. The same particles were then dissolved in water and the dissolution process was recorded by a video camera and analysed yielding: the overall dissolution times and images of particle size and shape during the dissolution. The results demonstrate the coherence of simulation method to reproduce experimental behaviour, based on known chemical and diffusion properties of constituent phase. The paper discusses how further sophistications to the modelling approach will need to include other important effects such as complex disintegration effects (particle ejection, uncertainties in chemical properties). The nature of the digital modelling approach is well suited to for future implementation with high speed computation using hybrid conventional (CPU) and graphical processor (GPU) systems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A novel simulation model for pyrolysis processes oflignocellulosicbiomassin AspenPlus (R) was presented at the BC&E 2013. Based on kinetic reaction mechanisms, the simulation calculates product compositions and yields depending on reactor conditions (temperature, residence time, flue gas flow rate) and feedstock composition (biochemical composition, atomic composition, ash and alkali metal content). The simulation model was found to show good correlation with existing publications. In order to further verify the model, own pyrolysis experiments in a 1 kg/h continuously fed fluidized bed fast pyrolysis reactor are performed. Two types of biomass with different characteristics are processed in order to evaluate the influence of the feedstock composition on the yields of the pyrolysis products and their composition. One wood and one straw-like feedstock are used due to their different characteristics. Furthermore, the temperature response of yields and product compositions is evaluated by varying the reactor temperature between 450 and 550 degrees C for one of the feedstocks. The yields of the pyrolysis products (gas, oil, char) are determined and their detailed composition is analysed. The experimental runs are reproduced with the corresponding reactor conditions in the AspenPlus model and the results compared with the experimental findings.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Efficient numerical models facilitate the study and design of solid oxide fuel cells (SOFCs), stacks, and systems. Whilst the accuracy and reliability of the computed results are usually sought by researchers, the corresponding modelling complexities could result in practical difficulties regarding the implementation flexibility and computational costs. The main objective of this article is to adapt a simple but viable numerical tool for evaluation of our experimental rig. Accordingly, a model for a multi-layer SOFC surrounded by a constant temperature furnace is presented, trained and validated against experimental data. The model consists of a four-layer structure including stand, two interconnects, and PEN (Positive electrode-Electrolyte-Negative electrode); each being approximated by a lumped parameter model. The heating process through the surrounding chamber is also considered. We used a set of V-I characteristics data for parameter adjustment followed by model verification against two independent sets of data. The model results show a good agreement with practical data, offering a significant improvement compared to reduced models in which the impact of external heat loss is neglected. Furthermore, thermal analysis for adiabatic and non-adiabatic process is carried out to capture the thermal behaviour of a single cell followed by a polarisation loss assessment. Finally, model-based design of experiment is demonstrated for a case study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Letter addresses image segmentation via a generative model approach. A Bayesian network (BNT) in the space of dyadic wavelet transform coefficients is introduced to model texture images. The model is similar to a Hidden Markov model (HMM), but with non-stationary transitive conditional probability distributions. It is composed of discrete hidden variables and observable Gaussian outputs for wavelet coefficients. In particular, the Gabor wavelet transform is considered. The introduced model is compared with the simplest joint Gaussian probabilistic model for Gabor wavelet coefficients for several textures from the Brodatz album [1]. The comparison is based on cross-validation and includes probabilistic model ensembles instead of single models. In addition, the robustness of the models to cope with additive Gaussian noise is investigated. We further study the feasibility of the introduced generative model for image segmentation in the novelty detection framework [2]. Two examples are considered: (i) sea surface pollution detection from intensity images and (ii) image segmentation of the still images with varying illumination across the scene.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although techniques such as biopanning rely heavily upon the screening of randomized gene libraries, there is surprisingly little information available on the construction of those libraries. In general, it is based on the cloning of 'randomized' synthetic oligonucleotides, in which given position(s) contain an equal mixture of all four bases. Yet, many supposedly 'randomized' libraries contain significant elements of bias and/or omission. Here, we report the development and validation of a new, PCR-based assay that enables rapid examination of library composition both prior to and after cloning. By using our assay to analyse model libraries, we demonstrate that the cloning of a given distribution of sequences does not necessarily result in a similarly composed library of clones. Thus, while bias in randomized synthetic oligonucleotide mixtures can be virtually eliminated by using unequal ratios of the four phosphoramidites, the use of such mixtures does not ensure retrieval of a truly randomized library. We propose that in the absence of a technique to control cloning frequencies, the ability to analyse the composition of libraries after cloning will enhance significantly the quality of information derived from those libraries. (C) 2000 Published by Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The investigation of insulation debris transport, sedimentation, penetration into the reactor core and head loss build up becomes important to reactor safety research for PWR and BWR, when considering the long-term behaviour of emergency core cooling systems during loss of coolant accidents. Research projects are being performed in cooperation between the University of Applied Sciences Zittau/Görlitz and the Helmholtz-Zentrum Dresden-Rossendorf. The projects include experimental investigations of different processes and phenomena of insulation debris in coolant flow and the development of CFD models. Generic complex experiments serve for building up a data base for the validation of models for single effects and their coupling in CFD codes. This paper includes the description of the experimental facility for complex generic experiments (ZSW), an overview about experimental boundary conditions and results for upstream and down-stream phenomena as well as for the long-time behaviour due to corrosive processes. © Carl Hanser Verlag, München.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When constructing and using environmental models, it is typical that many of the inputs to the models will not be known perfectly. In some cases, it will be possible to make observations, or occasionally physics-based uncertainty propagation, to ascertain the uncertainty on these inputs. However, such observations are often either not available or even possible, and another approach to characterising the uncertainty on the inputs must be sought. Even when observations are available, if the analysis is being carried out within a Bayesian framework then prior distributions will have to be specified. One option for gathering or at least estimating this information is to employ expert elicitation. Expert elicitation is well studied within statistics and psychology and involves the assessment of the beliefs of a group of experts about an uncertain quantity, (for example an input / parameter within a model), typically in terms of obtaining a probability distribution. One of the challenges in expert elicitation is to minimise the biases that might enter into the judgements made by the individual experts, and then to come to a consensus decision within the group of experts. Effort is made in the elicitation exercise to prevent biases clouding the judgements through well-devised questioning schemes. It is also important that, when reaching a consensus, the experts are exposed to the knowledge of the others in the group. Within the FP7 UncertWeb project (http://www.uncertweb.org/), there is a requirement to build a Webbased tool for expert elicitation. In this paper, we discuss some of the issues of building a Web-based elicitation system - both the technological aspects and the statistical and scientific issues. In particular, we demonstrate two tools: a Web-based system for the elicitation of continuous random variables and a system designed to elicit uncertainty about categorical random variables in the setting of landcover classification uncertainty. The first of these examples is a generic tool developed to elicit uncertainty about univariate continuous random variables. It is designed to be used within an application context and extends the existing SHELF method, adding a web interface and access to metadata. The tool is developed so that it can be readily integrated with environmental models exposed as web services. The second example was developed for the TREES-3 initiative which monitors tropical landcover change through ground-truthing at confluence points. It allows experts to validate the accuracy of automated landcover classifications using site-specific imagery and local knowledge. Experts may provide uncertainty information at various levels: from a general rating of their confidence in a site validation to a numerical ranking of the possible landcover types within a segment. A key challenge in the web based setting is the design of the user interface and the method of interacting between the problem owner and the problem experts. We show the workflow of the elicitation tool, and show how we can represent the final elicited distributions and confusion matrices using UncertML, ready for integration into uncertainty enabled workflows.We also show how the metadata associated with the elicitation exercise is captured and can be referenced from the elicited result, providing crucial lineage information and thus traceability in the decision making process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Soil erosion is one of the most pressing issues facing developing countries. The need for soil erosion assessment is paramount as a successful and productive agricultural base is necessary for economic growth and stability. In Ghana, a country with an expanding population and high potential for economic growth, agriculture is an important resource; however, most of the crop production is restricted to low technology shifting cultivation agriculture. The high intensity seasonal rainfall coincides with the early growing period of many of the crops meaning that plots are very susceptible to erosion, especially on steep sided valleys in the region south of Lake Volta. This research investigated the processes of soil erosion by rainfall with the aim of producing a sediment yield model for a small semi-agricultural catchment in rural Ghana. Various types of modelling techniques were considered to discover those most applicable to the sub-tropical environment of Southern Ghana. Once an appropriate model had been developed and calibrated, the aim was to look at how to enable the scaling up of the model using sub-catchments to calculate sedimentation rates of Lake Volta. An experimental catchment was located in Ghana, south west of Lake Volta, where data on rainstorms and the associated streamflow, sediment loads and soil data (moisture content, classification and particle size distribution) was collected to calibrate the model. Additional data was obtained from the Soil Research Institute in Ghana to explore calibration of the Universal Soil Loss Equation (USLE, Wischmeier and Smith, 1978) for Ghanaian soils and environment. It was shown that the USLE could be successfully converted to provide meaningful soil loss estimates in the Ghanaian environment. However, due to experimental difficulties, the proposed theory and methodology of the sediment yield model could only be tested in principle. Future work may include validation of the model and subsequent scaling up to estimate sedimentation rates in Lake Volta.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis, set within an Action Research framework, details the development and validation of a writer-centred model of the writing process. The model was synthesised within the boundaries of a writers’ group for MA students. The initial data collected, and analysed using the principles of grounded theory, were retrospective descriptions of group members’ writing processes. After initial analysis, additional data, from group members’ writing, and from audio recordings, were used for further analysis, and to form a model of the writing process. To ascertain whether the model had value outside the specific context in which it was made, it was validated from three different perspectives. Firstly, the retrospective descriptions of other writers were collected and analysed, using the model as a framework. Secondly, the model was presented at academic conferences; comments about the model, made by members of the audience, were collected and analysed. Finally, the model was used in writing courses for PhD students. Comments from these students, along with questionnaire responses, were collected and the content analysed. Upon examination of all data sources, the model was updated to reflect additional insights arising from the analysis. Analysis of the data also indicated that the model is useable outside its original context. Potential uses for the model are 1) raising awareness of the process of writing, 2) putting writers at ease, 3) serving as a starting point for individuals or groups to design their own models of the writing process, and 4) as a tool to help writers take control of their writing processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The starting point of this research was the belief that manufacturing and similar industries need help with the concept of e-business, especially in assessing the relevance of possible e-business initiatives. The research hypotheses was that it should be possible to produce a systematic model that defines, at a useful level of detail, the probable e-business requirements of an organisation based on objective criteria with an accuracy of 85%-90%. This thesis describes the development and validation of such a model. A preliminary model was developed from a variety of sources, including a survey of current and planned e-business activity and representative examples of e-business material produced by e-business solution providers. The model was subject to a process of testing and refinement based on recursive case studies, with controls over the improving accuracy and stability of the model. Useful conclusions were also possible as to the relevance of e-business functions to the case study participants themselves. Techniques were evolved to synthesise the e-business requirements of an organisation and present them at a management summary level of detail. The results of applying these techniques to all the case studies used in this research were discussed. The conclusion of the research was that the case study methodology employed was successful. A model was achieved suitable for practical application in a manufacturing organisation requiring help with a requirements definition process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the development of a tree-based decision model to predict the severity of pediatric asthma exacerbations in the emergency department (ED) at 2 h following triage. The model was constructed from retrospective patient data abstracted from the ED charts. The original data was preprocessed to eliminate questionable patient records and to normalize values of age-dependent clinical attributes. The model uses attributes routinely collected in the ED and provides predictions even for incomplete observations. Its performance was verified on independent validating data (split-sample validation) where it demonstrated AUC (area under ROC curve) of 0.83, sensitivity of 84%, specificity of 71% and the Brier score of 0.18. The model is intended to supplement an asthma clinical practice guideline, however, it can be also used as a stand-alone decision tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Developmental neurotoxicity is a major issue in human health and may have lasting neurological implications. In this preliminary study we exposed differentiating Ntera2/clone D1 (NT2/D1) cell neurospheres to known human teratogens classed as non-embryotoxic (acrylamide), weakly embryotoxic (lithium, valproic acid) and strongly embryotoxic (hydroxyurea) as listed by European Centre for the Validation of Alternative Methods (ECVAM) and examined endpoints of cell viability and neuronal protein marker expression specific to the central nervous system, to identify developmental neurotoxins. Following induction of neuronal differentiation, valproic acid had the most significant effect on neurogenesis, in terms of reduced viability and decreased neuronal markers. Lithium had least effect on viability and did not significantly alter the expression of neuronal markers. Hydroxyurea significantly reduced cell viability but did not affect neuronal protein marker expression. Acrylamide reduced neurosphere viability but did not affect neuronal protein marker expression. Overall, this NT2/D1 -based neurosphere model of neurogenesis, may provide the basis for a model of developmental neurotoxicity in vitro.