18 resultados para Validation model
em Aston University Research Archive
Resumo:
A new mesoscale simulation model for solids dissolution based on an computationally efficient and versatile digital modelling approach (DigiDiss) is considered and validated against analytical solutions and published experimental data for simple geometries. As the digital model is specifically designed to handle irregular shapes and complex multi-component structures, use of the model is explored for single crystals (sugars) and clusters. Single crystals and the cluster were first scanned using X-ray microtomography to obtain a digital version of their structures. The digitised particles and clusters were used as a structural input to digital simulation. The same particles were then dissolved in water and the dissolution process was recorded by a video camera and analysed yielding: the overall dissolution times and images of particle size and shape during the dissolution. The results demonstrate the coherence of simulation method to reproduce experimental behaviour, based on known chemical and diffusion properties of constituent phase. The paper discusses how further sophistications to the modelling approach will need to include other important effects such as complex disintegration effects (particle ejection, uncertainties in chemical properties). The nature of the digital modelling approach is well suited to for future implementation with high speed computation using hybrid conventional (CPU) and graphical processor (GPU) systems.
Resumo:
A novel simulation model for pyrolysis processes oflignocellulosicbiomassin AspenPlus (R) was presented at the BC&E 2013. Based on kinetic reaction mechanisms, the simulation calculates product compositions and yields depending on reactor conditions (temperature, residence time, flue gas flow rate) and feedstock composition (biochemical composition, atomic composition, ash and alkali metal content). The simulation model was found to show good correlation with existing publications. In order to further verify the model, own pyrolysis experiments in a 1 kg/h continuously fed fluidized bed fast pyrolysis reactor are performed. Two types of biomass with different characteristics are processed in order to evaluate the influence of the feedstock composition on the yields of the pyrolysis products and their composition. One wood and one straw-like feedstock are used due to their different characteristics. Furthermore, the temperature response of yields and product compositions is evaluated by varying the reactor temperature between 450 and 550 degrees C for one of the feedstocks. The yields of the pyrolysis products (gas, oil, char) are determined and their detailed composition is analysed. The experimental runs are reproduced with the corresponding reactor conditions in the AspenPlus model and the results compared with the experimental findings.
Resumo:
Efficient numerical models facilitate the study and design of solid oxide fuel cells (SOFCs), stacks, and systems. Whilst the accuracy and reliability of the computed results are usually sought by researchers, the corresponding modelling complexities could result in practical difficulties regarding the implementation flexibility and computational costs. The main objective of this article is to adapt a simple but viable numerical tool for evaluation of our experimental rig. Accordingly, a model for a multi-layer SOFC surrounded by a constant temperature furnace is presented, trained and validated against experimental data. The model consists of a four-layer structure including stand, two interconnects, and PEN (Positive electrode-Electrolyte-Negative electrode); each being approximated by a lumped parameter model. The heating process through the surrounding chamber is also considered. We used a set of V-I characteristics data for parameter adjustment followed by model verification against two independent sets of data. The model results show a good agreement with practical data, offering a significant improvement compared to reduced models in which the impact of external heat loss is neglected. Furthermore, thermal analysis for adiabatic and non-adiabatic process is carried out to capture the thermal behaviour of a single cell followed by a polarisation loss assessment. Finally, model-based design of experiment is demonstrated for a case study.
Resumo:
This Letter addresses image segmentation via a generative model approach. A Bayesian network (BNT) in the space of dyadic wavelet transform coefficients is introduced to model texture images. The model is similar to a Hidden Markov model (HMM), but with non-stationary transitive conditional probability distributions. It is composed of discrete hidden variables and observable Gaussian outputs for wavelet coefficients. In particular, the Gabor wavelet transform is considered. The introduced model is compared with the simplest joint Gaussian probabilistic model for Gabor wavelet coefficients for several textures from the Brodatz album [1]. The comparison is based on cross-validation and includes probabilistic model ensembles instead of single models. In addition, the robustness of the models to cope with additive Gaussian noise is investigated. We further study the feasibility of the introduced generative model for image segmentation in the novelty detection framework [2]. Two examples are considered: (i) sea surface pollution detection from intensity images and (ii) image segmentation of the still images with varying illumination across the scene.
Resumo:
Although techniques such as biopanning rely heavily upon the screening of randomized gene libraries, there is surprisingly little information available on the construction of those libraries. In general, it is based on the cloning of 'randomized' synthetic oligonucleotides, in which given position(s) contain an equal mixture of all four bases. Yet, many supposedly 'randomized' libraries contain significant elements of bias and/or omission. Here, we report the development and validation of a new, PCR-based assay that enables rapid examination of library composition both prior to and after cloning. By using our assay to analyse model libraries, we demonstrate that the cloning of a given distribution of sequences does not necessarily result in a similarly composed library of clones. Thus, while bias in randomized synthetic oligonucleotide mixtures can be virtually eliminated by using unequal ratios of the four phosphoramidites, the use of such mixtures does not ensure retrieval of a truly randomized library. We propose that in the absence of a technique to control cloning frequencies, the ability to analyse the composition of libraries after cloning will enhance significantly the quality of information derived from those libraries. (C) 2000 Published by Elsevier Science B.V. All rights reserved.
Resumo:
The investigation of insulation debris transport, sedimentation, penetration into the reactor core and head loss build up becomes important to reactor safety research for PWR and BWR, when considering the long-term behaviour of emergency core cooling systems during loss of coolant accidents. Research projects are being performed in cooperation between the University of Applied Sciences Zittau/Görlitz and the Helmholtz-Zentrum Dresden-Rossendorf. The projects include experimental investigations of different processes and phenomena of insulation debris in coolant flow and the development of CFD models. Generic complex experiments serve for building up a data base for the validation of models for single effects and their coupling in CFD codes. This paper includes the description of the experimental facility for complex generic experiments (ZSW), an overview about experimental boundary conditions and results for upstream and down-stream phenomena as well as for the long-time behaviour due to corrosive processes. © Carl Hanser Verlag, München.
Resumo:
When constructing and using environmental models, it is typical that many of the inputs to the models will not be known perfectly. In some cases, it will be possible to make observations, or occasionally physics-based uncertainty propagation, to ascertain the uncertainty on these inputs. However, such observations are often either not available or even possible, and another approach to characterising the uncertainty on the inputs must be sought. Even when observations are available, if the analysis is being carried out within a Bayesian framework then prior distributions will have to be specified. One option for gathering or at least estimating this information is to employ expert elicitation. Expert elicitation is well studied within statistics and psychology and involves the assessment of the beliefs of a group of experts about an uncertain quantity, (for example an input / parameter within a model), typically in terms of obtaining a probability distribution. One of the challenges in expert elicitation is to minimise the biases that might enter into the judgements made by the individual experts, and then to come to a consensus decision within the group of experts. Effort is made in the elicitation exercise to prevent biases clouding the judgements through well-devised questioning schemes. It is also important that, when reaching a consensus, the experts are exposed to the knowledge of the others in the group. Within the FP7 UncertWeb project (http://www.uncertweb.org/), there is a requirement to build a Webbased tool for expert elicitation. In this paper, we discuss some of the issues of building a Web-based elicitation system - both the technological aspects and the statistical and scientific issues. In particular, we demonstrate two tools: a Web-based system for the elicitation of continuous random variables and a system designed to elicit uncertainty about categorical random variables in the setting of landcover classification uncertainty. The first of these examples is a generic tool developed to elicit uncertainty about univariate continuous random variables. It is designed to be used within an application context and extends the existing SHELF method, adding a web interface and access to metadata. The tool is developed so that it can be readily integrated with environmental models exposed as web services. The second example was developed for the TREES-3 initiative which monitors tropical landcover change through ground-truthing at confluence points. It allows experts to validate the accuracy of automated landcover classifications using site-specific imagery and local knowledge. Experts may provide uncertainty information at various levels: from a general rating of their confidence in a site validation to a numerical ranking of the possible landcover types within a segment. A key challenge in the web based setting is the design of the user interface and the method of interacting between the problem owner and the problem experts. We show the workflow of the elicitation tool, and show how we can represent the final elicited distributions and confusion matrices using UncertML, ready for integration into uncertainty enabled workflows.We also show how the metadata associated with the elicitation exercise is captured and can be referenced from the elicited result, providing crucial lineage information and thus traceability in the decision making process.
Resumo:
Soil erosion is one of the most pressing issues facing developing countries. The need for soil erosion assessment is paramount as a successful and productive agricultural base is necessary for economic growth and stability. In Ghana, a country with an expanding population and high potential for economic growth, agriculture is an important resource; however, most of the crop production is restricted to low technology shifting cultivation agriculture. The high intensity seasonal rainfall coincides with the early growing period of many of the crops meaning that plots are very susceptible to erosion, especially on steep sided valleys in the region south of Lake Volta. This research investigated the processes of soil erosion by rainfall with the aim of producing a sediment yield model for a small semi-agricultural catchment in rural Ghana. Various types of modelling techniques were considered to discover those most applicable to the sub-tropical environment of Southern Ghana. Once an appropriate model had been developed and calibrated, the aim was to look at how to enable the scaling up of the model using sub-catchments to calculate sedimentation rates of Lake Volta. An experimental catchment was located in Ghana, south west of Lake Volta, where data on rainstorms and the associated streamflow, sediment loads and soil data (moisture content, classification and particle size distribution) was collected to calibrate the model. Additional data was obtained from the Soil Research Institute in Ghana to explore calibration of the Universal Soil Loss Equation (USLE, Wischmeier and Smith, 1978) for Ghanaian soils and environment. It was shown that the USLE could be successfully converted to provide meaningful soil loss estimates in the Ghanaian environment. However, due to experimental difficulties, the proposed theory and methodology of the sediment yield model could only be tested in principle. Future work may include validation of the model and subsequent scaling up to estimate sedimentation rates in Lake Volta.
Resumo:
This thesis, set within an Action Research framework, details the development and validation of a writer-centred model of the writing process. The model was synthesised within the boundaries of a writers’ group for MA students. The initial data collected, and analysed using the principles of grounded theory, were retrospective descriptions of group members’ writing processes. After initial analysis, additional data, from group members’ writing, and from audio recordings, were used for further analysis, and to form a model of the writing process. To ascertain whether the model had value outside the specific context in which it was made, it was validated from three different perspectives. Firstly, the retrospective descriptions of other writers were collected and analysed, using the model as a framework. Secondly, the model was presented at academic conferences; comments about the model, made by members of the audience, were collected and analysed. Finally, the model was used in writing courses for PhD students. Comments from these students, along with questionnaire responses, were collected and the content analysed. Upon examination of all data sources, the model was updated to reflect additional insights arising from the analysis. Analysis of the data also indicated that the model is useable outside its original context. Potential uses for the model are 1) raising awareness of the process of writing, 2) putting writers at ease, 3) serving as a starting point for individuals or groups to design their own models of the writing process, and 4) as a tool to help writers take control of their writing processes.
Resumo:
The starting point of this research was the belief that manufacturing and similar industries need help with the concept of e-business, especially in assessing the relevance of possible e-business initiatives. The research hypotheses was that it should be possible to produce a systematic model that defines, at a useful level of detail, the probable e-business requirements of an organisation based on objective criteria with an accuracy of 85%-90%. This thesis describes the development and validation of such a model. A preliminary model was developed from a variety of sources, including a survey of current and planned e-business activity and representative examples of e-business material produced by e-business solution providers. The model was subject to a process of testing and refinement based on recursive case studies, with controls over the improving accuracy and stability of the model. Useful conclusions were also possible as to the relevance of e-business functions to the case study participants themselves. Techniques were evolved to synthesise the e-business requirements of an organisation and present them at a management summary level of detail. The results of applying these techniques to all the case studies used in this research were discussed. The conclusion of the research was that the case study methodology employed was successful. A model was achieved suitable for practical application in a manufacturing organisation requiring help with a requirements definition process.
Resumo:
This paper describes the development of a tree-based decision model to predict the severity of pediatric asthma exacerbations in the emergency department (ED) at 2 h following triage. The model was constructed from retrospective patient data abstracted from the ED charts. The original data was preprocessed to eliminate questionable patient records and to normalize values of age-dependent clinical attributes. The model uses attributes routinely collected in the ED and provides predictions even for incomplete observations. Its performance was verified on independent validating data (split-sample validation) where it demonstrated AUC (area under ROC curve) of 0.83, sensitivity of 84%, specificity of 71% and the Brier score of 0.18. The model is intended to supplement an asthma clinical practice guideline, however, it can be also used as a stand-alone decision tool.
Resumo:
Developmental neurotoxicity is a major issue in human health and may have lasting neurological implications. In this preliminary study we exposed differentiating Ntera2/clone D1 (NT2/D1) cell neurospheres to known human teratogens classed as non-embryotoxic (acrylamide), weakly embryotoxic (lithium, valproic acid) and strongly embryotoxic (hydroxyurea) as listed by European Centre for the Validation of Alternative Methods (ECVAM) and examined endpoints of cell viability and neuronal protein marker expression specific to the central nervous system, to identify developmental neurotoxins. Following induction of neuronal differentiation, valproic acid had the most significant effect on neurogenesis, in terms of reduced viability and decreased neuronal markers. Lithium had least effect on viability and did not significantly alter the expression of neuronal markers. Hydroxyurea significantly reduced cell viability but did not affect neuronal protein marker expression. Acrylamide reduced neurosphere viability but did not affect neuronal protein marker expression. Overall, this NT2/D1 -based neurosphere model of neurogenesis, may provide the basis for a model of developmental neurotoxicity in vitro.
Resumo:
This thesis begins with a review of the literature on team-based working in organisations, highlighting the variations in research findings, and the need for greater precision in our measurement of teams. It continues with an illustration of the nature and prevalence of real and pseudo team-based working, by presenting results from a large sample of secondary data from the UK National Health Service. Results demonstrate that ‘real teams’ have an important and significant impact on the reduction of many work-related safety outcomes. Based on both theoretical and methodological limitations of existing approaches, the thesis moves on to provide a clarification and extension of the ‘real team’ construct, demarcating this from other (pseudo-like) team typologies on a sliding scale, rather than a simple dichotomy. A conceptual model for defining real teams is presented, providing a theoretical basis for the development of a scale on which teams can be measured for varying extents of ‘realness’. A new twelve-item scale is developed and tested with three samples of data comprising 53 undergraduate teams, 52 postgraduate teams, and 63 public sector teams from a large UK organisation. Evidence for the content, construct and criterion-related validity of the real team scale is examined over seven separate validation studies. Theoretical, methodological and practical implications of the real team scale are then discussed.
Resumo:
A microcap SPICE circuit-level model of a 12-pulse autotransformer based rectifier for an aircraft fuel-pump motor drive is described. The importance of including the nonlinear magnetising inductance of the interphase transformers is illustrated. Small supply voltage distortions are seen to result in current imbalance in the interphase transformers, degrading the rectifier input current, and may lead to infringement of the power quality specification. The model has been validated for various operating supply voltages, frequencies and output powers, against measurements from a 3.75 kW unit.
Resumo:
A combination of the two-fluid and drift flux models have been used to model the transport of fibrous debris. This debris is generated during loss of coolant accidents in the primary circuit of pressurized or boiling water nuclear reactors, as high pressure steam or water jets can damage adjacent insulation materials including mineral wool blankets. Fibre agglomerates released from the mineral wools may reach the containment sump strainers, where they can accumulate and compromise the long-term operation of the emergency core cooling system. Single-effect experiments of sedimentation in a quiescent rectangular column and sedimentation in a horizontal flow are used to verify and validate this particular application of the multiphase numerical models. The utilization of both modeling approaches allows a number of pseudocontinuous dispersed phases of spherical wetted agglomerates to be modeled simultaneously. Key effects on the transport of the fibre agglomerates are particle size, density and turbulent dispersion, as well as the relative viscosity of the fluid-fibre mixture.