972 resultados para computational study
Resumo:
In this paper we report the results of ab initio calculations on the energetics and kinetics of oxygen-driven carbon gasification reactions using a small model cluster, with full characterisation of the stationary points on the reaction paths. We show that previously unconsidered pathways present significantly reduced barriers to reaction and must be considered as alternative viable paths. At least two electronic spin states of the model cluster must be considered for a complete description. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
In large epidemiological studies missing data can be a problem, especially if information is sought on a sensitive topic or when a composite measure is calculated from several variables each affected by missing values. Multiple imputation is the method of choice for 'filling in' missing data based on associations among variables. Using an example about body mass index from the Australian Longitudinal Study on Women's Health, we identify a subset of variables that are particularly useful for imputing values for the target variables. Then we illustrate two uses of multiple imputation. The first is to examine and correct for bias when data are not missing completely at random. The second is to impute missing values for an important covariate; in this case omission from the imputation process of variables to be used in the analysis may introduce bias. We conclude with several recommendations for handling issues of missing data. Copyright (C) 2004 John Wiley Sons, Ltd.
Resumo:
Numerical simulations are conducted to investigate how a droplet of Newtonian liquid. entrained in a higher viscosity Newtonian liquid, behaves when passing through an axisymmetric microfluidic contraction. Simulations are performed using a transient Volume of Fluid finite volume algorithm, and cover ranges of Reynolds and Weber numbers relevant to microfluidic flows. Results are presented for a droplet to surrounding fluid viscosity ratio of 0.001. In contrast to behaviour at higher viscosity ratios obtained previously by the authors, shear and interfacial tension driven instabilities often develop along the droplet Surface. leading to complex shape development, and in some instances, droplet breakup. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Ab initio density functional theory (DFT) calculations are performed to explore possible catalytic effects on the dissociative chemisorption of hydrogen on a Mg(0001) surface when carbon is incorporated into Mg materials. The computational results imply that a C atom located initially on a Mg(0001) surface can migrate into the subsurface and occupy an fcc interstitial site, with charge transfer to the C atom from neighboring Mg atoms. The effect of subsurface C on the dissociation of H-2 on the Mg(0001) surface is found to be relatively marginal: a perfect sublayer of interstitial C is calculated to lower the barrier by 0.16 eV compared with that on a pure Mg(0001) surface. Further calculations reveal, however, that sublayer C may have a significant effect in enhancing the diffusion of atomic hydrogen into the sublayers through fcc channels. This contributes new physical understanding toward rationalizing the experimentally observed improvement in absorption kinetics of H2 when graphite or single walled carbon nanotubes (SWCNT) are introduced into the Mg powder during ball milling.
Resumo:
Computational fluid dynamics was used to search for the links between the observed pattern of attack seen in a bauxite refinery's heat exchanger headers and the hydrodynamics inside the header. Validation of the computational fluid dynamics results was done by comparing then with flow parameters measured in a 1:5 scale model of the first pass header in the laboratory. Computational fluid dynamics simulations were used to establish hydrodynamic similarity between the 1:5 scale and full scale models of the first pass header. It was found that the erosion-corrosion damage seen at the tubesheet of the first pass header was a consequence of increased levels of turbulence at the tubesheet caused by a rapidly turning flow. A prismatic flow corrections device introduced in the past helped in rectifying the problem at the tubesheet but exaggerated the erosion-corrosion problem at the first pass header shell. A number of alternative flow correction devices were tested using computational fluid dynamics. Axial ribbing in the first pass header and an inlet flow diffuser have shown the best performance and were recommended for implementation. Computational fluid dynamics simulations have revealed a smooth orderly low turbulence flow pattern in the second, third and fourth pass as well as the exit headers where no erosion-corrosion was seen in practice. This study has confirmed that near-wall turbulence intensity, which can be successfully predicted by using computational fluid dynamics, is a good hydrodynamic predictor of erosion-corrosion damage in complex geometries. (c) 2006 Published by Elsevier Ltd.
Resumo:
Racing algorithms have recently been proposed as a general-purpose method for performing model selection in machine teaming algorithms. In this paper, we present an empirical study of the Hoeffding racing algorithm for selecting the k parameter in a simple k-nearest neighbor classifier. Fifteen widely-used classification datasets from UCI are used and experiments conducted across different confidence levels for racing. The results reveal a significant amount of sensitivity of the k-nn classifier to its model parameter value. The Hoeffding racing algorithm also varies widely in its performance, in terms of the computational savings gained over an exhaustive evaluation. While in some cases the savings gained are quite small, the racing algorithm proved to be highly robust to the possibility of erroneously eliminating the optimal models. All results were strongly dependent on the datasets used.
Resumo:
The thesis presents an experimentally validated modelling study of the flow of combustion air in an industrial radiant tube burner (RTB). The RTB is used typically in industrial heat treating furnaces. The work has been initiated because of the need for improvements in burner lifetime and performance which are related to the fluid mechanics of the com busting flow, and a fundamental understanding of this is therefore necessary. To achieve this, a detailed three-dimensional Computational Fluid Dynamics (CFD) model has been used, validated with experimental air flow, temperature and flue gas measurements. Initially, the work programme is presented and the theory behind RTB design and operation in addition to the theory behind swirling flows and methane combustion. NOx reduction techniques are discussed and numerical modelling of combusting flows is detailed in this section. The importance of turbulence, radiation and combustion modelling is highlighted, as well as the numerical schemes that incorporate discretization, finite volume theory and convergence. The study first focuses on the combustion air flow and its delivery to the combustion zone. An isothermal computational model was developed to allow the examination of the flow characteristics as it enters the burner and progresses through the various sections prior to the discharge face in the combustion area. Important features identified include the air recuperator swirler coil, the step ring, the primary/secondary air splitting flame tube and the fuel nozzle. It was revealed that the effectiveness of the air recuperator swirler is significantly compromised by the need for a generous assembly tolerance. Also, there is a substantial circumferential flow maldistribution introduced by the swirier, but that this is effectively removed by the positioning of a ring constriction in the downstream passage. Computations using the k-ε turbulence model show good agreement with experimentally measured velocity profiles in the combustion zone and proved the use of the modelling strategy prior to the combustion study. Reasonable mesh independence was obtained with 200,000 nodes. Agreement was poorer with the RNG k-ε and Reynolds Stress models. The study continues to address the combustion process itself and the heat transfer process internal to the RTB. A series of combustion and radiation model configurations were developed and the optimum combination of the Eddy Dissipation (ED) combustion model and the Discrete Transfer (DT) radiation model was used successfully to validate a burner experimental test. The previously cold flow validated k-ε turbulence model was used and reasonable mesh independence was obtained with 300,000 nodes. The combination showed good agreement with temperature measurements in the inner and outer walls of the burner, as well as with flue gas composition measured at the exhaust. The inner tube wall temperature predictions validated the experimental measurements in the largest portion of the thermocouple locations, highlighting a small flame bias to one side, although the model slightly over predicts the temperatures towards the downstream end of the inner tube. NOx emissions were initially over predicted, however, the use of a combustion flame temperature limiting subroutine allowed convergence to the experimental value of 451 ppmv. With the validated model, the effectiveness of certain RTB features identified previously is analysed, and an analysis of the energy transfers throughout the burner is presented, to identify the dominant mechanisms in each region. The optimum turbulence-combustion-radiation model selection was then the baseline for further model development. One of these models, an eccentrically positioned flame tube model highlights the failure mode of the RTB during long term operation. Other models were developed to address NOx reduction and improvement of the flame profile in the burner combustion zone. These included a modified fuel nozzle design, with 12 circular section fuel ports, which demonstrates a longer and more symmetric flame, although with limited success in NOx reduction. In addition, a zero bypass swirler coil model was developed that highlights the effect of the stronger swirling combustion flow. A reduced diameter and a 20 mm forward displaced flame tube model shows limited success in NOx reduction; although the latter demonstrated improvements in the discharge face heat distribution and improvements in the flame symmetry. Finally, Flue Gas Recirculation (FGR) modelling attempts indicate the difficulty of the application of this NOx reduction technique in the Wellman RTB. Recommendations for further work are made that include design mitigations for the fuel nozzle and further burner modelling is suggested to improve computational validation. The introduction of fuel staging is proposed, as well as a modification in the inner tube to enhance the effect of FGR.
Resumo:
Finite element analysis is a useful tool in understanding how the accommodation system of the eye works. Further to simpler FEA models that have been used hitherto, this paper describes a sensitivity study which aims to understand which parameters of the crystalline lens are key to developing an accurate model of the accommodation system. A number of lens models were created, allowing the mechanical properties, internal structure and outer geometry to be varied. These models were then spun about their axes, and the deformations determined. The results showed the mechanical properties are the critical parameters, with the internal structure secondary. Further research is needed to fully understand how the internal structure and properties interact to affect lens deformation.
Resumo:
With the advent of distributed computer systems with a largely transparent user interface, new questions have arisen regarding the management of such an environment by an operating system. One fertile area of research is that of load balancing, which attempts to improve system performance by redistributing the workload submitted to the system by the users. Early work in this field concentrated on static placement of computational objects to improve performance, given prior knowledge of process behaviour. More recently this has evolved into studying dynamic load balancing with process migration, thus allowing the system to adapt to varying loads. In this thesis, we describe a simulated system which facilitates experimentation with various load balancing algorithms. The system runs under UNIX and provides functions for user processes to communicate through software ports; processes reside on simulated homogeneous processors, connected by a user-specified topology, and a mechanism is included to allow migration of a process from one processor to another. We present the results of a study of adaptive load balancing algorithms, conducted using the aforementioned simulated system, under varying conditions; these results show the relative merits of different approaches to the load balancing problem, and we analyse the trade-offs between them. Following from this study, we present further novel modifications to suggested algorithms, and show their effects on system performance.
Resumo:
This work examines prosody modelling for the Standard Yorùbá (SY) language in the context of computer text-to-speech synthesis applications. The thesis of this research is that it is possible to develop a practical prosody model by using appropriate computational tools and techniques which combines acoustic data with an encoding of the phonological and phonetic knowledge provided by experts. Our prosody model is conceptualised around a modular holistic framework. The framework is implemented using the Relational Tree (R-Tree) techniques (Ehrich and Foith, 1976). R-Tree is a sophisticated data structure that provides a multi-dimensional description of a waveform. A Skeletal Tree (S-Tree) is first generated using algorithms based on the tone phonological rules of SY. Subsequent steps update the S-Tree by computing the numerical values of the prosody dimensions. To implement the intonation dimension, fuzzy control rules where developed based on data from native speakers of Yorùbá. The Classification And Regression Tree (CART) and the Fuzzy Decision Tree (FDT) techniques were tested in modelling the duration dimension. The FDT was selected based on its better performance. An important feature of our R-Tree framework is its flexibility in that it facilitates the independent implementation of the different dimensions of prosody, i.e. duration and intonation, using different techniques and their subsequent integration. Our approach provides us with a flexible and extendible model that can also be used to implement, study and explain the theory behind aspects of the phenomena observed in speech prosody.
Resumo:
This thesis presents a study of how edges are detected and encoded by the human visual system. The study begins with theoretical work on the development of a model of edge processing, and includes psychophysical experiments on humans, and computer simulations of these experiments, using the model. The first chapter reviews the literature on edge processing in biological and machine vision, and introduces the mathematical foundations of this area of research. The second chapter gives a formal presentation of a model of edge perception that detects edges and characterizes their blur, contrast and orientation, using Gaussian derivative templates. This model has previously been shown to accurately predict human performance in blur matching tasks with several different types of edge profile. The model provides veridical estimates of the blur and contrast of edges that have a Gaussian integral profile. Since blur and contrast are independent parameters of Gaussian edges, the model predicts that varying one parameter should not affect perception of the other. Psychophysical experiments showed that this prediction is incorrect: reducing the contrast makes an edge look sharper; increasing the blur reduces the perceived contrast. Both of these effects can be explained by introducing a smoothed threshold to one of the processing stages of the model. It is shown that, with this modification,the model can predict the perceived contrast and blur of a number of edge profiles that differ markedly from the ideal Gaussian edge profiles on which the templates are based. With only a few exceptions, the results from all the experiments on blur and contrast perception can be explained reasonably well using one set of parameters for each subject. In the few cases where the model fails, possible extensions to the model are discussed.
Resumo:
This paper presents a comparative study of three closely related Bayesian models for unsupervised document level sentiment classification, namely, the latent sentiment model (LSM), the joint sentiment-topic (JST) model, and the Reverse-JST model. Extensive experiments have been conducted on two corpora, the movie review dataset and the multi-domain sentiment dataset. It has been found that while all the three models achieve either better or comparable performance on these two corpora when compared to the existing unsupervised sentiment classification approaches, both JST and Reverse-JST are able to extract sentiment-oriented topics. In addition, Reverse-JST always performs worse than JST suggesting that the JST model is more appropriate for joint sentiment topic detection.
Resumo:
We investigate an application of the method of fundamental solutions (MFS) to the backward heat conduction problem (BHCP). We extend the MFS in Johansson and Lesnic (2008) [5] and Johansson et al. (in press) [6] proposed for one and two-dimensional direct heat conduction problems, respectively, with the sources placed outside the space domain of interest. Theoretical properties of the method, as well as numerical investigations, are included, showing that accurate and stable results can be obtained efficiently with small computational cost.
Resumo:
This paper describes work conducted as a joint collaboration between the Virtual Design Team (VDT) research group at Stanford University (USA) , the Systems Engineering Group (SEG) at De Montfort University (UK) and Elipsis Ltd . We describe a new docking methodology in which we combine the use of two radically different types of organizational simulation tool. The VDT simulation tool operates on a standalone computer, and employs computational agents during simulated execution of a pre-defined process model (Kunz, 1998). The other software tool, DREAMS , operates over a standard TCP/IP network, and employs human agents (real people) during a simulated execution of a pre-defined process model (Clegg, 2000).
Resumo:
Background: The Aston Medication Adherence Study was designed to examine non-adherence to prescribed medicines within an inner-city population using general practice (GP) prescribing data. Objective: To examine non-adherence patterns to prescribed oralmedications within three chronic disease states and to compare differences in adherence levels between various patient groups to assist the routine identification of low adherence amongst patients within the Heart of Birmingham teaching Primary Care Trust (HoBtPCT). Setting: Patients within the area covered by HoBtPCT (England) prescribed medication for dyslipidaemia, type-2 diabetes and hypothyroidism, between 2000 and 2010 inclusively. HoBtPCT's population was disproportionately young,with seventy per cent of residents fromBlack and Minority Ethnic groups. Method: Systematic computational analysis of all medication issue data from 76 GP surgeries dichotomised patients into two groups (adherent and non-adherent) for each pharmacotherapeutic agent within the treatment groups. Dichotomised groupings were further analysed by recorded patient demographics to identify predictors of lower adherence levels. Results were compared to an analysis of a self-reportmeasure of adherence [using the Modified Morisky Scale© (MMAS-8)] and clinical value data (cholesterol values) from GP surgery records. Main outcome: Adherence levels for different patient demographics, for patients within specific longterm treatment groups. Results: Analysis within all three groups showed that for patients with the following characteristics, adherence levels were statistically lower than for others; patients: younger than 60 years of age; whose religion is coded as "Islam"; whose ethnicity is coded as one of the Asian groupings or as "Caribbean", "Other Black" and "African"; whose primary language is coded as "Urdu" or "Bengali"; and whose postcodes indicate that they live within the most socioeconomically deprived areas of HoBtPCT. Statistically significant correlations between adherence status and results from the selfreport measure of adherence and of clinical value data analysis were found. Conclusion: Using data fromGP prescribing systems, a computerised tool to calculate individual adherence levels for oral pharmacotherapy for the treatment of diabetes, dyslipidaemia and hypothyroidism has been developed.The tool has been used to establish nonadherence levels within the three treatment groups and the demographic characteristics indicative of lower adherence levels, which in turn will enable the targeting of interventional support within HoBtPCT. © Koninklijke Nederlandse Maatschappij ter bevordering der Pharmacie 2013.