27 resultados para Computational analysis

em Aston University Research Archive


Relevância:

80.00% 80.00%

Publicador:

Resumo:

A series of N1-benzylidene pyridine-2-carboxamidrazone anti-tuberculosis compounds has been evaluated for their cytotoxicity using human mononuclear leucocytes (MNL) as target cells. All eight compounds were significantly more toxic than dimethyl sulphoxide control and isoniazid (INH) with the exception of a 4-methoxy-3-(2-phenylethyloxy) derivative, which was not significantly different in toxicity compared with INH. The most toxic agent was an ethoxy derivative, followed by 3-nitro, 4-methoxy, dimethylpropyl, 4-methylbenzyloxy, 3-methoxy-4-(-2-phenylethyloxy) and 4-benzyloxy in rank order. In comparison with the effect of selected carboxamidrazone agents on cells alone, the presence of either N-acetyl cysteine (NAC) or glutathione caused a significant reduction in the toxicity of INH, as well as on the 4-benzyloxy derivative, although both increased the toxicity of a 4-N,N-dimethylamino-1-naphthylidene and a 2-t-butylthio derivative. The derivatives from this and three previous studies were subjected to computational analysis in order to derive equations designed to establish quantitative structure activity relationships for these agents. Twenty-five compounds were thus resolved into two groups (1 and 2), which on analysis yielded equations with r2 values in the range 0.65-0.92. Group 1 shares a common mode of toxicity related to hydrophobicity, where cytotoxicity peaked at logP of 3.2, while Group 2 toxicity was strongly related to ionisation potential. The presence of thiols such as NAC and GSH both promoted and attenuated toxicity in selected compounds from Group 1, suggesting that secondary mechanisms of toxicity were operating. These studies will facilitate the design of future low toxicity high activity anti-tubercular carboxamidrazone agents. © 2003 Elsevier Science B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Classical studies of area summation measure contrast detection thresholds as a function of grating diameter. Unfortunately, (i) this approach is compromised by retinal inhomogeneity and (ii) it potentially confounds summation of signal with summation of internal noise. The Swiss cheese stimulus of T. S. Meese and R. J. Summers (2007) and the closely related Battenberg stimulus of T. S. Meese (2010) were designed to avoid these problems by keeping target diameter constant and modulating interdigitated checks of first-order carrier contrast within the stimulus region. This approach has revealed a contrast integration process with greater potency than the classical model of spatial probability summation. Here, we used Swiss cheese stimuli to investigate the spatial limits of contrast integration over a range of carrier frequencies (1–16 c/deg) and raised plaid modulator frequencies (0.25–32 cycles/check). Subthreshold summation for interdigitated carrier pairs remained strong (~4 to 6 dB) up to 4 to 8 cycles/check. Our computational analysis of these results implied linear signal combination (following square-law transduction) over either (i) 12 carrier cycles or more or (ii) 1.27 deg or more. Our model has three stages of summation: short-range summation within linear receptive fields, medium-range integration to compute contrast energy for multiple patches of the image, and long-range pooling of the contrast integrators by probability summation. Our analysis legitimizes the inclusion of widespread integration of signal (and noise) within hierarchical image processing models. It also confirms the individual differences in the spatial extent of integration that emerge from our approach.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: The Aston Medication Adherence Study was designed to examine non-adherence to prescribed medicines within an inner-city population using general practice (GP) prescribing data. Objective: To examine non-adherence patterns to prescribed oralmedications within three chronic disease states and to compare differences in adherence levels between various patient groups to assist the routine identification of low adherence amongst patients within the Heart of Birmingham teaching Primary Care Trust (HoBtPCT). Setting: Patients within the area covered by HoBtPCT (England) prescribed medication for dyslipidaemia, type-2 diabetes and hypothyroidism, between 2000 and 2010 inclusively. HoBtPCT's population was disproportionately young,with seventy per cent of residents fromBlack and Minority Ethnic groups. Method: Systematic computational analysis of all medication issue data from 76 GP surgeries dichotomised patients into two groups (adherent and non-adherent) for each pharmacotherapeutic agent within the treatment groups. Dichotomised groupings were further analysed by recorded patient demographics to identify predictors of lower adherence levels. Results were compared to an analysis of a self-reportmeasure of adherence [using the Modified Morisky Scale© (MMAS-8)] and clinical value data (cholesterol values) from GP surgery records. Main outcome: Adherence levels for different patient demographics, for patients within specific longterm treatment groups. Results: Analysis within all three groups showed that for patients with the following characteristics, adherence levels were statistically lower than for others; patients: younger than 60 years of age; whose religion is coded as "Islam"; whose ethnicity is coded as one of the Asian groupings or as "Caribbean", "Other Black" and "African"; whose primary language is coded as "Urdu" or "Bengali"; and whose postcodes indicate that they live within the most socioeconomically deprived areas of HoBtPCT. Statistically significant correlations between adherence status and results from the selfreport measure of adherence and of clinical value data analysis were found. Conclusion: Using data fromGP prescribing systems, a computerised tool to calculate individual adherence levels for oral pharmacotherapy for the treatment of diabetes, dyslipidaemia and hypothyroidism has been developed.The tool has been used to establish nonadherence levels within the three treatment groups and the demographic characteristics indicative of lower adherence levels, which in turn will enable the targeting of interventional support within HoBtPCT. © Koninklijke Nederlandse Maatschappij ter bevordering der Pharmacie 2013.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Accurate protein structure prediction remains an active objective of research in bioinformatics. Membrane proteins comprise approximately 20% of most genomes. They are, however, poorly tractable targets of experimental structure determination. Their analysis using bioinformatics thus makes an important contribution to their on-going study. Using a method based on Bayesian Networks, which provides a flexible and powerful framework for statistical inference, we have addressed the alignment-free discrimination of membrane from non-membrane proteins. The method successfully identifies prokaryotic and eukaryotic α-helical membrane proteins at 94.4% accuracy, β-barrel proteins at 72.4% accuracy, and distinguishes assorted non-membranous proteins with 85.9% accuracy. The method here is an important potential advance in the computational analysis of membrane protein structure. It represents a useful tool for the characterisation of membrane proteins with a wide variety of potential applications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper we present the design and analysis of an intonation model for text-to-speech (TTS) synthesis applications using a combination of Relational Tree (RT) and Fuzzy Logic (FL) technologies. The model is demonstrated using the Standard Yorùbá (SY) language. In the proposed intonation model, phonological information extracted from text is converted into an RT. RT is a sophisticated data structure that represents the peaks and valleys as well as the spatial structure of a waveform symbolically in the form of trees. An initial approximation to the RT, called Skeletal Tree (ST), is first generated algorithmically. The exact numerical values of the peaks and valleys on the ST is then computed using FL. Quantitative analysis of the result gives RMSE of 0.56 and 0.71 for peak and valley respectively. Mean Opinion Scores (MOS) of 9.5 and 6.8, on a scale of 1 - -10, was obtained for intelligibility and naturalness respectively.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study presents a computational parametric analysis of DME steam reforming in a large scale Circulating Fluidized Bed (CFB) reactor. The Computational Fluid Dynamic (CFD) model used, which is based on Eulerian-Eulerian dispersed flow, has been developed and validated in Part I of this study [1]. The effect of the reactor inlet configuration, gas residence time, inlet temperature and steam to DME ratio on the overall reactor performance and products have all been investigated. The results have shown that the use of double sided solid feeding system remarkable improvement in the flow uniformity, but with limited effect on the reactions and products. The temperature has been found to play a dominant role in increasing the DME conversion and the hydrogen yield. According to the parametric analysis, it is recommended to run the CFB reactor at around 300 °C inlet temperature, 5.5 steam to DME molar ratio, 4 s gas residence time and 37,104 ml gcat -1 h-1 space velocity. At these conditions, the DME conversion and hydrogen molar concentration in the product gas were both found to be around 80%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Derivational morphology proposes meaningful connections between words and is largely unrepresented in lexical databases. This thesis presents a project to enrich a lexical database with morphological links and to evaluate their contribution to disambiguation. A lexical database with sense distinctions was required. WordNet was chosen because of its free availability and widespread use. Its suitability was assessed through critical evaluation with respect to specifications and criticisms, using a transparent, extensible model. The identification of serious shortcomings suggested a portable enrichment methodology, applicable to alternative resources. Although 40% of the most frequent words are prepositions, they have been largely ignored by computational linguists, so addition of prepositions was also required. The preferred approach to morphological enrichment was to infer relations from phenomena discovered algorithmically. Both existing databases and existing algorithms can capture regular morphological relations, but cannot capture exceptions correctly; neither of them provide any semantic information. Some morphological analysis algorithms are subject to the fallacy that morphological analysis can be performed simply by segmentation. Morphological rules, grounded in observation and etymology, govern associations between and attachment of suffixes and contribute to defining the meaning of morphological relationships. Specifying character substitutions circumvents the segmentation fallacy. Morphological rules are prone to undergeneration, minimised through a variable lexical validity requirement, and overgeneration, minimised by rule reformulation and restricting monosyllabic output. Rules take into account the morphology of ancestor languages through co-occurrences of morphological patterns. Multiple rules applicable to an input suffix need their precedence established. The resistance of prefixations to segmentation has been addressed by identifying linking vowel exceptions and irregular prefixes. The automatic affix discovery algorithm applies heuristics to identify meaningful affixes and is combined with morphological rules into a hybrid model, fed only with empirical data, collected without supervision. Further algorithms apply the rules optimally to automatically pre-identified suffixes and break words into their component morphemes. To handle exceptions, stoplists were created in response to initial errors and fed back into the model through iterative development, leading to 100% precision, contestable only on lexicographic criteria. Stoplist length is minimised by special treatment of monosyllables and reformulation of rules. 96% of words and phrases are analysed. 218,802 directed derivational links have been encoded in the lexicon rather than the wordnet component of the model because the lexicon provides the optimal clustering of word senses. Both links and analyser are portable to an alternative lexicon. The evaluation uses the extended gloss overlaps disambiguation algorithm. The enriched model outperformed WordNet in terms of recall without loss of precision. Failure of all experiments to outperform disambiguation by frequency reflects on WordNet sense distinctions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To make vision possible, the visual nervous system must represent the most informative features in the light pattern captured by the eye. Here we use Gaussian scale-space theory to derive a multiscale model for edge analysis and we test it in perceptual experiments. At all scales there are two stages of spatial filtering. An odd-symmetric, Gaussian first derivative filter provides the input to a Gaussian second derivative filter. Crucially, the output at each stage is half-wave rectified before feeding forward to the next. This creates nonlinear channels selectively responsive to one edge polarity while suppressing spurious or "phantom" edges. The two stages have properties analogous to simple and complex cells in the visual cortex. Edges are found as peaks in a scale-space response map that is the output of the second stage. The position and scale of the peak response identify the location and blur of the edge. The model predicts remarkably accurately our results on human perception of edge location and blur for a wide range of luminance profiles, including the surprising finding that blurred edges look sharper when their length is made shorter. The model enhances our understanding of early vision by integrating computational, physiological, and psychophysical approaches. © ARVO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Grafting of antioxidants and other modifiers onto polymers by reactive extrusion, has been performed successfully by the Polymer Processing and Performance Group at Aston University. Traditionally the optimum conditions for the grafting process have been established within a Brabender internal mixer. Transfer of this batch process to a continuous processor, such as an extruder, has, typically, been empirical. To have more confidence in the success of direct transfer of the process requires knowledge of, and comparison between, residence times, mixing intensities, shear rates and flow regimes in the internal mixer and in the continuous processor.The continuous processor chosen for the current work in the closely intermeshing, co-rotating twin-screw extruder (CICo-TSE). CICo-TSEs contain screw elements that convey material with a self-wiping action and are widely used for polymer compounding and blending. Of the different mixing modules contained within the CICo-TSE, the trilobal elements, which impose intensive mixing, and the mixing discs, which impose extensive mixing, are of importance when establishing the intensity of mixing. In this thesis, the flow patterns within the various regions of the single-flighted conveying screw elements and within both the trilobal element and mixing disc zones of a Betol BTS40 CICo-TSE, have been modelled using the computational fluid dynamics package Polyflow. A major obstacle encountered when solving the flow problem within all of these sets of elements, arises from both the complex geometry and the time-dependent flow boundaries as the elements rotate about their fixed axes. Simulation of the time dependent boundaries was overcome by selecting a number of sequential 2D and 3D geometries, used to represent partial mixing cycles. The flow fields were simulated using the ideal rheological properties of polypropylene and characterised in terms of velocity vectors, shear stresses generated and a parameter known as the mixing efficiency. The majority of the large 3D simulations were performed on the Cray J90 supercomputer situated at the Rutherford-Appleton laboratories, with pre- and postprocessing operations achieved via a Silicon Graphics Indy workstation. A mechanical model was constructed consisting of various CICo-TSE elements rotating within a transparent outer barrel. A technique has been developed using coloured viscous clays whereby the flow patterns and mixing characteristics within the CICo-TSE may be visualised. In order to test and verify the simulated predictions, the patterns observed within the mechanical model were compared with the flow patterns predicted by the computational model. The flow patterns within the single-flighted conveying screw elements in particular, showed good agreement between the experimental and simulated results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Finite element analysis is a useful tool in understanding how the accommodation system of the eye works. Further to simpler FEA models that have been used hitherto, this paper describes a sensitivity study which aims to understand which parameters of the crystalline lens are key to developing an accurate model of the accommodation system. A number of lens models were created, allowing the mechanical properties, internal structure and outer geometry to be varied. These models were then spun about their axes, and the deformations determined. The results showed the mechanical properties are the critical parameters, with the internal structure secondary. Further research is needed to fully understand how the internal structure and properties interact to affect lens deformation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The retrieval of wind vectors from satellite scatterometer observations is a non-linear inverse problem. A common approach to solving inverse problems is to adopt a Bayesian framework and to infer the posterior distribution of the parameters of interest given the observations by using a likelihood model relating the observations to the parameters, and a prior distribution over the parameters. We show how Gaussian process priors can be used efficiently with a variety of likelihood models, using local forward (observation) models and direct inverse models for the scatterometer. We present an enhanced Markov chain Monte Carlo method to sample from the resulting multimodal posterior distribution. We go on to show how the computational complexity of the inference can be controlled by using a sparse, sequential Bayes algorithm for estimation with Gaussian processes. This helps to overcome the most serious barrier to the use of probabilistic, Gaussian process methods in remote sensing inverse problems, which is the prohibitively large size of the data sets. We contrast the sampling results with the approximations that are found by using the sparse, sequential Bayes algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The spreading time of liquid binder droplet on the surface a primary particle is analyzed for Fluidized Bed Melt Granulation (FBMG). As discussed in the first paper of this series (Chua et al., in press) the droplet spreading rate has been identified as one of the important parameters affecting the probability of particles aggregation in FBMG. In this paper, the binder droplet spreading time has been estimated using Computational Fluid Dynamic modeling (CFD) based on Volume of Fluid approach (VOF). A simplified analytical solution has been developed and tested to explore its validity for predicting the spreading time. For the purpose of models validation, the droplet spreading evolution was recorded using a high speed video camera. Based on the validated model, a generalized correlative equation for binder spreading time is proposed. For the operating conditions considered here, the spreading time for Polyethylene Glycol (PEG1500) binder was found to fall within the range of 10-2 to 10-5 s. The study also included a number of other common binders used in FBMG. The results obtained here will be further used in paper III, where the binder solidification rate is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Antigenic peptide is presented to a T-cell receptor (TCR) through the formation of a stable complex with a major histocompatibility complex (MHC) molecule. Various predictive algorithms have been developed to estimate a peptide's capacity to form a stable complex with a given MHC class II allele, a technique integral to the strategy of vaccine design. These have previously incorporated such computational techniques as quantitative matrices and neural networks. A novel predictive technique is described, which uses molecular modeling of predetermined crystal structures to estimate the stability of an MHC class II-peptide complex. The structures are remodeled, energy minimized, and annealed before the energetic interaction is calculated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents an effective methodology for the generation of a simulation which can be used to increase the understanding of viscous fluid processing equipment and aid in their development, design and optimisation. The Hampden RAPRA Torque Rheometer internal batch twin rotor mixer has been simulated with a view to establishing model accuracies, limitations, practicalities and uses. As this research progressed, via the analyses several 'snap-shot' analysis of several rotor configurations using the commercial code Polyflow, it was evident that the model was of some worth and its predictions are in good agreement with the validation experiments, however, several major restrictions were identified. These included poor element form, high man-hour requirements for the construction of each geometry and the absence of the transient term in these models. All, or at least some, of these limitations apply to the numerous attempts to model internal mixes by other researchers and it was clear that there was no generally accepted methodology to provide a practical three-dimensional model which has been adequately validated. This research, unlike others, presents a full complex three-dimensional, transient, non-isothermal, generalised non-Newtonian simulation with wall slip which overcomes these limitations using unmatched ridding and sliding mesh technology adapted from CFX codes. This method yields good element form and, since only one geometry has to be constructed to represent the entire rotor cycle, is extremely beneficial for detailed flow field analysis when used in conjunction with user defined programmes and automatic geometry parameterisation (AGP), and improves accuracy for investigating equipment design and operation conditions. Model validation has been identified as an area which has been neglected by other researchers in this field, especially for time dependent geometries, and has been rigorously pursued in terms of qualitative and quantitative velocity vector analysis of the isothermal, full fill mixing of generalised non-Newtonian fluids, as well as torque comparison, with a relatively high degree of success. This indicates that CFD models of this type can be accurate and perhaps have not been validated to this extent previously because of the inherent difficulties arising from most real processes.