33 resultados para Integration of Programming Techniques

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The availability of ‘omics’ technologies is transforming scientific approaches to physiological problems from a reductionist viewpoint to that of a holistic viewpoint. This is of profound importance in nutrition, since the integration of multiple systems at the level of gene expression on the synthetic side through to metabolic enzyme activity on the degradative side combine to govern nutrient availability to tissues. Protein activity is central to the process of nutrition from the initial absorption of nutrients via uptake carriers in the gut, through to distribution and transport in the blood, metabolism by degradative enzymes in tissues and excretion through renal tubule exchange proteins. Therefore, the global profiling of the proteome, defined as the entire protein complement of the genome expressed in a particular cell or organ, or in plasma or serum at a particular time, offers the potential for identification of important biomarkers of nutritional state that respond to alterations in diet. The present review considers the published evidence of nutritional modulation of the proteome in vivo which has expanded exponentially over the last 3 years. It highlights some of the challenges faced by researchers using proteomic approaches to understand the interactions of diet with genomic and metabolic–phenotypic variables in normal populations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most of the existing work on information integration in the Semantic Web concentrates on resolving schema-level problems. Specific issues of data-level integration (instance coreferencing, conflict resolution, handling uncertainty) are usually tackled by applying the same techniques as for ontology schema matching or by reusing the solutions produced in the database domain. However, data structured according to OWL ontologies has its specific features: e.g., the classes are organized into a hierarchy, the properties are inherited, data constraints differ from those defined by database schema. This paper describes how these features are exploited in our architecture KnoFuss, designed to support data-level integration of semantic annotations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Saturation mutagenesis is a powerful tool in modern protein engineering. This can allow the analysis of potential new properties thus allowing key residues within a protein to be targeted and randomised. However, the creation of large libraries using conventional saturation mutagenesis with degenerate codons (NNN or NNK) has inherent redundancy and disparities in residue representation. In this we describe the combination of ProxiMAX randomisation and CIS display for the use of generating novel peptides. Unlike other methods ProxiMAX randomisation does not require any intricate chemistry but simply utilises synthetic DNA and molecular biology techniques. Designed ‘MAX’ oligonucleotides were ligated, amplified and digested in an iterative cycle. Results show that randomised ‘MAX’ codons can be added sequentially to the base sequence creating a series of randomised non-degenerate codons that can subsequently be inserted into a gene. CIS display (Isogencia, UK) is an in vitro DNA based screening method that creates a genotype to phenotype link between a peptide and the nucleic acid that encodes it. The use of straight forward in vitro transcription/translation and other molecular biology techniques permits ease of use along with flexibility making it a potent screening technique. Using ProxiMAX randomisation in combination with CIS display, the aim is to produce randomised anti-nerve growth factor (NGF) and calcitonin gene-related (CGRP) peptides to demonstrate the high-throughput nature of this combination.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In dimensional metrology, often the largest source of uncertainty of measurement is thermal variation. Dimensional measurements are currently scaled linearly, using ambient temperature measurements and coefficients of thermal expansion, to ideal metrology conditions at 20˚C. This scaling is particularly difficult to implement with confidence in large volumes as the temperature is unlikely to be uniform, resulting in thermal gradients. A number of well-established computational methods are used in the design phase of product development for the prediction of thermal and gravitational effects, which could be used to a greater extent in metrology. This paper outlines the theory of how physical measurements of dimension and temperature can be combined more comprehensively throughout the product lifecycle, from design through to the manufacturing phase. The Hybrid Metrology concept is also introduced: an approach to metrology, which promises to improve product and equipment integrity in future manufacturing environments. The Hybrid Metrology System combines various state of the art physical dimensional and temperature measurement techniques with established computational methods to better predict thermal and gravitational effects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article analyses the complex process that deracialised and democratised South African football between the early 1970s and 1990s. Based mainly on archival documents, it argues that growing isolation from world sport, exemplified by South Africa's expulsion from the Olympic movement in 1970 and FIFA in 1976, and the reinvigoration of the liberation struggle with the Soweto youth uprising triggered a process of gradual desegregation in the South African professional game. While Pretoria viewed such changes as a potential bulwark against rising black militancy, white football and big business had their own reasons for eventually supporting racial integration, as seen in the founding of the National Soccer League. As negotiations for a new democratic South Africa began in earnest between the African National Congress (ANC) and the National Party (NP) in the latter half of the 1980s, transformations in football and politics paralleled and informed each other. Previously antagonistic football associations began a series of 'unity talks' between 1985 and 1986 that eventually culminated in the formation of a single, non-racial South African Football Association in December 1991, just a few days before the Convention for a Democratic South Africa (CODESA) opened the process of writing a new post-apartheid constitution. Finally, three decades of isolation came to an end as FIFA welcomed South Africa back into world football in 1992 - a powerful example of the seemingly boundless potential of a liberated and united South Africa ahead of the first democratic elections in 1994.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article considers the role of accounting in organisational decision making. It challenges the rational nature of decisions made in organisations through the use of accounting models and the problems of predicting the future through the use of such models. The use of accounting in this manner is evaluated from an epochal postmodern stance. Issues raised by chaos theory and the uncertainty principle are used to demonstrate problems with the predictive ability of accounting models. The authors argue that any consideration of the predictive value of accounting needs to change to incorporate a recognition of the turbulent external environment, if it is to be of use for organisational decision making. Thus it is argued that the role of accounting as a mechanism for knowledge creation regarding the future is fundamentally flawed. We take this as a starting-point to argue for the real purpose of the use of the predictive techniques of accounting, using its ritualistic role in the context of myth creation to argue for the cultural benefits of the use of such flawed techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel direct integration technique of the Manakov-PMD equation for the simulation of polarisation mode dispersion (PMD) in optical communication systems is demonstrated and shown to be numerically as efficient as the commonly used coarse-step method. The main advantage of using a direct integration of the Manakov-PMD equation over the coarse-step method is a higher accuracy of the PMD model. The new algorithm uses precomputed M(w) matrices to increase the computational speed compared to a full integration without loss of accuracy. The simulation results for the probability distribution function (PDF) of the differential group delay (DGD) and the autocorrelation function (ACF) of the polarisation dispersion vector for varying numbers of precomputed M(w) matrices are compared to analytical models and results from the coarse-step method. It is shown that the coarse-step method achieves a significantly inferior reproduction of the statistical properties of PMD in optical fibres compared to a direct integration of the Manakov-PMD equation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: The use of PHMB as a disinfectant in contact lens multipurpose solutions has been at the centre of much debate in recent times, particularly in relation to the issue of solution induced corneal staining. Clinical studies have been carried out which suggest different effects with individual contact lens materials used in combination with specific PHMB containing care regimes. There does not appear to be, however, a reliable analytical technique that would detect and quantify with any degree of accuracy the specific levels of PHMB that are taken up and released from individual solutions by the various contact lens materials. Methods: PHMB is a mixture of positively charged polymer units of varying molecular weight that has maximum absorbance wavelength of 236 nm. On the basis of these properties a range of assays including capillary electrophoresis, HPLC, a nickelnioxime colorimetric technique, mass spectrophotometry, UV spectroscopy and ion chromatography were assessed paying particular attention to each of their constraints and detection levels. Particular interest was focused on the relative advantage of contactless conductivity compared to UV and mass spectrometry detection in capillary electrophoresis (CE). This study provides an overview of the comparative performance of these techniques. Results: The UV absorbance of PHMB solutions, ranging from 0.0625 to 50 ppm was measured at 236 nm. Within this range the calibration curve appears to be linear however, absorption values below 1 ppm (0.0001%) were extremely difficult to reproduce. The concentration of PHMB in solutions is in the range of 0.0002–0.00005% and our investigations suggest that levels of PHMB below 0.0001% (levels encountered in uptake and release studies) can not be accurately estimated, in particular when analysing complex lens care solutions which can contain competitively absorbing, and thus interfering, species in the solution. The use of separative methodologies, such as CE using UV detection alone is similarly limited. Alternative techniques including contactless conductivity detection offer greater discrimination in complex solutions together with the opportunity for dual channel detection. Preliminary results achieved by TraceDec1 contactless conductivity detection, (Gain 150%, Offset 150) in conjunction with the Agilent capillary electrophoresis system using a bare fused silica capillary (extended light path, 50 mid, total length 64.5 cm, effective length 56 cm) and a cationic buffer at pH 3.2, exhibit great potential with reproducible PHMB split peaks. Conclusions: PHMB-based solutions are commonly associated with the potential to invoke corneal staining in combination with certain contact lens materials. However this terminology ‘PHMBbased solution’ is used primarily because PHMB itself has yet to be adequately implicated as the causative agent of the staining and compromised corneal cell integrity. The lack of well characterised adequately sensitive assays, coupled with the range of additional components that characterise individual care solutions pose a major barrier to the investigation of PHMB interactions in the lenswearing eye.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, UK industry has seen an explosive growth in the number of `Computer Aided Production Management' (CAPM) system installations. Of the many CAPM systems, materials requirement planning/manufacturing resource planning (MRP/MRPII) is the most widely implemented. Despite the huge investments in MRP systems, over 80 percent are said to have failed within 3 to 5 years of installation. Many people now assume that Just-In-Time (JIT) is the best manufacturing technique. However, those who have implemented JIT have found that it also has many problems. The author argues that the success of a manufacturing company will not be due to a system which complies with a single technique; but due to the integration of many techniques and the ability to make them complement each other in a specific manufacturing environment. This dissertation examines the potential for integrating MRP with JIT and Two-Bin systems to reduce operational costs involved in managing bought-out inventory. Within this framework it shows that controlling MRP is essential to facilitate the integrating process. The behaviour of MRP systems is dependent on the complex interactions between the numerous control parameters used. Methodologies/models are developed to set these parameters. The models are based on the Pareto principle. The idea is to use business targets to set a coherent set of parameters, which not only enables those business targets to be realised, but also facilitates JIT implementation. It illustrates this approach in the context of an actual manufacturing plant - IBM Havant. (IBM Havant is a high volume electronics assembly plant with the majority of the materials bought-out). The parameter setting models are applicable to control bought-out items in a wide range of industries and are not dependent on specific MRP software. The models have produced successful results in several companies and are now being developed as commercial products.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The number of remote sensing platforms and sensors rises almost every year, yet much work on the interpretation of land cover is still carried out using either single images or images from the same source taken at different dates. Two questions could be asked of this proliferation of images: can the information contained in different scenes be used to improve the classification accuracy and, what is the best way to combine the different imagery? Two of these multiple image sources are MODIS on the Terra platform and ETM+ on board Landsat7, which are suitably complementary. Daily MODIS images with 36 spectral bands in 250-1000 m spatial resolution and seven spectral bands of ETM+ with 30m and 16 days spatial and temporal resolution respectively are available. In the UK, cloud cover may mean that only a few ETM+ scenes may be available for any particular year and these may not be at the time of year of most interest. The MODIS data may provide information on land cover over the growing season, such as harvest dates, that is not present in the ETM+ data. Therefore, the primary objective of this work is to develop a methodology for the integration of medium spatial resolution Landsat ETM+ image, with multi-temporal, multi-spectral, low-resolution MODIS \Terra images, with the aim of improving the classification of agricultural land. Additionally other data may also be incorporated such as field boundaries from existing maps. When classifying agricultural land cover of the type seen in the UK, where crops are largely sown in homogenous fields with clear and often mapped boundaries, the classification is greatly improved using the mapped polygons and utilising the classification of the polygon as a whole as an apriori probability in classifying each individual pixel using a Bayesian approach. When dealing with multiple images from different platforms and dates it is highly unlikely that the pixels will be exactly co-registered and these pixels will contain a mixture of different real world land covers. Similarly the different atmospheric conditions prevailing during the different days will mean that the same emission from the ground will give rise to different sensor reception. Therefore, a method is presented with a model of the instantaneous field of view and atmospheric effects to enable different remote sensed data sources to be integrated.