876 resultados para 2447: modelling and forecasting
Resumo:
Simulation modelling has been used for many years in the manufacturing sector but has now become a mainstream tool in business situations. This is partly because of the popularity of business process re-engineering (BPR) and other process based improvement methods that use simulation to help analyse changes in process design. This textbook includes case studies in both manufacturing and service situations to demonstrate the usefulness of the approach. A further reason for the increasing popularity of the technique is the development of business orientated and user-friendly Windows-based software. This text provides a guide to the use of ARENA, SIMUL8 and WITNESS simulation software systems that are widely used in industry and available to students. Overall this text provides a practical guide to building and implementing the results from a simulation model. All the steps in a typical simulation study are covered including data collection, input data modelling and experimentation.
Resumo:
Product design and sourcing decisions are among the most difficult and important of all decisions facing multinational manufacturing companies, yet associated decision support and evaluation systems tend to be myopic in nature. Design for manufacture and assembly techniques, for example, generally focuses on manufacturing capability and ignores capacity although both should be considered. Similarly, most modelling and evaluation tools available to examine the performance of various solution and improvement techniques have a narrower scope than desired. A unique collaboration, funded by the US National Science Foundation, between researchers in the USA and the UK currently addresses these problems. This paper describes a technique known as Design For the Existing Environment (DFEE) and an holistic evaluation system based on enterprise simulation that was used to demonstrate the business benefits of DFEE applied in a simple product development and manufacturing case study. A project that will extend these techniques to evaluate global product sourcing strategies is described along with the practical difficulties of building an enterprise simulation on the scale and detail required.
Resumo:
Background Adjuvants enhance or modify an immune response that is made to an antigen. An antagonist of the chemokine CCR4 receptor can display adjuvant-like properties by diminishing the ability of CD4+CD25+ regulatory T cells (Tregs) to down-regulate immune responses. Methodology Here, we have used protein modelling to create a plausible chemokine receptor model with the aim of using virtual screening to identify potential small molecule chemokine antagonists. A combination of homology modelling and molecular docking was used to create a model of the CCR4 receptor in order to investigate potential lead compounds that display antagonistic properties. Three-dimensional structure-based virtual screening of the CCR4 receptor identified 116 small molecules that were calculated to have a high affinity for the receptor; these were tested experimentally for CCR4 antagonism. Fifteen of these small molecules were shown to inhibit specifically CCR4-mediated cell migration, including that of CCR4+ Tregs. Significance Our CCR4 antagonists act as adjuvants augmenting human T cell proliferation in an in vitro immune response model and compound SP50 increases T cell and antibody responses in vivo when combined with vaccine antigens of Mycobacterium tuberculosis and Plasmodium yoelii in mice.
Resumo:
The topic of bioenergy, biofuels and bioproducts remains at the top of the current political and research agenda. Identification of the optimum processing routes for biomass, in terms of efficiency, cost, environment and socio-economics is vital as concern grows over the remaining fossil fuel resources, climate change and energy security. It is known that the only renewable way of producing conventional hydrocarbon fuels and organic chemicals is from biomass, but the problem remains of identifying the best product mix and the most efficient way of processing biomass to products. The aim is to move Europe towards a biobased economy and it is widely accepted that biorefineries are key to this development. A methodology was required for the generation and evaluation of biorefinery process chains for converting biomass into one or more valuable products that properly considers performance, cost, environment, socio-economics and other factors that influence the commercial viability of a process. In this thesis a methodology to achieve this objective is described. The completed methodology includes process chain generation, process modelling and subsequent analysis and comparison of results in order to evaluate alternative process routes. A modular structure was chosen to allow greater flexibility and allowing the user to generate a large number of different biorefinery configurations The significance of the approach is that the methodology is defined and is thus rigorous and consistent and may be readily re-examined if circumstances change. There was the requirement for consistency in structure and use, particularly for multiple analyses. It was important that analyses could be quickly and easily carried out to consider, for example, different scales, configurations and product portfolios and so that previous outcomes could be readily reconsidered. The result of the completed methodology is the identification of the most promising biorefinery chains from those considered as part of the European Biosynergy Project.
Resumo:
Tissue transglutaminase (TG2) is a multifunctional Ca2+ activated protein crosslinking enzyme secreted into the extracellular matrix (ECM), where it is involved in wound healing and scarring, tissue fibrosis, celiac disease and metastatic cancer. Extracellular TG2 can also facilitate cell adhesion important in wound healing through a non-transamidating mechanism via its association with fibronectin (FN), heparan sulphates (HS) and integrins. Regulating the mechanism how TG2 is translocated into the ECM therefore provides a strategy for modulating these physiological and pathological functions of the enzyme. Here, through molecular modelling and mutagenesis we have identified the HS binding site of TG2 202KFLKNAGRDCSRRSSPVYVGR222. We demonstrate the requirement of this binding site for translocation of TG2 into the ECM through a mechanism involving cell surface shedding of HS. By synthesizing a peptide NPKFLKNAGRDCSRRSS corresponding to the HS binding site within TG2, we also demonstrate how this mimicking peptide can in isolation compensate the RGD-induced loss of cell adhesion on FN via binding to syndecan-4, leading to activation of PKCa, pFAK-397 and ERK1/2 and the subsequent formation of focal adhesions and actin cytoskeleton organization. A novel regulatory mechanism for TG2 translocation into the extracellular compartment that depends upon TG2 conformation and the binding of HS is proposed.
Resumo:
A new surface analysis technique has been developed which has a number of benefits compared to conventional Low Energy Ion Scattering Spectrometry (LEISS). A major potential advantage arising from the absence of charge exchange complications is the possibility of quantification. The instrumentation that has been developed also offers the possibility of unique studies concerning the interaction between low energy ions and atoms and solid surfaces. From these studies it may also be possible, in principle, to generate sensitivity factors to quantify LEISS data. The instrumentation, which is referred to as a Time-of-Flight Fast Atom Scattering Spectrometer has been developed to investigate these conjecture in practice. The development, involved a number of modifications to an existing instrument, and allowed samples to be bombarded with a monoenergetic pulsed beam of either atoms or ions, and provided the capability to analyse the spectra of scattered atoms and ions separately. Further to this a system was designed and constructed to allow incident, exit and azimuthal angles of the particle beam to be varied independently. The key development was that of a pulsed, and mass filtered atom source; which was developed by a cyclic process of design, modelling and experimentation. Although it was possible to demonstrate the unique capabilities of the instrument, problems relating to surface contamination prevented the measurement of the neutralisation probabilities. However, these problems appear to be technical rather than scientific in nature, and could be readily resolved given the appropriate resources. Experimental spectra obtained from a number of samples demonstrate some fundamental differences between the scattered ion and neutral spectra. For practical non-ordered surfaces the ToF spectra are more complex than their LEISS counterparts. This is particularly true for helium scattering where it appears, in the absence of detailed computer simulation, that quantitative analysis is limited to ordered surfaces. Despite this limitation the ToFFASS instrument opens the way for quantitative analysis of the 'true' surface region to a wider range of surface materials.
Resumo:
The objective of this work has been to investigate the principle of combined bioreaction and separation in a simulated counter-current chromatographic bioreactor-separator system (SCCR-S). The SCCR-S system consisted of twelve 5.4cm i.d x 75cm long columns packed with calcium charged cross-linked polystyrene resin. Three bioreactions, namely the saccharification of modified starch to maltose and dextrin using the enzyme maltogenase, the hydrolysis of lactose to galactose and glucose in the presence of the enzyme lactase and the biosynthesis of dextran from sucrose using the enzyme dextransucrase. Combined bioreaction and separation has been successfully carried out in the SCCR-S system for the saccharification of modified starch to maltose and dextrin. The effects of the operating parameters (switch time, eluent flowrate, feed concentration and enzyme activity) on the performance of the SCCR-S system were investigated. By using an eluent of dilute enzyme solution, starch conversions of up to 60% were achieved using lower amounts of enzyme than the theoretical amount required by a conventional bioreactor to produce the same amount of maltose over the same time period. Comparing the SCCR-S system to a continuous annular chromatograph (CRAC) for the saccharification of modified starch showed that the SCCR-S system required only 34.6-47.3% of the amount of enzyme required by the CRAC. The SCCR-S system was operated in the batch and continuous modes as a bioreactor-separator for the hydrolysis of lactose to galactose and glucose. By operating the system in the continuous mode, the operating parameters were further investigated. During these experiments the eluent was deionised water and the enzyme was introduced into the system through the same port as the feed. The galactose produced was retarded and moved with the stationary phase to be purge as the galactose rich product (GalRP) while the glucose moved with the mobile phase and was collected as the glucose rich product (GRP). By operating at up to 30%w/v lactose feed concentrations, complete conversions were achieved using only 48% of the theoretical amount of enzyme required by a conventional bioreactor to hydrolyse the same amount of glucose over the same time period. The main operating parameters affecting the performance of the SCCR-S system operating in the batch mode were investigated and the results compared to those of the continuous operation of the SCCR-S system. . During the biosynthesis of dextran in the SCCR-S system, a method of on-line regeneration of the resin was required to operate the system continuously. Complete conversion was achieved at sucrose feed concentrations of 5%w/v with fructose rich. products (FRP) of up to 100% obtained. The dextran rich products were contaninated by small amounts of glucose and levan formed during the bioreaction. Mathematical modelling and computer simulation of the SCCR-S. system operating in the continuous mode for the hydrolysis of lactose has been carried out. .
Resumo:
The aim of this work has been to investigate the behaviour of a continuous rotating annular chromatograph (CRAC) under a combined biochemical reaction and separation duty. Two biochemical reactions have been employed, namely the inversion of sucrose to glucose and fructose in the presence of the enzyme invertase and the saccharification of liquefied starch to maltose and dextrin using the enzyme maltogenase. Simultaneous biochemical reaction and separation has been successfully carried out for the first time in a CRAC by inverting sucrose to fructose and glucose using the enzyme invertase and collecting continuously pure fractions of glucose and fructose from the base of the column. The CRAC was made of two concentric cylinders which form an annulus 140 cm long by 1.2 cm wide, giving an annular space of 14.5 dm3. The ion exchange resin used was an industrial grade calcium form Dowex 50W-X4 with a mean diameter of 150 microns. The mobile phase used was deionised and dearated water and contained the appropriate enzyme. The annular column was slowly rotated at speeds of up to 240°h-1 while the sucrose substrate was fed continuously through a stationary feed pipe to the top of the resin bed. A systematic investigation of the factors affecting the performance of the CRAC under simultaneous biochemical reaction and separation conditions was carried out by employing a factorial experimental procedure. The main factors affecting the performance of the system were found to be the feed rate, feed concentrations and eluent rate. Results from the experiments indicated that complete conversion could be achieved for feed concentrations of up to 50% w/v sucrose and at feed throughputs of up to 17.2 kg sucrose per m3 resin/h. The second enzymic reaction, namely the saccharification of liquefied starch to maltose employing the enzyme maltogenase has also been successfully carried out on a CRAC. Results from the experiments using soluble potato starch showed that conversions of up to 79% were obtained for a feed concentration of 15.5% w/v at a feed flowrate of 400 cm3/h. The product maltose obtained was over 95% pure. Mathematical modelling and computer simulation of the sucrose inversion system has been carried out. A finite difference method was used to solve the partial differential equations and the simulation results showed good agreement with the experimental results obtained.
Resumo:
The objective of this work has been to study the behaviour and performance of a batch chromatographic column under simultaneous bioreaction and separation conditions for several carbohydrate feedstocks. Four bioreactions were chosen, namely the hydrolysis of sucrose to glucose and fructose using the enzyme invertase, the hydrolysis of inulin to fructose and glucose using inulinase, the hydrolysis of lactose to glucose and galactose using lactase and the isomerization of glucose to fructose using glucose isomerase. The chromatographic columns employed were jacketed glass columns ranging from 1 m to 2 m long and the internal diameter ranging from 0.97 cm to 1.97 cm. The stationary phase used was a cation exchange resin (PUROLITE PCR-833) in the Ca2+ form for the hydrolysis and the Mg2+ form for the isomerization reactions. The mobile phase used was a diluted enzyme solution which was continuously pumped through the chromatographic bed. The substrate was injected at the top of the bed as a pulse. The effect of the parameters pulse size, the amount of substrate solution introduced into the system corresponding to a percentage of the total empty column volume (% TECV), pulse concentration, eluent flowrate and the enzyme activity of the eluent were investigated. For the system sucrose-invertase complete conversions of substrate were achieved for pulse sizes and pulse concentrations of up to 20% TECV and 60% w/v, respectively. Products with purity above 90% were obtained. The enzyme consumption was 45% of the amount theoretically required to produce the same amount of product as in a conventional batch reactor. A value of 27 kg sucrose/m3 resin/h for the throughput of the system was achieved. The systematic investigation of the factors affecting the performance of the batch chromatographic bioreactor-separator was carried out by employing a factorial experimental procedure. The main factors affecting the performance of the system were the flowrate and enzyme activity. For the system inulin-inulinase total conversions were also obtained for pulses sizes of up to 20 % TECV and a pulse concentration of 10 % w/v. Fructose rich fractions with 100 % purity and representing up to 99.4 % of the total fructose generated were obtained with an enzyme consumption of 32 % of the amount theoretically required to produce the same amount of product in a conventional batch reactor. The hydrolysis of lactose by lactase was studied in the glass columns and also in an SCCR-S unit adapted for batch operation, in co-operation with Dr. Shieh, a fellow researcher in the Chemical Engineering and Applied Chemistry Department at Aston University. By operating at up to 30 % w/v lactose feed concentrations complete conversions were obtained and the purities of the products generated were above 90%. An enzyme consumption of 48 % of the amount theoretically required to produce the same amount of product in a conventional batch reactor was achieved. On working with the system glucose-glucose isomerase, which is a reversible reaction, the separation obtained with the stationary phase conditioned in the magnesium form was very poor although the conversion obtained was compatible with those for conventional batch reactors. By working with a mixed pulse of enzyme and substrate, up to 82.5 % of the fructose generated with a purity of 100 % was obtained. The mathematical modelling and computer simulation of the batch chromatographic bioreaction-separation has been performed on a personal computer. A finite difference method was used to solve the partial differential equations and the simulation results showed good agreement with the experimental results.
Resumo:
Measurements of the sea surface obtained by satellite borne radar altimetry are irregularly spaced and contaminated with various modelling and correction errors. The largest source of uncertainty for low Earth orbiting satellites such as ERS-1 and Geosat may be attributed to orbital modelling errors. The empirical correction of such errors is investigated by examination of single and dual satellite crossovers, with a view to identifying the extent of any signal aliasing: either by removal of long wavelength ocean signals or introduction of additional error signals. From these studies, it was concluded that sinusoidal approximation of the dominant one cycle per revolution orbit error over arc lengths of 11,500 km did not remove a significant mesoscale ocean signal. The use of TOPEX/Poseidon dual crossovers with ERS-1 was shown to substantially improve the radial accuracy of ERS-1, except for some absorption of small TOPEX/Poseidon errors. The extraction of marine geoid information is of great interest to the oceanographic community and was the subject of the second half of this thesis. Firstly through determination of regional mean sea surfaces using Geosat data, it was demonstrated that a dataset with 70cm orbit error contamination could produce a marine geoid map which compares to better than 12cm with an accurate regional high resolution gravimetric geoid. This study was then developed into Optimal Fourier Transform Interpolation, a technique capable of analysing complete altimeter datasets for the determination of consistent global high resolution geoid maps. This method exploits the regular nature of ascending and descending data subsets thus making possible the application of fast Fourier transform algorithms. Quantitative assessment of this method was limited by the lack of global ground truth gravity data, but qualitative results indicate good signal recovery from a single 35-day cycle.
Resumo:
The binding theme of this thesis is the examination of both phakic and pseudophakic accommodation by means of theoretical modelling and the application of a new biometric measuring technique. Anterior Segment Optical Coherence Tomography (AS-OCT) was used to assess phakic accommodative changes in 30 young subjects (19.4 2.0 years; range, 18 to 25 years). A new method of assessing curvature change with this technique was employed with limited success. Changes in axial accommodative spacing, however, proved to be very similar to those of the Scheimpflug-based data. A unique biphasic trend in the position of the posterior crystalline lens surface during accommodation was discovered, which has not been alluded to in the literature. All axial changes with accommodation were statistically significant (p < 0.01) with the exception of corneal thickness (p = 0.81). A two-year follow-up study was undertaken for a cohort of subjects previously implanted with a new accommodating intraocular lens (AIOL) (Lenstec Tetraflex KH3500). All measures of best corrected distance visual acuity (BCDVA; +0.04 0.24 logMAR), distance corrected near visual acuity (DCNVA; +0.61 0.17 logMAR) and contrast sensitivity (+1.35 0.21 log units) were good. The subjective accommodation response quantified with the push-up technique (1.53 0.64 D) and defocus curves (0.77 0.29 D) was greater than the objective stimulus response (0.21 0.19 D). AS-OCT measures with accommodation stimulus revealed a small mean posterior movement of the AIOLs (0.02 0.03 mm for a 4.0 D stimulus); this is contrary to proposed mechanism of the anterior focus-shift principle.
Resumo:
Multiple-antenna systems offer significant performance enhancement and will be applied to the next generation broadband wireless communications. This thesis presents the investigations of multiple-antenna systems – multiple-input multiple-output (MIMO) and cooperative communication (CC) – and their performances in more realistic propagation environments than those reported previously. For MIMO systems, the investigations are conducted via theoretical modelling and simulations in a double-scattering environment. The results show that the variations of system performances depend on how scatterer density varies in flat fading channels, and that in frequency-selective fading channels system performances are affected by the length of the coding block as well as scatterer density. In realistic propagation environments, the fading correlation also has an impact on CC systems where the antennas can be further apart than those in MIMO systems. A general stochastic model is applied to studying the effects of fading correlation on the performances of CC systems. This model reflects the asymmetry fact of the wireless channels in a CC system. The results demonstrate the varied effects of fading correlation under different protocols and channel conditions. Performances of CC systems are further studied at the packet level, using both simulations and an experimental testbed. The results obtained have verified various performance trade-offs of the cooperative relaying network (CRN) investigated in different propagation environments. The results suggest that a proper selection of the relaying algorithms and other techniques can meet the requirements of quality of service for different applications.
Resumo:
INTAMAP is a Web Processing Service for the automatic spatial interpolation of measured point data. Requirements were (i) using open standards for spatial data such as developed in the context of the Open Geospatial Consortium (OGC), (ii) using a suitable environment for statistical modelling and computation, and (iii) producing an integrated, open source solution. The system couples an open-source Web Processing Service (developed by 52°North), accepting data in the form of standardised XML documents (conforming to the OGC Observations and Measurements standard) with a computing back-end realised in the R statistical environment. The probability distribution of interpolation errors is encoded with UncertML, a markup language designed to encode uncertain data. Automatic interpolation needs to be useful for a wide range of applications and the algorithms have been designed to cope with anisotropy, extreme values, and data with known error distributions. Besides a fully automatic mode, the system can be used with different levels of user control over the interpolation process.
Resumo:
Web-based distributed modelling architectures are gaining increasing recognition as potentially useful tools to build holistic environmental models, combining individual components in complex workflows. However, existing web-based modelling frameworks currently offer no support for managing uncertainty. On the other hand, the rich array of modelling frameworks and simulation tools which support uncertainty propagation in complex and chained models typically lack the benefits of web based solutions such as ready publication, discoverability and easy access. In this article we describe the developments within the UncertWeb project which are designed to provide uncertainty support in the context of the proposed ‘Model Web’. We give an overview of uncertainty in modelling, review uncertainty management in existing modelling frameworks and consider the semantic and interoperability issues raised by integrated modelling. We describe the scope and architecture required to support uncertainty management as developed in UncertWeb. This includes tools which support elicitation, aggregation/disaggregation, visualisation and uncertainty/sensitivity analysis. We conclude by highlighting areas that require further research and development in UncertWeb, such as model calibration and inference within complex environmental models.
Resumo:
The number of interoperable research infrastructures has increased significantly with the growing awareness of the efforts made by the Global Earth Observation System of Systems (GEOSS). One of the Societal Benefit Areas (SBA) that is benefiting most from GEOSS is biodiversity, given the costs of monitoring the environment and managing complex information, from space observations to species records including their genetic characteristics. But GEOSS goes beyond simple data sharing to encourage the publishing and combination of models, an approach which can ease the handling of complex multi-disciplinary questions. It is the purpose of this paper to illustrate these concepts by presenting eHabitat, a basic Web Processing Service (WPS) for computing the likelihood of finding ecosystems with equal properties to those specified by a user. When chained with other services providing data on climate change, eHabitat can be used for ecological forecasting and becomes a useful tool for decision-makers assessing different strategies when selecting new areas to protect. eHabitat can use virtually any kind of thematic data that can be considered as useful when defining ecosystems and their future persistence under different climatic or development scenarios. The paper will present the architecture and illustrate the concepts through case studies which forecast the impact of climate change on protected areas or on the ecological niche of an African bird.