928 resultados para Process Modelling
Resumo:
The aged population have an increased susceptibility to infection, therefore function of the innate immune system may be impaired as we age. Macrophages, and their precursors monocytes, play an important role in host defence in the form of phagocytosis, and also link the innate and adaptive immune system via antigen presentation. Classically-activated 'M1' macrophages are pro-inflammatory, which can be induced by encountering pathogenic material or pro-inflammatory mediators. Alternatively activated 'M2' macrophages have a largely reparative role, including clearance of apoptotic bodies and debris from tissues. Despite some innate immune receptors being implicated in the clearance of apoptotic cells, the process has been observed to have a dominant anti-inflammatory phenotype with cytokines such as IL-10 and TGF-ß being implicated. The atherosclerotic plaque contains recruited monocytes and macrophages, and is a highly inflammatory environment despite high levels of apoptosis. At these sites, monocytes differentiate into macrophages and gorge on lipoproteins, resulting in formation of 'foam cells' which then undergo apoptosis, recruiting further monocytes. This project seeks to understand why, given high levels of apoptosis, the plaque is a pro-inflammatory environment. This phenomenon may be the result of the aged environment or an inability of foam cells to elicit an anti-inflammatory effect in response to dying cells. Here we demonstrate that lipoprotein treatment of macrophages in culture results in reduced capacity to clear apoptotic cells. The effect of lipoprotein treatment on apoptotic cell-mediated immune modulation of macrophage function is currently under study.
Resumo:
Gaussian Processes provide good prior models for spatial data, but can be too smooth. In many physical situations there are discontinuities along bounding surfaces, for example fronts in near-surface wind fields. We describe a modelling method for such a constrained discontinuity and demonstrate how to infer the model parameters in wind fields with MCMC sampling.
Resumo:
A Bayesian procedure for the retrieval of wind vectors over the ocean using satellite borne scatterometers requires realistic prior near-surface wind field models over the oceans. We have implemented carefully chosen vector Gaussian Process models; however in some cases these models are too smooth to reproduce real atmospheric features, such as fronts. At the scale of the scatterometer observations, fronts appear as discontinuities in wind direction. Due to the nature of the retrieval problem a simple discontinuity model is not feasible, and hence we have developed a constrained discontinuity vector Gaussian Process model which ensures realistic fronts. We describe the generative model and show how to compute the data likelihood given the model. We show the results of inference using the model with Markov Chain Monte Carlo methods on both synthetic and real data.
Resumo:
This thesis consists of three empirical and one theoretical studies. While China has received an increasing amount of foreign direct investment (FDI) and become the second largest host country for FDI in recent years, the absence of comprehensive studies on FDI inflows into this country drives this research. In the first study, an econometric model is developed to analyse the economic, political, cultural and geographic determinants of both pledged and realised FDI in China. The results of this study suggest that China's relatively cheaper labour force, high degree of international integration with the outside world (represented by its exports and imports) and bilateral exchange rates are the important economic determinants of both pledged FDI and realised FDI in China. The second study analyses the regional distribution of both pledged and realised FDI within China. The econometric properties of the panel data set are examined using a standardised 't-bar' test. The empirical results indicate that provinces with higher level of international trade, lower wage rates, more R&D manpower, more preferential policies and closer ethnic links with overseas Chinese attract relatively more FDI. The third study constructs a dynamic equilibrium model to study the interactions among FDI, knowledge spillovers and long run economic growth in a developing country. The ideas of endogenous product cycles and trade-related international knowledge spillovers are modified and extended to FDI. The major conclusion is that, in the presence of FDI, economic growth is determined by the stock of human capital, the subjective discount rate and knowledge gap, while unskilled labour can not sustain growth. In the fourth study, the role of FDI in the growth process of the Chinese economy is investigated by using a panel of data for 27 provinces across China between 1986 and 1995. In addition to FDI, domestic R&D expenditure, international trade and human capital are added to the standard convergence regressions to control for different structural characteristics in each province. The empirical results support endogenous innovation growth theory in which regional per capita income can converge given technological diffusion, transfer and imitation.
Resumo:
The application of any e-Solution promises significant returns. In particular, using internet technologies both within enterprises and across the supply (value) chain provides real opportunity, not only for operational improvement but also for innovative strategic positioning. However, significant questions obscure potential investment; how any value will actually be created and, importantly, how this value will be shared across the value chain is not clear. This paper will describe a programme of research that is developing an enterprise simulator that will provide a more fundamental understanding of the impact of e-Solutions across operational supply chains, in terms of both standard operational and financial measures of performance. An efficient supply chain reduces total costs of operations by sharing accurate real-time information and coordinating inter-organizational business processes. This form of electronic link between organizations is known as business-to-business (B2B) e-Business. The financial measures go beyond simple cost calculations to real bottom-line performance by modelling the financial transactions that business processes generate. The paper will show how this enterprise simulator allows for a complete supply chain to be modelled in this way across four key applications: control system design, virtual enterprises, pan-supply-chain performance metrics and supporting e-Supply-chain design methodology.
Resumo:
Simulation modelling has been used for many years in the manufacturing sector but has now become a mainstream tool in business situations. This is partly because of the popularity of business process re-engineering (BPR) and other process based improvement methods that use simulation to help analyse changes in process design. This textbook includes case studies in both manufacturing and service situations to demonstrate the usefulness of the approach. A further reason for the increasing popularity of the technique is the development of business orientated and user-friendly Windows-based software. This text provides a guide to the use of ARENA, SIMUL8 and WITNESS simulation software systems that are widely used in industry and available to students. Overall this text provides a practical guide to building and implementing the results from a simulation model. All the steps in a typical simulation study are covered including data collection, input data modelling and experimentation.
Resumo:
Gaussian Processes provide good prior models for spatial data, but can be too smooth. In many physical situations there are discontinuities along bounding surfaces, for example fronts in near-surface wind fields. We describe a modelling method for such a constrained discontinuity and demonstrate how to infer the model parameters in wind fields with MCMC sampling.
Resumo:
The work describes the programme of activities relating to a mechanical study of the Conform extrusion process. The main objective was to provide a basic understanding of the mechanics of the Conform process with particular emphasis placed on modelling using experimental and theoretical considerations. The experimental equipment used includes a state of the art computer-aided data-logging system and high temperature loadcells (up to 260oC) manufactured from tungsten carbide. Full details of the experimental equipment is presented in sections 3 and 4. A theoretical model is given in Section 5. The model presented is based on the upper bound theorem using a variation of the existing extrusion theories combined with temperature changes in the feed metal across the deformation zone. In addition, constitutive equations used in the model have been generated from existing experimental data. Theoretical and experimental data are presented in tabular form in Section 6. The discussion of results includes a comprehensive graphical presentation of the experimental and theoretical data. The main findings are: (i) the establishment of stress/strain relationships and an energy balance in order to study the factors affecting redundant work, and hence a model suitable for design purposes; (ii) optimisation of the process, by determination of the extrusion pressure for the range of reduction and changes in the extrusion chamber geometry at lower wheel speeds; and (iii) an understanding of the control of the peak temperature reach during extrusion.
Resumo:
Innovation events – the introduction of new products or processes – represent the end of a process of knowledge sourcing and transformation. They also represent the beginning of a process of exploitation which may result in an improvement in the performance of the innovating business. This recursive process of knowledge sourcing, transformation and exploitation comprises the innovation value chain. Modelling the innovation value chain for a large group of manufacturing firms in Ireland and Northern Ireland highlights the drivers of innovation, productivity and firm growth. In terms of knowledge sourcing,we find strong complementarity between horizontal, forwards, backwards, public and internal knowledge sourcing activities. Each of these forms of knowledge sourcing also makes a positive contribution to innovation in both products and processes although public knowledge sources have only an indirect effect on innovation outputs. In the exploitation phase, innovation in both products and processes contribute positively tocompany growth, with product innovation having a short-term ‘disruption’ effect on labour productivity. Modelling the complete innovation value chain highlights the structure and complexity of the process of translating knowledge into business value and emphasises the role of skills, capital investment and firms’ other resources in the value creation process.
Resumo:
This thesis describes research into business user involvement in the information systems application building process. The main interest of this research is in establishing and testing techniques to quantify the relationships between identified success factors and the outcome effectiveness of 'business user development' (BUD). The availability of a mechanism to measure the levels of the success factors, and quantifiably relate them to outcome effectiveness, is important in that it provides an organisation with the capability to predict and monitor effects on BUD outcome effectiveness. This is particularly important in an era where BUD levels have risen dramatically, user centred information systems development benefits are recognised as significant, and awareness of the risks of uncontrolled BUD activity is becoming more widespread. This research targets the measurement and prediction of BUD success factors and implementation effectiveness for particular business users. A questionnaire instrument and analysis technique has been tested and developed which constitutes a tool for predicting and monitoring BUD outcome effectiveness, and is based on the BUDES (Business User Development Effectiveness and Scope) research model - which is introduced and described in this thesis. The questionnaire instrument is designed for completion by 'business users' - the target community being more explicitly defined as 'people who primarily have a business role within an organisation'. The instrument, named BUD ESP (Business User Development Effectiveness and Scope Predictor), can readily be used with survey participants, and has been shown to give meaningful and representative results.
Resumo:
A major application of computers has been to control physical processes in which the computer is embedded within some large physical process and is required to control concurrent physical processes. The main difficulty with these systems is their event-driven characteristics, which complicate their modelling and analysis. Although a number of researchers in the process system community have approached the problems of modelling and analysis of such systems, there is still a lack of standardised software development formalisms for the system (controller) development, particular at early stage of the system design cycle. This research forms part of a larger research programme which is concerned with the development of real-time process-control systems in which software is used to control concurrent physical processes. The general objective of the research in this thesis is to investigate the use of formal techniques in the analysis of such systems at their early stages of development, with a particular bias towards an application to high speed machinery. Specifically, the research aims to generate a standardised software development formalism for real-time process-control systems, particularly for software controller synthesis. In this research, a graphical modelling formalism called Sequential Function Chart (SFC), a variant of Grafcet, is examined. SFC, which is defined in the international standard IEC1131 as a graphical description language, has been used widely in industry and has achieved an acceptable level of maturity and acceptance. A comparative study between SFC and Petri nets is presented in this thesis. To overcome identified inaccuracies in the SFC, a formal definition of the firing rules for SFC is given. To provide a framework in which SFC models can be analysed formally, an extended time-related Petri net model for SFC is proposed and the transformation method is defined. The SFC notation lacks a systematic way of synthesising system models from the real world systems. Thus a standardised approach to the development of real-time process control systems is required such that the system (software) functional requirements can be identified, captured, analysed. A rule-based approach and a method called system behaviour driven method (SBDM) are proposed as a development formalism for real-time process-control systems.
Resumo:
Not withstanding the high demand of metal powder for automotive and High Tech applications, there are still many unclear aspects of the production process. Only recentlyhas supercomputer performance made possible numerical investigation of such phenomena. This thesis focuses on the modelling aspects of primary and secondary atomization. Initially two-dimensional analysis is carried out to investigate the influence of flow parameters (reservoir pressure and gas temperature principally) and nozzle geometry on final powder yielding. Among the different types, close coupled atomizers have the best performance in terms of cost and narrow size distribution. An isentropic contoured nozzle is introduced to minimize the gas flow losses through shock cells: the results demonstrate that it outperformed the standard converging-diverging slit nozzle. Furthermore the utilization of hot gas gave a promising outcome: the powder size distribution is narrowed and the gas consumption reduced. In the second part of the thesis, the interaction of liquid metal and high speed gas near the feeding tube exit was studied. Both axisymmetric andnon-axisymmetric geometries were simulated using a 3D approach. The filming mechanism was detected only for very small metal flow rates (typically obtained in laboratory scale atomizers). When the melt flow increased, the liquid core overtook the adverse gas flow and entered in the high speed wake directly: in this case the disruption isdriven by sinusoidal surface waves. The process is characterized by fluctuating values of liquid volumes entering the domain that are monitored only as a time average rate: it is far from industrial robustness and capability concept. The non-axisymmetric geometry promoted the splitting of the initial stream into four cores, smaller in diameter and easier to atomize. Finally a new atomization design based on the lesson learned from previous cases simulation is presented.
Resumo:
Liquid-liquid extraction has long been known as a unit operation that plays an important role in industry. This process is well known for its complexity and sensitivity to operation conditions. This thesis presents an attempt to explore the dynamics and control of this process using a systematic approach and state of the art control system design techniques. The process was studied first experimentally under carefully selected. operation conditions, which resembles the ranges employed practically under stable and efficient conditions. Data were collected at steady state conditions using adequate sampling techniques for the dispersed and continuous phases as well as during the transients of the column with the aid of a computer-based online data logging system and online concentration analysis. A stagewise single stage backflow model was improved to mimic the dynamic operation of the column. The developed model accounts for the variation in hydrodynamics, mass transfer, and physical properties throughout the length of the column. End effects were treated by addition of stages at the column entrances. Two parameters were incorporated in the model namely; mass transfer weight factor to correct for the assumption of no mass transfer in the. settling zones at each stage and the backmixing coefficients to handle the axial dispersion phenomena encountered in the course of column operation. The parameters were estimated by minimizing the differences between the experimental and the model predicted concentration profiles at steady state conditions using non-linear optimisation technique. The estimated values were then correlated as functions of operating parameters and were incorporated in·the model equations. The model equations comprise a stiff differential~algebraic system. This system was solved using the GEAR ODE solver. The calculated concentration profiles were compared to those experimentally measured. A very good agreement of the two profiles was achieved within a percent relative error of ±2.S%. The developed rigorous dynamic model of the extraction column was used to derive linear time-invariant reduced-order models that relate the input variables (agitator speed, solvent feed flowrate and concentration, feed concentration and flowrate) to the output variables (raffinate concentration and extract concentration) using the asymptotic method of system identification. The reduced-order models were shown to be accurate in capturing the dynamic behaviour of the process with a maximum modelling prediction error of I %. The simplicity and accuracy of the derived reduced-order models allow for control system design and analysis of such complicated processes. The extraction column is a typical multivariable process with agitator speed and solvent feed flowrate considered as manipulative variables; raffinate concentration and extract concentration as controlled variables and the feeds concentration and feed flowrate as disturbance variables. The control system design of the extraction process was tackled as multi-loop decentralised SISO (Single Input Single Output) as well as centralised MIMO (Multi-Input Multi-Output) system using both conventional and model-based control techniques such as IMC (Internal Model Control) and MPC (Model Predictive Control). Control performance of each control scheme was. studied in terms of stability, speed of response, sensitivity to modelling errors (robustness), setpoint tracking capabilities and load rejection. For decentralised control, multiple loops were assigned to pair.each manipulated variable with each controlled variable according to the interaction analysis and other pairing criteria such as relative gain array (RGA), singular value analysis (SVD). Loops namely Rotor speed-Raffinate concentration and Solvent flowrate Extract concentration showed weak interaction. Multivariable MPC has shown more effective performance compared to other conventional techniques since it accounts for loops interaction, time delays, and input-output variables constraints.
Resumo:
The generation of very short range forecasts of precipitation in the 0-6 h time window is traditionally referred to as nowcasting. Most existing nowcasting systems essentially extrapolate radar observations in some manner, however, very few systems account for the uncertainties involved. Thus deterministic forecast are produced, which have a limited use when decisions must be made, since they have no measure of confidence or spread of the forecast. This paper develops a Bayesian state space modelling framework for quantitative precipitation nowcasting which is probabilistic from conception. The model treats the observations (radar) as noisy realisations of the underlying true precipitation process, recognising that this process can never be completely known, and thus must be represented probabilistically. In the model presented here the dynamics of the precipitation are dominated by advection, so this is a probabilistic extrapolation forecast. The model is designed in such a way as to minimise the computational burden, while maintaining a full, joint representation of the probability density function of the precipitation process. The update and evolution equations avoid the need to sample, thus only one model needs be run as opposed to the more traditional ensemble route. It is shown that the model works well on both simulated and real data, but that further work is required before the model can be used operationally. © 2004 Elsevier B.V. All rights reserved.