96 resultados para Modelling Systems
Resumo:
The work is a logical continuation of research started at Aston some years ago when studies were conducted on fermentations in bubble columns. The present work highlights typical design and operating problems that could arise in such systems as waste water, chemical, biochemical and petroleum operations involving three-phase, gas-liquid-solid fluidisation; such systems are in increasing use. It is believed that this is one of few studies concerned with `true' three-phase, gas-liquid-solid fluidised systems, and that this work will contribute significantly to closing some of the gaps in knowledge in this area. The research work was mainly experimentally based and involved studies of the hydrodynamic parameters, phase holdups (gas and solid), particle mixing and segregation, and phase flow dynamics (flow regime and circulation patterns). The studies have focused particularly on the solid behaviour and the influence of properties of solids present on the above parameters in three-phase, gas-liquid-solid fluidised systems containing single particle components and those containing binary and ternary mixtures of particles. All particles were near spherical in shape and two particle sizes and total concentration levels were used. Experiments were carried out in two- and three-dimensional bubble columns. Quantitative results are presented in graphical form and are supported by qualitative results from visual studies which are also shown as schematic diagrams and in photographic form. Gas and solid holdup results are compared for air-water containing single, binary and ternary component particle mixtures. It should be noted that the criteria for selection of the materials used are very important if true three-phase fluidisation is to be achieved: this is very evident when comparing the results with those in the literature. The fluid flow and circulation patterns observed were assessed for validation of the generally accepted patterns, and the author believes that the present work provides more accurate insight into the modelling of liquid circulation in bubble columns. The characteristic bubbly flow at low gas velocity in a two-phase system is suppressed in the three-phase system. The degree of mixing within the system is found to be dependent on flow regime, liquid circulation and the ratio of solid phase physical properties. Evidence of strong `trade-off' of properties is shown; the overall solid holdup is believed to be a major parameter influencing the gas holdup structure.
Resumo:
Hazard and operability (HAZOP) studies on chemical process plants are very time consuming, and often tedious, tasks. The requirement for HAZOP studies is that a team of experts systematically analyse every conceivable process deviation, identifying possible causes and any hazards that may result. The systematic nature of the task, and the fact that some team members may be unoccupied for much of the time, can lead to tedium, which in turn may lead to serious errors or omissions. An aid to HAZOP are fault trees, which present the system failure logic graphically such that the study team can readily assimilate their findings. Fault trees are also useful to the identification of design weaknesses, and may additionally be used to estimate the likelihood of hazardous events occurring. The one drawback of fault trees is that they are difficult to generate by hand. This is because of the sheer size and complexity of modern process plants. The work in this thesis proposed a computer-based method to aid the development of fault trees for chemical process plants. The aim is to produce concise, structured fault trees that are easy for analysts to understand. Standard plant input-output equation models for major process units are modified such that they include ancillary units and pipework. This results in a reduction in the nodes required to represent a plant. Control loops and protective systems are modelled as operators which act on process variables. This modelling maintains the functionality of loops, making fault tree generation easier and improving the structure of the fault trees produced. A method, called event ordering, is proposed which allows the magnitude of deviations of controlled or measured variables to be defined in terms of the control loops and protective systems with which they are associated.
Resumo:
The generation of very short range forecasts of precipitation in the 0-6 h time window is traditionally referred to as nowcasting. Most existing nowcasting systems essentially extrapolate radar observations in some manner, however, very few systems account for the uncertainties involved. Thus deterministic forecast are produced, which have a limited use when decisions must be made, since they have no measure of confidence or spread of the forecast. This paper develops a Bayesian state space modelling framework for quantitative precipitation nowcasting which is probabilistic from conception. The model treats the observations (radar) as noisy realisations of the underlying true precipitation process, recognising that this process can never be completely known, and thus must be represented probabilistically. In the model presented here the dynamics of the precipitation are dominated by advection, so this is a probabilistic extrapolation forecast. The model is designed in such a way as to minimise the computational burden, while maintaining a full, joint representation of the probability density function of the precipitation process. The update and evolution equations avoid the need to sample, thus only one model needs be run as opposed to the more traditional ensemble route. It is shown that the model works well on both simulated and real data, but that further work is required before the model can be used operationally. © 2004 Elsevier B.V. All rights reserved.
Resumo:
It is generally assumed when using Bayesian inference methods for neural networks that the input data contains no noise. For real-world (errors in variable) problems this is clearly an unsafe assumption. This paper presents a Bayesian neural network framework which accounts for input noise provided that a model of the noise process exists. In the limit where the noise process is small and symmetric it is shown, using the Laplace approximation, that this method adds an extra term to the usual Bayesian error bar which depends on the variance of the input noise process. Further, by treating the true (noiseless) input as a hidden variable, and sampling this jointly with the network’s weights, using a Markov chain Monte Carlo method, it is demonstrated that it is possible to infer the regression over the noiseless input. This leads to the possibility of training an accurate model of a system using less accurate, or more uncertain, data. This is demonstrated on both the, synthetic, noisy sine wave problem and a real problem of inferring the forward model for a satellite radar backscatter system used to predict sea surface wind vectors.
Resumo:
Information systems are corporate resources, therefore information systems development must be aligned with corporate strategy. This thesis proposes that effective strategic alignment of information systems requires information systems development, information systems planning and strategic management to be united. Literature in these areas is examined, breaching the academic boundaries which separate these areas, to contribute a synthesised approach to the strategic alignment of information systems development. Previous work in information systems planning has extended information systems development techniques, such as data modelling, into strategic planning activities, neglecting techniques of strategic management. Examination of strategic management in this thesis, identifies parallel trends in strategic management and information systems development; the premises of the learning school of strategic management are similar to those of soft systems approaches to information systems development. It is therefore proposed that strategic management can be supported by a soft systems approach. Strategic management tools and techniques frame individual views of a strategic situation; soft systems approaches can integrate these diverse views to explore the internal and external environments of an organisation. The information derived from strategic analysis justifies the need for an information system and provides a starting point for information systems development. This is demonstrated by a composite framework which enables each information system to be justified according to its direct contribution to corporate strategy. The proposed framework was developed through action research conducted in a number of organisations of varying types. This suggests that the framework can be widely used to support the strategic alignment of information systems development, thereby contributing to organisational success.
Resumo:
The topic of this thesis is the development of knowledge based statistical software. The shortcomings of conventional statistical packages are discussed to illustrate the need to develop software which is able to exhibit a greater degree of statistical expertise, thereby reducing the misuse of statistical methods by those not well versed in the art of statistical analysis. Some of the issues involved in the development of knowledge based software are presented and a review is given of some of the systems that have been developed so far. The majority of these have moved away from conventional architectures by adopting what can be termed an expert systems approach. The thesis then proposes an approach which is based upon the concept of semantic modelling. By representing some of the semantic meaning of data, it is conceived that a system could examine a request to apply a statistical technique and check if the use of the chosen technique was semantically sound, i.e. will the results obtained be meaningful. Current systems, in contrast, can only perform what can be considered as syntactic checks. The prototype system that has been implemented to explore the feasibility of such an approach is presented, the system has been designed as an enhanced variant of a conventional style statistical package. This involved developing a semantic data model to represent some of the statistically relevant knowledge about data and identifying sets of requirements that should be met for the application of the statistical techniques to be valid. Those areas of statistics covered in the prototype are measures of association and tests of location.
Resumo:
Cellular manufacturing is widely acknowledged as one of the key approaches to achieving world-class performance in batch manufacturing operations. The design of cellular manufacturing systems (CMS) is therefore crucial in determining a company's competitiveness. This thesis postulated that, in order to be effective the design of CMS should not only be systematic but also systemic. A systemic design uses the concepts of the body of work known as the 'systems approach' to ensure that a truly effective CMS is defined. The thesis examined the systems approach and created a systemic framework against which existing approaches to the design of CMS were evaluated. The most promising of these, Manufacturing Systems Engineering (MSE), was further investigated using a series of cross-sectional case-studies. Although, in practice, MSE proved to be less than systemic, it appeared to produce significant benefits. This seemed to suggest that CMS design did not need to be systemic to be effective. However, further longitudinal case-studies showed that the benefits claimed were at an operational level not at a business level and also that the performance of the whole system had not been evaluated. The deficiencies identified in the existing approaches to designing CMS were then addressed by the development of a novel CMS design methodology that fully utilised systems concepts. A key aspect of the methodology was the use of the Whole Business Simulator (WBS), a modelling and simulation tool that enabled the evaluation of CMS at operational and business levels. The most contentious aspects of the methodology were tested on a significant and complex case-study. The results of the exercise indicated that the systemic methodology was feasible.
Resumo:
Topical and transdermal formulations are promising platforms for the delivery of drugs. A unit dose topical or transdermal drug delivery system that optimises the solubility of drugs within the vehicle provides a novel dosage form for efficacious delivery that also offers a simple manufacture technique is desirable. This study used Witepsol® H15 wax as a abase for the delivery system. One aspect of this project involved determination of the solubility of ibuprofen, flurbiprofen and naproxen in the was using microscopy, Higuchi release kinetics, HyperDSC and mathematical modelling techniques. Correlations between the results obtained via these techniques were noted with additional merits such as provision of valuable information on drug release kinetics and possible interactions between the drug and excipients. A second aspect of this project involved the incorporation of additional excipients: Tween 20 (T), Carbopol®971 (C) and menthol (M) to the wax formulation. On in vitro permeation through porcine skin, the preferred formulations were: ibuprofen (5% w/w) within Witepsol®H15 + 1% w/w T; flurbiprofen (10% w/w) within Witepsol®H15 + 1% w/w T; naproxen (5% w/w) within Witepsol®H15 + 1% w/w T + 1% C and sodium diclofenac (10% w/w) within Witepsol®H15 + 1% w/w T + 1% w/w T + 1% w/w C + 5% w/w M. Unit dose transdermal tablets containing ibuprofen and diclofenac were produced with improved flux compared to marketed products; Voltarol Emugel® demonstrated flux of 1.68x10-3 cm/h compared to 123 x 10-3 cm/h for the optimised product as detailed above; Ibugel Forte® demonstrated a permeation coefficient value of 7.65 x 10-3 cm/h compared to 8.69 x 10-3 cm/h for the optimised product as described above.
Resumo:
Investigation of the different approaches used by Expert Systems researchers to solve problems in the domain of Mechanical Design and Expert Systems was carried out. The techniques used for conventional formal logic programming were compared with those used when applying Expert Systems concepts. A literature survey of design processes was also conducted with a view to adopting a suitable model of the design process. A model, comprising a variation on two established ones, was developed and applied to a problem within what are described as class 3 design tasks. The research explored the application of these concepts to Mechanical Engineering Design problems and their implementation on a microcomputer using an Expert System building tool. It was necessary to explore the use of Expert Systems in this manner so as to bridge the gap between their use as a control structure and for detailed analytical design. The former application is well researched into and this thesis discusses the latter. Some Expert System building tools available to the author at the beginning of his work were evaluated specifically for their suitability for Mechanical Engineering design problems. Microsynics was found to be the most suitable on which to implement a design problem because of its simple but powerful Semantic Net Knowledge Representation structure and the ability to use other types of representation schemes. Two major implementations were carried out. The first involved a design program for a Helical compression spring and the second a gearpair system design. Two concepts were proposed in the thesis for the modelling and implementation of design systems involving many equations. The method proposed enables equation manipulation and analysis using a combination of frames, semantic nets and production rules. The use of semantic nets for purposes other than for psychology and natural language interpretation, is quite new and represents one of the major contributions to knowledge by the author. The development of a purpose built shell program for this type of design problems was recommended as an extension of the research. Microsynics may usefully be used as a platform for this development.
Resumo:
The absence of a definitive approach to the design of manufacturing systems signifies the importance of a control mechanism to ensure the timely application of relevant design techniques. To provide effective control, design development needs to be continually assessed in relation to the required system performance, which can only be achieved analytically through computer simulation. The technique providing the only method of accurately replicating the highly complex and dynamic interrelationships inherent within manufacturing facilities and realistically predicting system behaviour. Owing to the unique capabilities of computer simulation, its application should support and encourage a thorough investigation of all alternative designs. Allowing attention to focus specifically on critical design areas and enabling continuous assessment of system evolution. To achieve this system analysis needs to efficient, in terms of data requirements and both speed and accuracy of evaluation. To provide an effective control mechanism a hierarchical or multi-level modelling procedure has therefore been developed, specifying the appropriate degree of evaluation support necessary at each phase of design. An underlying assumption of the proposal being that evaluation is quick, easy and allows models to expand in line with design developments. However, current approaches to computer simulation are totally inappropriate to support the hierarchical evaluation. Implementation of computer simulation through traditional approaches is typically characterized by a requirement for very specialist expertise, a lengthy model development phase, and a correspondingly high expenditure. Resulting in very little and rather inappropriate use of the technique. Simulation, when used, is generally only applied to check or verify a final design proposal. Rarely is the full potential of computer simulation utilized to aid, support or complement the manufacturing system design procedure. To implement the proposed modelling procedure therefore the concept of a generic simulator was adopted, as such systems require no specialist expertise, instead facilitating quick and easy model creation, execution and modification, through simple data inputs. Previously generic simulators have tended to be too restricted, lacking the necessary flexibility to be generally applicable to manufacturing systems. Development of the ATOMS manufacturing simulator, however, has proven that such systems can be relevant to a wide range of applications, besides verifying the benefits of multi-level modelling.
Resumo:
Manufacturing firms are driven by competitive pressures to continually improve the effectiveness and efficiency of their organisations. For this reason, manufacturing engineers often implement changes to existing processes, or design new production facilities, with the expectation of making further gains in manufacturing system performance. This thesis relates to how the likely outcome of this type of decision should be predicted prior to its implementation. The thesis argues that since manufacturing systems must also interact with many other parts of an organisation, the expected performance improvements can often be significantly hampered by constraints that arise elsewhere in the business. As a result, decision-makers should attempt to predict just how well a proposed design will perform when these other factors, or 'support departments', are taken into consideration. However, the thesis also demonstrates that, in practice, where quantitative analysis is used to evaluate design decisions, the analysis model invariably ignores the potential impact of support functions on a system's overall performance. A more comprehensive modelling approach is therefore required. A study of how various business functions interact establishes that to properly represent the kind of delays that give rise to support department constraints, a model should actually portray the dynamic and stochastic behaviour of entities in both the manufacturing and non-manufacturing aspects of a business. This implies that computer simulation be used to model design decisions but current simulation software does not provide a sufficient range of functionality to enable the behaviour of all of these entities to be represented in this way. The main objective of the research has therefore been the development of a new simulator that will overcome limitations of existing software and so enable decision-makers to conduct a more holistic evaluation of design decisions. It is argued that the application of object-oriented techniques offers a potentially better way of fulfilling both the functional and ease-of-use issues relating to development of the new simulator. An object-oriented analysis and design of the system, called WBS/Office, are therefore presented that extends to modelling a firm's administrative and other support activities in the context of the manufacturing system design process. A particularly novel feature of the design is the ability for decision-makers to model how a firm's specific information and document processing requirements might hamper shop-floor performance. The simulator is primarily intended for modelling make-to-order batch manufacturing systems and the thesis presents example models created using a working version of WBS/Office that demonstrate the feasibility of using the system to analyse manufacturing system designs in this way.
Resumo:
This thesis is concerned with Organisational Problem Solving. The work reflects the complexities of organisational problem situations and the eclectic approach that has been necessary to gain an understanding of the processes involved. The thesis is structured into three main parts. Part I describes the author's understanding of problems and suitable approaches. Chapter 2 identifies the Transcendental Realist (TR) view of science (Harre 1970, Bhaskar 1975) as the best general framework for identifying suitable approaches to complex organisational problems. Chapter 3 discusses the relationship between Checkland's methodology (1972) and TR. The need to generate iconic (explanatory) models of the problem situation is identified and the ability of viable system modelling to supplement the modelling stage of the methodology is explored in Chapter 4. Chapter 5 builds further on the methodology to produce an original iconic model of the methodological process. The model characterises the mechanisms of organisational problem situations as well as desirable procedural steps. The Weltanschauungen (W's) or "world views" of key actors is recognised as central to the mechanisms involved. Part II describes the experience which prompted the theoretical investigation. Chapter 6 describes the first year of the project. The success of this stage is attributed to the predominance of a single W. Chapter 7 describes the changes in the organisation which made the remaining phase of the project difficult. These difficulties are attributed to a failure to recognise the importance of differing W's. Part III revisits the theoretical and organisational issues. Chapter 8 identifies a range of techniques embodying W's which are compatible with .the framework of Part I and which might usefully supplement it. Chapter 9 characterises possible W's in the sponsoring organisation. Throughout the work, an attempt 1s made to reflect the process as well as the product of the author's leaving.
Resumo:
Firstly, we numerically model a practical 20 Gb/s undersea configuration employing the Return-to-Zero Differential Phase Shift Keying data format. The modelling is completed using the Split-Step Fourier Method to solve the Generalised Nonlinear Schrdinger Equation. We optimise the dispersion map and per-channel launch power of these channels and investigate how the choice of pre/post compensation can influence the performance. After obtaining these optimal configurations, we investigate the Bit Error Rate estimation of these systems and we see that estimation based on Gaussian electrical current systems is appropriate for systems of this type, indicating quasi-linear behaviour. The introduction of narrower pulses due to the deployment of quasi-linear transmission decreases the tolerance to chromatic dispersion and intra-channel nonlinearity. We used tools from Mathematical Statistics to study the behaviour of these channels in order to develop new methods to estimate Bit Error Rate. In the final section, we consider the estimation of Eye Closure Penalty, a popular measure of signal distortion. Using a numerical example and assuming the symmetry of eye closure, we see that we can simply estimate Eye Closure Penalty using Gaussian statistics. We also see that the statistics of the logical ones dominates the statistics of the logical ones dominates the statistics of signal distortion in the case of Return-to-Zero On-Off Keying configurations.
Resumo:
This thesis investigates the modelling of drying processes for the promotion of market-led Demand Side Management (DSM) as applied to the UK Public Electricity Suppliers. A review of DSM in the electricity supply industry is provided, together with a discussion of the relevant drivers supporting market-led DSM and energy services (ES). The potential opportunities for ES in a fully deregulated energy market are outlined. It is suggested that targeted industrial sector energy efficiency schemes offer significant opportunity for long term customer and supplier benefit. On a process level, industrial drying is highlighted as offering significant scope for the application of energy services. Drying is an energy-intensive process used widely throughout industry. The results of an energy survey suggest that 17.7 per cent of total UK industrial energy use derives from drying processes. Comparison with published work indicates that energy use for drying shows an increasing trend against a background of reducing overall industrial energy use. Airless drying is highlighted as offering potential energy saving and production benefits to industry. To this end, a comprehensive review of the novel airless drying technology and its background theory is made. Advantages and disadvantages of airless operation are defined and the limited market penetration of airless drying is identified, as are the key opportunities for energy saving. Limited literature has been found which details the modelling of energy use for airless drying. A review of drying theory and previous modelling work is made in an attempt to model energy consumption for drying processes. The history of drying models is presented as well as a discussion of the different approaches taken and their relative merits. The viability of deriving energy use from empirical drying data is examined. Adaptive neuro fuzzy inference systems (ANFIS) are successfully applied to the modelling of drying rates for 3 drying technologies, namely convective air, heat pump and airless drying. The ANFIS systems are then integrated into a novel energy services model for the prediction of relative drying times, energy cost and atmospheric carbon dioxide emission levels. The author believes that this work constitutes the first to use fuzzy systems for the modelling of drying performance as an energy services approach to DSM. To gain an insight into the 'real world' use of energy for drying, this thesis presents a unique first-order energy audit of every ceramic sanitaryware manufacturing site in the UK. Previously unknown patterns of energy use are highlighted. Supplementary comments on the timing and use of drying systems are also made. The limitations of such large scope energy surveys are discussed.
Resumo:
Multiple-antenna systems offer significant performance enhancement and will be applied to the next generation broadband wireless communications. This thesis presents the investigations of multiple-antenna systems – multiple-input multiple-output (MIMO) and cooperative communication (CC) – and their performances in more realistic propagation environments than those reported previously. For MIMO systems, the investigations are conducted via theoretical modelling and simulations in a double-scattering environment. The results show that the variations of system performances depend on how scatterer density varies in flat fading channels, and that in frequency-selective fading channels system performances are affected by the length of the coding block as well as scatterer density. In realistic propagation environments, the fading correlation also has an impact on CC systems where the antennas can be further apart than those in MIMO systems. A general stochastic model is applied to studying the effects of fading correlation on the performances of CC systems. This model reflects the asymmetry fact of the wireless channels in a CC system. The results demonstrate the varied effects of fading correlation under different protocols and channel conditions. Performances of CC systems are further studied at the packet level, using both simulations and an experimental testbed. The results obtained have verified various performance trade-offs of the cooperative relaying network (CRN) investigated in different propagation environments. The results suggest that a proper selection of the relaying algorithms and other techniques can meet the requirements of quality of service for different applications.