70 resultados para Process Models
Resumo:
A major application of computers has been to control physical processes in which the computer is embedded within some large physical process and is required to control concurrent physical processes. The main difficulty with these systems is their event-driven characteristics, which complicate their modelling and analysis. Although a number of researchers in the process system community have approached the problems of modelling and analysis of such systems, there is still a lack of standardised software development formalisms for the system (controller) development, particular at early stage of the system design cycle. This research forms part of a larger research programme which is concerned with the development of real-time process-control systems in which software is used to control concurrent physical processes. The general objective of the research in this thesis is to investigate the use of formal techniques in the analysis of such systems at their early stages of development, with a particular bias towards an application to high speed machinery. Specifically, the research aims to generate a standardised software development formalism for real-time process-control systems, particularly for software controller synthesis. In this research, a graphical modelling formalism called Sequential Function Chart (SFC), a variant of Grafcet, is examined. SFC, which is defined in the international standard IEC1131 as a graphical description language, has been used widely in industry and has achieved an acceptable level of maturity and acceptance. A comparative study between SFC and Petri nets is presented in this thesis. To overcome identified inaccuracies in the SFC, a formal definition of the firing rules for SFC is given. To provide a framework in which SFC models can be analysed formally, an extended time-related Petri net model for SFC is proposed and the transformation method is defined. The SFC notation lacks a systematic way of synthesising system models from the real world systems. Thus a standardised approach to the development of real-time process control systems is required such that the system (software) functional requirements can be identified, captured, analysed. A rule-based approach and a method called system behaviour driven method (SBDM) are proposed as a development formalism for real-time process-control systems.
Resumo:
The paper presents a comparison between the different drag models for granular flows developed in the literature and the effect of each one of them on the fast pyrolysis of wood. The process takes place on an 100 g/h lab scale bubbling fluidized bed reactor located at Aston University. FLUENT 6.3 is used as the modeling framework of the fluidized bed hydrodynamics, while the fast pyrolysis of the discrete wood particles is incorporated as an external user defined function (UDF) hooked to FLUENT’s main code structure. Three different drag models for granular flows are compared, namely the Gidaspow, Syamlal O’Brien, and Wen-Yu, already incorporated in FLUENT’s main code, and their impact on particle trajectory, heat transfer, degradation rate, product yields, and char residence time is quantified. The Eulerian approach is used to model the bubbling behavior of the sand, which is treated as a continuum. Biomass reaction kinetics is modeled according to the literature using a two-stage, semiglobal model that takes into account secondary reactions.
Resumo:
In this paper we present a novel method for emulating a stochastic, or random output, computer model and show its application to a complex rabies model. The method is evaluated both in terms of accuracy and computational efficiency on synthetic data and the rabies model. We address the issue of experimental design and provide empirical evidence on the effectiveness of utilizing replicate model evaluations compared to a space-filling design. We employ the Mahalanobis error measure to validate the heteroscedastic Gaussian process based emulator predictions for both the mean and (co)variance. The emulator allows efficient screening to identify important model inputs and better understanding of the complex behaviour of the rabies model.
Resumo:
Large monitoring networks are becoming increasingly common and can generate large datasets from thousands to millions of observations in size, often with high temporal resolution. Processing large datasets using traditional geostatistical methods is prohibitively slow and in real world applications different types of sensor can be found across a monitoring network. Heterogeneities in the error characteristics of different sensors, both in terms of distribution and magnitude, presents problems for generating coherent maps. An assumption in traditional geostatistics is that observations are made directly of the underlying process being studied and that the observations are contaminated with Gaussian errors. Under this assumption, sub–optimal predictions will be obtained if the error characteristics of the sensor are effectively non–Gaussian. One method, model based geostatistics, assumes that a Gaussian process prior is imposed over the (latent) process being studied and that the sensor model forms part of the likelihood term. One problem with this type of approach is that the corresponding posterior distribution will be non–Gaussian and computationally demanding as Monte Carlo methods have to be used. An extension of a sequential, approximate Bayesian inference method enables observations with arbitrary likelihoods to be treated, in a projected process kriging framework which is less computationally intensive. The approach is illustrated using a simulated dataset with a range of sensor models and error characteristics.
Resumo:
Most parametric software cost estimation models used today evolved in the late 70's and early 80's. At that time, the dominant software development techniques being used were the early 'structured methods'. Since then, several new systems development paradigms and methods have emerged, one being Jackson Systems Development (JSD). As current cost estimating methods do not take account of these developments, their non-universality means they cannot provide adequate estimates of effort and hence cost. In order to address these shortcomings two new estimation methods have been developed for JSD projects. One of these methods JSD-FPA, is a top-down estimating method, based on the existing MKII function point method. The other method, JSD-COCOMO, is a sizing technique which sizes a project, in terms of lines of code, from the process structure diagrams and thus provides an input to the traditional COCOMO method.The JSD-FPA method allows JSD projects in both the real-time and scientific application areas to be costed, as well as the commercial information systems applications to which FPA is usually applied. The method is based upon a three-dimensional view of a system specification as opposed to the largely data-oriented view traditionally used by FPA. The method uses counts of various attributes of a JSD specification to develop a metric which provides an indication of the size of the system to be developed. This size metric is then transformed into an estimate of effort by calculating past project productivity and utilising this figure to predict the effort and hence cost of a future project. The effort estimates produced were validated by comparing them against the effort figures for six actual projects.The JSD-COCOMO method uses counts of the levels in a process structure chart as the input to an empirically derived model which transforms them into an estimate of delivered source code instructions.
Resumo:
This research examines the role of the information management process within a process-oriented enterprise, Xerox Ltd. The research approach is based on a post-positive paradigm and has resulted in thirty-five idiographic statements. The three major outcomes are: 1. The process-oriented holistic enterprise is an organisation that requires a long-term management commitment to its development. It depends on the careful management of people, tasks, information and technology. A complex integration of business processes is required and this can be managed through the use of consistent documentation techniques, clarity in the definition of process responsibilities and management attention to the global metrics and the centralisation of the management of the process model are critical to its success. 2. The role of the information management process within the context of a process-oriented enterprise is to provide flexible and cost-effective applications, technological, and process support to the business. This is best achieved through a centralisation of the management of information management and of the process model. A business-led approach combined with the consolidation of applications, information, process, and data architectures is central to providing effective business and process-focused support. 3. In a process oriented holistic enterprise, process and information management are inextricably linked. The model of process management depends heavily on information management, whilst the model of information management is totally focused around supporting and creating the process model. The two models are mutually creating - one cannot exist without the other. There is a duality concept of process and information management.
Resumo:
Xerox Customer Engagement activity is informed by the "Go To Market" strategy, and "Intelligent Coverage" sales philosophy. The realisation of this philosophy necessitates a sophisticated level of Market Understanding, and the effective integration of the direct channels of Customer Engagement. Sophisticated Market Understanding requires the mapping and coding of the entire UK market at the DMU (Decision Making Unit) level, which in turn enables the creation of tailored coverage prescriptions. Effective Channel Integration is made possible by the organisation of Customer Engagement work according to a single, process defined structure: the Selling Process. Organising by process facilitates the discipline of Task Substitution, which leads logically to creation of Hybrid Selling models. Productive Customer Engagement requires Selling Process specialisation by industry sector, customer segment and product group. The research shows that Xerox's Market Database (MDB) plays a central role in delivering the Go To Market strategic aims. It is a tool for knowledge based selling, enables productive SFA (Sales Force Automation) and, in sum, is critical to the efficient and effective deployment of Customer Engagement resources. Intelligent Coverage is not possible without the MDB. Analysis of the case evidence has resulted in the definition of 60 idiographic statements. These statements are about how Xerox organise and manage three direct channels of Customer Engagement: Face to Face, Telebusiness and Ebusiness. Xerox is shown to employ a process-oriented, IT-enabled, holistic approach to Customer Engagement productivity. The significance of the research is that it represents a detailed (perhaps unequalled) level of rich description of the interplay between IT and a holistic, process-oriented management philosophy.
Resumo:
This research focuses on two groups of local companies; namely, high-growth local companies and other local companies, to examine and compare the influence of utilising governmental initiatives, servicing foreign MNCs and internationalisation on their strategic planning process. The theme of this thesis argues that the approach of an organisation towards strategic planning is not only determined by the internal influences; namely, its firm size and the planning behaviour and attitude of an entrepreneur, as revealed in the literature, but it can also be affected by external influences. The theoretical contribution of this research determines this unique situation in Singapore, and tests the robustness of the conventional models of planning in smaller companies. As a result of the external influences, this study reveals that local companies are more likely to undertake a much more formal strategic planning than the conventional Western literature and models would indicate. High-growth local companies, in comparison, however, had undertaken a more formal and rigorous strategic planning process than other local companies.
Resumo:
The starting point of the project was the observation that strategic management is absent in small businesses. The first objective of the project was to examine the reasons causing this situation in Greece, the second one, to examine the appropriateness of the contemporary models of strategic planning for the Greek S.M.E.s, and the third to examine the appropriateness of the alternative approaches to strategic management for the Greek S.M.E.s. The term appropriateness includes (a) the ability of managers to use the models and (b) the ability of the models to assist the managers. The results of the research indicate that none of the two above conditions exists, hence, it is suggested that the contemporary models of strategic management are inappropriate for the Greek S.M.E.s. Many previous research projects on the topic suggest that since the strategic decision making process in S.M.E.s is informal, the whole process is absent or ineffective. Current trends in S.M.E.s' strategic management do not consider the informality of the strategic decision making process as a kind of managerial illness, but as a managerial characteristic. The use of sophisticated data collection and analytical methods does not indicate successful strategic decisions, but it indicates the method large firms use to manage their strategy. According to the literature review, the S.M.E.s' managers avoid the use of the contemporary models of strategic management, because they do not have the knowledge, the resources or the time. Another thesis, expressed by some firms' specialists, suggests that small firms are different from large ones, hence their practice of strategic management should not follow the large firm's prototypes.
Resumo:
Much of the geometrical data relating to engineering components and assemblies is stored in the form of orthographic views, either on paper or computer files. For various engineering applications, however, it is necessary to describe objects in formal geometric modelling terms. The work reported in this thesis is concerned with the development and implementation of concepts and algorithms for the automatic interpretation of orthographic views as solid models. The various rules and conventions associated with engineering drawings are reviewed and several geometric modelling representations are briefly examined. A review of existing techniques for the automatic, and semi-automatic, interpretation of engineering drawings as solid models is given. A new theoretical approach is then presented and discussed. The author shows how the implementation of such an approach for uniform thickness objects may be extended to more general objects by introducing the concept of `approximation models'. Means by which the quality of the transformations is monitored, are also described. Detailed descriptions of the interpretation algorithms and the software package that were developed for this project are given. The process is then illustrated by a number of practical examples. Finally, the thesis concludes that, using the techniques developed, a substantial percentage of drawings of engineering components could be converted into geometric models with a specific degree of accuracy. This degree is indicative of the suitability of the model for a particular application. Further work on important details is required before a commercially acceptable package is produced.
Resumo:
The thesis describes the work carried out to develop a prototype knowledge-based system 'KBS-SETUPP' to generate process plans for the manufacture of seamless tubes. The work is specifically related to a plant in which hollows are made from solid billets using a rotary piercing process and then reduced to required size and finished properties using the fixed plug cold drawing process. The thesis first discusses various methods of tube production in order to give a general background of tube manufacture. Then a review of the automation of the process planning function is presented in terms of its basic sub-tasks and the techniques and suitability of a knowledge-based system is established. In the light of such a review and a case study, the process planning problem is formulated in the domain of seamless tube manufacture, its basic sub-tasks are identified and capabilities and constraints of the available equipment in the specific plant are established. The task of collecting and collating the process planning knowledge in seamless tube manufacture is discussed and is mostly fulfilled from domain experts, analysing of existing manufacturing records specific to plant, textbooks and applicable Standards. For the cold drawing mill, tube-drawing schedules have been rationalised to correspond with practice. The validation of such schedules has been achieved by computing the process parameters and then comparing these with the drawbench capacity to avoid over-loading. The existing models cannot be simulated in the computer program as such, therefore a mathematical model has been proposed which estimates the process parameters which are in close agreement with experimental values established by other researchers. To implement the concepts, a Knowledge-Based System 'KBS- SETUPP' has been developed on Personal Computer using Turbo- Prolog. The system is capable of generating process plans, production schedules and some additional capabilities to supplement process planning. The system generated process plans have been compared with the actual plans of the company and it has been shown that the results are satisfactory and encouraging and that the system has the capabilities which are useful.
Resumo:
This thesis describes the procedure and results from four years research undertaken through the IHD (Interdisciplinary Higher Degrees) Scheme at Aston University in Birmingham, sponsored by the SERC (Science and Engineering Research Council) and Monk Dunstone Associates, Chartered Quantity Surveyors. A stochastic networking technique VERT (Venture Evaluation and Review Technique) was used to model the pre-tender costs of public health, heating ventilating, air-conditioning, fire protection, lifts and electrical installations within office developments. The model enabled the quantity surveyor to analyse, manipulate and explore complex scenarios which previously had defied ready mathematical analysis. The process involved the examination of historical material costs, labour factors and design performance data. Components and installation types were defined and formatted. Data was updated and adjusted using mechanical and electrical pre-tender cost indices and location, selection of contractor, contract sum, height and site condition factors. Ranges of cost, time and performance data were represented by probability density functions and defined by constant, uniform, normal and beta distributions. These variables and a network of the interrelationships between services components provided the framework for analysis. The VERT program, in this particular study, relied upon Monte Carlo simulation to model the uncertainties associated with pre-tender estimates of all possible installations. The computer generated output in the form of relative and cumulative frequency distributions of current element and total services costs, critical path analyses and details of statistical parameters. From this data alternative design solutions were compared, the degree of risk associated with estimates was determined, heuristics were tested and redeveloped, and cost significant items were isolated for closer examination. The resultant models successfully combined cost, time and performance factors and provided the quantity surveyor with an appreciation of the cost ranges associated with the various engineering services design options.
Resumo:
This thesis is a theoretical study of the accuracy and usability of models that attempt to represent the environmental control system of buildings in order to improve environmental design. These models have evolved from crude representations of a building and its environment through to an accurate representation of the dynamic characteristics of the environmental stimuli on buildings. Each generation of models has had its own particular influence on built form. This thesis analyses the theory, structure and data of such models in terms of their accuracy of simulation and therefore their validity in influencing built form. The models are also analysed in terms of their compatability with the design process and hence their ability to aid designers. The conclusions are that such models are unlikely to improve environmental performance since: a the models can only be applied to a limited number of building types, b they can only be applied to a restricted number of the characteristics of a design, c they can only be employed after many major environmental decisions have been made, d the data used in models is inadequate and unrepresentative, e models do not account for occupant interaction in environmental control. It is argued that further improvements in the accuracy of simulation of environmental control will not significantly improve environmental design. This is based on the premise that strategic environmental decisions are made at the conceptual stages of design whereas models influence the detailed stages of design. It is hypothesised that if models are to improve environmental design it must be through the analysis of building typologies which provides a method of feedback between models and the conceptual stages of design. Field studies are presented to describe a method by which typologies can be analysed and a theoretical framework is described which provides a basis for further research into the implications of the morphology of buildings on environmental design.
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.
Resumo:
This thesis reviews the main methodological developments in public sector investment appraisal and finds growing evidence that appraisal techniques are not fulfilling their earlier promise. It is suggested that an important reason for this failure lies in the inability of these techniques to handle uncertainty except in a highly circumscribed fashion. It is argued that a more fruitful approach is to strive for flexibility. Investment projects should be formulated with a view to making them responsive to a wide range of possible future events, rather than embodying a solution which is optimal for one configuration of circumstances only. The distinction drawn in economics between the short and the long run is used to examine the nature of flexibility. The concept of long run flexibility is applied to the pre-investment range of choice open to the decisionmaker. It is demonstrated that flexibility is reduced at a very early stage of decisionmaking by the conventional system of appraisal which evaluates only a small number of options. The pre-appraisal filtering process is considered further in relation to decisionmaking models. It is argued that for public sector projects the narrowing down of options is best understood in relation to an amended mixed scanning model which places importance on the process by which the 'national interest ' is determined. Short run flexibility deals with operational characteristics, the degree to which particular projects may respond to changing demands when the basic investment is already in place. The tension between flexibility and cost is noted. A short case study on the choice of electricity generating plant is presented. The thesis concludes with a brief examination of the approaches used by successive British governments to public sector investment, particularly in relation to the nationalised industries