28 resultados para Entropy of a sampling design
em Aston University Research Archive
Resumo:
This paper describes the establishment and early use of an eBusiness Design Studio in Aston Business School at Aston University which is located in Birmingham in the UK. It was originally conceived as an R&D aid as much as a teaching resource. However the bulk of its use to date has been in the support of Masters level modules (courses) and this description will focus on that application in the first instance. We have less experience of its use in an R&D context but we will make some preliminary comments on this in a later section.
Resumo:
This paper disputes the fact that product design determines 70% of costs and the implications that follow for design evaluation tools. Using the idea of decision chains, it is argued that such tools need to consider more of the downstream business activities and should take into account the current and future state of the business rather than some idealized view of it. To illustrate the argument, a series of experiments using an enterprise simulator are described that show the benefit from the application of a more holistic 'design for' technique. Design For the Existing Environment.
Resumo:
This work is concerned with the assessment of a newer version of the spout-fluid bed where the gas is supplied from a common plenum and the distributor controls the operational phenomenon. Thus the main body of the work deals with the effect of the distributor design on the mixing and segregation of solids in a spout-filled bed. The effect of distributor design in the conventional fluidised bed and of variation of the gas inlet diameter in a spouted bed were also briefly investigated for purpose of comparison. Large particles were selected for study because they are becoming increasingly important in industrial fluidised beds but have not been thoroughly investigated. The mean particle diameters of the fraction ranged from 550 to 2400 mm, and their specific gravity from 0.97 to 2.45. Only work carried out with binary systems is reported here. The effect of air velocity, particle properties, bed height, the relative amount of jetsam and flotsam and initial conditions on the steady-state concentration profiles were assessed with selected distributors. The work is divided into three sections. Sections I and II deal with the fluidised bed and spouted bed systems. Section III covers the development of the spout-filled bed and its behaviour with reference to distributor design and it is shown how benefits of both spouting and fluidising phenomena can be exploited. In the fluidisation zone, better mixing is achieved by distributors which produce a large initial bubble diameter. Some common features exist between the behaviour of unidensity jetsam-rich systems and different density flotsam-rich systems. The shape factor does not seem to have an affect as long as it is only restricted to the minor component. However, in the case of the major component, particle shape significantly affects the final results. Studies of aspect ratio showed that there is a maximum (1.5) above which slugging occurs and the effect of the distributor design is nullified. A mixing number was developed for unidensity spherical rich systems, which proved to be extremely useful in quantifying the variation in mixing and segregation with changes in distributor design.
Resumo:
Changes in modern structural design have created a demand for products which are light but possess high strength. The objective is a reduction in fuel consumption and weight of materials to satisfy both economic and environmental criteria. Cold roll forming has the potential to fulfil this requirement. The bending process is controlled by the shape of the profile machined on the periphery of the rolls. A CNC lathe can machine complicated profiles to a high standard of precision, but the expertise of a numerical control programmer is required. A computer program was developed during this project, using the expert system concept, to calculate tool paths and consequently to expedite the procurement of the machine control tapes whilst removing the need for a skilled programmer. Codifying the expertise of a human and the encapsulation of knowledge within a computer memory, destroys the dependency on highly trained people whose services can be costly, inconsistent and unreliable. A successful cold roll forming operation, where the product is geometrically correct and free from visual defects, is not easy to attain. The geometry of the sheet after travelling through the rolling mill depends on the residual strains generated by the elastic-plastic deformation. Accurate evaluation of the residual strains can provide the basis for predicting the geometry of the section. A study of geometric and material non-linearity, yield criteria, material hardening and stress-strain relationships was undertaken in this research project. The finite element method was chosen to provide a mathematical model of the bending process and, to ensure an efficient manipulation of the large stiffness matrices, the frontal solution was applied. A series of experimental investigations provided data to compare with corresponding values obtained from the theoretical modelling. A computer simulation, capable of predicting that a design will be satisfactory prior to the manufacture of the rolls, would allow effort to be concentrated into devising an optimum design where costs are minimised.
Resumo:
The work described in this thesis focuses on the use of a design-of-experiments approach in a multi-well mini-bioreactor to enable the rapid establishments of high yielding production phase conditions in yeast, which is an increasingly popular host system in both academic and industrial laboratories. Using green fluorescent protein secreted from the yeast, Pichia pastoris, a scalable predictive model of protein yield per cell was derived from 13 sets of conditions each with three factors (temperature, pH and dissolved oxygen) at 3 levels and was directly transferable to a 7 L bioreactor. This was in clear contrast to the situation in shake flasks, where the process parameters cannot be tightly controlled. By further optimisating both the accumulation of cell density in batch and improving the fed-batch induction regime, additional yield improvement was found to be additive to the per cell yield of the model. A separate study also demonstrated that improving biomass improved product yield in a second yeast species, Saccharomyces cerevisiae. Investigations of cell wall hydrophobicity in high cell density P. pastoris cultures indicated that cell wall hydrophobin (protein) compositional changes with growth phase becoming more hydrophobic in log growth than in lag or stationary phases. This is possibly due to an increased occurrence of proteins associated with cell division. Finally, the modelling approach was validated in mammalian cells, showing its flexibility and robustness. In summary, the strategy presented in this thesis has the benefit of reducing process development time in recombinant protein production, directly from bench to bioreactor.
Resumo:
Manufacturing system design is an ongoing activity within industry. Modelling tools based on Discrete Event Simulation are often used by practitioners during this design cycle. However, such tools do not adequately model the behaviour of 'direct' workers in manufacturing environments. There is an important need to expand the capability of modelling to include the relationships between human centred factors (demography, attitudes, beliefs, etc), their working environment (physical and organizational), and their subsequent performance in terms of productive routines. Therefore, this paper describes research that has formed a pilot modelling methodology that is an important first step in providing such a capability.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
As a new medium for questionnaire delivery, the Internet has the potential to revolutionize the survey process. Online-questionnaires can provide many capabilities not found in traditional paper-based questionnaires. Despite this, and the introduction of a plethora of tools to support online-questionnaire creation, current electronic survey design typically replicates the look-and-feel of paper-based questionnaires, thus failing to harness the full power of the electronic delivery medium. A recent environmental scan of online-questionnaire design tools found that little, if any, support is incorporated within these tools to guide questionnaire designers according to best-practice [Lumsden & Morgan 2005]. This paper briefly introduces a comprehensive set of guidelines for the design of online-questionnaires. Drawn from relevant disparate sources, all the guidelines incorporated within the set are proven in their own right; as an initial assessment of the value of the set of guidelines as a practical reference guide, we undertook an informal study to observe the effect of introducing the guidelines into the design process for a complex online-questionnaire. The paper discusses the qualitative findings — which are encouraging for the role of the guidelines in the ‘bigger picture’ of online survey delivery across many domains such as e-government, e-business, and e-health — of this case study.
Resumo:
This paper consolidates evidence and material from a range of specialist and disciplinary fields to provide an evidence-based review and synthesis on the design and use of serious games in higher education. Search terms identified 165 papers reporting conceptual and empirical evidence on how learning attributes and game mechanics may be planned, designed and implemented by university teachers interested in using games, which are integrated into lesson plans and orchestrated as part of a learning sequence at any scale. The findings outline the potential of classifying the links between learning attributes and game mechanics as a means to scaffold teachers’ understanding of how to perpetuate learning in optimal ways while enhancing the in-game learning experience. The findings of this paper provide a foundation for describing methods, frames and discourse around experiences of design and use of serious games, linked to methodological limitations and recommendations for further research in this area.
Resumo:
Concept evaluation at the early phase of product development plays a crucial role in new product development. It determines the direction of the subsequent design activities. However, the evaluation information at this stage mainly comes from experts' judgments, which is subjective and imprecise. How to manage the subjectivity to reduce the evaluation bias is a big challenge in design concept evaluation. This paper proposes a comprehensive evaluation method which combines information entropy theory and rough number. Rough number is first presented to aggregate individual judgments and priorities and to manipulate the vagueness under a group decision-making environment. A rough number based information entropy method is proposed to determine the relative weights of evaluation criteria. The composite performance values based on rough number are then calculated to rank the candidate design concepts. The results from a practical case study on the concept evaluation of an industrial robot design show that the integrated evaluation model can effectively strengthen the objectivity across the decision-making processes.
Resumo:
Distributed digital control systems provide alternatives to conventional, centralised digital control systems. Typically, a modern distributed control system will comprise a multi-processor or network of processors, a communications network, an associated set of sensors and actuators, and the systems and applications software. This thesis addresses the problem of how to design robust decentralised control systems, such as those used to control event-driven, real-time processes in time-critical environments. Emphasis is placed on studying the dynamical behaviour of a system and identifying ways of partitioning the system so that it may be controlled in a distributed manner. A structural partitioning technique is adopted which makes use of natural physical sub-processes in the system, which are then mapped into the software processes to control the system. However, communications are required between the processes because of the disjoint nature of the distributed (i.e. partitioned) state of the physical system. The structural partitioning technique, and recent developments in the theory of potential controllability and observability of a system, are the basis for the design of controllers. In particular, the method is used to derive a decentralised estimate of the state vector for a continuous-time system. The work is also extended to derive a distributed estimate for a discrete-time system. Emphasis is also given to the role of communications in the distributed control of processes and to the partitioning technique necessary to design distributed and decentralised systems with resilient structures. A method is presented for the systematic identification of necessary communications for distributed control. It is also shwon that the structural partitions can be used directly in the design of software fault tolerant concurrent controllers. In particular, the structural partition can be used to identify the boundary of the conversation which can be used to protect a specific part of the system. In addition, for certain classes of system, the partitions can be used to identify processes which may be dynamically reconfigured in the event of a fault. These methods should be of use in the design of robust distributed systems.
Resumo:
There is an alternative model of the 1-way ANOVA called the 'random effects' model or ‘nested’ design in which the objective is not to test specific effects but to estimate the degree of variation of a particular measurement and to compare different sources of variation that influence the measurement in space and/or time. The most important statistics from a random effects model are the components of variance which estimate the variance associated with each of the sources of variation influencing a measurement. The nested design is particularly useful in preliminary experiments designed to estimate different sources of variation and in the planning of appropriate sampling strategies.
Resumo:
This dissertation studies the process of operations systems design within the context of the manufacturing organization. Using the DRAMA (Design Routine for Adopting Modular Assembly) model as developed by a team from the IDOM Research Unit at Aston University as a starting point, the research employed empirically based fieldwork and a survey to investigate the process of production systems design and implementation within four UK manufacturing industries: electronics assembly, electrical engineering, mechanical engineering and carpet manufacturing. The intention was to validate the basic DRAMA model as a framework for research enquiry within the electronics industry, where the initial IDOM work was conducted, and then to test its generic applicability, further developing the model where appropriate, within the other industries selected. The thesis contains a review of production systems design theory and practice prior to presenting thirteen industrial case studies of production systems design from the four industry sectors. The results and analysis of the postal survey into production systems design are then presented. The strategic decisions of manufacturing and their relationship to production systems design, and the detailed process of production systems design and operation are then discussed. These analyses are used to develop the generic model of production systems design entitled DRAMA II (Decision Rules for Analysing Manufacturing Activities). The model contains three main constituent parts: the basic DRAMA model, the extended DRAMA II model showing the imperatives and relationships within the design process, and a benchmark generic approach for the design and analysis of each component in the design process. DRAMA II is primarily intended for use by researchers as an analytical framework of enquiry, but is also seen as having application for manufacturing practitioners.
Resumo:
This thesis reports on the results of the analyses of certain aspects of sampling inspection plans. The investigation has been confined to attributes (as distinct from variables) plans and in this respect.the analyses have been concerned with two main aspects of single and double plans. These are:- (i) the Average Outgoing Quality Limit (AOQL) of the plan. (ii) the Average Sample Number (ASN) of the plan. In the former connection the investigation has been concerned with the evaluation of the AOQL analytically and the determination of the fraction defective of the incoming material to give the AOQL. The analyses have been applied to both single and double sampling plans, In the latter connection the investigation has been concerned with the evaluation of the maximum ASN analytically and the determination of the fraction defective of the incoming material to give the maximum value of ASN. The analyses have been confined only to double sampling plans because in the case of single sampling the ASN is constant and is equal to n, the sample size.