885 resultados para systematic product design
Resumo:
Using current software engineering technology, the robustness required for safety critical software is not assurable. However, different approaches are possible which can help to assure software robustness to some extent. For achieving high reliability software, methods should be adopted which avoid introducing faults (fault avoidance); then testing should be carried out to identify any faults which persist (error removal). Finally, techniques should be used which allow any undetected faults to be tolerated (fault tolerance). The verification of correctness in system design specification and performance analysis of the model, are the basic issues in concurrent systems. In this context, modeling distributed concurrent software is one of the most important activities in the software life cycle, and communication analysis is a primary consideration to achieve reliability and safety. By and large fault avoidance requires human analysis which is error prone; by reducing human involvement in the tedious aspect of modelling and analysis of the software it is hoped that fewer faults will persist into its implementation in the real-time environment. The Occam language supports concurrent programming and is a language where interprocess interaction takes place by communications. This may lead to deadlock due to communication failure. Proper systematic methods must be adopted in the design of concurrent software for distributed computing systems if the communication structure is to be free of pathologies, such as deadlock. The objective of this thesis is to provide a design environment which ensures that processes are free from deadlock. A software tool was designed and used to facilitate the production of fault-tolerant software for distributed concurrent systems. Where Occam is used as a design language then state space methods, such as Petri-nets, can be used in analysis and simulation to determine the dynamic behaviour of the software, and to identify structures which may be prone to deadlock so that they may be eliminated from the design before the program is ever run. This design software tool consists of two parts. One takes an input program and translates it into a mathematical model (Petri-net), which is used for modeling and analysis of the concurrent software. The second part is the Petri-net simulator that takes the translated program as its input and starts simulation to generate the reachability tree. The tree identifies `deadlock potential' which the user can explore further. Finally, the software tool has been applied to a number of Occam programs. Two examples were taken to show how the tool works in the early design phase for fault prevention before the program is ever run.
Resumo:
BACKGROUND: The use of quality of life (QoL) instruments in menorrhagia research is increasing but there is concern that not enough emphasis is placed on patient-focus in these measurements, i.e. on issues which are of importance to patients and reflect their experiences and concerns (clinical face validity). The objective was to assess the quality of QoL instruments in studies of menorrhagia. STUDY DESIGN: A systematic review of published research. Papers were identified through MEDLINE (1966-April 2000), EMBASE (1980-April 2000), Science Citation Index (1981-April 2000), Social Science Citation Index (1981-April 2000), CINAHL (1982-1999) and PsychLIT (1966-1999), and by manual searching of bibliographies of known primary and review articles. Studies were selected if they assessed women with menorrhagia for life quality, either developing QoL instruments or applying them as an outcome measure. Selected studies were assessed for quality of their QoL instruments, using a 17 items checklist including 10 items for clinical face validity (issues of relevance to patients' expectations and concerns) and 7 items for measurement properties (such as reliability, responsiveness, etc.). RESULTS: A total of 19 articles, 8 on instrument development and 11 on application, were included in the review. The generic Short Form 36 Health Survey Questionnaire (SF36) was used in 12/19 (63%) studies. Only two studies developed new specific QoL instruments for menorrhagia but they complied with 7/17 (41%) and 10/17 (59%) of the quality criteria. Quality assessment showed that only 7/19 (37%) studies complied with more than half the criteria for face validity whereas 17/19 (90%) studies complied with more than half of the criteria for measurement properties (P = 0.0001). CONCLUSION: Among existing QoL instruments, there is good compliance with the quality criteria for measurement properties but not with those for clinical face validity. There is a need to develop methodologically sound disease specific QoL instruments in menorrhagia focussing both on face validity and measurement properties.
Resumo:
A systematic survey of the possible methods of chemical extraction of iron by chloride formation has been presented and supported by a comparable study of :feedstocks, products and markets. The generation and evaluation of alternative processes was carried out by the technique of morphological analysis vihich was exploited by way of a computer program. The final choice was related to technical feasibility and economic viability, particularly capital cost requirements and developments were made in an estimating procedure for hydrometallurgjcal processes which have general applications. The systematic exploration included the compilation of relevant data, and this indicated a need.to investigate precipitative hydrolysis or aqueous ferric chloride. Arising from this study, two novel hydrometallurgical processes for manufacturing iron powder are proposed and experimental work was undertaken in the following .areas to demonstrate feasibility and obtain basic data for design purposes: (1) Precipitative hydrolysis of aqueous ferric chloride. (2) Gaseous chloridation of metallic iron, and oxidation of resultant ferrous chloride. (3) Reduction of gaseous ferric chloride with hydrogen. (4) Aqueous acid leaching of low grade iron ore. (5) Aqueous acid leaching of metallic iron. The experimentation was supported by theoretical analyses dealing with: (1) Thermodynamics of hydrolysis. (2) Kinetics of ore leaching. (3) Kinetics of metallic iron leaching. (4) Crystallisation of ferrous chloride. (5) Oxidation of anhydrous ferrous chloride. (6) Reduction of ferric chloride. Conceptual designs are suggested fbr both the processes mentioned. These draw attention to areas where further work is necessary, which are listed. Economic analyses have been performed which isolate significant cost areas, und indicate total production costs. Comparisons are mode with previous and analogous proposals for the production of iron powder.
Resumo:
Purpose – This paper aims to present a framework that will help manufacturing firms to configure their internal production and support operations to enable effective and efficient delivery of products and their closely associated services. Design/methodology/approach – First, the key definitions and literature sources directly associated with servitization of manufacturing are established. Then, a theoretical framework that categorises the key characteristics of a manufacturer's operations strategy is developed and this is populated using both evidence from the extant literature and empirical data. Findings – The framework captures a set of operations principles, structures and processes that can guide a manufacturer in the delivery of product-centric servitized offering. These are illustrated and contrasted against operations that deliver purely product (production operations) and those which deliver purely services (services operations). Research limitations/implications – The work is based on a review of the literature supported by data collected from an exploratory case study. Whilst it provides an essential platform, further research will be needed to validate the framework. Originality/value – The principal contribution of this paper is a framework that captures the key characteristics of operations for product-centric servitized manufacture.
Resumo:
The objective of this study has been to enable a greater understanding of the biomass gasification process through the development and use of process and economic models. A new theoretical equilibrium model of gasification is described using the operating condition called the adiabatic carbon boundary. This represents an ideal gasifier working at the point where the carbon in the feedstock is completely gasified. The model can be used as a `target' against which the results of real gasifiers can be compared, but it does not simulate the results of real gasifiers. A second model has been developed which uses a stagewise approach in order to model fluid bed gasification, and its results have indicated that pyrolysis and the reactions of pyrolysis products play an important part in fluid bed gasifiers. Both models have been used in sensitivity analyses: the biomass moisture content and gasifying agent composition were found to have the largest effects on performance, whilst pressure and heat loss had lesser effects. Correlations have been produced to estimate the total installed capital cost of gasification systems and have been used in an economic model of gasification. This has been used in a sensitivity analysis to determine the factors which most affect the profitability of gasification. The most important influences on gasifier profitability have been found to be feedstock cost, product selling price and throughput. Given the economic conditions of late 1985, refuse gasification for the production of producer gas was found to be viable at throughputs of about 2.5 tonnes/h dry basis and above, in the metropolitan counties of the United Kingdom. At this throughput and above, the largest element of product gas cost is the feedstock cost, the cost element which is most variable.
Resumo:
Requirements for systems to continue to operate satisfactorily in the presence of faults has led to the development of techniques for the construction of fault tolerant software. This thesis addresses the problem of error detection and recovery in distributed systems which consist of a set of communicating sequential processes. A method is presented for the `a priori' design of conversations for this class of distributed system. Petri nets are used to represent the state and to solve state reachability problems for concurrent systems. The dynamic behaviour of the system can be characterised by a state-change table derived from the state reachability tree. Systematic conversation generation is possible by defining a closed boundary on any branch of the state-change table. By relating the state-change table to process attributes it ensures all necessary processes are included in the conversation. The method also ensures properly nested conversations. An implementation of the conversation scheme using the concurrent language occam is proposed. The structure of the conversation is defined using the special features of occam. The proposed implementation gives a structure which is independent of the application and is independent of the number of processes involved. Finally, the integrity of inter-process communications is investigated. The basic communication primitives used in message passing systems are seen to have deficiencies when applied to systems with safety implications. Using a Petri net model a boundary for a time-out mechanism is proposed which will increase the integrity of a system which involves inter-process communications.
Resumo:
Cellular manufacturing is widely acknowledged as one of the key approaches to achieving world-class performance in batch manufacturing operations. The design of cellular manufacturing systems (CMS) is therefore crucial in determining a company's competitiveness. This thesis postulated that, in order to be effective the design of CMS should not only be systematic but also systemic. A systemic design uses the concepts of the body of work known as the 'systems approach' to ensure that a truly effective CMS is defined. The thesis examined the systems approach and created a systemic framework against which existing approaches to the design of CMS were evaluated. The most promising of these, Manufacturing Systems Engineering (MSE), was further investigated using a series of cross-sectional case-studies. Although, in practice, MSE proved to be less than systemic, it appeared to produce significant benefits. This seemed to suggest that CMS design did not need to be systemic to be effective. However, further longitudinal case-studies showed that the benefits claimed were at an operational level not at a business level and also that the performance of the whole system had not been evaluated. The deficiencies identified in the existing approaches to designing CMS were then addressed by the development of a novel CMS design methodology that fully utilised systems concepts. A key aspect of the methodology was the use of the Whole Business Simulator (WBS), a modelling and simulation tool that enabled the evaluation of CMS at operational and business levels. The most contentious aspects of the methodology were tested on a significant and complex case-study. The results of the exercise indicated that the systemic methodology was feasible.
Resumo:
Changes in modern structural design have created a demand for products which are light but possess high strength. The objective is a reduction in fuel consumption and weight of materials to satisfy both economic and environmental criteria. Cold roll forming has the potential to fulfil this requirement. The bending process is controlled by the shape of the profile machined on the periphery of the rolls. A CNC lathe can machine complicated profiles to a high standard of precision, but the expertise of a numerical control programmer is required. A computer program was developed during this project, using the expert system concept, to calculate tool paths and consequently to expedite the procurement of the machine control tapes whilst removing the need for a skilled programmer. Codifying the expertise of a human and the encapsulation of knowledge within a computer memory, destroys the dependency on highly trained people whose services can be costly, inconsistent and unreliable. A successful cold roll forming operation, where the product is geometrically correct and free from visual defects, is not easy to attain. The geometry of the sheet after travelling through the rolling mill depends on the residual strains generated by the elastic-plastic deformation. Accurate evaluation of the residual strains can provide the basis for predicting the geometry of the section. A study of geometric and material non-linearity, yield criteria, material hardening and stress-strain relationships was undertaken in this research project. The finite element method was chosen to provide a mathematical model of the bending process and, to ensure an efficient manipulation of the large stiffness matrices, the frontal solution was applied. A series of experimental investigations provided data to compare with corresponding values obtained from the theoretical modelling. A computer simulation, capable of predicting that a design will be satisfactory prior to the manufacture of the rolls, would allow effort to be concentrated into devising an optimum design where costs are minimised.
Resumo:
Product reliability and its environmental performance have become critical elements within a product's specification and design. To obtain a high level of confidence in the reliability of the design it is customary to test the design under realistic conditions in a laboratory. The objective of the work is to examine the feasibility of designing mechanical test rigs which exhibit prescribed dynamical characteristics. The design is then attached to the rig and excitation is applied to the rig, which then transmits representative vibration levels into the product. The philosophical considerations made at the outset of the project are discussed as they form the basis for the resulting design methodologies. It is attempted to directly identify the parameters of a test rig from the spatial model derived during the system identification process. It is shown to be impossible to identify a feasible test rig design using this technique. A finite dimensional optimal design methodology is developed which identifies the parameters of a discrete spring/mass system which is dynamically similar to a point coordinate on a continuous structure. This design methodology is incorporated within another procedure which derives a structure comprising a continuous element and a discrete system. This methodology is used to obtain point coordinate similarity for two planes of motion, which is validated by experimental tests. A limitation of this approach is that it is impossible to achieve multi-coordinate similarity due to an interaction of the discrete system and the continuous element at points away from the coordinate of interest. During the work the importance of the continuous element is highlighted and a design methodology is developed for continuous structures. The design methodology is based upon distributed parameter optimal design techniques and allows an initial poor design estimate to be moved in a feasible direction towards an acceptable design solution. Cumulative damage theory is used to provide a quantitative method of assessing the quality of dynamic similarity. It is shown that the combination of modal analysis techniques and cumulative damage theory provides a feasible design synthesis methodology for representative test rigs.
Resumo:
Safety enforcement practitioners within Europe and marketers, designers or manufacturers of consumer products need to determine compliance with the legal test of "reasonable safety" for consumer goods, to reduce the "risks" of injury to the minimum. To enable freedom of movement of products, a method for safety appraisal is required for use as an "expert" system of hazard analysis by non-experts in safety testing of consumer goods for implementation consistently throughout Europe. Safety testing approaches and the concept of risk assessment and hazard analysis are reviewed in developing a model for appraising consumer product safety which seeks to integrate the human factors contribution of risk assessment, hazard perception, and information processing. The model develops a system of hazard identification, hazard analysis and risk assessment which can be applied to a wide range of consumer products through use of a series of systematic checklists and matrices and applies alternative numerical and graphical methods for calculating a final product safety risk assessment score. It is then applied in its pilot form by selected "volunteer" Trading Standards Departments to a sample of consumer products. A series of questionnaires is used to select participating Trading Standards Departments, to explore the contribution of potential subjective influences, to establish views regarding the usability and reliability of the model and any preferences for the risk assessment scoring system used. The outcome of the two stage hazard analysis and risk assessment process is considered to determine consistency in results of hazard analysis, final decisions regarding the safety of the sample product and to determine any correlation in the decisions made using the model and alternative scoring methods of risk assessment. The research also identifies a number of opportunities for future work, and indicates a number of areas where further work has already begun.
Resumo:
Cyclothialidine, a natural product isolated from Streptomyces .filipinensis NR0484, has been proven to be a potent and selective inhibitor of the bacterial enzyme DNA gyrase. Gyrase inhibition results in cell death, the enzyme being the target of several currently used antibiotics. Cyclothialidine showed poor activity against whole bacterial cells, highlighting scope for improvement regarding cell membrane pemeability in order for the full potential of this new class of antibiotics to be realised, Structurally, cyclothialidine contains a 12-membered lactone ring which is partly integrated into a pentapeptide chain, with a substituted aromatic moiety bordering the lactone, Retrosynthetically it can be traced back to cis-3-hydroxyproline, 3,5-dihydroxy-2,6-dimethylbenzoic acid and four commercially available amino acids; two serine, one cysteine and one alanine. In this work, a model of cyclothialidine was synthesised in order to establish the methodology for more complex compounds. Analogues with hydroxy, dihydroxy and dihydroxymethyl substituted aromatic moieties were then prepared to ensure successful protection methods could be performed and the pharmacophore synthesised. The key aromatic moiety, 2,6-dimethyl-3,5-dihydroxybenzoic acid was produced via two successive Mannich reaction/reduction steps. Acid protection using 4-nitrobenzyl bromide and TBDMS hydroxyl protection followed by bromination of one methyl afforded the desired intermediate. Reaction with a serine/cysteine dipeptide, followed by deprotection and cyclisation under Mitsunobu conditions lead to the 12-membered lactone. An amine substituted aromatic analogue and also replacement of the cysteine sulphur by oxygen were attempted but without success. In an effort to improve cell permeability, a conjugate was synthesised between the pharmacophore and a cholesterol moiety. It was hoped the steroid fragment would serve to increase potency by escorting the molecule through the lipid environment of the cell membrane. The pharmacophore and conjugate were tested against a variety of bacterial strains but the conjugate failed to improve activity.
Resumo:
There is a great deal of literature about the initial stages of innovative design. This is the process whereby a completely new product is conceived, invented and developed. In industry, however, the continuing success of a company is more often achieved by improving or developing existing designs to maintain their marketability. Unfortunately, this process of design by evolution is less well documented. This thesis reports the way in which this process was improved for the sponsoring company. The improvements were achieved by implementing a new form of computer aided design (C.A.D.) system. The advent of this system enabled the company to both shorten the design and development time and also to review the principles underlying the existing design procedures. C.A.D. was a new venture for the company and care had to be taken to ensure that the new procedures were compatible with the existing design office environment. In particular, they had to be acceptable to the design office staff. The C.A.D. system produced guides the designer from the draft specification to the first prototype layout. The computer presents the consequences of the designer's decisions clearly and fully, often by producing charts and sketches. The C.A.D. system and the necessary peripheral facilities were implemented, monitored and maintained. The system structure was left sufficiently flexible for maintenance to be undertaken quickly and effectively. The problems encountered during implementation are well documented in this thesis.
Resumo:
This is an exploratory study in a field which previously was virtually unexplored. The aim is to identify, for the benefit of innovators, the influence of industrial design on the commercial success of new science-based products used for professional and industrial purposes. The study is a contribution to the theory of success and failure in industrial innovation. The study begins by defining the terminology. To place the investigation in context, there is then a review of past attempts by official policy-making bodies to improve the competitiveness of British products of manufacture through good design. To elucidate the meaning of good design, attempts to establish a coherent philosophy of style in British products of manufacture during the same period are also reviewed. Following these reviews, empirical evidence is presented to identify what actually takes place in successful firms when industrial design is allocated a role in the process of technological innovation. The evidence comprises seven case studies of new science-based products used for professional or industrial purposes which have received Design Council Awards. To facilitate an objective appraisal, evidence was obtained by conducting separate semi-structured interviews, the detail of which is described, with senior personnel in innovating firms, with industrial design consultants, and with professional users. The study suggests that the likelihood of commercial success in technological innovation is greater when the form, configuration, and the overall appearance of a new product, together with the detail which delineates them, are consciously and expertly controlled. Moreover, uncertainty in innovation is likely to be reduced if the appearance of a new product is consciously designed to facilitate recognition and comprehension. Industrial design is an especially significant factor when a firm innovates against a background of international competition and comparable levels of technological competence in rival firms. The likelihood of success in innovation is enhanced if design is allocated a role closely identified with the total needs of the user and discrete from the engineering function in company organisation. Recent government measures, initiated since this study began, are corroborative of the findings.
Resumo:
Concurrent engineering and design for manufacture and assembly strategies have become pervasive in use in a wide array of industrial settings. These strategies have generally focused on product and process design issues based on capability concerns. The strategies have been historically justified using cost savings calculations focusing on easily quantifiable costs such as raw material savings or manufacturing or assembly operations no longer required. It is argued herein that neither the focus of the strategies nor the means of justification are adequate. Product and process design strategies should include both capability and capacity concerns and justification procedures should include the financial effects that the product and process changes would have on the entire company. The authors of this paper take this more holistic view of the problem and examine an innovative new design strategy using a comprehensive enterprise simulation tool. The results indicate that both the design strategy and the simulator show promise for further industrial use. © 2001 Elsevier Science B.V. All rights reserved.
Resumo:
The work described in this thesis focuses on the use of a design-of-experiments approach in a multi-well mini-bioreactor to enable the rapid establishments of high yielding production phase conditions in yeast, which is an increasingly popular host system in both academic and industrial laboratories. Using green fluorescent protein secreted from the yeast, Pichia pastoris, a scalable predictive model of protein yield per cell was derived from 13 sets of conditions each with three factors (temperature, pH and dissolved oxygen) at 3 levels and was directly transferable to a 7 L bioreactor. This was in clear contrast to the situation in shake flasks, where the process parameters cannot be tightly controlled. By further optimisating both the accumulation of cell density in batch and improving the fed-batch induction regime, additional yield improvement was found to be additive to the per cell yield of the model. A separate study also demonstrated that improving biomass improved product yield in a second yeast species, Saccharomyces cerevisiae. Investigations of cell wall hydrophobicity in high cell density P. pastoris cultures indicated that cell wall hydrophobin (protein) compositional changes with growth phase becoming more hydrophobic in log growth than in lag or stationary phases. This is possibly due to an increased occurrence of proteins associated with cell division. Finally, the modelling approach was validated in mammalian cells, showing its flexibility and robustness. In summary, the strategy presented in this thesis has the benefit of reducing process development time in recombinant protein production, directly from bench to bioreactor.