44 resultados para systematic machine design
Resumo:
Requirements for systems to continue to operate satisfactorily in the presence of faults has led to the development of techniques for the construction of fault tolerant software. This thesis addresses the problem of error detection and recovery in distributed systems which consist of a set of communicating sequential processes. A method is presented for the `a priori' design of conversations for this class of distributed system. Petri nets are used to represent the state and to solve state reachability problems for concurrent systems. The dynamic behaviour of the system can be characterised by a state-change table derived from the state reachability tree. Systematic conversation generation is possible by defining a closed boundary on any branch of the state-change table. By relating the state-change table to process attributes it ensures all necessary processes are included in the conversation. The method also ensures properly nested conversations. An implementation of the conversation scheme using the concurrent language occam is proposed. The structure of the conversation is defined using the special features of occam. The proposed implementation gives a structure which is independent of the application and is independent of the number of processes involved. Finally, the integrity of inter-process communications is investigated. The basic communication primitives used in message passing systems are seen to have deficiencies when applied to systems with safety implications. Using a Petri net model a boundary for a time-out mechanism is proposed which will increase the integrity of a system which involves inter-process communications.
Resumo:
This research investigates the general user interface problems in using networked services. Some of the problems are: users have to recall machine names and procedures to. invoke networked services; interactions with some of the services are by means of menu-based interfaces which are quite cumbersome to use; inconsistencies exist between the interfaces for different services because they were developed independently. These problems have to be removed so that users can use the services effectively. A prototype system has been developed to help users interact with networked services. This consists of software which gives the user an easy and consistent interface with the various services. The prototype is based on a graphical user interface and it includes the following appJications: Bath Information & Data Services; electronic mail; file editor. The prototype incorporates an online help facility to assist users using the system. The prototype can be divided into two parts: the user interface part that manages interactlon with the user; the communicatIon part that enables the communication with networked services to take place. The implementation is carried out using an object-oriented approach where both the user interface part and communication part are objects. The essential characteristics of object-orientation, - abstraction, encapsulation, inheritance and polymorphism - can all contribute to the better design and implementation of the prototype. The Smalltalk Model-View-Controller (MVC) methodology has been the framework for the construction of the prototype user interface. The purpose of the development was to study the effectiveness of users interaction to networked services. Having completed the prototype, tests users were requested to use the system to evaluate its effectiveness. The evaluation of the prototype is based on observation, i.e. observing the way users use the system and the opinion rating given by the users. Recommendations to improve further the prototype are given based on the results of the evaluation. based on the results of the evah:1ation. . .'. " "', ':::' ,n,<~;'.'
Resumo:
For more than forty years, research has been on going in the use of the computer in the processing of natural language. During this period methods have evolved, with various parsing techniques and grammars coming to prominence. Problems still exist, not least in the field of Machine Translation. However, one of the successes in this field is the translation of sublanguage. The present work reports Deterministic Parsing, a relatively new parsing technique, and its application to the sublanguage of an aircraft maintenance manual for Machine Translation. The aim has been to investigate the practicability of using Deterministic Parsers in the analysis stage of a Machine Translation system. Machine Translation, Sublanguage and parsing are described in general terms with a review of Deterministic parsing systems, pertinent to this research, being presented in detail. The interaction between machine Translation, Sublanguage and Parsing, including Deterministic parsing, is also highlighted. Two types of Deterministic Parser have been investigated, a Marcus-type parser, based on the basic design of the original Deterministic parser (Marcus, 1980) and an LR-type Deterministic Parser for natural language, based on the LR parsing algorithm. In total, four Deterministic Parsers have been built and are described in the thesis. Two of the Deterministic Parsers are prototypes from which the remaining two parsers to be used on sublanguage have been developed. This thesis reports the results of parsing by the prototypes, a Marcus-type parser and an LR-type parser which have a similar grammatical and linguistic range to the original Marcus parser. The Marcus-type parser uses a grammar of production rules, whereas the LR-type parser employs a Definite Clause Grammar(DGC).
Resumo:
Cellular manufacturing is widely acknowledged as one of the key approaches to achieving world-class performance in batch manufacturing operations. The design of cellular manufacturing systems (CMS) is therefore crucial in determining a company's competitiveness. This thesis postulated that, in order to be effective the design of CMS should not only be systematic but also systemic. A systemic design uses the concepts of the body of work known as the 'systems approach' to ensure that a truly effective CMS is defined. The thesis examined the systems approach and created a systemic framework against which existing approaches to the design of CMS were evaluated. The most promising of these, Manufacturing Systems Engineering (MSE), was further investigated using a series of cross-sectional case-studies. Although, in practice, MSE proved to be less than systemic, it appeared to produce significant benefits. This seemed to suggest that CMS design did not need to be systemic to be effective. However, further longitudinal case-studies showed that the benefits claimed were at an operational level not at a business level and also that the performance of the whole system had not been evaluated. The deficiencies identified in the existing approaches to designing CMS were then addressed by the development of a novel CMS design methodology that fully utilised systems concepts. A key aspect of the methodology was the use of the Whole Business Simulator (WBS), a modelling and simulation tool that enabled the evaluation of CMS at operational and business levels. The most contentious aspects of the methodology were tested on a significant and complex case-study. The results of the exercise indicated that the systemic methodology was feasible.
Resumo:
Advances in both computer technology and the necessary mathematical models capable of capturing the geometry of arbitarily shaped objects has led to the development in this thesis of a surface generation package called 'IBSCURF' aimed at providing a more economically viable solution to free-form surface manufacture. A suit of computer programs written in FORTRAN 77 has been developed to provide computer aids for every aspect of work in designing and machining free-form surfaces. A vector-valued parametric method was used for shape description and a lofting technique employed for the construction of the surface. The development of the package 'IBSCURF' consists of two phases. The first deals with CAD. The design process commences in defining the cross-sections which are represented by uniform B-spline curves as approximations to give polygons. The order of the curve and the position and number of the polygon vertices can be used as parameters for the modification to achieve the required curves. When the definitions of the sectional curves is complete, the surface is interpolated over them by cubic cardinal splines. To use the CAD function of the package to design a mould for a plastic handle, a mathematical model was developed. To facilitate the integration of design and machining using the mathematical representation of the surface, the second phase of the package is concerned with CAM which enables the generation of tool offset positions for ball-nosed cutters and a general post-processor has been developed which automatically generates NC tape programs for any CNC milling machine. The two phases of these programs have been successfully implemented, as a CAD/CAM package for free-form surfaces on the VAX 11/750 super-minicomputer with graphics facilities for displaying drawings interactively on the terminal screen. The development of this package has been beneficial in all aspects of design and machining of free form surfaces.
Resumo:
Changes in modern structural design have created a demand for products which are light but possess high strength. The objective is a reduction in fuel consumption and weight of materials to satisfy both economic and environmental criteria. Cold roll forming has the potential to fulfil this requirement. The bending process is controlled by the shape of the profile machined on the periphery of the rolls. A CNC lathe can machine complicated profiles to a high standard of precision, but the expertise of a numerical control programmer is required. A computer program was developed during this project, using the expert system concept, to calculate tool paths and consequently to expedite the procurement of the machine control tapes whilst removing the need for a skilled programmer. Codifying the expertise of a human and the encapsulation of knowledge within a computer memory, destroys the dependency on highly trained people whose services can be costly, inconsistent and unreliable. A successful cold roll forming operation, where the product is geometrically correct and free from visual defects, is not easy to attain. The geometry of the sheet after travelling through the rolling mill depends on the residual strains generated by the elastic-plastic deformation. Accurate evaluation of the residual strains can provide the basis for predicting the geometry of the section. A study of geometric and material non-linearity, yield criteria, material hardening and stress-strain relationships was undertaken in this research project. The finite element method was chosen to provide a mathematical model of the bending process and, to ensure an efficient manipulation of the large stiffness matrices, the frontal solution was applied. A series of experimental investigations provided data to compare with corresponding values obtained from the theoretical modelling. A computer simulation, capable of predicting that a design will be satisfactory prior to the manufacture of the rolls, would allow effort to be concentrated into devising an optimum design where costs are minimised.
Resumo:
The work presented in this thesis is concerned with the dynamic behaviour of structural joints which are both loaded, and excited, normal to the joint interface. Since the forces on joints are transmitted through their interface, the surface texture of joints was carefully examined. A computerised surface measuring system was developed and computer programs were written. Surface flatness was functionally defined, measured and quantised into a form suitable for the theoretical calculation of the joint stiffness. Dynamic stiffness and damping were measured at various preloads for a range of joints with different surface textures. Dry clean and lubricated joints were tested and the results indicated an increase in damping for the lubricated joints of between 30 to 100 times. A theoretical model for the computation of the stiffness of dry clean joints was built. The model is based on the theory that the elastic recovery of joints is due to the recovery of the material behind the loaded asperities. It takes into account, in a quantitative manner, the flatness deviations present on the surfaces of the joint. The theoretical results were found to be in good agreement with those measured experimentally. It was also found that theoretical assessment of the joint stiffness could be carried out using a different model based on the recovery of loaded asperities into a spherical form. Stepwise procedures are given in order to design a joint having a particular stiffness. A theoretical model for the loss factor of dry clean joints was built. The theoretical results are in reasonable agreement with those experimentally measured. The theoretical models for the stiffness and loss factor were employed to evaluate the second natural frequency of the test rig. The results are in good agreement with the experimentally measured natural frequencies.
Resumo:
With the competitive challenge facing business today, the need to keep cost down and quality up is a matter of survival. One way in which wire manufacturers can meet this challenge is to possess a thorough understanding of deformation, friction and lubrication during the wire drawing process, and therefore to make good decisions regarding the selection and application of lubricants as well as the die design. Friction, lubrication and die design during wire drawing thus become the subject of this study. Although theoretical and experimental investigations have been being carried out ever since the establishment of wire drawing technology, many problems remain unsolved. It is therefore necessary to conduct further research on traditional and fundamental subjects such as the mechanics of deformation, friction, lubrication and die design in wire drawing. Drawing experiments were carried out on an existing bull-block under different cross-sectional area reductions, different speeds and different lubricants. The instrumentation to measure drawing load and drawing speed was set up and connected to the wire drawing machine, together with a data acquisition system. A die box connected to the existing die holder for using dry soap lubricant was designed and tested. The experimental results in terms of drawing stress vs percentage area reduction curves under different drawing conditions were analysed and compared. The effects on drawing stress of friction, lubrication, drawing speed and pressure die nozzle are discussed. In order to determine the flow stress of the material during deformation, tensile tests were performed on an Instron universal test machine, using the wires drawn under different area reductions. A polynomial function is used to correlate the flow stress of the material with the plastic strain, on which a general computer program has been written to find out the coefficients of the stress-strain function. The residual lubricant film on the steel wire after drawing was examined both radially and longitudinally using an SEM and optical microscope. The lubricant film on the drawn wire was clearly observed. Therefore, the micro-analysis by SEM provides a way of friction and lubrication assessment in wire drawing.
Resumo:
Introduction: Adjuvants potentiate immune responses, reducing the amount and dosing frequency of antigen required for inducing protective immunity. Adjuvants are of special importance when considering subunit, epitope-based or more unusual vaccine formulations lacking significant innate immunogenicity. While numerous adjuvants are known, only a few are licensed for human use; principally alum, and squalene-based oil-in-water adjuvants. Alum, the most commonly used, is suboptimal. There are many varieties of adjuvant: proteins, oligonucleotides, drug-like small molecules and liposome-based delivery systems with intrinsic adjuvant activity being perhaps the most prominent. Areas covered: This article focuses on small molecules acting as adjuvants, with the author reviewing their current status while highlighting their potential for systematic discovery and rational optimisation. Known small molecule adjuvants (SMAs) can be synthetically complex natural products, small oligonucleotides or drug-like synthetic molecules. The author provides examples of each class, discussing adjuvant mechanisms relevant to SMAs, and exploring the high-throughput discovery of SMAs. Expert opinion: SMAs, particularly synthetic drug-like adjuvants, are amenable to the plethora of drug-discovery techniques able to optimise the properties of biologically active small molecules. These range from laborious synthetic modifications to modern, rational, effort-efficient computational approaches, such as QSAR and structure-based drug design. In principal, any property or characteristic can thus be designed in or out of compounds, allowing us to tailor SMAs to specific biological functions, such as targeting specific cells or pathways, in turn affording the power to tailor SMAs to better address different diseases.
Resumo:
Lean is usually associated with the ‘operations’ of a manufacturing enterprise; however, there is a growing awareness that these principles may be transferred readily to other functions and sectors. The application to knowledge-based activities such as engineering design is of particular relevance to UK plc. Hence, the purpose of this study has been to establish the state-of-the-art, in terms of the adoption of Lean in new product development, by carrying out a systematic review of the literature. The authors' findings confirm the view that Lean can be applied beneficially away from the factory; that an understanding and definition of value is key to success; that a set-based (or Toyota methodology) approach to design is favoured together with the strong leadership of a chief engineer; and that the successful implementation requires organization-wide changes to systems, practices, and behaviour. On this basis it is felt that this review paper provides a useful platform for further research in this topic.
Resumo:
Objectives: Are behavioural interventions effective in reducing the rate of sexually transmitted infections (STIs) among genitourinary medicine (GUM) clinic patients? Design: Systematic review and meta-analysis of published articles. Data sources: Medline, CINAHL, Embase, PsychINFO, Applied Social Sciences Index and Abstracts, Cochrane Library Controlled Clinical Trials Register, National Research Register (1966 to January 2004). Review methods: Randomised controlled trials of behavioural interventions in sexual health clinic patients were included if they reported change to STI rates or self reported sexual behaviour. Trial quality was assessed using the Jadad score and results pooled using random effects meta-analyses where outcomes were consistent across studies. Results: 14 trials were included; 12 based in the United States. Experimental interventions were heterogeneous and most control interventions were more structured than typical UK care. Eight trials reported data on laboratory confirmed infections, of which four observed a greater reduction in their intervention groups (in two cases this result was statistically significant, p<0.05). Seven trials reported consistent condom use, of which six observed a greater increase among their intervention subjects. Results for other measures of sexual behaviour were inconsistent. Success in reducing STIs was related to trial quality, use of social cognition models, and formative research in the target population. However, effectiveness was not related to intervention format or length. Conclusions: While results were heterogeneous, several trials observed reductions in STI rates. The most effective interventions were developed through extensive formative research. These findings should encourage further research in the United Kingdom where new approaches to preventing STIs are urgently required.
Resumo:
The purpose of this concise paper is to propose, with evidence gathered through a systematic evaluation of an academic development programme in the UK, that training in the use of new and emerging learning technologies should be holistically embedded in every learning and training opportunity in learning, teaching and assessment in higher education, and not only as stand-alone modules or one-off opportunities. The future of learning in higher education cannot afford to allow Universities to disregard that digital literacy is an expected professional skill for their entire staff.
Resumo:
Objective - To evaluate behavioural components and strategies associated with increased uptake and effectiveness of screening for coronary heart disease and diabetes with an implementation science focus. Design - Realist review. Data sources - PubMed, Web of Knowledge, Cochrane Database of Systematic Reviews, Cochrane Controlled Trials Register and reference chaining. Searches limited to English language studies published since 1990. Eligibility criteria - Eligible studies evaluated interventions designed to increase the uptake of cardiovascular disease (CVD) and diabetes screening and examined behavioural and/or strategic designs. Studies were excluded if they evaluated changes in risk factors or cost-effectiveness only. Results - In 12 eligible studies, several different intervention designs and evidence-based strategies were evaluated. Salient themes were effects of feedback on behaviour change or benefits of health dialogues over simple feedback. Studies provide mixed evidence about the benefits of these intervention constituents, which are suggested to be situation and design specific, broadly supporting their use, but highlighting concerns about the fidelity of intervention delivery, raising implementation science issues. Three studies examined the effects of informed choice or loss versus gain frame invitations, finding no effect on screening uptake but highlighting opportunistic screening as being more successful for recruiting higher CVD and diabetes risk patients than an invitation letter, with no differences in outcomes once recruited. Two studies examined differences between attenders and non-attenders, finding higher risk factors among non-attenders and higher diagnosed CVD and diabetes among those who later dropped out of longitudinal studies. Conclusions - If the risk and prevalence of these diseases are to be reduced, interventions must take into account what we know about effective health behaviour change mechanisms, monitor delivery by trained professionals and examine the possibility of tailoring programmes according to contexts such as risk level to reach those most in need. Further research is needed to determine the best strategies for lifelong approaches to screening.
Resumo:
Purpose: The servitization of manufacturing is a diverse and complex field of research interest. The purpose of this paper is to provide an integrative and organising lens for viewing the various contributions to knowledge production from those research communities addressing servitization. To achieve this, the paper aims to set out to address two principal questions, namely where are the knowledge stocks and flows amongst the research communities? And what are generic research concerns being addressed by these communities? Design/methodology/approach: Using an evidenced-based approach, the authors have performed a systematic review of the research literature associated with the servitization of manufacturing. This investigation incorporates a descriptive and thematic analysis of 148 academic and scholarly papers from 103 different lead authors in 68 international peer-reviewed journals. Findings: The work proposes support for the existence of distinct researcher communities, namely services marketing, service management, operations management, product-service systems and service science management and engineering, which are contributing to knowledge production in the servitization of manufacturing. Knowledge stocks within all communities associated with research in the servitization of manufacturing have dramatically increased since the mid-1990s. The trends clearly reveal that the operations community is in receipt of the majority of citations relating to the servitization of manufacturing. In terms of knowledge flows, it is apparent that the more mature communities are drawing on more locally produced knowledge stocks, whereas the emergent communities are drawing on a knowledge base more evenly distributed across all the communities. The results are indicative of varying degrees of interdependency amongst the communities. The generic research concerns being addressed within the communities are associated with the concepts of product-service differentiation, competitive strategy, customer value, customer relationships and product-service configuration. Originality/value: This research has further developed and articulated the identities of distinct researcher communities actively contributing to knowledge production in the servitization of manufacturing, and to what extent they are pursuing common research agendas. This study provides an improved descriptive and thematic awareness of the resulting body of knowledge, allowing the field of servitization to progress in a more informed and multidisciplinary fashion. © Emerald Group Publishing Limited.
Resumo:
BACKGROUND: The use of quality of life (QoL) instruments in menorrhagia research is increasing but there is concern that not enough emphasis is placed on patient-focus in these measurements, i.e. on issues which are of importance to patients and reflect their experiences and concerns (clinical face validity). The objective was to assess the quality of QoL instruments in studies of menorrhagia. STUDY DESIGN: A systematic review of published research. Papers were identified through MEDLINE (1966-April 2000), EMBASE (1980-April 2000), Science Citation Index (1981-April 2000), Social Science Citation Index (1981-April 2000), CINAHL (1982-1999) and PsychLIT (1966-1999), and by manual searching of bibliographies of known primary and review articles. Studies were selected if they assessed women with menorrhagia for life quality, either developing QoL instruments or applying them as an outcome measure. Selected studies were assessed for quality of their QoL instruments, using a 17 items checklist including 10 items for clinical face validity (issues of relevance to patients' expectations and concerns) and 7 items for measurement properties (such as reliability, responsiveness, etc.). RESULTS: A total of 19 articles, 8 on instrument development and 11 on application, were included in the review. The generic Short Form 36 Health Survey Questionnaire (SF36) was used in 12/19 (63%) studies. Only two studies developed new specific QoL instruments for menorrhagia but they complied with 7/17 (41%) and 10/17 (59%) of the quality criteria. Quality assessment showed that only 7/19 (37%) studies complied with more than half the criteria for face validity whereas 17/19 (90%) studies complied with more than half of the criteria for measurement properties (P = 0.0001). CONCLUSION: Among existing QoL instruments, there is good compliance with the quality criteria for measurement properties but not with those for clinical face validity. There is a need to develop methodologically sound disease specific QoL instruments in menorrhagia focussing both on face validity and measurement properties.