983 resultados para DESIGN BASIS ACCIDENTS
Resumo:
Product reliability and its environmental performance have become critical elements within a product's specification and design. To obtain a high level of confidence in the reliability of the design it is customary to test the design under realistic conditions in a laboratory. The objective of the work is to examine the feasibility of designing mechanical test rigs which exhibit prescribed dynamical characteristics. The design is then attached to the rig and excitation is applied to the rig, which then transmits representative vibration levels into the product. The philosophical considerations made at the outset of the project are discussed as they form the basis for the resulting design methodologies. It is attempted to directly identify the parameters of a test rig from the spatial model derived during the system identification process. It is shown to be impossible to identify a feasible test rig design using this technique. A finite dimensional optimal design methodology is developed which identifies the parameters of a discrete spring/mass system which is dynamically similar to a point coordinate on a continuous structure. This design methodology is incorporated within another procedure which derives a structure comprising a continuous element and a discrete system. This methodology is used to obtain point coordinate similarity for two planes of motion, which is validated by experimental tests. A limitation of this approach is that it is impossible to achieve multi-coordinate similarity due to an interaction of the discrete system and the continuous element at points away from the coordinate of interest. During the work the importance of the continuous element is highlighted and a design methodology is developed for continuous structures. The design methodology is based upon distributed parameter optimal design techniques and allows an initial poor design estimate to be moved in a feasible direction towards an acceptable design solution. Cumulative damage theory is used to provide a quantitative method of assessing the quality of dynamic similarity. It is shown that the combination of modal analysis techniques and cumulative damage theory provides a feasible design synthesis methodology for representative test rigs.
Resumo:
The pneumonia caused by Pneumocystis carinii is ultimately responsible for the death of many acquired immunodeficiency syndrome (AIDS) patients. Large doses of trimethoprim and pyrimethamine in combination with a sulphonamide and/or pentamidine suppress the infection but produce serious side-effects and seldom prevent recurrence after treatment withdrawal. However, the partial success of the aforementioned antifolates, and also trimetrexate used alone, does suggest dihydrofolate reductase (DHFR) as a target for the development of antipneumocystis agents. From the DHFR inhibitory activities of 3'-substituted pyrimethamine analogues it was suggested that the 3'-(3'',3''-dimethyltriazen-1''-yl) substituent may be responsible for the greater activity for the P.carinii over the mammalian enzyme. Crystallographic and molecular modeling studies revealed considerable geometrical and electronic differences between the triazene and the chemically related formamidine functions that may account for the differences in DHFR inhibitory profiles. Structural and electronic parameters calculated for a series of 3'-(3'',3''-disubstitutedtriazen-1''-yl) pyrimethamine analogues did not correlate with the DHFR inhibitory activities. However, the in vitro screening against P.carinii DHFR revealed that the 3''-hydroxyethyl-3''-benzyl analogue was the most active and selective. Models of the active sites of human and P.carinii DHFRs were constructed using DHFR sequence and structural homology data which had identified key residues involved in substrate and cofactor binding. Low energy conformations of the 3'',3''-dimethyl and 3''-hydroxyethyl-3''-benzyle analogues, determined from nuclear magnetic resonance studies and theoretical calculations, were docked by superimposing the diaminopyrimidine fragment onto a previously docked pyrimethamine analogue. Enzyme kinetic data supported the 3''-hydroxyethyl-3''-benzyl moiety being located in the NADPH binding groove. The 3''-benzyl substituent was able to locate to within 3 AA of a valine residue in the active site of P.carinii DHFR thereby producing a hydrophobic contact. The equivalent residue in human DHFR is threonine, more hydrophilic and less likely to be involved in such a contact. This difference may account for the greater inhibitory activity this analogue has for P.carinii DHFR and provide a basis for future drug design. From an in vivo model of PCP in immunosuppressed rats it was established that the 3"-hydroxyethyl-3"-benzyl analogue was able to reduce the.P.carinii burden more effectively with increasing doses, without causmg any visible signs of toxicity. However, equivalent doses were not as effective as pentamidine, a current treatment of choice for Pneumocystis carinii pneumonia.
Resumo:
This thesis describes work done exploring the application of expert system techniques to the domain of designing durable concrete. The nature of concrete durability design is described and some problems from the domain are discussed. Some related work on expert systems in concrete durability are described. Various implementation languages are considered - PROLOG and OPS5, and rejected in favour of a shell - CRYSTAL3 (later CRYSTAL4). Criteria for useful expert system shells in the domain are discussed. CRYSTAL4 is evaluated in the light of these criteria. Modules in various sub-domains (mix-design, sulphate attack, steel-corrosion and alkali aggregate reaction) are developed and organised under a BLACKBOARD system (called DEX). Extensions to the CRYSTAL4 modules are considered for different knowledge representations. These include LOTUS123 spreadsheets implementing models incorporating some of the mathematical knowledge in the domain. Design databases are used to represent tabular design knowledge. Hypertext representations of the original building standards texts are proposed as a tool for providing a well structured and extensive justification/help facility. A standardised approach to module development is proposed using hypertext development as a structured basis for expert systems development. Some areas of deficient domain knowledge are highlighted particularly in the use of data from mathematical models and in gaps and inconsistencies in the original knowledge source Digests.
Resumo:
This thesis is a theoretical study of the accuracy and usability of models that attempt to represent the environmental control system of buildings in order to improve environmental design. These models have evolved from crude representations of a building and its environment through to an accurate representation of the dynamic characteristics of the environmental stimuli on buildings. Each generation of models has had its own particular influence on built form. This thesis analyses the theory, structure and data of such models in terms of their accuracy of simulation and therefore their validity in influencing built form. The models are also analysed in terms of their compatability with the design process and hence their ability to aid designers. The conclusions are that such models are unlikely to improve environmental performance since: a the models can only be applied to a limited number of building types, b they can only be applied to a restricted number of the characteristics of a design, c they can only be employed after many major environmental decisions have been made, d the data used in models is inadequate and unrepresentative, e models do not account for occupant interaction in environmental control. It is argued that further improvements in the accuracy of simulation of environmental control will not significantly improve environmental design. This is based on the premise that strategic environmental decisions are made at the conceptual stages of design whereas models influence the detailed stages of design. It is hypothesised that if models are to improve environmental design it must be through the analysis of building typologies which provides a method of feedback between models and the conceptual stages of design. Field studies are presented to describe a method by which typologies can be analysed and a theoretical framework is described which provides a basis for further research into the implications of the morphology of buildings on environmental design.
Resumo:
The thesis deals with the background, development and description of a mathematical stock control methodology for use within an oil and chemical blending company, where demand and replenishment lead-times are generally non-stationary. The stock control model proper relies on, as input, adaptive forecasts of demand determined for an economical forecast/replenishment period precalculated on an individual stock-item basis. The control procedure is principally that of the continuous review, reorder level type, where the reorder level and reorder quantity 'float', that is, each changes in accordance with changes in demand. Two versions of the Methodology are presented; a cost minimisation version and a service level version. Realising the importance of demand forecasts, four recognised variations of the Trigg and Leach adaptive forecasting routine are examined. A fifth variation, developed, is proposed as part of the stock control methodology. The results of testing the cost minimisation version of the Methodology with historical data, by means of a computerised simulation, are presented together with a description of the simulation used. The performance of the Methodology is in addition compared favourably to a rule-of-thumb approach considered by the Company as an interim solution for reducing stack levels. The contribution of the work to the field of scientific stock control is felt to be significant for the following reasons:- (I) The Methodology is designed specifically for use with non-stationary demand and for this reason alone appears to be unique. (2) The Methodology is unique in its approach and the cost-minimisation version is shown to work successfully with the demand data presented. (3) The Methodology and the thesis as a whole fill an important gap between complex mathematical stock control theory and practical application. A brief description of a computerised order processing/stock monitoring system, designed and implemented as a pre-requisite for the Methodology's practical operation, is presented as an appendix.
Resumo:
The present scarcity of operational knowledge-based systems (KBS) has been attributed, in part, to an inadequate consideration shown to user interface design during development. From a human factors perspective the problem has stemmed from an overall lack of user-centred design principles. Consequently the integration of human factors principles and techniques is seen as a necessary and important precursor to ensuring the implementation of KBS which are useful to, and usable by, the end-users for whom they are intended. Focussing upon KBS work taking place within commercial and industrial environments, this research set out to assess both the extent to which human factors support was presently being utilised within development, and the future path for human factors integration. The assessment consisted of interviews conducted with a number of commercial and industrial organisations involved in KBS development; and a set of three detailed case studies of individual KBS projects. Two of the studies were carried out within a collaborative Alvey project, involving the Interdisciplinary Higher Degrees Scheme (IHD) at the University of Aston in Birmingham, BIS Applied Systems Ltd (BIS), and the British Steel Corporation. This project, which had provided the initial basis and funding for the research, was concerned with the application of KBS to the design of commercial data processing (DP) systems. The third study stemmed from involvement on a KBS project being carried out by the Technology Division of the Trustees Saving Bank Group plc. The preliminary research highlighted poor human factors integration. In particular, there was a lack of early consideration of end-user requirements definition and user-centred evaluation. Instead concentration was given to the construction of the knowledge base and prototype evaluation with the expert(s). In response to this identified problem, a set of methods was developed that was aimed at encouraging developers to consider user interface requirements early on in a project. These methods were then applied in the two further projects, and their uptake within the overall development process was monitored. Experience from the two studies demonstrated that early consideration of user interface requirements was both feasible, and instructive for guiding future development work. In particular, it was shown a user interface prototype could be used as a basis for capturing requirements at the functional (task) level, and at the interface dialogue level. Extrapolating from this experience, a KBS life-cycle model is proposed which incorporates user interface design (and within that, user evaluation) as a largely parallel, rather than subsequent, activity to knowledge base construction. Further to this, there is a discussion of several key elements which can be seen as inhibiting the integration of human factors within KBS development. These elements stem from characteristics of present KBS development practice; from constraints within the commercial and industrial development environments; and from the state of existing human factors support.
Resumo:
Lean is usually associated with the ‘operations’ of a manufacturing enterprise; however, there is a growing awareness that these principles may be transferred readily to other functions and sectors. The application to knowledge-based activities such as engineering design is of particular relevance to UK plc. Hence, the purpose of this study has been to establish the state-of-the-art, in terms of the adoption of Lean in new product development, by carrying out a systematic review of the literature. The authors' findings confirm the view that Lean can be applied beneficially away from the factory; that an understanding and definition of value is key to success; that a set-based (or Toyota methodology) approach to design is favoured together with the strong leadership of a chief engineer; and that the successful implementation requires organization-wide changes to systems, practices, and behaviour. On this basis it is felt that this review paper provides a useful platform for further research in this topic.
Resumo:
Manufacturing systems that are heavily dependent upon direct workers have an inherent complexity that the system designer is often ill-equipped to understand. This complexity is due to the interactions that cause variations in performance of the workers. Variation in human performance can be explained by many factors, however one important factor that is not currently considered in any detail during the design stage is the physical working environment. This paper presents the findings of ongoing research investigating human performance within manufacturing systems. It sets out to identify the form of the relationships that exist between changes in physical working environmental variables and operator performance. These relationships can provide managers with a decision basis when designing and managing manufacturing systems and their environments.
Resumo:
In this letter, we analyze and develop the required basis for a precise grating design in a scheme based on two oppositely chirped fiber Bragg gratings, and apply it in several examples which are numerically simulated. We obtain the interesting result that the broader bandwidth of the reshaped pulse, the shorter gratings required.
Resumo:
Distributed network utility maximization (NUM) is receiving increasing interests for cross-layer optimization problems in multihop wireless networks. Traditional distributed NUM algorithms rely heavily on feedback information between different network elements, such as traffic sources and routers. Because of the distinct features of multihop wireless networks such as time-varying channels and dynamic network topology, the feedback information is usually inaccurate, which represents as a major obstacle for distributed NUM application to wireless networks. The questions to be answered include if distributed NUM algorithm can converge with inaccurate feedback and how to design effective distributed NUM algorithm for wireless networks. In this paper, we first use the infinitesimal perturbation analysis technique to provide an unbiased gradient estimation on the aggregate rate of traffic sources at the routers based on locally available information. On the basis of that, we propose a stochastic approximation algorithm to solve the distributed NUM problem with inaccurate feedback. We then prove that the proposed algorithm can converge to the optimum solution of distributed NUM with perfect feedback under certain conditions. The proposed algorithm is applied to the joint rate and media access control problem for wireless networks. Numerical results demonstrate the convergence of the proposed algorithm. © 2013 John Wiley & Sons, Ltd.
Resumo:
Distributed network utility maximization (NUM) is receiving increasing interests for cross-layer optimization problems in multihop wireless networks. Traditional distributed NUM algorithms rely heavily on feedback information between different network elements, such as traffic sources and routers. Because of the distinct features of multihop wireless networks such as time-varying channels and dynamic network topology, the feedback information is usually inaccurate, which represents as a major obstacle for distributed NUM application to wireless networks. The questions to be answered include if distributed NUM algorithm can converge with inaccurate feedback and how to design effective distributed NUM algorithm for wireless networks. In this paper, we first use the infinitesimal perturbation analysis technique to provide an unbiased gradient estimation on the aggregate rate of traffic sources at the routers based on locally available information. On the basis of that, we propose a stochastic approximation algorithm to solve the distributed NUM problem with inaccurate feedback. We then prove that the proposed algorithm can converge to the optimum solution of distributed NUM with perfect feedback under certain conditions. The proposed algorithm is applied to the joint rate and media access control problem for wireless networks. Numerical results demonstrate the convergence of the proposed algorithm. © 2013 John Wiley & Sons, Ltd.
Resumo:
The poor retention and efficacy of instilled drops as a means of delivering drugs to the ophthalmic environment is well-recognised. The potential value of contact lenses as a means of ophthalmic drug delivery, and consequent improvement of pre-corneal retention is one obvious route to the development of a more effective ocular delivery system. Furthermore, the increasing availability and clinical use of daily disposable contact lenses provides the platform for the development of viable single-day use drug delivery devices based on existing materials and lenses. In order to provide a basis for the effective design of such devices, a systematic understanding of the factors affecting the interaction of individual drugs with the lens matrix is required. Because a large number of potential structural variables are involved, it is necessary to achieve some rationalisation of the parameters and physicochemical properties (such as molecular weight, charge, partition coefficients) that influence drug interactions. Ophthalmic dyes and structurally related compounds based on the same core structure were used to investigate these various factors and the way in which they can be used in concert to design effective release systems for structurally different drugs. Initial studies of passive diffusional release form a necessary precursor to the investigation of the features of the ocular environment that over-ride this simple behaviour. Commercially available contact lenses of differing structural classifications were used to study factors affecting the uptake of the surrogate actives and their release under 'passive' conditions. The interaction between active and lens material shows considerable and complex structure dependence, which is not simply related to equilibrium water content. The structure of the polymer matrix itself was found to have the dominant controlling influence on active uptake; hydrophobic interaction with the ophthalmic dye playing a major role. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Resumo:
In the article it is considered preconditions and main principles of creation of virtual laboratories for computer-aided design, as tools for interdisciplinary researches. Virtual laboratory, what are offered, is worth to be used on the stage of the requirements specification or EFT-stage, because it gives the possibility of fast estimating of the project realization, certain characteristics and, as a result, expected benefit of its applications. Using of these technologies already increase automation level of design stages of new devices for different purposes. Proposed computer technology gives possibility to specialists from such scientific fields, as chemistry, biology, biochemistry, physics etc, to check possibility of device creating on the basis of developed sensors. It lets to reduce terms and costs of designing of computer devices and systems on the early stages of designing, for example on the stage of requirements specification or EFT-stage. An important feature of this project is using the advanced multi-dimensional access method for organizing the information base of the Virtual laboratory.
Resumo:
This study identifies and investigates the potential use of in-eye trigger mechanisms to supplement the widely available information on release of ophthalmic drugs from contact lenses under passive release conditions. Ophthalmic dyes and surrogates have been successfully employed to investigate how these factors can be drawn together to make a successful system. The storage of a drug-containing lens in a pH lower than that of the ocular environment can be used to establish an equilibrium that favours retention of the drug in the lens prior to ocular insertion. Although release under passive conditions does not result in complete dye elution, the use of mechanical agitation techniques which mimic the eyelid blink action in conjunction with ocular tear chemistry promotes further release. In this way differentiation between passive and triggered in vitro release characteristics can be established. Investigation of the role of individual tear proteins revealed significant differences in their ability to alter the equilibrium between matrix-held and eluate-held dye or drug. These individual experiments were then investigated in vivo using ophthalmic dyes. Complete elution was found to be achievable in-eye; this demonstrated the importance of that fraction of the drug retained under passive conditions and the triggering effect of in-eye conditions on the release process. Understanding both the structure-property relationship between drug and material and in-eye trigger mechanisms, using ophthalmic dyes as a surrogate, provides the basis of knowledge necessary to design ocular drug delivery vehicles for in-eye release in a controllable manner.
Resumo:
PurposeTo develop and validate a classification system for focal vitreomacular traction (VMT) with and without macular hole based on spectral domain optical coherence tomography (SD-OCT), intended to aid in decision-making and prognostication.MethodsA panel of retinal specialists convened to develop this system. A literature review followed by discussion on a wide range of cases formed the basis for the proposed classification. Key features on OCT were identified and analysed for their utility in clinical practice. A final classification was devised based on two sequential, independent validation exercises to improve interobserver variability.ResultsThis classification tool pertains to idiopathic focal VMT assessed by a horizontal line scan using SD-OCT. The system uses width (W), interface features (I), foveal shape (S), retinal pigment epithelial changes (P), elevation of vitreous attachment (E), and inner and outer retinal changes (R) to give the acronym WISPERR. Each category is scored hierarchically. Results from the second independent validation exercise indicated a high level of agreement between graders: intraclass correlation ranged from 0.84 to 0.99 for continuous variables and Fleiss' kappa values ranged from 0.76 to 0.95 for categorical variables.ConclusionsWe present an OCT-based classification system for focal VMT that allows anatomical detail to be scrutinised and scored qualitatively and quantitatively using a simple, pragmatic algorithm, which may be of value in clinical practice as well as in future research studies.