983 resultados para DESIGN BASIS ACCIDENTS
Resumo:
Federal Highway Administration, Washington, D.C.
Resumo:
Federal Highway Administration, Washington, D.C.
Resumo:
Federal Highway Administration, Washington, D.C.
Resumo:
Federal Highway Administration, Office of Research and Development, Washington, D.C.
Resumo:
Federal Highway Administration, Office of Research and Development, Washington, D.C.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
v.1. Origin of the art. Anatomy the basis of drawing. The skeleton. The muscles of man and quadruped. Standard figure. Composition. Colour. Ancients and moderns. Invention.--v.2. Fuzeli. Wilkie. Effect of the societies on taste. A competent tribunal. On fresco. Elgin marbles. Beauty.
Resumo:
With marine biodiversity conservation the primary goal for reserve planning initiatives, a site's conservation potential is typically evaluated on the basis of the biological and physical features it contains. By comparison, socio-economic information is seldom a formal consideration of the reserve system design problem and generally limited to an assessment of threats, vulnerability or compatibility with surrounding uses. This is perhaps surprising given broad recognition that the success of reserve establishment is highly dependent on widespread stakeholder and community support. Using information on the spatial distribution and intensity of commercial rock lobster catch in South Australia, we demonstrate the capacity of mathematical reserve selection procedures to integrate socio-economic and biophysical information for marine reserve system design. Analyses of trade-offs highlight the opportunities to design representative, efficient and practical marine reserve systems that minimise potential loss to commercial users. We found that the objective of minimising the areal extent of the reserve system was barely compromised by incorporating economic design constraints. With a small increase in area (< 3%) and boundary length (< 10%), the economic impact of marine reserves on the commercial rock lobster fishery was reduced by more than a third. We considered also how a reserve planner might prioritise conservation areas using information on a planning units selection frequency. We found that selection frequencies alone were not a reliable guide for the selection of marine reserve systems, but could be used with approaches such as summed irreplaceability to direct conservation effort for efficient marine reserve design.
Resumo:
Three important goals in describing software design patterns are: generality, precision, and understandability. To address these goals, this paper presents an integrated approach to specifying patterns using Object-Z and UML. To achieve the generality goal, we adopt a role-based metamodeling approach to define patterns. With this approach, each pattern is defined as a pattern role model. To achieve precision, we formalize role concepts using Object-Z (a role metamodel) and use these concepts to define patterns (pattern role models). To achieve understandability, we represent the role metamodel and pattern role models visually using UML. Our pattern role models provide a precise basis for pattern-based model transformations or refactoring approaches.
Resumo:
This paper presents a way to describe design patterns rigorously based on role concepts. Rigorous pattern descriptions are a key aspect for patterns to be used as rules for model evolution in the MDA context, for example. We formalize the role concepts commonly used in defining design patterns as a role metamodel using Object-Z. Given this role metamodel, individual design patterns are specified generically as a formal pattern role model using Object-Z. We also formalize the properties that must be captured in a class model when a design pattern is deployed. These properties are defined generically in terms of role bindings from a pattern role model to a class model. Our work provides a precise but abstract approach for pattern definition and also provides a precise basis for checking the validity of pattern usage in designs.
Resumo:
Distributed digital control systems provide alternatives to conventional, centralised digital control systems. Typically, a modern distributed control system will comprise a multi-processor or network of processors, a communications network, an associated set of sensors and actuators, and the systems and applications software. This thesis addresses the problem of how to design robust decentralised control systems, such as those used to control event-driven, real-time processes in time-critical environments. Emphasis is placed on studying the dynamical behaviour of a system and identifying ways of partitioning the system so that it may be controlled in a distributed manner. A structural partitioning technique is adopted which makes use of natural physical sub-processes in the system, which are then mapped into the software processes to control the system. However, communications are required between the processes because of the disjoint nature of the distributed (i.e. partitioned) state of the physical system. The structural partitioning technique, and recent developments in the theory of potential controllability and observability of a system, are the basis for the design of controllers. In particular, the method is used to derive a decentralised estimate of the state vector for a continuous-time system. The work is also extended to derive a distributed estimate for a discrete-time system. Emphasis is also given to the role of communications in the distributed control of processes and to the partitioning technique necessary to design distributed and decentralised systems with resilient structures. A method is presented for the systematic identification of necessary communications for distributed control. It is also shwon that the structural partitions can be used directly in the design of software fault tolerant concurrent controllers. In particular, the structural partition can be used to identify the boundary of the conversation which can be used to protect a specific part of the system. In addition, for certain classes of system, the partitions can be used to identify processes which may be dynamically reconfigured in the event of a fault. These methods should be of use in the design of robust distributed systems.
Resumo:
The objective of this work was to design, construct and commission a new ablative pyrolysis reactor and a high efficiency product collection system. The reactor was to have a nominal throughput of 10 kg/11r of dry biomass and be inherently scalable up to an industrial scale application of 10 tones/hr. The whole process consists of a bladed ablative pyrolysis reactor, two high efficiency cyclones for char removal and a disk and doughnut quench column combined with a wet walled electrostatic precipitator, which is directly mounted on top, for liquids collection. In order to aid design and scale-up calculations, detailed mathematical modelling was undertaken of the reaction system enabling sizes, efficiencies and operating conditions to be determined. Specifically, a modular approach was taken due to the iterative nature of some of the design methodologies, with the output from one module being the input to the next. Separate modules were developed for the determination of the biomass ablation rate, specification of the reactor capacity, cyclone design, quench column design and electrostatic precipitator design. These models enabled a rigorous design protocol to be developed capable of specifying the required reactor and product collection system size for specified biomass throughputs, operating conditions and collection efficiencies. The reactor proved capable of generating an ablation rate of 0.63 mm/s for pine wood at a temperature of 525 'DC with a relative velocity between the heated surface and reacting biomass particle of 12.1 m/s. The reactor achieved a maximum throughput of 2.3 kg/hr, which was the maximum the biomass feeder could supply. The reactor is capable of being operated at a far higher throughput but this would require a new feeder and drive motor to be purchased. Modelling showed that the reactor is capable of achieving a reactor throughput of approximately 30 kg/hr. This is an area that should be considered for the future as the reactor is currently operating well below its theoretical maximum. Calculations show that the current product collection system could operate efficiently up to a maximum feed rate of 10 kg/Fir, provided the inert gas supply was adjusted accordingly to keep the vapour residence time in the electrostatic precipitator above one second. Operation above 10 kg/hr would require some modifications to the product collection system. Eight experimental runs were documented and considered successful, more were attempted but due to equipment failure had to be abandoned. This does not detract from the fact that the reactor and product collection system design was extremely efficient. The maximum total liquid yield was 64.9 % liquid yields on a dry wood fed basis. It is considered that the liquid yield would have been higher had there been sufficient development time to overcome certain operational difficulties and if longer operating runs had been attempted to offset product losses occurring due to the difficulties in collecting all available product from a large scale collection unit. The liquids collection system was highly efficient and modeling determined a liquid collection efficiency of above 99% on a mass basis. This was validated due to the fact that a dry ice/acetone condenser and a cotton wool filter downstream of the collection unit enabled mass measurements of the amount of condensable product exiting the product collection unit. This showed that the collection efficiency was in excess of 99% on a mass basis.
Resumo:
The objective of this study has been to enable a greater understanding of the biomass gasification process through the development and use of process and economic models. A new theoretical equilibrium model of gasification is described using the operating condition called the adiabatic carbon boundary. This represents an ideal gasifier working at the point where the carbon in the feedstock is completely gasified. The model can be used as a `target' against which the results of real gasifiers can be compared, but it does not simulate the results of real gasifiers. A second model has been developed which uses a stagewise approach in order to model fluid bed gasification, and its results have indicated that pyrolysis and the reactions of pyrolysis products play an important part in fluid bed gasifiers. Both models have been used in sensitivity analyses: the biomass moisture content and gasifying agent composition were found to have the largest effects on performance, whilst pressure and heat loss had lesser effects. Correlations have been produced to estimate the total installed capital cost of gasification systems and have been used in an economic model of gasification. This has been used in a sensitivity analysis to determine the factors which most affect the profitability of gasification. The most important influences on gasifier profitability have been found to be feedstock cost, product selling price and throughput. Given the economic conditions of late 1985, refuse gasification for the production of producer gas was found to be viable at throughputs of about 2.5 tonnes/h dry basis and above, in the metropolitan counties of the United Kingdom. At this throughput and above, the largest element of product gas cost is the feedstock cost, the cost element which is most variable.
Resumo:
The concept of shallow fluidized bed boilers is defined and a preliminary working design for a gas-fired package boiler has been produced. Those areas of the design requiring further study have been specified. Experimental investigations concerning these areas have been carried out. A two-dimensional, conducting paper analog has been developed for the specific purpose of evaluating sheet fins. The analog has been generalised and is presented as a simple means of simulating the general, two-dimensional Helmholtz equation. By recording the transient response of spherical, calorimetric probes when plunged into heated air-fluidized beds, heat transfer coefficients have been measured at bed temperatures up to 1 100°C. A correlation fitting all the data to within ±10% has been obtained. A model of heat transfer to surfaces immersed in high temperature beds has been proposed. The model solutions are, however, only in qualitative agreement with the experimental data. A simple experimental investigation has revealed that the effective, radial, thermal conductivities of shallow fluidized beds are an order of magnitude lower than the axial conductivities. These must, consequently, be taken into account when considering heat transfer to surfaces immersed within fluidized beds. Preliminary work on pre-mixed gas combustion and some further qualitative experiments have been used as the basis for discussing the feasibility of combusting heavy fuel oils within shallow beds. The use of binary beds, within which the fuel could be both gasified and subsequently burnt, is proposed. Finally, the consequences of the experimental studies on the initial design are considered, and suggestions for further work are made.
Resumo:
Changes in modern structural design have created a demand for products which are light but possess high strength. The objective is a reduction in fuel consumption and weight of materials to satisfy both economic and environmental criteria. Cold roll forming has the potential to fulfil this requirement. The bending process is controlled by the shape of the profile machined on the periphery of the rolls. A CNC lathe can machine complicated profiles to a high standard of precision, but the expertise of a numerical control programmer is required. A computer program was developed during this project, using the expert system concept, to calculate tool paths and consequently to expedite the procurement of the machine control tapes whilst removing the need for a skilled programmer. Codifying the expertise of a human and the encapsulation of knowledge within a computer memory, destroys the dependency on highly trained people whose services can be costly, inconsistent and unreliable. A successful cold roll forming operation, where the product is geometrically correct and free from visual defects, is not easy to attain. The geometry of the sheet after travelling through the rolling mill depends on the residual strains generated by the elastic-plastic deformation. Accurate evaluation of the residual strains can provide the basis for predicting the geometry of the section. A study of geometric and material non-linearity, yield criteria, material hardening and stress-strain relationships was undertaken in this research project. The finite element method was chosen to provide a mathematical model of the bending process and, to ensure an efficient manipulation of the large stiffness matrices, the frontal solution was applied. A series of experimental investigations provided data to compare with corresponding values obtained from the theoretical modelling. A computer simulation, capable of predicting that a design will be satisfactory prior to the manufacture of the rolls, would allow effort to be concentrated into devising an optimum design where costs are minimised.