979 resultados para Detailed design
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
Cultivation technologies promoting organization of mammalian cells in three dimensions are essential for gene-function analyses as well as drug testing and represent the first step toward the design of tissue replacements and bioartificial organs. Embedded in a three-dimensional environment, cells are expected to develop tissue-like higher order intercellular structures (cell-cell contacts, extracellular matrix) that orchestrate cellular functions including proliferation, differentiation, apoptosis, and angiogenesis with unmatched quality. We have refined the hanging drop cultivation technology to pioneer beating heart microtissues derived from pure primary rat and mouse cardiomyocyte cultures as well as mixed populations reflecting the cell type composition of rodent hearts. Phenotypic characterization combined with detailed analysis of muscle-specific cell traits, extracellular matrix components, as well as endogenous vascular endothelial growth factor (VEGF) expression profiles of heart microtissues revealed (1) a linear cell number-microtissue size correlation, (2) intermicrotissue superstructures, (3) retention of key cardiomyocyte-specific cell qualities, (4) a sophisticated extracellular matrix, and (5) a high degree of self-organization exemplified by the tendency of muscle structures to assemble at the periphery of these myocardial spheroids. Furthermore (6), myocardial spheroids support endogenous VEGF expression in a size-dependent manner that will likely promote vascularization of heart microtissues produced from defined cell mixtures as well as support connection to the host vascular system after implantation. As cardiomyocytes are known to be refractory to current transfection technologies we have designed lentivirus-based transduction strategies to lead the way for genetic engineering of myocardial microtissues in a clinical setting.
Resumo:
This article describes the construction and use of a systematic structured method of mental health country situation appraisal, in order to help meet the need for conceptual tools to assist planners and policy makers develop and audit policy and implementation strategies. The tool encompasses the key domains of context, needs, resources, provisions and outcomes, and provides a framework for synthesizing key qualitative and quantitative information, flagging up gaps in knowledge, and for reviewing existing policies. It serves as an enabling tool to alert and inform policy makers, professionals and other key stakeholders about important issues which need to be considered in mental health policy development. It provides detailed country specific information in a systematic format, to facilitate global sharing of experiences of mental health reform and strategies between policy makers and other stakeholders. Lastly, it is designed to be a capacity building tool for local stakeholders to enhance situation appraisal, and multisectorial policy development and implementation.
Resumo:
1. Growing concern associated with threats to the marine environment has resulted in an increased demand for marine reserves that conserve representative and adequate examples of biodiversity. Often, the decisions about where to locate reserves must be made in the absence of detailed information on the patterns of distribution of the biota. Alternative approaches are required that include defining habitats using surrogates for biodiversity. Surrogate measures of biodiversity enable decisions about where to locate marine reserves to be made more reliably in the absence of detailed data on the distribution of species. 2. Intertidal habitat types derived using physical properties of the shoreline were used as a surrogate for intertidal biodiversity to assist with the identification of sites for inclusion in a candidate system of intertidal marine reserves for 17 463 km of the mainland coast of Queensland, Australia. This represents the first systematic approach, on essentially one-dimensional data, using fine-scale (tens to hundreds of metres) intertidal habitats to identify a system of marine reserves for such a large length of coast. A range of solutions would provide for the protection of a representative example of intertidal habitats in Queensland. 3. The design and planning of marine and terrestrial protected areas systems should not be undertaken independently of each other because it is likely to lead to inadequate representation of intertidal habitats in either system. The development of reserve systems specially designed to protect intertidal habitats should be integrated into the design of terrestrial and marine protected area systems. Copyright (c) 2005 John Wiley & Sons, Ltd.
Resumo:
This paper presents a finite-difference time-domain (FDTD) simulator for electromagnetic analysis and design applications in MRI. It is intended to be a complete FDTD model of an MRI system including all RF and low-frequency field generating units and electrical models of the patient. The pro-ram has been constructed in an object-oriented framework. The design procedure is detailed and the numerical solver has been verified against analytical solutions for simple cases and also applied to various field calculation problems. In particular, the simulator is demonstrated for inverse RF coil design, optimized source profile generation, and parallel imaging in high-frequency situations. The examples show new developments enabled by the simulator and demonstrate that the proposed FDTD framework can be used to analyze large-scale computational electromagnetic problems in modern MRI engineering. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Socioeconomic considerations should have an important place in reserve design, Systematic reserve-selection tools allow simultaneous optimization for ecological objectives while minimizing costs but are seldom used to incorporate socioeconomic costs in the reserve-design process. The sensitivity of this process to biodiversity data resolution has been studied widely but the issue of socioeconomic data resolution has not previously been considered. We therefore designed marine reserves for biodiversity conservation with the constraint of minimizing commercial fishing revenue losses and investigated how economic data resolution affected the results. Incorporating coarse-resolution economic data from official statistics generated reserves that were only marginally less costly to the fishery than those designed with no attempt to minimize economic impacts. An intensive survey yielded fine-resolution data that, when incorporated in the design process, substantially reduced predicted fishery losses. Such an approach could help minimize fisher displacement because the least profitable grounds are selected for the reserve. Other work has shown that low-resolution biodiversity data can lead to underestimation of the conservation value of some sites, and a risk of overlooking the most valuable areas, and we have similarly shown that low-resolution economic data can cause underestimation of the profitability of some sites and a risk of inadvertently including these in the reserve. Detailed socioeconomic data are therefore an essential input for the design of cost-effective reserve networks.
Resumo:
As levels of investment in advanced manufacturing systems increase, effective project management becomes ever more critical. This paper demonstrates how the model proposed by Mintzberg, Raisinghani and Theoret in 1976, which structures complicated strategic decision processes, can be applied to the design of new production systems for both descriptive and analytical research purposes. This paper sets a detailed case study concerning the design and development of an advanced manufacturing system within the Mintzberg decision model and so breaks down the decision sequence into constituent parts. It thus shows how a structured model can provide a framework for the researcher who wishes to study decision episodes in the design of manufacturing facilities in greater depth.
Resumo:
A new surface analysis technique has been developed which has a number of benefits compared to conventional Low Energy Ion Scattering Spectrometry (LEISS). A major potential advantage arising from the absence of charge exchange complications is the possibility of quantification. The instrumentation that has been developed also offers the possibility of unique studies concerning the interaction between low energy ions and atoms and solid surfaces. From these studies it may also be possible, in principle, to generate sensitivity factors to quantify LEISS data. The instrumentation, which is referred to as a Time-of-Flight Fast Atom Scattering Spectrometer has been developed to investigate these conjecture in practice. The development, involved a number of modifications to an existing instrument, and allowed samples to be bombarded with a monoenergetic pulsed beam of either atoms or ions, and provided the capability to analyse the spectra of scattered atoms and ions separately. Further to this a system was designed and constructed to allow incident, exit and azimuthal angles of the particle beam to be varied independently. The key development was that of a pulsed, and mass filtered atom source; which was developed by a cyclic process of design, modelling and experimentation. Although it was possible to demonstrate the unique capabilities of the instrument, problems relating to surface contamination prevented the measurement of the neutralisation probabilities. However, these problems appear to be technical rather than scientific in nature, and could be readily resolved given the appropriate resources. Experimental spectra obtained from a number of samples demonstrate some fundamental differences between the scattered ion and neutral spectra. For practical non-ordered surfaces the ToF spectra are more complex than their LEISS counterparts. This is particularly true for helium scattering where it appears, in the absence of detailed computer simulation, that quantitative analysis is limited to ordered surfaces. Despite this limitation the ToFFASS instrument opens the way for quantitative analysis of the 'true' surface region to a wider range of surface materials.
Resumo:
The literature on the potential use of liquid ammonia as a solvent for the extraction of aromatic hydrocarbons from mixtures with paraffins, and the application of reflux, has been reviewed. Reference is made to extractors suited to this application. A pilot scale extraction plant was designed comprising a Scm. diameter by 12Scm. high, 50 stage Rotating Disc Contactor with 2 external settlers. Provision was made for operation with, or without, reflux at a pressure of 10 bar and ambient temperature. The solvent recovery unit consisted of an evaporator, compressor and condenser in a refrigeration cycle. Two systems were selected for study, Cumene-n-Heptane-Ammonia and Toluene-Methylcyclohexane-Ammonia. Equlibrium data for the first system was determined experimentally in a specially-designed, equilibrium bomb. A technique was developed to withdraw samples under pressure for analysis by chromatography and titration. The extraction plant was commissioned with a kerosine-water system; detailed operating procedures were developed based on a Hazard and Operability Study. Experimental runs were carried out with both ternary ammonia systems. With the system Toluene-Methylcyclohexane-Ammonia the extraction plant and the solvent recovery facility, operated satisfactorily, and safely,in accordance with the operating procedures. Experimental data gave reasonable agreement with theory. Recommendations are made for further work with plant.
Resumo:
The objective of this work was to design, construct and commission a new ablative pyrolysis reactor and a high efficiency product collection system. The reactor was to have a nominal throughput of 10 kg/11r of dry biomass and be inherently scalable up to an industrial scale application of 10 tones/hr. The whole process consists of a bladed ablative pyrolysis reactor, two high efficiency cyclones for char removal and a disk and doughnut quench column combined with a wet walled electrostatic precipitator, which is directly mounted on top, for liquids collection. In order to aid design and scale-up calculations, detailed mathematical modelling was undertaken of the reaction system enabling sizes, efficiencies and operating conditions to be determined. Specifically, a modular approach was taken due to the iterative nature of some of the design methodologies, with the output from one module being the input to the next. Separate modules were developed for the determination of the biomass ablation rate, specification of the reactor capacity, cyclone design, quench column design and electrostatic precipitator design. These models enabled a rigorous design protocol to be developed capable of specifying the required reactor and product collection system size for specified biomass throughputs, operating conditions and collection efficiencies. The reactor proved capable of generating an ablation rate of 0.63 mm/s for pine wood at a temperature of 525 'DC with a relative velocity between the heated surface and reacting biomass particle of 12.1 m/s. The reactor achieved a maximum throughput of 2.3 kg/hr, which was the maximum the biomass feeder could supply. The reactor is capable of being operated at a far higher throughput but this would require a new feeder and drive motor to be purchased. Modelling showed that the reactor is capable of achieving a reactor throughput of approximately 30 kg/hr. This is an area that should be considered for the future as the reactor is currently operating well below its theoretical maximum. Calculations show that the current product collection system could operate efficiently up to a maximum feed rate of 10 kg/Fir, provided the inert gas supply was adjusted accordingly to keep the vapour residence time in the electrostatic precipitator above one second. Operation above 10 kg/hr would require some modifications to the product collection system. Eight experimental runs were documented and considered successful, more were attempted but due to equipment failure had to be abandoned. This does not detract from the fact that the reactor and product collection system design was extremely efficient. The maximum total liquid yield was 64.9 % liquid yields on a dry wood fed basis. It is considered that the liquid yield would have been higher had there been sufficient development time to overcome certain operational difficulties and if longer operating runs had been attempted to offset product losses occurring due to the difficulties in collecting all available product from a large scale collection unit. The liquids collection system was highly efficient and modeling determined a liquid collection efficiency of above 99% on a mass basis. This was validated due to the fact that a dry ice/acetone condenser and a cotton wool filter downstream of the collection unit enabled mass measurements of the amount of condensable product exiting the product collection unit. This showed that the collection efficiency was in excess of 99% on a mass basis.
Resumo:
This thesis describes the design and engineering of a pressurised biomass gasification test facility. A detailed examination of the major elements within the plant has been undertaken in relation to specification of equipment, evaluation of options and final construction. The retrospective project assessment was developed from consideration of relevant literature and theoretical principles. The literature review includes a discussion on legislation and applicable design codes. From this analysis, each of the necessary equipment units was reviewed and important design decisions and procedures highlighted and explored. Particular emphasis was placed on examination of the stringent demands of the ASME VIII design codes. The inter-relationship of functional units was investigated and areas of deficiency, such as biomass feeders and gas cleaning, have been commented upon. Finally, plant costing was summarized in relation to the plant design and proposed experimental programme. The main conclusion drawn from the study is that pressurised gasification of biomass is far more difficult and expensive to support than atmospheric gasification. A number of recommendations have been made regarding future work in this area.
Resumo:
This dissertation studies the process of operations systems design within the context of the manufacturing organization. Using the DRAMA (Design Routine for Adopting Modular Assembly) model as developed by a team from the IDOM Research Unit at Aston University as a starting point, the research employed empirically based fieldwork and a survey to investigate the process of production systems design and implementation within four UK manufacturing industries: electronics assembly, electrical engineering, mechanical engineering and carpet manufacturing. The intention was to validate the basic DRAMA model as a framework for research enquiry within the electronics industry, where the initial IDOM work was conducted, and then to test its generic applicability, further developing the model where appropriate, within the other industries selected. The thesis contains a review of production systems design theory and practice prior to presenting thirteen industrial case studies of production systems design from the four industry sectors. The results and analysis of the postal survey into production systems design are then presented. The strategic decisions of manufacturing and their relationship to production systems design, and the detailed process of production systems design and operation are then discussed. These analyses are used to develop the generic model of production systems design entitled DRAMA II (Decision Rules for Analysing Manufacturing Activities). The model contains three main constituent parts: the basic DRAMA model, the extended DRAMA II model showing the imperatives and relationships within the design process, and a benchmark generic approach for the design and analysis of each component in the design process. DRAMA II is primarily intended for use by researchers as an analytical framework of enquiry, but is also seen as having application for manufacturing practitioners.
Resumo:
The widespread implementation of Manufacturing Resource Planning (MRPII) systems in this country and abroad and the reported dissatisfaction with their use formed the initial basis of this piece of research which concentrates on the fundamental theory and design of the Closed Loop MRPII system itself. The dissertation concentrates on two key aspects namely; how Master Production Scheduling is carried out in differing business environments and how well the `closing of the loop' operates by checking the capcity requirements of the different levels of plans within an organisation. The main hypothesis which is tested is that in U.K. manufacturing industry, resource checks are either not being carried out satisfactorily or they are not being fed back to the appropriate plan in a timely fashion. The research methodology employed involved initial detailed investigations into Master Scheduling and capacity planning in eight diverse manufacturing companies. This was followed by a nationwide survey of users in 349 companies, a survey of all the major suppliers of Production Management software in the U.K. and an analysis of the facilities offered by current software packages. The main conclusion which is drawn is that the hypothesis is proved in the majority of companies in that only just over 50% of companies are attempting Resource and Capacity Planning and only 20% are successfully feeding back CRP information to `close the loop'. Various causative factors are put forward and remedies are suggested.
Resumo:
The recent explosive growth in advanced manufacturing technology (AMT) and continued development of sophisticated information technologies (IT) is expected to have a profound effect on the way we design and operate manufacturing businesses. Furthermore, the escalating capital requirements associated with these developments have significantly increased the level of risk associated with initial design, ongoing development and operation. This dissertation has examined the integration of two key sub-elements of the Computer Integrated Manufacturing (CIM) system, namely the manufacturing facility and the production control system. This research has concentrated on the interactions between production control (MRP) and an AMT based production facility. The disappointing performance of such systems has been discussed in the context of a number of potential technological and performance incompatibilities between these two elements. It was argued that the design and selection of operating policies for both is the key to successful integration. Furthermore, policy decisions are shown to play an important role in matching the performance of the total system to the demands of the marketplace. It is demonstrated that a holistic approach to policy design must be adopted if successful integration is to be achieved. It is shown that the complexity of the issues resulting from such an approach required the formulation of a structured design methodology. Such a methodology was subsequently developed and discussed. This combined a first principles approach to the behaviour of system elements with the specification of a detailed holistic model for use in the policy design environment. The methodology aimed to make full use of the `low inertia' characteristics of AMT, whilst adopting a JIT configuration of MRP and re-coupling the total system to the market demands. This dissertation discussed the application of the methodology to an industrial case study and the subsequent design of operational policies. Consequently a novel approach to production control resulted. A central feature of which was a move toward reduced manual intervention in the MRP processing and scheduling logic with increased human involvement and motivation in the management of work-flow on the shopfloor. Experimental results indicated that significant performance advantages would result from the adoption of the recommended policy set.
Resumo:
Investigation of the different approaches used by Expert Systems researchers to solve problems in the domain of Mechanical Design and Expert Systems was carried out. The techniques used for conventional formal logic programming were compared with those used when applying Expert Systems concepts. A literature survey of design processes was also conducted with a view to adopting a suitable model of the design process. A model, comprising a variation on two established ones, was developed and applied to a problem within what are described as class 3 design tasks. The research explored the application of these concepts to Mechanical Engineering Design problems and their implementation on a microcomputer using an Expert System building tool. It was necessary to explore the use of Expert Systems in this manner so as to bridge the gap between their use as a control structure and for detailed analytical design. The former application is well researched into and this thesis discusses the latter. Some Expert System building tools available to the author at the beginning of his work were evaluated specifically for their suitability for Mechanical Engineering design problems. Microsynics was found to be the most suitable on which to implement a design problem because of its simple but powerful Semantic Net Knowledge Representation structure and the ability to use other types of representation schemes. Two major implementations were carried out. The first involved a design program for a Helical compression spring and the second a gearpair system design. Two concepts were proposed in the thesis for the modelling and implementation of design systems involving many equations. The method proposed enables equation manipulation and analysis using a combination of frames, semantic nets and production rules. The use of semantic nets for purposes other than for psychology and natural language interpretation, is quite new and represents one of the major contributions to knowledge by the author. The development of a purpose built shell program for this type of design problems was recommended as an extension of the research. Microsynics may usefully be used as a platform for this development.