899 resultados para Design methods
Resumo:
The development of a Laser Doppler Anemometer technique to measure the velocity distribution in a commercial plate heat exchanger is described. Detailed velocity profiles are presented and a preliminary investigation is reported on flow behaviour through a single cell in the channel matrix. The objective of the study was to extend previous investigations of plate heat exchanger flow patterns in the laminar range with the eventual aim of establishing the effect of flow patterns on heat transfer performance, thus leading to improved plate heat exchanger design and design methods. Accurate point velocities were obtained by Laser Anemometry in a perspex replica of the metal channel. Oil was used as a circulating liquid with a refractive index matched to that of the perspex so that the laser beams were not distorted. Cell-by-cell velocity measurements over a range of Reynolds number up to ten showed significant liquid mal-distribution. Local cell velocities were found to be as high as twenty seven times average velocity, contrary to the previously held belief of four times. The degree of mal-distribution varied across the channel as well as in the vertical direction, and depended on the upward or downward direction of flow. At Reynolds numbers less than one, flow zig-zagged from one side of the channel to the other in wave form, but increases in Reynolds number improved liquid distribution. A detailed examination of selected cells showed velocity variations in different directions, together with variation within individual cells. Experimental results are also reported on the flow split when passing through a single cell in a section of a channel . These observations were used to explain mal-distribution in the perspex channel itself.
Resumo:
Symbiotic design methods aim to take into account technical, social and organizational criteria simultaneously. Over the years, many symbiotic methods have been developed and applied in various countries. Nevertheless, the diagnosis that only technical criteria receive attention in the design of production systems, is still made repeatedly. Examples of symbiotic approaches are presented at three different levels: technical systems, organizations, and the process. From these, discussion points are generated concerning the character of the approaches, the importance of economic motives, the impact of national environments, the necessity of a guided design process, the use of symbiotic methods, and the roles of participants in the design process.
Resumo:
This work reports the developnent of a mathenatical model and distributed, multi variable computer-control for a pilot plant double-effect climbing-film evaporator. A distributed-parameter model of the plant has been developed and the time-domain model transformed into the Laplace domain. The model has been further transformed into an integral domain conforming to an algebraic ring of polynomials, to eliminate the transcendental terms which arise in the Laplace domain due to the distributed nature of the plant model. This has made possible the application of linear control theories to a set of linear-partial differential equations. The models obtained have well tracked the experimental results of the plant. A distributed-computer network has been interfaced with the plant to implement digital controllers in a hierarchical structure. A modern rnultivariable Wiener-Hopf controller has been applled to the plant model. The application has revealed a limitation condition that the plant matrix should be positive-definite along the infinite frequency axis. A new multi variable control theory has emerged fram this study, which avoids the above limitation. The controller has the structure of the modern Wiener-Hopf controller, but with a unique feature enabling a designer to specify the closed-loop poles in advance and to shape the sensitivity matrix as required. In this way, the method treats directly the interaction problems found in the chemical processes with good tracking and regulation performances. Though the ability of the analytical design methods to determine once and for all whether a given set of specifications can be met is one of its chief advantages over the conventional trial-and-error design procedures. However, one disadvantage that offsets to some degree the enormous advantages is the relatively complicated algebra that must be employed in working out all but the simplest problem. Mathematical algorithms and computer software have been developed to treat some of the mathematical operations defined over the integral domain, such as matrix fraction description, spectral factorization, the Bezout identity, and the general manipulation of polynomial matrices. Hence, the design problems of Wiener-Hopf type of controllers and other similar algebraic design methods can be easily solved.
Resumo:
Packed beds have many industrial applications and are increasingly used in the process industries due to their low pressure drop. With the introduction of more efficient packings, novel packing materials (i.e. adsorbents) and new applications (i.e. flue gas desulphurisation); the aspect ratio (height to diameter) of such beds is decreasing. Obtaining uniform gas distribution in such beds is of crucial importance in minimising operating costs and optimising plant performance. Since to some extent a packed bed acts as its own distributor the importance of obtaining uniform gas distribution has increased as aspect ratios (bed height to diameter) decrease. There is no rigorous design method for distributors due to a limited understanding of the fluid flow phenomena and in particular of the effect of the bed base / free fluid interface. This study is based on a combined theoretical and modelling approach. The starting point is the Ergun Equation which is used to determine the pressure drop over a bed where the flow is uni-directional. This equation has been applied in a vectorial form so it can be applied to maldistributed and multi-directional flows and has been realised in the Computational Fluid Dynamics code PHOENICS. The use of this equation and its application has been verified by modelling experimental measurements of maldistributed gas flows, where there is no free fluid / bed base interface. A novel, two-dimensional experiment has been designed to investigate the fluid mechanics of maldistributed gas flows in shallow packed beds. The flow through the outlet of the duct below the bed can be controlled, permitting a rigorous investigation. The results from this apparatus provide useful insights into the fluid mechanics of flow in and around a shallow packed bed and show the critical effect of the bed base. The PHOENICS/vectorial Ergun Equation model has been adapted to model this situation. The model has been improved by the inclusion of spatial voidage variations in the bed and the prescription of a novel bed base boundary condition. This boundary condition is based on the logarithmic law for velocities near walls without restricting the velocity at the bed base to zero and is applied within a turbulence model. The flow in a curved bed section, which is three-dimensional in nature, is examined experimentally. The effect of the walls and the changes in gas direction on the gas flow are shown to be particularly significant. As before, the relative amounts of gas flowing through the bed and duct outlet can be controlled. The model and improved understanding of the underlying physical phenomena form the basis for the development of new distributors and rigorous design methods for them.
Resumo:
This work studies the development of polymer membranes for the separation of hydrogen and carbon monoxide from a syngas produced by the partial oxidation of natural gas. The CO product is then used for the large scale manufacture of acetic acid by reaction with methanol. A method of economic evaluation has been developed for the process as a whole and a comparison is made between separation of the H2/CO mixture by a membrane system and the conventional method of cryogenic distillation. Costs are based on bids obtained from suppliers for several different specifications for the purity of the CO fed to the acetic acid reactor. When the purity of the CO is set at that obtained by cryogenic distillation it is shown that the membrane separator offers only a marginal cost advantage. Cost parameters for the membrane separation systems have been defined in terms of effective selectivity and cost permeability. These new parameters, obtained from an analysis of the bids, are then used in a procedure which defines the optimum degree of separation and recovery of carbon monoxide for a minimum cost of manufacture of acetic acid. It is shown that a significant cost reduction is achieved with a membrane separator at the optimum process conditions. A method of "targeting" the properties of new membranes has been developed. This involves defining the properties for new (hypothetical -yet to be developed) membranes such that their use for the hydrogen/carbon monoxide separation will produce a reduced cost of acetic acid manufacture. The use of the targeting method is illustrated in the development of new membranes for the separation of hydrogen and carbon monoxide. The selection of polymeric materials for new membranes is based on molecular design methods which predict the polymer properties from the molecular groups making up the polymer molecule. Two approaches have been used. One method develops the analogy between gas solubility in liquids and that in polymers. The UNIFAC group contribution method is then used to predict gas solubility in liquids. In the second method the polymer Permachor number, developed by Salame, has been correlated with hydrogen and carbon monoxide permeabilities. These correlations are used to predict the permeabilities of gases through polymers. Materials have been tested for hydrogen and carbon monoxide permeabilities and improvements in expected economic performance have been achieved.
Resumo:
The main theme of research of this project concerns the study of neutral networks to control uncertain and non-linear control systems. This involves the control of continuous time, discrete time, hybrid and stochastic systems with input, state or output constraints by ensuring good performances. A great part of this project is devoted to the opening of frontiers between several mathematical and engineering approaches in order to tackle complex but very common non-linear control problems. The objectives are: 1. Design and develop procedures for neutral network enhanced self-tuning adaptive non-linear control systems; 2. To design, as a general procedure, neural network generalised minimum variance self-tuning controller for non-linear dynamic plants (Integration of neural network mapping with generalised minimum variance self-tuning controller strategies); 3. To develop a software package to evaluate control system performances using Matlab, Simulink and Neural Network toolbox. An adaptive control algorithm utilising a recurrent network as a model of a partial unknown non-linear plant with unmeasurable state is proposed. Appropriately, it appears that structured recurrent neural networks can provide conveniently parameterised dynamic models for many non-linear systems for use in adaptive control. Properties of static neural networks, which enabled successful design of stable adaptive control in the state feedback case, are also identified. A survey of the existing results is presented which puts them in a systematic framework showing their relation to classical self-tuning adaptive control application of neural control to a SISO/MIMO control. Simulation results demonstrate that the self-tuning design methods may be practically applicable to a reasonably large class of unknown linear and non-linear dynamic control systems.
Resumo:
This thesis encompasses an investigation of the behaviour of concrete frame structure under localised fire scenarios by implementing a constitutive model using finite-element computer program. The investigation phase included properties of material at elevated temperature, description of computer program, thermal and structural analyses. Transient thermal properties of material have been employed in this study to achieve reasonable results. The finite-element computer package of ANSYS is utilized in the present analyses to examine the effect of fire on the concrete frame under five various fire scenarios. In addition, a report of full-scale BRE Cardington concrete building designed to Eurocode2 and BS8110 subjected to realistic compartment fire is also presented. The transient analyses of present model included additional specific heat to the base value of dry concrete at temperature 100°C and 200°C. The combined convective-radiation heat transfer coefficient and transient thermal expansion have also been considered in the analyses. For the analyses with the transient strains included, the constitutive model based on empirical formula in a full thermal strain-stress model proposed by Li and Purkiss (2005) is employed. Comparisons between the models with and without transient strains included are also discussed. Results of present study indicate that the behaviour of complete structure is significantly different from the behaviour of individual isolated members based on current design methods. Although the current tabulated design procedures are conservative when the entire building performance is considered, it should be noted that the beneficial and detrimental effects of thermal expansion in complete structures should be taken into account. Therefore, developing new fire engineering methods from the study of complete structures rather than from individual isolated member behaviour is essential.
Resumo:
PURPOSE: To examine whether objective performance of near tasks is improved with various electronic vision enhancement systems (EVES) compared with the subject's own optical magnifier. DESIGN: Experimental study, randomized, within-patient design. METHODS: This was a prospective study, conducted in a hospital ophthalmology low-vision clinic. The patient population comprised 70 sequential visually impaired subjects. The magnifying devices examined were: patient's optimum optical magnifier; magnification and field-of-view matched mouse EVES with monitor or head-mounted display (HMD) viewing; and stand EVES with monitor viewing. The tasks performed were: reading speed and acuity; time taken to track from one column of print to the next; follow a route map, and locate a specific feature; and identification of specific information from a medicine label. RESULTS: Mouse EVES with HMD viewing caused lower reading speeds than stand EVES with monitor viewing (F = 38.7, P < .001). Reading with the optical magnifier was slower than with the mouse or stand EVES with monitor viewing at smaller print sizes (P < .05). The column location task was faster with the optical magnifier than with any of the EVES (F = 10.3, P < .001). The map tracking and medicine label identification task was slower with the mouse EVES with HMD viewing than with the other magnifiers (P < .01). Previous EVES experience had no effect on task performance (P > .05), but subjects with previous optical magnifier experience were significantly slower at performing the medicine label identification task with all of the EVES (P < .05). CONCLUSIONS: Although EVES provide objective benefits to the visually impaired in reading speed and acuity, together with some specific near tasks, some can be performed just as fast using optical magnification. © 2003 by Elsevier Inc. All rights reserved.
Resumo:
Carte du Ciel (from French, map of the sky) is a part of a 19th century extensive international astronomical project whose goal was to map the entire visible sky. The results of this vast effort were collected in the form of astrographic plates and their paper representatives that are called astrographic maps and are widely distributed among many observatories and astronomical institutes over the world. Our goal is to design methods and algorithms to automatically extract data from digitized Carte du Ciel astrographic maps. This paper examines the image processing and pattern recognition techniques that can be adopted for automatic extraction of astronomical data from stars’ triple expositions that can aid variable stars detection in Carte du Ciel maps.
Resumo:
Ongoing advances in technology are increasing the scope for enhancing and supporting older adults’ daily living. The digital divide between older and younger adults raises concerns, however, about the suitability of technological solutions for older adults, especially for those with impairments. Taking older adults with Age-Related Macular Degeneration (AMD) as a case study, we used user-centred and participatory design approaches to develop an assistive mobile app for self-monitoring their intake of food [12,13]. In this paper we report on findings of a longitudinal field evaluation of our app that was conducted to investigate how it was received and adopted by older adults with AMD and its impact on their lives. Demonstrating the benefit of applying inclusive design methods for technology for older adults, our findings reveal how the use of the app raises participants’ awareness and facilitates self-monitoring of diet, encourages positive (diet) behaviour change, and encourages learning.
Resumo:
In establishing the reliability of performance-related design methods for concrete – which are relevant for resistance against chloride-induced corrosion - long-term experience of local materials and practices and detailed knowledge of the ambient and local micro-climate are critical. Furthermore, in the development of analytical models for performance-based design, calibration against test data representative of actual conditions in practice is required. To this end, the current study presents results from full-scale, concrete pier-stems under long-term exposure to a marine environment with work focussing on XS2 (below mid-tide level) in which the concrete is regarded as fully saturated and XS3 (tidal, splash and spray) in which the concrete is in an unsaturated condition. These exposures represent zones where concrete structures are most susceptible to ionic ingress and deterioration. Chloride profiles and chloride transport behaviour are studied using both an empirical model (erfc function) and a physical model (ClinConc). The time dependency of surface chloride concentration (Cs) and apparent diffusivity (Da) were established for the empirical model whereas, in the ClinConc model (originally based on saturated concrete), two new environmental factors were introduced for the XS3 environmental exposure zone. Although the XS3 is considered as one environmental exposure zone according to BS EN 206-1:2013, the work has highlighted that even within this zone, significant changes in chloride ingress are evident. This study aims to update the parameters of both models for predicting the long term transport behaviour of concrete subjected to environmental exposure classes XS2 and XS3.
Resumo:
This research aims to make a contribution in the context of design thinking at a global cultural scale and specifically how design methods are a feature of the homogenising and heterogenising forces of globalisation via creative destruction. Since Schumpeter’s description of economic innovation destroying the old and creating the new, a number of other interpretations of creative destruction have developed including those driving cultural evolution. However a design model showing the impact of different types of design method on cultural evolution can develop an understanding on a more systemic level from the medium to longer term impact of new designs that homogenise or increase the differences between various cultures. This research explores the theoretical terrain between creative destruction, design thinking and cybernetics in the context of exchanging cultural influences for collaborative creativity and concludes with an experiment that proposes a feedback loop between ubiquitising and differentiating design methods mediating cultural variety in creative ecosystems.
Resumo:
This research aims to make a contribution in the context of design thinking at a global cultural scale and specifically how design methods are a feature of the homogenising and heterogenising forces of globalisation via creative destruction. Since Schumpeter’s description of economic innovation destroying the old and creating the new, a number of other interpretations of creative destruction have developed including those driving cultural evolution. However a design model showing the impact of different types of design method on cultural evolution can develop an understanding on a more systemic level from the medium to longer term impact of new designs that homogenise or increase the differences between various cultures. This research explores the theoretical terrain between creative destruction, design thinking and cybernetics in the context of exchanging cultural influences for collaborative creativity and concludes with an experiment that proposes a feedback loop between ubiquitising and differentiating design methods mediating cultural variety in creative ecosystems.
Identifying Risk Factors for Postpartum Mood Episodes in Bipolar Disorder – A UK Prospective Study
Resumo:
Background and Aims: Women with bipolar disorder are vulnerable to episodes postpartum, but risk factors are poorly understood. We are exploring risk factors for postpartum mood episodes in women with bipolar disorder using a prospective longitudinal design. Methods: Pregnant women with lifetime DSM-IV bipolar disorder are being recruited into the Bipolar Disorder Research Network (www.BDRN.org). Baseline assessments during late pregnancy include lifetime psychopathology and potential risk factors for perinatal episodes such as medication use, sleep, obstetric factors, and psychosocial factors. Blood samples are taken for genetic analysis. Perinatal psychopathology is assessed via follow-up interview at 12-weeks postpartum. Interview data are supplemented by clinician questionnaires and case-note review. Potential risk factors will be compared between women who experience perinatal episodes and those who remain well. Results: 80 participants have been recruited to date. 32/61 (52%) women had a perinatal recurrence by follow-up. 16 (26%) had onset in pregnancy. 21 (34%) had postpartum onset, 19 (90%) within 6-weeks of delivery: 11 (18%) postpartum psychosis, 5 (8%) postpartum hypomania, 5 (8%) postpartum depression. Postpartum relapse was more frequent in women with bipolar-I than bipolar-II disorder (45% vs 17%). 62% women with postpartum relapse took prophylactic medication peripartum and almost all received care from secondary psychiatric services (95%). Conclusions: Rate of postpartum relapse is high, despite most women receiving specialist care and medication perinatally. A larger sample size will allow us to examine potential risk factors for postpartum episodes, which will assist in providing accurate and personalised advice to women with bipolar disorder who are considering pregnancy.
Resumo:
Les charpentes en bois doivent inévitablement inclure des assemblages pouvant transférer les charges entre les éléments de façon adéquate pour assurer l’intégrité de la structure. Les assemblages sont une partie critique des structures en bois puisque dans la plupart des cas, ce sont ceux-ci qui permettent de dissiper l’énergie et d’obtenir un mode de rupture ductile sous les charges sismiques. Ce mode de rupture est préférable, puisqu’il donne lieu à une grande déformation avant effondrement, permettant ainsi une évacuation des occupants en toute sécurité lors de tremblement de terre. Les assemblages à petits diamètres tels que les clous, les rivets et les vis sont fréquemment utilisés dans les constructions en bois et on suppose qu’ils amènent une rupture ductile bien qu’il soit impossible pour les concepteurs de prédire exactement le mode de rupture à l’aide de la méthode de calcul actuelle. De plus, les rivets ont une application très limitée dû au fait que la méthode de calcul utilisée actuellement s’applique à des configurations, essences et types de produits de bois très spécifiques. L’objectif de ce projet est d’évaluer une nouvelle méthode de calcul proposée par des chercheurs de Nouvelle-Zélande, Zarnani et Quenneville, pour les assemblages à rivets, mais adaptable pour les assemblages de bois à attaches de petits diamètres. Elle permet au concepteur de déterminer avec précision le mode de rupture des assemblages de différentes configurations avec différents produits de bois. Plus de 70 essais sur les assemblages à rivets et à clous résistants à des charges variant de 40kN à 800kN ont été effectués dans le cadre de ce projet de recherche afin de valider l’utilisation de cette méthode avec le produit du bois lamellé-collé canadien Nordic Lam et la comparer avec celle présentement utilisée au Canada. Les modes de rupture ductile, fragile et mixte ont été prévus avec l’emphase sur le mode fragile puisque c’est celui-ci qui est le plus variable et le moins étudié. Les assemblages en bois lamellé-collé Nordic Lam étaient cloués ou rivetés selon différentes configurations variant de 18 à 128 clous ou rivets. Les résultats démontrent une bonne prédiction de la résistance et des modes de rupture des assemblages à clous et à rivets. Pour quelques configurations des assemblages à rivets, les prédictions de la nouvelle méthode sont plus élevées qu’avec la méthode actuelle. Les assemblages à clous ont démontré des ruptures de la tige de clous au niveau du plan de cisaillement lors de tous les essais effectués, ce qui ne correspond pas à un mode ductile ou fragile prévue par la méthode de calcul.