885 resultados para Design problems
Resumo:
Mode of access: Internet.
Resumo:
The suspen-dome system is a new structural form that has become popular in the construction of long-span roof structures. These domes are very slender and lightweight, their configuration is complicated, and hence sequential consideration in the structural design is needed. This paper focuses on these considerations, which include the method for designing cable prestress force, a simplified analysis method, and the estimation of buckling capacity. Buckling is one of the most important problems for dome structures. This paper presents the findings of an intensive buckling study of the Lamella suspen-dome system that takes geometric imperfection, asymmetric loading, rise-to-span ratio, and connection rigidity into consideration. Finally, suggested design and construction guidelines are given in the conclusion of this paper. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
Owing to the high degree of vulnerability of liquid retaining structures to corrosion problems, there are stringent requirements in its design against cracking. In this paper, a prototype knowledge-based system is developed and implemented for the design of liquid retaining structures based on the blackboard architecture. A commercially available expert system shell VISUAL RULE STUDIO working as an ActiveX Designer under the VISUAL BASIC programming environment is employed. Hybrid knowledge representation approach with production rules and procedural methods under object-oriented programming are used to represent the engineering heuristics and design knowledge of this domain. It is demonstrated that the blackboard architecture is capable of integrating different knowledge together in an effective manner. The system is tailored to give advice to users regarding preliminary design, loading specification and optimized configuration selection of this type of structure. An example of application is given to illustrate the capabilities of the prototype system in transferring knowledge on liquid retaining structure to novice engineers. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents a finite-difference time-domain (FDTD) simulator for electromagnetic analysis and design applications in MRI. It is intended to be a complete FDTD model of an MRI system including all RF and low-frequency field generating units and electrical models of the patient. The pro-ram has been constructed in an object-oriented framework. The design procedure is detailed and the numerical solver has been verified against analytical solutions for simple cases and also applied to various field calculation problems. In particular, the simulator is demonstrated for inverse RF coil design, optimized source profile generation, and parallel imaging in high-frequency situations. The examples show new developments enabled by the simulator and demonstrate that the proposed FDTD framework can be used to analyze large-scale computational electromagnetic problems in modern MRI engineering. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
The argument from fine tuning is supposed to establish the existence of God from the fact that the evolution of carbon-based life requires the laws of physics and the boundary conditions of the universe to be more or less as they are. We demonstrate that this argument fails. In particular, we focus on problems associated with the role probabilities play in the argument. We show that, even granting the fine tuning of the universe, it does not follow that the universe is improbable, thus no explanation of the fine tuning, theistic or otherwise, is required.
Resumo:
Purpose - In many scientific and engineering fields, large-scale heat transfer problems with temperature-dependent pore-fluid densities are commonly encountered. For example, heat transfer from the mantle into the upper crust of the Earth is a typical problem of them. The main purpose of this paper is to develop and present a new combined methodology to solve large-scale heat transfer problems with temperature-dependent pore-fluid densities in the lithosphere and crust scales. Design/methodology/approach - The theoretical approach is used to determine the thickness and the related thermal boundary conditions of the continental crust on the lithospheric scale, so that some important information can be provided accurately for establishing a numerical model of the crustal scale. The numerical approach is then used to simulate the detailed structures and complicated geometries of the continental crust on the crustal scale. The main advantage in using the proposed combination method of the theoretical and numerical approaches is that if the thermal distribution in the crust is of the primary interest, the use of a reasonable numerical model on the crustal scale can result in a significant reduction in computer efforts. Findings - From the ore body formation and mineralization points of view, the present analytical and numerical solutions have demonstrated that the conductive-and-advective lithosphere with variable pore-fluid density is the most favorite lithosphere because it may result in the thinnest lithosphere so that the temperature at the near surface of the crust can be hot enough to generate the shallow ore deposits there. The upward throughflow (i.e. mantle mass flux) can have a significant effect on the thermal structure within the lithosphere. In addition, the emplacement of hot materials from the mantle may further reduce the thickness of the lithosphere. Originality/value - The present analytical solutions can be used to: validate numerical methods for solving large-scale heat transfer problems; provide correct thermal boundary conditions for numerically solving ore body formation and mineralization problems on the crustal scale; and investigate the fundamental issues related to thermal distributions within the lithosphere. The proposed finite element analysis can be effectively used to consider the geometrical and material complexities of large-scale heat transfer problems with temperature-dependent fluid densities.
Resumo:
Background. Children of alcoholics are significantly more likely to experience high-risk environmental exposures, including prenatal substance exposure, and are more likely to exhibit externalizing problems [e.g. attention deficit hyperactivity disorder (ADHD)]. While there is evidence that genetic influences and prenatal nicotine and/or alcohol exposure play separate roles in determining risk of ADHD, little has been done on determining the joint roles that genetic risk associated with maternal alcohol use disorder (AUD) and prenatal risk factors play in determining risk of ADHD. Method. Using a children-of-twins design, diagnostic telephone interview data from high-risk families (female monozygotic and dizygotic twins concordant or discordant for AUD as parents) and control families targeted from a large Australian twin cohort were analyzed using logistic regression models. Results. Offspring of twins with a history of AUD, as well as offspring of non-AUD monozygotic twins whose co-twin had AUD, were significantly more likely to exhibit ADHD than offspring of controls. This pattern is consistent with a genetic explanation for the association between maternal AUD and increased offspring risk of ADHD. Adjustment for prenatal smoking, which remained significantly predictive, did not remove the significant genetic association between maternal AUD and offspring ADHD. Conclusions. While maternal smoking during pregnancy probably contributes to the association between maternal AUD and offspring ADHD risk, the evidence for a significant genetic correlation suggests: (i) pleiotropic genetic effects, with some genes that influence risk of AUD also influencing vulnerability to ADHD; or (ii) ADHD is a direct risk-factor for AUD.
Resumo:
Conclusions about the effects of harsh parenting on children have been limited by research designs that cannot control for genetic or shared environmental confounds. The present study used a sample of children of twins and a hierarchical linear modeling statistical approach to analyze the consequences of varying levels of punishment while controlling for many confounding influences. The sample of 887 twin pairs and 2,554 children came from the Australian Twin Registry. Although corporal punishment per se did not have significant associations with negative childhood outcomes, harsher forms of physical punishment did appear to have specific and significant effects. The observed association between harsh physical punishment and negative outcomes in children survived a relatively rigorous test of its causal status, thereby increasing the authors' conviction that harsh physical punishment is a serious risk factor for children.
Resumo:
This paper describes a formal component language, used to support automated component-based program development. The components, referred to as templates, are machine processable, meaning that appropriate tool support, such as retrieval support, can be developed. The templates are highly adaptable, meaning that they can be applied to a wide range of problems. Some of the main features of the language are described, including: higher-order parameters; state variable declarations; specification statements and conditionals; applicability conditions and theories; meta-level place holders; and abstract data structures.
Resumo:
A method and a corresponding tool is described which assist design recovery and program understanding by recognising instances of design patterns semi-automatically. The approach taken is specifically designed to overcome the existing scalability problems caused by many design and implementation variants of design pattern instances. Our approach is based on a new recognition algorithm which works incrementally rather than trying to analyse a possibly large software system in one pass without any human intervention. The new algorithm exploits domain and context knowledge given by a reverse engineer and by a special underlying data structure, namely a special form of an annotated abstract syntax graph. A comparative and quantitative evaluation of applying the approach to the Java AWT and JGL libraries is also given.
Resumo:
Email has been used for some years as a low-cost telemedicine medium to provide support for developing countries. However, all operations have been relatively small scale and fairly labour intensive to administer. A scalable, automatic message-routing system was constructed which automates many of the tasks. During a four-month study period in 2002, 485 messages were processed automatically. There were 31 referrals from eight hospitals in three countries. These referrals were handled by 25 volunteer specialists from a panel of 42. Two system operators, located 10 time zones apart, managed the system. The median time from receipt of a new referral to its allocation to a specialist was 1.0 days (interquartile range 0.7-2.4). The median interval between allocation and first reply was 0.7 days (interquartile range 0.3-2.3). Automatic message handling solves many of the problems of manual email telemedicine systems and represents a potentially scalable way of doing low-cost telemedicine in the developing world.
Resumo:
Logistics distribution network design is one of the major decision problems arising in contemporary supply chain management. The decision involves many quantitative and qualitative factors that may be conflicting in nature. This paper applies an integrated multiple criteria decision making approach to design an optimal distribution network. In the approach, the analytic hierarchy process (AHP) is used first to determine the relative importance weightings or priorities of alternative warehouses with respect to both deliverer oriented and customer oriented criteria. Then, the goal programming (GP) model incorporating the constraints of system, resource, and AHP priority is formulated to select the best set of warehouses without exceeding the limited available resources. In this paper, two commercial packages are used: Expert Choice for determining the AHP priorities of the warehouses, and LINDO for solving the GP model. © 2007 IEEE.
Resumo:
Product design and sourcing decisions are among the most difficult and important of all decisions facing multinational manufacturing companies, yet associated decision support and evaluation systems tend to be myopic in nature. Design for manufacture and assembly techniques, for example, generally focuses on manufacturing capability and ignores capacity although both should be considered. Similarly, most modelling and evaluation tools available to examine the performance of various solution and improvement techniques have a narrower scope than desired. A unique collaboration, funded by the US National Science Foundation, between researchers in the USA and the UK currently addresses these problems. This paper describes a technique known as Design For the Existing Environment (DFEE) and an holistic evaluation system based on enterprise simulation that was used to demonstrate the business benefits of DFEE applied in a simple product development and manufacturing case study. A project that will extend these techniques to evaluate global product sourcing strategies is described along with the practical difficulties of building an enterprise simulation on the scale and detail required.
Resumo:
A new surface analysis technique has been developed which has a number of benefits compared to conventional Low Energy Ion Scattering Spectrometry (LEISS). A major potential advantage arising from the absence of charge exchange complications is the possibility of quantification. The instrumentation that has been developed also offers the possibility of unique studies concerning the interaction between low energy ions and atoms and solid surfaces. From these studies it may also be possible, in principle, to generate sensitivity factors to quantify LEISS data. The instrumentation, which is referred to as a Time-of-Flight Fast Atom Scattering Spectrometer has been developed to investigate these conjecture in practice. The development, involved a number of modifications to an existing instrument, and allowed samples to be bombarded with a monoenergetic pulsed beam of either atoms or ions, and provided the capability to analyse the spectra of scattered atoms and ions separately. Further to this a system was designed and constructed to allow incident, exit and azimuthal angles of the particle beam to be varied independently. The key development was that of a pulsed, and mass filtered atom source; which was developed by a cyclic process of design, modelling and experimentation. Although it was possible to demonstrate the unique capabilities of the instrument, problems relating to surface contamination prevented the measurement of the neutralisation probabilities. However, these problems appear to be technical rather than scientific in nature, and could be readily resolved given the appropriate resources. Experimental spectra obtained from a number of samples demonstrate some fundamental differences between the scattered ion and neutral spectra. For practical non-ordered surfaces the ToF spectra are more complex than their LEISS counterparts. This is particularly true for helium scattering where it appears, in the absence of detailed computer simulation, that quantitative analysis is limited to ordered surfaces. Despite this limitation the ToFFASS instrument opens the way for quantitative analysis of the 'true' surface region to a wider range of surface materials.