896 resultados para Simplified design method


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The thesis describes an investigation into methods for the design of flexible high-speed product processing machinery, consisting of independent electromechanically actuated machine functions which operate under software coordination and control. An analysis is made of the elements of traditionally designed cam-actuated, mechanically coupled machinery, so that the operational functions and principal performance limitations of the separate machine elements may be identified. These are then used to define the requirements for independent actuators machinery, with a discussion of how this type of design approach is more suited to modern manufacturing trends. A distributed machine controller topology is developed which is a hybrid of hierarchical and pipeline control. An analysis is made, with the aid of dynamic simulation modelling, which confirms the suitability of the controller for flexible machinery control. The simulations include complex models of multiple independent actuators systems, which enable product flow and failure analyses to be performed. An analysis is made of high performance brushless d.c. servomotors and their suitability for actuating machine motions is assessed. Procedures are developed for the selection of brushless servomotors for intermittent machine motions. An experimental rig is described which has enabled the actuation and control methods developed to be implemented. With reference to this, an evaluation is made of the suitability of the machine design method and a discussion is given of the developments which are necessary for operational independent actuators machinery to be attained.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Teallach project has adapted model-based user-interface development techniques to the systematic creation of user-interfaces for object-oriented database applications. Model-based approaches aim to provide designers with a more principled approach to user-interface development using a variety of underlying models, and tools which manipulate these models. Here we present the results of the Teallach project, describing the tools developed and the flexible design method supported. Distinctive features of the Teallach system include provision of database-specific constructs, comprehensive facilities for relating the different models, and support for a flexible design method in which models can be constructed and related by designers in different orders and in different ways, to suit their particular design rationales. The system then creates the desired user-interface as an independent, fully functional Java application, with automatically generated help facilities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis develops and tests an evaluative approach to assessing the acceptance of xeriscape as an alternative planting design method among landscape architects in South Florida. South Florida was chosen for its large number of practicing landscape architects and increasing water shortages due to the large infusion of people into the area. ^ A survey was developed and mailed to 95 subjects in South Florida who are landscape architects or related professionals. The responses to the questions were either yes or no, for the most part, and were designed to assess the acceptance, use, knowledge and practice of xeriscape in this region. ^ The results of the survey showed a lack of in-depth knowledge of the seven principles of xeriscape, few true xeriscapes instituted by the water departments are in the region, and that the government sector is the main advocate of the xeriscape concept. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the intention of studying and developing the design process based on a specific methodology, the object of this work is to present the design of a gated condominium community in Natal based on the application of principles of shape grammar, used in their design process. The shape grammar is a design method developed in the 1970s by George Stiny and James Gips. It is used for the analysis of the project as well as for its synthesis, with the goal of creating a "formal vocabulary" through mathematical and/or geometrical operations. Here, the methodology was used in the synthesis of the design process, through the relationship between formal subtractions and the houses’ architectural planning. As a result, five dwellings configurations were proposed, each one different from the other with respect to their shape and architectural programming, distributed in three twin groups, which are repeated until the final total of nine architectural volumes. In addition to studies of the condominium’s ventilation and the buildings’ shading simulations, studies of spatial flexibility and acoustic performance were also performed. The mapping of the design process, one of the specific objectives of the dissertation, was composed not only by the record of formal constraints (the preparation and application of rules), but also by physical, environmental, legal and sustainability aspects in relation to, on one hand, the optimization of the shading and passive ventilation for hot and humid climates, and, on the other hand, the modulation and rationalization of the construction.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper deals with the conceptual design of decoupled, compact, and monolithic XYZ compliant parallel manipulators (CPMs): CUBEs. Position spaces of compliant P (P: prismatic) joints are first discussed, which are represented by circles about the translational directions. A design method of monolithic XYZ CPMs is then proposed in terms of both the kinematic substitution method and the position spaces. Three types of monolithic XYZ CPMs are finally designed using the proposed method with the help of three classes of kinematical decoupled 3-DOF (degree of freedom) translational parallel mechanisms (TPMs). These monolithic XYZ CPMs include a 3-PPP XYZ CPM composed of identical parallelogram modules (a previously reported design), a novel 3-PPPR (R: revolute) XYZ CPM composed of identical compliant four-beam modules, and a novel 3-PPPRR XYZ CPM. The latter two monolithic designs also have extended lives. It is shown that the proposed design method can be used to design other decoupled and compact XYZ CPMs by using the concept of position spaces, and the resulting XYZ CPM is the most compact one when the fixed ends of the three actuated compliant P joints thereof overlap.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This is the first time a multidisciplinary team has employed an iterative co-design method to determine the ergonomic layout of an emergency ambulance treatment space. This process allowed the research team to understand how treatment protocols were performed and developed analytical tools to reach an optimum configuration towards ambulance design standardisation. Fusari conducted participatory observations during 12-hour shifts with front-line ambulance clinicians, hospital staff and patients to understand the details of their working environments whilst on response to urgent and emergency calls. A simple yet accurate 1:1 mock-up of the existing ambulance was built for detailed analysis of these procedures through simulations. Paramedics were called in to participate in interviews and role-playing inside the model to recreate tasks, how they are performed, the equipment used and to understand the limitations of the current ambulance. The use of Link Analysis distilled 5 modes of use. In parallel, an exhaustive audit of all equipment and consumables used in ambulances was performed (logging and photography) to define space use. These developed 12 layout options for refinement and CAD modelling and presented back to paramedics. The preferred options and features were then developed into a full size test rig and appearance model. Two key studies informed the process. The 2005 National Patient Safety Agency funded study “Future Ambulances” outlined 9 design challenges for future standardisation of emergency vehicles and equipment. Secondly, the 2007 EPSRC funded “Smart Pods” project investigated a new system of mobile urgent and emergency medicine to treat patients in the community. A full-size mobile demonstrator unit featuring the evidence-based ergonomic layout was built for clinical tests through simulated emergency scenarios. Results from clinical trials clearly show that the new layout improves infection control, speeds up treatment, and makes it easier for ambulance crews to follow correct clinical protocols.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research aims to make a contribution in the context of design thinking at a global cultural scale and specifically how design methods are a feature of the homogenising and heterogenising forces of globalisation via creative destruction. Since Schumpeter’s description of economic innovation destroying the old and creating the new, a number of other interpretations of creative destruction have developed including those driving cultural evolution. However a design model showing the impact of different types of design method on cultural evolution can develop an understanding on a more systemic level from the medium to longer term impact of new designs that homogenise or increase the differences between various cultures. This research explores the theoretical terrain between creative destruction, design thinking and cybernetics in the context of exchanging cultural influences for collaborative creativity and concludes with an experiment that proposes a feedback loop between ubiquitising and differentiating design methods mediating cultural variety in creative ecosystems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research aims to make a contribution in the context of design thinking at a global cultural scale and specifically how design methods are a feature of the homogenising and heterogenising forces of globalisation via creative destruction. Since Schumpeter’s description of economic innovation destroying the old and creating the new, a number of other interpretations of creative destruction have developed including those driving cultural evolution. However a design model showing the impact of different types of design method on cultural evolution can develop an understanding on a more systemic level from the medium to longer term impact of new designs that homogenise or increase the differences between various cultures. This research explores the theoretical terrain between creative destruction, design thinking and cybernetics in the context of exchanging cultural influences for collaborative creativity and concludes with an experiment that proposes a feedback loop between ubiquitising and differentiating design methods mediating cultural variety in creative ecosystems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose – The purpose of this study is to investigate the effect of non-audit services on auditor independence, and the importance of non-audit services as a source of income for audit firms in the United Kingdom. Design/method/approach – This study will examine 11 companies in the food retail- and wholesale industry during 2007 - 2014. Five indicators have been used; (1) Appointed auditor and provision of non-audit services to audit clients; (2) Auditor tenure; (3) Non-audit services in relation to total services; (4) Tax-services in relation to non-audit services, (5) Big Four’s revenue. Information has been collected using the quantitative approach through annual- and transparency reports. The threshold used to measure possible independence threats (self-review-, self-interest- and familiarity threat) has been set at 18,5 %. Findings – This study concludes that the jointly provision of audit- and nonaudit services possibly causes impairment of auditor independence, and that non-audit services is an important source of income for audit firms. The findings showed that in 99 %, companies purchased non-audit services from their statutory auditor. Non-audit services in relation to total services surpassed the threshold in 78 % of all financial years. Likewise, tax-services in comparison to non-audit services exceeded the threshold in 65 % of all financial years. The Big Four’s revenue from non-audit services to audit clients in relation to total revenue is almost constantly below the threshold. However, in all financial years except from one, total revenue from non-audit services surpassed revenue from audit services by far. Contribution – The study contributes to the ongoing discussion about nonaudit services effect on auditor independence. Originality/value – This study is one of few that provide detailed information about non-audit services in the food retail- and wholesale industry. It highlights social and ethical issues with regard to agency relationships.  

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The mixing performance of three passive milli-scale reactors with different geometries was investigated at different Reynolds numbers. The effects of design and operating characteristics such as mixing channel shape and volume flow rate were investigated. The main objective of this work was to demonstrate a process design method that uses on Computational Fluid Dynamics (CFD) for modeling and Additive Manufacturing (AM) technology for manufacture. The reactors were designed and simulated using SolidWorks and Fluent 15.0 software, respectively. Manufacturing of the devices was performed with an EOS M-series AM system. Step response experiments with distilled Millipore water and sodium hydroxide solution provided time-dependent concentration profiles. Villermaux-Dushman reaction experiments were also conducted for additional verification of CFD results and for mixing efficiency evaluation of the different geometries. Time-dependent concentration data and reaction evaluation showed that the performance of the AM-manufactured reactors matched the CFD results reasonably well. The proposed design method allows the implementation of new and innovative solutions, especially in the process design phase, for industrial scale reactor technologies. In addition, rapid implementation is another advantage due to the virtual flow design and due to the fast manufacturing which uses the same geometric file formats.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Les charpentes en bois doivent inévitablement inclure des assemblages pouvant transférer les charges entre les éléments de façon adéquate pour assurer l’intégrité de la structure. Les assemblages sont une partie critique des structures en bois puisque dans la plupart des cas, ce sont ceux-ci qui permettent de dissiper l’énergie et d’obtenir un mode de rupture ductile sous les charges sismiques. Ce mode de rupture est préférable, puisqu’il donne lieu à une grande déformation avant effondrement, permettant ainsi une évacuation des occupants en toute sécurité lors de tremblement de terre. Les assemblages à petits diamètres tels que les clous, les rivets et les vis sont fréquemment utilisés dans les constructions en bois et on suppose qu’ils amènent une rupture ductile bien qu’il soit impossible pour les concepteurs de prédire exactement le mode de rupture à l’aide de la méthode de calcul actuelle. De plus, les rivets ont une application très limitée dû au fait que la méthode de calcul utilisée actuellement s’applique à des configurations, essences et types de produits de bois très spécifiques. L’objectif de ce projet est d’évaluer une nouvelle méthode de calcul proposée par des chercheurs de Nouvelle-Zélande, Zarnani et Quenneville, pour les assemblages à rivets, mais adaptable pour les assemblages de bois à attaches de petits diamètres. Elle permet au concepteur de déterminer avec précision le mode de rupture des assemblages de différentes configurations avec différents produits de bois. Plus de 70 essais sur les assemblages à rivets et à clous résistants à des charges variant de 40kN à 800kN ont été effectués dans le cadre de ce projet de recherche afin de valider l’utilisation de cette méthode avec le produit du bois lamellé-collé canadien Nordic Lam et la comparer avec celle présentement utilisée au Canada. Les modes de rupture ductile, fragile et mixte ont été prévus avec l’emphase sur le mode fragile puisque c’est celui-ci qui est le plus variable et le moins étudié. Les assemblages en bois lamellé-collé Nordic Lam étaient cloués ou rivetés selon différentes configurations variant de 18 à 128 clous ou rivets. Les résultats démontrent une bonne prédiction de la résistance et des modes de rupture des assemblages à clous et à rivets. Pour quelques configurations des assemblages à rivets, les prédictions de la nouvelle méthode sont plus élevées qu’avec la méthode actuelle. Les assemblages à clous ont démontré des ruptures de la tige de clous au niveau du plan de cisaillement lors de tous les essais effectués, ce qui ne correspond pas à un mode ductile ou fragile prévue par la méthode de calcul.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dans le contexte où les routes non revêtues sont susceptibles de subir des charges importantes, une méthode rigoureuse pour la conception de ces chaussées basée sur des principes mécanistes-empiriques et sur le comportement mécanique des sols support est souhaitable. La conception mécaniste combinée à des lois d’endommagement permet l’optimisation des structures de chaussées non revêtues ainsi que la réduction des coûts de construction et d’entretien. Le but de ce projet est donc la mise au point d’une méthode de conception mécaniste-empirique adaptée aux chaussées non revêtues. Il a été question tout d’abord de mettre au point un code de calcul pour la détermination des contraintes et des déformations dans la chaussée. Ensuite, des lois d’endommagement empiriques pour les chaussées non revêtues ont été développées. Enfin, les méthodes de calcul ont permis la création d’abaques de conception. Le développement du code de calcul a consisté en une modélisation de la chaussée par un système élastique multi-couches. La modélisation a été faite en utilisant la transformation d’Odemark et les équations de Boussinesq pour le calcul des déformations sous la charge. L’élaboration des fonctions de transfert empiriques adaptées aux chaussées non revêtues a également été effectuée. Le développement des fonctions de transfert s’est fait en deux étapes. Tout d’abord, l’établissement de valeurs seuil d’orniérage considérant des niveaux jugés raisonnables de conditions fonctionnelle et structurale de la chaussée. Ensuite, le développement de critères de déformation admissible en associant les déformations théoriques calculées à l’aide du code de calcul à l’endommagement observé sur plusieurs routes en service. Les essais ont eu lieu sur des chaussées typiques reconstituées en laboratoire et soumises à un chargement répété par simulateur de charge. Les chaussées ont été instrumentées pour mesurer la déformation au sommet du sol d’infrastructure et les taux d’endommagements ont été mesurés au cours des essais.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conventional rockmass characterization and analysis methods for geotechnical assessment in mining, civil tunnelling, and other excavations consider only the intact rock properties and the discrete fractures that are present and form blocks within rockmasses. Field logging and classification protocols are based on historically useful but highly simplified design techniques, including direct empirical design and empirical strength assessment for simplified ground reaction and support analysis. As modern underground excavations go deeper and enter into more high stress environments with complex excavation geometries and associated stress paths, healed structures within initially intact rock blocks such as sedimentary nodule boundaries and hydrothermal veins, veinlets and stockwork (termed intrablock structure) are having an increasing influence on rockmass behaviour and should be included in modern geotechnical design. Due to the reliance on geotechnical classification methods which predate computer aided analysis, these complexities are ignored in conventional design. Given the comparatively complex, sophisticated and powerful numerical simulation and analysis techniques now practically available to the geotechnical engineer, this research is driven by the need for enhanced characterization of intrablock structure for application to numerical methods. Intrablock structure governs stress-driven behaviour at depth, gravity driven disintegration for large shallow spans, and controls ultimate fragmentation. This research addresses the characterization of intrablock structure and the understanding of its behaviour at laboratory testing and excavation scales, and presents new methodologies and tools to incorporate intrablock structure into geotechnical design practice. A new field characterization tool, the Composite Geological Strength Index, is used for outcrop or excavation face evaluation and provides direct input to continuum numerical models with implicit rockmass structure. A brittle overbreak estimation tool for complex rockmasses is developed using field observations. New methods to evaluate geometrical and mechanical properties of intrablock structure are developed. Finally, laboratory direct shear testing protocols for interblock structure are critically evaluated and extended to intrablock structure for the purpose of determining input parameters for numerical models with explicit structure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Power efficiency is one of the most important constraints in the design of embedded systems since such systems are generally driven by batteries with limited energy budget or restricted power supply. In every embedded system, there are one or more processor cores to run the software and interact with the other hardware components of the system. The power consumption of the processor core(s) has an important impact on the total power dissipated in the system. Hence, the processor power optimization is crucial in satisfying the power consumption constraints, and developing low-power embedded systems. A key aspect of research in processor power optimization and management is “power estimation”. Having a fast and accurate method for processor power estimation at design time helps the designer to explore a large space of design possibilities, to make the optimal choices for developing a power efficient processor. Likewise, understanding the processor power dissipation behaviour of a specific software/application is the key for choosing appropriate algorithms in order to write power efficient software. Simulation-based methods for measuring the processor power achieve very high accuracy, but are available only late in the design process, and are often quite slow. Therefore, the need has arisen for faster, higher-level power prediction methods that allow the system designer to explore many alternatives for developing powerefficient hardware and software. The aim of this thesis is to present fast and high-level power models for the prediction of processor power consumption. Power predictability in this work is achieved in two ways: first, using a design method to develop power predictable circuits; second, analysing the power of the functions in the code which repeat during execution, then building the power model based on average number of repetitions. In the first case, a design method called Asynchronous Charge Sharing Logic (ACSL) is used to implement the Arithmetic Logic Unit (ALU) for the 8051 microcontroller. The ACSL circuits are power predictable due to the independency of their power consumption to the input data. Based on this property, a fast prediction method is presented to estimate the power of ALU by analysing the software program, and extracting the number of ALU-related instructions. This method achieves less than 1% error in power estimation and more than 100 times speedup in comparison to conventional simulation-based methods. In the second case, an average-case processor energy model is developed for the Insertion sort algorithm based on the number of comparisons that take place in the execution of the algorithm. The average number of comparisons is calculated using a high level methodology called MOdular Quantitative Analysis (MOQA). The parameters of the energy model are measured for the LEON3 processor core, but the model is general and can be used for any processor. The model has been validated through the power measurement experiments, and offers high accuracy and orders of magnitude speedup over the simulation-based method.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Abstract : Although concrete is a relatively green material, the astronomical volume of concrete produced worldwide annually places the concrete construction sector among the noticeable contributors to the global warming. The most polluting constituent of concrete is cement due to its production process which releases, on average, 0.83 kg CO[subscript 2] per kg of cement. Self-consolidating concrete (SCC), a type of concrete that can fill in the formwork without external vibration, is a technology that can offer a solution to the sustainability issues of concrete industry. However, all of the workability requirements of SCC originate from a higher powder content (compared to conventional concrete) which can increase both the cost of construction and the environmental impact of SCC for some applications. Ecological SCC, Eco-SCC, is a recent development combing the advantages of SCC and a significantly lower powder content. The maximum powder content of this concrete, intended for building and commercial construction, is limited to 315 kg/m[superscript 3]. Nevertheless, designing Eco-SCC can be challenging since a delicate balance between different ingredients of this concrete is required to secure a satisfactory mixture. In this Ph.D. program, the principal objective is to develop a systematic design method to produce Eco-SCC. Since the particle lattice effect (PLE) is a key parameter to design stable Eco-SCC mixtures and is not well understood, in the first phase of this research, this phenomenon is studied. The focus in this phase is on the effect of particle-size distribution (PSD) on the PLE and stability of model mixtures as well as SCC. In the second phase, the design protocol is developed, and the properties of obtained Eco-SCC mixtures in both fresh and hardened states are evaluated. Since the assessment of robustness is crucial for successful production of concrete on large-scale, in the final phase of this work, the robustness of one the best-performing mixtures of Phase II is examined. It was found that increasing the volume fraction of a stable size-class results in an increase in the stability of that class, which in turn contributes to a higher PLE of the granular skeleton and better stability of the system. It was shown that a continuous PSD in which the volume fraction of each size class is larger than the consecutive coarser class can increase the PLE. Using such PSD was shown to allow for a substantial increase in the fluidity of SCC mixture without compromising the segregation resistance. An index to predict the segregation potential of a suspension of particles in a yield stress fluid was proposed. In the second phase of the dissertation, a five-step design method for Eco-SCC was established. The design protocol started with the determination of powder and water contents followed by the optimization of sand and coarse aggregate volume fractions according to an ideal PSD model (Funk and Dinger). The powder composition was optimized in the third step to minimize the water demand while securing adequate performance in the hardened state. The superplasticizer (SP) content of the mixtures was determined in next step. The last step dealt with the assessment of the global warming potential of the formulated Eco-SCC mixtures. The optimized Eco-SCC mixtures met all the requirements of self-consolidation in the fresh state. The 28-day compressive strength of such mixtures complied with the target range of 25 to 35 MPa. In addition, the mixtures showed sufficient performance in terms of drying shrinkage, electrical resistivity, and frost durability for the intended applications. The eco-performance of the developed mixtures was satisfactory as well. It was demonstrated in the last phase that the robustness of Eco-SCC is generally good with regards to water content variations and coarse aggregate characteristics alterations. Special attention must be paid to the dosage of SP during batching.