885 resultados para Design (process simulation)
Resumo:
This thesis is presented in two parts. The first part is an attempt to set out a framework of factors influencing the problem solving stage of the architectural design process. The discussion covers the nature of architectural problems and some of the main ways in which they differ from other types of design problems. The structure of constraints that both the problem and the architect impose upon solutions are seen as of great importance in defining the type of design problem solving situation. The problem solver, or architect, is then studied. The literature of the psychology of thinking is surveyed for relevant work . All of the traditional schools of psychology are found wanting in terms of providing a comprehensive theory of thinking. Various types of thinking are examined, particularly structural and productive thought, for their relevance to design problem solving. Finally some reported common traits of architects are briefly reviewed. The second section is a report of u~o main experiments which model some aspects of architectural design problem solving. The first experiment examines the way in which architects come to understand the structure of their problems. The performances of first and final year architectural students are compared with those of postgraduate science students and sixth form pupils. On the whole these groups show significantly different results and also different cognitive strategies. The second experiment poses design problems which involve both subjective and objective criteria, and examines the way in which final year architectural students are able to relate the different types of constraint produced. In the final section the significance of all the results is suggested. Some educational and methodological implications are discussed and some further experiments and investigations are proposed.
Resumo:
Design and Designing provides a broad and critical understanding of what is essentially a practical subject. Designing today is less a craft and more a part of the knowledge economy. It's all about knowing how to acquire knowledge and how to apply it creatively. Design and Designing covers the design process, modelling and drawing, working with clients, production and consumption, sustainability, professional practice and design futures. Chapters are written by expert teachers and practitioners from around the globe, each presenting an accessible and engaging overview of their field of design. Every chapter is highly illustrated with a combination of images and information boxes, which extend or highlight key material. Each section concludes with a design project, a hands-on activity for the reader. Design and Designing covers the full spectrum of design types, from graphic communication to product design, from fashion to games design, setting every type in its aesthetic, ethical and social contexts. With this essential book, readers will learn from today's best practice and best thinking in design, they will develop a critical sense, and become the designers of tomorrow.
Resumo:
Biodiesel is fast becoming one of the key transport fuels as the world endeavours to reduce its carbon footprint and find viable alternatives to oil derived fuels. Research in the field is currently focusing on more efficient ways to produce biodiesel, with the most promising avenue of research looking into the use of heterogeneous catalysis. This article presents a framework for kinetic reaction and diffusive transport modelling of the heterogeneously catalysed transesterification of triglycerides into fatty acid methyl esters (FAMEs), unveiled by a model system of tributyrin transesterification in the presence of MgO catalysts. In particular, the paper makes recommendations on multicomponent diffusion calculations such as the diffusion coefficients and molar fluxes from infinite dilution diffusion coefficients using the Wilke and Chang correlation, intrinsic reaction kinetic studies using the Eley-Rideal kinetic mechanism with methanol adsorption as the rate determining steps and multiscale reaction-diffusion process simulation between catalytic porous and bulk reactor scales. © 2013 The Royal Society of Chemistry.
Resumo:
A negative input-resistance compensator is designed to stabilize a power electronic brushless dc motor drive with constant power-load characteristics. The strategy is to feed a portion of the changes in the dc-link voltage into the current control loop to modify the system input impedance in the midfrequency range and thereby to damp the input filter. The design process of the compensator and the selection of parameters are described. The impact of the compensator is examined on the motor-controller performance, and finally, the effectiveness of the controller is verified by simulation and experimental testing.
Resumo:
As a new medium for questionnaire delivery, the internet has the potential to revolutionise the survey process. Online (web-based) questionnaires provide several advantages over traditional survey methods in terms of cost, speed, appearance, flexibility, functionality, and usability [1, 2]. For instance, delivery is faster, responses are received more quickly, and data collection can be automated or accelerated [1- 3]. Online-questionnaires can also provide many capabilities not found in traditional paper-based questionnaires: they can include pop-up instructions and error messages; they can incorporate links; and it is possible to encode difficult skip patterns making such patterns virtually invisible to respondents. Like many new technologies, however, online-questionnaires face criticism despite their advantages. Typically, such criticisms focus on the vulnerability of online-questionnaires to the four standard survey error types: namely, coverage, non-response, sampling, and measurement errors. Although, like all survey errors, coverage error (“the result of not allowing all members of the survey population to have an equal or nonzero chance of being sampled for participation in a survey” [2, pg. 9]) also affects traditional survey methods, it is currently exacerbated in online-questionnaires as a result of the digital divide. That said, many developed countries have reported substantial increases in computer and internet access and/or are targeting this as part of their immediate infrastructural development [4, 5]. Indicating that familiarity with information technologies is increasing, these trends suggest that coverage error will rapidly diminish to an acceptable level (for the developed world at least) in the near future, and in so doing, positively reinforce the advantages of online-questionnaire delivery. The second error type – the non-response error – occurs when individuals fail to respond to the invitation to participate in a survey or abandon a questionnaire before it is completed. Given today’s societal trend towards self-administration [2] the former is inevitable, irrespective of delivery mechanism. Conversely, non-response as a consequence of questionnaire abandonment can be relatively easily addressed. Unlike traditional questionnaires, the delivery mechanism for online-questionnaires makes estimation of questionnaire length and time required for completion difficult1, thus increasing the likelihood of abandonment. By incorporating a range of features into the design of an online questionnaire, it is possible to facilitate such estimation – and indeed, to provide respondents with context sensitive assistance during the response process – and thereby reduce abandonment while eliciting feelings of accomplishment [6]. For online-questionnaires, sampling error (“the result of attempting to survey only some, and not all, of the units in the survey population” [2, pg. 9]) can arise when all but a small portion of the anticipated respondent set is alienated (and so fails to respond) as a result of, for example, disregard for varying connection speeds, bandwidth limitations, browser configurations, monitors, hardware, and user requirements during the questionnaire design process. Similarly, measurement errors (“the result of poor question wording or questions being presented in such a way that inaccurate or uninterpretable answers are obtained” [2, pg. 11]) will lead to respondents becoming confused and frustrated. Sampling, measurement, and non-response errors are likely to occur when an online-questionnaire is poorly designed. Individuals will answer questions incorrectly, abandon questionnaires, and may ultimately refuse to participate in future surveys; thus, the benefit of online questionnaire delivery will not be fully realized. To prevent errors of this kind2, and their consequences, it is extremely important that practical, comprehensive guidelines exist for the design of online questionnaires. Many design guidelines exist for paper-based questionnaire design (e.g. [7-14]); the same is not true for the design of online questionnaires [2, 15, 16]. The research presented in this paper is a first attempt to address this discrepancy. Section 2 describes the derivation of a comprehensive set of guidelines for the design of online-questionnaires and briefly (given space restrictions) outlines the essence of the guidelines themselves. Although online-questionnaires reduce traditional delivery costs (e.g. paper, mail out, and data entry), set up costs can be high given the need to either adopt and acquire training in questionnaire development software or secure the services of a web developer. Neither approach, however, guarantees a good questionnaire (often because the person designing the questionnaire lacks relevant knowledge in questionnaire design). Drawing on existing software evaluation techniques [17, 18], we assessed the extent to which current questionnaire development applications support our guidelines; Section 3 describes the framework used for the evaluation, and Section 4 discusses our findings. Finally, Section 5 concludes with a discussion of further work.
Resumo:
As a new medium for questionnaire delivery, the Internet has the potential to revolutionize the survey process. Online-questionnaires can provide many capabilities not found in traditional paper-based questionnaires. Despite this, and the introduction of a plethora of tools to support online-questionnaire creation, current electronic survey design typically replicates the look-and-feel of paper-based questionnaires, thus failing to harness the full power of the electronic delivery medium. A recent environmental scan of online-questionnaire design tools found that little, if any, support is incorporated within these tools to guide questionnaire designers according to best-practice [Lumsden & Morgan 2005]. This paper briefly introduces a comprehensive set of guidelines for the design of online-questionnaires. Drawn from relevant disparate sources, all the guidelines incorporated within the set are proven in their own right; as an initial assessment of the value of the set of guidelines as a practical reference guide, we undertook an informal study to observe the effect of introducing the guidelines into the design process for a complex online-questionnaire. The paper discusses the qualitative findings — which are encouraging for the role of the guidelines in the ‘bigger picture’ of online survey delivery across many domains such as e-government, e-business, and e-health — of this case study.
Resumo:
This thesis explores the interaction between Micros (<10 employees) from non-creative sectors and website designers ("Creatives") that occurred when creating a website of a higher order than a basic template site. The research used Straussian Grounded Theory Method with a longitudinal design, in order to identify what knowledge transferred to the Micros during the collaboration, how it transferred, what factors affected the transfer and outcomes of the transfer including behavioural additionality. To identify whether the research could be extended beyond this, five other design areas were also examined, as well as five Small to Medium Enterprises (SMEs) engaged in website and branding projects. The findings were that, at the start of the design process, many Micros could not articulate their customer knowledge, and had poor marketing and visual language skills, knowledge core to web design, enabling targeted communication to customers through images. Despite these gaps, most Micros still tried to lead the process. To overcome this disjoint, the majority of the designers used a knowledge transfer strategy termed in this thesis as ‘Bi-Modal Knowledge Transfer’, where the Creative was aware of the transfer but the Micro was unaware, both for drawing out customer knowledge from the Micro and for transferring visual language skills to the Micro. Two models were developed to represent this process. Two models were also created to map changes in the knowledge landscapes of customer knowledge and visual language – the Knowledge Placement Model and the Visual Language Scale. The Knowledge Placement model was used to map the placement of customer knowledge within the consciousness, extending the known Automatic-Unconscious -Conscious model, adding two more locations – Peripheral Consciousness and Occasional Consciousness. Peripheral Consciousness is where potential knowledge is held, but not used. Occasional Consciousness is where potential knowledge is held but used only for specific tasks. The Visual Language Scale was created to measure visual language ability from visually responsive, where the participant only responds personally to visual symbols, to visually multi-lingual, where the participant can use visual symbols to communicate with multiple thought-worlds. With successful Bi-Modal Knowledge Transfer, the outcome included not only an effective website but also changes in the knowledge landscape for the Micros and ongoing behavioural changes, especially in marketing. These effects were not seen in the other design projects, and only in two of the SME projects. The key factors for this difference between SMEs and Micros appeared to be an expectation of knowledge by the Creatives and failure by the SMEs to transfer knowledge within the company.
Resumo:
The international economic and business environment continues to develop at a rapid rate. Increasing interactions between economies, particularly between Europe and Asia, has raised many important issues regarding transport infrastructure, logistics and broader supply chain management. The potential exists to further stimulate trade provided that these issues are addressed in a logical and systematic manner. However, if this potential is to be realised in practice there is a need to re-evaluate current supply chain configurations. A mismatch currently exists between the technological capability and the supply chain or logistical reality. This mismatch has sharpened the focus on the need for robust approaches to supply chain re-engineering. Traditional approaches to business re-engineering have been based on manufacturing systems engineering and business process management. A recognition that all companies exist as part of bigger supply chains has fundamentally changed the focus of re-engineering. Inefficiencies anywhere in a supply chain result in the chain as a whole being unable to reach its true competitive potential. This reality, combined with the potentially radical impact on business and supply chain architectures of the technologies associated with electronic business, requires organisations to adopt innovative approaches to supply chain analysis and re-design. This paper introduces a systems approach to supply chain re-engineering which is aimed at addressing the challenges which the evolving business environment brings with it. The approach, which is based on work with a variety of both conventional and electronic supply chains, comprises underpinning principles, a methodology and guidelines on good working practice, as well as a suite of tools and techniques. The adoption of approaches such as that outlined in this paper helps to ensure that robust supply chains are designed and implemented in practice. This facilitates an integrated approach, with involvement of all key stakeholders throughout the design process.
Resumo:
This paper describes how dimensional variation management could be integrated throughout design, manufacture and verification, to improve quality while reducing cycle times and manufacturing cost in the Digital Factory environment. Initially variation analysis is used to optimize tolerances during product and tooling design and also results in the creation of a simplified representation of product key characteristics. This simplified representation can then be used to carry out measurability analysis and process simulation. The link established between the variation analysis model and measurement processes can subsequently be used throughout the production process to automatically update the variation analysis model in real time with measurement data. This ‘live’ simulation of variation during manufacture will allow early detection of quality issues and facilitate autonomous measurement assisted processes such as predictive shimming. A study is described showing how these principles can be demonstrated using commercially available software combined with a number of prototype applications operating as discrete modules. The commercially available modules include Catia/Delmia for product and process design, 3DCS for variation analysis and Spatial Analyzer for measurement simulation. Prototype modules are used to carry out measurability analysis and instrument selection. Realizing the full potential of Metrology in the Digital Factory will require that these modules are integrated and software architecture to facilitate this is described. Crucially this integration must facilitate the use of realtime metrology data describing the emerging assembly to update the digital model.
Resumo:
The automotive industry combines a multitude of professionals to develop a modern car successfully. Within the design and development teams the collaboration and interface between Engineers and Designers is critical to ensure design intent is communicated and maintained throughout the development process. This study highlights recent industry practice with the emergence of Concept Engineers in design teams at Jaguar Land Rover Automotive group. The role of the Concept Engineer emphasises the importance of the Engineering and Design/Styling interface with the Concept engineer able to interact and understand the challenges and specific languages of each specialist area, hence improving efficiency and communication within the design team. Automotive education tends to approach design from two distinct directions, that of engineering design through BSc courses or a more styling design approach through BA and BDes routes. The educational challenge for both types of course is to develop engineers and stylist's who have greater understanding and experience of each other's specialist perspective of design and development. The study gives examples of two such courses in the UK who are developing programmes to help students widen their understanding of the engineering and design spectrum. Initial results suggest the practical approach has been well received by students and encouraged by industry as they seek graduates with specialist knowledge but also a wider appreciation of their role within the design process.
Resumo:
Database design is a difficult problem for non-expert designers. It is desirable to assist such designers during the problem solving process by means of a knowledge based (KB) system. A number of prototype KB systems have been proposed, however there are many shortcomings. Few have incorporated sufficient expertise in modeling relationships, particularly higher order relationships. There has been no empirical study that experimentally tested the effectiveness of any of these KB tools. Problem solving behavior of non-experts, whom the systems were intended to assist, has not been one of the bases for system design. In this project a consulting system for conceptual database design that addresses the above short comings was developed and empirically validated.^ The system incorporates (a) findings on why non-experts commit errors and (b) heuristics for modeling relationships. Two approaches to knowledge base implementation--system restrictiveness and decisional guidance--were used and compared in this project. The Restrictive approach is proscriptive and limits the designer's choices at various design phases by forcing him/her to follow a specific design path. The Guidance system approach which is less restrictive, provides context specific, informative and suggestive guidance throughout the design process. The main objectives of the study are to evaluate (1) whether the knowledge-based system is more effective than a system without the knowledge-base and (2) which knowledge implementation--restrictive or guidance--strategy is more effective. To evaluate the effectiveness of the knowledge base itself, the two systems were compared with a system that does not incorporate the expertise (Control).^ The experimental procedure involved the student subjects solving a task without using the system (pre-treatment task) and another task using one of the three systems (experimental task). The experimental task scores of those subjects who performed satisfactorily in the pre-treatment task were analyzed. Results are (1) The knowledge based approach to database design support lead to more accurate solutions than the control system; (2) No significant difference between the two KB approaches; (3) Guidance approach led to best performance; and (4) The subjects perceived the Restrictive system easier to use than the Guidance system. ^
Resumo:
Antenna design is an iterative process in which structures are analyzed and changed to comply with certain performance parameters required. The classic approach starts with analyzing a "known" structure, obtaining the value of its performance parameter and changing this structure until the "target" value is achieved. This process relies on having an initial structure, which follows some known or "intuitive" patterns already familiar to the designer. The purpose of this research was to develop a method of designing UWB antennas. What is new in this proposal is that the design process is reversed: the designer will start with the target performance parameter and obtain a structure as the result of the design process. This method provided a new way to replicate and optimize existing performance parameters. The base of the method was the use of a Genetic Algorithm (GA) adapted to the format of the chromosome that will be evaluated by the Electromagnetic (EM) solver. For the electromagnetic study we used XFDTD™ program, based in the Finite-Difference Time-Domain technique. The programming portion of the method was created under the MatLab environment, which serves as the interface for converting chromosomes, file formats and transferring of data between the XFDTD™ and GA. A high level of customization had to be written into the code to work with the specific files generated by the XFDTD™ program. Two types of cost functions were evaluated; the first one seeking broadband performance within the UWB band, and the second one searching for curve replication of a reference geometry. The performance of the method was evaluated considering the speed provided by the computer resources used. Balance between accuracy, data file size and speed of execution was achieved by defining parameters in the GA code as well as changing the internal parameters of the XFDTD™ projects. The results showed that the GA produced geometries that were analyzed by the XFDTD™ program and changed following the search criteria until reaching the target value of the cost function. Results also showed how the parameters can change the search criteria and influence the running of the code to provide a variety of geometries.
Design optimization of modern machine drive systems for maximum fault tolerant and optimal operation
Resumo:
Modern electric machine drives, particularly three phase permanent magnet machine drive systems represent an indispensable part of high power density products. Such products include; hybrid electric vehicles, large propulsion systems, and automation products. Reliability and cost of these products are directly related to the reliability and cost of these systems. The compatibility of the electric machine and its drive system for optimal cost and operation has been a large challenge in industrial applications. The main objective of this dissertation is to find a design and control scheme for the best compromise between the reliability and optimality of the electric machine-drive system. The effort presented here is motivated by the need to find new techniques to connect the design and control of electric machines and drive systems. ^ A highly accurate and computationally efficient modeling process was developed to monitor the magnetic, thermal, and electrical aspects of the electric machine in its operational environments. The modeling process was also utilized in the design process in form finite element based optimization process. It was also used in hardware in the loop finite element based optimization process. The modeling process was later employed in the design of a very accurate and highly efficient physics-based customized observers that are required for the fault diagnosis as well the sensorless rotor position estimation. Two test setups with different ratings and topologies were numerically and experimentally tested to verify the effectiveness of the proposed techniques. ^ The modeling process was also employed in the real-time demagnetization control of the machine. Various real-time scenarios were successfully verified. It was shown that this process gives the potential to optimally redefine the assumptions in sizing the permanent magnets of the machine and DC bus voltage of the drive for the worst operating conditions. ^ The mathematical development and stability criteria of the physics-based modeling of the machine, design optimization, and the physics-based fault diagnosis and the physics-based sensorless technique are described in detail. ^ To investigate the performance of the developed design test-bed, software and hardware setups were constructed first. Several topologies of the permanent magnet machine were optimized inside the optimization test-bed. To investigate the performance of the developed sensorless control, a test-bed including a 0.25 (kW) surface mounted permanent magnet synchronous machine example was created. The verification of the proposed technique in a range from medium to very low speed, effectively show the intelligent design capability of the proposed system. Additionally, to investigate the performance of the developed fault diagnosis system, a test-bed including a 0.8 (kW) surface mounted permanent magnet synchronous machine example with trapezoidal back electromotive force was created. The results verify the use of the proposed technique under dynamic eccentricity, DC bus voltage variations, and harmonic loading condition make the system an ideal case for propulsion systems.^
Resumo:
The primary purpose of this thesis was to present a theoretical large-signal analysis to study the power gain and efficiency of a microwave power amplifier for LS-band communications using software simulation. Power gain, efficiency, reliability, and stability are important characteristics in the power amplifier design process. These characteristics affect advance wireless systems, which require low-cost device amplification without sacrificing system performance. Large-signal modeling and input and output matching components are used for this thesis. Motorola's Electro Thermal LDMOS model is a new transistor model that includes self-heating affects and is capable of small-large signal simulations. It allows for most of the design considerations to be on stability, power gain, bandwidth, and DC requirements. The matching technique allows for the gain to be maximized at a specific target frequency. Calculations and simulations for the microwave power amplifier design were performed using Matlab and Microwave Office respectively. Microwave Office is the simulation software used in this thesis. The study demonstrated that Motorola's Electro Thermal LDMOS transistor in microwave power amplifier design process is a viable solution for common-source amplifier applications in high power base stations. The MET-LDMOS met the stability requirements for the specified frequency range without a stability-improvement model. The power gain of the amplifier circuit was improved through proper microwave matching design using input/output-matching techniques. The gain and efficiency of the amplifier improve approximately 4dB and 7.27% respectively. The gain value is roughly .89 dB higher than the maximum gain specified by the MRF21010 data sheet specifications. This work can lead to efficient modeling and development of high power LDMOS transistor implementations in commercial and industry applications.
Resumo:
Database design is a difficult problem for non-expert designers. It is desirable to assist such designers during the problem solving process by means of a knowledge based (KB) system. Although a number of prototype KB systems have been proposed, there are many shortcomings. Firstly, few have incorporated sufficient expertise in modeling relationships, particularly higher order relationships. Secondly, there does not seem to be any published empirical study that experimentally tested the effectiveness of any of these KB tools. Thirdly, problem solving behavior of non-experts, whom the systems were intended to assist, has not been one of the bases for system design. In this project, a consulting system, called CODA, for conceptual database design that addresses the above short comings was developed and empirically validated. More specifically, the CODA system incorporates (a) findings on why non-experts commit errors and (b) heuristics for modeling relationships. Two approaches to knowledge base implementation were used and compared in this project, namely system restrictiveness and decisional guidance (Silver 1990). The Restrictive system uses a proscriptive approach and limits the designer's choices at various design phases by forcing him/her to follow a specific design path. The Guidance system approach, which is less restrictive, involves providing context specific, informative and suggestive guidance throughout the design process. Both the approaches would prevent erroneous design decisions. The main objectives of the study are to evaluate (1) whether the knowledge-based system is more effective than the system without a knowledge-base and (2) which approach to knowledge implementation - whether Restrictive or Guidance - is more effective. To evaluate the effectiveness of the knowledge base itself, the systems were compared with a system that does not incorporate the expertise (Control). An experimental procedure using student subjects was used to test the effectiveness of the systems. The subjects solved a task without using the system (pre-treatment task) and another task using one of the three systems, viz. Control, Guidance or Restrictive (experimental task). Analysis of experimental task scores of those subjects who performed satisfactorily in the pre-treatment task revealed that the knowledge based approach to database design support lead to more accurate solutions than the control system. Among the two KB approaches, Guidance approach was found to lead to better performance when compared to the Control system. It was found that the subjects perceived the Restrictive system easier to use than the Guidance system.