901 resultados para Computer Graphics and Computer-Aided Design
Resumo:
In the UK, architectural design is regulated through a system of design control for the public interest, which aims to secure and promote ‘quality’ in the built environment. Design control is primarily implemented by locally employed planning professionals with political oversight, and independent design review panels, staffed predominantly by design professionals. Design control has a lengthy and complex history, with the concept of ‘design’ offering a range of challenges for a regulatory system of governance. A simultaneously creative and emotive discipline, architectural design is a difficult issue to regulate objectively or consistently, often leading to policy that is regarded highly discretionary and flexible. This makes regulatory outcomes difficult to predict, as approaches undertaken by the ‘agents of control’ can vary according to the individual. The role of the design controller is therefore central, tasked with the responsibility of interpreting design policy and guidance, appraising design quality and passing professional judgment. However, little is really known about what influences the way design controllers approach their task, providing a ‘veil’ over design control, shrouding the basis of their decisions. This research engaged directly with the attitudes and perceptions of design controllers in the UK, lifting this ‘veil’. Using in-depth interviews and Q-Methodology, the thesis explores this hidden element of control, revealing a number of key differences in how controllers approach and implement policy and guidance, conceptualise design quality, and rationalise their evaluations and judgments. The research develops a conceptual framework for agency in design control – this consists of six variables (Regulation; Discretion; Skills; Design Quality; Aesthetics; and Evaluation) and it is suggested that this could act as a ‘heuristic’ instrument for UK controllers, prompting more reflexivity in relation to evaluating their own position, approaches, and attitudes, leading to better practice and increased transparency of control decisions.
Resumo:
The top managers of a biotechnology startup firm agreed to participate in a system dynamics modeling project to help them think about the firm's growth strategy. The article describes how the model was created and used to stimulate debate and discussion about growth management. The paper highlights several novel features about the process used for capturing management team knowledge. A heavy emphasis was placed on mapping the operating structure of the factory and distribution channels. Qualitative modeling methods (structural diagrams, descriptive variable names, and friendly algebra) were used to capture the management team's descriptions of the business. Simulation scenarios were crafted to stimulate debate about strategic issues such as capacity allocation, capacity expansion, customer recruitment, customer retention, and market growth, and to engage the management team in using the computer to design strategic scenarios. The article concludes with comments on the impact of the project.
Resumo:
The idea of buildings in harmony with nature can be traced back to ancient times. The increasing concerns on sustainability oriented buildings have added new challenges in building architectural design and called for new design responses. Sustainable design integrates and balances the human geometries and the natural ones. As the language of nature, it is, therefore, natural to assume that fractal geometry could play a role in developing new forms of aesthetics and sustainable architectural design. This paper gives a brief description of fractal geometry theory and presents its current status and recent developments through illustrative review of some fractal case studies in architecture design, which provides a bridge between fractal geometry and architecture design.
Resumo:
Objective: To clarify how infection control requirements are represented, communicated, and understood in work interactions through the medical facility construction project life cycle. To assist project participants with effective infection control management by highlighting the nature of such requirements and presenting recommendations to aid practice. Background: A 4-year study regarding client requirement representation and use on National Health Service construction projects in the United Kingdom provided empirical evidence of infection control requirement communication and understanding through design and construction work interactions. Methods: An analysis of construction project resources (e.g., infection control regulations and room data sheets) was combined with semi-structured interviews with hospital client employees and design and construction professionals to provide valuable insights into the management of infection control issues. Results: Infection control requirements are representationally indistinct but also omnipresent through all phases of the construction project life cycle: Failure to recognize their nature, relevance, and significance can result in delays, stoppages, and redesign work. Construction project resources (e.g., regulatory guidance and room data sheets) can mask or obscure the meaning of infection control issues. Conclusions: A preemptive identification of issues combined with knowledge sharing activities among project stakeholders can enable infection control requirements to be properly understood and addressed. Such initiatives should also reference existing infection control regulatory guidance and advice.
Resumo:
This paper reviews the literature concerning the practice of using Online Analytical Processing (OLAP) systems to recall information stored by Online Transactional Processing (OLTP) systems. Such a review provides a basis for discussion on the need for the information that are recalled through OLAP systems to maintain the contexts of transactions with the data captured by the respective OLTP system. The paper observes an industry trend involving the use of OLTP systems to process information into data, which are then stored in databases without the business rules that were used to process information and data stored in OLTP databases without associated business rules. This includes the necessitation of a practice, whereby, sets of business rules are used to extract, cleanse, transform and load data from disparate OLTP systems into OLAP databases to support the requirements for complex reporting and analytics. These sets of business rules are usually not the same as business rules used to capture data in particular OLTP systems. The paper argues that, differences between the business rules used to interpret these same data sets, risk gaps in semantics between information captured by OLTP systems and information recalled through OLAP systems. Literature concerning the modeling of business transaction information as facts with context as part of the modelling of information systems were reviewed to identify design trends that are contributing to the design quality of OLTP and OLAP systems. The paper then argues that; the quality of OLTP and OLAP systems design has a critical dependency on the capture of facts with associated context, encoding facts with contexts into data with business rules, storage and sourcing of data with business rules, decoding data with business rules into the facts with the context and recall of facts with associated contexts. The paper proposes UBIRQ, a design model to aid the co-design of data with business rules storage for OLTP and OLAP purposes. The proposed design model provides the opportunity for the implementation and use of multi-purpose databases, and business rules stores for OLTP and OLAP systems. Such implementations would enable the use of OLTP systems to record and store data with executions of business rules, which will allow for the use of OLTP and OLAP systems to query data with business rules used to capture the data. Thereby ensuring information recalled via OLAP systems preserves the contexts of transactions as per the data captured by the respective OLTP system.
Resumo:
This text extends some ideas presented in a keynote lecture of the 5th Encontro de Tipografia conference, in Barcelos, Portugal, in November 2014. The paper discusses problems of identifying the location and encoding of design decisions, the implications of digital workflows for capturing knowledge generating through design practice, and the consequences of the transformation of production tools into commodities. It concludes with a discussion of the perception of added value in typeface design.
Resumo:
The most significant radiation field nonuniformity is the well-known Heel effect. This nonuniform beam effect has a negative influence on the results of computer-aided diagnosis of mammograms, which is frequently used for early cancer detection. This paper presents a method to correct all pixels in the mammography image according to the excess or lack on radiation to which these have been submitted as a result of the this effect. The current simulation method calculates the intensities at all points of the image plane. In the simulated image, the percentage of radiation received by all the points takes the center of the field as reference. In the digitized mammography, the percentages of the optical density of all the pixels of the analyzed image are also calculated. The Heel effect causes a Gaussian distribution around the anode-cathode axis and a logarithmic distribution parallel to this axis. Those characteristic distributions are used to determine the center of the radiation field as well as the cathode-anode axis, allowing for the automatic determination of the correlation between these two sets of data. The measurements obtained with our proposed method differs on average by 2.49 mm in the direction perpendicular to the anode-cathode axis and 2.02 mm parallel to the anode-cathode axis of commercial equipment. The method eliminates around 94% of the Heel effect in the radiological image and the objects will reflect their x-ray absorption. To evaluate this method, experimental data was taken from known objects, but could also be done with clinical and digital images.
Resumo:
In medical processes where ionizing radiation is used, dose planning and dose delivery are the key elements to patient safety and treatment success, particularly, when the delivered dose in a single session of treatment can be an order of magnitude higher than the regular doses of radiotherapy. Therefore, the radiation dose should be well defined and precisely delivered to the target while minimizing radiation exposure to surrounding normal tissues [1]. Several methods have been proposed to obtain three-dimensional (3-D) dose distribution [2, 3]. In this paper, we propose an alternative method, which can be easily implemented in any stereotactic radiosurgery center with a magnetic resonance imaging (MRI) facility. A phantom with or without scattering centers filled with Fricke gel solution is irradiated with Gamma Knife(A (R)) system at a chosen spot. The phantom can be a replica of a human organ such as head, breast or any other organ. It can even be constructed from a real 3-D MR image of an organ of a patient using a computer-aided construction and irradiated at a specific region corresponding to the tumor position determined by MRI. The spin-lattice relaxation time T (1) of different parts of the irradiated phantom is determined by localized spectroscopy. The T (1)-weighted phantom images are used to correlate the image pixels intensity to the absorbed dose and consequently a 3-D dose distribution with a high resolution is obtained.
Resumo:
The immersed boundary method is a versatile tool for the investigation of flow-structure interaction. In a large number of applications, the immersed boundaries or structures are very stiff and strong tangential forces on these interfaces induce a well-known, severe time-step restriction for explicit discretizations. This excessive stability constraint can be removed with fully implicit or suitable semi-implicit schemes but at a seemingly prohibitive computational cost. While economical alternatives have been proposed recently for some special cases, there is a practical need for a computationally efficient approach that can be applied more broadly. In this context, we revisit a robust semi-implicit discretization introduced by Peskin in the late 1970s which has received renewed attention recently. This discretization, in which the spreading and interpolation operators are lagged. leads to a linear system of equations for the inter-face configuration at the future time, when the interfacial force is linear. However, this linear system is large and dense and thus it is challenging to streamline its solution. Moreover, while the same linear system or one of similar structure could potentially be used in Newton-type iterations, nonlinear and highly stiff immersed structures pose additional challenges to iterative methods. In this work, we address these problems and propose cost-effective computational strategies for solving Peskin`s lagged-operators type of discretization. We do this by first constructing a sufficiently accurate approximation to the system`s matrix and we obtain a rigorous estimate for this approximation. This matrix is expeditiously computed by using a combination of pre-calculated values and interpolation. The availability of a matrix allows for more efficient matrix-vector products and facilitates the design of effective iterative schemes. We propose efficient iterative approaches to deal with both linear and nonlinear interfacial forces and simple or complex immersed structures with tethered or untethered points. One of these iterative approaches employs a splitting in which we first solve a linear problem for the interfacial force and then we use a nonlinear iteration to find the interface configuration corresponding to this force. We demonstrate that the proposed approach is several orders of magnitude more efficient than the standard explicit method. In addition to considering the standard elliptical drop test case, we show both the robustness and efficacy of the proposed methodology with a 2D model of a heart valve. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Energy efficiency and renewable energy use are two main priorities leading to industrial sustainability nowadays according to European Steel Technology Platform (ESTP). Modernization efforts can be done by industries to improve energy consumptions of the production lines. These days, steel making industrial applications are energy and emission intensive. It was estimated that over the past years, energy consumption and corresponding CO2 generation has increased steadily reaching approximately 338.15 parts per million in august 2010 [1]. These kinds of facts and statistics have introduced a lot of room for improvement in energy efficiency for industrial applications through modernization and use of renewable energy sources such as solar Photovoltaic Systems (PV).The purpose of this thesis work is to make a preliminary design and simulation of the solar photovoltaic system which would attempt to cover the energy demand of the initial part of the pickling line hydraulic system at the SSAB steel plant. For this purpose, the energy consumptions of this hydraulic system would be studied and evaluated and a general analysis of the hydraulic and control components performance would be done which would yield a proper set of guidelines contributing towards future energy savings. The results of the energy efficiency analysis showed that the initial part of the pickling line hydraulic system worked with a low efficiency of 3.3%. Results of general analysis showed that hydraulic accumulators of 650 liter size should be used by the initial part pickling line system in combination with a one pump delivery of 100 l/min. Based on this, one PV system can deliver energy to an AC motor-pump set covering 17.6% of total energy and another PV system can supply a DC hydraulic pump substituting 26.7% of the demand. The first system used 290 m2 area of the roof and was sized as 40 kWp, the second used 109 m2 and was sized as 15.2 kWp. It was concluded that the reason for the low efficiency was the oversized design of the system. Incremental modernization efforts could help to improve the hydraulic system energy efficiency and make the design of the solar photovoltaic system realistically possible. Two types of PV systems where analyzed in the thesis work. A method was found calculating the load simulation sequence based on the energy efficiency studies to help in the PV system simulations. Hydraulic accumulators integrated into the pickling line worked as energy storage when being charged by the PV system as well.
Resumo:
One of the main aims of this thesis is to design an optimized commercial Photovoltaic (PV) system in Barbados from several variables such as racking type, module type and inverter type based on practicality, technical performance as well as financial returns to the client. Detailed simulations are done in PVSYST and financial models are used to compare different systems and their viability. Once the preeminent system is determined from a financial and performance perspective a detailed design is done using PVSYST and AutoCAD to design the most optimal PV system for the customer. In doing so, suitable engineering drawings are generated which are detailed enough for construction of the system. Detailed cost with quotes from relevant manufacturers, suppliers and estimators become instrumental in determining Balance of System Costs in addition to total project cost. The final simulated system is suggested with a PV capacity of 425kW and an inverter output of 300kW resulting in an array oversizing of 1.42. The PV system has a weighted Performance Ratio of 77 %, a specific yield of 1467 kWh/kWp and a projected annual production of 624 MWh/yr. This system is estimated to offset approximately 28 % of Carlton’s electrical load annually. Over the course of 20 years the PV system is projected to produce electricity at a cost of $0.201USD/kWh which is significantly lower than the $0.35 USD/kWh paid to the utility at the time of writing this thesis. Due to the high cost of electricity on the island, an attractive Feed-In-Tariff is not necessary to warrant the installation of a commercial System which over a lifetime which produces electricity at less than 60% of the cost to the user purchasing electricity from the utility. A simple payback period of 5.4 years, a return on investment of 17 % without incentives, in addition to an estimated diversion of 6840 barrels of oil or 2168 tonnes of CO2 further provides compelling justification for the installation of a commercial Photovoltaic System not only on Carlton A-1 Supermarket, but also island wide as well as regionally where most electricity supplies are from imported fossil fuels.
Resumo:
Virtual Reality is a relatively new technology in the relatively young field of computer science. The design of Virtual Reality has only recently come into discussion, as well as the implications for this sort of design. I hope to determine how a user can work most efficiently and accurately in a Virtual World. By studying this, I hope to help in the standardization of Virtual Reality design.
Resumo:
The movement of graphics and audio programming towards three dimensions is to better simulate the way we experience our world. In this project I looked to use methods for coming closer to such simulation via realistic graphics and sound combined with a natural interface. I did most of my work on a Dell OptiPlex with an 800 MHz Pentium III processor and an NVIDlA GeForce 256 AGP Plus graphics accelerator -high end products in the consumer market as of April 2000. For graphics, I used OpenGL [1], an open·source, multi-platform set of graphics libraries that is relatively easy to use, coded in C . The basic engine I first put together was a system to place objects in a scene and to navigate around the scene in real time. Once I accomplished this, I was able to investigate specific techniques for making parts of a scene more appealing.
Resumo:
This thesis aims to describe and demonstrate the developed concept to facilitate the use of thermal simulation tools during the building design process. Despite the impact of architectural elements on the performance of buildings, some influential decisions are frequently based solely on qualitative information. Even though such design support is adequate for most decisions, the designer will eventually have doubts concerning the performance of some design decisions. These situations will require some kind of additional knowledge to be properly approached. The concept of designerly ways of simulating focuses on the formulation and solution of design dilemmas, which are doubts about the design that cannot be fully understood nor solved without using quantitative information. The concept intends to combine the power of analysis from computer simulation tools with the capacity of synthesis from architects. Three types of simulation tools are considered: solar analysis, thermal/energy simulation and CFD. Design dilemmas are formulated and framed according to the architect s reflection process about performance aspects. Throughout the thesis, the problem is investigated in three fields: professional, technical and theoretical fields. This approach on distinct parts of the problem aimed to i) characterize different professional categories with regards to their design practice and use of tools, ii) investigate preceding researchers on the use of simulation tools and iii) draw analogies between the proposed concept, and some concepts developed or described in previous works about design theory. The proposed concept was tested in eight design dilemmas extracted from three case studies in the Netherlands. The three investigated processes are houses designed by Dutch architectural firms. Relevant information and criteria from each case study were obtained through interviews and conversations with the involved architects. The practical application, despite its success in the research context, allowed the identification of some applicability limitations of the concept, concerning the architects need to have technical knowledge and the actual evolution stage of simulation tools