951 resultados para building simulation
Resumo:
Sustainability concerns every citizen. Housing affordability and sustainable solutions are being highlighted in research and practice in many parts of the world. This paper discusses the development of a Commuter Energy and Building Utilities System (CEBUS) in sustainable housing projects as a means of bridging the gap between current median house pricing and target affordable house pricing for low income earners. Similar scales of sustainable housing development cannot be achieved through independent application of current best practice methods in ecologically sustainable development strategies or transit oriented development master plans. This paper presents the initial stage of research on first capital and ongoing utilities and transport cost savings available from these sustainable design methods. It also outlines further research and development of a CEBUS Dynamic Simulation Model and Conceptual Framework for the Australian property development and construction industry.
Resumo:
High fidelity simulation as a teaching and learning approach is being embraced by many schools of nursing. Our school embarked on integrating high fidelity (HF) simulation into the undergraduate clinical education program in 2011. Low and medium fidelity simulation has been used for many years, but this did not simplify the integration of HF simulation. Alongside considerations of how and where HF simulation would be integrated, issues arose with: student consent and participation for observed activities; data management of video files; staff development, and conceptualising how methods for student learning could be researched. Simulation for undergraduate student nurses commenced as a formative learning activity, undertaken in groups of eight, where four students undertake the ‘doing’ role and four are structured observers, who then take a formal role in the simulation debrief. Challenges for integrating simulation into student learning included conceptualising and developing scenarios to trigger students’ decision making and application of skills, knowledge and attitudes explicit to solving clinical ‘problems’. Developing and planning scenarios for students to ‘try out’ skills and make decisions for problem solving lay beyond choosing pre-existing scenarios inbuilt with the software. The supplied scenarios were not concept based but rather knowledge, skills and technology (of the manikin) focussed. Challenges lay in using the technology for the purpose of building conceptual mastery rather than using technology simply because it was available. As we integrated use of HF simulation into the final year of the program, focus was on building skills, knowledge and attitudes that went beyond technical skill, and provided an opportunity to bridge the gap with theory-based knowledge that students often found difficult to link to clinical reality. We wished to provide opportunities to develop experiential knowledge based on application and clinical reasoning processes in team environments where problems are encountered, and to solve them, the nurse must show leadership and direction. Other challenges included students consenting for simulations to be videotaped and ethical considerations of this. For example if one student in a group of eight did not consent, did this mean they missed the opportunity to undertake simulation, or that others in the group may be disadvantaged by being unable to review their performance. This has implications for freely given consent but also for equity of access to learning opportunities for students who wished to be taped and those who did not. Alongside this issue were the details behind data management, storage and access. Developing staff with varying levels of computer skills to use software and undertake a different approach to being the ‘teacher’ required innovation where we took an experiential approach. Considering explicit learning approaches to be trialled for learning was not a difficult proposition, but considering how to enact this as research with issues of blinding, timetabling of blinded groups, and reducing bias for testing results of different learning approaches along with gaining ethical approval was problematic. This presentation presents examples of these challenges and how we overcame them.
Resumo:
Emerging from the challenge to reduce energy consumption in buildings is a need for research and development into the more effective use of simulation as a decision-support tool. Despite significant research, persistent limitations in process and software inhibit the integration of energy simulation in early architectural design. This paper presents a green star case study to highlight the obstacles commonly encountered with current integration strategies. It then examines simulation-based design in the aerospace industry, which has overcome similar limitations. Finally, it proposes a design system based on this contrasting approach, coupling parametric modelling and energy simulation software for rapid and iterative performance assessment of early design options.
Resumo:
This paper investigates cooling energy performance of commercial building before and after green roof and living wall application based on integrated building heat gain model developed from Overall Thermal Transfer Value (OTTV) of building wall and steady state heat transfer process of roof in sub-tropical climate. Using the modelled equation and eQUEST energy simulation tool, commercial building envelope parameters and relevant heat gain parameters have been accumulated to analyse the heat gain and cooling energy consumption of commercial building. Real life commercial building envelope and air-conditioned load data for the sub-tropical climate zone have been collected and compared with the modelled analysis. Relevant temperature data required for living wall and green roof analysis have been collected from experimental setup comprised of both green roof and west facing living wall. Then, Commercial building heat flux and cooling energy performance before and after green roof and living wall application have been scrutinized.
Resumo:
As global warming entails new conditions for the built environment, the thermal behavior of existing buildings, which were designed based on current weather data, may become unclear and remain a great concern. Through building computer simulation, this paper investigates the sensitivity of different office building zoning to the potential global warming. From the sample office building examined, it is found that compared with the middle and top floors, the ground floor for most cities appears to be most sensitive to the effect of global warming and has the highest tendency to having the overheating problem. From the analysis of the responses of different zone orientations to the outdoor air temperature increase, it is also found that there are widely different responses between different zone orientations, with South or Core zone being most sensitive. With an increased external air temperature, the difference between different floors or different zone orientations will become more significant.
Resumo:
This thesis develops a detailed conceptual design method and a system software architecture defined with a parametric and generative evolutionary design system to support an integrated interdisciplinary building design approach. The research recognises the need to shift design efforts toward the earliest phases of the design process to support crucial design decisions that have a substantial cost implication on the overall project budget. The overall motivation of the research is to improve the quality of designs produced at the author's employer, the General Directorate of Major Works (GDMW) of the Saudi Arabian Armed Forces. GDMW produces many buildings that have standard requirements, across a wide range of environmental and social circumstances. A rapid means of customising designs for local circumstances would have significant benefits. The research considers the use of evolutionary genetic algorithms in the design process and the ability to generate and assess a wider range of potential design solutions than a human could manage. This wider ranging assessment, during the early stages of the design process, means that the generated solutions will be more appropriate for the defined design problem. The research work proposes a design method and system that promotes a collaborative relationship between human creativity and the computer capability. The tectonic design approach is adopted as a process oriented design that values the process of design as much as the product. The aim is to connect the evolutionary systems to performance assessment applications, which are used as prioritised fitness functions. This will produce design solutions that respond to their environmental and function requirements. This integrated, interdisciplinary approach to design will produce solutions through a design process that considers and balances the requirements of all aspects of the design. Since this thesis covers a wide area of research material, 'methodological pluralism' approach was used, incorporating both prescriptive and descriptive research methods. Multiple models of research were combined and the overall research was undertaken following three main stages, conceptualisation, developmental and evaluation. The first two stages lay the foundations for the specification of the proposed system where key aspects of the system that have not previously been proven in the literature, were implemented to test the feasibility of the system. As a result of combining the existing knowledge in the area with the newlyverified key aspects of the proposed system, this research can form the base for a future software development project. The evaluation stage, which includes building the prototype system to test and evaluate the system performance based on the criteria defined in the earlier stage, is not within the scope this thesis. The research results in a conceptual design method and a proposed system software architecture. The proposed system is called the 'Hierarchical Evolutionary Algorithmic Design (HEAD) System'. The HEAD system has shown to be feasible through the initial illustrative paper-based simulation. The HEAD system consists of the two main components - 'Design Schema' and the 'Synthesis Algorithms'. The HEAD system reflects the major research contribution in the way it is conceptualised, while secondary contributions are achieved within the system components. The design schema provides constraints on the generation of designs, thus enabling the designer to create a wide range of potential designs that can then be analysed for desirable characteristics. The design schema supports the digital representation of the human creativity of designers into a dynamic design framework that can be encoded and then executed through the use of evolutionary genetic algorithms. The design schema incorporates 2D and 3D geometry and graph theory for space layout planning and building formation using the Lowest Common Design Denominator (LCDD) of a parameterised 2D module and a 3D structural module. This provides a bridge between the standard adjacency requirements and the evolutionary system. The use of graphs as an input to the evolutionary algorithm supports the introduction of constraints in a way that is not supported by standard evolutionary techniques. The process of design synthesis is guided as a higher level description of the building that supports geometrical constraints. The Synthesis Algorithms component analyses designs at four levels, 'Room', 'Layout', 'Building' and 'Optimisation'. At each level multiple fitness functions are embedded into the genetic algorithm to target the specific requirements of the relevant decomposed part of the design problem. Decomposing the design problem to allow for the design requirements of each level to be dealt with separately and then reassembling them in a bottom up approach reduces the generation of non-viable solutions through constraining the options available at the next higher level. The iterative approach, in exploring the range of design solutions through modification of the design schema as the understanding of the design problem improves, assists in identifying conflicts in the design requirements. Additionally, the hierarchical set-up allows the embedding of multiple fitness functions into the genetic algorithm, each relevant to a specific level. This supports an integrated multi-level, multi-disciplinary approach. The HEAD system promotes a collaborative relationship between human creativity and the computer capability. The design schema component, as the input to the procedural algorithms, enables the encoding of certain aspects of the designer's subjective creativity. By focusing on finding solutions for the relevant sub-problems at the appropriate levels of detail, the hierarchical nature of the system assist in the design decision-making process.
Resumo:
In microscopic traffic simulators, the interaction between vehicles is considered. The dynamics of the system then becomes an emergent property of the interaction between its components. Such interactions include lane-changing, car-following behaviours and intersection management. Although, in some cases, such simulators produce realistic prediction, they do not allow for an important aspect of the dynamics, that is, the driver-vehicle interaction. This paper introduces a physically sound vehicle-driver model for realistic microscopic simulation. By building a nanoscopic traffic simulation model that uses steering angle and throttle position as parameters, the model aims to overcome unrealistic acceleration and deceleration values, as found in various microscopic simulation tools. A physics engine calculates the driving force of the vehicle, and the preliminary results presented here, show that, through a realistic driver-vehicle-environment simulator, it becomes possible to model realistic driver and vehicle behaviours in a traffic simulation.
Resumo:
A new decision-making tool that will assist designers in the selection of appropriate daylighting solutions for buildings in tropical locations has been previously proposed by the authors. Through an evaluation matrix that prioritizes the parameters that best respond to the needs of tropical climates (e.g. reducing solar gain and protection from glare) the tool determines the most appropriate devices for specific climate and building inputs. The tool is effective in demonstrating the broad benefits and limitations of the different daylight strategies for buildings in the tropics. However for thorough analysis and calibration of the tool, validation is necessary. This paper presents a first step in the validation process. RADIANCE simulations were conducted to compare simulation performance with the performance predicted by the tool. To this end, an office building case study in subtropical Brisbane, Australia, and five different daylighting devices including openings, light guiding systems and light transport systems were simulated. Illuminance, light uniformity, daylight penetration and glare analysis were assessed for each device. The results indicate the tool can appropriately rank and recommend daylighting strategies based on specific building inputs for tropical and subtropical regions, making it a useful resource for designers.
Resumo:
The effect of resource management on the building design process directly influences the development cycle time and success of construction projects. This paper presents the information constraint net (ICN) to represent the complex information constraint relations among design activities involved in the building design process. An algorithm is developed to transform the information constraints throughout the ICN into a Petri net model. A resource management model is developed using the ICN to simulate and optimize resource allocation in the design process. An example is provided to justify the proposed model through a simulation analysis of the CPN Tools platform in the detailed structural design. The result demonstrates that the proposed approach can obtain the resource management and optimization needed for shortening the development cycle and optimal allocation of resources.
Resumo:
Traditional shading design principles guide the vertical and horizontal orientation of fins, louvres and awnings being applied to orthogonal planar façades. Due to doubly curved envelopes characterising many contemporary designs, these rules of thumb are now not always applicable. Operable blinds attempt to regulate the fluctuating luminance of daylight and aid in shading direct sunlight. Mostly they remain closed, as workers are commonly too preoccupied to continually adjust them so a reliance on electrically powered lights remains a preference. To remedy these problems, the idea of what it is to sustainable enclose space is reconsidered through the geometric and kinetic optimisation of a parametric skin, with sunlight responsive modules that regulate interior light levels. This research concludes with an optimised design and also defines some unique metrics to gauge the design’s performance in terms of, the amount of exterior unobstructed view, its ability to shade direct sunlight and, its daylight glare probability.
A hybrid simulation framework to assess the impact of renewable generators on a distribution network
Resumo:
With an increasing number of small-scale renewable generator installations, distribution network planners are faced with new technical challenges (intermittent load flows, network imbalances…). Then again, these decentralized generators (DGs) present opportunities regarding savings on network infrastructure if installed at strategic locations. How can we consider both of these aspects when building decision tools for planning future distribution networks? This paper presents a simulation framework which combines two modeling techniques: agent-based modeling (ABM) and particle swarm optimization (PSO). ABM is used to represent the different system units of the network accurately and dynamically, simulating over short time-periods. PSO is then used to find the most economical configuration of DGs over longer periods of time. The infrastructure of the framework is introduced, presenting the two modeling techniques and their integration. A case study of Townsville, Australia, is then used to illustrate the platform implementation and the outputs of a simulation.
Resumo:
For the evaluation, design, and planning of traffic facilities and measures, traffic simulation packages are the de facto tools for consultants, policy makers, and researchers. However, the available commercial simulation packages do not always offer the desired work flow and flexibility for academic research. In many cases, researchers resort to designing and building their own dedicated models, without an intrinsic incentive (or the practical means) to make the results available in the public domain. To make matters worse, a substantial part of these efforts pertains to rebuilding basic functionality and, in many respects, reinventing the wheel. This problem not only affects the research community but adversely affects the entire traffic simulation community and frustrates the development of traffic simulation in general. For this problem to be addressed, this paper describes an open source approach, OpenTraffic, which is being developed as a collaborative effort between the Queensland University of Technology, Australia; the National Institute of Informatics, Tokyo; and the Technical University of Delft, the Netherlands. The OpenTraffic simulation framework enables academies from geographic areas and disciplines within the traffic domain to work together and contribute to a specific topic of interest, ranging from travel choice behavior to car following, and from response to intelligent transportation systems to activity planning. The modular approach enables users of the software to focus on their area of interest, whereas other functional modules can be regarded as black boxes. Specific attention is paid to a standardization of data inputs and outputs for traffic simulations. Such standardization will allow the sharing of data with many existing commercial simulation packages.
Resumo:
Good daylighting design in buildings not only provides a comfortable luminous environment, but also delivers energy savings and comfortable and healthy environments for building occupants. Yet, there is still no consensus on how to assess what constitutes good daylighting design. Currently amongst building performance guidelines, Daylighting factors (DF) or minimum illuminance values are the standard; however, previous research has shown the shortcomings of these metrics. New computer software for daylighting analysis contains new more advanced metrics for daylighting (Climate Base Daylight Metrics-CBDM). Yet, these tools (new metrics or simulation tools) are not currently understood by architects and are not used within architectural firms in Australia. A survey of architectural firms in Brisbane showed the most relevant tools used by industry. The purpose of this paper is to assess and compare these computer simulation tools and new tools available architects and designers for daylighting. The tools are assessed in terms of their ease of use (e.g. previous knowledge required, complexity of geometry input, etc.), efficiency (e.g. speed, render capabilities, etc.) and outcomes (e.g. presentation of results, etc. The study shows tools that are most accessible for architects, are those that import a wide variety of files, or can be integrated into the current 3d modelling software or package. These software’s need to be able to calculate for point in times simulations, and annual analysis. There is a current need in these software solutions for an open source program able to read raw data (in the form of spreadsheets) and show that graphically within a 3D medium. Currently, development into plug-in based software’s are trying to solve this need through third party analysis, however some of these packages are heavily reliant and their host program. These programs however which allow dynamic daylighting simulation, which will make it easier to calculate accurate daylighting no matter which modelling platform the designer uses, while producing more tangible analysis today, without the need to process raw data.
Resumo:
In this paper, the initial stage of films assembled by energetic C36 fullerenes on diamond (001)–(2 × 1) surface at low-temperature was investigated by molecular dynamics simulation using the Brenner potential. The incident energy was first uniformly distributed within an energy interval 20–50 eV, which was known to be the optimum energy range for chemisorption of single C36 on diamond (001) surface. More than one hundred C36 cages were impacted one after the other onto the diamond surface by randomly selecting their orientation as well as the impact position relative to the surface. The growth of films was found to be in three-dimensional island mode, where the deposited C36 acted as building blocks. The study of film morphology shows that it retains the structure of a free C36 cage, which is consistent with Low Energy Cluster Beam Deposition (LECBD) experiments. The adlayer is composed of many C36-monomers as well as the covalently bonded C36 dimers and trimers which is quite different from that of C20 fullerene-assembled film, where a big polymerlike chain was observed due to the stronger interaction between C20 cages. In addition, the chemisorption probability of C36 fullerenes is decreased with increasing coverage because the interaction between these clusters is weaker than that between the cluster and the surface. When the incident energy is increased to 40–65 eV, the chemisorption probability is found to increased and more dimers and trimers as well as polymerlike-C36 were observed on the deposited films. Furthermore, C36 film also showed high thermal stability even when the temperature was raised to 1500 K.
Resumo:
Recent road safety statistics show that the decades-long fatalities decreasing trend is stopping and stagnating. Statistics further show that crashes are mostly driven by human error, compared to other factors such as environmental conditions and mechanical defects. Within human error, the dominant error source is perceptive errors, which represent about 50% of the total. The next two sources are interpretation and evaluation, which accounts together with perception for more than 75% of human error related crashes. Those statistics show that allowing drivers to perceive and understand their environment better, or supplement them when they are clearly at fault, is a solution to a good assessment of road risk, and, as a consequence, further decreasing fatalities. To answer this problem, currently deployed driving assistance systems combine more and more information from diverse sources (sensors) to enhance the driver's perception of their environment. However, because of inherent limitations in range and field of view, these systems' perception of their environment remains largely limited to a small interest zone around a single vehicle. Such limitations can be overcomed by increasing the interest zone through a cooperative process. Cooperative Systems (CS), a specific subset of Intelligent Transportation Systems (ITS), aim at compensating for local systems' limitations by associating embedded information technology and intervehicular communication technology (IVC). With CS, information sources are not limited to a single vehicle anymore. From this distribution arises the concept of extended or augmented perception. Augmented perception allows extending an actor's perceptive horizon beyond its "natural" limits not only by fusing information from multiple in-vehicle sensors but also information obtained from remote sensors. The end result of an augmented perception and data fusion chain is known as an augmented map. It is a repository where any relevant information about objects in the environment, and the environment itself, can be stored in a layered architecture. This thesis aims at demonstrating that augmented perception has better performance than noncooperative approaches, and that it can be used to successfully identify road risk. We found it was necessary to evaluate the performance of augmented perception, in order to obtain a better knowledge on their limitations. Indeed, while many promising results have already been obtained, the feasibility of building an augmented map from exchanged local perception information and, then, using this information beneficially for road users, has not been thoroughly assessed yet. The limitations of augmented perception, and underlying technologies, have not be thoroughly assessed yet. Most notably, many questions remain unanswered as to the IVC performance and their ability to deliver appropriate quality of service to support life-saving critical systems. This is especially true as the road environment is a complex, highly variable setting where many sources of imperfections and errors exist, not only limited to IVC. We provide at first a discussion on these limitations and a performance model built to incorporate them, created from empirical data collected on test tracks. Our results are more pessimistic than existing literature, suggesting IVC limitations have been underestimated. Then, we develop a new CS-applications simulation architecture. This architecture is used to obtain new results on the safety benefits of a cooperative safety application (EEBL), and then to support further study on augmented perception. At first, we confirm earlier results in terms of crashes numbers decrease, but raise doubts on benefits in terms of crashes' severity. In the next step, we implement an augmented perception architecture tasked with creating an augmented map. Our approach is aimed at providing a generalist architecture that can use many different types of sensors to create the map, and which is not limited to any specific application. The data association problem is tackled with an MHT approach based on the Belief Theory. Then, augmented and single-vehicle perceptions are compared in a reference driving scenario for risk assessment,taking into account the IVC limitations obtained earlier; we show their impact on the augmented map's performance. Our results show that augmented perception performs better than non-cooperative approaches, allowing to almost tripling the advance warning time before a crash. IVC limitations appear to have no significant effect on the previous performance, although this might be valid only for our specific scenario. Eventually, we propose a new approach using augmented perception to identify road risk through a surrogate: near-miss events. A CS-based approach is designed and validated to detect near-miss events, and then compared to a non-cooperative approach based on vehicles equiped with local sensors only. The cooperative approach shows a significant improvement in the number of events that can be detected, especially at the higher rates of system's deployment.