974 resultados para Mismatched uncertainties
Resumo:
This article reports on the use of building performance simulation to quantify the risks that climate change poses to the thermal performance of buildings, and to their critical functions. Through a number of case studies the article demonstrates that any prediction of the probable thermal building performance on the long timeframes inherent in climate change comes with very large uncertainties. The same cases are used to illustrate that assessing the consequences of predicted change is problematic, since the functions that the building provides in themselves often are a moving target. The article concludes that quantification of the risks posed by climate change is possible, but only with many restrictions. Further research that is needed to move to more effective discussion about risk acceptance and risk abatement for specific buildings is identified. © 2012 Elsevier Ltd.
Resumo:
The diversity of non-domestic buildings at urban scale poses a number of difficulties to develop models for large scale analysis of the stock. This research proposes a probabilistic, engineering-based, bottom-up model to address these issues. In a recent study we classified London's non-domestic buildings based on the service they provide, such as offices, retail premise, and schools, and proposed the creation of one probabilistic representational model per building type. This paper investigates techniques for the development of such models. The representational model is a statistical surrogate of a dynamic energy simulation (ES) model. We first identify the main parameters affecting energy consumption in a particular building sector/type by using sampling-based global sensitivity analysis methods, and then generate statistical surrogate models of the dynamic ES model within the dominant model parameters. Given a sample of actual energy consumption for that sector, we use the surrogate model to infer the distribution of model parameters by inverse analysis. The inferred distributions of input parameters are able to quantify the relative benefits of alternative energy saving measures on an entire building sector with requisite quantification of uncertainties. Secondary school buildings are used for illustrating the application of this probabilistic method. © 2012 Elsevier B.V. All rights reserved.
Resumo:
This paper introduces Periodically Controlled Hybrid Automata (PCHA) for describing a class of hybrid control systems. In a PCHA, control actions occur roughly periodically while internal and input actions may occur in the interim changing the discrete-state or the setpoint. Based on periodicity and subtangential conditions, a new sufficient condition for verifying invariance of PCHAs is presented. This technique is used in verifying safety of the planner-controller subsystem of an autonomous ground vehicle, and in deriving geometric properties of planner generated paths that can be followed safely by the controller under environmental uncertainties.
Resumo:
This paper reports on research that uses building performance simulation and uncertainty analysis to assess the risks that projected climate change poses to the thermal performance of buildings, and to their critical functions. The work takes meteorological climate change predictions as a starting point, but also takes into account developments and uncertainties in technology, occupancy, intervention and renovation, and others. Four cases are studied in depth to explore the prospects of the quantification of said climate change risks. The research concludes that quantification of the risks posed by climate change is possible, but only with many restrictive assumptions on the input side.
Resumo:
Firms and other organizations use Technology Roadmapping (TRM) extensively as a framework for supporting research and development of future technologies and products that could sustain a competitive advantage. While the importance of technology strategy has received more attention in recent years, few research studies have examined how roadmapping processes are used to explore the potential convergence of products and services that may be developed in the future. The aim of this paper is to introduce an integrated roadmapping process for services, devices and technologies capable of implementing a smart city development R&D project in Korea. The paper applies a QFD (Quality Function Deployment) method to establish interconnections between services and devices, and between devices and technologies. The method is illustrated by a detailed case study, which shows how different types of roadmap can be coordinated with each other to produce a clear representation of the technological changes and uncertainties associated with the strategic planning of complex innovations. © 2012 Elsevier Inc.
Resumo:
This paper investigates 'future-proofing' as an unexplored yet all-important aspect in the design of low-energy dwellings. It refers particularly to adopting lifecycle thinking and accommodating risks and uncertainties in the selection of fabric energy efficiency measures and low or zero-carbon technologies. Based on a conceptual framework for future-proofed design, the paper first presents results from the analysis of two 'best practice' housing developments in England; i.e., North West Cambridge in Cambridge and West Carclaze and Baal in St. Austell, Cornwall. Second, it examines the 'Energy and CO2 Emissions' part of the Code for Sustainable Homes to reveal which design criteria and assessment methods can be practically integrated into this established building certification scheme so that it can become more dynamic and future-oriented.Practical application: Future-proofed construction is promoted implicitly within the increasingly stringent building regulations; however, there is no comprehensive method to readily incorporate futures thinking into the energy design of buildings. This study has a three-fold objective of relevance to the building industry:Illuminating the two key categories of long-term impacts in buildings, which are often erroneously treated interchangeably:- The environmental impact of buildings due to their long lifecycles.- The environment's impacts on buildings due to risks and uncertainties affecting the energy consumption by at least 2050. This refers to social, technological, economic, environmental and regulatory (predictable or unknown) trends and drivers of change, such as climate uncertainty, home-working, technology readiness etc.Encouraging future-proofing from an early planning stage to reduce the likelihood of a prematurely obsolete building design.Enhancing established building energy assessment methods (certification, modelling or audit tools) by integrating a set of future-oriented criteria into their methodologies. © 2012 The Chartered Institution of Building Services Engineers.
Resumo:
This paper presents a review undertaken to understand the concept of 'future-proofing' the energy performance of buildings. The long lifecycles of the building stock, the impacts of climate change and the requirements for low carbon development underline the need for long-term thinking from the early design stages. 'Future-proofing' is an emerging research agenda with currently no widely accepted definition amongst scholars and building professionals. In this paper, it refers to design processes that accommodate explicitly full lifecycle perspectives and energy trends and drivers by at least 2050, when selecting energy efficient measures and low carbon technologies. A knowledge map is introduced, which explores the key axes (or attributes) for achieving a 'future-proofed' energy design; namely, coverage of sustainability issues, lifecycle thinking, and accommodating risks and uncertainties that affect the energy consumption. It is concluded that further research is needed so that established building energy assessment methods are refined to better incorporate future-proofing. The study follows an interdisciplinary approach and is targeted at design teams with aspirations to achieve resilient and flexible low-energy buildings over the long-term. © 2012 Elsevier Ltd.
Resumo:
Urbanisation is the great driving force of the twenty-first century. Cities are associated with both productivity and creativity, and the benefits offered by closely connected and high density living and working contribute to sustainability. At the same time, cities need extensive infrastructure – like water, power, sanitation and transportation systems – to operate effectively. Cities therefore comprise multiple components, forming both static and dynamic systems that are interconnected directly and indirectly on a number of levels, all forming the backdrop for the interaction of people and processes. Bringing together large numbers of people and complex products in rich interactions can lead to vulnerability from hazards, threats and even trends, whether natural hazards, epidemics, political upheaval, demographic changes, economic instability and/or mechanical failures; The key to countering vulnerability is the identification of critical systems and clear understanding of their interactions and dependencies. Critical systems can be assessed methodically to determine the implications of their failure and their interconnectivities with other systems to identify options. The overriding need is to support resilience – defined here as the degree to which a system or systems can continue to function effectively in a changing environment. Cities need to recognise the significance of devising adaptation strategies and processes to address a multitude of uncertainties relating to climate, economy, growth and demography. In this paper we put forward a framework to support cities in understanding the hazards, threats and trends that can make them vulnerable to unexpected changes and unpredictable shocks. The framework draws on an asset model of the city, in which components that contribute to resilience include social capital, economic assets, manufactured assets, and governance. The paper reviews the field, and draws together an overarching framework intended to help cities plan a robust trajectory towards increased resilience through flexibility, resourcefulness and responsiveness. It presents some brief case studies demonstrating the applicability of the proposed framework to a wide variety of circumstances.
Resumo:
Soil liquefaction following large earthquakes is a major contributor to damage to infrastructure and economic loss, as borne out by the earthquakes in Japan and New Zealand in 2011. While extensive research has been conducted on soil liquefaction and our understanding of liquefaction has been advancing, several uncertainties remain. In this paper the basic premise that liquefaction is an 'undrained' event will be challenged. Evidence will be offered based on dynamic centrifuge tests to show that rapid settlements occur both in level ground and for shallow foundations. It will also be shown that the definition of liquefaction based on excess pore pressure generation and the subsequent classification of sites as liquefiable and non-liquefiable is not satisfactory, as centrifuge test data shows that both loose and dense sand sites produce significant excess pore pressure. Experimental evidence will be presented that shows that the permeability of sands increases rapidly at very low effective stresses to allow for rapid drainage to take place from liquefied soil. Based on these observations a micro-mechanical view of soil liquefaction that brings together the Critical State view of soil liquefaction and the importance of dynamic loading will be presented. © 2012 Indian Geotechnical Society.
Resumo:
Bioethanol is the world's largest-produced alternative to petroleum-derived transportation fuels due to its compatibility within existing spark-ignition engines and its relatively mature production technology. Despite its success, questions remain over the greenhouse gas (GHG) implications of fuel ethanol use with many studies showing significant impacts of differences in land use, feedstock, and refinery operation. While most efforts to quantify life-cycle GHG impacts have focused on the production stage, a few recent studies have acknowledged the effect of ethanol on engine performance and incorporated these effects into the fuel life cycle. These studies have broadly asserted that vehicle efficiency increases with ethanol use to justify reducing the GHG impact of ethanol. These results seem to conflict with the general notion that ethanol decreases the fuel efficiency (or increases the fuel consumption) of vehicles due to the lower volumetric energy content of ethanol when compared to gasoline. Here we argue that due to the increased emphasis on alternative fuels with drastically differing energy densities, vehicle efficiency should be evaluated based on energy rather than volume. When done so, we show that efficiency of existing vehicles can be affected by ethanol content, but these impacts can serve to have both positive and negative effects and are highly uncertain (ranging from -15% to +24%). As a result, uncertainties in the net GHG effect of ethanol, particularly when used in a low-level blend with gasoline, are considerably larger than previously estimated (standard deviations increase by >10% and >200% when used in high and low blends, respectively). Technical options exist to improve vehicle efficiency through smarter use of ethanol though changes to the vehicle fleets and fuel infrastructure would be required. Future biofuel policies should promote synergies between the vehicle and fuel industries in order to maximize the society-wise benefits or minimize the risks of adverse impacts of ethanol.
Resumo:
A one-dimensional model for crevice HC post-flame oxidation is used to calculate and understand the effect of operating parameters and fuel type (propane and isooctane) on the extent of crevice hydrocarbon and the product distribution in the post flame environment. The calculations show that the main parameters controlling oxidation are: bulk burned gas temperatures, wall temperatures, turbulent diffusivity, and fuel oxidation rates. Calculated extents of oxidation agree well with experimental values, and the sensitivities to operating conditions (wall temperatures, equivalence ratio, fuel type) are reasonably well captured. Whereas the bulk gas temperatures largely determine the extent of oxidation, the hydrocarbon product distribution is not very much affected by the burned gas temperatures, but mostly by diffusion rates. Uncertainties in both turbulent diffusion rates as well as in mechanisms are an important factor limiting the predictive capabilities of the model. However, it seems well suited to sensitivity calculations about a baseline. Copyright © 1999 Society of Automotive Engineers, Inc.
Resumo:
Taper-free and vertically oriented Ge nanowires were grown on Si (111) substrates by chemical vapor deposition with Au nanoparticle catalysts. To achieve vertical nanowire growth on the highly lattice mismatched Si substrate, a thin Ge buffer layer was first deposited, and to achieve taper-free nanowire growth, a two-temperature process was employed. The two-temperature process consisted of a brief initial base growth step at high temperature followed by prolonged growth at lower temperature. Taper-free and defect-free Ge nanowires grew successfully even at 270 °C, which is 90 °C lower than the bulk eutectic temperature. The yield of vertical and taper-free nanowires is over 90%, comparable to that of vertical but tapered nanowires grown by the conventional one-temperature process. This method is of practical importance and can be reliably used to develop novel nanowire-based devices on relatively cheap Si substrates. Additionally, we observed that the activation energy of Ge nanowire growth by the two-temperature process is dependent on Au nanoparticle size. The low activation energy (∼5 kcal/mol) for 30 and 50 nm diameter Au nanoparticles suggests that the decomposition of gaseous species on the catalytic Au surface is a rate-limiting step. A higher activation energy (∼14 kcal/mol) was determined for 100 nm diameter Au nanoparticles which suggests that larger Au nanoparticles are partially solidified and that growth kinetics become the rate-limiting step. © 2011 American Chemical Society.
Resumo:
A multivariate, robust, rational interpolation method for propagating uncertainties in several dimensions is presented. The algorithm for selecting numerator and denominator polynomial orders is based on recent work that uses a singular value decomposition approach. In this paper we extend this algorithm to higher dimensions and demonstrate its efficacy in terms of convergence and accuracy, both as a method for response suface generation and interpolation. To obtain stable approximants for continuous functions, we use an L2 error norm indicator to rank optimal numerator and denominator solutions. For discontinous functions, a second criterion setting an upper limit on the approximant value is employed. Analytical examples demonstrate that, for the same stencil, rational methods can yield more rapid convergence compared to pseudospectral or collocation approaches for certain problems. © 2012 AIAA.
Resumo:
Operational uncertainties such as throttle excursions, varying inlet conditions and geometry changes lead to variability in compressor performance. In this work, the main operational uncertainties inherent in a transonic axial compressor are quantified to deter- mine their effect on performance. These uncertainties include the effects of inlet distortion, metal expansion, ow leakages and blade roughness. A 3D, validated RANS model of the compressor is utilized to simulate these uncertainties and quantify their effect on polytropic efficiency and pressure ratio. To propagate them, stochastic collocation and sparse pseudospectral approximations are used. We demonstrate that lower-order approximations are sufficient as these uncertainties are inherently linear. Results for epistemic uncertainties in the form of meshing methodologies are also presented. Finally, the uncertainties considered are ranked in order of their effect on efficiency loss. © 2012 AIAA.