163 resultados para code source


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Space heating accounts for a large portion of the world's carbon dioxide emissions. Ground Source Heat Pumps (GSHPs) are a technology which can reduce carbon emissions from heating and cooling. GSHP system performance is however highly sensitive to deviation from design values of the actual annual energy extraction/rejection rates from/to the ground. In order to prevent failure and/or performance deterioration of GSHP systems it is possible to incorporate a safety factor in the design of the GSHP by over-sizing the ground heat exchanger (GHE). A methodology to evaluate the financial risk involved in over-sizing the GHE is proposed is this paper. A probability based approach is used to evaluate the economic feasibility of a hypothetical full-size GSHP system as compared to four alternative Heating Ventilation and Air Conditioning (HVAC) system configurations. The model of the GSHP system is developed in the TRNSYS energy simulation platform and calibrated with data from an actual hybrid GSHP system installed in the Department of Earth Science, University of Oxford, UK. Results of the analysis show that potential savings from a full-size GSHP system largely depend on projected HVAC system efficiencies and gas and electricity prices. Results of the risk analysis also suggest that a full-size GSHP with auxiliary back up is potentially the most economical system configuration. © 2012 Elsevier Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We examine the role of heat source geometry in determining rates of airflow and thermal stratification in natural displacement ventilation flows. We modify existing models to account for heat sources of finite (non-zero) area, such as formed by a sun patch warming the floor of a room. Our model allows for predictions of the steady stratification and ventilation flow rates that develop in a room due to a circular heat source at floor level. We compare our theoretical predictions with predictions for the limiting cases of a point source of heat (yielding a stratified interior), and a uniformly heated floor (yielding a mixed interior). Our theory shows a smooth transition between these two limits, which themselves result in extremes of ventilation, as the ratio of the heat source radius to the room height increases. Our model for the transition from displacement to mixing ventilation is compared to previous work and demonstrates that the transition can occur for smaller sources than previously thought, particularly for rooms with large floor area compared to ceiling height. © 2009 Elsevier Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We compare natural ventilation flows established by a range of heat source distributions at floor level. Both evenly distributed and highly localised line and point source distributions are considered. We demonstrate that modelling the ventilation flow driven by a uniformly distributed heat source is equivalent to the flow driven by a large number of localised sources. A model is developed for the transient flow development in a room with a uniform heat distribution and is compared with existing models for localised buoyancy inputs. For large vent areas the flow driven by localised heat sources reaches a steady state more rapidly than the uniformly distributed case. For small vent areas there is little difference in the transient development times. Our transient model is then extended to consider the time taken to flush a neutrally buoyant pollutant from a naturally ventilated room. Again comparisons are drawn between uniform and localised (point and line) heat source geometries. It is demonstrated that for large vent areas a uniform heat distribution provides the fastest flushing. However, for smaller vent areas, localised heat sources produce the fastest flushing. These results are used to suggest a definition for the term 'natural ventilation efficiency', and a model is developed to estimate this efficiency as a function of the room and heat source geometries. © 2006 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An object in the peripheral visual field is more difficult to recognize when surrounded by other objects. This phenomenon is called "crowding". Crowding places a fundamental constraint on human vision that limits performance on numerous tasks. It has been suggested that crowding results from spatial feature integration necessary for object recognition. However, in the absence of convincing models, this theory has remained controversial. Here, we present a quantitative and physiologically plausible model for spatial integration of orientation signals, based on the principles of population coding. Using simulations, we demonstrate that this model coherently accounts for fundamental properties of crowding, including critical spacing, "compulsory averaging", and a foveal-peripheral anisotropy. Moreover, we show that the model predicts increased responses to correlated visual stimuli. Altogether, these results suggest that crowding has little immediate bearing on object recognition but is a by-product of a general, elementary integration mechanism in early vision aimed at improving signal quality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates 'future-proofing' as an unexplored yet all-important aspect in the design of low-energy dwellings. It refers particularly to adopting lifecycle thinking and accommodating risks and uncertainties in the selection of fabric energy efficiency measures and low or zero-carbon technologies. Based on a conceptual framework for future-proofed design, the paper first presents results from the analysis of two 'best practice' housing developments in England; i.e., North West Cambridge in Cambridge and West Carclaze and Baal in St. Austell, Cornwall. Second, it examines the 'Energy and CO2 Emissions' part of the Code for Sustainable Homes to reveal which design criteria and assessment methods can be practically integrated into this established building certification scheme so that it can become more dynamic and future-oriented.Practical application: Future-proofed construction is promoted implicitly within the increasingly stringent building regulations; however, there is no comprehensive method to readily incorporate futures thinking into the energy design of buildings. This study has a three-fold objective of relevance to the building industry:Illuminating the two key categories of long-term impacts in buildings, which are often erroneously treated interchangeably:- The environmental impact of buildings due to their long lifecycles.- The environment's impacts on buildings due to risks and uncertainties affecting the energy consumption by at least 2050. This refers to social, technological, economic, environmental and regulatory (predictable or unknown) trends and drivers of change, such as climate uncertainty, home-working, technology readiness etc.Encouraging future-proofing from an early planning stage to reduce the likelihood of a prematurely obsolete building design.Enhancing established building energy assessment methods (certification, modelling or audit tools) by integrating a set of future-oriented criteria into their methodologies. © 2012 The Chartered Institution of Building Services Engineers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although it is widely believed that reinforcement learning is a suitable tool for describing behavioral learning, the mechanisms by which it can be implemented in networks of spiking neurons are not fully understood. Here, we show that different learning rules emerge from a policy gradient approach depending on which features of the spike trains are assumed to influence the reward signals, i.e., depending on which neural code is in effect. We use the framework of Williams (1992) to derive learning rules for arbitrary neural codes. For illustration, we present policy-gradient rules for three different example codes - a spike count code, a spike timing code and the most general "full spike train" code - and test them on simple model problems. In addition to classical synaptic learning, we derive learning rules for intrinsic parameters that control the excitability of the neuron. The spike count learning rule has structural similarities with established Bienenstock-Cooper-Munro rules. If the distribution of the relevant spike train features belongs to the natural exponential family, the learning rules have a characteristic shape that raises interesting prediction problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BGCore is a software package for comprehensive computer simulation of nuclear reactor systems and their fuel cycles. The BGCore interfaces Monte Carlo particles transport code MCNP4C with a SARAF module - an independently developed code for calculating in-core fuel composition and spent fuel emissions following discharge. In BGCore system, depletion coupling methodology is based on the multi-group approach that significantly reduces computation time and allows tracking of large number of nuclides during calculations. In this study, burnup calculation capabilities of BGCore system were validated against well established and verified, computer codes for thermal and fast spectrum lattices. Very good agreement in k eigenvalue and nuclide densities prediction was observed for all cases under consideration. In addition, decay heat prediction capabilities of the BGCore system were benchmarked against the most recent edition of ANS Standard methodology for UO2 fuel decay power prediction in LWRs. It was found that the difference between ANS standard data and that predicted by the BGCore does not exceed 5%.