40 resultados para Honor Code


Relevância:

20.00% 20.00%

Publicador:

Resumo:

An object in the peripheral visual field is more difficult to recognize when surrounded by other objects. This phenomenon is called "crowding". Crowding places a fundamental constraint on human vision that limits performance on numerous tasks. It has been suggested that crowding results from spatial feature integration necessary for object recognition. However, in the absence of convincing models, this theory has remained controversial. Here, we present a quantitative and physiologically plausible model for spatial integration of orientation signals, based on the principles of population coding. Using simulations, we demonstrate that this model coherently accounts for fundamental properties of crowding, including critical spacing, "compulsory averaging", and a foveal-peripheral anisotropy. Moreover, we show that the model predicts increased responses to correlated visual stimuli. Altogether, these results suggest that crowding has little immediate bearing on object recognition but is a by-product of a general, elementary integration mechanism in early vision aimed at improving signal quality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates 'future-proofing' as an unexplored yet all-important aspect in the design of low-energy dwellings. It refers particularly to adopting lifecycle thinking and accommodating risks and uncertainties in the selection of fabric energy efficiency measures and low or zero-carbon technologies. Based on a conceptual framework for future-proofed design, the paper first presents results from the analysis of two 'best practice' housing developments in England; i.e., North West Cambridge in Cambridge and West Carclaze and Baal in St. Austell, Cornwall. Second, it examines the 'Energy and CO2 Emissions' part of the Code for Sustainable Homes to reveal which design criteria and assessment methods can be practically integrated into this established building certification scheme so that it can become more dynamic and future-oriented.Practical application: Future-proofed construction is promoted implicitly within the increasingly stringent building regulations; however, there is no comprehensive method to readily incorporate futures thinking into the energy design of buildings. This study has a three-fold objective of relevance to the building industry:Illuminating the two key categories of long-term impacts in buildings, which are often erroneously treated interchangeably:- The environmental impact of buildings due to their long lifecycles.- The environment's impacts on buildings due to risks and uncertainties affecting the energy consumption by at least 2050. This refers to social, technological, economic, environmental and regulatory (predictable or unknown) trends and drivers of change, such as climate uncertainty, home-working, technology readiness etc.Encouraging future-proofing from an early planning stage to reduce the likelihood of a prematurely obsolete building design.Enhancing established building energy assessment methods (certification, modelling or audit tools) by integrating a set of future-oriented criteria into their methodologies. © 2012 The Chartered Institution of Building Services Engineers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although it is widely believed that reinforcement learning is a suitable tool for describing behavioral learning, the mechanisms by which it can be implemented in networks of spiking neurons are not fully understood. Here, we show that different learning rules emerge from a policy gradient approach depending on which features of the spike trains are assumed to influence the reward signals, i.e., depending on which neural code is in effect. We use the framework of Williams (1992) to derive learning rules for arbitrary neural codes. For illustration, we present policy-gradient rules for three different example codes - a spike count code, a spike timing code and the most general "full spike train" code - and test them on simple model problems. In addition to classical synaptic learning, we derive learning rules for intrinsic parameters that control the excitability of the neuron. The spike count learning rule has structural similarities with established Bienenstock-Cooper-Munro rules. If the distribution of the relevant spike train features belongs to the natural exponential family, the learning rules have a characteristic shape that raises interesting prediction problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BGCore is a software package for comprehensive computer simulation of nuclear reactor systems and their fuel cycles. The BGCore interfaces Monte Carlo particles transport code MCNP4C with a SARAF module - an independently developed code for calculating in-core fuel composition and spent fuel emissions following discharge. In BGCore system, depletion coupling methodology is based on the multi-group approach that significantly reduces computation time and allows tracking of large number of nuclides during calculations. In this study, burnup calculation capabilities of BGCore system were validated against well established and verified, computer codes for thermal and fast spectrum lattices. Very good agreement in k eigenvalue and nuclide densities prediction was observed for all cases under consideration. In addition, decay heat prediction capabilities of the BGCore system were benchmarked against the most recent edition of ANS Standard methodology for UO2 fuel decay power prediction in LWRs. It was found that the difference between ANS standard data and that predicted by the BGCore does not exceed 5%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, the Serpent Monte Carlo code was used as a tool for preparation of homogenized few-group cross sections for the nodal diffusion analysis of Sodium cooled Fast Reactor (SFR) cores. Few-group constants for two reference SFR cores were generated by Serpent and then employed by nodal diffusion code DYN3D in 2D full core calculations. The DYN3D results were verified against the references full core Serpent Monte Carlo solutions. A good agreement between the reference Monte Carlo and nodal diffusion results was observed demonstrating the feasibility of using Serpent for generation of few-group constants for the deterministic SFR analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We demonstrate for the first time an electronically processed Walsh Code with 16 chips at 18Gchip/s. An auto-cross correlation ratio of 18.1dB is achieved between two orthogonal codes after transmission over 10km of SMF. © 2009 OSA.