999 resultados para Code Reuse


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A list is given of the provisions for aquaculture in the Philippine Fishery Code, passed by Congress on its third and final reading on 5 August 1997, under the following headings: 1) Code of practice for aquaculture; 2) Fishpond lease agreements; 3) Fish pens, fish cages, fish traps, etc.; 4) Non-obstruction to navigation and to defined migration paths of fish; 5) Insurance; and, 6) Registration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper advocates 'reduce, reuse, recycle' as a complete energy savings strategy. While reduction has been common to date, there is growing need to emphasize reuse and recycling as well. We design a DC-DC buck converter to demonstrate the 3 techniques: reduce with low-swing and zero voltage switching (ZVS), reuse with supply stacking, and recycle with regulated delivery of excess energy to the output load. The efficiency gained from these 3 techniques helps offset the loss of operating drivers at very high switching frequencies which are needed to move the output filter completely on-chip. A prototype was fabricated in 0.18μm CMOS, operates at 660MHz, and converts 2.2V to 0.75-1.0V at ∼50mA.1 © 2008 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Change propagates, potentially affecting many aspects of a design and requiring much rework to implement. This article introduces a cross-domain approach to decompose a design and identify possible change propagation linkages, complemented by an interactive tool that generates dynamic checklists to assess change impact. The approach considers the information domains of requirements, functions, components, and the detail design process. Laboratory experiments using a vacuum cleaner suggest that cross-domain modelling helps analyse a design to create and capture the information required for change prediction. Further experiments using an electronic product show that this information, coupled with the interactive tool, helps to quickly and consistently assess the impact of a proposed change. © 2012 Springer-Verlag London Limited.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An object in the peripheral visual field is more difficult to recognize when surrounded by other objects. This phenomenon is called "crowding". Crowding places a fundamental constraint on human vision that limits performance on numerous tasks. It has been suggested that crowding results from spatial feature integration necessary for object recognition. However, in the absence of convincing models, this theory has remained controversial. Here, we present a quantitative and physiologically plausible model for spatial integration of orientation signals, based on the principles of population coding. Using simulations, we demonstrate that this model coherently accounts for fundamental properties of crowding, including critical spacing, "compulsory averaging", and a foveal-peripheral anisotropy. Moreover, we show that the model predicts increased responses to correlated visual stimuli. Altogether, these results suggest that crowding has little immediate bearing on object recognition but is a by-product of a general, elementary integration mechanism in early vision aimed at improving signal quality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates 'future-proofing' as an unexplored yet all-important aspect in the design of low-energy dwellings. It refers particularly to adopting lifecycle thinking and accommodating risks and uncertainties in the selection of fabric energy efficiency measures and low or zero-carbon technologies. Based on a conceptual framework for future-proofed design, the paper first presents results from the analysis of two 'best practice' housing developments in England; i.e., North West Cambridge in Cambridge and West Carclaze and Baal in St. Austell, Cornwall. Second, it examines the 'Energy and CO2 Emissions' part of the Code for Sustainable Homes to reveal which design criteria and assessment methods can be practically integrated into this established building certification scheme so that it can become more dynamic and future-oriented.Practical application: Future-proofed construction is promoted implicitly within the increasingly stringent building regulations; however, there is no comprehensive method to readily incorporate futures thinking into the energy design of buildings. This study has a three-fold objective of relevance to the building industry:Illuminating the two key categories of long-term impacts in buildings, which are often erroneously treated interchangeably:- The environmental impact of buildings due to their long lifecycles.- The environment's impacts on buildings due to risks and uncertainties affecting the energy consumption by at least 2050. This refers to social, technological, economic, environmental and regulatory (predictable or unknown) trends and drivers of change, such as climate uncertainty, home-working, technology readiness etc.Encouraging future-proofing from an early planning stage to reduce the likelihood of a prematurely obsolete building design.Enhancing established building energy assessment methods (certification, modelling or audit tools) by integrating a set of future-oriented criteria into their methodologies. © 2012 The Chartered Institution of Building Services Engineers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although it is widely believed that reinforcement learning is a suitable tool for describing behavioral learning, the mechanisms by which it can be implemented in networks of spiking neurons are not fully understood. Here, we show that different learning rules emerge from a policy gradient approach depending on which features of the spike trains are assumed to influence the reward signals, i.e., depending on which neural code is in effect. We use the framework of Williams (1992) to derive learning rules for arbitrary neural codes. For illustration, we present policy-gradient rules for three different example codes - a spike count code, a spike timing code and the most general "full spike train" code - and test them on simple model problems. In addition to classical synaptic learning, we derive learning rules for intrinsic parameters that control the excitability of the neuron. The spike count learning rule has structural similarities with established Bienenstock-Cooper-Munro rules. If the distribution of the relevant spike train features belongs to the natural exponential family, the learning rules have a characteristic shape that raises interesting prediction problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BGCore is a software package for comprehensive computer simulation of nuclear reactor systems and their fuel cycles. The BGCore interfaces Monte Carlo particles transport code MCNP4C with a SARAF module - an independently developed code for calculating in-core fuel composition and spent fuel emissions following discharge. In BGCore system, depletion coupling methodology is based on the multi-group approach that significantly reduces computation time and allows tracking of large number of nuclides during calculations. In this study, burnup calculation capabilities of BGCore system were validated against well established and verified, computer codes for thermal and fast spectrum lattices. Very good agreement in k eigenvalue and nuclide densities prediction was observed for all cases under consideration. In addition, decay heat prediction capabilities of the BGCore system were benchmarked against the most recent edition of ANS Standard methodology for UO2 fuel decay power prediction in LWRs. It was found that the difference between ANS standard data and that predicted by the BGCore does not exceed 5%.