135 resultados para Efficient lighting

em CentAUR: Central Archive University of Reading - UK


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Global temperatures are expected to rise by between 1.1 and 6.4oC this century, depending, to a large extent, on the amount of carbon we emit to the atmosphere from now onwards. This warming is expected to have very negative effects on many peoples and ecosystems and, therefore, minimising our carbon emissions is a priority. Buildings are estimated to be responsible for around 50% of carbon emissions in the UK. Potential reductions involve both operational emissions, produced during use, and embodied emissions, produced during manufacture of materials and components, and during construction, refurbishments and demolition. To date the major effort has focused on reducing the, apparently, larger operational element, which is more readily quantifiable and reduction measures are relatively straightforward to identify and implement. Various studies have compared the magnitude of embodied and operational emissions, but have shown considerable variation in the relative values. This illustrates the difficulties in quantifying embodied, as it requires a detailed knowledge of the processes involved in the different life cycle phases, and requires the use of consistent system boundaries. However, other studies have established the interaction between operational and embodied, which demonstrates the importance of considering both elements together in order to maximise potential reductions. This is borne out in statements from both the Intergovernmental Panel on Climate Change and The Low Carbon Construction Innovation and Growth Team of the UK Government. In terms of meeting the 2020 and 2050 timeframes for carbon reductions it appears to be equally, if not more, important to consider early embodied carbon reductions, rather than just future operational reductions. Future decarbonisation of energy supply and more efficient lighting and M&E equipment installed in future refits is likely to significantly reduce operational emissions, lending further weight to this argument. A method of discounting to evaluate the present value of future carbon emissions would allow more realistic comparisons to be made on the relative importance of the embodied and operational elements. This paper describes the results of case studies on carbon emissions over the whole lifecycle of three buildings in the UK, compares four available software packages for determining embodied carbon and suggests a method of carbon discounting to obtain present values for future emissions. These form the initial stages of a research project aimed at producing information on embodied carbon for different types of building, components and forms of construction, in a simplified form, which can be readily used by building designers in optimising building design in terms of minimising overall carbon emissions. Keywords: Embodied carbon; carbon emission; building; operational carbon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the approximation of some highly oscillatory weakly singular surface integrals, arising from boundary integral methods with smooth global basis functions for solving problems of high frequency acoustic scattering by three-dimensional convex obstacles, described globally in spherical coordinates. As the frequency of the incident wave increases, the performance of standard quadrature schemes deteriorates. Naive application of asymptotic schemes also fails due to the weak singularity. We propose here a new scheme based on a combination of an asymptotic approach and exact treatment of singularities in an appropriate coordinate system. For the case of a spherical scatterer we demonstrate via error analysis and numerical results that, provided the observation point is sufficiently far from the shadow boundary, a high level of accuracy can be achieved with a minimal computational cost.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Radiation schemes in general circulation models currently make a number of simplifications when accounting for clouds, one of the most important being the removal of horizontal inhomogeneity. A new scheme is presented that attempts to account for the neglected inhomogeneity by using two regions of cloud in each vertical level of the model as opposed to one. One of these regions is used to represent the optically thinner cloud in the level, and the other represents the optically thicker cloud. So, along with the clear-sky region, the scheme has three regions in each model level and is referred to as “Tripleclouds.” In addition, the scheme has the capability to represent arbitrary vertical overlap between the three regions in pairs of adjacent levels. This scheme is implemented in the Edwards–Slingo radiation code and tested on 250 h of data from 12 different days. The data are derived from cloud retrievals using radar, lidar, and a microwave radiometer at Chilbolton, southern United Kingdom. When the data are grouped into periods equivalent in size to general circulation model grid boxes, the shortwave plane-parallel albedo bias is found to be 8%, while the corresponding bias is found to be less than 1% using Tripleclouds. Similar results are found for the longwave biases. Tripleclouds is then compared to a more conventional method of accounting for inhomogeneity that multiplies optical depths by a constant scaling factor, and Tripleclouds is seen to improve on this method both in terms of top-of-atmosphere radiative flux biases and internal heating rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Clustering is defined as the grouping of similar items in a set, and is an important process within the field of data mining. As the amount of data for various applications continues to increase, in terms of its size and dimensionality, it is necessary to have efficient clustering methods. A popular clustering algorithm is K-Means, which adopts a greedy approach to produce a set of K-clusters with associated centres of mass, and uses a squared error distortion measure to determine convergence. Methods for improving the efficiency of K-Means have been largely explored in two main directions. The amount of computation can be significantly reduced by adopting a more efficient data structure, notably a multi-dimensional binary search tree (KD-Tree) to store either centroids or data points. A second direction is parallel processing, where data and computation loads are distributed over many processing nodes. However, little work has been done to provide a parallel formulation of the efficient sequential techniques based on KD-Trees. Such approaches are expected to have an irregular distribution of computation load and can suffer from load imbalance. This issue has so far limited the adoption of these efficient K-Means techniques in parallel computational environments. In this work, we provide a parallel formulation for the KD-Tree based K-Means algorithm and address its load balancing issues.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chess endgame tables should provide efficiently the value and depth of any required position during play. The indexing of an endgame’s positions is crucial to meeting this objective. This paper updates Heinz’ previous review of approaches to indexing and describes the latest approach by the first and third authors. Heinz’ and Nalimov’s endgame tables (EGTs) encompass the en passant rule and have the most compact index schemes to date. Nalimov’s EGTs, to the Distance-to-Mate (DTM) metric, require only 30.6 × 10^9 elements in total for all the 3-to-5-man endgames and are individually more compact than previous tables. His new index scheme has proved itself while generating the tables and in the 1999 World Computer Chess Championship where many of the top programs used the new suite of EGTs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chess endgame tables should provide efficiently the value and depth of any required position during play. The indexing of an endgame’s positions is crucial to meeting this objective. This paper updates Heinz’ previous review of approaches to indexing and describes the latest approach by the first and third authors. Heinz’ and Nalimov’s endgame tables (EGTs) encompass the en passant rule and have the most compact index schemes to date. Nalimov’s EGTs, to the Distance-to-Mate (DTM) metric, require only 30.6 × 109 elements in total for all the 3-to-5-man endgames and are individually more compact than previous tables. His new index scheme has proved itself while generating the tables and in the 1999 World Computer Chess Championship where many of the top programs used the new suite of EGTs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The authors propose a bit serial pipeline used to perform the genetic operators in a hardware genetic algorithm. The bit-serial nature of the dataflow allows the operators to be pipelined, resulting in an architecture which is area efficient, easily scaled and is independent of the lengths of the chromosomes. An FPGA implementation of the device achieves a throughput of >25 million genes per second

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A multiple factor parametrization is described to permit the efficient calculation of collision efficiency (E) between electrically charged aerosol particles and neutral cloud droplets in numerical models of cloud and climate. The four-parameter representation summarizes the results obtained from a detailed microphysical model of E, which accounts for the different forces acting on the aerosol in the path of falling cloud droplets. The parametrization's range of validity is for aerosol particle radii of 0.4 to 10 mu m, aerosol particle densities of I to 2.0 g cm(-3), aerosol particle charges from neutral to 100 elementary charges and drop radii from 18.55 to 142 mu m. The parametrization yields values of E well within an order of magnitude of the detailed model's values, from a dataset of 3978 E values. Of these values 95% have modelled to parametrized ratios between 0.5 and 1.5 for aerosol particle sizes ranging between 0.4 and 2.0 mu m, and about 96% in the second size range. This parametrization speeds up the calculation of E by a factor of similar to 10(3) compared with the original microphysical model, permitting the inclusion of electric charge effects in numerical cloud and climate models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new heuristic for the Steiner Minimal Tree problem is presented here. The method described is based on the detection of particular sets of nodes in networks, the “Hot Spot” sets, which are used to obtain better approximations of the optimal solutions. An algorithm is also proposed which is capable of improving the solutions obtained by classical heuristics, by means of a stirring process of the nodes in solution trees. Classical heuristics and an enumerative method are used CIS comparison terms in the experimental analysis which demonstrates the goodness of the heuristic discussed in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new heuristic for the Steiner minimal tree problem is presented. The method described is based on the detection of particular sets of nodes in networks, the “hot spot” sets, which are used to obtain better approximations of the optimal solutions. An algorithm is also proposed which is capable of improving the solutions obtained by classical heuristics, by means of a stirring process of the nodes in solution trees. Classical heuristics and an enumerative method are used as comparison terms in the experimental analysis which demonstrates the capability of the heuristic discussed