81 resultados para Metric
Resumo:
Across Europe, elevated phosphorus (P) concentrations in lowland rivers have made them particularly susceptible to eutrophication. This is compounded in southern and central UK by increasing pressures on water resources, which may be further enhanced by the potential effects of climate change. The EU Water Framework Directive requires an integrated approach to water resources management at the catchment scale and highlights the need for modelling tools that can distinguish relative contributions from multiple nutrient sources and are consistent with the information content of the available data. Two such models are introduced and evaluated within a stochastic framework using daily flow and total phosphorus concentrations recorded in a clay catchment typical of many areas of the lowland UK. Both models disaggregate empirical annual load estimates, derived from land use data, as a function of surface/near surface runoff, generated using a simple conceptual rainfall-runoff model. Estimates of the daily load from agricultural land, together with those from baseflow and point sources, feed into an in-stream routing algorithm. The first model assumes constant concentrations in runoff via surface/near surface pathways and incorporates an additional P store in the river-bed sediments, depleted above a critical discharge, to explicitly simulate resuspension. The second model, which is simpler, simulates P concentrations as a function of surface/near surface runoff, thus emphasising the influence of non-point source loads during flow peaks and mixing of baseflow and point sources during low flows. The temporal consistency of parameter estimates and thus the suitability of each approach is assessed dynamically following a new approach based on Monte-Carlo analysis. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
Empirical Mode Decomposition (EMD) is a data driven technique for extraction of oscillatory components from data. Although it has been introduced over 15 years ago, its mathematical foundations are still missing which also implies lack of objective metrics for decomposed set evaluation. Most common technique for assessing results of EMD is their visual inspection, which is very subjective. This article provides objective measures for assessing EMD results based on the original definition of oscillatory components.
Resumo:
While Nalimov’s endgame tables for Western Chess are the most used today, their Depth-to-Mate metric is not the most efficient or effective in use. The authors have developed and used new programs to create tables to alternative metrics and recommend better strategies for endgame play.
Resumo:
While Nalimov’s endgame tables for Western Chess are the most used today, their Depth-to-Mate metric is not the only one and not the most effective in use. The authors have developed and used new programs to create tables to alternative metrics and recommend better strategies for endgame play.
Resumo:
Chess endgame tables should provide efficiently the value and depth of any required position during play. The indexing of an endgame’s positions is crucial to meeting this objective. This paper updates Heinz’ previous review of approaches to indexing and describes the latest approach by the first and third authors. Heinz’ and Nalimov’s endgame tables (EGTs) encompass the en passant rule and have the most compact index schemes to date. Nalimov’s EGTs, to the Distance-to-Mate (DTM) metric, require only 30.6 × 10^9 elements in total for all the 3-to-5-man endgames and are individually more compact than previous tables. His new index scheme has proved itself while generating the tables and in the 1999 World Computer Chess Championship where many of the top programs used the new suite of EGTs.
Resumo:
Chess endgame tables should provide efficiently the value and depth of any required position during play. The indexing of an endgame’s positions is crucial to meeting this objective. This paper updates Heinz’ previous review of approaches to indexing and describes the latest approach by the first and third authors. Heinz’ and Nalimov’s endgame tables (EGTs) encompass the en passant rule and have the most compact index schemes to date. Nalimov’s EGTs, to the Distance-to-Mate (DTM) metric, require only 30.6 × 109 elements in total for all the 3-to-5-man endgames and are individually more compact than previous tables. His new index scheme has proved itself while generating the tables and in the 1999 World Computer Chess Championship where many of the top programs used the new suite of EGTs.
Resumo:
The latest 6-man chess endgame results confirm that there are many deep forced mates beyond the 50-move rule. Players with potential wins near this limit naturally want to avoid a claim for a draw: optimal play to current metrics does not guarantee feasible wins or maximise the chances of winning against fallible opposition. A new metric and further strategies are defined which support players’ aspirations and improve their prospects of securing wins in the context of a k-move rule.
Resumo:
This note corrects a previous treatment of algorithms for the metric DTR, Depth by the Rule.
Resumo:
The recent G8 Gleneagles climate statement signed on 8 July 2005 specifically mentions a determination to lessen the impact of aviation on climate [Gleneagles, 2005. The Gleneagles communique: climate change, energy and sustainable development. http://www.fco.gov.uk/Files/kfile/PostG8_Gleneagles_Communique.pdf]. In January 2005 the European Union Emission Trading Scheme (ETS) commenced operation as the largest multi-country, multi-sector ETS in the world, albeit currently limited only to CO2 emissions. At present the scheme makes no provision for aircraft emissions. However, the UK Government would like to see aircraft included in the ETS and plans to use its Presidencies of both the EU and G8 in 2005 to implement these schemes within the EU and perhaps internationally. Non-CO2 effects have been included in some policy-orientated studies of the impact of aviation but we argue that the inclusion of such effects in any such ETS scheme is premature; we specifically argue that use of the Radiative Forcing Index for comparing emissions from different sources is inappropriate and that there is currently no metric for such a purpose that is likely to enable their inclusion in the near future. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
New conceptual ideas on network architectures have been proposed in the recent past. Current store-andforward routers are replaced by active intermediate systems, which are able to perform computations on transient packets, in a way that results very helpful for developing and deploying new protocols in a short time. This paper introduces a new routing algorithm, based on a congestion metric, and inspired by the behavior of ants in nature. The use of the Active Networks paradigm associated with a cooperative learning environment produces a robust, decentralized algorithm capable of adapting quickly to changing conditions.
Resumo:
The Bureau International des Poids et Mesures, the BIPM, was established by Article 1 of the Convention du Mètre, on 20 May 1875, and is charged with providing the basis for a single, coherent system of measurements to be used throughout the world. The decimal metric system, dating from the time of the French Revolution, was based on the metre and the kilogram. Under the terms of the 1875 Convention, new international prototypes of the metre and kilogram were made and formally adopted by the first Conférence Générale des Poids et Mesures (CGPM) in 1889. Over time this system developed, so that it now includes seven base units. In 1960 it was decided at the 11th CGPM that it should be called the Système International d’Unités, the SI (in English: the International System of Units). The SI is not static but evolves to match the world’s increasingly demanding requirements for measurements at all levels of precision and in all areas of science, technology, and human endeavour. This document is a summary of the SI Brochure, a publication of the BIPM which is a statement of the current status of the SI. The seven base units of the SI, listed in Table 1, provide the reference used to define all the measurement units of the International System. As science advances, and methods of measurement are refined, their definitions have to be revised. The more accurate the measurements, the greater the care required in the realization of the units of measurement.
Resumo:
Equilibrium study on complex formation of Co(II), Ni(II), Cu(II) and Zn(II), hereafter M(II), with the quadridentate (O-, N, O-, N) donor ligand, N-(2-hydroxybenzyl)-L-histidine (H(2)hb-L-his, hereafter H2L), in the absence and in the presence of typical (N, N) donor bidentate ligands, 1,10 phenanthroline(phen), 2, 2'-bipyridine(bipy), ethylenediamine(en), hereafter B, in aqueous solution at 25 +/- 1 degrees C was done at a fixed ionic strength, I = 0.1 mol dm(-3) (NaNO3) by combined pH-metric, UV-Vis and EPR measurements provide evidence for the formation of mononuclear and dinuclear binary and mixed ligand complexes of the types: M(L), M(L)(2)(2-), M-2(L)(2+), M-2(H-1L)(+), M(L)(B), (B)M(H-1L)M(B)(+). The imidazole moiety of the ligand is found to act as a bridging bidentate ligand in the dinuclear M-2(L)(2+), M-2(H-1L)(+) and (B)M(H-1L)M(B)(+) complexes, using its N-3 atom and N1-H deprotonated moiety. Stability constants of the complexes provide evidence of discrimination of Cu(II) from the other M(II) ions by this ligand. Solid complexes: [Ni(L)(H2O)(2)] (1), [Cu(L)(H2O)] (2), and [Ni(L)(bipy)] (.) H2O (3) have been isolated and characterized by various physicochemical studies. Single crystal X-ray diffraction of the ternary complex, 3, shows an octahedral [(O-,N,N,O-)(N,N)] geometry with extensive pi-pi stacking of the aromatic rings and H-bonding with imidazole (N1-H), secondary amino N-atom, the lattice H2O molecule, and the carboxylate and phenolate O-atoms. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
This paper develops cycle-level FPGA circuits of an organization for a fast path-based neural branch predictor Our results suggest that practical sizes of prediction tables are limited to around 32 KB to 64 KB in current FPGA technology due mainly to FPGA area of logic resources to maintain the tables. However the predictor scales well in terms of prediction speed. Table sizes alone should not be used as the only metric for hardware budget when comparing neural-based predictor to predictors of totally different organizations. This paper also gives early evidence to shift the attention on to the recovery from mis-prediction latency rather than on prediction latency as the most critical factor impacting accuracy of predictions for this class of branch predictors.
Resumo:
Biological Crossover occurs during the early stages of meiosis. During this process the chromosomes undergoing crossover are synapsed together at a number of homogenous sequence sections, it is within such synapsed sections that crossover occurs. The SVLC (Synapsing Variable Length Crossover) Algorithm recurrently synapses homogenous genetic sequences together in order of length. The genomes are considered to be flexible with crossover only being permitted within the synapsed sections. Consequently, common sequences are automatically preserved with only the genetic differences being exchanged, independent of the length of such differences. In addition to providing a rationale for variable length crossover it also provides a genotypic similarity metric for variable length genomes enabling standard niche formation techniques to be utilised. In a simple variable length test problem the SVLC algorithm outperforms current variable length crossover techniques.