24 resultados para corpúsculos de Heinz

em CentAUR: Central Archive University of Reading - UK


Relevância:

10.00% 10.00%

Publicador:

Resumo:

While Nalimov’s endgame tables for Western Chess are the most used today, their Depth-to-Mate metric is not the only one and not the most effective in use. The authors have developed and used new programs to create tables to alternative metrics and recommend better strategies for endgame play.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A reference model of Fallible Endgame Play has been implemented and exercised with the chess engine WILHELM. Various experiments have demonstrated the value of the model and the robustness of decisions based on it. Experimental results have also been compared with the theoretical predictions of a Markov model of the endgame and found to be in close agreement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Heinz recently completed a comprehensive experiment in self-play using the FRITZ chess engine to establish the ‘decreasing returns’ hypothesis with specific levels of statistical confidence. This note revisits the results and recalculates the confidence levels of this and other hypotheses. These appear to be better than Heinz’ initial analysis suggests.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

From the beginning, the world of game-playing by machine has been fortunate in attracting contributions from the leading names of computer science. Charles Babbage, Konrad Zuse, Claude Shannon, Alan Turing, John von Neumann, John McCarthy, Alan Newell, Herb Simon and Ken Thompson all come to mind, and each reader will wish to add to this list. Recently, the Journal has saluted both Claude Shannon and Herb Simon. Ken’s retirement from Lucent Technologies’ Bell Labs to the start-up Entrisphere is also a good moment for reflection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chess endgame tables should provide efficiently the value and depth of any required position during play. The indexing of an endgame’s positions is crucial to meeting this objective. This paper updates Heinz’ previous review of approaches to indexing and describes the latest approach by the first and third authors. Heinz’ and Nalimov’s endgame tables (EGTs) encompass the en passant rule and have the most compact index schemes to date. Nalimov’s EGTs, to the Distance-to-Mate (DTM) metric, require only 30.6 × 10^9 elements in total for all the 3-to-5-man endgames and are individually more compact than previous tables. His new index scheme has proved itself while generating the tables and in the 1999 World Computer Chess Championship where many of the top programs used the new suite of EGTs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chess endgame tables should provide efficiently the value and depth of any required position during play. The indexing of an endgame’s positions is crucial to meeting this objective. This paper updates Heinz’ previous review of approaches to indexing and describes the latest approach by the first and third authors. Heinz’ and Nalimov’s endgame tables (EGTs) encompass the en passant rule and have the most compact index schemes to date. Nalimov’s EGTs, to the Distance-to-Mate (DTM) metric, require only 30.6 × 109 elements in total for all the 3-to-5-man endgames and are individually more compact than previous tables. His new index scheme has proved itself while generating the tables and in the 1999 World Computer Chess Championship where many of the top programs used the new suite of EGTs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Heinz recently completed a comprehensive experiment in self-play using the FRITZ chess engine to establish the ‘decreasing returns’ hypothesis with specific levels of statistical confidence. This note revisits the results and recalculates the confidence levels of this and other hypotheses. These appear to be better than Heinz’ initial analysis suggests.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper considers meta-analysis of diagnostic studies that use a continuous score for classification of study participants into healthy or diseased groups. Classification is often done on the basis of a threshold or cut-off value, which might vary between studies. Consequently, conventional meta-analysis methodology focusing solely on separate analysis of sensitivity and specificity might be confounded by a potentially unknown variation of the cut-off value. To cope with this phenomena it is suggested to use, instead, an overall estimate of the misclassification error previously suggested and used as Youden’s index and; furthermore, it is argued that this index is less prone to between-study variation of cut-off values. A simple Mantel–Haenszel estimator as a summary measure of the overall misclassification error is suggested, which adjusts for a potential study effect. The measure of the misclassification error based on Youden’s index is advantageous in that it easily allows an extension to a likelihood approach, which is then able to cope with unobserved heterogeneity via a nonparametric mixture model. All methods are illustrated at hand of an example on a diagnostic meta-analysis on duplex doppler ultrasound, with angiography as the standard for stroke prevention.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Can human social cognitive processes and social motives be grasped by the methods of experimental economics? Experimental studies of strategic cognition and social preferences contribute to our understanding of the social aspects of economic decisions making. Yet, papers in this issue argue that the social aspects of decision-making introduce several difficulties for interpreting the results of economic experiments. In particular, the laboratory is itself a social context, and in many respects a rather distinctive one, which raises questions of external validity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the two-point boundary value problem for stiff systems of ordinary differential equations. For systems that can be transformed to essentially diagonally dominant form with appropriate smoothness conditions, a priori estimates are obtained. Problems with turning points can be treated with this theory, and we discuss this in detail. We give robust difference approximations and present error estimates for these schemes. In particular we give a detailed description of how to transform a general system to essentially diagonally dominant form and then stretch the independent variable so that the system will satisfy the correct smoothness conditions. Numerical examples are presented for both linear and nonlinear problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A detailed analysis is undertaken of the Atlantic-European climate using data from 500-year-long proxy-based climate reconstructions, a long climate simulation with perpetual 1990 forcing, as well as two global and one regional climate change scenarios. The observed and simulated interannual variability and teleconnectivity are compared and interpreted in order to improve the understanding of natural climate variability on interannual to decadal time scales for the late Holocene. The focus is set on the Atlantic-European and Alpine regions during the winter and summer seasons, using temperature, precipitation, and 500 hPa geopotential height fields. The climate reconstruction shows pronounced interdecadal variations that appear to “lock” the atmospheric circulation in quasi-steady long-term patterns over multi-decadal periods controlling at least part of the temperature and precipitation variability. Different circulation patterns are persistent over several decades for the period 1500 to 1900. The 500-year-long simulation with perpetual 1990 forcing shows some substantial differences, with a more unsteady teleconnectivity behaviour. Two global scenario simulations indicate a transition towards more stable teleconnectivity for the next 100 years. Time series of reconstructed and simulated temperature and precipitation over the Alpine region show comparatively small changes in interannual variability within the time frame considered, with the exception of the summer season, where a substantial increase in interannual variability is simulated by regional climate models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A quasi-optical interferometric technique capable of measuring antenna phase patterns without the need for a heterodyne receiver is presented. It is particularly suited to the characterization of terahertz antennas feeding power detectors or mixers employing quasi-optical local oscillator injection. Examples of recorded antenna phase patterns at frequencies of 1.4 and 2.5 THz using homodyne detectors are presented. To our knowledge, these are the highest frequency antenna phase patterns ever recovered. Knowledge of both the amplitude and phase patterns in the far field enable a Gauss-Hermite or Gauss-Laguerre beam-mode analysis to be carried out for the antenna, of importance in performance optimization calculations, such as antenna gain and beam efficiency parameters at the design and prototype stage of antenna development. A full description of the beam would also be required if the antenna is to be used to feed a quasi-optical system in the near-field to far-field transition region. This situation could often arise when the device is fitted directly at the back of telescopes in flying observatories. A further benefit of the proposed technique is simplicity for characterizing systems in situ, an advantage of considerable importance as in many situations, the components may not be removable for further characterization once assembled. The proposed methodology is generic and should be useful across the wider sensing community, e.g., in single detector acoustic imaging or in adaptive imaging array applications. Furthermore, it is applicable across other frequencies of the EM spectrum, provided adequate spatial and temporal phase stability of the source can be maintained throughout the measurement process. Phase information retrieval is also of importance to emergent research areas, such as band-gap structure characterization, meta-materials research, electromagnetic cloaking, slow light, super-lens design as well as near-field and virtual imaging applications.