254 resultados para Murdock, Guy
Resumo:
The ICGA and its members can now communicate with each other and a wider public, as never before. The ICGA website ICGA_W, www.icga.org, will complement the ICGA Journal, providing more space and access to an evolving and wider range of items including both topical news and definitive reference material.
Resumo:
It is now 32 years since Ströhlein’s pioneering computation of KRKN and ten years since the publication of Nunn’s Secrets of Rook Endings. This book defined a new genre under his authorship and editorship (Nunn, 1992, 1994, 1995; Müller and Lamprecht, 1999, 2001) and has merited a second edition. Now comes the second edition of Secrets of Pawnless Endings.
Resumo:
Our ability to identify, acquire, store, enquire on and analyse data is increasing as never before, especially in the GIS field. Technologies are becoming available to manage a wider variety of data and to make intelligent inferences on that data. The mainstream arrival of large-scale database engines is not far away. The experience of using the first such products tells us that they will radically change data management in the GIS field.
Resumo:
The EP2025 EDS project develops a highly parallel information server that supports established high-value interfaces. We describe the motivation for the project, the architecture of the system, and the design and application of its database and language subsystems. The Elipsys logic programming language, its advanced applications, EDS Lisp, and the Metal machine translation system are examined.
Resumo:
Within the Information Technology degree programme of the University of Reading, the students undertake a major project in their final year. The module is both a hurdle to an honours degree and significant in terms of assessment weighting. The two year history so far has shown that bad citation and plagiarism are issues, and in one case called for the due referral of a project report. In the light of experience to date, we are focusing firstly on plagiarism prevention, giving generic advice on report writing and citation practice, and secondly on detection. In the longer term, I believe we need to reflect on what capabilities we should be creating in our undergraduates and therefore what and how we should be assessing them.
Resumo:
Presentation on pre-emption, detection and redirection in the context of the contract cheating form of plagiarism.
Resumo:
From the beginning, the world of game-playing by machine has been fortunate in attracting contributions from the leading names of computer science. Charles Babbage, Konrad Zuse, Claude Shannon, Alan Turing, John von Neumann, John McCarthy, Alan Newell, Herb Simon and Ken Thompson all come to mind, and each reader will wish to add to this list. Recently, the Journal has saluted both Claude Shannon and Herb Simon. Ken’s retirement from Lucent Technologies’ Bell Labs to the start-up Entrisphere is also a good moment for reflection.
Resumo:
Ken Thompson recently communicated some results mined from his set of 64 6-man endgame tables. These list some positions of interest, namely, mutual zugzwangs and those of maximum depth. The results have been analysed by the authors and found to be identical or compatible with the available or published findings of Karrer, Nalimov, Stiller and Wirth.
Resumo:
Chess endgame tables should provide efficiently the value and depth of any required position during play. The indexing of an endgame’s positions is crucial to meeting this objective. This paper updates Heinz’ previous review of approaches to indexing and describes the latest approach by the first and third authors. Heinz’ and Nalimov’s endgame tables (EGTs) encompass the en passant rule and have the most compact index schemes to date. Nalimov’s EGTs, to the Distance-to-Mate (DTM) metric, require only 30.6 × 10^9 elements in total for all the 3-to-5-man endgames and are individually more compact than previous tables. His new index scheme has proved itself while generating the tables and in the 1999 World Computer Chess Championship where many of the top programs used the new suite of EGTs.
Resumo:
Chess endgame tables should provide efficiently the value and depth of any required position during play. The indexing of an endgame’s positions is crucial to meeting this objective. This paper updates Heinz’ previous review of approaches to indexing and describes the latest approach by the first and third authors. Heinz’ and Nalimov’s endgame tables (EGTs) encompass the en passant rule and have the most compact index schemes to date. Nalimov’s EGTs, to the Distance-to-Mate (DTM) metric, require only 30.6 × 109 elements in total for all the 3-to-5-man endgames and are individually more compact than previous tables. His new index scheme has proved itself while generating the tables and in the 1999 World Computer Chess Championship where many of the top programs used the new suite of EGTs.
Resumo:
The 1999 Kasparov-World game for the first time enabled anyone to join a team playing against a World Chess Champion via the web. It included a surprise in the opening, complex middle-game strategy and a deep ending. As the game headed for its mysterious finale, the World Team re-quested a KQQKQQ endgame table and was provided with two by the authors. This paper describes their work, compares the methods used, examines the issues raised and summarises the concepts involved for the benefit of future workers in the endgame field. It also notes the contribution of this endgame to chess itself.
Resumo:
The Kasparov-World match was initiated by Microsoft with sponsorship from the bank First USA. The concept was that Garry Kasparov as White would play the rest of the world on the Web: one ply would be played per day and the World Team was to vote for its move. The Kasparov-World game was a success from many points of view. It certainly gave thousands the feeling of facing the world’s best player across the board and did much for the future of the game. Described by Kasparov as “phenomenal ... the most complex in chess history”, it is probably a worthy ‘Greatest Game’ candidate. Computer technology has given chess a new mode of play and taken it to new heights: the experiment deserves to be repeated. We look forward to another game and experience of this quality although it will be difficult to surpass the event we have just enjoyed. We salute and thank all those who contributed - sponsors, moderator, coaches, unofficial analysts, organisers, technologists, voters and our new friends.
Resumo:
The latest 6-man chess endgame results confirm that there are many deep forced mates beyond the 50-move rule. Players with potential wins near this limit naturally want to avoid a claim for a draw: optimal play to current metrics does not guarantee feasible wins or maximise the chances of winning against fallible opposition. A new metric and further strategies are defined which support players’ aspirations and improve their prospects of securing wins in the context of a k-move rule.
Resumo:
An examination of the deepest win in KRNKNN in the context of Ken Thompson's results.
Resumo:
In multi-tasking systems when it is not possible to guarantee completion of all activities by specified times, the scheduling problem is not straightforward. Examples of this situation in real-time programming include the occurrence of alarm conditions and the buffering of output to peripherals in on-line facilities. The latter case is studied here with the hope of indicating one solution to the general problem.