512 resultados para Balducci, Lemmo, -1389.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Van der Heijden’s ENDGAME STUDY DATABASE IV, HhdbIV, is the definitive collection of 76,132 chess studies. In each one, White is to achieve the stipulated goal, win or draw: study solutions should be essentially unique with minor alternatives at most. In this second note on the mining of the database, we use the definitive Nalimov endgame tables to benchmark White’s moves in sub-7-man chess against this standard of uniqueness. Amongst goal-compatible mainline positions and goal-achieving moves, we identify the occurrence of absolutely unique moves and analyse the frequency and lengths of absolutely-unique-move sequences, AUMSs. We identify the occurrence of equi-optimal moves and suboptimal moves and refer to a defined method for classifying their significance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Van der Heijden’s ENDGAME STUDY DATABASE IV, HHDBIV, is the definitive collection of 76,132 chess studies. The zugzwang position or zug, one in which the side to move would prefer not to, is a frequent theme in the literature of chess studies. In this third data-mining of HHDBIV, we report on the occurrence of sub-7-man zugs there as discovered by the use of CQL and Nalimov endgame tables (EGTs). We also mine those Zugzwang Studies in which a zug more significantly appears in both its White-to-move (wtm) and Black-to-move (btm) forms. We provide some illustrative and extreme examples of zugzwangs in studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reports the availability on the web of the entire run of Beasley's 'British Endgame Study News', and reviews a recent report by Bourzutschky and Konoval on their discoveries with 7-man endgame tables.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this contribution we aim at anchoring Agent-Based Modeling (ABM) simulations in actual models of human psychology. More specifically, we apply unidirectional ABM to social psychological models using low level agents (i.e., intra-individual) to examine whether they generate better predictions, in comparison to standard statistical approaches, concerning the intentions of performing a behavior and the behavior. Moreover, this contribution tests to what extent the predictive validity of models of attitude such as the Theory of Planned Behavior (TPB) or Model of Goal-directed Behavior (MGB) depends on the assumption that peoples’ decisions and actions are purely rational. Simulations were therefore run by considering different deviations from rationality of the agents with a trembling hand method. Two data sets concerning respectively the consumption of soft drinks and physical activity were used. Three key findings emerged from the simulations. First, compared to standard statistical approach the agent-based simulation generally improves the prediction of behavior from intention. Second, the improvement in prediction is inversely proportional to the complexity of the underlying theoretical model. Finally, the introduction of varying degrees of deviation from rationality in agents’ behavior can lead to an improvement in the goodness of fit of the simulations. By demonstrating the potential of ABM as a complementary perspective to evaluating social psychological models, this contribution underlines the necessity of better defining agents in terms of psychological processes before examining higher levels such as the interactions between individuals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper explores the impact of the re-introduction of access restrictions to forests in Tanzania, through participatory forest management (PFM), that have excluded villagers from forests to which they have traditionally, albeit illegally, had access to collect non-timber forest products (NTFPs). Motivated by our fieldwork, and using a spatial–temporal model, we focus on the paths of forest degradation and regeneration and villagers' utility before and after an access restriction is introduced. Our paper illustrates a number of key points for policy makers. First, the benefits of forest conservation tend to be greatest in the first few periods after an access restriction is introduced, after which the overall forest quality often declines. Second, villagers may displace their NTFP collection into more distant forests that may have been completely protected by distance alone before access to a closer forest was restricted. Third, permitting villagers to collect limited amounts of NTFPs for a fee, or alternatively fining villagers caught collecting illegally from the protected forest, and returning the fee or fine revenue to the villagers, can improve both forest quality and villagers' livelihoods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Biomass partitioning of cacao (Theobroma cacao L.) was studied in seven clones and five hybrids in a replicated experiment in Bahia, Brazil. Over an eighteen month period, a seven- fold difference in dry bean yield was demonstrated between genotypes, ranging from the equivalent of 200 to 1389 kg.ha-1. During the same interval, the increase in trunk cross-sectional area ranged from 11.1 cm2 for clone EEG-29 to 27.6 cm2 for hybrid PA-150 * MA-15. Yield efficiency increment (the ratio of cumulative yield to the increase in trunk circumference), which indicated partitioning between the vegetative and reproductive components, ranged from 0.008 kg.cm-2 for clone CP-82 to 0.08 kg.cm-2 for clone EEG-29. An examination of biomass partitioning within the pod of the seven clones revealed that the beans accounted for between 32.0% (CP-82) and 44.5% (ICS-9) of the pod biomass. The study demonstrated the potential for yield improvement in cacao by selectively breeding for more efficient partitioning to the yield component.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The feasibility of halving greenhousegasemissions from hotels by 2030 has been studied as part of the Carbon Vision Buildings Programme. The aim of that programme was to study ways of reducing emissions from the existing stock because it will be responsible for the majority of building emissions over the next few decades. The work was carried out using detailed computer simulation using the ESP-r tool. Two hotels were studied, one older and converted and the other newer and purpose-built, with the aim of representing the most common UKhotel types. The effects were studied of interventions expected to be available in 2030 including fabric improvements, HVAC changes, lighting and appliance improvements and renewable energy generation. The main finding was that it is technically feasible to reduce emissions by 50% without compromising guest comfort. Ranking of the interventions was problematical for several reasons including interdependence and the impacts on boiler sizing of large reductions in the heating load

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article reviews the KQPKQP endgame of the ROOKIE-BARON game of the World Computer Chess Championship, 2011. It also reviews the decisive KRNPKBP endgame in the second Anand-Gelfand rapid game of the World Chess Championship 2012. There is a review of parts 2-3 of the Bourzutschky-Konoval 7-man endgame series in EG, of the new endgame software tool FinalGen, and of the 'Lomonosov' endgame table generation programme in Moscow.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has long been supposed that preference judgments between sets of to-be-considered possibilities are made by means of initially winnowing down the most promising-looking alternatives to form smaller “consideration sets” (Howard, 1963; Wright & Barbour, 1977). In preference choices with >2 options, it is standard to assume that a “consideration set”, based upon some simple criterion, is established to reduce the options available. Inferential judgments, in contrast, have more frequently been investigated in situations in which only two possibilities need to be considered (e.g., which of these two cities is the larger?) Proponents of the “fast and frugal” approach to decision-making suggest that such judgments are also made on the basis of limited, simple criteria. For example, if only one of two cities is recognized and the task is to judge which city has the larger population, the recognition heuristic states that the recognized city should be selected. A multinomial processing tree model is outlined which provides the basis for estimating the extent to which recognition is used as a criterion in establishing a consideration set for inferential judgments between three possible options.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When ε-nitro-a,β-unsaturated esters are added to conjugated cyanosulfones in the presence of a bifunctional thiourea catalyst, a highly stereoselective domino reaction occurs to generate complex cyclohexanes with up to four stereogenic centers, one of which is quaternary in nature. Therefore, it is demonstrated that, like nitro compounds, sulfones can undergo an asymmetric intramolecular conjugate addition to r,β- unsaturated esters in the presence of a bifunctional organocatalyst.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This review starts with a demonstration of the power of FinalGen and the new Lomonosov 7-man endgame tables, each giving an alternative 'bionic' ending to the 'five Queens' Hao-Carlsen (Tata Chess 2013) game. The completion of the Lomonosov 7-man DTM EGTs is announced. The final two parts of the Bourzutschky-Konoval 7-man-chess series in EG are summarised.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The 'Turing 100' Conference in Manchester was the main event of the Turing Centenary Year in 2012. This is a report and reflection on Kasparov's popular talk. Within it, he explained how Turing and influenced computer chess, his career and the chess community. Kasparov also played Chessbase's 'TURING' emulation of Turing's second paper chess engine, here labelled 'AT2'. Quasi Turing-tests, computer contributions to world championship chess, and suspected cheating in chess are also mentioned.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Theorem-proving is a one-player game. The history of computer programs being the players goes back to 1956 and the ‘LT’ LOGIC THEORY MACHINE of Newell, Shaw and Simon. In game-playing terms, the ‘initial position’ is the core set of axioms chosen for the particular logic and the ‘moves’ are the rules of inference. Now, the Univalent Foundations Program at IAS Princeton and the resulting ‘HoTT’ book on Homotopy Type Theory have demonstrated the success of a new kind of experimental mathematics using computer theorem proving.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Three topics are discussed. First, an issue in the epistemology of computer simulation - that of the chess endgame 'becoming' what computer-generated data says it is. Secondly, the endgames of the longest known games are discussed, and the concept of a Bionic Game is defined. Lastly, the set of record-depth positions published by Bourzutschky and Konoval are evaluated by the new MVL tables in Moscow - alongside the deepest known mate of 549 moves.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The 20th World Computer Chess Championship took place in Yokohama, Japan during August 2013. It was narrowly won by JUNIOR from JONNY with HIARCS, PANDIX, SHREDDER and MERLIN occupying the remaining positions. There are references to the detailed chess biographies of the engines and engine-authors in the Chessprogramming Wiki. The games, occasionally annotated, are available here.