900 resultados para Chess, Four-handed.
Resumo:
Some 50,000 Win Studies in Chess challenge White to find an effectively unique route to a win. Judging the impact of less than absolute uniqueness requires both technical analysis and artistic judgment. Here, for the first time, an algorithm is defined to help analyse uniqueness in endgame positions objectively. The key idea is to examine how critical certain positions are to White in achieving the win. The algorithm uses sub-n-man endgame tables (EGTs) for both Chess and relevant, adjacent variants of Chess. It challenges authors of EGT generators to generalise them to create EGTs for these chess variants. It has already proved efficient and effective in an implementation for Starchess, itself a variant of chess. The approach also addresses a number of similar questions arising in endgame theory, games and compositions.
Resumo:
This paper presents evidence for several features of the population of chess players, and the distribution of their performances measured in terms of Elo ratings and by computer analysis of moves. Evidence that ratings have remained stable since the inception of the Elo system in the 1970’s is given in several forms: by showing that the population of strong players fits a simple logistic-curve model without inflation, by plotting players’ average error against the FIDE category of tournaments over time, and by skill parameters from a model that employs computer analysis keeping a nearly constant relation to Elo rating across that time. The distribution of the model’s Intrinsic Performance Ratings can hence be used to compare populations that have limited interaction, such as between players in a national chess federation and FIDE, and ascertain relative drift in their respective rating systems.
Resumo:
The A-Train constellation of satellites provides a new capability to measure vertical cloud profiles that leads to more detailed information on ice-cloud microphysical properties than has been possible up to now. A variational radar–lidar ice-cloud retrieval algorithm (VarCloud) takes advantage of the complementary nature of the CloudSat radar and Cloud–Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) lidar to provide a seamless retrieval of ice water content, effective radius, and extinction coefficient from the thinnest cirrus (seen only by the lidar) to the thickest ice cloud (penetrated only by the radar). In this paper, several versions of the VarCloud retrieval are compared with the CloudSat standard ice-only retrieval of ice water content, two empirical formulas that derive ice water content from radar reflectivity and temperature, and retrievals of vertically integrated properties from the Moderate Resolution Imaging Spectroradiometer (MODIS) radiometer. The retrieved variables typically agree to within a factor of 2, on average, and most of the differences can be explained by the different microphysical assumptions. For example, the ice water content comparison illustrates the sensitivity of the retrievals to assumed ice particle shape. If ice particles are modeled as oblate spheroids rather than spheres for radar scattering then the retrieved ice water content is reduced by on average 50% in clouds with a reflectivity factor larger than 0 dBZ. VarCloud retrieves optical depths that are on average a factor-of-2 lower than those from MODIS, which can be explained by the different assumptions on particle mass and area; if VarCloud mimics the MODIS assumptions then better agreement is found in effective radius and optical depth is overestimated. MODIS predicts the mean vertically integrated ice water content to be around a factor-of-3 lower than that from VarCloud for the same retrievals, however, because the MODIS algorithm assumes that its retrieved effective radius (which is mostly representative of cloud top) is constant throughout the depth of the cloud. These comparisons highlight the need to refine microphysical assumptions in all retrieval algorithms and also for future studies to compare not only the mean values but also the full probability density function.
Provider diversity in the English NHS: a study of recent developments in four local health economies
Resumo:
Objectives: The overall objective of the research was to assess the impact of provider diversity on quality and innovation in the English NHS. The aims were to map the extent of diverse provider activity, identify the differences in performance between Third Sector Organisations (TSOs), for-profit private enterprises, and incumbent organisations within the NHS, and the factors that affect the entry and growth of new private and TSOs. Methods: Case studies of four Local Health Economies (LHEs). Data included: semi-structured interviews with 48 managerial and clinical staff from NHS organizations and providers from the private and Third Sector; some documentary evidence; a focus group with service users; and routine data from the Care Quality Commission and Companies House. Data collection was mainly between November 2008 and November 2009. Results: Involvement of diverse providers in the NHS is limited. Commissioners’ local strategies influence degrees of diversity. Barriers to the entry for TSOs include lack of economies of scale in the bidding process. Private providers have greater concern to improve patient pathways and patient experience, whereas TSOs deliver quality improvements by using a more holistic approach and a greater degree of community involvement. Entry of new providers drives NHS Trusts to respond by making improvements. Information sharing diminishes as competition intensifies. Conclusions: There is scope to increase the participation of diverse providers in the NHS, but care must be taken not to damage public accountability, overall productivity, equity and NHS providers (especially acute hospitals, which are likely to remain in the NHS) in the process.
Resumo:
This spreadsheet contains key data about that part of the endgame of Western Chess for which Endgame Tables (EGTs) have been generated by computer. It is derived from the EGT work since 1969 of Thomas Ströhlein, Ken Thompson, Christopher Wirth, Eugene Nalimov, Marc Bourzutschky, John Tamplin and Yakov Konoval. The data includes %s of wins, draws and losses (wtm and btm), the maximum and average depths of win under various metrics (DTC = Depth to Conversion, DTM = Depth to Mate, DTZ = Depth to Conversion or Pawn-push), and examples of positions of maximum depth. It is essentially about sub-7-man Chess but is updated as news comes in of 7-man EGT computations.
Resumo:
This data is derived from Eugene Nalimov's Depth-to-Mate Endgame Tables for Western Chess. While having the move is normally advantageous, there are positions where the side-to-move would have a better theoretical result if it were the other side to move. These are (Type A) 'zugzwang' positions where the 'obligation to act' is unwelcome. This data provides lists of all zugzwangs in sub-7-man chess, and summary data about those sets of zugzwangs including exemplar zugzwangs of maximum depth.
Resumo:
The names Opuntia bulbispina, O. clavata, O. emoryi and O. grahamii, originally proposed by George Engelmann between 1848 and 1856, are reviewed and typified after new findings of previously unknown voucher specimens. Original materials collected by some of the collaborators employed by Engelmann during the Mexican Boundary Survey were discovered in a loan from the Torrey Herbarium at the New York Botanical Garden (NY). Many of the materials include fragments of stems and fruits, and others include only sectioned flowers and some seeds. Particularly good descriptions of the species here concerned were published in Engelmann’s “Synopsis of the Cactaceae” in 1857, and exceptional illustrations were produced by Paulus Roetter and printed in “Cactaceae of the Boundary” in 1859. The problems surrounding some previous typifications of these names range from typification of joint lectotypes to illegitimate typifications of illustrations when original material was known to exist. The materials selected for typification were collected by the Mexican Boundary Survey and are lodged at the herbaria of the Missouri Botanical Garden (MO) and the New York Botanical Garden (NY); some are illustrations published by Engelmann.
Adaptive evolution of four microcephaly genes and the evolution of brain size in anthropoid primates
Resumo:
The anatomical basis and adaptive function of the expansion in primate brain size have long been studied; however, we are only beginning to understand the genetic basis of these evolutionary changes. Genes linked to human primary microcephaly have received much attention as they have accelerated evolutionary rates along lineages leading to humans. However, these studies focus narrowly on apes, and the link between microcephaly gene evolution and brain evolution is disputed. We analyzed the molecular evolution of four genes associated with microcephaly (ASPM, CDK5RAP2, CENPJ, MCPH1) across 21 species representing all major clades of anthropoid primates. Contrary to prevailing assumptions, positive selection was not limited to or intensified along the lineage leading to humans. In fact we show that all four loci were subject to positive selection across the anthropoid primate phylogeny. We developed clearly defined hypotheses to explicitly test if selection on these loci was associated with the evolution of brain size. We found positive relationships between both CDK5RAP2 and ASPM and neonatal brain mass and somewhat weaker relationships between these genes and adult brain size. In contrast, there is no evidence linking CENPJ and MCPH1 to brain size evolution. The stronger association of ASPM and CDK5RAP2 evolution with neonatal brain size than with adult brain size is consistent with these loci having a direct effect on prenatal neuronal proliferation. These results suggest that primate brain size may have at least a partially conserved genetic basis. Our results contradict a previous study that linked adaptive evolution of ASPM to changes in relative cortex size; however, our analysis indicates that this conclusion is not robust. Our finding that the coding regions of two widely expressed loci has experienced pervasive positive selection in relation to a complex, quantitative developmental phenotype provides a notable counterexample to the commonly asserted hypothesis that cisregulatory regions play a dominant role in phenotypic evolution. Key words: ASPM, MCPH1, CDK5RAP2, CENPJ, brain, neurogenesis, primates.
Resumo:
Within a changing climate, Mediterranean ‘Garrigue’ xerophytes are increasingly recommended as suitable urban landscape plants in north-west Europe, based on their capacity to tolerate high temperature and reduced water availability during summer. Such species, however, have a poor reputation for tolerating waterlogged soils; paradoxically a phenomenon that may also increase in north-west Europe due to predictions for both higher volumes of winter precipitation, and short, but intensive periods of summer rainfall. This study investigated flooding tolerance in four landscape ‘Garrigue’ species, Stachys byzantina, Cistus × hybridus, Lavandula angustifolia and Salvia officinalis. Despite evolving in a dry habitat, the four species tested proved remarkably resilient to flooding. All species survived 17 days flooding in winter, with Stachys and Lavandula also surviving equivalent flooding duration during summer. Photosynthesis and biomass production, however, were strongly inhibited by flooding although the most tolerant species, Stachys quickly restored its photosynthetic capacity on termination of flooding. Overall, survival rates were comparable to previous studies on other terrestrial (including wetland) species. Subsequent experiments using Salvia (a species we identified as ‘intermediate’ in tolerance) clearly demonstrated adaptations to waterlogging, e.g. acclimation against anoxia when pre-treated with hypoxia. Despite anecdotal information to the contrary, we found no evidence to suggest that these xerophytic species are particularly intolerant of waterlogging. Other climatic and biotic factors may restrict the viability and distribution of these species within the urban conurbations of north-west Europe, but we believe increased incidence of flooding per se should not preclude their consideration.
Resumo:
This article reviews the KQPKQP endgame of the ROOKIE-BARON game of the World Computer Chess Championship, 2011. It also reviews the decisive KRNPKBP endgame in the second Anand-Gelfand rapid game of the World Chess Championship 2012. There is a review of parts 2-3 of the Bourzutschky-Konoval 7-man endgame series in EG, of the new endgame software tool FinalGen, and of the 'Lomonosov' endgame table generation programme in Moscow.
Resumo:
Ozone (O3) precursor emissions influence regional and global climate and air quality through changes in tropospheric O3 and oxidants, which also influence methane (CH4) and sulfate aerosols (SO42−). We examine changes in the tropospheric composition of O3, CH4, SO42− and global net radiative forcing (RF) for 20% reductions in global CH4 burden and in anthropogenic O3 precursor emissions (NOx, NMVOC, and CO) from four regions (East Asia, Europe and Northern Africa, North America, and South Asia) using the Task Force on Hemispheric Transport of Air Pollution Source-Receptor global chemical transport model (CTM) simulations, assessing uncertainty (mean ± 1 standard deviation) across multiple CTMs. We evaluate steady state O3 responses, including long-term feedbacks via CH4. With a radiative transfer model that includes greenhouse gases and the aerosol direct effect, we find that regional NOx reductions produce global, annually averaged positive net RFs (0.2 ± 0.6 to 1.7 ± 2 mWm−2/Tg N yr−1), with some variation among models. Negative net RFs result from reductions in global CH4 (−162.6 ± 2 mWm−2 for a change from 1760 to 1408 ppbv CH4) and regional NMVOC (−0.4 ± 0.2 to −0.7 ± 0.2 mWm−2/Tg C yr−1) and CO emissions (−0.13 ± 0.02 to −0.15 ± 0.02 mWm−2/Tg CO yr−1). Including the effect of O3 on CO2 uptake by vegetation likely makes these net RFs more negative by −1.9 to −5.2 mWm−2/Tg N yr−1, −0.2 to −0.7 mWm−2/Tg C yr−1, and −0.02 to −0.05 mWm−2/Tg CO yr−1. Net RF impacts reflect the distribution of concentration changes, where RF is affected locally by changes in SO42−, regionally to hemispherically by O3, and globally by CH4. Global annual average SO42− responses to oxidant changes range from 0.4 ± 2.6 to −1.9 ± 1.3 Gg for NOx reductions, 0.1 ± 1.2 to −0.9 ± 0.8 Gg for NMVOC reductions, and −0.09 ± 0.5 to −0.9 ± 0.8 Gg for CO reductions, suggesting additional research is needed. The 100-year global warming potentials (GWP100) are calculated for the global CH4 reduction (20.9 ± 3.7 without stratospheric O3 or water vapor, 24.2 ± 4.2 including those components), and for the regional NOx, NMVOC, and CO reductions (−18.7 ± 25.9 to −1.9 ± 8.7 for NOx, 4.8 ± 1.7 to 8.3 ± 1.9 for NMVOC, and 1.5 ± 0.4 to 1.7 ± 0.5 for CO). Variation in GWP100 for NOx, NMVOC, and CO suggests that regionally specific GWPs may be necessary and could support the inclusion of O3 precursors in future policies that address air quality and climate change simultaneously. Both global net RF and GWP100 are more sensitive to NOx and NMVOC reductions from South Asia than the other three regions.
Resumo:
Between 1972 and 2001, the English late-modernist poet Roy Fisher provided the text for nine separate artist's books produced by Ron King at the Circle Press. Taken together, as Andrew Lambirth has written, the Fisher-King collaborations represent a sustained investigation of the various ways in which text and image can be integrated, breaking the mould of the codex or folio edition, and turning the book into a sculptural object. From the three-dimensional pop-up designs of Bluebeard's Castle (1973), each representing a part of the edifice (the portcullis, the armoury and so on), to ‘alphabet books’ such as The Half-Year Letters (1983), held in an ingenious french-folded concertina which can be stretched to over a metre long or compacted to a pocketbook, the project of these art books is to complicate their own bibliographic codes, and rethink what a book can be. Their folds and reduplications give a material form to the processes by which meanings are produced: from the discovery, in Top Down, Bottom Up (1990), of how to draw on both sides of the page at the same time, to the developments of The Left-Handed Punch (1987) and Anansi Company (1992), where the book becomes first a four-dimensional theatre space, in which a new version of Punch and Judy is played out by twelve articulated puppets, and then a location for characters that are self-contained and removable, in the form of thirteen hand-made wire and card rod-puppets. Finally, in Tabernacle (2001), a seven-drawer black wooden cabinet that stands foursquare like a sculpture (and sells to galleries and collectors for over three thousand pounds), the conception of the book and the material history of print are fully undone and reconstituted. This paper analyses how the King-Fisher art books work out their radically material poetics of the book; how their emphasis on collaboration, between artist and poet, image and text, and also book and reader – the construction of meaning becoming a co-implicated process – continuously challenges hierarchies and fixities in our conception of authorship; and how they re-think the status of poetic text and the construction of the book as material object.