804 resultados para Game of rules
Resumo:
Este artículo analiza la supremacía de la Constitución, en el escenario boliviano, frente al resto de las normas que integran el ordenamiento jurídico. Se estudia de manera particular a los tratados internacionales –en particular aquellos que forman parte del Derecho Internacional de los Derechos Humanos– en relación al debate sobre su jerarquía en el ordenamiento jurídico boliviano y, en particular, frente a la Constitución. Finalmente, se precisa la jerarquía que le corresponde a las normas propias de los pueblos indígenas, originarios y campesinos en el ordenamiento jurídico boliviano.
Resumo:
Ponce comenta y reflexiona en torno a la quinta novela del escritor Francisco Proaño Arandi, El sabor de la condena, publicada en Quito en 2009 por Editorial El Conejo. Se establecen algunas de las claves que configuran esta historia en la que Proaño vuelve a poner en escena las obsesiones que cruzan por sus otras experiencias novelescas. Los personajes, Javier y Male, se mueven en un juego de signos y símbolos que le permiten al lector acceder al misterio que constituye sus vidas, que es un tejido de interrogantes y apariencias continuas. Como en gran parte de sus ficciones, en esta el mundo está reducido a una casa y a una ciudad enigmática y hechizante como Quito, que vuelve a convertirse en referente central. Proaño, al decir de Ponce, es fiel a su estilo y a esa concepción de lo narrativo que se bate entre lo denso y lo moroso, que en él ya es clásica.
Resumo:
This is a review of progress in the Chess Endgame field. It includes news of the promulgation of Endgame Tables, their use, non-use and potential runtime creation. It includes news of data-mining achievements related to 7-man chess and to the field of Chess Studies. It includes news of an algorithm to create Endgame Tables for variants of the normal game of chess.
Resumo:
Purpose: Acquiring details of kinetic parameters of enzymes is crucial to biochemical understanding, drug development, and clinical diagnosis in ocular diseases. The correct design of an experiment is critical to collecting data suitable for analysis, modelling and deriving the correct information. As classical design methods are not targeted to the more complex kinetics being frequently studied, attention is needed to estimate parameters of such models with low variance. Methods: We have developed Bayesian utility functions to minimise kinetic parameter variance involving differentiation of model expressions and matrix inversion. These have been applied to the simple kinetics of the enzymes in the glyoxalase pathway (of importance in posttranslational modification of proteins in cataract), and the complex kinetics of lens aldehyde dehydrogenase (also of relevance to cataract). Results: Our successful application of Bayesian statistics has allowed us to identify a set of rules for designing optimum kinetic experiments iteratively. Most importantly, the distribution of points in the range is critical; it is not simply a matter of even or multiple increases. At least 60 % must be below the KM (or plural if more than one dissociation constant) and 40% above. This choice halves the variance found using a simple even spread across the range.With both the glyoxalase system and lens aldehyde dehydrogenase we have significantly improved the variance of kinetic parameter estimation while reducing the number and costs of experiments. Conclusions: We have developed an optimal and iterative method for selecting features of design such as substrate range, number of measurements and choice of intermediate points. Our novel approach minimises parameter error and costs, and maximises experimental efficiency. It is applicable to many areas of ocular drug design, including receptor-ligand binding and immunoglobulin binding, and should be an important tool in ocular drug discovery.
Resumo:
Web service composition can be facilitated by an automatic process which consists of rules, conditions and actions. This research has adapted ElementaryPetri Net (EPN) to analyze and model the web services and their composition. This paper describes a set of techniques for representing transition rules, algorithm and workflow that web service composition can be automatically carried out.
Resumo:
Deception-detection is the crux of Turing’s experiment to examine machine thinking conveyed through a capacity to respond with sustained and satisfactory answers to unrestricted questions put by a human interrogator. However, in 60 years to the month since the publication of Computing Machinery and Intelligence little agreement exists for a canonical format for Turing’s textual game of imitation, deception and machine intelligence. This research raises from the trapped mine of philosophical claims, counter-claims and rebuttals Turing’s own distinct five minutes question-answer imitation game, which he envisioned practicalised in two different ways: a) A two-participant, interrogator-witness viva voce, b) A three-participant, comparison of a machine with a human both questioned simultaneously by a human interrogator. Using Loebner’s 18th Prize for Artificial Intelligence contest, and Colby et al.’s 1972 transcript analysis paradigm, this research practicalised Turing’s imitation game with over 400 human participants and 13 machines across three original experiments. Results show that, at the current state of technology, a deception rate of 8.33% was achieved by machines in 60 human-machine simultaneous comparison tests. Results also show more than 1 in 3 Reviewers succumbed to hidden interlocutor misidentification after reading transcripts from experiment 2. Deception-detection is essential to uncover the increasing number of malfeasant programmes, such as CyberLover, developed to steal identity and financially defraud users in chatrooms across the Internet. Practicalising Turing’s two tests can assist in understanding natural dialogue and mitigate the risk from cybercrime.
Resumo:
This article reviews the KQPKQP endgame of the ROOKIE-BARON game of the World Computer Chess Championship, 2011. It also reviews the decisive KRNPKBP endgame in the second Anand-Gelfand rapid game of the World Chess Championship 2012. There is a review of parts 2-3 of the Bourzutschky-Konoval 7-man endgame series in EG, of the new endgame software tool FinalGen, and of the 'Lomonosov' endgame table generation programme in Moscow.
Resumo:
Norms are a set of rules that govern the behaviour of human agent, and how human agent behaves in response to the given certain conditions. This paper investigates the overlapping of information fields (set of shared norms) in the Context State Transition Model, and how these overlapping fields may affect the choices and actions of human agent. This paper also includes discussion on the implementation of new conflict resolution strategies based on the situation specification. The reasoning about conflicting norms in multiple information fields is discussed in detail.)
Resumo:
Advances in hardware and software in the past decade allow to capture, record and process fast data streams at a large scale. The research area of data stream mining has emerged as a consequence from these advances in order to cope with the real time analysis of potentially large and changing data streams. Examples of data streams include Google searches, credit card transactions, telemetric data and data of continuous chemical production processes. In some cases the data can be processed in batches by traditional data mining approaches. However, in some applications it is required to analyse the data in real time as soon as it is being captured. Such cases are for example if the data stream is infinite, fast changing, or simply too large in size to be stored. One of the most important data mining techniques on data streams is classification. This involves training the classifier on the data stream in real time and adapting it to concept drifts. Most data stream classifiers are based on decision trees. However, it is well known in the data mining community that there is no single optimal algorithm. An algorithm may work well on one or several datasets but badly on others. This paper introduces eRules, a new rule based adaptive classifier for data streams, based on an evolving set of Rules. eRules induces a set of rules that is constantly evaluated and adapted to changes in the data stream by adding new and removing old rules. It is different from the more popular decision tree based classifiers as it tends to leave data instances rather unclassified than forcing a classification that could be wrong. The ongoing development of eRules aims to improve its accuracy further through dynamic parameter setting which will also address the problem of changing feature domain values.
Resumo:
There is an increasing interest in the application of Evolutionary Algorithms (EAs) to induce classification rules. This hybrid approach can benefit areas where classical methods for rule induction have not been very successful. One example is the induction of classification rules in imbalanced domains. Imbalanced data occur when one or more classes heavily outnumber other classes. Frequently, classical machine learning (ML) classifiers are not able to learn in the presence of imbalanced data sets, inducing classification models that always predict the most numerous classes. In this work, we propose a novel hybrid approach to deal with this problem. We create several balanced data sets with all minority class cases and a random sample of majority class cases. These balanced data sets are fed to classical ML systems that produce rule sets. The rule sets are combined creating a pool of rules and an EA is used to build a classifier from this pool of rules. This hybrid approach has some advantages over undersampling, since it reduces the amount of discarded information, and some advantages over oversampling, since it avoids overfitting. The proposed approach was experimentally analysed and the experimental results show an improvement in the classification performance measured as the area under the receiver operating characteristics (ROC) curve.
Resumo:
The aim of this work was to design a set of rules for levodopa infusion dose adjustment in Parkinson’s disease based on a simulation experiments. Using this simulator, optimal infusions dose in different conditions were calculated. There are seven conditions (-3 to +3)appearing in a rating scale for Parkinson’s disease patients. By finding mean of the differences between conditions and optimal dose, two sets of rules were designed. The set of rules was optimized by several testing. Usefulness for optimizing the titration procedure of new infusion patients based on rule-based reasoning was investigated. Results show that both of the number of the steps and the errors for finding optimal dose was shorten by new rules. At last, the dose predicted with new rules well on each single occasion of majority of patients in simulation experiments.
Resumo:
The Open Provenance Model is a model of provenance that is designed to meet the following requirements: (1) To allow provenance information to be exchanged between systems, by means of a compatibility layer based on a shared provenance model. (2) To allow developers to build and share tools that operate on such a provenance model. (3) To define provenance in a precise, technology-agnostic manner. (4) To support a digital representation of provenance for any 'thing', whether produced by computer systems or not. (5) To allow multiple levels of description to coexist. (6) To define a core set of rules that identify the valid inferences that can be made on provenance representation. This document contains the specification of the Open Provenance Model (v1.1) resulting from a community-effort to achieve inter-operability in the Provenance Challenge series.
Resumo:
Em sua tese de doutoramento, a autora descreve uma pesquisa realizada em uma pequena cidade da Franca, onde constatou-se a oposição entre o discurso do planejador preocupado em introduzir uma nova lógica às práticas cotidianas dos moradores de uma vila operária, e o discurso dos aposentados que ali moravam e cuja vivência dos espaços da vizinhança, a havia sido impregnado de sua própria história de vida. Foi a partir desta experiência que a autora passou a argumentar que sem uma fina e aprofundada observação das práticas cotidianas, nenhuma intervenção no espaço urbano deveria ser realizada. Partindo do princípio que a urbanização e crescimento das cidades segue cada vez mais a lógica do planejamento impondo ao seu habitante uma passagem do espaço privado para o público que e quase sempre abrupta e hostil pois trata a circulação como um fluxo inibindo o desenvolvimento de "espaços de transição", e modificando a concepção de sociabilidade nos espaços nos espaços de vizinhança. Esta pesquisa tem por objetivo procurar algumas "localidades" situadas em uma grande metrópole como São Paulo onde a observação da transição entre a vida privada e pública possa ser estudada. Para a autora o estudo das regras e normas da vida social nesses espaços que ora são chamados de intermediários, ora de transição deverão servir para compor o que ela chama de cultura de vizinhança, e que varia muito entre localidades de uma mesma cidade.
Resumo:
This work is analyzing the challenges which the National Petrol Agency is facing to regulate the Petrol industry in Brazil after the Monopoly crash in the period between 1997 until 2005. Due to the necessities of adaptation of its political strategies to the rules which determine the international economic flows, Brazil was forced to use the Economic Regulation in order to control the market. The regulation established in Brazil is not indifferent to imperfect markets. Thus can be find a conflict of interests among companies, the government and consumers within this process of regulation. The established agency does not have enough autonomy for administrating a regulation. The State with its paternalism power does not allow the agency to fulfill its function for which it was established, even though its function was established by law. A regulating policy which is clearly defined will establish a strong and independent agency with a clear limitation of its competences, avoiding divergent interpretation which prioritizes investments and promotes economic development. The agency will have the challenge to regulate the companies that enter the sector, allowing the opening of the market for new initiatives of investments which contribute to the welfare of the country and breaking at the same time the monopoly that is lead by Petrobras since 1953. Combining a stable set of rules with agility in order to adapt to changes will provide the regulator with a great decision-making power. The flexibility in the regulation will improve the correcting of the rules that were set in the beginning, being more efficient, which are based on acquired experience and achieved results. The structure of the agency and the flexibility of the regulation should be orientated on the promotion of competition in order to achieve economic and social development.
Resumo:
In 1964, year of the military coup, the Brazilian government established a housing finance system with the intention of reducing the housing shortage that had been going on for decades. In order to reach this goal, the government created the Housing Finance System (acronym in Portuguese ¿ SFH), a set of rules which intended to set up a regulated market through standardized contracts and compulsory sources of funds. The system survived for some time, due to the state control of prices and salaries in the authoritarian regime. However, the increasing inflationary pressure obliged the government to adopt a populist subsidy policy, which left as a consequence outstanding balances at the end of the contracts that very often exceeded the value of the financed units. The solution adopted was to create a fund to settle these residual balances. Such fund should be capitalized by the government and by compulsory contributions from borrowers and financial institutions. Since the government did not make such contributions, the debt of this fund increased on a yearly basis, reaching around 3,5 % of Brazil¿s GDP in December 31, 2006. Due to the decline of private investments in the housing finance system, this debt concentrated mostly on public and state-owned companies, government agencies and public funds. The outcome of this policy was the Salary Variations Compensation Fund (acronym in Portuguese ¿ FCVS), which has a negative net equity of 76 billion reais and costs 100 million reais per year to be managed, and whose main creditor is the Federal Government itself.