926 resultados para Combinatorial reasoning
Resumo:
We present an educational proposal for the study of combinatorial reasoning and calculation of probabilities based on a game and problem solving methodology, aiming to support the teaching of mathematical content. A review of the literature related to the teaching and learning of the concepts of combinatorial reasoning and probability is presented. The game is original, using a board game similar to Tic-tac-toe, and the movements of its parts have some similarities, in particular those made with the rook and pawn pieces of the game of chess. We formulate various activities (problems) involving the game which, in the process of solving them, using the problem solving methodology and with the appropriate intervention of the teacher, encourage students to develop strategies for counting, an indispensable tool in the initial study of Combinatorial Analysis and Calculation of Probabilities.
Resumo:
La seguridad verificada es una metodología para demostrar propiedades de seguridad de los sistemas informáticos que se destaca por las altas garantías de corrección que provee. Los sistemas informáticos se modelan como programas probabilísticos y para probar que verifican una determinada propiedad de seguridad se utilizan técnicas rigurosas basadas en modelos matemáticos de los programas. En particular, la seguridad verificada promueve el uso de demostradores de teoremas interactivos o automáticos para construir demostraciones completamente formales cuya corrección es certificada mecánicamente (por ordenador). La seguridad verificada demostró ser una técnica muy efectiva para razonar sobre diversas nociones de seguridad en el área de criptografía. Sin embargo, no ha podido cubrir un importante conjunto de nociones de seguridad “aproximada”. La característica distintiva de estas nociones de seguridad es que se expresan como una condición de “similitud” entre las distribuciones de salida de dos programas probabilísticos y esta similitud se cuantifica usando alguna noción de distancia entre distribuciones de probabilidad. Este conjunto incluye destacadas nociones de seguridad de diversas áreas como la minería de datos privados, el análisis de flujo de información y la criptografía. Ejemplos representativos de estas nociones de seguridad son la indiferenciabilidad, que permite reemplazar un componente idealizado de un sistema por una implementación concreta (sin alterar significativamente sus propiedades de seguridad), o la privacidad diferencial, una noción de privacidad que ha recibido mucha atención en los últimos años y tiene como objetivo evitar la publicación datos confidenciales en la minería de datos. La falta de técnicas rigurosas que permitan verificar formalmente este tipo de propiedades constituye un notable problema abierto que tiene que ser abordado. En esta tesis introducimos varias lógicas de programa quantitativas para razonar sobre esta clase de propiedades de seguridad. Nuestra principal contribución teórica es una versión quantitativa de una lógica de Hoare relacional para programas probabilísticos. Las pruebas de correción de estas lógicas son completamente formalizadas en el asistente de pruebas Coq. Desarrollamos, además, una herramienta para razonar sobre propiedades de programas a través de estas lógicas extendiendo CertiCrypt, un framework para verificar pruebas de criptografía en Coq. Confirmamos la efectividad y aplicabilidad de nuestra metodología construyendo pruebas certificadas por ordendor de varios sistemas cuyo análisis estaba fuera del alcance de la seguridad verificada. Esto incluye, entre otros, una meta-construcción para diseñar funciones de hash “seguras” sobre curvas elípticas y algoritmos diferencialmente privados para varios problemas de optimización combinatoria de la literatura reciente. ABSTRACT The verified security methodology is an emerging approach to build high assurance proofs about security properties of computer systems. Computer systems are modeled as probabilistic programs and one relies on rigorous program semantics techniques to prove that they comply with a given security goal. In particular, it advocates the use of interactive theorem provers or automated provers to build fully formal machine-checked versions of these security proofs. The verified security methodology has proved successful in modeling and reasoning about several standard security notions in the area of cryptography. However, it has fallen short of covering an important class of approximate, quantitative security notions. The distinguishing characteristic of this class of security notions is that they are stated as a “similarity” condition between the output distributions of two probabilistic programs, and this similarity is quantified using some notion of distance between probability distributions. This class comprises prominent security notions from multiple areas such as private data analysis, information flow analysis and cryptography. These include, for instance, indifferentiability, which enables securely replacing an idealized component of system with a concrete implementation, and differential privacy, a notion of privacy-preserving data mining that has received a great deal of attention in the last few years. The lack of rigorous techniques for verifying these properties is thus an important problem that needs to be addressed. In this dissertation we introduce several quantitative program logics to reason about this class of security notions. Our main theoretical contribution is, in particular, a quantitative variant of a full-fledged relational Hoare logic for probabilistic programs. The soundness of these logics is fully formalized in the Coq proof-assistant and tool support is also available through an extension of CertiCrypt, a framework to verify cryptographic proofs in Coq. We validate the applicability of our approach by building fully machine-checked proofs for several systems that were out of the reach of the verified security methodology. These comprise, among others, a construction to build “safe” hash functions into elliptic curves and differentially private algorithms for several combinatorial optimization problems from the recent literature.
Biased Random-key Genetic Algorithms For The Winner Determination Problem In Combinatorial Auctions.
Resumo:
Abstract In this paper, we address the problem of picking a subset of bids in a general combinatorial auction so as to maximize the overall profit using the first-price model. This winner determination problem assumes that a single bidding round is held to determine both the winners and prices to be paid. We introduce six variants of biased random-key genetic algorithms for this problem. Three of them use a novel initialization technique that makes use of solutions of intermediate linear programming relaxations of an exact mixed integer-linear programming model as initial chromosomes of the population. An experimental evaluation compares the effectiveness of the proposed algorithms with the standard mixed linear integer programming formulation, a specialized exact algorithm, and the best-performing heuristics proposed for this problem. The proposed algorithms are competitive and offer strong results, mainly for large-scale auctions.
Resumo:
We investigate the performance of a variant of Axelrod's model for dissemination of culture-the Adaptive Culture Heuristic (ACH)-on solving an NP-Complete optimization problem, namely, the classification of binary input patterns of size F by a Boolean Binary Perceptron. In this heuristic, N agents, characterized by binary strings of length F which represent possible solutions to the optimization problem, are fixed at the sites of a square lattice and interact with their nearest neighbors only. The interactions are such that the agents' strings (or cultures) become more similar to the low-cost strings of their neighbors resulting in the dissemination of these strings across the lattice. Eventually the dynamics freezes into a homogeneous absorbing configuration in which all agents exhibit identical solutions to the optimization problem. We find through extensive simulations that the probability of finding the optimal solution is a function of the reduced variable F/N(1/4) so that the number of agents must increase with the fourth power of the problem size, N proportional to F(4), to guarantee a fixed probability of success. In this case, we find that the relaxation time to reach an absorbing configuration scales with F(6) which can be interpreted as the overall computational cost of the ACH to find an optimal set of weights for a Boolean binary perceptron, given a fixed probability of success.
Resumo:
Background: Continuing education courses related to critical thinking and clinical reasoning are needed to improve the accuracy of diagnosis. Method: This study evaluated a 4-day, 16-hour continuing education course conducted in Brazil. Thirty-nine nurses completed a pretest and a posttest consisting of two written case studies designed to measure the accuracy of nurses` diagnoses. Results: There were significant differences in accuracy from pretest to posttest for case 1 (p = .008) and case 2 (p = .042) and overall (p = .001). Conclusion: Continuing education courses should be implemented to improve the accuracy of nurses` diagnoses. J Contin Educ Nurs 2009;40(3):121-127.
Resumo:
A combination of deductive reasoning, clustering, and inductive learning is given as an example of a hybrid system for exploratory data analysis. Visualization is replaced by a dialogue with the data.
Resumo:
A major challenge associated with using large chemical libraries synthesized on microscopic solid support beads is the rapid discrimination of individual compounds in these libraries. This challenge can be overcome by encoding the beads with 1 mum silica colloidal particles (reporters) that contain specific and identifiable combinations of fluorescent byes. The colored bar code generated on support beads during combinatorial library synthesis can be easily, rapidly, and inexpensively decoded through the use of fluorescence microscopy. All reporters are precoated with polyelectrolytes [poly(acrylic acid), PAA, poly(sodium 4-styrenesulfonate PSSS, polyethylenimine, PEI, and/or poly(diallyldimethylammonium chloride), PDADMAC] with the aim of enhancing surface charge, promoting electrostatic attraction to the bead, and facilitating polymer bridging between the bead and reporter for permanent adhesion. As shown in this article, reporters coated with polyelectrolytes clearly outperform uncoated reporters with regard to quantity of attached reporters per bead (54 +/- 23 in 2500 mum(2) area for PEI/PAA coated and 11 +/- 6 for uncoated reporters) and minimization of cross-contamination (1 red reporter in 2500 mum(2) area of green-labeled bead for PEI/PAA coated and 26 +/- 15 red reporters on green-labeled beads for uncoated reporters after 10 days). Examination of various polyelectrolyte systems shows that the magnitude of the xi -potential of polyelectrolyte-coated reporters (-64 mV for PDADMAC/PSSS and -42 mV for PEI/PAA-coated reporters) has no correlation with the number of reporters that adhere to the solid support beads (21 +/- 16 in 2500 mum(2) area for PDADMAC/PSSS and 54 +/- 23 for PEI/PAA-coated reporters). The contribution of polymer bridging to the adhesion has a far greater influence than electrostatic attraction and is demonstrated by modification of the polyelectrolyte multilayers using gamma irradiation of precoated reporters either in aqueous solution or in polyelectrolyte solution.
Resumo:
Substance-dependence is highly associated with executive cognitive function (ECF) impairments. However. considering that it is difficult to assess ECF clinically, the aim of the present study was to examine the feasibility of a brief neuropsychological tool (the Frontal Assessment Battery FAB) to detect specific ECF impairments in a sample of substance-dependent individuals (SDI). Sixty-two subjects participated in this study. Thirty DSM-IV-diagnosed SDI, after 2 weeks of abstinence, and 32 healthy individuals (control group) were evaluated with FAD and other ECF-related tasks: digits forward (DF), digits backward (DB), Stroop Color Word Test (SCWT), and Wisconsin Card Sorting Test (WCST). SDI did not differ from the control group on sociodemographic variables or IQ. However, SDI performed below the controls in OF, DB, and FAB. The SDI were cognitively impaired in 3 of the 6 cognitive domains assessed by the FAB: abstract reasoning, motor programming, and cognitive flexibility. The FAB correlated with DF, SCWT, and WCST. In addition, some neuropsychological measures were correlated with the amount of alcohol, cannabis, and cocaine use. In conclusion, SDI performed more poorly than the comparison group on the FAB and the FAB`s results were associated with other ECF-related tasks. The results suggested a negative impact of alcohol, cannabis, and cocaine use on the ECF. The FAB may be useful in assisting professionals as an instrument to screen for ECF-related deficits in SDI. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Large chemical libraries can be synthesized on solid-support beads by the combinatorial split-and-mix method. A major challenge associated with this type of library synthesis is distinguishing between the beads and their attached compounds. A new method of encoding these solid-support beads, 'colloidal bar-coding', involves attaching fluorescent silica colloids ('reporters') to the beads as they pass through the compound synthesis, thereby creating a fluorescent bar code on each bead. In order to obtain sufficient reporter varieties to bar code extremely large libraries, many of the reporters must contain multiple fluorescent dyes. We describe here the synthesis and spectroscopic analysis of various mono- and multi-fluorescent silica particles for this purpose. It was found that by increasing the amount of a single dye introduced into the particle reaction mixture, mono- fluorescent silica particles of increasing intensities could be prepared. This increase was highly reproducible and was observed for six different fluorescent dyes. Multi-fluorescent silica particles containing up to six fluorescent dyes were also prepared. The resultant emission intensity of each dye in the multi-fluorescent particles was found to be dependent upon a number of factors; the hydrolysis rate of each silane-dye conjugate, the magnitude of the inherent emission intensity of each dye within the silica matrix, and energy transfer effects between dyes. We show that by varying the relative concentration of each silane-dye conjugate in the synthesis of multi-fluorescent particles, it is possible to change and optimize the resultant emission intensity of each dye to enable viewing in a fluorescence detection instrument.