836 resultados para crittografia, mixnet, EasyCrypt, game-based proofs,sequence of games, computation-aided proofs
Resumo:
In the last decades, intensive research has been carried out in order to replace oil-based polymers with bio-based polymers due to growing environmental concerns. So far, most of the barrier materials used in food packaging are petroleum-based materials. The purpose of the barrier is to protect the packaged food from oxygen, water vapour, water and fat. The mechanical and barrier properties of coatings based on starch-plasticizer and starch-poly(vinyl alcohol) (PVOH)-plasticizer blends have been studied in the work described in this thesis. The plasticizers used were glycerol, polyethylene glycol and citric acid. In a second step, polyethylene coatings were extruded onto paperboard pre-coated with a starch-PVOH-plasticizer blend. The addition of PVOH to the starch increased the flexibility of the film. Curing of the film led to a decrease in flexibility and an increase in tensile strength. The flexibility of the starch-PVOH films was increased more when glycerol or polyethylene glycol was added than citric acid. The storage modulus of the starch-PVOH films containing citric acid increased substantially at high temperature. It was seen that the addition of polyethylene glycol or citric acid to the starch-PVOH blend resulted in an enrichment of PVOH at the surface of the films. Tensile tests on the films indicated that citric acid acted as a compatibilizer and increased the compatibility of the starch and PVOH in the blend. The addition of citric acid to the coating recipe substantially decreased the water vapour transmission rate through the starch-PVOH coated paperboard, which indicated that citric acid acts as a cross-linker for starch and/or PVOH. The starch-PVOH coatings containing citric acid showed oxygen-barrier properties similar to those of pure PVOH or of a starch-PVOH blend without plasticizer when four coating layers were applied on a paperboard. The oxygen-barrier properties of coatings based on a starch-PVOH blend containing citric acid indicated a cross-linking and increase in compatibility of the starch-PVOH blends. Polyethylene extrusion coating on a pre-coated paperboard resulted in a clear reduction in the oxygen transmission rate for all the pre-coating formulations containing plasticizers. The addition of a plasticizer to the pre-coating reduced the adhesion of polyethylene to pre-coated board. Polyethylene extrusion coating gave a board with a lower oxygen transmission rate when the paperboard was pre-coated with a polyethylene-glycol-containing formulation than with a citric-acid-containing formulation. The addition of polyethylene glycol to pre-coatings indicated an increase in wetting of the pre-coated paperboard by the polyethylene melt, and this may have sealed the small defects in the pre-coating leading to low oxygen transmission rate. The increase in brittleness of starch-PVOH films containing citric acid at a high temperature seemed to have a dominating effect on the barrier properties developed by the extrusion coating process.
Resumo:
Thrombophilia stands for a genetic or an acquired tendency to hypercoagulable states that increase the risk of venous and arterial thromboses. Indeed, venous thromboembolism is often a chronic illness, mainly in deep venous thrombosis and pulmonary embolism, requiring lifelong prevention strategies. Therefore, it is crucial to identify the cause of the disease, the most appropriate treatment, the length of treatment or prevent a thrombotic recurrence. Thus, this work will focus on the development of a diagnosis decision support system in terms of a formal agenda built on a logic programming approach to knowledge representation and reasoning, complemented with a case-based approach to computing. The proposed model has been quite accurate in the assessment of thrombophilia predisposition risk, since the overall accuracy is higher than 90% and sensitivity ranging in the interval [86.5%, 88.1%]. The main strength of the proposed solution is the ability to deal explicitly with incomplete, unknown, or even self-contradictory information.
Resumo:
2016
Resumo:
Localization of technology is now widely applied to the preservation and revival of the culture of indigenous peoples around the world, most commonly through the translation into indigenous languages, which has been proven to increase the adoption of technology. However, this current form of localization excludes two demographic groups, which are key to the effectiveness of localization efforts in the African context: the younger generation (under the age of thirty) with an Anglo- American cultural view who have no need or interest in their indigenous culture; and the older generation (over the age of fifty) who are very knowledgeable about their indigenous culture, but have little or no knowledge on the use of a computer. This paper presents the design of a computer game engine that can be used to provide an interface for both technology and indigenous culture learning for both generations. Four indigenous Ugandan games are analyzed and identified for their attractiveness to both generations, to both rural and urban populations, and for their propensity to develop IT skills in older generations.
Resumo:
The co-curing process for advanced grid-stiffened (AGS) composite structure is a promising manufacturing process, which could reduce the manufacturing cost, augment the advantages and improve the performance of AGS composite structure. An improved method named soft-mold aided co-curing process which replaces the expansion molds by a whole rubber mold is adopted in this paper. This co-curing process is capable to co-cure a typical AGS composite structure with the manufacturer’s recommended cure cycle (MRCC). Numerical models are developed to evaluate the variation of temperature and the degree of cure in AGS composite structure during the soft-mold aided co-curing process. The simulation results were validated by experimental results obtained from embedded temperature sensors. Based on the validated modeling framework, the cycle of cure can be optimized by reducing more than half the time of MRCC while obtaining a reliable degree of cure. The shape and size effects of AGS composite structure on the distribution of temperature and degree of cure are also investigated to provide insights for the optimization of soft-mold aided co-curing process.
Resumo:
We study the supercore of a system derived from a normal form game. For the case of a finite game with pure strategies, we define a sequence of games and show that the supercore of that system coincides with the set of Nash equilibrium strategy profiles of the last game in the sequence. This result is illustrated with the characterization of the supercore for the n-person prisoners’ dilemma. With regard to the mixed extension of a normal form game, we show that the set of Nash equilibrium profiles coincides with the supercore for games with a finite number of Nash equilibria. For games with an infinite number of Nash equilibria this need not be no longer the case. Yet, it is not difficult to find a binary relation which guarantees the coincidence of these two sets.
Resumo:
Ce mémoire s‘intéresse à la spatialité du jeu vidéo et à l‘adaptation vidéoludique de lieux réels. Il se concentre sur un lieu précis, le métro, et sur sa représentation dans le genre de l‘horreur. Cette recherche comprend trois niveaux d‘adaptation du lieu et de création spatiale, soit l‘adaptation systémique, l‘adaptation sociohistorique et finalement l‘adaptation technologique. À partir d‘exemples de jeux comparés aux réalités concrètes du métro, ces trois niveaux d‘adaptation sont analysés afin d‘explorer à la fois les impacts du lieu virtuel sur l‘expérience de jeu et les influences externes qui guident la conception vidéoludique.
Resumo:
Il y a des problemes qui semblent impossible a resoudre sans l'utilisation d'un tiers parti honnete. Comment est-ce que deux millionnaires peuvent savoir qui est le plus riche sans dire a l'autre la valeur de ses biens ? Que peut-on faire pour prevenir les collisions de satellites quand les trajectoires sont secretes ? Comment est-ce que les chercheurs peuvent apprendre les liens entre des medicaments et des maladies sans compromettre les droits prives du patient ? Comment est-ce qu'une organisation peut ecmpecher le gouvernement d'abuser de l'information dont il dispose en sachant que l'organisation doit n'avoir aucun acces a cette information ? Le Calcul multiparti, une branche de la cryptographie, etudie comment creer des protocoles pour realiser de telles taches sans l'utilisation d'un tiers parti honnete. Les protocoles doivent etre prives, corrects, efficaces et robustes. Un protocole est prive si un adversaire n'apprend rien de plus que ce que lui donnerait un tiers parti honnete. Un protocole est correct si un joueur honnete recoit ce que lui donnerait un tiers parti honnete. Un protocole devrait bien sur etre efficace. Etre robuste correspond au fait qu'un protocole marche meme si un petit ensemble des joueurs triche. On demontre que sous l'hypothese d'un canal de diusion simultane on peut echanger la robustesse pour la validite et le fait d'etre prive contre certains ensembles d'adversaires. Le calcul multiparti a quatre outils de base : le transfert inconscient, la mise en gage, le partage de secret et le brouillage de circuit. Les protocoles du calcul multiparti peuvent etre construits avec uniquements ces outils. On peut aussi construire les protocoles a partir d'hypoth eses calculatoires. Les protocoles construits a partir de ces outils sont souples et peuvent resister aux changements technologiques et a des ameliorations algorithmiques. Nous nous demandons si l'efficacite necessite des hypotheses de calcul. Nous demontrons que ce n'est pas le cas en construisant des protocoles efficaces a partir de ces outils de base. Cette these est constitue de quatre articles rediges en collaboration avec d'autres chercheurs. Ceci constitue la partie mature de ma recherche et sont mes contributions principales au cours de cette periode de temps. Dans le premier ouvrage presente dans cette these, nous etudions la capacite de mise en gage des canaux bruites. Nous demontrons tout d'abord une limite inferieure stricte qui implique que contrairement au transfert inconscient, il n'existe aucun protocole de taux constant pour les mises en gage de bit. Nous demontrons ensuite que, en limitant la facon dont les engagements peuvent etre ouverts, nous pouvons faire mieux et meme un taux constant dans certains cas. Ceci est fait en exploitant la notion de cover-free families . Dans le second article, nous demontrons que pour certains problemes, il existe un echange entre robustesse, la validite et le prive. Il s'effectue en utilisant le partage de secret veriable, une preuve a divulgation nulle, le concept de fantomes et une technique que nous appelons les balles et les bacs. Dans notre troisieme contribution, nous demontrons qu'un grand nombre de protocoles dans la litterature basee sur des hypotheses de calcul peuvent etre instancies a partir d'une primitive appelee Transfert Inconscient Veriable, via le concept de Transfert Inconscient Generalise. Le protocole utilise le partage de secret comme outils de base. Dans la derniere publication, nous counstruisons un protocole efficace avec un nombre constant de rondes pour le calcul a deux parties. L'efficacite du protocole derive du fait qu'on remplace le coeur d'un protocole standard par une primitive qui fonctionne plus ou moins bien mais qui est tres peu couteux. On protege le protocole contre les defauts en utilisant le concept de privacy amplication .
Resumo:
We transform a non co-operati ve game into a -Bayesian decision problem for each player where the uncertainty faced by a player is the strategy choices of the other players, the pr iors of other players on the choice of other players, the priors over priors and so on.We provide a complete characterization between the extent of knowledge about the rationality of players and their ability to successfulIy eliminate strategies which are not best responses. This paper therefore provides the informational foundations of iteratively unàominated strategies and rationalizable strategic behavior (Bernheim (1984) and Pearce (1984». Moreover, sufficient condi tions are also found for Nash equilibrium behavior. We also provide Aumann's (1985) results on correlated equilibria .
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
We introduce the need for a distributed guideline-based decision sup-port (DSS) process, describe its characteristics, and explain how we implement-ed this process within the European Union?s MobiGuide project. In particular, we have developed a mechanism of sequential, piecemeal projection, i.e., 'downloading' small portions of the guideline from the central DSS server, to the local DSS in the patient's mobile device, which then applies that portion, us-ing the mobile device's local resources. The mobile device sends a callback to the central DSS when it encounters a triggering pattern predefined in the pro-jected module, which leads to an appropriate predefined action by the central DSS, including sending a new projected module, or directly controlling the rest of the workflow. We suggest that such a distributed architecture that explicitly defines a dialog between a central DSS server and a local DSS module, better balances the computational load and exploits the relative advantages of the cen-tral server and of the local mobile device.
Resumo:
A simple evolutionary process can discover sophisticated methods for emergent information processing in decentralized spatially extended systems. The mechanisms underlying the resulting emergent computation are explicated by a technique for analyzing particle-based logic embedded in pattern-forming systems. Understanding how globally coordinated computation can emerge in evolution is relevant both for the scientific understanding of natural information processing and for engineering new forms of parallel computing systems.
Resumo:
Description based on: 1890.
Resumo:
Computer Game Playing has been an active area of research since Samuel’s first Checkers player (Samuel 1959). Recently interest beyond the classic games of Chess and Checkers has led to competitions such as the General Game Playing competition, in which players have no beforehand knowledge of the games they are to play, and the Computer Poker Competition which force players to reason about imperfect information under conditions of uncertainty. The purpose of this dissertation is to explore the area of General Game Playing both specifically and generally. On the specific side, we describe the design and implementation of our General Game Playing system OGRE. This system includes an innovative method for feature extraction that helped it to achieve second and fourth place in two international General Game Playing competitions. On the more general side, we also introduce the Regular Game Language, which goes beyond current works to provide support for both stochastic and imperfect information games as well as the more traditional games.