904 resultados para Secure multiparty computation cryptography


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Division of Professional and Occupational Licensing is an administrative unit within the Department of Labor, Licensing, and Regulation that provides support to 40 professional and occupational regulatory boards. All board members need documents to review during the meeting. Some boards use a fleet of dated laptops during the meetings to give board members the meeting materials in PDF format; however many of boards still print the meeting materials and put them into binders. .In today's age, there has to be a more efficient way to conduct meetings without all of the paper. This paper explores alternatives to paper and laptops for use at these meetings. It was concluded that a tablet with a Windows Operating System would be the best way to go.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reinforcement learning is a particular paradigm of machine learning that, recently, has proved times and times again to be a very effective and powerful approach. On the other hand, cryptography usually takes the opposite direction. While machine learning aims at analyzing data, cryptography aims at maintaining its privacy by hiding such data. However, the two techniques can be jointly used to create privacy preserving models, able to make inferences on the data without leaking sensitive information. Despite the numerous amount of studies performed on machine learning and cryptography, reinforcement learning in particular has never been applied to such cases before. Being able to successfully make use of reinforcement learning in an encrypted scenario would allow us to create an agent that efficiently controls a system without providing it with full knowledge of the environment it is operating in, leading the way to many possible use cases. Therefore, we have decided to apply the reinforcement learning paradigm to encrypted data. In this project we have applied one of the most well-known reinforcement learning algorithms, called Deep Q-Learning, to simple simulated environments and studied how the encryption affects the training performance of the agent, in order to see if it is still able to learn how to behave even when the input data is no longer readable by humans. The results of this work highlight that the agent is still able to learn with no issues whatsoever in small state spaces with non-secure encryptions, like AES in ECB mode. For fixed environments, it is also able to reach a suboptimal solution even in the presence of secure modes, like AES in CBC mode, showing a significant improvement with respect to a random agent; however, its ability to generalize in stochastic environments or big state spaces suffers greatly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, information security is a very important topic. In particular, wireless networks are experiencing an ongoing widespread diffusion, also thanks the increasing number of Internet Of Things devices, which generate and transmit a lot of data: protecting wireless communications is of fundamental importance, possibly through an easy but secure method. Physical Layer Security is an umbrella of techniques that leverages the characteristic of the wireless channel to generate security for the transmission. In particular, the Physical Layer based-Key generation aims at allowing two users to generate a random symmetric keys in an autonomous way, hence without the aid of a trusted third entity. Physical Layer based-Key generation relies on observations of the wireless channel, from which harvesting entropy: however, an attacker might possesses a channel simulator, for example a Ray Tracing simulator, to replicate the channel between the legitimate users, in order to guess the secret key and break the security of the communication. This thesis work is focused on the possibility to carry out a so called Ray Tracing attack: the method utilized for the assessment consist of a set of channel measurements, in different channel conditions, that are then compared with the simulated channel from the ray tracing, to compute the mutual information between the measurements and simulations. Furthermore, it is also presented the possibility of using the Ray Tracing as a tool to evaluate the impact of channel parameters (e.g. the bandwidth or the directivity of the antenna) on the Physical Layer based-Key generation. The measurements have been carried out at the Barkhausen Institut gGmbH in Dresden (GE), in the framework of the existing cooperation agreement between BI and the Dept. of Electrical, Electronics and Information Engineering "G. Marconi" (DEI) at the University of Bologna.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation investigates the relations between logic and TCS in the probabilistic setting. It is motivated by two main considerations. On the one hand, since their appearance in the 1960s-1970s, probabilistic models have become increasingly pervasive in several fast-growing areas of CS. On the other, the study and development of (deterministic) computational models has considerably benefitted from the mutual interchanges between logic and CS. Nevertheless, probabilistic computation was only marginally touched by such fruitful interactions. The goal of this thesis is precisely to (start) bring(ing) this gap, by developing logical systems corresponding to specific aspects of randomized computation and, therefore, by generalizing standard achievements to the probabilistic realm. To do so, our key ingredient is the introduction of new, measure-sensitive quantifiers associated with quantitative interpretations. The dissertation is tripartite. In the first part, we focus on the relation between logic and counting complexity classes. We show that, due to our classical counting propositional logic, it is possible to generalize to counting classes, the standard results by Cook and Meyer and Stockmeyer linking propositional logic and the polynomial hierarchy. Indeed, we show that the validity problem for counting-quantified formulae captures the corresponding level in Wagner's hierarchy. In the second part, we consider programming language theory. Type systems for randomized \lambda-calculi, also guaranteeing various forms of termination properties, were introduced in the last decades, but these are not "logically oriented" and no Curry-Howard correspondence is known for them. Following intuitions coming from counting logics, we define the first probabilistic version of the correspondence. Finally, we consider the relationship between arithmetic and computation. We present a quantitative extension of the language of arithmetic able to formalize basic results from probability theory. This language is also our starting point to define randomized bounded theories and, so, to generalize canonical results by Buss.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the main practical implications of quantum mechanical theory is quantum computing, and therefore the quantum computer. Quantum computing (for example, with Shor’s algorithm) challenges the computational hardness assumptions, such as the factoring problem and the discrete logarithm problem, that anchor the safety of cryptosystems. So the scientific community is studying how to defend cryptography; there are two defense strategies: the quantum cryptography (which involves the use of quantum cryptographic algorithms on quantum computers) and the post-quantum cryptography (based on classical cryptographic algorithms, but resistant to quantum computers). For example, National Institute of Standards and Technology (NIST) is collecting and standardizing the post-quantum ciphers, as it established DES and AES as symmetric cipher standards, in the past. In this thesis an introduction on quantum mechanics was given, in order to be able to talk about quantum computing and to analyze Shor’s algorithm. The differences between quantum and post-quantum cryptography were then analyzed. Subsequently the focus was given to the mathematical problems assumed to be resistant to quantum computers. To conclude, post-quantum digital signature cryptographic algorithms selected by NIST were studied and compared in order to apply them in today’s life.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, we develop a randomized bounded arithmetic for probabilistic computation, following the approach adopted by Buss for non-randomized computation. This work relies on a notion of representability inspired by of Buss' one, but depending on a non-standard quantitative and measurable semantic. Then, we establish that the representable functions are exactly the ones in PPT. Finally, we extend the language of our arithmetic with a measure quantifier, which is true if and only if the quantified formula's semantic has measure greater than a given threshold. This allows us to define purely logical characterizations of standard probabilistic complexity classes such as BPP, RP, co-RP and ZPP.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract In this paper, we address the problem of picking a subset of bids in a general combinatorial auction so as to maximize the overall profit using the first-price model. This winner determination problem assumes that a single bidding round is held to determine both the winners and prices to be paid. We introduce six variants of biased random-key genetic algorithms for this problem. Three of them use a novel initialization technique that makes use of solutions of intermediate linear programming relaxations of an exact mixed integer-linear programming model as initial chromosomes of the population. An experimental evaluation compares the effectiveness of the proposed algorithms with the standard mixed linear integer programming formulation, a specialized exact algorithm, and the best-performing heuristics proposed for this problem. The proposed algorithms are competitive and offer strong results, mainly for large-scale auctions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work approaches the forced air cooling of strawberry by numerical simulation. The mathematical model that was used describes the process of heat transfer, based on the Fourier's law, in spherical coordinates and simplified to describe the one-dimensional process. For the resolution of the equation expressed for the mathematical model, an algorithm was developed based on the explicit scheme of the numerical method of the finite differences and implemented in the scientific computation program MATLAB 6.1. The validation of the mathematical model was made by the comparison between theoretical and experimental data, where strawberries had been cooled with forced air. The results showed to be possible the determination of the convective heat transfer coefficient by fitting the numerical and experimental data. The methodology of the numerical simulations was showed like a promising tool in the support of the decision to use or to develop equipment in the area of cooling process with forced air of spherical fruits.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Universidade Estadual de Campinas. Faculdade de Educação Física

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Universidade Estadual de Campinas. Faculdade de Educação Física

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: To estimate the spatial intensity of urban violence events using wavelet-based methods and emergency room data. METHODS: Information on victims attended at the emergency room of a public hospital in the city of São Paulo, Southeastern Brazil, from January 1, 2002 to January 11, 2003 were obtained from hospital records. The spatial distribution of 3,540 events was recorded and a uniform random procedure was used to allocate records with incomplete addresses. Point processes and wavelet analysis technique were used to estimate the spatial intensity, defined as the expected number of events by unit area. RESULTS: Of all georeferenced points, 59% were accidents and 40% were assaults. There is a non-homogeneous spatial distribution of the events with high concentration in two districts and three large avenues in the southern area of the city of São Paulo. CONCLUSIONS: Hospital records combined with methodological tools to estimate intensity of events are useful to study urban violence. The wavelet analysis is useful in the computation of the expected number of events and their respective confidence bands for any sub-region and, consequently, in the specification of risk estimates that could be used in decision-making processes for public policies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O advento da terapia anti-retroviral de alta potência (HAART) alterou a história natural da aids, diminuindo sua mortalidade e a incidência de doenças oportunistas e aumentando a esperança de vida das pessoas vivendo com aids.Como uma doença crônica, outras questões passam a ser relevantes, entre elas a adesão ao tratamento, seus efeitos adversos e a qualidade de vida das pessoas nessa condição. A CIF constitui um instrumento adequado para identificar as características da funcionalidade, do ambiente e condições pessoais que interferem na qualidade de vida. Instrumentos para a sua aplicação, core sets, têm sido desenvolvidos para várias condições de saúde. Com o objetivo de propor um core set para aids, foram desenvolvidas duas etapas preliminares do modelo proposto para a construção desses instrumentos. A primeira etapa, de revisão sistemática buscou no MEDLINE artigos com descritores HAART e qualidade de vida, publicados em inglês, de 2000 a 2004. Foram selecionados 31 estudos que resultou em 87 conceitos dos quais 66 puderam ser identificados como categorias da CIF. Estas formaram as perguntas da entrevista aplicada em 42 voluntários, pacientes de um centro de referência para DST e Aids de São Paulo. Entre as condições mais freqüentemente associadas ao tratamento, estão às mudanças na imagem corporal, conseqüência da lipodistrofia, apontada em 84 por cento dos estudos e em 93 por cento das entrevistas. Alterações das funções digestivas, das relações íntimas, e das funções sexuais foram condições importantes identificadas no estudo. As duas etapas definiram 40 categorias da CIF como proposta preliminar de um core set para pacientes com aids

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objetivos: Avaliar a limitação de atividades e a participação social em indivíduos portadores de diabetes melito tipo 2. Métodos: Foram avaliados 79 pacientes, utilizando-se a escala SALSA (Screening of Activity Limitation and Safety Awareness - Triagem de Limitação de Atividade e Consciência de Risco), e a escala de Participação, que abrange oito das nove principais áreas da vida definidas na Classificação Internacional de Funcionalidade, Incapacidade e Saúde (CIF) da OMS. Resultados: A idade média dos participantes foi 61,6 ± 9,8 anos, sendo 55,7 por cento do sexo feminino, 68,4 por cento com companheiro(a), 32,9 por cento com renda até 3 salários mínimos e em 13,9 por cento o diabete influenciou na ocupação. O tempo médio de doença foi 10,3 ± 8,9 anos. Tratamento de 39,3 por cento dos participantes foi com insulina, 70,9 por cento com medicação oral, 51,9 por cento com dieta e 45,6 por cento com exercícios físicos. 48,1 por cento apresentavam alguma complicação da doença. A média de pontos SALSA foi 26,5 ± 11,6 e houve maior pontuação quando o tempo de doença foi superior a 10 anos. Com a evolução do diabetes, pode haver necessidade de insulinoterapia, aparecem as complicações, que podem interferir na ocupação. Estes fatores parecem contribuir para a limitação de atividade. A média de pontos na Escala de Participação foi 9,8±10,9, com maior pontuação quando os entrevistados consideraram sua saúde física alterada no último ano e faziam uso de insulina. Conclusões: A limitação de atividades no diabetes melito tipo 2 se associou ao tempo de doença, com possível contribuição de fatores que ocorrem com sua evolução. Auto-avaliação de saúde física alterada e insulinoterapia se associaram a restrição social

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since the first experimental evidences of active conductances in dendrites, most neurons have been shown to exhibit dendritic excitability through the expression of a variety of voltage-gated ion channels. However, despite experimental and theoretical efforts undertaken in the past decades, the role of this excitability for some kind of dendritic computation has remained elusive. Here we show that, owing to very general properties of excitable media, the average output of a model of an active dendritic tree is a highly non-linear function of its afferent rate, attaining extremely large dynamic ranges (above 50 dB). Moreover, the model yields double-sigmoid response functions as experimentally observed in retinal ganglion cells. We claim that enhancement of dynamic range is the primary functional role of active dendritic conductances. We predict that neurons with larger dendritic trees should have larger dynamic range and that blocking of active conductances should lead to a decrease in dynamic range.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Analyses of population structure and breed diversity have provided insight into the origin and evolution of cattle. Previously, these studies have used a low density of microsatellite markers, however, with the large number of single nucleotide polymorphism markers that are now available, it is possible to perform genome wide population genetic analyses in cattle. In this study, we used a high-density panel of SNP markers to examine population structure and diversity among eight cattle breeds sampled from Bos indicus and Bos taurus. Results: Two thousand six hundred and forty one single nucleotide polymorphisms ( SNPs) spanning all of the bovine autosomal genome were genotyped in Angus, Brahman, Charolais, Dutch Black and White Dairy, Holstein, Japanese Black, Limousin and Nelore cattle. Population structure was examined using the linkage model in the program STRUCTURE and Fst estimates were used to construct a neighbor-joining tree to represent the phylogenetic relationship among these breeds. Conclusion: The whole-genome SNP panel identified several levels of population substructure in the set of examined cattle breeds. The greatest level of genetic differentiation was detected between the Bos taurus and Bos indicus breeds. When the Bos indicus breeds were excluded from the analysis, genetic differences among beef versus dairy and European versus Asian breeds were detected among the Bos taurus breeds. Exploration of the number of SNP loci required to differentiate between breeds showed that for 100 SNP loci, individuals could only be correctly clustered into breeds 50% of the time, thus a large number of SNP markers are required to replace the 30 microsatellite markers that are currently commonly used in genetic diversity studies.