937 resultados para Threshold crypto-graphic schemes and algorithms


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les humains communiquent via différents types de canaux: les mots, la voix, les gestes du corps, des émotions, etc. Pour cette raison, un ordinateur doit percevoir ces divers canaux de communication pour pouvoir interagir intelligemment avec les humains, par exemple en faisant usage de microphones et de webcams. Dans cette thèse, nous nous intéressons à déterminer les émotions humaines à partir d’images ou de vidéo de visages afin d’ensuite utiliser ces informations dans différents domaines d’applications. Ce mémoire débute par une brève introduction à l'apprentissage machine en s’attardant aux modèles et algorithmes que nous avons utilisés tels que les perceptrons multicouches, réseaux de neurones à convolution et autoencodeurs. Elle présente ensuite les résultats de l'application de ces modèles sur plusieurs ensembles de données d'expressions et émotions faciales. Nous nous concentrons sur l'étude des différents types d’autoencodeurs (autoencodeur débruitant, autoencodeur contractant, etc) afin de révéler certaines de leurs limitations, comme la possibilité d'obtenir de la coadaptation entre les filtres ou encore d’obtenir une courbe spectrale trop lisse, et étudions de nouvelles idées pour répondre à ces problèmes. Nous proposons également une nouvelle approche pour surmonter une limite des autoencodeurs traditionnellement entrainés de façon purement non-supervisée, c'est-à-dire sans utiliser aucune connaissance de la tâche que nous voulons finalement résoudre (comme la prévision des étiquettes de classe) en développant un nouveau critère d'apprentissage semi-supervisé qui exploite un faible nombre de données étiquetées en combinaison avec une grande quantité de données non-étiquetées afin d'apprendre une représentation adaptée à la tâche de classification, et d'obtenir une meilleure performance de classification. Finalement, nous décrivons le fonctionnement général de notre système de détection d'émotions et proposons de nouvelles idées pouvant mener à de futurs travaux.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dans ce mémoire, nous nous pencherons tout particulièrement sur une primitive cryptographique connue sous le nom de partage de secret. Nous explorerons autant le domaine classique que le domaine quantique de ces primitives, couronnant notre étude par la présentation d’un nouveau protocole de partage de secret quantique nécessitant un nombre minimal de parts quantiques c.-à-d. une seule part quantique par participant. L’ouverture de notre étude se fera par la présentation dans le chapitre préliminaire d’un survol des notions mathématiques sous-jacentes à la théorie de l’information quantique ayant pour but primaire d’établir la notation utilisée dans ce manuscrit, ainsi que la présentation d’un précis des propriétés mathématique de l’état de Greenberger-Horne-Zeilinger (GHZ) fréquemment utilisé dans les domaines quantiques de la cryptographie et des jeux de la communication. Mais, comme nous l’avons mentionné plus haut, c’est le domaine cryptographique qui restera le point focal de cette étude. Dans le second chapitre, nous nous intéresserons à la théorie des codes correcteurs d’erreurs classiques et quantiques qui seront à leur tour d’extrême importances lors de l’introduction de la théorie quantique du partage de secret dans le chapitre suivant. Dans la première partie du troisième chapitre, nous nous concentrerons sur le domaine classique du partage de secret en présentant un cadre théorique général portant sur la construction de ces primitives illustrant tout au long les concepts introduits par des exemples présentés pour leurs intérêts autant historiques que pédagogiques. Ceci préparera le chemin pour notre exposé sur la théorie quantique du partage de secret qui sera le focus de la seconde partie de ce même chapitre. Nous présenterons alors les théorèmes et définitions les plus généraux connus à date portant sur la construction de ces primitives en portant un intérêt particulier au partage quantique à seuil. Nous montrerons le lien étroit entre la théorie quantique des codes correcteurs d’erreurs et celle du partage de secret. Ce lien est si étroit que l’on considère les codes correcteurs d’erreurs quantiques étaient de plus proches analogues aux partages de secrets quantiques que ne leur étaient les codes de partage de secrets classiques. Finalement, nous présenterons un de nos trois résultats parus dans A. Broadbent, P.-R. Chouha, A. Tapp (2009); un protocole sécuritaire et minimal de partage de secret quantique a seuil (les deux autres résultats dont nous traiterons pas ici portent sur la complexité de la communication et sur la simulation classique de l’état de GHZ).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il y a des problemes qui semblent impossible a resoudre sans l'utilisation d'un tiers parti honnete. Comment est-ce que deux millionnaires peuvent savoir qui est le plus riche sans dire a l'autre la valeur de ses biens ? Que peut-on faire pour prevenir les collisions de satellites quand les trajectoires sont secretes ? Comment est-ce que les chercheurs peuvent apprendre les liens entre des medicaments et des maladies sans compromettre les droits prives du patient ? Comment est-ce qu'une organisation peut ecmpecher le gouvernement d'abuser de l'information dont il dispose en sachant que l'organisation doit n'avoir aucun acces a cette information ? Le Calcul multiparti, une branche de la cryptographie, etudie comment creer des protocoles pour realiser de telles taches sans l'utilisation d'un tiers parti honnete. Les protocoles doivent etre prives, corrects, efficaces et robustes. Un protocole est prive si un adversaire n'apprend rien de plus que ce que lui donnerait un tiers parti honnete. Un protocole est correct si un joueur honnete recoit ce que lui donnerait un tiers parti honnete. Un protocole devrait bien sur etre efficace. Etre robuste correspond au fait qu'un protocole marche meme si un petit ensemble des joueurs triche. On demontre que sous l'hypothese d'un canal de diusion simultane on peut echanger la robustesse pour la validite et le fait d'etre prive contre certains ensembles d'adversaires. Le calcul multiparti a quatre outils de base : le transfert inconscient, la mise en gage, le partage de secret et le brouillage de circuit. Les protocoles du calcul multiparti peuvent etre construits avec uniquements ces outils. On peut aussi construire les protocoles a partir d'hypoth eses calculatoires. Les protocoles construits a partir de ces outils sont souples et peuvent resister aux changements technologiques et a des ameliorations algorithmiques. Nous nous demandons si l'efficacite necessite des hypotheses de calcul. Nous demontrons que ce n'est pas le cas en construisant des protocoles efficaces a partir de ces outils de base. Cette these est constitue de quatre articles rediges en collaboration avec d'autres chercheurs. Ceci constitue la partie mature de ma recherche et sont mes contributions principales au cours de cette periode de temps. Dans le premier ouvrage presente dans cette these, nous etudions la capacite de mise en gage des canaux bruites. Nous demontrons tout d'abord une limite inferieure stricte qui implique que contrairement au transfert inconscient, il n'existe aucun protocole de taux constant pour les mises en gage de bit. Nous demontrons ensuite que, en limitant la facon dont les engagements peuvent etre ouverts, nous pouvons faire mieux et meme un taux constant dans certains cas. Ceci est fait en exploitant la notion de cover-free families . Dans le second article, nous demontrons que pour certains problemes, il existe un echange entre robustesse, la validite et le prive. Il s'effectue en utilisant le partage de secret veriable, une preuve a divulgation nulle, le concept de fantomes et une technique que nous appelons les balles et les bacs. Dans notre troisieme contribution, nous demontrons qu'un grand nombre de protocoles dans la litterature basee sur des hypotheses de calcul peuvent etre instancies a partir d'une primitive appelee Transfert Inconscient Veriable, via le concept de Transfert Inconscient Generalise. Le protocole utilise le partage de secret comme outils de base. Dans la derniere publication, nous counstruisons un protocole efficace avec un nombre constant de rondes pour le calcul a deux parties. L'efficacite du protocole derive du fait qu'on remplace le coeur d'un protocole standard par une primitive qui fonctionne plus ou moins bien mais qui est tres peu couteux. On protege le protocole contre les defauts en utilisant le concept de privacy amplication .

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Severe local storms, including tornadoes, damaging hail and wind gusts, frequently occur over the eastern and northeastern states of India during the pre-monsoon season (March-May). Forecasting thunderstorms is one of the most difficult tasks in weather prediction, due to their rather small spatial and temporal extension and the inherent non-linearity of their dynamics and physics. In this paper, sensitivity experiments are conducted with the WRF-NMM model to test the impact of convective parameterization schemes on simulating severe thunderstorms that occurred over Kolkata on 20 May 2006 and 21 May 2007 and validated the model results with observation. In addition, a simulation without convective parameterization scheme was performed for each case to determine if the model could simulate the convection explicitly. A statistical analysis based on mean absolute error, root mean square error and correlation coefficient is performed for comparisons between the simulated and observed data with different convective schemes. This study shows that the prediction of thunderstorm affected parameters is sensitive to convective schemes. The Grell-Devenyi cloud ensemble convective scheme is well simulated the thunderstorm activities in terms of time, intensity and the region of occurrence of the events as compared to other convective schemes and also explicit scheme

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Paper unfolds the paradox that exists in the tribal community with respect to the development indicators and hence tries to cull out the difference in the standard of living of the tribes in a dichotomous framework, forward and backward. Four variables have been considered for ascertaining the standard of living and socio-economic conditions of the tribes. The data for the study is obtained from a primary survey in the three tribal predominant districts of Wayanad, Idukki and Palakkad. Wayanad was selected for studying six tribal communities (Paniya, Adiya, Kuruma, Kurichya, Urali and Kattunaika), Idukki for two communities (Malayarayan and Muthuvan) and Palakkad for one community (Irula). 500 samples from 9 prominent tribal communities of Kerala have been collected according to multistage proportionate random sample framework. The analysis highlights the disproportionate nature of socio-economic indicators within the tribes in Kerala owing to the failure of governmental schemes and assistances meant for their empowerment. The socio-economic variables, such as education, health, and livelihood have been augmented with SLI based on correlation analysis gives interesting inference for policy options as high educated tribal communities are positively correlated with high SLI and livelihood. Further, each of the SLI variable is decomposed using Correlation and Correspondence analysis for understanding the relative standing of the nine tribal sub communities in the three dimensional framework of high, medium and low SLI levels. Tribes with good education and employment (Malayarayan, Kuruma and Kurichya) have a better living standard and hence they can generally be termed as forward tribes whereas those with a low or poor education, employment and living standard indicators (Paniya, Adiya, Urali, Kattunaika, Muthuvans and Irula) are categorized as backward tribes

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Little is known about the heavy metal and microbial contamination of vegetables produced in Central Asian cities. We therefore measured the concentration of cadmium (Cd), copper (Cu), lead (Pb), and zinc (Zn) and of faecal pathogens (Coliform bacteria, Salmonella sp., Shigella sp., Ascaris lubricoides, Entamoeba sp. and pinworms [Oxyuris vermicularis syn. Enterobius vermicularis]) in soil, irrigation water, and marketed vegetables of Kabul City, Afghanistan. Leaf Pb and Zn concentrations of leafy vegetables were with 1–5 and 33–160 mg kg^{-1} dry weight (DW) several-fold above respective international thresholds of 0.3 mg Pb kg^{-1} and 50 mg Zn kg^{-1}. The tissue concentration of Cu was below threshold limits in all samples except for spinach in one farm. Above-threshold loads of microbes and parasites on vegetables were found in five out of six gardens with coliforms ranging from 0.5–2 × 10^7 cells 100g^{-1} fresh weight (FW), but no Salmonella and Shigella were found. Contamination with 0.2 × 10^7 eggs 100g^{-1} FW of Ascaris was detected on produce of three farms and critical concentrations of Entamoeba in a single case, while Oxyuris vermicularis, and Enterobius vermicularis were found on produce of three and four farms, respectively. Irrigation water had Ascaris, Coliforms, Salmonella, Shigella, Entamoeba, and Oxyuris vermicularis syn. Enterobius vermicularis ranging from 0.35 × 10^7 to 2 × 10^7 cells l^{-1}. The heavy metal and microbial loads on fresh UPA vegetables are likely the result of contamination from rising traffic, residues of the past decades of war and lacking treatment of sewage which needs urgent attention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines the strategies and techniques researched and implemented by the International Union for Conservation of Nature (IUCN) in villages in the vicinity of Doi Mae Salong in Chiang Rai Province, Thailand. The strategies revolve around the paradigm linking poverty alleviation, conservation and landscape restoration. IUCN and its partners specifically researched and implemented schemes directed toward diversification of the household economy through alternative and sustainable intensified agriculture techniques based on balancing conservation and livelihood objectives. The projects aimed to reduce poverty and build the resilience of smallholders through decentralised governance arrangements including land use planning schemes and stakeholder negotiation. Considering the agro-ecological system on a catchment-wide scale enhances the conceptual understanding of each component, collectively forming a landscape matrix with requisite benefits for biodiversity, smallholder livelihoods and ecosystem services. In particular, the role of enhancing ecosystem services and functions in building socio-ecological resilience to vulnerabilities such as climate and economic variability is paramount in the process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent research on payments for environmental services (PES) has observed that high transaction costs (TCs) are incurred through the implementation of PES schemes and farmer participation. TCs incurred by households are considered to be an obstacle to the participation in and efficiency of PES policies. This study aims to understand transactions related to previous forest plantation programmes and to estimate the actual TCs incurred by farmers who participated in these programmes in a mountainous area of northwestern Vietnam. In addition, this study examines determinants of households’ TCs to test the hypothesis of whether the amount of TCs varies according to household characteristics. Results show that average TCs are not likely to be a constraint for participation since they are about 200,000 VND (USD 10) per household per contract, which is equivalent to one person’s average earnings for about two days of labour. However, TCs amount to more than one-third of the programmes’ benefits, which is relatively high compared to PES programmes in developed countries. This implies that rather than aiming to reduce TCs, an appropriate agenda for policy improvement is to balance the level of TCs with PES programme benefits to enhance the overall attractiveness of afforestation programmes for smallholder farmers. Regression analysis reveals that education, gender and perception towards PES programmes have significant effects on the magnitude of TCs. The analyses also points out the importance of local conditions on the level of TCs, with some unexpected results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a trainable system capable of tracking faces and facialsfeatures like eyes and nostrils and estimating basic mouth features such as sdegrees of openness and smile in real time. In developing this system, we have addressed the twin issues of image representation and algorithms for learning. We have used the invariance properties of image representations based on Haar wavelets to robustly capture various facial features. Similarly, unlike previous approaches this system is entirely trained using examples and does not rely on a priori (hand-crafted) models of facial features based on optical flow or facial musculature. The system works in several stages that begin with face detection, followed by localization of facial features and estimation of mouth parameters. Each of these stages is formulated as a problem in supervised learning from examples. We apply the new and robust technique of support vector machines (SVM) for classification in the stage of skin segmentation, face detection and eye detection. Estimation of mouth parameters is modeled as a regression from a sparse subset of coefficients (basis functions) of an overcomplete dictionary of Haar wavelets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Kineticist's Workbench is a program that simulates chemical reaction mechanisms by predicting, generating, and interpreting numerical data. Prior to simulation, it analyzes a given mechanism to predict that mechanism's behavior; it then simulates the mechanism numerically; and afterward, it interprets and summarizes the data it has generated. In performing these tasks, the Workbench uses a variety of techniques: graph- theoretic algorithms (for analyzing mechanisms), traditional numerical simulation methods, and algorithms that examine simulation results and reinterpret them in qualitative terms. The Workbench thus serves as a prototype for a new class of scientific computational tools---tools that provide symbiotic collaborations between qualitative and quantitative methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This class introduces basics of web mining and information retrieval including, for example, an introduction to the Vector Space Model and Text Mining. Guest Lecturer: Dr. Michael Granitzer Optional: Modeling the Internet and the Web: Probabilistic Methods and Algorithms, Pierre Baldi, Paolo Frasconi, Padhraic Smyth, Wiley, 2003 (Chapter 4, Text Analysis)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An emerging consensus in cognitive science views the biological brain as a hierarchically-organized predictive processing system. This is a system in which higher-order regions are continuously attempting to predict the activity of lower-order regions at a variety of (increasingly abstract) spatial and temporal scales. The brain is thus revealed as a hierarchical prediction machine that is constantly engaged in the effort to predict the flow of information originating from the sensory surfaces. Such a view seems to afford a great deal of explanatory leverage when it comes to a broad swathe of seemingly disparate psychological phenomena (e.g., learning, memory, perception, action, emotion, planning, reason, imagination, and conscious experience). In the most positive case, the predictive processing story seems to provide our first glimpse at what a unified (computationally-tractable and neurobiological plausible) account of human psychology might look like. This obviously marks out one reason why such models should be the focus of current empirical and theoretical attention. Another reason, however, is rooted in the potential of such models to advance the current state-of-the-art in machine intelligence and machine learning. Interestingly, the vision of the brain as a hierarchical prediction machine is one that establishes contact with work that goes under the heading of 'deep learning'. Deep learning systems thus often attempt to make use of predictive processing schemes and (increasingly abstract) generative models as a means of supporting the analysis of large data sets. But are such computational systems sufficient (by themselves) to provide a route to general human-level analytic capabilities? I will argue that they are not and that closer attention to a broader range of forces and factors (many of which are not confined to the neural realm) may be required to understand what it is that gives human cognition its distinctive (and largely unique) flavour. The vision that emerges is one of 'homomimetic deep learning systems', systems that situate a hierarchically-organized predictive processing core within a larger nexus of developmental, behavioural, symbolic, technological and social influences. Relative to that vision, I suggest that we should see the Web as a form of 'cognitive ecology', one that is as much involved with the transformation of machine intelligence as it is with the progressive reshaping of our own cognitive capabilities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The introduction of my contribution contains a brief information on the Faculty of Architecture of the Slovak University of Technology in Bratislava (FA STU) and the architectural research performed at this institution. Schemes and priorities of our research in architecture have changed several times since the very beginning in early 50’s. The most significant change occurred after “the velvet revolution” in 1989. Since 1990 there have been several sources to support research at universities. The significant part of my contribution is rooted in my own research experience since the time I had joined FA STU in 1975 as a young architect and researcher. The period of the 80’s is characterized by the first unintentional attempts to do “research by design” and my “scientific” achievements as by-products of my design work. Some of them resulted in the following issues: conception of mezzo-space, theory of the complex perception of architectural space and definition of basic principles of ecologically conscious architecture. Nowadays I continue my research by design within the application of so called solar envelope in urban scale with my students.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines current and forthcoming measures related to the exchange of data and information in EU Justice and Home Affairs policies, with a focus on the ‘smart borders’ initiative. It argues that there is no reversibility in the growing reliance on such schemes and asks whether current and forthcoming proposals are necessary and original. It outlines the main challenges raised by the proposals, including issues related to the right to data protection, but also to privacy and non-discrimination.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Seven groups have participated in an intercomparison study of calculations of radiative forcing (RF) due to stratospheric water vapour (SWV) and contrails. A combination of detailed radiative transfer schemes and codes for global-scale calculations have been used, as well as a combination of idealized simulations and more realistic global-scale changes in stratospheric water vapour and contrails. Detailed line-by-line codes agree within about 15 % for longwave (LW) and shortwave (SW) RF, except in one case where the difference is 30 %. Since the LW and SW RF due to contrails and SWV changes are of opposite sign, the differences between the models seen in the individual LW and SW components can be either compensated or strengthened in the net RF, and thus in relative terms uncertainties are much larger for the net RF. Some of the models used for global-scale simulations of changes in SWV and contrails differ substantially in RF from the more detailed radiative transfer schemes. For the global-scale calculations we use a method of weighting the results to calculate a best estimate based on their performance compared to the more detailed radiative transfer schemes in the idealized simulations.