960 resultados para Coarse Authentication
Resumo:
Authentication plays an important role in how we interact with computers, mobile devices, the web, etc. The idea of authentication is to uniquely identify a user before granting access to system privileges. For example, in recent years more corporate information and applications have been accessible via the Internet and Intranet. Many employees are working from remote locations and need access to secure corporate files. During this time, it is possible for malicious or unauthorized users to gain access to the system. For this reason, it is logical to have some mechanism in place to detect whether the logged-in user is the same user in control of the user's session. Therefore, highly secure authentication methods must be used. We posit that each of us is unique in our use of computer systems. It is this uniqueness that is leveraged to "continuously authenticate users" while they use web software. To monitor user behavior, n-gram models are used to capture user interactions with web-based software. This statistical language model essentially captures sequences and sub-sequences of user actions, their orderings, and temporal relationships that make them unique by providing a model of how each user typically behaves. Users are then continuously monitored during software operations. Large deviations from "normal behavior" can possibly indicate malicious or unintended behavior. This approach is implemented in a system called Intruder Detector (ID) that models user actions as embodied in web logs generated in response to a user's actions. User identification through web logs is cost-effective and non-intrusive. We perform experiments on a large fielded system with web logs of approximately 4000 users. For these experiments, we use two classification techniques; binary and multi-class classification. We evaluate model-specific differences of user behavior based on coarse-grain (i.e., role) and fine-grain (i.e., individual) analysis. A specific set of metrics are used to provide valuable insight into how each model performs. Intruder Detector achieves accurate results when identifying legitimate users and user types. This tool is also able to detect outliers in role-based user behavior with optimal performance. In addition to web applications, this continuous monitoring technique can be used with other user-based systems such as mobile devices and the analysis of network traffic.
Resumo:
Composition and orientation effects on the final recrystallization texture of three coarse-grained Nb-containing AISI 430 ferritic stainless steels (FSSs) were investigated. Hot-bands of steels containing distinct amounts of niobium, carbon and nitrogen were annealed at 1250 degrees C for 2h to promote grain growth. In particular, the amounts of Nb in solid solution vary from one grade to another. For purposes of comparison, the texture evolution of a hot-band sheet annealed at 1030 degrees C for 1 min (finer grain structure) was also investigated. Subsequently, the four sheets were cold rolled up to 80% reduction and then annealed at 800 degrees C for 15 min. Texture was determined using X-ray diffraction and electron backscatter diffraction (EBSD). Noticeable differences regarding the final recrystallization texture and microstructure were observed in the four investigated grades. Results suggest that distinct nucleation mechanisms take place within these large grains leading to the development of different final recrystallization textures. (c) 2011 Elsevier B.V. All rights reserved.
Resumo:
Higher order (2,4) FDTD schemes used for numerical solutions of Maxwell`s equations are focused on diminishing the truncation errors caused by the Taylor series expansion of the spatial derivatives. These schemes use a larger computational stencil, which generally makes use of the two constant coefficients, C-1 and C-2, for the four-point central-difference operators. In this paper we propose a novel way to diminish these truncation errors, in order to obtain more accurate numerical solutions of Maxwell`s equations. For such purpose, we present a method to individually optimize the pair of coefficients, C-1 and C-2, based on any desired grid size resolution and size of time step. Particularly, we are interested in using coarser grid discretizations to be able to simulate electrically large domains. The results of our optimization algorithm show a significant reduction in dispersion error and numerical anisotropy for all modeled grid size resolutions. Numerical simulations of free-space propagation verifies the very promising theoretical results. The model is also shown to perform well in more complex, realistic scenarios.
Resumo:
The hot melt granulation of a coarse pharmaceutical powder in a top spray spouted bed is described. The substrate was lactose-polyvinylpyrrolidone particles containing or not acetaminophen as a drug model. Polyethylene glycol (MW, 4000) used as binder was atomized onto the bed by a two-fluid spray nozzle. The granulation experiments followed a 2(3) factorial design with triplicates at the center point and were carried out by varying the spray nozzle vertical position, the atomizing air flow rate and the binder feed rate. Granules were evaluated by their pharmacotechnical properties like size distribution, bulk and tapped densities, Carr index, Hausner ratio and tableting characteristics. Analysis of variance showed that granule sizes were affected by the PEG feed rate and atomizing air pressure at the significance levels of 1.0 and 5.0%. respectively, but spray nozzle distance to the substrate bed was not significant. The spray conditions also affected granule flow and consolidation properties. measured by the Carr index and Hausner ratio. Measured densities, Carr indexes and Hausner ratios proved that granules flowability and consolidation properties are adequate for pharmaceutical processing and tableting. Tablets prepared with acetaminophen-containing granules showed good properties and adequate release profiles in in vitro dissolution tests. The results indicate the suitability of spouted beds for the hot melt granulation of pharmaceutical coarse powders. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
It is difficult to get the decision about an opinion after many users get the meeting in same place. It used to spend too much time in order to find solve some problem because of the various opinions of each other. TAmI (Group Decision Making Toolkit) is the System to Group Decision in Ambient Intelligence [1]. This program was composed with IGATA [2], WebMeeting and the related Database system. But, because it is sent without any encryption in IP / Password, it can be opened to attacker. They can use the IP / Password to the bad purpose. As the result, although they make the wrong result, the joined member can’t know them. Therefore, in this paper, we studied the applying method of user’s authentication into TAmI.
Resumo:
Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática
Resumo:
Coccidioidomycosis is an emerging fungal disease in Brazil; adequate maintenance and authentication of Coccidioides isolates are essential for research into genetic diversity of the environmental organisms, as well as for understanding the human disease. Seventeen Coccidioides isolates maintained under mineral oil since 1975 in the Instituto de Medicina Tropical de São Paulo (IMTSP) culture collection, Brazil, were evaluated with respect to their viability, morphological characteristics and genetic features in order to authenticate these fungal cultures. Only five isolates were viable after almost 30 years, showing typical morphological characteristics, and sequencing analysis using Coi-F and Coi-R primers revealed 99% identity with Coccidioides genera. These five isolates were then preserved in liquid nitrogen and sterile water, and remained viable after two years of storage under these conditions, maintaining the same features.
Resumo:
Twenty Coccidioides immitis strains were evaluated. Only 5 of the 20 strains kept under mineral oil maintained their viability while all 5 subcultures preserved in water remained viable and none of the 13 subcultures kept in soil were viable. A 519 bp PCR product from the csa gene confirmed the identity of the strains.
Resumo:
Nowadays, authentication studies for paintings require a multidisciplinary approach, based on the contribution of visual features analysis but also on characterizations of materials and techniques. Moreover, it is important that the assessment of the authorship of a painting is supported by technical studies of a selected number of original artworks that cover the entire career of an artist. This dissertation is concerned about the work of modernist painter Amadeo de Souza-Cardoso. It is divided in three parts. In the first part, we propose a tool based on image processing that combines information obtained by brushstroke and materials analysis. The resulting tool provides qualitative and quantitative evaluation of the authorship of the paintings; the quantitative element is particularly relevant, as it could be crucial in solving authorship controversies, such as judicial disputes. The brushstroke analysis was performed by combining two algorithms for feature detection, namely Gabor filter and Scale Invariant Feature Transform. Thanks to this combination (and to the use of the Bag-of-Features model), the proposed method shows an accuracy higher than 90% in distinguishing between images of Amadeo’s paintings and images of artworks by other contemporary artists. For the molecular analysis, we implemented a semi-automatic system that uses hyperspectral imaging and elemental analysis. The system provides as output an image that depicts the mapping of the pigments present, together with the areas made using materials not coherent with Amadeo’s palette, if any. This visual output is a simple and effective way of assessing the results of the system. The tool proposed based on the combination of brushstroke and molecular information was tested in twelve paintings obtaining promising results. The second part of the thesis presents a systematic study of four selected paintings made by Amadeo in 1917. Although untitled, three of these paintings are commonly known as BRUT, Entrada and Coty; they are considered as his most successful and genuine works. The materials and techniques of these artworks have never been studied before. The paintings were studied with a multi-analytical approach using micro-Energy Dispersive X-ray Fluorescence spectroscopy, micro-Infrared and Raman Spectroscopy, micro-Spectrofluorimetry and Scanning Electron Microscopy. The characterization of Amadeo’s materials and techniques used on his last paintings, as well as the investigation of some of the conservation problems that affect these paintings, is essential to enrich the knowledge on this artist. Moreover, the study of the materials in the four paintings reveals commonalities between the paintings BRUT and Entrada. This observation is supported also by the analysis of the elements present in a photograph of a collage (conserved at the Art Library of the Calouste Gulbenkian Foundation), the only remaining evidence of a supposed maquete of these paintings. The final part of the thesis describes the application of the image processing tools developed in the first part of the thesis on a set of case studies; this experience demonstrates the potential of the tool to support painting analysis and authentication studies. The brushstroke analysis was used as additional analysis on the evaluation process of four paintings attributed to Amadeo, and the system based on hyperspectral analysis was applied on the painting dated 1917. The case studies therefore serve as a bridge between the first two parts of the dissertation.
Resumo:
Introduction: In past decades, leishmaniasis burden has been low across Egypt; however, changing environment and land use has placed several parts of the country at risk. As a consequence, leishmaniasis has become a particularly difficult health problem, both for local inhabitants and for multinational military personnel. Methods: To evaluate coarse-resolution aspects of the ecology of leishmaniasis transmission, collection records for sandflies and Leishmania species were obtained from diverse sources. To characterize environmental variation across the country, we used multitemporal Land Surface Temperature (LST) and Normalized Difference Vegetation Index (NDVI) data from the Moderate Resolution Imaging Spectroradiometer (MODIS) for 2005-2011. Ecological niche models were generated using MaxEnt, and results were analyzed using background similarity tests to assess whether associations among vectors and parasites (i.e., niche similarity) can be detected across broad geographic regions. Results: We found niche similarity only between one vector species and its corresponding parasite species (i.e., Phlebotomus papatasi with Leishmania major), suggesting that geographic ranges of zoonotic cutaneous leishmaniasis and its potential vector may overlap, but under distinct environmental associations. Other associations (e.g., P. sergenti with L. major) were not supported. Mapping suitable areas for each species suggested that northeastern Egypt is particularly at risk because both parasites have potential to circulate. Conclusions: Ecological niche modeling approaches can be used as a first-pass assessment of vector-parasite interactions, offering useful insights into constraints on the geography of transmission patterns of leishmaniasis.
Resumo:
Immune systems have been used in the last years to inspire approaches for several computational problems. This paper focus on behavioural biometric authentication algorithms’ accuracy enhancement by using them more than once and with different thresholds in order to first simulate the protection provided by the skin and then look for known outside entities, like lymphocytes do. The paper describes the principles that support the application of this approach to Keystroke Dynamics, an authentication biometric technology that decides on the legitimacy of a user based on his typing pattern captured on he enters the username and/or the password and, as a proof of concept, the accuracy levels of one keystroke dynamics algorithm when applied to five legitimate users of a system both in the traditional and in the immune inspired approaches are calculated and the obtained results are compared.
Resumo:
Named entity recognizers are unable to distinguish if a term is a general concept as "scientist" or an individual as "Einstein". In this paper we explore the possibility to reach this goal combining two basic approaches: (i) Super Sense Tagging (SST) and (ii) YAGO. Thanks to these two powerful tools we could automatically create a corpus set in order to train the SuperSense Tagger. The general F1 is over 76% and the model is publicly available.
Resumo:
This paper presents the design and implementation of QRP, an open source proof-of-concept authentication system that uses a two-factorauthentication by combining a password and a camera-equipped mobile phone, acting as an authentication token. QRP is extremely secure asall the sensitive information stored and transmitted is encrypted, but it isalso an easy to use and cost-efficient solution. QRP is portable and can be used securely in untrusted computers. Finally, QRP is able to successfully authenticate even when the phone is offline.
Real-Time implementation of a blind authentication method using self-synchronous speech watermarking
Resumo:
A blind speech watermarking scheme that meets hard real-time deadlines is presented and implemented. In addition, one of the key issues in these block-oriented watermarking techniques is to preserve the synchronization. Namely, to recover the exact position of each block in the mark extract process. In fact, the presented scheme can be split up into two distinguished parts, the synchronization and the information mark methods. The former is embedded into the time domain and it is fast enough to be run meeting real-time requirements. The latter contains the authentication information and it is embedded into the wavelet domain. The synchronization and information mark techniques are both tunable in order to allow a con gurable method. Thus, capacity, transparency and robustness can be con gured depending on the needs. It makes the scheme useful for professional applications, such telephony authentication or even sending information throw radio applications.
Resumo:
The fatty acids of olive oils of distinct quality grade from the most important European Union (EU) producer countries were chemically and isotopically characterized. The analytical approach utilized combined capillary column gas chromatography-mass spectrometry (GC/MS) and the novel technique of compound-specific isotope analysis (CSIA) through gas chromatography coupled to a stable isotope ratio mass spectrometer (IRMS) via a combustion (C) interface (GC/C/IRMS). This approach provides further insights into the control of the purity and geographical origin of oils sold as cold-pressed extra virgin olive oil with certified origin appellation. The results indicate that substantial enrichment in heavy carbon isotope (C-13) of the bulk oil and of individual fatty acids are related to (1) a thermally induced degradation due to deodorization or steam washing of the olive oils and (2) the potential blend with refined olive oil or other vegetable oils. The interpretation of the data is based on principal component analysis of the fatty acids concentrations and isotopic data (delta(13)C(oil), delta(13)C(16:0), delta(13)C(18:1)) and on the delta(13)C(16:0) vs delta(13)C(18:1) covariations. The differences in the delta(13)C values of palmitic and oleic acids are discussed in terms of biosynthesis of these acids in the plant tissue and admixture of distinct oils.