826 resultados para 2D barcode based authentication scheme


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The precise arraying of functional entities in morphologically well-defined shapes remains one of the key challenges in the processing of organic molecules1. Among various π-conjugated species, pyrene exhibits a set of unique properties, which make it an attractive compound for the utilization in materials science2. In this contribution we report on properties of self-assembled structures prepared from amphiphilic pyrene trimers (Py3) consisting of phosphodiester-linked pyrenes. Depending on the geometry of a pyrene core substitution (1.6-, 1.8-, or 2.7- type, see Scheme), the thermally-controlled self-assembly allows the preparation of supramolecular architectures of different morphologies in a bottom-up approach: two-dimensional (2D) nanosheets3 are formed in case of 1.6- and 2.7-substitution4 whereas one-dimensional (1D) fibers are built from 1.8- substituted isomers. The morphologies of the assemblies are established by AFM and TEM, and the results are further correlated with spectroscopic and scattering data. Two-dimensional assemblies consist of an inner layer of hydrophobic pyrenes, sandwiched between a net of phosphates. Due to the repulsion of the negative charges, the 2D assemblies exist mostly as free-standing sheets. An internal alignment of pyrenes leads to strong exciton coupling with an unprecedented observation (simultaneous development of J- and H-bands from two different electronic transitions). Despite the similarity in spectroscopic properties, the structural parameters of the 2D aggregates drastically depend on the preparation procedure. Under certain conditions extra-large sheets (thickness of 2 nm, aspect ratio area/thickness ~107) in aqueous solution are formed4B. Finally, one-dimensional assemblies are formed as micrometer-long and nanometer-thick fibers. Both, planar and linear structures are intriguing objects for the creation of conductive nanowires that may find interest for applications in supramolecular electronics.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This chapter proposed a personalized X-ray reconstruction-based planning and post-operative treatment evaluation framework called iJoint for advancing modern Total Hip Arthroplasty (THA). Based on a mobile X-ray image calibration phantom and a unique 2D-3D reconstruction technique, iJoint can generate patient-specific models of hip joint by non-rigidly matching statistical shape models to the X-ray radiographs. Such a reconstruction enables a true 3D planning and treatment evaluation of hip arthroplasty from just 2D X-ray radiographs whose acquisition is part of the standard diagnostic and treatment loop. As part of the system, a 3D model-based planning environment provides surgeons with hip arthroplasty related parameters such as implant type, size, position, offset and leg length equalization. With this newly developed system, we are able to provide true 3D solutions for computer assisted planning of THA using only 2D X-ray radiographs, which is not only innovative but also cost-effective.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We have recently demonstrated a biosensor based on a lattice of SU8 pillars on a 1 μm SiO2/Si wafer by measuring vertically reflectivity as a function of wavelength. The biodetection has been proven with the combination of Bovine Serum Albumin (BSA) protein and its antibody (antiBSA). A BSA layer is attached to the pillars; the biorecognition of antiBSA involves a shift in the reflectivity curve, related with the concentration of antiBSA. A detection limit in the order of 2 ng/ml is achieved for a rhombic lattice of pillars with a lattice parameter (a) of 800 nm, a height (h) of 420 nm and a diameter(d) of 200 nm. These results correlate with calculations using 3D-finite difference time domain method. A 2D simplified model is proposed, consisting of a multilayer model where the pillars are turned into a 420 nm layer with an effective refractive index obtained by using Beam Propagation Method (BPM) algorithm. Results provided by this model are in good correlation with experimental data, reaching a reduction in time from one day to 15 minutes, giving a fast but accurate tool to optimize the design and maximizing sensitivity, and allows analyzing the influence of different variables (diameter, height and lattice parameter). Sensitivity is obtained for a variety of configurations, reaching a limit of detection under 1 ng/ml. Optimum design is not only chosen because of its sensitivity but also its feasibility, both from fabrication (limited by aspect ratio and proximity of the pillars) and fluidic point of view. (© 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A video-aware unequal loss protection (ULP) system for protecting RTP video streaming in bursty packet loss networks is proposed. Just considering the relevance of the frame, the state of the channel and the bitrate constraints of the protection bitstream, our algorithm selects in real time the most suitable frames to be protected through forward error correction (FEC) techniques. It benefits from a wise RTP encapsulation that allows working at a frame level without requiring any further process than that of parsing RTP headers, so it is perfectly suitable to be included in commercial transmitters. The simulation results show how our proposed ULP technique outperforms non-smart schemes.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article evaluates an authentication technique for mobiles based on gestures. Users create a remindful identifying gesture to be considered as their in-air signature. This work analyzes a database of 120 gestures of different vulnerability, obtaining an Equal Error Rate (EER) of 9.19% when robustness of gestures is not verified. Most of the errors in this EER come from very simple and easily forgeable gestures that should be discarded at enrollment phase. Therefore, an in-air signature robustness verification system using Linear Discriminant Analysis is proposed to infer automatically whether the gesture is secure or not. Different configurations have been tested obtaining a lowest EER of 4.01% when 45.02% of gestures were discarded, and an optimal compromise of EER of 4.82% when 19.19% of gestures were automatically rejected.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article focuses on the evaluation of a biometric technique based on the performance of an identifying gesture by holding a telephone with an embedded accelerometer in his/her hand. The acceleration signals obtained when users perform gestures are analyzed following a mathematical method based on global sequence alignment. In this article, eight different scores are proposed and evaluated in order to quantify the differences between gestures, obtaining an optimal EER result of 3.42% when analyzing a random set of 40 users of a database made up of 80 users with real attempts of falsification. Moreover, a temporal study of the technique is presented leeding to the need to update the template to adapt the manner in which users modify how they perform their identifying gesture over time. Six updating schemes have been assessed within a database of 22 users repeating their identifying gesture in 20 sessions over 4 months, concluding that the more often the template is updated the better and more stable performance the technique presents.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Opportunities offered by high performance computing provide a significant degree of promise in the enhancement of the performance of real-time flood forecasting systems. In this paper, a real-time framework for probabilistic flood forecasting through data assimilation is presented. The distributed rainfall-runoff real-time interactive basin simulator (RIBS) model is selected to simulate the hydrological process in the basin. Although the RIBS model is deterministic, it is run in a probabilistic way through the results of calibration developed in a previous work performed by the authors that identifies the probability distribution functions that best characterise the most relevant model parameters. Adaptive techniques improve the result of flood forecasts because the model can be adapted to observations in real time as new information is available. The new adaptive forecast model based on genetic programming as a data assimilation technique is compared with the previously developed flood forecast model based on the calibration results. Both models are probabilistic as they generate an ensemble of hydrographs, taking the different uncertainties inherent in any forecast process into account. The Manzanares River basin was selected as a case study, with the process being computationally intensive as it requires simulation of many replicas of the ensemble in real time.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article presents a solution to the problem of strong authentication, portable and expandable using a combination of Java technology and storage of X.509 digital certificate in Java cards to access services offered by an institution, in this case, the technology of the University of Panama, ensuring the authenticity, confidentiality, integrity and non repudiation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Liquid crystal devices are being used in many non-display applications in order to construct small devices controlled by low voltage electronics without mechanical components. In this work, we present a novel liquid crystal device for laser beam steering. In this device the orientation of the liquid crystal molecules can be controlled. A change in the liquid crystal orientation results in a change of the refractive index. When a laser beam passes through the device, the beam will be deviated (Fig.1) and the device works a prism. The main difference between this device and a prism is that in the device the orientation profile of the liquid crystal molecules can be modified so that the laser beam can be deviated a required angle: the device is tuneable.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A unified low complexity sign-bit correlation based symbol timing synchronization scheme for Multiband Orthogonal Frequency Division Multiplexing (MB-OFDM) Ultra Wideband (UWB) receiver system is proposed. By using the time domain sequence of the packet/frame synchronization preamble, the proposed scheme is in charge of detecting the upcoming MB-OFDM symbol and it estimates the exact boundary of the start of Fast Fourier Transform (FFT) window. The proposed algorithm is implemented by using an efficient Hardware-Software co-simulation methodology. The effectiveness of the proposed synchronization scheme and the optimization criteria is confirmed by hardware implementation results.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present an adaptive unequal error protection (UEP) strategy built on the 1-D interleaved parity Application Layer Forward Error Correction (AL-FEC) code for protecting the transmission of stereoscopic 3D video content encoded with Multiview Video Coding (MVC) through IP-based networks. Our scheme targets the minimization of quality degradation produced by packet losses during video transmission in time-sensitive application scenarios. To that end, based on a novel packet-level distortion model, it selects in real time the most suitable packets within each Group of Pictures (GOP) to be protected and the most convenient FEC technique parameters, i.e., the size of the FEC generator matrix. In order to make these decisions, it considers the relevance of the packet, the behavior of the channel, and the available bitrate for protection purposes. Simulation results validate both the distortion model introduced to estimate the importance of packets and the optimization of the FEC technique parameter values.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Automatic 2D-to-3D conversion is an important application for filling the gap between the increasing number of 3D displays and the still scant 3D content. However, existing approaches have an excessive computational cost that complicates its practical application. In this paper, a fast automatic 2D-to-3D conversion technique is proposed, which uses a machine learning framework to infer the 3D structure of a query color image from a training database with color and depth images. Assuming that photometrically similar images have analogous 3D structures, a depth map is estimated by searching the most similar color images in the database, and fusing the corresponding depth maps. Large databases are desirable to achieve better results, but the computational cost also increases. A clustering-based hierarchical search using compact SURF descriptors to characterize images is proposed to drastically reduce search times. A significant computational time improvement has been obtained regarding other state-of-the-art approaches, maintaining the quality results.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The backdrop of actual problematic about the implementation of Information Technology (IT) services management in Small and Medium Enterprises (SMEs) will be described. It will be exposed the reasons why reaching a maturity/capability level through well-known standards or the implementation of good software engineering practices by means of IT infrastructure Library are really difficult to achieve by SMEs. Also, the solutions to the exposed problems will be explained. Also master thesis goals are presented in terms of: purpose, research questions, research goals, objectives and scope. Finally, thesis structure is described.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The formulation of thermodynamically consistent (TC) time integration methods was introduced by a general procedure based on the GENERIC form of the evolution equations for thermo-mechanical problems. The use of the entropy was reported to be the best choice for the thermodynamical variable to easily provide TC integrators. Also the employment of the internal energy was proved to not involve excessive complications. However, attempts towards the use of the temperature in the design of GENERIC-based TC schemes have so far been unfruitful. This paper complements the said procedure to attain TC integrators by presenting a TC scheme based on the temperature as thermodynamical state variable. As a result, the problems which arise due to the use of the entropy are overcome, mainly the definition of boundary conditions. What is more, the newly proposed method exhibits the general enhanced numerical stability and robustness properties of the entropy formulation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

El extraordinario auge de las nuevas tecnologías de la información, el desarrollo de la Internet de las Cosas, el comercio electrónico, las redes sociales, la telefonía móvil y la computación y almacenamiento en la nube, han proporcionado grandes beneficios en todos los ámbitos de la sociedad. Junto a éstos, se presentan nuevos retos para la protección y privacidad de la información y su contenido, como la suplantación de personalidad y la pérdida de la confidencialidad e integridad de los documentos o las comunicaciones electrónicas. Este hecho puede verse agravado por la falta de una frontera clara que delimite el mundo personal del mundo laboral en cuanto al acceso de la información. En todos estos campos de la actividad personal y laboral, la Criptografía ha jugado un papel fundamental aportando las herramientas necesarias para garantizar la confidencialidad, integridad y disponibilidad tanto de la privacidad de los datos personales como de la información. Por otro lado, la Biometría ha propuesto y ofrecido diferentes técnicas con el fin de garantizar la autentificación de individuos a través del uso de determinadas características personales como las huellas dáctilares, el iris, la geometría de la mano, la voz, la forma de caminar, etc. Cada una de estas dos ciencias, Criptografía y Biometría, aportan soluciones a campos específicos de la protección de datos y autentificación de usuarios, que se verían enormemente potenciados si determinadas características de ambas ciencias se unieran con vistas a objetivos comunes. Por ello es imperativo intensificar la investigación en estos ámbitos combinando los algoritmos y primitivas matemáticas de la Criptografía con la Biometría para dar respuesta a la demanda creciente de nuevas soluciones más técnicas, seguras y fáciles de usar que potencien de modo simultáneo la protección de datos y la identificacíón de usuarios. En esta combinación el concepto de biometría cancelable ha supuesto una piedra angular en el proceso de autentificación e identificación de usuarios al proporcionar propiedades de revocación y cancelación a los ragos biométricos. La contribución de esta tesis se basa en el principal aspecto de la Biometría, es decir, la autentificación segura y eficiente de usuarios a través de sus rasgos biométricos, utilizando tres aproximaciones distintas: 1. Diseño de un esquema criptobiométrico borroso que implemente los principios de la biometría cancelable para identificar usuarios lidiando con los problemas acaecidos de la variabilidad intra e inter-usuarios. 2. Diseño de una nueva función hash que preserva la similitud (SPHF por sus siglas en inglés). Actualmente estas funciones se usan en el campo del análisis forense digital con el objetivo de buscar similitudes en el contenido de archivos distintos pero similares de modo que se pueda precisar hasta qué punto estos archivos pudieran ser considerados iguales. La función definida en este trabajo de investigación, además de mejorar los resultados de las principales funciones desarrolladas hasta el momento, intenta extender su uso a la comparación entre patrones de iris. 3. Desarrollando un nuevo mecanismo de comparación de patrones de iris que considera tales patrones como si fueran señales para compararlos posteriormente utilizando la transformada de Walsh-Hadarmard. Los resultados obtenidos son excelentes teniendo en cuenta los requerimientos de seguridad y privacidad mencionados anteriormente. Cada uno de los tres esquemas diseñados han sido implementados para poder realizar experimentos y probar su eficacia operativa en escenarios que simulan situaciones reales: El esquema criptobiométrico borroso y la función SPHF han sido implementados en lenguaje Java mientras que el proceso basado en la transformada de Walsh-Hadamard en Matlab. En los experimentos se ha utilizado una base de datos de imágenes de iris (CASIA) para simular una población de usuarios del sistema. En el caso particular de la función de SPHF, además se han realizado experimentos para comprobar su utilidad en el campo de análisis forense comparando archivos e imágenes con contenido similar y distinto. En este sentido, para cada uno de los esquemas se han calculado los ratios de falso negativo y falso positivo. ABSTRACT The extraordinary increase of new information technologies, the development of Internet of Things, the electronic commerce, the social networks, mobile or smart telephony and cloud computing and storage, have provided great benefits in all areas of society. Besides this fact, there are new challenges for the protection and privacy of information and its content, such as the loss of confidentiality and integrity of electronic documents and communications. This is exarcebated by the lack of a clear boundary between the personal world and the business world as their differences are becoming narrower. In both worlds, i.e the personal and the business one, Cryptography has played a key role by providing the necessary tools to ensure the confidentiality, integrity and availability both of the privacy of the personal data and information. On the other hand, Biometrics has offered and proposed different techniques with the aim to assure the authentication of individuals through their biometric traits, such as fingerprints, iris, hand geometry, voice, gait, etc. Each of these sciences, Cryptography and Biometrics, provides tools to specific problems of the data protection and user authentication, which would be widely strengthen if determined characteristics of both sciences would be combined in order to achieve common objectives. Therefore, it is imperative to intensify the research in this area by combining the basics mathematical algorithms and primitives of Cryptography with Biometrics to meet the growing demand for more secure and usability techniques which would improve the data protection and the user authentication. In this combination, the use of cancelable biometrics makes a cornerstone in the user authentication and identification process since it provides revocable or cancelation properties to the biometric traits. The contributions in this thesis involve the main aspect of Biometrics, i.e. the secure and efficient authentication of users through their biometric templates, considered from three different approaches. The first one is designing a fuzzy crypto-biometric scheme using the cancelable biometric principles to take advantage of the fuzziness of the biometric templates at the same time that it deals with the intra- and inter-user variability among users without compromising the biometric templates extracted from the legitimate users. The second one is designing a new Similarity Preserving Hash Function (SPHF), currently widely used in the Digital Forensics field to find similarities among different files to calculate their similarity level. The function designed in this research work, besides the fact of improving the results of the two main functions of this field currently in place, it tries to expand its use to the iris template comparison. Finally, the last approach of this thesis is developing a new mechanism of handling the iris templates, considering them as signals, to use the Walsh-Hadamard transform (complemented with three other algorithms) to compare them. The results obtained are excellent taking into account the security and privacy requirements mentioned previously. Every one of the three schemes designed have been implemented to test their operational efficacy in situations that simulate real scenarios: The fuzzy crypto-biometric scheme and the SPHF have been implemented in Java language, while the process based on the Walsh-Hadamard transform in Matlab. The experiments have been performed using a database of iris templates (CASIA-IrisV2) to simulate a user population. The case of the new SPHF designed is special since previous to be applied i to the Biometrics field, it has been also tested to determine its applicability in the Digital Forensic field comparing similar and dissimilar files and images. The ratios of efficiency and effectiveness regarding user authentication, i.e. False Non Match and False Match Rate, for the schemes designed have been calculated with different parameters and cases to analyse their behaviour.