997 resultados para integrity verification technique
Resumo:
Dissertação para obtenção do Grau de Doutor em Engenharia Informática
Resumo:
Introduction : DTI has proven to be an exquisite biomarker of tissue microstructure integrity. This technique has been successfully applied to schizophrenia in showing that fractional anisotropy (FA, a marker of white matter integrity) is diminished in several areas of the brain (Kyriakopoulos M et al (2008)). New ways of representing diffusion data emerged recently and achieved to create structural connectivity maps in healthy brains (Hagmann P et al. (2008)). These maps have the capacity to study alterations over the entire brain at the connection and network level. This is of high interest in complex disconnection diseases like schizophrenia. We report on the specific network alterations of schizophrenic patients. Methods : 13 patients with chronic schizophrenia were recruited from in-patient, day treatment, out-patient clinics. Comparison subjects were recruited and group-matched to patients on age, sex, handedness, and parental social economic-status. This study was approved by the local IRB and subjects had to give informed written consent. They were scanned with a 3T clinical MRI scanner. DTI and high-resolution anatomical T1w imaging were performed during the same session. The path from diffusion MRI to a multi-resolution structural connection matrices of the entire brain is a five steps process that was performed in a similar way as described in Hagmann P et al. (2008). (1) DTI and T1w MRI of the brain, (2) segmentation of white and gray matter, (3) white matter tractography, (4) segmentation of the cortex into 242 ROIs of equal surface area covering the entire cortex (Fig 1), (5) the connection network was constructed by measuring for each ROI to ROI connection the related average FA along the corresponding tract. Results : For every connection between 2 ROIs of the network we tested the hypothesis H0: "average FA along fiber pathway is larger or equal in patients than in controls". H0 was rejected for connections where average FA in a connection was significantly lower in patients than in controls. Threshold p-value was 0.01 corrected for multiple comparisons with false discovery rate. We identified consistently that temporal, occipito-temporal, precuneo-temporal as well as frontal inferior and precuneo-cingulate connections were altered (Fig 2: significant connections in yellow). This is in agreement with the known literature, which showed across several studies that FA is diminished in several areas of the brain. More precisely, abnormalities were reported in the prefrontal and temporal white matter and to some extent also in the parietal and occipital regions. The alterations reported in the literature specifically included the corpus callosum, the arcuate fasciculus and the cingulum bundle, which was the case here as well. In addition small world indexes are significantly reduced in patients (p<0.01) (Fig. 3). Conclusions : Using connectome mapping to characterize differences in structural connectivity between healthy and diseased subjects we were able to show widespread connectional alterations in schizophrenia patients and systematic small worldness decrease, which is a marker of network desorganization. More generally, we described a method that has the capacity to sensitively identify structure alterations in complex disconnection syndromes where lesions are widespread throughout the connectional network.
Resumo:
With many operational centers moving toward order 1-km-gridlength models for routine weather forecasting, this paper presents a systematic investigation of the properties of high-resolution versions of the Met Office Unified Model for short-range forecasting of convective rainfall events. The authors describe a suite of configurations of the Met Office Unified Model running with grid lengths of 12, 4, and 1 km and analyze results from these models for a number of convective cases from the summers of 2003, 2004, and 2005. The analysis includes subjective evaluation of the rainfall fields and comparisons of rainfall amounts, initiation, cell statistics, and a scale-selective verification technique. It is shown that the 4- and 1-km-gridlength models often give more realistic-looking precipitation fields because convection is represented explicitly rather than parameterized. However, the 4-km model representation suffers from large convective cells and delayed initiation because the grid length is too long to correctly reproduce the convection explicitly. These problems are not as evident in the 1-km model, although it does suffer from too numerous small cells in some situations. Both the 4- and 1-km models suffer from poor representation at the start of the forecast in the period when the high-resolution detail is spinning up from the lower-resolution (12 km) starting data used. A scale-selective precipitation verification technique implies that for later times in the forecasts (after the spinup period) the 1-km model performs better than the 12- and 4-km models for lower rainfall thresholds. For higher thresholds the 4-km model scores almost as well as the 1-km model, and both do better than the 12-km model.
Resumo:
A new fragile logo watermarking scheme is proposed for public authentication and integrity verification of images. The security of the proposed block-wise scheme relies on a public encryption algorithm and a hash function. The encoding and decoding methods can provide public detection capabilities even in the absence of the image indices and the original logos. Furthermore, the detector automatically authenticates input images and extracts possible multiple logos and image indices, which can be used not only to localise tampered regions, but also to identify the original source of images used to generate counterfeit images. Results are reported to illustrate the effectiveness of the proposed method.
Resumo:
En esta tesis doctoral se propone una técnica biométrica de verificación en teléfonos móviles consistente en realizar una firma en el aire con la mano que sujeta el teléfono móvil. Los acelerómetros integrados en el dispositivo muestrean las aceleraciones del movimiento de la firma en el aire, generando tres señales temporales que pueden utilizarse para la verificación del usuario. Se proponen varios enfoques para la implementación del sistema de verificación, a partir de los enfoques más utilizados en biometría de firma manuscrita: correspondencia de patrones, con variantes de los algoritmos de Needleman-Wusch (NW) y Dynamic Time Warping (DTW), modelos ocultos de Markov (HMM) y clasificador estadístico basado en Máquinas de Vector Soporte (SVM). Al no existir bases de datos públicas de firmas en el aire y con el fin de evaluar los métodos propuestos en esta tesis doctoral, se han capturado dos con distintas características; una con falsificaciones reales a partir del estudio de las grabaciones de usuarios auténticos y otra con muestras de usuarios obtenidas en diferentes sesiones a lo largo del tiempo. Utilizando estas bases de datos se han evaluado una gran cantidad de algoritmos para implementar un sistema de verificación basado en firma en el aire. Esta evaluación se ha realizado de acuerdo con el estándar ISO/IEC 19795, añadiendo el caso de verificación en mundo abierto no incluido en la norma. Además, se han analizado las características que hacen que una firma sea suficientemente segura. Por otro lado, se ha estudiado la permanencia de las firmas en el aire a lo largo del tiempo, proponiendo distintos métodos de actualización, basados en una adaptación dinámica del patrón, para mejorar su rendimiento. Finalmente, se ha implementado un prototipo de la técnica de firma en el aire para teléfonos Android e iOS. Los resultados de esta tesis doctoral han tenido un gran impacto, generando varias publicaciones en revistas internacionales, congresos y libros. La firma en el aire ha sido nombrada también en varias revistas de divulgación, portales de noticias Web y televisión. Además, se han obtenido varios premios en competiciones de ideas innovadoras y se ha firmado un acuerdo de explotación de la tecnología con una empresa extranjera. ABSTRACT This thesis proposes a biometric verification technique on mobile phones consisting on making a signature in the air with the hand holding a mobile phone. The accelerometers integrated in the device capture the movement accelerations, generating three temporal signals that can be used for verification. This thesis suggests several approaches for implementing the verification system, based on the most widely used approaches in handwritten signature biometrics: template matching, with a lot of variations of the Needleman- Wusch (NW) and Dynamic Time Warping (DTW) algorithms, Hidden Markov Models (HMM) and Supported Vector Machines (SVM). As there are no public databases of in-air signatures and with the aim of assessing the proposed methods, there have been captured two databases; one. with real falsification attempts from the study of recordings captured when genuine users made their signatures in front of a camera, and other, with samples obtained in different sessions over a long period of time. These databases have been used to evaluate a lot of algorithms in order to implement a verification system based on in-air signatures. This evaluation has been conducted according to the standard ISO/IEC 19795, adding the open-set verification scenario not included in the norm. In addition, the characteristics of a secure signature are also investigated, as well as the permanence of in-air signatures over time, proposing several updating strategies to improve its performance. Finally, a prototype of in-air signature has been developed for iOS and Android phones. The results of this thesis have achieved a high impact, publishing several articles in SCI journals, conferences and books. The in-air signature deployed in this thesis has been also referred in numerous media. Additionally, this technique has won several awards in the entrepreneurship field and also an exploitation agreement has been signed with a foreign company.
Resumo:
Encryption and integrity trees guard against phys- ical attacks, but harm performance. Prior academic work has speculated around the latency of integrity verification, but has done so in an insecure manner. No industrial implementations of secure processors have included speculation. This work presents PoisonIvy, a mechanism which speculatively uses data before its integrity has been verified while preserving security and closing address-based side-channels. PoisonIvy reduces per- formance overheads from 40% to 20% for memory intensive workloads and down to 1.8%, on average.
Resumo:
Wydział Matematyki i Informatyki UAM
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The clinical advantage for protons over conventional high-energy x-rays stems from their unique depth-dose distribution, which delivers essentially no dose beyond the end of range. In order to achieve it, accurate localization of the tumor volume relative to the proton beam is necessary. For cases where the tumor moves with respiration, the resultant dose distribution is sensitive to such motion. One way to reduce uncertainty caused by respiratory motion is to use gated beam delivery. The main goal of this dissertation is to evaluate the respiratory gating technique in both passive scattering and scanning delivery mode. Our hypothesis for the study was that optimization of the parameters of synchrotron operation and respiratory gating can lead to greater efficiency and accuracy of respiratory gating for all modes of synchrotron-based proton treatment delivery. The hypothesis is tested in two specific aims. The specific aim #1 is to assess the efficiency of respiratory-gated proton beam delivery and optimize the synchrotron operations for the gated proton therapy. A simulation study was performed and introduced an efficient synchrotron operation pattern, called variable Tcyc. In addition, the simulation study estimated the efficiency in the respiratory gated scanning beam delivery mode as well. The specific aim #2 is to assess the accuracy of beam delivery in respiratory-gated proton therapy. The simulation study was extended to the passive scattering mode to estimate the quality of pulsed beam delivery to the residual motion for several synchrotron operation patterns with the gating technique. The results showed that variable Tcyc operation can offer good reproducible beam delivery to the residual motion at a certain phase of the motion. For respiratory gated scanning beam delivery, the impact of motion on the dose distributions by scanned beams was investigated by measurement. The results showed the threshold for motion for a variety of scan patterns and the proper number of paintings for normal and respiratory gated beam deliveries. The results of specific aims 1 and 2 provided supporting data for implementation of the respiratory gating beam delivery technique into both passive and scanning modes and the validation of the hypothesis.
Resumo:
The purpose of this paper was to study the main effects of the turning in the superficial integrity of the duplex stainless steel ASTM A890-6A. The tests were conducted on a turning centre with carbide tools and the main entrances variables were: tool material class, feed rate, cutting depth, cutting speed and cutting fluid utilisation. The answers were analysed: microstructural analysis by optical microscopy and x-ray diffraction, cutting forces measurements by a piezoelectric dynamometer, surface roughness, residual stress by x-ray diffraction technique and the microhardness measurements. The results do not show any changes in the microstructural of the material, even when the greater cutting parameters were used. The smaller feed rate (0.1 mm/v), smaller cutting speed (110 m/min) and the greater cutting depth (0.5 mm) provided the smaller values for the tensile residual stress, the smaller surface roughness and the greater microhardness.
Resumo:
Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material`s strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents a novel algorithm to successfully achieve viable integrity and authenticity addition and verification of n-frame DICOM medical images using cryptographic mechanisms. The aim of this work is the enhancement of DICOM security measures, especially for multiframe images. Current approaches have limitations that should be properly addressed for improved security. The algorithm proposed in this work uses data encryption to provide integrity and authenticity, along with digital signature. Relevant header data and digital signature are used as inputs to cipher the image. Therefore, one can only retrieve the original data if and only if the images and the inputs are correct. The encryption process itself is a cascading scheme, where a frame is ciphered with data related to the previous frames, generating also additional data on image integrity and authenticity. Decryption is similar to encryption, featuring also the standard security verification of the image. The implementation was done in JAVA, and a performance evaluation was carried out comparing the speed of the algorithm with other existing approaches. The evaluation showed a good performance of the algorithm, which is an encouraging result to use it in a real environment.
Resumo:
Dentin bonding performed with hydrophobic resins using ethanol-wet bonding should be less susceptible to degradation but this hypothesis has never been validated. Objectives. This in vitro study evaluated stability of resin-dentin bonds created with an experimental three-step BisGMA/TEGDMA hydrophobic adhesive or a three-step hydrophilic adhesive after one year of accelerated aging in artificial saliva. Methods. Flat surfaces in mid-coronal dentin were obtained from 45 sound human molars and randomly divided into three groups (n = 15): an experimental three-step BisGMA/TEGDMA hydrophobic adhesive applied to ethanol (ethanol-wet bonding-GI) or water-saturated dentin (water-wet bonding-GII) and Adper Scotchbond Multi-Purpose [MP-GIII] applied, according to manufacturer instructions, to water-saturated dentin. Resin composite crowns were incrementally formed and light-cured to approximately 5 mm in height. Bonded specimens were stored in artificial saliva at 37 degrees C for 24h and sectioned into sticks. They were subjected to microtensile bond test and TEM analysis immediately and after one year. Data were analyzed with two-way ANOVA and Tukey tests. Results. MP exhibited significant reduction in microtensile bond strength after aging (24 h: 40.6 +/- 2.5(a); one year: 27.5 +/- 3.3(b); in MPa). Hybrid layer degradation was evident in all specimens examined by TEM. The hydrophobic adhesive with ethanol-wet bonding preserved bond strength (24 h: 43.7 +/- 7.4(a); one year: 39.8 +/- 2.7(a)) and hybrid layer integrity, with the latter demonstrating intact collagen fibrils and wide interfibrillar spaces. Significance. Coaxing hydrophobic resins into acid-etched dentin using ethanol-wet bonding preserves resin-dentin bond integrity without the adjunctive use of MMPs inhibitors and warrants further biocompatibility and patient safety`s studies and clinical testing. (C) 2009 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Conservação e Restauro, especialização em pintura sobre tela
Resumo:
La programación concurrente es una tarea difícil aún para los más experimentados programadores. Las investigaciones en concurrencia han dado como resultado una gran cantidad de mecanismos y herramientas para resolver problemas de condiciones de carrera de datos y deadlocks, problemas que surgen por el mal uso de los mecanismos de sincronización. La verificación de propiedades interesantes de programas concurrentes presenta dificultades extras a los programas secuenciales debido al no-determinismo de su ejecución, lo cual resulta en una explosión en el número de posibles estados de programa, haciendo casi imposible un tratamiento manual o aún con la ayuda de computadoras. Algunos enfoques se basan en la creación de lenguajes de programación con construcciones con un alto nivel de abstración para expresar concurrencia y sincronización. Otros enfoques tratan de desarrollar técnicas y métodos de razonamiento para demostrar propiedades, algunos usan demostradores de teoremas generales, model-checking o algortimos específicos sobre un determinado sistema de tipos. Los enfoques basados en análisis estático liviano utilizan técnicas como interpretación abstracta para detectar ciertos tipos de errores, de una manera conservativa. Estas técnicas generalmente escalan lo suficiente para aplicarse en grandes proyectos de software pero los tipos de errores que pueden detectar es limitada. Algunas propiedades interesantes están relacionadas a condiciones de carrera y deadlocks, mientras que otros están interesados en problemas relacionados con la seguridad de los sistemas, como confidencialidad e integridad de datos. Los principales objetivos de esta propuesta es identificar algunas propiedades de interés a verificar en sistemas concurrentes y desarrollar técnicas y herramientas para realizar la verificación en forma automática. Para lograr estos objetivos, se pondrá énfasis en el estudio y desarrollo de sistemas de tipos como tipos dependientes, sistema de tipos y efectos, y tipos de efectos sensibles al flujo de datos y control. Estos sistemas de tipos se aplicarán a algunos modelos de programación concurrente como por ejemplo, en Simple Concurrent Object-Oriented Programming (SCOOP) y Java. Además se abordarán propiedades de seguridad usando sistemas de tipos específicos. Concurrent programming has remained a dificult task even for very experienced programmers. Concurrency research has provided a rich set of tools and mechanisms for dealing with data races and deadlocks that arise of incorrect use of synchronization. Verification of most interesting properties of concurrent programs is a very dificult task due to intrinsic non-deterministic nature of concurrency, resulting in a state explosion which make it almost imposible to be manually treat and it is a serious challenge to do that even with help of computers. Some approaches attempts create programming languages with higher levels of abstraction for expressing concurrency and synchronization. Other approaches try to develop reasoning methods to prove properties, either using general theorem provers, model-checking or specific algorithms on some type systems. The light-weight static analysis approach apply techniques like abstract interpretation to find certain kind of bugs in a conservative way. This techniques scale well to be applied in large software projects but the kind of bugs they may find are limited. Some interesting properties are related to data races and deadlocks, while others are interested in some security problems like confidentiality and integrity of data. The main goals of this proposal is to identify some interesting properties to verify in concurrent systems and develop techniques and tools to do full automatic verification. The main approach will be the application of type systems, as dependent types, type and effect systems, and flow-efect types. Those type systems will be applied to some models for concurrent programming as Simple Concurrent Object-Oriented Programming (SCOOP) and Java. Other goals include the analysis of security properties also using specific type systems.