917 resultados para Variant hemoglobin


Relevância:

10.00% 10.00%

Publicador:

Resumo:

SIMON is a family of 10 lightweight block ciphers published by Beaulieu et al. from the United States National Security Agency (NSA). A cipher in this family with K -bit key and N -bit block is called SIMON N/K . We present several linear characteristics for reduced-round SIMON32/64 that can be used for a key-recovery attack and extend them further to attack other variants of SIMON. Moreover, we provide results of key recovery analysis using several impossible differential characteristics starting from 14 out of 32 rounds for SIMON32/64 to 22 out of 72 rounds for SIMON128/256. In some cases the presented observations do not directly yield an attack, but provide a basis for further analysis for the specific SIMON variant. Finally, we exploit a connection between linear and differential characteristics for SIMON to construct linear characteristics for different variants of reduced-round SIMON. Our attacks extend to all variants of SIMON covering more rounds compared to any known results using linear cryptanalysis. We present a key recovery attack against SIMON128/256 which covers 35 out of 72 rounds with data complexity 2123 . We have implemented our attacks for small scale variants of SIMON and our experiments confirm the theoretical bias presented in this work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

At CRYPTO 2006, Halevi and Krawczyk proposed two randomized hash function modes and analyzed the security of digital signature algorithms based on these constructions. They showed that the security of signature schemes based on the two randomized hash function modes relies on properties similar to the second preimage resistance rather than on the collision resistance property of the hash functions. One of the randomized hash function modes was named the RMX hash function mode and was recommended for practical purposes. The National Institute of Standards and Technology (NIST), USA standardized a variant of the RMX hash function mode and published this standard in the Special Publication (SP) 800-106. In this article, we first discuss a generic online birthday existential forgery attack of Dang and Perlner on the RMX-hash-then-sign schemes. We show that a variant of this attack can be applied to forge the other randomize-hash-then-sign schemes. We point out practical limitations of the generic forgery attack on the RMX-hash-then-sign schemes. We then show that these limitations can be overcome for the RMX-hash-then-sign schemes if it is easy to find fixed points for the underlying compression functions, such as for the Davies-Meyer construction used in the popular hash functions such as MD5 designed by Rivest and the SHA family of hash functions designed by the National Security Agency (NSA), USA and published by NIST in the Federal Information Processing Standards (FIPS). We show an online birthday forgery attack on this class of signatures by using a variant of Dean’s method of finding fixed point expandable messages for hash functions based on the Davies-Meyer construction. This forgery attack is also applicable to signature schemes based on the variant of RMX standardized by NIST in SP 800-106. We discuss some important applications of our attacks and discuss their applicability on signature schemes based on hash functions with ‘built-in’ randomization. Finally, we compare our attacks on randomize-hash-then-sign schemes with the generic forgery attacks on the standard hash-based message authentication code (HMAC).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we present concrete collision and preimage attacks on a large class of compression function constructions making two calls to the underlying ideal primitives. The complexity of the collision attack is above the theoretical lower bound for constructions of this type, but below the birthday complexity; the complexity of the preimage attack, however, is equal to the theoretical lower bound. We also present undesirable properties of some of Stam’s compression functions proposed at CRYPTO ’08. We show that when one of the n-bit to n-bit components of the proposed 2n-bit to n-bit compression function is replaced by a fixed-key cipher in the Davies-Meyer mode, the complexity of finding a preimage would be 2 n/3. We also show that the complexity of finding a collision in a variant of the 3n-bits to 2n-bits scheme with its output truncated to 3n/2 bits is 2 n/2. The complexity of our preimage attack on this hash function is about 2 n . Finally, we present a collision attack on a variant of the proposed m + s-bit to s-bit scheme, truncated to s − 1 bits, with a complexity of O(1). However, none of our results compromise Stam’s security claims.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Halevi and Krawczyk proposed a message randomization algorithm called RMX as a front-end tool to the hash-then-sign digital signature schemes such as DSS and RSA in order to free their reliance on the collision resistance property of the hash functions. They have shown that to forge a RMX-hash-then-sign signature scheme, one has to solve a cryptanalytical task which is related to finding second preimages for the hash function. In this article, we will show how to use Dean’s method of finding expandable messages for finding a second preimage in the Merkle-Damgård hash function to existentially forge a signature scheme based on a t-bit RMX-hash function which uses the Davies-Meyer compression functions (e.g., MD4, MD5, SHA family) in 2 t/2 chosen messages plus 2 t/2 + 1 off-line operations of the compression function and similar amount of memory. This forgery attack also works on the signature schemes that use Davies-Meyer schemes and a variant of RMX published by NIST in its Draft Special Publication (SP) 800-106. We discuss some important applications of our attack.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The forthcoming NIST’s Advanced Hash Standard (AHS) competition to select SHA-3 hash function requires that each candidate hash function submission must have at least one construction to support FIPS 198 HMAC application. As part of its evaluation, NIST is aiming to select either a candidate hash function which is more resistant to known side channel attacks (SCA) when plugged into HMAC, or that has an alternative MAC mode which is more resistant to known SCA than the other submitted alternatives. In response to this, we perform differential power analysis (DPA) on the possible smart card implementations of some of the recently proposed MAC alternatives to NMAC (a fully analyzed variant of HMAC) and HMAC algorithms and NMAC/HMAC versions of some recently proposed hash and compression function modes. We show that the recently proposed BNMAC and KMDP MAC schemes are even weaker than NMAC/HMAC against the DPA attacks, whereas multi-lane NMAC, EMD MAC and the keyed wide-pipe hash have similar security to NMAC against the DPA attacks. Our DPA attacks do not work on the NMAC setting of MDC-2, Grindahl and MAME compression functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper introduces our dedicated authenticated encryption scheme ICEPOLE. ICEPOLE is a high-speed hardware-oriented scheme, suitable for high-throughput network nodes or generally any environment where specialized hardware (such as FPGAs or ASICs) can be used to provide high data processing rates. ICEPOLE-128 (the primary ICEPOLE variant) is very fast. On the modern FPGA device Virtex 6, a basic iterative architecture of ICEPOLE reaches 41 Gbits/s, which is over 10 times faster than the equivalent implementation of AES-128-GCM. The throughput-to-area ratio is also substantially better when compared to AES-128-GCM. We have carefully examined the security of the algorithm through a range of cryptanalytic techniques and our findings indicate that ICEPOLE offers high security level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis focused upon the development of improved capacity analysis and capacity planning techniques for railways. A number of innovations were made and were tested on a case study of a real national railway. These techniques can reduce the time required to perform decision making activities that planners and managers need to perform. As all railways need to be expanded to meet increasing demands, the presumption that analytical capacity models can be used to identify how best to improve an existing network at least cost, was fully investigated. Track duplication was the mechanism used to expanding a network's capacity, and two variant capacity expansion models were formulated. Another outcome of this thesis is the development and validation of bi objective models for capacity analysis. These models regulate the competition for track access and perform a trade-off analysis. An opportunity to develop more general mulch-objective approaches was identified.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of whole-body imaging at single-cell resolution enables system-level approaches to studying cellular circuits in organisms. Previous clearing methods focused on homogenizing mismatched refractive indices of individual tissues, enabling reductions in opacity but falling short of achieving transparency. Here, we show that an aminoalcohol decolorizes blood by efficiently eluting the heme chromophore from hemoglobin. Direct transcardial perfusion of an aminoalcohol-containing cocktail that we previously termed CUBIC coupled with a 10 day to 2 week clearing protocol decolorized and rendered nearly transparent almost all organs of adult mice as well as the entire body of infant and adult mice. This CUBIC-perfusion protocol enables rapid whole-body and whole-organ imaging at single-cell resolution by using light-sheet fluorescent microscopy. The CUBIC protocol is also applicable to 3D pathology, anatomy, and immunohistochemistry of various organs. These results suggest that whole-body imaging of colorless tissues at high resolution will contribute to organism-level systems biology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Railways are an important mode of transportation. They are however large and complex and their construction, management and operation is time consuming and costly. Evidently planning the current and future activities is vital. Part of that planning process is an analysis of capacity. To determine what volume of traffic can be achieved over time, a variety of railway capacity analysis techniques have been created. A generic analytical approach that incorporates more complex train paths however has yet to be provided. This article provides such an approach. This article extends a mathematical model for determining the theoretical capacity of a railway network. The main contribution of this paper is the modelling of more complex train paths whereby each section can be visited many times in the course of a train’s journey. Three variant models are formulated and then demonstrated in a case study. This article’s numerical investigations have successively shown the applicability of the proposed models and how they may be used to gain insights into system performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, Bree Hadley discusses The Ex/centric Fixations Project, a practice-led research project which explores the inadequacy of language as a technology for expressing human experiences of difference, discrimination or marginalisation within mainstream cultures. The project asks questions about the way experience, memory and the public discourses available to express them are bound together, about the silences, failures and falsehoods embedded in any effort to convey human experience via public discourses, and about how these failures might form the basis of a performative writing method. It has, to date, focused on developing a method that expresses experience through improvised, intertextual and discontinous collages of language drawn from a variety of public discourses. Aesthetically, this method works with what Hans Theis Lehmann (Postdramatic Theatre p. 17) calls a “textual variant” of the postdramatic “in which language appears not as the speech of characters – if there are still definable characters at all – but as an autonomous theatricality” (Ibid. 18). It is defined by what Lehmann, following Julia Kristeva, calls a “polylogue”, which presents experience as a conflicted, discontinuous and circular phenomenon, akin to a musical fugue, to break away from “an order centred on one logos” (Ibid. 32). The texts function simultaneously as a series of parts, and as wholes, interwoven voices seeming almost to connect, almost to respond to each other, and almost to tell – or challenging each other’s telling – of a story. In this paper, Hadley offers a performative demonstration, together with descriptions of the way spectators respond, including the way their playful, polyvocal texture impacts on engagement, and the way the presence or non-presence of performing bodies to which the experiences depicted can be attached impacts on engagement. She suggests that the improvised, intertextual and experimental enactments of self embodied in the texts encourage spectators to engage at an emotional level, and make-meaning based primarily on memories they recall in the moment, and thus has the potential to counter the risk that people may read depictions of experiences radically different from their own in reductive, essentialised ways.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Serum lutein (L) and zeaxanthin (Z) positively correlate with macular pigment optical density (MPOD), hence the latter is a valuable indirect tool for measuring L and Z content in the macula. L and Z have been attributed antioxidant capacity and protection from certain retinal diseases but their uptake within the eye is thought to depend on genetic, age and environmental factors. In particular gene variants within beta-carotene monooxygenase (BCMO1) are thought to modulate MPOD in the macula. Objectives: To determine the effect of BCMO1 single nucleotide polymorphisms (SNPs) rs11645428, rs6420424 and rs6464851 on macular pigment optical density (MPOD) in a cohort of young healthy participants of Caucasian origin with normal ocular health. Design In this cohort study, MPOD was assessed in 46 healthy participants (22 male and 24 female) with a mean age of 24 ± 4.0 years (range 19-33). The three SNPs, rs11645428, rs6420424, rs6564851 that have established associations with MPOD were determined using MassEXTEND (hME) Sequenom assay. One-way analysis of variance (ANOVA) was performed on groups segregated into homozygous and heterozygous BCMO1 genotypes. Correlations between body mass index (BMI), iris colour, gender, central retinal thickness (CRT), diet and MPOD were investigated. Results MPOD did not significantly vary with BCMO1 rs11645428 (F2,41 = 0.700, p = 0.503), rs6420424 (F2,41 = 0.210, p = 0.801) nor rs6464851 homozygous or heterozygous genotypes (F2,41 = 0,13, p = 0.88), in this young healthy cohort. The combination of these three SNPs into triple genotypes based on plasma conversion efficiency did not affect MPOD (F2,41 = 0.07, p = 0.9). There was a significant negative correlation with MPOD and central retinal thickness (r = - 0.39, p = 0.01) but no significant correlation between BMI, iris colour, gender and MPOD. Conclusion Our results indicate that macular pigment deposition within the central retina is not dependent on BCMO1 gene variants in young healthy people. We propose that MPOD is saturated in younger persons and/or other gene variant combinations determine its deposition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we develop and validate a new Statistically Assisted Fluid Registration Algorithm (SAFIRA) for brain images. A non-statistical version of this algorithm was first implemented in [2] and re-formulated using Lagrangian mechanics in [3]. Here we extend this algorithm to 3D: given 3D brain images from a population, vector fields and their corresponding deformation matrices are computed in a first round of registrations using the non-statistical implementation. Covariance matrices for both the deformation matrices and the vector fields are then obtained and incorporated (separately or jointly) in the regularizing (i.e., the non-conservative Lagrangian) terms, creating four versions of the algorithm. We evaluated the accuracy of each algorithm variant using the manually labeled LPBA40 dataset, which provides us with ground truth anatomical segmentations. We also compared the power of the different algorithms using tensor-based morphometry -a technique to analyze local volumetric differences in brain structure- applied to 46 3D brain scans from healthy monozygotic twins.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we used a nonconservative Lagrangian mechanics approach to formulate a new statistical algorithm for fluid registration of 3-D brain images. This algorithm is named SAFIRA, acronym for statistically-assisted fluid image registration algorithm. A nonstatistical version of this algorithm was implemented, where the deformation was regularized by penalizing deviations from a zero rate of strain. In, the terms regularizing the deformation included the covariance of the deformation matrices Σ and the vector fields (q). Here, we used a Lagrangian framework to reformulate this algorithm, showing that the regularizing terms essentially allow nonconservative work to occur during the flow. Given 3-D brain images from a group of subjects, vector fields and their corresponding deformation matrices are computed in a first round of registrations using the nonstatistical implementation. Covariance matrices for both the deformation matrices and the vector fields are then obtained and incorporated (separately or jointly) in the nonconservative terms, creating four versions of SAFIRA. We evaluated and compared our algorithms' performance on 92 3-D brain scans from healthy monozygotic and dizygotic twins; 2-D validations are also shown for corpus callosum shapes delineated at midline in the same subjects. After preliminary tests to demonstrate each method, we compared their detection power using tensor-based morphometry (TBM), a technique to analyze local volumetric differences in brain structure. We compared the accuracy of each algorithm variant using various statistical metrics derived from the images and deformation fields. All these tests were also run with a traditional fluid method, which has been quite widely used in TBM studies. The versions incorporating vector-based empirical statistics on brain variation were consistently more accurate than their counterparts, when used for automated volumetric quantification in new brain images. This suggests the advantages of this approach for large-scale neuroimaging studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Brain-derived neurotrophic factor (BDNF) plays a key role in learning and memory, but its effects on the fiber architecture of the living brain are unknown. We genotyped 455 healthy adult twins and their non-twin siblings (188 males/267 females; age: 23.7 ± 2.1. years, mean ± SD) and scanned them with high angular resolution diffusion tensor imaging (DTI), to assess how the BDNF Val66Met polymorphism affects white matter microstructure. By applying genetic association analysis to every 3D point in the brain images, we found that the Val-BDNF genetic variant was associated with lower white matter integrity in the splenium of the corpus callosum, left optic radiation, inferior fronto-occipital fasciculus, and superior corona radiata. Normal BDNF variation influenced the association between subjects' performance intellectual ability (as measured by Object Assembly subtest) and fiber integrity (as measured by fractional anisotropy; FA) in the callosal splenium, and pons. BDNF gene may affect the intellectual performance by modulating the white matter development. This combination of genetic association analysis and large-scale diffusion imaging directly relates a specific gene to the fiber microstructure of the living brain and to human intelligence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Deficits in lentiform nucleus volume and morphometry are implicated in a number of genetically influenced disorders, including Parkinson's disease, schizophrenia, and ADHD. Here we performed genome-wide searches to discover common genetic variants associated with differences in lentiform nucleus volume in human populations. We assessed structural MRI scans of the brain in two large genotyped samples: the Alzheimer's Disease Neuroimaging Initiative (ADNI; N = 706) and the Queensland Twin Imaging Study (QTIM; N = 639). Statistics of association from each cohort were combined meta-analytically using a fixed-effects model to boost power and to reduce the prevalence of false positive findings. We identified a number of associations in and around the flavin-containing monooxygenase (FMO) gene cluster. The most highly associated SNP, rs1795240, was located in the FMO3 gene; after meta-analysis, it showed genome-wide significant evidence of association with lentiform nucleus volume (PMA = 4. 79 × 10-8). This commonly-carried genetic variant accounted for 2. 68 % and 0. 84 % of the trait variability in the ADNI and QTIM samples, respectively, even though the QTIM sample was on average 50 years younger. Pathway enrichment analysis revealed significant contributions of this gene to the cytochrome P450 pathway, which is involved in metabolizing numerous therapeutic drugs for pain, seizures, mania, depression, anxiety, and psychosis. The genetic variants we identified provide replicated, genome-wide significant evidence for the FMO gene cluster's involvement in lentiform nucleus volume differences in human populations.