286 resultados para histone H2A variant
Resumo:
BACKGROUND: About 1-5% of cancer patients suffer from significant normal tissue reactions as a result of radiotherapy (RT). It is not possible at this time to predict how most patients' normal tissues will respond to RT. DNA repair dysfunction is implicated in sensitivity to RT particularly in genes that mediate the repair of DNA double-strand breaks (DSBs). Phosphorylation of histone H2AX (phosphorylated molecules are known as gammaH2AX) occurs rapidly in response to DNA DSBs, and, among its other roles, contributes to repair protein recruitment to these damaged sites. Mammalian cell lines have also been crucial in facilitating the successful cloning of many DNA DSB repair genes; yet, very few mutant cell lines exist for non-syndromic clinical radiosensitivity (RS). METHODS: Here, we survey DNA DSB induction and repair in whole cells from RS patients, as revealed by gammaH2AX foci assays, as potential predictive markers of clinical radiation response. RESULTS: With one exception, both DNA focus induction and repair in cell lines from RS patients were comparable with controls. Using gammaH2AX foci assays, we identified a RS cancer patient cell line with a novel ionising radiation-induced DNA DSB repair defect; these data were confirmed by an independent DNA DSB repair assay. CONCLUSION: gammaH2AX focus measurement has limited scope as a pre-RT predictive assay in lymphoblast cell lines from RT patients; however, the assay can successfully identify novel DNA DSB repair-defective patient cell lines, thus potentially facilitating the discovery of novel constitutional contributions to clinical RS.
Resumo:
The central dogma in radiation biology is that nuclear DNA is the critical target with respect to radiosensitivity. In accordance with the theoretical expectations, and in the absence of a conclusive model, the general consensus in the field has been to view chromatin as a homogeneous template for DNA damage and repair. This paradigm has been called into question by recent findings indicating a disparity in γ-irradiation-induced γH2AX foci formation in euchromatin and heterochromatin. Here, we have extended those studies and provide evidence that γH2AX foci form preferentially in actively transcribing euchromatin following γ-irradiation.
Resumo:
Hyperphenylalaninemia is a variant of phenylketonuria, and debate remains as to what, if any, active management of this condition is required to preserve cognitive function and psychological well-being. This study is the first to examine longitudinally the executive function (EF) in adolescents with hyperphenylalaninemia. Two sibling pairs with mild hyperphenylalaninemia underwent neuropsychological examination in early childhood and again in adolescence using EF tests that were highly sensitive to phenylalanine exposure. By early adolescence, none of the 4 children demonstrated EF impairment. The children demonstrated a typical developmental trajectory of EF from childhood to adolescence, given phenylalanine exposure consistent with their condition.
Resumo:
Synthesis of imines from amines and aliphatic alcohols (C1–C6) in the presence of base on supported palladium nanoparticles has been achieved for the first time. The catalytic system shows high activity and selectivity in open air at room temperature. As an example of the isostructural Ln3Sb3Co2O14 (Ln: La, Pr, Nd, Sm—Ho) series with an ordered pyrochlore structure, the La variant is prepared by a citrate complex method employing stoichiometric amounts of La(NO3)3, Co(NO3)2, and Sb tartrate together with citric acid with a metal/citrate molar ratio of 1:2
Resumo:
SIMON is a family of 10 lightweight block ciphers published by Beaulieu et al. from the United States National Security Agency (NSA). A cipher in this family with K -bit key and N -bit block is called SIMON N/K . We present several linear characteristics for reduced-round SIMON32/64 that can be used for a key-recovery attack and extend them further to attack other variants of SIMON. Moreover, we provide results of key recovery analysis using several impossible differential characteristics starting from 14 out of 32 rounds for SIMON32/64 to 22 out of 72 rounds for SIMON128/256. In some cases the presented observations do not directly yield an attack, but provide a basis for further analysis for the specific SIMON variant. Finally, we exploit a connection between linear and differential characteristics for SIMON to construct linear characteristics for different variants of reduced-round SIMON. Our attacks extend to all variants of SIMON covering more rounds compared to any known results using linear cryptanalysis. We present a key recovery attack against SIMON128/256 which covers 35 out of 72 rounds with data complexity 2123 . We have implemented our attacks for small scale variants of SIMON and our experiments confirm the theoretical bias presented in this work.
Resumo:
At CRYPTO 2006, Halevi and Krawczyk proposed two randomized hash function modes and analyzed the security of digital signature algorithms based on these constructions. They showed that the security of signature schemes based on the two randomized hash function modes relies on properties similar to the second preimage resistance rather than on the collision resistance property of the hash functions. One of the randomized hash function modes was named the RMX hash function mode and was recommended for practical purposes. The National Institute of Standards and Technology (NIST), USA standardized a variant of the RMX hash function mode and published this standard in the Special Publication (SP) 800-106. In this article, we first discuss a generic online birthday existential forgery attack of Dang and Perlner on the RMX-hash-then-sign schemes. We show that a variant of this attack can be applied to forge the other randomize-hash-then-sign schemes. We point out practical limitations of the generic forgery attack on the RMX-hash-then-sign schemes. We then show that these limitations can be overcome for the RMX-hash-then-sign schemes if it is easy to find fixed points for the underlying compression functions, such as for the Davies-Meyer construction used in the popular hash functions such as MD5 designed by Rivest and the SHA family of hash functions designed by the National Security Agency (NSA), USA and published by NIST in the Federal Information Processing Standards (FIPS). We show an online birthday forgery attack on this class of signatures by using a variant of Dean’s method of finding fixed point expandable messages for hash functions based on the Davies-Meyer construction. This forgery attack is also applicable to signature schemes based on the variant of RMX standardized by NIST in SP 800-106. We discuss some important applications of our attacks and discuss their applicability on signature schemes based on hash functions with ‘built-in’ randomization. Finally, we compare our attacks on randomize-hash-then-sign schemes with the generic forgery attacks on the standard hash-based message authentication code (HMAC).
Resumo:
In this paper we present concrete collision and preimage attacks on a large class of compression function constructions making two calls to the underlying ideal primitives. The complexity of the collision attack is above the theoretical lower bound for constructions of this type, but below the birthday complexity; the complexity of the preimage attack, however, is equal to the theoretical lower bound. We also present undesirable properties of some of Stam’s compression functions proposed at CRYPTO ’08. We show that when one of the n-bit to n-bit components of the proposed 2n-bit to n-bit compression function is replaced by a fixed-key cipher in the Davies-Meyer mode, the complexity of finding a preimage would be 2 n/3. We also show that the complexity of finding a collision in a variant of the 3n-bits to 2n-bits scheme with its output truncated to 3n/2 bits is 2 n/2. The complexity of our preimage attack on this hash function is about 2 n . Finally, we present a collision attack on a variant of the proposed m + s-bit to s-bit scheme, truncated to s − 1 bits, with a complexity of O(1). However, none of our results compromise Stam’s security claims.
Resumo:
Halevi and Krawczyk proposed a message randomization algorithm called RMX as a front-end tool to the hash-then-sign digital signature schemes such as DSS and RSA in order to free their reliance on the collision resistance property of the hash functions. They have shown that to forge a RMX-hash-then-sign signature scheme, one has to solve a cryptanalytical task which is related to finding second preimages for the hash function. In this article, we will show how to use Dean’s method of finding expandable messages for finding a second preimage in the Merkle-Damgård hash function to existentially forge a signature scheme based on a t-bit RMX-hash function which uses the Davies-Meyer compression functions (e.g., MD4, MD5, SHA family) in 2 t/2 chosen messages plus 2 t/2 + 1 off-line operations of the compression function and similar amount of memory. This forgery attack also works on the signature schemes that use Davies-Meyer schemes and a variant of RMX published by NIST in its Draft Special Publication (SP) 800-106. We discuss some important applications of our attack.
Resumo:
The forthcoming NIST’s Advanced Hash Standard (AHS) competition to select SHA-3 hash function requires that each candidate hash function submission must have at least one construction to support FIPS 198 HMAC application. As part of its evaluation, NIST is aiming to select either a candidate hash function which is more resistant to known side channel attacks (SCA) when plugged into HMAC, or that has an alternative MAC mode which is more resistant to known SCA than the other submitted alternatives. In response to this, we perform differential power analysis (DPA) on the possible smart card implementations of some of the recently proposed MAC alternatives to NMAC (a fully analyzed variant of HMAC) and HMAC algorithms and NMAC/HMAC versions of some recently proposed hash and compression function modes. We show that the recently proposed BNMAC and KMDP MAC schemes are even weaker than NMAC/HMAC against the DPA attacks, whereas multi-lane NMAC, EMD MAC and the keyed wide-pipe hash have similar security to NMAC against the DPA attacks. Our DPA attacks do not work on the NMAC setting of MDC-2, Grindahl and MAME compression functions.
Resumo:
This paper introduces our dedicated authenticated encryption scheme ICEPOLE. ICEPOLE is a high-speed hardware-oriented scheme, suitable for high-throughput network nodes or generally any environment where specialized hardware (such as FPGAs or ASICs) can be used to provide high data processing rates. ICEPOLE-128 (the primary ICEPOLE variant) is very fast. On the modern FPGA device Virtex 6, a basic iterative architecture of ICEPOLE reaches 41 Gbits/s, which is over 10 times faster than the equivalent implementation of AES-128-GCM. The throughput-to-area ratio is also substantially better when compared to AES-128-GCM. We have carefully examined the security of the algorithm through a range of cryptanalytic techniques and our findings indicate that ICEPOLE offers high security level.
Resumo:
This thesis focused upon the development of improved capacity analysis and capacity planning techniques for railways. A number of innovations were made and were tested on a case study of a real national railway. These techniques can reduce the time required to perform decision making activities that planners and managers need to perform. As all railways need to be expanded to meet increasing demands, the presumption that analytical capacity models can be used to identify how best to improve an existing network at least cost, was fully investigated. Track duplication was the mechanism used to expanding a network's capacity, and two variant capacity expansion models were formulated. Another outcome of this thesis is the development and validation of bi objective models for capacity analysis. These models regulate the competition for track access and perform a trade-off analysis. An opportunity to develop more general mulch-objective approaches was identified.
Resumo:
Characterization of the epigenetic profile of humans since the initial breakthrough on the human genome project has strongly established the key role of histone modifications and DNA methylation. These dynamic elements interact to determine the normal level of expression or methylation status of the constituent genes in the genome. Recently, considerable evidence has been put forward to demonstrate that environmental stress implicitly alters epigenetic patterns causing imbalance that can lead to cancer initiation. This chain of consequences has motivated attempts to computationally model the influence of histone modification and DNA methylation in gene expression and investigate their intrinsic interdependency. In this paper, we explore the relation between DNA methylation and transcription and characterize in detail the histone modifications for specific DNA methylation levels using a stochastic approach.
Resumo:
Over the last few years, investigations of human epigenetic profiles have identified key elements of change to be Histone Modifications, stable and heritable DNA methylation and Chromatin remodeling. These factors determine gene expression levels and characterise conditions leading to disease. In order to extract information embedded in long DNA sequences, data mining and pattern recognition tools are widely used, but efforts have been limited to date with respect to analyzing epigenetic changes, and their role as catalysts in disease onset. Useful insight, however, can be gained by investigation of associated dinucleotide distributions. The focus of this paper is to explore specific dinucleotides frequencies across defined regions within the human genome, and to identify new patterns between epigenetic mechanisms and DNA content. Signal processing methods, including Fourier and Wavelet Transformations, are employed and principal results are reported.
Resumo:
Railways are an important mode of transportation. They are however large and complex and their construction, management and operation is time consuming and costly. Evidently planning the current and future activities is vital. Part of that planning process is an analysis of capacity. To determine what volume of traffic can be achieved over time, a variety of railway capacity analysis techniques have been created. A generic analytical approach that incorporates more complex train paths however has yet to be provided. This article provides such an approach. This article extends a mathematical model for determining the theoretical capacity of a railway network. The main contribution of this paper is the modelling of more complex train paths whereby each section can be visited many times in the course of a train’s journey. Three variant models are formulated and then demonstrated in a case study. This article’s numerical investigations have successively shown the applicability of the proposed models and how they may be used to gain insights into system performance.
Resumo:
In this paper, Bree Hadley discusses The Ex/centric Fixations Project, a practice-led research project which explores the inadequacy of language as a technology for expressing human experiences of difference, discrimination or marginalisation within mainstream cultures. The project asks questions about the way experience, memory and the public discourses available to express them are bound together, about the silences, failures and falsehoods embedded in any effort to convey human experience via public discourses, and about how these failures might form the basis of a performative writing method. It has, to date, focused on developing a method that expresses experience through improvised, intertextual and discontinous collages of language drawn from a variety of public discourses. Aesthetically, this method works with what Hans Theis Lehmann (Postdramatic Theatre p. 17) calls a “textual variant” of the postdramatic “in which language appears not as the speech of characters – if there are still definable characters at all – but as an autonomous theatricality” (Ibid. 18). It is defined by what Lehmann, following Julia Kristeva, calls a “polylogue”, which presents experience as a conflicted, discontinuous and circular phenomenon, akin to a musical fugue, to break away from “an order centred on one logos” (Ibid. 32). The texts function simultaneously as a series of parts, and as wholes, interwoven voices seeming almost to connect, almost to respond to each other, and almost to tell – or challenging each other’s telling – of a story. In this paper, Hadley offers a performative demonstration, together with descriptions of the way spectators respond, including the way their playful, polyvocal texture impacts on engagement, and the way the presence or non-presence of performing bodies to which the experiences depicted can be attached impacts on engagement. She suggests that the improvised, intertextual and experimental enactments of self embodied in the texts encourage spectators to engage at an emotional level, and make-meaning based primarily on memories they recall in the moment, and thus has the potential to counter the risk that people may read depictions of experiences radically different from their own in reductive, essentialised ways.