206 resultados para Josef Oriol, Beato, 1650-1702


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tobacco use is causally associated with head and neck squamous cell cancer (HNSCC). Here, we present the results of a case-control study that investigated the effects that the genetic variants of the cytochrome (CYP)1A1, CYP1B1, glutathione-S-transferase (GST)M1, GSTT1, and GSTP1 genes have on modifying the risk of smoking-related HNSCC. Allelisms of the CYP1A1, GSTT1, GSTM1, and GSTT1 genes alone were not associated with an increased risk. CYP1B1 codon 432 polymorphism was found to be a putative susceptibility factor in smoking-related HNSCC. The frequency of CYP1B1 polymorphism was significantly higher (P < 0.001) in the group of smoking cases when compared with smoking controls. Additionally, an odds ratio (OR) of 4.53 (2.62-7.98) was discovered when investigating smoking and nonsmoking cases for the susceptible genotype CYP1B1*2/*2, when compared with the presence of the genotype wild type. In combination with polymorphic variants of the GST genes, a synergistic-effect OR was observed. The calculated OR for the combined genotype CYP1B1*2/*2 and GSTM1*2/*2 was 12.8 (4.09-49.7). The calculated OR for the combined genotype was 13.4 (2.92-97.7) for CYP1B1*2/*2 and GSTT1*2/*2, and 24.1 (9.36-70.5) for the combination of CYP1B1*2/*2 and GSTT1-expressors. The impact of the polymorphic variants of the CYP1B1 gene on HNSCC risk is reflected by the strong association with the frequency of somatic mutations of the p53 gene. Smokers with susceptible genotype CYP1B1*2/*2 were 20 times more likely to show evidence of p53 mutations than were those with CYP1B1 wild type. Combined genotype analysis of CYP1B1 and GSTM1 or GSTT1 revealed interactive effects on the occurrence of p53 gene mutations. The results of the present study indicate that polymorphic variants of CYP1B1 relate significantly to the individual susceptibility of smokers to HNSCC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We show the first deterministic construction of an unconditionally secure multiparty computation (MPC) protocol in the passive adversarial model over black-box non-Abelian groups which is both optimal (secure against an adversary who possesses any t

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The NLM stream cipher designed by Hoon Jae Lee, Sang Min Sung, Hyeong Rag Kim is a strengthened version of the LM summation generator that combines linear and non-linear feedback shift registers. In recent works, the NLM cipher has been used for message authentication in lightweight communication over wireless sensor networks and for RFID authentication protocols. The work analyses the security of the NLM stream cipher and the NLM-MAC scheme that is built on the top of the NLM cipher. We first show that the NLM cipher suffers from two major weaknesses that lead to key recovery and forgery attacks. We prove the internal state of the NLM cipher can be recovered with time complexity about nlog7×2, where the total length of internal state is 2⋅n+22⋅n+2 bits. The attack needs about n2n2 key-stream bits. We also show adversary is able to forge any MAC tag very efficiently by having only one pair (MAC tag, ciphertext). The proposed attacks are practical and break the scheme with a negligible error probability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A crucial issue with hybrid quantum secret sharing schemes is the amount of data that is allocated to the participants. The smaller the amount of allocated data, the better the performance of a scheme. Moreover, quantum data is very hard and expensive to deal with, therefore, it is desirable to use as little quantum data as possible. To achieve this goal, we first construct extended unitary operations by the tensor product of n, n ≥ 2, basic unitary operations, and then by using those extended operations, we design two quantum secret sharing schemes. The resulting dual compressible hybrid quantum secret sharing schemes, in which classical data play a complementary role to quantum data, range from threshold to access structure. Compared with the existing hybrid quantum secret sharing schemes, our proposed schemes not only reduce the number of quantum participants, but also the number of particles and the size of classical shares. To be exact, the number of particles that are used to carry quantum data is reduced to 1 while the size of classical secret shares also is also reduced to l−2 m−1 based on ((m+1, n′)) threshold and to l−2 r2 (where r2 is the number of maximal unqualified sets) based on adversary structure. Consequently, our proposed schemes can greatly reduce the cost and difficulty of generating and storing EPR pairs and lower the risk of transmitting encoded particles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Integration of biometrics is considered as an attractive solution for the issues associated with password based human authentication as well as for secure storage and release of cryptographic keys which is one of the critical issues associated with modern cryptography. However, the widespread popularity of bio-cryptographic solutions are somewhat restricted by the fuzziness associated with biometric measurements. Therefore, error control mechanisms must be adopted to make sure that fuzziness of biometric inputs can be sufficiently countered. In this paper, we have outlined such existing techniques used in bio-cryptography while explaining how they are deployed in different types of solutions. Finally, we have elaborated on the important facts to be considered when choosing appropriate error correction mechanisms for a particular biometric based solution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

So far, low probability differentials for the key schedule of block ciphers have been used as a straightforward proof of security against related-key differential analysis. To achieve resistance, it is believed that for cipher with k-bit key it suffices the upper bound on the probability to be 2− k . Surprisingly, we show that this reasonable assumption is incorrect, and the probability should be (much) lower than 2− k . Our counter example is a related-key differential analysis of the well established block cipher CLEFIA-128. We show that although the key schedule of CLEFIA-128 prevents differentials with a probability higher than 2− 128, the linear part of the key schedule that produces the round keys, and the Feistel structure of the cipher, allow to exploit particularly chosen differentials with a probability as low as 2− 128. CLEFIA-128 has 214 such differentials, which translate to 214 pairs of weak keys. The probability of each differential is too low, but the weak keys have a special structure which allows with a divide-and-conquer approach to gain an advantage of 27 over generic analysis. We exploit the advantage and give a membership test for the weak-key class and provide analysis of the hashing modes. The proposed analysis has been tested with computer experiments on small-scale variants of CLEFIA-128. Our results do not threaten the practical use of CLEFIA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper introduces our dedicated authenticated encryption scheme ICEPOLE. ICEPOLE is a high-speed hardware-oriented scheme, suitable for high-throughput network nodes or generally any environment where specialized hardware (such as FPGAs or ASICs) can be used to provide high data processing rates. ICEPOLE-128 (the primary ICEPOLE variant) is very fast. On the modern FPGA device Virtex 6, a basic iterative architecture of ICEPOLE reaches 41 Gbits/s, which is over 10 times faster than the equivalent implementation of AES-128-GCM. The throughput-to-area ratio is also substantially better when compared to AES-128-GCM. We have carefully examined the security of the algorithm through a range of cryptanalytic techniques and our findings indicate that ICEPOLE offers high security level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we attack round-reduced Keccak hash function with a technique called rotational cryptanalysis. We focus on Keccak variants proposed as SHA-3 candidates in the NIST’s contest for a new standard of cryptographic hash function. Our main result is a preimage attack on 4-round Keccak and a 5-round distinguisher on Keccak-f[1600] permutation — the main building block of Keccak hash function.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ultrafine particles are particles that are less than 0.1 micrometres (µm) in diameter. Due to their very small size they can penetrate deep into the lungs, and potentially cause more damage than larger particles. The Ultrafine Particles from Traffic Emissions and Children’s Health (UPTECH) study is the first Australian epidemiological study to assess the health effects of ultrafine particles on children’s health in general and peripheral airways in particular. The study is being conducted in Brisbane, Australia. Continuous indoor and outdoor air pollution monitoring was conducted within each of the twenty five participating school campuses to measure particulate matter, including in the ultrafine size range, and gases. Respiratory health effects were evaluated by conducting the following tests on participating children at each school: spirometry, forced oscillation technique (FOT) and multiple breath nitrogen washout test (MBNW) (to assess airway function), fraction of exhaled nitric oxide (FeNO, to assess airway inflammation), blood cotinine levels (to assess exposure to second-hand tobacco smoke), and serum C-reactive protein (CRP) levels (to measure systemic inflammation). A pilot study was conducted prior to commencing the main study to assess the feasibility and reliably of measurement of some of the clinical tests that have been proposed for the main study. Air pollutant exposure measurements were not included in the pilot study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 2005, Ginger Myles and Hongxia Jin proposed a software watermarking scheme based on converting jump instructions or unconditional branch statements (UBSs) by calls to a fingerprint branch function (FBF) that computes the correct target address of the UBS as a function of the generated fingerprint and integrity check. If the program is tampered with, the fingerprint and integrity checks change and the target address will not be computed correctly. In this paper, we present an attack based on tracking stack pointer modifications to break the scheme and provide implementation details. The key element of the attack is to remove the fingerprint and integrity check generating code from the program after disassociating the target address from the fingerprint and integrity value. Using the debugging tools that give vast control to the attacker to track stack pointer operations, we perform both subtractive and watermark replacement attacks. The major steps in the attack are automated resulting in a fast and low-cost attack.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Preface The 9th Australasian Conference on Information Security and Privacy (ACISP 2004) was held in Sydney, 13–15 July, 2004. The conference was sponsored by the Centre for Advanced Computing – Algorithms and Cryptography (ACAC), Information and Networked Security Systems Research (INSS), Macquarie University and the Australian Computer Society. The aims of the conference are to bring together researchers and practitioners working in areas of information security and privacy from universities, industry and government sectors. The conference program covered a range of aspects including cryptography, cryptanalysis, systems and network security. The program committee accepted 41 papers from 195 submissions. The reviewing process took six weeks and each paper was carefully evaluated by at least three members of the program committee. We appreciate the hard work of the members of the program committee and external referees who gave many hours of their valuable time. Of the accepted papers, there were nine from Korea, six from Australia, five each from Japan and the USA, three each from China and Singapore, two each from Canada and Switzerland, and one each from Belgium, France, Germany, Taiwan, The Netherlands and the UK. All the authors, whether or not their papers were accepted, made valued contributions to the conference. In addition to the contributed papers, Dr Arjen Lenstra gave an invited talk, entitled Likely and Unlikely Progress in Factoring. This year the program committee introduced the Best Student Paper Award. The winner of the prize for the Best Student Paper was Yan-Cheng Chang from Harvard University for his paper Single Database Private Information Retrieval with Logarithmic Communication. We would like to thank all the people involved in organizing this conference. In particular we would like to thank members of the organizing committee for their time and efforts, Andrina Brennan, Vijayakrishnan Pasupathinathan, Hartono Kurnio, Cecily Lenton, and members from ACAC and INSS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RFID is an important technology that can be used to create the ubiquitous society. But an RFID system uses open radio frequency signal to transfer information and this leads to pose many serious threats to its privacy and security. In general, the computing and storage resources in an RFID tag are very limited and this makes it difficult to solve its secure and private problems, especially for low-cost RFID tags. In order to ensure the security and privacy of low-cost RFID systems we propose a lightweight authentication protocol based on Hash function. This protocol can ensure forward security and prevent information leakage, location tracing, eavesdropping, replay attack and spoofing. This protocol completes the strong authentication of the reader to the tag by twice authenticating and it only transfers part information of the encrypted tag’s identifier for each session so it is difficult for an adversary to intercept the whole identifier of a tag. This protocol is simple and it takes less computing and storage resources, it is very suitable to some low-cost RFID systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the terminating concept of BKZ reduction first introduced by Hanrot et al. [Crypto'11] and make extensive experiments to predict the number of tours necessary to obtain the best possible trade off between reduction time and quality. Then, we improve Buchmann and Lindner's result [Indocrypt'09] to find sub-lattice collision in SWIFFT. We illustrate that further improvement in time is possible through special setting of SWIFFT parameters and also through the combination of different reduction parameters adaptively. Our contribution also include a probabilistic simulation approach top-up deterministic simulation described by Chen and Nguyen [Asiacrypt'11] that can able to predict the Gram-Schmidt norms more accurately for large block sizes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tourism plays an important role in the development of Cook Islands. In this paper we examine the nexus between tourism and growth using quarterly data over the period 2009Q1–2014Q2 using the recently upgraded ARDL bounds test to cointegration tool, Microfit 5.01, which provides sample adjusted bounds and hence is more reliable for small sample size studies. We perform the cointegration using the ARDL bounds test and examine the direction of causality. Using visitor arrival and output in per capita terms as respective proxy for tourism development and growth, we examine the long-run association and report the elasticity coefficient of tourism and causality nexus, accordingly. Using unit root break tests, we note that 2011Q1 and 2011Q2 are two structural break periods in the output series. However, we note that this period is not statistically significant in the ARDL model and hence excluded from the estimation. Subsequently, the regression results show the two series are cointegrated. The long-run elasticity coefficient of tourism is estimated to be 0.83 and the short-run is 0.73. A bidirectional causality between tourism and income is noted for Cook Islands which indicates that tourism development and income mutually reinforce each other. In light of this, socio-economic policies need to focus on broad-based, inclusive and income-generating tourism development projects which are expected to have feedback effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

South Africa is an emerging and industrializing economy which is experiencing remarkable progress. We contend that amidst the developments in the economy, the role of energy, trade openness and financial development are critical. In this article, we revisit the pivotal role of these factors. We use the ARDL bounds [72], the Bayer and Hanck [11] cointegration techniques, and an extended Cobb–Douglas framework, to examine the long-run association with output per worker over the sample period 1971–2011. The results support long-run association between output per worker, capital per worker and the shift parameters. The short-run elasticity coefficients are as follows: energy (0.24), trade (0.07), financial development (−0.03). In the long-run, the elasticity coefficients are: trade openness (0.05), energy (0.29), and financial development (−0.04). In both the short-run and the long-run, we note the post-2000 period has a marginal positive effect on the economy. The Toda and Yamamoto [91] Granger causality results show that a unidirectional causality from capital stock and energy consumption to output; and from capital stock to trade openness; a bidirectional causality between trade openness and output; and absence (neutrality) of any causality between financial development and output thus indicating that these two variables evolve independent of each other.