458 resultados para memory access complexity
em Queensland University of Technology - ePrints Archive
Resumo:
We introduce multiple-control fuzzy vaults allowing generalised threshold, compartmented and multilevel access structure. The presented schemes enable many useful applications employing multiple users and/or multiple locking sets. Introducing the original single control fuzzy vault of Juels and Sudan we identify several similarities and differences between their vault and secret sharing schemes which influence how best to obtain working generalisations. We design multiple-control fuzzy vaults suggesting applications using biometric credentials as locking and unlocking values. Furthermore we assess the security of our obtained generalisations for insider/ outsider attacks and examine the access-complexity for legitimate vault owners.
Resumo:
X-ray microtomography (micro-CT) with micron resolution enables new ways of characterizing microstructures and opens pathways for forward calculations of multiscale rock properties. A quantitative characterization of the microstructure is the first step in this challenge. We developed a new approach to extract scale-dependent characteristics of porosity, percolation, and anisotropic permeability from 3-D microstructural models of rocks. The Hoshen-Kopelman algorithm of percolation theory is employed for a standard percolation analysis. The anisotropy of permeability is calculated by means of the star volume distribution approach. The local porosity distribution and local percolation probability are obtained by using the local porosity theory. Additionally, the local anisotropy distribution is defined and analyzed through two empirical probability density functions, the isotropy index and the elongation index. For such a high-resolution data set, the typical data sizes of the CT images are on the order of gigabytes to tens of gigabytes; thus an extremely large number of calculations are required. To resolve this large memory problem parallelization in OpenMP was used to optimally harness the shared memory infrastructure on cache coherent Non-Uniform Memory Access architecture machines such as the iVEC SGI Altix 3700Bx2 Supercomputer. We see adequate visualization of the results as an important element in this first pioneering study.
Resumo:
In this paper we present a cryptanalysis of a new 256-bit hash function, FORK-256, proposed by Hong et al. at FSE 2006. This cryptanalysis is based on some unexpected differentials existing for the step transformation. We show their possible uses in different attack scenarios by giving a 1-bit (resp. 2-bit) near collision attack against the full compression function of FORK-256 running with complexity of 2^125 (resp. 2^120) and with negligible memory, and by exhibiting a 22-bit near pseudo-collision. We also show that we can find collisions for the full compression function with a small amount of memory with complexity not exceeding 2^126.6 hash evaluations. We further show how to reduce this complexity to 2^109.6 hash computations by using 273 memory words. Finally, we show that this attack can be extended with no additional cost to find collisions for the full hash function, i.e. with the predefined IV.
Resumo:
In this paper, we analyze the SHAvite-3-512 hash function, as proposed and tweaked for round 2 of the SHA-3 competition. We present cryptanalytic results on 10 out of 14 rounds of the hash function SHAvite-3-512, and on the full 14 round compression function of SHAvite-3-512. We show a second preimage attack on the hash function reduced to 10 rounds with a complexity of 2497 compression function evaluations and 216 memory. For the full 14-round compression function, we give a chosen counter, chosen salt preimage attack with 2384 compression function evaluations and 2128 memory (or complexity 2448 without memory), and a collision attack with 2192 compression function evaluations and 2128 memory.
Resumo:
There is considerable evidence that working memory impairment is a common feature of schizophrenia. The present study assessed working memory and executive function in 54 participants with schizophrenia, and a group of 54 normal controls matched to the patients on age, gender and estimated premorbid IQ, using traditional and newer measures of executive function and two dual tasks—Telephone Search with Counting and the Memory Span and Tracking Task. Results indicated that participants with schizophrenia were significantly impaired on all standardised measures of executive function with the exception of a composite measure of the Trail Making Test. Results for the dual task measures demonstrated that while the participants with schizophrenia were unimpaired on immediate digit span recall over a 2-min period, they recalled fewer digit strings and performed more poorly on a tracking task (box-crossing task) compared with controls. In addition, participants with schizophrenia performed more poorly on the tracking task when they were required to simultaneously recall digits strings than when they performed this task alone. Contrary to expectation, results of the telephone search task under dual conditions were not significantly different between groups. These results may reflect the insufficient complexity of the tone-counting task as an interference task. Overall, the present study showed that participants with schizophrenia appear to have a restricted impairment of their working memory system that is evident in tasks in which the visuospatial sketchpad slave system requires central executive control.
Resumo:
Rapid advances in information and communications technology (ICT) - particularly the development of online technologies -have transformed the nature of economic, social and cultural relations across the globe. In the context of higher education in post-industrial societies, technological change has had a significant impact on university operating environments. In a broad sense, technological advancement has contributed significantly to the increasing complexity of global economies and societies, which is reflected in the rise of lifelong learning discourses with which universities are engaging. More specifically, the ever-expanding array of ICT available within the university sector has generated new management and pedagogical imperatives for higher education in the information age.
Resumo:
The process of researching children’s literature from the past is a growing challenge as resources age and are increasingly treated as rare items, stored away within libraries and other research centres. In Australia, researchers and librarians have collaborated with the bibliographic database AustLit: The Australian Literature Resource to produce the Australian Children’s Literature Digital Resources Project (CLDR). This Project aims to address the growing demand for online access to rare children’s literature resources, and demonstrates the research potential of early Australian children’s literature by supplementing the collection with relevant critical articles. The CLDR project is designed with a specific focus and provides access to full text Australian children’s literature from European settlement to 1945. The collection demonstrates a need and desire to preserve literature treasures to prevent losing such collections in a digital age. The collection covers many themes relevant to the conference including, trauma, survival, memory, survival, hauntings, and histories. The resource provides new and exciting ways with which to research children’s literature from the past and offers a fascinating repository to scholars and professionals of ranging disciplines who are in interested in Australian children’s literature.
Resumo:
Using a quasi-natural voting experiment encompassing a 160-year period (1848–2009) in Switzerland, we investigate whether a higher level of complexity leads to increased reliance on trusted parliamentary representatives. We find that when more referenda are held on the same day, constituents are more likely to refer to parliamentary recommendations when making their decisions. This finding holds true even when we narrow our focus to referenda with a relatively lower voter turnout on days on which more than one referendum is held. We also demonstrate that when constituents face a higher level of complexity, they follow the parliamentary recommendations rather than those of interest groups. "Viewed as a geometric figure, the ant’s path is irregular, complex, hard to describe. But its complexity is really a complexity in the surface of the beach, not a complexity in the ant." ([1] p. 51)
Resumo:
A multi-secret sharing scheme allows several secrets to be shared amongst a group of participants. In 2005, Shao and Cao developed a verifiable multi-secret sharing scheme where each participant’s share can be used several times which reduces the number of interactions between the dealer and the group members. In addition some secrets may require a higher security level than others involving the need for different threshold values. Recently Chan and Chang designed such a scheme but their construction only allows a single secret to be shared per threshold value. In this article we combine the previous two approaches to design a multiple time verifiable multi-secret sharing scheme where several secrets can be shared for each threshold value. Since the running time is an important factor for practical applications, we will provide a complexity comparison of our combined approach with respect to the previous schemes.
Resumo:
This paper addresses the problem of joint identification of infinite-frequency added mass and fluid memory models of marine structures from finite frequency data. This problem is relevant for cases where the code used to compute the hydrodynamic coefficients of the marine structure does not give the infinite-frequency added mass. This case is typical of codes based on 2D-potential theory since most 3D-potential-theory codes solve the boundary value associated with the infinite frequency. The method proposed in this paper presents a simpler alternative approach to other methods previously presented in the literature. The advantage of the proposed method is that the same identification procedure can be used to identify the fluid-memory models with or without having access to the infinite-frequency added mass coefficient. Therefore, it provides an extension that puts the two identification problems into the same framework. The method also exploits the constraints related to relative degree and low-frequency asymptotic values of the hydrodynamic coefficients derived from the physics of the problem, which are used as prior information to refine the obtained models.
Resumo:
Double-strand breaks represent an extremely cytolethal form of DNA damage and thus pose a serious threat to the preservation of genetic and epigenetic information. Though it is well-known that double-strand breaks such as those generated by ionising radiation are among the principal causative factors behind mutations, chromosomal aberrations, genetic instability and carcino-genesis, significantly less is known about the epigenetic consequences of double-strand break formation and repair for carcinogenesis. Double-strand break repair is a highly coordinated process that requires the unravelling of the compacted chromatin structure to facilitate repair machinery access and then restoration of the original undamaged chromatin state. Recent experimental findings have pointed to a potential mechanism for double-strand break-induced epigenetic silencing. This review will discuss some of the key epigenetic regulatory processes involved in double-strand break (DSB) repair and how incomplete or incorrect restoration of chromatin structure can leave a DSB-induced epigenetic memory of damage with potentially pathological repercussions
Resumo:
In recent years the Australian government has dedicated considerable project funds to establish public Internet access points in rural and regional communities. Drawing on data from a major Australian study of the social and economic impact of new technologies on rural areas, this paper explores some of the difficulties rural communities have faced in setting up public access points and sustaining them beyond their project funding. Of particular concern is the way that economic sustainability has been positioned as a measure of the success of such ventures. Government funding has been allocated on the basis of these rural public access points becoming economically self-sustaining. This is problematic on a number of counts. It is therefore argued that these public access points should be reconceptualised as essential community infrastructure like schools and libraries, rather than potential economic enterprises. Author Keywords: Author Keywords: Internet; Public access; Sustainability; Digital divide; Rural Australia
Resumo:
We generalize the classical notion of Vapnik–Chernovenkis (VC) dimension to ordinal VC-dimension, in the context of logical learning paradigms. Logical learning paradigms encompass the numerical learning paradigms commonly studied in Inductive Inference. A logical learning paradigm is defined as a set W of structures over some vocabulary, and a set D of first-order formulas that represent data. The sets of models of ϕ in W, where ϕ varies over D, generate a natural topology W over W. We show that if D is closed under boolean operators, then the notion of ordinal VC-dimension offers a perfect characterization for the problem of predicting the truth of the members of D in a member of W, with an ordinal bound on the number of mistakes. This shows that the notion of VC-dimension has a natural interpretation in Inductive Inference, when cast into a logical setting. We also study the relationships between predictive complexity, selective complexity—a variation on predictive complexity—and mind change complexity. The assumptions that D is closed under boolean operators and that W is compact often play a crucial role to establish connections between these concepts. We then consider a computable setting with effective versions of the complexity measures, and show that the equivalence between ordinal VC-dimension and predictive complexity fails. More precisely, we prove that the effective ordinal VC-dimension of a paradigm can be defined when all other effective notions of complexity are undefined. On a better note, when W is compact, all effective notions of complexity are defined, though they are not related as in the noncomputable version of the framework.