142 resultados para Sampling schemes
Resumo:
Authenticated Encryption (AE) is the cryptographic process of providing simultaneous confidentiality and integrity protection to messages. This approach is more efficient than applying a two-step process of providing confidentiality for a message by encrypting the message, and in a separate pass providing integrity protection by generating a Message Authentication Code (MAC). AE using symmetric ciphers can be provided by either stream ciphers with built in authentication mechanisms or block ciphers using appropriate modes of operation. However, stream ciphers have the potential for higher performance and smaller footprint in hardware and/or software than block ciphers. This property makes stream ciphers suitable for resource constrained environments, where storage and computational power are limited. There have been several recent stream cipher proposals that claim to provide AE. These ciphers can be analysed using existing techniques that consider confidentiality or integrity separately; however currently there is no existing framework for the analysis of AE stream ciphers that analyses these two properties simultaneously. This thesis introduces a novel framework for the analysis of AE using stream cipher algorithms. This thesis analyzes the mechanisms for providing confidentiality and for providing integrity in AE algorithms using stream ciphers. There is a greater emphasis on the analysis of the integrity mechanisms, as there is little in the public literature on this, in the context of authenticated encryption. The thesis has four main contributions as follows. The first contribution is the design of a framework that can be used to classify AE stream ciphers based on three characteristics. The first classification applies Bellare and Namprempre's work on the the order in which encryption and authentication processes take place. The second classification is based on the method used for accumulating the input message (either directly or indirectly) into the into the internal states of the cipher to generate a MAC. The third classification is based on whether the sequence that is used to provide encryption and authentication is generated using a single key and initial vector, or two keys and two initial vectors. The second contribution is the application of an existing algebraic method to analyse the confidentiality algorithms of two AE stream ciphers; namely SSS and ZUC. The algebraic method is based on considering the nonlinear filter (NLF) of these ciphers as a combiner with memory. This method enables us to construct equations for the NLF that relate the (inputs, outputs and memory of the combiner) to the output keystream. We show that both of these ciphers are secure from this type of algebraic attack. We conclude that using a keydependent SBox in the NLF twice, and using two different SBoxes in the NLF of ZUC, prevents this type of algebraic attack. The third contribution is a new general matrix based model for MAC generation where the input message is injected directly into the internal state. This model describes the accumulation process when the input message is injected directly into the internal state of a nonlinear filter generator. We show that three recently proposed AE stream ciphers can be considered as instances of this model; namely SSS, NLSv2 and SOBER-128. Our model is more general than a previous investigations into direct injection. Possible forgery attacks against this model are investigated. It is shown that using a nonlinear filter in the accumulation process of the input message when either the input message or the initial states of the register is unknown prevents forgery attacks based on collisions. The last contribution is a new general matrix based model for MAC generation where the input message is injected indirectly into the internal state. This model uses the input message as a controller to accumulate a keystream sequence into an accumulation register. We show that three current AE stream ciphers can be considered as instances of this model; namely ZUC, Grain-128a and Sfinks. We establish the conditions under which the model is susceptible to forgery and side-channel attacks.
Resumo:
In this paper, we present three counterfeiting attacks on the block-wise dependent fragile watermarking schemes. We consider vulnerabilities such as the exploitation of a weak correlation among block-wise dependent watermarks to modify valid watermarked %(medical or other digital) images, where they could still be verified as authentic, though they are actually not. Experimental results successfully demonstrate the practicability and consequences of the proposed attacks for some relevant schemes. The development of the proposed attack models can be used as a means to systematically examine the security levels of similar watermarking schemes.
Resumo:
Background: Developing sampling strategies to target biological pests such as insects in stored grain is inherently difficult owing to species biology and behavioural characteristics. The design of robust sampling programmes should be based on an underlying statistical distribution that is sufficiently flexible to capture variations in the spatial distribution of the target species. Results: Comparisons are made of the accuracy of four probability-of-detection sampling models - the negative binomial model,1 the Poisson model,1 the double logarithmic model2 and the compound model3 - for detection of insects over a broad range of insect densities. Although the double log and negative binomial models performed well under specific conditions, it is shown that, of the four models examined, the compound model performed the best over a broad range of insect spatial distributions and densities. In particular, this model predicted well the number of samples required when insect density was high and clumped within experimental storages. Conclusions: This paper reinforces the need for effective sampling programs designed to detect insects over a broad range of spatial distributions. The compound model is robust over a broad range of insect densities and leads to substantial improvement in detection probabilities within highly variable systems such as grain storage.
Resumo:
Acoustic sensors can be used to estimate species richness for vocal species such as birds. They can continuously and passively record large volumes of data over extended periods. These data must subsequently be analyzed to detect the presence of vocal species. Automated analysis of acoustic data for large numbers of species is complex and can be subject to high levels of false positive and false negative results. Manual analysis by experienced surveyors can produce accurate results; however the time and effort required to process even small volumes of data can make manual analysis prohibitive. This study examined the use of sampling methods to reduce the cost of analyzing large volumes of acoustic sensor data, while retaining high levels of species detection accuracy. Utilizing five days of manually analyzed acoustic sensor data from four sites, we examined a range of sampling frequencies and methods including random, stratified, and biologically informed. We found that randomly selecting 120 one-minute samples from the three hours immediately following dawn over five days of recordings, detected the highest number of species. On average, this method detected 62% of total species from 120 one-minute samples, compared to 34% of total species detected from traditional area search methods. Our results demonstrate that targeted sampling methods can provide an effective means for analyzing large volumes of acoustic sensor data efficiently and accurately. Development of automated and semi-automated techniques is required to assist in analyzing large volumes of acoustic sensor data. Read More: http://www.esajournals.org/doi/abs/10.1890/12-2088.1
Resumo:
The detached housing scheme is a unique and exclusive segment of the residential property market in Malaysia. Generally, the product is expensive and for many Malaysians who can afford them, owning a detached house is a once in a lifetime opportunity. In spite of this, most of the owners failed to fully comprehend the specific need of this type of housing scheme, increasing the risk of it being a problematic undertaking. Unlike other types of pre-designed "mass housing" schemes, the detached housing scheme may be built specifically to cater the needs and demands of its owner. Therefore, owner participation during critical development stages is vital to guarantee the success of the development as a whole. In addition, due to its unique design the house would have to individually comply with the requirements and regulations of relevant authorities. Failure by the owner to recognise this will result in delays, penalties, disputes and ultimately cost overruns. These circumstances highlight the need for a research to guide the owner through participation during the critical development stages of a detached house. Therefore, this research aims to develop a guideline to improve owner participation for a successful detached house development in Malaysia. To achieve the aim, questionnaire surveys and semi-structured interviews were employed to collect the detached house owners' and consultants' & contractors' responses through their experiences in developing detached houses in Malaysia. Stratified and random sampling were utilised to gather information from both parties to represent Malaysian detached house participants. The questionnaire responses were analysed through the application of quantitative analysis such as descriptive analysis, factor analysis and structural equation modelling which were substantiated through qualitative analysis procedure such as content analysis. This research had identified that in order to produce a successful outcome detached house owners are required to participate during critical stages of the development. In the planning stage, the owner needs to provide proper specific input to the consultant regarding his/her expectations of the cost for the entire development, its detailed specification and general idea of the internal and external design of the detached house and its compound. In the contracting stage, the owner must make the appropriate choice of selecting the right contractor for the job. This decision may be taken after recommendations from the consultants or from the owner's personal contacts or experiences but it is not recommended for the owner to select a contractor primarily on the basis of the lowest bid. In the completion stage, the owner may need to attend a number of important site meetings to ensure that the progress of the works is according to what had been planned and the completion date is achievable. By having the owners undertake an active role during critical stages of the development, not only the quality and delivery of the development improved but also there is an increase in satisfaction to the owners themselves.
Resumo:
Basing signature schemes on strong lattice problems has been a long standing open issue. Today, two families of lattice-based signature schemes are known: the ones based on the hash-and-sign construction of Gentry et al.; and Lyubashevsky’s schemes, which are based on the Fiat-Shamir framework. In this paper we show for the first time how to adapt the schemes of Lyubashevsky to the ring signature setting. In particular we transform the scheme of ASIACRYPT 2009 into a ring signature scheme that provides strong properties of security under the random oracle model. Anonymity is ensured in the sense that signatures of different users are within negligible statistical distance even under full key exposure. In fact, the scheme satisfies a notion which is stronger than the classical full key exposure setting as even if the keypair of the signing user is adversarially chosen, the statistical distance between signatures of different users remains negligible. Considering unforgeability, the best lattice-based ring signature schemes provide either unforgeability against arbitrary chosen subring attacks or insider corruption in log-sized rings. In this paper we present two variants of our scheme. In the basic one, unforgeability is ensured in those two settings. Increasing signature and key sizes by a factor k (typically 80 − 100), we provide a variant in which unforgeability is ensured against insider corruption attacks for arbitrary rings. The technique used is pretty general and can be adapted to other existing schemes.
Resumo:
Monitoring stream networks through time provides important ecological information. The sampling design problem is to choose locations where measurements are taken so as to maximise information gathered about physicochemical and biological variables on the stream network. This paper uses a pseudo-Bayesian approach, averaging a utility function over a prior distribution, in finding a design which maximizes the average utility. We use models for correlations of observations on the stream network that are based on stream network distances and described by moving average error models. Utility functions used reflect the needs of the experimenter, such as prediction of location values or estimation of parameters. We propose an algorithmic approach to design with the mean utility of a design estimated using Monte Carlo techniques and an exchange algorithm to search for optimal sampling designs. In particular we focus on the problem of finding an optimal design from a set of fixed designs and finding an optimal subset of a given set of sampling locations. As there are many different variables to measure, such as chemical, physical and biological measurements at each location, designs are derived from models based on different types of response variables: continuous, counts and proportions. We apply the methodology to a synthetic example and the Lake Eacham stream network on the Atherton Tablelands in Queensland, Australia. We show that the optimal designs depend very much on the choice of utility function, varying from space filling to clustered designs and mixtures of these, but given the utility function, designs are relatively robust to the type of response variable.
Resumo:
The deposition of biological material (biofouling) onto polymeric contact lenses is thought to be a major contributor to lens discomfort and hence discontinuation of wear. We describe a method to characterize lipid deposits directly from worn contact lenses utilizing liquid extraction surface analysis coupled to tandem mass spectrometry (LESA-MS/MS). This technique effected facile and reproducible extraction of lipids from the contact lens surfaces and identified lipid molecular species representing all major classes present in human tear film. Our data show that LESA-MS/MS is a rapid and comprehensive technique for the characterization of lipid-related biofouling on polymer surfaces.
Resumo:
Multiple-time signatures are digital signature schemes where the signer is able to sign a predetermined number of messages. They are interesting cryptographic primitives because they allow to solve many important cryptographic problems, and at the same time offer substantial efficiency advantage over ordinary digital signature schemes like RSA. Multiple-time signature schemes have found numerous applications, in ordinary, on-line/off-line, forward-secure signatures, and multicast/stream authentication. We propose a multiple-time signature scheme with very efficient signing and verifying. Our construction is based on a combination of one-way functions and cover-free families, and it is secure against the adaptive chosen-message attack.
Resumo:
The purpose of this paper is to describe a new decomposition construction for perfect secret sharing schemes with graph access structures. The previous decomposition construction proposed by Stinson is a recursive method that uses small secret sharing schemes as building blocks in the construction of larger schemes. When the Stinson method is applied to the graph access structures, the number of such “small” schemes is typically exponential in the number of the participants, resulting in an exponential algorithm. Our method has the same flavor as the Stinson decomposition construction; however, the linear programming problem involved in the construction is formulated in such a way that the number of “small” schemes is polynomial in the size of the participants, which in turn gives rise to a polynomial time construction. We also show that if we apply the Stinson construction to the “small” schemes arising from our new construction, both have the same information rate.
Resumo:
We consider the problem of increasing the threshold parameter of a secret-sharing scheme after the setup (share distribution) phase, without further communication between the dealer and the shareholders. Previous solutions to this problem require one to start off with a nonstandard scheme designed specifically for this purpose, or to have communication between shareholders. In contrast, we show how to increase the threshold parameter of the standard Shamir secret-sharing scheme without communication between the shareholders. Our technique can thus be applied to existing Shamir schemes even if they were set up without consideration to future threshold increases. Our method is a new positive cryptographic application for lattice reduction algorithms, inspired by recent work on lattice-based list decoding of Reed-Solomon codes with noise bounded in the Lee norm. We use fundamental results from the theory of lattices (geometry of numbers) to prove quantitative statements about the information-theoretic security of our construction. These lattice-based security proof techniques may be of independent interest.
Resumo:
Enterprise resource planning (ERP) systems are rapidly being combined with “big data” analytics processes and publicly available “open data sets”, which are usually outside the arena of the enterprise, to expand activity through better service to current clients as well as identifying new opportunities. Moreover, these activities are now largely based around relevant software systems hosted in a “cloud computing” environment. However, the over 50- year old phrase related to mistrust in computer systems, namely “garbage in, garbage out” or “GIGO”, is used to describe problems of unqualified and unquestioning dependency on information systems. However, a more relevant GIGO interpretation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems based around ERP and open datasets as well as “big data” analytics, particularly in a cloud environment, the ability to verify the authenticity and integrity of the data sets used may be almost impossible. In turn, this may easily result in decision making based upon questionable results which are unverifiable. Illicit “impersonation” of and modifications to legitimate data sets may become a reality while at the same time the ability to audit any derived results of analysis may be an important requirement, particularly in the public sector. The pressing need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment is discussed in this paper. Some current and appropriate technologies currently being offered are also examined. However, severe limitations in addressing the problems identified are found and the paper proposes further necessary research work for the area. (Note: This paper is based on an earlier unpublished paper/presentation “Identity, Addressing, Authenticity and Audit Requirements for Trust in ERP, Analytics and Big/Open Data in a ‘Cloud’ Computing Environment: A Review and Proposal” presented to the Department of Accounting and IT, College of Management, National Chung Chen University, 20 November 2013.)
Resumo:
A multi-secret sharing scheme allows several secrets to be shared amongst a group of participants. In 2005, Shao and Cao developed a verifiable multi-secret sharing scheme where each participant’s share can be used several times which reduces the number of interactions between the dealer and the group members. In addition some secrets may require a higher security level than others involving the need for different threshold values. Recently Chan and Chang designed such a scheme but their construction only allows a single secret to be shared per threshold value. In this article we combine the previous two approaches to design a multiple time verifiable multi-secret sharing scheme where several secrets can be shared for each threshold value. Since the running time is an important factor for practical applications, we will provide a complexity comparison of our combined approach with respect to the previous schemes.
Resumo:
The construction and operation of infrastructure assets can have significant impact on society and the region. Using a sustainability assessment framework can be an effective means to build sustainability aspects into the design, construction and operation of infrastructure assets. The conventional evaluation processes and procedures for infrastructure projects do not necessarily measure the qualitative/quantitative effectiveness of all aspects of sustainability: environment, social wellbeing and economy. As a result, a few infrastructure sustainability rating schemes have been developed with a view to assess the level of sustainability attained in the infrastructure projects. These include: Infrastructure Sustainability (Australia); CEEQUAL (UK); and Envision (USA). In addition, road sector specific sustainability rating schemes such as Greenroads (USA) and Invest (Australia) have also been developed. These schemes address several aspects of sustainability with varying emphasis (weightings) on areas such as: use of resources; emission, pollution and waste; ecology; people and place; management and governance; and innovation. The attainment of sustainability of an infrastructure project depends largely on addressing the whole-of-life environmental issues. This study has analysed the rating schemes’ coverage of different environmental components for the road infrastructure under the five phases of a project: material, construction, use, maintenance and end-of-life. This is based on a comprehensive life cycle assessment (LCA) system boundary. The findings indicate that there is a need for the schemes to consider key (high impact) life cycle environmental components such as traffic congestion during construction, rolling resistance due to surface roughness and structural stiffness of the pavement, albedo, lighting, and end-of-life management (recycling) to deliver sustainable road projects.
Resumo:
Mammographic density (MD) adjusted for age and body mass index (BMI) is a strong heritable breast cancer risk factor; however, its biological basis remains elusive. Previous studies assessed MD-associated histology using random sampling approaches, despite evidence that high and low MD areas exist within a breast and are negatively correlated with respect to one another. We have used an image-guided approach to sample high and low MD tissues from within individual breasts to examine the relationship between histology and degree of MD. Image-guided sampling was performed using two different methodologies on mastectomy tissues (n = 12): (1) sampling of high and low MD regions within a slice guided by bright (high MD) and dark (low MD) areas in a slice X-ray film; (2) sampling of high and low MD regions within a whole breast using a stereotactically guided vacuum-assisted core biopsy technique. Pairwise analysis accounting for potential confounders (i.e. age, BMI, menopausal status, etc.) provides appropriate power for analysis despite the small sample size. High MD tissues had higher stromal (P = 0.002) and lower fat (P = 0.002) compositions, but no evidence of difference in glandular areas (P = 0.084) compared to low MD tissues from the same breast. High MD regions had higher relative gland counts (P = 0.023), and a preponderance of Type I lobules in high MD compared to low MD regions was observed in 58% of subjects (n = 7), but did not achieve significance. These findings clarify the histologic nature of high MD tissue and support hypotheses regarding the biophysical impact of dense connective tissue on mammary malignancy. They also provide important terms of reference for ongoing analyses of the underlying genetics of MD.