905 resultados para Fold Block-designs
Resumo:
In this paper we investigate the differential properties of block ciphers in hash function modes of operation. First we show the impact of differential trails for block ciphers on collision attacks for various hash function constructions based on block ciphers. Further, we prove the lower bound for finding a pair that follows some truncated differential in case of a random permutation. Then we present open-key differential distinguishers for some well known round-reduced block ciphers.
Resumo:
Dose-finding designs estimate the dose level of a drug based on observed adverse events. Relatedness of the adverse event to the drug has been generally ignored in all proposed design methodologies. These designs assume that the adverse events observed during a trial are definitely related to the drug, which can lead to flawed dose-level estimation. We incorporate adverse event relatedness into the so-called continual reassessment method. Adverse events that have ‘doubtful’ or ‘possible’ relationships to the drug are modelled using a two-parameter logistic model with an additive probability mass. Adverse events ‘probably’ or ‘definitely’ related to the drug are modelled using a cumulative logistic model. To search for the maximum tolerated dose, we use the maximum estimated toxicity probability of these two adverse event relatedness categories. We conduct a simulation study that illustrates the characteristics of the design under various scenarios. This article demonstrates that adverse event relatedness is important for improved dose estimation. It opens up further research pathways into continual reassessment design methodologies.
Resumo:
This paper presents a framework for the design of a joint motion controller and a control allocation strategy for dynamic positioning of marine vehicles. The key aspects of the proposed designs are a systematic approach to deal with actuator saturation and to inform the motion controller about saturation. The proposed system uses a mapping that translates the actuator constraint sets into constraint sets at the motion controller level. Hence, while the motion controller addresses the constraints, the control allocation algorithm can solve an unconstrained optimisation problem. The constrained control design is approached using a multivariable anti-wind-up strategy for strictly proper controllers. This is applicable to the implementation of PI and PID type of motion controllers.
Resumo:
This chapter investigates a variety of water quality assessment tools for reservoirs with balanced/unbalanced monitoring designs and focuses on providing informative water quality assessments to ensure decision-makers are able to make risk-informed management decisions about reservoir health. In particular, two water quality assessment methods are described: non-compliance (probability of the number of times the indicator exceeds the recommended guideline) and amplitude (degree of departure from the guideline). Strengths and weaknesses of current and alternative water quality methods will be discussed. The proposed methodology is particularly applicable to unbalanced designs with/without missing values and reflects the general conditions and is not swayed too heavily by the occasional extreme value (very high or very low quality). To investigate the issues in greater detail, we use as a case study, a reservoir within South-East Queensland (SEQ), Australia. The purpose here is to obtain an annual score that reflected the overall water quality, temporally, spatially and across water quality indicators for each reservoir.
Resumo:
The paper provides a systematic approach to designing the laboratory phase of a multiphase experiment, taking into account previous phases. General principles are outlined for experiments in which orthogonal designs can be employed. Multiphase experiments occur widely, although their multiphase nature is often not recognized. The need to randomize the material produced from the first phase in the laboratory phase is emphasized. Factor-allocation diagrams are used to depict the randomizations in a design and the use of skeleton analysis-of-variance (ANOVA) tables to evaluate their properties discussed. The methods are illustrated using a scenario and a case study. A basis for categorizing designs is suggested. This article has supplementary material online.
Resumo:
Several recently proposed ciphers, for example Rijndael and Serpent, are built with layers of small S-boxes interconnected by linear key-dependent layers. Their security relies on the fact, that the classical methods of cryptanalysis (e.g. linear or differential attacks) are based on probabilistic characteristics, which makes their security grow exponentially with the number of rounds N r r. In this paper we study the security of such ciphers under an additional hypothesis: the S-box can be described by an overdefined system of algebraic equations (true with probability 1). We show that this is true for both Serpent (due to a small size of S-boxes) and Rijndael (due to unexpected algebraic properties). We study general methods known for solving overdefined systems of equations, such as XL from Eurocrypt’00, and show their inefficiency. Then we introduce a new method called XSL that uses the sparsity of the equations and their specific structure. The XSL attack uses only relations true with probability 1, and thus the security does not have to grow exponentially in the number of rounds. XSL has a parameter P, and from our estimations is seems that P should be a constant or grow very slowly with the number of rounds. The XSL attack would then be polynomial (or subexponential) in N r> , with a huge constant that is double-exponential in the size of the S-box. The exact complexity of such attacks is not known due to the redundant equations. Though the presented version of the XSL attack always gives always more than the exhaustive search for Rijndael, it seems to (marginally) break 256-bit Serpent. We suggest a new criterion for design of S-boxes in block ciphers: they should not be describable by a system of polynomial equations that is too small or too overdefined.
Resumo:
The deformation of a rectangular block into an annular wedge is studied with respect to the state of swelling interior to the block. Nonuniform swelling fields are shown to generate these flexure deformations in the absence of resultant forces and bending moments. Analytical expressions for the deformation fields demonstrate these effects for both incompressible and compressible generalizations of conventional hyperelastic materials. Existing results in the absence of a swelling agent are recovered as special cases.
Resumo:
Unique features and benefits of the plasma-aided nanofabrication are considered by using the "plasma-building block" approach, which is based on plasma diagnostics and nanofilm characterization, cross-referenced by numerical simulation of generation and dynamics of building blocks in the gas phase, their interaction with nanostructured surfaces, and ab initio simulation of chemical structure of relevant nanoassemblies. The examples include carbon nanotip microemitter structures, semiconductor quantum dots and nanowires synthesized in the integrated plasma-aided nanofabrication facility.
Resumo:
A computationally efficient sequential Monte Carlo algorithm is proposed for the sequential design of experiments for the collection of block data described by mixed effects models. The difficulty in applying a sequential Monte Carlo algorithm in such settings is the need to evaluate the observed data likelihood, which is typically intractable for all but linear Gaussian models. To overcome this difficulty, we propose to unbiasedly estimate the likelihood, and perform inference and make decisions based on an exact-approximate algorithm. Two estimates are proposed: using Quasi Monte Carlo methods and using the Laplace approximation with importance sampling. Both of these approaches can be computationally expensive, so we propose exploiting parallel computational architectures to ensure designs can be derived in a timely manner. We also extend our approach to allow for model uncertainty. This research is motivated by important pharmacological studies related to the treatment of critically ill patients.
Principles in the design of multiphase experiments with a later laboratory phase: Orthogonal designs
Resumo:
Critical stage in open-pit mining is to determine the optimal extraction sequence of blocks, which has significant impacts on mining profitability. In this paper, a more comprehensive block sequencing optimisation model is developed for the open-pit mines. In the model, material characteristics of blocks, grade control, excavator and block sequencing are investigated and integrated to maximise the short-term benefit of mining. Several case studies are modeled and solved by CPLEX MIP and CP engines. Numerical investigations are presented to illustrate and validate the proposed methodology.
Resumo:
Background Rapid diagnostic tests (RDTs) for detection of Plasmodium falciparum infection that target P. falciparum histidine-rich protein 2 (PfHRP2), a protein that circulates in the blood of patients infected with this species of malaria, are widely used to guide case management. Understanding determinants of PfHRP2 availability in circulation is therefore essential to understanding the performance of PfHRP2-detecting RDTs. Methods The possibility that pre-formed host anti-PfHRP2 antibodies may block target antigen detection, thereby causing false negative test results was investigated in this study. Results Anti-PfHRP2 antibodies were detected in 19/75 (25%) of plasma samples collected from patients with acute malaria from Cambodia, Nigeria and the Philippines, as well as in 3/28 (10.7%) asymptomatic Solomon Islands residents. Pre-incubation of plasma samples from subjects with high-titre anti-PfHRP2 antibodies with soluble PfHRP2 blocked the detection of the target antigen on two of the three brands of RDTs tested, leading to false negative results. Pre-incubation of the plasma with intact parasitized erythrocytes resulted in a reduction of band intensity at the highest parasite density, and a reduction of lower detection threshold by ten-fold on all three brands of RDTs tested. Conclusions These observations indicate possible reduced sensitivity for diagnosis of P. falciparum malaria using PfHRP2-detecting RDTs among people with high levels of specific antibodies and low density infection, as well as possible interference with tests configured to detect soluble PfHRP2 in saliva or urine samples. Further investigations are required to assess the impact of pre-formed anti-PfHRP2 antibodies on RDT performance in different transmission settings.
Resumo:
There is a wide range of potential study designs for intervention studies to decrease nosocomial infections in hospitals. The analysis is complex due to competing events, clustering, multiple timescales and time-dependent period and intervention variables. This review considers the popular pre-post quasi-experimental design and compares it with randomized designs. Randomization can be done in several ways: randomization of the cluster [intensive care unit (ICU) or hospital] in a parallel design; randomization of the sequence in a cross-over design; and randomization of the time of intervention in a stepped-wedge design. We introduce each design in the context of nosocomial infections and discuss the designs with respect to the following key points: bias, control for nonintervention factors, and generalizability. Statistical issues are discussed. A pre-post-intervention design is often the only choice that will be informative for a retrospective analysis of an outbreak setting. It can be seen as a pilot study with further, more rigorous designs needed to establish causality. To yield internally valid results, randomization is needed. Generally, the first choice in terms of the internal validity should be a parallel cluster randomized trial. However, generalizability might be stronger in a stepped-wedge design because a wider range of ICU clinicians may be convinced to participate, especially if there are pilot studies with promising results. For analysis, the use of extended competing risk models is recommended.
Resumo:
Preneel, Govaerts and Vandewalle (PGV) analysed the security of single-block-length block cipher based compression functions assuming that the underlying block cipher has no weaknesses. They showed that 12 out of 64 possible compression functions are collision and (second) preimage resistant. Black, Rogaway and Shrimpton formally proved this result in the ideal cipher model. However, in the indifferentiability security framework introduced by Maurer, Renner and Holenstein, all these 12 schemes are easily differentiable from a fixed input-length random oracle (FIL-RO) even when their underlying block cipher is ideal. We address the problem of building indifferentiable compression functions from the PGV compression functions. We consider a general form of 64 PGV compression functions and replace the linear feed-forward operation in this generic PGV compression function with an ideal block cipher independent of the one used in the generic PGV construction. This modified construction is called a generic modified PGV (MPGV). We analyse indifferentiability of the generic MPGV construction in the ideal cipher model and show that 12 out of 64 MPGV compression functions in this framework are indifferentiable from a FIL-RO. To our knowledge, this is the first result showing that two independent block ciphers are sufficient to design indifferentiable single-block-length compression functions.