402 resultados para Gaussian probability function
Resumo:
In this paper, we analyze the SHAvite-3-512 hash function, as proposed and tweaked for round 2 of the SHA-3 competition. We present cryptanalytic results on 10 out of 14 rounds of the hash function SHAvite-3-512, and on the full 14 round compression function of SHAvite-3-512. We show a second preimage attack on the hash function reduced to 10 rounds with a complexity of 2497 compression function evaluations and 216 memory. For the full 14-round compression function, we give a chosen counter, chosen salt preimage attack with 2384 compression function evaluations and 2128 memory (or complexity 2448 without memory), and a collision attack with 2192 compression function evaluations and 2128 memory.
Resumo:
Many RFID protocols use cryptographic hash functions for their security. The resource constrained nature of RFID systems forces the use of light weight cryptographic algorithms. Tav-128 is one such 128-bit light weight hash function proposed by Peris-Lopez et al. for a low-cost RFID tag authentication protocol. Apart from some statistical tests for randomness by the designers themselves, Tav-128 has not undergone any other thorough security analysis. Based on these tests, the designers claimed that Tav-128 does not posses any trivial weaknesses. In this article, we carry out the first third party security analysis of Tav-128 and show that this hash function is neither collision resistant nor second preimage resistant. Firstly, we show a practical collision attack on Tav-128 having a complexity of 237 calls to the compression function and produce message pairs of arbitrary length which produce the same hash value under this hash function. We then show a second preimage attack on Tav-128 which succeeds with a complexity of 262 calls to the compression function. Finally, we study the constituent functions of Tav-128 and show that the concatenation of nonlinear functions A and B produces a 64-bit permutation from 32-bit messages. This could be a useful light weight primitive for future RFID protocols.
Resumo:
The effects of reductions in cell wall lignin content, manifested by RNA interference suppression of coumaroyl 3'-hydroxylase, on plant growth, water transport, gas exchange, and photosynthesis were evaluated in hybrid poplar trees (Populus alba 3 grandidentata). The growth characteristics of the reduced lignin trees were significantly impaired, resulting in smaller stems and reduced root biomass when compared to wild-type trees, as well as altered leaf morphology and architecture. The severe inhibition of cell wall lignification produced trees with a collapsed xylem phenotype, resulting in compromised vascular integrity, and displayed reduced hydraulic conductivity and a greater susceptibility to wall failure and cavitation. In the reduced lignin trees, photosynthetic carbon assimilation and stomatal conductance were also greatly reduced, however, shoot xylem pressure potential and carbon isotope discrimination were higher and water-use efficiency was lower, inconsistent with water stress. Reductions in assimilation rate could not be ascribed to increased stomatal limitation. Starch and soluble sugars analysis of leaves revealed that photosynthate was accumulating to high levels, suggesting that the trees with substantially reduced cell wall lignin were not carbon limited and that reductions in sink strength were, instead, limiting photosynthesis.
Resumo:
The commercialization of aerial image processing is highly dependent on the platforms such as UAVs (Unmanned Aerial Vehicles). However, the lack of an automated UAV forced landing site detection system has been identified as one of the main impediments to allow UAV flight over populated areas in civilian airspace. This article proposes a UAV forced landing site detection system that is based on machine learning approaches including the Gaussian Mixture Model and the Support Vector Machine. A range of learning parameters are analysed including the number of Guassian mixtures, support vector kernels including linear, radial basis function Kernel (RBF) and polynormial kernel (poly), and the order of RBF kernel and polynormial kernel. Moreover, a modified footprint operator is employed during feature extraction to better describe the geometric characteristics of the local area surrounding a pixel. The performance of the presented system is compared to a baseline UAV forced landing site detection system which uses edge features and an Artificial Neural Network (ANN) region type classifier. Experiments conducted on aerial image datasets captured over typical urban environments reveal improved landing site detection can be achieved with an SVM classifier with an RBF kernel using a combination of colour and texture features. Compared to the baseline system, the proposed system provides significant improvement in term of the chance to detect a safe landing area, and the performance is more stable than the baseline in the presence of changes to the UAV altitude.
Resumo:
One of the main challenges facing online and offline path planners is the uncertainty in the magnitude and direction of the environmental energy because it is dynamic, changeable with time, and hard to forecast. This thesis develops an artificial intelligence for a mobile robot to learn from historical or forecasted data of environmental energy available in the area of interest which will help for a persistence monitoring under uncertainty using the developed algorithm.
Resumo:
The NLM stream cipher designed by Hoon Jae Lee, Sang Min Sung, Hyeong Rag Kim is a strengthened version of the LM summation generator that combines linear and non-linear feedback shift registers. In recent works, the NLM cipher has been used for message authentication in lightweight communication over wireless sensor networks and for RFID authentication protocols. The work analyses the security of the NLM stream cipher and the NLM-MAC scheme that is built on the top of the NLM cipher. We first show that the NLM cipher suffers from two major weaknesses that lead to key recovery and forgery attacks. We prove the internal state of the NLM cipher can be recovered with time complexity about nlog7×2, where the total length of internal state is 2⋅n+22⋅n+2 bits. The attack needs about n2n2 key-stream bits. We also show adversary is able to forge any MAC tag very efficiently by having only one pair (MAC tag, ciphertext). The proposed attacks are practical and break the scheme with a negligible error probability.
Resumo:
So far, low probability differentials for the key schedule of block ciphers have been used as a straightforward proof of security against related-key differential analysis. To achieve resistance, it is believed that for cipher with k-bit key it suffices the upper bound on the probability to be 2− k . Surprisingly, we show that this reasonable assumption is incorrect, and the probability should be (much) lower than 2− k . Our counter example is a related-key differential analysis of the well established block cipher CLEFIA-128. We show that although the key schedule of CLEFIA-128 prevents differentials with a probability higher than 2− 128, the linear part of the key schedule that produces the round keys, and the Feistel structure of the cipher, allow to exploit particularly chosen differentials with a probability as low as 2− 128. CLEFIA-128 has 214 such differentials, which translate to 214 pairs of weak keys. The probability of each differential is too low, but the weak keys have a special structure which allows with a divide-and-conquer approach to gain an advantage of 27 over generic analysis. We exploit the advantage and give a membership test for the weak-key class and provide analysis of the hashing modes. The proposed analysis has been tested with computer experiments on small-scale variants of CLEFIA-128. Our results do not threaten the practical use of CLEFIA.
Resumo:
We consider online prediction problems where the loss between the prediction and the outcome is measured by the squared Euclidean distance and its generalization, the squared Mahalanobis distance. We derive the minimax solutions for the case where the prediction and action spaces are the simplex (this setup is sometimes called the Brier game) and the \ell_2 ball (this setup is related to Gaussian density estimation). We show that in both cases the value of each sub-game is a quadratic function of a simple statistic of the state, with coefficients that can be efficiently computed using an explicit recurrence relation. The resulting deterministic minimax strategy and randomized maximin strategy are linear functions of the statistic.
Resumo:
Models of the mammalian clock have traditionally been based around two feedback loops-the self-repression of Per/Cry by interfering with activation by BMAL/CLOCK, and the repression of Bmal/Clock by the REV-ERB proteins. Recent experimental evidence suggests that the D-box, a transcription factor binding site associated with daytime expression, plays a larger role in clock function than has previously been understood. We present a simplified clock model that highlights the role of the D-box and illustrate an approach for finding maximum-entropy ensembles of model parameters, given experimentally imposed constraints. Parameter variability can be mitigated using prior probability distributions derived from genome-wide studies of cellular kinetics. Our model reproduces predictions concerning the dual regulation of Cry1 by the D-box and Rev-ErbA/ROR response element (RRE) promoter elements and allows for ensemble-based predictions of phase response curves (PRCs). Nonphotic signals such as Neuropeptide Y (NPY) may act by promoting Cry1 expression, whereas photic signals likely act by stimulating expression from the E/E' box. Ensemble generation with parameter probability restraints reveals more about a model's behavior than a single optimal parameter set.
Resumo:
It’s commonly assumed that psychiatric violence is motivated by delusions, but here the concept of a reversed impetus is explored, to understand whether delusions are formed as ad-hoc or post-hoc rationalizations of behaviour or in advance of the actus reus. The reflexive violence model proposes that perceptual stimuli has motivational power and this may trigger unwanted actions and hallucinations. The model is based on the theory of ecological perception, where opportunities enabled by an object are cues to act. As an apple triggers a desire to eat, a gun triggers a desire to shoot. These affordances (as they are called) are part of the perceptual apparatus, they allow the direct recognition of objects – and in emergencies they enable the fastest possible reactions. Even under normal circumstances, the presence of a weapon will trigger inhibited violent impulses. The presence of a victim will also, but under normal circumstances, these affordances don’t become violent because negative action impulses are totally inhibited, whereas in psychotic illness, negative action impulses are treated as emergencies and bypass frontal inhibitory circuits. What would have been object recognition becomes a blind automatic action. A range of mental illnesses can cause inhibition to be bypassed. At its most innocuous, this causes both simple hallucinations (where the motivational power of an object is misattributed). But ecological perception may have the power to trigger serious violence also –a kind that’s devoid of motives or planning and is often shrouded in amnesia or post-rational delusions.
Resumo:
This paper presents an efficient noniterative method for distribution state estimation using conditional multivariate complex Gaussian distribution (CMCGD). In the proposed method, the mean and standard deviation (SD) of the state variables is obtained in one step considering load uncertainties, measurement errors, and load correlations. In this method, first the bus voltages, branch currents, and injection currents are represented by MCGD using direct load flow and a linear transformation. Then, the mean and SD of bus voltages, or other states, are calculated using CMCGD and estimation of variance method. The mean and SD of pseudo measurements, as well as spatial correlations between pseudo measurements, are modeled based on the historical data for different levels of load duration curve. The proposed method can handle load uncertainties without using time-consuming approaches such as Monte Carlo. Simulation results of two case studies, six-bus, and a realistic 747-bus distribution network show the effectiveness of the proposed method in terms of speed, accuracy, and quality against the conventional approach.