97 resultados para RANDOM GRAPHS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

True random number generation is crucial in hardware security applications. Proposed is a voltage-controlled true random number generator that is inherently field-programmable. This facilitates increased entropy as a randomness source because there is more than one configuration state which lends itself to more compact and low-power architectures. It is evaluated through electrical characterisation and statistically through industry-standard randomness tests. To the best of the author's knowledge, it is one of the most efficient designs to date with respect to hardware design metrics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces the discrete choice model-paradigm of Random Regret Minimization (RRM) to the field of environmental and resource economics. The RRM-approach has been very recently developed in the context of travel demand modelling and presents a tractable, regret-based alternative to the dominant choice-modelling paradigm based on Random Utility Maximization-theory (RUM-theory). We highlight how RRM-based models provide closed form, logit-type formulations for choice probabilities that allow for capturing semi-compensatory behaviour and choice set-composition effects while being equally parsimonious as their utilitarian counterparts. Using data from a Stated Choice-experiment aimed at identifying valuations of characteristics of nature parks, we compare RRM-based models and RUM-based models in terms of parameter estimates, goodness of fit, elasticities and consequential policy implications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A conceptual model is described for generating distributions of grazing animals, according to their searching behavior, to investigate the mechanisms animals may use to achieve their distributions. The model simulates behaviors ranging from random diffusion, through taxis and cognitively aided navigation (i.e., using memory), to the optimization extreme of the Ideal Free Distribution. These behaviors are generated from simulation of biased diffusion that operates at multiple scales simultaneously, formalizing ideas of multiple-scale foraging behavior. It uses probabilistic bias to represent decisions, allowing multiple search goals to be combined (e.g., foraging and social goals) and the representation of suboptimal behavior. By allowing bias to arise at multiple scales within the environment, each weighted relative to the others, the model can represent different scales of simultaneous decision-making and scale-dependent behavior. The model also allows different constraints to be applied to the animal's ability (e.g., applying food-patch accessibility and information limits). Simulations show that foraging-decision randomness and spatial scale of decision bias have potentially profound effects on both animal intake rate and the distribution of resources in the environment. Spatial variograms show that foraging strategies can differentially change the spatial pattern of resource abundance in the environment to one characteristic of the foraging strategy.</

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A multivariate Fokker-Planck-type kinetic equation modeling a test - panicle weakly interacting with an electrostatic plasma. in the presence of a magnetic field B . is analytically solved in an Ornstein - Uhlenbeck - type approximation. A new set of analytic expressions are obtained for variable moments and panicle density as a function of time. The process is diffusive.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to relate macroscopic random motion (described e.g. by Langevin-type theories) to microscopic dynamics, we have undertaken the derivation of a Fokker-Planck-type equation from first microscopic principles. Both subsystems are subject to an external force field. Explicit expressions for the diffusion and drift coefficients are obtained, in terms of the field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new nonlinear theory for the perpendicular transport of charged particles is presented. This approach is based on an improved nonlinear treatment of field line random walk in combination with a generalized compound diffusion model. The generalized compound diffusion model is much more systematic and reliable, in comparison to previous theories. Furthermore, the new theory shows remarkably good agreement with test-particle simulations and heliospheric observations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We compare two approaches for estimating the distribution of consumers' willingness to pay (WTP) in discrete choice models. The usual procedure is to estimate the distribution of the utility coefficients and then derive the distribution of WTP, which is the ratio of coefficients. The alternative is to estimate the distribution of WTP directly. We apply both approaches to data on site choice in the Alps. We find that the alternative approach fits the data better, reduces the incidence of exceedingly large estimated WTP values, and provides the analyst with greater control in specifying and testing the distribution of WTP. © 2008 Agricultural and Applied Economics Association.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coronary heart disease is the commonest cause of death in Northern Ireland, but few data exist on the incidence of risk factors in young adult students and non-students.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of bit-level systolic array circuits as building blocks in the construction of larger word-level systolic systems is investigated. It is shown that the overall structure and detailed timing of such systems may be derived quite simply using the dependence graph and cut-set procedure developed by S. Y. Kung (1988). This provides an attractive and intuitive approach to the bit-level design of many VLSI signal processing components. The technique can be applied to ripple-through and partly pipelined circuits as well as fully systolic designs. It therefore provides a means of examining the relative tradeoff between levels of pipelining, chip area, power consumption, and throughput rate within a given VLSI design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The highly structured nature of many digital signal processing operations allows these to be directly implemented as regular VLSI circuits. This feature has been successfully exploited in the design of a number of commercial chips, some examples of which are described. While many of the architectures on which such chips are based were originally derived on heuristic basis, there is an increasing interest in the development of systematic design techniques for the direct mapping of computations onto regular VLSI arrays. The purpose of this paper is to show how the the technique proposed by Kung can be readily extended to the design of VLSI signal processing chips where the organisation of computations at the level of individual data bits is of paramount importance. The technique in question allows architectures to be derived using the projection and retiming of data dependence graphs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The treatment of the Random-Phase Approximation Hamiltonians, encountered in different frameworks, like time-dependent density functional theory or Bethe-Salpeter equation, is complicated by their non-Hermicity. Compared to their Hermitian Hamiltonian counterparts, computational methods for the treatment of non-Hermitian Hamiltonians are often less efficient and less stable, sometimes leading to the breakdown of the method. Recently [Gruning et al. Nano Lett. 8 (2009) 28201, we have identified that such Hamiltonians are usually pseudo-Hermitian. Exploiting this property, we have implemented an algorithm of the Lanczos type for Random-Phase Approximation Hamiltonians that benefits from the same stability and computational load as its Hermitian counterpart, and applied it to the study of the optical response of carbon nanotubes. We present here the related theoretical grounds and technical details, and study the performance of the algorithm for the calculation of the optical absorption of a molecule within the Bethe-Salpeter equation framework. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A numerical method is developed to simulate complex two-dimensional crack propagation in quasi-brittle materials considering random heterogeneous fracture properties. Potential cracks are represented by pre-inserted cohesive elements with tension and shear softening constitutive laws modelled by spatially varying Weibull random fields. Monte Carlo simulations of a concrete specimen under uni-axial tension were carried out with extensive investigation of the effects of important numerical algorithms and material properties on numerical efficiency and stability, crack propagation processes and load-carrying capacities. It was found that the homogeneous model led to incorrect crack patterns and load–displacement curves with strong mesh-dependence, whereas the heterogeneous model predicted realistic, complicated fracture processes and load-carrying capacity of little mesh-dependence. Increasing the variance of the tensile strength random fields with increased heterogeneity led to reduction in the mean peak load and increase in the standard deviation. The developed method provides a simple but effective tool for assessment of structural reliability and calculation of characteristic material strength for structural design.