11 resultados para coalescing random walk
em Cochin University of Science
Resumo:
One comes across directions as the observations in a number of situations. The first inferential question that one should answer when dealing with such data is, “Are they isotropic or uniformly distributed?” The answer to this question goes back in history which we shall retrace a bit and provide an exact and approximate solution to this so-called “Pearson’s Random Walk” problem.
Resumo:
Routine activity theory introduced by Cohen& Felson in 1979 states that criminal acts are caused due to the presenceof criminals, vic-timsand the absence of guardians in time and place. As the number of collision of these elements in place and time increases, criminal acts will also increase even if the number of criminals or civilians remains the same within the vicinity of a city. Street robbery is a typical example of routine ac-tivity theory and the occurrence of which can be predicted using routine activity theory. Agent-based models allow simulation of diversity among individuals. Therefore agent based simulation of street robbery can be used to visualize how chronological aspects of human activity influence the incidence of street robbery.The conceptual model identifies three classes of people-criminals, civilians and police with certain activity areas for each. Police exist only as agents of formal guardianship. Criminals with a tendency for crime will be in the search for their victims. Civilians without criminal tendencycan be either victims or guardians. In addition to criminal tendency, each civilian in the model has a unique set of characteristicslike wealth, employment status, ability for guardianship etc. These agents are subjected to random walk through a street environment guided by a Q –learning module and the possible outcomes are analyzed
Resumo:
This study is about the stability of random sums and extremes.The difficulty in finding exact sampling distributions resulted in considerable problems of computing probabilities concerning the sums that involve a large number of terms.Functions of sample observations that are natural interest other than the sum,are the extremes,that is , the minimum and the maximum of the observations.Extreme value distributions also arise in problems like the study of size effect on material strengths,the reliability of parallel and series systems made up of large number of components,record values and assessing the levels of air pollution.It may be noticed that the theories of sums and extremes are mutually connected.For instance,in the search for asymptotic normality of sums ,it is assumed that at least the variance of the population is finite.In such cases the contributions of the extremes to the sum of independent and identically distributed(i.i.d) r.vs is negligible.
Resumo:
The present research problem is to study the existing encryption methods and to develop a new technique which is performance wise superior to other existing techniques and at the same time can be very well incorporated in the communication channels of Fault Tolerant Hard Real time systems along with existing Error Checking / Error Correcting codes, so that the intention of eaves dropping can be defeated. There are many encryption methods available now. Each method has got it's own merits and demerits. Similarly, many crypt analysis techniques which adversaries use are also available.
Resumo:
The present study gave emphasis on characterizing continuous probability distributions and its weighted versions in univariate set up. Therefore a possible work in this direction is to study the properties of weighted distributions for truncated random variables in discrete set up. The problem of extending the measures into higher dimensions as well as its weighted versions is yet to be examined. As the present study focused attention to length-biased models, the problem of studying the properties of weighted models with various other weight functions and their functional relationships is yet to be examined.
Resumo:
In this article, we study reliability measures such as geometric vitality function and conditional Shannon’s measures of uncertainty proposed by Ebrahimi (1996) and Sankaran and Gupta (1999), respectively, for the doubly (interval) truncated random variables. In survival analysis and reliability engineering, these measures play a significant role in studying the various characteristics of a system/component when it fails between two time points. The interrelationships among these uncertainty measures for various distributions are derived and proved characterization theorems arising out of them
Resumo:
In this paper, we study the relationship between the failure rate and the mean residual life of doubly truncated random variables. Accordingly, we develop characterizations for exponential, Pareto 11 and beta distributions. Further, we generalize the identities for fire Pearson and the exponential family of distributions given respectively in Nair and Sankaran (1991) and Consul (1995). Applications of these measures in file context of lengthbiased models are also explored
Resumo:
Nanocrystalline Fe–Ni thin films were prepared by partial crystallization of vapour deposited amorphous precursors. The microstructure was controlled by annealing the films at different temperatures. X-ray diffraction, transmission electron microscopy and energy dispersive x-ray spectroscopy investigations showed that the nanocrystalline phase was that of Fe–Ni. Grain growth was observed with an increase in the annealing temperature. X-ray photoelectron spectroscopy observations showed the presence of a native oxide layer on the surface of the films. Scanning tunnelling microscopy investigations support the biphasic nature of the nanocrystalline microstructure that consists of a crystalline phase along with an amorphous phase. Magnetic studies using a vibrating sample magnetometer show that coercivity has a strong dependence on grain size. This is attributed to the random magnetic anisotropy characteristic of the system. The observed coercivity dependence on the grain size is explained using a modified random anisotropy model
Resumo:
In many situations probability models are more realistic than deterministic models. Several phenomena occurring in physics are studied as random phenomena changing with time and space. Stochastic processes originated from the needs of physicists.Let X(t) be a random variable where t is a parameter assuming values from the set T. Then the collection of random variables {X(t), t ∈ T} is called a stochastic process. We denote the state of the process at time t by X(t) and the collection of all possible values X(t) can assume, is called state space