12 resultados para Random Allocation
em Cochin University of Science
Resumo:
This study is about the stability of random sums and extremes.The difficulty in finding exact sampling distributions resulted in considerable problems of computing probabilities concerning the sums that involve a large number of terms.Functions of sample observations that are natural interest other than the sum,are the extremes,that is , the minimum and the maximum of the observations.Extreme value distributions also arise in problems like the study of size effect on material strengths,the reliability of parallel and series systems made up of large number of components,record values and assessing the levels of air pollution.It may be noticed that the theories of sums and extremes are mutually connected.For instance,in the search for asymptotic normality of sums ,it is assumed that at least the variance of the population is finite.In such cases the contributions of the extremes to the sum of independent and identically distributed(i.i.d) r.vs is negligible.
Resumo:
The present research problem is to study the existing encryption methods and to develop a new technique which is performance wise superior to other existing techniques and at the same time can be very well incorporated in the communication channels of Fault Tolerant Hard Real time systems along with existing Error Checking / Error Correcting codes, so that the intention of eaves dropping can be defeated. There are many encryption methods available now. Each method has got it's own merits and demerits. Similarly, many crypt analysis techniques which adversaries use are also available.
Resumo:
The present study gave emphasis on characterizing continuous probability distributions and its weighted versions in univariate set up. Therefore a possible work in this direction is to study the properties of weighted distributions for truncated random variables in discrete set up. The problem of extending the measures into higher dimensions as well as its weighted versions is yet to be examined. As the present study focused attention to length-biased models, the problem of studying the properties of weighted models with various other weight functions and their functional relationships is yet to be examined.
Resumo:
The thesis entitled An Evaluation of Primary Health Care System in Kerala. The present study is intended to examine the working of primary health care system and its impact on the health status of people. The hypothesis tested in the thesis includes, a. The changes in the health profile require reallocation of resources of primary health care system, b. Rate of utilization depends on the quality of services provided by primary health centers, and c. There is a significant decline in the operational efficiency of the primary health care system. The major elements of primary health care stated in the report of AlmaAta International Conference on Primary Health Care (WHO, 1994)” is studied on the basis of the classification of the elements in to three: Preventive, Promotive, and Curative measures. Preventive measures include Maternal and Child Health Care including family Planning. Provision of water and sanitation is reviewed under promotive measures. Curative measures are studied using the disease profile of the study area. Collection of primary data was done through a sample survey, using pre-tested interview schedule of households of the study area. Multi stage random sampling design was used for selecting the sample. The design of the present study is both descriptive and analytical in nature. As far as the analytical tools are concerned, growth index, percentages, ratios, rates, time series analysis, analysis of variance, chi square test, Z test were used for analyzing the data. Present study revealed that no one in these areas was covered under any type of health insurance. Conclusion states that considering the present changes in the health profile, traditional pattern of resource allocation should be altered to meet the urgent health care needs of the people. Preventive and promotive measures like health education for giving awareness among people to change health habits, diet pattern, life style etc. are to be developed. Proper diagnosis and treatment of the disease at the beginning of the stage itself may help to cure majority of disease. For that, Public health policy must ensure the primary health care as enunciated at Alma- Ata international Conference. At the same time Public health is not to be treated as the sole responsibility of the government. Active community participation is an essential means to attain the goals.
Resumo:
In this article, we study reliability measures such as geometric vitality function and conditional Shannon’s measures of uncertainty proposed by Ebrahimi (1996) and Sankaran and Gupta (1999), respectively, for the doubly (interval) truncated random variables. In survival analysis and reliability engineering, these measures play a significant role in studying the various characteristics of a system/component when it fails between two time points. The interrelationships among these uncertainty measures for various distributions are derived and proved characterization theorems arising out of them
Resumo:
In this paper, we study the relationship between the failure rate and the mean residual life of doubly truncated random variables. Accordingly, we develop characterizations for exponential, Pareto 11 and beta distributions. Further, we generalize the identities for fire Pearson and the exponential family of distributions given respectively in Nair and Sankaran (1991) and Consul (1995). Applications of these measures in file context of lengthbiased models are also explored
Resumo:
Nanocrystalline Fe–Ni thin films were prepared by partial crystallization of vapour deposited amorphous precursors. The microstructure was controlled by annealing the films at different temperatures. X-ray diffraction, transmission electron microscopy and energy dispersive x-ray spectroscopy investigations showed that the nanocrystalline phase was that of Fe–Ni. Grain growth was observed with an increase in the annealing temperature. X-ray photoelectron spectroscopy observations showed the presence of a native oxide layer on the surface of the films. Scanning tunnelling microscopy investigations support the biphasic nature of the nanocrystalline microstructure that consists of a crystalline phase along with an amorphous phase. Magnetic studies using a vibrating sample magnetometer show that coercivity has a strong dependence on grain size. This is attributed to the random magnetic anisotropy characteristic of the system. The observed coercivity dependence on the grain size is explained using a modified random anisotropy model
Resumo:
Bank switching in embedded processors having partitioned memory architecture results in code size as well as run time overhead. An algorithm and its application to assist the compiler in eliminating the redundant bank switching codes introduced and deciding the optimum data allocation to banked memory is presented in this work. A relation matrix formed for the memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Data allocation to memory is done by considering all possible permutation of memory banks and combination of data. The compiler output corresponding to each data mapping scheme is subjected to a static machine code analysis which identifies the one with minimum number of bank switching codes. Even though the method is compiler independent, the algorithm utilizes certain architectural features of the target processor. A prototype based on PIC 16F87X microcontrollers is described. This method scales well into larger number of memory blocks and other architectures so that high performance compilers can integrate this technique for efficient code generation. The technique is illustrated with an example
Resumo:
One comes across directions as the observations in a number of situations. The first inferential question that one should answer when dealing with such data is, “Are they isotropic or uniformly distributed?” The answer to this question goes back in history which we shall retrace a bit and provide an exact and approximate solution to this so-called “Pearson’s Random Walk” problem.
Resumo:
In many situations probability models are more realistic than deterministic models. Several phenomena occurring in physics are studied as random phenomena changing with time and space. Stochastic processes originated from the needs of physicists.Let X(t) be a random variable where t is a parameter assuming values from the set T. Then the collection of random variables {X(t), t ∈ T} is called a stochastic process. We denote the state of the process at time t by X(t) and the collection of all possible values X(t) can assume, is called state space