16 resultados para Random equivalent availability

em Cochin University of Science


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two graphs G and H are Turker equivalent if they have the same set of Turker angles. In this paper some Turker equivalent family of graphs are obtained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study is about the stability of random sums and extremes.The difficulty in finding exact sampling distributions resulted in considerable problems of computing probabilities concerning the sums that involve a large number of terms.Functions of sample observations that are natural interest other than the sum,are the extremes,that is , the minimum and the maximum of the observations.Extreme value distributions also arise in problems like the study of size effect on material strengths,the reliability of parallel and series systems made up of large number of components,record values and assessing the levels of air pollution.It may be noticed that the theories of sums and extremes are mutually connected.For instance,in the search for asymptotic normality of sums ,it is assumed that at least the variance of the population is finite.In such cases the contributions of the extremes to the sum of independent and identically distributed(i.i.d) r.vs is negligible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this Letter a new physical model for metal-insulatormetal CMOS capacitors is presented. In the model the parameters of the circuit are derived from the physical structural details. Physical behaviors due to metal skin effect and inductance have been considered. The model has been confirmed by 3D EM simulator and design rules proposed. The model presented is scalable with capacitor geometry, allowing designers to predict and optimize quality factor. The approach has been verified for MIM CMOS capacitors

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present research problem is to study the existing encryption methods and to develop a new technique which is performance wise superior to other existing techniques and at the same time can be very well incorporated in the communication channels of Fault Tolerant Hard Real time systems along with existing Error Checking / Error Correcting codes, so that the intention of eaves dropping can be defeated. There are many encryption methods available now. Each method has got it's own merits and demerits. Similarly, many crypt analysis techniques which adversaries use are also available.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Department of Statistics, Cochin University of Science and Technology

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The hazards associated with major accident hazard (MAH) industries are fire, explosion and toxic gas releases. Of these, toxic gas release is the worst as it has the potential to cause extensive fatalities. Qualitative and quantitative hazard analyses are essential for the identitication and quantification of the hazards associated with chemical industries. This research work presents the results of a consequence analysis carried out to assess the damage potential of the hazardous material storages in an industrial area of central Kerala, India. A survey carried out in the major accident hazard (MAH) units in the industrial belt revealed that the major hazardous chemicals stored by the various industrial units are ammonia, chlorine, benzene, naphtha, cyclohexane, cyclohexanone and LPG. The damage potential of the above chemicals is assessed using consequence modelling. Modelling of pool fires for naphtha, cyclohexane, cyclohexanone, benzene and ammonia are carried out using TNO model. Vapor cloud explosion (VCE) modelling of LPG, cyclohexane and benzene are carried out using TNT equivalent model. Boiling liquid expanding vapor explosion (BLEVE) modelling of LPG is also carried out. Dispersion modelling of toxic chemicals like chlorine, ammonia and benzene is carried out using the ALOHA air quality model. Threat zones for different hazardous storages are estimated based on the consequence modelling. The distance covered by the threat zone was found to be maximum for chlorine release from a chlor-alkali industry located in the area. The results of consequence modelling are useful for the estimation of individual risk and societal risk in the above industrial area.Vulnerability assessment is carried out using probit functions for toxic, thermal and pressure loads. Individual and societal risks are also estimated at different locations. Mapping of threat zones due to different incident outcome cases from different MAH industries is done with the help of Are GIS.Fault Tree Analysis (FTA) is an established technique for hazard evaluation. This technique has the advantage of being both qualitative and quantitative, if the probabilities and frequencies of the basic events are known. However it is often difficult to estimate precisely the failure probability of the components due to insufficient data or vague characteristics of the basic event. It has been reported that availability of the failure probability data pertaining to local conditions is surprisingly limited in India. This thesis outlines the generation of failure probability values of the basic events that lead to the release of chlorine from the storage and filling facility of a major chlor-alkali industry located in the area using expert elicitation and proven fuzzy logic. Sensitivity analysis has been done to evaluate the percentage contribution of each basic event that could lead to chlorine release. Two dimensional fuzzy fault tree analysis (TDFFTA) has been proposed for balancing the hesitation factor invo1ved in expert elicitation .

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study gave emphasis on characterizing continuous probability distributions and its weighted versions in univariate set up. Therefore a possible work in this direction is to study the properties of weighted distributions for truncated random variables in discrete set up. The problem of extending the measures into higher dimensions as well as its weighted versions is yet to be examined. As the present study focused attention to length-biased models, the problem of studying the properties of weighted models with various other weight functions and their functional relationships is yet to be examined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The average availability of a repairable system is the expected proportion of time that the system is operating in the interval [0, t]. The present article discusses the nonparametric estimation of the average availability when (i) the data on 'n' complete cycles of system operation are available, (ii) the data are subject to right censorship, and (iii) the process is observed upto a specified time 'T'. In each case, a nonparametric confidence interval for the average availability is also constructed. Simulations are conducted to assess the performance of the estimators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article, we study reliability measures such as geometric vitality function and conditional Shannon’s measures of uncertainty proposed by Ebrahimi (1996) and Sankaran and Gupta (1999), respectively, for the doubly (interval) truncated random variables. In survival analysis and reliability engineering, these measures play a significant role in studying the various characteristics of a system/component when it fails between two time points. The interrelationships among these uncertainty measures for various distributions are derived and proved characterization theorems arising out of them

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we study the relationship between the failure rate and the mean residual life of doubly truncated random variables. Accordingly, we develop characterizations for exponential, Pareto 11 and beta distributions. Further, we generalize the identities for fire Pearson and the exponential family of distributions given respectively in Nair and Sankaran (1991) and Consul (1995). Applications of these measures in file context of lengthbiased models are also explored

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nanocrystalline Fe–Ni thin films were prepared by partial crystallization of vapour deposited amorphous precursors. The microstructure was controlled by annealing the films at different temperatures. X-ray diffraction, transmission electron microscopy and energy dispersive x-ray spectroscopy investigations showed that the nanocrystalline phase was that of Fe–Ni. Grain growth was observed with an increase in the annealing temperature. X-ray photoelectron spectroscopy observations showed the presence of a native oxide layer on the surface of the films. Scanning tunnelling microscopy investigations support the biphasic nature of the nanocrystalline microstructure that consists of a crystalline phase along with an amorphous phase. Magnetic studies using a vibrating sample magnetometer show that coercivity has a strong dependence on grain size. This is attributed to the random magnetic anisotropy characteristic of the system. The observed coercivity dependence on the grain size is explained using a modified random anisotropy model

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One comes across directions as the observations in a number of situations. The first inferential question that one should answer when dealing with such data is, “Are they isotropic or uniformly distributed?” The answer to this question goes back in history which we shall retrace a bit and provide an exact and approximate solution to this so-called “Pearson’s Random Walk” problem.