994 resultados para reliability models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Application of Queueing theory in areas like Computer networking, ATM facilities, Telecommunications and to many other numerous situation made people study Queueing models extensively and it has become an ever expanding branch of applied probability. The thesis discusses Reliability of a ‘k-out-of-n system’ where the server also attends external customers when there are no failed components (main customers), under a retrial policy, which can be explained in detail. It explains the reliability of a ‘K-out-of-n-system’ where the server also attends external customers and studies a multi-server infinite capacity Queueing system where each customer arrives as ordinary but can generate into priority customer which waiting in the queue. The study gives details on a finite capacity multi-server queueing system with self-generation of priority customers and also on a single server infinite capacity retrial Queue where the customer in the orbit can generate into a priority customer and leaves the system if the server is already busy with a priority generated customer; else he is taken for service immediately. Arrival process is according to a MAP and service times follow MSP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The service quality of any sector has two major aspects namely technical and functional. Technical quality can be attained by maintaining technical specification as decided by the organization. Functional quality refers to the manner which service is delivered to customer which can be assessed by the customer feed backs. A field survey was conducted based on the management tool SERVQUAL, by designing 28 constructs under 7 dimensions of service quality. Stratified sampling techniques were used to get 336 valid responses and the gap scores of expectations and perceptions are analyzed using statistical techniques to identify the weakest dimension. To assess the technical aspects of availability six months live outage data of base transceiver were collected. The statistical and exploratory techniques were used to model the network performance. The failure patterns have been modeled in competing risk models and probability distribution of service outage and restorations were parameterized. Since the availability of network is a function of the reliability and maintainability of the network elements, any service provider who wishes to keep up their service level agreements on availability should be aware of the variability of these elements and its effects on interactions. The availability variations were studied by designing a discrete time event simulation model with probabilistic input parameters. The probabilistic distribution parameters arrived from live data analysis was used to design experiments to define the availability domain of the network under consideration. The availability domain can be used as a reference for planning and implementing maintenance activities. A new metric is proposed which incorporates a consistency index along with key service parameters that can be used to compare the performance of different service providers. The developed tool can be used for reliability analysis of mobile communication systems and assumes greater significance in the wake of mobile portability facility. It is also possible to have a relative measure of the effectiveness of different service providers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

So far, in the bivariate set up, the analysis of lifetime (failure time) data with multiple causes of failure is done by treating each cause of failure separately. with failures from other causes considered as independent censoring. This approach is unrealistic in many situations. For example, in the analysis of mortality data on married couples one would be interested to compare the hazards for the same cause of death as well as to check whether death due to one cause is more important for the partners’ risk of death from other causes. In reliability analysis. one often has systems with more than one component and many systems. subsystems and components have more than one cause of failure. Design of high-reliability systems generally requires that the individual system components have extremely high reliability even after long periods of time. Knowledge of the failure behaviour of a component can lead to savings in its cost of production and maintenance and. in some cases, to the preservation of human life. For the purpose of improving reliability. it is necessary to identify the cause of failure down to the component level. By treating each cause of failure separately with failures from other causes considered as independent censoring, the analysis of lifetime data would be incomplete. Motivated by this. we introduce a new approach for the analysis of bivariate competing risk data using the bivariate vector hazard rate of Johnson and Kotz (1975).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The term reliability of an equipment or device is often meant to indicate the probability that it carries out the functions expected of it adequately or without failure and within specified performance limits at a given age for a desired mission time when put to use under the designated application and operating environmental stress. A broad classification of the approaches employed in relation to reliability studies can be made as probabilistic and deterministic, where the main interest in the former is to device tools and methods to identify the random mechanism governing the failure process through a proper statistical frame work, while the latter addresses the question of finding the causes of failure and steps to reduce individual failures thereby enhancing reliability. In the probabilistic attitude to which the present study subscribes to, the concept of life distribution, a mathematical idealisation that describes the failure times, is fundamental and a basic question a reliability analyst has to settle is the form of the life distribution. It is for no other reason that a major share of the literature on the mathematical theory of reliability is focussed on methods of arriving at reasonable models of failure times and in showing the failure patterns that induce such models. The application of the methodology of life time distributions is not confined to the assesment of endurance of equipments and systems only, but ranges over a wide variety of scientific investigations where the word life time may not refer to the length of life in the literal sense, but can be concieved in its most general form as a non-negative random variable. Thus the tools developed in connection with modelling life time data have found applications in other areas of research such as actuarial science, engineering, biomedical sciences, economics, extreme value theory etc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, reciprocal subtangent has been used as a useful tool to describe the behaviour of a density curve. Motivated by this, in the present article we extend the concept to the weighted models. Characterization results are proved for models viz. gamma, Rayleigh, equilibrium, residual lifetime, and proportional hazards. An identity under weighted distribution is also obtained when the reciprocal subtangent takes the form of a general class of distributions. Finally, an extension of reciprocal subtangent for the weighted models in the bivariate and multivariate cases are introduced and proved some useful results

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Partial moments are extensively used in literature for modeling and analysis of lifetime data. In this paper, we study properties of partial moments using quantile functions. The quantile based measure determines the underlying distribution uniquely. We then characterize certain lifetime quantile function models. The proposed measure provides alternate definitions for ageing criteria. Finally, we explore the utility of the measure to compare the characteristics of two lifetime distributions

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Refiners today operate their equipment for prolonged periods without shutdown. This is primarily due to the increased pressures of the market resulting in extended shutdown-to-shutdown intervals. This places extreme demands on the reliability of the plant equipment. The traditional methods of reliability assurance, like Preventive Maintenance, Predictive Maintenance and Condition Based Maintenance become inadequate in the face of such demands. The alternate approaches to reliability improvement, being adopted the world over are implementation of RCFA programs and Reliability Centered Maintenance. However refiners and process plants find it difficult to adopt this standardized methodology of RCM mainly due to the complexity and the large amount of analysis that needs to be done, resulting in a long drawn out implementation, requiring the services of a number of skilled people. These results in either an implementation restricted to only few equipment or alternately, one that is non-standard. The paper presents the current models in use, the core requirements of a standard RCM model, the alternatives to classical RCM, limitations in the existing model, classical RCM and available alternatives to RCM and will then go on to present an ‗Accelerated‘ approach to RCM implementation, that, while ensuring close conformance to the standard, does not place a large burden on the implementers

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software systems are progressively being deployed in many facets of human life. The implication of the failure of such systems, has an assorted impact on its customers. The fundamental aspect that supports a software system, is focus on quality. Reliability describes the ability of the system to function under specified environment for a specified period of time and is used to objectively measure the quality. Evaluation of reliability of a computing system involves computation of hardware and software reliability. Most of the earlier works were given focus on software reliability with no consideration for hardware parts or vice versa. However, a complete estimation of reliability of a computing system requires these two elements to be considered together, and thus demands a combined approach. The present work focuses on this and presents a model for evaluating the reliability of a computing system. The method involves identifying the failure data for hardware components, software components and building a model based on it, to predict the reliability. To develop such a model, focus is given to the systems based on Open Source Software, since there is an increasing trend towards its use and only a few studies were reported on the modeling and measurement of the reliability of such products. The present work includes a thorough study on the role of Free and Open Source Software, evaluation of reliability growth models, and is trying to present an integrated model for the prediction of reliability of a computational system. The developed model has been compared with existing models and its usefulness of is being discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several methods have been suggested to estimate non-linear models with interaction terms in the presence of measurement error. Structural equation models eliminate measurement error bias, but require large samples. Ordinary least squares regression on summated scales, regression on factor scores and partial least squares are appropriate for small samples but do not correct measurement error bias. Two stage least squares regression does correct measurement error bias but the results strongly depend on the instrumental variable choice. This article discusses the old disattenuated regression method as an alternative for correcting measurement error in small samples. The method is extended to the case of interaction terms and is illustrated on a model that examines the interaction effect of innovation and style of use of budgets on business performance. Alternative reliability estimates that can be used to disattenuate the estimates are discussed. A comparison is made with the alternative methods. Methods that do not correct for measurement error bias perform very similarly and considerably worse than disattenuated regression

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In real-world environments it is usually difficult to specify the quality of a preventive maintenance (PM) action precisely. This uncertainty makes it problematic to optimise maintenance policy.-This problem is tackled in this paper by assuming that the-quality of a PM action is a random variable following a probability distribution. Two frequently studied PM models, a failure rate PM model and an age reduction PM model, are investigated. The optimal PM policies are presented and optimised. Numerical examples are also given.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The CMIP3 (IPCC AR4) models show a consistent intensification and poleward shift of the westerly winds over the Southern Ocean during the 21st century. However, the responses of the Antarctic Circumpolar Currents (ACC) show great diversity in these models, with many even showing reductions in transport. To obtain some understanding of diverse responses in the ACC transport, we investigate both external atmospheric and internal oceanic processes that control the ACC transport responses in these models. While the strengthened westerlies act to increase the tilt of isopycnal surfaces and hence the ACC transport through Ekman pumping effects, the associated changes in buoyancy forcing generally tend to reduce the surface meridional density gradient. The steepening of isopycnal surfaces induced by increased wind forcing leads to enhanced (parameterized) eddy-induced transports that act to reduce the isopycnal slopes. There is also considerable narrowing of the ACC that tends to reduce the ACC transport, caused mainly by the poleward shifts of the subtropical gyres and to a lesser extent by the equatorward expansions of the subpolar gyres in some models. If the combined effect of these retarding processes is larger than that of enhanced Ekman pumping, the ACC transport will be reduced. In addition, the effect of Ekman pumping on the ACC is reduced in weakly stratified models. These findings give insight into the reliability of IPCC-class model predictions of the Southern Ocean circulation, and into the observed decadal-scale steady ACC transport.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Asian monsoon system, including the western North Pacific (WNP), East Asian, and Indian monsoons, dominates the climate of the Asia-Indian Ocean-Pacific region, and plays a significant role in the global hydrological and energy cycles. The prediction of monsoons and associated climate features is a major challenge in seasonal time scale climate forecast. In this study, a comprehensive assessment of the interannual predictability of the WNP summer climate has been performed using the 1-month lead retrospective forecasts (hindcasts) of five state-of-the-art coupled models from ENSEMBLES for the period of 1960–2005. Spatial distribution of the temporal correlation coefficients shows that the interannual variation of precipitation is well predicted around the Maritime Continent and east of the Philippines. The high skills for the lower-tropospheric circulation and sea surface temperature (SST) spread over almost the whole WNP. These results indicate that the models in general successfully predict the interannual variation of the WNP summer climate. Two typical indices, the WNP summer precipitation index and the WNP lower-tropospheric circulation index (WNPMI), have been used to quantify the forecast skill. The correlation coefficient between five models’ multi-model ensemble (MME) mean prediction and observations for the WNP summer precipitation index reaches 0.66 during 1979–2005 while it is 0.68 for the WNPMI during 1960–2005. The WNPMI-regressed anomalies of lower-tropospheric winds, SSTs and precipitation are similar between observations and MME. Further analysis suggests that prediction reliability of the WNP summer climate mainly arises from the atmosphere–ocean interaction over the tropical Indian and the tropical Pacific Ocean, implying that continuing improvement in the representation of the air–sea interaction over these regions in CGCMs is a key for long-lead seasonal forecast over the WNP and East Asia. On the other hand, the prediction of the WNP summer climate anomalies exhibits a remarkable spread resulted from uncertainty in initial conditions. The summer anomalies related to the prediction spread, including the lower-tropospheric circulation, SST and precipitation anomalies, show a Pacific-Japan or East Asia-Pacific pattern in the meridional direction over the WNP. Our further investigations suggest that the WNPMI prediction spread arises mainly from the internal dynamics in air–sea interaction over the WNP and Indian Ocean, since the local relationships among the anomalous SST, circulation, and precipitation associated with the spread are similar to those associated with the interannual variation of the WNPMI in both observations and MME. However, the magnitudes of these anomalies related to the spread are weaker, ranging from one third to a half of those anomalies associated with the interannual variation of the WNPMI in MME over the tropical Indian Ocean and subtropical WNP. These results further support that the improvement in the representation of the air–sea interaction over the tropical Indian Ocean and subtropical WNP in CGCMs is a key for reducing the prediction spread and for improving the long-lead seasonal forecast over the WNP and East Asia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Richards equation has been widely used for simulating soil water movement. However, the take-up of agro-hydrological models using the basic theory of soil water flow for optimizing irrigation, fertilizer and pesticide practices is still low. This is partly due to the difficulties in obtaining accurate values for soil hydraulic properties at a field scale. Here, we use an inverse technique to deduce the effective soil hydraulic properties, based on measuring the changes in the distribution of soil water with depth in a fallow field over a long period, subject to natural rainfall and evaporation using a robust micro Genetic Algorithm. A new optimized function was constructed from the soil water contents at different depths, and the soil water at field capacity. The deduced soil water retention curve was approximately parallel but higher than that derived from published pedo-tranfer functions for a given soil pressure head. The water contents calculated from the deduced soil hydraulic properties were in good agreement with the measured values. The reliability of the deduced soil hydraulic properties was tested in reproducing data measured from an independent experiment on the same soil cropped with leek. The calculation of root water uptake took account for both soil water potential and root density distribution. Results show that the predictions of soil water contents at various depths agree fairly well with the measurements, indicating that the inverse analysis is an effective and reliable approach to estimate soil hydraulic properties, and thus permits the simulation of soil water dynamics in both cropped and fallow soils in the field accurately. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Extratropical cyclones produce the majority of precipitation in many regions of the extratropics. This study evaluates the ability of a climate model, HiGEM, to reproduce the precipitation associated with extratropical cyclones. The model is evaluated using the ERA-Interim reanalysis and GPCP dataset. The analysis employs a cyclone centred compositing technique, evaluates composites across a range of geographical areas and cyclone intensities and also investigates the ability of the model to reproduce the climatological distribution of cyclone associated precipitation across the Northern Hemisphere. Using this phenomena centred approach provides an ability to identify the processes which are responsible for climatological biases in the model. Composite precipitation intensities are found to be comparable when all cyclones across the Northern Hemisphere are included. When the cyclones are filtered by region or intensity, differences are found, in particular, HiGEM produces too much precipitation in its most intense cyclones relative to ERA-Interim and GPCP. Biases in the climatological distribution of cyclone associated precipitation are also found, with biases around the storm track regions associated with both the number of cyclones in HiGEM and also their average precipitation intensity. These results have implications for the reliability of future projections of extratropical precipitation from the model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we consider some non-homogeneous Poisson models to estimate the probability that an air quality standard is exceeded a given number of times in a time interval of interest. We assume that the number of exceedances occurs according to a non-homogeneous Poisson process (NHPP). This Poisson process has rate function lambda(t), t >= 0, which depends on some parameters that must be estimated. We take into account two cases of rate functions: the Weibull and the Goel-Okumoto. We consider models with and without change-points. When the presence of change-points is assumed, we may have the presence of either one, two or three change-points, depending of the data set. The parameters of the rate functions are estimated using a Gibbs sampling algorithm. Results are applied to ozone data provided by the Mexico City monitoring network. In a first instance, we assume that there are no change-points present. Depending on the adjustment of the model, we assume the presence of either one, two or three change-points. Copyright (C) 2009 John Wiley & Sons, Ltd.