992 resultados para Reversed hazard rate


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a methodology and software for hazard rate analysis of induction type watt-hour meters, considering the main variables related with the degradation process of these meters, for the Elektro Electricity and Services SA. The modeling developed to calculate the watt-hour meters hazard rate was implemented in a tool through a user friendly platform, in Delphi language, enabling not only hazard rate analysis, but also a classification by risk range, localization of installation for the analyzed meters, and, allowing, through an expert system, the sampling of induction type watt-hour meters, based on the model risk developed with artificial intelligence, with the mainly goal of follow and manage the process of degradation, maintenance and replacement of these meters. © 2010 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a methodology and a specialist tool for failure probability analysis of induction type watt-hour meters, considering the main variables related to their measurement degradation processes. The database of the metering park of a distribution company, named Elektro Electricity and Services Co., was used for determining the most relevant variables and to feed the data in the software. The modeling developed to calculate the watt-hour meters probability of failure was implemented in a tool through a user friendly platform, written in Delphi language. Among the main features of this tool are: analysis of probability of failure by risk range; geographical localization of the meters in the metering park, and automatic sampling of induction type watt-hour meters, based on a risk classification expert system, in order to obtain information to aid the management of these meters. The main goals of the specialist tool are following and managing the measurement degradation, maintenance and replacement processes for induction watt-hour meters. © 2011 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Gumbel distribution is perhaps the most widely applied statistical distribution for problems in engineering. We propose a generalization-referred to as the Kumaraswamy Gumbel distribution-and provide a comprehensive treatment of its structural properties. We obtain the analytical shapes of the density and hazard rate functions. We calculate explicit expressions for the moments and generating function. The variation of the skewness and kurtosis measures is examined and the asymptotic distribution of the extreme values is investigated. Explicit expressions are also derived for the moments of order statistics. The methods of maximum likelihood and parametric bootstrap and a Bayesian procedure are proposed for estimating the model parameters. We obtain the expected information matrix. An application of the new model to a real dataset illustrates the potentiality of the proposed model. Two bivariate generalizations of the model are proposed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we introduce an extension of the Lindley distribution which offers a more flexible model for lifetime data. Several statistical properties of the distribution are explored, such as the density, (reversed) failure rate, (reversed) mean residual lifetime, moments, order statistics, Bonferroni and Lorenz curves. Estimation using the maximum likelihood and inference of a random sample from the distribution are investigated. A real data application illustrates the performance of the distribution. (C) 2011 The Korean Statistical Society. Published by Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this article, for the first time, we propose the negative binomial-beta Weibull (BW) regression model for studying the recurrence of prostate cancer and to predict the cure fraction for patients with clinically localized prostate cancer treated by open radical prostatectomy. The cure model considers that a fraction of the survivors are cured of the disease. The survival function for the population of patients can be modeled by a cure parametric model using the BW distribution. We derive an explicit expansion for the moments of the recurrence time distribution for the uncured individuals. The proposed distribution can be used to model survival data when the hazard rate function is increasing, decreasing, unimodal and bathtub shaped. Another advantage is that the proposed model includes as special sub-models some of the well-known cure rate models discussed in the literature. We derive the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes. We analyze a real data set for localized prostate cancer patients after open radical prostatectomy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In my PhD thesis I propose a Bayesian nonparametric estimation method for structural econometric models where the functional parameter of interest describes the economic agent's behavior. The structural parameter is characterized as the solution of a functional equation, or by using more technical words, as the solution of an inverse problem that can be either ill-posed or well-posed. From a Bayesian point of view, the parameter of interest is a random function and the solution to the inference problem is the posterior distribution of this parameter. A regular version of the posterior distribution in functional spaces is characterized. However, the infinite dimension of the considered spaces causes a problem of non continuity of the solution and then a problem of inconsistency, from a frequentist point of view, of the posterior distribution (i.e. problem of ill-posedness). The contribution of this essay is to propose new methods to deal with this problem of ill-posedness. The first one consists in adopting a Tikhonov regularization scheme in the construction of the posterior distribution so that I end up with a new object that I call regularized posterior distribution and that I guess it is solution of the inverse problem. The second approach consists in specifying a prior distribution on the parameter of interest of the g-prior type. Then, I detect a class of models for which the prior distribution is able to correct for the ill-posedness also in infinite dimensional problems. I study asymptotic properties of these proposed solutions and I prove that, under some regularity condition satisfied by the true value of the parameter of interest, they are consistent in a "frequentist" sense. Once I have set the general theory, I apply my bayesian nonparametric methodology to different estimation problems. First, I apply this estimator to deconvolution and to hazard rate, density and regression estimation. Then, I consider the estimation of an Instrumental Regression that is useful in micro-econometrics when we have to deal with problems of endogeneity. Finally, I develop an application in finance: I get the bayesian estimator for the equilibrium asset pricing functional by using the Euler equation defined in the Lucas'(1978) tree-type models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The generalized failure rate of a continuous random variable has demonstrable importance in operations management. If the valuation distribution of a product has an increasing generalized failure rate (that is, the distribution is IGFR), then the associated revenue function is unimodal, and when the generalized failure rate is strictly increasing, the global maximum is uniquely specified. The assumption that the distribution is IGFR is thus useful and frequently held in recent pricing, revenue, and supply chain management literature. This note contributes to the IGFR literature in several ways. First, it investigates the prevalence of the IGFR property for the left and right truncations of valuation distributions. Second, we extend the IGFR notion to discrete distributions and contrast it with the continuous distribution case. The note also addresses two errors in the previous IGFR literature. Finally, for future reference, we analyze all common (continuous and discrete) distributions for the prevalence of the IGFR property, and derive and tabulate their generalized failure rates.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

It is of interest in some applications to determine whether there is a relationship between a hazard rate function (or a cumulative incidence function) and a mark variable which is only observed at uncensored failure times. We develop nonparametric tests for this problem when the mark variable is continuous. Tests are developed for the null hypothesis that the mark-specific hazard rate is independent of the mark versus ordered and two-sided alternatives expressed in terms of mark-specific hazard functions and mark-specific cumulative incidence functions. The test statistics are based on functionals of a bivariate test process equal to a weighted average of differences between a Nelson--Aalen-type estimator of the mark-specific cumulative hazard function and a nonparametric estimator of this function under the null hypothesis. The weight function in the test process can be chosen so that the test statistics are asymptotically distribution-free.Asymptotically correct critical values are obtained through a simple simulation procedure. The testing procedures are shown to perform well in numerical studies, and are illustrated with an AIDS clinical trial example. Specifically, the tests are used to assess if the instantaneous or absolute risk of treatment failure depends on the amount of accumulation of drug resistance mutations in a subject's HIV virus. This assessment helps guide development of anti-HIV therapies that surmount the problem of drug resistance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Prediction studies in subjects at Clinical High Risk (CHR) for psychosis are hampered by a high proportion of uncertain outcomes. We therefore investigated whether quantitative EEG (QEEG) parameters can contribute to an improved identification of CHR subjects with a later conversion to psychosis. METHODS This investigation was a project within the European Prediction of Psychosis Study (EPOS), a prospective multicenter, naturalistic field study with an 18-month follow-up period. QEEG spectral power and alpha peak frequencies (APF) were determined in 113 CHR subjects. The primary outcome measure was conversion to psychosis. RESULTS Cox regression yielded a model including frontal theta (HR=1.82; [95% CI 1.00-3.32]) and delta (HR=2.60; [95% CI 1.30-5.20]) power, and occipital-parietal APF (HR=.52; [95% CI .35-.80]) as predictors of conversion to psychosis. The resulting equation enabled the development of a prognostic index with three risk classes (hazard rate 0.057 to 0.81). CONCLUSIONS Power in theta and delta ranges and APF contribute to the short-term prediction of psychosis and enable a further stratification of risk in CHR samples. Combined with (other) clinical ratings, EEG parameters may therefore be a useful tool for individualized risk estimation and, consequently, targeted prevention.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper is concerned with the analysis of zero-inflated count data when time of exposure varies. It proposes a modified zero-inflated count data model where the probability of an extra zero is derived from an underlying duration model with Weibull hazard rate. The new model is compared to the standard Poisson model with logit zero inflation in an application to the effect of treatment with thiotepa on the number of new bladder tumors.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Over the past three decades Germany has repeatedly deregulated the law on temporary agency work by stepwise increasing the maximum period for hiring-out employees and allowing temporary work agencies to conclude fixed-term contracts. These reforms should have had an effect on employment duration within temporary work agencies. Based on an informative administrative data set we use a mixed proportional hazard rate model to examine whether employment duration has changed in response to these reforms. We find that the repeated prolongation of the maximum period for hiring-out employees significantly increased average employment duration while the authorization of fixed-term contracts reduced employment tenure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation consists of three separate essays on job search and labor market dynamics. In the first essay, “The Impact of Labor Market Conditions on Job Creation: Evidence from Firm Level Data”, I study how much changes in labor market conditions reduce employment fluctuations over the business cycle. Changes in labor market conditions make hiring more expensive during expansions and cheaper during recessions, creating counter-cyclical incentives for job creation. I estimate firm level elasticities of labor demand with respect to changes in labor market conditions, considering two margins: changes in labor market tightness and changes in wages. Using employer-employee matched data from Brazil, I find that all firms are more sensitive to changes in wages rather than labor market tightness, and there is substantial heterogeneity in labor demand elasticity across regions. Based on these results, I demonstrate that changes in labor market conditions reduce the variance of employment growth over the business cycle by 20% in a median region, and this effect is equally driven by changes along each margin. Moreover, I show that the magnitude of the effect of labor market conditions on employment growth can be significantly affected by economic policy. In particular, I document that the rapid growth of the national minimum wages in Brazil in 1997-2010 amplified the impact of the change in labor market conditions during local expansions and diminished this impact during local recessions.

In the second essay, “A Framework for Estimating Persistence of Local Labor

Demand Shocks”, I propose a decomposition which allows me to study the persistence of local labor demand shocks. Persistence of labor demand shocks varies across industries, and the incidence of shocks in a region depends on the regional industrial composition. As a result, less diverse regions are more likely to experience deeper shocks, but not necessarily more long lasting shocks. Building on this idea, I propose a decomposition of local labor demand shocks into idiosyncratic location shocks and nationwide industry shocks and estimate the variance and the persistence of these shocks using the Quarterly Census of Employment and Wages (QCEW) in 1990-2013.

In the third essay, “Conditional Choice Probability Estimation of Continuous- Time Job Search Models”, co-authored with Peter Arcidiacono and Arnaud Maurel, we propose a novel, computationally feasible method of estimating non-stationary job search models. Non-stationary job search models arise in many applications, where policy change can be anticipated by the workers. The most prominent example of such policy is the expiration of unemployment benefits. However, estimating these models still poses a considerable computational challenge, because of the need to solve a differential equation numerically at each step of the optimization routine. We overcome this challenge by adopting conditional choice probability methods, widely used in dynamic discrete choice literature, to job search models and show how the hazard rate out of unemployment and the distribution of the accepted wages, which can be estimated in many datasets, can be used to infer the value of unemployment. We demonstrate how to apply our method by analyzing the effect of the unemployment benefit expiration on duration of unemployment using the data from the Survey of Income and Program Participation (SIPP) in 1996-2007.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Apresenta·se um breve resumo histórico da evolução da amostragem por transectos lineares e desenvolve·se a sua teoria. Descrevemos a teoria de amostragem por transectos lineares, proposta por Buckland (1992), sendo apresentados os pontos mais relevantes, no que diz respeito à modelação da função de detecção. Apresentamos uma descrição do princípio CDM (Rissanen, 1978) e a sua aplicação à estimação de uma função densidade por um histograma (Kontkanen e Myllymãki, 2006), procedendo à aplicação de um exemplo prático, recorrendo a uma mistura de densidades. Procedemos à sua aplicação ao cálculo do estimador da probabilidade de detecção, no caso dos transectos lineares e desta forma estimar a densidade populacional de animais. Analisamos dois casos práticos, clássicos na amostragem por distâncias, comparando os resultados obtidos. De forma a avaliar a metodologia, simulámos vários conjuntos de observações, tendo como base o exemplo das estacas, recorrendo às funções de detecção semi-normal, taxa de risco, exponencial e uniforme com um cosseno. Os resultados foram obtidos com o programa DISTANCE (Thomas et al., in press) e um algoritmo escrito em linguagem C, cedido pelo Professor Doutor Petri Kontkanen (Departamento de Ciências da Computação, Universidade de Helsínquia). Foram desenvolvidos programas de forma a calcular intervalos de confiança recorrendo à técnica bootstrap (Efron, 1978). São discutidos os resultados finais e apresentadas sugestões de desenvolvimentos futuros. ABSTRACT; We present a brief historical note on the evolution of line transect sampling and its theoretical developments. We describe line transect sampling theory as proposed by Buckland (1992), and present the most relevant issues about modeling the detection function. We present a description of the CDM principle (Rissanen, 1978) and its application to histogram density estimation (Kontkanen and Myllymãki, 2006), with a practical example, using a mixture of densities. We proceed with the application and estimate probability of detection and animal population density in the context of line transect sampling. Two classical examples from the literature are analyzed and compared. ln order to evaluate the proposed methodology, we carry out a simulation study based on a wooden stakes example, and using as detection functions half normal, hazard rate, exponential and uniform with a cosine term. The results were obtained using program DISTANCE (Thomas et al., in press), and an algorithm written in C language, kindly offered by Professor Petri Kontkanen (Department of Computer Science, University of Helsinki). We develop some programs in order to estimate confidence intervals using the bootstrap technique (Efron, 1978). Finally, the results are presented and discussed with suggestions for future developments.