975 resultados para parametric duration models


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Os modelos hazard, também conhecidos por modelos de tempo até a falência ou duração, são empregados para determinar quais variáveis independentes têm maior poder explicativo na previsão de falência de empresas. Consistem em uma abordagem alternativa aos modelos binários logit e probit, e à análise discriminante. Os modelos de duração deveriam ser mais eficientes que modelos de alternativas discretas, pois levam em consideração o tempo de sobrevivência para estimar a probabilidade instantânea de falência de um conjunto de observações sobre uma variável independente. Os modelos de alternativa discreta tipicamente ignoram a informação de tempo até a falência, e fornecem apenas a estimativa de falhar em um dado intervalo de tempo. A questão discutida neste trabalho é como utilizar modelos hazard para projetar taxas de inadimplência e construir matrizes de migração condicionadas ao estado da economia. Conceitualmente, o modelo é bastante análogo às taxas históricas de inadimplência e mortalidade utilizadas na literatura de crédito. O Modelo Semiparamétrico Proporcional de Cox é testado em empresas brasileiras não pertencentes ao setor financeiro, e observa-se que a probabilidade de inadimplência diminui sensivelmente após o terceiro ano da emissão do empréstimo. Observa-se também que a média e o desvio-padrão das probabilidades de inadimplência são afetados pelos ciclos econômicos. É discutido como o Modelo Proporcional de Cox pode ser incorporado aos quatro modelos mais famosos de gestão de risco .de crédito da atualidade: CreditRisk +, KMV, CreditPortfolio View e CreditMetrics, e as melhorias resultantes dessa incorporação

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In Survival Analysis, long duration models allow for the estimation of the healing fraction, which represents a portion of the population immune to the event of interest. Here we address classical and Bayesian estimation based on mixture models and promotion time models, using different distributions (exponential, Weibull and Pareto) to model failure time. The database used to illustrate the implementations is described in Kersey et al. (1987) and it consists of a group of leukemia patients who underwent a certain type of transplant. The specific implementations used were numeric optimization by BFGS as implemented in R (base::optim), Laplace approximation (own implementation) and Gibbs sampling as implemented in Winbugs. We describe the main features of the models used, the estimation methods and the computational aspects. We also discuss how different prior information can affect the Bayesian estimates

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O G.fast é um novo padrão da União Internacional de Telecomunicações que almeja atingir 1 Gb/s sobre enlaces de cobre curtos, utilizando freqüências de até 212 MHz. Essa nova tecnologia requer modelos paramétricos precisos de cabo para fins de projeto, simulação e testes de avaliação de desempenho. A maioria dos modelos de cabo de cobre foram desenvolvidos focando o espectro VDSL - isto é, freqüências de até 30 MHz - e adotam suposições que são violadas quando a faixa de freqüência é estendida para freqüências G.fast. Esta tese apresenta novos modelos de cabo simples e causais capazes de caracterizar com precisão enlaces de cobre compostos por segmentos individuais ou múltiplos, tanto no domínio do tempo quanto no domínio da freqüência. Resultados utilizando as topologias de referência do padrão G.fast mostram que, além de serem precisos, os novos modelos são atrativos devido ao baixo custo computacional e fórmulas fechadas para ajuste de seus parâmetros junto a dados medidos.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Time series models relating short-term changes in air pollution levels to daily mortality counts typically assume that the effects of air pollution on the log relative rate of mortality do not vary with time. However, these short-term effects might plausibly vary by season. Changes in the sources of air pollution and meteorology can result in changes in characteristics of the air pollution mixture across seasons. The authors develop Bayesian semi-parametric hierarchical models for estimating time-varying effects of pollution on mortality in multi-site time series studies. The methods are applied to the updated National Morbidity and Mortality Air Pollution Study database for the period 1987--2000, which includes data for 100 U.S. cities. At the national level, a 10 micro-gram/m3 increase in PM(10) at lag 1 is associated with a 0.15 (95% posterior interval: -0.08, 0.39),0.14 (-0.14, 0.42), 0.36 (0.11, 0.61), and 0.14 (-0.06, 0.34) percent increase in mortality for winter, spring, summer, and fall, respectively. An analysis by geographical regions finds a strong seasonal pattern in the northeast (with a peak in summer) and little seasonal variation in the southern regions of the country. These results provide useful information for understanding particle toxicity and guiding future analyses of particle constituent data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: To estimate the prognosis over 5 years of HIV-1-infected, treatment-naive patients starting HAART, taking into account the immunological and virological response to therapy. DESIGN: A collaborative analysis of data from 12 cohorts in Europe and North America on 20,379 adults who started HAART between 1995 and 2003. METHODS: Parametric survival models were used to predict the cumulative incidence at 5 years of a new AIDS-defining event or death, and death alone, first from the start of HAART and second from 6 months after the start of HAART. Data were analysed by intention-to-continue-treatment, ignoring treatment changes and interruptions. RESULTS: During 61 798 person-years of follow-up, 1005 patients died and an additional 1303 developed AIDS. A total of 10 046 (49%) patients started HAART either with a CD4 cell count of less than 200 cells/microl or with a diagnosis of AIDS. The 5-year risk of AIDS or death (death alone) from the start of HAART ranged from 5.6 to 77% (1.8-65%), depending on age, CD4 cell count, HIV-1-RNA level, clinical stage, and history of injection drug use. From 6 months the corresponding figures were 4.1-99% for AIDS or death and 1.3-96% for death alone. CONCLUSION: On the basis of data collected routinely in HIV care, prognostic models with high discriminatory power over 5 years were developed for patients starting HAART in industrialized countries. A risk calculator that produces estimates for progression rates at years 1 to 5 after starting HAART is available from www.art-cohort-collaboration.org.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Parameter estimates from commonly used multivariable parametric survival regression models do not directly quantify differences in years of life expectancy. Gaussian linear regression models give results in terms of absolute mean differences, but are not appropriate in modeling life expectancy, because in many situations time to death has a negative skewed distribution. A regression approach using a skew-normal distribution would be an alternative to parametric survival models in the modeling of life expectancy, because parameter estimates can be interpreted in terms of survival time differences while allowing for skewness of the distribution. In this paper we show how to use the skew-normal regression so that censored and left-truncated observations are accounted for. With this we model differences in life expectancy using data from the Swiss National Cohort Study and from official life expectancy estimates and compare the results with those derived from commonly used survival regression models. We conclude that a censored skew-normal survival regression approach for left-truncated observations can be used to model differences in life expectancy across covariates of interest.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We developed an anatomical mapping technique to detect hippocampal and ventricular changes in Alzheimer disease (AD). The resulting maps are sensitive to longitudinal changes in brain structure as the disease progresses. An anatomical surface modeling approach was combined with surface-based statistics to visualize the region and rate of atrophy in serial MRI scans and isolate where these changes link with cognitive decline. Fifty-two high-resolution MRI scans were acquired from 12 AD patients (age: 68.4 +/- 1.9 years) and 14 matched controls (age: 71.4 +/- 0.9 years), each scanned twice (2.1 +/- 0.4 years apart). 3D parametric mesh models of the hippocampus and temporal horns were created in sequential scans and averaged across subjects to identify systematic patterns of atrophy. As an index of radial atrophy, 3D distance fields were generated relating each anatomical surface point to a medial curve threading down the medial axis of each structure. Hippocampal atrophic rates and ventricular expansion were assessed statistically using surface-based permutation testing and were faster in AD than in controls. Using color-coded maps and video sequences, these changes were visualized as they progressed anatomically over time. Additional maps localized regions where atrophic changes linked with cognitive decline. Temporal horn expansion maps were more sensitive to AD progression than maps of hippocampal atrophy, but both maps correlated with clinical deterioration. These quantitative, dynamic visualizations of hippocampal atrophy and ventricular expansion rates in aging and AD may provide a promising measure to track AD progression in drug trials. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A tanárok pályaelhagyási döntését vizsgálva, a tanulmány a következő két kérdésre keresi a választ. Milyen szerepet játszanak e döntésekben a keresetek, alternatív kereseti lehetőségek? Hogyan hatott a tanárok pályaelhagyására a 2002. évi közalkalmazotti béremelés? Az elemzéshez az OEP-ONYF-FH összekapcsolt nagymintás adatbázis felhasználásával kétféle modellt becsült a szerző: 1. két lehetőséget megkülönböztetve (elhagyja a tanári pályát/nem hagyja el) Cox-féle arányos hazárdfüggvényeket, 2. a pályaelhagyás okai között a más állásba kerülést és az egyéb pályaelhagyási okokat megkülönböztetve versengő kockázati modelleket. Az eredmények azt mutatják, hogy a kereseti lehetőségek hatnak a pályaelhagyási döntésekre. A magasabb jövedelem és magasabb relatív kereset csökkenti annak valószínűségét, hogy egy tanár elhagyja a pályát, és más pályán helyezkedjen el, vagy nem foglalkoztatotti státusba kerüljön. A közalkalmazotti béremelés átmenetileg csökkentette a pályaelhagyás valószínűségét a fiatal tanárok körében, de a hatás egy-két év alatt eltűnt. Az 51 évesnél idősebb tanárokat pedig inkább a pályán tartotta a béremelés, csökkentette annak valószínűségét is, hogy más pályán helyezkedjenek el, vagy hogy nem foglalkoztatotti státusba kerüljenek. ______ The paper investigates teachers decisions to leave the profession. It first examines the role in such decisions of pay compared with earnings in alternative occupations, and then discusses how the public-sector pay increase of 2002 af-fected exit decisions by teachers. Duration models were estimated using large merged administrative data sets. First binary-choice Cox proportional hazard models (leaving teaching profession or not), then competing risk models that distinguish exits to another occupation and exits to a non-working state. Results show that earnings matter. Higher wages reduce the probability of exiting teacher profession to go to another occupation or to non-employment. The public-sector pay increase decreased the probability of inexperienced teachers leaving the teacher profession temporarily, but the effect disappeared after one or two years. For experienced teachers over 51 years old, the wage increase was found to reduce attrition.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation consists of three separate essays on job search and labor market dynamics. In the first essay, “The Impact of Labor Market Conditions on Job Creation: Evidence from Firm Level Data”, I study how much changes in labor market conditions reduce employment fluctuations over the business cycle. Changes in labor market conditions make hiring more expensive during expansions and cheaper during recessions, creating counter-cyclical incentives for job creation. I estimate firm level elasticities of labor demand with respect to changes in labor market conditions, considering two margins: changes in labor market tightness and changes in wages. Using employer-employee matched data from Brazil, I find that all firms are more sensitive to changes in wages rather than labor market tightness, and there is substantial heterogeneity in labor demand elasticity across regions. Based on these results, I demonstrate that changes in labor market conditions reduce the variance of employment growth over the business cycle by 20% in a median region, and this effect is equally driven by changes along each margin. Moreover, I show that the magnitude of the effect of labor market conditions on employment growth can be significantly affected by economic policy. In particular, I document that the rapid growth of the national minimum wages in Brazil in 1997-2010 amplified the impact of the change in labor market conditions during local expansions and diminished this impact during local recessions.

In the second essay, “A Framework for Estimating Persistence of Local Labor

Demand Shocks”, I propose a decomposition which allows me to study the persistence of local labor demand shocks. Persistence of labor demand shocks varies across industries, and the incidence of shocks in a region depends on the regional industrial composition. As a result, less diverse regions are more likely to experience deeper shocks, but not necessarily more long lasting shocks. Building on this idea, I propose a decomposition of local labor demand shocks into idiosyncratic location shocks and nationwide industry shocks and estimate the variance and the persistence of these shocks using the Quarterly Census of Employment and Wages (QCEW) in 1990-2013.

In the third essay, “Conditional Choice Probability Estimation of Continuous- Time Job Search Models”, co-authored with Peter Arcidiacono and Arnaud Maurel, we propose a novel, computationally feasible method of estimating non-stationary job search models. Non-stationary job search models arise in many applications, where policy change can be anticipated by the workers. The most prominent example of such policy is the expiration of unemployment benefits. However, estimating these models still poses a considerable computational challenge, because of the need to solve a differential equation numerically at each step of the optimization routine. We overcome this challenge by adopting conditional choice probability methods, widely used in dynamic discrete choice literature, to job search models and show how the hazard rate out of unemployment and the distribution of the accepted wages, which can be estimated in many datasets, can be used to infer the value of unemployment. We demonstrate how to apply our method by analyzing the effect of the unemployment benefit expiration on duration of unemployment using the data from the Survey of Income and Program Participation (SIPP) in 1996-2007.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La ricerca si pone l’obiettivo di analizzare strumenti e metodi per l’applicazione dell’H-BIM comprendendone le criticità e fornendo soluzioni utili in questo campo. Al contempo la finalità non è circoscrivibile alla semplice produzione di modelli 3D semanticamente strutturati e parametrici a partire da una nuvola di punti ottenuta con un rilievo digitale, ma si propone di definire i criteri e le metodiche di applicazione delle H-BIM all’interno dell’intero processo. L’impostazione metodologica scelta prevede un processo che parte dalla conoscenza dello stato dell’arte in tema di H-BIM con lo studio dell’attuale normativa in materia e i casi studio di maggior rilevanza. Si è condotta una revisione critica completa della letteratura in merito alla tecnologia BIM e H-BIM, analizzando esperienze di utilizzo della tecnologia BIM nel settore edile globale. Inoltre, al fine di promuovere soluzioni intelligenti all’interno del Facility Management è stato necessario analizzare le criticità presenti nelle procedure, rivedere i processi e i metodi per raccogliere e gestire i dati, nonché individuare le procedure adeguate per garantire il successo dell’implementazione. Sono state evidenziate le potenzialità procedurali e operative legate all’uso sistematico delle innovazioni digitali nell’ottica del Facility Management, oltre che allo studio degli strumenti di acquisizione ed elaborazione dei dati e di post-produzione. Si è proceduto al testing su casi specifici per l’analisi della fase di Scan-to-BIM, differenziati per tipologia di utilizzo, data di costruzione, proprietà e localizzazione. Il percorso seguito ha permesso di porre in luce il significato e le implicazioni dell’utilizzo del BIM nell’ambito del Facility Management, sulla base di una differenziazione delle applicazioni del modello BIM al variare delle condizioni in essere. Infine, sono state definite le conclusioni e formulate raccomandazioni riguardo al futuro utilizzo della tecnologia H-BIM nel settore delle costruzioni. In particolare, definendo l’emergente frontiera del Digital Twin, quale veicolo necessario nel futuro della Costruzione 4.0.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

In this paper, we propose a class of ACD-type models that accommodates overdispersion, intermittent dynamics, multiple regimes, and sign and size asymmetries in financial durations. In particular, our functional coefficient autoregressive conditional duration (FC-ACD) model relies on a smooth-transition autoregressive specification. The motivation lies on the fact that the latter yields a universal approximation if one lets the number of regimes grows without bound. After establishing that the sufficient conditions for strict stationarity do not exclude explosive regimes, we address model identifiability as well as the existence, consistency, and asymptotic normality of the quasi-maximum likelihood (QML) estimator for the FC-ACD model with a fixed number of regimes. In addition, we also discuss how to consistently estimate using a sieve approach a semiparametric variant of the FC-ACD model that takes the number of regimes to infinity. An empirical illustration indicates that our functional coefficient model is flexible enough to model IBM price durations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Leaf wetness duration (LWD) models based on empirical approaches offer practical advantages over physically based models in agricultural applications, but their spatial portability is questionable because they may be biased to the climatic conditions under which they were developed. In our study, spatial portability of three LWD models with empirical characteristics - a RH threshold model, a decision tree model with wind speed correction, and a fuzzy logic model - was evaluated using weather data collected in Brazil, Canada, Costa Rica, Italy and the USA. The fuzzy logic model was more accurate than the other models in estimating LWD measured by painted leaf wetness sensors. The fraction of correct estimates for the fuzzy logic model was greater (0.87) than for the other models (0.85-0.86) across 28 sites where painted sensors were installed, and the degree of agreement k statistic between the model and painted sensors was greater for the fuzzy logic model (0.71) than that for the other models (0.64-0.66). Values of the k statistic for the fuzzy logic model were also less variable across sites than those of the other models. When model estimates were compared with measurements from unpainted leaf wetness sensors, the fuzzy logic model had less mean absolute error (2.5 h day(-1)) than other models (2.6-2.7 h day(-1)) after the model was calibrated for the unpainted sensors. The results suggest that the fuzzy logic model has greater spatial portability than the other models evaluated and merits further validation in comparison with physical models under a wider range of climate conditions. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The objective of the present study was to estimate milk yield genetic parameters applying random regression models and parametric correlation functions combined with a variance function to model animal permanent environmental effects. A total of 152,145 test-day milk yields from 7,317 first lactations of Holstein cows belonging to herds located in the southeastern region of Brazil were analyzed. Test-day milk yields were divided into 44 weekly classes of days in milk. Contemporary groups were defined by herd-test-day comprising a total of 2,539 classes. The model included direct additive genetic, permanent environmental, and residual random effects. The following fixed effects were considered: contemporary group, age of cow at calving (linear and quadratic regressions), and the population average lactation curve modeled by fourth-order orthogonal Legendre polynomial. Additive genetic effects were modeled by random regression on orthogonal Legendre polynomials of days in milk, whereas permanent environmental effects were estimated using a stationary or nonstationary parametric correlation function combined with a variance function of different orders. The structure of residual variances was modeled using a step function containing 6 variance classes. The genetic parameter estimates obtained with the model using a stationary correlation function associated with a variance function to model permanent environmental effects were similar to those obtained with models employing orthogonal Legendre polynomials for the same effect. A model using a sixth-order polynomial for additive effects and a stationary parametric correlation function associated with a seventh-order variance function to model permanent environmental effects would be sufficient for data fitting.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This comment corrects the errors in the estimation process that appear in Martins (2001). The first error is in the parametric probit estimation, as the previously presented results do not maximize the log-likelihood function. In the global maximum more variables become significant. As for the semiparametric estimation method, the kernel function used in Martins (2001) can take on both positive and negative values, which implies that the participation probability estimates may be outside the interval [0,1]. We have solved the problem by applying local smoothing in the kernel estimation, as suggested by Klein and Spady (1993).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present a real data set of claims amounts where costs related to damage are recorded separately from those related to medical expenses. Only claims with positive costs are considered here. Two approaches to density estimation are presented: a classical parametric and a semi-parametric method, based on transformation kernel density estimation. We explore the data set with standard univariate methods. We also propose ways to select the bandwidth and transformation parameters in the univariate case based on Bayesian methods. We indicate how to compare the results of alternative methods both looking at the shape of the overall density domain and exploring the density estimates in the right tail.