977 resultados para Bivariate BEKK-GARCH


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Department of Statistics, Cochin University of Science and Technology

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multivariate lifetime data arise in various forms including recurrent event data when individuals are followed to observe the sequence of occurrences of a certain type of event; correlated lifetime when an individual is followed for the occurrence of two or more types of events, or when distinct individuals have dependent event times. In most studies there are covariates such as treatments, group indicators, individual characteristics, or environmental conditions, whose relationship to lifetime is of interest. This leads to a consideration of regression models.The well known Cox proportional hazards model and its variations, using the marginal hazard functions employed for the analysis of multivariate survival data in literature are not sufficient to explain the complete dependence structure of pair of lifetimes on the covariate vector. Motivated by this, in Chapter 2, we introduced a bivariate proportional hazards model using vector hazard function of Johnson and Kotz (1975), in which the covariates under study have different effect on two components of the vector hazard function. The proposed model is useful in real life situations to study the dependence structure of pair of lifetimes on the covariate vector . The well known partial likelihood approach is used for the estimation of parameter vectors. We then introduced a bivariate proportional hazards model for gap times of recurrent events in Chapter 3. The model incorporates both marginal and joint dependence of the distribution of gap times on the covariate vector . In many fields of application, mean residual life function is considered superior concept than the hazard function. Motivated by this, in Chapter 4, we considered a new semi-parametric model, bivariate proportional mean residual life time model, to assess the relationship between mean residual life and covariates for gap time of recurrent events. The counting process approach is used for the inference procedures of the gap time of recurrent events. In many survival studies, the distribution of lifetime may depend on the distribution of censoring time. In Chapter 5, we introduced a proportional hazards model for duration times and developed inference procedures under dependent (informative) censoring. In Chapter 6, we introduced a bivariate proportional hazards model for competing risks data under right censoring. The asymptotic properties of the estimators of the parameters of different models developed in previous chapters, were studied. The proposed models were applied to various real life situations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a novel, simple, efficient and distribution-free re-sampling technique for developing prediction intervals for returns and volatilities following ARCH/GARCH models. In particular, our key idea is to employ a Box–Jenkins linear representation of an ARCH/GARCH equation and then to adapt a sieve bootstrap procedure to the nonlinear GARCH framework. Our simulation studies indicate that the new re-sampling method provides sharp and well calibrated prediction intervals for both returns and volatilities while reducing computational costs by up to 100 times, compared to other available re-sampling techniques for ARCH/GARCH models. The proposed procedure is illustrated by an application to Yen/U.S. dollar daily exchange rate data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

So far, in the bivariate set up, the analysis of lifetime (failure time) data with multiple causes of failure is done by treating each cause of failure separately. with failures from other causes considered as independent censoring. This approach is unrealistic in many situations. For example, in the analysis of mortality data on married couples one would be interested to compare the hazards for the same cause of death as well as to check whether death due to one cause is more important for the partners’ risk of death from other causes. In reliability analysis. one often has systems with more than one component and many systems. subsystems and components have more than one cause of failure. Design of high-reliability systems generally requires that the individual system components have extremely high reliability even after long periods of time. Knowledge of the failure behaviour of a component can lead to savings in its cost of production and maintenance and. in some cases, to the preservation of human life. For the purpose of improving reliability. it is necessary to identify the cause of failure down to the component level. By treating each cause of failure separately with failures from other causes considered as independent censoring, the analysis of lifetime data would be incomplete. Motivated by this. we introduce a new approach for the analysis of bivariate competing risk data using the bivariate vector hazard rate of Johnson and Kotz (1975).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is highly desirable that any multivariate distribution possessescharacteristic properties that are generalisation in some sense of the corresponding results in the univariate case. Therefore it is of interest to examine whether a multivariate distribution can admit such characterizations. In the exponential context, the question to be answered is, in what meaning— ful way can one extend the unique properties in the univariate case in a bivariate set up? Since the lack of memory property is the best studied and most useful property of the exponential law, our first endeavour in the present thesis, is to suitably extend this property and its equivalent forms so as to characterize the Gumbel's bivariate exponential distribution. Though there are many forms of bivariate exponential distributions, a matching interest has not been shown in developing corresponding discrete versions in the form of bivariate geometric distributions. Accordingly, attempt is also made to introduce the geometric version of the Gumbel distribution and examine several of its characteristic properties. A major area where exponential models are successfully applied being reliability theory, we also look into the role of these bivariate laws in that context. The present thesis is organised into five Chapters

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The term reliability of an equipment or device is often meant to indicate the probability that it carries out the functions expected of it adequately or without failure and within specified performance limits at a given age for a desired mission time when put to use under the designated application and operating environmental stress. A broad classification of the approaches employed in relation to reliability studies can be made as probabilistic and deterministic, where the main interest in the former is to device tools and methods to identify the random mechanism governing the failure process through a proper statistical frame work, while the latter addresses the question of finding the causes of failure and steps to reduce individual failures thereby enhancing reliability. In the probabilistic attitude to which the present study subscribes to, the concept of life distribution, a mathematical idealisation that describes the failure times, is fundamental and a basic question a reliability analyst has to settle is the form of the life distribution. It is for no other reason that a major share of the literature on the mathematical theory of reliability is focussed on methods of arriving at reasonable models of failure times and in showing the failure patterns that induce such models. The application of the methodology of life time distributions is not confined to the assesment of endurance of equipments and systems only, but ranges over a wide variety of scientific investigations where the word life time may not refer to the length of life in the literal sense, but can be concieved in its most general form as a non-negative random variable. Thus the tools developed in connection with modelling life time data have found applications in other areas of research such as actuarial science, engineering, biomedical sciences, economics, extreme value theory etc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper the class of continuous bivariate distributions that has form-invariant weighted distribution with weight function w(x1, x2) ¼ xa1 1 xa2 2 is identified. It is shown that the class includes some well known bivariate models. Bayesian inference on the parameters of the class is considered and it is shown that there exist natural conjugate priors for the parameters

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the residual Kullback–Leibler discrimination information measure is extended to conditionally specified models. The extension is used to characterize some bivariate distributions. These distributions are also characterized in terms of proportional hazard rate models and weighted distributions. Moreover, we also obtain some bounds for this dynamic discrimination function by using the likelihood ratio order and some preceding results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a family of bivariate distributions whose marginals are weighted distributions in the original variables is studied. The relationship between the failure rates of the derived and original models are obtained. These relationships are used to provide some characterizations of specific bivariate models

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article, we study some relevant information divergence measures viz. Renyi divergence and Kerridge’s inaccuracy measures. These measures are extended to conditionally specifiedmodels and they are used to characterize some bivariate distributions using the concepts of weighted and proportional hazard rate models. Moreover, some bounds are obtained for these measures using the likelihood ratio order

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A joint distribution of two discrete random variables with finite support can be displayed as a two way table of probabilities adding to one. Assume that this table has n rows and m columns and all probabilities are non-null. This kind of table can be seen as an element in the simplex of n · m parts. In this context, the marginals are identified as compositional amalgams, conditionals (rows or columns) as subcompositions. Also, simplicial perturbation appears as Bayes theorem. However, the Euclidean elements of the Aitchison geometry of the simplex can also be translated into the table of probabilities: subspaces, orthogonal projections, distances. Two important questions are addressed: a) given a table of probabilities, which is the nearest independent table to the initial one? b) which is the largest orthogonal projection of a row onto a column? or, equivalently, which is the information in a row explained by a column, thus explaining the interaction? To answer these questions three orthogonal decompositions are presented: (1) by columns and a row-wise geometric marginal, (2) by rows and a columnwise geometric marginal, (3) by independent two-way tables and fully dependent tables representing row-column interaction. An important result is that the nearest independent table is the product of the two (row and column)-wise geometric marginal tables. A corollary is that, in an independent table, the geometric marginals conform with the traditional (arithmetic) marginals. These decompositions can be compared with standard log-linear models. Key words: balance, compositional data, simplex, Aitchison geometry, composition, orthonormal basis, arithmetic and geometric marginals, amalgam, dependence measure, contingency table

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En este trabajo examinamos si la teoría de expectativas con primas de liquidez constantes puede explicar la estructura temporal de los tipos de interés de pequeños vencimientos en el mercado interbancario de depósitos español, para datos mensuales desde 1977 hasta 1995. Utilizamos el contraste de Campbell y Shiller (1987) basado en un modelo VAR cointegrado. A partir de las estimaciones consistentes de dicho modelo obtenemos la magnitud y persistencia de los shocks a través de la simulación de la respuesta al impulso, y estimaciones eficientes de los parámetros modelizando la varianza condicional que es variable en el tiempo. En este sentido, se proponen varios esquemas de volatilidad que permiten plantear distintas aproximaciones de la incertidumbre en un entorno multiecuacional GARCH y que están basadas en el modelo de expectativas propuesto. La evidencia empírica muestra que se incumple la teoría de las expectativas, que existe una dinámica conjunta a corto plazo para los tipos de interés y el diferencial que está definida por un modelo VAR(4)-GARCH( 1,1)-BEKK (que está próximo a la integrabilidad en varianza), y que existen distintos factores de riesgo que afectan a las primas en los plazos estudiados

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En este trabajo se estudia el comportamiento de los retornos de los tres principales ´ındices burs´atiles de Colombia: el IBB de la Bolsa de Bogot´a, el IBOMED de la Bolsa de Medell´ın, y el IGBC de Bolsa de Valores de Colombia. A trav´es de un modelo STAR GARCH se identifican dos estados o reg´ımenes extremos, mientras en el primero los rendimientos de los ´ındices son, en t´erminos absolutos, bajos y los procesos son estacionarios, en el segundo se tienen grandes p´erdidas o ganancias, donde los efectos de los choques son permanentes. Aunque en cada uno de los reg´ımenes el efecto del d´ıa de la semana es diferente, los resultados indican que para los tres ´ındices existe un efecto del d´ıa de la semana en la media, y un efecto del d´ıa en la varianza para la Bolsa de Bogot´a y Bolsa de Valores de Colombia. Los resultados contradicen la hip´otesis de un mercado de acciones eficiente en información

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Un conjunto de modelos GARCH multivariados son estimados y su validez empírica comparada a partir del cálculo de la medida VaR, para los retornos diarios de la tasa de cambio nominal del peso colombiano con respecto al dólar americano, euro, libra esterlina y yen japonés en el periodo 1999–2005. La comparación de las estimaciones para la matriz de covarianza condicional y los resultados obtenidos para la proporción de fallo y el contraste de cuantil dinámico de Engle y Manganelli (2004) presentan evidencia a favor del modelo de correlación condicional constante.  

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En este trabajo se estudia el comportamiento de los retornos de los tres principales índices bursátiles de Colombia: el IBB de la Bolsa de Bogotá, el IBOMED de la Bolsa de Medellin, y el IGBC de Bolsa de Valores de Colombia. A través de un modelo STAR GARCH se identifican dos estados o regiones extremos; mientras en el primero los rendimientos de los índices son, en términos absolutos, bajos y los procesos son estacionarios, en el segundo se tienen grandes pérdidas o ganancias, donde los efectos de los choques son permanentes. Aunque en cada uno de los regímenes el efecto del día de la semana es diferente, los resultados indican que para los tres índices existe un efecto del día de la semana en la media, y un efecto del día en la varianza para la Bolsa de Bogotá y Bolsa de Valores de Colombia. Los resultados contradicen la hipótesis de un mercado de acciones efciente en información.