928 resultados para Multivariate GARCH models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Orthotopic liver retransplantation (re-OLT) is highly controversial. The objectives of this study were to determine the validity of a recently developed United Network for Organ Sharing (UNOS) multivariate model using an independent cohort of patients undergoing re-OLT outside the United States, to determine whether incorporation of other variables that were incomplete in the UNOS registry would provide additional prognostic information, to develop new models combining data sets from both cohorts, and to evaluate the validity of the model for end-stage liver disease (MELD) in patients undergoing re-OLT. Two hundred eighty-one adult patients undergoing re-OLT (between 1986 and 1999) at 6 foreign transplant centers comprised the validation cohort. We found good agreement between actual survival and predicted survival in the validation cohort; 1-year patient survival rates in the low-, intermediate-, and high-risk groups (as assigned by the original UNOS model) were 72%, 68%, and 36%, respectively (P < .0001). In the patients for whom the international normalized ratio (INR) of prothrombin time was available, MELD correlated with outcome following re-OLT; the median MELD scores for patients surviving at least 90 days compared with those dying within 90 days were 20.75 versus 25.9, respectively (P = .004). Utilizing both patient cohorts (n = 979), a new model, based on recipient age, total serum bilirubin, creatinine, and interval to re-OLT, was constructed (whole model χ(2) = 105, P < .0001). Using the c-statistic with 30-day, 90-day, 1-year, and 3-year mortality as the end points, the area under the receiver operating characteristic (ROC) curves for 4 different models were compared. In conclusion, prospective validation and use of these models as adjuncts to clinical decision making in the management of patients being considered for re-OLT are warranted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a new identification method based on the residual white noise autoregressive criterion (Pukkila et al. , 1990) to select the order of VARMA structures. Results from extensive simulation experiments based on different model structures with varying number of observations and number of component series are used to demonstrate the performance of this new procedure. We also use economic and business data to compare the model structures selected by this order selection method with those identified in other published studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Single male sexually selected traits have been found to exhibit substantial genetic variance, even though natural and sexual selection are predicted to deplete genetic variance in these traits. We tested whether genetic variance in multiple male display traits of Drosophila serrata was maintained under field conditions. A breeding design involving 300 field-reared males and their laboratory-reared offspring allowed the estimation of the genetic variance-covariance matrix for six male cuticular hydrocarbons (CHCs) under field conditions. Despite individual CHCs displaying substantial genetic variance under field conditions, the vast majority of genetic variance in CHCs was not closely associated with the direction of sexual selection measured on field phenotypes. Relative concentrations of three CHCs correlated positively with body size in the field, but not under laboratory conditions, suggesting condition-dependent expression of CHCs under field conditions. Therefore condition dependence may not maintain genetic variance in preferred combinations of male CHCs under field conditions, suggesting that the large mutational target supplied by the evolution of condition dependence may not provide a solution to the lek paradox in this species. Sustained sexual selection may be adequate to deplete genetic variance in the direction of selection, perhaps as a consequence of the low rate of favorable mutations expected in multiple trait systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional vegetation mapping methods use high cost, labour-intensive aerial photography interpretation. This approach can be subjective and is limited by factors such as the extent of remnant vegetation, and the differing scale and quality of aerial photography over time. An alternative approach is proposed which integrates a data model, a statistical model and an ecological model using sophisticated Geographic Information Systems (GIS) techniques and rule-based systems to support fine-scale vegetation community modelling. This approach is based on a more realistic representation of vegetation patterns with transitional gradients from one vegetation community to another. Arbitrary, though often unrealistic, sharp boundaries can be imposed on the model by the application of statistical methods. This GIS-integrated multivariate approach is applied to the problem of vegetation mapping in the complex vegetation communities of the Innisfail Lowlands in the Wet Tropics bioregion of Northeastern Australia. The paper presents the full cycle of this vegetation modelling approach including sampling sites, variable selection, model selection, model implementation, internal model assessment, model prediction assessments, models integration of discrete vegetation community models to generate a composite pre-clearing vegetation map, independent data set model validation and model prediction's scale assessments. An accurate pre-clearing vegetation map of the Innisfail Lowlands was generated (0.83r(2)) through GIS integration of 28 separate statistical models. This modelling approach has good potential for wider application, including provision of. vital information for conservation planning and management; a scientific basis for rehabilitation of disturbed and cleared areas; a viable method for the production of adequate vegetation maps for conservation and forestry planning of poorly-studied areas. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a forecasting technique for forward energy prices, one day ahead. This technique combines a wavelet transform and forecasting models such as multi- layer perceptron, linear regression or GARCH. These techniques are applied to real data from the UK gas markets to evaluate their performance. The results show that the forecasting accuracy is improved significantly by using the wavelet transform. The methodology can be also applied to forecasting market clearing prices and electricity/gas loads.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most traditional methods for extracting the relationships between two time series are based on cross-correlation. In a non-linear non-stationary environment, these techniques are not sufficient. We show in this paper how to use hidden Markov models (HMMs) to identify the lag (or delay) between different variables for such data. We first present a method using maximum likelihood estimation and propose a simple algorithm which is capable of identifying associations between variables. We also adopt an information-theoretic approach and develop a novel procedure for training HMMs to maximise the mutual information between delayed time series. Both methods are successfully applied to real data. We model the oil drilling process with HMMs and estimate a crucial parameter, namely the lag for return.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is a study of three techniques to improve performance of some standard fore-casting models, application to the energy demand and prices. We focus on forecasting demand and price one-day ahead. First, the wavelet transform was used as a pre-processing procedure with two approaches: multicomponent-forecasts and direct-forecasts. We have empirically compared these approaches and found that the former consistently outperformed the latter. Second, adaptive models were introduced to continuously update model parameters in the testing period by combining ?lters with standard forecasting methods. Among these adaptive models, the adaptive LR-GARCH model was proposed for the fi?rst time in the thesis. Third, with regard to noise distributions of the dependent variables in the forecasting models, we used either Gaussian or Student-t distributions. This thesis proposed a novel algorithm to infer parameters of Student-t noise models. The method is an extension of earlier work for models that are linear in parameters to the non-linear multilayer perceptron. Therefore, the proposed method broadens the range of models that can use a Student-t noise distribution. Because these techniques cannot stand alone, they must be combined with prediction models to improve their performance. We combined these techniques with some standard forecasting models: multilayer perceptron, radial basis functions, linear regression, and linear regression with GARCH. These techniques and forecasting models were applied to two datasets from the UK energy markets: daily electricity demand (which is stationary) and gas forward prices (non-stationary). The results showed that these techniques provided good improvement to prediction performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents some forecasting techniques for energy demand and price prediction, one day ahead. These techniques combine wavelet transform (WT) with fixed and adaptive machine learning/time series models (multi-layer perceptron (MLP), radial basis functions, linear regression, or GARCH). To create an adaptive model, we use an extended Kalman filter or particle filter to update the parameters continuously on the test set. The adaptive GARCH model is a new contribution, broadening the applicability of GARCH methods. We empirically compared two approaches of combining the WT with prediction models: multicomponent forecasts and direct forecasts. These techniques are applied to large sets of real data (both stationary and non-stationary) from the UK energy markets, so as to provide comparative results that are statistically stronger than those previously reported. The results showed that the forecasting accuracy is significantly improved by using the WT and adaptive models. The best models on the electricity demand/gas price forecast are the adaptive MLP/GARCH with the multicomponent forecast; their MSEs are 0.02314 and 0.15384 respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Amongst all the objectives in the study of time series, uncovering the dynamic law of its generation is probably the most important. When the underlying dynamics are not available, time series modelling consists of developing a model which best explains a sequence of observations. In this thesis, we consider hidden space models for analysing and describing time series. We first provide an introduction to the principal concepts of hidden state models and draw an analogy between hidden Markov models and state space models. Central ideas such as hidden state inference or parameter estimation are reviewed in detail. A key part of multivariate time series analysis is identifying the delay between different variables. We present a novel approach for time delay estimating in a non-stationary environment. The technique makes use of hidden Markov models and we demonstrate its application for estimating a crucial parameter in the oil industry. We then focus on hybrid models that we call dynamical local models. These models combine and generalise hidden Markov models and state space models. Probabilistic inference is unfortunately computationally intractable and we show how to make use of variational techniques for approximating the posterior distribution over the hidden state variables. Experimental simulations on synthetic and real-world data demonstrate the application of dynamical local models for segmenting a time series into regimes and providing predictive distributions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This empirical study employs a different methodology to examine the change in wealth associated with mergers and acquisitions (M&As) for US firms. Specifically, we employ the standard CAPM, the Fama-French three-factor model and the Carhart four-factor models within the OLS and GJR-GARCH estimation methods to test the behaviour of the cumulative abnormal returns (CARs). Whilst the standard CAPM captures the variability of stock returns with the overall market, the Fama-French factors capture the risk factors that are important to investors. Additionally, augmenting the Fama-French three-factor model with the Carhart momentum factor to generate the four-factor captures additional pricing elements that may affect stock returns. Traditionally, estimates of abnormal returns (ARs) in M&As situations rely on the standard OLS estimation method. However, the standard OLS will provide inefficient estimates of the ARs if the data contain ARCH and asymmetric effects. To minimise this problem of estimation efficiency we re-estimated the ARs using GJR-GARCH estimation method. We find that there is variation in the results both as regards the choice models and estimation methods. Besides these variations in the estimated models and the choice of estimation methods, we also tested whether the ARs are affected by the degree of liquidity of the stocks and the size of the firm. We document significant positive post-announcement cumulative ARs (CARs) for target firm shareholders under both the OLS and GJR-GARCH methods across all three methodologies. However, post-event CARs for acquiring firm shareholders were insignificant for both sets of estimation methods under the three methodologies. The GJR-GARCH method seems to generate larger CARs than those of the OLS method. Using both market capitalization and trading volume as a measure of liquidity and the size of the firm, we observed strong return continuations in the medium firms relative to small and large firms for target shareholders. We consistently observed market efficiency in small and large firm. This implies that target firms for small and large firms overreact to new information resulting in a more efficient market. For acquirer firms, our measure of liquidity captures strong return continuations for small firms under the OLS estimates for both CAPM and Fama-French three-factor models, whilst under the GJR-GARCH estimates only for Carhart model. Post-announcement bootstrapping simulated CARs confirmed our earlier results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The accurate identification of T-cell epitopes remains a principal goal of bioinformatics within immunology. As the immunogenicity of peptide epitopes is dependent on their binding to major histocompatibility complex (MHC) molecules, the prediction of binding affinity is a prerequisite to the reliable prediction of epitopes. The iterative self-consistent (ISC) partial-least-squares (PLS)-based additive method is a recently developed bioinformatic approach for predicting class II peptide−MHC binding affinity. The ISC−PLS method overcomes many of the conceptual difficulties inherent in the prediction of class II peptide−MHC affinity, such as the binding of a mixed population of peptide lengths due to the open-ended class II binding site. The method has applications in both the accurate prediction of class II epitopes and the manipulation of affinity for heteroclitic and competitor peptides. The method is applied here to six class II mouse alleles (I-Ab, I-Ad, I-Ak, I-As, I-Ed, and I-Ek) and included peptides up to 25 amino acids in length. A series of regression equations highlighting the quantitative contributions of individual amino acids at each peptide position was established. The initial model for each allele exhibited only moderate predictivity. Once the set of selected peptide subsequences had converged, the final models exhibited a satisfactory predictive power. Convergence was reached between the 4th and 17th iterations, and the leave-one-out cross-validation statistical terms - q2, SEP, and NC - ranged between 0.732 and 0.925, 0.418 and 0.816, and 1 and 6, respectively. The non-cross-validated statistical terms r2 and SEE ranged between 0.98 and 0.995 and 0.089 and 0.180, respectively. The peptides used in this study are available from the AntiJen database (http://www.jenner.ac.uk/AntiJen). The PLS method is available commercially in the SYBYL molecular modeling software package. The resulting models, which can be used for accurate T-cell epitope prediction, will be made freely available online (http://www.jenner.ac.uk/MHCPred).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our approach for knowledge presentation is based on the idea of expert system shell. At first we will build a graph shell of both possible dependencies and possible actions. Then, reasoning by means of Loglinear models, we will activate some nodes and some directed links. In this way a Bayesian network and networks presenting loglinear models are generated.