913 resultados para Serial correlation
Resumo:
Multiparameter cerebral monitoring has been widely applied in traumatic brain injury to study posttraumatic pathophysiology and to manage head-injured patients (e.g., combining O(2) and pH sensors with cerebral microdialysis). Because a comprehensive approach towards understanding injury processes will also require functional measures, we have added electrophysiology to these monitoring modalities by attaching a recording electrode to the microdialysis probe. These dual-function (microdialysis/electrophysiology) probes were placed in rats following experimental fluid percussion brain injuries, and in a series of severely head-injured human patients. Electrical activity (cell firing, EEG) was monitored concurrently with microdialysis sampling of extracellular glutamate, glucose and lactate. Electrophysiological parameters (firing rate, serial correlation, field potential occurrences) were analyzed offline and compared to dialysate concentrations. In rats, these probes demonstrated an injury-induced suppression of neuronal firing (from a control level of 2.87 to 0.41 spikes/sec postinjury), which was associated with increases in extracellular glutamate and lactate, and decreases in glucose levels. When placed in human patients, the probes detected sparse and slowly firing cells (mean = 0.21 spike/sec), with most units (70%) exhibiting a lack of serial correlation in the spike train. In some patients, spontaneous field potentials were observed, suggesting synchronously firing neuronal populations. In both the experimental and clinical application, the addition of the recording electrode did not appreciably affect the performance of the microdialysis probe. The results suggest that this technique provides a functional monitoring capability which cannot be obtained when electrophysiology is measured with surface or epidural EEG alone.
Resumo:
A detailed paleomagnetic and rock-magnetic investigation was conducted on thirty six basaltic flows of the ~1095 Ma Portage Lake Volcanics. The flows were sampled along the East Adit of the Quincy Mine (Hancock, MI). Thirty two flows yielded well-defined primary magnetization directions carried by magnetite. A secondary magnetization component carried by hematite was also found in twenty nine flows. After correction for serial correlation between the flows, nineteen independent mean directions were calculated. The corresponding paleomagnetic pole is located at 25.5 °N, 182.1 °W (A95 = 3.5°). The new pole overlaps with the pole from the ~1087 Ma Lake Shore Traps suggesting a standstill of the North American plate during that time period. The low angular dispersion of virtual geomagnetic poles (S = 7.9°) suggests that the flows were erupted within a short time period, or that the strength of geomagnetic secular variation was lower than that of the recent field.
Resumo:
We introduce an algorithm (called REDFITmc2) for spectrum estimation in the presence of timescale errors. It is based on the Lomb-Scargle periodogram for unevenly spaced time series, in combination with the Welch's Overlapped Segment Averaging procedure, bootstrap bias correction and persistence estimation. The timescale errors are modelled parametrically and included in the simulations for determining (1) the upper levels of the spectrum of the red-noise AR(1) alternative and (2) the uncertainty of the frequency of a spectral peak. Application of REDFITmc2 to ice core and stalagmite records of palaeoclimate allowed a more realistic evaluation of spectral peaks than when ignoring this source of uncertainty. The results support qualitatively the intuition that stronger effects on the spectrum estimate (decreased detectability and increased frequency uncertainty) occur for higher frequencies. The surplus information brought by algorithm REDFITmc2 is that those effects are quantified. Regarding timescale construction, not only the fixpoints, dating errors and the functional form of the age-depth model play a role. Also the joint distribution of all time points (serial correlation, stratigraphic order) determines spectrum estimation.
Resumo:
Serial correlation of extreme midlatitude cyclones observed at the storm track exits is explained by deviations from a Poisson process. To model these deviations, we apply fractional Poisson processes (FPPs) to extreme midlatitude cyclones, which are defined by the 850 hPa relative vorticity of the ERA interim reanalysis during boreal winter (DJF) and summer (JJA) seasons. Extremes are defined by a 99% quantile threshold in the grid-point time series. In general, FPPs are based on long-term memory and lead to non-exponential return time distributions. The return times are described by a Weibull distribution to approximate the Mittag–Leffler function in the FPPs. The Weibull shape parameter yields a dispersion parameter that agrees with results found for midlatitude cyclones. The memory of the FPP, which is determined by detrended fluctuation analysis, provides an independent estimate for the shape parameter. Thus, the analysis exhibits a concise framework of the deviation from Poisson statistics (by a dispersion parameter), non-exponential return times and memory (correlation) on the basis of a single parameter. The results have potential implications for the predictability of extreme cyclones.
Resumo:
This paper analyses the time series behaviour of the initial public offering (IPO) market using an equilibrium model of demand and supply that incorporates the number of new issues, average underpricing, and general market conditions. Model predictions include the existence of serial correlation in both the number of new issues and the average level of underpricing, as well as interactions between these variables and the impact of general market conditions. The model is tested using 40 years of monthly IPO data. The empirical results are generally consistent with predictions.
Resumo:
In this article we investigate the asymptotic and finite-sample properties of predictors of regression models with autocorrelated errors. We prove new theorems associated with the predictive efficiency of generalized least squares (GLS) and incorrectly structured GLS predictors. We also establish the form associated with their predictive mean squared errors as well as the magnitude of these errors relative to each other and to those generated from the ordinary least squares (OLS) predictor. A large simulation study is used to evaluate the finite-sample performance of forecasts generated from models using different corrections for the serial correlation.
Resumo:
In this article a partial-adjustment model, which shows how equity prices fail to adjust instantaneously to new information, is estimated using a Kalman filter. For the components of the Dow Jones Industrial 30 index I aim to identify whether overreaction or noise is the cause of serial correlation and high volatility associated with opening returns. I find that the tendency for overreaction in opening prices is much stronger than for closing prices; therefore, overreaction rather than noise may account for differences in the return behavior of opening and closing returns.
Resumo:
The techniques and insights from two distinct areas of financial economic modelling are combined to provide evidence of the influence of firm size on the volatility of stock portfolio returns. Portfolio returns are characterized by positive serial correlation induced by the varying levels of non-synchronous trading among the component stocks. This serial correlation is greatest for portfolios of small firms. The conditional volatility of stock returns has been shown to be well represented by the GARCH family of statistical processes. Using a GARCH model of the variance of capitalization-based portfolio returns, conditioned on the autocorrelation structure in the conditional mean, striking differences related to firm size are uncovered.
Resumo:
Cette thèse développe des méthodes bootstrap pour les modèles à facteurs qui sont couram- ment utilisés pour générer des prévisions depuis l'article pionnier de Stock et Watson (2002) sur les indices de diffusion. Ces modèles tolèrent l'inclusion d'un grand nombre de variables macroéconomiques et financières comme prédicteurs, une caractéristique utile pour inclure di- verses informations disponibles aux agents économiques. Ma thèse propose donc des outils éco- nométriques qui améliorent l'inférence dans les modèles à facteurs utilisant des facteurs latents extraits d'un large panel de prédicteurs observés. Il est subdivisé en trois chapitres complémen- taires dont les deux premiers en collaboration avec Sílvia Gonçalves et Benoit Perron. Dans le premier article, nous étudions comment les méthodes bootstrap peuvent être utilisées pour faire de l'inférence dans les modèles de prévision pour un horizon de h périodes dans le futur. Pour ce faire, il examine l'inférence bootstrap dans un contexte de régression augmentée de facteurs où les erreurs pourraient être autocorrélées. Il généralise les résultats de Gonçalves et Perron (2014) et propose puis justifie deux approches basées sur les résidus : le block wild bootstrap et le dependent wild bootstrap. Nos simulations montrent une amélioration des taux de couverture des intervalles de confiance des coefficients estimés en utilisant ces approches comparativement à la théorie asymptotique et au wild bootstrap en présence de corrélation sérielle dans les erreurs de régression. Le deuxième chapitre propose des méthodes bootstrap pour la construction des intervalles de prévision permettant de relâcher l'hypothèse de normalité des innovations. Nous y propo- sons des intervalles de prédiction bootstrap pour une observation h périodes dans le futur et sa moyenne conditionnelle. Nous supposons que ces prévisions sont faites en utilisant un ensemble de facteurs extraits d'un large panel de variables. Parce que nous traitons ces facteurs comme latents, nos prévisions dépendent à la fois des facteurs estimés et les coefficients de régres- sion estimés. Sous des conditions de régularité, Bai et Ng (2006) ont proposé la construction d'intervalles asymptotiques sous l'hypothèse de Gaussianité des innovations. Le bootstrap nous permet de relâcher cette hypothèse et de construire des intervalles de prédiction valides sous des hypothèses plus générales. En outre, même en supposant la Gaussianité, le bootstrap conduit à des intervalles plus précis dans les cas où la dimension transversale est relativement faible car il prend en considération le biais de l'estimateur des moindres carrés ordinaires comme le montre une étude récente de Gonçalves et Perron (2014). Dans le troisième chapitre, nous suggérons des procédures de sélection convergentes pour les regressions augmentées de facteurs en échantillons finis. Nous démontrons premièrement que la méthode de validation croisée usuelle est non-convergente mais que sa généralisation, la validation croisée «leave-d-out» sélectionne le plus petit ensemble de facteurs estimés pour l'espace généré par les vraies facteurs. Le deuxième critère dont nous montrons également la validité généralise l'approximation bootstrap de Shao (1996) pour les regressions augmentées de facteurs. Les simulations montrent une amélioration de la probabilité de sélectionner par- cimonieusement les facteurs estimés comparativement aux méthodes de sélection disponibles. L'application empirique revisite la relation entre les facteurs macroéconomiques et financiers, et l'excès de rendement sur le marché boursier américain. Parmi les facteurs estimés à partir d'un large panel de données macroéconomiques et financières des États Unis, les facteurs fortement correlés aux écarts de taux d'intérêt et les facteurs de Fama-French ont un bon pouvoir prédictif pour les excès de rendement.
Resumo:
Cette thèse développe des méthodes bootstrap pour les modèles à facteurs qui sont couram- ment utilisés pour générer des prévisions depuis l'article pionnier de Stock et Watson (2002) sur les indices de diffusion. Ces modèles tolèrent l'inclusion d'un grand nombre de variables macroéconomiques et financières comme prédicteurs, une caractéristique utile pour inclure di- verses informations disponibles aux agents économiques. Ma thèse propose donc des outils éco- nométriques qui améliorent l'inférence dans les modèles à facteurs utilisant des facteurs latents extraits d'un large panel de prédicteurs observés. Il est subdivisé en trois chapitres complémen- taires dont les deux premiers en collaboration avec Sílvia Gonçalves et Benoit Perron. Dans le premier article, nous étudions comment les méthodes bootstrap peuvent être utilisées pour faire de l'inférence dans les modèles de prévision pour un horizon de h périodes dans le futur. Pour ce faire, il examine l'inférence bootstrap dans un contexte de régression augmentée de facteurs où les erreurs pourraient être autocorrélées. Il généralise les résultats de Gonçalves et Perron (2014) et propose puis justifie deux approches basées sur les résidus : le block wild bootstrap et le dependent wild bootstrap. Nos simulations montrent une amélioration des taux de couverture des intervalles de confiance des coefficients estimés en utilisant ces approches comparativement à la théorie asymptotique et au wild bootstrap en présence de corrélation sérielle dans les erreurs de régression. Le deuxième chapitre propose des méthodes bootstrap pour la construction des intervalles de prévision permettant de relâcher l'hypothèse de normalité des innovations. Nous y propo- sons des intervalles de prédiction bootstrap pour une observation h périodes dans le futur et sa moyenne conditionnelle. Nous supposons que ces prévisions sont faites en utilisant un ensemble de facteurs extraits d'un large panel de variables. Parce que nous traitons ces facteurs comme latents, nos prévisions dépendent à la fois des facteurs estimés et les coefficients de régres- sion estimés. Sous des conditions de régularité, Bai et Ng (2006) ont proposé la construction d'intervalles asymptotiques sous l'hypothèse de Gaussianité des innovations. Le bootstrap nous permet de relâcher cette hypothèse et de construire des intervalles de prédiction valides sous des hypothèses plus générales. En outre, même en supposant la Gaussianité, le bootstrap conduit à des intervalles plus précis dans les cas où la dimension transversale est relativement faible car il prend en considération le biais de l'estimateur des moindres carrés ordinaires comme le montre une étude récente de Gonçalves et Perron (2014). Dans le troisième chapitre, nous suggérons des procédures de sélection convergentes pour les regressions augmentées de facteurs en échantillons finis. Nous démontrons premièrement que la méthode de validation croisée usuelle est non-convergente mais que sa généralisation, la validation croisée «leave-d-out» sélectionne le plus petit ensemble de facteurs estimés pour l'espace généré par les vraies facteurs. Le deuxième critère dont nous montrons également la validité généralise l'approximation bootstrap de Shao (1996) pour les regressions augmentées de facteurs. Les simulations montrent une amélioration de la probabilité de sélectionner par- cimonieusement les facteurs estimés comparativement aux méthodes de sélection disponibles. L'application empirique revisite la relation entre les facteurs macroéconomiques et financiers, et l'excès de rendement sur le marché boursier américain. Parmi les facteurs estimés à partir d'un large panel de données macroéconomiques et financières des États Unis, les facteurs fortement correlés aux écarts de taux d'intérêt et les facteurs de Fama-French ont un bon pouvoir prédictif pour les excès de rendement.
Resumo:
PURPOSE The aim was to assess changes of tumour hypoxia during primary radiochemotherapy (RCT) for head and neck cancer (HNC) and to evaluate their relationship with treatment outcome. MATERIAL AND METHODS Hypoxia was assessed by FMISO-PET in weeks 0, 2 and 5 of RCT. The tumour volume (TV) was determined using FDG-PET/MRI/CT co-registered images. The level of hypoxia was quantified on FMISO-PET as TBRmax (SUVmaxTV/SUVmean background). The hypoxic subvolume (HSV) was defined as TV that showed FMISO uptake ⩾1.4 times blood pool activity. RESULTS Sixteen consecutive patients (T3-4, N+, M0) were included (mean follow-up 31, median 44months). Mean TBRmax decreased significantly (p<0.05) from 1.94 to 1.57 (week 2) and 1.27 (week 5). Mean HSV in week 2 and week 5 (HSV2=5.8ml, HSV3=0.3ml) were significantly (p<0.05) smaller than at baseline (HSV1=15.8ml). Kaplan-Meier plots of local recurrence free survival stratified at the median TBRmax showed superior local control for less hypoxic tumours, the difference being significant at baseline and after 2weeks (p=0.031, p=0.016). CONCLUSIONS FMISO-PET documented that in most HNC reoxygenation starts early during RCT and is correlated with better outcome.
Resumo:
Background Tumour necrosis (TN) is recognized to be a consequence of chronic cellular hypoxia. TN and hypoxia correlate with poor prognosis in solid tumours. Methods In a retrospective study the prognostic implications of the extent of TN was evaluated in non-small cell lung cancer (NSCLC) and correlated with clinicopathological variables and expression of epidermal growth factor receptor, Bcl-2, p53 and matrix metalloproteinase-9 (MMP-9). Tissue specimens from 178 surgically resected cases of stage I-IIIA NSCLC with curative intent were studied. The specimens were routinely processed, formalin-fixed and paraffin-embedded. TN was graded as extensive or either limited or absent by two independent observers; disagreements were resolved using a double-headed microscope. The degree of reproducibility was estimated by re-interpreting 40 randomly selected cases after a 4 month interval. Results Reproducibility was attained in 36/40 cases, Kappa score=0.8 P<0.001. TN correlated with T-stage (P=0.001), platelet count (P=0.004) and p53 expression (P=0.031). Near significant associations of TN with N-stage (P=0.063) and MMP-9 expression (P=0.058) were seen. No association was found with angiogenesis (P=0.98). On univariate (P=0.0016) and multivariate analysis (P=0.023) TN was prognostic. Conclusion These results indicate that extensive TN reflects an aggressive tumour phenotype in NSCLC and may improve the predictive power of the TMN staging system. The lack of association between TN and angiogenesis may be important although these variables were not evaluated on serial sections. © 2002 Elsevier Science Ireland Ltd. All rights reserved.
Resumo:
Ischemic stroke (IS) is a heterogeneous disease in which outcome is influenced by many factors. The hemostatic system is activated in association with cerebral ischemia, and thus, markers measuring coagulation, fibrinolysis, and vasoactivity could be useful tools in clinical practice. We investigated whether repeated measurements of these markers reveal patterns that might help in evaluating IS patients, including the early diagnosis of stroke subtypes, in estimating prognosis and risk of recurrence, and in selecting a treatment for secondary prevention of stroke. Vasoconstrictor peptide endothelin-1 (ET-1), homocysteine (Hcy), indicators of thrombin formation and activation (prothrombin fragment 1+2/F1+2, thrombin-antithrombin complex/TAT), indicators of plasmin formation and fibrinolysis (tissue plasminogen activator/t-PA, plasminogen activator inhibitor-1/PAI-1, and D-dimer), and natural anticoagulants (antithrombin/AT, protein C/PC, and protein S/PS) were measured in 102 consecutive mild to moderate IS patients on four occasions: on admission and at 1 week, 1 month, and 3 months after stroke, and once in controls. All patients underwent neurological examination and blood sampling in the same session. Furthermore, 42 IS patients with heterozygous factor V Leiden mutation (FVLm) were selected from 740 IS patients without an obvious etiology, and evaluated in detail for specific clinical, laboratory, and radiological features. Measurements of ET-1 and Hcy levels did not disclose information that could aid in the diagnostic evaluation of IS patients. F1+2 level at 3 months after IS had a positive correlation with recurrence of thromboembolic events, and thus, may be used as a predictive marker of subsequent cerebral events. The D-dimer and AT levels on admission and 1 week after IS were strongly associated with stroke severity, outcome, and disability. The specific analysis of IS patients with FVLm more often revealed a positive family history of thrombosis, a higher prevalence of peripheral vascular disease, and multiple infarctions in brain images, most of which were `silent infarcts´. Results of this study support the view that IS patients with sustained activation of both the fibrinolytic and the coagulation systems and increased thrombin generation may have an unfavorable prognosis. The level of activation may reflect the ongoing thrombotic process and the extent of thrombosis. Changes in these markers could be useful in predicting prognosis of IS patients. A clear need exists for a randomized prospective study to determine whether a subgroup of IS patients with markers indicating activation of fibrinolytic and coagulation systems might benefit from more aggressive secondary prevention of IS.
Resumo:
As System-on-Chip (SoC) designs migrate to 28nm process node and beyond, the electromagnetic (EM) co-interactions of the Chip-Package-Printed Circuit Board (PCB) becomes critical and require accurate and efficient characterization and verification. In this paper a fast, scalable, and parallelized boundary element based integral EM solutions to Maxwell equations is presented. The accuracy of the full-wave formulation, for complete EM characterization, has been validated on both canonical structures and real-world 3-D system (viz. Chip + Package + PCB). Good correlation between numerical simulation and measurement has been achieved. A few examples of the applicability of the formulation to high speed digital and analog serial interfaces on a 45nm SoC are also presented.
Resumo:
Ninety-one patients were studied serially for chimeric status following allogeneic stem cell transplantation (SCT) for severe aplastic anaemia (SAA) or Fanconi Anaemia (FA). Short tandem repeat polymerase chain reaction (STR-PCR) was used to stratify patients into five groups: (A) complete donor chimeras (n = 39), (B) transient mixed chimeras (n = 15) (C) stable mixed chimeras (n = 18), (D) progressive mixed chimeras (n = 14) (E) recipient chimeras with early graft rejection (n = 5). As serial sampling was not possible in Group E, serial chimerism results for 86 patients were available for analysis. The following factors were analysed for association with chimeric status: age, sex match, donor type, aetiology of aplasia, source of stem cells, number of cells engrafted, conditioning regimen, graft-versus-host disease (GvHD) prophylaxis, occurrence of acute and chronic GvHD and survival. Progressive mixed chimeras (PMCs) were at high risk of late graft rejection (n = 10, P <0.0001). Seven of these patients lost their graft during withdrawal of immunosuppressive therapy. STR-PCR indicated an inverse correlation between detection of recipient cells post-SCT and occurrence of acute GvHD (P = 0.008). PMC was a bad prognostic indicator of survival (P = 0.003). Monitoring of chimeric status during cyclosporin withdrawal may facilitate therapeutic intervention to prevent late graft rejection in patients transplanted for SAA.