147 resultados para Robust estimates

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Value-at-risk (VaR) forecasting generally relies on a parametric density function of portfolio returns that ignores higher moments or assumes them constant. In this paper, we propose a simple approach to forecasting of a portfolio VaR. We employ the Gram-Charlier expansion (GCE) augmenting the standard normal distribution with the first four moments, which are allowed to vary over time. In an extensive empirical study, we compare the GCE approach to other models of VaR forecasting and conclude that it provides accurate and robust estimates of the realized VaR. In spite of its simplicity, on our dataset GCE outperforms other estimates that are generated by both constant and time-varying higher-moments models.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Soil carbon stores are a major component of the annual returns required by EU governments to the Intergovernmental Panel on Climate Change. Peat has a high proportion of soil carbon due to the relatively high carbon density of peat and organic-rich soils. For this reason it has become increasingly important to measure and model soil carbon stores and changes in peat stocks to facilitate the management of carbon changes over time. The approach investigated in this research evaluates the use of airborne geophysical (radiometric) data to estimate peat thickness using the attenuation of bedrock geology radioactivity by superficial peat cover. Remotely sensed radiometric data are validated with ground peat depth measurements combined with non-invasive geophysical surveys. Two field-based case studies exemplify and validate the results. Variography and kriging are used to predict peat thickness from point measurements of peat depth and airborne radiometric data and provide an estimate of uncertainty in the predictions. Cokriging, by assessing the degree of spatial correlation between recent remote sensed geophysical monitoring and previous peat depth models, is used to examine changes in peat stocks over time. The significance of the coregionalisation is that the spatial cross correlation between the remote and ground based data can be used to update the model of peat depth. The result is that by integrating remotely sensed data with ground geophysics, the need is reduced for extensive ground-based monitoring and invasive peat depth measurements. The overall goal is to provide robust estimates of peat thickness to improve estimates of carbon stocks. The implications from the research have a broader significance that promotes a reduction in the need for damaging onsite peat thickness measurement and an increase in the use of remote sensed data for carbon stock estimations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: "Cumulative meta-analysis" describes a statistical procedure to calculate, retrospectively, summary estimates from the results of similar trials every time the results of a further trial in the series had become available. In the early 1990 s, comparisons of cumulative meta-analyses of treatments for myocardial infarction with advice promulgated through medical textbooks showed that research had continued long after robust estimates of treatment effects had accumulated, and that medical textbooks had overlooked strong, existing evidence from trials. Cumulative meta-analyses have subsequently been used to assess what could have been known had new studies been informed by systematic reviews of relevant existing evidence and how waste might have been reduced.

METHODS AND FINDINGS: We used a systematic approach to identify and summarise the findings of cumulative meta-analyses of studies of the effects of clinical interventions, published from 1992 to 2012. Searches were done of PubMed, MEDLINE, EMBASE, the Cochrane Methodology Register and Science Citation Index. A total of 50 eligible reports were identified, including more than 1,500 cumulative meta-analyses. A variety of themes are illustrated with specific examples. The studies showed that initially positive results became null or negative in meta-analyses as more trials were done; that early null or negative results were over-turned; that stable results (beneficial, harmful and neutral) would have been seen had a meta-analysis been done before the new trial; and that additional trials had been much too small to resolve the remaining uncertainties.

CONCLUSIONS: This large, unique collection of cumulative meta-analyses highlights how a review of the existing evidence might have helped researchers, practitioners, patients and funders make more informed decisions and choices about new trials over decades of research. This would have led to earlier uptake of effective interventions in practice, less exposure of trial participants to less effective treatments, and reduced waste resulting from unjustified research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Robust joint modelling is an emerging field of research. Through the advancements in electronic patient healthcare records, the popularly of joint modelling approaches has grown rapidly in recent years providing simultaneous analysis of longitudinal and survival data. This research advances previous work through the development of a novel robust joint modelling methodology for one of the most common types of standard joint models, that which links a linear mixed model with a Cox proportional hazards model. Through t-distributional assumptions, longitudinal outliers are accommodated with their detrimental impact being down weighed and thus providing more efficient and reliable estimates. The robust joint modelling technique and its major benefits are showcased through the analysis of Northern Irish end stage renal disease patients. With an ageing population and growing prevalence of chronic kidney disease within the United Kingdom, there is a pressing demand to investigate the detrimental relationship between the changing haemoglobin levels of haemodialysis patients and their survival. As outliers within the NI renal data were found to have significantly worse survival, identification of outlying individuals through robust joint modelling may aid nephrologists to improve patient's survival. A simulation study was also undertaken to explore the difference between robust and standard joint models in the presence of increasing proportions and extremity of longitudinal outliers. More efficient and reliable estimates were obtained by robust joint models with increasing contrast between the robust and standard joint models when a greater proportion of more extreme outliers are present. Through illustration of the gains in efficiency and reliability of parameters when outliers exist, the potential of robust joint modelling is evident. The research presented in this thesis highlights the benefits and stresses the need to utilise a more robust approach to joint modelling in the presence of longitudinal outliers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper outlines the effects of polymer conditioning on alum sludge properties, such as floc size, density, fractal dimension (DF) and rheological properties. Experimental results demonstrate that polymer conditioning of alum sludge leads to: larger floc size with a plateau reached in higher doses; higher densities associated with higher doses; increased degree of compactness; and an initial decrease followed by an increase of supernatant viscosity with continued increase in polymer dose. The secondary focus of this paper dwells on a comparison of the estimates of optimum dose using different criteria that emanate from established dewatering tests such as CST, SRF, liquid phase viscosity and modified SRF as well as a simple settlement test in terms of CML30. Alum sludge was derived from a water works treating coloured, low-turbidity raw waters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a method for generating Pareto-optimal solutions in multi-party negotiations. In this iterative method, decision makers (DMs) formulate proposals that yield a minimum payoff to their opponents. Each proposal belongs to the efficient frontier, DMs try to adjust to a common one. In this setting, each DM is supposed to have a given bargaining power. More precisely each DM is supposed to have a subjective estimate of the power of the different parties. We study the convergence of the method, and provide examples where there is no possible agreement resulting from it.