951 resultados para isometric log ratios (ilr)
Resumo:
There is widespread evidence that the volatility of stock returns displays an asymmetric response to good and bad news. This article considers the impact of asymmetry on time-varying hedges for financial futures. An asymmetric model that allows forecasts of cash and futures return volatility to respond differently to positive and negative return innovations gives superior in-sample hedging performance. However, the simpler symmetric model is not inferior in a hold-out sample. A method for evaluating the models in a modern risk-management framework is presented, highlighting the importance of allowing optimal hedge ratios to be both time-varying and asymmetric.
Resumo:
For forecasting and economic analysis many variables are used in logarithms (logs). In time series analysis, this transformation is often considered to stabilize the variance of a series. We investigate under which conditions taking logs is beneficial for forecasting. Forecasts based on the original series are compared to forecasts based on logs. For a range of economic variables, substantial forecasting improvements from taking logs are found if the log transformation actually stabilizes the variance of the underlying series. Using logs can be damaging for the forecast precision if a stable variance is not achieved.
Resumo:
Purpose Meat and fish consumption are associated with changes in the risk of chronic diseases. Intake is mainly assessed using self-reporting, as no true quantitative nutritional biomarker is available. The measurement of plasma fatty acids, often used as an alternative, is expensive and time-consuming. As meat and fish differ in their stable isotope ratios, δ13C and δ15N have been proposed as biomarkers. However, they have never been investigated in controlled human dietary intervention studies. Objective In a short-term feeding study, we investigated the suitability of δ13C and δ15N in blood, urine and faeces as biomarkers of meat and fish intake. Methods The dietary intervention study (n = 14) followed a randomised cross-over design with three eight-day dietary periods (meat, fish and half-meat–half-fish). In addition, 4 participants completed a vegetarian control period. At the end of each period, 24-h urine, fasting venous blood and faeces were collected and their δ13C and δ15N analysed. Results There was a significant difference between diets in isotope ratios in faeces and urine samples, but not in blood samples (Kruskal–Wallis test, p < 0.0001). In pairwise comparisons, δ13C and δ15N were significantly higher in urine and faecal samples following a fish diet when compared with all other diets, and significantly lower following a vegetarian diet. There was no significant difference in isotope ratio between meat and half-meat–half-fish diets for blood, urine or faecal samples. Conclusions The results of this study show that urinary and faecal δ13C and δ15N are suitable candidate biomarkers for short-term meat and fish intake.
Resumo:
Seasonal sea-surface temperaturevariability for the Neoglacial (3300–2500 BP) and Roman WarmPeriod (RWP; 2500–1600 BP), which correspond to the Bronze and Iron Ages, respectively, was estimated using oxygen isotope ratios obtained from high-resolution samples micromilled from radiocarbon-dated, archaeological limpet (Patella vulgata) shells. The coldest winter months recorded in Neoglacial shells averaged 6.6 ± 0.3 °C, and the warmest summer months averaged 14.7 ± 0.4 °C. One Neoglacial shell captured a year without a summer, which may have resulted from a dust veil from a volcanic eruption in the Katla volcanic system in Iceland. RWP shells record average winter and summer monthly temperatures of 6.3 ± 0.1 °C and 13.3 ± 0.3 °C, respectively. These results capture a cooling transition from the Neoglacial to RWP, which is further supported by earlier studies of pine history in Scotland, pollen type analyses in northeast Scotland, and European glacial events. The cooling transition observed at the boundary between the Neoglacial and RWP in our study also agrees with the abrupt climate deterioration at 2800–2700 BP (also referred to as the Subboreal/Subatlantic transition) and therefore may have been driven by decreased solar radiation and weakened North Atlantic Oscillation conditions.
Resumo:
Melting of the Greenland Ice Sheet (GrIS) is accelerating and will contribute significantly to global sea level rise during the 21st century. Instrumental data on GrIS melting only cover the last few decades, and proxy data extending our knowledge into the past are vital for validating models predicting the influence of ongoing climate change. We investigated a potential meltwater proxy in Godthåbsfjord (West Greenland), where glacier meltwater causes seasonal excursions with lower oxygen isotope water (δ18Ow) values and salinity. The blue mussel (Mytilus edulis) potentially records these variations, because it precipitates its shell calcite in oxygen isotopic equilibrium with ambient seawater. As M. edulis shells are known to occur in raised shorelines and archaeological shell middens from previous Holocene warm periods, this species may be ideal in reconstructing past meltwater dynamics. We investigate its potential as a palaeo-meltwater proxy. First, we confirmed that M. edulis shell calcite oxygen isotope (δ18Oc) values are in equilibrium with ambient water and generally reflect meltwater conditions. Subsequently we investigated if this species recorded the full range of δ18Ow values occurring during the years 2007 to 2010. Results show that δ18Ow values were not recorded at very low salinities (< ~ 19), because the mussels appear to cease growing. This implies that Mytilus edulis δ18Oc values are suitable in reconstructing past meltwater amounts in most cases, but care has to be taken that shells are collected not too close to a glacier, but rather in the mid-region or mouth of the fjord. The focus of future research will expand on the geographical and temporal range of the shell measurements by sampling mussels in other fjords in Greenland along a south–north gradient, and by sampling shells from raised shorelines and archaeological shell middens from prehistoric settlements in Greenland.
Resumo:
Background: Expression microarrays are increasingly used to obtain large scale transcriptomic information on a wide range of biological samples. Nevertheless, there is still much debate on the best ways to process data, to design experiments and analyse the output. Furthermore, many of the more sophisticated mathematical approaches to data analysis in the literature remain inaccessible to much of the biological research community. In this study we examine ways of extracting and analysing a large data set obtained using the Agilent long oligonucleotide transcriptomics platform, applied to a set of human macrophage and dendritic cell samples. Results: We describe and validate a series of data extraction, transformation and normalisation steps which are implemented via a new R function. Analysis of replicate normalised reference data demonstrate that intrarray variability is small (only around 2 of the mean log signal), while interarray variability from replicate array measurements has a standard deviation (SD) of around 0.5 log(2) units (6 of mean). The common practise of working with ratios of Cy5/Cy3 signal offers little further improvement in terms of reducing error. Comparison to expression data obtained using Arabidopsis samples demonstrates that the large number of genes in each sample showing a low level of transcription reflect the real complexity of the cellular transcriptome. Multidimensional scaling is used to show that the processed data identifies an underlying structure which reflect some of the key biological variables which define the data set. This structure is robust, allowing reliable comparison of samples collected over a number of years and collected by a variety of operators. Conclusions: This study outlines a robust and easily implemented pipeline for extracting, transforming normalising and visualising transcriptomic array data from Agilent expression platform. The analysis is used to obtain quantitative estimates of the SD arising from experimental (non biological) intra- and interarray variability, and for a lower threshold for determining whether an individual gene is expressed. The study provides a reliable basis for further more extensive studies of the systems biology of eukaryotic cells.
Resumo:
The objective of this paper is to apply the mis-specification (M-S) encompassing perspective to the problem of choosing between linear and log-linear unit-root models. A simple M-S encompassing test, based on an auxiliary regression stemming from the conditional second moment, is proposed and its empirical size and power are investigated using Monte Carlo simulations. It is shown that by focusing on the conditional process the sampling distributions of the relevant statistics are well behaved under both the null and alternative hypotheses. The proposed M-S encompassing test is illustrated using US total disposable income quarterly data.
Resumo:
Studies with a diverse array of 22 purified condensed tannin (CT) samples from nine plant species demonstrated that procyanidin/prodelphinidin (PC/PD) and cis/trans-flavan-3-ol ratios can be appraised by 1H-13C HSQC NMR spectroscopy. The method was developed from samples containing 44 to ~100% CT, PC/PD ratios ranging from 0/100 to 99/1, and cis/trans ratios from 58/42 to 95/5 as determined by thiolysis with benzyl mercaptan. Integration of cross-peak contours of H/C-6' signals from PC and of H/C-2',6' signals from PD yielded nuclei adjusted estimates that were highly correlated with PC/PD ratios obtained by thiolysis (R2 = 0.99). Cis/trans-flavan-3-ol ratios, obtained by integration of the respective H/C-4 cross-peak contours, were also related to determinations made by thiolysis (R2 = 0.89). Overall, 1H-13C HSQC NMR spectroscopy appears to be a viable alternative to thiolysis for estimating PC/PD and cis/trans ratios of CT, if precautions are taken to avoid integration of cross-peak contours of contaminants.
Resumo:
It is a well known result that for β ∈ (1,1+√52) and x ∈ (0,1β−1) there exists uncountably many (ǫi)∞i=1 ∈ {0,1}N such that x = P∞i=1ǫiβ−i. When β ∈ (1+√52,2] there exists x ∈ (0,1β−1) for which there exists a unique (ǫi)∞i=1 ∈ {0,1}N such that x=P∞i=1ǫiβ−i. In this paper we consider the more general case when our sequences are elements of {0, . . . , m}N. We show that an analogue of the golden ratio exists and give an explicit formula for it.
Resumo:
The relationships between the four radiant fluxes are analyzed based on a 4 year data archive of hourly and daily global ultraviolet (I(UV)), photosynthetically active-PAR (I(PAR)), near infrared (I(NIR)) and broadband global solar radiation (I(G)) collected at Botucatu, Brazil. These data are used to establish both the fractions of spectral components to global solar radiation and the proposed linear regression models. Verification results indicated that the proposed regression models predict accurately the spectral radiant fluxes at least for the Brazilian environment. Finally, results obtained in this analysis agreed well with most published results in the literature. (c) 2010 Elsevier Ltd. All rights reserved.
Resumo:
In this paper we prove an existence result for local and global isometric immersions of semi-Riemannian surfaces into the three dimensional Heisenberg group endowed with a homogeneous left-invariant Lorentzian metric. As a corollary, we prove a rigidity result for such immersions.
Resumo:
The purpose of this paper is to develop a Bayesian approach for log-Birnbaum-Saunders Student-t regression models under right-censored survival data. Markov chain Monte Carlo (MCMC) methods are used to develop a Bayesian procedure for the considered model. In order to attenuate the influence of the outlying observations on the parameter estimates, we present in this paper Birnbaum-Saunders models in which a Student-t distribution is assumed to explain the cumulative damage. Also, some discussions on the model selection to compare the fitted models are given and case deletion influence diagnostics are developed for the joint posterior distribution based on the Kullback-Leibler divergence. The developed procedures are illustrated with a real data set. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
In interval-censored survival data, the event of interest is not observed exactly but is only known to occur within some time interval. Such data appear very frequently. In this paper, we are concerned only with parametric forms, and so a location-scale regression model based on the exponentiated Weibull distribution is proposed for modeling interval-censored data. We show that the proposed log-exponentiated Weibull regression model for interval-censored data represents a parametric family of models that include other regression models that are broadly used in lifetime data analysis. Assuming the use of interval-censored data, we employ a frequentist analysis, a jackknife estimator, a parametric bootstrap and a Bayesian analysis for the parameters of the proposed model. We derive the appropriate matrices for assessing local influences on the parameter estimates under different perturbation schemes and present some ways to assess global influences. Furthermore, for different parameter settings, sample sizes and censoring percentages, various simulations are performed; in addition, the empirical distribution of some modified residuals are displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended to a modified deviance residual in log-exponentiated Weibull regression models for interval-censored data. (C) 2009 Elsevier B.V. All rights reserved.