956 resultados para Conditional CAPM


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abnormal expansion or depletion of particular lymphocyte subsets is associated with clinical manifestations such as HIV progression to AIDS and autoimmune disease. We sought to identify genetic predictors of lymphocyte levels and reasoned that these may play a role in immune-related diseases. We tested 2.3 million variants for association with five lymphocyte subsets, measured in 2538 individuals from the general population, including CD4+ T cells, CD8+ T cells, CD56+ natural killer (NK) cells, and the derived measure CD4:CD8 ratio. We identified two regions of strong association. The first was located in the major histocompatibility complex (MHC), with multiple SNPs strongly associated with CD4:CD8 ratio (rs2524054, p = 2.1 × 10−28). The second region was centered within a cluster of genes from the Schlafen family and was associated with NK cell levels (rs1838149, p = 6.1 × 10−14). The MHC association with CD4:CD8 replicated convincingly (p = 1.4 × 10−9) in an independent panel of 988 individuals. Conditional analyses indicate that there are two major independent quantitative trait loci (QTL) in the MHC region that regulate CD4:CD8 ratio: one is located in the class I cluster and influences CD8 levels, whereas the second is located in the class II cluster and regulates CD4 levels. Jointly, both QTL explained 8% of the variance in CD4:CD8 ratio. The class I variants are also strongly associated with durable host control of HIV, and class II variants are associated with type-1 diabetes, suggesting that genetic variation at the MHC may predispose one to immune-related diseases partly through disregulation of T cell homeostasis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Photovoltaic (PV) panels and electric domestic water heater with storage (DWH) are widely used in households in many countries. However, DWH should be explored as an energy storage mechanism before batteries when households have excess PV energy. Through a residential case study in Queensland, Australia, this paper presents a new optimized design and control solution to reduce water heating costs by utilizing existing DWH energy storage capacity and increasing PV self-consumption for water heating. The solution is produced by evaluating the case study energy profile and numerically maximizing the use of PV for DWH. A conditional probability matrix for different solar insolation and hot water usage days is developed to test the solution. Compared to other tariffs, this solution shows cost reduction from 20.8% to 63.3% This new solution could encourage solar households move to a more economical and carbon neutral water heating method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several lines of evidence have implicated the catechol-O-methyltransferase (COMT) gene as a candidate for schizophrenia (SZ) susceptibility, not only because it encodes a key dopamine catabolic enzyme but also because it maps to the velocardiofacial syndrome region of chromosome 22q11 which has long been associated with SZ predisposition. The interest in COMT as a candidate SZ risk factor has led to numerous case-control and family-based studies, with the majority placing emphasis on examining a functional Val/Met polymorphism within this enzyme. Unfortunately, these studies have continually produced conflicting results. To assess the genetic contribution of other COMT variants to SZ susceptibility, we investigated three single-nucleotide polymorphisms (SNPs) (rs737865, rs4633, rs165599) in addition to the Val/Met variant (rs4680) in a highly selected sample of Australian Caucasian families containing 107 patients with SZ. The Val/Met and rs4633 variants showed nominally significant associations with SZ (P<0.05), although neither of the individual SNPs remained significant after adjusting for multiple testing (most significant P=0.1174). However, haplotype analyses showed strong evidence of an association; the most significant being the three-marker haplotype rs737865-rs4680-rs165599 (global P=0.0022), which spans more than 26 kb. Importantly, conditional analyses indicated the presence of two separate and interacting effects within this haplotype, irrespective of gender. In addition, our results indicate the Val/Met polymorphism is not disease-causing and is simply in strong linkage disequilibrium with a causative effect, which interacts with another as yet unidentified variant approximately 20 kb away. These results may help explain the inconsistent results reported on the Val/Met polymorphism and have important implications for future investigations into the role of COMT in SZ susceptibility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background A pandemic strain of influenza A spread rapidly around the world in 2009, now referred to as pandemic (H1N1) 2009. This study aimed to examine the spatiotemporal variation in the transmission rate of pandemic (H1N1) 2009 associated with changes in local socio-environmental conditions from May 7–December 31, 2009, at a postal area level in Queensland, Australia. Method We used the data on laboratory-confirmed H1N1 cases to examine the spatiotemporal dynamics of transmission using a flexible Bayesian, space–time, Susceptible-Infected-Recovered (SIR) modelling approach. The model incorporated parameters describing spatiotemporal variation in H1N1 infection and local socio-environmental factors. Results The weekly transmission rate of pandemic (H1N1) 2009 was negatively associated with the weekly area-mean maximum temperature at a lag of 1 week (LMXT) (posterior mean: −0.341; 95% credible interval (CI): −0.370–−0.311) and the socio-economic index for area (SEIFA) (posterior mean: −0.003; 95% CI: −0.004–−0.001), and was positively associated with the product of LMXT and the weekly area-mean vapour pressure at a lag of 1 week (LVAP) (posterior mean: 0.008; 95% CI: 0.007–0.009). There was substantial spatiotemporal variation in transmission rate of pandemic (H1N1) 2009 across Queensland over the epidemic period. High random effects of estimated transmission rates were apparent in remote areas and some postal areas with higher proportion of indigenous populations and smaller overall populations. Conclusions Local SEIFA and local atmospheric conditions were associated with the transmission rate of pandemic (H1N1) 2009. The more populated regions displayed consistent and synchronized epidemics with low average transmission rates. The less populated regions had high average transmission rates with more variations during the H1N1 epidemic period.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We derive a new method for determining size-transition matrices (STMs) that eliminates probabilities of negative growth and accounts for individual variability. STMs are an important part of size-structured models, which are used in the stock assessment of aquatic species. The elements of STMs represent the probability of growth from one size class to another, given a time step. The growth increment over this time step can be modelled with a variety of methods, but when a population construct is assumed for the underlying growth model, the resulting STM may contain entries that predict negative growth. To solve this problem, we use a maximum likelihood method that incorporates individual variability in the asymptotic length, relative age at tagging, and measurement error to obtain von Bertalanffy growth model parameter estimates. The statistical moments for the future length given an individual’s previous length measurement and time at liberty are then derived. We moment match the true conditional distributions with skewed-normal distributions and use these to accurately estimate the elements of the STMs. The method is investigated with simulated tag–recapture data and tag–recapture data gathered from the Australian eastern king prawn (Melicertus plebejus).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

VHF nighttime scintillations, recorded during a high solar activity period at a meridian chain of stations covering a magnetic latitude belt of 3°–21°N (420 km subionospheric points) are analyzed to investigate the influence of equatorial spread F irregularities on the occurrence of scintillation at latitudes away from the equator. Observations show that saturated amplitude scintillations start abruptly about one and a half hours after ground sunset and their onset is almost simultaneous at stations whose subionospheric points are within 12°N latitude of the magnetic equator, but is delayed at a station whose subionospheric point is at 21°N magnetic latitude by 15 min to 4 hours. In addition, the occurrence of postsunset scintillations at all the stations is found to be conditional on their prior occurrence at the equatorial station. If no postsunset scintillation activity is seen at the equatorial station, no scintillations are seen at other stations also. The occurrence of scintillations is explained as caused by rising plasma bubbles and associated irregularities over the magnetic equator and the subsequent mapping of these irregularities down the magnetic field lines to the F region of higher latitudes through some instantaneous mechanism; and hence an equatorial control is established on the generation of postsunset scintillation-producing irregularities in the entire low-latitude belt.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Correlations between oil and agricultural commodities have varied over previous decades, impacted by renewable fuels policy and turbulent economic conditions. We estimate smooth transition conditional correlation models for 12 agricultural commodities and WTI crude oil. While a structural change in correlations occurred concurrently with the introduction of biofuel policy, oil and food price levels are also key influences. High correlation between biofuel feedstocks and oil is more likely to occur when food and oil price levels are high. Correlation with oil returns is strong for biofuel feedstocks, unlike with other agricultural futures, suggesting limited contagion from energy to food markets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sequential firings with fixed time delays are frequently observed in simultaneous recordings from multiple neurons. Such temporal patterns are potentially indicative of underlying microcircuits and it is important to know when a repeatedly occurring pattern is statistically significant. These sequences are typically identified through correlation counts. In this paper we present a method for assessing the significance of such correlations. We specify the null hypothesis in terms of a bound on the conditional probabilities that characterize the influence of one neuron on another. This method of testing significance is more general than the currently available methods since under our null hypothesis we do not assume that the spiking processes of different neurons are independent. The structure of our null hypothesis also allows us to rank order the detected patterns. We demonstrate our method on simulated spike trains.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Whether a statistician wants to complement a probability model for observed data with a prior distribution and carry out fully probabilistic inference, or base the inference only on the likelihood function, may be a fundamental question in theory, but in practice it may well be of less importance if the likelihood contains much more information than the prior. Maximum likelihood inference can be justified as a Gaussian approximation at the posterior mode, using flat priors. However, in situations where parametric assumptions in standard statistical models would be too rigid, more flexible model formulation, combined with fully probabilistic inference, can be achieved using hierarchical Bayesian parametrization. This work includes five articles, all of which apply probability modeling under various problems involving incomplete observation. Three of the papers apply maximum likelihood estimation and two of them hierarchical Bayesian modeling. Because maximum likelihood may be presented as a special case of Bayesian inference, but not the other way round, in the introductory part of this work we present a framework for probability-based inference using only Bayesian concepts. We also re-derive some results presented in the original articles using the toolbox equipped herein, to show that they are also justifiable under this more general framework. Here the assumption of exchangeability and de Finetti's representation theorem are applied repeatedly for justifying the use of standard parametric probability models with conditionally independent likelihood contributions. It is argued that this same reasoning can be applied also under sampling from a finite population. The main emphasis here is in probability-based inference under incomplete observation due to study design. This is illustrated using a generic two-phase cohort sampling design as an example. The alternative approaches presented for analysis of such a design are full likelihood, which utilizes all observed information, and conditional likelihood, which is restricted to a completely observed set, conditioning on the rule that generated that set. Conditional likelihood inference is also applied for a joint analysis of prevalence and incidence data, a situation subject to both left censoring and left truncation. Other topics covered are model uncertainty and causal inference using posterior predictive distributions. We formulate a non-parametric monotonic regression model for one or more covariates and a Bayesian estimation procedure, and apply the model in the context of optimal sequential treatment regimes, demonstrating that inference based on posterior predictive distributions is feasible also in this case.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Frictions are factors that hinder trading of securities in financial markets. Typical frictions include limited market depth, transaction costs, lack of infinite divisibility of securities, and taxes. Conventional models used in mathematical finance often gloss over these issues, which affect almost all financial markets, by arguing that the impact of frictions is negligible and, consequently, the frictionless models are valid approximations. This dissertation consists of three research papers, which are related to the study of the validity of such approximations in two distinct modeling problems. Models of price dynamics that are based on diffusion processes, i.e., continuous strong Markov processes, are widely used in the frictionless scenario. The first paper establishes that diffusion models can indeed be understood as approximations of price dynamics in markets with frictions. This is achieved by introducing an agent-based model of a financial market where finitely many agents trade a financial security, the price of which evolves according to price impacts generated by trades. It is shown that, if the number of agents is large, then under certain assumptions the price process of security, which is a pure-jump process, can be approximated by a one-dimensional diffusion process. In a slightly extended model, in which agents may exhibit herd behavior, the approximating diffusion model turns out to be a stochastic volatility model. Finally, it is shown that when agents' tendency to herd is strong, logarithmic returns in the approximating stochastic volatility model are heavy-tailed. The remaining papers are related to no-arbitrage criteria and superhedging in continuous-time option pricing models under small-transaction-cost asymptotics. Guasoni, Rásonyi, and Schachermayer have recently shown that, in such a setting, any financial security admits no arbitrage opportunities and there exist no feasible superhedging strategies for European call and put options written on it, as long as its price process is continuous and has the so-called conditional full support (CFS) property. Motivated by this result, CFS is established for certain stochastic integrals and a subclass of Brownian semistationary processes in the two papers. As a consequence, a wide range of possibly non-Markovian local and stochastic volatility models have the CFS property.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis addresses modeling of financial time series, especially stock market returns and daily price ranges. Modeling data of this kind can be approached with so-called multiplicative error models (MEM). These models nest several well known time series models such as GARCH, ACD and CARR models. They are able to capture many well established features of financial time series including volatility clustering and leptokurtosis. In contrast to these phenomena, different kinds of asymmetries have received relatively little attention in the existing literature. In this thesis asymmetries arise from various sources. They are observed in both conditional and unconditional distributions, for variables with non-negative values and for variables that have values on the real line. In the multivariate context asymmetries can be observed in the marginal distributions as well as in the relationships of the variables modeled. New methods for all these cases are proposed. Chapter 2 considers GARCH models and modeling of returns of two stock market indices. The chapter introduces the so-called generalized hyperbolic (GH) GARCH model to account for asymmetries in both conditional and unconditional distribution. In particular, two special cases of the GARCH-GH model which describe the data most accurately are proposed. They are found to improve the fit of the model when compared to symmetric GARCH models. The advantages of accounting for asymmetries are also observed through Value-at-Risk applications. Both theoretical and empirical contributions are provided in Chapter 3 of the thesis. In this chapter the so-called mixture conditional autoregressive range (MCARR) model is introduced, examined and applied to daily price ranges of the Hang Seng Index. The conditions for the strict and weak stationarity of the model as well as an expression for the autocorrelation function are obtained by writing the MCARR model as a first order autoregressive process with random coefficients. The chapter also introduces inverse gamma (IG) distribution to CARR models. The advantages of CARR-IG and MCARR-IG specifications over conventional CARR models are found in the empirical application both in- and out-of-sample. Chapter 4 discusses the simultaneous modeling of absolute returns and daily price ranges. In this part of the thesis a vector multiplicative error model (VMEM) with asymmetric Gumbel copula is found to provide substantial benefits over the existing VMEM models based on elliptical copulas. The proposed specification is able to capture the highly asymmetric dependence of the modeled variables thereby improving the performance of the model considerably. The economic significance of the results obtained is established when the information content of the volatility forecasts derived is examined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The stochastic filtering has been in general an estimation of indirectly observed states given observed data. This means that one is discussing conditional expected values as being one of the most accurate estimation, given the observations in the context of probability space. In my thesis, I have presented the theory of filtering using two different kind of observation process: the first one is a diffusion process which is discussed in the first chapter, while the third chapter introduces the latter which is a counting process. The majority of the fundamental results of the stochastic filtering is stated in form of interesting equations, such the unnormalized Zakai equation that leads to the Kushner-Stratonovich equation. The latter one which is known also by the normalized Zakai equation or equally by Fujisaki-Kallianpur-Kunita (FKK) equation, shows the divergence between the estimate using a diffusion process and a counting process. I have also introduced an example for the linear gaussian case, which is mainly the concept to build the so-called Kalman-Bucy filter. As the unnormalized and the normalized Zakai equations are in terms of the conditional distribution, a density of these distributions will be developed through these equations and stated by Kushner Theorem. However, Kushner Theorem has a form of a stochastic partial differential equation that needs to be verify in the sense of the existence and uniqueness of its solution, which is covered in the second chapter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this thesis is to analyse the key ecumenical dialogues between Methodists and Lutherans from the perspective of Arminian soteriology and Methodist theology in general. The primary research question is defined as: "To what extent do the dialogues under analysis relate to Arminian soteriology?" By seeking an answer to this question, new knowledge is sought on the current soteriological position of the Methodist-Lutheran dialogues, the contemporary Methodist theology and the commonalities between the Lutheran and Arminian understanding of soteriology. This way the soteriological picture of the Methodist-Lutheran discussions is clarified. The dialogues under analysis were selected on the basis of versatility. Firstly, the sole world organisation level dialogue was chosen: The Church – Community of Grace. Additionally, the document World Methodist Council and the Joint Declaration on the Doctrine of Justification is analysed as a supporting document. Secondly, a document concerning the discussions between two main-line churches in the United States of America was selected: Confessing Our Faith Together. Thirdly, two dialogues between non-main-line Methodist churches and main-line Lutheran national churches in Europe were chosen: Fellowship of Grace from Norway and Kristuksesta osalliset from Finland. The theoretical approach to the research conducted in this thesis is systematic analysis. The Remonstrant articles of Arminian soteriology are utilised as an analysis tool to examine the soteriological positions of the dialogues. New knowledge is sought by analysing the stances of the dialogues concerning the doctrines of partial depravity, conditional election, universal atonement, resistible grace and conditional perseverance of saints. This way information is also provided for approaching the Calvinist-Arminian controversy from new perspectives. The results of this thesis show that the current soteriological position of the Methodist-Lutheran dialogues is closer to Arminianism than Calvinism. The dialogues relate to Arminian soteriology especially concerning the doctrines of universal atonement, resistible grace and conditional perseverance of saints. The commonalities between the Lutheran and Arminian understanding of soteriology exist mainly in these three doctrines as they are uniformly favoured in the dialogues. The most discussed area of soteriology is human depravity, in which the largest diversity of stances occurs as well. On the other hand, divine election is the least discussed topic. The overall perspective, which the results of the analysis provide, indicates that the Lutherans could approach the Calvinist churches together with the Methodists with a wider theological perspective and understanding when the soteriological issues are considered as principal. Human depravity is discovered as the area of soteriology which requires most work in future ecumenical dialogues. However, the detected Lutheran hybrid notion on depravity (a Calvinist-Arminian mixture) appears to provide a useful new perspective for Calvinist-Arminian ecumenism and offers potentially fruitful considerations to future ecumenical dialogues.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes a detailed study of the structure of turbulence in boundary layers along mildly curved convex and concave surfaces. The surface curvature studied corresponds to δ/Rw = ± 0·01, δ being the boundary-layer thickness and Rw the radius of curvature of the wall, taken as positive for convex and negative for concave curvature. Measurements of turbulent energy balance, autocorrelations, auto- and cross-power spectra, amplitude probability distributions and conditional correlations are reported. It is observed that even mild curvature has very strong effects on the various aspects of the turbulent structure. For example, convex curvature suppresses the diffusion of turbulent energy away from the wall, reduces drastically the integral time scales and shifts the spectral distributions of turbulent energy and Reynolds shear stress towards high wavenumbers. Exactly opposite effects, though generally of a smaller magnitude, are produced by concave wall curvature. It is also found that curvature of either sign affects the v fluctuations more strongly than the u fluctuations and that curvature effects are more significant in the outer region of the boundary layer than in the region close to the wall. The data on the conditional correlations are used to study, in detail, the mechanism of turbulent transport in curved boundary layers. (Published Online April 12 2006)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Volatility is central in options pricing and risk management. It reflects the uncertainty of investors and the inherent instability of the economy. Time series methods are among the most widely applied scientific methods to analyze and predict volatility. Very frequently sampled data contain much valuable information about the different elements of volatility and may ultimately reveal the reasons for time varying volatility. The use of such ultra-high-frequency data is common to all three essays of the dissertation. The dissertation belongs to the field of financial econometrics. The first essay uses wavelet methods to study the time-varying behavior of scaling laws and long-memory in the five-minute volatility series of Nokia on the Helsinki Stock Exchange around the burst of the IT-bubble. The essay is motivated by earlier findings which suggest that different scaling laws may apply to intraday time-scales and to larger time-scales, implying that the so-called annualized volatility depends on the data sampling frequency. The empirical results confirm the appearance of time varying long-memory and different scaling laws that, for a significant part, can be attributed to investor irrationality and to an intraday volatility periodicity called the New York effect. The findings have potentially important consequences for options pricing and risk management that commonly assume constant memory and scaling. The second essay investigates modelling the duration between trades in stock markets. Durations convoy information about investor intentions and provide an alternative view at volatility. Generalizations of standard autoregressive conditional duration (ACD) models are developed to meet needs observed in previous applications of the standard models. According to the empirical results based on data of actively traded stocks on the New York Stock Exchange and the Helsinki Stock Exchange the proposed generalization clearly outperforms the standard models and also performs well in comparison to another recently proposed alternative to the standard models. The distribution used to derive the generalization may also prove valuable in other areas of risk management. The third essay studies empirically the effect of decimalization on volatility and market microstructure noise. Decimalization refers to the change from fractional pricing to decimal pricing and it was carried out on the New York Stock Exchange in January, 2001. The methods used here are more accurate than in the earlier studies and put more weight on market microstructure. The main result is that decimalization decreased observed volatility by reducing noise variance especially for the highly active stocks. The results help risk management and market mechanism designing.