859 resultados para conditional expected utility
Resumo:
A better understanding of stock price changes is important in guiding many economic activities. Since prices often do not change without good reasons, searching for related explanatory variables has involved many enthusiasts. This book seeks answers from prices per se by relating price changes to their conditional moments. This is based on the belief that prices are the products of a complex psychological and economic process and their conditional moments derive ultimately from these psychological and economic shocks. Utilizing information about conditional moments hence makes it an attractive alternative to using other selective financial variables in explaining price changes. The first paper examines the relation between the conditional mean and the conditional variance using information about moments in three types of conditional distributions; it finds that the significance of the estimated mean and variance ratio can be affected by the assumed distributions and the time variations in skewness. The second paper decomposes the conditional industry volatility into a concurrent market component and an industry specific component; it finds that market volatility is on average responsible for a rather small share of total industry volatility — 6 to 9 percent in UK and 2 to 3 percent in Germany. The third paper looks at the heteroskedasticity in stock returns through an ARCH process supplemented with a set of conditioning information variables; it finds that the heteroskedasticity in stock returns allows for several forms of heteroskedasticity that include deterministic changes in variances due to seasonal factors, random adjustments in variances due to market and macro factors, and ARCH processes with past information. The fourth paper examines the role of higher moments — especially skewness and kurtosis — in determining the expected returns; it finds that total skewness and total kurtosis are more relevant non-beta risk measures and that they are costly to be diversified due either to the possible eliminations of their desirable parts or to the unsustainability of diversification strategies based on them.
Resumo:
Revised: 2006-07
Resumo:
The complete internal transcribed spacer 1 (ITS1), 5.8S ribosomal DNA, and ITS2 region of the ribosomal DNA from 60 specimens belonging to two closely related bucephalid digeneans (Dollfustrema vaneyi and Dollfustrema hefeiensis) from different localities, hosts, and microhabitat sites were cloned to examine the level of sequence variation and the taxonomic levels to show utility in species identification and phylogeny estimation. Our data show that these molecular markers can help to discriminate the two species, which are morphologically very close and difficult to separate by classical methods. We found 21 haplotypes defined by 44 polymorphic positions in 38 individuals of D. vaneyi, and 16 haplotypes defined by 43 polymorphic positions in 22 individuals of D. hefeiensis. There is no shared haplotypes between the two species. Haplotype rather than nucleotide diversity is similar between the two species. Phylogenetic analyses reveal two robustly supported clades, one corresponding to D. vaneyi and the other corresponding to D. hefeiensis. However, the population structures between the two species seem to be incongruent and show no geographic and host-specific structure among them, further indicating that the two species may have had a more complex evolutionary history than expected.
Resumo:
This paper considers forecasting the conditional mean and variance from a single-equation dynamic model with autocorrelated disturbances following an ARMA process, and innovations with time-dependent conditional heteroskedasticity as represented by a linear GARCH process. Expressions for the minimum MSE predictor and the conditional MSE are presented. We also derive the formula for all the theoretical moments of the prediction error distribution from a general dynamic model with GARCH(1, 1) innovations. These results are then used in the construction of ex ante prediction confidence intervals by means of the Cornish-Fisher asymptotic expansion. An empirical example relating to the uncertainty of the expected depreciation of foreign exchange rates illustrates the usefulness of the results. © 1992.
Resumo:
This paper proposes a discrete mixture model which assigns individuals, up to a probability, to either a class of random utility (RU) maximizers or a class of random regret (RR) minimizers, on the basis of their sequence of observed choices. Our proposed model advances the state of the art of RU-RR mixture models by (i) adding and simultaneously estimating a membership model which predicts the probability of belonging to a RU or RR class; (ii) adding a layer of random taste heterogeneity within each behavioural class; and (iii) deriving a welfare measure associated with the RU-RR mixture model and consistent with referendum-voting, which is the adequate mechanism of provision for such local public goods. The context of our empirical application is a stated choice experiment concerning traffic calming schemes. We find that the random parameter RU-RR mixture model not only outperforms its fixed coefficient counterpart in terms of fit-as expected-but also in terms of plausibility of membership determinants of behavioural class. In line with psychological theories of regret, we find that, compared to respondents who are familiar with the choice context (i.e. the traffic calming scheme), unfamiliar respondents are more likely to be regret minimizers than utility maximizers. © 2014 Elsevier Ltd.
Resumo:
The research towards efficient, reliable and environmental-friendly power supply solutions is producing growing interest to the “Smart Grid” approach for the development of the electricity networks and managing the increasing energy consumption. One of the novel approaches is an LVDC microgrid. The purpose of the research is to analyze the possibilities for the implementation of LVDC microgrids in public distribution networks in Russia. The research contains the analysis of the modern Russian electric power industry, electricity market, electricity distribution business, regulatory framework and standardization, related to the implementation of LVDC microgrid concept. For the purpose of the economic feasibility estimation, a theoretical case study for comparing low voltage AC and medium voltage AC with LVDC microgrid solutions for a small settlement in Russia is presented. The results of the market and regulatory framework analysis along with the economic comparison of AC and DC solutions show that implementation of the LVDC microgrid concept in Russia is possible and can be economically feasible. From the electric power industry and regulatory framework point of view, there are no serious obstacles for the LVDC microgrids in Russian distribution networks. However, the most suitable use cases at the moment are expected to be found in the electrification of remote settlements, which are isolated from the Unified Energy System of Russia.
Resumo:
We discuss the utility of single nucleotide polymorphism loci for full trio and mother-unavailable paternity testing cases, in the presence of population substructure and relatedness of putative and actual fathers. We focus primarily on the expected number of loci required to gain specified probabilities of mismatches, and report the expected proportion of paternity indices greater than three threshold values for these loci. (c) 2004 Elsevier Ireland Ltd. All rights reserved.
Resumo:
The evaluation of investment fund performance has been one of the main developments of modern portfolio theory. Most studies employ the technique developed by Jensen (1968) that compares a particular fund's returns to a benchmark portfolio of equal risk. However, the standard measures of fund manager performance are known to suffer from a number of problems in practice. In particular previous studies implicitly assume that the risk level of the portfolio is stationary through the evaluation period. That is unconditional measures of performance do not account for the fact that risk and expected returns may vary with the state of the economy. Therefore many of the problems encountered in previous performance studies reflect the inability of traditional measures to handle the dynamic behaviour of returns. As a consequence Ferson and Schadt (1996) suggest an approach to performance evaluation called conditional performance evaluation which is designed to address this problem. This paper utilises such a conditional measure of performance on a sample of 27 UK property funds, over the period 1987-1998. The results of which suggest that once the time varying nature of the funds beta is corrected for, by the addition of the market indicators, the average fund performance show an improvement over that of the traditional methods of analysis.
Resumo:
Techniques are proposed for evaluating forecast probabilities of events. The tools are especially useful when, as in the case of the Survey of Professional Forecasters (SPF) expected probability distributions of inflation, recourse cannot be made to the method of construction in the evaluation of the forecasts. The tests of efficiency and conditional efficiency are applied to the forecast probabilities of events of interest derived from the SPF distributions, and supplement a whole-density evaluation of the SPF distributions based on the probability integral transform approach.
Resumo:
In a natural experiment, this paper studies the impact of an informal sanctioning mechanism on individuals’ voluntary contribution to a public good. Cross-country skiers’ actual cash contributions in two ski resorts, one with and one without an informal sanctioning system, are used. I find the contributing share to be higher in the informal sanctioning system (79 percent) than in the non-sanctioning system (36 percent). Previous studies in one-shot public good situations have found an increasing conditional contribution (CC) function, i.e. the relationship between expected average contributions of other group members and the individual’s own contribution. In contrast, the present results suggest that the CC-function in the non-sanctioning system is non-increasing at high perceived levels of others’ contribution. This relationship deserves further testing in lab.
Resumo:
Portal hypertension (PH) is a common complication and a leading cause of death in patients with chronic liver diseases. PH is underlined by structural and functional derangement of liver sinusoid vessels and its fenestrated endothelium. Because in most clinical settings PH is accompanied by parenchymal injury, it has been difficult to determine the precise role of microvascular perturbations in causing PH. Reasoning that Vascular Endothelial Growth Factor (VEGF) is required to maintain functional integrity of the hepatic microcirculation, we developed a transgenic mouse system for a liver-specific-, reversible VEGF inhibition. The system is based on conditional induction and de-induction of a VEGF decoy receptor that sequesters VEGF and preclude signaling. VEGF blockade results in sinusoidal endothelial cells (SECs) fenestrations closure and in accumulation and transformation of the normally quiescent hepatic stellate cells, i.e. provoking the two processes underlying sinusoidal capillarization. Importantly, sinusoidal capillarization was sufficient to cause PH and its typical sequela, ascites, splenomegaly and venous collateralization without inflicting parenchymal damage or fibrosis. Remarkably, these dramatic phenotypes were fully reversed within few days from lifting-off VEGF blockade and resultant re-opening of SECs' fenestrations. This study not only uncovered an indispensible role for VEGF in maintaining structure and function of mature SECs, but also highlights the vasculo-centric nature of PH pathogenesis. Unprecedented ability to rescue PH and its secondary manifestations via manipulating a single vascular factor may also be harnessed for examining the potential utility of de-capillarization treatment modalities.
Resumo:
Several studies have shown that HER-2/neu (erbB-2) blocking therapy strategies can cause tumor remission. However, the responsible molecular mechanisms are not yet known. Both ERK1/2 and Akt/PKB are critical for HER-2-mediated signal transduction. Therefore, we used a mouse tumor model that allows downregulation of HER-2 in tumor tissue by administration of anhydrotetracycline (ATc). Switching-off HER-2 caused a rapid tumor remission by more than 95% within 7 d of ATc administration compared to the volume before switching-off HER-2. Interestingly, HER-2 downregulation caused a dephosphorylation of p-ERK1/2 by more than 80% already before tumor remission occurred. Levels of total ERK protein were not influenced. In contrast, dephosphorylation of p-Akt occurred later, when the tumor was already in remission. These data suggest that in our HER-2 tumor model dephosphorylation of p-ERK1/2 may be more critical for tumor remission than dephosphorylation of p-Akt. To test this hypothesis we used a second mouse tumor model that allows ATc controlled expression of BXB-Raf1 because the latter constitutively signals to ERK1/2, but cannot activate Akt/PKB. As expected, downregulation of BXB-Raf1 in tumor tissue caused a strong dephosphorylation of p-ERK1/2, but did not decrease levels of p-Akt. Interestingly, tumor remission after switching-off BXB-Raf1 was similarly efficient as the effect of HER-2 downregulation, despite the lack of p-Akt dephosphorylation. In conclusion, two lines of evidence strongly suggest that dephosphorylation of p-ERK1/2 and not that of p-Akt is critical for the rapid tumor remission after downregulation of HER-2 or BXB-Raf1 in our tumor model: (i) dephosphorylation of p-ERK1/2 but not that of p-Akt precedes tumor remission after switching-off HER-2 and (ii) downregulation of BXB-Raf1 leads to a similarly efficient tumor remission as downregulation of HER-2, although no p-Akt dephosphorylation was observed after switching-off BXB-Raf1.
Resumo:
Mouse cell lines were immortalized by introduction of specific immortalizing genes. Embryonic and adult animals and an embryonal stem cell line were used as a source of primary cells. The immortalizing genes were either introduced by DNA transfection or by ecotropic retrovirus transduction. Fibroblasts were obtained by expression of SV40 virus large T antigen (TAg). The properties of the resulting fibroblast cell lines were reproducible, independent of the donor mouse strains employed and the cells showed no transformed properties in vitro and did not form tumors in vivo. Endothelial cell lines were generated by Polyoma virus middle T antigen expression in primary embryonal cells. These cell lines consistently expressed relevant endothelial cell surface markers. Since the expression of the immortalizing genes was expected to strongly influence the cellular characteristics fibroblastoid cells were reversibly immortalized by using a vector that allows conditional expression of the TAg. Under inducing conditions, these cells exhibited properties that were highly similar to the properties of constitutively immortalized cells. In the absence of TAg expression, cell proliferation stops. Cell growth is resumed when TAg expression is restored. Gene expression profiling indicates that TAg influences the expression levels of more than 1000 genes that are involved in diverse cellular processes. The data show that conditionally immortalized cell lines have several advantageous properties over constitutively immortalized cells.
Resumo:
A key energy-saving adaptation to chronic hypoxia that enables cardiomyocytes to withstand severe ischemic insults is hibernation, i.e., a reversible arrest of contractile function. Whereas hibernating cardiomyocytes represent the critical reserve of dysfunctional cells that can be potentially rescued, a lack of a suitable animal model has hampered insights on this medically important condition. We developed a transgenic mouse system for conditional induction of long-term hibernation and a system to rescue hibernating cardiomyocytes at will. Via myocardium-specific induction (and, in turn, deinduction) of a VEGF-sequestering soluble receptor, we show that VEGF is indispensable for adjusting the coronary vasculature to match increased oxygen consumption and exploit this finding to generate a hypoperfused heart. Importantly, ensuing ischemia is tunable to a level at which large cohorts of cardiomyocytes are driven to enter a hibernation mode, without cardiac cell death. Relieving the VEGF blockade even months later resulted in rapid revascularization and full recovery of contractile function. Furthermore, we show that left ventricular remodeling associated with hibernation is also fully reversible. The unique opportunity to uncouple hibernation from other ischemic heart phenotypes (e.g., infarction) was used to determine the genetic program of hibernation; uncovering hypoxia-inducible factor target genes associated with metabolic adjustments and induced expression of several cardioprotective genes. Autophagy, specifically self-digestion of mitochondria, was identified as a key prosurvival mechanism in hibernating cardiomyocytes. This system may lend itself for examining the potential utility of treatments to rescue dysfunctional cardiomyocytes and reverse maladaptive remodeling.
Resumo:
Multi-objective optimization algorithms aim at finding Pareto-optimal solutions. Recovering Pareto fronts or Pareto sets from a limited number of function evaluations are challenging problems. A popular approach in the case of expensive-to-evaluate functions is to appeal to metamodels. Kriging has been shown efficient as a base for sequential multi-objective optimization, notably through infill sampling criteria balancing exploitation and exploration such as the Expected Hypervolume Improvement. Here we consider Kriging metamodels not only for selecting new points, but as a tool for estimating the whole Pareto front and quantifying how much uncertainty remains on it at any stage of Kriging-based multi-objective optimization algorithms. Our approach relies on the Gaussian process interpretation of Kriging, and bases upon conditional simulations. Using concepts from random set theory, we propose to adapt the Vorob’ev expectation and deviation to capture the variability of the set of non-dominated points. Numerical experiments illustrate the potential of the proposed workflow, and it is shown on examples how Gaussian process simulations and the estimated Vorob’ev deviation can be used to monitor the ability of Kriging-based multi-objective optimization algorithms to accurately learn the Pareto front.