867 resultados para Conditional expected utility
Resumo:
Doutoramento em Matemática
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Universidade Federal da Paraíba, Universidade Federal do Rio Grande do Norte, Programa Multiinstitucional e Inter-regional de Pós-Graduação em Ciências Contábeis, 2016.
Resumo:
Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.
Resumo:
We consider the problem of estimating the mean hospital cost of stays of a class of patients (e.g., a diagnosis-related group) as a function of patient characteristics. The statistical analysis is complicated by the asymmetry of the cost distribution, the possibility of censoring on the cost variable, and the occurrence of outliers. These problems have often been treated separately in the literature, and a method offering a joint solution to all of them is still missing. Indirect procedures have been proposed, combining an estimate of the duration distribution with an estimate of the conditional cost for a given duration. We propose a parametric version of this approach, allowing for asymmetry and censoring in the cost distribution and providing a mean cost estimator that is robust in the presence of extreme values. In addition, the new method takes covariate information into account.
Resumo:
Electrical Impedance Tomography (EIT) is an imaging method which enables a volume conductivity map of a subject to be produced from multiple impedance measurements. It has the potential to become a portable non-invasive imaging technique of particular use in imaging brain function. Accurate numerical forward models may be used to improve image reconstruction but, until now, have employed an assumption of isotropic tissue conductivity. This may be expected to introduce inaccuracy, as body tissues, especially those such as white matter and the skull in head imaging, are highly anisotropic. The purpose of this study was, for the first time, to develop a method for incorporating anisotropy in a forward numerical model for EIT of the head and assess the resulting improvement in image quality in the case of linear reconstruction of one example of the human head. A realistic Finite Element Model (FEM) of an adult human head with segments for the scalp, skull, CSF, and brain was produced from a structural MRI. Anisotropy of the brain was estimated from a diffusion tensor-MRI of the same subject and anisotropy of the skull was approximated from the structural information. A method for incorporation of anisotropy in the forward model and its use in image reconstruction was produced. The improvement in reconstructed image quality was assessed in computer simulation by producing forward data, and then linear reconstruction using a sensitivity matrix approach. The mean boundary data difference between anisotropic and isotropic forward models for a reference conductivity was 50%. Use of the correct anisotropic FEM in image reconstruction, as opposed to an isotropic one, corrected an error of 24 mm in imaging a 10% conductivity decrease located in the hippocampus, improved localisation for conductivity changes deep in the brain and due to epilepsy by 4-17 mm, and, overall, led to a substantial improvement on image quality. This suggests that incorporation of anisotropy in numerical models used for image reconstruction is likely to improve EIT image quality.
Resumo:
The research towards efficient, reliable and environmental-friendly power supply solutions is producing growing interest to the “Smart Grid” approach for the development of the electricity networks and managing the increasing energy consumption. One of the novel approaches is an LVDC microgrid. The purpose of the research is to analyze the possibilities for the implementation of LVDC microgrids in public distribution networks in Russia. The research contains the analysis of the modern Russian electric power industry, electricity market, electricity distribution business, regulatory framework and standardization, related to the implementation of LVDC microgrid concept. For the purpose of the economic feasibility estimation, a theoretical case study for comparing low voltage AC and medium voltage AC with LVDC microgrid solutions for a small settlement in Russia is presented. The results of the market and regulatory framework analysis along with the economic comparison of AC and DC solutions show that implementation of the LVDC microgrid concept in Russia is possible and can be economically feasible. From the electric power industry and regulatory framework point of view, there are no serious obstacles for the LVDC microgrids in Russian distribution networks. However, the most suitable use cases at the moment are expected to be found in the electrification of remote settlements, which are isolated from the Unified Energy System of Russia.
Resumo:
We discuss the utility of single nucleotide polymorphism loci for full trio and mother-unavailable paternity testing cases, in the presence of population substructure and relatedness of putative and actual fathers. We focus primarily on the expected number of loci required to gain specified probabilities of mismatches, and report the expected proportion of paternity indices greater than three threshold values for these loci. (c) 2004 Elsevier Ireland Ltd. All rights reserved.
Resumo:
The evaluation of investment fund performance has been one of the main developments of modern portfolio theory. Most studies employ the technique developed by Jensen (1968) that compares a particular fund's returns to a benchmark portfolio of equal risk. However, the standard measures of fund manager performance are known to suffer from a number of problems in practice. In particular previous studies implicitly assume that the risk level of the portfolio is stationary through the evaluation period. That is unconditional measures of performance do not account for the fact that risk and expected returns may vary with the state of the economy. Therefore many of the problems encountered in previous performance studies reflect the inability of traditional measures to handle the dynamic behaviour of returns. As a consequence Ferson and Schadt (1996) suggest an approach to performance evaluation called conditional performance evaluation which is designed to address this problem. This paper utilises such a conditional measure of performance on a sample of 27 UK property funds, over the period 1987-1998. The results of which suggest that once the time varying nature of the funds beta is corrected for, by the addition of the market indicators, the average fund performance show an improvement over that of the traditional methods of analysis.
Resumo:
Techniques are proposed for evaluating forecast probabilities of events. The tools are especially useful when, as in the case of the Survey of Professional Forecasters (SPF) expected probability distributions of inflation, recourse cannot be made to the method of construction in the evaluation of the forecasts. The tests of efficiency and conditional efficiency are applied to the forecast probabilities of events of interest derived from the SPF distributions, and supplement a whole-density evaluation of the SPF distributions based on the probability integral transform approach.
Resumo:
In a natural experiment, this paper studies the impact of an informal sanctioning mechanism on individuals’ voluntary contribution to a public good. Cross-country skiers’ actual cash contributions in two ski resorts, one with and one without an informal sanctioning system, are used. I find the contributing share to be higher in the informal sanctioning system (79 percent) than in the non-sanctioning system (36 percent). Previous studies in one-shot public good situations have found an increasing conditional contribution (CC) function, i.e. the relationship between expected average contributions of other group members and the individual’s own contribution. In contrast, the present results suggest that the CC-function in the non-sanctioning system is non-increasing at high perceived levels of others’ contribution. This relationship deserves further testing in lab.
Resumo:
Portal hypertension (PH) is a common complication and a leading cause of death in patients with chronic liver diseases. PH is underlined by structural and functional derangement of liver sinusoid vessels and its fenestrated endothelium. Because in most clinical settings PH is accompanied by parenchymal injury, it has been difficult to determine the precise role of microvascular perturbations in causing PH. Reasoning that Vascular Endothelial Growth Factor (VEGF) is required to maintain functional integrity of the hepatic microcirculation, we developed a transgenic mouse system for a liver-specific-, reversible VEGF inhibition. The system is based on conditional induction and de-induction of a VEGF decoy receptor that sequesters VEGF and preclude signaling. VEGF blockade results in sinusoidal endothelial cells (SECs) fenestrations closure and in accumulation and transformation of the normally quiescent hepatic stellate cells, i.e. provoking the two processes underlying sinusoidal capillarization. Importantly, sinusoidal capillarization was sufficient to cause PH and its typical sequela, ascites, splenomegaly and venous collateralization without inflicting parenchymal damage or fibrosis. Remarkably, these dramatic phenotypes were fully reversed within few days from lifting-off VEGF blockade and resultant re-opening of SECs' fenestrations. This study not only uncovered an indispensible role for VEGF in maintaining structure and function of mature SECs, but also highlights the vasculo-centric nature of PH pathogenesis. Unprecedented ability to rescue PH and its secondary manifestations via manipulating a single vascular factor may also be harnessed for examining the potential utility of de-capillarization treatment modalities.
Resumo:
Several studies have shown that HER-2/neu (erbB-2) blocking therapy strategies can cause tumor remission. However, the responsible molecular mechanisms are not yet known. Both ERK1/2 and Akt/PKB are critical for HER-2-mediated signal transduction. Therefore, we used a mouse tumor model that allows downregulation of HER-2 in tumor tissue by administration of anhydrotetracycline (ATc). Switching-off HER-2 caused a rapid tumor remission by more than 95% within 7 d of ATc administration compared to the volume before switching-off HER-2. Interestingly, HER-2 downregulation caused a dephosphorylation of p-ERK1/2 by more than 80% already before tumor remission occurred. Levels of total ERK protein were not influenced. In contrast, dephosphorylation of p-Akt occurred later, when the tumor was already in remission. These data suggest that in our HER-2 tumor model dephosphorylation of p-ERK1/2 may be more critical for tumor remission than dephosphorylation of p-Akt. To test this hypothesis we used a second mouse tumor model that allows ATc controlled expression of BXB-Raf1 because the latter constitutively signals to ERK1/2, but cannot activate Akt/PKB. As expected, downregulation of BXB-Raf1 in tumor tissue caused a strong dephosphorylation of p-ERK1/2, but did not decrease levels of p-Akt. Interestingly, tumor remission after switching-off BXB-Raf1 was similarly efficient as the effect of HER-2 downregulation, despite the lack of p-Akt dephosphorylation. In conclusion, two lines of evidence strongly suggest that dephosphorylation of p-ERK1/2 and not that of p-Akt is critical for the rapid tumor remission after downregulation of HER-2 or BXB-Raf1 in our tumor model: (i) dephosphorylation of p-ERK1/2 but not that of p-Akt precedes tumor remission after switching-off HER-2 and (ii) downregulation of BXB-Raf1 leads to a similarly efficient tumor remission as downregulation of HER-2, although no p-Akt dephosphorylation was observed after switching-off BXB-Raf1.
Resumo:
Mouse cell lines were immortalized by introduction of specific immortalizing genes. Embryonic and adult animals and an embryonal stem cell line were used as a source of primary cells. The immortalizing genes were either introduced by DNA transfection or by ecotropic retrovirus transduction. Fibroblasts were obtained by expression of SV40 virus large T antigen (TAg). The properties of the resulting fibroblast cell lines were reproducible, independent of the donor mouse strains employed and the cells showed no transformed properties in vitro and did not form tumors in vivo. Endothelial cell lines were generated by Polyoma virus middle T antigen expression in primary embryonal cells. These cell lines consistently expressed relevant endothelial cell surface markers. Since the expression of the immortalizing genes was expected to strongly influence the cellular characteristics fibroblastoid cells were reversibly immortalized by using a vector that allows conditional expression of the TAg. Under inducing conditions, these cells exhibited properties that were highly similar to the properties of constitutively immortalized cells. In the absence of TAg expression, cell proliferation stops. Cell growth is resumed when TAg expression is restored. Gene expression profiling indicates that TAg influences the expression levels of more than 1000 genes that are involved in diverse cellular processes. The data show that conditionally immortalized cell lines have several advantageous properties over constitutively immortalized cells.
Resumo:
A key energy-saving adaptation to chronic hypoxia that enables cardiomyocytes to withstand severe ischemic insults is hibernation, i.e., a reversible arrest of contractile function. Whereas hibernating cardiomyocytes represent the critical reserve of dysfunctional cells that can be potentially rescued, a lack of a suitable animal model has hampered insights on this medically important condition. We developed a transgenic mouse system for conditional induction of long-term hibernation and a system to rescue hibernating cardiomyocytes at will. Via myocardium-specific induction (and, in turn, deinduction) of a VEGF-sequestering soluble receptor, we show that VEGF is indispensable for adjusting the coronary vasculature to match increased oxygen consumption and exploit this finding to generate a hypoperfused heart. Importantly, ensuing ischemia is tunable to a level at which large cohorts of cardiomyocytes are driven to enter a hibernation mode, without cardiac cell death. Relieving the VEGF blockade even months later resulted in rapid revascularization and full recovery of contractile function. Furthermore, we show that left ventricular remodeling associated with hibernation is also fully reversible. The unique opportunity to uncouple hibernation from other ischemic heart phenotypes (e.g., infarction) was used to determine the genetic program of hibernation; uncovering hypoxia-inducible factor target genes associated with metabolic adjustments and induced expression of several cardioprotective genes. Autophagy, specifically self-digestion of mitochondria, was identified as a key prosurvival mechanism in hibernating cardiomyocytes. This system may lend itself for examining the potential utility of treatments to rescue dysfunctional cardiomyocytes and reverse maladaptive remodeling.
Resumo:
Multi-objective optimization algorithms aim at finding Pareto-optimal solutions. Recovering Pareto fronts or Pareto sets from a limited number of function evaluations are challenging problems. A popular approach in the case of expensive-to-evaluate functions is to appeal to metamodels. Kriging has been shown efficient as a base for sequential multi-objective optimization, notably through infill sampling criteria balancing exploitation and exploration such as the Expected Hypervolume Improvement. Here we consider Kriging metamodels not only for selecting new points, but as a tool for estimating the whole Pareto front and quantifying how much uncertainty remains on it at any stage of Kriging-based multi-objective optimization algorithms. Our approach relies on the Gaussian process interpretation of Kriging, and bases upon conditional simulations. Using concepts from random set theory, we propose to adapt the Vorob’ev expectation and deviation to capture the variability of the set of non-dominated points. Numerical experiments illustrate the potential of the proposed workflow, and it is shown on examples how Gaussian process simulations and the estimated Vorob’ev deviation can be used to monitor the ability of Kriging-based multi-objective optimization algorithms to accurately learn the Pareto front.