867 resultados para Conditional expected utility
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
Emigration has been a very present word in Portugal. Due to the effects of the Economic Crisis and the Memorandum of Understanding policies, we have witnessed a significant yearly migration outflow of people searching for better conditions. This study aims to measure the factors affecting this flow as well as how much the probability of emigrating has evolved during the years bridging 2006 to 2012. I shall consider the decision of emigrating as Discrete Choice Random Utility maximization use a conditional Logit framework to model the probability choice for 31 OECD countries of destination. Moreover I will ascertain the compensating variation required such that the probability of choice in 2012 is adjusted back to 2007 values, keeping all other variables constant. I replicate this exercise using the unemployment rate instead of income. The most likely country of destination is Luxembourg throughout the years analyzed and the values obtained for the CV is of circa 1.700€ in terms of Income per capita and -11% in terms of the unemployment rate adjustment.
Resumo:
INTRODUCTION: Operational classification of leprosy based on the number of skin lesions was conceived to screen patients presenting severe forms of the disease to enable their reception of a more intense multidrug regimen without having to undergo lymph smear testing. We evaluated the concordance between operational classification and bacilloscopy to define multibacillary and paucibacillary leprosy. METHODS: We selected 1,213 records of individuals with leprosy, who were untreated (new cases) and admitted to a dermatology clinic in Recife, Brazil, from 2000 to 2005, and who underwent bacteriological examination at diagnosis for ratification of the operational classification. RESULTS: Compared to bacilloscopy, operational classification demonstrated 88.6% sensitivity, 76.9% specificity, a positive predictive value of 61.8%, and a negative predictive value of 94.1%, with 80% accuracy and a moderate kappa index. Among the bacilloscopy-negative cases, 23% had more than 5 skin lesions. Additionally, 11% of the bacilloscopy-positive cases had up to 5 lesions, which would have led to multibacillary cases being treated as paucibacillary leprosy if the operational classification had not been confirmed by bacilloscopy. CONCLUSIONS: Operational classification has limitations that are more obvious in borderline cases, suggesting that in these cases, lymph smear testing is advisable to enable the selection of true multibacillary cases for more intense treatment, thereby contributing to minimization of resistant strain selection and possible relapse.
Resumo:
Economics is a social science which, therefore, focuses on people and on the decisions they make, be it in an individual context, or in group situations. It studies human choices, in face of needs to be fulfilled, and a limited amount of resources to fulfill them. For a long time, there was a convergence between the normative and positive views of human behavior, in that the ideal and predicted decisions of agents in economic models were entangled in one single concept. That is, it was assumed that the best that could be done in each situation was exactly the choice that would prevail. Or, at least, that the facts that economics needed to explain could be understood in the light of models in which individual agents act as if they are able to make ideal decisions. However, in the last decades, the complexity of the environment in which economic decisions are made and the limits on the ability of agents to deal with it have been recognized, and incorporated into models of decision making in what came to be known as the bounded rationality paradigm. This was triggered by the incapacity of the unboundedly rationality paradigm to explain observed phenomena and behavior. This thesis contributes to the literature in three different ways. Chapter 1 is a survey on bounded rationality, which gathers and organizes the contributions to the field since Simon (1955) first recognized the necessity to account for the limits on human rationality. The focus of the survey is on theoretical work rather than the experimental literature which presents evidence of actual behavior that differs from what classic rationality predicts. The general framework is as follows. Given a set of exogenous variables, the economic agent needs to choose an element from the choice set that is avail- able to him, in order to optimize the expected value of an objective function (assuming his preferences are representable by such a function). If this problem is too complex for the agent to deal with, one or more of its elements is simplified. Each bounded rationality theory is categorized according to the most relevant element it simplifes. Chapter 2 proposes a novel theory of bounded rationality. Much in the same fashion as Conlisk (1980) and Gabaix (2014), we assume that thinking is costly in the sense that agents have to pay a cost for performing mental operations. In our model, if they choose not to think, such cost is avoided, but they are left with a single alternative, labeled the default choice. We exemplify the idea with a very simple model of consumer choice and identify the concept of isofin curves, i.e., sets of default choices which generate the same utility net of thinking cost. Then, we apply the idea to a linear symmetric Cournot duopoly, in which the default choice can be interpreted as the most natural quantity to be produced in the market. We find that, as the thinking cost increases, the number of firms thinking in equilibrium decreases. More interestingly, for intermediate levels of thinking cost, an equilibrium in which one of the firms chooses the default quantity and the other best responds to it exists, generating asymmetric choices in a symmetric model. Our model is able to explain well-known regularities identified in the Cournot experimental literature, such as the adoption of different strategies by players (Huck et al. , 1999), the inter temporal rigidity of choices (Bosch-Dom enech & Vriend, 2003) and the dispersion of quantities in the context of di cult decision making (Bosch-Dom enech & Vriend, 2003). Chapter 3 applies a model of bounded rationality in a game-theoretic set- ting to the well-known turnout paradox in large elections, pivotal probabilities vanish very quickly and no one should vote, in sharp contrast with the ob- served high levels of turnout. Inspired by the concept of rhizomatic thinking, introduced by Bravo-Furtado & Côrte-Real (2009a), we assume that each per- son is self-delusional in the sense that, when making a decision, she believes that a fraction of the people who support the same party decides alike, even if no communication is established between them. This kind of belief simplifies the decision of the agent, as it reduces the number of players he believes to be playing against { it is thus a bounded rationality approach. Studying a two-party first-past-the-post election with a continuum of self-delusional agents, we show that the turnout rate is positive in all the possible equilibria, and that it can be as high as 100%. The game displays multiple equilibria, at least one of which entails a victory of the bigger party. The smaller one may also win, provided its relative size is not too small; more self-delusional voters in the minority party decreases this threshold size. Our model is able to explain some empirical facts, such as the possibility that a close election leads to low turnout (Geys, 2006), a lower margin of victory when turnout is higher (Geys, 2006) and high turnout rates favoring the minority (Bernhagen & Marsh, 1997).
Resumo:
The receiver-operating characteristic (ROC) curve is the most widely used measure for evaluating the performance of a diagnostic biomarker when predicting a binary disease outcome. The ROC curve displays the true positive rate (or sensitivity) and the false positive rate (or 1-specificity) for different cut-off values used to classify an individual as healthy or diseased. In time-to-event studies, however, the disease status (e.g. death or alive) of an individual is not a fixed characteristic, and it varies along the study. In such cases, when evaluating the performance of the biomarker, several issues should be taken into account: first, the time-dependent nature of the disease status; and second, the presence of incomplete data (e.g. censored data typically present in survival studies). Accordingly, to assess the discrimination power of continuous biomarkers for time-dependent disease outcomes, time-dependent extensions of true positive rate, false positive rate, and ROC curve have been recently proposed. In this work, we present new nonparametric estimators of the cumulative/dynamic time-dependent ROC curve that allow accounting for the possible modifying effect of current or past covariate measures on the discriminatory power of the biomarker. The proposed estimators can accommodate right-censored data, as well as covariate-dependent censoring. The behavior of the estimators proposed in this study will be explored through simulations and illustrated using data from a cohort of patients who suffered from acute coronary syndrome.
Resumo:
Utilizando-se dados do sensor aerotransportado SAR R99, adquiridos na banda L (1,28 GHz) em amplitude e com quatro polarizações (HH, VV, HV e VH), avaliou-se a distinção de fitofisionomias de floresta de várzea existentes nas Reservas de Desenvolvimento Sustentável Amanã e Mamirauá e áreas adjacentes, com a aplicação do algoritmo Iterated Conditional Modes (ICM) de classificação polarimétrica pontual/contextual. Os resultados mostraram que o uso das distribuições multivariadas em amplitude, conjuntamente com uma banda de textura, produziu classificações de qualidade superior àquelas obtidas com dados polarimétricos uni/bivariados. Esta abordagem permitiu a obtenção de um índice Kappa de 0,8963, discriminando as três classes vegetacionais de interesse, comprovando assim o potencial dos dados do SAR R99 e do algoritmo ICM no mapeamento de florestas de várzea da Amazônia.
Resumo:
Sirtuins (Sirts) regulate several cellular mechanisms through deacetylation of several transcription factors and enzymes. Recently, Sirt2 was shown to prevent the development of inflammatory processes and its expression favors acute Listeria monocytogenes infection. The impact of this molecule in the context of chronic infections remains unknown. We found that specific Sirt2 deletion in the myeloid lineage transiently increased Mycobacterium tuberculosis load in the lungs and liver of conditional mice. Sirt2 did not affect long-term infection since no significant differences were observed in the bacterial burden at days 60 and 120 post-infection. The initial increase in M. tuberculosis growth was not due to differences in inflammatory cell infiltrates in the lung, myeloid or CD4+ T cells. The transcription levels of IFN-?, IL-17, TNF, IL-6 and NOS2 were also not affected in the lungs by Sirt2-myeloid specific deletion. Overall, our results demonstrate that Sirt2 expression has a transitory effect in M. tuberculosis infection. Thus, modulation of Sirt2 activity in vivo is not expected to affect chronic infection with M. tuberculosis.
Resumo:
Dissertação de mestrado em Systems Engineering
Resumo:
Inspired by natural structures, great attention has been devoted to the study and development of surfaces with extreme wettable properties. The meticulous study of natural systems revealed that the micro/nano-topography of the surface is critical to obtaining unique wettability features, including superhydrophobicity. However, the surface chemistry also has an important role in such surface characteristics. As the interaction of biomaterials with the biological milieu occurs at the surface of the materials, it is expected that synthetic substrates with extreme and controllable wettability ranging from superhydrophilic to superhydrophobic regimes could bring about the possibility of new investigations of cellâ material interactions on nonconventional surfaces and the development of alternative devices with biomedical utility. This first part of the review will describe in detail how proteins and cells interact with micro/nano-structured surfaces exhibiting extreme wettabilities.
Resumo:
Magdeburg, Univ., Fak. für Wirtschaftswiss., Diss., 2011
Resumo:
Abstract Background: The revascularization strategy of the left main disease is determinant for clinical outcomes. Objective: We sought to 1) validate and compare the performance of the SYNTAX Score 1 and 2 for predicting major cardiovascular events at 4 years in patients who underwent unprotected left main angioplasty and 2) evaluate the long-term outcome according to the SYNTAX score 2-recommended revascularization strategy. Methods: We retrospectively studied 132 patients from a single-centre registry who underwent unprotected left main angioplasty between March 1999 and December 2010. Discrimination and calibration of both models were assessed by ROC curve analysis, calibration curves and the Hosmer-Lemeshow test. Results: Total event rate was 26.5% at 4 years.The AUC for the SYNTAX Score 1 and SYNTAX Score 2 for percutaneous coronary intervention, was 0.61 (95% CI: 0.49-0.73) and 0.67 (95% CI: 0.57-0.78), respectively. Despite a good overall adjustment for both models, the SYNTAX Score 2 tended to underpredict risk. In the 47 patients (36%) who should have undergone surgery according to the SYNTAX Score 2, event rate was numerically higher (30% vs. 25%; p=0.54), and for those with a higher difference between the two SYNTAX Score 2 scores (Percutaneous coronary intervention vs. Coronary artery by-pass graft risk estimation greater than 5.7%), event rate was almost double (40% vs. 22%; p=0.2). Conclusion: The SYNTAX Score 2 may allow a better and individualized risk stratification of patients who need revascularization of an unprotected left main coronary artery. Prospective studies are needed for further validation.
Resumo:
n.s. no.36(1987)
Resumo:
In microeconomic analysis functions with diminishing returns to scale (DRS) have frequently been employed. Various properties of increasing quasiconcave aggregator functions with DRS are derived. Furthermore duality in the classical sense as well as of a new type is studied for such aggregator functions in production and consumer theory. In particular representation theorems for direct and indirect aggregator functions are obtained. These involve only small sets of generator functions. The study is carried out in the contemporary framework of abstract convexity and abstract concavity.
Resumo:
Ever since the appearance of the ARCH model [Engle(1982a)], an impressive array of variance specifications belonging to the same class of models has emerged [i.e. Bollerslev's (1986) GARCH; Nelson's (1990) EGARCH]. This recent domain has achieved very successful developments. Nevertheless, several empirical studies seem to show that the performance of such models is not always appropriate [Boulier(1992)]. In this paper we propose a new specification: the Quadratic Moving Average Conditional heteroskedasticity model. Its statistical properties, such as the kurtosis and the symmetry, as well as two estimators (Method of Moments and Maximum Likelihood) are studied. Two statistical tests are presented, the first one tests for homoskedasticity and the second one, discriminates between ARCH and QMACH specification. A Monte Carlo study is presented in order to illustrate some of the theoretical results. An empirical study is undertaken for the DM-US exchange rate.