989 resultados para non-expected utililty
Resumo:
The concept of Ambiguity designates those situations where the information available to the decision maker is insufficient to form a probabilistic view of the world. Thus, it has provided the motivation for departing from the Subjective Expected Utility (SEU) paradigm. Yet, the formalization of the concept is missing. This is a grave omission as it leaves non-expected utility models hanging on a shaky ground. In particular, it leaves unanswered basic questions such as: (1) Does Ambiguity exist?; (2) If so, which situations should be labeled as "ambiguous"?; (3) Why should one depart from Subjective Expected Utility (SEU) in the presence of Ambiguity?; and (4) If so, what kind of behavior should emerge in the presence of Ambiguity? The present paper fills these gaps. Specifically, it identifies those information structures that are incompatible with SEU theory, and shows that their mathematical properties are the formal counterpart of the intuitive idea of insufficient information. These are used to give a formal definition of Ambiguity and, consequently, to distinguish between ambiguous and unambiguous situations. Finally, the paper shows that behavior not conforming to SEU theory must emerge in correspondence of insufficient information and identifies the class of non-EU models that emerge in the face of Ambiguity. The paper also proposes a new comparative definition of Ambiguity, and discusses its relation with some of the existing literature.
Resumo:
This paper provides a characterization of QALYs, the most important outcome measure in medical decision making, in the context of a general rank dependent utility model. We show that both for chronic and for nonchronic health states the characterization of QALYs depends on intuitive conditions. This facilitates the assessment of the validity of QALYs in rank dependent non-expected utility theories and a comparison with other utility based measures of health.
Resumo:
Nos tempos actuais os equipamentos para Aquecimento Ventilação e Ar Condicionado (AVAC) ocupam um lugar de grande importância na concepção, desenvolvimento e manutenção de qualquer edifício por mais pequeno que este seja. Assim, surge a necessidade premente de racionalizar os consumos energéticos optimizando-os. A alta fiabilidade desejada nestes sistemas obriga-nos cada vez mais a descobrir formas de tornar a sua manutenção mais eficiente, pelo que é necessário prevenir de uma forma proactiva todas as falhas que possam prejudicar o bom desempenho destas instalações. Como tal, torna-se necessário detectar estas falhas/anomalias, sendo imprescíndivel que nos antecipemos a estes eventos prevendo o seu acontecimento num horizonte temporal pré-definido, permitindo actuar o mais cedo possível. É neste domínio que a presente dissertação tenta encontrar soluções para que a manutenção destes equipamentos aconteça de uma forma proactiva e o mais eficazmente possível. A ideia estruturante é a de tentar intervir ainda numa fase incipiente do problema, alterando o comportamento dos equipamentos monitorizados, de uma forma automática, com recursos a agentes inteligentes de diagnóstico de falhas. No caso em estudo tenta-se adaptar de forma automática o funcionamento de uma Unidade de Tratamento de Ar (UTA) aos desvios/anomalias detectadas, promovendo a paragem integral do sistema apenas como último recurso. A arquitectura aplicada baseia-se na utilização de técnicas de inteligência artificial, nomeadamente dos sistemas multiagente. O algoritmo utilizado e testado foi construído em Labview®, utilizando um kit de ferramentas de controlo inteligente para Labview®. O sistema proposto é validado através de um simulador com o qual se conseguem reproduzir as condições reais de funcionamento de uma UTA.
Resumo:
I consider a general specification of criminals' objective functionand argue that, when the general non-expected utility theory issubstituted for the traditional expected utility theory, thehigh-fine-low-probability result (Becker, 1968) only holds underspecific and strong restrictions.
Resumo:
Zur Erholung in die Natur gehen oder doch lieber zur Natursimulation greifen? Intuitiv würden die meisten Menschen der Natur einen größeren Erholungswert zusprechen als einer Natursimulation. Aber ist die Natur tatsächlich erholsamer? In der Naturerholungsforschung (Restorative Environment Research) kommen häufig Natursimulationen zum Einsatz, um die erholsame Wirkung von Natur zu ermitteln. Problematisch ist dabei, dass deren ökologische Validität und Vergleichbarkeit noch nicht empirisch abgesichert ist. Vorliegende Arbeit setzt an dieser methodischen und empirischen Lücke an. Sie überprüft sowohl die ökologische Validität als auch die Vergleichbarkeit von Natursimulationen. Dazu wird die erholsame Wirkung von zwei Natursimulationen im Vergleich zu der physisch-materiellen Natur empirisch untersucht und verglichen. Darüber hinaus werden Aspekte des subjektiven Erlebens und der Bewertung im Naturerholungskontext exploriert. Als bedeutsamer Wirkmechanismus wird die erlebnisbezogene Künstlichkeit/Natürlichkeit angesehen, die sich auf die Erlebnisqualität von Natursimulationen und der physisch-materiellen Natur bezieht: Natursimulationen weisen im Vergleich zur physisch-materiellen Natur eine reduzierte Erlebnisqualität auf (erlebnisbezogene Künstlichkeit), z.B. eine reduzierte Qualität und Quantität der Sinnesansprache. Stellt man einen derartigen Vergleich nicht nur mit der physisch-materiellen Natur, sondern mit unterschiedlichen Natursimulationstypen an, dann zeigen sich auch hier Unterschiede in der erlebnisbezogenen Künstlichkeit. Beispielsweise unterscheidet sich ein Naturfoto von einem Naturfilm durch das Fehlen von auditiven und bewegten Stimuli. Diese erlebnisbezogene Künstlichkeit kann die erholsame Wirkung von Natur - direkt oder indirekt über Bewertungen - hemmen. Als Haupthypothese wird angenommen, dass mit zunehmendem Ausmaß an erlebnisbezogener Künstlichkeit die erholsame Wirkung der Natur abnimmt. Dem kombinierten Feld- und Laborexperiment liegt ein einfaktorielles Vorher-Nachher-Design zugrunde. Den 117 Probanden wurde zunächst eine kognitiv und affektiv belastende Aufgabe vorgelegt, danach folgte die Erholungsphase. Diese bestand aus einem Spaziergang, der entweder in der physisch-materiellen Natur (urbaner Park) oder in einer der beiden audio-visuellen Natursimulationen (videogefilmter vs. computergenerierter Spaziergang durch selbigen urbanen Park) oder auf dem Laufband ohne audio-visuelle Darbietung stattfand. Die erlebnisbezogene Künstlichkeit/Natürlichkeit wurde also wie folgt operationlisiert: die physische Natur steht für die erlebnisbezogene Natürlichkeit. Die beiden Natursimulationen stehen für die erlebnisbezogene Künstlichkeit. Die computergenerierte Version ist im Vergleich zur Videoversion erlebnisbezogen künstlicher, da sie weniger fotorealistisch ist. Die Zuordnung zu einer der vier experimentellen Erholungssettings erfolgte nach dem Zufallsprinzip. Die Effekte von moderater Bewegung wurden in den Natursimulationen durch das Laufen auf dem Laufband kontrolliert. Die Beanspruchungs- bzw. Erholungsreaktionen wurden auf kognitiver (Konzentriertheit, Aufmerksamkeitsleistung) affektiver (3 Befindlichkeitsskalen: Wachheit, Ruhe, gute Stimmung) und physiologischer (Alpha-Amylase) Ebene gemessen, um ein umfassendes Bild der Reaktionen zu erhalten. Insgesamt zeigen die Ergebnisse, dass die beiden Natursimulationen trotz Unterschiede in der erlebnisbezogenen Künstlichkeit/Natürlichkeit zu relativ ähnlichen Erholungsreaktionen führen, wie die physisch-materielle Natur. Eine Ausnahme stellen eine der drei affektiven (Wachheit) und die physiologische Reaktion dar: Probanden der physisch-materiellen Naturbedingung geben an wacher zu sein und weisen - wider erwarten - eine höhere physiologische Erregung auf. Demnach ist die physisch-materielle Natur nicht grundsätzlich erholsamer als die Natursimulationen. Die Hypothese ließ sich somit nicht bestätigen. Vielmehr deuten sich komplexe Erholungsmuster und damit auch unterschiedliche Erholungsqualitäten der Settings an, die einer differenzierten Betrachtung bedürfen. Für die ökologische Validität von Natursimulationen gilt, dass diese nur mit Einschränkung als ökologisch valide bezeichnet werden können, d.h. nur für bestimmte, aber nicht für alle Erholungsreaktionen. Die beiden Natursimulationen führen ebenfalls trotz Unterschiede in der erlebnisbezogenen Künstlichkeit zu ähnlichen Erholungsreaktionen und können somit als gleichwertig behandelt werden. Erstaunlicherweise kommt es hier zu ähnlichen Erholungsreaktionen, obwohl die bestehenden Unterschiede von den Probanden wahrgenommen und die erlebnisbezogen künstlichere computergenerierte Version negativer bewertet wird. Aufgrund der nicht erwartungskonformen Ergebnisse muss das Erklärungskonzept der erlebnisbezogenen Künstlichkeit/Natürlichkeit infrage gestellt werden. Alternative Erklärungskonzepte für die Ergebnisse („Ungewissheit“, mentale räumliche Modelle), die sich andeutenden unterschiedlichen Erholungsqualitäten der Settings, methodische Einschränkungen sowie die praktische Bedeutung der Ergebnisse werden kritisch diskutiert.
Resumo:
The paper reviews recent models that have applied the techniques of behavioural economics to the analysis of the tax compliance choice of an individual taxpayer. The construction of these models is motivated by the failure of the Yitzhaki version of the Allingham–Sandmo model to predict correctly the proportion of taxpayers who will evade and the effect of an increase in the tax rate upon the chosen level of evasion. Recent approaches have applied non-expected utility theory to the compliance decision and have addressed social interaction. The models we describe are able to match the observed extent of evasion and correctly predict the tax effect but do not have the parsimony or precision of the Yitzhaki model.
Resumo:
Com o emprego do modelo de dois setores de acumulação ótima de capital em economia aberta, determina-se o impacto sobre a trajetória do câmbio, dos salários, do investimento, da poupança e, portanto, da dívida externa e do estoque de capital, de uma elevação permanente e não antecipada da produtividade da economia. Em geral, após um choque positivo permanente de produtividade há redução da poupança, piora do balanço de pagamentos em transações correntes e valorização do câmbio. Todos fenômenos que do ponto de vista do modelo são de equilíbrio intertemporal, conseqüência da elevação da renda permanente e do excesso de demanda por bens domésticos que sucede o ganho de produtividade. Supondo que os programas de estabilização elevaram a produtividade da economia é possível com a estrutura analítica construída racionalizar qualitativamente os fenômenos observados após estes planos.
Resumo:
The Rational Agent model have been a foundational basis for theoretical models such as Economics, Management Science, Artificial Intelligence and Game Theory, mainly by the ¿maximization under constraints¿ principle, e.g. the ¿Expected Utility Models¿, among them, the Subjective Expected Utility (SEU) Theory, from Savage, placed as most influence player over theoretical models we¿ve seen nowadays, even though many other developments have been done, indeed also in non-expected utility theories field. Having the ¿full rationality¿ assumption, going for a less idealistic sight ¿bounded rationality¿ of Simon, or for classical anomalies studies, such as the ¿heuristics and bias¿ analysis by Kahneman e Tversky, ¿Prospect Theory¿ also by Kahneman & Tversky, or Thaler¿s Anomalies, and many others, what we can see now is that Rational Agent Model is a ¿Management by Exceptions¿ example, as for each new anomalies¿s presentation, in sequence, a ¿problem solving¿ development is needed. This work is a theoretical essay, which tries to understand: 1) The rational model as a ¿set of exceptions¿; 2) The actual situation unfeasibility, since once an anomalie is identified, we need it¿s specific solution developed, and since the number of anomalies increases every year, making strongly difficult to manage rational model; 3) That behaviors judged as ¿irrationals¿ or deviated, by the Rational Model, are truly not; 4) That¿s the right moment to emerge a Theory including mental processes used in decision making; and 5) The presentation of an alternative model, based on some cognitive and experimental psychology analysis, such as conscious and uncounscious processes, cognition, intuition, analogy-making, abstract roles, and others. Finally, we present conclusions and future research, that claims for deeper studies in this work¿s themes, for mathematical modelling, and studies about a rational analysis and cognitive models possible integration. .
Resumo:
O objetivo deste trabalho é verificar se o ajustamento das condições de paridade de juros por expectativa do mercado (paridade descoberta) e por prêmios de risco (paridades coberta e descoberta) leva à validação da relação de não-arbitragem subjacente, ou pelo menos a resultados econométricos mais próximos de sua validação. Para isso, combinamos taxas de retornos de instrumentos de renda fixa domésticos e norte-americanos e aplicamos o arcabouço econométrico de séries de tempo. Como primeiro passo de investigação, aplicamos a paridade de juros (descoberta e coberta) na sua forma tradicional. No passo seguinte aplicamos os testes econométricos às condições de paridade ajustadas por um prêmio de risco. No caso da PDJ, não obtivemos resultados satisfatórios, mesmo ajustando pelos prêmios de risco. Esse ajuste propiciou uma mudança nos sinais dos coeficientes na direção correta, mas a magnitude do coeficiente da desvalorização cambial efetiva passou a destoar bastante da magnitude das outras séries. Apesar de termos obtido a validade da PCJ na forma tradicional, não esperaríamos este resultado, pois isso implicaria que o prêmio de risco país seria nulo para este período. Ajustando a PCJ pelo prêmio de risco de não-pagamento passa-se a não obter co integração entre as séries, ou seja, o prêmio de risco de não-pagamento teria um comportamento independente do prêmio futuro e do diferencial de juros. As possíveis causas para a não obtenção dos resultados esperados são: intervalo amostraI menor que 3 anos, erro de medida dos dados de survey ou tentativa do Banco Central de controlar a taxa de câmbio nominal e as taxas de juros domésticas simultaneamente.
Resumo:
Employing the two sector model of capital accumulation in an open economy, the impact on the path of the following variables: exchange rate, wages, investment, saving, and consequently externaI debt and capital stock afier a permanent and non expected elevation of the economy productivity is determinated. Afier this positive shock, saving rate decreases, current transaction deteriorates and the exchange rate appreciates. Those are equilibrium phenomena from 3D intertemporaI point of view due to the permanent income raise and to the domestic good excess demand that follows the productivity increase. Assuming that the stabilization programa augment the economy productivity, the model could rationalize qualitatively the stylized facts witnessed after those programa.
Resumo:
We study an intertemporal asset pricing model in which a representative consumer maximizes expected utility derived from both the ratio of his consumption to some reference level and this level itself. If the reference consumption level is assumed to be determined by past consumption levels, the model generalizes the usual habit formation specifications. When the reference level growth rate is made dependent on the market portfolio return and on past consumption growth, the model mixes a consumption CAPM with habit formation together with the CAPM. It therefore provides, in an expected utility framework, a generalization of the non-expected recursive utility model of Epstein and Zin (1989). When we estimate this specification with aggregate per capita consumption, we obtain economically plausible values of the preference parameters, in contrast with the habit formation or the Epstein-Zin cases taken separately. All tests performed with various preference specifications confirm that the reference level enters significantly in the pricing kernel.
Resumo:
In this work I discuss several key aspects of welfare economics and policy analysis and I propose two original contributions to the growing field of behavioral public policymaking. After providing a historical perspective of welfare economics and an overview of policy analysis processes in the introductory chapter, in chapter 2 I discuss a debated issue of policymaking, the choice of the social welfare function. I contribute to this debate by proposing an original methodological contribution based on the analysis of the quantitative relationship among different social welfare functional forms commonly used by policy analysts. In chapter 3 I then discuss a behavioral policy to contrast indirect tax evasion based on the use of lotteries. I show that the predictions of my model based on non-expected utility are consistent with observed, and so far unexplained, empirical evidence of the policy success. Finally, in chapter 4 I investigate by mean of a laboratory experiment the effects of social influence on the individual likelihood to engage in altruistic punishment. I show that bystanders’ decision to engage in punishment is influenced by the punishment behavior of their peers and I suggest ways to enact behavioral policies that exploit this finding.
Resumo:
Individual analysis of functional Magnetic Resonance Imaging (fMRI) scans requires user-adjustment of the statistical threshold in order to maximize true functional activity and eliminate false positives. In this study, we propose a novel technique that uses radiomic texture analysis (TA) features associated with heterogeneity to predict areas of true functional activity. Scans of 15 right-handed healthy volunteers were analyzed using SPM8. The resulting functional maps were thresholded to optimize visualization of language areas, resulting in 116 regions of interests (ROIs). A board-certified neuroradiologist classified different ROIs into Expected (E) and Non-Expected (NE) based on their anatomical locations. TA was performed using the mean Echo-Planner Imaging (EPI) volume, and 20 rotation-invariant texture features were obtained for each ROI. Using forward stepwise logistic regression, we built a predictive model that discriminated between E and NE areas of functional activity, with a cross-validation AUC and success rate of 79.84% and 80.19% respectively (specificity/sensitivity of 78.34%/82.61%). This study found that radiomic TA of fMRI scans may allow for determination of areas of true functional activity, and thus eliminate clinician bias.
Resumo:
We present a definition of increasing uncertainty, in which an elementary increase in the uncertainty of any act corresponds to the addition of an 'elementary bet' that increases consumption by a fixed amount in (relatively) 'good' states and decreases consumption by a fixed (and possibly different) amount in (relatively) 'bad' states. This definition naturally gives rise to a dual definition of comparative aversion to uncertainty. We characterize this definition for a popular class of generalized models of choice under uncertainty.
Resumo:
The comparative study based on spectroscopic analysis of the materials used to produce four sixteenth-century Manueline Charters (the Charters of Alcochete, Terena, Alandroal and Evora) was performed following a systematic analytical approach. SEM–EDS, l-Raman and l-FTIR analysis highlighted interesting features between them, namely the use of different pigments and colourants (such as different green and yellow pigments), the presence of pigments alterations and the use of a non-expected extemporaneous material (with the presence of titanium white in the Charter of Alcochete). Principal component analysis restricted to the C–H absorption region (3000–2840 cm-1) was applied to 36 infrared spectra of blue historical samples from the Charters of Alcochete,Terena, Alandroal and Évora, suggesting the use of a mixture of a triglyceride and polysaccharide as binder.