976 resultados para non-expected utililty
Resumo:
The concept of Ambiguity designates those situations where the information available to the decision maker is insufficient to form a probabilistic view of the world. Thus, it has provided the motivation for departing from the Subjective Expected Utility (SEU) paradigm. Yet, the formalization of the concept is missing. This is a grave omission as it leaves non-expected utility models hanging on a shaky ground. In particular, it leaves unanswered basic questions such as: (1) Does Ambiguity exist?; (2) If so, which situations should be labeled as "ambiguous"?; (3) Why should one depart from Subjective Expected Utility (SEU) in the presence of Ambiguity?; and (4) If so, what kind of behavior should emerge in the presence of Ambiguity? The present paper fills these gaps. Specifically, it identifies those information structures that are incompatible with SEU theory, and shows that their mathematical properties are the formal counterpart of the intuitive idea of insufficient information. These are used to give a formal definition of Ambiguity and, consequently, to distinguish between ambiguous and unambiguous situations. Finally, the paper shows that behavior not conforming to SEU theory must emerge in correspondence of insufficient information and identifies the class of non-EU models that emerge in the face of Ambiguity. The paper also proposes a new comparative definition of Ambiguity, and discusses its relation with some of the existing literature.
Resumo:
Poor pharmacokinetics is one of the reasons for the withdrawal of drug candidates from clinical trials. There is an urgent need for investigating in vitro ADME (absorption, distribution, metabolism and excretion) properties and recognising unsuitable drug candidates as early as possible in the drug development process. Current throughput of in vitro ADME profiling is insufficient because effective new synthesis techniques, such as drug design in silico and combinatorial synthesis, have vastly increased the number of drug candidates. Assay technologies for larger sets of compounds than are currently feasible are critically needed. The first part of this work focused on the evaluation of cocktail strategy in studies of drug permeability and metabolic stability. N-in-one liquid chromatography-tandem mass spectrometry (LC/MS/MS) methods were developed and validated for the multiple component analysis of samples in cocktail experiments. Together, cocktail dosing and LC/MS/MS were found to form an effective tool for increasing throughput. First, cocktail dosing, i.e. the use of a mixture of many test compounds, was applied in permeability experiments with Caco-2 cell culture, which is a widely used in vitro model for small intestinal absorption. A cocktail of 7-10 reference compounds was successfully evaluated for standardization and routine testing of the performance of Caco-2 cell cultures. Secondly, cocktail strategy was used in metabolic stability studies of drugs with UGT isoenzymes, which are one of the most important phase II drug metabolizing enzymes. The study confirmed that the determination of intrinsic clearance (Clint) as a cocktail of seven substrates is possible. The LC/MS/MS methods that were developed were fast and reliable for the quantitative analysis of a heterogenous set of drugs from Caco-2 permeability experiments and the set of glucuronides from in vitro stability experiments. The performance of a new ionization technique, atmospheric pressure photoionization (APPI), was evaluated through comparison with electrospray ionization (ESI), where both techniques were used for the analysis of Caco-2 samples. Like ESI, also APPI proved to be a reliable technique for the analysis of Caco-2 samples and even more flexible than ESI because of the wider dynamic linear range. The second part of the experimental study focused on metabolite profiling. Different mass spectrometric instruments and commercially available software tools were investigated for profiling metabolites in urine and hepatocyte samples. All the instruments tested (triple quadrupole, quadrupole time-of-flight, ion trap) exhibited some good and some bad features in searching for and identifying of expected and non-expected metabolites. Although, current profiling software is helpful, it is still insufficient. Thus a time-consuming largely manual approach is still required for metabolite profiling from complex biological matrices.
Resumo:
Zur Erholung in die Natur gehen oder doch lieber zur Natursimulation greifen? Intuitiv würden die meisten Menschen der Natur einen größeren Erholungswert zusprechen als einer Natursimulation. Aber ist die Natur tatsächlich erholsamer? In der Naturerholungsforschung (Restorative Environment Research) kommen häufig Natursimulationen zum Einsatz, um die erholsame Wirkung von Natur zu ermitteln. Problematisch ist dabei, dass deren ökologische Validität und Vergleichbarkeit noch nicht empirisch abgesichert ist. Vorliegende Arbeit setzt an dieser methodischen und empirischen Lücke an. Sie überprüft sowohl die ökologische Validität als auch die Vergleichbarkeit von Natursimulationen. Dazu wird die erholsame Wirkung von zwei Natursimulationen im Vergleich zu der physisch-materiellen Natur empirisch untersucht und verglichen. Darüber hinaus werden Aspekte des subjektiven Erlebens und der Bewertung im Naturerholungskontext exploriert. Als bedeutsamer Wirkmechanismus wird die erlebnisbezogene Künstlichkeit/Natürlichkeit angesehen, die sich auf die Erlebnisqualität von Natursimulationen und der physisch-materiellen Natur bezieht: Natursimulationen weisen im Vergleich zur physisch-materiellen Natur eine reduzierte Erlebnisqualität auf (erlebnisbezogene Künstlichkeit), z.B. eine reduzierte Qualität und Quantität der Sinnesansprache. Stellt man einen derartigen Vergleich nicht nur mit der physisch-materiellen Natur, sondern mit unterschiedlichen Natursimulationstypen an, dann zeigen sich auch hier Unterschiede in der erlebnisbezogenen Künstlichkeit. Beispielsweise unterscheidet sich ein Naturfoto von einem Naturfilm durch das Fehlen von auditiven und bewegten Stimuli. Diese erlebnisbezogene Künstlichkeit kann die erholsame Wirkung von Natur - direkt oder indirekt über Bewertungen - hemmen. Als Haupthypothese wird angenommen, dass mit zunehmendem Ausmaß an erlebnisbezogener Künstlichkeit die erholsame Wirkung der Natur abnimmt. Dem kombinierten Feld- und Laborexperiment liegt ein einfaktorielles Vorher-Nachher-Design zugrunde. Den 117 Probanden wurde zunächst eine kognitiv und affektiv belastende Aufgabe vorgelegt, danach folgte die Erholungsphase. Diese bestand aus einem Spaziergang, der entweder in der physisch-materiellen Natur (urbaner Park) oder in einer der beiden audio-visuellen Natursimulationen (videogefilmter vs. computergenerierter Spaziergang durch selbigen urbanen Park) oder auf dem Laufband ohne audio-visuelle Darbietung stattfand. Die erlebnisbezogene Künstlichkeit/Natürlichkeit wurde also wie folgt operationlisiert: die physische Natur steht für die erlebnisbezogene Natürlichkeit. Die beiden Natursimulationen stehen für die erlebnisbezogene Künstlichkeit. Die computergenerierte Version ist im Vergleich zur Videoversion erlebnisbezogen künstlicher, da sie weniger fotorealistisch ist. Die Zuordnung zu einer der vier experimentellen Erholungssettings erfolgte nach dem Zufallsprinzip. Die Effekte von moderater Bewegung wurden in den Natursimulationen durch das Laufen auf dem Laufband kontrolliert. Die Beanspruchungs- bzw. Erholungsreaktionen wurden auf kognitiver (Konzentriertheit, Aufmerksamkeitsleistung) affektiver (3 Befindlichkeitsskalen: Wachheit, Ruhe, gute Stimmung) und physiologischer (Alpha-Amylase) Ebene gemessen, um ein umfassendes Bild der Reaktionen zu erhalten. Insgesamt zeigen die Ergebnisse, dass die beiden Natursimulationen trotz Unterschiede in der erlebnisbezogenen Künstlichkeit/Natürlichkeit zu relativ ähnlichen Erholungsreaktionen führen, wie die physisch-materielle Natur. Eine Ausnahme stellen eine der drei affektiven (Wachheit) und die physiologische Reaktion dar: Probanden der physisch-materiellen Naturbedingung geben an wacher zu sein und weisen - wider erwarten - eine höhere physiologische Erregung auf. Demnach ist die physisch-materielle Natur nicht grundsätzlich erholsamer als die Natursimulationen. Die Hypothese ließ sich somit nicht bestätigen. Vielmehr deuten sich komplexe Erholungsmuster und damit auch unterschiedliche Erholungsqualitäten der Settings an, die einer differenzierten Betrachtung bedürfen. Für die ökologische Validität von Natursimulationen gilt, dass diese nur mit Einschränkung als ökologisch valide bezeichnet werden können, d.h. nur für bestimmte, aber nicht für alle Erholungsreaktionen. Die beiden Natursimulationen führen ebenfalls trotz Unterschiede in der erlebnisbezogenen Künstlichkeit zu ähnlichen Erholungsreaktionen und können somit als gleichwertig behandelt werden. Erstaunlicherweise kommt es hier zu ähnlichen Erholungsreaktionen, obwohl die bestehenden Unterschiede von den Probanden wahrgenommen und die erlebnisbezogen künstlichere computergenerierte Version negativer bewertet wird. Aufgrund der nicht erwartungskonformen Ergebnisse muss das Erklärungskonzept der erlebnisbezogenen Künstlichkeit/Natürlichkeit infrage gestellt werden. Alternative Erklärungskonzepte für die Ergebnisse („Ungewissheit“, mentale räumliche Modelle), die sich andeutenden unterschiedlichen Erholungsqualitäten der Settings, methodische Einschränkungen sowie die praktische Bedeutung der Ergebnisse werden kritisch diskutiert.
Resumo:
The paper reviews recent models that have applied the techniques of behavioural economics to the analysis of the tax compliance choice of an individual taxpayer. The construction of these models is motivated by the failure of the Yitzhaki version of the Allingham–Sandmo model to predict correctly the proportion of taxpayers who will evade and the effect of an increase in the tax rate upon the chosen level of evasion. Recent approaches have applied non-expected utility theory to the compliance decision and have addressed social interaction. The models we describe are able to match the observed extent of evasion and correctly predict the tax effect but do not have the parsimony or precision of the Yitzhaki model.
Resumo:
Com o emprego do modelo de dois setores de acumulação ótima de capital em economia aberta, determina-se o impacto sobre a trajetória do câmbio, dos salários, do investimento, da poupança e, portanto, da dívida externa e do estoque de capital, de uma elevação permanente e não antecipada da produtividade da economia. Em geral, após um choque positivo permanente de produtividade há redução da poupança, piora do balanço de pagamentos em transações correntes e valorização do câmbio. Todos fenômenos que do ponto de vista do modelo são de equilíbrio intertemporal, conseqüência da elevação da renda permanente e do excesso de demanda por bens domésticos que sucede o ganho de produtividade. Supondo que os programas de estabilização elevaram a produtividade da economia é possível com a estrutura analítica construída racionalizar qualitativamente os fenômenos observados após estes planos.
Resumo:
The Rational Agent model have been a foundational basis for theoretical models such as Economics, Management Science, Artificial Intelligence and Game Theory, mainly by the ¿maximization under constraints¿ principle, e.g. the ¿Expected Utility Models¿, among them, the Subjective Expected Utility (SEU) Theory, from Savage, placed as most influence player over theoretical models we¿ve seen nowadays, even though many other developments have been done, indeed also in non-expected utility theories field. Having the ¿full rationality¿ assumption, going for a less idealistic sight ¿bounded rationality¿ of Simon, or for classical anomalies studies, such as the ¿heuristics and bias¿ analysis by Kahneman e Tversky, ¿Prospect Theory¿ also by Kahneman & Tversky, or Thaler¿s Anomalies, and many others, what we can see now is that Rational Agent Model is a ¿Management by Exceptions¿ example, as for each new anomalies¿s presentation, in sequence, a ¿problem solving¿ development is needed. This work is a theoretical essay, which tries to understand: 1) The rational model as a ¿set of exceptions¿; 2) The actual situation unfeasibility, since once an anomalie is identified, we need it¿s specific solution developed, and since the number of anomalies increases every year, making strongly difficult to manage rational model; 3) That behaviors judged as ¿irrationals¿ or deviated, by the Rational Model, are truly not; 4) That¿s the right moment to emerge a Theory including mental processes used in decision making; and 5) The presentation of an alternative model, based on some cognitive and experimental psychology analysis, such as conscious and uncounscious processes, cognition, intuition, analogy-making, abstract roles, and others. Finally, we present conclusions and future research, that claims for deeper studies in this work¿s themes, for mathematical modelling, and studies about a rational analysis and cognitive models possible integration. .
Resumo:
O objetivo deste trabalho é verificar se o ajustamento das condições de paridade de juros por expectativa do mercado (paridade descoberta) e por prêmios de risco (paridades coberta e descoberta) leva à validação da relação de não-arbitragem subjacente, ou pelo menos a resultados econométricos mais próximos de sua validação. Para isso, combinamos taxas de retornos de instrumentos de renda fixa domésticos e norte-americanos e aplicamos o arcabouço econométrico de séries de tempo. Como primeiro passo de investigação, aplicamos a paridade de juros (descoberta e coberta) na sua forma tradicional. No passo seguinte aplicamos os testes econométricos às condições de paridade ajustadas por um prêmio de risco. No caso da PDJ, não obtivemos resultados satisfatórios, mesmo ajustando pelos prêmios de risco. Esse ajuste propiciou uma mudança nos sinais dos coeficientes na direção correta, mas a magnitude do coeficiente da desvalorização cambial efetiva passou a destoar bastante da magnitude das outras séries. Apesar de termos obtido a validade da PCJ na forma tradicional, não esperaríamos este resultado, pois isso implicaria que o prêmio de risco país seria nulo para este período. Ajustando a PCJ pelo prêmio de risco de não-pagamento passa-se a não obter co integração entre as séries, ou seja, o prêmio de risco de não-pagamento teria um comportamento independente do prêmio futuro e do diferencial de juros. As possíveis causas para a não obtenção dos resultados esperados são: intervalo amostraI menor que 3 anos, erro de medida dos dados de survey ou tentativa do Banco Central de controlar a taxa de câmbio nominal e as taxas de juros domésticas simultaneamente.
Resumo:
Employing the two sector model of capital accumulation in an open economy, the impact on the path of the following variables: exchange rate, wages, investment, saving, and consequently externaI debt and capital stock afier a permanent and non expected elevation of the economy productivity is determinated. Afier this positive shock, saving rate decreases, current transaction deteriorates and the exchange rate appreciates. Those are equilibrium phenomena from 3D intertemporaI point of view due to the permanent income raise and to the domestic good excess demand that follows the productivity increase. Assuming that the stabilization programa augment the economy productivity, the model could rationalize qualitatively the stylized facts witnessed after those programa.
Resumo:
We study an intertemporal asset pricing model in which a representative consumer maximizes expected utility derived from both the ratio of his consumption to some reference level and this level itself. If the reference consumption level is assumed to be determined by past consumption levels, the model generalizes the usual habit formation specifications. When the reference level growth rate is made dependent on the market portfolio return and on past consumption growth, the model mixes a consumption CAPM with habit formation together with the CAPM. It therefore provides, in an expected utility framework, a generalization of the non-expected recursive utility model of Epstein and Zin (1989). When we estimate this specification with aggregate per capita consumption, we obtain economically plausible values of the preference parameters, in contrast with the habit formation or the Epstein-Zin cases taken separately. All tests performed with various preference specifications confirm that the reference level enters significantly in the pricing kernel.
Resumo:
In this work I discuss several key aspects of welfare economics and policy analysis and I propose two original contributions to the growing field of behavioral public policymaking. After providing a historical perspective of welfare economics and an overview of policy analysis processes in the introductory chapter, in chapter 2 I discuss a debated issue of policymaking, the choice of the social welfare function. I contribute to this debate by proposing an original methodological contribution based on the analysis of the quantitative relationship among different social welfare functional forms commonly used by policy analysts. In chapter 3 I then discuss a behavioral policy to contrast indirect tax evasion based on the use of lotteries. I show that the predictions of my model based on non-expected utility are consistent with observed, and so far unexplained, empirical evidence of the policy success. Finally, in chapter 4 I investigate by mean of a laboratory experiment the effects of social influence on the individual likelihood to engage in altruistic punishment. I show that bystanders’ decision to engage in punishment is influenced by the punishment behavior of their peers and I suggest ways to enact behavioral policies that exploit this finding.
Resumo:
Individual analysis of functional Magnetic Resonance Imaging (fMRI) scans requires user-adjustment of the statistical threshold in order to maximize true functional activity and eliminate false positives. In this study, we propose a novel technique that uses radiomic texture analysis (TA) features associated with heterogeneity to predict areas of true functional activity. Scans of 15 right-handed healthy volunteers were analyzed using SPM8. The resulting functional maps were thresholded to optimize visualization of language areas, resulting in 116 regions of interests (ROIs). A board-certified neuroradiologist classified different ROIs into Expected (E) and Non-Expected (NE) based on their anatomical locations. TA was performed using the mean Echo-Planner Imaging (EPI) volume, and 20 rotation-invariant texture features were obtained for each ROI. Using forward stepwise logistic regression, we built a predictive model that discriminated between E and NE areas of functional activity, with a cross-validation AUC and success rate of 79.84% and 80.19% respectively (specificity/sensitivity of 78.34%/82.61%). This study found that radiomic TA of fMRI scans may allow for determination of areas of true functional activity, and thus eliminate clinician bias.
Resumo:
We present a definition of increasing uncertainty, in which an elementary increase in the uncertainty of any act corresponds to the addition of an 'elementary bet' that increases consumption by a fixed amount in (relatively) 'good' states and decreases consumption by a fixed (and possibly different) amount in (relatively) 'bad' states. This definition naturally gives rise to a dual definition of comparative aversion to uncertainty. We characterize this definition for a popular class of generalized models of choice under uncertainty.
Resumo:
The comparative study based on spectroscopic analysis of the materials used to produce four sixteenth-century Manueline Charters (the Charters of Alcochete, Terena, Alandroal and Evora) was performed following a systematic analytical approach. SEM–EDS, l-Raman and l-FTIR analysis highlighted interesting features between them, namely the use of different pigments and colourants (such as different green and yellow pigments), the presence of pigments alterations and the use of a non-expected extemporaneous material (with the presence of titanium white in the Charter of Alcochete). Principal component analysis restricted to the C–H absorption region (3000–2840 cm-1) was applied to 36 infrared spectra of blue historical samples from the Charters of Alcochete,Terena, Alandroal and Évora, suggesting the use of a mixture of a triglyceride and polysaccharide as binder.
Resumo:
Hospital acquired infections (HAI) are costly but many are avoidable. Evaluating prevention programmes requires data on their costs and benefits. Estimating the actual costs of HAI (a measure of the cost savings due to prevention) is difficult as HAI changes cost by extending patient length of stay, yet, length of stay is a major risk factor for HAI. This endogeneity bias can confound attempts to measure accurately the cost of HAI. We propose a two-stage instrumental variables estimation strategy that explicitly controls for the endogeneity between risk of HAI and length of stay. We find that a 10% reduction in ex ante risk of HAI results in an expected savings of £693 ($US 984).