982 resultados para Utility-functions
Resumo:
El conjunto eficiente en la Teoría de la Decisión Multicriterio juega un papel fundamental en los procesos de solución ya que es en este conjunto donde el decisor debe hacer su elección más preferida. Sin embargo, la generación de tal conjunto puede ser difícil, especialmente en problemas continuos y/o no lineales. El primer capítulo de esta memoria, es introductorio a la Decisión Multicriterio y en él se exponen aquellos conceptos y herramientas que se van a utilizar en desarrollos posteriores. El segundo capítulo estudia los problemas de Toma de Decisiones en ambiente de certidumbre. La herramienta básica y punto de partida es la función de valor vectorial que refleja imprecisión sobre las preferencias del decisor. Se propone una caracterización del conjunto de valor eficiente y diferentes aproximaciones con sus propiedades de encaje y convergencia. Varios algoritmos interactivos de solución complementan los desarrollos teóricos. El tercer capítulo está dedicado al caso de ambiente de incertidumbre. Tiene un desarrollo parcialmente paralelo al anterior y utiliza la función de utilidad vectorial como herramienta de modelización de preferencias del decisor. A partir de la consideración de las distribuciones simples se introduce la eficiencia en utilidad, su caracterización y aproximaciones, que posteriormente se extienden a los casos de distribuciones discretas y continuas. En el cuarto capítulo se estudia el problema en ambiente difuso, aunque de manera introductoria. Concluimos sugiriendo distintos problemas abiertos.---ABSTRACT---The efficient set of a Multicriteria Decicion-Making Problem plays a fundamental role in the solution process since the Decisión Maker's preferred choice should be in this set. However, the computation of that set may be difficult, specially in continuous and/or nonlinear problems. Chapter one introduces Multicriteria Decision-Making. We review basic concepts and tools for later developments. Chapter two studies Decision-Making problems under certainty. The basic tool is the vector valué function, which represents imprecisión in the DM's preferences. We propose a characterization of the valué efficient set and different approximations with nesting and convergence properties. Several interactive algorithms complement the theoretical results. We devote Chapter three to problems under uncertainty. The development is parallel to the former and uses vector utility functions to model the DM's preferences. We introduce utility efficiency for simple distributions, its characterization and some approximations, which we partially extend to discrete and continuous classes of distributions. Chapter four studies the problem under fuzziness, at an exploratory level. We conclude with several open problems.
Resumo:
La tesis estudia uno de los aspectos más importantes de la gestión de la sociedad de la información: conocer la manera en que una persona valora cualquier situación. Esto es importante para el individuo que realiza la valoración y para el entorno con el que se relaciona. La valoración es el resultado de la comparación: se asignan los mismos valores a alternativas similares y mayores valores a alternativas mejor consideradas en el proceso de comparación. Los patrones que guían al individuo a la hora de hacer la comparación se derivan de sus preferencias individuales (es decir, de sus opiniones). En la tesis se presentan varios procedimientos para establecer las relaciones de preferencia entre alternativas de una persona. La valoración progresa hasta obtener una representación numérica de sus preferencias. Cuando la representación de preferencias es homogénea permite, además, contrastar las preferencias personales con las del resto de evaluadores, lo que favorece la evaluación de políticas, la transferencia de información entre diferentes individuos y el diseño de la alternativa que mejor se adapte a las preferencias identificadas. Al mismo tiempo, con esta información se pueden construir comunidades de personas con los mismos sistemas de preferencias ante una cuestión concreta. La tesis muestra un caso de aplicación de esta metodología: optimización de las políticas laborales en un mercado real. Para apoyar a los demandantes de empleo (en su iniciación o reincorporación al mundo laboral o en el cambio de su actividad) es necesario conocer sus preferencias respecto a las ocupaciones que están dispuestos a desempeñar. Además, para que la intermediación laboral sea efectiva, las ocupaciones buscadas deben de ser ofrecidas por el mercado de trabajo y el demandante debe reunir las condiciones para acceder a esas ocupaciones. El siguiente desarrollo de estos modelos nos lleva a los procedimientos utilizados para transformar múltiples preferencias en una decisión agregada y que consideran tanto la opinión de cada uno de los individuos que participan en la decisión como las interacciones sociales, todo ello dirigido a generar una solución que se ajuste lo mejor posible al punto de vista de toda la población. Las decisiones con múltiples participantes inciden, principalmente, en: el aumento del alcance para incorporar a personas que tradicionalmente no han sido consideradas en las tomas de decisiones, la agregación de las preferencias de los múltiples participantes en las tomas de decisiones colectivas (mediante votación, utilizando aplicaciones desarrolladas para la Web2.0 y a través de comparaciones interpersonales de utilidad) y, finalmente, la auto-organización para permitir que interaccionen entre si los participantes en la valoración, de forma que hagan que el resultado final sea mejor que la mera agregación de opiniones individuales. La tesis analiza los sistemas de e-democracia o herramientas para su implantación que tienen más más utilización en la actualidad o son más avanzados. Están muy relacionados con la web 2.0 y su implantación está suponiendo una evolución de la democracia actual. También se estudian aplicaciones de software de Colaboración en la toma de decisiones (Collaborative decision-making (CDM)) que ayudan a dar sentido y significado a los datos. Pretenden coordinar las funciones y características necesarias para llegar a decisiones colectivas oportunas, lo que permite a todos los interesados participar en el proceso. La tesis finaliza con la presentación de un nuevo modelo o paradigma en la toma de decisiones con múltiples participantes. El desarrollo se apoya en el cálculo de las funciones de utilidad empática. Busca la colaboración entre los individuos para que la toma de decisiones sea más efectiva, además pretende aumentar el número de personas implicadas. Estudia las interacciones y la retroalimentación entre ciudadanos, ya que la influencia de unos ciudadanos en los otros es fundamental para los procesos de toma de decisiones colectivas y de e-democracia. También incluye métodos para detectar cuando se ha estancado el proceso y se debe interrumpir. Este modelo se aplica a la consulta de los ciudadanos de un municipio sobre la oportunidad de implantar carriles-bici y las características que deben tomar. Se simula la votación e interacción entre los votantes. ABSTRACT The thesis examines one of the most important aspects of the management of the information society: get to know how a person values every situation. This is important for the individual performing the assessment and for the environment with which he interacts. The assessment is a result of the comparison: identical values are allocated to similar alternatives and higher values are assigned to those alternatives that are more favorably considered in the comparison process. Patterns that guide the individual in making the comparison are derived from his individual preferences (ie, his opinions). In the thesis several procedures to establish preference relations between alternatives a person are presented. The assessment progresses to obtain a numerical representation of his preferences. When the representation of preferences is homogeneous, it also allows the personal preferences of each individual to be compared with those of other evaluators, favoring policy evaluation, the transfer of information between different individuals and design the alternative that best suits the identified preferences. At the same time, with this information you can build communities of people with similar systems of preferences referred to a particular issue. The thesis shows a case of application of this methodology: optimization of labour policies in a real market. To be able support jobseekers (in their initiation or reinstatement to employment or when changing area of professional activity) is necessary to know their preferences for jobs that he is willing to perform. In addition, for labour mediation to be effective occupations that are sought must be offered by the labour market and the applicant must meet the conditions for access to these occupations. Further development of these models leads us to the procedures used to transform multiple preferences in an aggregate decision and consider both the views of each of the individuals involved in the decision and the social interactions, all aimed at generating a solution that best fits of the point of view of the entire population. Decisions with multiple participants mainly focus on: increasing the scope to include people who traditionally have not been considered in decision making, aggregation of the preferences of multiple participants in collective decision making (by vote, using applications developed for the Web 2.0 and through interpersonal comparisons of utility) and, finally, self-organization to allow participants to interact with each other in the assessment, so that the final result is better than the mere aggregation of individual opinions. The thesis analyzes the systems of e-democracy or tools for implementation which are more popular or more advanced. They are closely related to the Web 2.0 and its implementation is bringing an evolution of the current way of understanding democracy. We have also studied Collaborative Decision-Making (CDM)) software applications in decision-making which help to give sense and meaning to the data. They intend to coordinate the functions and features needed to reach adequate collective decisions, allowing all stakeholders to participate in the process. The thesis concludes with the presentation of a new model or paradigm in decision-making with multiple participants. The development is based on the calculation of the empathic utility functions. It seeks collaboration between individuals to make decision-making more effective; it also aims to increase the number of people involved. It studies the interactions and feedback among citizens, because the influence of some citizens in the other is fundamental to the process of collective decision-making and e-democracy. It also includes methods for detecting when the process has stalled and should be discontinued. This model is applied to the consultation of the citizens of a municipality on the opportunity to introduce bike lanes and characteristics they should have. Voting and interaction among voters is simulated.
Resumo:
Cover crop selection should be oriented to the achievement of specific agrosystem benefits. The covercrop, catch crop, green manure and fodder uses were identified as possible targets for selection. Theobjective was to apply multi-criteria decision analysis to evaluate different species (Hordeum vulgareL., Secale cereale L., ×Triticosecale Whim, Sinapis alba L., Vicia sativa L.) and cultivars according to theirsuitability to be used as cover crops in each of the uses. A field trial with 20 cultivars of the five specieswas conducted in Central Spain during two seasons (October?April). Measurements of ground cover, cropbiomass, N uptake, N derived from the atmosphere, C/N, dietary fiber content and residue quality werecollected. Aggregation of these variables through utility functions allowed ranking species and cultivarsfor each usage. Grasses were the most suitable for the cover crop, catch crop and fodder uses, while thevetches were the best as green manures. The mustard attained high ranks as cover and catch crop the firstseason, but the second decayed due to low performance in cold winters. Mustard and vetches obtainedworse rankings than grasses as fodder. Hispanic was the most suitable barley cultivar as cover and catchcrop, and Albacete as fodder. The triticale Titania attained the highest rank as cover and catch crop andfodder. Vetches Aitana and BGE014897 showed good aptitudes as green manures and catch crops. Thisanalysis allowed comparison among species and cultivars and might provide relevant information forcover crops selection and management.
Resumo:
Individuals exchange contracts for the delivery of commodities in competitive markets and, simultaneously, act strategically; actions affect utilities across individuals directly or through the payoffs of contracts. This encompasses economies with asymmetric information. Nash–Walras equilibria exist for large economies, even if utility functions are not quasi-concave and choice sets are not convex, which is the case in standard settings; the separation of the purchase from the sale of contracts and the pooling of the deliveries on contracts guarantee that the markets for commodities clear.
Resumo:
In this paper we study the Debreu Gap Lemma and its generalizations to totally ordered sets more general than (R, less than or equal to). We explain why it is important in economics to study utility functions which may not be real-valued and we build the foundations of a theory of continuity of such generalized utility functions. (C) 2004 Published by Elsevier B.V.
Resumo:
Many automated negotiation models have been developed to solve the conflict in many distributed computational systems. However, the problem of finding win-win outcome in multiattribute negotiation has not been tackled well. To address this issue, based on an evolutionary method of multiobjective optimization, this paper presents a negotiation model that can find win-win solutions of multiple attributes, but needs not to reveal negotiating agents' private utility functions to their opponents or a third-party mediator. Moreover, we also equip our agents with a general type of utility functions of interdependent multiattributes, which captures human intuitions well. In addition, we also develop a novel time-dependent concession strategy model, which can help both sides find a final agreement among a set of win-win ones. Finally, lots of experiments confirm that our negotiation model outperforms the existing models developed recently. And the experiments also show our model is stable and efficient in finding fair win-win outcomes, which is seldom solved in the existing models. © 2012 Wiley Periodicals, Inc.
Resumo:
A szerző röviden összefoglalja a származtatott termékek árazásával kapcsolatos legfontosabb ismereteket és problémákat. A derivatív árazás elmélete a piacon levő termékek közötti redundanciát kihasználva próbálja meghatározni az egyes termékek relatív árát. Ezt azonban csak teljes piacon lehet megtenni, és így csak teljes piac esetén lehetséges a hasznossági függvények fogalmát az elméletből és a ráépülő gyakorlatból elhagyni, ezért a kockázatsemleges árazás elve félrevezető. Másképpen fogalmazva: a származtatott termékek elmélete csak azon az áron képes a hasznossági függvény fogalmától megszabadulni, ha a piac szerkezetére a valóságban nem teljesülő megkötéseket tesz. Ennek hangsúlyozása mind a piaci gyakorlatban, mind az oktatásban elengedhetetlen. / === / The author sums up briefly the main aspects and problems to do with the pricing of derived products. The theory of derivative pricing uses the redundancy among products on the market to arrive at relative product prices. But this can be done only on a complete market, so that only with a complete market does it become possible to omit from the theory and the practice built upon it the concept of utility functions, and for that reason the principle of risk-neutral pricing is misleading. To put it another way, the theory of derived products is capable of freeing itself from the concept of utility functions only at a price where in practice it places impossible restrictions on the market structure. This it is essential to emphasize in market practice and in teaching.
Resumo:
Ebben a cikkben azzal foglalkozom, hogy a kockázat és a vevőkör nagysága együttesen hogyan hat a termék árára. Kétféle piacot hasonlítok össze: egy biztosítási piacot, és egy termékpiacot. A kétféle piac között az a legfontosabb különbség, hogy termékpiac esetében az eladó számára csak ott jelentkezik kockázat, hogy el tudja-e adni a terméket, míg biztosítási piac esetében az eladó a termék értékesítése után is szembesül kockázattal. A cikk során megmutatom, hogy a vevőkör növekedésének ellentétes hatása lehet a termék árára termék- illetve biztosítási piacok esetében. / === / An economic approach for modeling the insurance markets. The study focuses on the monopolistic market, where one insurance company sells a product with predetermined benefits for the customers. An outline of the company and the insureds' behavior with utility functions is given. The study investigates the problem of policy pricing in relation to the number of clients the company acquires. Analytic tools will be used to further clarify the points.
Resumo:
This research sought to determine the implications of a non-traded differentiated commodity produced with increasing returns to scale, for the welfare of countries that allowed free international migration. We developed two- and three-country Ricardian models in which labor was the only factor of production. The countries traded freely in homogeneous goods produced with constant returns to scale. Each also had a non-traded differentiated good sector where production took place using increasing returns to scale technology. Then we allowed for free international migration between two of the countries and observed what happened to welfare in both countries as indicated by their per capita utilities in the new equilibrium relative to their pre-migration utilities. ^ Preferences of consumers were represented by a two-tier utility function [Dixit and Stiglitz 1977]. As migration took place it impacted utility in two ways. The expanding country enjoyed the positive effect of increased product diversity in the non-traded good sector. However, it also suffered adverse terms-of-trade as its production cost declined. The converse was true for the contracting country. To determine the net impact on welfare we derived indirect per capita utility functions of the countries algebraically and graphically. Then we juxtaposed the graphs of the utility functions to obtain possible general equilibria. These we used to observe the welfare outcomes. ^ We found that the most likely outcomes were either that both countries gained, or one country lost while the other gained. We were, however, able to generate cases where both countries lost as a result of allowing free inter-country migration. This was most likely to happen when the shares of income spent on each country's export good differed significantly. In the three country world when we allowed two of the countries to engage in preferential trading arrangements while imposing a prohibitive tariff on imports from the third country welfare of the partner countries declined. When inter-union migration was permitted welfare declined even further. This we showed was due to the presence of the non-traded good sector. ^
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
The central motif of this work is prediction and optimization in presence of multiple interacting intelligent agents. We use the phrase `intelligent agents' to imply in some sense, a `bounded rationality', the exact meaning of which varies depending on the setting. Our agents may not be `rational' in the classical game theoretic sense, in that they don't always optimize a global objective. Rather, they rely on heuristics, as is natural for human agents or even software agents operating in the real-world. Within this broad framework we study the problem of influence maximization in social networks where behavior of agents is myopic, but complication stems from the structure of interaction networks. In this setting, we generalize two well-known models and give new algorithms and hardness results for our models. Then we move on to models where the agents reason strategically but are faced with considerable uncertainty. For such games, we give a new solution concept and analyze a real-world game using out techniques. Finally, the richest model we consider is that of Network Cournot Competition which deals with strategic resource allocation in hypergraphs, where agents reason strategically and their interaction is specified indirectly via player's utility functions. For this model, we give the first equilibrium computability results. In all of the above problems, we assume that payoffs for the agents are known. However, for real-world games, getting the payoffs can be quite challenging. To this end, we also study the inverse problem of inferring payoffs, given game history. We propose and evaluate a data analytic framework and we show that it is fast and performant.
Resumo:
Double Degree
Resumo:
We provide a nonparametric 'revealed preference’ characterization of rational household behavior in terms of the collective consumption model, while accounting for general (possibly non-convex) individual preferences. We establish a Collective Axiom of Revealed Preference (CARP), which provides a necessary and sufficient condition for data consistency with collective rationality. Our main result takes the form of a ‘collective’ version of the Afriat Theorem for rational behavior in terms of the unitary model. This theorem has some interesting implications. With only a finite set of observations, the nature of consumption externalities (positive or negative) in the intra-household allocation process is non-testable. The same non-testability conclusion holds for privateness (with or without externalities) or publicness of consumption. By contrast, concavity of individual utility functions (representing convex preferences) turns out to be testable. In addition, monotonicity is testable for the model that assumes all household consumption is public.
Resumo:
Multivariate volatility forecasts are an important input in many financial applications, in particular portfolio optimisation problems. Given the number of models available and the range of loss functions to discriminate between them, it is obvious that selecting the optimal forecasting model is challenging. The aim of this thesis is to thoroughly investigate how effective many commonly used statistical (MSE and QLIKE) and economic (portfolio variance and portfolio utility) loss functions are at discriminating between competing multivariate volatility forecasts. An analytical investigation of the loss functions is performed to determine whether they identify the correct forecast as the best forecast. This is followed by an extensive simulation study examines the ability of the loss functions to consistently rank forecasts, and their statistical power within tests of predictive ability. For the tests of predictive ability, the model confidence set (MCS) approach of Hansen, Lunde and Nason (2003, 2011) is employed. As well, an empirical study investigates whether simulation findings hold in a realistic setting. In light of these earlier studies, a major empirical study seeks to identify the set of superior multivariate volatility forecasting models from 43 models that use either daily squared returns or realised volatility to generate forecasts. This study also assesses how the choice of volatility proxy affects the ability of the statistical loss functions to discriminate between forecasts. Analysis of the loss functions shows that QLIKE, MSE and portfolio variance can discriminate between multivariate volatility forecasts, while portfolio utility cannot. An examination of the effective loss functions shows that they all can identify the correct forecast at a point in time, however, their ability to discriminate between competing forecasts does vary. That is, QLIKE is identified as the most effective loss function, followed by portfolio variance which is then followed by MSE. The major empirical analysis reports that the optimal set of multivariate volatility forecasting models includes forecasts generated from daily squared returns and realised volatility. Furthermore, it finds that the volatility proxy affects the statistical loss functions’ ability to discriminate between forecasts in tests of predictive ability. These findings deepen our understanding of how to choose between competing multivariate volatility forecasts.
Resumo:
Currently, mass spectrometry-based metabolomics studies extend beyond conventional chemical categorization and metabolic phenotype analysis to understanding gene function in various biological contexts (e.g., mammalian, plant, and microbial). These novel utilities have led to many innovative discoveries in the following areas: disease pathogenesis, therapeutic pathway or target identification, the biochemistry of animal and plant physiological and pathological activities in response to diverse stimuli, and molecular signatures of host-pathogen interactions during microbial infection. In this review, we critically evaluate the representative applications of mass spectrometry-based metabolomics to better understand gene function in diverse biological contexts, with special emphasis on working principles, study protocols, and possible future development of this technique. Collectively, this review raises awareness within the biomedical community of the scientific value and applicability of mass spectrometry-based metabolomics strategies to better understand gene function, thus advancing this application's utility in a broad range of biological fields