333 resultados para Discount airfare


Relevância:

10.00% 10.00%

Publicador:

Resumo:

To be published in: Revista Internacional de Sociología (2011), Special Issue on Experimental and Behavioral Economics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this article is to characterize dynamic optimal harvesting trajectories that maximize discounted utility assuming an age-structured population model, in the same line as Tahvonen (2009). The main novelty of our study is that uses as an age-structured population model the standard stochastic cohort framework applied in Virtual Population Analysis for fish stock assessment. This allows us to compare optimal harvesting in a discounted economic context with standard reference points used by fisheries agencies for long term management plans (e.g. Fmsy). Our main findings are the following. First, optimal steady state is characterized and sufficient conditions that guarantees its existence and uniqueness for the general case of n cohorts are shown. It is also proved that the optimal steady state coincides with the traditional target Fmsy when the utility function to be maximized is the yield and the discount rate is zero. Second, an algorithm to calculate the optimal path that easily drives the resource to the steady state is developed. And third, the algorithm is applied to the Northern Stock of hake. Results show that management plans based exclusively on traditional reference targets as Fmsy may drive fishery economic results far from the optimal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper has two major contributions to the theory of repeated games. First, we build a supergame oligopoly model where firms compete in supply functions, we show how collusion sustainability is affected by the presence of a convex cost function, the magnitude of both the slope of demand market, and the number of rivals. Then, we compare the results with those of the traditional Cournot reversion under the same structural characteristics. We find how depending on the number of firms and the slope of the linear demand, collusion sustainability is easier under supply function than under Cournot competition. The conclusions of the models are simulated with data from the Spanish wholesale electricity market to predict lower bounds of the discount factors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

I consider cooperation situations where players have network relations. Networks evolve according to a stationary transition probability matrix and at each moment in time players receive payoffs from a stationary allocation rule. Players discount the future by a common factor. The pair formed by an allocation rule and a transition probability matrix is called expected fair if for every link in the network both participants gain, marginally, and in discounted, expected terms, the same from it; and it is called a pairwise network formation procedure if the probability that a link is created (or eliminated) is positive if the discounted, expected gains to its two participants are positive too. The main result is the existence, for the discount factor small enough, of an expected fair and pairwise network formation procedure where the allocation rule is component balanced, meaning it distributes the total value of any maximal connected subnetwork among its participants. This existence result holds for all discount factors when the pairwise network formation procedure is restricted. I finally provide some comparison with previous models of farsighted network formation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider cooperation situations where players have network relations. Networks evolve according to a stationary transition probability matrix and at each moment in time players receive payoffs from a stationary allocation rule. Players discount the future by a common factor. The pair formed by an allocation rule and a transition probability matrix is called a forward-looking network formation scheme if, first, the probability that a link is created is positive if the discounted, expected gains to its two participants are positive, and if, second, the probability that a link is eliminated is positive if the discounted, expected gains to at least one of its two participants are positive. The main result is the existence, for all discount factors and all value functions, of a forward-looking network formation scheme. Furthermore, we can always nd a forward-looking network formation scheme such that (i) the allocation rule is component balanced and (ii) the transition probabilities increase in the di erence in payo s for the corresponding players responsible for the transition. We use this dynamic solution concept to explore the tension between e ciency and stability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis belongs to the growing field of economic networks. In particular, we develop three essays in which we study the problem of bargaining, discrete choice representation, and pricing in the context of networked markets. Despite analyzing very different problems, the three essays share the common feature of making use of a network representation to describe the market of interest.

In Chapter 1 we present an analysis of bargaining in networked markets. We make two contributions. First, we characterize market equilibria in a bargaining model, and find that players' equilibrium payoffs coincide with their degree of centrality in the network, as measured by Bonacich's centrality measure. This characterization allows us to map, in a simple way, network structures into market equilibrium outcomes, so that payoffs dispersion in networked markets is driven by players' network positions. Second, we show that the market equilibrium for our model converges to the so called eigenvector centrality measure. We show that the economic condition for reaching convergence is that the players' discount factor goes to one. In particular, we show how the discount factor, the matching technology, and the network structure interact in a very particular way in order to see the eigenvector centrality as the limiting case of our market equilibrium.

We point out that the eigenvector approach is a way of finding the most central or relevant players in terms of the “global” structure of the network, and to pay less attention to patterns that are more “local”. Mathematically, the eigenvector centrality captures the relevance of players in the bargaining process, using the eigenvector associated to the largest eigenvalue of the adjacency matrix of a given network. Thus our result may be viewed as an economic justification of the eigenvector approach in the context of bargaining in networked markets.

As an application, we analyze the special case of seller-buyer networks, showing how our framework may be useful for analyzing price dispersion as a function of sellers and buyers' network positions.

Finally, in Chapter 3 we study the problem of price competition and free entry in networked markets subject to congestion effects. In many environments, such as communication networks in which network flows are allocated, or transportation networks in which traffic is directed through the underlying road architecture, congestion plays an important role. In particular, we consider a network with multiple origins and a common destination node, where each link is owned by a firm that sets prices in order to maximize profits, whereas users want to minimize the total cost they face, which is given by the congestion cost plus the prices set by firms. In this environment, we introduce the notion of Markovian traffic equilibrium to establish the existence and uniqueness of a pure strategy price equilibrium, without assuming that the demand functions are concave nor imposing particular functional forms for the latency functions. We derive explicit conditions to guarantee existence and uniqueness of equilibria. Given this existence and uniqueness result, we apply our framework to study entry decisions and welfare, and establish that in congested markets with free entry, the number of firms exceeds the social optimum.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the quest for a descriptive theory of decision-making, the rational actor model in economics imposes rather unrealistic expectations and abilities on human decision makers. The further we move from idealized scenarios, such as perfectly competitive markets, and ambitiously extend the reach of the theory to describe everyday decision making situations, the less sense these assumptions make. Behavioural economics has instead proposed models based on assumptions that are more psychologically realistic, with the aim of gaining more precision and descriptive power. Increased psychological realism, however, comes at the cost of a greater number of parameters and model complexity. Now there are a plethora of models, based on different assumptions, applicable in differing contextual settings, and selecting the right model to use tends to be an ad-hoc process. In this thesis, we develop optimal experimental design methods and evaluate different behavioral theories against evidence from lab and field experiments.

We look at evidence from controlled laboratory experiments. Subjects are presented with choices between monetary gambles or lotteries. Different decision-making theories evaluate the choices differently and would make distinct predictions about the subjects' choices. Theories whose predictions are inconsistent with the actual choices can be systematically eliminated. Behavioural theories can have multiple parameters requiring complex experimental designs with a very large number of possible choice tests. This imposes computational and economic constraints on using classical experimental design methods. We develop a methodology of adaptive tests: Bayesian Rapid Optimal Adaptive Designs (BROAD) that sequentially chooses the "most informative" test at each stage, and based on the response updates its posterior beliefs over the theories, which informs the next most informative test to run. BROAD utilizes the Equivalent Class Edge Cutting (EC2) criteria to select tests. We prove that the EC2 criteria is adaptively submodular, which allows us to prove theoretical guarantees against the Bayes-optimal testing sequence even in the presence of noisy responses. In simulated ground-truth experiments, we find that the EC2 criteria recovers the true hypotheses with significantly fewer tests than more widely used criteria such as Information Gain and Generalized Binary Search. We show, theoretically as well as experimentally, that surprisingly these popular criteria can perform poorly in the presence of noise, or subject errors. Furthermore, we use the adaptive submodular property of EC2 to implement an accelerated greedy version of BROAD which leads to orders of magnitude speedup over other methods.

We use BROAD to perform two experiments. First, we compare the main classes of theories for decision-making under risk, namely: expected value, prospect theory, constant relative risk aversion (CRRA) and moments models. Subjects are given an initial endowment, and sequentially presented choices between two lotteries, with the possibility of losses. The lotteries are selected using BROAD, and 57 subjects from Caltech and UCLA are incentivized by randomly realizing one of the lotteries chosen. Aggregate posterior probabilities over the theories show limited evidence in favour of CRRA and moments' models. Classifying the subjects into types showed that most subjects are described by prospect theory, followed by expected value. Adaptive experimental design raises the possibility that subjects could engage in strategic manipulation, i.e. subjects could mask their true preferences and choose differently in order to obtain more favourable tests in later rounds thereby increasing their payoffs. We pay close attention to this problem; strategic manipulation is ruled out since it is infeasible in practice, and also since we do not find any signatures of it in our data.

In the second experiment, we compare the main theories of time preference: exponential discounting, hyperbolic discounting, "present bias" models: quasi-hyperbolic (α, β) discounting and fixed cost discounting, and generalized-hyperbolic discounting. 40 subjects from UCLA were given choices between 2 options: a smaller but more immediate payoff versus a larger but later payoff. We found very limited evidence for present bias models and hyperbolic discounting, and most subjects were classified as generalized hyperbolic discounting types, followed by exponential discounting.

In these models the passage of time is linear. We instead consider a psychological model where the perception of time is subjective. We prove that when the biological (subjective) time is positively dependent, it gives rise to hyperbolic discounting and temporal choice inconsistency.

We also test the predictions of behavioral theories in the "wild". We pay attention to prospect theory, which emerged as the dominant theory in our lab experiments of risky choice. Loss aversion and reference dependence predicts that consumers will behave in a uniquely distinct way than the standard rational model predicts. Specifically, loss aversion predicts that when an item is being offered at a discount, the demand for it will be greater than that explained by its price elasticity. Even more importantly, when the item is no longer discounted, demand for its close substitute would increase excessively. We tested this prediction using a discrete choice model with loss-averse utility function on data from a large eCommerce retailer. Not only did we identify loss aversion, but we also found that the effect decreased with consumers' experience. We outline the policy implications that consumer loss aversion entails, and strategies for competitive pricing.

In future work, BROAD can be widely applicable for testing different behavioural models, e.g. in social preference and game theory, and in different contextual settings. Additional measurements beyond choice data, including biological measurements such as skin conductance, can be used to more rapidly eliminate hypothesis and speed up model comparison. Discrete choice models also provide a framework for testing behavioural models with field data, and encourage combined lab-field experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The following work explores the processes individuals utilize when making multi-attribute choices. With the exception of extremely simple or familiar choices, most decisions we face can be classified as multi-attribute choices. In order to evaluate and make choices in such an environment, we must be able to estimate and weight the particular attributes of an option. Hence, better understanding the mechanisms involved in this process is an important step for economists and psychologists. For example, when choosing between two meals that differ in taste and nutrition, what are the mechanisms that allow us to estimate and then weight attributes when constructing value? Furthermore, how can these mechanisms be influenced by variables such as attention or common physiological states, like hunger?

In order to investigate these and similar questions, we use a combination of choice and attentional data, where the attentional data was collected by recording eye movements as individuals made decisions. Chapter 1 designs and tests a neuroeconomic model of multi-attribute choice that makes predictions about choices, response time, and how these variables are correlated with attention. Chapter 2 applies the ideas in this model to intertemporal decision-making, and finds that attention causally affects discount rates. Chapter 3 explores how hunger, a common physiological state, alters the mechanisms we utilize as we make simple decisions about foods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[ES] Las comunidades online se han convertido en un lugar de encuentro muy popular para los consumidores que les permite compartir información. En este artículo se presenta una técnica de información novedosa como la netnografía, y se aplica para determinar cuál es el posicionamiento de las empresas de distribución alimentaria. Tras la recogida y análisis de 506 mensajes válidos de la comunidad online Ciao, se pudo conocer qué atributos se asociaban a seis establecimientos de alimentación analizados. Mercadona se asocia con la calidad de su marca de distribuidor y una escasa variedad de marcas/productos. Las tiendas discount, Lidl y DIA, destacan por la posibilidad de mejora en la limpieza del establecimiento y la localización de los productos. Los hipermercados, Eroski, Alcampo y Carrefour, son destacados por su variedad de marcas/productos, y alejado del domicilio. También se ha identificado a los competidores más directos de cada empresa, encontrándose una competencia entre los formatos de venta del mismo tipo (intratipo). El uso de la netnografia, técnica relativamente reciente, supone la mayor originalidad del trabajo. Además, las conclusiones obtenidas, que son coincidentes con estudios anteriores, muestran que la netnografía puede ser una fuente de información para determinar cuál es la imagen comercial y el posicionamiento de las empresas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

No mundo, as hepatites decorrentes de infecções virais têm sido uma das grandes preocupações em saúde pública devido a seu caráter crônico, curso assintomático e pela sua capacidade de determinar a perda da função hepática. Com o uso em larga escala de medicamentos antirretrovirais, a doença hepática relacionada à infecção pelo vírus da hepatite C (VHC) contribuiu para uma mudança radical na história natural da infecção pelo vírus da imunodeficiência humana (HIV). Não se sabe ao certo o peso da coinfecção VHC/HIV no Brasil, mas evidências apontam que independentemente da região geográfica, esses indivíduos apresentam maiores dificuldades em eliminar o VHC após o tratamento farmacológico, quando comparados a monoinfectados. No âmbito do SUS, o tratamento antiviral padrão para portadores do genótipo 1 do VHC e do HIV é a administração de peguinterferon associado à Ribavirina. Quanto ao período de tratamento e aos indivíduos que devem ser incluídos, os dois protocolos terapêuticos mais recentes possuem divergências. A diretriz mais atual preconiza o tratamento de indivíduos respondedores precoces somados a respondedores virológicos lentos, enquanto a diretriz imediatamente anterior exclui na 12 semana indivíduos que não respondem completamente. Com base nessa divergência, esse estudo objetivou avaliar o custo-efetividade do tratamento contra o VHC em indivíduos portadores do genótipo 1, coinfectados com o HIV, virgens de tratamento antiviral, não cirróticos e imunologicamente estabilizados, submetidos às regras de tratamento antiviral estabelecidos pelas duas mais recentes diretrizes terapêuticas direcionadas ao atendimento pelo SUS. Para tal, foi elaborado um modelo matemático de decisão, baseado em cadeias de Markov, que simulou a progressão da doença hepática mediante o tratamento e não tratamento. Foi acompanhada uma coorte hipotética de mil indivíduos homens, maiores de 40 anos. Adotou-se a perspectiva do Sistema Único de Saúde, horizonte temporal de 30 anos e taxa de desconto de 5% para os custos e consequências clínicas. A extensão do tratamento para respondedores lentos proporcionou incremento de 0,28 anos de vida ajustados por qualidade (QALY), de 7% de sobrevida e aumento de 60% no número de indivíduos que eliminaram o VHC. Além dos esperados benefícios em eficácia, a inclusão de respondedores virológicos lentos mostrou-se uma estratégia custo-efetiva ao alcançar um incremental de custo efetividade de R$ 44.171/QALY, valor abaixo do limiar de aceitabilidade proposto pela Organização Mundial da Saúde OMS - (R$ 63.756,00/QALY). A análise de sensibilidade demonstrou que as possíveis incertezas contidas no modelo são incapazes de alterar o resultado final, evidenciando, assim, a robustez da análise. A inclusão de indivíduos coinfectados VHC/HIV respondedores virológicos lentos no protocolo de tratamento apresenta-se, do ponto de vista fármaco-econômico, como uma estratégia com relação de custoefetividade favorável para o Sistema Único de Saúde. Sua adoção é perfeitamente compatível com a perspectiva do sistema, ao retornar melhores resultados em saúdeassociados a custos abaixo de um teto orçamentário aceitável, e com o da sociedade, ao evitar em maior grau, complicações e internações quando comparado à não inclusão.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A questão dos resíduos sólidos tornou-se um tema preocupante uma vez que os mesmos têm sido gerados em grande quantidade na sociedade capitalista do consumo com a substituição dos produtos e criação de complexas embalagens. Esse fator reflete no meio ambiente, pois há uma dificuldade de gestão adequada desses resíduos sem que os mesmos causem impactos ambientais negativos devido à sua demora em degradar-se e aos elementos contaminantes que podem conter. Alternativas de tratamento e destinação final vêm sendo incentivadas para atenuar os malefícios gerados pelos resíduos sólidos. A reciclagem está se destacando como mecanismo de utilização dos resíduos sólidos como matéria prima. A coleta seletiva é uma ferramenta fundamental para viabilizar o processo de reciclagem. Dessa forma a participação popular se faz necessária visto que os consumidores são fonte geradora dos resíduos e efetuando uma separação logo após o consumo facilita e qualifica todo o procedimento. Contudo os municípios brasileiros vêm apresentados baixos índices de reciclagem e coleta seletiva. Uma revisão bibliográfica foi feita acerca de casos de sucesso no Brasil e no mundo em ambas as práticas. Questões pertinentes à compreensão de todas as vertentes envolvidas também serão abordadas através de fundamento teórico. Os prédios residenciais representam grande contribuição nesse processo de geração de resíduos uma vez que concentram elevado contingente populacional em pequena área. Niterói é um município urbano localizado na região metropolitana do Rio de Janeiro, com uma população considerável possuindo diversos prédios e condomínios residenciais de grande porte. Existe um programa de coleta seletiva municipal, promovido pela Companhia de Limpeza Urbana de Niterói (CLIN) que está em vigor há mais de quinze anos que direciona os recicláveis a duas cooperativas de catadores no município (COOPCANIT e Morro do Céu). Esse estudo busca identificar todos os atores envolvidos no programa de coleta seletiva da CLIN (moradores, CLIN, COOPCANIT) assim como avaliar o desempenho do programa e sugerir formas de melhoria do mesmo baseado no referencial teórico. A partir de visitas técnicas, entrevistas com os responsáveis e a aplicação de questionários de avaliação da consciência ambiental de moradores de condomínios residenciais com e sem oferta do serviço de coleta seletiva. Foram verificadas falhas no programa como a pouca divulgação do mesmo e ausência de programa de educação ambiental aos moradores dos condomínios, refletindo na baixa adesão popular. Assim como foi possível identificar aspectos positivos como a busca por parcerias, representada pela atuação da empresa AMPLA que oferece desconto na conta de luz aos moradores que entregam seus resíduos nos postos de entrega voluntários. Algumas recomendações e sugestões são feitas aos gestores locais assim como propostas de futuros trabalhos e estudos relevantes ao problema.