892 resultados para Demand (Economic theory)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In three essays we examine user-generated product ratings with aggregation. While recommendation systems have been studied extensively, this simple type of recommendation system has been neglected, despite its prevalence in the field. We develop a novel theoretical model of user-generated ratings. This model improves upon previous work in three ways: it considers rational agents and allows them to abstain from rating when rating is costly; it incorporates rating aggregation (such as averaging ratings); and it considers the effect on rating strategies of multiple simultaneous raters. In the first essay we provide a partial characterization of equilibrium behavior. In the second essay we test this theoretical model in laboratory, and in the third we apply established behavioral models to the data generated in the lab. This study provides clues to the prevalence of extreme-valued ratings in field implementations. We show theoretically that in equilibrium, ratings distributions do not represent the value distributions of sincere ratings. Indeed, we show that if rating strategies follow a set of regularity conditions, then in equilibrium the rate at which players participate is increasing in the extremity of agents' valuations of the product. This theoretical prediction is realized in the lab. We also find that human subjects show a disproportionate predilection for sincere rating, and that when they do send insincere ratings, they are almost always in the direction of exaggeration. Both sincere and exaggerated ratings occur with great frequency despite the fact that such rating strategies are not in subjects' best interest. We therefore apply the behavioral concepts of quantal response equilibrium (QRE) and cursed equilibrium (CE) to the experimental data. Together, these theories explain the data significantly better than does a theory of rational, Bayesian behavior -- accurately predicting key comparative statics. However, the theories fail to predict the high rates of sincerity, and it is clear that a better theory is needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the quest for a descriptive theory of decision-making, the rational actor model in economics imposes rather unrealistic expectations and abilities on human decision makers. The further we move from idealized scenarios, such as perfectly competitive markets, and ambitiously extend the reach of the theory to describe everyday decision making situations, the less sense these assumptions make. Behavioural economics has instead proposed models based on assumptions that are more psychologically realistic, with the aim of gaining more precision and descriptive power. Increased psychological realism, however, comes at the cost of a greater number of parameters and model complexity. Now there are a plethora of models, based on different assumptions, applicable in differing contextual settings, and selecting the right model to use tends to be an ad-hoc process. In this thesis, we develop optimal experimental design methods and evaluate different behavioral theories against evidence from lab and field experiments.

We look at evidence from controlled laboratory experiments. Subjects are presented with choices between monetary gambles or lotteries. Different decision-making theories evaluate the choices differently and would make distinct predictions about the subjects' choices. Theories whose predictions are inconsistent with the actual choices can be systematically eliminated. Behavioural theories can have multiple parameters requiring complex experimental designs with a very large number of possible choice tests. This imposes computational and economic constraints on using classical experimental design methods. We develop a methodology of adaptive tests: Bayesian Rapid Optimal Adaptive Designs (BROAD) that sequentially chooses the "most informative" test at each stage, and based on the response updates its posterior beliefs over the theories, which informs the next most informative test to run. BROAD utilizes the Equivalent Class Edge Cutting (EC2) criteria to select tests. We prove that the EC2 criteria is adaptively submodular, which allows us to prove theoretical guarantees against the Bayes-optimal testing sequence even in the presence of noisy responses. In simulated ground-truth experiments, we find that the EC2 criteria recovers the true hypotheses with significantly fewer tests than more widely used criteria such as Information Gain and Generalized Binary Search. We show, theoretically as well as experimentally, that surprisingly these popular criteria can perform poorly in the presence of noise, or subject errors. Furthermore, we use the adaptive submodular property of EC2 to implement an accelerated greedy version of BROAD which leads to orders of magnitude speedup over other methods.

We use BROAD to perform two experiments. First, we compare the main classes of theories for decision-making under risk, namely: expected value, prospect theory, constant relative risk aversion (CRRA) and moments models. Subjects are given an initial endowment, and sequentially presented choices between two lotteries, with the possibility of losses. The lotteries are selected using BROAD, and 57 subjects from Caltech and UCLA are incentivized by randomly realizing one of the lotteries chosen. Aggregate posterior probabilities over the theories show limited evidence in favour of CRRA and moments' models. Classifying the subjects into types showed that most subjects are described by prospect theory, followed by expected value. Adaptive experimental design raises the possibility that subjects could engage in strategic manipulation, i.e. subjects could mask their true preferences and choose differently in order to obtain more favourable tests in later rounds thereby increasing their payoffs. We pay close attention to this problem; strategic manipulation is ruled out since it is infeasible in practice, and also since we do not find any signatures of it in our data.

In the second experiment, we compare the main theories of time preference: exponential discounting, hyperbolic discounting, "present bias" models: quasi-hyperbolic (α, β) discounting and fixed cost discounting, and generalized-hyperbolic discounting. 40 subjects from UCLA were given choices between 2 options: a smaller but more immediate payoff versus a larger but later payoff. We found very limited evidence for present bias models and hyperbolic discounting, and most subjects were classified as generalized hyperbolic discounting types, followed by exponential discounting.

In these models the passage of time is linear. We instead consider a psychological model where the perception of time is subjective. We prove that when the biological (subjective) time is positively dependent, it gives rise to hyperbolic discounting and temporal choice inconsistency.

We also test the predictions of behavioral theories in the "wild". We pay attention to prospect theory, which emerged as the dominant theory in our lab experiments of risky choice. Loss aversion and reference dependence predicts that consumers will behave in a uniquely distinct way than the standard rational model predicts. Specifically, loss aversion predicts that when an item is being offered at a discount, the demand for it will be greater than that explained by its price elasticity. Even more importantly, when the item is no longer discounted, demand for its close substitute would increase excessively. We tested this prediction using a discrete choice model with loss-averse utility function on data from a large eCommerce retailer. Not only did we identify loss aversion, but we also found that the effect decreased with consumers' experience. We outline the policy implications that consumer loss aversion entails, and strategies for competitive pricing.

In future work, BROAD can be widely applicable for testing different behavioural models, e.g. in social preference and game theory, and in different contextual settings. Additional measurements beyond choice data, including biological measurements such as skin conductance, can be used to more rapidly eliminate hypothesis and speed up model comparison. Discrete choice models also provide a framework for testing behavioural models with field data, and encourage combined lab-field experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The analysis of the evolution of the M3 money aggregate is an important element in the definition and implementation of monetary policy for the ECB. A well-defined and stable long run demand function is an essential requisite for M3 to be a valid monetary tool. Therefore, this paper analyzes based in cointegration techniques the existence of a long run money demand, estimating it and testing its stability for the Euro Area and for ten of its member countries. Specifically, bearing in mind the high degree of monetary instability that the current economic crisis has created in the Euro Area, we also test whether this has had a noticeable impact in the cointegration among real money demand and its determinants. The analysis gives evidence of the existence of a long run relationship when the aggregated Euro Area and six of the ten countries are considered. However, these relationships are highly instable since the outbreak of the financial crisis, leading in some cases to even rejecting cointegration. All this suggests that the ECB’s strategy of focusing in the M3 monetary aggregates could not be a convenient approach under the current circumstances

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis studies decision making under uncertainty and how economic agents respond to information. The classic model of subjective expected utility and Bayesian updating is often at odds with empirical and experimental results; people exhibit systematic biases in information processing and often exhibit aversion to ambiguity. The aim of this work is to develop simple models that capture observed biases and study their economic implications.

In the first chapter I present an axiomatic model of cognitive dissonance, in which an agent's response to information explicitly depends upon past actions. I introduce novel behavioral axioms and derive a representation in which beliefs are directionally updated. The agent twists the information and overweights states in which his past actions provide a higher payoff. I then characterize two special cases of the representation. In the first case, the agent distorts the likelihood ratio of two states by a function of the utility values of the previous action in those states. In the second case, the agent's posterior beliefs are a convex combination of the Bayesian belief and the one which maximizes the conditional value of the previous action. Within the second case a unique parameter captures the agent's sensitivity to dissonance, and I characterize a way to compare sensitivity to dissonance between individuals. Lastly, I develop several simple applications and show that cognitive dissonance contributes to the equity premium and price volatility, asymmetric reaction to news, and belief polarization.

The second chapter characterizes a decision maker with sticky beliefs. That is, a decision maker who does not update enough in response to information, where enough means as a Bayesian decision maker would. This chapter provides axiomatic foundations for sticky beliefs by weakening the standard axioms of dynamic consistency and consequentialism. I derive a representation in which updated beliefs are a convex combination of the prior and the Bayesian posterior. A unique parameter captures the weight on the prior and is interpreted as the agent's measure of belief stickiness or conservatism bias. This parameter is endogenously identified from preferences and is easily elicited from experimental data.

The third chapter deals with updating in the face of ambiguity, using the framework of Gilboa and Schmeidler. There is no consensus on the correct way way to update a set of priors. Current methods either do not allow a decision maker to make an inference about her priors or require an extreme level of inference. In this chapter I propose and axiomatize a general model of updating a set of priors. A decision maker who updates her beliefs in accordance with the model can be thought of as one that chooses a threshold that is used to determine whether a prior is plausible, given some observation. She retains the plausible priors and applies Bayes' rule. This model includes generalized Bayesian updating and maximum likelihood updating as special cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of codes, classically motivated by the need to communicate information reliably in the presence of error, has found new life in fields as diverse as network communication, distributed storage of data, and even has connections to the design of linear measurements used in compressive sensing. But in all contexts, a code typically involves exploiting the algebraic or geometric structure underlying an application. In this thesis, we examine several problems in coding theory, and try to gain some insight into the algebraic structure behind them.

The first is the study of the entropy region - the space of all possible vectors of joint entropies which can arise from a set of discrete random variables. Understanding this region is essentially the key to optimizing network codes for a given network. To this end, we employ a group-theoretic method of constructing random variables producing so-called "group-characterizable" entropy vectors, which are capable of approximating any point in the entropy region. We show how small groups can be used to produce entropy vectors which violate the Ingleton inequality, a fundamental bound on entropy vectors arising from the random variables involved in linear network codes. We discuss the suitability of these groups to design codes for networks which could potentially outperform linear coding.

The second topic we discuss is the design of frames with low coherence, closely related to finding spherical codes in which the codewords are unit vectors spaced out around the unit sphere so as to minimize the magnitudes of their mutual inner products. We show how to build frames by selecting a cleverly chosen set of representations of a finite group to produce a "group code" as described by Slepian decades ago. We go on to reinterpret our method as selecting a subset of rows of a group Fourier matrix, allowing us to study and bound our frames' coherences using character theory. We discuss the usefulness of our frames in sparse signal recovery using linear measurements.

The final problem we investigate is that of coding with constraints, most recently motivated by the demand for ways to encode large amounts of data using error-correcting codes so that any small loss can be recovered from a small set of surviving data. Most often, this involves using a systematic linear error-correcting code in which each parity symbol is constrained to be a function of some subset of the message symbols. We derive bounds on the minimum distance of such a code based on its constraints, and characterize when these bounds can be achieved using subcodes of Reed-Solomon codes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time, risk, and attention are all integral to economic decision making. The aim of this work is to understand those key components of decision making using a variety of approaches: providing axiomatic characterizations to investigate time discounting, generating measures of visual attention to infer consumers' intentions, and examining data from unique field settings.

Chapter 2, co-authored with Federico Echenique and Kota Saito, presents the first revealed-preference characterizations of exponentially-discounted utility model and its generalizations. My characterizations provide non-parametric revealed-preference tests. I apply the tests to data from a recent experiment, and find that the axiomatization delivers new insights on a dataset that had been analyzed by traditional parametric methods.

Chapter 3, co-authored with Min Jeong Kang and Colin Camerer, investigates whether "pre-choice" measures of visual attention improve in prediction of consumers' purchase intentions. We measure participants' visual attention using eyetracking or mousetracking while they make hypothetical as well as real purchase decisions. I find that different patterns of visual attention are associated with hypothetical and real decisions. I then demonstrate that including information on visual attention improves prediction of purchase decisions when attention is measured with mousetracking.

Chapter 4 investigates individuals' attitudes towards risk in a high-stakes environment using data from a TV game show, Jeopardy!. I first quantify players' subjective beliefs about answering questions correctly. Using those beliefs in estimation, I find that the representative player is risk averse. I then find that trailing players tend to wager more than "folk" strategies that are known among the community of contestants and fans, and this tendency is related to their confidence. I also find gender differences: male players take more risk than female players, and even more so when they are competing against two other male players.

Chapter 5, co-authored with Colin Camerer, investigates the dynamics of the favorite-longshot bias (FLB) using data on horse race betting from an online exchange that allows bettors to trade "in-play." I find that probabilistic forecasts implied by market prices before start of the races are well-calibrated, but the degree of FLB increases significantly as the events approach toward the end.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past, agricultural researchers tended to ignore the fisheries factor in global food and nutritional security. However, the role of fish is becoming critical as a result of changes in fisheries regimes, income distribution, demand and increasing international trade. Fish has become the fastest growing food commodity in international trade and this is raising concern for the supply of fish for poorer people. As a result, the impact of international trade regimes on fish supply and demand, and the consequences on the availability of fish for developing countries need to be studied. Policies aimed at increasing export earnings are in conflict with those aimed at increasing food security in third world countries. Fisheries policy research will need to focus on three primary areas which have an impact on the marginal and poorer communities of developing countries: increased international demand for low-value fish on the supply of poorer countries; improved aquaculture technologies and productivity on poorer and marginal farmers; and land and water allocation policy on productivity, food security and sustainability across farm, fishery and related sectors. The key to local food security is in the integration of agriculture, aquaculture and natural resources but an important focus on fisheries policy research will be to look at the linkages between societal, economic and natural systems in order to develop adequate and flexible solutions to achieve sustainable use of aquatic resources systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has been predicted that the global demand for fish for human consumption will increase by more than 50% over the next 15 years. The FAO has projected that the increase in supply will originate primarily from marine fisheries, aquaculture and to a lesser extent from inland fisheries, but with a commensurate price increase. However, there are constraints to increased production in both marine and inland fisheries, such as overfishing, overexploitation limited potential increase and environmental degradation due to industrialization. The author sees aquaculture as having the greatest potential for future expansion. Aquaculture practices vary depending on culture, environment, society amd sources of fish. Inputs are generally low-cost, ecologically efficient and the majority of aquaculture ventures are small-scale and family operated. In the future, advances in technology, genetic improvement of cultured species, improvement in nutrition, disease management, reproduction control and environmental management are expected along with opportunities for complimentary activities with agriculture, industrial and wastewater linkages. The main constraints to aquaculture are from reduced access to suitable land and good quality water due to pollution and habitat degradation. Aquaculture itself carries minimal potential for aquatic pollution. State participation in fisheries production has not proven to be the best way to promote the fisheries sector. The role of governments is increasingly seen as creating an environment for economic sectors to make an optimum contribution, through support in areas such as infrastructure, research, training and extension and a legal framework. The author feels that a holistic approach integrating the natural and social sciences is called for when fisheries policy is being examined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The United States' increasing competitive advantage in international seafood trade in Alaska walleye pollock. Theragra chalcogramma, has contributed to higher prices for surimi-based goods and structural changes in seafood production and trade in Japan. The objectives of this analytical investigation include: 1) Evaluation of the role reversal of Japan and the United States in international seafood trade and 2) quantification of the impact of rising prices of frozen surimi on household consumption of surimi-based foods in Japan. This study documents Japan's regression from "seafood self-sufficiency" to increasing dependence on imported products and raw materials. In particular, Japan's growing dependence on American fishermen and seafood producers is described. Surimi production by the United States, and its emerging dominance over Japanese sources of supply, are especially significant. Results of the analysis suggest that Japanese consumer demand for surimi-based food stuffs correlates directly with "competitive" food prices, e.g., pork, chicken, and beef, and inversely with personal income. Also revealed is how rising household income and relative price shifts among competing animal protein sources in the Japanese diet have contributed to declining household consumption of surimi-based foods, specifically, and a shift away from seafoods in favor of beef, in general. The linkages between, for example. Japanese domestic seafood production and consumption, international trade in marine products, and resource management decisions in the U.S. EEZ present a picture of a changing global marketplace. Increasingly, actions in one arena will have perhaps profound implications in the others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this report we have attempted to evaluate the ecological and economic consequences of hypoxia in the northern Gulf of Mexico. Although our initial approach was to rely on published accounts, we quickly realized that the body of published literature deahng with hypoxia was limited, and we would have to conduct our own exploratory analysis of existing Gulf data, or rely on published accounts from other systems to infer possible or potential effects of hypoxia. For the economic analysis, we developed a conceptual model of how hypoxia-related impacts could affect fisheries. Our model included both supply and demand components. The supply model had two components: (1) a physical production function for fish or shrimp, and (2) the cost of fishing. If hypoxia causes the cost of a unit of fishing effort to change, then this will result in a shift in supply. The demand model considered how hypoxia might affect the quality of landed fish or shrimp. In particular, the market value per pound is lower for small shrimp than for large shrimp. Given the limitations of the ecological assessment, the shallow continental shelf area affected by hypoxia does show signs of hypoxia-related stress. While current ecological conditions are a response to a variety of stressors, the effects of hypoxia are most obvious in the benthos that experience mortality, elimination of larger long-lived species, and a shifting of productivity to nonhypoxic periods (energy pulsing). What is not known is whether hypoxia leads to higher productivity during productive periods, or simply to a reduction of productivity during oxygen-stressed periods. The economic assessment based on fisheries data, however, failed to detect effects attributable to hypoxia. Overall, fisheries landings statistics for at least the last few decades have been relatively constant. The failure to identify clear hypoxic effects in the fisheries statistics does not necessarily mean that they are absent. There are several possibilities: (1) hypoxic effects are small relative to the overall variability in the data sets evaluated; (2) the data and the power of the analyses are not adequate; and (3) currently there are no hypoxic effects on fisheries. Lack of identified hypoxic effects in available fisheries data does not imply that effects would not occur should conditions worsen. Experience with other hypoxic zones around the globe shows that both ecological and fisheries effects become progressively more severe as hypoxia increases. Several large systems around the globe have suffered serious ecological and economic consequences from seasonal summertime hypoxia; most notable are the Kattegat and Black Sea. The consequences range from localized loss of catch and recruitment failure to complete system-wide loss of fishery species. If experiences in other systems are applicable to the Gulf of Mexico, then in the face of worsening hypoxic conditions, at some point fisheries and other species will decline, perhaps precipitously.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fisheries of Lake Victoria have undergonea dramatic transformation during the last two decades. From being a locally based fishery with little intervention and capital investment from outside,the present fishery is dominated by national and international capital penetrating the industry. It is explosion in the growth of nile perch and the strong demand devloped for this fishin the global markets, which have transformed the fisheries of the Lake victoria. This report presents the results of a survey carried out between October 2001 and February 2002 about the fishery distribution patterns and their impacts on fisher communities of Lake Victoria. The fisheries distribution pattern of the lake is described as well as the flows and benefits from the fisheries resource and the resource constraints and sustainability options. A major part of the paper discusses some of the socio-economic impacts of the rapid changes that are responsible for the present fisheries. It particularly focuses on the effect of the Nile perch boom, its globalization and the development of the fish industry in Uganda, on food security and employment for the local population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Six carp based culture technologies such as, carp-pangas, carp polyculture, carp-golda, pangas monoculture, golda monoculture and nursery have been selected to determine the cost and returns of respective technologies in Bangladesh. The sample farmers selected for these technologies were 55, 100, 65, 50, 51 and 55 respectively and thus the total sample size stood at 376. The study covered 7 districts of Bangladesh, namely, Mymensingh, Bogra, Noakhali, Comilla, Jessore, Khulna and Bagerhat. Both primary and secondary data were used for this study. It was found that farmers used a good number of feeds for the selected technologies and they maintained no standard doses for them. Remarkable differences were found among the prices of different feeds and other inputs used for different technologies in different locations. Prices of all inputs were found to be increasing and this increase was more in recent years compared to previous years. Though all the technologies were found to be profitable, the feed situation was not satisfactory. Except rice polish all the local feeds showed deficit in supply to meet the national demand for the country. If this situation persists and no proper measures are taken to secure the local feed supply, the present development of supplementary feed-based aquaculture would be fully dependent on imported feeds and would not be sustainable in future. This study strongly suggests the corresponding authority to handle the matter with proper attention considering its significant livelihood impact on the economy of the country.