996 resultados para 14 Economics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the major problems in the mass production of sugpo is how to obtain a constant supply of fry. Since ultimately it is the private sector which should produce the sugpo fry to fill the needs of the industry, the Barangay Hatchery Project under the Prawn Program of the Aquaculture Department of SEAFDEC has scaled down the hatchery technology from large tanks to a level which can be adopted by the private sector, especially in the villages, with a minimum of financial and technical inputs. This guide to small-scale hatchery operations is expected to generate more enthusiasm among fish farmers interested in venturing into sugpo culture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis belongs to the growing field of economic networks. In particular, we develop three essays in which we study the problem of bargaining, discrete choice representation, and pricing in the context of networked markets. Despite analyzing very different problems, the three essays share the common feature of making use of a network representation to describe the market of interest.

In Chapter 1 we present an analysis of bargaining in networked markets. We make two contributions. First, we characterize market equilibria in a bargaining model, and find that players' equilibrium payoffs coincide with their degree of centrality in the network, as measured by Bonacich's centrality measure. This characterization allows us to map, in a simple way, network structures into market equilibrium outcomes, so that payoffs dispersion in networked markets is driven by players' network positions. Second, we show that the market equilibrium for our model converges to the so called eigenvector centrality measure. We show that the economic condition for reaching convergence is that the players' discount factor goes to one. In particular, we show how the discount factor, the matching technology, and the network structure interact in a very particular way in order to see the eigenvector centrality as the limiting case of our market equilibrium.

We point out that the eigenvector approach is a way of finding the most central or relevant players in terms of the “global” structure of the network, and to pay less attention to patterns that are more “local”. Mathematically, the eigenvector centrality captures the relevance of players in the bargaining process, using the eigenvector associated to the largest eigenvalue of the adjacency matrix of a given network. Thus our result may be viewed as an economic justification of the eigenvector approach in the context of bargaining in networked markets.

As an application, we analyze the special case of seller-buyer networks, showing how our framework may be useful for analyzing price dispersion as a function of sellers and buyers' network positions.

Finally, in Chapter 3 we study the problem of price competition and free entry in networked markets subject to congestion effects. In many environments, such as communication networks in which network flows are allocated, or transportation networks in which traffic is directed through the underlying road architecture, congestion plays an important role. In particular, we consider a network with multiple origins and a common destination node, where each link is owned by a firm that sets prices in order to maximize profits, whereas users want to minimize the total cost they face, which is given by the congestion cost plus the prices set by firms. In this environment, we introduce the notion of Markovian traffic equilibrium to establish the existence and uniqueness of a pure strategy price equilibrium, without assuming that the demand functions are concave nor imposing particular functional forms for the latency functions. We derive explicit conditions to guarantee existence and uniqueness of equilibria. Given this existence and uniqueness result, we apply our framework to study entry decisions and welfare, and establish that in congested markets with free entry, the number of firms exceeds the social optimum.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report contains results from the second cruise of the Modis Optical Characterization Experiment (MOCE). Data presented here were obtained on the Mexican Research Vessel El Puma between 29 March and 13 April along the Pacific coast of Baja California and in the Gulf of California. Three types of data are reported: high spectral resolution radiometry at three depths for 13 stations; salinity, temperature beam attenuation and chlorophyll-a fluorescence, profiles at the same stations; and total suspended matter and suspended organic carbon and nitrogen.(PDF is 90 pages.)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Roughly one half of World's languages are in danger of extinction. The endangered languages, spoken by minorities, typically compete with powerful languages such as En- glish or Spanish. Consequently, the speakers of minority languages have to consider that not everybody can speak their language, converting the language choice into strategic,coordination-like situation. We show experimentally that the displacement of minority languages may be partially explained by the imperfect information about the linguistic type of the partner, leading to frequent failure to coordinate on the minority language even between two speakers who can and prefer to use it. The extent of miscoordination correlates with how minoritarian a language is and with the real-life linguistic condition of subjects: the more endangered a language the harder it is to coordinate on its use, and people on whom the language survival relies the most acquire behavioral strategies that lower its use. Our game-theoretical treatment of the issue provides a new perspective for linguistic policies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the quest for a descriptive theory of decision-making, the rational actor model in economics imposes rather unrealistic expectations and abilities on human decision makers. The further we move from idealized scenarios, such as perfectly competitive markets, and ambitiously extend the reach of the theory to describe everyday decision making situations, the less sense these assumptions make. Behavioural economics has instead proposed models based on assumptions that are more psychologically realistic, with the aim of gaining more precision and descriptive power. Increased psychological realism, however, comes at the cost of a greater number of parameters and model complexity. Now there are a plethora of models, based on different assumptions, applicable in differing contextual settings, and selecting the right model to use tends to be an ad-hoc process. In this thesis, we develop optimal experimental design methods and evaluate different behavioral theories against evidence from lab and field experiments.

We look at evidence from controlled laboratory experiments. Subjects are presented with choices between monetary gambles or lotteries. Different decision-making theories evaluate the choices differently and would make distinct predictions about the subjects' choices. Theories whose predictions are inconsistent with the actual choices can be systematically eliminated. Behavioural theories can have multiple parameters requiring complex experimental designs with a very large number of possible choice tests. This imposes computational and economic constraints on using classical experimental design methods. We develop a methodology of adaptive tests: Bayesian Rapid Optimal Adaptive Designs (BROAD) that sequentially chooses the "most informative" test at each stage, and based on the response updates its posterior beliefs over the theories, which informs the next most informative test to run. BROAD utilizes the Equivalent Class Edge Cutting (EC2) criteria to select tests. We prove that the EC2 criteria is adaptively submodular, which allows us to prove theoretical guarantees against the Bayes-optimal testing sequence even in the presence of noisy responses. In simulated ground-truth experiments, we find that the EC2 criteria recovers the true hypotheses with significantly fewer tests than more widely used criteria such as Information Gain and Generalized Binary Search. We show, theoretically as well as experimentally, that surprisingly these popular criteria can perform poorly in the presence of noise, or subject errors. Furthermore, we use the adaptive submodular property of EC2 to implement an accelerated greedy version of BROAD which leads to orders of magnitude speedup over other methods.

We use BROAD to perform two experiments. First, we compare the main classes of theories for decision-making under risk, namely: expected value, prospect theory, constant relative risk aversion (CRRA) and moments models. Subjects are given an initial endowment, and sequentially presented choices between two lotteries, with the possibility of losses. The lotteries are selected using BROAD, and 57 subjects from Caltech and UCLA are incentivized by randomly realizing one of the lotteries chosen. Aggregate posterior probabilities over the theories show limited evidence in favour of CRRA and moments' models. Classifying the subjects into types showed that most subjects are described by prospect theory, followed by expected value. Adaptive experimental design raises the possibility that subjects could engage in strategic manipulation, i.e. subjects could mask their true preferences and choose differently in order to obtain more favourable tests in later rounds thereby increasing their payoffs. We pay close attention to this problem; strategic manipulation is ruled out since it is infeasible in practice, and also since we do not find any signatures of it in our data.

In the second experiment, we compare the main theories of time preference: exponential discounting, hyperbolic discounting, "present bias" models: quasi-hyperbolic (α, β) discounting and fixed cost discounting, and generalized-hyperbolic discounting. 40 subjects from UCLA were given choices between 2 options: a smaller but more immediate payoff versus a larger but later payoff. We found very limited evidence for present bias models and hyperbolic discounting, and most subjects were classified as generalized hyperbolic discounting types, followed by exponential discounting.

In these models the passage of time is linear. We instead consider a psychological model where the perception of time is subjective. We prove that when the biological (subjective) time is positively dependent, it gives rise to hyperbolic discounting and temporal choice inconsistency.

We also test the predictions of behavioral theories in the "wild". We pay attention to prospect theory, which emerged as the dominant theory in our lab experiments of risky choice. Loss aversion and reference dependence predicts that consumers will behave in a uniquely distinct way than the standard rational model predicts. Specifically, loss aversion predicts that when an item is being offered at a discount, the demand for it will be greater than that explained by its price elasticity. Even more importantly, when the item is no longer discounted, demand for its close substitute would increase excessively. We tested this prediction using a discrete choice model with loss-averse utility function on data from a large eCommerce retailer. Not only did we identify loss aversion, but we also found that the effect decreased with consumers' experience. We outline the policy implications that consumer loss aversion entails, and strategies for competitive pricing.

In future work, BROAD can be widely applicable for testing different behavioural models, e.g. in social preference and game theory, and in different contextual settings. Additional measurements beyond choice data, including biological measurements such as skin conductance, can be used to more rapidly eliminate hypothesis and speed up model comparison. Discrete choice models also provide a framework for testing behavioural models with field data, and encourage combined lab-field experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Chesapeake and Delaware Canal is a man-made waterway connecting the upper Chesapeake Bay with the Delaware Bay. It started in 1829 as a private barge canal with locks, two at the Delaware end, and one at the Chesapeake end. For the most part, natural tidal and non-tidal waterways were connected by short dredged sections to form the original canal. In 1927, the C and D Canal was converted to a sea-level canal, with a controlling depth of 14 feet, and a width of 150 feet. In 1938 the canal was deepened to 27 feet, with a channel width of 250 feet. Channel side slopes were dredged at 2.5:1, thus making the total width of the waterway at least 385 feet in those segments representing new cuts or having shore spoil area dykes rising above sea level. In 1954 Congress authorized a further enlargement of the Canal to a depth of 35 feet and a channel width of 450 feet. (pdf contains 27 pages)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent research has shown that the biomass of bacteria in lakes and other water-bodies can attain significant values. The huge production of bacteria is brought about by their great rate of reproduction. In a series of cases their biomass exceeds the biomass of phytoplankton. Therefore in a study of the biological productivity of water bodies it is necessary to calculate the biomass and production not only of the phyto- and zooplankton, but also of bacteria.The authors uses different methods and formulae to to compare the time of one generation of the bacteria.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The original method, proposed by Yentsch (1957), of determination of chlorophyll directly in the cells, attracts attention by its simplicity. In order to measure the content of chlorophyll by this method, a determined volume of suspension of algae is filtered through a membrane filter. The latter is dried a little, clarified by immersion oil, clamped between two glasses, and spectrophotometrized. Extinction is read off at , wavelengths equal to 670 millimicrons (around the maximum absorption of chlorophyll a in the cell) and 750 millimicrons (correction for non- specific absorption and dispersion of light by particles of the preparation). The method of Yentsch was employed by the authors for determination of chlorophyll-a in samples of phytoplankton. They conclude that in spite of the simplicity and convenience of determination the method must be applied sufficiently carefully. It is more suitable for analysis of cultures of algae, where, non-specific absorption of light is insignificant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As is known, copepods play an important role in the nutrition of fish. Therefore with a view to facilitating research on the study of the quantitative side of feeding, there have recently appeared a considerable number of papers devoted to the development of methods for determining the wet. weight of these crustaceans. For the further facilitating of research in the nutrition of fish it would be of great interest to clarify the problem, is there not some kind of rule in the growth of the crustaceans during metamorphosis, and if there is such a rule is it not possible, to determine the length of the larvae at each stage, not by measuring them, but by using the formulae derived on the basis of these rules. This article examines the growth curves of different species of freshwater Copepoda, obtained on the basis of experimental observations in cultures or by way of measurement of mass material at all stages of development in samples from water-bodies. The authors study in particular the ratio of the mean diameter of the eggs to the mean length of the egg-bearing females.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A person living in an industrialized society has almost no choice but to receive information daily with negative implications for himself or others. His attention will often be drawn to the ups and downs of economic indicators or the alleged misdeeds of leaders and organizations. Reacting to new information is central to economics, but economics typically ignores the affective aspect of the response, for example, of stress or anger. These essays present the results of considering how the affective aspect of the response can influence economic outcomes.

The first chapter presents an experiment in which individuals were presented with information about various non-profit organizations and allowed to take actions that rewarded or punished those organizations. When social interaction was introduced into this environment an asymmetry between rewarding and punishing appeared. The net effects of punishment became greater and more variable, whereas the effects of reward were unchanged. The individuals were more strongly influenced by negative social information and used that information to target unpopular organizations. These behaviors contributed to an increase in inequality among the outcomes of the organizations.

The second and third chapters present empirical studies of reactions to negative information about local economic conditions. Economic factors are among the most prevalent stressors, and stress is known to have numerous negative effects on health. These chapters document localized, transient effects of the announcement of information about large-scale job losses. News of mass layoffs and shut downs of large military bases are found to decrease birth weights and gestational ages among babies born in the affected regions. The effect magnitudes are close to those estimated in similar studies of disasters.