878 resultados para bio-economics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper deals with the economics of gasification facilities in general and IGCC power plants in particular. Regarding the prospects of these systems, passing the technological test is one thing, passing the economic test can be quite another. In this respect, traditional valuations assume constant input and/or output prices. Since this is hardly realistic, we allow for uncertainty in prices. We naturally look at the markets where many of the products involved are regularly traded. Futures markets on commodities are particularly useful for valuing uncertain future cash flows. Thus, revenues and variable costs can be assessed by means of sound financial concepts and actual market data. On the other hand, these complex systems provide a number of flexibility options (e.g., to choose among several inputs, outputs, modes of operation, etc.). Typically, flexibility contributes significantly to the overall value of real assets. Indeed, maximization of the asset value requires the optimal exercise of any flexibility option available. Yet the economic value of flexibility is elusive, the more so under (price) uncertainty. And the right choice of input fuels and/or output products is a main concern for the facility managers. As a particular application, we deal with the valuation of input flexibility. We follow the Real Options approach. In addition to economic variables, we also address technical and environmental issues such as energy efficiency, utility performance characteristics and emissions (note that carbon constraints are looming). Lastly, a brief introduction to some stochastic processes suitable for valuation purposes is provided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nano-fibrillar arrays are fabricated using polystyrene materials. The average diameter of each fiber is about 300 nm. Experiments show that such a fibrillar surface possesses a relatively hydrophobic feature with a water contact angle of 142 degrees. Nanoscale friction properties are mainly focused on. It is found that the friction force of polystyrene nano-fibrillar surfaces is obviously enhanced in contrast to polystyrene smooth surfaces. The apparent coefficient of friction increases with the applied load, but is independent of the scanning speed. An interesting observation is that the friction force increases almost linearly with the real contact area, which abides by the fundamental Bowden-Tabor law of nano-scale friction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

How is climate change affecting our coastal environment? How can coastal communities adapt to sea level rise and increased storm risk? These questions have garnered tremendous interest from scientists and policy makers alike, as the dynamic coastal environment is particularly vulnerable to the impacts of climate change. Over half the world population lives and works in a coastal zone less than 120 miles wide, thereby being continuously affected by the changes in the coastal environment [6]. Housing markets are directly influenced by the physical processes that govern coastal systems. Beach towns like Oak Island in North Carolina (NC) face severe erosion, and the tax assesed value of one coastal property fell by 93% in 2007 [9]. With almost ninety percent of the sandy beaches in the US facing moderate to severe erosion [8], coastal communities often intervene to stabilize the shoreline and hold back the sea in order to protect coastal property and infrastructure. Beach nourishment, which is the process of rebuilding a beach by periodically replacing an eroding section of the beach with sand dredged from another location, is a policy for erosion control in many parts of the US Atlantic and Pacific coasts [3]. Beach nourishment projects in the United States are primarily federally funded and implemented by the Army Corps of Engineers (ACE) after a benefit-cost analysis. Benefits from beach nourishment include reduction in storm damage and recreational benefits from a wider beach. Costs would include the expected cost of construction, present value of periodic maintenance, and any external cost such as the environmental cost associated with a nourishment project (NOAA). Federal appropriations for nourishment totaled $787 million from 1995 to 2002 [10]. Human interventions to stabilize shorelines and physical coastal dynamics are strongly coupled. The value of the beach, in the form of storm protection and recreation amenities, is at least partly capitalized into property values. These beach values ultimately influence the benefit-cost analysis in support of shoreline stabilization policy, which, in turn, affects the shoreline dynamics. This paper explores the policy implications of this circularity. With a better understanding of the physical-economic feedbacks, policy makers can more effectively design climate change adaptation strategies. (PDF contains 4 pages)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regulatory action to protect California’s coastal water quality from degradation by copper from recreational boats’ antifouling paints interacts with efforts to prevent transport of invasive, hull-fouling species. A copper regulatory program is in place for a major yacht basin in northern San Diego Bay and in process for other major, California boat basins. “Companion” fouling control strategies are used with copper-based antifouling paints, as some invasive species have developed resistance to the copper biocide. Such strategies are critical for boats with less toxic or nontoxic hull coatings. Boat traffic along over 3,000 miles of coastline in California and Baja California increases invasive species transport risks. For example, 80% of boats in Baja California marinas are from the United States, especially California. Policy makers, boating businesses and boat owners need information on costs and supply-side capacity for effective fouling control measures to co-manage water quality and invasive species concerns. (PDF contains 3 pages)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis examines foundational questions in behavioral economics—also called psychology and economics—and the neural foundations of varied sources of utility. We have three primary aims: First, to provide the field of behavioral economics with psychological theories of behavior that are derived from neuroscience and to use those theories to identify novel evidence for behavioral biases. Second, we provide neural and micro foundations of behavioral preferences that give rise to well-documented empirical phenomena in behavioral economics. Finally, we show how a deep understanding of the neural foundations of these behavioral preferences can feed back into our theories of social preferences and reference-dependent utility.

The first chapter focuses on classical conditioning and its application in identifying the psychological underpinnings of a pricing phenomenon. We return to classical conditioning again in the third chapter where we use fMRI to identify varied sources of utility—here, reference dependent versus direct utility—and cross-validate our interpretation with a conditioning experiment. The second chapter engages social preferences and, more broadly, causative utility (wherein the decision-maker derives utility from making or avoiding particular choices).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the major problems in the mass production of sugpo is how to obtain a constant supply of fry. Since ultimately it is the private sector which should produce the sugpo fry to fill the needs of the industry, the Barangay Hatchery Project under the Prawn Program of the Aquaculture Department of SEAFDEC has scaled down the hatchery technology from large tanks to a level which can be adopted by the private sector, especially in the villages, with a minimum of financial and technical inputs. This guide to small-scale hatchery operations is expected to generate more enthusiasm among fish farmers interested in venturing into sugpo culture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biological machines are active devices that are comprised of cells and other biological components. These functional devices are best suited for physiological environments that support cellular function and survival. Biological machines have the potential to revolutionize the engineering of biomedical devices intended for implantation, where the human body can provide the required physiological environment. For engineering such cell-based machines, bio-inspired design can serve as a guiding platform as it provides functionally proven designs that are attainable by living cells. In the present work, a systematic approach was used to tissue engineer one such machine by exclusively using biological building blocks and by employing a bio-inspired design. Valveless impedance pumps were constructed based on the working principles of the embryonic vertebrate heart and by using cells and tissue derived from rats. The function of these tissue-engineered muscular pumps was characterized by exploring their spatiotemporal and flow behavior in order to better understand the capabilities and limitations of cells when used as the engines of biological machines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis belongs to the growing field of economic networks. In particular, we develop three essays in which we study the problem of bargaining, discrete choice representation, and pricing in the context of networked markets. Despite analyzing very different problems, the three essays share the common feature of making use of a network representation to describe the market of interest.

In Chapter 1 we present an analysis of bargaining in networked markets. We make two contributions. First, we characterize market equilibria in a bargaining model, and find that players' equilibrium payoffs coincide with their degree of centrality in the network, as measured by Bonacich's centrality measure. This characterization allows us to map, in a simple way, network structures into market equilibrium outcomes, so that payoffs dispersion in networked markets is driven by players' network positions. Second, we show that the market equilibrium for our model converges to the so called eigenvector centrality measure. We show that the economic condition for reaching convergence is that the players' discount factor goes to one. In particular, we show how the discount factor, the matching technology, and the network structure interact in a very particular way in order to see the eigenvector centrality as the limiting case of our market equilibrium.

We point out that the eigenvector approach is a way of finding the most central or relevant players in terms of the “global” structure of the network, and to pay less attention to patterns that are more “local”. Mathematically, the eigenvector centrality captures the relevance of players in the bargaining process, using the eigenvector associated to the largest eigenvalue of the adjacency matrix of a given network. Thus our result may be viewed as an economic justification of the eigenvector approach in the context of bargaining in networked markets.

As an application, we analyze the special case of seller-buyer networks, showing how our framework may be useful for analyzing price dispersion as a function of sellers and buyers' network positions.

Finally, in Chapter 3 we study the problem of price competition and free entry in networked markets subject to congestion effects. In many environments, such as communication networks in which network flows are allocated, or transportation networks in which traffic is directed through the underlying road architecture, congestion plays an important role. In particular, we consider a network with multiple origins and a common destination node, where each link is owned by a firm that sets prices in order to maximize profits, whereas users want to minimize the total cost they face, which is given by the congestion cost plus the prices set by firms. In this environment, we introduce the notion of Markovian traffic equilibrium to establish the existence and uniqueness of a pure strategy price equilibrium, without assuming that the demand functions are concave nor imposing particular functional forms for the latency functions. We derive explicit conditions to guarantee existence and uniqueness of equilibria. Given this existence and uniqueness result, we apply our framework to study entry decisions and welfare, and establish that in congested markets with free entry, the number of firms exceeds the social optimum.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lipid bilayer membranes are models for cell membranes--the structure that helps regulate cell function. Cell membranes are heterogeneous, and the coupling between composition and shape gives rise to complex behaviors that are important to regulation. This thesis seeks to systematically build and analyze complete models to understand the behavior of multi-component membranes.

We propose a model and use it to derive the equilibrium and stability conditions for a general class of closed multi-component biological membranes. Our analysis shows that the critical modes of these membranes have high frequencies, unlike single-component vesicles, and their stability depends on system size, unlike in systems undergoing spinodal decomposition in flat space. An important implication is that small perturbations may nucleate localized but very large deformations. We compare these results with experimental observations.

We also study open membranes to gain insight into long tubular membranes that arise for example in nerve cells. We derive a complete system of equations for open membranes by using the principle of virtual work. Our linear stability analysis predicts that the tubular membranes tend to have coiling shapes if the tension is small, cylindrical shapes if the tension is moderate, and beading shapes if the tension is large. This is consistent with experimental observations reported in the literature in nerve fibers. Further, we provide numerical solutions to the fully nonlinear equilibrium equations in some problems, and show that the observed mode shapes are consistent with those suggested by linear stability. Our work also proves that beadings of nerve fibers can appear purely as a mechanical response of the membrane.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Roughly one half of World's languages are in danger of extinction. The endangered languages, spoken by minorities, typically compete with powerful languages such as En- glish or Spanish. Consequently, the speakers of minority languages have to consider that not everybody can speak their language, converting the language choice into strategic,coordination-like situation. We show experimentally that the displacement of minority languages may be partially explained by the imperfect information about the linguistic type of the partner, leading to frequent failure to coordinate on the minority language even between two speakers who can and prefer to use it. The extent of miscoordination correlates with how minoritarian a language is and with the real-life linguistic condition of subjects: the more endangered a language the harder it is to coordinate on its use, and people on whom the language survival relies the most acquire behavioral strategies that lower its use. Our game-theoretical treatment of the issue provides a new perspective for linguistic policies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optical microscopy has become an indispensable tool for biological researches since its invention, mostly owing to its sub-cellular spatial resolutions, non-invasiveness, instrumental simplicity, and the intuitive observations it provides. Nonetheless, obtaining reliable, quantitative spatial information from conventional wide-field optical microscopy is not always intuitive as it appears to be. This is because in the acquired images of optical microscopy the information about out-of-focus regions is spatially blurred and mixed with in-focus information. In other words, conventional wide-field optical microscopy transforms the three-dimensional spatial information, or volumetric information about the objects into a two-dimensional form in each acquired image, and therefore distorts the spatial information about the object. Several fluorescence holography-based methods have demonstrated the ability to obtain three-dimensional information about the objects, but these methods generally rely on decomposing stereoscopic visualizations to extract volumetric information and are unable to resolve complex 3-dimensional structures such as a multi-layer sphere.

The concept of optical-sectioning techniques, on the other hand, is to detect only two-dimensional information about an object at each acquisition. Specifically, each image obtained by optical-sectioning techniques contains mainly the information about an optically thin layer inside the object, as if only a thin histological section is being observed at a time. Using such a methodology, obtaining undistorted volumetric information about the object simply requires taking images of the object at sequential depths.

Among existing methods of obtaining volumetric information, the practicability of optical sectioning has made it the most commonly used and most powerful one in biological science. However, when applied to imaging living biological systems, conventional single-point-scanning optical-sectioning techniques often result in certain degrees of photo-damages because of the high focal intensity at the scanning point. In order to overcome such an issue, several wide-field optical-sectioning techniques have been proposed and demonstrated, although not without introducing new limitations and compromises such as low signal-to-background ratios and reduced axial resolutions. As a result, single-point-scanning optical-sectioning techniques remain the most widely used instrumentations for volumetric imaging of living biological systems to date.

In order to develop wide-field optical-sectioning techniques that has equivalent optical performance as single-point-scanning ones, this thesis first introduces the mechanisms and limitations of existing wide-field optical-sectioning techniques, and then brings in our innovations that aim to overcome these limitations. We demonstrate, theoretically and experimentally, that our proposed wide-field optical-sectioning techniques can achieve diffraction-limited optical sectioning, low out-of-focus excitation and high-frame-rate imaging in living biological systems. In addition to such imaging capabilities, our proposed techniques can be instrumentally simple and economic, and are straightforward for implementation on conventional wide-field microscopes. These advantages together show the potential of our innovations to be widely used for high-speed, volumetric fluorescence imaging of living biological systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the quest for a descriptive theory of decision-making, the rational actor model in economics imposes rather unrealistic expectations and abilities on human decision makers. The further we move from idealized scenarios, such as perfectly competitive markets, and ambitiously extend the reach of the theory to describe everyday decision making situations, the less sense these assumptions make. Behavioural economics has instead proposed models based on assumptions that are more psychologically realistic, with the aim of gaining more precision and descriptive power. Increased psychological realism, however, comes at the cost of a greater number of parameters and model complexity. Now there are a plethora of models, based on different assumptions, applicable in differing contextual settings, and selecting the right model to use tends to be an ad-hoc process. In this thesis, we develop optimal experimental design methods and evaluate different behavioral theories against evidence from lab and field experiments.

We look at evidence from controlled laboratory experiments. Subjects are presented with choices between monetary gambles or lotteries. Different decision-making theories evaluate the choices differently and would make distinct predictions about the subjects' choices. Theories whose predictions are inconsistent with the actual choices can be systematically eliminated. Behavioural theories can have multiple parameters requiring complex experimental designs with a very large number of possible choice tests. This imposes computational and economic constraints on using classical experimental design methods. We develop a methodology of adaptive tests: Bayesian Rapid Optimal Adaptive Designs (BROAD) that sequentially chooses the "most informative" test at each stage, and based on the response updates its posterior beliefs over the theories, which informs the next most informative test to run. BROAD utilizes the Equivalent Class Edge Cutting (EC2) criteria to select tests. We prove that the EC2 criteria is adaptively submodular, which allows us to prove theoretical guarantees against the Bayes-optimal testing sequence even in the presence of noisy responses. In simulated ground-truth experiments, we find that the EC2 criteria recovers the true hypotheses with significantly fewer tests than more widely used criteria such as Information Gain and Generalized Binary Search. We show, theoretically as well as experimentally, that surprisingly these popular criteria can perform poorly in the presence of noise, or subject errors. Furthermore, we use the adaptive submodular property of EC2 to implement an accelerated greedy version of BROAD which leads to orders of magnitude speedup over other methods.

We use BROAD to perform two experiments. First, we compare the main classes of theories for decision-making under risk, namely: expected value, prospect theory, constant relative risk aversion (CRRA) and moments models. Subjects are given an initial endowment, and sequentially presented choices between two lotteries, with the possibility of losses. The lotteries are selected using BROAD, and 57 subjects from Caltech and UCLA are incentivized by randomly realizing one of the lotteries chosen. Aggregate posterior probabilities over the theories show limited evidence in favour of CRRA and moments' models. Classifying the subjects into types showed that most subjects are described by prospect theory, followed by expected value. Adaptive experimental design raises the possibility that subjects could engage in strategic manipulation, i.e. subjects could mask their true preferences and choose differently in order to obtain more favourable tests in later rounds thereby increasing their payoffs. We pay close attention to this problem; strategic manipulation is ruled out since it is infeasible in practice, and also since we do not find any signatures of it in our data.

In the second experiment, we compare the main theories of time preference: exponential discounting, hyperbolic discounting, "present bias" models: quasi-hyperbolic (α, β) discounting and fixed cost discounting, and generalized-hyperbolic discounting. 40 subjects from UCLA were given choices between 2 options: a smaller but more immediate payoff versus a larger but later payoff. We found very limited evidence for present bias models and hyperbolic discounting, and most subjects were classified as generalized hyperbolic discounting types, followed by exponential discounting.

In these models the passage of time is linear. We instead consider a psychological model where the perception of time is subjective. We prove that when the biological (subjective) time is positively dependent, it gives rise to hyperbolic discounting and temporal choice inconsistency.

We also test the predictions of behavioral theories in the "wild". We pay attention to prospect theory, which emerged as the dominant theory in our lab experiments of risky choice. Loss aversion and reference dependence predicts that consumers will behave in a uniquely distinct way than the standard rational model predicts. Specifically, loss aversion predicts that when an item is being offered at a discount, the demand for it will be greater than that explained by its price elasticity. Even more importantly, when the item is no longer discounted, demand for its close substitute would increase excessively. We tested this prediction using a discrete choice model with loss-averse utility function on data from a large eCommerce retailer. Not only did we identify loss aversion, but we also found that the effect decreased with consumers' experience. We outline the policy implications that consumer loss aversion entails, and strategies for competitive pricing.

In future work, BROAD can be widely applicable for testing different behavioural models, e.g. in social preference and game theory, and in different contextual settings. Additional measurements beyond choice data, including biological measurements such as skin conductance, can be used to more rapidly eliminate hypothesis and speed up model comparison. Discrete choice models also provide a framework for testing behavioural models with field data, and encourage combined lab-field experiments.