846 resultados para Efficient market theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consumer demand is revolutionizing the way products are being produced, distributed and marketed. In relation to the dairy sector in developing countries, aspects of milk quality are receiving more attention from both society and the government. However, milk quality management needs to be better addressed in dairy production systems to guarantee the access of stakeholders, mainly small-holders, into dairy markets. The present study is focused on an analysis of the interaction of the upstream part of the dairy supply chain (farmers and dairies) in the Mantaro Valley (Peruvian central Andes), in order to understand possible constraints both stakeholders face implementing milk quality controls and practices; and evaluate “ex-ante” how different strategies suggested to improve milk quality could affect farmers and processors’ profits. The analysis is based on three complementary field studies conducted between 2012 and 2013. Our work has shown that the presence of a dual supply chain combining both formal and informal markets has a direct impact on dairy production at the technical and organizational levels, affecting small formal dairy processors’ possibilities to implement contracts, including agreements on milk quality standards. The analysis of milk quality management from farms to dairy plants highlighted the poor hygiene in the study area, even when average values of milk composition were usually high. Some husbandry practices evaluated at farm level demonstrated cost effectiveness and a big impact on hygienic quality; however, regular application of these practices was limited, since small-scale farmers do not receive a bonus for producing hygienic milk. On the basis of these two results, we co-designed with formal small-scale dairy processors a simulation tool to show prospective scenarios, in which they could select their best product portfolio but also design milk payment systems to reward farmers’ with high milk quality performances. This type of approach allowed dairy processors to realize the importance of including milk quality management in their collection and manufacturing processes, especially in a context of high competition for milk supply. We concluded that the improvement of milk quality in a smallholder farming context requires a more coordinated effort among stakeholders. Successful implementation of strategies will depend on the willingness of small-scale dairy processors to reward farmers producing high milk quality; but also on the support from the State to provide incentives to the stakeholders in the formal sector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dilute bismide alloys, containing small fractions of bismuth (Bi), have recently attracted interest due to their potential for applications in a range of semiconductor devices. Experiments have revealed that dilute bismide alloys such as GaBixAs1−x, in which a small fraction x of the atoms in the III-V semiconductor GaAs are replaced by Bi, exhibit a number of unusual and unique properties. For example, the band gap energy (E g) decreases rapidly with increasing Bi composition x, by up to 90 meV per % Bi replacing As in the alloy. This band gap reduction is accompanied by a strong increase in the spin-orbit-splitting energy (ΔSO) with increasing x, and both E g and ΔSO are characterised by strong, composition-dependent bowing. The existence of a ΔSO > E g regime in the GaBixAs1−x alloy has been demonstrated for x ≳10%, a band structure condition which is promising for the development of highly efficient, temperature stable semiconductor lasers that could lead to large energy savings in future optical communication networks. In addition to their potential for specific applications, dilute bismide alloys have also attracted interest from a fundamental perspective due to their unique properties. In this thesis we develop the theory of the electronic and optical properties of dilute bismide alloys. By adopting a multi-scale approach encompassing atomistic calculations of the electronic structure using the semi-empirical tight-binding method, as well as continuum calculations based on the k•p method, we develop a fundamental understanding of this unusual class of semiconductor alloys and identify general material properties which are promising for applications in semiconductor optoelectronic and photovoltaic devices. By performing detailed supercell calculations on both ordered and disordered alloys we explicitly demonstrate that Bi atoms act as isovalent impurities when incorporated in dilute quantities in III-V (In)GaAs(P) materials, strongly perturbing the electronic structure of the valence band. We identify and quantify the causes and consequences of the unusual electronic properties of GaBixAs1−x and related alloys, and our analysis is reinforced throughout by a series of detailed comparisons to the results of experimental measurements. Our k•p models of the band structure of GaBixAs1−x and related alloys, which we derive directly from detailed atomistic calculations, are ideally suited to the study of dilute bismide-based devices. We focus in the latter part of the thesis on calculations of the electronic and optical properties of dilute bismide quantum well lasers. In addition to developing an understanding of the effects of Bi incorporation on the operational characteristics of semiconductor lasers, we also present calculations which have been used explicitly in designing and optimising the first generation of GaBixAs1−x-based devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Localized molecular orbitals (LMOs) are much more compact representations of electronic degrees of freedom than canonical molecular orbitals (CMOs). The most compact representation is provided by nonorthogonal localized molecular orbitals (NOLMOs), which are linearly independent but are not orthogonal. Both LMOs and NOLMOs are thus useful for linear-scaling calculations of electronic structures for large systems. Recently, NOLMOs have been successfully applied to linear-scaling calculations with density functional theory (DFT) and to reformulating time-dependent density functional theory (TDDFT) for calculations of excited states and spectroscopy. However, a challenge remains as NOLMO construction from CMOs is still inefficient for large systems. In this work, we develop an efficient method to accelerate the NOLMO construction by using predefined centroids of the NOLMO and thereby removing the nonlinear equality constraints in the original method ( J. Chem. Phys. 2004 , 120 , 9458 and J. Chem. Phys. 2000 , 112 , 4 ). Thus, NOLMO construction becomes an unconstrained optimization. Its efficiency is demonstrated for the selected saturated and conjugated molecules. Our method for fast NOLMO construction should lead to efficient DFT and NOLMO-TDDFT applications to large systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Theory suggests that economic instruments, such as pollution taxes or tradable permits, can provide more efficient technology adoption incentives than conventional regulatory standards. We explore this issue for an important industry undergoing dramatic decreases in allowed pollution - the U.S. petroleum industry's phasedown of lead in gasoline. Using a duration model applied to a panel of refineries from 1971-1995, we find that the pattern of technology adoption is consistent with an economic response to market incentives, plant characteristics, and alternative policies. Importantly, evidence suggests that the tradable permit system used during the phasedown provided incentives for more efficient technology adoption decisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Market failures associated with environmental pollution interact with market failures associated with the innovation and diffusion of new technologies. These combined market failures provide a strong rationale for a portfolio of public policies that foster emissions reduction as well as the development and adoption of environmentally beneficial technology. Both theory and empirical evidence suggest that the rate and direction of technological advance is influenced by market and regulatory incentives, and can be cost-effectively harnessed through the use of economic-incentive based policy. In the presence of weak or nonexistent environmental policies, investments in the development and diffusion of new environmentally beneficial technologies are very likely to be less than would be socially desirable. Positive knowledge and adoption spillovers and information problems can further weaken innovation incentives. While environmental technology policy is fraught with difficulties, a long-term view suggests a strategy of experimenting with policy approaches and systematically evaluating their success. © 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scheduling a set of jobs over a collection of machines to optimize a certain quality-of-service measure is one of the most important research topics in both computer science theory and practice. In this thesis, we design algorithms that optimize {\em flow-time} (or delay) of jobs for scheduling problems that arise in a wide range of applications. We consider the classical model of unrelated machine scheduling and resolve several long standing open problems; we introduce new models that capture the novel algorithmic challenges in scheduling jobs in data centers or large clusters; we study the effect of selfish behavior in distributed and decentralized environments; we design algorithms that strive to balance the energy consumption and performance.

The technically interesting aspect of our work is the surprising connections we establish between approximation and online algorithms, economics, game theory, and queuing theory. It is the interplay of ideas from these different areas that lies at the heart of most of the algorithms presented in this thesis.

The main contributions of the thesis can be placed in one of the following categories.

1. Classical Unrelated Machine Scheduling: We give the first polygorithmic approximation algorithms for minimizing the average flow-time and minimizing the maximum flow-time in the offline setting. In the online and non-clairvoyant setting, we design the first non-clairvoyant algorithm for minimizing the weighted flow-time in the resource augmentation model. Our work introduces iterated rounding technique for the offline flow-time optimization, and gives the first framework to analyze non-clairvoyant algorithms for unrelated machines.

2. Polytope Scheduling Problem: To capture the multidimensional nature of the scheduling problems that arise in practice, we introduce Polytope Scheduling Problem (\psp). The \psp problem generalizes almost all classical scheduling models, and also captures hitherto unstudied scheduling problems such as routing multi-commodity flows, routing multicast (video-on-demand) trees, and multi-dimensional resource allocation. We design several competitive algorithms for the \psp problem and its variants for the objectives of minimizing the flow-time and completion time. Our work establishes many interesting connections between scheduling and market equilibrium concepts, fairness and non-clairvoyant scheduling, and queuing theoretic notion of stability and resource augmentation analysis.

3. Energy Efficient Scheduling: We give the first non-clairvoyant algorithm for minimizing the total flow-time + energy in the online and resource augmentation model for the most general setting of unrelated machines.

4. Selfish Scheduling: We study the effect of selfish behavior in scheduling and routing problems. We define a fairness index for scheduling policies called {\em bounded stretch}, and show that for the objective of minimizing the average (weighted) completion time, policies with small stretch lead to equilibrium outcomes with small price of anarchy. Our work gives the first linear/ convex programming duality based framework to bound the price of anarchy for general equilibrium concepts such as coarse correlated equilibrium.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whether a small cell, a small genome or a minimal set of chemical reactions with self-replicating properties, simplicity is beguiling. As Leonardo da Vinci reportedly said, 'simplicity is the ultimate sophistication'. Two diverging views of simplicity have emerged in accounts of symbiotic and commensal bacteria and cosmopolitan free-living bacteria with small genomes. The small genomes of obligate insect endosymbionts have been attributed to genetic drift caused by small effective population sizes (Ne). In contrast, streamlining theory attributes small cells and genomes to selection for efficient use of nutrients in populations where Ne is large and nutrients limit growth. Regardless of the cause of genome reduction, lost coding potential eventually dictates loss of function. Consequences of reductive evolution in streamlined organisms include atypical patterns of prototrophy and the absence of common regulatory systems, which have been linked to difficulty in culturing these cells. Recent evidence from metagenomics suggests that streamlining is commonplace, may broadly explain the phenomenon of the uncultured microbial majority, and might also explain the highly interdependent (connected) behavior of many microbial ecosystems. Streamlining theory is belied by the observation that many successful bacteria are large cells with complex genomes. To fully appreciate streamlining, we must look to the life histories and adaptive strategies of cells, which impose minimum requirements for complexity that vary with niche.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides algorithms that use an information-theoretic analysis to learn Bayesian network structures from data. Based on our three-phase learning framework, we develop efficient algorithms that can effectively learn Bayesian networks, requiring only polynomial numbers of conditional independence (CI) tests in typical cases. We provide precise conditions that specify when these algorithms are guaranteed to be correct as well as empirical evidence (from real world applications and simulation tests) that demonstrates that these systems work efficiently and reliably in practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Extending the work presented in Prasad et al. (IEEE Proceedings on Control Theory and Applications, 147, 523-37, 2000), this paper reports a hierarchical nonlinear physical model-based control strategy to account for the problems arising due to complex dynamics of drum level and governor valve, and demonstrates its effectiveness in plant-wide disturbance handling. The strategy incorporates a two-level control structure consisting of lower-level conventional PI regulators and a higher-level nonlinear physical model predictive controller (NPMPC) for mainly set-point manoeuvring. The lower-level PI loops help stabilise the unstable drum-boiler dynamics and allow faster governor valve action for power and grid-frequency regulation. The higher-level NPMPC provides an optimal load demand (or set-point) transition by effective handling of plant-wide interactions and system disturbances. The strategy has been tested in a simulation of a 200-MW oil-fired power plant at Ballylumford in Northern Ireland. A novel approach is devized to test the disturbance rejection capability in severe operating conditions. Low frequency disturbances were created by making random changes in radiation heat flow on the boiler-side, while condenser vacuum was fluctuating in a random fashion on the turbine side. In order to simulate high-frequency disturbances, pulse-type load disturbances were made to strike at instants which are not an integral multiple of the NPMPC sampling period. Impressive results have been obtained during both types of system disturbances and extremely high rates of load changes, right across the operating range, These results compared favourably with those from a conventional state-space generalized predictive control (GPC) method designed under similar conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An efficient analysis and design of an electromagnetic-bandgap (EBG) waveguide with resonant loads is presented. Equivalent-circuit analysis is employed to demonstrate the differences between EBG waveguides with resonant and nonresonant loadings. As a result of the resonance, transmission zeros at finite frequencies emerge. The concept is demonstrated in E-plane waveguides. A generic fast and efficient formulation is presented, which starts from the generalized scattering matrix of the unit cell and derives the dispersion properties of the infinite structure. Both real and imaginary parts of the propagation constant are derived and discussed. The Floquet wavelength and impedance are also presented. The theoretical results are validated by comparison with simulations of a finite structure and experimental results. The application of the proposed EBG waveguide in the suppression of the spurious passband of a conventional E-plane filter is presented by experiment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an efficient. modeling technique for the derivation of the dispersion characteristics of novel uniplanar metallodielectric periodic structures. The analysis is based on the method of moments and an interpolation scheme, which significantly accelerates the computations. Triangular basis functions are used that allow for modeling of arbitrary shaped metallic elements. Based on this method, novel uniplanar left-handed (LH) metamaterials are proposed. Variations of the split rectangular-loop element printed on grounded dielectric substrate are demonstrated to possess LH propagation properties. Full-wave dispersion curves are presented. Based on the dual transmission-line concept, we study the distribution of the modal fields And the variation of series capacitance and shunt inductance for all the proposed elements. A verification of the left-handedness is presented by means of full-wave simulation of finite uniplanar arrays using commercial software (HFSS). The cell dimensions are a small fraction of the wavelength (approximately lambda/24) so that the structures can he considered as a homogeneous effective medium. The structures are simple, readily scalable to higher frequencies, and compatible with low-cost fabrication techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genes, species and ecosystems are often considered to be assets. The need to ensure a sufficient diversity of this asset is being increasingly recognised today. Asset managers in banks and insurance companies face a similar challenge. They are asked to manage the assets of their investors by constructing efficient portfolios. They deliberately make use of a phenomenon observed in the formation of portfolios: returns are additive, while risks diversify. This phenomenon and its implications are at the heart of portfolio theory. Portfolio theory, like few other economic theories, has dramatically transformed the practical work of banks and insurance companies. Before portfolio theory was developed about 50 years ago, asset managers were confronted with a situation similar to the situation the research on biodiversity faces today. While the need for diversification was generally accepted, a concept that linked risk and return on a portfolio level and showed the value of diversification was missing. Portfolio theory has closed this gap. This article first explains the fundamentals of portfolio theory and transfers it to biodiversity. A large part of this article is then dedicated to some of the implications portfolio theory has for the valuation and management of biodiversity. The last section introduces three development openings for further research.