628 resultados para Maximizing
Resumo:
The quest for sustainable resources to meet the demands of a rapidly rising global population while mitigating the risks of rising CO2 emissions and associated climate change, represents a grand challenge for humanity. Biomass offers the most readily implemented and low-cost solution for sustainable transportation fuels, and the only non-petroleum route to organic molecules for the manufacture of bulk, fine and speciality chemicals and polymers. To be considered truly sustainable, biomass must be derived fromresources which do not compete with agricultural land use for food production, or compromise the environment (e.g. via deforestation). Potential feedstocks include waste lignocellulosic or oil-based materials derived from plant or aquatic sources, with the so-called biorefinery concept offering the co-production of biofuels, platform chemicals and energy; analogous to today's petroleum refineries which deliver both high-volume/low-value (e.g. fuels and commodity chemicals) and lowvolume/ high-value (e.g. fine/speciality chemicals) products, thereby maximizing biomass valorization. This article addresses the challenges to catalytic biomass processing and highlights recent successes in the rational design of heterogeneous catalysts facilitated by advances in nanotechnology and the synthesis of templated porous materials, as well as the use of tailored catalyst surfaces to generate bifunctional solid acid/base materials or tune hydrophobicity.
Resumo:
Photovoltaic (PV) solar power generation is proven to be effective and sustainable but is currently hampered by relatively high costs and low conversion efficiency. This paper addresses both issues by presenting a low-cost and efficient temperature distribution analysis for identifying PV module mismatch faults by thermography. Mismatch faults reduce the power output and cause potential damage to PV cells. This paper first defines three fault categories in terms of fault levels, which lead to different terminal characteristics of the PV modules. The investigation of three faults is also conducted analytically and experimentally, and maintenance suggestions are also provided for different fault types. The proposed methodology is developed to combine the electrical and thermal characteristics of PV cells subjected to different fault mechanisms through simulation and experimental tests. Furthermore, the fault diagnosis method can be incorporated into the maximum power point tracking schemes to shift the operating point of the PV string. The developed technology has improved over the existing ones in locating the faulty cell by a thermal camera, providing a remedial measure, and maximizing the power output under faulty conditions.
Resumo:
Principal component analysis (PCA) is well recognized in dimensionality reduction, and kernel PCA (KPCA) has also been proposed in statistical data analysis. However, KPCA fails to detect the nonlinear structure of data well when outliers exist. To reduce this problem, this paper presents a novel algorithm, named iterative robust KPCA (IRKPCA). IRKPCA works well in dealing with outliers, and can be carried out in an iterative manner, which makes it suitable to process incremental input data. As in the traditional robust PCA (RPCA), a binary field is employed for characterizing the outlier process, and the optimization problem is formulated as maximizing marginal distribution of a Gibbs distribution. In this paper, this optimization problem is solved by stochastic gradient descent techniques. In IRKPCA, the outlier process is in a high-dimensional feature space, and therefore kernel trick is used. IRKPCA can be regarded as a kernelized version of RPCA and a robust form of kernel Hebbian algorithm. Experimental results on synthetic data demonstrate the effectiveness of IRKPCA. © 2010 Taylor & Francis.
Resumo:
We will investigate the amount of residual demand in a market consisting of only one consumer and two producers. Since there is only one consumer, we cannot really speak about a rationing rule, but we can ask ourselves whether a known rationing rule reflects the consumer’s utility maximizing behavior. We will show that, if the consumer has a Cobb-Douglas utility function, then the amount purchased by the consumer from the high-price firm lies between the values determined according to the efficient rationing rule and the random rationing rule. We will show further, that if the consumer has a quasilinear utility function, then in the economically interesting case his residual demand function will be equal to the residual demand function under efficient rationing.
Resumo:
A correlation scheme (leading to a special equilibrium called “soft” correlated equilibrium) is applied for two-person finite games in extensive form with perfect information. Randomization by an umpire takes place over the leaves of the game tree. At every decision point players have the choice either to follow the recommendation of the umpire blindly or freely choose any other action except the one suggested. This scheme can lead to Pareto-improved outcomes of other correlated equilibria. Computational issues of maximizing a linear function over the set of soft correlated equilibria are considered and a linear-time algorithm in terms of the number of edges in the game tree is given for a special procedure called “subgame perfect optimization”.
Resumo:
A kiskereskedelmi árrögzítés évtizedek óta vitatott kérdés a közgazdasági elméletben. Az Egyesült Államok legfelsőbb bíróságának közelmúltbeli döntése - megszüntetve az ilyen típusú árkorlátozások önmagában törvénytelennek ítélését - ismételten felhívta a figyelmet az adott problémakörre. Cikkünkben az árrögzítés eddig mellőzött versenyfokozó hatásával foglalkozunk. A megszokott statikus modellek helyett dinamikus környezetet feltételezve, arra a következtetésre jutunk, hogy egy profitmaximalizáló termelőnek számos esetben célszerű kiskereskedelmi árrögzítést alkalmazni egy esetlegesen kialakuló forgalmazói kartell megelőzésére, amelynek egyértelműen pozitív hatása van nemcsak a termelő profitjára, hanem a kialakuló fogyasztói többletre nézve is. Amellett érvelünk, hogy indokolatlan a még mindig uralkodó, a legtöbb ország versenyszabályozásában tetten érhető, önmagában törvénytelennek minősített megítélés a vertikális árkorlátozásokkal kapcsolatban. / === / Retail price fixing has been a disputed issue in theoretical economics for decades, to which attention was drawn again by a recent decision by the US Supreme Court ending the illegality of such price restrictions as such. Assuming a dynamic environment instead of the customary static model leads to the conclusion that it is frequently advantageous to a profit-maximizing producer to use retail price maintenance to avert the possible appearance of a reseller cartel. This will have a clearly positive effect on producer profits, and also in terms of increasing consumption. It is also argued in the study that it is unjustified to qualify such vertical pricing restrictions as essentially illegal, after the manner of the competition rules in most countries.
Resumo:
A szerző egy, a szennyezőanyag-kibocsátás európai kereskedelmi rendszerében megfelelésre kötelezett gázturbinás erőmű szén-dioxid-kibocsátását modellezi négy termékre (völgy- és csúcsidőszaki áramár, gázár, kibocsátási kvóta) vonatkozó reálopciós modell segítségével. A profitmaximalizáló erőmű csak abban az esetben termel és szennyez, ha a megtermelt áramon realizálható fedezete pozitív. A jövőbeli időszak összesített szén-dioxid-kibocsátása megfeleltethető európai típusú bináris különbözetopciók összegének. A modell keretein belül a szén-dioxid-kibocsátás várható értékét és sűrűségfüggvényét becsülhetjük, az utóbbi segítségével a szén-dioxid-kibocsátási pozíció kockáztatott értékét határozhatjuk meg, amely az erőmű számára előírt megfelelési kötelezettség teljesítésének adott konfidenciaszint melletti költségét jelenti. A sztochasztikus modellben az alaptermékek geometriai Ornstein-Uhlenbeck-folyamatot követnek. Ezt illesztette a szerző a német energiatőzsdéről származó publikus piaci adatokra. A szimulációs modellre támaszkodva megvizsgálta, hogy a különböző technológiai és piaci tényezők ceteris paribus megváltozása milyen hatással van a megfelelés költségére, a kockáztatott értékére. ______ The carbon-dioxide emissions of an EU Emissions Trading System participant, gas-fuelled power generator are modelled by using real options for four underlying instruments (peak and off-peak electricity, gas, emission quota). This profit-maximizing power plant operates and emits pollution only if its profit (spread) on energy produced is positive. The future emissions can be estimated by a sum of European binary-spread options. Based on the real-option model, the expected value of emissions and its probability-density function can be deducted. Also calculable is the Value at Risk of emission quota position, which gives the cost of compliance at a given confidence level. To model the prices of the four underlying instruments, the geometric Ornstein-Uhlenbeck process is supposed and matched to public available price data from EEX. Based on the simulation model, the effects of various technological and market factors are analysed for the emissions level and the cost of compliance.
Resumo:
Economic behavior is multifaceted and context-dependent. However, the so-called Homo Oeconomicus model states that agents are perfectly rational, self-interest-maximizing beings. This model can be criticized on both empirical and normative grounds. Understanding economic behavior requires a more complex and dynamic framework. In the "I & We" paradigm developed by Amitai Etzioni, economic behavior is co-determined by utility calculations and moral considerations. Two major factors can explain the ethicality of economic behavior; namely, the moral character of the agents and the relative cost of ethical behavior. Economic agents are moral beings, but the ethical fabric of the economy determines which face of the Moral Economic Man predominates.
Resumo:
This research examines evolving issues in applied computer science and applies economic and business analyses as well. There are two main areas. The first is internetwork communications as embodied by the Internet. The goal of the research is to devise an efficient pricing, prioritization, and incentivization plan that could be realistically implemented on the existing infrastructure. Criteria include practical and economic efficiency, and proper incentives for both users and providers. Background information on the evolution and functional operation of the Internet is given, and relevant literature is surveyed and analyzed. Economic analysis is performed on the incentive implications of the current pricing structure and organization. The problems are identified, and minimally disruptive solutions are proposed for all levels of implementation to the lowest level protocol. Practical issues are considered and performance analyses are done. The second area of research is mass market software engineering, and how this differs from classical software engineering. Software life-cycle revenues are analyzed and software pricing and timing implications are derived. A profit maximizing methodology is developed to select or defer the development of software features for inclusion in a given release. An iterative model of the stages of the software development process is developed, taking into account new communications capabilities as well as profitability. ^
Resumo:
The purpose of this thesis was to identify the optimal design parameters for a jet nozzle which obtains a local maximum shear stress while maximizing the average shear stress on the floor of a fluid filled system. This research examined how geometric parameters of a jet nozzle, such as the nozzle's angle, height, and orifice, influence the shear stress created on the bottom surface of a tank. Simulations were run using a Computational Fluid Dynamics (CFD) software package to determine shear stress values for a parameterized geometric domain including the jet nozzle. A response surface was created based on the shear stress values obtained from 112 simulated designs. A multi-objective optimization software utilized the response surface to generate designs with the best combination of parameters to achieve maximum shear stress and maximum average shear stress. The optimal configuration of parameters achieved larger shear stress values over a commercially available design.
Resumo:
Management training in the hospitality industry is as important as employee training. There are a number of effective models and approaches for training effective managers. The author reviews these models and offers guidelines for maximizing the results from each of these approaches.
Resumo:
In the discussion - Indirect Cost Factors in Menu Pricing – by David V. Pavesic, Associate Professor, Hotel, Restaurant and Travel Administration at Georgia State University, Associate Professor Pavesic initially states: “Rational pricing methodologies have traditionally employed quantitative factors to mark up food and beverage or food and labor because these costs can be isolated and allocated to specific menu items. There are, however, a number of indirect costs that can influence the price charged because they provide added value to the customer or are affected by supply/demand factors. The author discusses these costs and factors that must be taken into account in pricing decisions. Professor Pavesic offers as a given that menu pricing should cover costs, return a profit, reflect a value for the customer, and in the long run, attract customers and market the establishment. “Prices that are too high will drive customers away, and prices that are too low will sacrifice profit,” Professor Pavesic puts it succinctly. To dovetail with this premise the author provides that although food costs measure markedly into menu pricing, other factors such as equipment utilization, popularity/demand, and marketing are but a few of the parenthetic factors also to be considered. “… there is no single method that can be used to mark up every item on any given restaurant menu. One must employ a combination of methodologies and theories,” says Professor Pavesic. “Therefore, when properly carried out, prices will reflect food cost percentages, individual and/or weighted contribution margins, price points, and desired check averages, as well as factors driven by intuition, competition, and demand.” Additionally, Professor Pavesic wants you to know that value, as opposed to maximizing revenue, should be a primary motivating factor when designing menu pricing. This philosophy does come with certain caveats, and he explains them to you. Generically speaking, Professor Pavesic says, “The market ultimately determines the price one can charge.” But, in fine-tuning that decree he further offers, “Lower prices do not automatically translate into value and bargain in the minds of the customers. Having the lowest prices in your market may not bring customers or profit. “Too often operators engage in price wars through discount promotions and find that profits fall and their image in the marketplace is lowered,” Professor Pavesic warns. In reference to intangibles that influence menu pricing, service is at the top of the list. Ambience, location, amenities, product [i.e. food] presentation, and price elasticity are discussed as well. Be aware of price-value perception; Professor Pavesic explains this concept to you. Professor Pavesic closes with a brief overview of a la carte pricing; its pros and cons.
Resumo:
This research first evaluated the effects of urban wildland interface on reproductive biology of the Big Pine Partridge Pea, Chamaecrista keyensis, an understory herb that is endemic to Big Pine Key, Florida. I found that C. keyensis was self-compatible, but depended on bees for seed set. Furthermore, individuals of C. keyensis in urban habitats suffered higher seed predation and therefore set fewer seeds than forest interior plants. ^ I then focused on the effects of fire at different times of the year, summer (wet) and winter (dry), on the population dynamics and population viability of C. keyensis. I found that C. keyensis population recovered faster after winter burns and early summer burns (May–June) than after late summer burns (July–September) due to better survival and seedling recruitment following former fires. Fire intensity had positive effects on reproduction of C. keyensis. In contrast, no significant fire intensity effects were found on survival, growth, and seedling recruitment. This indicated that better survival and seedling recruitment following winter and early summer burns (compared with late summer burns) were due to the reproductive phenology of the plant in relation to fires rather than differences in fire intensity. Deterministic population modeling showed that time since fire significantly affected the finite population growth rates (λ). Particularly, recently burned plots had the largest λ. In addition, effects of timing of fires on λ were most pronounced the year of burn, but not the subsequent years. The elasticity analyses suggested that maximizing survival is an effective way to minimize the reduction in finite population growth rate the year of burn. Early summer fires or dry-season fires may achieve this objective. Finally, stochastic simulations indicated that the C. keyensis population had lower extinction risk and population decline probability if burned in the winter than in the late summer. A fire frequency of approximately 7 years would create the lowest extinction probability for C. keyensis. A fire management regime including a wide range of burning seasons may be essential for the continued existence of C. keyensis and other endemic species of pine rockland on Big Pine Key. ^
Resumo:
We present our approach to real-time service-oriented scheduling problems with the objective of maximizing the total system utility. Different from the traditional utility accrual scheduling problems that each task is associated with only a single time utility function (TUF), we associate two different TUFs—a profit TUF and a penalty TUF—with each task, to model the real-time services that not only need to reward the early completions but also need to penalize the abortions or deadline misses. The scheduling heuristics we proposed in this paper judiciously accept, schedule, and abort real-time services when necessary to maximize the accrued utility. Our extensive experimental results show that our proposed algorithms can significantly outperform the traditional scheduling algorithms such as the Earliest Deadline First (EDF), the traditional utility accrual (UA) scheduling algorithms, and an earlier scheduling approach based on a similar model.
Resumo:
Cloud computing realizes the long-held dream of converting computing capability into a type of utility. It has the potential to fundamentally change the landscape of the IT industry and our way of life. However, as cloud computing expanding substantially in both scale and scope, ensuring its sustainable growth is a critical problem. Service providers have long been suffering from high operational costs. Especially the costs associated with the skyrocketing power consumption of large data centers. In the meantime, while efficient power/energy utilization is indispensable for the sustainable growth of cloud computing, service providers must also satisfy a user's quality of service (QoS) requirements. This problem becomes even more challenging considering the increasingly stringent power/energy and QoS constraints, as well as other factors such as the highly dynamic, heterogeneous, and distributed nature of the computing infrastructures, etc. ^ In this dissertation, we study the problem of delay-sensitive cloud service scheduling for the sustainable development of cloud computing. We first focus our research on the development of scheduling methods for delay-sensitive cloud services on a single server with the goal of maximizing a service provider's profit. We then extend our study to scheduling cloud services in distributed environments. In particular, we develop a queue-based model and derive efficient request dispatching and processing decisions in a multi-electricity-market environment to improve the profits for service providers. We next study a problem of multi-tier service scheduling. By carefully assigning sub deadlines to the service tiers, our approach can significantly improve resource usage efficiencies with statistically guaranteed QoS. Finally, we study the power conscious resource provision problem for service requests with different QoS requirements. By properly sharing computing resources among different requests, our method statistically guarantees all QoS requirements with a minimized number of powered-on servers and thus the power consumptions. The significance of our research is that it is one part of the integrated effort from both industry and academia to ensure the sustainable growth of cloud computing as it continues to evolve and change our society profoundly.^