17 resultados para Counterfeit luxury goods
em Indian Institute of Science - Bangalore - Índia
Resumo:
We discuss a dynamic pricing model which will aid automobile manufacturer in choosing the right price for customer segment. Though there is oligopoly market structure, the customers get "locked" into a particular technology/company which virtually makes the situation akin to a monopoly. There are associated network externalities and positive feedback. The key idea in monopoly pricing lies in extracting the customer surplus by exploiting the respective elasticities of demand. We present a Walrasian general equilibrium approach to determine the segment price. We compare the prices obtained from optimization model with that from Walrasian dynamics. The results are encouraging and can serve as a critical factor in Customer Relationship Management (CRM) and thereby effectively manage the lock-in.
Resumo:
In this paper, we exploit the idea of decomposition to match buyers and sellers in an electronic exchange for trading large volumes of homogeneous goods, where the buyers and sellers specify marginal-decreasing piecewise constant price curves to capture volume discounts. Such exchanges are relevant for automated trading in many e-business applications. The problem of determining winners and Vickrey prices in such exchanges is known to have a worst-case complexity equal to that of as many as (1 + m + n) NP-hard problems, where m is the number of buyers and n is the number of sellers. Our method proposes the overall exchange problem to be solved as two separate and simpler problems: 1) forward auction and 2) reverse auction, which turns out to be generalized knapsack problems. In the proposed approach, we first determine the quantity of units to be traded between the sellers and the buyers using fast heuristics developed by us. Next, we solve a forward auction and a reverse auction using fully polynomial time approximation schemes available in the literature. The proposed approach has worst-case polynomial time complexity. and our experimentation shows that the approach produces good quality solutions to the problem. Note to Practitioners- In recent times, electronic marketplaces have provided an efficient way for businesses and consumers to trade goods and services. The use of innovative mechanisms and algorithms has made it possible to improve the efficiency of electronic marketplaces by enabling optimization of revenues for the marketplace and of utilities for the buyers and sellers. In this paper, we look at single-item, multiunit electronic exchanges. These are electronic marketplaces where buyers submit bids and sellers ask for multiple units of a single item. We allow buyers and sellers to specify volume discounts using suitable functions. Such exchanges are relevant for high-volume business-to-business trading of standard products, such as silicon wafers, very large-scale integrated chips, desktops, telecommunications equipment, commoditized goods, etc. The problem of determining winners and prices in such exchanges is known to involve solving many NP-hard problems. Our paper exploits the familiar idea of decomposition, uses certain algorithms from the literature, and develops two fast heuristics to solve the problem in a near optimal way in worst-case polynomial time.
Resumo:
The association parameter in the diffuswn equaiior, dye fo Wiike one Chong has been interpreted in deferminable properties, thus permitting easily the calculation of the same for unknown systems. The proposed eqyotion a!se holds goods for water as soiute in organic solvenfs. The over-all percentage error remains the sarrse as that of the original equation.
Resumo:
Vegetated coastal ecosystems provide goods and services to billions of people. In the aftermath of a series of recent natural disasters, including the Indian Ocean Tsunami, Hurricane Katrina and Cyclone Nargis, coastal vegetation has been widely promoted for the purpose of reducing the impact of large storm surges and tsunami. In this paper, we review the use of coastal vegetation as a "bioshield" against these extreme events. Our objective is to alter bioshield policy and reduce the long-term negative consequences for biodiversity and human capital. We begin with an overview of the scientific literature, in particular focusing on studies published since the Indian Ocean Tsunami in 2004 and discuss the science of wave attenuation by vegetation. We then explore case studies from the Indian subcontinent and evaluate the detrimental impacts bioshield plantations can have upon native ecosystems, drawing a distinction between coastal restoration and the introduction of exotic species in inappropriate locations. Finally, we place bioshield policies into a political context, and outline a new direction for coastal vegetation policy and research.
Resumo:
Vegetated coastal ecosystems provide goods and services to billions of people.In the aftermath of a series of recent natural disasters, including the Indian Ocean Tsunami, Hurricane Katrina and Cyclone Nargis, coastal vegetation has been widely promoted for the purpose of reducing the impact of large storm surges and tsunami. In this paper, we review the use of coastal vegetation as a ``bioshield'' against these extreme events. Our objective is to alter bioshield policy and reduce the long-term negative consequences for biodiversity and human capital. We begin with an overview of the scientific literature, in particular focusing on studies published since the Indian Ocean Tsunami in 2004 and discuss the science of wave attenuation by vegetation. We then explore case studies from the Indian subcontinent and evaluate the detrimental impacts bioshield plantations can have upon native ecosystems, drawing a distinction between coastal restoration and the introduction of exotic species in inappropriate locations. Finally, we place bioshield policies into a political context, and outline a new direction for coastal vegetation policy and research.
Resumo:
There are a number of large networks which occur in many problems dealing with the flow of power, communication signals, water, gas, transportable goods, etc. Both design and planning of these networks involve optimization problems. The first part of this paper introduces the common characteristics of a nonlinear network (the network may be linear, the objective function may be non linear, or both may be nonlinear). The second part develops a mathematical model trying to put together some important constraints based on the abstraction for a general network. The third part deals with solution procedures; it converts the network to a matrix based system of equations, gives the characteristics of the matrix and suggests two solution procedures, one of them being a new one. The fourth part handles spatially distributed networks and evolves a number of decomposition techniques so that we can solve the problem with the help of a distributed computer system. Algorithms for parallel processors and spatially distributed systems have been described.There are a number of common features that pertain to networks. A network consists of a set of nodes and arcs. In addition at every node, there is a possibility of an input (like power, water, message, goods etc) or an output or none. Normally, the network equations describe the flows amoungst nodes through the arcs. These network equations couple variables associated with nodes. Invariably, variables pertaining to arcs are constants; the result required will be flows through the arcs. To solve the normal base problem, we are given input flows at nodes, output flows at nodes and certain physical constraints on other variables at nodes and we should find out the flows through the network (variables at nodes will be referred to as across variables).The optimization problem involves in selecting inputs at nodes so as to optimise an objective function; the objective may be a cost function based on the inputs to be minimised or a loss function or an efficiency function. The above mathematical model can be solved using Lagrange Multiplier technique since the equalities are strong compared to inequalities. The Lagrange multiplier technique divides the solution procedure into two stages per iteration. Stage one calculates the problem variables % and stage two the multipliers lambda. It is shown that the Jacobian matrix used in stage one (for solving a nonlinear system of necessary conditions) occurs in the stage two also.A second solution procedure has also been imbedded into the first one. This is called total residue approach. It changes the equality constraints so that we can get faster convergence of the iterations.Both solution procedures are found to coverge in 3 to 7 iterations for a sample network.The availability of distributed computer systems — both LAN and WAN — suggest the need for algorithms to solve the optimization problems. Two types of algorithms have been proposed — one based on the physics of the network and the other on the property of the Jacobian matrix. Three algorithms have been deviced, one of them for the local area case. These algorithms are called as regional distributed algorithm, hierarchical regional distributed algorithm (both using the physics properties of the network), and locally distributed algorithm (a multiprocessor based approach with a local area network configuration). The approach used was to define an algorithm that is faster and uses minimum communications. These algorithms are found to converge at the same rate as the non distributed (unitary) case.
Resumo:
We address the problem of allocating a single divisible good to a number of agents. The agents have concave valuation functions parameterized by a scalar type. The agents report only the type. The goal is to find allocatively efficient, strategy proof, nearly budget balanced mechanisms within the Groves class. Near budget balance is attained by returning as much of the received payments as rebates to agents. Two performance criteria are of interest: the maximum ratio of budget surplus to efficient surplus, and the expected budget surplus, within the class of linear rebate functions. The goal is to minimize them. Assuming that the valuation functions are known, we show that both problems reduce to convex optimization problems, where the convex constraint sets are characterized by a continuum of half-plane constraints parameterized by the vector of reported types. We then propose a randomized relaxation of these problems by sampling constraints. The relaxed problem is a linear programming problem (LP). We then identify the number of samples needed for ``near-feasibility'' of the relaxed constraint set. Under some conditions on the valuation function, we show that value of the approximate LP is close to the optimal value. Simulation results show significant improvements of our proposed method over the Vickrey-Clarke-Groves (VCG) mechanism without rebates. In the special case of indivisible goods, the mechanisms in this paper fall back to those proposed by Moulin, by Guo and Conitzer, and by Gujar and Narahari, without any need for randomization. Extension of the proposed mechanisms to situations when the valuation functions are not known to the central planner are also discussed. Note to Practitioners-Our results will be useful in all resource allocation problems that involve gathering of information privately held by strategic users, where the utilities are any concave function of the allocations, and where the resource planner is not interested in maximizing revenue, but in efficient sharing of the resource. Such situations arise quite often in fair sharing of internet resources, fair sharing of funds across departments within the same parent organization, auctioning of public goods, etc. We study methods to achieve near budget balance by first collecting payments according to the celebrated VCG mechanism, and then returning as much of the collected money as rebates. Our focus on linear rebate functions allows for easy implementation. The resulting convex optimization problem is solved via relaxation to a randomized linear programming problem, for which several efficient solvers exist. This relaxation is enabled by constraint sampling. Keeping practitioners in mind, we identify the number of samples that assures a desired level of ``near-feasibility'' with the desired confidence level. Our methodology will occasionally require subsidy from outside the system. We however demonstrate via simulation that, if the mechanism is repeated several times over independent instances, then past surplus can support the subsidy requirements. We also extend our results to situations where the strategic users' utility functions are not known to the allocating entity, a common situation in the context of internet users and other problems.
Resumo:
Wetlands are the most productive ecosystems, recognized globally for its vital role in sustaining a wide array of biodiversity and provide goods and services. However despite their important role in maintaining the ecology and economy, wetlands in India are endangered by inattention and lack of appreciation for their role. Increased anthropogenic activities such as intense agriculture practices, indiscriminate disposal of industrial effluents and sewage wastes have altered the physical, chemical as well as biological integrity of the ecosystem. This has resulted in the ecological degradation, which is evident from the current ecosystem valuation of Varthur wetland. Global valuation of coastal wetland ecosystem shows a total of 14,785/ha US$ annual economic value. An earlier study of relatively pristine wetland in Bangalore shows the value of Rs. 10,435/ha/day while the polluted wetland shows the value of Rs.20/ha/day. In contrast to this, Varthur, a sewage fed wetland has a value of Rs.118.9/ha/day. The pollutants and subsequent contamination of the wetland has telling effects such as disappearance of native species, dominance of invasive exotic species (such as African catfish), in addition to profuse breeding of disease vectors and pathogens. Water quality analysis revealed of high phosphates (4.22-5.76 ppm) level in addition to the enhanced BOD (119-140 ppm) and decreased DO (0-1.06 ppm). The amplified decline of ecosystem goods and services with degradation of water quality necessitates the implementation of sustainable management strategies to recover the lost wetland benefits.
Resumo:
The quest for prosperity has been a central motive in the life of man from the moment of his entrance into the worldly scence. And certain, it is that the issue of prosperity at the very cost of his very existence has mounted in intensity and urgency with the unforeseen evolution of industrialization. The traditional paradigm of prosperity has been resting on the classical theory of production economics.With increasing empiricism it is obvious that the rational model fail to grapple with the complexity of the concept. The paper addresses the prosperity as an goal state resting on the conviction of harmony between present generation and generation of humans to come. Sustainable prosperity involves more than growth in services/goods. It requires a change in the content of the growth, to make it less material-and energy-intensive and more equitable in its impact. The process of economic prosperity must be more soundly based upon the realities of the stock of capital that sustains it.
Resumo:
Wetlands are the most productive ecosystems, recognized globally for its vital role in sustaining a wide array of biodiversity and provide goods and services. However despite their important role in maintaining the ecology and economy, wetlands in India are endangered by inattention and lack of appreciation for their role. Increased anthropogenic activities such as intense agriculture practices, indiscriminate disposal of industrial effluents and sewage wastes have altered the physical, chemical as well as biological integrity of the ecosystem. This has resulted in the ecological degradation, which is evident from the current ecosystem valuation of Varthur wetland. Global valuation of coastal wetland ecosystem shows a total of 14,785/ha US$ annual economic value. An earlier study of relatively pristine wetland in Bangalore shows the value of Rs. 10,435/ha/day while the polluted wetland shows the value of Rs.20/ha/day. In contrast to this, Varthur, a sewage fed wetland has a value of Rs.118.9/ha/day. The pollutants and subsequent contamination of the wetland has telling effects such as disappearance of native species, dominance of invasive exotic species (such as African catfish), in addition to profuse breeding of disease vectors and pathogens. Water quality analysis revealed of high phosphates (4.22-5.76 ppm) level in addition to the enhanced BOD (119-140 ppm) and decreased DO (0-1.06 ppm). The amplified decline of ecosystem goods and services with degradation of water quality necessitates the implementation of sustainable management strategies to recover the lost wetland benefits.
Resumo:
Land cover (LC) and land use (LU) dynamics induced by human and natural processes play a major role in global as well as regional patterns of landscapes influencing biodiversity, hydrology, ecology and climate. Changes in LC features resulting in forest fragmentations have posed direct threats to biodiversity, endangering the sustainability of ecological goods and services. Habitat fragmentation is of added concern as the residual spatial patterns mitigate or exacerbate edge effects. LU dynamics are obtained by classifying temporal remotely sensed satellite imagery of different spatial and spectral resolutions. This paper reviews five different image classification algorithms using spatio-temporal data of a temperate watershed in Himachal Pradesh, India. Gaussian Maximum Likelihood classifier was found to be apt for analysing spatial pattern at regional scale based on accuracy assessment through error matrix and ROC (receiver operating characteristic) curves. The LU information thus derived was then used to assess spatial changes from temporal data using principal component analysis and correspondence analysis based image differencing. The forest area dynamics was further studied by analysing the different types of fragmentation through forest fragmentation models. The computed forest fragmentation and landscape metrics show a decline of interior intact forests with a substantial increase in patch forest during 1972-2007.
Resumo:
Electronic exchanges are double-sided marketplaces that allow multiple buyers to trade with multiple sellers, with aggregation of demand and supply across the bids to maximize the revenue in the market. Two important issues in the design of exchanges are (1) trade determination (determining the number of goods traded between any buyer-seller pair) and (2) pricing. In this paper we address the trade determination issue for one-shot, multi-attribute exchanges that trade multiple units of the same good. The bids are configurable with separable additive price functions over the attributes and each function is continuous and piecewise linear. We model trade determination as mixed integer programming problems for different possible bid structures and show that even in two-attribute exchanges, trade determination is NP-hard for certain bid structures. We also make some observations on the pricing issues that are closely related to the mixed integer formulations.
Resumo:
Third World hinterlands provide most of the settings in which the quality of human life has improved the least over the decade since Our Common Future was published. This low quality of life promotes a desire for large number of offspring, fuelling population growth and an exodus to the urban centres of the Third World, Enhancing the quality of life of these people in ways compatible with the health of their environments is therefore the most significant of the challenges from the perspective of sustainable development. Human quality of life may be viewed in terms of access to goods, services and a satisfying social role. The ongoing processes of globalization are enhancing flows of goods worldwide, but these hardly reach the poor of Third World countrysides. But processes of globalization have also vastly improved everybody's access to Information, and there are excellent opportunities of putting this to good use to enhance the quality of life of the people of Third World countrysides through better access to education and health. More importantly, better access to information could promote a more satisfying social role through strengthening grass-roots involvement in development planning and management of natural resources. I illustrate these possibilities with the help of a series of concrete experiences form the south Indian state of Kerala. Such an effort does not call for large-scare material inputs, rather it calls for a culture of inform-and-share in place place of the prevalent culture of control-and-command. It calls for openness and transparency in transactions involving government agencies, NGOs, and national and transnational business enterprises. It calls for acceptance of accountability by such agencies.
Resumo:
For necessary goods like water, under supply constraints, fairness considerations lead to negative externalities. The objective of this paper is to design an infinite horizon contract or relational contract (a type of long-term contract) that ensures self-enforcing (instead of court-enforced) behaviour by the agents to mitigate the externality due to fairness issues. In this contract, the consumer is induced to consume at firm-supply level using the threat of higher fair price for future time periods. The pricing mechanism, computed in this paper, internalizes the externality and is shown to be economically efficient and provides revenue sufficiency.
Resumo:
A supply chain ecosystem consists of the elements of the supply chain and the entities that influence the goods, information and financial flows through the supply chain. These influences come through government regulations, human, financial and natural resources, logistics infrastructure and management, etc., and thus affect the supply chain performance. Similarly, all the ecosystem elements also contribute to the risk. The aim of this paper is to identify both performances-based and risk-based decision criteria, which are important and critical to the supply chain. A two step approach using fuzzy AHP and fuzzy technique for order of preference by similarity to ideal solution has been proposed for multi-criteria decision-making and illustrated using a numerical example. The first step does the selection without considering risks and then in the next step suppliers are ranked according to their risk profiles. Later, the two ranks are consolidated into one. In subsequent section, the method is also extended for multi-tier supplier selection. In short, we are presenting a method for the design of a resilient supply chain, in this paper.