53 resultados para indivisible objects allocation
Resumo:
The earning structure in science is known to be flat relative to the one in theprivate sector, which could cause a brain drain toward the private sector. In thispaper, we assume that agents value both money and fame and study the role ofthe institution of science in the allocation of talent between the science sector andthe private sector. Following works on the Sociology of Science, we model theinstitution of science as a mechanism distributing fame (i.e. peer recognition). Weshow that since the intrinsic performance is less noisy signal of talent in the sciencesector than in the private sector, a good institution of science can mitigate thebrain drain. We also find that providing extra monetary incentives through themarket might undermine the incentives provided by the institution and therebyworsen the brain drain. Finally, we study the optimal balance between monetaryand non-monetary incentives in science.
Resumo:
This article studies the effects of interest rate restrictions on loan allocation. The British governmenttightened the usury laws in 1714, reducing the maximum permissible interest rate from 6% to5%. A sample of individual loan transactions reveals that average loan size and minimum loan sizeincreased strongly, while access to credit worsened for those with little social capital. Collateralisedcredits, which had accounted for a declining share of total lending, returned to their former role ofprominence. Our results suggest that the usury laws distorted credit markets significantly; we findno evidence that they offered a form of Pareto-improving social insurance.
Resumo:
The goal of this paper is to present an optimal resource allocation model for the regional allocation of public service inputs. Theproposed solution leads to maximise the relative public service availability in regions located below the best availability frontier, subject to exogenous budget restrictions and equality ofaccess for equal need criteria (equity-based notion of regional needs). The construction of non-parametric deficit indicators is proposed for public service availability by a novel application of Data Envelopment Analysis (DEA) models, whose results offer advantages for the evaluation and improvement of decentralised public resource allocation systems. The method introduced in this paper has relevance as a resource allocation guide for the majority of services centrally funded by the public sector in a given country, such as health care, basic and higher education, citizen safety, justice, transportation, environmental protection, leisure, culture, housing and city planning, etc.
Resumo:
We postulate a two-region world, comprised of North (calibrated after the US) and South(calibrated after China). Our optimization results show the compatibility of the following threedesiderata:(1) Global CO2 emissions follow a conservative path that leads to the stabilizationof concentrations at 450 ppm.(2) North and South converge to a path of sustained growth at 1% per year (28.2%per generation) in 2075.(3) During the transition to the steady state, North also grows at 1% per year whileSouth s rates of growth are markedly higher.The transition paths require a drastic reduction of the share of emissions allocated to North,large investments in knowledge, both in North and South, as well as very large investments ineducation in South. Surprisingly, in order to sustain North s utility growth rate, some output mustbe transferred from South to North during the transition.Although undoubtedly subject to many caveats, our results support a degree of optimism byproviding prima facie evidence of the possibility of tackling climate change in a way that is fairboth across generations and across regions while allowing for positive rates of humandevelopment.
Resumo:
In this paper we address the issue of locating hierarchical facilities in the presence of congestion. Two hierarchical models are presented, where lower level servers attend requests first, and then, some of the served customers are referred to higher level servers. In the first model, the objective is to find the minimum number of servers and theirlocations that will cover a given region with a distance or time standard. The second model is cast as a Maximal Covering Location formulation. A heuristic procedure is then presented together with computational experience. Finally, some extensions of these models that address other types of spatial configurations are offered.
Resumo:
This paper describes an optimized model to support QoS by mean of Congestion minimization on LSPs (Label Switching Path). In order to perform this model, we start from a CFA (Capacity and Flow Allocation) model. As this model does not consider the buffer size to calculate the capacity cost, our model- named BCA (Buffer Capacity Allocation)- take into account this issue and it improve the CFA performance. To test our proposal, we perform several simulations; results show that BCA model minimizes LSP congestion and uniformly distributes flows on the network
Resumo:
A subclass of games with population monotonic allocation schemes is studied, namelygames with regular population monotonic allocation schemes (rpmas). We focus on theproperties of these games and we prove the coincidence between the core and both theDavis-Maschler bargaining set and the Mas-Colell bargaining set
Resumo:
[eng] In the context of cooperative TU-games, and given an order of players, we consider the problem of distributing the worth of the grand coalition as a sequentia decision problem. In each step of process, upper and lower bounds for the payoff of the players are required related to successive reduced games. Sequentially compatible payoffs are defined as those allocation vectors that meet these recursive bounds. The core of the game is reinterpreted as a set of sequentally compatible payoffs when the Davis-Maschler reduced game is considered (Th.1). Independently of the reduction, the core turns out to be the intersections of the family of the sets of sequentially compatible payoffs corresponding to the different possible orderings (Th.2), so it is in some sense order-independent. Finally, we analyze advantagenous properties for the first player
Resumo:
A subclass of games with population monotonic allocation schemes is studied, namelygames with regular population monotonic allocation schemes (rpmas). We focus on theproperties of these games and we prove the coincidence between the core and both theDavis-Maschler bargaining set and the Mas-Colell bargaining set
Resumo:
[eng] In the context of cooperative TU-games, and given an order of players, we consider the problem of distributing the worth of the grand coalition as a sequentia decision problem. In each step of process, upper and lower bounds for the payoff of the players are required related to successive reduced games. Sequentially compatible payoffs are defined as those allocation vectors that meet these recursive bounds. The core of the game is reinterpreted as a set of sequentally compatible payoffs when the Davis-Maschler reduced game is considered (Th.1). Independently of the reduction, the core turns out to be the intersections of the family of the sets of sequentially compatible payoffs corresponding to the different possible orderings (Th.2), so it is in some sense order-independent. Finally, we analyze advantagenous properties for the first player
Resumo:
The cost of operational risk refers to the capital needed to a fford the loss generated by ordinary activities of a firm. In this work we demonstrate how allocation principles can be used to the subdivision of the aggregate capital so that the firm can distribute this cost across its various constituents that generate operational risk. Several capital allocation principles are revised. Proportional allocation allows to calculate a relative risk premium to be charged to each unit. An example of fraud risk in the banking sector is presented and some correlation scenarios between business lines are compared. Keywords: solvency, quantile, value at risk, copulas
Resumo:
Learning objects have been the promise of providing people with high quality learning resources. Initiatives such as MIT Open-CourseWare, MERLOT and others have shown the real possibilities of creating and sharing knowledge through Internet. Thousands of educational resources are available through learning object repositories. We indeed live in an age of content abundance, and content can be considered as infrastructure for building adaptive and personalized learning paths, promoting both formal and informal learning. Nevertheless, although most educational institutions are adopting a more open approach, publishing huge amounts of educational resources, the reality is that these resources are barely used in other educational contexts. This paradox can be partly explained by the dificulties in adapting such resources with respect to language, e-learning standards and specifications and, finally, granularity. Furthermore, if we want our learners to use and take advantage of learning object repositories, we need to provide them with additional services than just browsing and searching for resources. Social networks can be a first step towards creating an open social community of learning around a topic or a subject. In this paper we discuss and analyze the process of using a learning object repository and building a social network on the top of it, with respect to the information architecture needed to capture and store the interaction between learners and resources in form of learning object metadata.
Resumo:
This paper presents a simple and fast solution to the problem of finding the time variations of the forces that keep the object equilibrium when a finger is removed from a three contact point grasp or a finger is added to a two contact point grasp, assuming the existence of an external perturbation force (that can be the object weight itself). The procedure returns force set points for the control system of a manipulator device in a regrasping action. The approach was implemented and a numerical example is included in the paper to illustrate how it works.
Resumo:
This paper presents a Bayesian approach to the design of transmit prefiltering matrices in closed-loop schemes robust to channel estimation errors. The algorithms are derived for a multiple-input multiple-output (MIMO) orthogonal frequency division multiplexing (OFDM) system. Two different optimizationcriteria are analyzed: the minimization of the mean square error and the minimization of the bit error rate. In both cases, the transmitter design is based on the singular value decomposition (SVD) of the conditional mean of the channel response, given the channel estimate. The performance of the proposed algorithms is analyzed,and their relationship with existing algorithms is indicated. As withother previously proposed solutions, the minimum bit error rate algorithmconverges to the open-loop transmission scheme for very poor CSI estimates.
Resumo:
This paper describes a simple low-cost approach toadding an element of haptic interaction within a virtualenvironment. Using off-the-shelf hardware and software wedescribe a simple setup that can be used to explore physically virtual objects in space. This setup comprises of a prototype glove with a number of vibrating actuators to provide the haptic feedback, a Kinect camera for the tracking of the user's hand and a virtual reality development environment. As proof of concept and to test the efficiency of the system as well as its potential applications, we developed a simple application where we created 4 different shapes within a virtual environment in order to try toexplore them and guess their shape through touch alone.