137 resultados para Least manipulable envy-free rules
Resumo:
This paper describes a systematic research about free software solutions and techniques for art imagery computer recognition problem.
Resumo:
Is it important to negotiate on proportions rather than on numbers? To answer this question, we analyze the behavior of well-known bargaining solutions and the claims rules they induce when they are applied to a "proportionally transformed" bargaining set SP -so-called bargaining-in-proportions set. The idea of applying bargaining solutions to claims problems was already developed in Dagan and Volij (1993). They apply the bargaining solutions over a bargaining set that is the one de ned by the claims and the endowment. A comparison among our results and theirs is provided. Keywords: Bargaining problem, Claims problem, Proportional, Constrained Equal Awards, Constrained Equal Losses, Nash bargaining solution. JEL classi fication: C71, D63, D71.
Resumo:
The availability of induced pluripotent stem cells (iPSCs)has created extraordinary opportunities for modeling andperhaps treating human disease. However, all reprogrammingprotocols used to date involve the use of products of animal origin. Here, we set out to develop a protocol to generate and maintain human iPSC that would be entirelydevoid of xenobiotics. We first developed a xeno-free cellculture media that supported the long-term propagation of human embryonic stem cells (hESCs) to a similar extent as conventional media containing animal origin products or commercially available xeno-free medium. We also derivedprimary cultures of human dermal fibroblasts under strictxeno-free conditions (XF-HFF), and we show that they can be used as both the cell source for iPSC generation as well as autologous feeder cells to support their growth. We also replaced other reagents of animal origin trypsin, gelatin, matrigel) with their recombinant equivalents. Finally, we used vesicular stomatitis virus G-pseudotyped retroviral particles expressing a polycistronic construct encoding Oct4, Sox2, Klf4, and GFP to reprogram XF-HFF cells under xeno-free conditions. A total of 10 xeno-free humaniPSC lines were generated, which could be continuously passaged in xeno-free conditions and aintained characteristics indistinguishable from hESCs, including colonymorphology and growth behavior, expression of pluripotency-associated markers, and pluripotent differentiationability in vitro and in teratoma assays. Overall, the resultspresented here demonstrate that human iPSCs can be generatedand maintained under strict xeno-free conditions and provide a path to good manufacturing practice (GMP) applicability that should facilitate the clinical translation of iPSC-based therapies.
Resumo:
This paper presents a new registration algorithm, called Temporal Di eomorphic Free Form Deformation (TDFFD), and its application to motion and strain quanti cation from a sequence of 3D ultrasound (US) images. The originality of our approach resides in enforcing time consistency by representing the 4D velocity eld as the sum of continuous spatiotemporal B-Spline kernels. The spatiotemporal displacement eld is then recovered through forward Eulerian integration of the non-stationary velocity eld. The strain tensor iscomputed locally using the spatial derivatives of the reconstructed displacement eld. The energy functional considered in this paper weighs two terms: the image similarity and a regularization term. The image similarity metric is the sum of squared di erences between the intensities of each frame and a reference one. Any frame in the sequence can be chosen as reference. The regularization term is based on theincompressibility of myocardial tissue. TDFFD was compared to pairwise 3D FFD and 3D+t FFD, bothon displacement and velocity elds, on a set of synthetic 3D US images with di erent noise levels. TDFFDshowed increased robustness to noise compared to these two state-of-the-art algorithms. TDFFD also proved to be more resistant to a reduced temporal resolution when decimating this synthetic sequence. Finally, this synthetic dataset was used to determine optimal settings of the TDFFD algorithm. Subsequently, TDFFDwas applied to a database of cardiac 3D US images of the left ventricle acquired from 9 healthy volunteers and 13 patients treated by Cardiac Resynchronization Therapy (CRT). On healthy cases, uniform strain patterns were observed over all myocardial segments, as physiologically expected. On all CRT patients, theimprovement in synchrony of regional longitudinal strain correlated with CRT clinical outcome as quanti ed by the reduction of end-systolic left ventricular volume at follow-up (6 and 12 months), showing the potential of the proposed algorithm for the assessment of CRT.
Resumo:
This paper describes a Computer-Supported Collaborative Learning (CSCL) case study in engineering education carried out within the context of a network management course. The case study shows that the use of two computing tools developed by the authors and based on Free- and Open-Source Software (FOSS) provide significant educational benefits over traditional engineering pedagogical approaches in terms of both concepts and engineering competencies acquisition. First, the Collage authoring tool guides and supports the course teacher in the process of authoring computer-interpretable representations (using the IMS Learning Design standard notation) of effective collaborative pedagogical designs. Besides, the Gridcole system supports the enactment of that design by guiding the students throughout the prescribed sequence of learning activities. The paper introduces the goals and context of the case study, elaborates onhow Collage and Gridcole were employed, describes the applied evaluation methodology, anddiscusses the most significant findings derived from the case study.
Resumo:
Can rules be used to shield public resources from political interference? The Brazilian constitution and national tax code stipulate that revenue sharing transfers to municipal governments be determined by the size of counties in terms of estimated population. In this paper I document that the population estimates which went into the transfer allocation formula for the year 1991 were manipulated, resulting in significant transfer differentials over the entire 1990's. I test whether conditional on county characteristics that might account for the manipulation, center-local party alignment, party popularity and the extent of interparty fragmentation at the county level are correlated with estimated populations in 1991. Results suggest that revenue sharing transfers were targeted at right-wing national deputies in electorally fragmented counties as well as aligned local executives.
Resumo:
Using a new dataset on capital account openness, we investigate why equity return correlations changed over the last century. Based on a new, long-run dataset on capital account regulations in a group of 16 countries over the period 1890-2001, we show that correlations increase as financial markets are liberalized. These findings are robust to controlling for both the Forbes-Rigobon bias and global averages in equity return correlations. We test the robustness of our conclusions, and show that greater synchronization of fundamentals is not the main cause of increasing correlations. These results imply that the home bias puzzle may be smaller than traditionally claimed.
Resumo:
The defaults of Philip II have attained mythical status as the origin of sovereign debt crises. Four times during his reign the king failed to honor his debts and had to renegotiate borrowing contracts. In this paper, we reassess the fiscal position of Habsburg Spain. New archival evidence allows us to derive comprehensive estimates of debt and revenue. These show that primary surpluses were sufficient to make the king's debt sustainable in most scenarios. Spain's debt burden was manageable up to the 1580s, and its fiscal position only deteriorated for good after the defeat of the "Invincible Armada." We also estimate fiscal policy reaction functions, and show that Spain under the Habsburgs was at least as "responsible" as the US in the 20th century or as Britain in the 18th century. Our results suggest that the outcome of uncertain events such as wars may influence on a history of default more than strict adherence to fiscal rules.
Resumo:
In this paper, I consider a general and informationally effcient approach to determine the optimal access rule and show that there exists a simple rule that achieves the Ramsey outcome as the unique equilibrium when networks compete in linear prices without network-based price discrimination. My approach is informationally effcient in the sense that the regulator is required to know only the marginal cost structure, i.e. the marginal cost of making and terminating a call. The approach is general in that access prices can depend not only on the marginal costs but also on the retail prices, which can be observed by consumers and therefore by the regulator as well. In particular, I consider the set of linear access pricing rules which includes any fixed access price, the Efficient Component Pricing Rule (ECPR) and the Modified ECPR as special cases. I show that in this set, there is a unique access rule that achieves the Ramsey outcome as the unique equilibrium as long as there exists at least a mild degree of substitutability among networks' services.
Resumo:
Manipulation of government finances for the benefit of narrowly defined groups is usuallythought to be limited to the part of the budget over which politicians exercise discretion inthe short run, such as earmarks. Analyzing a revenue-sharing program between the centraland local governments in Brazil that uses an allocation formula based on local population estimates,I document two main results: first, that the population estimates entering the formulawere manipulated and second, that this manipulation was political in nature. Consistent withswing-voter targeting by the right-wing central government, I find that municipalities withroughly equal right-wing and non-right-wing vote shares benefited relative to opposition orconservative core support municipalities. These findings suggest that the exclusive focus ondiscretionary transfers in the extant empirical literature on special-interest politics may understatethe true scope of tactical redistribution that is going on under programmatic disguise.
Resumo:
One of the assumptions of the Capacitated Facility Location Problem (CFLP) is thatdemand is known and fixed. Most often, this is not the case when managers take somestrategic decisions such as locating facilities and assigning demand points to thosefacilities. In this paper we consider demand as stochastic and we model each of thefacilities as an independent queue. Stochastic models of manufacturing systems anddeterministic location models are put together in order to obtain a formula for thebacklogging probability at a potential facility location.Several solution techniques have been proposed to solve the CFLP. One of the mostrecently proposed heuristics, a Reactive Greedy Adaptive Search Procedure, isimplemented in order to solve the model formulated. We present some computationalexperiments in order to evaluate the heuristics performance and to illustrate the use ofthis new formulation for the CFLP. The paper finishes with a simple simulationexercise.
Resumo:
We estimate a forward-looking monetary policy reaction function for thepostwar United States economy, before and after Volcker's appointmentas Fed Chairman in 1979. Our results point to substantial differencesin the estimated rule across periods. In particular, interest ratepolicy in the Volcker-Greenspan period appears to have been much moresensitive to changes in expected inflation than in the pre-Volckerperiod. We then compare some of the implications of the estimated rulesfor the equilibrium properties of inflation and output, using a simplemacroeconomic model, and show that the Volcker-Greenspan rule is stabilizing.
Resumo:
Scoring rules that elicit an entire belief distribution through the elicitation of point beliefsare time-consuming and demand considerable cognitive e¤ort. Moreover, the results are validonly when agents are risk-neutral or when one uses probabilistic rules. We investigate a classof rules in which the agent has to choose an interval and is rewarded (deterministically) onthe basis of the chosen interval and the realization of the random variable. We formulatean e¢ ciency criterion for such rules and present a speci.c interval scoring rule. For single-peaked beliefs, our rule gives information about both the location and the dispersion of thebelief distribution. These results hold for all concave utility functions.
Resumo:
Minimax lower bounds for concept learning state, for example, thatfor each sample size $n$ and learning rule $g_n$, there exists a distributionof the observation $X$ and a concept $C$ to be learnt such that the expectederror of $g_n$ is at least a constant times $V/n$, where $V$ is the VC dimensionof the concept class. However, these bounds do not tell anything about therate of decrease of the error for a {\sl fixed} distribution--concept pair.\\In this paper we investigate minimax lower bounds in such a--stronger--sense.We show that for several natural $k$--parameter concept classes, includingthe class of linear halfspaces, the class of balls, the class of polyhedrawith a certain number of faces, and a class of neural networks, for any{\sl sequence} of learning rules $\{g_n\}$, there exists a fixed distributionof $X$ and a fixed concept $C$ such that the expected error is larger thana constant times $k/n$ for {\sl infinitely many n}. We also obtain suchstrong minimax lower bounds for the tail distribution of the probabilityof error, which extend the corresponding minimax lower bounds.
Resumo:
Recent research on the dynamics of moral behavior has documented two contrastingphenomena - moral consistency and moral balancing. Moral balancing refers to thephenomenon whereby behaving (un)ethically decreases the likelihood of doing so againat a later time. Moral consistency describes the opposite pattern - engaging in(un)ethical behavior increases the likelihood of doing so later on. Three studies supportthe hypothesis that individuals' ethical mindset (i.e., outcome-based versus rule-based)moderates the impact of an initial (un)ethical act on the likelihood of behaving ethicallyin a subsequent occasion. More specifically, an outcome-based mindset facilitates moralbalancing and a rule-based mindset facilitates moral consistency.