994 resultados para Equilibrium Problem


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Celem publikacji jest podjęcie próby przeanalizowania polityki, jaką Turcja prowadzi wobec obcokrajowców poszukujących schronienia na jej terytorium. O ważkości zagadnienia zadecydowała w ostatnich latach przede wszystkim tocząca się w Syrii wojna domowa, w wyniku której na terytorium Turcji znalazło się ponad 700 tysięcy Syryjczyków. Szczególne w tym kontekście kontrowersje budzi fakt stosowania przez Turcję podwójnych standardów w przedmiocie nadawania imigrantom konwencyjnego statusu uchodźcy. Państwo to, jako jedno z czterech na świecie, w momencie przystępowania do Konwencji dotyczącej statusu uchodźców i Protokołu nowojorskiego zastrzegło sobie prawo do stosowania w tej materii tzw. kryterium geograficznego. W efekcie, o ile status uchodźcy nadany być może osobom przybywającym zza zachodnich granic Turcji, o tyle uciekinierzy z państw takich, jak Syria, Iran, czy Irak z formalnego punktu widzenia są „poszukującymi schronienia” (tur. sığınmacı). To zaś oznacza brak ich konwencyjnej ochrony. Celem artykułu jest jednak nie tylko przeanalizowanie prawnego i rzeczywistego położenia, w jakim znajdują się ofiary syryjskiej wojny domowej, przybywające na terytorium Turcji, a także próba przewidzenia scenariusza rozwoju tejże sytuacji. Celem uczynienia analizy możliwie najbardziej rzetelną, odwołano się zarówno do anglo, jak i tureckojęzycznych materiałów źródłowych.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resource Allocation Problems (RAPs) are concerned with the optimal allocation of resources to tasks. Problems in fields such as search theory, statistics, finance, economics, logistics, sensor & wireless networks fit this formulation. In literature, several centralized/synchronous algorithms have been proposed including recently proposed auction algorithm, RAP Auction. Here we present asynchronous implementation of RAP Auction for distributed RAPs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Interdomain routing on the Internet is performed using route preference policies specified independently, and arbitrarily by each Autonomous System in the network. These policies are used in the border gateway protocol (BGP) by each AS when selecting next-hop choices for routes to each destination. Conflicts between policies used by different ASs can lead to routing instabilities that, potentially, cannot be resolved no matter how long BGP is run. The Stable Paths Problem (SPP) is an abstract graph theoretic model of the problem of selecting nexthop routes for a destination. A stable solution to the problem is a set of next-hop choices, one for each AS, that is compatible with the policies of each AS. In a stable solution each AS has selected its best next-hop given that the next-hop choices of all neighbors are fixed. BGP can be viewed as a distributed algorithm for solving SPP. In this report we consider the stable paths problem, as well as a family of restricted variants of the stable paths problem, which we call F stable paths problems. We show that two very simple variants of the stable paths problem are also NP-complete. In addition we show that for networks with a DAG topology, there is an efficient centralized algorithm to solve the stable paths problem, and that BGP always efficiently converges to a stable solution on such networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce Collocation Games as the basis of a general framework for modeling, analyzing, and facilitating the interactions between the various stakeholders in distributed systems in general, and in cloud computing environments in particular. Cloud computing enables fixed-capacity (processing, communication, and storage) resources to be offered by infrastructure providers as commodities for sale at a fixed cost in an open marketplace to independent, rational parties (players) interested in setting up their own applications over the Internet. Virtualization technologies enable the partitioning of such fixed-capacity resources so as to allow each player to dynamically acquire appropriate fractions of the resources for unencumbered use. In such a paradigm, the resource management problem reduces to that of partitioning the entire set of applications (players) into subsets, each of which is assigned to fixed-capacity cloud resources. If the infrastructure and the various applications are under a single administrative domain, this partitioning reduces to an optimization problem whose objective is to minimize the overall deployment cost. In a marketplace, in which the infrastructure provider is interested in maximizing its own profit, and in which each player is interested in minimizing its own cost, it should be evident that a global optimization is precisely the wrong framework. Rather, in this paper we use a game-theoretic framework in which the assignment of players to fixed-capacity resources is the outcome of a strategic "Collocation Game". Although we show that determining the existence of an equilibrium for collocation games in general is NP-hard, we present a number of simplified, practically-motivated variants of the collocation game for which we establish convergence to a Nash Equilibrium, and for which we derive convergence and price of anarchy bounds. In addition to these analytical results, we present an experimental evaluation of implementations of some of these variants for cloud infrastructures consisting of a collection of multidimensional resources of homogeneous or heterogeneous capacities. Experimental results using trace-driven simulations and synthetically generated datasets corroborate our analytical results and also illustrate how collocation games offer a feasible distributed resource management alternative for autonomic/self-organizing systems, in which the adoption of a global optimization approach (centralized or distributed) would be neither practical nor justifiable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In many networked applications, independent caching agents cooperate by servicing each other's miss streams, without revealing the operational details of the caching mechanisms they employ. Inference of such details could be instrumental for many other processes. For example, it could be used for optimized forwarding (or routing) of one's own miss stream (or content) to available proxy caches, or for making cache-aware resource management decisions. In this paper, we introduce the Cache Inference Problem (CIP) as that of inferring the characteristics of a caching agent, given the miss stream of that agent. While CIP is insolvable in its most general form, there are special cases of practical importance in which it is, including when the request stream follows an Independent Reference Model (IRM) with generalized power-law (GPL) demand distribution. To that end, we design two basic "litmus" tests that are able to detect LFU and LRU replacement policies, the effective size of the cache and of the object universe, and the skewness of the GPL demand for objects. Using extensive experiments under synthetic as well as real traces, we show that our methods infer such characteristics accurately and quite efficiently, and that they remain robust even when the IRM/GPL assumptions do not hold, and even when the underlying replacement policies are not "pure" LFU or LRU. We exemplify the value of our inference framework by considering example applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The combinatorial Dirichlet problem is formulated, and an algorithm for solving it is presented. This provides an effective method for interpolating missing data on weighted graphs of arbitrary connectivity. Image processing examples are shown, and the relation to anistropic diffusion is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An incremental, nonparametric probability estimation procedure using the fuzzy ARTMAP neural network is introduced. In slow-learning mode, fuzzy ARTMAP searches for patterns of data on which to build ever more accurate estimates. In max-nodes mode, the network initially learns a fixed number of categories, and weights are then adjusted gradually.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A neural network model of 3-D visual perception and figure-ground separation by visual cortex is introduced. The theory provides a unified explanation of how a 2-D image may generate a 3-D percept; how figures pop-out from cluttered backgrounds; how spatially sparse disparity cues can generate continuous surface representations at different perceived depths; how representations of occluded regions can be completed and recognized without usually being seen; how occluded regions can sometimes be seen during percepts of transparency; how high spatial frequency parts of an image may appear closer than low spatial frequency parts; how sharp targets are detected better against a figure and blurred targets are detector better against a background; how low spatial frequency parts of an image may be fused while high spatial frequency parts are rivalrous; how sparse blue cones can generate vivid blue surface percepts; how 3-D neon color spreading, visual phantoms, and tissue contrast percepts are generated; how conjunctions of color-and-depth may rapidly pop-out during visual search. These explanations arise derived from an ecological analysis of how monocularly viewed parts of an image inherit the appropriate depth from contiguous binocularly viewed parts, as during DaVinci stereopsis. The model predicts the functional role and ordering of multiple interactions within and between the two parvocellular processing streams that join LGN to prestriate area V4. Interactions from cells representing larger scales and disparities to cells representing smaller scales and disparities are of particular importance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we examine exchange rates in Vietnam’s transitional economy. Evidence of long-run equilibrium are established in most cases through a single co-integrating vector among endogenous variables that determine the real exchange rates. This supports relative PPP in which ECT of the system can be combined linearly into a stationary process, reducing deviation from PPP in the long run. Restricted coefficient vectors ß’ = (1, 1, -1) for real exchange rates of currencies in question are not rejected. This empirics of relative PPP adds to found evidences by many researchers, including Flre et al. (1999), Lee (1999), Johnson (1990), Culver and Papell (1999), Cuddington and Liang (2001). Instead of testing for different time series on a common base currency, we use different base currencies (USD, GBP, JPY and EUR). By doing so we want to know the whether theory may posit significant differences against one currency? We have found consensus, given inevitable technical differences, even with smallerdata sample for EUR. Speeds of convergence to PPP and adjustment are faster compared to results from other researches for developed economies, using both observed and bootstrapped HL measures. Perhaps, a better explanation is the adjustment from hyperinflation period, after which the theory indicates that adjusting process actually accelerates. We observe that deviation appears to have been large in early stages of the reform, mostly overvaluation. Over time, its correction took place leading significant deviations to gradually disappear.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While there is growing interest in measuring the size and scope of local spillovers, it is well understood that such spillovers cannot be distinguished from unobservable local attributes using solely the observed location decisions of individuals or firms. We propose an empirical strategy for recovering estimates of spillovers in the presence of unobserved local attributes for a broadly applicable class of equilibrium sorting models. Our approach relies on an IV strategy derived from the internal logic of the sorting model itself. We show practically how the strategy is implemented, provide intuition for our instruments, discuss the role of effective choice-set variation in identifying the model, and carry-out a series of Monte Carlo simulations to demonstrate performance in small samples. © 2007 The Author(s). Journal compilation Royal Economic Society 2007.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies the multiplicity-correction effect of standard Bayesian variable-selection priors in linear regression. Our first goal is to clarify when, and how, multiplicity correction happens automatically in Bayesian analysis, and to distinguish this correction from the Bayesian Ockham's-razor effect. Our second goal is to contrast empirical-Bayes and fully Bayesian approaches to variable selection through examples, theoretical results and simulations. Considerable differences between the two approaches are found. In particular, we prove a theorem that characterizes a surprising aymptotic discrepancy between fully Bayes and empirical Bayes. This discrepancy arises from a different source than the failure to account for hyperparameter uncertainty in the empirical-Bayes estimate. Indeed, even at the extreme, when the empirical-Bayes estimate converges asymptotically to the true variable-inclusion probability, the potential for a serious difference remains. © Institute of Mathematical Statistics, 2010.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many consumer durable retailers often do not advertise their prices and instead ask consumers to call them for prices. It is easy to see that this practice increases the consumers' cost of learning the prices of products they are considering, yet firms commonly use such practices. Not advertising prices may reduce the firm's advertising costs, but the strategic effects of doing so are not clear. Our objective is to examine the strategic effects of this practice. In particular, how does making price discovery more difficult for consumers affect competing retailers' price, service decisions, and profits? We develop a model in which a manufacturer sells its product through a high-service retailer and a low-service retailer. Consumers can purchase the retail service at the high-end retailer and purchase the product at the competing low-end retailer. Therefore, the high-end retailer faces a free-riding problem. A retailer first chooses its optimal service levels. Then, it chooses its optimal price levels. Finally, a retailer decides whether to advertise its prices. The model results in four structures: (1) both retailers advertise prices, (2) only the low-service retailer advertises price, (3) only the high-service retailer advertises price, and (4) neither retailer advertises price. We find that when a retailer does not advertise its price and makes price discovery more difficult for consumers, the competition between the retailers is less intense. However, the retailer is forced to charge a lower price. In addition, if the competing retailer does advertise its prices, then the competing retailer enjoys higher profit margins. We identify conditions under which each of the above four structures is an equilibrium and show that a low-service retailer not advertising its price is a more likely outcome than a high-service retailer doing so. We then solve the manufacturer's problem and find that there are several instances when a retailer's advertising decisions are different from what the manufacturer would want. We describe the nature of this channel coordination problem and identify some solutions. © 2010 INFORMS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Dropouts and missing data are nearly-ubiquitous in obesity randomized controlled trails, threatening validity and generalizability of conclusions. Herein, we meta-analytically evaluate the extent of missing data, the frequency with which various analytic methods are employed to accommodate dropouts, and the performance of multiple statistical methods. METHODOLOGY/PRINCIPAL FINDINGS: We searched PubMed and Cochrane databases (2000-2006) for articles published in English and manually searched bibliographic references. Articles of pharmaceutical randomized controlled trials with weight loss or weight gain prevention as major endpoints were included. Two authors independently reviewed each publication for inclusion. 121 articles met the inclusion criteria. Two authors independently extracted treatment, sample size, drop-out rates, study duration, and statistical method used to handle missing data from all articles and resolved disagreements by consensus. In the meta-analysis, drop-out rates were substantial with the survival (non-dropout) rates being approximated by an exponential decay curve (e(-lambdat)) where lambda was estimated to be .0088 (95% bootstrap confidence interval: .0076 to .0100) and t represents time in weeks. The estimated drop-out rate at 1 year was 37%. Most studies used last observation carried forward as the primary analytic method to handle missing data. We also obtained 12 raw obesity randomized controlled trial datasets for empirical analyses. Analyses of raw randomized controlled trial data suggested that both mixed models and multiple imputation performed well, but that multiple imputation may be more robust when missing data are extensive. CONCLUSION/SIGNIFICANCE: Our analysis offers an equation for predictions of dropout rates useful for future study planning. Our raw data analyses suggests that multiple imputation is better than other methods for handling missing data in obesity randomized controlled trials, followed closely by mixed models. We suggest these methods supplant last observation carried forward as the primary method of analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gemstone Team GREEN JUSTICE

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes that atherosclerosis is initiated by a signaling event that deposits calcium hydroxyapatite (Ca-HAP). This event is preceded by a loss of mechanical structure in the arterial wall. After Ca-HAP has been deposited, it is unlikely that it will be reabsorbed because the solubility product constant (K sp) is very small, and the large stores of Ca +2 and PO 4-3 in the bones oppose any attempts to dissolve Ca-HAP by decreasing the common ions. The hydroxide ion (OH -) of Ca-HAP can be displaced in nature by fluoride (F -) and carbonate (CO 3-2) ions, and it is proposed that anions associated with cholesterol ester hydrolysis and, in very small quantities, the enolate of 7-ketocholesterol could also displace the OH -of Ca-HAP, forming an ionic bond. The free energy of hydration of Ca-HAP at 310 K is most likely negative, and the ionic radii of the anions associated with the hydrolysis of cholesterol ester are compatible with the substitution. Furthermore, examination of the pathology of atherosclerotic lesions by Raman and NMR spectroscopy and confocal microscopy supports deposition of Ca-HAP associated with cholesterol. Investigating the affinity of intermediates of cholesterol hydrolysis for Ca-HAP compared to lipoproteins such as HDL, LDL, and VLDL using isothermic titration calorimetry could add proof of this concept and may lead to the development of a new class of medications targeted at the deposition of cholesterol within Ca-HAP. Treatment of acute ischemic events as a consequence of atherosclerosis with denitrogenation and oxygenation is discussed. © the author(s), publisher and licensee Libertas Academica Ltd.