56 resultados para Sequential Quadratic Programming
Resumo:
We present a model of price discrimination where a monopolistfaces a consumer who is privately informed about thedistribution of his valuation for an indivisible unit ofgood but has yet to learn privately the actual valuation.The monopolist sequentially screens the consumer with amenu of contracts:the consumer self-selects once by choosing a contract andthen self-selects again when he learns the actual valuation. A deterministic sequential mechanism is a menu of refundcontracts, each consisting of an advance payment and a refundamount in case of no consumption, but sequential mechanismsmay involve randomization.We characterize the optimal sequential mechanism when someconsumer types are more eager in the sense of first-orderstochastic dominance, and when some types face greatervaluation uncertainty in the sense of mean-preserving-spread.We show that it can be optimal to subsidize consumer typeswith smaller valuation uncertainty (through low refund, as inairplane ticket pricing) in order to reduce the rent to thosewith greater uncertainty. The size of distortion depends bothon the type distribution and on how informative the consumer'sinitial private knowledge is about his valuation, but noton how much he initially knows about the valuation per se.
Resumo:
We consider adaptive sequential lossy coding of bounded individual sequences when the performance is measured by the sequentially accumulated mean squared distortion. Theencoder and the decoder are connected via a noiseless channel of capacity $R$ and both are assumed to have zero delay. No probabilistic assumptions are made on how the sequence to be encoded is generated. For any bounded sequence of length $n$, the distortion redundancy is defined as the normalized cumulative distortion of the sequential scheme minus the normalized cumulative distortion of the best scalarquantizer of rate $R$ which is matched to this particular sequence. We demonstrate the existence of a zero-delay sequential scheme which uses common randomization in the encoder and the decoder such that the normalized maximum distortion redundancy converges to zero at a rate $n^{-1/5}\log n$ as the length of the encoded sequence $n$ increases without bound.
Resumo:
The aim of this project is to get used to another kind of programming. Since now, I used very complex programming languages to develop applications or even to program microcontrollers, but PicoCricket system is the evidence that we don’t need so complex development tools to get functional devices. PicoCricket system is the clear example of simple programming to make devices work the way we programmed it. There’s an easy but effective way to program small, devices just saying what we want them to do. We cannot do complex algorithms and mathematical operations but we can program them in a short time. Nowadays, the easier and faster we produce, the more we earn. So the tendency is to develop fast, cheap and easy, and PicoCricket system can do it.
Resumo:
A sequential weakly efficient two-auction game with entry costs, interdependence between objects, two potential bidders and IPV assumption is presented here in order to give some theoretical predictions on the effects of geographical scale economies on local service privatization performance. It is shown that the first object seller takes profit of this interdependence. The interdependence externality rises effective competition for the first object, expressed as the probability of having more than one final bidder. Besides, if there is more than one final bidder in the first auction, seller extracts the entire bidder¿s expected future surplus differential between having won the first auction and having lost. Consequences for second object seller are less clear, reflecting the contradictory nature of the two main effects of object interdependence. On the one hand, first auction winner becomes ¿stronger¿, so that expected payments rise in a competitive environment. On the other hand, first auction loser becomes relatively ¿weaker¿, hence (probably) reducing effective competition for the second object. Additionally, some contributions to static auction theory with entry cost and asymmetric bidders are presented in the appendix
Resumo:
[eng] In the context of cooperative TU-games, and given an order of players, we consider the problem of distributing the worth of the grand coalition as a sequentia decision problem. In each step of process, upper and lower bounds for the payoff of the players are required related to successive reduced games. Sequentially compatible payoffs are defined as those allocation vectors that meet these recursive bounds. The core of the game is reinterpreted as a set of sequentally compatible payoffs when the Davis-Maschler reduced game is considered (Th.1). Independently of the reduction, the core turns out to be the intersections of the family of the sets of sequentially compatible payoffs corresponding to the different possible orderings (Th.2), so it is in some sense order-independent. Finally, we analyze advantagenous properties for the first player
Resumo:
We study the problem of the partition of a system of initial size V into a sequence of fragments s1,s2,s3 . . . . By assuming a scaling hypothesis for the probability p(s;V) of obtaining a fragment of a given size, we deduce that the final distribution of fragment sizes exhibits power-law behavior. This minimal model is useful to understanding the distribution of avalanche sizes in first-order phase transitions at low temperatures.
Resumo:
High-sensitivity electron paramagnetic resonance experiments have been carried out in fresh and stressed Mn12 acetate single crystals for frequencies ranging from 40 GHz up to 110 GHz. The high number of crystal dislocations formed in the stressing process introduces a E(Sx2-Sy2) transverse anisotropy term in the spin Hamiltonian. From the behavior of the resonant absorptions on the applied transverse magnetic field we have obtained an average value for E=22 mK, corresponding to a concentration of dislocations per unit cell of c=10-3.
Resumo:
[eng] In the context of cooperative TU-games, and given an order of players, we consider the problem of distributing the worth of the grand coalition as a sequentia decision problem. In each step of process, upper and lower bounds for the payoff of the players are required related to successive reduced games. Sequentially compatible payoffs are defined as those allocation vectors that meet these recursive bounds. The core of the game is reinterpreted as a set of sequentally compatible payoffs when the Davis-Maschler reduced game is considered (Th.1). Independently of the reduction, the core turns out to be the intersections of the family of the sets of sequentially compatible payoffs corresponding to the different possible orderings (Th.2), so it is in some sense order-independent. Finally, we analyze advantagenous properties for the first player
Resumo:
Biometric system performance can be improved by means of data fusion. Several kinds of information can be fused in order to obtain a more accurate classification (identification or verification) of an input sample. In this paper we present a method for computing the weights in a weighted sum fusion for score combinations, by means of a likelihood model. The maximum likelihood estimation is set as a linear programming problem. The scores are derived from a GMM classifier working on a different feature extractor. Our experimental results assesed the robustness of the system in front a changes on time (different sessions) and robustness in front a change of microphone. The improvements obtained were significantly better (error bars of two standard deviations) than a uniform weighted sum or a uniform weighted product or the best single classifier. The proposed method scales computationaly with the number of scores to be fussioned as the simplex method for linear programming.
Resumo:
We consider the numerical treatment of the optical flow problem by evaluating the performance of the trust region method versus the line search method. To the best of our knowledge, the trust region method is studied here for the first time for variational optical flow computation. Four different optical flow models are used to test the performance of the proposed algorithm combining linear and nonlinear data terms with quadratic and TV regularization. We show that trust region often performs better than line search; especially in the presence of non-linearity and non-convexity in the model.
Resumo:
We investigate under which dynamical conditions the Julia set of a quadratic rational map is a Sierpiński curve.
Resumo:
In this paper we propose a method for computing JPEG quantization matrices for a given mean square error or PSNR. Then, we employ our method to compute JPEG standard progressive operation mode definition scripts using a quantization approach. Therefore, it is no longer necessary to use a trial and error procedure to obtain a desired PSNR and/or definition script, reducing cost. Firstly, we establish a relationship between a Laplacian source and its uniform quantization error. We apply this model to the coefficients obtained in the discrete cosine transform stage of the JPEG standard. Then, an image may be compressed using the JPEG standard under a global MSE (or PSNR) constraint and a set of local constraints determined by the JPEG standard and visual criteria. Secondly, we study the JPEG standard progressive operation mode from a quantization based approach. A relationship between the measured image quality at a given stage of the coding process and a quantization matrix is found. Thus, the definition script construction problem can be reduced to a quantization problem. Simulations show that our method generates better quantization matrices than the classical method based on scaling the JPEG default quantization matrix. The estimation of PSNR has usually an error smaller than 1 dB. This figure decreases for high PSNR values. Definition scripts may be generated avoiding an excessive number of stages and removing small stages that do not contribute during the decoding process with a noticeable image quality improvement.