80 resultados para Bounds


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper I try to move away from the Extreme Bounds method ofidentifying ``robust'' empirical relations in the economic growth literature.Instead of analyzing the extreme bounds of the estimates of the coefficientof a particular variable, I analyze the entire distribution. My claimin this paper is that, if we do this, the picture emerging from theempirical growth literature is not the pessimistic ``Nothing is Robust''that we get with the extreme bound analysis. Instead, we find that asubstantial number of variables can be found to be strongly relatedto growth.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several studies have reported high performance of simple decision heuristics multi-attribute decision making. In this paper, we focus on situations where attributes are binary and analyze the performance of Deterministic-Elimination-By-Aspects (DEBA) and similar decision heuristics. We consider non-increasing weights and two probabilistic models for the attribute values: one where attribute values are independent Bernoulli randomvariables; the other one where they are binary random variables with inter-attribute positive correlations. Using these models, we show that good performance of DEBA is explained by the presence of cumulative as opposed to simple dominance. We therefore introduce the concepts of cumulative dominance compliance and fully cumulative dominance compliance and show that DEBA satisfies those properties. We derive a lower bound with which cumulative dominance compliant heuristics will choose a best alternative and show that, even with many attributes, this is not small. We also derive an upper bound for the expected loss of fully cumulative compliance heuristics and show that this is moderateeven when the number of attributes is large. Both bounds are independent of the values ofthe weights.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We obtain minimax lower and upper bounds for the expected distortionredundancy of empirically designed vector quantizers. We show that the meansquared distortion of a vector quantizer designed from $n$ i.i.d. datapoints using any design algorithm is at least $\Omega (n^{-1/2})$ awayfrom the optimal distortion for some distribution on a bounded subset of${\cal R}^d$. Together with existing upper bounds this result shows thatthe minimax distortion redundancy for empirical quantizer design, as afunction of the size of the training data, is asymptotically on the orderof $n^{1/2}$. We also derive a new upper bound for the performance of theempirically optimal quantizer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present an exact test for whether two random variables that have known bounds on their support are negatively correlated. The alternative hypothesis is that they are not negatively correlated. No assumptions are made on the underlying distributions. We show by example that the Spearman rank correlation test as the competing exact test of correlation in nonparametric settings rests on an additional assumption on the data generating process without which it is not valid as a test for correlation.We then show how to test for the significance of the slope in a linear regression analysis that invovles a single independent variable and where outcomes of the dependent variable belong to a known bounded set.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Given $n$ independent replicates of a jointly distributed pair $(X,Y)\in {\cal R}^d \times {\cal R}$, we wish to select from a fixed sequence of model classes ${\cal F}_1, {\cal F}_2, \ldots$ a deterministic prediction rule $f: {\cal R}^d \to {\cal R}$ whose risk is small. We investigate the possibility of empirically assessingthe {\em complexity} of each model class, that is, the actual difficulty of the estimation problem within each class. The estimated complexities are in turn used to define an adaptive model selection procedure, which is based on complexity penalized empirical risk.The available data are divided into two parts. The first is used to form an empirical cover of each model class, and the second is used to select a candidate rule from each cover based on empirical risk. The covering radii are determined empirically to optimize a tight upper bound on the estimation error. An estimate is chosen from the list of candidates in order to minimize the sum of class complexity and empirical risk. A distinguishing feature of the approach is that the complexity of each model class is assessed empirically, based on the size of its empirical cover.Finite sample performance bounds are established for the estimates, and these bounds are applied to several non-parametric estimation problems. The estimates are shown to achieve a favorable tradeoff between approximation and estimation error, and to perform as well as if the distribution-dependent complexities of the model classes were known beforehand. In addition, it is shown that the estimate can be consistent,and even possess near optimal rates of convergence, when each model class has an infinite VC or pseudo dimension.For regression estimation with squared loss we modify our estimate to achieve a faster rate of convergence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We derive a new inequality for uniform deviations of averages from their means. The inequality is a common generalization of previous results of Vapnik and Chervonenkis (1974) and Pollard (1986). Usingthe new inequality we obtain tight bounds for empirical loss minimization learning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We address the performance optimization problem in a single-stationmulticlass queueing network with changeover times by means of theachievable region approach. This approach seeks to obtainperformance bounds and scheduling policies from the solution of amathematical program over a relaxation of the system's performanceregion. Relaxed formulations (including linear, convex, nonconvexand positive semidefinite constraints) of this region are developedby formulating equilibrium relations satisfied by the system, withthe help of Palm calculus. Our contributions include: (1) newconstraints formulating equilibrium relations on server dynamics;(2) a flow conservation interpretation of the constraintspreviously derived by the potential function method; (3) newpositive semidefinite constraints; (4) new work decomposition lawsfor single-station multiclass queueing networks, which yield newconvex constraints; (5) a unified buffer occupancy method ofperformance analysis obtained from the constraints; (6) heuristicscheduling policies from the solution of the relaxations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We develop a mathematical programming approach for the classicalPSPACE - hard restless bandit problem in stochastic optimization.We introduce a hierarchy of n (where n is the number of bandits)increasingly stronger linear programming relaxations, the lastof which is exact and corresponds to the (exponential size)formulation of the problem as a Markov decision chain, while theother relaxations provide bounds and are efficiently computed. Wealso propose a priority-index heuristic scheduling policy fromthe solution to the first-order relaxation, where the indices aredefined in terms of optimal dual variables. In this way wepropose a policy and a suboptimality guarantee. We report resultsof computational experiments that suggest that the proposedheuristic policy is nearly optimal. Moreover, the second-orderrelaxation is found to provide strong bounds on the optimalvalue.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The classical binary classification problem is investigatedwhen it is known in advance that the posterior probability function(or regression function) belongs to some class of functions. We introduceand analyze a method which effectively exploits this knowledge. The methodis based on minimizing the empirical risk over a carefully selected``skeleton'' of the class of regression functions. The skeleton is acovering of the class based on a data--dependent metric, especiallyfitted for classification. A new scale--sensitive dimension isintroduced which is more useful for the studied classification problemthan other, previously defined, dimension measures. This fact isdemonstrated by performance bounds for the skeleton estimate in termsof the new dimension.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Precise estimation of propagation parameters inprecipitation media is of interest to improve the performanceof communications systems and in remote sensing applications.In this paper, we present maximum-likelihood estimators ofspecific attenuation and specific differential phase in rain. Themodel used for obtaining the cited estimators assumes coherentpropagation, reflection symmetry of the medium, and Gaussianstatistics of the scattering matrix measurements. No assumptionsabout the microphysical properties of the medium are needed.The performance of the estimators is evaluated through simulateddata. Results show negligible estimators bias and variances closeto Cramer–Rao bounds.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Projective homography sits at the heart of many problems in image registration. In addition to many methods for estimating the homography parameters (R.I. Hartley and A. Zisserman, 2000), analytical expressions to assess the accuracy of the transformation parameters have been proposed (A. Criminisi et al., 1999). We show that these expressions provide less accurate bounds than those based on the earlier results of Weng et al. (1989). The discrepancy becomes more critical in applications involving the integration of frame-to-frame homographies and their uncertainties, as in the reconstruction of terrain mosaics and the camera trajectory from flyover imagery. We demonstrate these issues through selected examples

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[eng] In the context of cooperative TU-games, and given an order of players, we consider the problem of distributing the worth of the grand coalition as a sequentia decision problem. In each step of process, upper and lower bounds for the payoff of the players are required related to successive reduced games. Sequentially compatible payoffs are defined as those allocation vectors that meet these recursive bounds. The core of the game is reinterpreted as a set of sequentally compatible payoffs when the Davis-Maschler reduced game is considered (Th.1). Independently of the reduction, the core turns out to be the intersections of the family of the sets of sequentially compatible payoffs corresponding to the different possible orderings (Th.2), so it is in some sense order-independent. Finally, we analyze advantagenous properties for the first player

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The As Pontes basin (12 km2), NW Iberian Peninsula, is bounded by a double restraining bend of a dextral strike-slip fault, which is related to the western onshore end of the Pyrenean belt. Surface and subsurface data obtained from intensive coal exploration and mining in the basin since the 1960s together with additional structural and stratigraphic sequence analysis allowed us to determine the geometric relationships between tectonic structures and stratigraphic markers. The small size of the basin and the large amount of quality data make the As Pontes basin a unique natural laboratory for improving our understanding of the origin and evolution of restraining bends. The double restraining bend is the end stage of the structural evolution of a compressive underlapping stepover, where the basin was formed. During the first stage (stepover stage), which began ca. 30 Ma ago (latest Rupelian) and lasted 3.4 My, two small isolated basins bounded by thrusts and normal faults were formed. For 1.3 My, the strike-slip faults, which defined the stepover, grew towards each other until joining and forming the double restraining bend, which bounds one large As Pontes basin (transition stage). The history of the basin was controlled by the activity of the double restraining bend for a further 3.4 My (restraining bend stage) and ended in mid-Aquitanian times (ca. 22 Ma).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the quest to completely describe entanglement in the general case of a finite number of parties sharing a physical system of finite-dimensional Hilbert space an entanglement magnitude is introduced for its pure and mixed states: robustness. It corresponds to the minimal amount of mixing with locally prepared states which washes out all entanglement. It quantifies in a sense the endurance of entanglement against noise and jamming. Its properties are studied comprehensively. Analytical expressions for the robustness are given for pure states of two-party systems, and analytical bounds for mixed states of two-party systems. Specific results are obtained mainly for the qubit-qubit system (qubit denotes quantum bit). As by-products local pseudomixtures are generalized, a lower bound for the relative volume of separable states is deduced, and arguments for considering convexity a necessary condition of any entanglement measure are put forward.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We estimate the attainable limits on the coupling of a nonstandard Higgs boson to two photons taking into account the data collected by the Fermilab collaborations on diphoton events. We based our analysis on a general set of dimension-6 effective operators that give rise to anomalous couplings in the bosonic sector of the standard model. If the coefficients of all blind operators have the same magnitude, indirect bounds on the anomalous triple vector-boson couplings can also be inferred, provided there is no large cancellation in the Higgs-gamma-gamma coupling.