978 resultados para Robust Stochastic Optimization
Spanning tests in return and stochastic discount factor mean-variance frontiers: A unifying approach
Resumo:
We propose new spanning tests that assess if the initial and additional assets share theeconomically meaningful cost and mean representing portfolios. We prove their asymptoticequivalence to existing tests under local alternatives. We also show that unlike two-step oriterated procedures, single-step methods such as continuously updated GMM yield numericallyidentical overidentifyng restrictions tests, so there is arguably a single spanning test.To prove these results, we extend optimal GMM inference to deal with singularities in thelong run second moment matrix of the influence functions. Finally, we test for spanningusing size and book-to-market sorted US stock portfolios.
Resumo:
We address the problem of scheduling a multiclass $M/M/m$ queue with Bernoulli feedback on $m$ parallel servers to minimize time-average linear holding costs. We analyze the performance of a heuristic priority-index rule, which extends Klimov's optimal solution to the single-server case: servers select preemptively customers with larger Klimov indices. We present closed-form suboptimality bounds (approximate optimality) for Klimov's rule, which imply that its suboptimality gap is uniformly bounded above with respect to (i) external arrival rates, as long as they stay within system capacity;and (ii) the number of servers. It follows that its relativesuboptimality gap vanishes in a heavy-traffic limit, as external arrival rates approach system capacity (heavy-traffic optimality). We obtain simpler expressions for the special no-feedback case, where the heuristic reduces to the classical $c \mu$ rule. Our analysis is based on comparing the expected cost of Klimov's ruleto the value of a strong linear programming (LP) relaxation of the system's region of achievable performance of mean queue lengths. In order to obtain this relaxation, we derive and exploit a new set ofwork decomposition laws for the parallel-server system. We further report on the results of a computational study on the quality of the $c \mu$ rule for parallel scheduling.
Resumo:
This paper analyzes empirically the volatility of consumption-based stochastic discount factors as a measure of implicit economic fears by studying its relationship with future economic and stock market cycles. Time-varying economic fears seem to be well captured by the volatility of stochastic discount factors. In particular, the volatility of recursive utility-based stochastic discount factor with contemporaneous growth explains between 9 and 34 percent of future changes in industrial production at short and long horizons respectively. They also explain ex-ante uncertainty and risk aversion. However, future stock market cycles are better explained by a similar stochastic discount factor with long-run consumption growth. This specification of the stochastic discount factor presents higher volatility and lower pricing errors than the specification with contemporaneous consumption growth.
Resumo:
This paper estimates a translog stochastic frontier production function in the analysis of all 48 contiguous U.S. states in the period 1970-1983, to attempt to measure and explain changes in technical efficiency. The model allows technical inefficiency to vary over time, and inefficiency effects to be a function of a set of explanatory variables in which the level and composition of public capital plays an important role. Results indicated that U.S. state inefficiency levels were significantly and positively correlated with the ratio of public capital to private capital. The proportion of public capital devoted to highways is negatively correlated with technical inefficiency, suggesting that not only the level but also the composition of public capital influenced state efficiency.
Resumo:
In this paper we consider dynamic processes, in repeated games, that are subject to the natural informational restriction of uncoupledness. We study the almost sure convergence to Nash equilibria, and present a number of possibility and impossibility results. Basically, we show that if in addition to random moves some recall is introduced, then successful search procedures that are uncoupled can be devised. In particular, to get almost sure convergence to pure Nash equilibria when these exist, it su±ces to recall the last two periods of play.
Resumo:
A general formalism on stochastic choice is presented. Tje Rationalizability and Recoverability (Identification) problems are discussed. For the identification issue parametric examples are analyzed by means of techniques of mathematical tomography (Random transforms).
Resumo:
Nonlinear regression problems can often be reduced to linearity by transforming the response variable (e.g., using the Box-Cox family of transformations). The classic estimates of the parameter defining the transformation as well as of the regression coefficients are based on the maximum likelihood criterion, assuming homoscedastic normal errors for the transformed response. These estimates are nonrobust in the presence of outliers and can be inconsistent when the errors are nonnormal or heteroscedastic. This article proposes new robust estimates that are consistent and asymptotically normal for any unimodal and homoscedastic error distribution. For this purpose, a robust version of conditional expectation is introduced for which the prediction mean squared error is replaced with an M scale. This concept is then used to develop a nonparametric criterion to estimate the transformation parameter as well as the regression coefficients. A finite sample estimate of this criterion based on a robust version of smearing is also proposed. Monte Carlo experiments show that the new estimates compare favorably with respect to the available competitors.
Resumo:
PURPOSE: To suppress the noise, by sacrificing some of the signal homogeneity for numerical stability, in uniform T1 weighted (T1w) images obtained with the magnetization prepared 2 rapid gradient echoes sequence (MP2RAGE) and to compare the clinical utility of these robust T1w images against the uniform T1w images. MATERIALS AND METHODS: 8 healthy subjects (29.0±4.1 years; 6 Male), who provided written consent, underwent two scan sessions within a 24 hour period on a 7T head-only scanner. The uniform and robust T1w image volumes were calculated inline on the scanner. Two experienced radiologists qualitatively rated the images for: general image quality; 7T specific artefacts; and, local structure definition. Voxel-based and volume-based morphometry packages were used to compare the segmentation quality between the uniform and robust images. Statistical differences were evaluated by using a positive sided Wilcoxon rank test. RESULTS: The robust image suppresses background noise inside and outside the skull. The inhomogeneity introduced was ranked as mild. The robust image was significantly ranked higher than the uniform image for both observers (observer 1/2, p-value = 0.0006/0.0004). In particular, an improved delineation of the pituitary gland, cerebellar lobes was observed in the robust versus uniform T1w image. The reproducibility of the segmentation results between repeat scans improved (p-value = 0.0004) from an average volumetric difference across structures of ≈6.6% to ≈2.4% for the uniform image and robust T1w image respectively. CONCLUSIONS: The robust T1w image enables MP2RAGE to produce, clinically familiar T1w images, in addition to T1 maps, which can be readily used in uniform morphometry packages.
Resumo:
We see that the price of an european call option in a stochastic volatilityframework can be decomposed in the sum of four terms, which identifythe main features of the market that affect to option prices: the expectedfuture volatility, the correlation between the volatility and the noisedriving the stock prices, the market price of volatility risk and thedifference of the expected future volatility at different times. We alsostudy some applications of this decomposition.
Resumo:
Monitoring and management of intracranial pressure (ICP) and cerebral perfusion pressure (CPP) is a standard of care after traumatic brain injury (TBI). However, the pathophysiology of so-called secondary brain injury, i.e., the cascade of potentially deleterious events that occur in the early phase following initial cerebral insult-after TBI, is complex, involving a subtle interplay between cerebral blood flow (CBF), oxygen delivery and utilization, and supply of main cerebral energy substrates (glucose) to the injured brain. Regulation of this interplay depends on the type of injury and may vary individually and over time. In this setting, patient management can be a challenging task, where standard ICP/CPP monitoring may become insufficient to prevent secondary brain injury. Growing clinical evidence demonstrates that so-called multimodal brain monitoring, including brain tissue oxygen (PbtO2), cerebral microdialysis and transcranial Doppler among others, might help to optimize CBF and the delivery of oxygen/energy substrate at the bedside, thereby improving the management of secondary brain injury. Looking beyond ICP and CPP, and applying a multimodal therapeutic approach for the optimization of CBF, oxygen delivery, and brain energy supply may eventually improve overall care of patients with head injury. This review summarizes some of the important pathophysiological determinants of secondary cerebral damage after TBI and discusses novel approaches to optimize CBF and provide adequate oxygen and energy supply to the injured brain using multimodal brain monitoring.
Resumo:
The paper develops a method to solve higher-dimensional stochasticcontrol problems in continuous time. A finite difference typeapproximation scheme is used on a coarse grid of low discrepancypoints, while the value function at intermediate points is obtainedby regression. The stability properties of the method are discussed,and applications are given to test problems of up to 10 dimensions.Accurate solutions to these problems can be obtained on a personalcomputer.
Resumo:
Agent-based computational economics is becoming widely used in practice. This paperexplores the consistency of some of its standard techniques. We focus in particular on prevailingwholesale electricity trading simulation methods. We include different supply and demandrepresentations and propose the Experience-Weighted Attractions method to include severalbehavioural algorithms. We compare the results across assumptions and to economic theorypredictions. The match is good under best-response and reinforcement learning but not underfictitious play. The simulations perform well under flat and upward-slopping supply bidding,and also for plausible demand elasticity assumptions. Learning is influenced by the number ofbids per plant and the initial conditions. The overall conclusion is that agent-based simulationassumptions are far from innocuous. We link their performance to underlying features, andidentify those that are better suited to model wholesale electricity markets.
Resumo:
We introduce a simple new hypothesis testing procedure, which,based on an independent sample drawn from a certain density, detects which of $k$ nominal densities is the true density is closest to, under the total variation (L_{1}) distance. Weobtain a density-free uniform exponential bound for the probability of false detection.