897 resultados para Minimization Problem, Lattice Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given $n$ independent replicates of a jointly distributed pair $(X,Y)\in {\cal R}^d \times {\cal R}$, we wish to select from a fixed sequence of model classes ${\cal F}_1, {\cal F}_2, \ldots$ a deterministic prediction rule $f: {\cal R}^d \to {\cal R}$ whose risk is small. We investigate the possibility of empirically assessingthe {\em complexity} of each model class, that is, the actual difficulty of the estimation problem within each class. The estimated complexities are in turn used to define an adaptive model selection procedure, which is based on complexity penalized empirical risk.The available data are divided into two parts. The first is used to form an empirical cover of each model class, and the second is used to select a candidate rule from each cover based on empirical risk. The covering radii are determined empirically to optimize a tight upper bound on the estimation error. An estimate is chosen from the list of candidates in order to minimize the sum of class complexity and empirical risk. A distinguishing feature of the approach is that the complexity of each model class is assessed empirically, based on the size of its empirical cover.Finite sample performance bounds are established for the estimates, and these bounds are applied to several non-parametric estimation problems. The estimates are shown to achieve a favorable tradeoff between approximation and estimation error, and to perform as well as if the distribution-dependent complexities of the model classes were known beforehand. In addition, it is shown that the estimate can be consistent,and even possess near optimal rates of convergence, when each model class has an infinite VC or pseudo dimension.For regression estimation with squared loss we modify our estimate to achieve a faster rate of convergence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents several applications to interest rate risk managementbased on a two-factor continuous-time model of the term structure of interestrates previously presented in Moreno (1996). This model assumes that defaultfree discount bond prices are determined by the time to maturity and twofactors, the long-term interest rate and the spread (difference between thelong-term rate and the short-term (instantaneous) riskless rate). Several newmeasures of ``generalized duration" are presented and applied in differentsituations in order to manage market risk and yield curve risk. By means ofthese measures, we are able to compute the hedging ratios that allows us toimmunize a bond portfolio by means of options on bonds. Focusing on thehedging problem, it is shown that these new measures allow us to immunize abond portfolio against changes (parallel and/or in the slope) in the yieldcurve. Finally, a proposal of solution of the limitations of conventionalduration by means of these new measures is presented and illustratednumerically.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many dynamic revenue management models divide the sale period into a finite number of periods T and assume, invoking a fine-enough grid of time, that each period sees at most one booking request. These Poisson-type assumptions restrict the variability of the demand in the model, but researchers and practitioners were willing to overlook this for the benefit of tractability of the models. In this paper, we criticize this model from another angle. Estimating the discrete finite-period model poses problems of indeterminacy and non-robustness: Arbitrarily fixing T leads to arbitrary control values and on the other hand estimating T from data adds an additional layer of indeterminacy. To counter this, we first propose an alternate finite-population model that avoids this problem of fixing T and allows a wider range of demand distributions, while retaining the useful marginal-value properties of the finite-period model. The finite-population model still requires jointly estimating market size and the parameters of the customer purchase model without observing no-purchases. Estimation of market-size when no-purchases are unobservable has rarely been attempted in the marketing or revenue management literature. Indeed, we point out that it is akin to the classical statistical problem of estimating the parameters of a binomial distribution with unknown population size and success probability, and hence likely to be challenging. However, when the purchase probabilities are given by a functional form such as a multinomial-logit model, we propose an estimation heuristic that exploits the specification of the functional form, the variety of the offer sets in a typical RM setting, and qualitative knowledge of arrival rates. Finally we perform simulations to show that the estimator is very promising in obtaining unbiased estimates of population size and the model parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study model selection strategies based on penalized empirical loss minimization. We point out a tight relationship between error estimation and data-based complexity penalization: any good error estimate may be converted into a data-based penalty function and the performance of the estimate is governed by the quality of the error estimate. We consider several penalty functions, involving error estimates on independent test data, empirical {\sc vc} dimension, empirical {\sc vc} entropy, andmargin-based quantities. We also consider the maximal difference between the error on the first half of the training data and the second half, and the expected maximal discrepancy, a closely related capacity estimate that can be calculated by Monte Carlo integration. Maximal discrepancy penalty functions are appealing for pattern classification problems, since their computation is equivalent to empirical risk minimization over the training data with some labels flipped.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article builds on the recent policy diffusion literature and attempts to overcome one of its major problems, namely the lack of a coherent theoretical framework. The literature defines policy diffusion as a process where policy choices are interdependent, and identifies several diffusion mechanisms that specify the link between the policy choices of the various actors. As these mechanisms are grounded in different theories, theoretical accounts of diffusion currently have little internal coherence. In this article we put forward an expected-utility model of policy change that is able to subsume all the diffusion mechanisms. We argue that the expected utility of a policy depends on both its effectiveness and the payoffs it yields, and we show that the various diffusion mechanisms operate by altering these two parameters. Each mechanism affects one of the two parameters, and does so in distinct ways. To account for aggregate patterns of diffusion, we embed our model in a simple threshold model of diffusion. Given the high complexity of the process that results, strong analytical conclusions on aggregate patterns cannot be drawn without more extensive analysis which is beyond the scope of this article. However, preliminary considerations indicate that a wide range of diffusion processes may exist and that convergence is only one possible outcome.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Protein-ligand docking has made important progress during the last decade and has become a powerful tool for drug development, opening the way to virtual high throughput screening and in silico structure-based ligand design. Despite the flattering picture that has been drawn, recent publications have shown that the docking problem is far from being solved, and that more developments are still needed to achieve high successful prediction rates and accuracy. Introducing an accurate description of the solvation effect upon binding is thought to be essential to achieve this goal. In particular, EADock uses the Generalized Born Molecular Volume 2 (GBMV2) solvent model, which has been shown to reproduce accurately the desolvation energies calculated by solving the Poisson equation. Here, the implementation of the Fast Analytical Continuum Treatment of Solvation (FACTS) as an implicit solvation model in small molecules docking calculations has been assessed using the EADock docking program. Our results strongly support the use of FACTS for docking. The success rates of EADock/FACTS and EADock/GBMV2 are similar, i.e. around 75% for local docking and 65% for blind docking. However, these results come at a much lower computational cost: FACTS is 10 times faster than GBMV2 in calculating the total electrostatic energy, and allows a speed up of EADock by a factor of 4. This study also supports the EADock development strategy relying on the CHARMM package for energy calculations, which enables straightforward implementation and testing of the latest developments in the field of Molecular Modeling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a recent paper [Phys. Rev. B 50, 3477 (1994)], P. Fratzl and O. Penrose present the results of the Monte Carlo simulation of the spinodal decomposition problem (phase separation) using the vacancy dynamics mechanism. They observe that the t1/3 growth regime is reached faster than when using the standard Kawasaki dynamics. In this Comment we provide a simple explanation for the phenomenon based on the role of interface diffusion, which they claim is irrelevant for the observed behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A numerical study is presented of the third-dimensional Gaussian random-field Ising model at T=0 driven by an external field. Standard synchronous relaxation dynamics is employed to obtain the magnetization versus field hysteresis loops. The focus is on the analysis of the number and size distribution of the magnetization avalanches. They are classified as being nonspanning, one-dimensional-spanning, two-dimensional-spanning, or three-dimensional-spanning depending on whether or not they span the whole lattice in different space directions. Moreover, finite-size scaling analysis enables identification of two different types of nonspanning avalanches (critical and noncritical) and two different types of three-dimensional-spanning avalanches (critical and subcritical), whose numbers increase with L as a power law with different exponents. We conclude by giving a scenario for avalanche behavior in the thermodynamic limit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The influence of vacancy concentration on the behavior of the three-dimensional random field Ising model with metastable dynamics is studied. We have focused our analysis on the number of spanning avalanches which allows us a clear determination of the critical line where the hysteresis loops change from continuous to discontinuous. By a detailed finite-size scaling analysis we determine the phase diagram and numerically estimate the critical exponents along the whole critical line. Finally, we discuss the origin of the curvature of the critical line at high vacancy concentration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Isotopic and isotonic chains of superheavy nuclei are analyzed to search for spherical double shell closures beyond Z=82 and N=126 within the new effective field theory model of Furnstahl, Serot, and Tang for the relativistic nuclear many-body problem. We take into account several indicators to identify the occurrence of possible shell closures, such as two-nucleon separation energies, two-nucleon shell gaps, average pairing gaps, and the shell correction energy. The effective Lagrangian model predicts N=172 and Z=120 and N=258 and Z=120 as spherical doubly magic superheavy nuclei, whereas N=184 and Z=114 show some magic character depending on the parameter set. The magicity of a particular neutron (proton) number in the analyzed mass region is found to depend on the number of protons (neutrons) present in the nucleus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The debate about Free Will has been in the human mind for centuries, but has become even more intense with the recent scientific findings adding new lights on the problem. This interdisciplinary explosion of interest for the topic has brought many insightful knowledge, but also a great deal of epistemological problems. We think that those epistemological problems are deeply related to the very definition of Free Will and how this definition interacts with the interpretations of experimental results. We will thus outline a few of these problems and then propose a definition of Free Will which takes into account those epistemological pitfalls.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In human, neuronal migration disorders are commonly associated with developmental delay, mental retardation, and epilepsy. We describe here a new mouse mutant that develops a heterotopic cortex (HeCo) lying in the dorsolateral hemispheric region, between the homotopic cortex (HoCo) and subcortical white matter. Cross-breeding demonstrated an autosomal recessive transmission. Birthdating studies and immunochemistry for layer-specific markers revealed that HeCo formation was due to a transit problem in the intermediate zone affecting both radially and tangentially migrating neurons. The scaffold of radial glial fibers, as well as the expression of doublecortin is not altered in the mutant. Neurons within the HeCo are generated at a late embryonic age (E18) and the superficial layers of the HoCo have a correspondingly lower cell density and layer thickness. Parvalbumin immunohistochemistry showed the presence of gamma-aminobutyric acidergic cells in the HeCo and the mutant mice have a lowered threshold for the induction of epileptic seizures. The mutant showed a developmental delay but, in contrast, memory function was relatively spared. Therefore, this unique mouse model resembles subcortical band heterotopia observed in human. This model represents a new and rare tool to better understand cortical development and to investigate future therapeutic strategies for refractory epilepsy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Synchronization phenomena in large populations of interacting elements are the subject of intense research efforts in physical, biological, chemical, and social systems. A successful approach to the problem of synchronization consists of modeling each member of the population as a phase oscillator. In this review, synchronization is analyzed in one of the most representative models of coupled phase oscillators, the Kuramoto model. A rigorous mathematical treatment, specific numerical methods, and many variations and extensions of the original model that have appeared in the last few years are presented. Relevant applications of the model in different contexts are also included.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a generalization of the persistent random walk for dimensions greater than 1. Based on a cubic lattice, the model is suitable for an arbitrary dimension d. We study the continuum limit and obtain the equation satisfied by the probability density function for the position of the random walker. An exact solution is obtained for the projected motion along an axis. This solution, which is written in terms of the free-space solution of the one-dimensional telegraphers equation, may open a new way to address the problem of light propagation through thin slabs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The short-range resonating-valence-bond (RVB) wave function with nearest-neighbor (NN) spin pairings only is investigated as a possible description for the Heisenberg model on a square-planar lattice. A type of long-range order associated to this RVB Ansatz is identified along with some qualitative consequences involving lattice distortions, excitations, and their coupling.