907 resultados para computationally efficient algorithm
Resumo:
This paper revisits the problem of adverse selection in the insurance market of Rothschild and Stiglitz [28]. We propose a simple extension of the game-theoretic structure in Hellwig [14] under which Nash-type strategic interaction between the informed customers and the uninformed firms results always in a particular separating equilibrium. The equilibrium allocation is unique and Pareto-efficient in the interim sense subject to incentive-compatibility and individual rationality. In fact, it is the unique neutral optimum in the sense of Myerson [22].
Resumo:
We consider a frictional two-sided matching market in which one side uses public cheap talk announcements so as to attract the other side. We show that if the first-price auction is adopted as the trading protocol, then cheap talk can be perfectly informative, and the resulting market outcome is efficient, constrained only by search frictions. We also show that the performance of an alternative trading protocol in the cheap-talk environment depends on the level of price dispersion generated by the protocol: If a trading protocol compresses (spreads) the distribution of prices relative to the first-price auction, then an efficient fully revealing equilibrium always (never) exists. Our results identify the settings in which cheap talk can serve as an efficient competitive instrument, in the sense that the central insights from the literature on competing auctions and competitive search continue to hold unaltered even without ex ante price commitment.
Resumo:
Vector Autoregressive Moving Average (VARMA) models have many theoretical properties which should make them popular among empirical macroeconomists. However, they are rarely used in practice due to over-parameterization concerns, difficulties in ensuring identification and computational challenges. With the growing interest in multivariate time series models of high dimension, these problems with VARMAs become even more acute, accounting for the dominance of VARs in this field. In this paper, we develop a Bayesian approach for inference in VARMAs which surmounts these problems. It jointly ensures identification and parsimony in the context of an efficient Markov chain Monte Carlo (MCMC) algorithm. We use this approach in a macroeconomic application involving up to twelve dependent variables. We find our algorithm to work successfully and provide insights beyond those provided by VARs.
Resumo:
Amb l'evolució de la tecnologia les capacitats de còmput es van incrementant i problemes irresolubles del passat deixen de ser-ho amb els recursos actuals. La majoria d'aplicacions que s'enfronten a aquests problemes són complexes, ja que per aconseguir taxes elevades de rendiment es fa necessari utilitzar el major nombre de recursos possibles, i això les dota d'una arquitectura inherentment distribuïda. Seguint la tendència de la comunitat investigadora, en aquest treball de recerca es proposa una arquitectura per a entorns grids basada en la virtualització de recursos que possibilita la gestió eficient d'aquests recursos. L'experimentació duta a terme ha permès comprovar la viabilitat d'aquesta arquitectura i la millora en la gestió que la utilització de màquines virtuals proporciona.
Resumo:
The implicit projection algorithm of isotropic plasticity is extended to an objective anisotropic elastic perfectly plastic model. The recursion formula developed to project the trial stress on the yield surface, is applicable to any non linear elastic law and any plastic yield function.A curvilinear transverse isotropic model based on a quadratic elastic potential and on Hill's quadratic yield criterion is then developed and implemented in a computer program for bone mechanics perspectives. The paper concludes with a numerical study of a schematic bone-prosthesis system to illustrate the potential of the model.
Resumo:
Innate immune responses play a central role in neuroprotection and neurotoxicity during inflammatory processes that are triggered by pathogen-associated molecular pattern-exhibiting agents such as bacterial lipopolysaccharide (LPS) and that are modulated by inflammatory cytokines such as interferon γ (IFNγ). Recent findings describing the unexpected complexity of mammalian genomes and transcriptomes have stimulated further identification of novel transcripts involved in specific physiological and pathological processes, such as the neural innate immune response that alters the expression of many genes. We developed a system for efficient subtractive cloning that employs both sense and antisense cRNA drivers, and coupled it with in-house cDNA microarray analysis. This system enabled effective direct cloning of differentially expressed transcripts, from a small amount (0.5 µg) of total RNA. We applied this system to isolation of genes activated by LPS and IFNγ in primary-cultured cortical cells that were derived from newborn mice, to investigate the mechanisms involved in neuroprotection and neurotoxicity in maternal/perinatal infections that cause various brain injuries including periventricular leukomalacia. A number of genes involved in the immune and inflammatory response were identified, showing that neonatal neuronal/glial cells are highly responsive to LPS and IFNγ. Subsequent RNA blot analysis revealed that the identified genes were activated by LPS and IFNγ in a cooperative or distinctive manner, thereby supporting the notion that these bacterial and cellular inflammatory mediators can affect the brain through direct but complicated pathways. We also identified several novel clones of apparently non-coding RNAs that potentially harbor various regulatory functions. Characterization of the presently identified genes will give insights into mechanisms and interventions not only for perinatal infection-induced brain damage, but also for many other innate immunity-related brain disorders.
Resumo:
Defining an efficient training set is one of the most delicate phases for the success of remote sensing image classification routines. The complexity of the problem, the limited temporal and financial resources, as well as the high intraclass variance can make an algorithm fail if it is trained with a suboptimal dataset. Active learning aims at building efficient training sets by iteratively improving the model performance through sampling. A user-defined heuristic ranks the unlabeled pixels according to a function of the uncertainty of their class membership and then the user is asked to provide labels for the most uncertain pixels. This paper reviews and tests the main families of active learning algorithms: committee, large margin, and posterior probability-based. For each of them, the most recent advances in the remote sensing community are discussed and some heuristics are detailed and tested. Several challenging remote sensing scenarios are considered, including very high spatial resolution and hyperspectral image classification. Finally, guidelines for choosing the good architecture are provided for new and/or unexperienced user.
Resumo:
Astrocytes are now considered as key players in brain information processing because of their newly discovered roles in synapse formation and plasticity, energy metabolism and blood flow regulation. However, our understanding of astrocyte function is still fragmented compared to other brain cell types. A better appreciation of the biology of astrocytes requires the development of tools to generate animal models in which astrocyte-specific proteins and pathways can be manipulated. In addition, it is becoming increasingly evident that astrocytes are also important players in many neurological disorders. Targeted modulation of protein expression in astrocytes would be critical for the development of new therapeutic strategies. Gene transfer is valuable to target a subpopulation of cells and explore their function in experimental models. In particular, viral-mediated gene transfer provides a rapid, highly flexible and cost-effective, in vivo paradigm to study the impact of genes of interest during central nervous system development or in adult animals. We will review the different strategies that led to the recent development of efficient viral vectors that can be successfully used to selectively transduce astrocytes in the mammalian brain.
Resumo:
A family of nonempty closed convex sets is built by using the data of the Generalized Nash equilibrium problem (GNEP). The sets are selected iteratively such that the intersection of the selected sets contains solutions of the GNEP. The algorithm introduced by Iusem-Sosa (2003) is adapted to obtain solutions of the GNEP. Finally some numerical experiments are given to illustrate the numerical behavior of the algorithm.
Resumo:
We compared the influence of the bug density in the capacity of Triatoma infestans and Panstrongylus megistus in obtaining blood meal in non anaesthetized mice. The regression anlysis for increase in body weight (mg) versus density (no. of bugs/mouse) showed that in experiments with anaesthetized mice (AM), no correlation was observed. In experiments with non anaesthetized mice (NAM) the weight increase was inversely proportional to density. The regression slope for blood meal size on density was less steep for T. infestans than for P. megistus (-1.9 and -3.0, respectively). The average weight increase of P. megistus nymphus in experiments with AM was higher than for T. infestans nymphs; however, in experiments with NAM such results were inverted. Mortality of P. megistus was significantly higher than of T. infestans with NAM. However, in experiments with AM very low mortality was observed. Considering the mortality and the slope of regression line on NAM, T. infestans is more efficient than P. megistus in obtaining blood meal in similar densities, possibly because it caused less irritation of the mice. The better exploitation of blood source of T. infestans when compared with P. megistus in similar densities, favours the maintenance of a better nutritional status in higher densities. This could explain epidemiological findings in which T. infestans not only succeeds in establishing larger colonies but also dislodges P. megistus in human dwellings when it is introduced in areas where the latter species prevails.
Resumo:
Given a sample from a fully specified parametric model, let Zn be a given finite-dimensional statistic - for example, an initial estimator or a set of sample moments. We propose to (re-)estimate the parameters of the model by maximizing the likelihood of Zn. We call this the maximum indirect likelihood (MIL) estimator. We also propose a computationally tractable Bayesian version of the estimator which we refer to as a Bayesian Indirect Likelihood (BIL) estimator. In most cases, the density of the statistic will be of unknown form, and we develop simulated versions of the MIL and BIL estimators. We show that the indirect likelihood estimators are consistent and asymptotically normally distributed, with the same asymptotic variance as that of the corresponding efficient two-step GMM estimator based on the same statistic. However, our likelihood-based estimators, by taking into account the full finite-sample distribution of the statistic, are higher order efficient relative to GMM-type estimators. Furthermore, in many cases they enjoy a bias reduction property similar to that of the indirect inference estimator. Monte Carlo results for a number of applications including dynamic and nonlinear panel data models, a structural auction model and two DSGE models show that the proposed estimators indeed have attractive finite sample properties.
Resumo:
We evaluate the performance of different optimization techniques developed in the context of optical flowcomputation with different variational models. In particular, based on truncated Newton methods (TN) that have been an effective approach for large-scale unconstrained optimization, we develop the use of efficient multilevel schemes for computing the optical flow. More precisely, we evaluate the performance of a standard unidirectional multilevel algorithm - called multiresolution optimization (MR/OPT), to a bidrectional multilevel algorithm - called full multigrid optimization (FMG/OPT). The FMG/OPT algorithm treats the coarse grid correction as an optimization search direction and eventually scales it using a line search. Experimental results on different image sequences using four models of optical flow computation show that the FMG/OPT algorithm outperforms both the TN and MR/OPT algorithms in terms of the computational work and the quality of the optical flow estimation.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
This paper discusses the use of probabilistic or randomized algorithms for solving combinatorial optimization problems. Our approach employs non-uniform probability distributions to add a biased random behavior to classical heuristics so a large set of alternative good solutions can be quickly obtained in a natural way and without complex conguration processes. This procedure is especially useful in problems where properties such as non-smoothness or non-convexity lead to a highly irregular solution space, for which the traditional optimization methods, both of exact and approximate nature, may fail to reach their full potential. The results obtained are promising enough to suggest that randomizing classical heuristics is a powerful method that can be successfully applied in a variety of cases.