932 resultados para efficient algorithm
Resumo:
We consider collective choice problems where a set of agents have to choose an alternative from a finite set and agents may or may not become users of the chosen alternative. An allocation is a pair given by the chosen alternative and the set of its users. Agents have gregarious preferences over allocations: given an allocation, they prefer that the set of users becomes larger. We require that the final allocation be efficient and stable (no agent can be forced to be a user and no agent who wants to be a user can be excluded). We propose a two-stage sequential mechanism whose unique subgame perfect equilibrium outcome is an efficient and stable allocation which also satisfies a maximal participation property.
Resumo:
The algorithmic approach to data modelling has developed rapidly these last years, in particular methods based on data mining and machine learning have been used in a growing number of applications. These methods follow a data-driven methodology, aiming at providing the best possible generalization and predictive abilities instead of concentrating on the properties of the data model. One of the most successful groups of such methods is known as Support Vector algorithms. Following the fruitful developments in applying Support Vector algorithms to spatial data, this paper introduces a new extension of the traditional support vector regression (SVR) algorithm. This extension allows for the simultaneous modelling of environmental data at several spatial scales. The joint influence of environmental processes presenting different patterns at different scales is here learned automatically from data, providing the optimum mixture of short and large-scale models. The method is adaptive to the spatial scale of the data. With this advantage, it can provide efficient means to model local anomalies that may typically arise in situations at an early phase of an environmental emergency. However, the proposed approach still requires some prior knowledge on the possible existence of such short-scale patterns. This is a possible limitation of the method for its implementation in early warning systems. The purpose of this paper is to present the multi-scale SVR model and to illustrate its use with an application to the mapping of Cs137 activity given the measurements taken in the region of Briansk following the Chernobyl accident.
Resumo:
Forest fires are a serious threat to humans and nature from an ecological, social and economic point of view. Predicting their behaviour by simulation still delivers unreliable results and remains a challenging task. Latest approaches try to calibrate input variables, often tainted with imprecision, using optimisation techniques like Genetic Algorithms. To converge faster towards fitter solutions, the GA is guided with knowledge obtained from historical or synthetical fires. We developed a robust and efficient knowledge storage and retrieval method. Nearest neighbour search is applied to find the fire configuration from knowledge base most similar to the current configuration. Therefore, a distance measure was elaborated and implemented in several ways. Experiments show the performance of the different implementations regarding occupied storage and retrieval time with overly satisfactory results.
Resumo:
The implementation of public programs to support business R&D projects requires the establishment of a selection process. This selection process faces various difficulties, which include the measurement of the impact of the R&D projects as well as selection process optimization among projects with multiple, and sometimes incomparable, performance indicators. To this end, public agencies generally use the peer review method, which, while presenting some advantages, also demonstrates significant drawbacks. Private firms, on the other hand, tend toward more quantitative methods, such as Data Envelopment Analysis (DEA), in their pursuit of R&D investment optimization. In this paper, the performance of a public agency peer review method of project selection is compared with an alternative DEA method.
Resumo:
During the last two decades there has been an increase in using dynamic tariffs for billing household electricity consumption. This has questioned the suitability of traditional pricing schemes, such as two-part tariffs, since they contribute to create marked peak and offpeak demands. The aim of this paper is to assess if two-part tariffs are an efficient pricing scheme using Spanish household electricity microdata. An ordered probit model with instrumental variables on the determinants of power level choice and non-paramentric spline regressions on the electricity price distribution will allow us to distinguish between the tariff structure choice and the simultaneous demand decisions. We conclude that electricity consumption and dwellings’ and individuals’ characteristics are key determinants of the fixed charge paid by Spanish households Finally, the results point to the inefficiency of the two-part tariff as those consumers who consume more electricity pay a lower price than the others.
Resumo:
The recent developments in high magnetic field 13C magnetic resonance spectroscopy with improved localization and shimming techniques have led to important gains in sensitivity and spectral resolution of 13C in vivo spectra in the rodent brain, enabling the separation of several 13C isotopomers of glutamate and glutamine. In this context, the assumptions used in spectral quantification might have a significant impact on the determination of the 13C concentrations and the related metabolic fluxes. In this study, the time domain spectral quantification algorithm AMARES (advanced method for accurate, robust and efficient spectral fitting) was applied to 13 C magnetic resonance spectroscopy spectra acquired in the rat brain at 9.4 T, following infusion of [1,6-(13)C2 ] glucose. Using both Monte Carlo simulations and in vivo data, the goal of this work was: (1) to validate the quantification of in vivo 13C isotopomers using AMARES; (2) to assess the impact of the prior knowledge on the quantification of in vivo 13C isotopomers using AMARES; (3) to compare AMARES and LCModel (linear combination of model spectra) for the quantification of in vivo 13C spectra. AMARES led to accurate and reliable 13C spectral quantification similar to those obtained using LCModel, when the frequency shifts, J-coupling constants and phase patterns of the different 13C isotopomers were included as prior knowledge in the analysis.
Resumo:
This paper revisits the problem of adverse selection in the insurance market of Rothschild and Stiglitz [28]. We propose a simple extension of the game-theoretic structure in Hellwig [14] under which Nash-type strategic interaction between the informed customers and the uninformed firms results always in a particular separating equilibrium. The equilibrium allocation is unique and Pareto-efficient in the interim sense subject to incentive-compatibility and individual rationality. In fact, it is the unique neutral optimum in the sense of Myerson [22].
Resumo:
We consider a frictional two-sided matching market in which one side uses public cheap talk announcements so as to attract the other side. We show that if the first-price auction is adopted as the trading protocol, then cheap talk can be perfectly informative, and the resulting market outcome is efficient, constrained only by search frictions. We also show that the performance of an alternative trading protocol in the cheap-talk environment depends on the level of price dispersion generated by the protocol: If a trading protocol compresses (spreads) the distribution of prices relative to the first-price auction, then an efficient fully revealing equilibrium always (never) exists. Our results identify the settings in which cheap talk can serve as an efficient competitive instrument, in the sense that the central insights from the literature on competing auctions and competitive search continue to hold unaltered even without ex ante price commitment.
Resumo:
Vector Autoregressive Moving Average (VARMA) models have many theoretical properties which should make them popular among empirical macroeconomists. However, they are rarely used in practice due to over-parameterization concerns, difficulties in ensuring identification and computational challenges. With the growing interest in multivariate time series models of high dimension, these problems with VARMAs become even more acute, accounting for the dominance of VARs in this field. In this paper, we develop a Bayesian approach for inference in VARMAs which surmounts these problems. It jointly ensures identification and parsimony in the context of an efficient Markov chain Monte Carlo (MCMC) algorithm. We use this approach in a macroeconomic application involving up to twelve dependent variables. We find our algorithm to work successfully and provide insights beyond those provided by VARs.
Resumo:
The implicit projection algorithm of isotropic plasticity is extended to an objective anisotropic elastic perfectly plastic model. The recursion formula developed to project the trial stress on the yield surface, is applicable to any non linear elastic law and any plastic yield function.A curvilinear transverse isotropic model based on a quadratic elastic potential and on Hill's quadratic yield criterion is then developed and implemented in a computer program for bone mechanics perspectives. The paper concludes with a numerical study of a schematic bone-prosthesis system to illustrate the potential of the model.
Resumo:
Innate immune responses play a central role in neuroprotection and neurotoxicity during inflammatory processes that are triggered by pathogen-associated molecular pattern-exhibiting agents such as bacterial lipopolysaccharide (LPS) and that are modulated by inflammatory cytokines such as interferon γ (IFNγ). Recent findings describing the unexpected complexity of mammalian genomes and transcriptomes have stimulated further identification of novel transcripts involved in specific physiological and pathological processes, such as the neural innate immune response that alters the expression of many genes. We developed a system for efficient subtractive cloning that employs both sense and antisense cRNA drivers, and coupled it with in-house cDNA microarray analysis. This system enabled effective direct cloning of differentially expressed transcripts, from a small amount (0.5 µg) of total RNA. We applied this system to isolation of genes activated by LPS and IFNγ in primary-cultured cortical cells that were derived from newborn mice, to investigate the mechanisms involved in neuroprotection and neurotoxicity in maternal/perinatal infections that cause various brain injuries including periventricular leukomalacia. A number of genes involved in the immune and inflammatory response were identified, showing that neonatal neuronal/glial cells are highly responsive to LPS and IFNγ. Subsequent RNA blot analysis revealed that the identified genes were activated by LPS and IFNγ in a cooperative or distinctive manner, thereby supporting the notion that these bacterial and cellular inflammatory mediators can affect the brain through direct but complicated pathways. We also identified several novel clones of apparently non-coding RNAs that potentially harbor various regulatory functions. Characterization of the presently identified genes will give insights into mechanisms and interventions not only for perinatal infection-induced brain damage, but also for many other innate immunity-related brain disorders.
Resumo:
Defining an efficient training set is one of the most delicate phases for the success of remote sensing image classification routines. The complexity of the problem, the limited temporal and financial resources, as well as the high intraclass variance can make an algorithm fail if it is trained with a suboptimal dataset. Active learning aims at building efficient training sets by iteratively improving the model performance through sampling. A user-defined heuristic ranks the unlabeled pixels according to a function of the uncertainty of their class membership and then the user is asked to provide labels for the most uncertain pixels. This paper reviews and tests the main families of active learning algorithms: committee, large margin, and posterior probability-based. For each of them, the most recent advances in the remote sensing community are discussed and some heuristics are detailed and tested. Several challenging remote sensing scenarios are considered, including very high spatial resolution and hyperspectral image classification. Finally, guidelines for choosing the good architecture are provided for new and/or unexperienced user.
Resumo:
Astrocytes are now considered as key players in brain information processing because of their newly discovered roles in synapse formation and plasticity, energy metabolism and blood flow regulation. However, our understanding of astrocyte function is still fragmented compared to other brain cell types. A better appreciation of the biology of astrocytes requires the development of tools to generate animal models in which astrocyte-specific proteins and pathways can be manipulated. In addition, it is becoming increasingly evident that astrocytes are also important players in many neurological disorders. Targeted modulation of protein expression in astrocytes would be critical for the development of new therapeutic strategies. Gene transfer is valuable to target a subpopulation of cells and explore their function in experimental models. In particular, viral-mediated gene transfer provides a rapid, highly flexible and cost-effective, in vivo paradigm to study the impact of genes of interest during central nervous system development or in adult animals. We will review the different strategies that led to the recent development of efficient viral vectors that can be successfully used to selectively transduce astrocytes in the mammalian brain.
Resumo:
A family of nonempty closed convex sets is built by using the data of the Generalized Nash equilibrium problem (GNEP). The sets are selected iteratively such that the intersection of the selected sets contains solutions of the GNEP. The algorithm introduced by Iusem-Sosa (2003) is adapted to obtain solutions of the GNEP. Finally some numerical experiments are given to illustrate the numerical behavior of the algorithm.