105 resultados para EFFICIENT SIMULATION
Resumo:
The Pseudo-Spectral Time Domain (PSTD) method is an alternative time-marching method to classical leapfrog finite difference schemes inthe simulation of wave-like propagating phenomena. It is based on the fundamentals of the Fourier transform to compute the spatial derivativesof hyperbolic differential equations. Therefore, it results in an isotropic operator that can be implemented in an efficient way for room acousticssimulations. However, one of the first issues to be solved consists on modeling wall absorption. Unfortunately, there are no references in thetechnical literature concerning to that problem. In this paper, assuming real and constant locally reacting impedances, several proposals toovercome this problem are presented, validated and compared to analytical solutions in different scenarios.
Resumo:
From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.
Resumo:
In this paper the core functions of an artificial intelligence (AI) for controlling a debris collector robot are designed and implemented. Using the robot operating system (ROS) as the base of this work a multi-agent system is built with abilities for task planning.
Resumo:
Scoring rules that elicit an entire belief distribution through the elicitation of point beliefsare time-consuming and demand considerable cognitive e¤ort. Moreover, the results are validonly when agents are risk-neutral or when one uses probabilistic rules. We investigate a classof rules in which the agent has to choose an interval and is rewarded (deterministically) onthe basis of the chosen interval and the realization of the random variable. We formulatean e¢ ciency criterion for such rules and present a speci.c interval scoring rule. For single-peaked beliefs, our rule gives information about both the location and the dispersion of thebelief distribution. These results hold for all concave utility functions.
Resumo:
I show that intellectual property rights yield static efficiency gains, irrespective oftheir dynamic role in fostering innovation. I develop a property-rights model of firmorganization with two dimensions of non-contractible investment. In equilibrium, thefirst best is attained if and only if ownership of tangible and intangible assets is equallyprotected. If IP rights are weaker, firm structure is distorted and efficiency declines:the entrepreneur must either integrate her suppliers, which prompts a decline in theirinvestment; or else risk their defection, which entails a waste of her human capital. Mymodel predicts greater prevalence of vertical integration where IP rights are weaker,and a switch from integration to outsourcing over the product cycle. Both empiricalpredictions are consistent with evidence on multinational companies. As a normativeimplication, I find that IP rights should be strong but narrowly defined, to protect abusiness without holding up its potential spin-offs.
Resumo:
We develop a general error analysis framework for the Monte Carlo simulationof densities for functionals in Wiener space. We also study variancereduction methods with the help of Malliavin derivatives. For this, wegive some general heuristic principles which are applied to diffusionprocesses. A comparison with kernel density estimates is made.
Resumo:
We study a retail benchmarking approach to determine access prices for interconnected networks. Instead of considering fixed access charges as in the existing literature, we study access pricing rules that determine the access price that network i pays to network j as a linear function of the marginal costs and the retail prices set by both networks. In the case of competition in linear prices, we show that there is a unique linear rule that implements the Ramsey outcome as the unique equilibrium, independently of the underlying demand conditions. In the case of competition in two-part tariffs, we consider a class of access pricing rules, similar to the optimal one under linear prices but based on average retail prices. We show that firms choose the variable price equal to the marginal cost under this class of rules. Therefore, the regulator (or the competition authority) can choose one among the rules to pursue additional objectives such as consumer surplus, network coverage or investment: for instance, we show that both static and dynamic e±ciency can be achieved at the same time.
Resumo:
This article investigates the main sources of heterogeneity in regional efficiency. We estimate a translog stochastic frontier production function in the analysis of Spanish regions in the period 1964-1996, to attempt to measure and explain changes in technical efficiency. Our results confirm that regional inefficiency is significantly and positively correlated with the ratio of public capital to private capital. The proportion of service industries in the private capital, the proportion of public capital devoted to transport infrastructures, the industrial specialization, and spatial spillovers from transport infrastructures in neighbouring regions significantly contributed to improve regional efficiency.
Resumo:
This paper extends existing insurance results on the type of insurance contracts needed for insurance market efficiency toa dynamic setting. It introduces continuosly open markets that allow for more efficient asset allocation. It alsoeliminates the role of preferences and endowments in the classification of risks, which is done primarily in terms of the actuarial properties of the underlying riskprocess. The paper further extends insurability to include correlated and catstrophic events. Under these very general conditions the paper defines a condition that determines whether a small number of standard insurance contracts (together with aggregate assets) suffice to complete markets or one needs to introduce such assets as mutual insurance.
Resumo:
Nominal Unification is an extension of first-order unification where terms can contain binders and unification is performed modulo α equivalence. Here we prove that the existence of nominal unifiers can be decided in quadratic time. First, we linearly-reduce nominal unification problems to a sequence of freshness and equalities between atoms, modulo a permutation, using ideas as Paterson and Wegman for first-order unification. Second, we prove that solvability of these reduced problems may be checked in quadràtic time. Finally, we point out how using ideas of Brown and Tarjan for unbalanced merging, we could solve these reduced problems more efficiently
Resumo:
The computer code system PENELOPE (version 2008) performs Monte Carlo simulation of coupledelectron-photon transport in arbitrary materials for a wide energy range, from a few hundred eV toabout 1 GeV. Photon transport is simulated by means of the standard, detailed simulation scheme.Electron and positron histories are generated on the basis of a mixed procedure, which combinesdetailed simulation of hard events with condensed simulation of soft interactions. A geometry packagecalled PENGEOM permits the generation of random electron-photon showers in material systemsconsisting of homogeneous bodies limited by quadric surfaces, i.e., planes, spheres, cylinders, etc. Thisreport is intended not only to serve as a manual of the PENELOPE code system, but also to provide theuser with the necessary information to understand the details of the Monte Carlo algorithm.
Resumo:
[spa] La implementación de un programa de subvenciones públicas a proyectos empresariales de I+D comporta establecer un sistema de selección de proyectos. Esta selección se enfrenta a problemas relevantes, como son la medición del posible rendimiento de los proyectos de I+D y la optimización del proceso de selección entre proyectos con múltiples y a veces incomparables medidas de resultados. Las agencias públicas utilizan mayoritariamente el método peer review que, aunque presenta ventajas, no está exento de críticas. En cambio, las empresas privadas con el objetivo de optimizar su inversión en I+D utilizan métodos más cuantitativos, como el Data Envelopment Análisis (DEA). En este trabajo se compara la actuación de los evaluadores de una agencia pública (peer review) con una metodología alternativa de selección de proyectos como es el DEA.