42 resultados para Computer Science, theory and methods
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
We report on the study of nonequilibrium ordering in the reaction-diffusion lattice gas. It is a kinetic model that relaxes towards steady states under the simultaneous competition of a thermally activated creation-annihilation $(reaction$) process at temperature T, and a diffusion process driven by a heat bath at temperature T?T. The phase diagram as one varies T and T, the system dimension d, the relative priori probabilities for the two processes, and their dynamical rates is investigated. We compare mean-field theory, new Monte Carlo data, and known exact results for some limiting cases. In particular, no evidence of Landau critical behavior is found numerically when d=2 for Metropolis rates but Onsager critical points and a variety of first-order phase transitions.
Resumo:
Bimodal dispersal probability distributions with characteristic distances differing by several orders of magnitude have been derived and favorably compared to observations by Nathan [Nature (London) 418, 409 (2002)]. For such bimodal kernels, we show that two-dimensional molecular dynamics computer simulations are unable to yield accurate front speeds. Analytically, the usual continuous-space random walks (CSRWs) are applied to two dimensions. We also introduce discrete-space random walks and use them to check the CSRW results (because of the inefficiency of the numerical simulations). The physical results reported are shown to predict front speeds high enough to possibly explain Reid's paradox of rapid tree migration. We also show that, for a time-ordered evolution equation, fronts are always slower in two dimensions than in one dimension and that this difference is important both for unimodal and for bimodal kernels
Resumo:
In this paper we propose a simple and general model for computing the Ramsey optimal inflation tax, which includes several models from the previous literature as special cases. We show that it cannot be claimed that the Friedman rule is always optimal (or always non--optimal) on theoretical grounds. The Friedman rule is optimal or not, depending on conditions related to the shape of various relevant functions. One contribution of this paper is to relate these conditions to {\it measurable} variables such as the interest rate or the consumption elasticity of money demand. We find that it tends to be optimal to tax money when there are economies of scale in the demand for money (the scale elasticity is smaller than one) and/or when money is required for the payment of consumption or wage taxes. We find that it tends to be optimal to tax money more heavily when the interest elasticity of money demand is small. We present empirical evidence on the parameters that determine the optimal inflation tax. Calibrating the model to a variety of empirical studies yields a optimal nominal interest rate of less than 1\%/year, although that finding is sensitive to the calibration.
Endogeneous matching in university-industry collaboration: Theory and empirical evidence from the UK
Resumo:
We develop a two-sided matching model to analyze collaboration between heterogeneousacademics and firms. We predict a positive assortative matching in terms of both scientificability and affinity for type of research, but negative assortative in terms of ability on one sideand affinity in the other. In addition, the most able and most applied academics and the mostable and most basic firms shall collaborate rather than stay independent. Our predictionsreceive strong support from the analysis of the teams of academics and firms that proposeresearch projects to the UK's Engineering and Physical Sciences Research Council.
Resumo:
The self-intermediate dynamic structure factor Fs(k,t) of liquid lithium near the melting temperature is calculated by molecular dynamics. The results are compared with the predictions of several theoretical approaches, paying special attention to the Lovesey model and the Wahnstrm and Sjgren mode-coupling theory. To this end the results for the Fs(k,t) second memory function predicted by both models are compared with the ones calculated from the simulations.
Resumo:
A Fundamentals of Computing Theory course involves different topics that are core to the Computer Science curricula and whose level of abstraction makes them difficult both to teach and to learn. Such difficulty stems from the complexity of the abstract notions involved and the required mathematical background. Surveys conducted among our students showed that many of them were applying some theoretical concepts mechanically rather than developing significant learning. This paper shows a number of didactic strategies that we introduced in the Fundamentals of Computing Theory curricula to cope with the above problem. The proposed strategies were based on a stronger use of technology and a constructivist approach. The final goal was to promote more significant learning of the course topics.
Resumo:
This paper surveys the recent literature on convergence across countries and regions. I discuss the main convergence and divergence mechanisms identified in the literature and develop a simple model that illustrates their implications for income dynamics. I then review the existing empirical evidence and discuss its theoretical implications. Early optimism concerning the ability of a human capital-augmented neoclassical model to explain productivity differences across economies has been questioned on the basis of more recent contributions that make use of panel data techniques and obtain theoretically implausible results. Some recent research in this area tries to reconcile these findings with sensible theoretical models by exploring the role of alternative convergence mechanisms and the possible shortcomings of panel data techniques for convergence analysis.
Resumo:
We study the process by which subordinated regions of a country can obtain a more favourable political status. In our theoretical model a dominant and a dominated region first interact through a voting process that can lead to different degrees of autonomy. If this process fails then both regions engage in a costly political conflict which can only lead to the maintenance of the initial subordination of the region in question or to its complete independence. In the subgame-perfect equilibrium the voting process always leads to an intermediate arrangement acceptable for both parts. Hence, the costly political struggle never occurs. In contrast, in our experiments we observe a large amount of fighting involving high material losses, even in a case in which the possibilities for an arrangement without conflict are very salient. In our experimental environment intermediate solutions are feasible and stable, but purely emotional elements prevent them from being reached.
Resumo:
Is there a link between decentralized governance and conflict prevention? This article tries to answer the question by presenting the state of the art of the intersection of both concepts. Provided that social conflict is inevitable and given the appearance of new threats and types of violence, as well as new demands for security based on people (human security), our societies should focus on promoting peaceful changes. Through an extensive analysis of the existing literature and the study of several cases, this paper suggests that decentralized governance can contribute to these efforts by transforming conflicts, bringing about power-sharing and inclusion incentives of minority groups. Albeit the complexity of assessing its impact on conflict prevention, it can be contended that decentralized governance might have very positive effects on the reduction of causes that bring about conflicts due to its ability to foster the creation of war/violence preventors. More specifically, this paper argues that decentralization can have a positive impact on the so-called triggers and accelerators (short- and medium-term causes).
Resumo:
Descriptive set theory is mainly concerned with studying subsets of the space of all countable binary sequences. In this paper we study the generalization where countable is replaced by uncountable. We explore properties of generalized Baire and Cantor spaces, equivalence relations and their Borel reducibility. The study shows that the descriptive set theory looks very different in this generalized setting compared to the classical, countable case. We also draw the connection between the stability theoretic complexity of first-order theories and the descriptive set theoretic complexity of their isomorphism relations. Our results suggest that Borel reducibility on uncountable structures is a model theoretically natural way to compare the complexity of isomorphism relations.
Resumo:
This paper shows how instructors can use the problem‐based learning method to introduce producer theory and market structure in intermediate microeconomics courses. The paper proposes a framework where different decision problems are presented to students, who are asked to imagine that they are the managers of a firm who need to solve a problem in a particular business setting. In this setting, the instructors’ role isto provide both guidance to facilitate student learning and content knowledge on a just‐in‐time basis
Resumo:
Es discuteixen breument algunes consideracions sobre l'aplicació de la Teoria delsConjunts difusos a la Química quàntica. Es demostra aqui que molts conceptes químics associats a la teoria són adequats per ésser connectats amb l'estructura dels Conjunts difusos. També s'explica com algunes descripcions teoriques dels observables quàntics espotencien tractant-les amb les eines associades als esmentats Conjunts difusos. La funciódensitat es pren com a exemple de l'ús de distribucions de possibilitat al mateix temps queles distribucions de probabilitat quàntiques
Resumo:
The biplot has proved to be a powerful descriptive and analytical tool in many areasof applications of statistics. For compositional data the necessary theoreticaladaptation has been provided, with illustrative applications, by Aitchison (1990) andAitchison and Greenacre (2002). These papers were restricted to the interpretation ofsimple compositional data sets. In many situations the problem has to be described insome form of conditional modelling. For example, in a clinical trial where interest isin how patients’ steroid metabolite compositions may change as a result of differenttreatment regimes, interest is in relating the compositions after treatment to thecompositions before treatment and the nature of the treatments applied. To study thisthrough a biplot technique requires the development of some form of conditionalcompositional biplot. This is the purpose of this paper. We choose as a motivatingapplication an analysis of the 1992 US President ial Election, where interest may be inhow the three-part composition, the percentage division among the three candidates -Bush, Clinton and Perot - of the presidential vote in each state, depends on the ethniccomposition and on the urban-rural composition of the state. The methodology ofconditional compositional biplots is first developed and a detailed interpretation of the1992 US Presidential Election provided. We use a second application involving theconditional variability of tektite mineral compositions with respect to major oxidecompositions to demonstrate some hazards of simplistic interpretation of biplots.Finally we conjecture on further possible applications of conditional compositionalbiplots