146 resultados para Random noise theory


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the concept of propagation connectivity on random 3-uniform hypergraphs. This concept is inspired by a simple linear time algorithm for solving instances of certain constraint satisfaction problems. We derive upper and lower bounds for the propagation connectivity threshold, and point out some algorithmic implications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El projecte ha consistit en la creació de gràfics estadístics de soroll d’Europa de forma automàtica amb tecnologies Open Source dins el visor Noise Map Viewer per Europa de l’ETC-LUSI. La llibreria utilitzada per fer aquest procés ha estat JFreeChart i el llenguatge de programació utilitzat ha estat Java (programació orientada a objectes) dins l’entorn de desenvolupament integrat Eclipse. La base de dades utilitzada ha estat PostgreSQL. Com a servidors s’han fet servir Apache (servidor HTTP) i Tomcat (servidor contenidor d’aplicacions). Un cop acabat el procés s’ha integrat dins de MapFish canviant el codi JavaScript corresponent de la web original.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Is there a link between decentralized governance and conflict prevention? This article tries to answer the question by presenting the state of the art of the intersection of both concepts. Provided that social conflict is inevitable and given the appearance of new threats and types of violence, as well as new demands for security based on people (human security), our societies should focus on promoting peaceful changes. Through an extensive analysis of the existing literature and the study of several cases, this paper suggests that decentralized governance can contribute to these efforts by transforming conflicts, bringing about power-sharing and inclusion incentives of minority groups. Albeit the complexity of assessing its impact on conflict prevention, it can be contended that decentralized governance might have very positive effects on the reduction of causes that bring about conflicts due to its ability to foster the creation of war/violence preventors. More specifically, this paper argues that decentralization can have a positive impact on the so-called triggers and accelerators (short- and medium-term causes).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article addresses the normative dilemma located within the application of `securitization,’ as a method of understanding the social construction of threats and security policies. Securitization as a theoretical and practical undertaking is being increasingly used by scholars and practitioners. This scholarly endeavour wishes to provide those wishing to engage with securitization with an alternative application of this theory; one which is sensitive to and self-reflective of the possible normative consequences of its employment. This article argues that discussing and analyzing securitization processes have normative implications, which is understood here to be the negative securitization of a referent. The negative securitization of a referent is asserted to be carried out through the unchallenged analysis of securitization processes which have emerged through relations of exclusion and power. It then offers a critical understanding and application of securitization studies as a way of overcoming the identified normative dilemma. First, it examines how the Copenhagen School’s formation of securitization theory gives rise to a normative dilemma, which is situated in the performative and symbolic power of security as a political invocation and theoretical concept. Second, it evaluates previous attempts to overcome the normative dilemma of securitization studies, outlining the obstacles that each individual proposal faces. Third, this article argues that the normative dilemma of applying securitization can be avoided by firstly, deconstructing the institutional power of security actors and dominant security subjectivities and secondly, by addressing countering or alternative approaches to security and incorporating different security subjectivities. Examples of the securitization of international terrorism and immigration are prominent throughout.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La Teoria de la Relativitat General preveu que quan un objecte massiu és sotmès a una certa acceleració en certes condicions ha d’emetre ones gravitacionals. Es tracta d’un tipus d’on altament energètica però que interacciona amb la matèria de manera molt feble i el seu punt d’emissió és força llunyà. Per la qual cosa la seva detecció és una tasca extraordinàriament complicada. Conseqüentment, la detecció d’aquestes ones es creu molt més factible utilitzant instruments situats a l’espai. Amb aquest objectiu, neis la missió LISA (Laser Interferometer Space Antenna). Es tracta aquesta d’una missió conjunta entre la NASA i l’ESA amb llançament previst per 2020-2025. Per reduir els riscs que comporta una primera utilització de tecnologia no testejada, unit a l’alt cost econòmic de la missió LISA. Aquesta missió contindrà instruments molt avançats: el LTP (LISA Technoplogy Package), desenvolupat per la Unió Europea, que provarà la tecnologia de LISA i el Drag Free flying system, que s’encarregarà de provar una sèrie de propulsors (thrusters) utilitzats per al control d’actitud i posició de satèl•lit amb precisió de nanòmetres. Particularment, el LTP, està composat per dues masses de prova separades per 35 centímetres, i d’un interferòmetre làser que mesura la variació de la distància relativa entre elles. D’aquesta manera, el LTP mesurarà les prestacions dels equips i les possibles interferències que afecten a la mesura. Entre les fonts de soroll es troben, entre d’altres, el vent i pressió de radiació solar, les càrregues electrostàtiques, el gradient tèrmic, les fluctuacions de voltatge o les forces internes. Una de les possibles causes de soroll és aquella que serà l’objecte d’estudi en aquest projecte de tesi doctoral: la presència dintre del LTP de camps magnètics, que exerceixen una força sobre les masses de prova, la seva estimació i el seu control, prenent en compte les caracterírstiques magnètiques de l’experiment i la dinàmica del satèl•lit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

I study large random assignment economies with a continuum of agents and a finite number of object types. I consider the existence of weak priorities discriminating among agents with respect to their rights concerning the final assignment. The respect for priorities ex ante (ex-ante stability) usually precludes ex-ante envy-freeness. Therefore I define a new concept of fairness, called no unjustified lower chances: priorities with respect to one object type cannot justify different achievable chances regarding another object type. This concept, which applies to the assignment mechanism rather than to the assignment itself, implies ex-ante envy-freeness among agents of the same priority type. I propose a variation of Hylland and Zeckhauser' (1979) pseudomarket that meets ex-ante stability, no unjustified lower chances and ex-ante efficiency among agents of the same priority type. Assuming enough richness in preferences and priorities, the converse is also true: any random assignment with these properties could be achieved through an equilibrium in a pseudomarket with priorities. If priorities are acyclical (the ordering of agents is the same for each object type), this pseudomarket achieves ex-ante efficient random assignments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we prove a formula for the analytic index of a basic Dirac-type operator on a Riemannian foliation, solving a problem that has been open for many years. We also consider more general indices given by twisting the basic Dirac operator by a representation of the orthogonal group. The formula is a sum of integrals over blowups of the strata of the foliation and also involves eta invariants of associated elliptic operators. As a special case, a Gauss-Bonnet formula for the basic Euler characteristic is obtained using two independent proofs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Descriptive set theory is mainly concerned with studying subsets of the space of all countable binary sequences. In this paper we study the generalization where countable is replaced by uncountable. We explore properties of generalized Baire and Cantor spaces, equivalence relations and their Borel reducibility. The study shows that the descriptive set theory looks very different in this generalized setting compared to the classical, countable case. We also draw the connection between the stability theoretic complexity of first-order theories and the descriptive set theoretic complexity of their isomorphism relations. Our results suggest that Borel reducibility on uncountable structures is a model theoretically natural way to compare the complexity of isomorphism relations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We give the first systematic study of strong isomorphism reductions, a notion of reduction more appropriate than polynomial time reduction when, for example, comparing the computational complexity of the isomorphim problem for different classes of structures. We show that the partial ordering of its degrees is quite rich. We analyze its relationship to a further type of reduction between classes of structures based on purely comparing for every n the number of nonisomorphic structures of cardinality at most n in both classes. Furthermore, in a more general setting we address the question of the existence of a maximal element in the partial ordering of the degrees.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Vintage capital growth models have been at the heart of growth theory in the 60s. This research line collapsed in the late 60s with the so-called embodiment controversy and the technical sophisitication of the vintage models. This paper analyzes the astonishing revival of this literature in the 90s. In particular, it outlines three methodological breakthroughs explaining this resurgence: a growth accounting revolution, taking advantage of the availability of new time series, an optimal control revolution allowing to safely study vintage capital optimal growth models, and a vintage human capital revolution, along with the rise of economic demography, accounting for the vintage structure of human capital similarly to physical capital age structuring. The related literature is surveyed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article analyzes empirically the main existing theories on income and population city growth: increasing returns to scale, locational fundamentals and random growth. To do this we implement a threshold nonlinearity test that extends standard linear growth regression models to a dataset on urban, climatological and macroeconomic variables on 1,175 U.S. cities. Our analysis reveals the existence of increasing returns when per-capita income levels are beyond $19; 264. Despite this, income growth is mostly explained by social and locational fundamentals. Population growth also exhibits two distinct equilibria determined by a threshold value of 116,300 inhabitants beyond which city population grows at a higher rate. Income and population growth do not go hand in hand, implying an optimal level of population beyond which income growth stagnates or deteriorates

Relevância:

20.00% 20.00%

Publicador:

Resumo:

After a historical survey of temperament in Bach’s Well-Tempered Clavier by Johann Sebastian Bach, an analysis of the work has been made by applying a number of historical good temperaments as well as some recent proposals. The results obtained show that the global dissonance for all preludes and fugues in major keys can be minimized using the Kirnberger II temperament. The method of analysis used for this research is based on the mathematical theories of sensory dissonance, which have been developed by authors such as Hermann Ludwig Ferdinand von Helmholtz, Harry Partch, Reinier Plomp, Willem J. M. Levelt and William A. Sethares

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies frequent monitoring in an infinitely repeated game with imperfect public information and discounting, where players observe the state of a continuous time Brownian process at moments in time of length _. It shows that a limit folk theorem can be achieved with imperfect public monitoring when players monitor each other at the highest frequency, i.e., _. The approach assumes that the expected joint output depends exclusively on the action profile simultaneously and privately decided by the players at the beginning of each period of the game, but not on _. The strong decreasing effect on the expected immediate gains from deviation when the interval between actions shrinks, and the associated increase precision of the public signals, make the result possible in the limit. JEL: C72/73, D82, L20. KEYWORDS: Repeated Games, Frequent Monitoring, Public Monitoring, Brownian Motion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A parts based model is a parametrization of an object class using a collection of landmarks following the object structure. The matching of parts based models is one of the problems where pairwise Conditional Random Fields have been successfully applied. The main reason of their effectiveness is tractable inference and learning due to the simplicity of involved graphs, usually trees. However, these models do not consider possible patterns of statistics among sets of landmarks, and thus they sufffer from using too myopic information. To overcome this limitation, we propoese a novel structure based on a hierarchical Conditional Random Fields, which we explain in the first part of this memory. We build a hierarchy of combinations of landmarks, where matching is performed taking into account the whole hierarchy. To preserve tractable inference we effectively sample the label set. We test our method on facial feature selection and human pose estimation on two challenging datasets: Buffy and MultiPIE. In the second part of this memory, we present a novel approach to multiple kernel combination that relies on stacked classification. This method can be used to evaluate the landmarks of the parts-based model approach. Our method is based on combining responses of a set of independent classifiers for each individual kernel. Unlike earlier approaches that linearly combine kernel responses, our approach uses them as inputs to another set of classifiers. We will show that we outperform state-of-the-art methods on most of the standard benchmark datasets.