136 resultados para Critical coupling parameter
Resumo:
We analyze the two-dimensional parabolic-elliptic Patlak-Keller-Segel model in the whole Euclidean space R2. Under the hypotheses of integrable initial data with finite second moment and entropy, we first show local in time existence for any mass of "free-energy solutions", namely weak solutions with some free energy estimates. We also prove that the solution exists as long as the entropy is controlled from above. The main result of the paper is to show the global existence of free-energy solutions with initial data as before for the critical mass 8 Π/Χ. Actually, we prove that solutions blow-up as a delta dirac at the center of mass when t→∞ keeping constant their second moment at any time. Furthermore, all moments larger than 2 blow-up as t→∞ if initially bounded.
Resumo:
Variational steepest descent approximation schemes for the modified Patlak-Keller-Segel equation with a logarithmic interaction kernel in any dimension are considered. We prove the convergence of the suitably interpolated in time implicit Euler scheme, defined in terms of the Euclidean Wasserstein distance, associated to this equation for sub-critical masses. As a consequence, we recover the recent result about the global in time existence of weak-solutions to the modified Patlak-Keller-Segel equation for the logarithmic interaction kernel in any dimension in the sub-critical case. Moreover, we show how this method performs numerically in one dimension. In this particular case, this numerical scheme corresponds to a standard implicit Euler method for the pseudo-inverse of the cumulative distribution function. We demonstrate its capabilities to reproduce easily without the need of mesh-refinement the blow-up of solutions for super-critical masses.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
The appeal to ideas as causal variables and/or constitutive features of political processes increasingly characterises political analysis. Yet, perhaps because of the pace of this ideational intrusion, too often ideas have simply been grafted onto pre-existing explanatory theories at precisely the point at which they seem to get into difficulties, with little or no consideration either of the status of such ideational variables or of the character or consistency of the resulting theoretical hybrid. This is particularly problematic for ideas are far from innocent variables – and can rarely, if ever, be incorporated seamlessly within existing explanatory and/or constitutive theories without ontological and epistemological consequence. We contend that this tendency along with the limitations of the prevailing Humean conception of causality, and associated epistemological polemic between causal and constitutive logics, continue to plague almost all of the literature that strives to accord an explanatory role to ideas. In trying to move beyond the current vogue for epistemological polemic, we argue that the incommensurability thesis between causal and constitutive logics is only credible in the context of a narrow, Humean, conception of causation. If we reject this in favour of a more inclusive (and ontologically realist) understanding then it is perfectly possible to chart the causal significance of constitutive processes and reconstrue the explanatory role of ideas as causally constitutive.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
The main purpose of this work is to give a survey of main monotonicity properties of queueing processes based on the coupling method. The literature on this topic is quite extensive, and we do not consider all aspects of this topic. Our more concrete goal is to select the most interesting basic monotonicity results and give simple and elegant proofs. Also we give a few new (or revised) proofs of a few important monotonicity properties for the queue-size and workload processes both in single-server and multi- server systems. The paper is organized as follows. In Section 1, the basic notions and results on coupling method are given. Section 2 contains known coupling results for renewal processes with focus on construction of synchronized renewal instants for a superposition of independent renewal processes. In Section 3, we present basic monotonicity results for the queue-size and workload processes. We consider both discrete-and continuous-time queueing systems with single and multi servers. Less known results on monotonicity of queueing processes with dependent service times and interarrival times are also presented. Section 4 is devoted to monotonicity of general Jackson-type queueing networks with Markovian routing. This section is based on the notable paper [17]. Finally, Section 5 contains elements of stability analysis of regenerative queues and networks, where coupling and monotonicity results play a crucial role to establish minimal suficient stability conditions. Besides, we present some new monotonicity results for tandem networks.
Resumo:
Projecte de recerca elaborat a partir d’una estada a la University of Nottingham, Gran Bretanya, entre març i abril del 2007. Aquest treball s’ha centrat en l’aplicació de compostos derivats de la D-(+)-glucosa, de la D-(+)-fructosa i la D-galactosa com a lligands de catalitzadors homogenis quirals en dos reaccions asimètriques: addició 1,2 a aldehids catalitzada per níquel i addició 1,4 conjugada catalitzada per coure.(veure figura adjunta al final del document). En primer lloc, s’ha estudiat l’aplicació dels compostos L1-L6 a les reaccions d’addició 1,2 a aldehids catalitzades per níquel. S’ha observat que la selectivitat del procés depèn principalment del grup funcional unit a l’esquelet del lligand, de les propietats estèriques del substituent en la funció oxazolina i de l’estructura del substrat. S’ha obtingut fins a un 59% d’excés enantiomèric utilitzant el precursor de catalitzador que conté el lligand L3a. En segon lloc, aquest treball descriu l’aplicació de les tres famílies de compostos (L1-L11) com a lligands en la reacció d’addició 1,4 catalitzada per coure de compostos organometàl•lics a diferents enones amb diferents propietats estèriques. L’ús de les llibreries de compostos fosfit-oxazolina (L1-L5) i fosfit-fosforamidit (L6) han proporcionat bones enantioselectivitats (fins a 80%) en l’addició de reactius de trialquilalumini a diferents enones. En canvi, la llibreria de compostos monofosfit (L7-L11) ha mostrat bones activitats però enantioselectivitats fins a 57%.
Resumo:
The empirical finding of an inverse U-shaped relationship between per capita income and pollution, the so-called Environmental Kuznets Curve (EKC), suggests that as countries experience economic growth, environmental deterioration decelerates and thus becomes less of an issue. Focusing on the prime example of carbon emissions, the present article provides a critical review of the new econometric techniques that have questioned the baseline polynomial specification in the EKC literature. We discuss issues related to the functional form, heterogeneity, “spurious” regressions and spatial dependence to address whether and to what extent the EKC can be observed. Despite these new approaches, there is still no clear-cut evidence supporting the existence of the EKC for carbon emissions. JEL classifications: C20; Q32; Q50; O13 Keywords: Environmental Kuznets Curve; Carbon emissions; Functional form; Heterogeneity; “Spurious” regressions; Spatial dependence.Residential satisfaction is often used as a barometer to assess the performance of public policy and programmes designed to raise individuals' well-being. However, the fact that responses elicited from residents might be biased by subjective, non-observable factors casts doubt on whether these responses can be taken as trustable indicators of the individuals' housing situation. Emotional factors such as aspirations or expectations might affect individuals' cognitions of their true residential situation. To disentangle this puzzle, we investigated whether identical residential attributes can be perceived differently depending on tenure status. Our results indicate that tenure status is crucial not only in determining the level of housing satisfaction, but also regarding how dwellers perceive their housing characteristics. Keywords: Housing satisfaction, subjective well-being, homeownership. JEL classification: D1, R2.
Resumo:
When dealing with sustainability we are concerned with the biophysical as well as the monetary aspects of economic and ecological interactions. This multidimensional approach requires that special attention is given to dimensional issues in relation to curve fitting practice in economics. Unfortunately, many empirical and theoretical studies in economics, as well as in ecological economics, apply dimensional numbers in exponential or logarithmic functions. We show that it is an analytical error to put a dimensional unit x into exponential functions ( a x ) and logarithmic functions ( x a log ). Secondly, we investigate the conditions of data sets under which a particular logarithmic specification is superior to the usual regression specification. This analysis shows that logarithmic specification superiority in terms of least square norm is heavily dependent on the available data set. The last section deals with economists’ “curve fitting fetishism”. We propose that a distinction be made between curve fitting over past observations and the development of a theoretical or empirical law capable of maintaining its fitting power for any future observations. Finally we conclude this paper with several epistemological issues in relation to dimensions and curve fitting practice in economics
Resumo:
This article addresses the normative dilemma located within the application of `securitization,’ as a method of understanding the social construction of threats and security policies. Securitization as a theoretical and practical undertaking is being increasingly used by scholars and practitioners. This scholarly endeavour wishes to provide those wishing to engage with securitization with an alternative application of this theory; one which is sensitive to and self-reflective of the possible normative consequences of its employment. This article argues that discussing and analyzing securitization processes have normative implications, which is understood here to be the negative securitization of a referent. The negative securitization of a referent is asserted to be carried out through the unchallenged analysis of securitization processes which have emerged through relations of exclusion and power. It then offers a critical understanding and application of securitization studies as a way of overcoming the identified normative dilemma. First, it examines how the Copenhagen School’s formation of securitization theory gives rise to a normative dilemma, which is situated in the performative and symbolic power of security as a political invocation and theoretical concept. Second, it evaluates previous attempts to overcome the normative dilemma of securitization studies, outlining the obstacles that each individual proposal faces. Third, this article argues that the normative dilemma of applying securitization can be avoided by firstly, deconstructing the institutional power of security actors and dominant security subjectivities and secondly, by addressing countering or alternative approaches to security and incorporating different security subjectivities. Examples of the securitization of international terrorism and immigration are prominent throughout.
Resumo:
Nonlinear Noisy Leaky Integrate and Fire (NNLIF) models for neurons networks can be written as Fokker-Planck-Kolmogorov equations on the probability density of neurons, the main parameters in the model being the connectivity of the network and the noise. We analyse several aspects of the NNLIF model: the number of steady states, a priori estimates, blow-up issues and convergence toward equilibrium in the linear case. In particular, for excitatory networks, blow-up always occurs for initial data concentrated close to the firing potential. These results show how critical is the balance between noise and excitatory/inhibitory interactions to the connectivity parameter.
Resumo:
This paper discusses the use of probabilistic or randomized algorithms for solving combinatorial optimization problems. Our approach employs non-uniform probability distributions to add a biased random behavior to classical heuristics so a large set of alternative good solutions can be quickly obtained in a natural way and without complex conguration processes. This procedure is especially useful in problems where properties such as non-smoothness or non-convexity lead to a highly irregular solution space, for which the traditional optimization methods, both of exact and approximate nature, may fail to reach their full potential. The results obtained are promising enough to suggest that randomizing classical heuristics is a powerful method that can be successfully applied in a variety of cases.
Resumo:
It has been proposed that the number of tropical cyclones as a function of the energy they release is a decreasing power-law function, up to a characteristic energy cutoff determined by the spatial size of the ocean basin in which the storm occurs. This means that no characteristic scale exists for the energy of tropical cyclones, except for the finite-size effects induced by the boundaries of the basins. This has important implications for the physics of tropical cyclones. We discuss up to what point tropical cyclones are related to critical phenomena (in the same way as earthquakes, rainfall, etc.), providing a consistent picture of the energy balance in the system. Moreover, this perspective allows one to visualize more clearly the effects of global warming on tropical-cyclone occurrence.
Resumo:
The amalgamation operation is frequently used to reduce the number of parts of compositional data but it is a non-linear operation in the simplex with the usual geometry,the Aitchison geometry. The concept of balances between groups, a particular coordinate system designed over binary partitions of the parts, could be an alternative to theamalgamation in some cases. In this work we discuss the proper application of bothconcepts using a real data set corresponding to behavioral measures of pregnant sows
Resumo:
The literature related to skew–normal distributions has grown rapidly in recent yearsbut at the moment few applications concern the description of natural phenomena withthis type of probability models, as well as the interpretation of their parameters. Theskew–normal distributions family represents an extension of the normal family to whicha parameter (λ) has been added to regulate the skewness. The development of this theoreticalfield has followed the general tendency in Statistics towards more flexible methodsto represent features of the data, as adequately as possible, and to reduce unrealisticassumptions as the normality that underlies most methods of univariate and multivariateanalysis. In this paper an investigation on the shape of the frequency distribution of thelogratio ln(Cl−/Na+) whose components are related to waters composition for 26 wells,has been performed. Samples have been collected around the active center of Vulcanoisland (Aeolian archipelago, southern Italy) from 1977 up to now at time intervals ofabout six months. Data of the logratio have been tentatively modeled by evaluating theperformance of the skew–normal model for each well. Values of the λ parameter havebeen compared by considering temperature and spatial position of the sampling points.Preliminary results indicate that changes in λ values can be related to the nature ofenvironmental processes affecting the data