88 resultados para critical approaches
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
The empirical finding of an inverse U-shaped relationship between per capita income and pollution, the so-called Environmental Kuznets Curve (EKC), suggests that as countries experience economic growth, environmental deterioration decelerates and thus becomes less of an issue. Focusing on the prime example of carbon emissions, the present article provides a critical review of the new econometric techniques that have questioned the baseline polynomial specification in the EKC literature. We discuss issues related to the functional form, heterogeneity, “spurious” regressions and spatial dependence to address whether and to what extent the EKC can be observed. Despite these new approaches, there is still no clear-cut evidence supporting the existence of the EKC for carbon emissions. JEL classifications: C20; Q32; Q50; O13 Keywords: Environmental Kuznets Curve; Carbon emissions; Functional form; Heterogeneity; “Spurious” regressions; Spatial dependence.Residential satisfaction is often used as a barometer to assess the performance of public policy and programmes designed to raise individuals' well-being. However, the fact that responses elicited from residents might be biased by subjective, non-observable factors casts doubt on whether these responses can be taken as trustable indicators of the individuals' housing situation. Emotional factors such as aspirations or expectations might affect individuals' cognitions of their true residential situation. To disentangle this puzzle, we investigated whether identical residential attributes can be perceived differently depending on tenure status. Our results indicate that tenure status is crucial not only in determining the level of housing satisfaction, but also regarding how dwellers perceive their housing characteristics. Keywords: Housing satisfaction, subjective well-being, homeownership. JEL classification: D1, R2.
Resumo:
This article addresses the normative dilemma located within the application of `securitization,’ as a method of understanding the social construction of threats and security policies. Securitization as a theoretical and practical undertaking is being increasingly used by scholars and practitioners. This scholarly endeavour wishes to provide those wishing to engage with securitization with an alternative application of this theory; one which is sensitive to and self-reflective of the possible normative consequences of its employment. This article argues that discussing and analyzing securitization processes have normative implications, which is understood here to be the negative securitization of a referent. The negative securitization of a referent is asserted to be carried out through the unchallenged analysis of securitization processes which have emerged through relations of exclusion and power. It then offers a critical understanding and application of securitization studies as a way of overcoming the identified normative dilemma. First, it examines how the Copenhagen School’s formation of securitization theory gives rise to a normative dilemma, which is situated in the performative and symbolic power of security as a political invocation and theoretical concept. Second, it evaluates previous attempts to overcome the normative dilemma of securitization studies, outlining the obstacles that each individual proposal faces. Third, this article argues that the normative dilemma of applying securitization can be avoided by firstly, deconstructing the institutional power of security actors and dominant security subjectivities and secondly, by addressing countering or alternative approaches to security and incorporating different security subjectivities. Examples of the securitization of international terrorism and immigration are prominent throughout.
Resumo:
Different procedures to obtain atom condensed Fukui functions are described. It is shown how the resulting values may differ depending on the exact approach to atom condensed Fukui functions. The condensed Fukui function can be computed using either the fragment of molecular response approach or the response of molecular fragment approach. The two approaches are nonequivalent; only the latter approach corresponds in general with a population difference expression. The Mulliken approach does not depend on the approach taken but has some computational drawbacks. The different resulting expressions are tested for a wide set of molecules. In practice one must make seemingly arbitrary choices about how to compute condensed Fukui functions, which suggests questioning the role of these indicators in conceptual density-functional theory
Resumo:
The existence of a liquid-gas phase transition for hot nuclear systems at subsaturation densities is a well-established prediction of finite-temperature nuclear many-body theory. In this paper, we discuss for the first time the properties of such a phase transition for homogeneous nuclear matter within the self-consistent Green's function approach. We find a substantial decrease of the critical temperature with respect to the Brueckner-Hartree-Fock approximation. Even within the same approximation, the use of two different realistic nucleon-nucleon interactions gives rise to large differences in the properties of the critical point.
Resumo:
Today, most software development teams use free and open source software (FOSS) components, because it increases the speed and the quality of the development. Many open source components are the de facto standard of their category. However, FOSS has licensing restrictions, and corporate organizations usually maintain a list of allowed and forbidden licenses. But how do you enforce this policy? How can you make sure that ALL files in your source depot, either belong to you, or fit your licensing policy? A first, preventive approach is to train and increase the awareness of the development team to these licensing issues. Depending on the size of the team, it may be costly but necessary. However, this does not ensure that a single individual will not commit a forbidden icon or library, and jeopardize the legal status of the whole release... if not the company, since software is becoming more and more a critical asset. Another approach is to verify what is included in the source repository, and check whether it belongs to the open-source world. This can be done on-the-fly, whenever a new file is added into the source depot. It can also be part of the release process, as a verification step before publishing the release. In both cases, there are some tools and databases to automate the detection process. We will present the various options regarding FOSS detection, how this process can be integrated in the "software factory", and how the results can be displayed in a usable and efficient way.
Resumo:
Analysis of stratigraphic terminology and classification, shows that time-related stratigraphic units, which by definition have a global extent, are the concern of international cornrnissions and committees of the intemational Union of Geological Sciences (IUGS) . In contrast, lithostratigraphic, and other closely related units, are regional in extent and are catalogued in the International Stratigraphic Lexicon (ISL), the last volume of which, was published in 1987. Tlie intemational Commission on Stratigraphy (ICS) is currently attempting to revitalize the publication of ISL, given that the information contained in published volumes has never been updated, and that there has been a significant increase in stratigraphic research in recent decades. The proliferation of named units in the South Pyrenean and Ebro Basin Paleogene is evaluated to illustrate the extent of the problem. Moreover, new approaches to stratigraphic analysis have led to the naming of genetic units according to similar guidelines followed in the naming of descnptive or lithostratigraphic units. This has led to considerable confusion. The proposal to revitalize the ISL is accepted as part of the solution, that should also include the publication of critica1 catalogues, and the creation of norms for genetic unit terminology.
Resumo:
We prove that any subanalytic locally Lipschitz function has the Sard property. Such functions are typically nonsmooth and their lack of regularity necessitates the choice of some generalized notion of gradient and of critical point. In our framework these notions are defined in terms of the Clarke and of the convex-stable subdifferentials. The main result of this note asserts that for any subanalytic locally Lipschitz function the set of its Clarke critical values is locally finite. The proof relies on Pawlucki's extension of the Puiseuxlemma. In the last section we give an example of a continuous subanalytic function which is not constant on a segment of "broadly critical" points, that is, points for which we can find arbitrarily short convex combinations of gradients at nearby points.
Resumo:
We review recent likelihood-based approaches to modeling demand for medical care. A semi-nonparametric model along the lines of Cameron and Johansson's Poisson polynomial model, but using a negative binomial baseline model, is introduced. We apply these models, as well a semiparametric Poisson, hurdle semiparametric Poisson, and finite mixtures of negative binomial models to six measures of health care usage taken from the Medical Expenditure Panel survey. We conclude that most of the models lead to statistically similar results, both in terms of information criteria and conditional and unconditional prediction. This suggests that applied researchers may not need to be overly concerned with the choice of which of these models they use to analyze data on health care demand.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
The paper is divided into four sections. The first offers a critical assessment of explanations of both rationalist and constructivist approaches currently dominating European studies and assesses the notion of path dependence. The second and third sections analyse the role of both material interests and polity ideas in EU enlargement to Turkey, and conclude that explanations exclusively based on either strategic calculations or values and identities have significant shortcomings. The fourth section examines the institutional path of Turkey's candidacy to show how the course of action begun at Helsinki restricted the range of possible and legitimate options three years later in Copenhagen.
Resumo:
We analyze the two-dimensional parabolic-elliptic Patlak-Keller-Segel model in the whole Euclidean space R2. Under the hypotheses of integrable initial data with finite second moment and entropy, we first show local in time existence for any mass of "free-energy solutions", namely weak solutions with some free energy estimates. We also prove that the solution exists as long as the entropy is controlled from above. The main result of the paper is to show the global existence of free-energy solutions with initial data as before for the critical mass 8 Π/Χ. Actually, we prove that solutions blow-up as a delta dirac at the center of mass when t→∞ keeping constant their second moment at any time. Furthermore, all moments larger than 2 blow-up as t→∞ if initially bounded.
Resumo:
Variational steepest descent approximation schemes for the modified Patlak-Keller-Segel equation with a logarithmic interaction kernel in any dimension are considered. We prove the convergence of the suitably interpolated in time implicit Euler scheme, defined in terms of the Euclidean Wasserstein distance, associated to this equation for sub-critical masses. As a consequence, we recover the recent result about the global in time existence of weak-solutions to the modified Patlak-Keller-Segel equation for the logarithmic interaction kernel in any dimension in the sub-critical case. Moreover, we show how this method performs numerically in one dimension. In this particular case, this numerical scheme corresponds to a standard implicit Euler method for the pseudo-inverse of the cumulative distribution function. We demonstrate its capabilities to reproduce easily without the need of mesh-refinement the blow-up of solutions for super-critical masses.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
The appeal to ideas as causal variables and/or constitutive features of political processes increasingly characterises political analysis. Yet, perhaps because of the pace of this ideational intrusion, too often ideas have simply been grafted onto pre-existing explanatory theories at precisely the point at which they seem to get into difficulties, with little or no consideration either of the status of such ideational variables or of the character or consistency of the resulting theoretical hybrid. This is particularly problematic for ideas are far from innocent variables – and can rarely, if ever, be incorporated seamlessly within existing explanatory and/or constitutive theories without ontological and epistemological consequence. We contend that this tendency along with the limitations of the prevailing Humean conception of causality, and associated epistemological polemic between causal and constitutive logics, continue to plague almost all of the literature that strives to accord an explanatory role to ideas. In trying to move beyond the current vogue for epistemological polemic, we argue that the incommensurability thesis between causal and constitutive logics is only credible in the context of a narrow, Humean, conception of causation. If we reject this in favour of a more inclusive (and ontologically realist) understanding then it is perfectly possible to chart the causal significance of constitutive processes and reconstrue the explanatory role of ideas as causally constitutive.