69 resultados para critic of critical theory
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
This article addresses the normative dilemma located within the application of `securitization,’ as a method of understanding the social construction of threats and security policies. Securitization as a theoretical and practical undertaking is being increasingly used by scholars and practitioners. This scholarly endeavour wishes to provide those wishing to engage with securitization with an alternative application of this theory; one which is sensitive to and self-reflective of the possible normative consequences of its employment. This article argues that discussing and analyzing securitization processes have normative implications, which is understood here to be the negative securitization of a referent. The negative securitization of a referent is asserted to be carried out through the unchallenged analysis of securitization processes which have emerged through relations of exclusion and power. It then offers a critical understanding and application of securitization studies as a way of overcoming the identified normative dilemma. First, it examines how the Copenhagen School’s formation of securitization theory gives rise to a normative dilemma, which is situated in the performative and symbolic power of security as a political invocation and theoretical concept. Second, it evaluates previous attempts to overcome the normative dilemma of securitization studies, outlining the obstacles that each individual proposal faces. Third, this article argues that the normative dilemma of applying securitization can be avoided by firstly, deconstructing the institutional power of security actors and dominant security subjectivities and secondly, by addressing countering or alternative approaches to security and incorporating different security subjectivities. Examples of the securitization of international terrorism and immigration are prominent throughout.
Resumo:
Aim: To investigate and understand patient's satisfaction with nursing care in the intensive care unit to identify the dimensions of the concept of"satisfaction" from the patient's point of view. To design and validate a questionnaire that measures satisfaction levels in critical patients. Background: There are many instruments capable of measuring satisfaction with nursing care; however, they do not address the reality for critical patients nor are they applicable in our context. Design: A dual approach study comprising: a qualitative phase employing Grounded Theory and a quantitative and descriptive phase to prepare and validate the questionnaire. Methods: Data collection in the qualitative phase will consist of: in-depth interview after theoretical sampling, on-site diary and expert discussion group. The sample size will depend on the expected theoretical saturation n = 27-36. Analysis will be based on Grounded Theory. For the quantitative phase, the sampling will be based on convenience (n = 200). A questionnaire will be designed on the basis of qualitative data. Descriptive and inferential statistics will be used. The validation will be developed on the basis of the validity of the content, the criteria of the construct and reliability of the instrument by the Cronbach's alpha and test-retest approach. Approval date for this protocol was November 2010. Discussion: Self-perceptions, beliefs, experiences, demographic, socio-cultural epistemological and political factors are determinants for satisfaction, and these should be taken into account when compiling a questionnaire on satisfaction with nursing care among critical patients.
Resumo:
A growing literature integrates theories of debt management into models of optimal fiscal policy. One promising theory argues that the composition of government debt should be chosen so that fluctuations in the market value of debt offset changes in expected future deficits. This complete market approach to debt management is valid even when the government only issues non-contingent bonds. A number of authors conclude from this approach that governments should issue long term debt and invest in short term assets. We argue that the conclusions of this approach are too fragile to serve as a basis for policy recommendations. This is because bonds at different maturities have highly correlated returns, causing the determination of the optimal portfolio to be ill-conditioned. To make this point concrete we examine the implications of this approach to debt management in various models, both analytically and using numerical methods calibrated to the US economy. We find the complete market approach recommends asset positions which are huge multiples of GDP. Introducing persistent shocks or capital accumulation only worsens this problem. Increasing the volatility of interest rates through habits partly reduces the size of these simulations we find no presumption that governments should issue long term debt ? policy recommendations can be easily reversed through small perturbations in the specification of shocks or small variations in the maturity of bonds issued. We further extend the literature by removing the assumption that governments every period costlessly repurchase all outstanding debt. This exacerbates the size of the required positions, worsens their volatility and in some cases produces instability in debt holdings. We conclude that it is very difficult to insulate fiscal policy from shocks by using the complete markets approach to debt management. Given the limited variability of the yield curve using maturities is a poor way to substitute for state contingent debt. The result is the positions recommended by this approach conflict with a number of features that we believe are important in making bond markets incomplete e.g allowing for transaction costs, liquidity effects, etc.. Until these features are all fully incorporated we remain in search of a theory of debt management capable of providing robust policy insights.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
In this paper we propose a simple and general model for computing the Ramsey optimal inflation tax, which includes several models from the previous literature as special cases. We show that it cannot be claimed that the Friedman rule is always optimal (or always non--optimal) on theoretical grounds. The Friedman rule is optimal or not, depending on conditions related to the shape of various relevant functions. One contribution of this paper is to relate these conditions to {\it measurable} variables such as the interest rate or the consumption elasticity of money demand. We find that it tends to be optimal to tax money when there are economies of scale in the demand for money (the scale elasticity is smaller than one) and/or when money is required for the payment of consumption or wage taxes. We find that it tends to be optimal to tax money more heavily when the interest elasticity of money demand is small. We present empirical evidence on the parameters that determine the optimal inflation tax. Calibrating the model to a variety of empirical studies yields a optimal nominal interest rate of less than 1\%/year, although that finding is sensitive to the calibration.
Resumo:
Following a scheme of Levin we describe the values that functions in Fock spaces take on lattices of critical density in terms of both the size of the values and a cancelation condition that involves discrete versions of the Cauchy and Beurling-Ahlfors transforms.
Resumo:
The object of this project is to schedule a ctitious European basketball competition with many teams situated a long distances. The schedule must be fair, feasible and economical, which means that the total distance trav- eled by every team must be the minimal possible. First, we de ne the sport competition terminology and study di erent competition systems, focusing on the NBA and the Euroleague systems. Then we de ne concepts of graph theory and spherical distance that will be needed. Next we propose a com- petition system, explaining where will be allocated the teams and how will be the scheduling. Then there is a description of the programs that have been implemented, and, nally, the complete schedule is displayed, and some possible improvements are mentioned.
Resumo:
A Fundamentals of Computing Theory course involves different topics that are core to the Computer Science curricula and whose level of abstraction makes them difficult both to teach and to learn. Such difficulty stems from the complexity of the abstract notions involved and the required mathematical background. Surveys conducted among our students showed that many of them were applying some theoretical concepts mechanically rather than developing significant learning. This paper shows a number of didactic strategies that we introduced in the Fundamentals of Computing Theory curricula to cope with the above problem. The proposed strategies were based on a stronger use of technology and a constructivist approach. The final goal was to promote more significant learning of the course topics.
Resumo:
We analyze the results for infinite nuclear and neutron matter using the standard relativistic mean field model and its recent effective field theory motivated generalization. For the first time, we show quantitatively that the inclusion in the effective theory of vector meson self-interactions and scalar-vector cross-interactions explains naturally the recent experimental observations of the softness of the nuclear equation of state, without losing the advantages of the standard relativistic model for finite nuclei.
Resumo:
In the article we resume four experiments of an interdisciplinary nature carried out in four different secondary education centres. The nexus of the union of these didactic proposals is that of looking at values in sport and the critical capacity of the students from distinct perspectives: violence, mass media, politics and gender and the treatment of body in our society
Resumo:
By means of computer simulations and solution of the equations of the mode coupling theory (MCT),we investigate the role of the intramolecular barriers on several dynamic aspects of nonentangled polymers. The investigated dynamic range extends from the caging regime characteristic of glass-formers to the relaxation of the chain Rouse modes. We review our recent work on this question,provide new results, and critically discuss the limitations of the theory. Solutions of the MCT for the structural relaxation reproduce qualitative trends of simulations for weak and moderate barriers. However, a progressive discrepancy is revealed as the limit of stiff chains is approached. This dis-agreement does not seem related with dynamic heterogeneities, which indeed are not enhanced by increasing barrier strength. It is not connected either with the breakdown of the convolution approximation for three-point static correlations, which retains its validity for stiff chains. These findings suggest the need of an improvement of the MCT equations for polymer melts. Concerning the relaxation of the chain degrees of freedom, MCT provides a microscopic basis for time scales from chain reorientation down to the caging regime. It rationalizes, from first principles, the observed deviations from the Rouse model on increasing the barrier strength. These include anomalous scaling of relaxation times, long-time plateaux, and nonmonotonous wavelength dependence of the mode correlators.
Resumo:
By means of computer simulations and solution of the equations of the mode coupling theory (MCT),we investigate the role of the intramolecular barriers on several dynamic aspects of nonentangled polymers. The investigated dynamic range extends from the caging regime characteristic of glass-formers to the relaxation of the chain Rouse modes. We review our recent work on this question,provide new results, and critically discuss the limitations of the theory. Solutions of the MCT for the structural relaxation reproduce qualitative trends of simulations for weak and moderate barriers. However, a progressive discrepancy is revealed as the limit of stiff chains is approached. This dis-agreement does not seem related with dynamic heterogeneities, which indeed are not enhanced by increasing barrier strength. It is not connected either with the breakdown of the convolution approximation for three-point static correlations, which retains its validity for stiff chains. These findings suggest the need of an improvement of the MCT equations for polymer melts. Concerning the relaxation of the chain degrees of freedom, MCT provides a microscopic basis for time scales from chain reorientation down to the caging regime. It rationalizes, from first principles, the observed deviations from the Rouse model on increasing the barrier strength. These include anomalous scaling of relaxation times, long-time plateaux, and nonmonotonous wavelength dependence of the mode correlators.
Resumo:
We report on the onset of fluid entrainment when a contact line is forced to advance over a dry solid of arbitrary wettability. We show that entrainment occurs at a critical advancing speed beyond which the balance between capillary, viscous, and contact-line forces sustaining the shape of the interface is no longer satisfied. Wetting couples to the hydrodynamics by setting both the morphology of the interface at small scales and the viscous friction of the front. We find that the critical deformation that the interface can sustain is controlled by the friction at the contact line and the viscosity contrast between the displacing and displaced fluids, leading to a rich variety of wetting-entrainment regimes. We discuss the potential use of our theory to measure contact-line forces using atomic force microscopy and to study entrainment under microfluidic conditions exploiting colloid-polymer fluids of ultralow surface tension.
Resumo:
We prove that any subanalytic locally Lipschitz function has the Sard property. Such functions are typically nonsmooth and their lack of regularity necessitates the choice of some generalized notion of gradient and of critical point. In our framework these notions are defined in terms of the Clarke and of the convex-stable subdifferentials. The main result of this note asserts that for any subanalytic locally Lipschitz function the set of its Clarke critical values is locally finite. The proof relies on Pawlucki's extension of the Puiseuxlemma. In the last section we give an example of a continuous subanalytic function which is not constant on a segment of "broadly critical" points, that is, points for which we can find arbitrarily short convex combinations of gradients at nearby points.
Resumo:
We present existence, uniqueness and continuous dependence results for some kinetic equations motivated by models for the collective behavior of large groups of individuals. Models of this kind have been recently proposed to study the behavior of large groups of animals, such as flocks of birds, swarms, or schools of fish. Our aim is to give a well-posedness theory for general models which possibly include a variety of effects: an interaction through a potential, such as a short-range repulsion and long-range attraction; a velocity-averaging effect where individuals try to adapt their own velocity to that of other individuals in their surroundings; and self-propulsion effects, which take into account effects on one individual that are independent of the others. We develop our theory in a space of measures, using mass transportation distances. As consequences of our theory we show also the convergence of particle systems to their corresponding kinetic equations, and the local-in-time convergence to the hydrodynamic limit for one of the models.