984 resultados para representation theory


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article, we present a new approach of Nekhoroshev theory for a generic unperturbed Hamiltonian which completely avoids small divisors problems. The proof is an extension of a method introduced by P. Lochak which combines averaging along periodic orbits with simultaneous Diophantine approximation and uses geometric arguments designed by the second author to handle generic integrable Hamiltonians. This method allows to deal with generic non-analytic Hamiltonians and to obtain new results of generic stability around linearly stable tori.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Is there a link between decentralized governance and conflict prevention? This article tries to answer the question by presenting the state of the art of the intersection of both concepts. Provided that social conflict is inevitable and given the appearance of new threats and types of violence, as well as new demands for security based on people (human security), our societies should focus on promoting peaceful changes. Through an extensive analysis of the existing literature and the study of several cases, this paper suggests that decentralized governance can contribute to these efforts by transforming conflicts, bringing about power-sharing and inclusion incentives of minority groups. Albeit the complexity of assessing its impact on conflict prevention, it can be contended that decentralized governance might have very positive effects on the reduction of causes that bring about conflicts due to its ability to foster the creation of war/violence preventors. More specifically, this paper argues that decentralization can have a positive impact on the so-called triggers and accelerators (short- and medium-term causes).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper I review a series of theoretical concepts that are relevant for the integrated assessment of agricultural sustainability but that are not generally included in the curriculum of the various scientific disciplines dealing with quantitative analysis of agriculture. I first illustrate with plain narratives and concrete examples that sustainability is an extremely complex issue requiring the simultaneous consideration of several aspects, which cannot be reduced into a single indicator of performance. Following, I justify this obvious need for multi-criteria analysis with theoretical concepts dealing with the epistemological predicament of complexity, starting from classic philosophical lessons to arrive to recent developments in complex system theory, in particular Rosen´s theory of modelling relation which is essential to analyze the quality of any quantitative representation. The implications of these theoretical concepts are then illustrated with applications of multi-criteria analysis to the sustainability of agriculture. I wrap up by pointing out the crucial difference between "integrated assessment" and "integrated analysis". An integrated analysis is a set of indicators and analytical models generating an analytical output. An integrated assessment is much more than that. It is about finding an effective way to deal with three key issues: (i) legitimacy – how to handle the unavoidable existence of legitimate but contrasting points of view about different meanings given by social actors to the word "development"; (ii) pertinence – how to handle in a coherent way scientific analyses referring to different scales and dimensions; and (iii) credibility – how to handle the unavoidable existence of uncertainty and genuine ignorance, when dealing with the analysis of future scenarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article addresses the normative dilemma located within the application of `securitization,’ as a method of understanding the social construction of threats and security policies. Securitization as a theoretical and practical undertaking is being increasingly used by scholars and practitioners. This scholarly endeavour wishes to provide those wishing to engage with securitization with an alternative application of this theory; one which is sensitive to and self-reflective of the possible normative consequences of its employment. This article argues that discussing and analyzing securitization processes have normative implications, which is understood here to be the negative securitization of a referent. The negative securitization of a referent is asserted to be carried out through the unchallenged analysis of securitization processes which have emerged through relations of exclusion and power. It then offers a critical understanding and application of securitization studies as a way of overcoming the identified normative dilemma. First, it examines how the Copenhagen School’s formation of securitization theory gives rise to a normative dilemma, which is situated in the performative and symbolic power of security as a political invocation and theoretical concept. Second, it evaluates previous attempts to overcome the normative dilemma of securitization studies, outlining the obstacles that each individual proposal faces. Third, this article argues that the normative dilemma of applying securitization can be avoided by firstly, deconstructing the institutional power of security actors and dominant security subjectivities and secondly, by addressing countering or alternative approaches to security and incorporating different security subjectivities. Examples of the securitization of international terrorism and immigration are prominent throughout.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La acelerada invención de nuevos hardware y software van modificando, casi diariamente, la percepción del mundo, y, por lo tanto, la producción cultural, permeabilizando conceptos como arte-literatura, cuadro-libro, imagen-texto. Si bien estas parejas han sido siempre objeto del discurso teórico, la discusión asume hoy una urgencia creciente al momento que las nuevas tecnologías exponen lo que estaba refugiado en el reino de la teoría. La misma forma de comprender la realidad se ve afectada por la inmediatez de estos medios. La investigación analiza la obra de diferentes autores de los nuevos medios que trabajan en torno a la problemática de la representación de la memoria en esta perspectiva contemporánea. El trabajo de investigación desarrollado en la Tesis Doctoral se centra en la forma de representación de la memoria, así como esta planteada en la obra de Chris Marker. Interesan especialmente los últimos dispositivos creados por el autor en el marco de las llamadas nuevas tecnologías y los nuevos espacios de exposición de cine. El proyecto propone un análisis en torno a la memoria que dichos discursos sugieren a través de los temas que les son propios: archivo, identidades culturales, contribución del espectador, base de datos y tratamiento tecnológico de la información. Se ha seleccionado la obra de Chris Marker por las características de realización y de discurso que permiten una amplia discusión sobre las llamadas nuevas tecnologías y el mundo que éstas representan en el nuevo espacio híbrido construido entre las artes visuales, la literatura y la tecnología.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A growing number of studies have been addressing the relationship between theory of mind (TOM) and executive functions (EF) in patients with acquired neurological pathology. In order to provide a global overview on the main findings, we conducted a systematic review on group studies where we aimed to (1) evaluate the patterns of impaired and preserved abilities of both TOM and EF in groups of patients with acquired neurological pathology and (2) investigate the existence of particular relations between different EF domains and TOM tasks. The search was conducted in Pubmed/Medline. A total of 24 articles met the inclusion criteria. We considered for analysis classical clinically accepted TOM tasks (first- and second-order false belief stories, the Faux Pas test, Happe's stories, the Mind in the Eyes task, and Cartoon's tasks) and EF domains (updating, shifting, inhibition, and access). The review suggests that (1) EF and TOM appear tightly associated. However, the few dissociations observed suggest they cannot be reduced to a single function; (2) no executive subprocess could be specifically associated with TOM performances; (3) the first-order false belief task and the Happe's story task seem to be less sensitive to neurological pathologies and less associated to EF. Even though the analysis of the reviewed studies demonstrates a close relationship between TOM and EF in patients with acquired neurological pathology, the nature of this relationship must be further investigated. Studies investigating ecological consequences of TOM and EF deficits, and intervention researches may bring further contributions to this question.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Descriptive set theory is mainly concerned with studying subsets of the space of all countable binary sequences. In this paper we study the generalization where countable is replaced by uncountable. We explore properties of generalized Baire and Cantor spaces, equivalence relations and their Borel reducibility. The study shows that the descriptive set theory looks very different in this generalized setting compared to the classical, countable case. We also draw the connection between the stability theoretic complexity of first-order theories and the descriptive set theoretic complexity of their isomorphism relations. Our results suggest that Borel reducibility on uncountable structures is a model theoretically natural way to compare the complexity of isomorphism relations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ultrasound segmentation is a challenging problem due to the inherent speckle and some artifacts like shadows, attenuation and signal dropout. Existing methods need to include strong priors like shape priors or analytical intensity models to succeed in the segmentation. However, such priors tend to limit these methods to a specific target or imaging settings, and they are not always applicable to pathological cases. This work introduces a semi-supervised segmentation framework for ultrasound imaging that alleviates the limitation of fully automatic segmentation, that is, it is applicable to any kind of target and imaging settings. Our methodology uses a graph of image patches to represent the ultrasound image and user-assisted initialization with labels, which acts as soft priors. The segmentation problem is formulated as a continuous minimum cut problem and solved with an efficient optimization algorithm. We validate our segmentation framework on clinical ultrasound imaging (prostate, fetus, and tumors of the liver and eye). We obtain high similarity agreement with the ground truth provided by medical expert delineations in all applications (94% DICE values in average) and the proposed algorithm performs favorably with the literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We give the first systematic study of strong isomorphism reductions, a notion of reduction more appropriate than polynomial time reduction when, for example, comparing the computational complexity of the isomorphim problem for different classes of structures. We show that the partial ordering of its degrees is quite rich. We analyze its relationship to a further type of reduction between classes of structures based on purely comparing for every n the number of nonisomorphic structures of cardinality at most n in both classes. Furthermore, in a more general setting we address the question of the existence of a maximal element in the partial ordering of the degrees.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Vintage capital growth models have been at the heart of growth theory in the 60s. This research line collapsed in the late 60s with the so-called embodiment controversy and the technical sophisitication of the vintage models. This paper analyzes the astonishing revival of this literature in the 90s. In particular, it outlines three methodological breakthroughs explaining this resurgence: a growth accounting revolution, taking advantage of the availability of new time series, an optimal control revolution allowing to safely study vintage capital optimal growth models, and a vintage human capital revolution, along with the rise of economic demography, accounting for the vintage structure of human capital similarly to physical capital age structuring. The related literature is surveyed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction. There is some cross-sectional evidence that theory of mind ability is associated with social functioning in those with psychosis but the direction of this relationship is unknown. This study investigates the longitudinal association between both theory of mind and psychotic symptoms and social functioning outcome in first-episode psychosis. Methods. Fifty-four people with first-episode psychosis were followed up at 6 and 12 months. Random effects regression models were used to estimate the stability of theory of mind over time and the association between baseline theory of mind and psychotic symptoms and social functioning outcome. Results. Neither baseline theory of mind ability (regression coefficients: Hinting test 1.07 95% CI 0.74, 2.88; Visual Cartoon test 2.91 95% CI 7.32, 1.51) nor baseline symptoms (regression coefficients: positive symptoms 0.04 95% CI 1.24, 1.16; selected negative symptoms 0.15 95% CI 2.63, 2.32) were associated with social functioning outcome. There was evidence that theory of mind ability was stable over time, (regression coefficients: Hinting test 5.92 95% CI 6.66, 8.92; Visual Cartoon test score 0.13 95% CI 0.17, 0.44). Conclusions. Neither baseline theory of mind ability nor psychotic symptoms are associated with social functioning outcome. Further longitudinal work is needed to understand the origin of social functioning deficits in psychosis.