898 resultados para nonlocal theories and models
Resumo:
The behavior of the transition pion form factor for processes gamma (*)gamma --> pi(0) and gamma (*)gamma (*) --> pi(0) at large values of space-like photon momenta is estimated within the nonlocal covariant quark-pion model. It is shown that, in general, the coefficient of the leading asymptotic term depends dynamically on the ratio of the constituent quark mass and the average virtuality of quarks in the vacuum and kinematically on the ratio of photon virtualities. The kinematic dependence of the transition form factor allows us to obtain the relation between the pion light-cone distribution amplitude and the quark-pion vertex function. The dynamic dependence indicates that the transition form factor gamma (*)gamma -->, pi(0) at high momentum transfers is very sensitive to the nonlocality size of nonperturbative fluctuations in the QCD vacuum. (C) 2000 Elsevier B.V. B.V. All rights reserved.
Resumo:
Behavioral finance, or behavioral economics, consists of a theoretical field of research stating that consequent psychological and behavioral variables are involved in financial activities such as corporate finance and investment decisions (i.e. asset allocation, portfolio management and so on). This field has known an increasing interest from scholar and financial professionals since episodes of multiple speculative bubbles and financial crises. Indeed, practical incoherencies between economic events and traditional neoclassical financial theories had pushed more and more researchers to look for new and broader models and theories. The purpose of this work is to present the field of research, still ill-known by a vast majority. This work is thus a survey that introduces its origins and its main theories, while contrasting them with traditional finance theories still predominant nowadays. The main question guiding this work would be to see if this area of inquiry is able to provide better explanations for real life market phenomenon. For that purpose, the study will present some market anomalies unsolved by traditional theories, which have been recently addressed by behavioral finance researchers. In addition, it presents a practical application of portfolio management, comparing asset allocation under the traditional Markowitz’s approach to the Black-Litterman model, which incorporates some features of behavioral finance.
Resumo:
The aim of this study was to describe and quantify the effect of aquatic pollution on the fish assemblage structure of the Corumbatai River (Brazil), by comparing two sites with different water quality characteristics. The results revealed that abundance of individuals was low at the polluted site (B). However, the two sites did not differ significantly in species richness (total and average). This fact contradicts theories stating that portions where the transverse area of the channel is larger should present a higher biological richness. It was also observed that the ichthyofauna of site B had higher evenness, and, consequently, a tendency to a higher diversity than that at site A. This demonstrates that diversity estimates should be used cautiously in environmental impact studies, as they do not necessarily indicate better conditions of communities living in more preserved environments.
Resumo:
We consider a two-parameter family of Z(2) gauge theories on a lattice discretization T(M) of a three-manifold M and its relation to topological field theories. Familiar models such as the spin-gauge model are curves on a parameter space Gamma. We show that there is a region Gamma(0) subset of Gamma where the partition function and the expectation value h < W-R(gamma)> i of the Wilson loop can be exactly computed. Depending on the point of Gamma(0), the model behaves as topological or quasi-topological. The partition function is, up to a scaling factor, a topological number of M. The Wilson loop on the other hand, does not depend on the topology of gamma. However, for a subset of Gamma(0), < W-R(gamma)> depends on the size of gamma and follows a discrete version of an area law. At the zero temperature limit, the spin-gauge model approaches the topological and the quasi-topological regions depending on the sign of the coupling constant.
Resumo:
In this thesis we develop further the functional renormalization group (RG) approach to quantum field theory (QFT) based on the effective average action (EAA) and on the exact flow equation that it satisfies. The EAA is a generalization of the standard effective action that interpolates smoothly between the bare action for krightarrowinfty and the standard effective action rnfor krightarrow0. In this way, the problem of performing the functional integral is converted into the problem of integrating the exact flow of the EAA from the UV to the IR. The EAA formalism deals naturally with several different aspects of a QFT. One aspect is related to the discovery of non-Gaussian fixed points of the RG flow that can be used to construct continuum limits. In particular, the EAA framework is a useful setting to search for Asymptotically Safe theories, i.e. theories valid up to arbitrarily high energies. A second aspect in which the EAA reveals its usefulness are non-perturbative calculations. In fact, the exact flow that it satisfies is a valuable starting point for devising new approximation schemes. In the first part of this thesis we review and extend the formalism, in particular we derive the exact RG flow equation for the EAA and the related hierarchy of coupled flow equations for the proper-vertices. We show how standard perturbation theory emerges as a particular way to iteratively solve the flow equation, if the starting point is the bare action. Next, we explore both technical and conceptual issues by means of three different applications of the formalism, to QED, to general non-linear sigma models (NLsigmaM) and to matter fields on curved spacetimes. In the main part of this thesis we construct the EAA for non-abelian gauge theories and for quantum Einstein gravity (QEG), using the background field method to implement the coarse-graining procedure in a gauge invariant way. We propose a new truncation scheme where the EAA is expanded in powers of the curvature or field strength. Crucial to the practical use of this expansion is the development of new techniques to manage functional traces such as the algorithm proposed in this thesis. This allows to project the flow of all terms in the EAA which are analytic in the fields. As an application we show how the low energy effective action for quantum gravity emerges as the result of integrating the RG flow. In any treatment of theories with local symmetries that introduces a reference scale, the question of preserving gauge invariance along the flow emerges as predominant. In the EAA framework this problem is dealt with the use of the background field formalism. This comes at the cost of enlarging the theory space where the EAA lives to the space of functionals of both fluctuation and background fields. In this thesis, we study how the identities dictated by the symmetries are modified by the introduction of the cutoff and we study so called bimetric truncations of the EAA that contain both fluctuation and background couplings. In particular, we confirm the existence of a non-Gaussian fixed point for QEG, that is at the heart of the Asymptotic Safety scenario in quantum gravity; in the enlarged bimetric theory space where the running of the cosmological constant and of Newton's constant is influenced by fluctuation couplings.
Resumo:
The leadership categorisation theory suggests that followers rely on a hierarchical cognitive structure in perceiving leaders and the leadership process, which consists of three levels; superordinate, basic and subordinate. The predominant view is that followers rely on Implicit Leadership Theories (ILTs) at the basic level in making judgments about managers. The thesis examines whether this presumption is true by proposing and testing two competing conceptualisations; namely the congruence between the basic level ILTs (general leader) and actual manager perceptions, and subordinate level ILTs (job-specific leader) and actual manager. The conceptualisation at the job-specific level builds on context-related assertions of the ILT explanatory models: leadership categorisation, information processing and connectionist network theories. Further, the thesis addresses the effects of ILT congruence at the group level. The hypothesised model suggests that Leader-Member Exchange (LMX) will act as a mediator between ILT congruence and outcomes. Three studies examined the proposed model. The first was cross-sectional with 175 students reporting on work experience during a 1-year industrial placement. The second was longitudinal and had a sample of 343 students engaging in a business simulation in groups with formal leadership. The final study was a cross-sectional survey in several organisations with a sample of 178. A novel approach was taken to congruence analysis; the hypothesised models were tested using Latent Congruence Modelling (LCM), which accounts for measurement error and overcomes the majority of limitations of traditional approaches. The first two studies confirm the traditional theorised view that employees rely on basic-level ILTs in making judgments about their managers with important implications, and show that LMX mediates the relationship between ILT congruence and work-related outcomes (performance, job satisfaction, well-being, task satisfaction, intragroup conflict, group satisfaction, team realness, team-member exchange, group performance). The third study confirms this with conflict, well-being, self-rated performance and commitment as outcomes.
Resumo:
A significant forum of scholarly and practitioner-based research has developed in recent years that has sought both to theorize upon and empirically measure the competitiveness of regions. However, the disparate and fragmented nature of this work has led to the lack of a substantive theoretical foundation underpinning the various analyses and measurement methodologies employed. The aim of this paper is to place the regional competitiveness discourse within the context of theories of economic growth, and more particularly, those concerning regional economic growth. It is argued that regional competitiveness models are usually implicitly constructed in the lineage of endogenous growth frameworks, whereby deliberate investments in factors such as human capital and knowledge are considered to be key drivers of growth differentials. This leads to the suggestion that regional competitiveness can be usefully defined as the capacity and capability of regions to achieve economic growth relative to other regions at a similar overall stage of economic development, which will usually be within their own nation or continental bloc. The paper further assesses future avenues for theoretical and methodological exploration, highlighting the role of institutions, resilience and, well-being in understanding how the competitiveness of regions influences their long-term evolution.
Resumo:
A modernkori számvitel egyik alapvető kérdése, hogy a pénzügyi beszámolás címzettjét – az érdekhordozókat – miként lehet azonosítani. Ez a törekvés már a klasszikus, azóta meghaladottá vált elméletekben is központi szerepet töltött be és modern, posztmodern elméletekben kulcsfontosságúvá vált. A tapasztalatok alapján az azonosított érdekhordozók köre módosult, bővült. Ennek a fejlődésnek a vizsgálata során a számvitel számos olyan ismérvét sikerült azonosítani, amely segítségével a vonatkozó szabályok tökéletesíthetők. Emellett az evolúció vizsgálata segítségével közvetlenül is megfigyelhetővé vált az, hogy a számvitelt extern módon szabályozó hatalom szükségessége milyen feltételek teljesítése mellett igazolható. A vizsgálat során azonosíthatóvá váltak olyan helyzetek, amikor a számviteli szabályozó és „kívülről irányított” pénzügyi beszámolás szuboptimális helyzethez vezet. A cikk az érdekhordozói elméletek fejlődését a klasszikus felfogásoktól indulva mutatja be. Feltárja, hogy a modern – jelenleg elfogadott – koalíciós vállalatfelfogás miben hozott újat, elsősorban miként hívta életre az extern szabályozót. _____ One of the key problems of the modern financial accounting is how to define the stakeholders. This problem was already a key issue in the already outdated classical stakeholder theories. Research and experience noted that the group of stakeholders has widened and has been modified. Through this evolution researchers identified many characteristics of financial reporting through which the regulation could have been improved. This advance pointed out which are the situations when the existence of an extern accounting regulator may be justified, since under given circumstances this existence led to suboptimal scenario. This paper deals with the stakeholder theories, starting with the classical ones. The article points out how did the currently accepted theory changed the assertions of the previous one and how was the external regulator created as an inevitable consequence. The paper also highlights the main issues raised by the post-modern theories; those, which try to fit the current questions into the current stakeholder models. The article also produces a Hungarian evidence for the previously mentioned suboptimal scenario, where the not tax-driven regulation proves to be suboptimal.
Resumo:
Resource allocation decisions are made to serve the current emergency without knowing which future emergency will be occurring. Different ordered combinations of emergencies result in different performance outcomes. Even though future decisions can be anticipated with scenarios, previous models follow an assumption that events over a time interval are independent. This dissertation follows an assumption that events are interdependent, because speed reduction and rubbernecking due to an initial incident provoke secondary incidents. The misconception that secondary incidents are not common has resulted in overlooking a look-ahead concept. This dissertation is a pioneer in relaxing the structural assumptions of independency during the assignment of emergency vehicles. When an emergency is detected and a request arrives, an appropriate emergency vehicle is immediately dispatched. We provide tools for quantifying impacts based on fundamentals of incident occurrences through identification, prediction, and interpretation of secondary incidents. A proposed online dispatching model minimizes the cost of moving the next emergency unit, while making the response as close to optimal as possible. Using the look-ahead concept, the online model flexibly re-computes the solution, basing future decisions on present requests. We introduce various online dispatching strategies with visualization of the algorithms, and provide insights on their differences in behavior and solution quality. The experimental evidence indicates that the algorithm works well in practice. After having served a designated request, the available and/or remaining vehicles are relocated to a new base for the next emergency. System costs will be excessive if delay regarding dispatching decisions is ignored when relocating response units. This dissertation presents an integrated method with a principle of beginning with a location phase to manage initial incidents and progressing through a dispatching phase to manage the stochastic occurrence of next incidents. Previous studies used the frequency of independent incidents and ignored scenarios in which two incidents occurred within proximal regions and intervals. The proposed analytical model relaxes the structural assumptions of Poisson process (independent increments) and incorporates evolution of primary and secondary incident probabilities over time. The mathematical model overcomes several limiting assumptions of the previous models, such as no waiting-time, returning rule to original depot, and fixed depot. The temporal locations flexible with look-ahead are compared with current practice that locates units in depots based on Poisson theory. A linearization of the formulation is presented and an efficient heuristic algorithm is implemented to deal with a large-scale problem in real-time.
Resumo:
In this thesis I show a triple new connection we found between quantum integrability, N=2 supersymmetric gauge theories and black holes perturbation theory. I use the approach of the ODE/IM correspondence between Ordinary Differential Equations (ODE) and Integrable Models (IM), first to connect basic integrability functions - the Baxter’s Q, T and Y functions - to the gauge theory periods. This fundamental identification allows several new results for both theories, for example: an exact non linear integral equation (Thermodynamic Bethe Ansatz, TBA) for the gauge periods; an interpretation of the integrability functional relations as new exact R-symmetry relations for the periods; new formulas for the local integrals of motion in terms of gauge periods. This I develop in all details at least for the SU(2) gauge theory with Nf=0,1,2 matter flavours. Still through to the ODE/IM correspondence, I connect the mathematically precise definition of quasinormal modes of black holes (having an important role in gravitational waves’ obervations) with quantization conditions on the Q, Y functions. In this way I also give a mathematical explanation of the recently found connection between quasinormal modes and N=2 supersymmetric gauge theories. Moreover, it follows a new simple and effective method to numerically compute the quasinormal modes - the TBA - which I compare with other standard methods. The spacetimes for which I show these in all details are in the simplest Nf=0 case the D3 brane in the Nf=1,2 case a generalization of extremal Reissner-Nordström (charged) black holes. Then I begin treating also the Nf=3,4 theories and argue on how our integrability-gauge-gravity correspondence can generalize to other types of black holes in either asymptotically flat (Nf=3) or Anti-de-Sitter (Nf=4) spacetime. Finally I begin to show the extension to a 4-fold correspondence with also Conformal Field Theory (CFT), through the renowned AdS/CFT correspondence.
Resumo:
This article intends to rationally reconstruct Locke`s theory of knowledge as incorporated in a research program concerning the nature and structure of the theories and models of rationality. In previous articles we argued that the rationalist program can be subdivided into the classical rationalistic subprogram, which includes the knowledge theories of Descartes, Locke, Hume and Kant, the neoclassical subprogram, which includes the approaches of Duhem, Poincare and Mach, and the critical subprogram of Popper. The subdivision results from the different views of rationality proposed by each one of these subprograms, as well as from the tools made available by each one of them, containing theoretical instruments used to arrange, organize and develop the discussion on rationality, the main one of which is the structure of solution of problems. In this essay we intend to reconstruct the assumptions of Locke`s theory of knowledge, which in our view belongs to the classical rationalistic subprogram because it shares with it the thesis of the identity of (scientific) knowledge and certain knowledge.