979 resultados para Theories and models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider a two-parameter family of Z(2) gauge theories on a lattice discretization T(M) of a three-manifold M and its relation to topological field theories. Familiar models such as the spin-gauge model are curves on a parameter space Gamma. We show that there is a region Gamma(0) subset of Gamma where the partition function and the expectation value h < W-R(gamma)> i of the Wilson loop can be exactly computed. Depending on the point of Gamma(0), the model behaves as topological or quasi-topological. The partition function is, up to a scaling factor, a topological number of M. The Wilson loop on the other hand, does not depend on the topology of gamma. However, for a subset of Gamma(0), < W-R(gamma)> depends on the size of gamma and follows a discrete version of an area law. At the zero temperature limit, the spin-gauge model approaches the topological and the quasi-topological regions depending on the sign of the coupling constant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we develop further the functional renormalization group (RG) approach to quantum field theory (QFT) based on the effective average action (EAA) and on the exact flow equation that it satisfies. The EAA is a generalization of the standard effective action that interpolates smoothly between the bare action for krightarrowinfty and the standard effective action rnfor krightarrow0. In this way, the problem of performing the functional integral is converted into the problem of integrating the exact flow of the EAA from the UV to the IR. The EAA formalism deals naturally with several different aspects of a QFT. One aspect is related to the discovery of non-Gaussian fixed points of the RG flow that can be used to construct continuum limits. In particular, the EAA framework is a useful setting to search for Asymptotically Safe theories, i.e. theories valid up to arbitrarily high energies. A second aspect in which the EAA reveals its usefulness are non-perturbative calculations. In fact, the exact flow that it satisfies is a valuable starting point for devising new approximation schemes. In the first part of this thesis we review and extend the formalism, in particular we derive the exact RG flow equation for the EAA and the related hierarchy of coupled flow equations for the proper-vertices. We show how standard perturbation theory emerges as a particular way to iteratively solve the flow equation, if the starting point is the bare action. Next, we explore both technical and conceptual issues by means of three different applications of the formalism, to QED, to general non-linear sigma models (NLsigmaM) and to matter fields on curved spacetimes. In the main part of this thesis we construct the EAA for non-abelian gauge theories and for quantum Einstein gravity (QEG), using the background field method to implement the coarse-graining procedure in a gauge invariant way. We propose a new truncation scheme where the EAA is expanded in powers of the curvature or field strength. Crucial to the practical use of this expansion is the development of new techniques to manage functional traces such as the algorithm proposed in this thesis. This allows to project the flow of all terms in the EAA which are analytic in the fields. As an application we show how the low energy effective action for quantum gravity emerges as the result of integrating the RG flow. In any treatment of theories with local symmetries that introduces a reference scale, the question of preserving gauge invariance along the flow emerges as predominant. In the EAA framework this problem is dealt with the use of the background field formalism. This comes at the cost of enlarging the theory space where the EAA lives to the space of functionals of both fluctuation and background fields. In this thesis, we study how the identities dictated by the symmetries are modified by the introduction of the cutoff and we study so called bimetric truncations of the EAA that contain both fluctuation and background couplings. In particular, we confirm the existence of a non-Gaussian fixed point for QEG, that is at the heart of the Asymptotic Safety scenario in quantum gravity; in the enlarged bimetric theory space where the running of the cosmological constant and of Newton's constant is influenced by fluctuation couplings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The leadership categorisation theory suggests that followers rely on a hierarchical cognitive structure in perceiving leaders and the leadership process, which consists of three levels; superordinate, basic and subordinate. The predominant view is that followers rely on Implicit Leadership Theories (ILTs) at the basic level in making judgments about managers. The thesis examines whether this presumption is true by proposing and testing two competing conceptualisations; namely the congruence between the basic level ILTs (general leader) and actual manager perceptions, and subordinate level ILTs (job-specific leader) and actual manager. The conceptualisation at the job-specific level builds on context-related assertions of the ILT explanatory models: leadership categorisation, information processing and connectionist network theories. Further, the thesis addresses the effects of ILT congruence at the group level. The hypothesised model suggests that Leader-Member Exchange (LMX) will act as a mediator between ILT congruence and outcomes. Three studies examined the proposed model. The first was cross-sectional with 175 students reporting on work experience during a 1-year industrial placement. The second was longitudinal and had a sample of 343 students engaging in a business simulation in groups with formal leadership. The final study was a cross-sectional survey in several organisations with a sample of 178. A novel approach was taken to congruence analysis; the hypothesised models were tested using Latent Congruence Modelling (LCM), which accounts for measurement error and overcomes the majority of limitations of traditional approaches. The first two studies confirm the traditional theorised view that employees rely on basic-level ILTs in making judgments about their managers with important implications, and show that LMX mediates the relationship between ILT congruence and work-related outcomes (performance, job satisfaction, well-being, task satisfaction, intragroup conflict, group satisfaction, team realness, team-member exchange, group performance). The third study confirms this with conflict, well-being, self-rated performance and commitment as outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A significant forum of scholarly and practitioner-based research has developed in recent years that has sought both to theorize upon and empirically measure the competitiveness of regions. However, the disparate and fragmented nature of this work has led to the lack of a substantive theoretical foundation underpinning the various analyses and measurement methodologies employed. The aim of this paper is to place the regional competitiveness discourse within the context of theories of economic growth, and more particularly, those concerning regional economic growth. It is argued that regional competitiveness models are usually implicitly constructed in the lineage of endogenous growth frameworks, whereby deliberate investments in factors such as human capital and knowledge are considered to be key drivers of growth differentials. This leads to the suggestion that regional competitiveness can be usefully defined as the capacity and capability of regions to achieve economic growth relative to other regions at a similar overall stage of economic development, which will usually be within their own nation or continental bloc. The paper further assesses future avenues for theoretical and methodological exploration, highlighting the role of institutions, resilience and, well-being in understanding how the competitiveness of regions influences their long-term evolution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A modernkori számvitel egyik alapvető kérdése, hogy a pénzügyi beszámolás címzettjét – az érdekhordozókat – miként lehet azonosítani. Ez a törekvés már a klasszikus, azóta meghaladottá vált elméletekben is központi szerepet töltött be és modern, posztmodern elméletekben kulcsfontosságúvá vált. A tapasztalatok alapján az azonosított érdekhordozók köre módosult, bővült. Ennek a fejlődésnek a vizsgálata során a számvitel számos olyan ismérvét sikerült azonosítani, amely segítségével a vonatkozó szabályok tökéletesíthetők. Emellett az evolúció vizsgálata segítségével közvetlenül is megfigyelhetővé vált az, hogy a számvitelt extern módon szabályozó hatalom szükségessége milyen feltételek teljesítése mellett igazolható. A vizsgálat során azonosíthatóvá váltak olyan helyzetek, amikor a számviteli szabályozó és „kívülről irányított” pénzügyi beszámolás szuboptimális helyzethez vezet. A cikk az érdekhordozói elméletek fejlődését a klasszikus felfogásoktól indulva mutatja be. Feltárja, hogy a modern – jelenleg elfogadott – koalíciós vállalatfelfogás miben hozott újat, elsősorban miként hívta életre az extern szabályozót. _____ One of the key problems of the modern financial accounting is how to define the stakeholders. This problem was already a key issue in the already outdated classical stakeholder theories. Research and experience noted that the group of stakeholders has widened and has been modified. Through this evolution researchers identified many characteristics of financial reporting through which the regulation could have been improved. This advance pointed out which are the situations when the existence of an extern accounting regulator may be justified, since under given circumstances this existence led to suboptimal scenario. This paper deals with the stakeholder theories, starting with the classical ones. The article points out how did the currently accepted theory changed the assertions of the previous one and how was the external regulator created as an inevitable consequence. The paper also highlights the main issues raised by the post-modern theories; those, which try to fit the current questions into the current stakeholder models. The article also produces a Hungarian evidence for the previously mentioned suboptimal scenario, where the not tax-driven regulation proves to be suboptimal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Resource allocation decisions are made to serve the current emergency without knowing which future emergency will be occurring. Different ordered combinations of emergencies result in different performance outcomes. Even though future decisions can be anticipated with scenarios, previous models follow an assumption that events over a time interval are independent. This dissertation follows an assumption that events are interdependent, because speed reduction and rubbernecking due to an initial incident provoke secondary incidents. The misconception that secondary incidents are not common has resulted in overlooking a look-ahead concept. This dissertation is a pioneer in relaxing the structural assumptions of independency during the assignment of emergency vehicles. When an emergency is detected and a request arrives, an appropriate emergency vehicle is immediately dispatched. We provide tools for quantifying impacts based on fundamentals of incident occurrences through identification, prediction, and interpretation of secondary incidents. A proposed online dispatching model minimizes the cost of moving the next emergency unit, while making the response as close to optimal as possible. Using the look-ahead concept, the online model flexibly re-computes the solution, basing future decisions on present requests. We introduce various online dispatching strategies with visualization of the algorithms, and provide insights on their differences in behavior and solution quality. The experimental evidence indicates that the algorithm works well in practice. After having served a designated request, the available and/or remaining vehicles are relocated to a new base for the next emergency. System costs will be excessive if delay regarding dispatching decisions is ignored when relocating response units. This dissertation presents an integrated method with a principle of beginning with a location phase to manage initial incidents and progressing through a dispatching phase to manage the stochastic occurrence of next incidents. Previous studies used the frequency of independent incidents and ignored scenarios in which two incidents occurred within proximal regions and intervals. The proposed analytical model relaxes the structural assumptions of Poisson process (independent increments) and incorporates evolution of primary and secondary incident probabilities over time. The mathematical model overcomes several limiting assumptions of the previous models, such as no waiting-time, returning rule to original depot, and fixed depot. The temporal locations flexible with look-ahead are compared with current practice that locates units in depots based on Poisson theory. A linearization of the formulation is presented and an efficient heuristic algorithm is implemented to deal with a large-scale problem in real-time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper summarizes the papers presented in the thematic stream Models for the Analysis of Individual and Group Needs, at the 2007 IAEVG-SVP-NCDA Symposium: Vocational Psychology and Career Guidance Practice: An International Partnership. The predominant theme which emerged from the papers was that theory and practice need to be positioned within their contexts. For this paper, context has been formulated as a dimension ranging from the individual’s experience of himself or herself in conversations, including interpersonal transactions and body culture, through to broad higher levels of education, work, nation, and economy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Theories provide us with a frame of reference or model of how something works. Theoreticians who focus on the human state try to make a best-fit model. They try to imagine a typical case and generate a set of frameworks that might assist us to predict behaviour or some outcome, or simply explain how things work. They aim to understand how elements of interest might impact upon each other, and give rise to or predict behavioural, emotional, moral, physical, cognitive or social change for individuals and groups. Theories help give us insight. However, theories do not provide the templates for growth and change. They are simply someone’s informed and researched view regarding what might happen as people grow and interact with the physical and social world.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In today's fiercely competitive products market, product warranty has started playing an important role. The warranty period offered by the manufacturer/dealer has been progressively increasing since the beginning of the 20th Century. Currently, a large number of products are being sold with long-term warranty policies in the form of extended warranty, warranty for used products, service contracts and lifetime warranty policies. Lifetime warranties are relatively a new concept. The modelling of failures during the warranty period and the costs for such policies are complex since the lifespan in these policies are not defined well and it is often difficult to tell about life measures for the longer period of coverage due to usage pattern/maintenance activities undertaken and uncertainties of costs over the period. This paper focuses on defining lifetime, developing lifetime warranty policies and models for predicting failures and estimating costs for lifetime warranty policies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Any theory of thinking or teaching or learning rests on an underlying philosophy of knowledge. Mathematics education is situated at the nexus of two fields of inquiry, namely mathematics and education. However, numerous other disciplines interact with these two fields which compound the complexity of developing theories that define mathematics education. We first address the issue of clarifying a philosophy of mathematics education before attempting to answer whether theories of mathematics education are constructible? In doing so we draw on the foundational writings of Lincoln and Guba (1994), in which they clearly posit that any discipline within education, in our case mathematics education, needs to clarify for itself the following questions: (1) What is reality? Or what is the nature of the world around us? (2) How do we go about knowing the world around us? [the methodological question, which presents possibilities to various disciplines to develop methodological paradigms] and, (3) How can we be certain in the “truth” of what we know? [the epistemological question]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis addresses computational challenges arising from Bayesian analysis of complex real-world problems. Many of the models and algorithms designed for such analysis are ‘hybrid’ in nature, in that they are a composition of components for which their individual properties may be easily described but the performance of the model or algorithm as a whole is less well understood. The aim of this research project is to after a better understanding of the performance of hybrid models and algorithms. The goal of this thesis is to analyse the computational aspects of hybrid models and hybrid algorithms in the Bayesian context. The first objective of the research focuses on computational aspects of hybrid models, notably a continuous finite mixture of t-distributions. In the mixture model, an inference of interest is the number of components, as this may relate to both the quality of model fit to data and the computational workload. The analysis of t-mixtures using Markov chain Monte Carlo (MCMC) is described and the model is compared to the Normal case based on the goodness of fit. Through simulation studies, it is demonstrated that the t-mixture model can be more flexible and more parsimonious in terms of number of components, particularly for skewed and heavytailed data. The study also reveals important computational issues associated with the use of t-mixtures, which have not been adequately considered in the literature. The second objective of the research focuses on computational aspects of hybrid algorithms for Bayesian analysis. Two approaches will be considered: a formal comparison of the performance of a range of hybrid algorithms and a theoretical investigation of the performance of one of these algorithms in high dimensions. For the first approach, the delayed rejection algorithm, the pinball sampler, the Metropolis adjusted Langevin algorithm, and the hybrid version of the population Monte Carlo (PMC) algorithm are selected as a set of examples of hybrid algorithms. Statistical literature shows how statistical efficiency is often the only criteria for an efficient algorithm. In this thesis the algorithms are also considered and compared from a more practical perspective. This extends to the study of how individual algorithms contribute to the overall efficiency of hybrid algorithms, and highlights weaknesses that may be introduced by the combination process of these components in a single algorithm. The second approach to considering computational aspects of hybrid algorithms involves an investigation of the performance of the PMC in high dimensions. It is well known that as a model becomes more complex, computation may become increasingly difficult in real time. In particular the importance sampling based algorithms, including the PMC, are known to be unstable in high dimensions. This thesis examines the PMC algorithm in a simplified setting, a single step of the general sampling, and explores a fundamental problem that occurs in applying importance sampling to a high-dimensional problem. The precision of the computed estimate from the simplified setting is measured by the asymptotic variance of the estimate under conditions on the importance function. Additionally, the exponential growth of the asymptotic variance with the dimension is demonstrated and we illustrates that the optimal covariance matrix for the importance function can be estimated in a special case.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2005, Stephen Abram, vice president of Innovation at SirsiDynix, challenged library and information science (LIS) professionals to start becoming “librarian 2.0.” In the last few years, discussion and debate about the “core competencies” needed by librarian 2.0 have appeared in the “biblioblogosphere” (blogs written by LIS professionals). However, beyond these informal blog discussions few systematic and empirically based studies have taken place. A project funded by the Australian Learning and Teaching Council fills this gap. The project identifies the key skills, knowledge, and attributes required by “librarian 2.0.” Eighty-one members of the Australian LIS profession participated in a series of focus groups. Eight themes emerged as being critical to “librarian 2.0”: technology, communication, teamwork, user focus, business savvy, evidence based practice, learning and education, and personal traits. Guided by these findings interviews with 36 LIS educators explored the current approaches used within contemporary LIS education to prepare graduates to become “librarian 2.0”. This video presents an example of ‘great practice’ in current LIS educative practice in helping to foster web 2.0 professionals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2005, Stephen Abram, vice president of Innovation at SirsiDynix, challenged library and information science (LIS) professionals to start becoming “librarian 2.0.” In the last few years, discussion and debate about the “core competencies” needed by librarian 2.0 have appeared in the “biblioblogosphere” (blogs written by LIS professionals). However, beyond these informal blog discussions few systematic and empirically based studies have taken place. A project funded by the Australian Learning and Teaching Council fills this gap. The project identifies the key skills, knowledge, and attributes required by “librarian 2.0.” Eighty-one members of the Australian LIS profession participated in a series of focus groups. Eight themes emerged as being critical to “librarian 2.0”: technology, communication, teamwork, user focus, business savvy, evidence based practice, learning and education, and personal traits. Guided by these findings interviews with 36 LIS educators explored the current approaches used within contemporary LIS education to prepare graduates to become “librarian 2.0”. This video presents an example of ‘great practice’ in current LIS education as it strives to foster web 2.0 professionals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2005, Stephen Abram, vice president of Innovation at SirsiDynix, challenged library and information science (LIS) professionals to start becoming “librarian 2.0.” In the last few years, discussion and debate about the “core competencies” needed by librarian 2.0 have appeared in the “biblioblogosphere” (blogs written by LIS professionals). However, beyond these informal blog discussions few systematic and empirically based studies have taken place. A project funded by the Australian Learning and Teaching Council fills this gap. The project identifies the key skills, knowledge, and attributes required by “librarian 2.0.” Eighty-one members of the Australian LIS profession participated in a series of focus groups. Eight themes emerged as being critical to “librarian 2.0”: technology, communication, teamwork, user focus, business savvy, evidence based practice, learning and education, and personal traits. Guided by these findings interviews with 36 LIS educators explored the current approaches used within contemporary LIS education to prepare graduates to become “librarian 2.0”. This video presents an example of ‘great practice’ in current LIS education as it strives to foster web 2.0 professionals.