928 resultados para heavy-quark effective theory
Resumo:
The diagrammatic strong-coupling perturbation theory (SCPT) for correlated electron systems is developed for intersite Coulomb interaction and for a nonorthogonal basis set. The construction is based on iterations of exact closed equations for many - electron Green functions (GFs) for Hubbard operators in terms of functional derivatives with respect to external sources. The graphs, which do not contain the contributions from the fluctuations of the local population numbers of the ion states, play a special role: a one-to-one correspondence is found between the subset of such graphs for the many - electron GFs and the complete set of Feynman graphs of weak-coupling perturbation theory (WCPT) for single-electron GFs. This fact is used for formulation of the approximation of renormalized Fermions (ARF) in which the many-electron quasi-particles behave analogously to normal Fermions. Then, by analyzing: (a) Sham's equation, which connects the self-energy and the exchange- correlation potential in density functional theory (DFT); and (b) the Galitskii and Migdal expressions for the total energy, written within WCPT and within ARF SCPT, a way we suggest a method to improve the description of the systems with correlated electrons within the local density approximation (LDA) to DFT. The formulation, in terms of renormalized Fermions LIDA (RF LDA), is obtained by introducing the spectral weights of the many electron GFs into the definitions of the charge density, the overlap matrices, effective mixing and hopping matrix elements, into existing electronic structure codes, whereas the weights themselves have to be found from an additional set of equations. Compared with LDA+U and self-interaction correction (SIC) methods, RF LDA has the advantage of taking into account the transfer of spectral weights, and, when formulated in terms of GFs, also allows for consideration of excitations and nonzero temperature. Going beyond the ARF SCPT, as well as RF LIDA, and taking into account the fluctuations of ion population numbers would require writing completely new codes for ab initio calculations. The application of RF LDA for ab initio band structure calculations for rare earth metals is presented in part 11 of this study (this issue). (c) 2005 Wiley Periodicals, Inc.
Resumo:
We consider the problem of estimating P(Yi + (...) + Y-n > x) by importance sampling when the Yi are i.i.d. and heavy-tailed. The idea is to exploit the cross-entropy method as a toot for choosing good parameters in the importance sampling distribution; in doing so, we use the asymptotic description that given P(Y-1 + (...) + Y-n > x), n - 1 of the Yi have distribution F and one the conditional distribution of Y given Y > x. We show in some specific parametric examples (Pareto and Weibull) how this leads to precise answers which, as demonstrated numerically, are close to being variance minimal within the parametric class under consideration. Related problems for M/G/l and GI/G/l queues are also discussed.
Resumo:
The schema of an information system can significantly impact the ability of end users to efficiently and effectively retrieve the information they need. Obtaining quickly the appropriate data increases the likelihood that an organization will make good decisions and respond adeptly to challenges. This research presents and validates a methodology for evaluating, ex ante, the relative desirability of alternative instantiations of a model of data. In contrast to prior research, each instantiation is based on a different formal theory. This research theorizes that the instantiation that yields the lowest weighted average query complexity for a representative sample of information requests is the most desirable instantiation for end-user queries. The theory was validated by an experiment that compared end-user performance using an instantiation of a data structure based on the relational model of data with performance using the corresponding instantiation of the data structure based on the object-relational model of data. Complexity was measured using three different Halstead metrics: program length, difficulty, and effort. For a representative sample of queries, the average complexity using each instantiation was calculated. As theorized, end users querying the instantiation with the lower average complexity made fewer semantic errors, i.e., were more effective at composing queries. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
A self-consistent theory is derived to describe the BCS-Bose-Einstein-condensate crossover for a strongly interacting Fermi gas with a Feshbach resonance. In the theory the fluctuation of the dressed molecules, consisting of both preformed Cooper pairs and bare Feshbach molecules, has been included within a self-consistent T-matrix approximation, beyond the Nozieres and Schmitt-Rink strategy considered by Ohashi and Griffin. The resulting self-consistent equations are solved numerically to investigate the normal-state properties of the crossover at various resonance widths. It is found that the superfluid transition temperature T-c increases monotonically at all widths as the effective interaction between atoms becomes more attractive. Furthermore, a residue factor Z(m) of the molecule's Green function and a complex effective mass have been determined to characterize the fraction and lifetime of Feshbach molecules at T-c. Our many-body calculations of Z(m) agree qualitatively well with recent measurments of the gas of Li-6 atoms near the broad resonance at 834 G. The crossover from narrow to broad resonances has also been studied.
Resumo:
The estimation of P(S-n > u) by simulation, where S, is the sum of independent. identically distributed random varibles Y-1,..., Y-n, is of importance in many applications. We propose two simulation estimators based upon the identity P(S-n > u) = nP(S, > u, M-n = Y-n), where M-n = max(Y-1,..., Y-n). One estimator uses importance sampling (for Y-n only), and the other uses conditional Monte Carlo conditioning upon Y1,..., Yn-1. Properties of the relative error of the estimators are derived and a numerical study given in terms of the M/G/1 queue in which n is replaced by an independent geometric random variable N. The conclusion is that the new estimators compare extremely favorably with previous ones. In particular, the conditional Monte Carlo estimator is the first heavy-tailed example of an estimator with bounded relative error. Further improvements are obtained in the random-N case, by incorporating control variates and stratification techniques into the new estimation procedures.
Resumo:
In this thesis work we develop a new generative model of social networks belonging to the family of Time Varying Networks. The importance of correctly modelling the mechanisms shaping the growth of a network and the dynamics of the edges activation and inactivation are of central importance in network science. Indeed, by means of generative models that mimic the real-world dynamics of contacts in social networks it is possible to forecast the outcome of an epidemic process, optimize the immunization campaign or optimally spread an information among individuals. This task can now be tackled taking advantage of the recent availability of large-scale, high-quality and time-resolved datasets. This wealth of digital data has allowed to deepen our understanding of the structure and properties of many real-world networks. Moreover, the empirical evidence of a temporal dimension in networks prompted the switch of paradigm from a static representation of graphs to a time varying one. In this work we exploit the Activity-Driven paradigm (a modeling tool belonging to the family of Time-Varying-Networks) to develop a general dynamical model that encodes fundamental mechanism shaping the social networks' topology and its temporal structure: social capital allocation and burstiness. The former accounts for the fact that individuals does not randomly invest their time and social interactions but they rather allocate it toward already known nodes of the network. The latter accounts for the heavy-tailed distributions of the inter-event time in social networks. We then empirically measure the properties of these two mechanisms from seven real-world datasets and develop a data-driven model, analytically solving it. We then check the results against numerical simulations and test our predictions with real-world datasets, finding a good agreement between the two. Moreover, we find and characterize a non-trivial interplay between burstiness and social capital allocation in the parameters phase space. Finally, we present a novel approach to the development of a complete generative model of Time-Varying-Networks. This model is inspired by the Kaufman's adjacent possible theory and is based on a generalized version of the Polya's urn. Remarkably, most of the complex and heterogeneous feature of real-world social networks are naturally reproduced by this dynamical model, together with many high-order topological properties (clustering coefficient, community structure etc.).
Resumo:
Most advanced economies offer publicly financed advice services to start-up firms and SMEs. In England, local or regional Business Links organisations have provided these services, and divided their support into nonintensive one-off contacts providing information or advice and more intensive support involving a diagnostic process and repeated interaction with firms. A key choice for Business Link managers is how to shape their intervention strategies, balancing resources between intensive and nonintensive support. Drawing on resource dependency theory, we develop a typology of intervention strategies for Business Links in England which reflects differences in the breadth and depth of the support provided. We then test the impacts of these alternative intervention models on client companies using both subjective assessments by firms and econometric treatment models that allow for selection bias. Our key empirical result is that Business Links’ choice of intervention strategy has a significant effect both on actual and on perceived business outcomes, with our results emphasising the value of depth over breadth. The implication is that where additional resources are available for business support these should be used to deepen the assistance provided rather than extend assistance to a wider group of firms.
Resumo:
A major problem in modern probabilistic modeling is the huge computational complexity involved in typical calculations with multivariate probability distributions when the number of random variables is large. Because exact computations are infeasible in such cases and Monte Carlo sampling techniques may reach their limits, there is a need for methods that allow for efficient approximate computations. One of the simplest approximations is based on the mean field method, which has a long history in statistical physics. The method is widely used, particularly in the growing field of graphical models. Researchers from disciplines such as statistical physics, computer science, and mathematical statistics are studying ways to improve this and related methods and are exploring novel application areas. Leading approaches include the variational approach, which goes beyond factorizable distributions to achieve systematic improvements; the TAP (Thouless-Anderson-Palmer) approach, which incorporates correlations by including effective reaction terms in the mean field theory; and the more general methods of graphical models. Bringing together ideas and techniques from these diverse disciplines, this book covers the theoretical foundations of advanced mean field methods, explores the relation between the different approaches, examines the quality of the approximation obtained, and demonstrates their application to various areas of probabilistic modeling.
Resumo:
Many see the absence of conflict between groups as indicative of effective intergroup relations. Others consider its management a suitable effectiveness criterion. In this article we demarcate a different approach and propose that these views are deficient in describing effective intergroup relations. The article theorizes alternative criteria of intergroup effectiveness rooted in team representatives' subjective value judgements and assesses the psychometric characteristics of a short measure based on these criteria. Results on empirical validity suggest the measure to be a potential alternative outcome of organizational conflict. Implications for both the study of intergroup relations and conflict theory are discussed. © 2005 Psychology Press Ltd.
Resumo:
Gain insight into crucial British mental health approaches for LGB individuals. There is very little collaborative literature between LGB-affirmative psychologists and psychotherapists in the United States and the United Kingdom. British Lesbian, Gay, and Bisexual Psychologies: Theory, Research, and Practice may well be a crucial beginning step in building dialogue between these two countries on important LGB psychotherapy developments. Leading authorities comprehensively examine the latest studies and effective therapies for LGB individuals in the United Kingdom. Practitioners will discover an extensive survey of the most current developments to supplement their own work, while educators and students will find diverse expert perspectives on which to consider and broaden their own viewpoints. This unique book offers an informative introduction to British psychosocial perspectives on theory, research, and practice. British Lesbian, Gay, and Bisexual Psychologies provides a critical exploration of the recent history of LGB psychology and psychotherapy in the United Kingdom, focusing on key publications and outlining the current terrain. Other chapters are organized into two thematic sections. The first section explores theoretical frameworks in United Kingdom therapeutic practice, while the second section examines sexual minority identities and their needs for support and community. Topics in British Lesbian, Gay, and Bisexual Psychologies include: - similarities and differences between LGBT psychology and psychotherapy in the United States and United Kingdom - gay affirmative therapy (GAT) as a positive framework - existential-phenomenological approach to psychotherapy - core issues in the anxiety about whether or not to “come out” - object relations theory - exploring homo-negativity in the therapeutic process - aspects of psychotherapy that lesbians and gay men find helpful - research into how the mainstreaming of lesbian and gay culture has affected the lives of LGB individuals - study into LGB youth issues - difficulties of gay men with learning disabilities—with suggestions on how to offer the best psychological service - a study on gay athletes’ experiences of coming out in a heterosexist world British Lesbian, Gay, and Bisexual Psychologies takes a needed step toward sharing valuable psychosocial perspectives between countries. This useful, enlightening text is perfect for educators, students, psychologists, psychotherapists, and counselors working in the field of sexuality.
Resumo:
A range of chromia pillared montmorillonite and tin oxide pillared laponite clay catalysts, as well as new pillared clay materials such as cerium and europium oxide pillared montmorillonites were synthesised. Methods included both conventional ion exchange techniques and microwave enhanced methods to improve performance and/or reduce preparation time. These catalytic materials were characterised in detail both before and after use in order to study the effect of the preparation parameters (starting material, preparation method, pillaring species, hydroxyl to metal ratio etc.) and the hydro cracking procedure on their properties. This led to a better understanding of the nature of their structure and catalytic operation. These catalysts were evaluated with regards to their performance in hydrocracking coal derived liquids in a conventional microbomb reactor (carried out at Imperial College). Nearly all catalysts displayed better conversions when reused. The chromia pillared montmorillonite CM3 and the tin oxide pillared laponite SL2a showed the best "conversions". The intercalation of chromium in the form of chromia (Cr203) in the interlayer clearly increased conversion. This was attributed to the redox activity of the chromia pillar. However, this increase was not proportional to the increase in chromium content or basal spacing. In the case of tin oxide pillared laponite, the catalytic activity might have been a result of better access to the acid sites due to the delaminated nature of laponite, whose activity was promoted by the presence of tin oxide. The manipulation of the structural properties of the catalysts via pillaring did not seem to have any effect on the catalysts' activity. This was probably due to the collapse of the pillars under hydrocracking conditions as indicated by the similar basal spacing of the catalysts after use. However, the type of the pillaring species had a significant effect on conversion. Whereas pillaring with chromium and tin oxides increased the conversion exhibited by the parent clays, pillaring with cerium and europium oxides appeared to have a detrimental effect. The relatively good performance of the parent clays was attributed to their acid sites, coupled with their macropores which are able to accommodate the very high molecular mass of coal derived liquids. A microwave reactor operating at moderate conditions was modified for hydro cracking coal derived liquids and tested with the conventional catalyst NiMo on alumina. It was thought that microwave irradiation could enable conversion to occur at milder conditions than those conventionally used, coupled with a more effective use of hydrogen. The latter could lead to lower operating costs making the process cost effective. However, in practice excessive coke deposition took place leading to negative total conversion. This was probably due to a very low hydrogen pressure, unable to have any hydro cracking effect even under microwave irradiation. The decomposition of bio-oil under microwave irradiation was studied, aiming to identify the extent to which the properties of bio-oil change as a function of time, temperature, mode of heating, presence of char and catalyst. This information would be helpful not only for upgrading bio-oil to transport fuels, but also for any potential fuel application. During this study the rate constants of bio-oil's decomposition were calculated assuming first order kinetics.
Resumo:
This paper presents the design and results of a task-based user study, based on Information Foraging Theory, on a novel user interaction framework - uInteract - for content-based image retrieval (CBIR). The framework includes a four-factor user interaction model and an interactive interface. The user study involves three focused evaluations, 12 simulated real life search tasks with different complexity levels, 12 comparative systems and 50 subjects. Information Foraging Theory is applied to the user study design and the quantitative data analysis. The systematic findings have not only shown how effective and easy to use the uInteract framework is, but also illustrate the value of Information Foraging Theory for interpreting user interaction with CBIR. © 2011 Springer-Verlag Berlin Heidelberg.
Resumo:
A large number of studies have been devoted to modeling the contents and interactions between users on Twitter. In this paper, we propose a method inspired from Social Role Theory (SRT), which assumes that a user behaves differently in different roles in the generation process of Twitter content. We consider the two most distinctive social roles on Twitter: originator and propagator, who respectively posts original messages and retweets or forwards the messages from others. In addition, we also consider role-specific social interactions, especially implicit interactions between users who share some common interests. All the above elements are integrated into a novel regularized topic model. We evaluate the proposed method on real Twitter data. The results show that our method is more effective than the existing ones which do not distinguish social roles. Copyright 2013 ACM.
Resumo:
Purpose - The purpose of this paper is to demonstrate analytically how entrepreneurial action as learning relating to diversifying into technical clothing - i.e. a high-value manufacturing sector - can take place. This is particularly relevant to recent discussion and debate in academic and policy-making circles concerning the survival of the clothing manufacture industry in developed industrialised countries. Design/methodology/approach - Using situated learning theory (SLT) as the major analytical lens, this case study examines an episode of entrepreneurial action relating to diversification into a high-value manufacturing sector. It is considered on instrumentality grounds, revealing wider tendencies in the management of knowledge and capabilities requisite for effective entrepreneurial action of this kind. Findings - Boundary events, brokers, boundary objects, membership structures and inclusive participation that addresses power asymmetries are found to be crucial organisational design elements, enabling the development of inter- and intracommunal capacities. These together constitute a dynamic learning capability, which underpins entrepreneurial action, such as diversification into high-value manufacturing sectors. Originality/value - Through a refinement of SLT in the context of entrepreneurial action, the paper contributes to an advancement of a substantive theory of managing technological knowledge and capabilities for effective diversification into high-value manufacturing sectors. Copyright © 2014 Emerald Group Publishing Limited. All rights reserved.
Resumo:
Purpose: The purpose of the research described in this paper is to disentangle the rhetoric from the reality in relation to supply chain management (SCM) adoption in practice. There is significant evidence of a divergence between theory and practice in the field of SCM. Design/methodology/approach: Based on a review of extant theory, the authors posit a new definitional construct for SCM – the Four Fundamentals – and investigated four research questions (RQs) that emerged from the theoretical review. The empirical work comprised three main phases: focussed interviews, focus groups and a questionnaire survey. Each phase used the authors’ definitional construct as its basis. While the context of the paper’s empirical work is Ireland, the insights and results are generalisable to other geographical contexts. Findings: The data collected during the various stages of the empirical research supported the essence of the definitional construct and allowed it to be further developed and refined. In addition, the findings suggest that, while levels of SCM understanding are generally quite high, there is room for improvement in relation to how this understanding is translated into practice. Research limitations/implications: Expansion of the research design to incorporate case studies, grounded theory and action research has the potential to generate new SCM theory that builds on the Four Fundamentals construct, thus facilitating a deeper and richer understanding of SCM phenomena. The use of longitudinal studies would enable a barometer of progress to be developed over time. Practical implications: The authors’ definitional construct supports improvement in the cohesion of SCM practices, thereby promoting the effective implementation of supply chain strategies. A number of critical success factors and/or barriers to implementation of SCM theory in practice are identified, as are a number of practical measures that could be implemented at policy/supply chain/firm level to improve the level of effective SCM adoption. Originality/value: The authors’ robust definitional construct supports a more cohesive approach to the development of a unified theory of SCM. In addition to a profile of SCM understanding and adoption by firms in Ireland, the related critical success factors and/or inhibitors to success, as well as possible interventions, are identified.