51 resultados para off-shell triangle diagram
Resumo:
We study the properties of (K) over bar* mesons in nuclear matter using a unitary approach in coupled channels within the framework of the local hidden gauge formalism and incorporating the (K) over bar pi decay channel in matter. The in-medium (K) over bar *N interaction accounts for Pauli blocking effects and incorporates the (K) over bar* self-energy in a self-consistent manner. We also obtain the (K) over bar* (off-shell) spectral function and analyze its behavior at finite density and momentum. At a normal nuclear matter density, the (K) over bar* meson feels a moderately attractive potential, while the (K) over bar* width becomes five times larger than in free space. We estimate the transparency ratio of the gamma A -> K+K*(-) A` reaction, which we propose as a feasible scenario at the present facilities to detect changes in the properties of the (K) over bar* meson in nuclear medium.
Resumo:
A generalized off-shell unitarity relation for the two-body scattering T matrix in a many-body medium at finite temperature is derived, through a consistent real-time perturbation expansion by means of Feynman diagrams. We comment on perturbation schemes at finite temperature in connection with an erroneous formulation of the Dyson equation in a paper recently published.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt".
Resumo:
Inductive learning aims at finding general rules that hold true in a database. Targeted learning seeks rules for the predictions of the value of a variable based on the values of others, as in the case of linear or non-parametric regression analysis. Non-targeted learning finds regularities without a specific prediction goal. We model the product of non-targeted learning as rules that state that a certain phenomenon never happens, or that certain conditions necessitate another. For all types of rules, there is a trade-off between the rule's accuracy and its simplicity. Thus rule selection can be viewed as a choice problem, among pairs of degree of accuracy and degree of complexity. However, one cannot in general tell what is the feasible set in the accuracy-complexity space. Formally, we show that finding out whether a point belongs to this set is computationally hard. In particular, in the context of linear regression, finding a small set of variables that obtain a certain value of R2 is computationally hard. Computational complexity may explain why a person is not always aware of rules that, if asked, she would find valid. This, in turn, may explain why one can change other people's minds (opinions, beliefs) without providing new information.
Resumo:
This paper analyzes the joint dynamics of two key macroeconomic variables for the conduct of monetary policy: inflation and the aggregate capacity utilization rate. An econometric procedure useful for estimating dynamic rational expectation models with unobserved components is developed and applied in this context. The method combines the flexibility of the unobserved components approach, based on the Kalman recursion, with the power of the general method of moments estimation procedure. A 'hyb id' Phillips curve relating inflation to the capacity utilization gap and incorporating forward and backward looking components is estimated. The results show that such a relationship in non-linear: the slope of the Phillips curve depends significantly on the magnitude of the capacity gap. These findings provide support for studying the implications of asymmetricmonetary policy rules.
Resumo:
The aim of this paper is to measure the returns to human capital. We use a unique data set consisting of matched employer-employee information. Data on individuals' human capital include a set of 26 competences that capture the utilization of workers' skills in a very detailed way. Thus, we can expand the concept of human capital and discuss the type of skills that are more productive in the workplace and, hence, generate a higher payoff for the workers. The rich information on firm's and workplace characteristics allows us to introduce a broad range of controls and to improve previous research in this field. This paper gives evidence that the returns to generic competences differ depending on the position of the worker in the firm. Only numeracy skills are reward independent of the occupational status of the worker. The level of technology used by the firm in the production process does not directly increase workers’ pay, but it influences the pay-off to some of the competences. JEL Classification: J24, J31
Resumo:
Aquest projecte de doctorat és un treball interdisciplinari adreçat a l’obtenció de nous nanocompòsits (NCs) funcionals sintetitzats a partir de materials polimèrics bescanviadors d’ions que són modificats amb nanopartícules metàl•liques (NPMs) de diferent composició. Els materials desenvolupats s’avaluen en funció de dues possibles aplicacions: 1) com a catalitzadors de reaccions orgàniques d’interès actual (NCs basats en pal•ladi) i, 2) la seva dedicació a aplicacions bactericides en el tractament d’aigües domèstiques o industrials (NCs basats en plata). El desenvolupament de nanomaterials és de gran interès a l’actualitat donades les seves especials propietats, l’aprofitament de les quals és la força impulsora per a la fabricació de nous NCs. Les nanopartícules metàl•liques estabilitzades en polímer (Polymer Stabilized Metal Nanoparticles, PSNPM) s’han preparat mitjançant la tècnica in-situ de síntesi intermatricial (Inter-matrix synthesis, IMS) que consisteix en la càrrega seqüencial dels grups funcionals de les matrius polimèriques amb ions metàl•lics, i la seva posterior reducció química dins de la matriu polimèrica de bescanvi iònic. L’estabilització en matrius polimèriques evita l’agregació entre elles (self-aggreagtion), un dels principals problemes coneguts de les NPs. Pel desenvolupament d’aquesta metodologia, s’han emprat diferents tipus de matrius polimèriques de bescanvi iònic: membrana Sulfonated PolyEtherEtherKetone, SPEEK, així com fibres sintètiques basades en polypropilè amb diferents tipus de grups funcionals, que ens permeten el seu ús com a filtres en la desinfecció de solucions aquoses o com a material catalitzador. Durant el projecte s’ha anat avançant en l’optimització del material nanocomposite final per a les aplicacions d’interès, en quant activitat i funcionalitat de les nanopartícules i estabilitat del nanocomposite. Així, s’ha optimitzat la síntesi de NPs estabilitzades en resines de bescanvi iònic, realitzant un screening de diferents tipus de resines i la seva avaluació en aplicacions industrials d’interès.
Resumo:
Gairebé 182 milions d'ciutadans de la Unió Europea (= 37,5% de la població total) viuen en aproximadament 130 regions frontereres i transfrontereres. Aquestes regions contribueixen significativament al procés d'integració europea. Aquesta importància es documenta pel paquet dels Fons Estructurals 2007-2013, que ha estat presentat per la Comissió Europea i que va ser aprovat recentment pel Parlament Europeu. Considerant que la UE ha gastat uns 4875 € milions per a la cooperació transfronterera, transnacional i interregional en el marc de la iniciativa Interreg per al període 2000-2006, la cooperació territorial europea es convertirà en un dels tres objectius dels fons estructurals i rebrà € 7750000000 (5,57 milions d'euros per a la cooperació transfronterera només) per al període 2007-2013 (Comissió Europea, 2006a, 2006b). A part d'això, un nou conjunt de normes per a l'establiment d'una "agrupació europea de cooperació territorial" (AECT) ha estat adoptat i que facilitarà la cooperació transboundray, transnacional i interregional a la UE. Aquest treball s'ocuparà de les estructures de la institucionalització, la presa de decisions i l'execució i les polítiques de la "Gran Regió" / "Großregion" (d'ara endavant: GR o Gran Regió).
Resumo:
Aquest projecte s’ha portat a terme per tal de millorar en diferentsaspectes el motor Honda Gx35 , del vehicle de baix consum de la Universitat deGirona (Udg). Aquest és un motor de combustió interna de gasolina (cicle Otto).L’objectiu és el disseny d’una culata per poder minimitzar el consum de gasolina, la qual s’ha de poder acoblar amb el motor Honda Gx35. Aquest motor,prèviament s’haurà de modificar per poder-hi instal•lari la nova culata
Resumo:
Through the history of Electrical Engineering education, vectorial and phasorial diagrams have been used as a fundamental learning tool. At present, computational power has replaced them by long data lists, the result of solving equation systems by means of numerical methods. In this sense, diagrams have been shifted to an academic background and although theoretically explained, they are not used in a practical way within specific examples. This fact may be against the understanding of the complex behavior of the electrical power systems by students. This article proposes a modification of the classical Perrine-Baum diagram construction to allowing both a more practical representation and a better understanding of the behavior of a high-voltage electric line under different levels of load. This modification allows, at the same time, the forecast of the obsolescence of this behavior and line’s loading capacity. Complementary, we evaluate the impact of this tool in the learning process showing comparative undergraduate results during three academic years
Resumo:
The basis set superposition error-free second-order MØller-Plesset perturbation theory of intermolecular interactions was studied. The difficulties of the counterpoise (CP) correction in open-shell systems were also discussed. The calculations were performed by a program which was used for testing the new variants of the theory. It was shown that the CP correction for the diabatic surfaces should be preferred to the adiabatic ones
Resumo:
The Person Trade-Off (PTO) is a methodology aimed at measuring thesocial value of health states. The rest of methodologies would measure individualutility and would be less appropriate for taking resource allocation decisions.However few studies have been conducted to test the validity of the method.We present a pilot study with this objective. The study is based on theresult of interviews to 30 undergraduate students in Economics. We judgethe validity of PTO answers by their adequacy to three hypothesis of rationality.First, we show that, given certain rationality assumptions, PTO answersshould be predicted from answers to Standard Gamble questions. This firsthypothesis is not verified. The second hypothesis is that PTO answersshould not vary with different frames of equivalent PTO questions. Thissecond hypothesis is also not verified. Our third hypothesis is that PTOvalues should predict social preferences for allocating resources betweenpatients. This hypothesis is verified. The evidence on the validity of themethod is then conflicting.
Resumo:
We present a new unifying framework for investigating throughput-WIP(Work-in-Process) optimal control problems in queueing systems,based on reformulating them as linear programming (LP) problems withspecial structure: We show that if a throughput-WIP performance pairin a stochastic system satisfies the Threshold Property we introducein this paper, then we can reformulate the problem of optimizing alinear objective of throughput-WIP performance as a (semi-infinite)LP problem over a polygon with special structure (a thresholdpolygon). The strong structural properties of such polygones explainthe optimality of threshold policies for optimizing linearperformance objectives: their vertices correspond to the performancepairs of threshold policies. We analyze in this framework theversatile input-output queueing intensity control model introduced byChen and Yao (1990), obtaining a variety of new results, including (a)an exact reformulation of the control problem as an LP problem over athreshold polygon; (b) an analytical characterization of the Min WIPfunction (giving the minimum WIP level required to attain a targetthroughput level); (c) an LP Value Decomposition Theorem that relatesthe objective value under an arbitrary policy with that of a giventhreshold policy (thus revealing the LP interpretation of Chen andYao's optimality conditions); (d) diminishing returns and invarianceproperties of throughput-WIP performance, which underlie thresholdoptimality; (e) a unified treatment of the time-discounted andtime-average cases.
Resumo:
This paper tests the internal consistency of time trade-off utilities.We find significant violations of consistency in the direction predictedby loss aversion. The violations disappear for higher gauge durations.We show that loss aversion can also explain that for short gaugedurations time trade-off utilities exceed standard gamble utilities. Ourresults suggest that time trade-off measurements that use relativelyshort gauge durations, like the widely used EuroQol algorithm(Dolan 1997), are affected by loss aversion and lead to utilities thatare too high.
Resumo:
This paper presents a test of the predictive validity of various classes ofQALY models (i.e., linear, power and exponential models). We first estimatedTTO utilities for 43 EQ-5D chronic health states and next these states wereembedded in health profiles. The chronic TTO utilities were then used topredict the responses to TTO questions with health profiles. We find that thepower QALY model clearly outperforms linear and exponential QALY models.Optimal power coefficient is 0.65. Our results suggest that TTO-based QALYcalculations may be biased. This bias can be avoided using a power QALY model.