934 resultados para upscale extensions
Resumo:
We study extensions of the standard model with a strongly coupled fourth generation. This occurs in models where electroweak symmetry breaking is triggered by the condensation of at least some of the fourth-generation fermions. With focus on the phenomenology at the LHC, we study the pair production of fourth-generation down quarks, D(4). We consider the typical masses that could be associated with a strongly coupled fermion sector, in the range (300-600) GeV. We show that the production and successive decay of these heavy quarks into final states with same-sign dileptons, trileptons, and four leptons can be easily seen above background with relatively low luminosity. On the other hand, in order to confirm the presence of a new strong interaction responsible for fourth-generation condensation, we study its contribution to D(4) pair production, and the potential to separate it from standard QCD-induced heavy quark production. We show that this separation might require large amounts of data. This is true even if it is assumed that the new interaction is mediated by a massive colored vector boson, since its strong coupling to the fourth generation renders its width of the order of its mass. We conclude that, although this class of models can be falsified at early stages of the LHC running, its confirmation would require high integrated luminosities.
Resumo:
The appearance of spin-1 resonances associated with the electroweak symmetry breaking sector is expected in many extensions of the standard model. We analyze the CERN Large Hadron Collider potential to probe the spin of possible new charged and neutral vector resonances through the purely leptonic processes pp -> Z' -> l(+) l'(-) E(T), and pp -> W' -> l'(+/-) l(+) l(-) E(T), with l, l' = e or mu. We perform a model-independent analysis and demonstrate that the spin of the new states can be determined with 99% C. L. in a large fraction of the parameter space where these resonances can be observed with 100 fb(-1). We show that the best sensitivity to the spin is obtained by directly studying correlations between the final state leptons, without the need of reconstructing the events in their center-of-mass frames.
Resumo:
We solve the operator ordering problem for the quantum continuous integrable su(1,1) Landau-Lifshitz model, and give a prescription to obtain the quantum trace identities, and the spectrum for the higher-order local charges. We also show that this method, based on operator regularization and renormalization, which guarantees quantum integrability, as well as the construction of self-adjoint extensions, can be used as an alternative to the discretization procedure, and unlike the latter, is based only on integrable representations. (C) 2010 American Institute of Physics. [doi:10.1063/1.3509374]
Resumo:
We investigate the quantum integrability of the Landau-Lifshitz (LL) model and solve the long-standing problem of finding the local quantum Hamiltonian for the arbitrary n-particle sector. The particular difficulty of the LL model quantization, which arises due to the ill-defined operator product, is dealt with by simultaneously regularizing the operator product and constructing the self-adjoint extensions of a very particular structure. The diagonalizibility difficulties of the Hamiltonian of the LL model, due to the highly singular nature of the quantum-mechanical Hamiltonian, are also resolved in our method for the arbitrary n-particle sector. We explicitly demonstrate the consistency of our construction with the quantum inverse scattering method due to Sklyanin [Lett. Math. Phys. 15, 357 (1988)] and give a prescription to systematically construct the general solution, which explains and generalizes the puzzling results of Sklyanin for the particular two-particle sector case. Moreover, we demonstrate the S-matrix factorization and show that it is a consequence of the discontinuity conditions on the functions involved in the construction of the self-adjoint extensions.
Resumo:
In this report, the application of a class of separated local field NMR experiments named dipolar chemical shift correlation (DIPSHIFT) for probing motions in the intermediate regime is discussed. Simple analytical procedures based on the Anderson-Weiss (AW) approximation are presented. In order to establish limits of validity of the AW based formulas, a comparison with spin dynamics simulations based on the solution of the stochastic Liouville-von-Neumann equation is presented. It is shown that at short evolution times (less than 30% of the rotor period), the AW based formulas are suitable for fitting the DIPSHIFT curves and extracting kinetic parameters even in the case of jumplike motions. However, full spin dynamics simulations provide a more reliable treatment and extend the frequency range of the molecular motions accessible by DIPSHIFT experiments. As an experimental test, molecular jumps of imidazol methyl sulfonate and trimethylsulfoxonium iodide, as well as the side-chain motions in the photoluminescent polymer poly[2-methoxy-5-(2(')-ethylhexyloxy)-1,4-phenylenevinylene], were characterized. Possible extensions are also discussed. (c) 2008 American Institute of Physics.
Resumo:
This paper is a continuation and a complement of our previous work on isomorphic classification of some spaces of compact operators. We improve the main result concerning extensions of the classical isomorphic classification of the Banach spaces of continuous functions on ordinals. As an application, fixing an ordinal a and denoting by X(xi), omega(alpha) <= xi < omega(alpha+1), the Banach space of all X-valued continuous functions defined in the interval of ordinals [0,xi] and equipped with the supremum, we provide complete isomorphic classifications of some Banach spaces K(X(xi),Y(eta)) of compact operators from X(xi) to Y(eta), eta >= omega. It is relatively consistent with ZFC (Zermelo-Fraenkel set theory with the axiom of choice) that these results include the following cases: 1.X* contains no copy of c(0) and has the Mazur property, and Y = c(0)(J) for every set J. 2. X = c(0)(I) and Y = l(q)(J) for any infinite sets I and J and 1 <= q < infinity. 3. X = l(p)(I) and Y = l(q)(J) for any infinite sets I and J and 1 <= q < p < infinity.
Resumo:
We construct a family of examples of increasing homeomorphisms of the real line whose local quasi-symmetric distortion blows up almost everywhere, which nevertheless can be realized as the boundary values of David homeomorphisms of the upper half-plane. The construction of such David extensions uses Carleson boxes.
Resumo:
It has been suggested that muscle tension plays a major role in the activation of intracellular pathways for skeletal muscle hypertrophy via an increase in mechano growth factor (MGF) and other downstream targets. Eccentric exercise (EE) imposes a greater amount of tension on the active muscle. In particular, high-speed EE seems to exert an additional effect on muscle tension and, thus, on muscle hypertrophy. However, little is known about the effect of EE velocity on hypertrophy signaling. This study investigated the effect of acute EE-velocity manipulation on the Akt/mTORCI/p70(S6K) hypertrophy pathway. Twenty subjects were assigned to either a slow (20 degrees.s(-1); ES) or fast EE (210 degrees.s(-1); EF) group. Biopsies were taken from vastus lateralis at baseline (B), immediately after (T1), and 2 h after (T2) the completion of 5 sets of 8 repetitions of eccentric knee extensions. Akt, mTOR, and p70(S6K) total protein were similar between groups, and did not change postintervention. Further, Akt and p70(S6K) protein phosphorylation were higher at T2 than at B for ES and EF. MGF messenger RNA was similar between groups, and only significantly higher at T2 than at B in ES. The acute manipulation of EE velocity does not seem to differently influence intracellular hypertrophy signaling through the Akt/mTORCI/p70S6K pathway.
Resumo:
Background and Study Aim: The ability to develop a strong grip and maintain it during a judo match has become an important element for judo athletes. Therefore, the purpose of this investigation was to examine differences between measurements of maximal isometric time on judogi pull-up, and number of repetitions during dynamic judogi pull-up. Material/Methods: The sample was composed by two groups: 16 high-level judo athletes from the male Brazilian National Team and 12 male state-level judo athletes, with at least one athlete per weight category. The tests were compared through analysis of co-variance (body mass as co-variable), followed by a post-hoc test (Scheffe). Significance level was set at 5%. Results: No difference was found in the isometric test: Brazilian Team: 35 +/- 18s; Regional: 39 +/- 14s. However, the Brazilian Team performed a high number of repetitions (12 +/- 5 rep) compared to regional group (9 +/- 4 rep) during the dynamic grip strength endurance test. Conclusions: Thus, dynamic grip strength endurance seems to be a discriminating variable between judo athletes, probably because judo combat involves many elbow extensions and flexions in order to avoid the opponent`s grip and to subdue them.
Resumo:
The exact vibration modes and natural frequencies of planar structures and mechanisms, comprised Euler-Bernoulli beams, are obtained by solving a transcendental. nonlinear, eigenvalue problem stated by the dynamic stiffness matrix (DSM). To solve this kind of problem, the most employed technique is the Wittrick-Williams algorithm, developed in the early seventies. By formulating a new type of eigenvalue problem, which preserves the internal degrees-of-freedom for all members in the model, the present study offers an alternative to the use of this algorithm. The new proposed eigenvalue problem presents no poles, so the roots of the problem can be found by any suitable iterative numerical method. By avoiding a standard formulation for the DSM, the local mode shapes are directly calculated and any extension to the beam theory can be easily incorporated. It is shown that the method here adopted leads to exact solutions, as confirmed by various examples. Extensions of the formulation are also given, where rotary inertia, end release, skewed edges and rigid offsets are all included. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Lysozyme precipitation induced by the addition of the volatile salt ammonium carbamate was studied through cloud-point measurements and precipitation assays. Phase equilibrium experiments were carried out at 5.0, 15.0 and 25.0 degrees C and the compositions of the coexisting phases were determined. A complete separation of the coexisting liquid and solid phases could not be achieved. Nevertheless it was possible to determine the composition of the solid precipitate through the extensions of experimental tie lines. The same precipitate was found at all temperatures. Lysozyme enzymatic activities of the supernatant and precipitate phases were also determined. The activity balance suggests that ammonium carbamate preserves lysozyme activity after the salting-out precipitation. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The complex interactions among endangered ecosystems, landowners` interests, and different models of land tenure and use, constitute an important series of challenges for those seeking to maintain and restore biodiversity and augment the flow of ecosystem services. Over the past 10 years, we have developed a data-based approach to address these challenges and to achieve medium and large-scale ecological restoration of riparian areas on private lands in the state of Sao Paulo, southeastern Brazil. Given varying motivations for ecological restoration, the location of riparian areas within landholdings, environmental zoning of different riparian areas, and best-practice restoration methods were developed for each situation. A total of 32 ongoing projects, covering 527,982 ha, were evaluated in large sugarcane farms and small mixed farms, and six different restoration techniques have been developed to help upscale the effort. Small mixed farms had higher portions of land requiring protection as riparian areas (13.3%), and lower forest cover of riparian areas (18.3%), than large sugarcane farms (10.0% and 36.9%, respectively for riparian areas and forest cover values). In both types of farms, forest fragments required some degree of restoration. Historical anthropogenic degradation has compromised forest ecosystem structure and functioning, despite their high-diversity of native tree and shrub species. Notably, land use patterns in riparian areas differed markedly. Large sugarcane farms had higher portions of riparian areas occupied by highly mechanized agriculture, abandoned fields, and anthropogenic wet fields created by siltation in water courses. In contrast, in small mixed crop farms, low or non-mechanized agriculture and pasturelands were predominant. Despite these differences, plantations of native tree species covering the entire area was by far the main restoration method needed both by large sugarcane farms (76.0%) and small mixed farms (92.4%), in view of the low resilience of target sites, reduced forest cover, and high fragmentation, all of which limit the potential for autogenic restoration. We propose that plantations should be carried out with a high-diversity of native species in order to create biologically viable restored forests, and to assist long-term biodiversity persistence at the landscape scale. Finally, we propose strategies to integrate the political, socio-economic and methodological aspects needed to upscale restoration efforts in tropical forest regions throughout Latin America and elsewhere. (C) 2010 Elsevier BA/. All rights reserved.
Resumo:
The one-way quantum computing model introduced by Raussendorf and Briegel [Phys. Rev. Lett. 86, 5188 (2001)] shows that it is possible to quantum compute using only a fixed entangled resource known as a cluster state, and adaptive single-qubit measurements. This model is the basis for several practical proposals for quantum computation, including a promising proposal for optical quantum computation based on cluster states [M. A. Nielsen, Phys. Rev. Lett. (to be published), quant-ph/0402005]. A significant open question is whether such proposals are scalable in the presence of physically realistic noise. In this paper we prove two threshold theorems which show that scalable fault-tolerant quantum computation may be achieved in implementations based on cluster states, provided the noise in the implementations is below some constant threshold value. Our first threshold theorem applies to a class of implementations in which entangling gates are applied deterministically, but with a small amount of noise. We expect this threshold to be applicable in a wide variety of physical systems. Our second threshold theorem is specifically adapted to proposals such as the optical cluster-state proposal, in which nondeterministic entangling gates are used. A critical technical component of our proofs is two powerful theorems which relate the properties of noisy unitary operations restricted to act on a subspace of state space to extensions of those operations acting on the entire state space. We expect these theorems to have a variety of applications in other areas of quantum-information science.
Resumo:
The second edition of An Introduction to Efficiency and Productivity Analysis is designed to be a general introduction for those who wish to study efficiency and productivity analysis. The book provides an accessible, well-written introduction to the four principal methods involved: econometric estimation of average response models; index numbers, data envelopment analysis (DEA); and stochastic frontier analysis (SFA). For each method, a detailed introduction to the basic concepts is presented, numerical examples are provided, and some of the more important extensions to the basic methods are discussed. Of special interest is the systematic use of detailed empirical applications using real-world data throughout the book. In recent years, there have been a number of excellent advance-level books published on performance measurement. This book, however, is the first systematic survey of performance measurement with the express purpose of introducing the field to a wide audience of students, researchers, and practitioners. Indeed, the 2nd Edition maintains its uniqueness: (1) It is a well-written introduction to the field. (2) It outlines, discusses and compares the four principal methods for efficiency and productivity analysis in a well-motivated presentation. (3) It provides detailed advice on computer programs that can be used to implement these performance measurement methods. The book contains computer instructions and output listings for the SHAZAM, LIMDEP, TFPIP, DEAP and FRONTIER computer programs. More extensive listings of data and computer instruction files are available on the book's website: (www.uq.edu.au/economics/cepa/crob2005).
Resumo:
The notion of salience was developed by Schelling in the context of the meeting-place problem of locating a partner in the absence of a pre-agreed meeting place. In this paper, we argue that a realistic specification of the meeting place problem involves allowing a strategy of active search over a range of possible meeting places. We solve this extended problem, allowing for extensions such as repeated play, search costs and asymmetric payoffs. The result is a considerably richer, but more complex, notion of salience. (C) 1998 Elsevier Science B.V.