22 resultados para heavy-quark effective theory

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the first calculation of the complete NLO QCD corrections to the production of heavy flavors with longitudinally polarized hadrons. This reaction can be used at RHIC to extract the gluon helicity density and may shed light on the "heavy quark enigma". The theoretical uncertainties are briefly discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present all relevant details of our calculation of the complete next-to-leading order O(αS2α) QCD corrections to heavy flavor photoproduction with longitudinally polarized point-like photons and hadrons. In particular we provide analytical results for the virtual plus soft gluon cross section. We carefully address the relevance of remaining theoretical uncertainties by varying, for instance, the factorization and renormalization scales independently. Such studies are of importance for a meaningful first direct determination of the polarized gluon density Δg from the total charm production spin asymmetry by the upcoming COMPASS experiment. It is shown that the scale uncertainty is considerably reduced in next-to-leading order, but the dependence on the charm quark mass is sizable at fixed target energies. Finally, we study several differential single-inclusive heavy quark distributions and, for the polarized HERA option, the total bottom spin asymmetry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The complete details of our calculation of the NLO QCD corrections to heavy flavor photo- and hadroproduction with longitudinally polarized initial states are presented. The main motivation for investigating these processes is the determination of the polarized gluon density at the COMPASS and RHIC experiments, respectively, in the near future. All methods used in the computation are extensively documented, providing a self-contained introduction to this type of calculations. Some employed tools also may be of general interest, e.g., the series expansion of hypergeometric functions. The relevant parton level results are collected and plotted in the form of scaling functions. However, the simplification of the obtained gluon-gluon virtual contributions has not been completed yet. Thus NLO phenomenological predictions are only given in the case of photoproduction. The theoretical uncertainties of these predictions, in particular with respect to the heavy quark mass, are carefully considered. Also it is shown that transverse momentum cuts can considerably enhance the measured production asymmetries. Finally unpolarized heavy quark production is reviewed in order to derive conditions for a successful interpretation of future spin-dependent experimental data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent radar and rain-gauge observations from the island of Dominica, which lies in the eastern Caribbean sea at 15 N, show a strong orographic enhancement of trade-wind precipitation. The mechanisms behind this enhancement are investigated using idealized large-eddy simulations with a realistic representation of the shallow trade-wind cumuli over the open ocean upstream of the island. The dominant mechanism is found to be the rapid growth of convection by the bulk lifting of the inhomogenous impinging flow. When rapidly lifted by the terrain, existing clouds and other moist parcels gain buoyancy relative to rising dry air because of their different adiabatic lapse rates. The resulting energetic, closely-packed convection forms precipitation readily and brings frequent heavy showers to the high terrain. Despite this strong precipitation enhancement, only a small fraction (1%) of the impinging moisture flux is lost over the island. However, an extensive rain shadow forms to the lee of Dominica due to the convective stabilization, forced descent, and wave breaking. A linear model is developed to explain the convective enhancement over the steep terrain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. The management of threatened species is an important practical way in which conservationists can intervene in the extinction process and reduce the loss of biodiversity. Understanding the causes of population declines (past, present and future) is pivotal to designing effective practical management. This is the declining-population paradigm identified by Caughley. 2. There are three broad classes of ecological tool used by conservationists to guide management decisions for threatened species: statistical models of habitat use, demographic models and behaviour-based models. Each of these is described here, illustrated with a case study and evaluated critically in terms of its practical application. 3. These tools are fundamentally different. Statistical models of habitat use and demographic models both use descriptions of patterns in abundance and demography, in relation to a range of factors, to inform management decisions. In contrast, behaviourbased models describe the evolutionary processes underlying these patterns, and derive such patterns from the strategies employed by individuals when competing for resources under a specific set of environmental conditions. 4. Statistical models of habitat use and demographic models have been used successfully to make management recommendations for declining populations. To do this, assumptions are made about population growth or vital rates that will apply when environmental conditions are restored, based on either past data collected under favourable environmental conditions or estimates of these parameters when the agent of decline is removed. As a result, they can only be used to make reliable quantitative predictions about future environments when a comparable environment has been experienced by the population of interest in the past. 5. Many future changes in the environment driven by management will not have been experienced by a population in the past. Under these circumstances, vital rates and their relationship with population density will change in the future in a way that is not predictable from past patterns. Reliable quantitative predictions about population-level responses then need to be based on an explicit consideration of the evolutionary processes operating at the individual level. 6. Synthesis and applications. It is argued that evolutionary theory underpins Caughley’s declining-population paradigm, and that it needs to become much more widely used within mainstream conservation biology. This will help conservationists examine critically the reliability of the tools they have traditionally used to aid management decision-making. It will also give them access to alternative tools, particularly when predictions are required for changes in the environment that have not been experienced by a population in the past.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. The management of threatened species is an important practical way in which conservationists can intervene in the extinction process and reduce the loss of biodiversity. Understanding the causes of population declines (past, present and future) is pivotal to designing effective practical management. This is the declining-population paradigm identified by Caughley. 2. There are three broad classes of ecological tool used by conservationists to guide management decisions for threatened species: statistical models of habitat use, demographic models and behaviour-based models. Each of these is described here, illustrated with a case study and evaluated critically in terms of its practical application. 3. These tools are fundamentally different. Statistical models of habitat use and demographic models both use descriptions of patterns in abundance and demography, in relation to a range of factors, to inform management decisions. In contrast, behaviour-based models describe the evolutionary processes underlying these patterns, and derive such patterns from the strategies employed by individuals when competing for resources under a specific set of environmental conditions. 4. Statistical models of habitat use and demographic models have been used successfully to make management recommendations for declining populations. To do this, assumptions are made about population growth or vital rates that will apply when environmental conditions are restored, based on either past data collected under favourable environmental conditions or estimates of these parameters when the agent of decline is removed. As a result, they can only be used to make reliable quantitative predictions about future environments when a comparable environment has been experienced by the population of interest in the past. 5. Many future changes in the environment driven by management will not have been experienced by a population in the past. Under these circumstances, vital rates and their relationship with population density will change in the future in a way that is not predictable from past patterns. Reliable quantitative predictions about population-level responses then need to be based on an explicit consideration of the evolutionary processes operating at the individual level. 6. Synthesis and applications. It is argued that evolutionary theory underpins Caughley's declining-population paradigm, and that it needs to become much more widely used within mainstream conservation biology. This will help conservationists examine critically the reliability of the tools they have traditionally used to aid management decision-making. It will also give them access to alternative tools, particularly when predictions are required for changes in the environment that have not been experienced by a population in the past.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Building services are worth about 2% GDP and are essential for the effective and efficient operations of the building. It is increasingly recognised that the value of a building is related to the way it supports the client organisation’s ongoing business operations. Building services are central to the functional performance of buildings and provide the necessary conditions for health, well-being, safety and security of the occupants. They frequently comprise several technologically distinct sub-systems and their design and construction requires the involvement of numerous disciplines and trades. Designers and contractors working on the same project are frequently employed by different companies. Materials and equipment is supplied by a diverse range of manufacturers. Facilities managers are responsible for operation of the building service in use. The coordination between these participants is crucially important to achieve optimum performance, but too often is neglected. This leaves room for serious faults. The need for effective integration is important. Modern technology offers increasing opportunities for integrated personal-control systems for lighting, ventilation and security as well as interoperability between systems. Opportunities for a new mode of systems integration are provided by the emergence of PFI/PPP procurements frameworks. This paper attempts to establish how systems integration can be achieved in the process of designing, constructing and operating building services. The essence of the paper therefore is to envisage the emergent organisational responses to the realisation of building services as an interactive systems network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The assessment of building energy efficiency is one of the most effective measures for reducing building energy consumption. This paper proposes a holistic method (HMEEB) for assessing and certifying building energy efficiency based on the D-S (Dempster-Shafer) theory of evidence and the Evidential Reasoning (ER) approach. HMEEB has three main features: (i) it provides both a method to assess and certify building energy efficiency, and exists as an analytical tool to identify improvement opportunities; (ii) it combines a wealth of information on building energy efficiency assessment, including identification of indicators and a weighting mechanism; and (iii) it provides a method to identify and deal with inherent uncertainties within the assessment procedure. This paper demonstrates the robustness, flexibility and effectiveness of the proposed method, using two examples to assess the energy efficiency of two residential buildings, both located in the ‘Hot Summer and Cold Winter’ zone in China. The proposed certification method provides detailed recommendations for policymakers in the context of carbon emission reduction targets and promoting energy efficiency in the built environment. The method is transferable to other countries and regions, using an indicator weighting system to modify local climatic, economic and social factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent research has shown that Lighthill–Ford spontaneous gravity wave generation theory, when applied to numerical model data, can help predict areas of clear-air turbulence. It is hypothesized that this is the case because spontaneously generated atmospheric gravity waves may initiate turbulence by locally modifying the stability and wind shear. As an improvement on the original research, this paper describes the creation of an ‘operational’ algorithm (ULTURB) with three modifications to the original method: (1) extending the altitude range for which the method is effective downward to the top of the boundary layer, (2) adding turbulent kinetic energy production from the environment to the locally produced turbulent kinetic energy production, and, (3) transforming turbulent kinetic energy dissipation to eddy dissipation rate, the turbulence metric becoming the worldwide ‘standard’. In a comparison of ULTURB with the original method and with the Graphical Turbulence Guidance second version (GTG2) automated procedure for forecasting mid- and upper-level aircraft turbulence ULTURB performed better for all turbulence intensities. Since ULTURB, unlike GTG2, is founded on a self-consistent dynamical theory, it may offer forecasters better insight into the causes of the clear-air turbulence and may ultimately enhance its predictability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Counterinsurgency Manual FM 3-24 has been accused of being over-dependent on the counterinsurgency 'classics' Galula and Thompson. But comparison reveals that it is different in spirit. Galula and Thompson seek practical control; the Manual seeks to build 'legitimacy'. Its concept of legitimacy is superficially Weberian, but owes more to the writings of the American Max Manwaring. The Manual presupposes that a rights-based legal order can (other things being equal) be made to be cross-culturally attractive; 'effective governance' by itself can build legitimacy. The fusion of its methods with an ideology creates unrealistic criteria for success. Its weaknesses suggest a level of incapacity to think politically that will, in time, result in further failures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evolutionary developmental genetics brings together systematists, morphologists and developmental geneticists; it will therefore impact on each of these component disciplines. The goals and methods of phylogenetic analysis are reviewed here, and the contribution of evolutionary developmental genetics to morphological systematics, in terms of character conceptualisation and primary homology assessment, is discussed. Evolutionary developmental genetics, like its component disciplines phylogenetic systematics and comparative morphology, is concerned with homology concepts. Phylogenetic concepts of homology and their limitations are considered here, and the need for independent homology statements at different levels of biological organisation is evaluated. The role of systematics in evolutionary developmental genetics is outlined. Phylogenetic systematics and comparative morphology will suggest effective sampling strategies to developmental geneticists. Phylogenetic systematics provides hypotheses of character evolution (including parallel evolution and convergence), stimulating investigations into the evolutionary gains and losses of morphologies. Comparative morphology identifies those structures that are not easily amenable to typological categorisation, and that may be of particular interest in terms of developmental genetics. The concepts of latent homology and genetic recall may also prove useful in the evolutionary interpretation of developmental genetic data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Airborne lidar provides accurate height information of objects on the earth and has been recognized as a reliable and accurate surveying tool in many applications. In particular, lidar data offer vital and significant features for urban land-cover classification, which is an important task in urban land-use studies. In this article, we present an effective approach in which lidar data fused with its co-registered images (i.e. aerial colour images containing red, green and blue (RGB) bands and near-infrared (NIR) images) and other derived features are used effectively for accurate urban land-cover classification. The proposed approach begins with an initial classification performed by the Dempster–Shafer theory of evidence with a specifically designed basic probability assignment function. It outputs two results, i.e. the initial classification and pseudo-training samples, which are selected automatically according to the combined probability masses. Second, a support vector machine (SVM)-based probability estimator is adopted to compute the class conditional probability (CCP) for each pixel from the pseudo-training samples. Finally, a Markov random field (MRF) model is established to combine spatial contextual information into the classification. In this stage, the initial classification result and the CCP are exploited. An efficient belief propagation (EBP) algorithm is developed to search for the global minimum-energy solution for the maximum a posteriori (MAP)-MRF framework in which three techniques are developed to speed up the standard belief propagation (BP) algorithm. Lidar and its co-registered data acquired by Toposys Falcon II are used in performance tests. The experimental results prove that fusing the height data and optical images is particularly suited for urban land-cover classification. There is no training sample needed in the proposed approach, and the computational cost is relatively low. An average classification accuracy of 93.63% is achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present the complete next-to-leading order QCD corrections to the polarized hadroproduction of heavy flavors which soon will be studied experimentally in polarized pp collisions at the BNL Relativistic Heavy Ion Collider (RHIC) in order to constrain the polarized gluon density Δg. It is demonstrated that the dependence on unphysical renormalization and factorization scales is strongly reduced beyond the leading order. The sensitivity of the charm quark spin asymmetry to Δg is analyzed in some detail, including the limited detector acceptance for leptons from charm quark decays at the BNL RHIC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a calculation of the next-to-leading order ... QCD corrections to heavy flavor photoproduction with longitudinally polarized beams. We apply our results to study the longitudinal spin asymmetry for the total charm quark production cross section which will be utilized by the forthcoming COMPASS experiment at CERN to obtain first direct information on the polarized gluon density Δg. We also briefly discuss the main theoretical uncertainties inherent in this calculation. In particular we demonstrate that the factorization scale dependence is considerably reduced in next-to-leading order.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present the complete next-to-leading order QCD corrections to the polarized hadroproduction of heavy flavors. This reaction can be studied experimentally in polarized pp collisions at the JHF and at the BNL RHIC in order to constrain the polarized gluon density. It is demonstrated that the dependence on the unphysical renormalization and factorization scales is strongly reduced beyond the leading order. We also discuss how the high luminosity at the JHF can be used to control remaining theoretical uncertainties. An effective method for bridging the gap between theoretical predictions for heavy quarks and experimental measurements of heavy meson decay products is introduced briefly.