980 resultados para template overlap method top ATLAS
Resumo:
A measurement of the production cross sections of top quark pairs in association with a W or Z boson is presented. The measurement uses 20.3 fb−1 of data from proton-proton collisions at √s = 8 TeV collected by the ATLAS detector at the Large Hadron Collider. Four different final states are considered: two opposite-sign leptons, two same-sign leptons, three leptons, and four leptons. The t t̅ W and t t̅ Z cross sections are simultaneously extracted using a maximum likelihood fit over all the final states. The t t̅ Z cross section is measured to be 176+58−52 fb, corresponding to a signal significance of 4.2σ. The t t̅ W cross section is measured to be 369+100−91 fb, corresponding to a signal significance of 5.0σ. The results are consistent with next-to-leading-order calculations for the tt̅W and tt̅Z processes.
Resumo:
Effective conservation and management of top predators requires a comprehensive understanding of their distributions and of the underlying biological and physical processes that affect these distributions. The Mid-Atlantic Bight shelf break system is a dynamic and productive region where at least 32 species of cetaceans have been recorded through various systematic and opportunistic marine mammal surveys from the 1970s through 2012. My dissertation characterizes the spatial distribution and habitat of cetaceans in the Mid-Atlantic Bight shelf break system by utilizing marine mammal line-transect survey data, synoptic multi-frequency active acoustic data, and fine-scale hydrographic data collected during the 2011 summer Atlantic Marine Assessment Program for Protected Species (AMAPPS) survey. Although studies describing cetacean habitat and distributions have been previously conducted in the Mid-Atlantic Bight, my research specifically focuses on the shelf break region to elucidate both the physical and biological processes that influence cetacean distribution patterns within this cetacean hotspot.
In Chapter One I review biologically important areas for cetaceans in the Atlantic waters of the United States. I describe the study area, the shelf break region of the Mid-Atlantic Bight, in terms of the general oceanography, productivity and biodiversity. According to recent habitat-based cetacean density models, the shelf break region is an area of high cetacean abundance and density, yet little research is directed at understanding the mechanisms that establish this region as a cetacean hotspot.
In Chapter Two I present the basic physical principles of sound in water and describe the methodology used to categorize opportunistically collected multi-frequency active acoustic data using frequency responses techniques. Frequency response classification methods are usually employed in conjunction with net-tow data, but the logistics of the 2011 AMAPPS survey did not allow for appropriate net-tow data to be collected. Biologically meaningful information can be extracted from acoustic scattering regions by comparing the frequency response curves of acoustic regions to theoretical curves of known scattering models. Using the five frequencies on the EK60 system (18, 38, 70, 120, and 200 kHz), three categories of scatterers were defined: fish-like (with swim bladder), nekton-like (e.g., euphausiids), and plankton-like (e.g., copepods). I also employed a multi-frequency acoustic categorization method using three frequencies (18, 38, and 120 kHz) that has been used in the Gulf of Maine and Georges Bank which is based the presence or absence of volume backscatter above a threshold. This method is more objective than the comparison of frequency response curves because it uses an established backscatter value for the threshold. By removing all data below the threshold, only strong scattering information is retained.
In Chapter Three I analyze the distribution of the categorized acoustic regions of interest during the daytime cross shelf transects. Over all transects, plankton-like acoustic regions of interest were detected most frequently, followed by fish-like acoustic regions and then nekton-like acoustic regions. Plankton-like detections were the only significantly different acoustic detections per kilometer, although nekton-like detections were only slightly not significant. Using the threshold categorization method by Jech and Michaels (2006) provides a more conservative and discrete detection of acoustic scatterers and allows me to retrieve backscatter values along transects in areas that have been categorized. This provides continuous data values that can be integrated at discrete spatial increments for wavelet analysis. Wavelet analysis indicates significant spatial scales of interest for fish-like and nekton-like acoustic backscatter range from one to four kilometers and vary among transects.
In Chapter Four I analyze the fine scale distribution of cetaceans in the shelf break system of the Mid-Atlantic Bight using corrected sightings per trackline region, classification trees, multidimensional scaling, and random forest analysis. I describe habitat for common dolphins, Risso’s dolphins and sperm whales. From the distribution of cetacean sightings, patterns of habitat start to emerge: within the shelf break region of the Mid-Atlantic Bight, common dolphins were sighted more prevalently over the shelf while sperm whales were more frequently found in the deep waters offshore and Risso’s dolphins were most prevalent at the shelf break. Multidimensional scaling presents clear environmental separation among common dolphins and Risso’s dolphins and sperm whales. The sperm whale random forest habitat model had the lowest misclassification error (0.30) and the Risso’s dolphin random forest habitat model had the greatest misclassification error (0.37). Shallow water depth (less than 148 meters) was the primary variable selected in the classification model for common dolphin habitat. Distance to surface density fronts and surface temperature fronts were the primary variables selected in the classification models to describe Risso’s dolphin habitat and sperm whale habitat respectively. When mapped back into geographic space, these three cetacean species occupy different fine-scale habitats within the dynamic Mid-Atlantic Bight shelf break system.
In Chapter Five I present a summary of the previous chapters and present potential analytical steps to address ecological questions pertaining the dynamic shelf break region. Taken together, the results of my dissertation demonstrate the use of opportunistically collected data in ecosystem studies; emphasize the need to incorporate middle trophic level data and oceanographic features into cetacean habitat models; and emphasize the importance of developing more mechanistic understanding of dynamic ecosystems.
Resumo:
The MAREDAT atlas covers 11 types of plankton, ranging in size from bacteria to jellyfish. Together, these plankton groups determine the health and productivity of the global ocean and play a vital role in the global carbon cycle. Working within a uniform and consistent spatial and depth grid (map) of the global ocean, the researchers compiled thousands and tens of thousands of data points to identify regions of plankton abundance and scarcity as well as areas of data abundance and scarcity. At many of the grid points, the MAREDAT team accomplished the difficult conversion from abundance (numbers of organisms) to biomass (carbon mass of organisms). The MAREDAT atlas provides an unprecedented global data set for ecological and biochemical analysis and modeling as well as a clear mandate for compiling additional existing data and for focusing future data gathering efforts on key groups in key areas of the ocean. The present collection presents the original data sets used to compile Global distributions of diazotrophs abundance, biomass and nitrogen fixation rates
Resumo:
Planktic foraminifera have been used as recorders of the neodymium (Nd) isotopic composition of seawater, although there is still controversy over the precise provenance of the Nd signal. We present an extensive, multispecific plankton tow Nd/Ca data set from several geographic locations (SE Atlantic, NE Atlantic, Norwegian Sea, and western Mediterranean), together with core top samples from the Mediterranean region. The range of Nd/Ca ratios in plankton-towed foraminifera, cleaned only of organic material, from all regions (0.01-0.7 µmol/mol), is similar to previously published analyses of sedimentary foraminifera cleaned using both oxidative and reductive steps, with distribution coefficients (Kd) ranging between 4 and 302. For the Mediterranean, where core top and plankton tow data are both available, the range for plankton tows (0.05-0.7 µmol/mol) is essentially identical to that for the core tops (0.1-0.5 µmol/mol). Readsorption of Nd during cleaning is ruled out by the fact that the plankton tow samples underwent only an oxidative cleaning process. We find a relationship between manganese (Mn) and Nd in plankton tow samples that is mirrored by a similar correlation in core top samples. This relationship suggests that Fe-Mn coatings are of negligible importance to the Nd budgets of foraminifera as the Nd/Mn ratio it implies is over an order of magnitude greater than that seen in other Fe-Mn oxide phases. Rather, since both plankton tows and core tops present a similar behavior, the Nd/Mn relationship must originate in the upper water column. The data are consistent with the acquisition of Nd and Mn from the water column by binding to organic material and the fact that intratest organic material is shielded from both aggressive cleaning and diagenetic processes. Collectively, the results help to explain two abiding puzzles about Nd in sedimentary planktic foraminifera: their high REE contents and the fact that they record a surface water Nd isotopic signal, regardless of the cleaning procedure used.
Resumo:
X-ray computed tomography (CT) provides an insight into the progression of dissolution in the tests of planktonic foraminifera. Four species of foraminifera (G. ruber [white], G. sacculifer, N. dutertrei and P. obliquiloculata) from Pacific, Atlantic and Indian Ocean core-top samples were examined by CT and SEM. Inner chamber walls began to dissolve at Delta[CO3**2-] values of 12-14 µmol/kg. Close to the calcite saturation horizon, dissolution and precipitation of calcite may occur simultaneously. Inner calcite of G. sacculifer, N. dutertrei and P. obliquiloculata from such sites appeared altered or replaced, whereas outer crust calcite was dense with no pores. Unlike the other species, there was no distinction between inner and outer calcite in CT scans of G. ruber. Empty calcite crusts of N. dutertrei and P. obliquiloculata were most resistant to dissolution and were present in samples where Delta[CO3**2-] ~ -20 µmol/kg. Five stages of preservation were identified in CT scans, and an empirical dissolution index, XDX, was established. XDX appears to be insensitive to initial test mass. Mass loss in response to dissolution was similar between species and sites at ~ 0.4 µg/µmol/kg. We provide calibrations to estimate Delta[CO3**2-] and initial test mass from XDX.
Resumo:
Body size is a key determinant of metabolic rate, but logistical constraints have led to a paucity of energetics measurements from large water-breathing animals. As a result, estimating energy requirements of large fish generally relies on extrapolation of metabolic rate from individuals of lower body mass using allometric relationships that are notoriously variable. Swim-tunnel respirometry is the ‘gold standard’ for measuring active metabolic rates in water-breathing animals, yet previous data are entirely derived from body masses <10 kg – at least one order of magnitude lower than the body masses of many top-order marine predators. Here, we describe the design and testing of a new method for measuring metabolic rates of large water-breathing animals: a c. 26 000 L seagoing ‘mega-flume’ swim-tunnel respirometer. We measured the swimming metabolic rate of a 2·1-m, 36-kg zebra shark Stegostoma fasciatum within this new mega-flume and compared the results to data we collected from other S. fasciatum (3·8–47·7 kg body mass) swimming in static respirometers and previously published measurements of active metabolic rate measurements from other shark species. The mega-flume performed well during initial tests, with intra- and interspecific comparisons suggesting accurate metabolic rate measurements can be obtained with this new tool. Inclusion of our data showed that the scaling exponent of active metabolic rate with mass for sharks ranging from 0·13 to 47·7 kg was 0·79; a similar value to previous estimates for resting metabolic rates in smaller fishes. We describe the operation and usefulness of this new method in the context of our current uncertainties surrounding energy requirements of large water-breathing animals. We also highlight the sensitivity of mass-extrapolated energetic estimates in large aquatic animals and discuss the consequences for predicting ecosystem impacts such as trophic cascades.
Resumo:
Body size is a key determinant of metabolic rate, but logistical constraints have led to a paucity of energetics measurements from large water-breathing animals. As a result, estimating energy requirements of large fish generally relies on extrapolation of metabolic rate from individuals of lower body mass using allometric relationships that are notoriously variable. Swim-tunnel respirometry is the ‘gold standard’ for measuring active metabolic rates in water-breathing animals, yet previous data are entirely derived from body masses <10 kg – at least one order of magnitude lower than the body masses of many top-order marine predators. Here, we describe the design and testing of a new method for measuring metabolic rates of large water-breathing animals: a c. 26 000 L seagoing ‘mega-flume’ swim-tunnel respirometer. We measured the swimming metabolic rate of a 2·1-m, 36-kg zebra shark Stegostoma fasciatum within this new mega-flume and compared the results to data we collected from other S. fasciatum (3·8–47·7 kg body mass) swimming in static respirometers and previously published measurements of active metabolic rate measurements from other shark species. The mega-flume performed well during initial tests, with intra- and interspecific comparisons suggesting accurate metabolic rate measurements can be obtained with this new tool. Inclusion of our data showed that the scaling exponent of active metabolic rate with mass for sharks ranging from 0·13 to 47·7 kg was 0·79; a similar value to previous estimates for resting metabolic rates in smaller fishes. We describe the operation and usefulness of this new method in the context of our current uncertainties surrounding energy requirements of large water-breathing animals. We also highlight the sensitivity of mass-extrapolated energetic estimates in large aquatic animals and discuss the consequences for predicting ecosystem impacts such as trophic cascades.
Resumo:
Radon Affected Area potential in Northern Ireland was estimated using a joint mapping method. This method allows variation of radon potential both between and within geological units. The estimates are based on the results of radon measurements and geological information for more than 23,000 homes. Elevated radon potential is presented here as indicative maps based on the highest radon potential for each 1 kilometre square of the Irish grid. The full definitive detail is published as a digital dataset for geographical information systems, which can be licensed. The estimated radon potential for an individual home can be obtained through the Public Health England (PHE) UKradon website. The work was partially funded by the Northern Ireland Environment Agency and was prepared jointly by PHE and the British Geological Survey. This report replaces the 2009 review and atlas (HPA-RPD-061).
Resumo:
The West African Monsoon (WAM) and its representation in numerical models are strongly influenced by the Saharan Heat Low (SHL), a low-pressure system driven by radiative heating over the central Sahara and ventilated by the cold and moist inflow from adjacent oceans. It has recently been shown that a significant part of the southerly moisture flux into the SHL originates from convective cold pools over the Sahel. These density currents driven by evaporation of rain are largely absent in models with parameterized convection. This crucial issue has been hypothesized to contribute to the inability of many climate models to reproduce the variability of the WAM. Here, the role of convective cold pools approaching the SHL from the Atlas Mountains, which are a strong orographic trigger for deep convection in Northwest Africa, is analyzed. Knowledge about the frequency of these events, as well as their impact on large-scale dynamics, is required to understand their contribution to the variability of the SHL and to known model uncertainties. The first aspect is addressed through the development of an objective and automated method for the generation of multi-year climatologies not available before. The algorithm combines freely available standard surface observations with satellite microwave data. Representativeness of stations and influence of their spatial density are addressed by comparison to a satellite-only climatology. Applying this algorithm to data from automated weather stations and manned synoptic stations in and south of the Atlas Mountains reveals the frequent occurrence. On the order of 6 events per month are detected from May to September when the SHL is in its northernmost position. The events tend to cluster into several-days long convectively active periods, often with strong events on consecutive days. This study is the first to diagnose dynamical impacts of such periods on the SHL, based on simulations of two example cases using the Weather Research and Forecast (WRF) model at convection-permitting resolution. Sensitivity experiments with artificially removed cold pools as well as different resolutions and parameterizations are conducted. Results indicate increases in surface pressure of more than 1 hPa and significant moisture transports into the desert over several days. This moisture affects radiative heating and thus the energy balance of the SHL. Even though cold pool events north of the SHL are less frequent when compared to their Sahelian counterparts, it is shown that they gain importance due to their temporal clustering on synoptic timescale. Together with studies focusing on the Sahel, this work emphasizes the need for improved parameterization schemes for deep convection in order to produce more reliable climate projections for the WAM.
Resumo:
The work presented in this thesis is concerned with the dynamical behavior of a CBandola's acoustical box at low resonances -- Two models consisting of two and three coupled oscillators are proposed in order to analyse the response at the first two and three resonances, respectively -- These models describe the first resonances in a bandola as a combination of the lowest modes of vibration of enclosed air, top and back plates -- Physically, the coupling between these elements is caused by the fluid-structure interaction that gives rise to coupled modes of vibration for the assembled resonance box -- In this sense, the coupling in the models is expressed in terms of the ratio of effective areas and masses of the elements which is an useful parameter to control the coupling -- Numerical models are developed for the analysis of modal coupling which is performed using the Finite Element Method -- First, it is analysed the modal behavior of separate elements: enclosed air, top plate and back plate -- This step is important to identify participating modes in the coupling -- Then, a numerical model of the resonance box is used to compute the coupled modes -- The computation of normal modes of vibration was executed in the frequency range of 0-800Hz -- Although the introduced models of coupled oscillators only predict maximum the first three resonances, they also allow to study qualitatively the coupling between the rest of the computed modes in the range -- Considering that dynamic response of a structure can be described in terms of the modal parameters, this work represents, in a good approach, the basic behavior of a CBandola, although experimental measurements are suggested as further work to verify the obtained results and get more information about some characteristics of the coupled modes, for instance, the phase of vibration of the air mode and the radiation e ciency
Resumo:
A two stage approach to performing ab initio calculations on medium and large sized molecules is described. The first step is to perform SCF calculations on small molecules or molecular fragments using the OPIT Program. This employs a small basis set of spherical and p-type Gaussian functions. The Gaussian functions can be identified very closely with atomic cores, bond pairs, lone pairs, etc. The position and exponent of any of the Gaussian functions can be varied by OPIT to produce a small but fully optimised basis set. The second stage is the molecular fragments method. As an example of this, Gaussian exponents and distances are taken from an OPIT calculation on ethylene and used unchanged in a single SCF calculation on benzene. Approximate ab initio calculations of this type give much useful information and are often preferable to semi-empirical approaches, since the nature of the approximations involved is much better defined.
Resumo:
Edge-labeled graphs have proliferated rapidly over the last decade due to the increased popularity of social networks and the Semantic Web. In social networks, relationships between people are represented by edges and each edge is labeled with a semantic annotation. Hence, a huge single graph can express many different relationships between entities. The Semantic Web represents each single fragment of knowledge as a triple (subject, predicate, object), which is conceptually identical to an edge from subject to object labeled with predicates. A set of triples constitutes an edge-labeled graph on which knowledge inference is performed. Subgraph matching has been extensively used as a query language for patterns in the context of edge-labeled graphs. For example, in social networks, users can specify a subgraph matching query to find all people that have certain neighborhood relationships. Heavily used fragments of the SPARQL query language for the Semantic Web and graph queries of other graph DBMS can also be viewed as subgraph matching over large graphs. Though subgraph matching has been extensively studied as a query paradigm in the Semantic Web and in social networks, a user can get a large number of answers in response to a query. These answers can be shown to the user in accordance with an importance ranking. In this thesis proposal, we present four different scoring models along with scalable algorithms to find the top-k answers via a suite of intelligent pruning techniques. The suggested models consist of a practically important subset of the SPARQL query language augmented with some additional useful features. The first model called Substitution Importance Query (SIQ) identifies the top-k answers whose scores are calculated from matched vertices' properties in each answer in accordance with a user-specified notion of importance. The second model called Vertex Importance Query (VIQ) identifies important vertices in accordance with a user-defined scoring method that builds on top of various subgraphs articulated by the user. Approximate Importance Query (AIQ), our third model, allows partial and inexact matchings and returns top-k of them with a user-specified approximation terms and scoring functions. In the fourth model called Probabilistic Importance Query (PIQ), a query consists of several sub-blocks: one mandatory block that must be mapped and other blocks that can be opportunistically mapped. The probability is calculated from various aspects of answers such as the number of mapped blocks, vertices' properties in each block and so on and the most top-k probable answers are returned. An important distinguishing feature of our work is that we allow the user a huge amount of freedom in specifying: (i) what pattern and approximation he considers important, (ii) how to score answers - irrespective of whether they are vertices or substitution, and (iii) how to combine and aggregate scores generated by multiple patterns and/or multiple substitutions. Because so much power is given to the user, indexing is more challenging than in situations where additional restrictions are imposed on the queries the user can ask. The proposed algorithms for the first model can also be used for answering SPARQL queries with ORDER BY and LIMIT, and the method for the second model also works for SPARQL queries with GROUP BY, ORDER BY and LIMIT. We test our algorithms on multiple real-world graph databases, showing that our algorithms are far more efficient than popular triple stores.
Resumo:
The time-mean Argo float displacements and the World Ocean Atlas 2009 temperature–salinity climatology are used to obtain the total, top to bottom, mass transports. Outside of an equatorial band, the total transports are the sum of the vertical integrals of geostrophic- and wind-driven Ekman currents. However, these transports are generally divergent, and to obtain a mass conserving circulation, a Poisson equation is solved for the streamfunction with Dirichlet boundary conditions at solid boundaries. The value of the streamfunction on islands is also part of the unknowns. This study presents and discusses an energetic circulation in three basins: the North Atlantic, the North Pacific, and the Southern Ocean. This global method leads to new estimations of the time-mean western Eulerian boundary current transports maxima of 97 Sverdrups (Sv; 1 Sv ≡ 106 m3 s−1) at 60°W for the Gulf Stream, 84 Sv at 157°E for the Kuroshio, 80 Sv for the Agulhas Current between 32° and 36°S, and finally 175 Sv for the Antarctic Circumpolar Current at Drake Passage. Although the large-scale structure and boundary of the interior gyres is well predicted by the Sverdrup relation, the transports derived from the wind stress curl are lower than the observed transports in the interior by roughly a factor of 2, suggesting an important contribution of the bottom torques. With additional Argo displacement data, the errors caused by the presence of remaining transient terms at the 1000-db reference level will continue to decrease, allowing this method to produce increasingly accurate results in the future.
Resumo:
Since it has been found that the MadGraph Monte Carlo generator offers superior flavour-matching capability as compared to Alpgen, the suitability of MadGraph for the generation of ttb¯ ¯b events is explored, with a view to simulating this background in searches for the Standard Model Higgs production and decay process ttH, H ¯ → b ¯b. Comparisons are performed between the output of MadGraph and that of Alpgen, showing that satisfactory agreement in their predictions can be obtained with the appropriate generator settings. A search for the Standard Model Higgs boson, produced in association with the top quark and decaying into a b ¯b pair, using 20.3 fb−1 of 8 TeV collision data collected in 2012 by the ATLAS experiment at CERN’s Large Hadron Collider, is presented. The GlaNtp analysis framework, together with the RooFit package and associated software, are used to obtain an expected 95% confidence-level limit of 4.2 +4.1 −2.0 times the Standard Model expectation, and the corresponding observed limit is found to be 5.9; this is within experimental uncertainty of the published result of the analysis performed by the ATLAS collaboration. A search for a heavy charged Higgs boson of mass mH± in the range 200 ≤ mH± /GeV ≤ 600, where the Higgs mediates the five-flavour beyond-theStandard-Model physics process gb → tH± → ttb, with one top quark decaying leptonically and the other decaying hadronically, is presented, using the 20.3 fb−1 8 TeV ATLAS data set. Upper limits on the product of the production cross-section and the branching ratio of the H± boson are computed for six mass points, and these are found to be compatible within experimental uncertainty with those obtained by the corresponding published ATLAS analysis.
Resumo:
This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.