4 resultados para Branch Dependency Tracking
em ArchiMeD - Elektronische Publikationen der Universität Mainz - Alemanha
Resumo:
Quantitative branch determination in polyolefins by solid- and melt-state 13C NMR has been investigated. Both methods were optimised toward sensitivity per unit time. While solid-state NMR was shown to give quick albeit only qualitative results, melt-state NMR allowed highly time efficient accurate branch quantification. Comparison of spectra obtained using spectrometers operating at 300, 500 and 700 MHz 1H Larmor frequency, with 4 and 7~mm MAS probeheads, showed that the best sensitivity was achieved at 500 MHz using a 7 mm 13C-1H optimised high temperature probehead. For materials available in large quantities, static melt-state NMR, using large diameter detection coils and high coil filling at 300 MHz, was shown to produce comparable results to melt-state MAS measurements in less time. While the use of J-coupling mediated polarisation transfer techniques was shown to be possible, direct polarisation via single-pulse excitation proved to be more suitable for branch quantification in the melt-state. Artificial line broadening, introduced by FID truncation, was able to be reduced by the use of π pulse-train heteronuclear dipolar decoupling. This decoupling method, when combined with an extended duty-cycle, allowed for significant improvement in resolution. Standard setup, processing and analysis techniques were developed to minimise systematic errors contributing to the measured branch contents. The final optimised melt-state MAS NMR method was shown to allow time efficient quantification of comonomer content and distribution in both polyethylene- and polypropylene-co-α-olefins. The sensitivity of the technique was demonstrated by quantifying branch concentrations of 8 branches per 100,000 CH2 for an industrial ‘linear’ polyethylene in only 13 hours. Even lower degrees of 3–8 long-chain branches per 100,000 carbons were able to be estimated in just 24 hours for a series of γ-irradiated polypropylene homopolymers.
Resumo:
The Standard Model of elementary particle physics was developed to describe the fundamental particles which constitute matter and the interactions between them. The Large Hadron Collider (LHC) at CERN in Geneva was built to solve some of the remaining open questions in the Standard Model and to explore physics beyond it, by colliding two proton beams at world-record centre-of-mass energies. The ATLAS experiment is designed to reconstruct particles and their decay products originating from these collisions. The precise reconstruction of particle trajectories plays an important role in the identification of particle jets which originate from bottom quarks (b-tagging). This thesis describes the step-wise commissioning of the ATLAS track reconstruction and b-tagging software and one of the first measurements of the b-jet production cross section in pp collisions at sqrt(s)=7 TeV with the ATLAS detector. The performance of the track reconstruction software was studied in great detail, first using data from cosmic ray showers and then collisions at sqrt(s)=900 GeV and 7 TeV. The good understanding of the track reconstruction software allowed a very early deployment of the b-tagging algorithms. First studies of these algorithms and the measurement of the b-tagging efficiency in the data are presented. They agree well with predictions from Monte Carlo simulations. The b-jet production cross section was measured with the 2010 dataset recorded by the ATLAS detector, employing muons in jets to estimate the fraction of b-jets. The measurement is in good agreement with the Standard Model predictions.
Resumo:
This paper presents the first full-fledged branch-and-price (bap) algorithm for the capacitated arc-routing problem (CARP). Prior exact solution techniques either rely on cutting planes or the transformation of the CARP into a node-routing problem. The drawbacks are either models with inherent symmetry, dense underlying networks, or a formulation where edge flows in a potential solution do not allow the reconstruction of unique CARP tours. The proposed algorithm circumvents all these drawbacks by taking the beneficial ingredients from existing CARP methods and combining them in a new way. The first step is the solution of the one-index formulation of the CARP in order to produce strong cuts and an excellent lower bound. It is known that this bound is typically stronger than relaxations of a pure set-partitioning CARP model.rnSuch a set-partitioning master program results from a Dantzig-Wolfe decomposition. In the second phase, the master program is initialized with the strong cuts, CARP tours are iteratively generated by a pricing procedure, and branching is required to produce integer solutions. This is a cut-first bap-second algorithm and its main function is, in fact, the splitting of edge flows into unique CARP tours.
Resumo:
Data sets describing the state of the earth's atmosphere are of great importance in the atmospheric sciences. Over the last decades, the quality and sheer amount of the available data increased significantly, resulting in a rising demand for new tools capable of handling and analysing these large, multidimensional sets of atmospheric data. The interdisciplinary work presented in this thesis covers the development and the application of practical software tools and efficient algorithms from the field of computer science, aiming at the goal of enabling atmospheric scientists to analyse and to gain new insights from these large data sets. For this purpose, our tools combine novel techniques with well-established methods from different areas such as scientific visualization and data segmentation. In this thesis, three practical tools are presented. Two of these tools are software systems (Insight and IWAL) for different types of processing and interactive visualization of data, the third tool is an efficient algorithm for data segmentation implemented as part of Insight.Insight is a toolkit for the interactive, three-dimensional visualization and processing of large sets of atmospheric data, originally developed as a testing environment for the novel segmentation algorithm. It provides a dynamic system for combining at runtime data from different sources, a variety of different data processing algorithms, and several visualization techniques. Its modular architecture and flexible scripting support led to additional applications of the software, from which two examples are presented: the usage of Insight as a WMS (web map service) server, and the automatic production of a sequence of images for the visualization of cyclone simulations. The core application of Insight is the provision of the novel segmentation algorithm for the efficient detection and tracking of 3D features in large sets of atmospheric data, as well as for the precise localization of the occurring genesis, lysis, merging and splitting events. Data segmentation usually leads to a significant reduction of the size of the considered data. This enables a practical visualization of the data, statistical analyses of the features and their events, and the manual or automatic detection of interesting situations for subsequent detailed investigation. The concepts of the novel algorithm, its technical realization, and several extensions for avoiding under- and over-segmentation are discussed. As example applications, this thesis covers the setup and the results of the segmentation of upper-tropospheric jet streams and cyclones as full 3D objects. Finally, IWAL is presented, which is a web application for providing an easy interactive access to meteorological data visualizations, primarily aimed at students. As a web application, the needs to retrieve all input data sets and to install and handle complex visualization tools on a local machine are avoided. The main challenge in the provision of customizable visualizations to large numbers of simultaneous users was to find an acceptable trade-off between the available visualization options and the performance of the application. Besides the implementational details, benchmarks and the results of a user survey are presented.