4 resultados para Choruses, Secular (Mixed voices, 8 parts) with piano.
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The production rate of $b$ and $\bar{b}$ hadrons in $pp$ collisions are not expected to be strictly identical, due to imbalance between quarks and anti-quarks in the initial state. This phenomenon can be naively related to the fact that the $\bar{b}$ quark produced in the hard scattering might combine with a $u$ or $d$ valence quark from the colliding protons, whereas the same cannot happen for a $b$ quark. This thesis presents the analysis performed to determine the production asymmetries of $B^0$ and $B^0_s$. The analysis relies on data samples collected by the LHCb detector at the Large Hadron Collider (LHC) during the 2011 and 2012 data takings at two different values of the centre of mass energy $\sqrt{s}=7$ TeV and at $\sqrt{s}=8$ TeV, corresponding respectively to an integrated luminosity of 1 fb$^{-1}$ and of 2 fb$^{-1}$. The production asymmetry is one of the key ingredients to perform measurements of $CP$ violation in b-hadron decays at the LHC, since $CP$ asymmetries must be disentangled from other sources. The measurements of the production asymmetries are performed in bins of $p_\mathrm{T}$ and $\eta$ of the $B$-meson. The values of the production asymmetries, integrated in the ranges $4 < p_\mathrm{T} < 30$ GeV/c and $2.5<\eta<4.5$, are determined to be: \begin{equation} A_\mathrm{P}(\B^0)= (-1.00\pm0.48\pm0.29)\%,\nonumber \end{equation} \begin{equation} A_\mathrm{P}(\B^0_s)= (\phantom{-}1.09\pm2.61\pm0.61)\%,\nonumber \end{equation} where the first uncertainty is statistical and the second is systematic. The measurement of $A_\mathrm{P}(B^0)$ is performed using the full statistics collected by LHCb so far, corresponding to an integrated luminosity of 3 fb$^{-1}$, while the measurement of $A_\mathrm{P}(B^0_s)$ is realized with the first 1 fb$^{-1}$, leaving room for improvement. No clear evidence of dependences on the values of $p_\mathrm{T}$ and $\eta$ is observed. The results presented in this thesis are the most precise measurements available up to date.
Resumo:
Mixed integer programming is up today one of the most widely used techniques for dealing with hard optimization problems. On the one side, many practical optimization problems arising from real-world applications (such as, e.g., scheduling, project planning, transportation, telecommunications, economics and finance, timetabling, etc) can be easily and effectively formulated as Mixed Integer linear Programs (MIPs). On the other hand, 50 and more years of intensive research has dramatically improved on the capability of the current generation of MIP solvers to tackle hard problems in practice. However, many questions are still open and not fully understood, and the mixed integer programming community is still more than active in trying to answer some of these questions. As a consequence, a huge number of papers are continuously developed and new intriguing questions arise every year. When dealing with MIPs, we have to distinguish between two different scenarios. The first one happens when we are asked to handle a general MIP and we cannot assume any special structure for the given problem. In this case, a Linear Programming (LP) relaxation and some integrality requirements are all we have for tackling the problem, and we are ``forced" to use some general purpose techniques. The second one happens when mixed integer programming is used to address a somehow structured problem. In this context, polyhedral analysis and other theoretical and practical considerations are typically exploited to devise some special purpose techniques. This thesis tries to give some insights in both the above mentioned situations. The first part of the work is focused on general purpose cutting planes, which are probably the key ingredient behind the success of the current generation of MIP solvers. Chapter 1 presents a quick overview of the main ingredients of a branch-and-cut algorithm, while Chapter 2 recalls some results from the literature in the context of disjunctive cuts and their connections with Gomory mixed integer cuts. Chapter 3 presents a theoretical and computational investigation of disjunctive cuts. In particular, we analyze the connections between different normalization conditions (i.e., conditions to truncate the cone associated with disjunctive cutting planes) and other crucial aspects as cut rank, cut density and cut strength. We give a theoretical characterization of weak rays of the disjunctive cone that lead to dominated cuts, and propose a practical method to possibly strengthen those cuts arising from such weak extremal solution. Further, we point out how redundant constraints can affect the quality of the generated disjunctive cuts, and discuss possible ways to cope with them. Finally, Chapter 4 presents some preliminary ideas in the context of multiple-row cuts. Very recently, a series of papers have brought the attention to the possibility of generating cuts using more than one row of the simplex tableau at a time. Several interesting theoretical results have been presented in this direction, often revisiting and recalling other important results discovered more than 40 years ago. However, is not clear at all how these results can be exploited in practice. As stated, the chapter is a still work-in-progress and simply presents a possible way for generating two-row cuts from the simplex tableau arising from lattice-free triangles and some preliminary computational results. The second part of the thesis is instead focused on the heuristic and exact exploitation of integer programming techniques for hard combinatorial optimization problems in the context of routing applications. Chapters 5 and 6 present an integer linear programming local search algorithm for Vehicle Routing Problems (VRPs). The overall procedure follows a general destroy-and-repair paradigm (i.e., the current solution is first randomly destroyed and then repaired in the attempt of finding a new improved solution) where a class of exponential neighborhoods are iteratively explored by heuristically solving an integer programming formulation through a general purpose MIP solver. Chapters 7 and 8 deal with exact branch-and-cut methods. Chapter 7 presents an extended formulation for the Traveling Salesman Problem with Time Windows (TSPTW), a generalization of the well known TSP where each node must be visited within a given time window. The polyhedral approaches proposed for this problem in the literature typically follow the one which has been proven to be extremely effective in the classical TSP context. Here we present an overall (quite) general idea which is based on a relaxed discretization of time windows. Such an idea leads to a stronger formulation and to stronger valid inequalities which are then separated within the classical branch-and-cut framework. Finally, Chapter 8 addresses the branch-and-cut in the context of Generalized Minimum Spanning Tree Problems (GMSTPs) (i.e., a class of NP-hard generalizations of the classical minimum spanning tree problem). In this chapter, we show how some basic ideas (and, in particular, the usage of general purpose cutting planes) can be useful to improve on branch-and-cut methods proposed in the literature.
Resumo:
Backgrounds:Treatment of patients with relapsed/refractory (R/R) diffuse large B-cell lymphoma (DLBCL) not eligible to high dose therapy represents an unmet medical need. Panobinostat showed encouraging therapeutic activity in studies conducted in lymphoma cell lines and in vivo in patients with advanced hematologic malignancies.Purpose:FIL-PanAL10 (NCT01523834) is a phase II, prospective multicenter trial of the Fondazione Italiana Linfomi (FIL) to evaluate safety and efficacy of single agent Panobinostat as salvage therapy for R/R DLBCL patients and to evaluate a possible relationships between response and any biological features. Patients and Methods:Patients with R/R DLBCL were included. The treatment plan included 6 induction courses with Panobinostat monotherapy followed by other 6 courses of consolidation. The primary objective was to evaluate Panobinostat activity in terms of overall response (OR); secondary objectives were: CR rate, time to response (TTR), progression-free survival (PFS), safety and feasibility of Panobinostat. We included evaluation of the impact of pharmacogenetics, immunohistochemical patterns and patient’s specific gene expression and mutations as potential predictors of response to Panobinostat as explorative objectives. To this aim a pre-enrollment new tissue biopsy was mandatory. ResultsThirty-five patients, 21 males (60%), were enrolled between June 2011 and March 2014. At the end of induction phase, 7 responses (20%) were observed, including 4 CR (11%), while 28 patients (80%) discontinued treatment due to progressive disease (PD) in 21 (60%) or adverse events in 7 (20%). Median TTR in 9 responders was 2.6 months (range 1.8-12). With a median follow up of 6 months (range 1-34), the estimated 12 months PFS and OS were 27% and 30.5%, respectively. Grade 3-4 thrombocytopenia and neutropenia were the most common toxicities (in 29 (83%) and 12 (34%) patients, respectively. Conclusions The results of this study indicate that Panobinostat might be remarkably active in some patients with R/R DLBCL, showing durable CR
Resumo:
This research proposes a solution for integrating RFID - Radio Frequency Identification technology within a structure based on CFRPs - Carbon Fiber Reinforced Polymers. Therefore, the main objective is to use technology to monitor and track composite components during manufacturing and service life. The study can be divided into two macro-areas. The first portion of the research evaluates the impact of the composite materials used on transmitting the electromagnetic signal to and from the tag. RFID technology communicates through radio frequencies to to track and trace items associated with the tags. In the first instance, a feasibility study was carried out to assess using commercially available tags. Then, after evaluating different solutions, it was decided to incorporate the tags into coupons during production. The second portion of the research is focused on evaluating the impact on the composite material's resistance to tag embedding. It starts with designing tensile test specimens through the FEM model with different housing configurations. Subsequently, the best configuration was tested in the facilities of the In the Faculty of Aerospace Engineering at TU Delft, particularly in the Structure & Materials Laboratory, two tests were conducted: the first one based on ASTM D3039/D3039 - 14 - Standard Test Method for Tensile Properties of Polymer Matrix Composite Materials, the second one dividing the path to failure into failure intervals in a load-unload-reload. Both tests were accompanied by instruments such as DIC, AE, C-Scan and Optical Microscopes. The expected result of the inclusion of RFID tags in composite components is that it brings added value to the parts with which it is associated without affecting too much its mechanical properties. This comes first from the automatic identification of RFID during the production cycle and its useful life. As a result, improvements were made in the design of production facilities.