934 resultados para Integrated production


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study is focused on radio-frequency inductively coupled thermal plasma (ICP) synthesis of nanoparticles, combining experimental and modelling approaches towards process optimization and industrial scale-up, in the framework of the FP7-NMP SIMBA European project (Scaling-up of ICP technology for continuous production of Metallic nanopowders for Battery Applications). First the state of the art of nanoparticle production through conventional and plasma routes is summarized, then results for the characterization of the plasma source and on the investigation of the nanoparticle synthesis phenomenon, aiming at highlighting fundamental process parameters while adopting a design oriented modelling approach, are presented. In particular, an energy balance of the torch and of the reaction chamber, employing a calorimetric method, is presented, while results for three- and two-dimensional modelling of an ICP system are compared with calorimetric and enthalpy probe measurements to validate the temperature field predicted by the model and used to characterize the ICP system under powder-free conditions. Moreover, results from the modeling of critical phases of ICP synthesis process, such as precursor evaporation, vapour conversion in nanoparticles and nanoparticle growth, are presented, with the aim of providing useful insights both for the design and optimization of the process and on the underlying physical phenomena. Indeed, precursor evaporation, one of the phases holding the highest impact on industrial feasibility of the process, is discussed; by employing models to describe particle trajectories and thermal histories, adapted from the ones originally developed for other plasma technologies or applications, such as DC non-transferred arc torches and powder spherodization, the evaporation of micro-sized Si solid precursor in a laboratory scale ICP system is investigated. Finally, a discussion on the role of thermo-fluid dynamic fields on nano-particle formation is presented, as well as a study on the effect of the reaction chamber geometry on produced nanoparticle characteristics and process yield.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les recherches relatives à l'utilisation des TICE se concentrent fréquemment soit sur la dimension cognitive, sur la dimension linguistique ou sur la dimension culturelle. Le plus souvent, les recherches empiriques se proposent d'évaluer les effets directs des TICE sur les performances langagières des apprenants. En outre, les recherches, surtout en psychologie cognitive, sont le plus souvent effectuées en laboratoire. C'est pourquoi le travail présenté dans cette thèse se propose d'inscrire l'utilisation des TICE dans une perspective écologique, et de proposer une approche intégrée pour l'analyse des pratiques effectives aussi bien en didactique des langues qu'en didactique de la traduction. En ce qui concerne les aspects cognitifs, nous recourons à un concept apprécié des praticiens, celui de stratégies d'apprentissage. Les quatre premiers chapitres de la présente thèse sont consacrés à l'élaboration du cadre théorique dans lequel nous inscrivons notre recherche. Nous aborderons en premier lieu les aspects disciplinaires, et notamment l’interdisciplinarité de nos deux champs de référence. Ensuite nous traiterons les stratégies d'apprentissage et les stratégies de traduction. Dans un troisième mouvement, nous nous efforcerons de définir les deux compétences visées par notre recherche : la production écrite et la traduction. Dans un quatrième temps, nous nous intéresserons aux modifications introduites par les TICE dans les pratiques d'enseignement et d'apprentissage de ces deux compétences. Le cinquième chapitre a pour objet la présentation, l'analyse des données recueillies auprès de groupes d'enseignants et d'étudiants de la section de français de la SSLMIT. Il s’agira dans un premier temps, de présenter notre corpus. Ensuite nous procéderons à l’analyse des données. Enfin, nous présenterons, après une synthèse globale, des pistes didactiques et scientifiques à même de prolonger notre travail.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The production of the Z boson in proton-proton collisions at the LHC serves as a standard candle at the ATLAS experiment during early data-taking. The decay of the Z into an electron-positron pair gives a clean signature in the detector that allows for calibration and performance studies. The cross-section of ~ 1 nb allows first LHC measurements of parton density functions. In this thesis, simulations of 10 TeV collisions at the ATLAS detector are studied. The challenges for an experimental measurement of the cross-section with an integrated luminositiy of 100 pb−1 are discussed. In preparation for the cross-section determination, the single-electron efficiencies are determined via a simulation based method and in a test of a data-driven ansatz. The two methods show a very good agreement and differ by ~ 3% at most. The ingredients of an inclusive and a differential Z production cross-section measurement at ATLAS are discussed and their possible contributions to systematic uncertainties are presented. For a combined sample of signal and background the expected uncertainty on the inclusive cross-section for an integrated luminosity of 100 pb−1 is determined to 1.5% (stat) +/- 4.2% (syst) +/- 10% (lumi). The possibilities for single-differential cross-section measurements in rapidity and transverse momentum of the Z boson, which are important quantities because of the impact on parton density functions and the capability to check for non-pertubative effects in pQCD, are outlined. The issues of an efficiency correction based on electron efficiencies as function of the electron’s transverse momentum and pseudorapidity are studied. A possible alternative is demonstrated by expanding the two-dimensional efficiencies with the additional dimension of the invariant mass of the two leptons of the Z decay.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The LHCb experiment at the LHC, by exploiting the high production cross section for $c\overline{c}$ quark pairs, offers the possibility to investigate $\mathcal{CP}$ violation in the charm sector with a very high precision.\\ In this thesis a measurement of time-integrated \(\mathcal{CP}\) violation using $D^0\rightarrow~K^+K^-$ and $D^0\rightarrow \pi^+\pi^-$ decays at LHCb is presented. The measured quantity is the difference ($\Delta$) of \(\mathcal{CP}\) asymmetry ($\mathcal{A}_{\mathcal{CP}}$) between the decay rates of $D^0$ and $\overline{D}^0$ mesons into $K^+K^–$ and $\pi^+\pi^-$ pairs.\\ The analysis is performed on 2011 data, collected at \(\sqrt{s}=7\) TeV and corresponding to an integrated luminosity of 1 fb\(^{-1}\), and 2012 data, collected at \(\sqrt{s}=8\) TeV and corresponding to an integrated luminosity of 2 fb\(^{-1}\).\\ A complete study of systematic uncertainties is beyond the aim of this thesis. However the most important systematic of the previous analysis has been studied. We find that this systematic uncertainty was due to a statistical fluctuation and then we demonstrate that it is no longer necessary to take into account.\\ By combining the 2011 and 2012 results, the final statistical precision is 0.08\%. When this analysis will be completed and published, this will be the most precise single measurement in the search for $\mathcal{CP}$ violation in the charm sector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The production rate of $b$ and $\bar{b}$ hadrons in $pp$ collisions are not expected to be strictly identical, due to imbalance between quarks and anti-quarks in the initial state. This phenomenon can be naively related to the fact that the $\bar{b}$ quark produced in the hard scattering might combine with a $u$ or $d$ valence quark from the colliding protons, whereas the same cannot happen for a $b$ quark. This thesis presents the analysis performed to determine the production asymmetries of $B^0$ and $B^0_s$. The analysis relies on data samples collected by the LHCb detector at the Large Hadron Collider (LHC) during the 2011 and 2012 data takings at two different values of the centre of mass energy $\sqrt{s}=7$ TeV and at $\sqrt{s}=8$ TeV, corresponding respectively to an integrated luminosity of 1 fb$^{-1}$ and of 2 fb$^{-1}$. The production asymmetry is one of the key ingredients to perform measurements of $CP$ violation in b-hadron decays at the LHC, since $CP$ asymmetries must be disentangled from other sources. The measurements of the production asymmetries are performed in bins of $p_\mathrm{T}$ and $\eta$ of the $B$-meson. The values of the production asymmetries, integrated in the ranges $4 < p_\mathrm{T} < 30$ GeV/c and $2.5<\eta<4.5$, are determined to be: \begin{equation} A_\mathrm{P}(\B^0)= (-1.00\pm0.48\pm0.29)\%,\nonumber \end{equation} \begin{equation} A_\mathrm{P}(\B^0_s)= (\phantom{-}1.09\pm2.61\pm0.61)\%,\nonumber \end{equation} where the first uncertainty is statistical and the second is systematic. The measurement of $A_\mathrm{P}(B^0)$ is performed using the full statistics collected by LHCb so far, corresponding to an integrated luminosity of 3 fb$^{-1}$, while the measurement of $A_\mathrm{P}(B^0_s)$ is realized with the first 1 fb$^{-1}$, leaving room for improvement. No clear evidence of dependences on the values of $p_\mathrm{T}$ and $\eta$ is observed. The results presented in this thesis are the most precise measurements available up to date.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work was to identify markers associated with production traits in the pig genome using different approaches. We focused the attention on Italian Large White pig breed using Genome Wide Association Studies (GWAS) and applying a selective genotyping approach to increase the power of the analyses. Furthermore, we searched the pig genome using Next Generation Sequencing (NSG) Ion Torrent Technology to combine selective genotyping approach and deep sequencing for SNP discovery. Other two studies were carried on with a different approach. Allele frequency changes for SNPs affecting candidate genes and at Genome Wide level were analysed to identify selection signatures driven by selection program during the last 20 years. This approach confirmed that a great number of markers may affect production traits and that they are captured by the classical selection programs. GWAS revealed 123 significant or suggestively significant SNP associated with Back Fat Thickenss and 229 associated with Average Daily Gain. 16 Copy Number Variant Regions resulted more frequent in lean or fat pigs and showed that different copies of those region could have a limited impact on fat. These often appear to be involved in food intake and behavior, beside affecting genes involved in metabolic pathways and their expression. By combining NGS sequencing with selective genotyping approach, new variants where discovered and at least 54 are worth to be analysed in association studies. The study of groups of pigs undergone to stringent selection showed that allele frequency of some loci can drastically change if they are close to traits that are interesting for selection schemes. These approaches could be, in future, integrated in genomic selection plans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays microalgae are studied, and a number of species already mass-cultivated, for their application in many fields: food and feed, chemicals, pharmaceutical, phytoremediation and renewable energy. Phytoremediation, in particular, can become a valid integrated process in many algae biomass production systems. This thesis is focused on the physiological and biochemical effects of different environmental factors, mainly macronutrients, lights and temperature on microalgae. Microalgal species have been selected on the basis of their potential in biotechnologies, and nitrogen occurs in all chapters due to its importance in physiological and applicative fields. There are 5 chapters, ready or in preparation to be submitted, with different specific matters: (i) to measure the kinetic parameters and the nutrient removal efficiencies for a selected and local strain of microalgae; (ii) to study the biochemical pathways of the microalga D. communis in presence of nitrate and ammonium; (iii) to improve the growth and the removal efficiency of a specific green microalga in mixotrophic conditions; (iv) to optimize the productivity of some microalgae with low growth-rate conditions through phytohormones and other biostimulants; and (v) to apply the phyto-removal of ammonium in an effluent from anaerobic digestion. From the results it is possible to understand how a physiological point of view is necessary to provide and optimize already existing biotechnologies and applications with microalgae.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In hadronischen Kollisionen entstehen bei einem Großteil der Ereignisse mit einem hohen Impulsübertrag Paare aus hochenergetischen Jets. Deren Produktion und Eigenschaften können mit hoher Genauigkeit durch die Störungstheorie in der Quantenchromodynamik (QCD) vorhergesagt werden. Die Produktion von \textit{bottom}-Quarks in solchen Kollisionen kann als Maßstab genutzt werden, um die Vorhersagen der QCD zu testen, da diese Quarks die Dynamik des Produktionsprozesses bei Skalen wieder spiegelt, in der eine Störungsrechnung ohne Einschränkungen möglich ist. Auf Grund der hohen Masse von Teilchen, die ein \textit{bottom}-Quark enthalten, erhält der gemessene, hadronische Zustand den größten Teil der Information von dem Produktionsprozess der Quarks. Weil sie eine große Produktionsrate besitzen, spielen sie und ihre Zerfallsprodukte eine wichtige Rolle als Untergrund in vielen Analysen, insbesondere in Suchen nach neuer Physik. In ihrer herausragenden Stellung in der dritten Quark-Generation könnten sich vermehrt Zeichen im Vergleich zu den leichteren Quarks für neue Phänomene zeigen. Daher ist die Untersuchung des Verhältnisses zwischen der Produktion von Jets, die solche \textit{bottom}-Quarks enthalten, auch bekannt als $b$-Jets, und aller nachgewiesener Jets ein wichtiger Indikator für neue massive Objekte. In dieser Arbeit werden die Produktionsrate und die Korrelationen von Paaren aus $b$-Jets bestimmt und nach ersten Hinweisen eines neuen massiven Teilchens, das bisher nicht im Standard-Modell enthalten ist, in dem invarianten Massenspektrum der $b$-Jets gesucht. Am Large Hadron Collider (LHC) kollidieren zwei Protonenstrahlen bei einer Schwerpunktsenergie von $\sqrt s = 7$ TeV, und es werden viele solcher Paare aus $b$-Jets produziert. Diese Analyse benutzt die aufgezeichneten Kollisionen des ATLAS-Detektors. Die integrierte Luminosität der verwendbaren Daten beläuft sich auf 34~pb$^{-1}$. $b$-Jets werden mit Hilfe ihrer langen Lebensdauer und den rekonstruierten, geladenen Zerfallsprodukten identifiziert. Für diese Analyse müssen insbesondere die Unterschiede im Verhalten von Jets, die aus leichten Objekten wie Gluonen und leichten Quarks hervorgehen, zu diesen $b$-Jets beachtet werden. Die Energieskala dieser $b$-Jets wird untersucht und die zusätzlichen Unsicherheit in der Energiemessung der Jets bestimmt. Effekte bei der Jet-Rekonstruktion im Detektor, die einzigartig für $b$-Jets sind, werden studiert, um letztlich diese Messung unabhängig vom Detektor und auf Niveau der Hadronen auswerten zu können. Hiernach wird die Messung zu Vorhersagen auf nächst-zu-führender Ordnung verglichen. Dabei stellt sich heraus, dass die Vorhersagen in Übereinstimmung zu den aufgenommenen Daten sind. Daraus lässt sich schließen, dass der zugrunde liegende Produktionsmechanismus auch in diesem neu erschlossenen Energiebereich am LHC gültig ist. Jedoch werden auch erste Hinweise auf Mängel in der Beschreibung der Eigenschaften dieser Ereignisse gefunden. Weiterhin können keine Anhaltspunkte für eine neue Resonanz, die in Paare aus $b$-Jets zerfällt, in dem invarianten Massenspektrum bis etwa 1.7~TeV gefunden werden. Für das Auftreten einer solchen Resonanz mit einer Gauß-förmigen Massenverteilung werden modell-unabhängige Grenzen berechnet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Top quark studies play an important role in the physics program of the Large Hadron Collider (LHC). The energy and luminosity reached allow the acquisition of a large amount of data especially in kinematic regions never studied before. In this thesis is presented the measurement of the ttbar production differential cross section on data collected by ATLAS in 2012 in proton proton collisions at \sqrt{s} = 8 TeV, corresponding to an integrated luminosity of 20.3 fb^{−1}. The measurement is performed for ttbar events in the semileptonic channel where the hadronically decaying top quark has a transverse momentum above 300 GeV. The hadronic top quark decay is reconstructed as a single large radius jet and identified using jet substructure properties. The final differential cross section result has been compared with several theoretical distributions obtaining a discrepancy of about the 25% between data and predictions, depending on the MC generator. Furthermore the kinematic distributions of the ttbar production process are very sensitive to the choice of the parton distribution function (PDF) set used in the simulations and could provide constraints on gluons PDF. In particular in this thesis is performed a systematic study on the PDF of the protons, varying several PDF sets and checking which one better describes the experimental distributions. The boosted techniques applied in this measurement will be fundamental in the next data taking at \sqrt{s}=13 TeV when will be produced a large amount of heavy particles with high momentum.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurements of the self coupling between bosons are important to test the electroweak sector of the Standard Model (SM). The production of pairs of Z bosons through the s-channel is forbidden in the SM. The presence of physics, beyond the SM, could lead to a deviation of the expected production cross section of pairs of Z bosons due to the so called anomalous Triple Gauge Couplings (aTGC). Proton-proton data collisions at the Large Hadron Collider (LHC) recorded by the ATLAS detector at a center of mass energy of 8 TeV were analyzed corresponding to an integrated luminosity of 20.3 fb-1. Pairs of Z bosons decaying into two electron-positron pairs are searched for in the data sample. The effect of the inclusion of detector regions corresponding to high values of the pseudorapidity was studied to enlarge the phase space available for the measurement of the ZZ production. The number of ZZ candidates was determined and the ZZ production cross section was measured to be: rn7.3±1.0(Stat.)±0.4(Sys.)±0.2(lumi.)pb, which is consistent with the SM expectation value of 7.2±0.3pb. Limits on the aTGCs were derived using the observed yield, which are twice as stringent as previous limits obtained by ATLAS at a center of mass energy of 7 TeV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our generation of computational scientists is living in an exciting time: not only do we get to pioneer important algorithms and computations, we also get to set standards on how computational research should be conducted and published. From Euclid’s reasoning and Galileo’s experiments, it took hundreds of years for the theoretical and experimental branches of science to develop standards for publication and peer review. Computational science, rightly regarded as the third branch, can walk the same road much faster. The success and credibility of science are anchored in the willingness of scientists to expose their ideas and results to independent testing and replication by other scientists. This requires the complete and open exchange of data, procedures and materials. The idea of a “replication by other scientists” in reference to computations is more commonly known as “reproducible research”. In this context the journal “EAI Endorsed Transactions on Performance & Modeling, Simulation, Experimentation and Complex Systems” had the exciting and original idea to make the scientist able to submit simultaneously the article and the computation materials (software, data, etc..) which has been used to produce the contents of the article. The goal of this procedure is to allow the scientific community to verify the content of the paper, reproducing it in the platform independently from the OS chosen, confirm or invalidate it and especially allow its reuse to reproduce new results. This procedure is therefore not helpful if there is no minimum methodological support. In fact, the raw data sets and the software are difficult to exploit without the logic that guided their use or their production. This led us to think that in addition to the data sets and the software, an additional element must be provided: the workflow that relies all of them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We measure the cross section and the difference in rapidities between photons and charged leptons for inclusive W -> lnu+gamma production in egamma and mugamma final states. Using data corresponding to an integrated luminosity of 4.2 fb-1 collected with the D0 detector at the Fermilab Tevatron Collider, the cross section multiplied by the branching fraction for the process ppbar -> Wgamma+X -> lnugamma+X, measured to be 15.8 +/- 0.8 (stat.) +/- 1.2 (syst.) pb, and the distribution of the charge-signed photon-lepton rapidity difference are found to be in agreement with the standard model. These results provide the most stringent limits on anomalous WWgamma couplings for data from hadron colliders: -0.4 < Delta kappa_gamma < 0.4 and -0.08 < lambda_gamma < 0.07 at the 95% C.L.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a measurement of the W+W- production cross section in pp collisions at root s = 7 TeV. The leptonic decay channels are analyzed using data corresponding to an integrated luminosity of 4: 6 fb(-1) collected with the ATLAS detector at the Large Hadron Collider. The W+W- production cross section sigma(pp -> W+W- + X) is measured to be 51.9 +/- 2.0(stat) +/- 3.9(syst) +/- 2.0(lumi) pb, compatible with the Standard Model prediction of 44.7(-1.9)(+2.1) pb. A measurement of the normalized fiducial cross section as a function of the leading lepton transverse momentum is also presented. The reconstructed transverse momentum distribution of the leading lepton is used to extract limits on anomalous WWZ and WW gamma couplings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The epidemiology of an enrofloxacin-resistant Escherichia coli clone was investigated during two separate outbreaks of colibacillosis in the Danish broiler production. In total five flocks were reported affected by the outbreaks. Recorded first-week mortalities were in the range of 1.7-12.7%. The clone was first isolated from dead broilers and subsequently demonstrated in samples from associated hatchers and the parent flock with its embryonated eggs, suggesting a vertical transmission from the parents. The second outbreak involved two broiler flocks unrelated to the affected flocks from the first outbreak. However, the clone could not be demonstrated in the associated parent flock. Furthermore, samplings from grand-parent flocks were negative for the outbreak clone. The clonality was evaluated by plasmid profiling and pulsed-field gel electrophoresis. None of the recognized virulence factors were demonstrated in the outbreak clone by microarray and PCR assay. The molecular background for the fluoroquinolone-resistance was investigated and point mutations in gyrA and parC leading to amino-acid substitutions in quinolone-resistance determining regions of GyrA and ParC were demonstrated. Vertical transmission of enrofloxacin-resistant E. coli from healthy parents resulting in high first-week mortality in the offspring illustrates the potential of the emergence and spreading of fluoroquinolone-resistant bacteria in animal husbandry, even though the use of fluoroquinolones is restricted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The demands in production and associate costs at power generation through non renewable resources are increasing at an alarming rate. Solar energy is one of the renewable resource that has the potential to minimize this increase. Utilization of solar energy have been concentrated mainly on heating application. The use of solar energy in cooling systems in building would benefit greatly achieving the goal of non-renewable energy minimization. The approaches of solar energy heating system research done by initiation such as University of Wisconsin at Madison and building heat flow model research conducted by Oklahoma State University can be used to develop and optimize solar cooling building system. The research uses two approaches to develop a Graphical User Interface (GUI) software for an integrated solar absorption cooling building model, which is capable of simulating and optimizing the absorption cooling system using solar energy as the main energy source to drive the cycle. The software was then put through a number of litmus test to verify its integrity. The litmus test was conducted on various building cooling system data sets of similar applications around the world. The output obtained from the software developed were identical with established experimental results from the data sets used. Software developed by other research are catered for advanced users. The software developed by this research is not only reliable in its code integrity but also through its integrated approach which is catered for new entry users. Hence, this dissertation aims to correctly model a complete building with the absorption cooling system in appropriate climate as a cost effective alternative to conventional vapor compression system.