957 resultados para production studies
Resumo:
The thesis aims at analysing the role of collective action as a viable alternative to the traditional forms of intervention in agriculture in order to encourage the provision of agri-environmental public goods. Which are the main benefits of collective action, in terms of effectiveness and efficiency, compared to traditional market or public intervention policies? What are the drivers that encourage farmers to participate into collective action? To what extent it is possible to incorporate collective aspects into policies aimed at providing agri-environmental public goods? With the objective of addressing these research questions, the thesis is articulated in two levels: a theoretical analysis on the role of collective action in the provision of public goods and a specific investigation of two local initiative,s were an approach collective management of agro-environmental resources was successfully implemented. The first case study concerns a project named “Custodians of the Territory”, developed by the local agency in Tuscany “Comunità Montana Media Valle del Serchio”, which settled for an agreement with local farmers for a collective provision of environmental services related to the hydro-geological management of the district. The second case study is related to the territorial agri-environmental agreement experimented in Valdaso (Marche), where local farmers have adopted integrated pest management practices collectively with the aim of reducing the environmental impact of their farming practices. The analysis of these initiatives, carried out through participatory methods (Rapid Rural Appraisal), allowed developing a theoretical discussion on the role of innovative tools (such as co-production and co-management) in the provision of agri-environmental public goods. The case studies also provided some recommendations on the government intervention and policies needed to promote successful collective action for the provision of agri-environmental public goods.
Resumo:
Les recherches relatives à l'utilisation des TICE se concentrent fréquemment soit sur la dimension cognitive, sur la dimension linguistique ou sur la dimension culturelle. Le plus souvent, les recherches empiriques se proposent d'évaluer les effets directs des TICE sur les performances langagières des apprenants. En outre, les recherches, surtout en psychologie cognitive, sont le plus souvent effectuées en laboratoire. C'est pourquoi le travail présenté dans cette thèse se propose d'inscrire l'utilisation des TICE dans une perspective écologique, et de proposer une approche intégrée pour l'analyse des pratiques effectives aussi bien en didactique des langues qu'en didactique de la traduction. En ce qui concerne les aspects cognitifs, nous recourons à un concept apprécié des praticiens, celui de stratégies d'apprentissage. Les quatre premiers chapitres de la présente thèse sont consacrés à l'élaboration du cadre théorique dans lequel nous inscrivons notre recherche. Nous aborderons en premier lieu les aspects disciplinaires, et notamment l’interdisciplinarité de nos deux champs de référence. Ensuite nous traiterons les stratégies d'apprentissage et les stratégies de traduction. Dans un troisième mouvement, nous nous efforcerons de définir les deux compétences visées par notre recherche : la production écrite et la traduction. Dans un quatrième temps, nous nous intéresserons aux modifications introduites par les TICE dans les pratiques d'enseignement et d'apprentissage de ces deux compétences. Le cinquième chapitre a pour objet la présentation, l'analyse des données recueillies auprès de groupes d'enseignants et d'étudiants de la section de français de la SSLMIT. Il s’agira dans un premier temps, de présenter notre corpus. Ensuite nous procéderons à l’analyse des données. Enfin, nous présenterons, après une synthèse globale, des pistes didactiques et scientifiques à même de prolonger notre travail.
Resumo:
This thesis presents a comparative developmental study of inflorescences and focuses on the production of the terminal flower (TF). Morphometric attributes of inflorescence meristems (IM) were obtained throughout the ontogeny of inflorescence buds with the aim of describing possible spatial constraints that could explain the failure in developing the TF. The study exposes the inflorescence ontogeny of 20 species from five families of the Eudicots (Berberidaceae, Papaveraceae-Fumarioideae, Rosaceae, Campanulaceae and Apiaceae) in which 745 buds of open (i.e. without TF) and closed (i.e. with TF) inflorescences were observed under the scanning electron microscope.rnThe study shows that TFs appear on IMs which are 2,75 (se = 0,38) times larger than the youngest lateral reproductive primordium. The shape of these IMs is characterized by a leaf arc (phyllotactic attribute) of 91,84° (se = 7,32) and a meristematic elevation of 27,93° (se = 5,42). IMs of open inflorescences show a significant lower relative surface, averaging 1,09 (se=0,26) times the youngest primordium size, which suggests their incapacity for producing TFs. The relative lower size of open IMs is either a condition throughout the complete ontogeny (‘open I’) or a result from the drastic reduction of the meristematic surface after flower segregation (‘open II’). rnIt is concluded that a suitable bulge configuration of the IM is a prerequisite for TF formation. Observations in the TF-facultative species Daucus carota support this view, as the absence of the TF in certain umbellets is correlated with a reduction of their IM dimensions. A review of literature regarding histological development of IMs and genetic regulation of inflorescences suggests that in ‘open I’ inflorescences, the histological composition and molecular activity at the tip of the IM could impede the TF differentiation. On the other side, in ‘open II’ inflorescences, the small final IM bulge could represent a spatial constraint that hinders the differentiation of the TF. The existence of two distinct kinds of ontogenies of open inflorescences suggests two ways in which the loss of the TF could have occurred in the course of evolution.rn
Resumo:
The production of the Z boson in proton-proton collisions at the LHC serves as a standard candle at the ATLAS experiment during early data-taking. The decay of the Z into an electron-positron pair gives a clean signature in the detector that allows for calibration and performance studies. The cross-section of ~ 1 nb allows first LHC measurements of parton density functions. In this thesis, simulations of 10 TeV collisions at the ATLAS detector are studied. The challenges for an experimental measurement of the cross-section with an integrated luminositiy of 100 pb−1 are discussed. In preparation for the cross-section determination, the single-electron efficiencies are determined via a simulation based method and in a test of a data-driven ansatz. The two methods show a very good agreement and differ by ~ 3% at most. The ingredients of an inclusive and a differential Z production cross-section measurement at ATLAS are discussed and their possible contributions to systematic uncertainties are presented. For a combined sample of signal and background the expected uncertainty on the inclusive cross-section for an integrated luminosity of 100 pb−1 is determined to 1.5% (stat) +/- 4.2% (syst) +/- 10% (lumi). The possibilities for single-differential cross-section measurements in rapidity and transverse momentum of the Z boson, which are important quantities because of the impact on parton density functions and the capability to check for non-pertubative effects in pQCD, are outlined. The issues of an efficiency correction based on electron efficiencies as function of the electron’s transverse momentum and pseudorapidity are studied. A possible alternative is demonstrated by expanding the two-dimensional efficiencies with the additional dimension of the invariant mass of the two leptons of the Z decay.
Resumo:
The benthic dinoflagellate O. ovata represents a serious threat for human health and for the ecology of its blooming areas: thanks to its toxicity this microalga has been responsible for several cases of human intoxication and mass mortalities of benthic invertebrates. Although the large number of studies on this dinoflagellate, the mechanisms underpinning O. ovata growth and toxin production are still far to be fully understood. In this work we have enriched the dataset on this species by carrying out a new experiment on an Adriatic O. cf. ovata strain. Data from this experiment (named Beta) and from another comparable experiment previously conducted on the same strain (named Alpha), revealed some interesting aspects of this dinoflagellate: it is able to grow also in a condition of strong intracellular nutrient deficiency (C:P molar ratio > 400; C:N > 25), reaching extremely low values of chlorophyll-a to carbon ratio (0.0004). Was also found a significant inverse relationships (r > -0.7) between cellular toxin to carbon and cellular nutrient to carbon ratios of experiment Alpha. In the light of these result, we hypothesized that in O. cf. ovata nutrient-stress conditions (intended as intracellular nutrient deficiency) can cause: i) an increase in toxin production; ii) a strong decrease in chlorophyll-a synthesis; iii) a lowering of metabolism associated with the formation of a sort of resting stage. We then used a modelling approach to test and critically evaluate these hypotheses in a mechanistic way: newly developed formulation describing toxin production and fate, and ad hoc changes in the already existent formulations describing chlorophyll synthesis, rest respiration, and mortality, have been incorporated in a simplified version of the European Regional Seas Ecosystem Model (ERSEM), together with a new ad hoc parameterization. The adapted model was able to accurately reproduce many of the trends observed in the Alpha experiment, allowing us to support our hypotheses. Instead the simulations of the experiment Beta were not fully satisfying in quantitative terms. We explained this gap with the presumed different physiological behaviors between the algae of the two experiments, due to the different pre-experimental periods of acclimation: the model was not able to reproduce acclimation processes in its simulations of the experiment Beta. Thus we attempt to simulate the acclimation of the algae to nutrient-stress conditions by manual intervention on some parameters of nutrient-stress thresholds, but we received conflicting results. Further studies are required to shed light on this interesting aspect. In this work we also improve the range of applicability of a state of the art marine biogeochemical model (ERSEM) by implementing in it an ecological relevant process such as the production of toxic compounds.
Resumo:
The Standard Model of elementary particle physics was developed to describe the fundamental particles which constitute matter and the interactions between them. The Large Hadron Collider (LHC) at CERN in Geneva was built to solve some of the remaining open questions in the Standard Model and to explore physics beyond it, by colliding two proton beams at world-record centre-of-mass energies. The ATLAS experiment is designed to reconstruct particles and their decay products originating from these collisions. The precise reconstruction of particle trajectories plays an important role in the identification of particle jets which originate from bottom quarks (b-tagging). This thesis describes the step-wise commissioning of the ATLAS track reconstruction and b-tagging software and one of the first measurements of the b-jet production cross section in pp collisions at sqrt(s)=7 TeV with the ATLAS detector. The performance of the track reconstruction software was studied in great detail, first using data from cosmic ray showers and then collisions at sqrt(s)=900 GeV and 7 TeV. The good understanding of the track reconstruction software allowed a very early deployment of the b-tagging algorithms. First studies of these algorithms and the measurement of the b-tagging efficiency in the data are presented. They agree well with predictions from Monte Carlo simulations. The b-jet production cross section was measured with the 2010 dataset recorded by the ATLAS detector, employing muons in jets to estimate the fraction of b-jets. The measurement is in good agreement with the Standard Model predictions.
Resumo:
Folates (vitamin B9) are essential water soluble vitamins, whose deficiency in humans may contribute to the onset of several diseases, such as anaemia, cancer, cardiovascular diseases, neurological problems as well as defects in embryonic development. Human and other mammals are unable to synthesize ex novo folate obtaining it from exogenous sources, via intestinal absorption. Recently the gut microbiota has been identified as an important source of folates and the selection and use of folate producing microorganisms represents an innovative strategy to increase human folate levels. The aim of this thesis was to gain a fundamental understanding of folate metabolism in Bifidobacterium adolescentis. The work was subdivided in three main phases, also aimed to solve different problems encountered working with Bifidobacterium strains. First, a new identification method (based on PCR-RFLP of hsp60 gene) was specifically developed to identify Bifidobacterium strains. Secondly, Bifidobacterium adolescentis biodiversity was explored in order to recognize representing strains of this species to be screened for their folate production ability. Results showed that this species is characterized by a wide variability and support the idea that a possible new taxonomic re-organization would be required. Finally B. adolescentis folate metabolism was studied using a double approach. A quantitative analysis of folate content was complemented by the examination of expression levels of genes involved in folate related pathways. For the normalization process, required to increase the robustness of the qRT-PCR analysis, an appropriate set of reference genes was tested using two different algorithms. Results demonstrate that B.adolescentis strains may represent an endogenous source of natural folate and they could be used to fortify fermented dairy products. This bio-fortification strategy presents many advantages for the consumer, providing native folate forms more bio-available, and not implicated in the discussed controversy concerning the safety of high intake of synthetic folic acid.
Resumo:
Precision Agriculture (PA) and the more specific branch of Precision Horticulture are two very promising sectors. They focus on the use of technologies in agriculture to optimize the use of inputs, so to reach a better efficiency, and minimize waste of resources. This important objective motivated many researchers and companies to search new technology solutions. Sometimes the effort proved to be a good seed, but sometimes an unfeasible idea. So that PA, from its birth more or less 25 years ago, is still a “new” management, interesting for the future, but an actual low adoption rate is still reported by experts and researchers. This work aims to give a contribution in finding the causes of this low adoption rate and proposing a methodological solution to this problem. The first step was to examine prior research about Precision Agriculture adoption, by ex ante and ex post approach. It was supposed as important to find connections between these two phases of a purchase experience. In fact, the ex ante studies dealt with potential consumer’s perceptions before a usage experience occurred, therefore before purchasing a technology, while the ex post studies described the drivers which made a farmer become an end-user of PA technology. Then, an example of consumer research is presented. This was an ex ante research focused on pre-prototype technology for fruit production. This kind of research could give precious information about consumer acceptance before reaching an advanced development phase of the technology, and so to have the possibility to change something with the least financial impact. The final step was to develop the pre-prototype technology that was the subject of the consumer acceptance research and test its technical characteristics.
Resumo:
The aim of this work was to identify markers associated with production traits in the pig genome using different approaches. We focused the attention on Italian Large White pig breed using Genome Wide Association Studies (GWAS) and applying a selective genotyping approach to increase the power of the analyses. Furthermore, we searched the pig genome using Next Generation Sequencing (NSG) Ion Torrent Technology to combine selective genotyping approach and deep sequencing for SNP discovery. Other two studies were carried on with a different approach. Allele frequency changes for SNPs affecting candidate genes and at Genome Wide level were analysed to identify selection signatures driven by selection program during the last 20 years. This approach confirmed that a great number of markers may affect production traits and that they are captured by the classical selection programs. GWAS revealed 123 significant or suggestively significant SNP associated with Back Fat Thickenss and 229 associated with Average Daily Gain. 16 Copy Number Variant Regions resulted more frequent in lean or fat pigs and showed that different copies of those region could have a limited impact on fat. These often appear to be involved in food intake and behavior, beside affecting genes involved in metabolic pathways and their expression. By combining NGS sequencing with selective genotyping approach, new variants where discovered and at least 54 are worth to be analysed in association studies. The study of groups of pigs undergone to stringent selection showed that allele frequency of some loci can drastically change if they are close to traits that are interesting for selection schemes. These approaches could be, in future, integrated in genomic selection plans.
Resumo:
In hadronischen Kollisionen entstehen bei einem Großteil der Ereignisse mit einem hohen Impulsübertrag Paare aus hochenergetischen Jets. Deren Produktion und Eigenschaften können mit hoher Genauigkeit durch die Störungstheorie in der Quantenchromodynamik (QCD) vorhergesagt werden. Die Produktion von \textit{bottom}-Quarks in solchen Kollisionen kann als Maßstab genutzt werden, um die Vorhersagen der QCD zu testen, da diese Quarks die Dynamik des Produktionsprozesses bei Skalen wieder spiegelt, in der eine Störungsrechnung ohne Einschränkungen möglich ist. Auf Grund der hohen Masse von Teilchen, die ein \textit{bottom}-Quark enthalten, erhält der gemessene, hadronische Zustand den größten Teil der Information von dem Produktionsprozess der Quarks. Weil sie eine große Produktionsrate besitzen, spielen sie und ihre Zerfallsprodukte eine wichtige Rolle als Untergrund in vielen Analysen, insbesondere in Suchen nach neuer Physik. In ihrer herausragenden Stellung in der dritten Quark-Generation könnten sich vermehrt Zeichen im Vergleich zu den leichteren Quarks für neue Phänomene zeigen. Daher ist die Untersuchung des Verhältnisses zwischen der Produktion von Jets, die solche \textit{bottom}-Quarks enthalten, auch bekannt als $b$-Jets, und aller nachgewiesener Jets ein wichtiger Indikator für neue massive Objekte. In dieser Arbeit werden die Produktionsrate und die Korrelationen von Paaren aus $b$-Jets bestimmt und nach ersten Hinweisen eines neuen massiven Teilchens, das bisher nicht im Standard-Modell enthalten ist, in dem invarianten Massenspektrum der $b$-Jets gesucht. Am Large Hadron Collider (LHC) kollidieren zwei Protonenstrahlen bei einer Schwerpunktsenergie von $\sqrt s = 7$ TeV, und es werden viele solcher Paare aus $b$-Jets produziert. Diese Analyse benutzt die aufgezeichneten Kollisionen des ATLAS-Detektors. Die integrierte Luminosität der verwendbaren Daten beläuft sich auf 34~pb$^{-1}$. $b$-Jets werden mit Hilfe ihrer langen Lebensdauer und den rekonstruierten, geladenen Zerfallsprodukten identifiziert. Für diese Analyse müssen insbesondere die Unterschiede im Verhalten von Jets, die aus leichten Objekten wie Gluonen und leichten Quarks hervorgehen, zu diesen $b$-Jets beachtet werden. Die Energieskala dieser $b$-Jets wird untersucht und die zusätzlichen Unsicherheit in der Energiemessung der Jets bestimmt. Effekte bei der Jet-Rekonstruktion im Detektor, die einzigartig für $b$-Jets sind, werden studiert, um letztlich diese Messung unabhängig vom Detektor und auf Niveau der Hadronen auswerten zu können. Hiernach wird die Messung zu Vorhersagen auf nächst-zu-führender Ordnung verglichen. Dabei stellt sich heraus, dass die Vorhersagen in Übereinstimmung zu den aufgenommenen Daten sind. Daraus lässt sich schließen, dass der zugrunde liegende Produktionsmechanismus auch in diesem neu erschlossenen Energiebereich am LHC gültig ist. Jedoch werden auch erste Hinweise auf Mängel in der Beschreibung der Eigenschaften dieser Ereignisse gefunden. Weiterhin können keine Anhaltspunkte für eine neue Resonanz, die in Paare aus $b$-Jets zerfällt, in dem invarianten Massenspektrum bis etwa 1.7~TeV gefunden werden. Für das Auftreten einer solchen Resonanz mit einer Gauß-förmigen Massenverteilung werden modell-unabhängige Grenzen berechnet.
Resumo:
Die Nuklearmedizin ist ein modernes und effektives Werkzeug zur Erkennung und Behandlung von onkologischen Erkrankungen. Molekulare Bildgebung, die auf dem Einsatz von Radiopharmaka basiert, beinhaltet die Einzel-Photonen-Emissions-Tomographie (SPECT) und Positronenemissions¬tomographie (PET) und ermöglicht die nicht-invasive Visualisierung von Tumoren auf nano-und picomolarer Ebene.rnDerzeit werden viele neue Tracer für die genauere Lokalisierung von kleinen Tumoren und Metastasen eingeführt und hinsichtlich ihrer Eignung untersucht. Die meisten von ihnen sind Protein-basierte Biomoleküle, die die Natur selbst als Antigene für die Tumorzellen produziert. Dabei spielen Antikörper und Antikörper-Fragmente eine wichtige Rolle in der Tumor-Diagnostik und Behandlung. Die PET-Bildgebung mit Antikörpern und Antikörperfragmenten bezeichnet man als immuno-PET. Ein wichtiger Aspekt hierbei ist, dass entsprechende Radiopharmaka benötigt werden, deren Halbwertszeit mit der Halbwertszeit der Biomoleküle korreliert ist.rnIn neueren Arbeiten wird 90Nb als potenzieller Kandidat für die Anwendung in der immuno-PET vorgeschlagen. Seine Halbwertszeit von 14,6 Stunden ist geeignet für die Anwendung mit Antikörperfragmenten und einige intakten Antikörpern. 90Nb hat eine relativ hohen Anteil an Positronenemission von 53% und eine optimale Energie für die β+-Emission von 0,35 MeV, die sowohl eine hohe Qualität der Bildgebung als auch eine niedrige Aktivitätsmenge des Radionuklids ermöglicht.rnErsten grundlegende Untersuchungen zeigten: i) dass 90Nb in ausreichender Menge und Reinheit durch Protonen-Bombardierung des natürlichen Zirkonium Targets produziert, ii) aus dem Targetmaterial in entsprechender radiochemischer Reinheit isoliert und iii) zur Markierung des monoklonalen Antikörpers (Rituximab) verwendet werden kann und iv) dieser 90Nb-markierte mAb eine hohe in vitro Stabilität besitzt. Desweiteren wurde eine alternative und schnelle Abtrennungsmethode entwickelt, die es erlaubt 90Nb, mit einer geeigneten radiochemischen und radionuklidischen Reinheit für eine anschließende Markierung von Biomolekülen in einer Stunde zu aufzureinigen. Schließlich wurden erstmals 90Nb-markierte Biomolekülen in vivo untersucht. Desweiteren wurden auch Experimente durchgeführt, um den optimalen bifunktionellen Chelatbildner (BFC) für 90Niob zu finden. Mehrere BFC wurden hinsichtlich Komplexbildung mit NbV untersucht. Desferrioxamin (Df) erwies sich als geeignetster Chelator für 90Nb. Der monoklonale Antikörper Bevacizumab (Avastin®) wurde mit 90Nb markiert und eine Biodistributionsstudie und eine PET-Untersuchung durchgeführt. Alle diese Ergebnisse zeigten, dass 90Nb ein vielversprechendes Radionuklid für die Immuno-PET ist, welches sogar für weitere kommerzielle Anwendungen in der klinischen Routine geeignet zu sein scheint.rn
Resumo:
The thesis investigates the nucleon structure probed by the electromagnetic interaction. One of the most basic observables, reflecting the electromagnetic structure of the nucleon, are the form factors, which have been studied by means of elastic electron-proton scattering with ever increasing precision for several decades. In the timelike region, corresponding with the proton-antiproton annihilation into a electron-positron pair, the present experimental information is much less accurate. However, in the near future high-precision form factor measurements are planned. About 50 years after the first pioneering measurements of the electromagnetic form factors, polarization experiments stirred up the field since the results were found to be in striking contradiction to the findings of previous form factor investigations from unpolarized measurements. Triggered by the conflicting results, a whole new field studying the influence of two-photon exchange corrections to elastic electron-proton scattering emerged, which appeared as the most likely explanation of the discrepancy. The main part of this thesis deals with theoretical studies of two-photon exchange, which is investigated particularly with regard to form factor measurements in the spacelike as well as in the timelike region. An extraction of the two-photon amplitudes in the spacelike region through a combined analysis using the results of unpolarized cross section measurements and polarization experiments is presented. Furthermore, predictions of the two-photon exchange effects on the e+p/e-p cross section ratio are given for several new experiments, which are currently ongoing. The two-photon exchange corrections are also investigated in the timelike region in the process pbar{p} -> e+ e- by means of two factorization approaches. These corrections are found to be smaller than those obtained for the spacelike scattering process. The influence of the two-photon exchange corrections on cross section measurements as well as asymmetries, which allow a direct access of the two-photon exchange contribution, is discussed. Furthermore, one of the factorization approaches is applied for investigating the two-boson exchange effects in parity-violating electron-proton scattering. In the last part of the underlying work, the process pbar{p} -> pi0 e+e- is analyzed with the aim of determining the form factors in the so-called unphysical, timelike region below the two-nucleon production threshold. For this purpose, a phenomenological model is used, which provides a good description of the available data of the real photoproduction process pbar{p} -> pi0 gamma.
Resumo:
Top quark studies play an important role in the physics program of the Large Hadron Collider (LHC). The energy and luminosity reached allow the acquisition of a large amount of data especially in kinematic regions never studied before. In this thesis is presented the measurement of the ttbar production differential cross section on data collected by ATLAS in 2012 in proton proton collisions at \sqrt{s} = 8 TeV, corresponding to an integrated luminosity of 20.3 fb^{−1}. The measurement is performed for ttbar events in the semileptonic channel where the hadronically decaying top quark has a transverse momentum above 300 GeV. The hadronic top quark decay is reconstructed as a single large radius jet and identified using jet substructure properties. The final differential cross section result has been compared with several theoretical distributions obtaining a discrepancy of about the 25% between data and predictions, depending on the MC generator. Furthermore the kinematic distributions of the ttbar production process are very sensitive to the choice of the parton distribution function (PDF) set used in the simulations and could provide constraints on gluons PDF. In particular in this thesis is performed a systematic study on the PDF of the protons, varying several PDF sets and checking which one better describes the experimental distributions. The boosted techniques applied in this measurement will be fundamental in the next data taking at \sqrt{s}=13 TeV when will be produced a large amount of heavy particles with high momentum.
Resumo:
Volatile amines are prominent indicators of food freshness, as they are produced during many microbiological food degradation processes. Monitoring and indicating the volatile amine concentration within the food package by intelligent packaging solutions might therefore be a simple yet powerful way to control food safety throughout the distribution chain.rnrnIn this context, this work aims to the formation of colourimetric amine sensing surfaces on different substrates, especially transparent PET packaging foil. The colour change of the deposited layers should ideally be discernible by the human eye to facilitate the determination by the end-user. rnrnDifferent tailored zinc(II) and chromium(III) metalloporphyrins have been used as chromophores for the colourimetric detection of volatile amines. A new concept to increase the porphyrins absorbance change upon exposure to amines is introduced. Moreover, the novel porphyrins’ processability during the deposition process is increased by their enhanced solubility in non-polar solvents.rnrnThe porphyrin chromophores have successfully been incorporated into polysiloxane matrices on different substrates via a dielectric barrier discharge enhanced chemical vapour deposition. This process allows the use of nitrogen as a cheap and abundant plasma gas, produces minor amounts of waste and by-products and can be easily introduced into (existing) roll-to-roll production lines. The formed hybrid sensing layers tightly incorporate the porphyrins and moreover form a porous structure to facilitate the amines diffusion to and interaction with the chromophores.rnrnThe work is completed with the thorough analysis of the porphyrins’ amine sensing performance in solution as well as in the hybrid coatings . To reveal the underlying interaction mechanisms, the experimental results are supported by DFT calculations. The deposited layers could be used for the detection of NEt3 concentrations below 10 ppm in the gas phase. Moreover, the coated foils have been tested in preliminary food storage experiments. rnrnThe mechanistic investigations on the interaction of amines with chromium(III) porphyrins revealed a novel pathway to the formation of chromium(IV) oxido porphyrins. This has been used for electrochemical epoxidation reactions with dioxygen as the formal terminal oxidant.rn
Resumo:
In this thesis, we develop high precision tools for the simulation of slepton pair production processes at hadron colliders and apply them to phenomenological studies at the LHC. Our approach is based on the POWHEG method for the matching of next-to-leading order results in perturbation theory to parton showers. We calculate matrix elements for slepton pair production and for the production of a slepton pair in association with a jet perturbatively at next-to-leading order in supersymmetric quantum chromodynamics. Both processes are subsequently implemented in the POWHEG BOX, a publicly available software tool that contains general parts of the POWHEG matching scheme. We investigate phenomenological consequences of our calculations in several setups that respect experimental exclusion limits for supersymmetric particles and provide precise predictions for slepton signatures at the LHC. The inclusion of QCD emissions in the partonic matrix elements allows for an accurate description of hard jets. Interfacing our codes to the multi-purpose Monte-Carlo event generator PYTHIA, we simulate parton showers and slepton decays in fully exclusive events. Advanced kinematical variables and specific search strategies are examined as means for slepton discovery in experimentally challenging setups.