268 resultados para Weiche Materie
Resumo:
Das Standardmodell der Teilchenphysik, das drei der vier fundamentalen Wechselwirkungen beschreibt, stimmt bisher sehr gut mit den Messergebnissen der Experimente am CERN, dem Fermilab und anderen Forschungseinrichtungen überein. rnAllerdings können im Rahmen dieses Modells nicht alle Fragen der Teilchenphysik beantwortet werden. So lässt sich z.B. die vierte fundamentale Kraft, die Gravitation, nicht in das Standardmodell einbauen.rnDarüber hinaus hat das Standardmodell auch keinen Kandidaten für dunkle Materie, die nach kosmologischen Messungen etwa 25 % unseres Universum ausmacht.rnAls eine der vielversprechendsten Lösungen für diese offenen Fragen wird die Supersymmetrie angesehen, die eine Symmetrie zwischen Fermionen und Bosonen einführt. rnAus diesem Modell ergeben sich sogenannte supersymmetrische Teilchen, denen jeweils ein Standardmodell-Teilchen als Partner zugeordnet sind.rnEin mögliches Modell dieser Symmetrie ist das R-Paritätserhaltende mSUGRA-Modell, falls Supersymmetrie in der Natur realisiert ist.rnIn diesem Modell ist das leichteste supersymmetrische Teilchen (LSP) neutral und schwach wechselwirkend, sodass es nicht direkt im Detektor nachgewiesen werden kann, sondern indirekt über die vom LSP fortgetragene Energie, die fehlende transversale Energie (etmiss), nachgewiesen werden muss.rnrnDas ATLAS-Experiment wird 2010 mit Hilfe des pp-Beschleunigers LHC mit einer Schwerpunktenergie von sqrt(s)=7-10 TeV mit einer Luminosität von 10^32 #/(cm^2*s) mit der Suche nach neuer Physik starten.rnDurch die sehr hohe Datenrate, resultierend aus den etwa 10^8 Auslesekanälen des ATLAS-Detektors bei einer Bunchcrossingrate von 40 MHz, wird ein Triggersystem benötigt, um die zu speichernde Datenmenge zu reduzieren.rnDabei muss ein Kompromiss zwischen der verfügbaren Triggerrate und einer sehr hohen Triggereffizienz für die interessanten Ereignisse geschlossen werden, da etwa nur jedes 10^8-te Ereignisse für die Suche nach neuer Physik interessant ist.rnZur Erfüllung der Anforderungen an das Triggersystem wird im Experiment ein dreistufiges System verwendet, bei dem auf der ersten Triggerstufe mit Abstand die höchste Datenreduktion stattfindet.rnrnIm Rahmen dieser Arbeit rn%, die vollständig auf Monte-Carlo-Simulationen basiert, rnist zum einen ein wesentlicher Beitrag zum grundlegenden Verständnis der Eigenschaft der fehlenden transversalen Energie auf der ersten Triggerstufe geleistet worden.rnZum anderen werden Methoden vorgestellt, mit denen es möglich ist, die etmiss-Triggereffizienz für Standardmodellprozesse und mögliche mSUGRA-Szenarien aus Daten zu bestimmen. rnBei der Optimierung der etmiss-Triggerschwellen für die erste Triggerstufe ist die Triggerrate bei einer Luminosität von 10^33 #/(cm^2*s) auf 100 Hz festgelegt worden.rnFür die Triggeroptimierung wurden verschiedene Simulationen benötigt, bei denen eigene Entwicklungsarbeit eingeflossen ist.rnMit Hilfe dieser Simulationen und den entwickelten Optimierungsalgorithmen wird gezeigt, dass trotz der niedrigen Triggerrate das Entdeckungspotential (für eine Signalsignifikanz von mindestens 5 sigma) durch Kombinationen der etmiss-Schwelle mit Lepton bzw. Jet-Triggerschwellen gegenüber dem bestehenden ATLAS-Triggermenü auf der ersten Triggerstufe um bis zu 66 % erhöht wird.
A new double laser pulse pumping scheme for transient collisionally excited plasma soft X-ray lasers
Resumo:
Within this thesis a new double laser pulse pumping scheme for plasma-based, transient collisionally excited soft x-ray lasers (SXRL) was developed, characterized and utilized for applications. SXRL operations from ~50 up to ~200 electron volt were demonstrated applying this concept. As a central technical tool, a special Mach-Zehnder interferometer in the chirped pulse amplification (CPA) laser front-end was developed for the generation of fully controllable double-pulses to optimally pump SXRLs.rnThis Mach-Zehnder device is fully controllable and enables the creation of two CPA pulses of different pulse duration and variable energy balance with an adjustable time delay. Besides the SXRL pumping, the double-pulse configuration was applied to determine the B-integral in the CPA laser system by amplifying short pulse replica in the system, followed by an analysis in the time domain. The measurement of B-integral values in the 0.1 to 1.5 radian range, only limited by the reachable laser parameters, proved to be a promising tool to characterize nonlinear effects in the CPA laser systems.rnContributing to the issue of SXRL pumping, the double-pulse was configured to optimally produce the gain medium of the SXRL amplification. The focusing geometry of the two collinear pulses under the same grazing incidence angle on the target, significantly improved the generation of the active plasma medium. On one hand the effect was induced by the intrinsically guaranteed exact overlap of the two pulses on the target, and on the other hand by the grazing incidence pre-pulse plasma generation, which allows for a SXRL operation at higher electron densities, enabling higher gain in longer wavelength SXRLs and higher efficiency at shorter wavelength SXRLs. The observation of gain enhancement was confirmed by plasma hydrodynamic simulations.rnThe first introduction of double short-pulse single-beam grazing incidence pumping for SXRL pumping below 20 nanometer at the laser facility PHELIX in Darmstadt (Germany), resulted in a reliable operation of a nickel-like palladium SXRL at 14.7 nanometer with a pump energy threshold strongly reduced to less than 500 millijoule. With the adaptation of the concept, namely double-pulse single-beam grazing incidence pumping (DGRIP) and the transfer of this technology to the laser facility LASERIX in Palaiseau (France), improved efficiency and stability of table-top high-repetition soft x-ray lasers in the wavelength region below 20 nanometer was demonstrated. With a total pump laser energy below 1 joule the target, 2 mircojoule of nickel-like molybdenum soft x-ray laser emission at 18.9 nanometer was obtained at 10 hertz repetition rate, proving the attractiveness for high average power operation. An easy and rapid alignment procedure fulfilled the requirements for a sophisticated installation, and the highly stable output satisfied the need for a reliable strong SXRL source. The qualities of the DGRIP scheme were confirmed in an irradiation operation on user samples with over 50.000 shots corresponding to a deposited energy of ~ 50 millijoule.rnThe generation of double-pulses with high energies up to ~120 joule enabled the transfer to shorter wavelength SXRL operation at the laser facility PHELIX. The application of DGRIP proved to be a simple and efficient method for the generation of soft x-ray lasers below 10 nanometer. Nickel-like samarium soft x-ray lasing at 7.3 nanometer was achieved at a low total pump energy threshold of 36 joule, which confirmed the suitability of the applied pumping scheme. A reliable and stable SXRL operation was demonstrated, due to the single-beam pumping geometry despite the large optical apertures. The soft x-ray lasing of nickel-like samarium was an important milestone for the feasibility of applying the pumping scheme also for higher pumping pulse energies, which are necessary to obtain soft x-ray laser wavelengths in the water window. The reduction of the total pump energy below 40 joule for 7.3 nanometer short wavelength lasing now fulfilled the requirement for the installation at the high-repetition rate operation laser facility LASERIX.rn
Resumo:
Die vorliegende Arbeit untersucht die Formästhetik im Kunstdenken der Moderne. Bedeutend ist, dass im Zusammenhang mit der Formästhetik meist auch gewisse ästhetische Anthropologieentwürfe entstehen. In Ermangelung geeigneter Definitionsansätze wird die Form, frei von Vordeutungen, als dynamisches Konstrukt betrachtet. Dementsprechend verändert sich auch der Blick auf die anthropologischen Konzepte der untersuchten Texte. Bei allen behandelten Autoren wird deutlich, dass der Verlust absoluter Werte, der sich in der Moderne immer weiter ausdifferenziert, zum Fundament der Kunstanschauung wird. Der Mensch ist einer gewissen Tragik ausgeliefert: Er ist fortan Schöpfer der eigenen Realität, aber auch immer an die Materie gebunden. Dies führt zu einer Widersprüchlichkeit des menschlichen Daseins, für das eine ‚reine Identität’ nicht erreichbar ist. Die Befreiung von der menschlichen Zerissenheit kann scheinbar durch die ästhetische Betrachtung erlangt werden. Die künstlerische Formgebung entlastet von der widersprüchlichen menschlichen Realität und die Kunst wird zum Ideal der Freiheit. Mittels der schöpferischen Kraft kann der Mensch ein Bild seiner Menschlicheit formen und bestimmen. Der künstlerischen Form wird der Stellenwert eines Mythos zugeordnet. Eine solche Kunstanschauung birgt Gefahren, denn mit der Erhebung in den Stand eines Mythos wird sie zum absoluten Welterklärungsmodell oder gar zur Ideologie. Tatsächlich fehlt jedoch jegliche ethische Untermauerung in der Lebenswirklichkeit. Die ausgewählten Autoren beleuchten den Diskurs und die ihm innewohnenden Gefahren auf unterschiedliche Art und Weise: Friedrich Schiller, die Denker der Frühromantik, Arthur Schopenhauer, Friedrich Nietzsche, Gottfried Benn, Thomas Mann, Theodor W. Adorno, Elias Canetti und die Denker der Postmoderne. Der subversive Einfluss der unterschiedlichen Formentwürfe wird bei mehr als einem Autor deutlich. Daher ist die Frage, wie eine ideologische Vereinnahmung dieser Ideenkonstrukte verhindert werden kann. Dies führt zu einer Neudefinition der Formästhetik. Neuer Fixpunkt muss die Realität des Denkens sein, d.h. Realität und Idee dürfen nicht im Kontrast, sondern müssen in Relation betrachtet werden, denn schließlich ist es diese Relation, die den Menschen zu bestimmen scheint. Die Distanz von Ästhetik und Realität muss bewusst und kritisch hinterfragt werden, um zu einer ‚Ästhetik des Negativen’ zu führen.
Resumo:
La dissertazione ha riguardato l’analisi di sostenibilità di un sistema agronomico per la produzione di olio vegetale a fini energetici in terreni resi marginali dall’infestazione di nematodi. Il processo indagato ha previsto il sovescio di una coltura con proprietà biofumiganti (brassicacea) coltivata in precessione alla specie oleosa (soia e tabacco) al fine di contrastare il proliferare dell’infestazione nel terreno. Tale sistema agronomico è stato confrontato attraverso una analisi di ciclo di vita (LCA) ad uno scenario di coltivazione della stessa specie oleosa senza precessione di brassica ma con l’utilizzo di 1-3-dicloropropene come sistema di lotta ai nematodi. Allo scopo di completare l’analisi LCA con una valutazione dell’impatto sull’uso del suolo (Land use Impact) generato dai due scenari a confronto, sono stati costruiti due modelli nel software per il calcolo del Soil Conditioning Index (SCI), un indicatore quali-quantitativo della qualità del terreno definito dal Dipartimento per l’Agricoltura degli Stati Uniti d’America (USDA).
Resumo:
Nel corso del mio lavoro di ricerca mi sono occupata di identificare strategie che permettano il risparmio delle risorse a livello edilizio e di approfondire un metodo per la valutazione ambientale di tali strategie. La convinzione di fondo è che bisogna uscire da una visione antropocentrica in cui tutto ciò che ci circonda è merce e materiale a disposizione dell’uomo, per entrare in una nuova era di equilibrio tra le risorse della terra e le attività che l’uomo esercita sul pianeta. Ho quindi affrontato il tema dell’edilizia responsabile approfondendo l’ambito delle costruzioni in balle di paglia e terra. Sono convinta che l’edilizia industriale abbia un futuro molto breve davanti a sé e lascerà inevitabilmente spazio a tecniche non convenzionali che coinvolgono materiali di semplice reperimento e posa in opera. Sono altresì convinta che il solo utilizzo di materiali naturali non sia garanzia di danni ridotti sull’ecosistema. Allo stesso tempo ritengo che una mera certificazione energetica non sia sinonimo di sostenibilità. Per questo motivo ho valutato le tecnologie non convenzionali con approccio LCA (Life Cycle Assessment), approfondendo gli impatti legati alla produzione, ai trasporti degli stessi, alla tipologia di messa in opera, e ai loro possibili scenari di fine vita. Inoltre ho approfondito il metodo di calcolo dei danni IMPACT, identificando una carenza nel sistema, che non prevede una categoria di danno legata alle modifiche delle condizioni idrogeologiche del terreno. La ricerca si è svolta attraverso attività pratiche e sperimentali in cantieri di edilizia non convenzionale e attività di ricerca e studio sull’LCA presso l’Enea di Bologna (Ing. Paolo Neri).
Resumo:
The ability of block copolymers to spontaneously self-assemble into a variety of ordered nano-structures not only makes them a scientifically interesting system for the investigation of order-disorder phase transitions, but also offers a wide range of nano-technological applications. The architecture of a diblock is the most simple among the block copolymer systems, hence it is often used as a model system in both experiment and theory. We introduce a new soft-tetramer model for efficient computer simulations of diblock copolymer melts. The instantaneous non-spherical shape of polymer chains in molten state is incorporated by modeling each of the two blocks as two soft spheres. The interactions between the spheres are modeled in a way that the diblock melt tends to microphase separate with decreasing temperature. Using Monte Carlo simulations, we determine the equilibrium structures at variable values of the two relevant control parameters, the diblock composition and the incompatibility of unlike components. The simplicity of the model allows us to scan the control parameter space in a completeness that has not been reached in previous molecular simulations.The resulting phase diagram shows clear similarities with the phase diagram found in experiments. Moreover, we show that structural details of block copolymer chains can be reproduced by our simple model.We develop a novel method for the identification of the observed diblock copolymer mesophases that formalizes the usual approach of direct visual observation,using the characteristic geometry of the structures. A cluster analysis algorithm is used to determine clusters of each component of the diblock, and the number and shape of the clusters can be used to determine the mesophase.We also employ methods from integral geometry for the identification of mesophases and compare their usefulness to the cluster analysis approach.To probe the properties of our model in confinement, we perform molecular dynamics simulations of atomistic polyethylene melts confined between graphite surfaces. The results from these simulations are used as an input for an iterative coarse-graining procedure that yields a surface interaction potential for the soft-tetramer model. Using the interaction potential derived in that way, we perform an initial study on the behavior of the soft-tetramer model in confinement. Comparing with experimental studies, we find that our model can reflect basic features of confined diblock copolymer melts.
Resumo:
In the field of organic optoelectronics, the nanoscale structure of the materials has huge im-pact on the device performance. Here, scanning force microscopy (SFM) techniques become increasingly important. In addition to topographic information, various surface properties can be recorded on a nanometer length scale, such as electrical conductivity (conductive scanning force microscopy, C-SFM) and surface potential (Kelvin probe force microscopy, KPFM).rnrnIn the context of this work, the electrical SFM modes were applied to study the interplay be-tween morphology and electrical properties in hybrid optoelectronic structures, developed in the group of Prof. J. Gutmann (MPI-P Mainz). In particular, I investigated the working prin-ciple of a novel integrated electron blocking layer system. A structure of electrically conduct-ing pathways along crystalline TiO2 particles in an insulating matrix of a polymer derived ceramic was found and insulating defect structures could be identified. In order to get insights into the internal structure of a device I investigated a working hybrid solar cell by preparing a cross cut with focused ion beam polishing. With C-SFM, the functional layers could be identified and the charge transport properties of the novel active layer composite material could be studied. rnrnIn C-SFM, soft surfaces can be permanently damaged by (i) tip induced forces, (ii) high elec-tric fields and (iii) high current densities close to the SFM-tip. Thus, an alternative operation based on torsion mode topography imaging in combination with current mapping was intro-duced. In torsion mode, the SFM-tip vibrates laterally and in close proximity to the sample surface. Thus, an electrical contact between tip and sample can be established. In a series of reference experiments on standard surfaces, the working mechanism of scanning conductive torsion mode microscopy (SCTMM) was investigated. Moreover, I studied samples covered with free standing semiconducting polymer nano-pillars that were developed in the group of Dr. P. Theato (University Mainz). The application of SCTMM allowed non-destructive imag-ing of the flexible surface at high resolution while measuring the conductance on individual pillarsrnrnIn order to study light induced electrical effects on the level of single nanostructures, a new SFM setup was built. It is equipped with a laser sample illumination and placed in inert at-mosphere. With this photoelectric SFM, I investigated the light induced response in function-alized nanorods that were developed in the group of Prof. R. Zentel (University Mainz). A block-copolymer containing an anchor block and dye moiety and a semiconducting conju-gated polymer moiety was synthesized and covalently bound to ZnO nanorods. This system forms an electron donor/acceptor interface and can thus be seen as a model system of a solar cell on the nanoscale. With a KPFM study on the illuminated samples, the light induced charge separation between the nanorod and the polymeric corona could not only be visualized, but also quantified.rnrnThe results demonstrate that electrical scanning force microscopy can study fundamental processes in nanostructures and give invaluable feedback to the synthetic chemists for the optimization of functional nanomaterials.rn
Resumo:
Dopo aver analizzato il conflitto, le sue funzioni e le modalità di gestione, l'autore si sofferma dapprima sulle varie tipologie di mediazione per poi focalizzare l'attenzione sulla mediazione civile e commerciale evidenziando i dati disponibili dall'entrata in vigore del tentativo obbligatorio come condizione di procedibilità della domanda giudiziale per le materie civili, alla fine del 2013.
Resumo:
Mit Hilfe von Molekulardynamik-Simulationen untersuchen wir bürstenartige Systeme unter guten Lösungsmittelbedingungen. Diese Systeme sind, dank ihren vielfältigen Beschaffenheiten, die von Molekularparametern und äußeren Bedingungen abhängig sind, wichtig für viele industrielle Anwendungen. Man vermutet, dass die Polymerbürsten eine entscheidende Rolle in der Natur wegen ihrer einzigartigen Gleiteigenschaften spielen. Ein vergröbertes Modell wird verwendet, um die strukturellen und dynamischen Eigenschaften zweier hochkomprimierter Polymerbürsten, die eine niedrige Reibung aufweisen, zu untersuchen. Allerdings sind die Lubrikationseigenschaften dieser Systeme, die in vielen biologischen Systemen vorhanden sind, beeinflußt. Wir untersuchen so-genannte "weiche Kolloide", die zwischen den beiden Polymerbürsten eingebettet sind, und wie diese Makroobjekte auf die Polymerbürsten wirken.rnrnNicht-Gleichgewichts-Molekulardynamik-Simulationen werden durchgeführt, in denen die hydrodynamischen Wechselwirkungen durch die Anwendung des DPD-Thermostaten mit expliziten Lösungsmittelmolekülen berücksichtigt werden. Wir zeigen, dass die Kenntnis der Gleichgewichtseigenschaften des Systems erlaubt, dynamische Nichtgleichgewichtsigenschaften der Doppelschicht vorherzusagen.rnrnWir untersuchen, wie die effektive Wechselwirkung zwischen kolloidalen Einschlüßen durch die Anwesenheit der Bürsten (in Abhängigkeit der Weichheit der Kolloide und der Pfropfdichte der Bürsten) beeinflußt wird. Als nächsten Schritt untersuchen wir die rheologische Antwort von solchen komplexen Doppelschichten auf Scherung. Wir entwickeln eine Skalen-Theorie, die die Abhängigkeit der makroskopischen Transporteigenschaften und der lateralen Ausdehnung der verankerten Ketten von der Weissenberg Zahl oberhalb des Bereichs, in dem die lineare Antwort-Theorie gilt, voraussagt. Die Vorhersagen der Theorie stimmen gut mit unseren und früheren numerischen Ergebnissen und neuen Experimenten überein. Unsere Theorie bietet die Möglichkeit, die Relaxationszeit der Doppelschicht zu berechnen. Wenn diese Zeit mit einer charakteristischen Längenskala kombiniert wird, kann auch das ''transiente'' (nicht-stationäre) Verhalten beschrieben werden.rnrnrnWir untersuchen die Antwort des Drucktensors und die Deformation der Bürsten während der Scherinvertierung für grosse Weissenberg Zahlen. Wir entwickeln eine Vorhersage für die charakteristische Zeit, nach der das System wieder den stationären Zustand erreicht.rnrnrnElektrostatik spielt eine bedeutende Rolle in vielen biologischen Prozessen. Die Lubrikationseigenschaften der Polymerbürsten werden durch die Anwesenheit langreichweitiger Wechselwirkungen stark beeinflusst. Für unterschiedliche Stärken der elektrostatischen Wechselwirkungen untersuchen wir rheologische Eigenschaften der Doppelschicht und vergleichen mit neutralen Systemen. Wir studieren den kontinuierlichen Übergang der Systemeigenschaften von neutralen zu stark geladenen Bürsten durch Variation der Bjerrumlänge und der Ladungsdichte.
Resumo:
La green chemistry può essere definita come l’applicazione dei principi fondamentali di sviluppo sostenibile, al fine di ridurre al minimo l’impiego o la formazione di sostanze pericolose nella progettazione, produzione e applicazione di prodotti chimici. È in questo contesto che si inserisce la metodologia LCA (Life Cycle Assessment), come strumento di analisi e di valutazione. Il presente lavoro di tesi è stato condotto con l’intenzione di offrire una valutazione degli impatti ambientali associati al settore dei processi chimici di interesse industriale in una prospettiva di ciclo di vita. In particolare, è stato studiato il processo di produzione di acroleina ponendo a confronto due vie di sintesi alternative: la via tradizionale che impiega propilene come materia prima, e l’alternativa da glicerolo ottenuto come sottoprodotto rinnovabile di processi industriali. Il lavoro si articola in due livelli di studio: un primo, parziale, in cui si va ad esaminare esclusivamente il processo di produzione di acroleina, non considerando gli stadi a monte per l’ottenimento delle materie prime di partenza; un secondo, più dettagliato, in cui i confini di sistema vengono ampliati all’intero ciclo produttivo. I risultati sono stati confrontati ed interpretati attraverso tre tipologie di analisi: Valutazione del danno, Analisi di contributo ed Analisi di incertezza. Dal confronto tra i due scenari parziali di produzione di acroleina, emerge come il processo da glicerolo abbia impatti globalmente maggiori rispetto al tradizionale. Tale andamento è ascrivibile ai diversi consumi energetici ed in massa del processo per l’ottenimento dell’acroleina. Successivamente, per avere una visione completa di ciascuno scenario, l’analisi è stata estesa includendo le fasi a monte di produzione delle due materie prime. Da tale confronto emerge come lo scenario più impattante risulta essere quello di produzione di acroleina partendo da glicerolo ottenuto dalla trans-esterificazione di olio di colza. Al contrario, lo scenario che impiega glicerolo prodotto come scarto della lavorazione di sego sembra essere il modello con i maggiori vantaggi ambientali. Con l’obiettivo di individuare le fasi di processo maggiormente incidenti sul carico totale e quindi sulle varie categorie d’impatto intermedie, è stata eseguita un’analisi di contributo suddividendo ciascuno scenario nei sotto-processi che lo compongono. È stata infine eseguita un’analisi di incertezza tramite il metodo Monte Carlo, verificando la riproducibilità dei risultati.
Resumo:
Le finte pelli sono sistemi costituiti da vari strati, il cui componente principale è il PVC. La produzione delle finte pelli richiede l’utilizzo di un elevato numero di additivi; per ognuno di essi occorre valutare la quantità da utilizzare, l’efficienza in rapporto al suo costo e, non di minore importanza, il suo grado di tossicità. Gli additivi attualmente utilizzati nella produzione delle finte pelli, che presentano problemi legati alla sicurezza sono: gli ftalati (plastificanti), l’azodicarbonammide (agente espandente) e il triossido di antimonio (agente antifiamma). Il lavoro di tesi è stato incentrato sullo studio di queste materie prime allo scopo di trovare delle valide alternative rispetto ai composti standard, in relazione agli attuali (e probabili futuri) requisiti del REACH e produrre una finta pelle “migliorata” in termini di prestazioni e sicurezza.
Resumo:
This thesis reports on the experimental realization, characterization and application of a novel microresonator design. The so-called “bottle microresonator” sustains whispering-gallery modes in which light fields are confined near the surface of the micron-sized silica structure by continuous total internal reflection. While whispering-gallery mode resonators in general exhibit outstanding properties in terms of both temporal and spatial confinement of light fields, their monolithic design makes tuning of their resonance frequency difficult. This impedes their use, e.g., in cavity quantum electrodynamics (CQED) experiments, which investigate the interaction of single quantum mechanical emitters of predetermined resonance frequency with a cavity mode. In contrast, the highly prolate shape of the bottle microresonators gives rise to a customizable mode structure, enabling full tunability. The thesis is organized as follows: In chapter I, I give a brief overview of different types of optical microresonators. Important quantities, such as the quality factor Q and the mode volume V, which characterize the temporal and spatial confinement of the light field are introduced. In chapter II, a wave equation calculation of the modes of a bottle microresonator is presented. The intensity distribution of different bottle modes is derived and their mode volume is calculated. A brief description of light propagation in ultra-thin optical fibers, which are used to couple light into and out of bottle modes, is given as well. The chapter concludes with a presentation of the fabrication techniques of both structures. Chapter III presents experimental results on highly efficient, nearly lossless coupling of light into bottle modes as well as their spatial and spectral characterization. Ultra-high intrinsic quality factors exceeding 360 million as well as full tunability are demonstrated. In chapter IV, the bottle microresonator in add-drop configuration, i.e., with two ultra-thin fibers coupled to one bottle mode, is discussed. The highly efficient, nearly lossless coupling characteristics of each fiber combined with the resonator's high intrinsic quality factor, enable resonant power transfers between both fibers with efficiencies exceeding 90%. Moreover, the favorable ratio of absorption and the nonlinear refractive index of silica yields optical Kerr bistability at record low powers on the order of 50 µW. Combined with the add-drop configuration, this allows one to route optical signals between the outputs of both ultra-thin fibers, simply by varying the input power, thereby enabling applications in all-optical signal processing. Finally, in chapter V, I discuss the potential of the bottle microresonator for CQED experiments with single atoms. Its Q/V-ratio, which determines the ratio of the atom-cavity coupling rate to the dissipative rates of the subsystems, aligns with the values obtained for state-of-the-art CQED microresonators. In combination with its full tunability and the possibility of highly efficient light transfer to and from the bottle mode, this makes the bottle microresonator a unique tool for quantum optics applications.
Resumo:
L’attuale rilevanza rappresentata dalla stretta relazione tra cambiamenti climatici e influenza antropogenica ha da tempo posto l’attenzione sull’effetto serra e sul surriscaldamento planetario così come sull’aumento delle concentrazioni atmosferiche dei gas climaticamente attivi, in primo luogo la CO2. Il radiocarbonio è attualmente il tracciante ambientale per eccellenza in grado di fornire mediante un approccio “top-down” un valido strumento di controllo per discriminare e quantificare il diossido di carbonio presente in atmosfera di provenienza fossile o biogenica. Ecco allora che ai settori applicativi tradizionali del 14C, quali le datazioni archeometriche, si affiancano nuovi ambiti legati da un lato al settore energetico per quanto riguarda le problematiche associate alle emissioni di impianti, ai combustibili, allo stoccaggio geologico della CO2, dall’altro al mercato in forte crescita dei cosiddetti prodotti biobased costituiti da materie prime rinnovabili. Nell’ambito del presente lavoro di tesi è stato quindi esplorato il mondo del radiocarbonio sia dal punto di vista strettamente tecnico e metodologico che dal punto di vista applicativo relativamente ai molteplici e diversificati campi d’indagine. E’ stato realizzato e validato un impianto di analisi basato sul metodo radiometrico mediante assorbimento diretto della CO2 ed analisi in scintillazione liquida apportando miglioramenti tecnologici ed accorgimenti procedurali volti a migliorare le performance del metodo in termini di semplicità, sensibilità e riproducibilità. Il metodo, pur rappresentando generalmente un buon compromesso rispetto alle metodologie tradizionalmente usate per l’analisi del 14C, risulta allo stato attuale ancora inadeguato a quei settori applicativi laddove è richiesta una precisione molto puntuale, ma competitivo per l’analisi di campioni moderni ad elevata concentrazione di 14C. La sperimentazione condotta su alcuni liquidi ionici, seppur preliminare e non conclusiva, apre infine nuove linee di ricerca sulla possibilità di utilizzare questa nuova classe di composti come mezzi per la cattura della CO2 e l’analisi del 14C in LSC.
Resumo:
Die vorliegende Arbeit untersucht den Zusammenhang zwischen Skalen in Systemen weicher Materie, der für Multiskalen-Simulationen eine wichtige Rolle spielt. Zu diesem Zweck wurde eine Methode entwickelt, die die Approximation der Separierbarkeit von Variablen für die Molekulardynamik und ähnliche Anwendungen bewertet. Der zweite und größere Teil dieser Arbeit beschäftigt sich mit der konzeptionellen und technischen Erweiterung des Adaptive Resolution Scheme'' (AdResS), einer Methode zur gleichzeitigen Simulation von Systemen mit mehreren Auflösungsebenen. Diese Methode wurde auf Systeme erweitert, in denen klassische und quantenmechanische Effekte eine Rolle spielen.rnrnDie oben genannte erste Methode benötigt nur die analytische Form der Potentiale, wie sie die meisten Molekulardynamik-Programme zur Verfügung stellen. Die Anwendung der Methode auf ein spezielles Problem gibt bei erfolgreichem Ausgang einen numerischen Hinweis auf die Gültigkeit der Variablenseparation. Bei nicht erfolgreichem Ausgang garantiert sie, dass keine Separation der Variablen möglich ist. Die Methode wird exemplarisch auf ein zweiatomiges Molekül auf einer Oberfläche und für die zweidimensionale Version des Rotational Isomer State (RIS) Modells einer Polymerkette angewandt.rnrnDer zweite Teil der Arbeit behandelt die Entwicklung eines Algorithmus zur adaptiven Simulation von Systemen, in denen Quanteneffekte berücksichtigt werden. Die Quantennatur von Atomen wird dabei in der Pfadintegral-Methode durch einen klassischen Polymerring repräsentiert. Die adaptive Pfadintegral-Methode wird zunächst für einatomige Flüssigkeiten und tetraedrische Moleküle unter normalen thermodynamischen Bedingungen getestet. Schließlich wird die Stabilität der Methode durch ihre Anwendung auf flüssigen para-Wasserstoff bei niedrigen Temperaturen geprüft.
Resumo:
Heusler intermetallics Mn$_{2}Y$Ga and $X_{2}$MnGa ($X,Y$=Fe, Co, Ni) undergo tetragonal magnetostructural transitions that can result in half metallicity, magnetic shape memory, or the magnetocaloric effect. Understanding the magnetism and magnetic behavior in functional materials is often the most direct route to being able to optimize current materials and design future ones.rnrnSynchrotron soft x-ray magnetic spectromicroscopy techniques are well suited to explore the the competing effects from the magnetization and the lattice parameters in these materials as they provide detailed element-, valence-, and site-specific information on the coupling of crystallographic ordering and electronic structure as well as external parameters like temperature and pressure on the bonding and exchange.rnrnFundamental work preparing the model systems of spintronic, multiferroic, and energy-related compositions is presented for context. The methodology of synchrotron spectroscopy is presented and applied to not only magnetic characterization but also of developing a systematic screening method for future examples of materials exhibiting any of the above effects. rnrnChapters include an introduction to the concepts and materials under consideration (Chapter 1); an overview of sample preparation techniques and results, and the kinds of characterization methods employed (Chapter 2); spectro- and microscopic explorations of $X_2$MnGa/Ge (Chapter 3); spectroscopic investigations of the composition series Mn$_{2}Y$Ga to the logical Mn$_3$Ga endpoint (Chapter 4); and a summary and overview of upcoming work (Chapter 5). Appendices include the results of a “Think Tank” for the Graduate School of Excellence MAINZ (Appendix A) and details of an imaging project now in progress on magnetic reversal and domain wall observation in the classical Heusler material Co$_2$FeSi (Appendix B).