893 resultados para Logic, Symbolic and mathematical
Resumo:
We propose a new general Bayesian latent class model for evaluation of the performance of multiple diagnostic tests in situations in which no gold standard test exists based on a computationally intensive approach. The modeling represents an interesting and suitable alternative to models with complex structures that involve the general case of several conditionally independent diagnostic tests, covariates, and strata with different disease prevalences. The technique of stratifying the population according to different disease prevalence rates does not add further marked complexity to the modeling, but it makes the model more flexible and interpretable. To illustrate the general model proposed, we evaluate the performance of six diagnostic screening tests for Chagas disease considering some epidemiological variables. Serology at the time of donation (negative, positive, inconclusive) was considered as a factor of stratification in the model. The general model with stratification of the population performed better in comparison with its concurrents without stratification. The group formed by the testing laboratory Biomanguinhos FIOCRUZ-kit (c-ELISA and rec-ELISA) is the best option in the confirmation process by presenting false-negative rate of 0.0002% from the serial scheme. We are 100% sure that the donor is healthy when these two tests have negative results and he is chagasic when they have positive results.
Resumo:
Using a network representation for real soil samples and mathematical models for microbial spread, we show that the structural heterogeneity of the soil habitat may have a very significant influence on the size of microbial invasions of the soil pore space. In particular, neglecting the soil structural heterogeneity may lead to a substantial underestimation of microbial invasion. Such effects are explained in terms of a crucial interplay between heterogeneity in microbial spread and heterogeneity in the topology of soil networks. The main influence of network topology on invasion is linked to the existence of long channels in soil networks that may act as bridges for transmission of microorganisms between distant parts of soil.
Resumo:
A common interest in gene expression data analysis is to identify from a large pool of candidate genes the genes that present significant changes in expression levels between a treatment and a control biological condition. Usually, it is done using a statistic value and a cutoff value that are used to separate the genes differentially and nondifferentially expressed. In this paper, we propose a Bayesian approach to identify genes differentially expressed calculating sequentially credibility intervals from predictive densities which are constructed using the sampled mean treatment effect from all genes in study excluding the treatment effect of genes previously identified with statistical evidence for difference. We compare our Bayesian approach with the standard ones based on the use of the t-test and modified t-tests via a simulation study, using small sample sizes which are common in gene expression data analysis. Results obtained report evidence that the proposed approach performs better than standard ones, especially for cases with mean differences and increases in treatment variance in relation to control variance. We also apply the methodologies to a well-known publicly available data set on Escherichia coli bacterium.
Resumo:
Abstract Background Atherosclerosis causes millions of deaths, annually yielding billions in expenses round the world. Intravascular Optical Coherence Tomography (IVOCT) is a medical imaging modality, which displays high resolution images of coronary cross-section. Nonetheless, quantitative information can only be obtained with segmentation; consequently, more adequate diagnostics, therapies and interventions can be provided. Since it is a relatively new modality, many different segmentation methods, available in the literature for other modalities, could be successfully applied to IVOCT images, improving accuracies and uses. Method An automatic lumen segmentation approach, based on Wavelet Transform and Mathematical Morphology, is presented. The methodology is divided into three main parts. First, the preprocessing stage attenuates and enhances undesirable and important information, respectively. Second, in the feature extraction block, wavelet is associated with an adapted version of Otsu threshold; hence, tissue information is discriminated and binarized. Finally, binary morphological reconstruction improves the binary information and constructs the binary lumen object. Results The evaluation was carried out by segmenting 290 challenging images from human and pig coronaries, and rabbit iliac arteries; the outcomes were compared with the gold standards made by experts. The resultant accuracy was obtained: True Positive (%) = 99.29 ± 2.96, False Positive (%) = 3.69 ± 2.88, False Negative (%) = 0.71 ± 2.96, Max False Positive Distance (mm) = 0.1 ± 0.07, Max False Negative Distance (mm) = 0.06 ± 0.1. Conclusions In conclusion, by segmenting a number of IVOCT images with various features, the proposed technique showed to be robust and more accurate than published studies; in addition, the method is completely automatic, providing a new tool for IVOCT segmentation.
Resumo:
Salt deposits characterize the subsurface of Tuzla (BiH) and made it famous since the ancient times. Archeological discoveries demonstrate the presence of a Neolithic pile-dwelling settlement related to the existence of saltwater springs that contributed to make the most of the area a swampy ground. Since the Roman times, the town is reported as “the City of Salt deposits and Springs”; "tuz" is the Turkish word for salt, as the Ottomans renamed the settlement in the 15th century following their conquest of the medieval Bosnia (Donia and Fine, 1994). Natural brine springs were located everywhere and salt has been evaporated by means of hot charcoals since pre-Roman times. The ancient use of salt was just a small exploitation compared to the massive salt production carried out during the 20th century by means of classical mine methodologies and especially wild brine pumping. In the past salt extraction was practised tapping natural brine springs, while the modern technique consists in about 100 boreholes with pumps tapped to the natural underground brine runs, at an average depth of 400-500 m. The mining operation changed the hydrogeological conditions enabling the downward flow of fresh water causing additional salt dissolution. This process induced severe ground subsidence during the last 60 years reaching up to 10 meters of sinking in the most affected area. Stress and strain of the overlying rocks induced the formation of numerous fractures over a conspicuous area (3 Km2). Consequently serious damages occurred to buildings and infrastructures such as water supply system, sewage networks and power lines. Downtown urban life was compromised by the destruction of more than 2000 buildings that collapsed or needed to be demolished causing the resettlement of about 15000 inhabitants (Tatić, 1979). Recently salt extraction activities have been strongly reduced, but the underground water system is returning to his natural conditions, threatening the flooding of the most collapsed area. During the last 60 years local government developed a monitoring system of the phenomenon, collecting several data about geodetic measurements, amount of brine pumped, piezometry, lithostratigraphy, extension of the salt body and geotechnical parameters. A database was created within a scientific cooperation between the municipality of Tuzla and the city of Rotterdam (D.O.O. Mining Institute Tuzla, 2000). The scientific investigation presented in this dissertation has been financially supported by a cooperation project between the Municipality of Tuzla, The University of Bologna (CIRSA) and the Province of Ravenna. The University of Tuzla (RGGF) gave an important scientific support in particular about the geological and hydrogeological features. Subsidence damage resulting from evaporite dissolution generates substantial losses throughout the world, but the causes are only well understood in a few areas (Gutierrez et al., 2008). The subject of this study is the collapsing phenomenon occurring in Tuzla area with the aim to identify and quantify the several factors involved in the system and their correlations. Tuzla subsidence phenomenon can be defined as geohazard, which represents the consequence of an adverse combination of geological processes and ground conditions precipitated by human activity with the potential to cause harm (Rosenbaum and Culshaw, 2003). Where an hazard induces a risk to a vulnerable element, a risk management process is required. The single factors involved in the subsidence of Tuzla can be considered as hazards. The final objective of this dissertation represents a preliminary risk assessment procedure and guidelines, developed in order to quantify the buildings vulnerability in relation to the overall geohazard that affect the town. The historical available database, never fully processed, have been analyzed by means of geographic information systems and mathematical interpolators (PART I). Modern geomatic applications have been implemented to deeply investigate the most relevant hazards (PART II). In order to monitor and quantify the actual subsidence rates, geodetic GPS technologies have been implemented and 4 survey campaigns have been carried out once a year. Subsidence related fractures system has been identified by means of field surveys and mathematical interpretations of the sinking surface, called curvature analysis. The comparison of mapped and predicted fractures leaded to a better comprehension of the problem. Results confirmed the reliability of fractures identification using curvature analysis applied to sinking data instead of topographic or seismic data. Urban changes evolution has been reconstructed analyzing topographic maps and satellite imageries, identifying the most damaged areas. This part of the investigation was very important for the quantification of buildings vulnerability.
Resumo:
Our research takes place in the context of a discipline kwown as Communication for Development, sited inside the field of Communication for Social Change, characterized by the use of interpersonal ad mass communication theories and tools, applyied to international development cooperation. Our study aims at pointing out a change of paradigm in this field: our object is Public Administration’s communication, therefore, what we suggest is a shift from Communication for Development, to Development Communication. The object of our study, hence, becomes the discourse itself, in its double action of representation and construction of reality. In particular, we are interested in the discourse’s tribute to the creation of a collective immagination, wich is the perspective towards which we have oriented the analysis, through a structuralist semoitics-based methodology integrated with a socio-semiotic approach. Taking into consideartion the fact that in our contemporary society (that is to say a ‘Western’ and ‘First World’ society), the internet is a crucial public space for the mediation and the management of collective immagination, we chose the web sites of Public Bodies which are dedicated to International Cooperation has our analysis corpus. This, due to their symbolic and ideologic significance, as well as for the actual political responsibility we think these web sites should have. The result of our analysis allows us to suggest some discoursive strategies used in the web sites of Public Bodies. In these sites, there is a tendency to shift the discourses around international cooperation from the ideological axis - avoiding in so doing to explicit a political statement about the causes of injustices and un-balances which lead to the necessity of a support in development (i.e. avoiding to mention values such as social justice and democracy while acknowledging socio-economical institutions which contribute to foster underdevelopment on a global scale) -, to the ethical axis, hence referring to moral values concerning the private sphere (human solidarity and charity), which is delegated mainly to non governamental associations.
Resumo:
Many research fields are pushing the engineering of large-scale, mobile, and open systems towards the adoption of techniques inspired by self-organisation: pervasive computing, but also distributed artificial intelligence, multi-agent systems, social networks, peer-topeer and grid architectures exploit adaptive techniques to make global system properties emerge in spite of the unpredictability of interactions and behaviour. Such a trend is visible also in coordination models and languages, whenever a coordination infrastructure needs to cope with managing interactions in highly dynamic and unpredictable environments. As a consequence, self-organisation can be regarded as a feasible metaphor to define a radically new conceptual coordination framework. The resulting framework defines a novel coordination paradigm, called self-organising coordination, based on the idea of spreading coordination media over the network, and charge them with services to manage interactions based on local criteria, resulting in the emergence of desired and fruitful global coordination properties of the system. Features like topology, locality, time-reactiveness, and stochastic behaviour play a key role in both the definition of such a conceptual framework and the consequent development of self-organising coordination services. According to this framework, the thesis presents several self-organising coordination techniques developed during the PhD course, mainly concerning data distribution in tuplespace-based coordination systems. Some of these techniques have been also implemented in ReSpecT, a coordination language for tuple spaces, based on logic tuples and reactions to events occurring in a tuple space. In addition, the key role played by simulation and formal verification has been investigated, leading to analysing how automatic verification techniques like probabilistic model checking can be exploited in order to formally prove the emergence of desired behaviours when dealing with coordination approaches based on self-organisation. To this end, a concrete case study is presented and discussed.
Resumo:
Die vorliegende Arbeit beschäftigt sich mit derAutomatisierung von Berechnungen virtuellerStrahlungskorrekturen in perturbativen Quantenfeldtheorien.Die Berücksichtigung solcher Korrekturen aufMehrschleifen-Ebene in der Störungsreihenentwicklung istheute unabdingbar, um mit der wachsenden Präzisionexperimenteller Resultate Schritt zu halten. Im allgemeinen kinematischen Fall können heute nur dieEinschleifen-Korrekturen als theoretisch gelöst angesehenwerden -- für höhere Ordnungen liegen nur Teilergebnissevor. In Mainz sind in den letzten Jahren einige neuartigeMethoden zur Integration von Zweischleifen-Feynmandiagrammenentwickelt und im xloops-Paket in algorithmischer Formteilweise erfolgreich implementiert worden. Die verwendetenVerfahren sind eine Kombination exakter symbolischerRechenmethoden mit numerischen. DieZweischleifen-Vierbeinfunktionen stellen in diesem Rahmenein neues Kapitel dar, das durch seine große Anzahl vonfreien kinematischen Parametern einerseits leichtunüberschaubar wird und andererseits auch auf symbolischerEbene die bisherigen Anforderungen übersteigt. Sie sind ausexperimenteller Sicht aber für manche Streuprozesse vongroßem Interesse. In dieser Arbeit wurde, basierend auf einer Idee von DirkKreimer, ein Verfahren untersucht, welches die skalarenVierbeinfunktionen auf Zweischleifen-Niveau ganz ohneRandbedingungen an den Parameterraum zu integrierenversucht. Die Struktur der nach vier Residuenintegrationenauftretenden Terme konnte dabei weitgehend geklärt und dieKomplexität der auftretenden Ausdrücke soweit verkleinertwerden, dass sie von heutigen Rechnern darstellbar sind.Allerdings ist man noch nicht bei einer vollständigautomatisierten Implementierung angelangt. All dies ist dasThema von Kapitel 2. Die Weiterentwicklung von xloops über Zweibeinfunktionenhinaus erschien aus vielfältigen Gründen allerdings nichtmehr sinnvoll. Im Rahmen dieser Arbeit wurde daher einradikaler Bruch vollzogen und zusammen mit C. Bauer und A.Frink eine Programmbibliothek entworfen, die als Vehikel fürsymbolische Manipulationen dient und es uns ermöglicht,übliche symbolische Sprachen wie Maple durch C++ zuersetzen. Im dritten Kapitel wird auf die Gründeeingegangen, warum diese Umstellung sinnvoll ist, und dabeidie Bibliothek GiNaC vorgestellt. Im vierten Kapitel werdenDetails der Implementierung dann im Einzelnen vorgestelltund im fünften wird sie auf ihre Praxistauglichkeituntersucht. Anhang A bietet eine Übersicht über dieverwendeten Hilfsmittel komplexer Analysis und Anhang Bbeschreibt ein bewährtes numerisches Instrument.
Resumo:
Ground-based Earth troposphere calibration systems play an important role in planetary exploration, especially to carry out radio science experiments aimed at the estimation of planetary gravity fields. In these experiments, the main observable is the spacecraft (S/C) range rate, measured from the Doppler shift of an electromagnetic wave transmitted from ground, received by the spacecraft and coherently retransmitted back to ground. If the solar corona and interplanetary plasma noise is already removed from Doppler data, the Earth troposphere remains one of the main error sources in tracking observables. Current Earth media calibration systems at NASA’s Deep Space Network (DSN) stations are based upon a combination of weather data and multidirectional, dual frequency GPS measurements acquired at each station complex. In order to support Cassini’s cruise radio science experiments, a new generation of media calibration systems were developed, driven by the need to achieve the goal of an end-to-end Allan deviation of the radio link in the order of 3×〖10〗^(-15) at 1000 s integration time. The future ESA’s Bepi Colombo mission to Mercury carries scientific instrumentation for radio science experiments (a Ka-band transponder and a three-axis accelerometer) which, in combination with the S/C telecommunication system (a X/X/Ka transponder) will provide the most advanced tracking system ever flown on an interplanetary probe. Current error budget for MORE (Mercury Orbiter Radioscience Experiment) allows the residual uncalibrated troposphere to contribute with a value of 8×〖10〗^(-15) to the two-way Allan deviation at 1000 s integration time. The current standard ESA/ESTRACK calibration system is based on a combination of surface meteorological measurements and mathematical algorithms, capable to reconstruct the Earth troposphere path delay, leaving an uncalibrated component of about 1-2% of the total delay. In order to satisfy the stringent MORE requirements, the short time-scale variations of the Earth troposphere water vapor content must be calibrated at ESA deep space antennas (DSA) with more precise and stable instruments (microwave radiometers). In parallel to this high performance instruments, ESA ground stations should be upgraded to media calibration systems at least capable to calibrate both troposphere path delay components (dry and wet) at sub-centimetre level, in order to reduce S/C navigation uncertainties. The natural choice is to provide a continuous troposphere calibration by processing GNSS data acquired at each complex by dual frequency receivers already installed for station location purposes. The work presented here outlines the troposphere calibration technique to support both Deep Space probe navigation and radio science experiments. After an introduction to deep space tracking techniques, observables and error sources, in Chapter 2 the troposphere path delay is widely investigated, reporting the estimation techniques and the state of the art of the ESA and NASA troposphere calibrations. Chapter 3 deals with an analysis of the status and the performances of the NASA Advanced Media Calibration (AMC) system referred to the Cassini data analysis. Chapter 4 describes the current release of a developed GNSS software (S/W) to estimate the troposphere calibration to be used for ESA S/C navigation purposes. During the development phase of the S/W a test campaign has been undertaken in order to evaluate the S/W performances. A description of the campaign and the main results are reported in Chapter 5. Chapter 6 presents a preliminary analysis of microwave radiometers to be used to support radio science experiments. The analysis has been carried out considering radiometric measurements of the ESA/ESTEC instruments installed in Cabauw (NL) and compared with the requirements of MORE. Finally, Chapter 7 summarizes the results obtained and defines some key technical aspects to be evaluated and taken into account for the development phase of future instrumentation.
Resumo:
In der vorliegenden Arbeit werden zwei physikalischeFließexperimente an Vliesstoffen untersucht, die dazu dienensollen, unbekannte hydraulische Parameter des Materials, wiez. B. die Diffusivitäts- oder Leitfähigkeitsfunktion, ausMeßdaten zu identifizieren. Die physikalische undmathematische Modellierung dieser Experimente führt auf einCauchy-Dirichlet-Problem mit freiem Rand für die degeneriertparabolische Richardsgleichung in derSättigungsformulierung, das sogenannte direkte Problem. Ausder Kenntnis des freien Randes dieses Problems soll dernichtlineare Diffusivitätskoeffizient derDifferentialgleichung rekonstruiert werden. Für diesesinverse Problem stellen wir einOutput-Least-Squares-Funktional auf und verwenden zu dessenMinimierung iterative Regularisierungsverfahren wie dasLevenberg-Marquardt-Verfahren und die IRGN-Methode basierendauf einer Parametrisierung des Koeffizientenraumes durchquadratische B-Splines. Für das direkte Problem beweisen wirunter anderem Existenz und Eindeutigkeit der Lösung desCauchy-Dirichlet-Problems sowie die Existenz des freienRandes. Anschließend führen wir formal die Ableitung desfreien Randes nach dem Koeffizienten, die wir für dasnumerische Rekonstruktionsverfahren benötigen, auf einlinear degeneriert parabolisches Randwertproblem zurück.Wir erläutern die numerische Umsetzung und Implementierungunseres Rekonstruktionsverfahrens und stellen abschließendRekonstruktionsergebnisse bezüglich synthetischer Daten vor.
Resumo:
Die vorliegende Arbeit untersucht, wie sich Greenaways Filme selbstreflexiv zur Problematik des filmischen Mediums stellen, d.h. wie der Illusions- und Artefaktcharakter des Films im Film selbst thematisiert wird. Die Untersuchung der Selbstreflexion wird konkret unter drei Untersuchungsaspekten erfolgt, nämlich die systematisch und künstlich organisierte formale Struktur, die Narrativität und die Wahrnehmungsweise des Zuschauers. Greenaways Filme veranschaulichen auf der formalen Ebene:· daß die Filmbilder diskontinuierlich und uneinheitlich sind, · wie einzelnen visuellen, akustischen und technischen Zeichen systematisch und künstlich organisiert sind und schließlich· wie die diskontinuierlichen und uneinheitlichen Filmbilder durch die systematische und künstliche Organisation der Zeichen kontinuierlich und einheitlich wirken. Seine Filme thematisieren auch auf der allegorischen, symbolischen und metaphorischen Ebene das Verhältnis zwischen der formalen Struktur, der Geschichte und der interaktiven Wahrnehmungsweise des Zuschauers, und die Beziehung zwischen dem Zuschauer, dem Film, dem Filmemacher. Die männliche Hauptfigur metaphorisiert den Zuschauer. Die Frauenfiguren allegorisieren die zwei Seiten des Films, die Form und den Inhalt des Films. Die sexuelle Beziehung zwischen der männlichen Hauptfigur und den Frauen umfaßt auf der metaphorischen Ebene die Interaktivität des Zuschauers mit dem Film.
Resumo:
Die regionale Bestimmung der Durchblutung (Perfusion) ermöglicht differenzierte Aussagen über den Gesundheitszustand und die Funktionalität der Lunge. Durch neue Messverfahren ermöglicht die Magnetresonanztomographie (MRT) eine nicht-invasive und strahlungsfreie Untersuchung der Perfusion. Obwohl die Machbarkeit qualitativer MRT-Durchblutungsmessungen bereits gezeigt wurde, fehlt bisher eine validierte quantitative Methode. Ziel dieser Arbeit war eine Optimierung der bestehenden Messprotokolle und mathematischen Modelle zur Absolutquantifizierung der Lungenperfusion mit Magnetresonanztomographie. Weiterhin sollte die Methodik durch Vergleich mit einem etablierten Referenzverfahren validiert werden. Durch Simulationen und Phantommessungen konnten optimale MRT-Messparameter und ein standardisiertes Protokoll festgelegt werden. Des Weiteren wurde eine verallgemeinerte Bestimmung der Kontrastmittelkonzentration aus den gemessenen Signalintensitäten vorgestellt, diskutiert und durch Probandenmessungen validiert. Auf der Basis dieser Entwicklungen wurde die MRT-Durchblutungsmessung der Lunge tierexperimentell mit der Positronenemissionstomographie (PET) intraindividuell verglichen und validiert. Die Ergebnisse zeigten nur kleine Abweichungen und eine statistisch hochsignifikante, stark lineare Korrelation. Zusammenfassend war es durch die Entwicklungen der vorgestellten Arbeit möglich, die kontrastmittelgestützte MRT-Durchblutungsmessung der Lunge zu optimieren und erstmals zu validieren.
Resumo:
La società dei consumi ha reso ogni ambito vitale suscettibile di mercificazione. Il capitalismo ha da tempo svestito la prassi produttiva dei suoi elementi più pesanti per vestire i panni della dimensione simbolica e culturale. Il consumo fattosi segno, dimensione immateriale delle nostre vite materiali, ha finito per colonizzare l'ambito dell'esperienza e, quindi, della vita stessa. Il consumo diventa, innanzitutto, esperienza di vita. Una esperienza continuamente cangiante che ci permette di vivere numerose vite, ognuna diversa. Ciò che è stabile è il paradigma consumistico che investe la nostra stessa identità, l'identità dei luoghi, l'identità del mondo. Il nomadismo è la dimensione più tipica del consumo, così come la perenne mobilità della vita è la dimensione propria dell'epoca globale. Tuttavia, le nuove forme di consumerismo politico, etico e responsabile, conseguenti al montare dei rischi globali, investendo proprio l’ambito dell’esperienza e del consumo, contribuiscono a modificare i comportamenti in senso “riflessivo” e “glocale”. L’ambito turistico, rappresentando al contempo il paradigma esperienziale e quello della mobilità globale, può diventare allora l’osservatorio privilegiato per indagare queste stesse forme riflessive di consumo, le quali forniscono un significato del tutto nuovo tanto all’esperienza quanto al consumo stesso. Il lavoro di tesi vuole allora approfondire, attraverso lo studio di caso, il modo in cui nuove forme emergenti di turismo responsabile possano rappresentare una chiave d’accesso a nuove forme di sviluppo sostenibile dei territori locali in contesti di prima modernizzazione.
Resumo:
La presente ricerca, L’architettura religiosa di Luis Moya Blanco. La costruzione come principio compositivo, tratta i temi inerenti l’edificazione di spazi per il culto della religione cristiana che l’architetto spagnolo progetta e realizza a Madrid dal 1945 al 1970. La tesi è volta ad indagare quali siano i principi alla base della composizione architettonica che si possano considerare immutati, nel lungo arco temporale in cui l’autore si trova ad operare. Tale indagine, partendo da una prima analisi riguardante gli anni della formazione e gli scritti da lui prodotti, verte in particolare sullo studio dei progetti più recenti e ancora poco trattati dalla critica. L’obbiettivo della presente tesi è dunque quello di apportare un contributo originale sull’aspetto compositivo della sua architettura. Ma analizzare la composizione significa, in Moya, analizzare la costruzione che, a dispetto del susseguirsi dei linguaggi, rimarrà l’aspetto principale delle sue opere. Lo studio dei manufatti mediante categorie estrapolate dai suoi stessi scritti – la matematica, il numero, la geometria e i tracciati regolatori - permette di evidenziare punti di contatto e di continuità tra le prime chiese, fortemente caratterizzate da un impianto barocco, e gli ultimi progetti che sembrano cercare invece un confronto con forme decisamente moderne. Queste riflessioni, parallelamente contestualizzate nell’ambito della sua consistente produzione saggistica, andranno a confluire nell’idea finale per cui la costruzione diventi per Luis Moya Blanco il principio compositivo da cui non si può prescindere, la regola che sostanzia nella materia il numero e la geometria. Se la costruzione è dunque la pietrificazione di leggi geometrico-matematiche che sottendono schemi planimetrici; il ricorso allo spazio di origine centrale non risponde all’intenzione di migliorare la liturgia, ma a questioni di tipo filosofico-idealista, che fanno corrispondere alla somma naturalezza della perfezione divina, la somma perfezione della forma circolare o di uno dei suoi derivati come l’ellisse.
Resumo:
In this thesis, I present the realization of a fiber-optical interface using optically trapped cesium atoms, which is an efficient tool for coupling light and atoms. The basic principle of the presented scheme relies on the trapping of neutral cesium atoms in a two-color evanescent field surrounding a nanofiber. The strong confinement of the fiber guided light, which also protrudes outside the nanofiber, provides strong confinement of the atoms as well as efficient coupling to near-resonant light propagating through the fiber. In chapter 1, the necessary physical and mathematical background describing the propagation of light in an optical fiber is presented. The exact solution of Maxwell’s equations allows us to model fiber-guided light fields which give rise to the trapping potentials and the atom-light coupling in the close vicinity of a nanofiber. Chapter 2 gives the theoretical background of light-atom interaction. A quantum mechanical model of the light-induced shifts of the relevant atomic levels is reviewed, which allows us to quantify the perturbation of the atomic states due to the presence of the trapping light-fields. The experimental realization of the fiber-based atom trap is the focus of chapter 3. Here, I analyze the properties of the fiber-based trap in terms of the confinement of the atoms and the impact of several heating mechanisms. Furthermore, I demonstrate the transportation of the trapped atoms, as a first step towards a deterministic delivery of individual atoms. In chapter 4, I present the successful interfacing of the trapped atomic ensemble and fiber-guided light. Three different approaches are discussed, i.e., those involving the measurement of either near-resonant scattering in absorption or the emission into the guided mode of the nanofiber. In the analysis of the spectroscopic properties of the trapped ensemble we find good agreement with the prediction of theoretical model discussed in chapter 2. In addition, I introduce a non-destructive scheme for the interrogation of the atoms states, which is sensitive to phase shifts of far-detuned fiber-guided light interacting with the trapped atoms. The inherent birefringence in our system, induced by the atoms, changes the state of polarization of the probe light and can be thus detected via a Stokes vector measurement.