874 resultados para sensitivity radius
Resumo:
Despite the scientific achievement of the last decades in the astrophysical and cosmological fields, the majority of the Universe energy content is still unknown. A potential solution to the “missing mass problem” is the existence of dark matter in the form of WIMPs. Due to the very small cross section for WIMP-nuleon interactions, the number of expected events is very limited (about 1 ev/tonne/year), thus requiring detectors with large target mass and low background level. The aim of the XENON1T experiment, the first tonne-scale LXe based detector, is to be sensitive to WIMP-nucleon cross section as low as 10^-47 cm^2. To investigate the possibility of such a detector to reach its goal, Monte Carlo simulations are mandatory to estimate the background. To this aim, the GEANT4 toolkit has been used to implement the detector geometry and to simulate the decays from the various background sources: electromagnetic and nuclear. From the analysis of the simulations, the level of background has been found totally acceptable for the experiment purposes: about 1 background event in a 2 tonne-years exposure. Indeed, using the Maximum Gap method, the XENON1T sensitivity has been evaluated and the minimum for the WIMP-nucleon cross sections has been found at 1.87 x 10^-47 cm^2, at 90% CL, for a WIMP mass of 45 GeV/c^2. The results have been independently cross checked by using the Likelihood Ratio method that confirmed such results with an agreement within less than a factor two. Such a result is completely acceptable considering the intrinsic differences between the two statistical methods. Thus, in the PhD thesis it has been proven that the XENON1T detector will be able to reach the designed sensitivity, thus lowering the limits on the WIMP-nucleon cross section by about 2 orders of magnitude with respect to the current experiments.
Resumo:
Biosensors find wide application in clinical diagnostics, bioprocess control and environmental monitoring. They should not only show high specificity and reproducibility but also a high sensitivity and stability of the signal. Therefore, I introduce a novel sensor technology based on plasmonic nanoparticles which overcomes both of these limitations. Plasmonic nanoparticles exhibit strong absorption and scattering in the visible and near-infrared spectral range. The plasmon resonance, the collective coherent oscillation mode of the conduction band electrons against the positively charged ionic lattice, is sensitive to the local environment of the particle. I monitor these changes in the resonance wavelength by a new dark-field spectroscopy technique. Due to a strong light source and a highly sensitive detector a temporal resolution in the microsecond regime is possible in combination with a high spectral stability. This opens a window to investigate dynamics on the molecular level and to gain knowledge about fundamental biological processes.rnFirst, I investigate adsorption at the non-equilibrium as well as at the equilibrium state. I show the temporal evolution of single adsorption events of fibrinogen on the surface of the sensor on a millisecond timescale. Fibrinogen is a blood plasma protein with a unique shape that plays a central role in blood coagulation and is always involved in cell-biomaterial interactions. Further, I monitor equilibrium coverage fluctuations of sodium dodecyl sulfate and demonstrate a new approach to quantify the characteristic rate constants which is independent of mass transfer interference and long term drifts of the measured signal. This method has been investigated theoretically by Monte-Carlo simulations but so far there has been no sensor technology with a sufficient signal-to-noise ratio.rnSecond, I apply plasmonic nanoparticles as sensors for the determination of diffusion coefficients. Thereby, the sensing volume of a single, immobilized nanorod is used as detection volume. When a diffusing particle enters the detection volume a shift in the resonance wavelength is introduced. As no labeling of the analyte is necessary the hydrodynamic radius and thus the diffusion properties are not altered and can be studied in their natural form. In comparison to the conventional Fluorescence Correlation Spectroscopy technique a volume reduction by a factor of 5000-10000 is reached.
Resumo:
Poly(ethylene glycol) (PEG) is used in a broad range of applications due to its unique combination of properties and is approved use in formulations for body-care products, edibles and medicine. This thesis aims at the synthesis and characterization of novel heterofunctional PEG structures and the establishment of diethyl squarate as a suitable linker for the covalent attachment to proteins. Chapter 1 is an introduction on the properties and applications of PEG as well as the fascinating chemistry of squaric acid derivatives. In Chapter 1.1, the synthesis and properties of PEG are described, and the versatile applications of PEG derivatives in everyday products are emphasized with a focus on PEG-based pharmaceuticals and nonionic surfactants. This chapter is written in German, as it was published in the German Journal Chemie in unserer Zeit. Chapter 1.2 deals with PEGs major drawbacks, its non-biodegradability, which impedes parenteral administration of PEG conjugates with polyethers exceeding the renal excretion limit, although these would improve blood circulation times and passive tumor targeting. This section gives a comprehensive overview of the cleavable groups that have been implemented in the polyether backbone to tackle this issue as well as the synthetic strategies employed to accomplish this task. Chapter 1.3 briefly summarizes the chemical properties of alkyl squarates and the advantages in protein conjugation chemistry that can be taken from its use as a coupling agent. In Chapter 2, the application of diethyl squarate as a coupling agent in the PEGylation of proteins is illustrated. Chapter 2.1 describes the straightforward synthesis and characterization of squaric acid ethyl ester amido PEGs with terminal hydroxyl functions or methoxy groups. The reactivity and selectivity of theses activated PEGs are explored in kinetic studies on the reactions with different lysine and other amino acid derivatives, followed by 1H NMR spectroscopy. Further, the efficient attachment of the novel PEGs to a model protein, i.e., bovine serum albumin (BSA), demonstrates the usefulness of the new linker for the PEGylation with heterofunctional PEGs. In Chapter 2.3 initial studies on the biocompatibility of polyether/BSA conjugates synthesized by the squaric acid mediated PEGylation are presented. No cytotoxic effects on human umbilical vein endothelial cells exposed to various concentrations of the conjugates were observed in a WST-1 assay. A cell adhesion molecule - enzyme immunosorbent assay did not reveal the expression of E-selectin or ICAM-1, cell adhesion molecules involved in inflammation processes. The focus of Chapter 3 lies on the syntheses of novel heterofunctional PEG structures which are suitable candidates for the squaric acid mediated PEGylation and exhibit superior features compared to established PEGs applied in bioconjugation. Chapter 3.1 describes the synthetic route to well-defined, linear heterobifunctional PEGs carrying a single acid-sensitive moiety either at the initiation site or at a tunable position in the polyether backbone. A universal concept for the implementation of acetal moieties into initiators for the anionic ring-opening polymerization (AROP) of epoxides is presented and proven to grant access to the degradable PEG structures aimed at. The hydrolysis of the heterofunctional PEG with the acetal moiety at the initiating site is followed by 1H NMR spectroscopy in deuterium oxide at different pH. In an exploratory study, the same polymer is attached to BSA via the squarate acid coupling and subsequently cleaved from the conjugate under acidic conditions. Furthermore, the concept for the generation of acetal-modified AROP initiators is demonstrated to be suitable for cholesterol, and the respective amphiphilic cholesteryl-PEG is cleaved at lowered pH. In Chapter 3.2, the straightforward synthesis of α-amino ω2-dihydroxyl star-shaped three-arm PEGs is described. To assure a symmetric length of the hydroxyl-terminated PEG arms, a novel AROP initiator is presented, who’s primary and secondary hydroxyl groups are separated by an acetal moiety. Upon polymerization of ethylene oxide for these functionalities and subsequent cleavage of the acid-labile unit no difference in the degree of polymerization is seen for both polyether fragments.
Resumo:
Für das Vermögen der Atmosphäre sich selbst zu reinigen spielen Stickstoffmonoxid (NO) und Stickstoffdioxid (NO2) eine bedeutende Rolle. Diese Spurengase bestimmen die photochemische Produktion von Ozon (O3) und beeinflussen das Vorkommen von Hydroxyl- (OH) und Nitrat-Radikalen (NO3). Wenn tagsüber ausreichend Solarstrahlung und Ozon vorherrschen, stehen NO und NO2 in einem schnellen photochemischen Gleichgewicht, dem „Photostationären Gleichgewichtszustand“ (engl.: photostationary state). Die Summe von NO und NO2 wird deshalb als NOx zusammengefasst. Vorhergehende Studien zum photostationären Gleichgewichtszustand von NOx umfassen Messungen an unterschiedlichsten Orten, angefangen bei Städten (geprägt von starken Luftverschmutzungen), bis hin zu abgeschiedenen Regionen (geprägt von geringeren Luftverschmutzungen). Während der photochemische Kreislauf von NO und NO2 unter Bedingungen erhöhter NOx-Konzentrationen grundlegend verstanden ist, gibt es in ländlicheren und entlegenen Regionen, welche geprägt sind von niedrigeren NOx-Konzetrationen, signifikante Lücken im Verständnis der zugrundeliegenden Zyklierungsprozesse. Diese Lücken könnten durch messtechnische NO2-Interferenzen bedingt sein - insbesondere bei indirekten Nachweismethoden, welche von Artefakten beeinflusst sein können. Bei sehr niedrigen NOx-Konzentrationen und wenn messtechnische NO2-Interferenzen ausgeschlossen werden können, wird häufig geschlussfolgert, dass diese Verständnislücken mit der Existenz eines „unbekannten Oxidationsmittels“ (engl.: unknown oxidant) verknüpft ist. Im Rahmen dieser Arbeit wird der photostationäre Gleichgewichtszustand von NOx analysiert, mit dem Ziel die potenzielle Existenz bislang unbekannter Prozesse zu untersuchen. Ein Gasanalysator für die direkte Messung von atmosphärischem NO¬2 mittels laserinduzierter Fluoreszenzmesstechnik (engl. LIF – laser induced fluorescence), GANDALF, wurde neu entwickelt und während der Messkampagne PARADE 2011 erstmals für Feldmessungen eingesetzt. Die Messungen im Rahmen von PARADE wurden im Sommer 2011 in einem ländlich geprägten Gebiet in Deutschland durchgeführt. Umfangreiche NO2-Messungen unter Verwendung unterschiedlicher Messtechniken (DOAS, CLD und CRD) ermöglichten einen ausführlichen und erfolgreichen Vergleich von GANDALF mit den übrigen NO2-Messtechniken. Weitere relevante Spurengase und meteorologische Parameter wurden gemessen, um den photostationären Zustand von NOx, basierend auf den NO2-Messungen mit GANDALF in dieser Umgebung zu untersuchen. Während PARADE wurden moderate NOx Mischungsverhältnisse an der Messstelle beobachtet (10^2 - 10^4 pptv). Mischungsverhältnisse biogener flüchtige Kohlenwasserstoffverbindungen (BVOC, engl.: biogenic volatile organic compounds) aus dem umgebenden Wald (hauptsächlich Nadelwald) lagen in der Größenordnung 10^2 pptv vor. Die Charakteristiken des photostationären Gleichgewichtszustandes von NOx bei niedrigen NOx-Mischungsverhältnissen (10 - 10^3 pptv) wurde für eine weitere Messstelle in einem borealen Waldgebiet während der Messkampagne HUMPPA-COPEC 2010 untersucht. HUMPPA–COPEC–2010 wurde im Sommer 2010 in der SMEARII-Station in Hyytiälä, Süd-Finnland, durchgeführt. Die charakteristischen Eigenschaften des photostationären Gleichgewichtszustandes von NOx in den beiden Waldgebieten werden in dieser Arbeit verglichen. Des Weiteren ermöglicht der umfangreiche Datensatz - dieser beinhaltet Messungen von relevanten Spurengasen für die Radikalchemie (OH, HO2), sowie der totalen OH-Reaktivität – das aktuelle Verständnis bezüglich der NOx-Photochemie unter Verwendung von einem Boxmodell, in welches die gemessenen Daten als Randbedingungen eingehen, zu überprüfen und zu verbessern. Während NOx-Konzentrationen in HUMPPA-COPEC 2010 niedriger sind, im Vergleich zu PARADE 2011 und BVOC-Konzentrationen höher, sind die Zyklierungsprozesse von NO und NO2 in beiden Fällen grundlegend verstanden. Die Analyse des photostationären Gleichgewichtszustandes von NOx für die beiden stark unterschiedlichen Messstandorte zeigt auf, dass potenziell unbekannte Prozesse in keinem der beiden Fälle vorhanden sind. Die aktuelle Darstellung der NOx-Chemie wurde für HUMPPA-COPEC 2010 unter Verwendung des chemischen Mechanismus MIM3* simuliert. Die Ergebnisse der Simulation sind konsistent mit den Berechnungen basierend auf dem photostationären Gleichgewichtszustand von NOx.
Resumo:
Diese Arbeit beschreibt die Entwicklung, Konstruktion und Untersuchung eines Magnetometers zur exakten und präzisen Messung schwacher Magnetfelder. Diese Art von Magnetometer eignet sich zur Anwendung in physikalischen hochpräzisions Experimenten wie zum Beispiel der Suche nach dem elektrischen Dipolmomentrndes Neutrons. Die Messmethode beruht auf der gleichzeitigen Detektion der freien Spin Präzession Kern-Spin polarisierten 3He Gases durch mehrere optisch gepumpte Cäsium Magnetometer. Es wird gezeigt, dass Cäsium Magnetometer eine zuverlässige und vielseitige Methode zur Messung der 3He Larmor Frequenz und eine komfortable Alternative zur Benutzung von SQUIDs für diesen Zweck darstellen. Ein Prototyp dieses Magnetometers wurde gebaut und seine Funktion in der magnetisch abgeschirmten Messkabine der Physikalisch Technischen Bundesanstalt untersucht. Die Sensitivität des Magnetometers in Abhängigkeitrnvon der Messdauer wurde experimentell untersucht. Es wird gezeigt, dass für kurze Messperioden (< 500s) Cramér-Rao limitierte Messungen möglich sind während die Sensitivität bei längeren Messungen durch die Stabilität des angelegten Magnetfeldes limitiert ist. Messungen eines 1 muT Magnetfeldes mit einer relative Genauigkeit von besser als 5x10^(-8) in 100s werden präsentiert. Es wird gezeigt, dass die Messgenauigkeit des Magnetometers durch die Zahl der zur Detektion der 3He Spin Präzession eingesetzten Cäsium Magnetometer skaliert werden kann. Prinzipiell ist dadurch eine Anpassung der Messgenauigkeit an jegliche experimentellen Bedürfnisse möglich. Es wird eine gradiometrische Messmethode vorgestellt, die es erlaubt den Einfluss periodischerrnmagnetischer Störungen auf dieMessung zu unterdrücken. Der Zusammenhang zwischen der Sensitivität des kombinierten Magnetometers und den Betriebsparametern der Cäsium Magnetometer die zur Spin Detektion verwendet werden wird theoretisch untersucht und anwendungsspezifische Vor- und Nachteile verschiedener Betriebsartenwerden diskutiert. Diese Zusammenhänge werden in einer Formel zusammengefasst die es erlaubt, die erwartete Sensitivität des Magnetometers zu berechnen. Diese Vorhersagen befinden sich in perfekter Übereinstimmung mit den experimentellen Daten. Die intrinsische Sensitivität des Magnetometer Prototyps wird auf Basis dieser Formel theoretisch bestimmt. Ausserdem wird die erwartete Sensitivität für die Anwendung im Rahmen des Experiments der nächsten Generation zur Bestimmung des elektrischenrnDipolmoments des Neutrons am Paul Scherrer Institut abgeschätzt. Des weiteren wird eine bequeme experimentelle Methode zur Messung des Polarisationsgrades und des Rabi Flip-Winkels der 3He Kernspin Polarisation vorgestellt. Letztere Messung ist sehr wichtig für die Anwendung in hochpräzisions Experimenten.
Resumo:
Oggi sappiamo che la materia ordinaria rappresenta solo una piccola parte dell'intero contenuto in massa dell'Universo. L'ipotesi dell'esistenza della Materia Oscura, un nuovo tipo di materia che interagisce solo gravitazionalmente e, forse, tramite la forza debole, è stata avvalorata da numerose evidenze su scala sia galattica che cosmologica. Gli sforzi rivolti alla ricerca delle cosiddette WIMPs (Weakly Interacting Massive Particles), il generico nome dato alle particelle di Materia Oscura, si sono moltiplicati nel corso degli ultimi anni. L'esperimento XENON1T, attualmente in costruzione presso i Laboratori Nazionali del Gran Sasso (LNGS) e che sarà in presa dati entro la fine del 2015, segnerà un significativo passo in avanti nella ricerca diretta di Materia Oscura, che si basa sulla rivelazione di collisioni elastiche su nuclei bersaglio. XENON1T rappresenta la fase attuale del progetto XENON, che ha già realizzato gli esperimenti XENON10 (2005) e XENON100 (2008 e tuttora in funzione) e che prevede anche un ulteriore sviluppo, chiamato XENONnT. Il rivelatore XENON1T sfrutta circa 3 tonnellate di xeno liquido (LXe) e si basa su una Time Projection Chamber (TPC) a doppia fase. Dettagliate simulazioni Monte Carlo della geometria del rivelatore, assieme a specifiche misure della radioattività dei materiali e stime della purezza dello xeno utilizzato, hanno permesso di predire con accuratezza il fondo atteso. In questo lavoro di tesi, presentiamo lo studio della sensibilità attesa per XENON1T effettuato tramite il metodo statistico chiamato Profile Likelihood (PL) Ratio, il quale nell'ambito di un approccio frequentista permette un'appropriata trattazione delle incertezze sistematiche. In un primo momento è stata stimata la sensibilità usando il metodo semplificato Likelihood Ratio che non tiene conto di alcuna sistematica. In questo modo si è potuto valutare l'impatto della principale incertezza sistematica per XENON1T, ovvero quella sulla emissione di luce di scintillazione dello xeno per rinculi nucleari di bassa energia. I risultati conclusivi ottenuti con il metodo PL indicano che XENON1T sarà in grado di migliorare significativamente gli attuali limiti di esclusione di WIMPs; la massima sensibilità raggiunge una sezione d'urto σ=1.2∙10-47 cm2 per una massa di WIMP di 50 GeV/c2 e per una esposizione nominale di 2 tonnellate∙anno. I risultati ottenuti sono in linea con l'ambizioso obiettivo di XENON1T di abbassare gli attuali limiti sulla sezione d'urto, σ, delle WIMPs di due ordini di grandezza. Con tali prestazioni, e considerando 1 tonnellata di LXe come massa fiduciale, XENON1T sarà in grado di superare gli attuali limiti (esperimento LUX, 2013) dopo soli 5 giorni di acquisizione dati.
Resumo:
Argomento del lavoro è stato lo studio di problemi legati alla Flow-Asurance. In particolare, si focalizza su due aspetti: i) una valutazione comparativa delle diverse equazioni di stato implementate nel simulatore multifase OLGA, per valutare quella che porta a risultati più conservativi; ii) l’analisi della formazione di idrati, all’interno di sistemi caratterizzati dalla presenza di gas ed acqua. Il primo argomento di studio nasce dal fatto che per garantire continuità del flusso è necessario conoscere il comportamento volumetrico del fluido all’interno delle pipelines. Per effettuare tali studi, la Flow-Assurance si basa sulle Equazioni di Stato cubiche. In particolare, sono state confrontate: -L’equazione di Soave-Redlich-Kwong; -L’equazione di Peng-Robinson; -L’equazione di Peng-Robinson modificata da Peneloux. Sono stati analizzati 4 fluidi idrocarburici (2 multifase, un olio e un gas) con diverse composizioni e diverse condizioni di fase. Le variabili considerate sono state pressione, temperatura, densità e viscosità; sono state poi valutate le perdite di carico, parametro fondamentale nello studio del trasporto di un fluido, valutando che l'equazione di Peng-Robinson è quella più adatta per caratterizzare termodinamicamente il fluido durante una fase di design, in quanto fornisce l'andamento più conservativo. Dopo aver accertato la presenza di idrati nei fluidi multifase, l’obiettivo del lavoro è stato analizzare come il sistema rispondesse all’aggiunta di inibitori chimici per uscire dalla regione termodinamica di stabilità dell’idrato. Gli inibitori utilizzati sono stati metanolo e mono-etilen-glicole in soluzione acquosa. L’analisi è stata effettuata confrontando due metodi: -Metodo analitico di Hammerschmidt; -Metodo iterativo con PVTSim. I risultati ottenuti hanno dimostrato che entrambi gli inibitori utilizzati risolvono il problema della formazione di idrato spostando la curva di stabilità al di fuori delle pressioni e temperature che si incontrano nella pipeline. Valutando le quantità da iniettare, il metodo di Hammerschmidt risulta quello più conservativo, indicando portate maggiori rispetto al PVTsim, soprattutto aggiungendo metanolo.
Resumo:
The use of radial artery conduits in coronary artery bypass grafting (CABG) surgery is associated with improved long-term patency and patient survival rates as compared with saphenous vein conduits. Despite increasing popularity, relative incidence of local harvest-site complications and subjective perception of adverse long-term sequelae remain poorly described.
Resumo:
PURPOSE: The purpose of our study was to retrospectively evaluate the specificity, sensitivity and accuracy of computed tomography (CT), digital radiography (DR) and low-dose linear slit digital radiography (LSDR, Lodox(®)) in the detection of internal cocaine containers. METHODS: Institutional review board approval was obtained. The study collectively consisted of 83 patients (76 males, 7 females, 16-45 years) suspected of having incorporated cocaine drug containers. All underwent radiological imaging; a total of 135 exams were performed: nCT=35, nDR=70, nLSDR=30. An overall calculation of all "drug mules" and a specific evaluation of body packers, pushers and stuffers were performed. The gold standard was stool examination in a dedicated holding cell equipped with a drug toilet. RESULTS: There were 54 drug mules identified in this study. CT of all drug carriers showed the highest diagnostic accuracy 97.1%, sensitivity 100% and specificity 94.1%. DR in all cases was 71.4% accurate, 58.3% sensitive and 85.3% specific. LSDR of all patients with internal cocaine was 60% accurate, 57.9% sensitive and 63.4% specific. CONCLUSIONS: CT was the most accurate test studied. Therefore, the detection of internal cocaine drug packs should be performed by CT, rather than by conventional X-ray, in order to apply the most sensitive exam in the medico-legal investigation of suspected drug carriers. Nevertheless, the higher radiation applied by CT than by DR or LSDR needs to be considered. Future studies should include evaluation of low dose CT protocols in order to address germane issues and to reduce dosage.
Resumo:
Sphingosine kinases (SK) catalyse the formation of sphingosine 1-phosphate, which is a key lipid mediator regulating cell responses such as proliferation, survival and migration. Here we have investigated the effect of targeted inhibition of SK-1 on cell damage and elucidated the mechanisms involved.