945 resultados para Lakhi-Tor
Resumo:
Für diese Arbeit wurden sechs neue Benzodiazepinderivate, TC07, TC08, TC09, TC10, TC11 und TC12, hergestellt. Diese wurden mittels Radioligandenbindungsassay sowohl auf ihre Bindungseigenschaften für Membranen des Cerebellum, des Hippo-campus und des Cortex der Ratte hin untersucht, als auch für Membranen von HEK293 Zellen, die transient rekombinante GABAA Rezeptoren exprimierten. Zusätz-lich wurden kompetitive in situ Rezeptorautoradiographien an Rattenhirnschnitten mit den Liganden [3H]Ro15-4513 und [3H]R015-1788 durchgeführt. Zusammen ergaben sich aus diesen Experimenten deutliche Hinweise auf eine Selektivität der Verbindun-gen TC07, TC11 und TC12 für a5-Untereinheiten enthaltende GABAA Rezeptoren mit a5-Affinitäten im niedrigen nanomolaren Bereich. In vivo Bindungsexperimente in Ratten, mit [3H]Ro15-1788 als Tracer und TC07 als Kompetitor, ergaben, dass TC07 mehr [3H]Ro15-1788 im Vorderhirn als im Cerebellum verdrängt. Bezog man die regionale Verteilung der a5-Untereinheit des GABAA Rezep-tors im Rattenhirn mit ein – sehr wenige a5-Untereinheiten im Cerebellum, etwa 20 % der GABAA Rezeptor-Untereinheiten im Hippocampus – untermauerten diese Ergeb-nisse die Vermutung, TC07 könne a5-selektiv sein. Diese Daten bestätigten darü-berhinaus, dass TC07 die Blut-Hirn-Schranke passieren kann. Für elektrophysiologische Messungen mit TC07 und TC12 wurden die oben erwähnten transient transfizierten HEK293 Zellen verwendet, welche die GABAA Rezeptor Unte-reinheitenkombination a5b3g2 exprimierten. Das Dosis-Antwort Verhalten ergab keinen signifikanten Effekt für TC12. Die Daten von TC07 dagegen lassen auf einen schwach negativ modulatorischen Effekt schließen, was, zumindest theoretisch, die Möglichkeit eröffnet, TC07 auch als sogenannten cognitive enhancer einzusetzen. Der errechnete Ki-Wert lag in derselben Größenordnung wie der Ki-Wert, der anhand der Bindungsas-saydaten errechnet wurde. Insgesamt rechtfertigen die bisherigen Ergebnisse die radiochemische Markierung mit 18F von drei der sechs getesteten Verbindungen in der Reihenfolge TC07, TC12 und TC11. Des Weiteren wurde [18F]MHMZ, ein potentiell 5-HT2A selektiver Ligand und PET-Tracer einschließlich Vorläufer und Referenzverbindungen, mit hohen Ausbeuten syn-thetisiert (Herth, Debus et al. 2008). Autoradiographieexperimente mit Rattenhirn-schnitten zeigten hervorragende in situ Bindungseigenschaften der neuen Verbindung. Die Daten wiesen eine hohe Selektivität für 5-HT2A Rezeptoren in Verbindung mit einer niedrigen unspezifischen Bindung auf. [18F]MHMZ erfährt in vivo eine schnelle Metabo-lisierung, wobei ein polarer aktiver Metabolit entsteht, welcher vermutlich nicht die Blut-Hirn-Schranke passieren kann. Transversale, sagittale und coronale Kleintier-PET-Bilder des Rattenhirns zeigten eine hohe Anreicherung im frontalen Cortex und im Striatum, während im Cerebellum so gut wie keine Anreicherung festzustellen war. Diese Verteilung deckt sich mit der bekann-ten Verteilung der 5-HT2A Rezeptoren. Die in vivo Anreicherung scheint sich ebenfalls gut mit der Verteilung der in den Autoradiographieexperimenten gemessenen Bindung zu decken. Nach Berechnungen mit dem 4-Parameter Referenzgewebe Modell beträgt das Bindungspotential (BP) für den frontalen Cortex 1,45. Das Cortex zu Cerebellum Verhältnis wurde auf 2,7 nach 30 Minuten Messzeit bestimmt, was bemerkenswert nah an den von Lundkvist et al. für [11C]MDL 100907 publizierten Daten liegt. Abgesehen von der etwas niedrigeren Affinität waren die gemessenen in vitro, in situ und in vivo Daten denen von [3H]MDL 100907 und [11C]MDL 100907 sehr ähnlich, so dass wir ein [18F]Analogon in der Hand haben, das die bessere Selektivität von MDL 100907 verglichen mit Altanserin mit der längeren Halbwertszeit und den besse-ren Eigenschaften für die klinische Routine von 18F verglichen mit 11C verbindet. Die Ergebnisse von [18F]MHMZ rechtfertigenden weitere Experimente, um diesen Liganden für die klinische Routine am Menschen nutzbar zu machen.
Resumo:
In the field of organic optoelectronics, the nanoscale structure of the materials has huge im-pact on the device performance. Here, scanning force microscopy (SFM) techniques become increasingly important. In addition to topographic information, various surface properties can be recorded on a nanometer length scale, such as electrical conductivity (conductive scanning force microscopy, C-SFM) and surface potential (Kelvin probe force microscopy, KPFM).rnrnIn the context of this work, the electrical SFM modes were applied to study the interplay be-tween morphology and electrical properties in hybrid optoelectronic structures, developed in the group of Prof. J. Gutmann (MPI-P Mainz). In particular, I investigated the working prin-ciple of a novel integrated electron blocking layer system. A structure of electrically conduct-ing pathways along crystalline TiO2 particles in an insulating matrix of a polymer derived ceramic was found and insulating defect structures could be identified. In order to get insights into the internal structure of a device I investigated a working hybrid solar cell by preparing a cross cut with focused ion beam polishing. With C-SFM, the functional layers could be identified and the charge transport properties of the novel active layer composite material could be studied. rnrnIn C-SFM, soft surfaces can be permanently damaged by (i) tip induced forces, (ii) high elec-tric fields and (iii) high current densities close to the SFM-tip. Thus, an alternative operation based on torsion mode topography imaging in combination with current mapping was intro-duced. In torsion mode, the SFM-tip vibrates laterally and in close proximity to the sample surface. Thus, an electrical contact between tip and sample can be established. In a series of reference experiments on standard surfaces, the working mechanism of scanning conductive torsion mode microscopy (SCTMM) was investigated. Moreover, I studied samples covered with free standing semiconducting polymer nano-pillars that were developed in the group of Dr. P. Theato (University Mainz). The application of SCTMM allowed non-destructive imag-ing of the flexible surface at high resolution while measuring the conductance on individual pillarsrnrnIn order to study light induced electrical effects on the level of single nanostructures, a new SFM setup was built. It is equipped with a laser sample illumination and placed in inert at-mosphere. With this photoelectric SFM, I investigated the light induced response in function-alized nanorods that were developed in the group of Prof. R. Zentel (University Mainz). A block-copolymer containing an anchor block and dye moiety and a semiconducting conju-gated polymer moiety was synthesized and covalently bound to ZnO nanorods. This system forms an electron donor/acceptor interface and can thus be seen as a model system of a solar cell on the nanoscale. With a KPFM study on the illuminated samples, the light induced charge separation between the nanorod and the polymeric corona could not only be visualized, but also quantified.rnrnThe results demonstrate that electrical scanning force microscopy can study fundamental processes in nanostructures and give invaluable feedback to the synthetic chemists for the optimization of functional nanomaterials.rn
Resumo:
Bone metastases are responsible for different clinical complications defined as skeletal-related events (SREs) such as pathologic fractures, spinal cord compression, hypercalcaemia, bone marrow infiltration and severe bone pain requiring palliative radiotherapy. The general aim of these three years research period was to improve the management of patients with bone metastases through two different approaches of translational research. Firstly in vitro preclinical tests were conducted on breast cancer cells and on indirect co-colture of cancer cells and osteoclasts to evaluate bone targeted therapy singly and in combination with conventional chemotherapy. The study suggests that zoledronic acid has an antitumor activity in breast cancer cell lines. Its mechanism of action involves the decrease of RAS and RHO, as in osteoclasts. Repeated treatment enhances antitumor activity compared to non-repeated treatment. Furthermore the combination Zoledronic Acid + Cisplatin induced a high antitumoral activity in the two triple-negative lines MDA-MB-231 and BRC-230. The p21, pMAPK and m-TOR pathways were regulated by this combined treatment, particularly at lower Cisplatin doses. A co-colture system to test the activity of bone-targeted molecules on monocytes-breast conditioned by breast cancer cells was also developed. Another important criticism of the treatment of breast cancer patients, is the selection of patients who will benefit of bone targeted therapy in the adjuvant setting. A retrospective case-control study on breast cancer patients to find new predictive markers of bone metastases in the primary tumors was performed. Eight markers were evaluated and TFF1 and CXCR4 were found to discriminate between patients with relapse to bone respect to patients with no evidence of disease. In particular TFF1 was the most accurate marker reaching a sensitivity of 63% and a specificity of 79%. This marker could be a useful tool for clinicians to select patients who could benefit for bone targeted therapy in adjuvant setting.
Resumo:
In dieser Arbeit wird ein vergröbertes (engl. coarse-grained, CG) Simulationsmodell für Peptide in wässriger Lösung entwickelt. In einem CG Verfahren reduziert man die Anzahl der Freiheitsgrade des Systems, so dass manrngrössere Systeme auf längeren Zeitskalen untersuchen kann. Die Wechselwirkungspotentiale des CG Modells sind so aufgebaut, dass die Peptid Konformationen eines höher aufgelösten (atomistischen) Modells reproduziert werden.rnIn dieser Arbeit wird der Einfluss unterschiedlicher bindender Wechsel-rnwirkungspotentiale in der CG Simulation untersucht, insbesondere daraufhin,rnin wie weit das Konformationsgleichgewicht der atomistischen Simulation reproduziert werden kann. Im CG Verfahren verliert man per Konstruktionrnmikroskopische strukturelle Details des Peptids, zum Beispiel, Korrelationen zwischen Freiheitsgraden entlang der Peptidkette. In der Dissertationrnwird gezeigt, dass diese “verlorenen” Eigenschaften in einem Rückabbildungsverfahren wiederhergestellt werden können, in dem die atomistischen Freiheitsgrade wieder in die CG-Strukturen eingefügt werden. Dies gelingt, solange die Konformationen des CG Modells grundsätzlich gut mit der atomistischen Ebene übereinstimmen. Die erwähnten Korrelationen spielen einerngrosse Rolle bei der Bildung von Sekundärstrukturen und sind somit vonrnentscheidender Bedeutung für ein realistisches Ensemble von Peptidkonformationen. Es wird gezeigt, dass für eine gute Übereinstimmung zwischen CG und atomistischen Kettenkonformationen spezielle bindende Wechselwirkungen wie zum Beispiel 1-5 Bindungs- und 1,3,5-Winkelpotentiale erforderlich sind. Die intramolekularen Parameter (d.h. Bindungen, Winkel, Torsionen), die für kurze Oligopeptide parametrisiert wurden, sind übertragbarrnauf längere Peptidsequenzen. Allerdings können diese gebundenen Wechselwirkungen nur in Kombination mit solchen nichtbindenden Wechselwirkungspotentialen kombiniert werden, die bei der Parametrisierung verwendet werden, sind also zum Beispiel nicht ohne weiteres mit einem andere Wasser-Modell kombinierbar. Da die Energielandschaft in CG-Simulationen glatter ist als im atomistischen Modell, gibt es eine Beschleunigung in der Dynamik. Diese Beschleunigung ist unterschiedlich für verschiedene dynamische Prozesse, zum Beispiel für verschiedene Arten von Bewegungen (Rotation und Translation). Dies ist ein wichtiger Aspekt bei der Untersuchung der Kinetik von Strukturbildungsprozessen, zum Beispiel Peptid Aggregation.rn
Resumo:
Efficient coupling of light to quantum emitters, such as atoms, molecules or quantum dots, is one of the great challenges in current research. The interaction can be strongly enhanced by coupling the emitter to the eva-nescent field of subwavelength dielectric waveguides that offer strong lateral confinement of the guided light. In this context subwavelength diameter optical nanofibers as part of a tapered optical fiber (TOF) have proven to be powerful tool which also provide an efficient transfer of the light from the interaction region to an optical bus, that is to say, from the nanofiber to an optical fiber. rnAnother approach towards enhancing light–matter interaction is to employ an optical resonator in which the light is circulating and thus passes the emitters many times. Here, both approaches are combined by experi-mentally realizing a microresonator with an integrated nanofiber waist. This is achieved by building a fiber-integrated Fabry-Pérot type resonator from two fiber Bragg grating mirrors with a stop-band near the cesium D2-line wavelength. The characteristics of this resonator fulfill the requirements of nonlinear optics, optical sensing, and cavity quantum electrodynamics in the strong-coupling regime. Together with its advantageous features, such as a constant high coupling strength over a large volume, tunability, high transmission outside the mirror stop band, and a monolithic design, this resonator is a promising tool for experiments with nanofiber-coupled atomic ensembles in the strong-coupling regime. rnThe resonator's high sensitivity to the optical properties of the nanofiber provides a probe for changes of phys-ical parameters that affect the guided optical mode, e.g., the temperature via the thermo-optic effect of silica. Utilizing this detection scheme, the thermalization dynamics due to far-field heat radiation of a nanofiber is studied over a large temperature range. This investigation provides, for the first time, a measurement of the total radiated power of an object with a diameter smaller than all absorption lengths in the thermal spectrum at the level of a single object of deterministic shape and material. The results show excellent agreement with an ab initio thermodynamic model that considers heat radiation as a volumetric effect and that takes the emitter shape and size relative to the emission wavelength into account. Modeling and investigating the thermalization of microscopic objects with arbitrary shape from first principles is of fundamental interest and has important applications, such as heat management in nano-devices or radiative forcing of aerosols in Earth's climate system. rnUsing a similar method, the effect of the TOF's mechanical modes on the polarization and phase of the fiber-guided light is studied. The measurement results show that in typical TOFs these quantities exhibit high-frequency thermal fluctuations. They originate from high-Q torsional oscillations that couple to the nanofiber-guided light via the strain-optic effect. An ab-initio opto-mechanical model of the TOF is developed that provides an accurate quantitative prediction for the mode spectrum and the mechanically induced polarization and phase fluctuations. These high-frequency fluctuations may limit the ultimate ideality of fiber-coupling into photonic structures. Furthermore, first estimations show that they may currently limit the storage time of nanofiber-based atom traps. The model, on the other hand, provides a method to design TOFs with tailored mechanical properties in order to meet experimental requirements. rn
Resumo:
Cloud services are becoming ever more important for everyone's life. Cloud storage? Web mails? Yes, we don't need to be working in big IT companies to be surrounded by cloud services. Another thing that's growing in importance, or at least that should be considered ever more important, is the concept of privacy. The more we rely on services of which we know close to nothing about, the more we should be worried about our privacy. In this work, I will analyze a prototype software based on a peer to peer architecture for the offering of cloud services, to see if it's possible to make it completely anonymous, meaning that not only the users using it will be anonymous, but also the Peers composing it will not know the real identity of each others. To make it possible, I will make use of anonymizing networks like Tor. I will start by studying the state of art of Cloud Computing, by looking at some real example, followed by analyzing the architecture of the prototype, trying to expose the differences between its distributed nature and the somehow centralized solutions offered by the famous vendors. After that, I will get as deep as possible into the working principle of the anonymizing networks, because they are not something that can just be 'applied' mindlessly. Some de-anonymizing techniques are very subtle so things must be studied carefully. I will then implement the required changes, and test the new anonymized prototype to see how its performances differ from those of the standard one. The prototype will be run on many machines, orchestrated by a tester script that will automatically start, stop and do all the required API calls. As to where to find all these machines, I will make use of Amazon EC2 cloud services and their on-demand instances.
Resumo:
La mobilitazione di polveri radioattive nel caso di un incidente di perdita di vuoto (LOVA) all’interno di ITER (International Thermonuclear Experimental Reactor), è uno dei problemi di sicurezza che si sono posti durante la costruzione di tale reattore. Le polveri vengono generate dalla continua erosione da parte del plasma del materiale di contenimento. Ciò porta ad un accumulo delle stesse all’interno della camera di vuoto. Nel caso di un incidente LOVA il rilascio di tali polveri in atmosfera rappresenta un rischio per la salute di lavoratori e della popolazione circostante. Per raccogliere dati su tale tipo di incidente è stata costruita presso il laboratorio dell’università di Tor Vergata una piccola facility, STARDUST, in cui sono stati eseguiti vari esperimenti con delle condizioni iniziali simili a quelle presenti all’interno di ITER. Uno di questi esperimenti in particolare simula la rottura della camera di vuoto mediante l’ingresso di aria in STARDUST, inizialmente posto a 100 Pa, con un rateo di pressurizzazione di 300 Pa s−1. All’interno del serbatoio sono presenti delle polveri che, in differente percentuale, vengono portate in sospensione dal flusso di fluido entrante. In particolare le polveri sono composte da tungsteno (W), acciaio inossidabile (SS – 316 ) e carbonio ( C ). Scopo del presente lavoro è quello di riprodurre il campo di velocità che si genera all’interno di STARDUST nel caso dell’esperimento appena descritto e valutare il moto delle particelle portate in sospensione e la loro successiva deposizione. Ciò viene fatto mediante l’utilizzo di una geometria bidimensionale creata con Salome. Su tale geometria sono costruite differenti mesh strutturate in base al tipo di simulazione che si vuole implementare. Quest’ultima viene poi esportata nel solutore utilizzato, Code_Saturne. Le simulazioni eseguite possono essere suddivise in tre categorie principali. Nella prima (Mesh A) si è cercato di riprodurre i risultati numerici presentati dagli autori della parte sperimentale, che hanno utilizzato il software commerciale Fluent. Nella seconda si è riprodotto il campo di velocità interno a STARUDST sulla base dei dati sperimentali in possesso. Infine nell’ultima parte delle simulazioni si è riprodotto il moto delle particelle sospese all’interno del fluido in STARDUST, valutandone la deposizione. Il moto del fluido pressurizzante è stato considerato come compressibile. Il regime di moto è stato considerato turbolento viste le elevate velocità che provocavano elevati valori del numero di Reynolds per l’aria. I modelli con cui si è trattata la turbolenza sono stati di tre tipi. Il campo di velocità ottenuto è stato leggermente differente nei tre casi e si è proceduto ad un confronto per valutare le cause che hanno portato a tali differenze. Il moto delle particelle è stato trattato mediante l’utilizzo del modello di tracciamento lagrangiano delle particelle implementato in Code_Saturne. Differenti simulazioni sono state eseguite per tenere in considerazione i vari modelli di turbolenza adottati. Si è dunque proceduto ad analizzare similitudini e differenze dei risultati ottenuti.
Resumo:
CAP.1: introduzione sulla creazione di Internet e la sua diffusione; CAP.2: panoramica sui dati reperibili online e sugli strumenti attraverso cui è possibile estrarne informazioni; CAP.3: concetti di privacy ed anonimato applicati ad Internet, alcune normative, sintesi su cookie e spyware; CAP.4: deep web, cos'è e come raggiungerlo; CAP.5: TOR project, elenco delle componenti, spiegazione del protocollo per la creazione di connessioni anonime, particolarità ed aspetti problematici; CAP.6: conclusioni; carrellata dei progetti correlati a TOR, statistiche sull'uso dell'Internet anonimo, considerazioni sugli effetti dell'anonimato sul sociale e sull'inviolabilità di questo sistema.
Resumo:
Lo scopo di questo lavoro è stato quello di realizzare un'app, in collaborazione con l'Università degli Studi di Roma Tor Vergata, che fosse di supporto nello stabilire l'ecostenibilità del pesce comprato da potenziali acquirenti. In modo particolare, per ecosostenibilità dell'acquisto del pesce intendiamo principalmente due fattori: - lunghezza minima del pesce pescato; - attenzione posta sul pescare ed acquistare pesce nel giusto periodo dell'anno. Col primo aspetto, intendiamo porre l'attenzione sul fatto che ogni esemplare di pesce deve essere di una certa lunghezza minima per essere pescato e poi messo in vendita mentre col secondo fattore intendiamo l'evitamento della pesca di certe specie di pesce qualora si trovino nella loro stagione riproduttiva. Pertanto, compito fondamentale dell'app presentata in questa tesi è quello di stimare la lunghezza di un pesce acquistato mediante una fotografia scattata allo stesso tramite uno smartphone e di verificare se esso sia stato venduto nella giusta stagione, preoccupandosi poi non solo di informare di conseguenza l'utente ma anche di salvare ed inviare una segnalazione riguardo l'esito dell'operazione a seguito di un'attenta raccolta di dati. Vedremo nel corso di questo documento quali siano stati tutti i passaggi di sviluppo di questa app e quali aspetti abbiano richiesto una maggiore attenzione per mantenere sia una semplicità d'uso nei confronti dell'utente sia un'implementazione rapida ma efficiente.
Resumo:
La simulazione è definita come la rappresentazione del comportamento di un sistema o di un processo per mezzo del funzionamento di un altro o, alternativamente, dall'etimologia del verbo “simulare”, come la riproduzione di qualcosa di fittizio, irreale, come se in realtà, lo fosse. La simulazione ci permette di modellare la realtà ed esplorare soluzioni differenti e valutare sistemi che non possono essere realizzati per varie ragioni e, inoltre, effettuare differenti valutazioni, dinamiche per quanto concerne la variabilità delle condizioni. I modelli di simulazione possono raggiungere un grado di espressività estremamente elevato, difficilmente un solo calcolatore potrà soddisfare in tempi accettabili i risultati attesi. Una possibile soluzione, viste le tendenze tecnologiche dei nostri giorni, è incrementare la capacità computazionale tramite un’architettura distribuita (sfruttando, ad esempio, le possibilità offerte dal cloud computing). Questa tesi si concentrerà su questo ambito, correlandolo ad un altro argomento che sta guadagnando, giorno dopo giorno, sempre più rilevanza: l’anonimato online. I recenti fatti di cronaca hanno dimostrato quanto una rete pubblica, intrinsecamente insicura come l’attuale Internet, non sia adatta a mantenere il rispetto di confidenzialità, integrità ed, in alcuni, disponibilità degli asset da noi utilizzati: nell’ambito della distribuzione di risorse computazionali interagenti tra loro, non possiamo ignorare i concreti e molteplici rischi; in alcuni sensibili contesti di simulazione (e.g., simulazione militare, ricerca scientifica, etc.) non possiamo permetterci la diffusione non controllata dei nostri dati o, ancor peggio, la possibilità di subire un attacco alla disponibilità delle risorse coinvolte. Essere anonimi implica un aspetto estremamente rilevante: essere meno attaccabili, in quanto non identificabili.
Resumo:
Aim To analyze alcohol use, clinical data and laboratory parameters that may affect FIB-4, an index for measuring liver fibrosis, in HCV-monoinfected and HCV/HIV-coinfected drug users. Patients and Methods Patients admitted for substance abuse treatment between 1994 and 2006 were studied. Socio-demographic data, alcohol and drug use characteristics and clinical variables were obtained through hospital records. Blood samples for biochemistry, liver function tests, CD4 cell count, and serology of HIV and HCV infection were collected at admission. Multivariate linear regression was used to analyze the predictors of FIB-4 increase. Results A total of 472 (83% M, 17% F) patients were eligible. The median age at admission was 31 years (Interquartile range (IQR) 27–35 years), and the median duration of drug use was 10 years (IQR 5.5–15 years). Unhealthy drinking (>50 grams/day) was reported in 32% of the patients. The FIB-4 scores were significantly greater in the HCV/HIV-coinfected patients (1.14, IQR 0.76–1.87) than in the HCV-monoinfected patients (0.75, IQR 0.56–1.11) (p<0.001). In the multivariate analysis, unhealthy drinking (p = 0.034), lower total cholesterol (p = 0.042), serum albumin (p<0.001), higher GGT (p<0.001) and a longer duration of addiction (p = 0.005) were independently associated with higher FIB-4 scores in the HCV-monoinfected drug users. The effect of unhealthy drinking on FIB-4 scores disappeared in the HCV/HIV-coinfected patients, whereas lower serum albumin (p<0.001), a lower CD4 cell count (p = 0.006), higher total bilirubin (p<0.001) and a longer drug addiction duration (p<0.001) were significantly associated with higher FIB-4 values. Conclusions Unhealthy alcohol use in the HCV-monoinfected patients and HIV-related immunodeficiency in the HCV/HIV-coinfected patients are important risk factors associated with liver fibrosis in the respective populations.