925 resultados para Weighted histogram analysis method


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Premise: In the literary works of our anthropological and cultural imagination, the various languages and the different discursive practices are not necessarily quoted, expressly alluded to or declared through clear expressive mechanisms; instead, they rather constitute a substratum, a background, now consolidated, which with irony and intertextuality shines through the thematic and formal elements of each text. The various contaminations, hybridizations and promptings that we find in the expressive forms, the rhetorical procedures and the linguistic and thematic choices of post-modern literary texts are shaped as fluid and familiar categories. Exchanges and passages are no longer only allowed but also inevitable; the post-modern imagination is made up of an agglomeration of discourses that are no longer really separable, built up from texts that blend and quote one another, composing, each with its own specificities, the great family of the cultural products of our social scenario. A literary work, therefore, is not only a whole phenomenon, delimited hic et nunc by a beginning and an ending, but is a fragment of that complex, dense and boundless network that is given by the continual interrelations between human forms of communication and symbolization. The research hypothesis: A vision is delineated of comparative literature as a discipline attentive to the social contexts in which texts take shape and move and to the media-type consistency that literary phenomena inevitably take on. Hence literature is seen as an open systematicity that chooses to be contaminated by other languages and other discursive practices of an imagination that is more than ever polymorphic and irregular. Inside this interpretative framework the aim is to focus the analysis on the relationship that postmodern literature establishes with advertising discourse. On one side post-modern literature is inserted in the world of communication, loudly asserting the blending and reciprocal contamination of literary modes with media ones, absorbing their languages and signification practices, translating them now into thematic nuclei, motifs and sub-motifs and now into formal expedients and new narrative choices; on the other side advertising is chosen as a signification practice of the media universe, which since the 1960s has actively contributed to shaping the dynamics of our socio-cultural scenarios, in terms which are just as important as those of other discursive practices. Advertising has always been a form of communication and symbolization that draws on the collective imagination – myths, actors and values – turning them into specific narrative programs for its own texts. Hence the aim is to interpret and analyze this relationship both from a strictly thematic perspective – and therefore trying to understand what literature speaks about when it speaks about advertising, and seeking advertising quotations in post-modern fiction – and from a formal perspective, with a search for parallels and discordances between the rhetorical procedures, the languages and the verifiable stylistic choices in the texts of the two different signification practices. The analysis method chosen, for the purpose of constructive multiplication of the perspectives, aims to approach the analytical processes of semiotics, applying, when possible, the instruments of the latter, in order to highlight the thematic and formal relationships between literature and advertising. The corpus: The corpus of the literary texts is made up of various novels and, although attention is focused on the post-modern period, there will also be ineludible quotations from essential authors that with their works prompted various reflections: H. De Balzac, Zola, Fitzgerald, Joyce, Calvino, etc… However, the analysis focuses the corpus on three authors: Don DeLillo, Martin Amis and Aldo Nove, and in particular the followings novels: “Americana” (1971) and “Underworld” (1999) by Don DeLillo, “Money” (1984) by Martin Amis and “Woobinda and other stories without a happy ending” (1996) and “Superwoobinda” (1998) by Aldo Nove. The corpus selection is restricted to these novels for two fundamental reasons: 1. assuming parameters of spatio-temporal evaluation, the texts are representative of different socio-cultural contexts and collective imaginations (from the masterly glimpses of American life by DeLillo, to the examples of contemporary Italian life by Nove, down to the English imagination of Amis) and of different historical moments (the 1970s of DeLillo’s Americana, the 1980s of Amis, down to the 1990s of Nove, decades often used as criteria of division of postmodernism into phases); 2. adopting a perspective of strictly thematic analysis, as mentioned in the research hypothesis, the variations and the constants in the novels (thematic nuclei, topoi, images and narrative developments) frequently speak of advertising and inside the narrative plot they affirm various expressions and realizations of it: value ones, thematic ones, textual ones, urban ones, etc… In these novels the themes and the processes of signification of advertising discourse pervade time, space and the relationships that the narrator character builds around him. We are looking at “particle-characters” whose endless facets attest the influence and contamination of advertising in a large part of the narrative developments of the plot: on everyday life, on the processes of acquisition and encoding of the reality, on ideological and cultural baggage, on the relationships and interchanges with the other characters, etc… Often the characters are victims of the implacable consequentiality of the advertising mechanism, since the latter gets the upper hand over the usual processes of communication, which are overwhelmed by it, wittingly or unwittingly (for example: disturbing openings in which the protagonist kills his or her parents on the basis of a spot, former advertisers that live life codifying it through the commercial mechanisms of products, sons and daughters of advertisers that as children instead of playing outside for whole nights saw tapes of spots.) Hence the analysis arises from the text and aims to show how much the developments and the narrative plots of the novels encode, elaborate and recount the myths, the values and the narrative programs of advertising discourse, transforming them into novel components in their own right. And also starting from the text a socio-cultural reference context is delineated, a collective imagination that is different, now geographically, now historically, and from comparison between them the aim is to deduce the constants, the similarities and the variations in the relationship between literature and advertising.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a study of the metal sites of different proteins through X-ray Absorption Fine Structure (XAFS) spectroscopy. First of all, the capabilities of XAFS analysis have been improved by ab initio simulation of the near-edge region of the spectra, and an original analysis method has been proposed. The method subsequently served ad a tool to treat diverse biophysical problems, like the inhibition of proton-translocating proteins by metal ions and the matrix effect exerted on photosynthetic proteins (the bacterial Reaction Center, RC) by strongly dehydrate sugar matrices. A time-resolved study of Fe site of RC with μs resolution has been as well attempted. Finally, a further step aimed to improve the reliability of XAFS analysis has been performed by calculating the dynamical parameters of the metal binding cluster by means of DFT methods, and the theoretical result obtained for MbCO has been successfully compared with experimental data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Das Ziel des Experiments NA48 am CERN ist die Messung des Parameters Re(epsilon'/epsilon) der direktenCP-Verletzung mit einer Genauigkeit von 2x10^-4. Experimentell zugänglich ist das DoppelverhältnisR, das aus den Zerfällen des KL und KS in zwei neutrale bzw. zwei geladene Pionengebildet wird. Für R gilt in guter Näherung: R=1-6Re(epsilon'/epsilon).
NA48 verwendet eine Wichtung der KL-Ereignisse zur Reduzierung der Sensitivität auf dieDetektorakzeptanz. Zur Kontrolle der bisherigen Standardanalyse wurde eine Analyse ohne Ereigniswichtung durchgeführt. Das Ergebnis derungewichteten Analyse wird in dieser Arbeit vorgestellt. Durch Verzicht auf die Ereigniswichtung kann derstatistische Anteil des Gesamtfehlers deutlich verringert werden. Da der limitierende Kanal der Zerfall deslanglebigen Kaons in zwei neutrale Pionen ist, ist die Verwendung der gesamten Anzahl derKL-Zerfälle ein lohnendes Ziel.Im Laufe dieser Arbeit stellte sich heraus, dass dersystematische Fehler der Akzeptanzkorrektur diesen Gewinn wieder aufhebt.

Das Ergebnis der Arbeit für die Daten aus den Jahren 1998und 1999 ohne Ereigniswichtung lautet
Re(epsilon'/epsilon)=(17,91+-4,41(syst.)+-1,36(stat.))x10^-4.
Damit ist eindeutig die Existenz der direkten CP-Verletzungbestätigt. Dieses Ergebnis ist mit dem veröffentlichten Ergebnis vonNA48 verträglichSomit ist der Test der bisherigen Analysestrategie bei NA48erfolgreich durchgeführt worden.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research performed during the PhD and presented in this thesis, allowed to make judgments on pushover analysis method about its application in evaluating the correct structural seismic response. In this sense, the extensive critical review of existing pushover procedures (illustrated in chapter 1) outlined their major issues related to assumptions and to hypothesis made in the application of the method. Therefore, with the purpose of evaluate the effectiveness of pushover procedures, a wide numerical investigation have been performed. In particular the attention has been focused on the structural irregularity on elevation, on the choice of the load vector and on its updating criteria. In the study eight pushover procedures have been considered, of which four are conventional type, one is multi-modal, and three are adaptive. The evaluation of their effectiveness in the identification of the correct dynamic structural response, has been done by performing several dynamic and static non-linear analysis on eight RC frames, characterized by different proprieties in terms of regularity in elevation. The comparisons of static and dynamic results have then permitted to evaluate the examined pushover procedures and to identify the expected margin of error by using each of them. Both on base shear-top displacement curves and on considered storey parameters, the best agreement with the dynamic response has been noticed on Multi-Modal Pushover procedure. Therefore the attention has been focused on Displacement-based Adative Pushover, coming to define for it an improvement strategy, and on modal combination rules, advancing an innovative method based on a quadratic combination of the modal shapes (QMC). This latter has been implemented in a conventional pushover procedure, whose results have been compared with those obtained by other multi-modal procedures. The development of research on pushover analysis is very important because the objective is to come to the definition of a simple, effective and reliable analysis method, indispensable tool in the seismic evaluation of new or existing structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die vorliegende Arbeit beschreibt unter anderem die Realisierung eines Assays aus mikrostrukturierten und selektiv funktionalisierten künstlichen Membransegmenten auf einem Chip. Die Strukturierungsmethode kombiniert die softlithographische Technik des Mikroformens in Kapillaren mit der Vesikelspreittechnik und bietet ein elegantes Verfahren, einzeln adressierbare Lipidsegmente im Mikrometer Regime zu erzeugen. Unter Berücksichtigung des hydrodynamischen Fließverhaltens und der Stabilitätskriterien für PDMS-Elastomere wurden außerdem neue Strukturen entwi-ckelt, die für den kombinierten Einsatz von Rasterkraftmikroskopie und Fluoreszenz-mikroskopie optimiert sind. Die Anwendbarkeit des Lab-On-A-Chip-Devices als Bio-sensor wurde durch zwei prominente Protein-Rezeptor-Bindungsstudien fluores-zenzmikroskopisch und rasterkraftmikroskopisch belegt. Im zweiten Teil der Arbeit sind die mechanischen und adhäsiven Eigenschaften aus-gewählter Lipidsysteme mit einer neuen Charakterisierungstechnik untersucht wor-den, die die Kontaktmechanik von Rastersonden und Lipidmembranen auf Basis der Digitalisierung von Hochgeschwindigkeitskraftkurven und einer automatisierten Multi-parameteranalyse quantitativ erfasst. Dabei konnte die Korrelation zwischen der Ad-häsion und den materialspezifischen Durchbruchlängen und Durchbruchkräften, die charakteristische Stabilitätsparameter der Lipidmembran darstellen, auf Systemen mit variierenden Kopfgruppen und Kettenlängen analysiert werden. Das Verfahren erlaubte zudem die simultane Quantifizierung der elastischen Eigenschaften der Li-piddoppelschichten. Zu den Kraftkurven wurden Simulationen der Systemantwort durchgeführt, die ein tieferes Verständnis der Kontrastentstehung ermöglichen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research has focused on the study of the behavior and of the collapse of masonry arch bridges. The latest decades have seen an increasing interest in this structural type, that is still present and in use, despite the passage of time and the variation of the transport means. Several strategies have been developed during the time to simulate the response of this type of structures, although even today there is no generally accepted standard one for assessment of masonry arch bridges. The aim of this thesis is to compare the principal analytical and numerical methods existing in literature on case studies, trying to highlight values and weaknesses. The methods taken in exam are mainly three: i) the Thrust Line Analysis Method; ii) the Mechanism Method; iii) the Finite Element Methods. The Thrust Line Analysis Method and the Mechanism Method are analytical methods and derived from two of the fundamental theorems of the Plastic Analysis, while the Finite Element Method is a numerical method, that uses different strategies of discretization to analyze the structure. Every method is applied to the case study through computer-based representations, that allow a friendly-use application of the principles explained. A particular closed-form approach based on an elasto-plastic material model and developed by some Belgian researchers is also studied. To compare the three methods, two different case study have been analyzed: i) a generic masonry arch bridge with a single span; ii) a real masonry arch bridge, the Clemente Bridge, built on Savio River in Cesena. In the analyses performed, all the models are two-dimensional in order to have results comparable between the different methods taken in exam. The different methods have been compared with each other in terms of collapse load and of hinge positions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis work is focused on the use of selected core-level x-ray spectroscopies to study semiconductor materials of great technological interest and on the development of a new implementation of appearance potential spectroscopy. Core-level spectroscopies can be exploited to study these materials with a local approach since they are sensitive to the electronic structure localized on a chemical species present in the sample examined. This approach, in fact, provides important micro-structural information that is difficult to obtain with techniques sensitive to the average properties of materials. In this thesis work we present a novel approach to the study of semiconductors with core-level spectroscopies based on an original analysis procedure that leads to an insightful understanding of the correlation between the local micro-structure and the spectral features observed. In particular, we studied the micro-structure of Hydrogen induced defects in nitride semiconductors, since the analysed materials show substantial variations of optical and electronic properties as a consequence of H incorporation. Finally, we present a novel implementation of soft x-ray appearance potential spectroscopy, a core-level spectroscopy that uses electrons as a source of excitation and has the great advantage of being an in-house technique. The original set-up illustrated was designed to reach a high signal-to-noise ratio for the acquisition of good quality spectra that can then be analyzed in the framework of the real space full multiple scattering theory. This technique has never been coupled with this analysis approach and therefore our work unite a novel implementation with an original data analysis method, enlarging the field of application of this technique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Im Juli 2009 wurde am Mainzer Mikrotron (MAMI) erstmal ein Experiment durchgeführt, bei dem ein polarisiertes 3He Target mit Photonen im Energiebereich von 200 bis 800 MeV untersucht wurde. Das Ziel dieses Experiments war die Überprüfung der Gerasimov-Drell-Hearn Summenregel am Neutron. Die Verwendung der Messdaten welche mit dem polarisierten 3He Target gewonnen wurden, geben - im Vergleich mit den bereits existieren Daten vom Deuteron - aufgrund der Spin-Struktur des 3He einen komplementären und direkteren Zugang zum Neutron. Die Messung des totalen helizitätsabhängigen Photoabsorptions-Wirkungsquerschnitts wurde mittels eines energiemarkierten Strahls von zirkular polarisierten Photonen, welcher auf das longitudinal polarisierte 3He Target trifft, durchgeführt. Als Produktdetektoren kamen der Crystal Ball (4π Raumabdeckung), TAPS (als ”Vorwärtswand”) sowie ein Schwellen-Cherenkov-Detektor (online Veto zur Reduktion von elektromagnetischen Ereignissen) zum Einsatz. Planung und Aufbau der verschiedenen komponenten Teile des 3He Experimentaufbaus war ein entscheidender Teil dieser Dissertation und wird detailliert in der vorliegenden Arbeit beschrieben. Das Detektorsystem als auch die Analyse-Methoden wurden durch die Messung des unpolarisierten, totalen und inklusiven Photoabsoprtions-Wirkungsquerschnitts an flüssigem Wasserstoff getestet. Hierbei zeigten die Ergebnisse eine gute Übereinstimmung mit bereits zuvor publizierten Daten. Vorläufige Ergebnisse des unpolarisierten totalen Photoabsorptions-Wirkungsquerschnitts sowie der helizitätsabhängige Unterschied zwischen Photoabsorptions-Wirkungsquerschnitten an 3He im Vergleich zu verschiedenen theoretischen Modellen werden vorgestellt.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The atmosphere is a global influence on the movement of heat and humidity between the continents, and thus significantly affects climate variability. Information about atmospheric circulation are of major importance for the understanding of different climatic conditions. Dust deposits from maar lakes and dry maars from the Eifel Volcanic Field (Germany) are therefore used as proxy data for the reconstruction of past aeolian dynamics.rnrnIn this thesis past two sediment cores from the Eifel region are examined: the core SM3 from Lake Schalkenmehren and the core DE3 from the Dehner dry maar. Both cores contain the tephra of the Laacher See eruption, which is dated to 12,900 before present. Taken together the cores cover the last 60,000 years: SM3 the Holocene and DE3 the marine isotope stages MIS-3 and MIS-2, respectively. The frequencies of glacial dust storm events and their paleo wind direction are detected by high resolution grain size and provenance analysis of the lake sediments. Therefore two different methods are applied: geochemical measurements of the sediment using µXRF-scanning and the particle analysis method RADIUS (rapid particle analysis of digital images by ultra-high-resolution scanning of thin sections).rnIt is shown that single dust layers in the lake sediment are characterized by an increased content of aeolian transported carbonate particles. The limestone-bearing Eifel-North-South zone is the most likely source for the carbonate rich aeolian dust in the lake sediments of the Dehner dry maar. The dry maar is located on the western side of the Eifel-North-South zone. Thus, carbonate rich aeolian sediment is most likely to be transported towards the Dehner dry maar within easterly winds. A methodology is developed which limits the detection to the aeolian transported carbonate particles in the sediment, the RADIUS-carbonate module.rnrnIn summary, during the marine isotope stage MIS-3 the storm frequency and the east wind frequency are both increased in comparison to MIS-2. These results leads to the suggestion that atmospheric circulation was affected by more turbulent conditions during MIS-3 in comparison to the more stable atmospheric circulation during the full glacial conditions of MIS-2.rnThe results of the investigations of the dust records are finally evaluated in relation a study of atmospheric general circulation models for a comprehensive interpretation. Here, AGCM experiments (ECHAM3 and ECHAM4) with different prescribed SST patterns are used to develop a synoptic interpretation of long-persisting east wind conditions and of east wind storm events, which are suggested to lead to an enhanced accumulation of sediment being transported by easterly winds to the proxy site of the Dehner dry maar.rnrnThe basic observations made on the proxy record are also illustrated in the 10 m-wind vectors in the different model experiments under glacial conditions with different prescribed sea surface temperature patterns. Furthermore, the analysis of long-persisting east wind conditions in the AGCM data shows a stronger seasonality under glacial conditions: all the different experiments are characterized by an increase of the relative importance of the LEWIC during spring and summer. The different glacial experiments consistently show a shift from a long-lasting high over the Baltic Sea towards the NW, directly above the Scandinavian Ice Sheet, together with contemporary enhanced westerly circulation over the North Atlantic.rnrnThis thesis is a comprehensive analysis of atmospheric circulation patterns during the last glacial period. It has been possible to reconstruct important elements of the glacial paleo climate in Central Europe. While the proxy data from sediment cores lead to a binary signal of the wind direction changes (east versus west wind), a synoptic interpretation using atmospheric circulation models is successful. This shows a possible distribution of high and low pressure areas and thus the direction and strength of wind fields which have the capacity to transport dust. In conclusion, the combination of numerical models, to enhance understanding of processes in the climate system, with proxy data from the environmental record is the key to a comprehensive approach to paleo climatic reconstruction.rn

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes the developments of new models and toolkits for the orbit determination codes to support and improve the precise radio tracking experiments of the Cassini-Huygens mission, an interplanetary mission to study the Saturn system. The core of the orbit determination process is the comparison between observed observables and computed observables. Disturbances in either the observed or computed observables degrades the orbit determination process. Chapter 2 describes a detailed study of the numerical errors in the Doppler observables computed by NASA's ODP and MONTE, and ESA's AMFIN. A mathematical model of the numerical noise was developed and successfully validated analyzing against the Doppler observables computed by the ODP and MONTE, with typical relative errors smaller than 10%. The numerical noise proved to be, in general, an important source of noise in the orbit determination process and, in some conditions, it may becomes the dominant noise source. Three different approaches to reduce the numerical noise were proposed. Chapter 3 describes the development of the multiarc library, which allows to perform a multi-arc orbit determination with MONTE. The library was developed during the analysis of the Cassini radio science gravity experiments of the Saturn's satellite Rhea. Chapter 4 presents the estimation of the Rhea's gravity field obtained from a joint multi-arc analysis of Cassini R1 and R4 fly-bys, describing in details the spacecraft dynamical model used, the data selection and calibration procedure, and the analysis method followed. In particular, the approach of estimating the full unconstrained quadrupole gravity field was followed, obtaining a solution statistically not compatible with the condition of hydrostatic equilibrium. The solution proved to be stable and reliable. The normalized moment of inertia is in the range 0.37-0.4 indicating that Rhea's may be almost homogeneous, or at least characterized by a small degree of differentiation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The interplay of hydrodynamic and electrostatic forces is of great importance for the understanding of colloidal dispersions. Theoretical descriptions are often based on the so called standard electrokinetic model. This Mean Field approach combines the Stokes equation for the hydrodynamic flow field, the Poisson equation for electrostatics and a continuity equation describing the evolution of the ion concentration fields. In the first part of this thesis a new lattice method is presented in order to efficiently solve the set of non-linear equations for a charge-stabilized colloidal dispersion in the presence of an external electric field. Within this framework, the research is mainly focused on the calculation of the electrophoretic mobility. Since this transport coefficient is independent of the electric field only for small driving, the algorithm is based upon a linearization of the governing equations. The zeroth order is the well known Poisson-Boltzmann theory and the first order is a coupled set of linear equations. Furthermore, this set of equations is divided into several subproblems. A specialized solver for each subproblem is developed, and various tests and applications are discussed for every particular method. Finally, all solvers are combined in an iterative procedure and applied to several interesting questions, for example, the effect of the screening mechanism on the electrophoretic mobility or the charge dependence of the field-induced dipole moment and ion clouds surrounding a weakly charged sphere. In the second part a quantitative data analysis method is developed for a new experimental approach, known as "Total Internal Reflection Fluorescence Cross-Correlation Spectroscopy" (TIR-FCCS). The TIR-FCCS setup is an optical method using fluorescent colloidal particles to analyze the flow field close to a solid-fluid interface. The interpretation of the experimental results requires a theoretical model, which is usually the solution of a convection-diffusion equation. Since an analytic solution is not available due to the form of the flow field and the boundary conditions, an alternative numerical approach is presented. It is based on stochastic methods, i. e. a combination of a Brownian Dynamics algorithm and Monte Carlo techniques. Finally, experimental measurements for a hydrophilic surface are analyzed using this new numerical approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nanopartikuläre Wirkstofftransportsysteme besitzen ein großes Potential für therapeutische Anwendungen. In der vorliegenden Arbeit wurden verschiedene grundlegende Aspekte, die für das erweiterte biologische Verständnis und die Entwicklung weiterer zielgerichteter Strategien zur Pharmakotherapie mit Nanopartikeln und –kapseln notwendig sind, näher untersucht. Experimente zur zellulären Aufnahmefähigkeit (in vitro und ex vivo) wurden mit verschiedenen Nanopartikeln und –kapseln aus diversen Monomeren und biokompatiblen Makromolekülen in immortalisierten Zellkulturlinien, humanen mesenchymalen Stammzellen und Leukozyten durchgeführt und durchflusszytometrisch sowie mittels konfokaler Laser-Raster-Mikroskopie analysiert. Die Einflüsse der Oberflächenfunktionalisierungen der nanopartikulären Systeme, deren toxikologische Effekte sowie der Einfluss von adsorbiertem bovinem Serumalbumin auf funktionalisierten Polystyrol-Nanopartikeln wurden in Bezug auf die zelluläre Aufnahme untersucht.Um die multiplen Wechselwirkungen der Nanopartikel mit Bestandteilen des humanen peripheren Vollblutes zu untersuchen, wurde erfolgreich ein durchflusszytometrisches Analyseverfahren in antikoaguliertem peripherem Vollblut (ex vivo) entwickelt. Es konnte nachgewiesen werden, dass der Einfluss von Calcium-komplexierenden Antikoagulanzien zu einer Verringerung und nicht Li-Heparin zu einer Verstärkung der zellulären Aufnahme von funktionalisierten Polystyrol-Nanopartikeln in diversen Leukozyten führt.Für Folsäure-gekoppelte Hydroxyethylstärke-Nanokapseln (Synthese Frau Dr. Grit Baier) konnte ein größenabhängiger selektiver, Folatrezeptor α vermittelter, zellulärer Aufnahmeweg in HeLa-Zellen nachgewiesen werden.Hydrolysierbare, nicht zytotoxische Polyester-Nanopartikel aus Poly(5,6-Benzo-2-methylen-1,3-dioxepan) (Synthese Herr Dr. Jörg Max Siebert) mit eingebettetem Paclitaxel zeigten in HeLa-Zellen eine vergleichbare pharmakologische Wirkung wie kommerziell erhältliche Paclitaxel-Formulierungen.Die in dieser Arbeit eingesetzten Nanopartikel und Nanokapseln besitzen ein vielfältiges Potential als Wirkstofftransportsysteme. Es zeigte sich, dass Unterschiede bei der Größe, der Größenverteilung, des Polymers sowie der Oberflächenfunktionalisierung der Nanopartikel bedeutende Unterschiede der Zellaufnahme in diversen Zellkulturlinien (in vitro) und Leukozyten in peripherem Vollblut (ex vivo) zur Folge haben.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In dieser Arbeit wird die bisher präziseste und erste direkte Hochpräzisionsmessung des g-Faktors eines einzelnen Protons präsentiert. Die Messung beruht auf der nicht-destruktiven Bestimmung der Zyklotronfrequenz und der Larmorfrequenz eines in einer Penning-Falle gespeicherten Protons. Zur Bestimmung der Larmorfrequenz wird die Spin-Flip-Wahrscheinlichkeit als Funktion einer externen Spin-Flip-Anregung aufgenommen. Zu diesem Zweck wird der kontinuierliche Stern-Gerlach Effekt verwendet, welcher zu einer Kopplung des Spin-Moments an die axiale Bewegung des Protons führt. Ein Spin-Flip zeigt sich dabei in einem Sprung der axialen Bewegungsfrequenz. Die Schwierigkeit besteht darin, diesen Frequenzsprung auf einem Hintergrund axialer Frequenzfluktuationen zu detektieren. Um diese Herausforderung zu bewältigen, wurden neuartige Methoden und Techniken angewandt. Zum einen wurden supraleitende Nachweise mit höchster Empfindlichkeit entwickelt, welche schnelle und damit präzise Frequenzmessungen erlauben. Zum anderen wurde eine auf dem statistischen Bayes Theorem basierende Spin-Flip-Analyse-Methode angewandt. Mit diesen Verbesserungen war es möglich, einzelne Spin-Flips eines einzelnen Protons zu beobachten. Dies wiederum ermöglichte die Anwendung der sogenannten Doppelfallen-Methode, und damit die eingangs erwähnte Messung des g-Faktors mit einer Präzision von 4.3 10^-9.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In den vergangenen Jahren wurden einige bislang unbekannte Phänomene experimentell beobachtet, wie etwa die Existenz unterschiedlicher Prä-Nukleations-Strukturen. Diese haben zu einem neuen Verständnis von Prozessen, die auf molekularer Ebene während der Nukleation und dem Wachstum von Kristallen auftreten, beigetragen. Die Auswirkungen solcher Prä-Nukleations-Strukturen auf den Prozess der Biomineralisation sind noch nicht hinreichend verstanden. Die Mechanismen, mittels derer biomolekulare Modifikatoren, wie Peptide, mit Prä-Nukleations-Strukturen interagieren und somit den Nukleationsprozess von Mineralen beeinflussen könnten, sind vielfältig. Molekulare Simulationen sind zur Analyse der Formation von Prä-Nukleations-Strukturen in Anwesenheit von Modifikatoren gut geeignet. Die vorliegende Arbeit beschreibt einen Ansatz zur Analyse der Interaktion von Peptiden mit den in Lösung befindlichen Bestandteilen der entstehenden Kristalle mit Hilfe von Molekular-Dynamik Simulationen.rnUm informative Simulationen zu ermöglichen, wurde in einem ersten Schritt die Qualität bestehender Kraftfelder im Hinblick auf die Beschreibung von mit Calciumionen interagierenden Oligoglutamaten in wässrigen Lösungen untersucht. Es zeigte sich, dass große Unstimmigkeiten zwischen etablierten Kraftfeldern bestehen, und dass keines der untersuchten Kraftfelder eine realistische Beschreibung der Ionen-Paarung dieser komplexen Ionen widerspiegelte. Daher wurde eine Strategie zur Optimierung bestehender biomolekularer Kraftfelder in dieser Hinsicht entwickelt. Relativ geringe Veränderungen der auf die Ionen–Peptid van-der-Waals-Wechselwirkungen bezogenen Parameter reichten aus, um ein verlässliches Modell für das untersuchte System zu erzielen. rnDas umfassende Sampling des Phasenraumes der Systeme stellt aufgrund der zahlreichen Freiheitsgrade und der starken Interaktionen zwischen Calciumionen und Glutamat in Lösung eine besondere Herausforderung dar. Daher wurde die Methode der Biasing Potential Replica Exchange Molekular-Dynamik Simulationen im Hinblick auf das Sampling von Oligoglutamaten justiert und es erfolgte die Simulation von Peptiden verschiedener Kettenlängen in Anwesenheit von Calciumionen. Mit Hilfe der Sketch-Map Analyse konnten im Rahmen der Simulationen zahlreiche stabile Ionen-Peptid-Komplexe identifiziert werden, welche die Formation von Prä-Nukleations-Strukturen beeinflussen könnten. Abhängig von der Kettenlänge des Peptids weisen diese Komplexe charakteristische Abstände zwischen den Calciumionen auf. Diese ähneln einigen Abständen zwischen den Calciumionen in jenen Phasen von Calcium-Oxalat Kristallen, die in Anwesenheit von Oligoglutamaten gewachsen sind. Die Analogie der Abstände zwischen Calciumionen in gelösten Ionen-Peptid-Komplexen und in Calcium-Oxalat Kristallen könnte auf die Bedeutung von Ionen-Peptid-Komplexen im Prozess der Nukleation und des Wachstums von Biomineralen hindeuten und stellt einen möglichen Erklärungsansatz für die Fähigkeit von Oligoglutamaten zur Beeinflussung der Phase des sich formierenden Kristalls dar, die experimentell beobachtet wurde.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many cases, it is not possible to call the motorists to account for their considerable excess in speeding, because they deny being the driver on the speed-check photograph. An anthropological comparison of facial features using a photo-to-photo comparison can be very difficult depending on the quality of the photographs. One difficulty of that analysis method is that the comparison photographs of the presumed driver are taken with a different camera or camera lens and from a different angle than for the speed-check photo. To take a comparison photograph with exactly the same camera setup is almost impossible. Therefore, only an imprecise comparison of the individual facial features is possible. The geometry and position of each facial feature, for example the distances between the eyes or the positions of the ears, etc., cannot be taken into consideration. We applied a new method using 3D laser scanning, optical surface digitalization, and photogrammetric calculation of the speed-check photo, which enables a geometric comparison. Thus, the influence of the focal length and the distortion of the objective lens are eliminated and the precise position and the viewing direction of the speed-check camera are calculated. Even in cases of low-quality images or when the face of the driver is partly hidden, good results are delivered using this method. This new method, Geometric Comparison, is evaluated and validated in a prepared study which is described in this article.