955 resultados para analytical methods
Resumo:
Die Selbstorganisation von amphiphilen Molekülen wird genutzt, um in Lösung, auf der Oberfläche, in der festen Phase und an der Flüssig/Fest-Grenzfläche nanoskopisch strukturierte Materialien zu erhalten. Ziel hierbei ist es, die Dynamik der niedermolekularen Amphiphile mit der Stabilität der hochmolekularen Amphiphile zu vereinigen, um damit die Selbstorganisation der Moleküle zu kontrollieren. Drei Konzepte zur Strukturierung von Kohlenstoff durch Selbstorganisation werden vorgestellt. Im ersten Konzept werden aus Hexaphenylbenzol-Polyethylenglykol- (HPB-PEG) und Hexa-peri-hexabenzocoronen- (HBC-PEG) Derivaten wurmähnliche bzw. faserförmige Strukturen in wässriger Lösung erhalten. Der Wassergehalt in den Hydrogelfasern aus den HPB-PEG-Derivaten kann durch das Substitutionsmuster der Amphiphile und die Länge der PEG-Ketten eingestellt werden. Die Hydrogelfasern ähneln anders als die bisherigen Verfahren, die zur Faserherstellung verwendet werden (Extrudieren, Mikrofluid-Verarbeitung oder Elektrospinning), Systemen in der Natur. Der Beweis für die Bildung von Hydrogelfasern wird mittels spezieller Methoden der polarisierten und depolarisierten dynamischen Lichtstreuung erbracht. Im zweiten Konzept werden durch Elektronenbestrahlung und Pyrolyse von 3',4',5',6'-Tetraphenyl-[1,1':2',1''-terphenyl]-4,4''-dithiol homogene Kohlenstoffmembranen mit Poren erzeugt, die Anwendung in der Filtration finden können und im dritten Konzept wird die Selbstorganisation von einem ortho-verknüpften HPB-Trimer an der Flüssig/Fest-Grenzfläche untersucht. Auf diese Weise werden hochgeordnete lamellare Strukturen erhalten. In allen drei Konzepten sind die Geometrie und die Größe der Moleküle die entscheidenden Parameter zur Erzeugung definierter Strukturen.
Resumo:
Das Ziel dieser Arbeit lag darin mannosylierte Polymersysteme hauptsächlich auf der Basis von N-(Hydroxy)propylmethacrylat zu synthetisieren, um gezielt Zellen des Immunsystems zu adressieren. Dazu wurden zunächst verschiedene Reaktivesterpolymere auf der Basis von Pentafluorophenylmethacrylat (PFPMA) unter Verwendung der RAFT-Polymerisation mit enger Molekulargewichtsverteilung und unterschiedlichen Anteilen an LMA (Laurylmethacrylat) hergestellt.rnUm eine genaue Aussage über den Aufbau eines statistischen PFPMA-LMA Copolymers treffen zu können, wurde die Copolymerisation von PFPMA und LMA mittels Echtzeit 1H-NMR Kinetikmessungen untersucht. Dies ermöglichte es, die Copolymerisationsparameter zu berechnen und genaue Aussagen über den Aufbau eines statistischen PFPMA-LMA Copolymers zu treffen. Die so erhaltenen Reaktivesterpolymere wurden dann in einer polymeranalogen Reaktion unter Erhalt des Polymerisationsgrades in die gewünschten HPMA-Polymere umgewandelt. Um die quantitative Umsetzung ohne auftretende Nebenreaktionen zu untersuchen, wurden verschiedene Reaktionsbedingungen gewählt und unterschiedliche Analysemethoden verwendet. Damit konnte gezeigt werden, dass es möglich ist, über den Reaktivesteransatz qualitativ hochwertige amphiphile Polymersysteme herzustellen, die auf anderen Wegen schwer zu synthetisieren und charakterisieren sind. Ein weiterer Vorteil dieser Syntheseroute ist, dass gleichzeitig sowohl Marker für die Visualisierung der Polymere in vitro und in vivo, als auch Targetliganden für die Adressierung bestimmter Zellen eingeführt werden können. Dafür wurde hauptsächlich Mannose als einfache Zuckerstruktur angebunden, da bekannt ist, dass mannosylierte Polymersysteme von Zellen des Imunsystems aufgenommen werden. Zusätzlich konnten die mannosylierten Polymere mit hydrophobem Wirkstoff beladen werden, wobei die Stabilität von beladenen Mizellen anhand der Einlagerung eines hydrophoben radioaktiven Komplexes genauer untersucht werden konnte.rnAnschließende in vitro Experimente der mannosylierten Polymermizellen an dendritischen Zellen zeigten wie erwartet eine mannosespezifische und verstärkte Aufnahme. Für eine mögliche Untersuchung dieser Systeme in vivo mittels PET konnte gezeigt werden, dass es möglich ist HPMA Polymere radioaktiv zu markieren, wobei auch erste Markierungsversuche mit einem langlebigen Radionuklid für Langzeitbiodistributionsstudien durchgeführt werden konnte.rn
Resumo:
Der Rational-Choice-Ansatz (RCA) hat in den letzten Jahrzehnten eine weite Ver-rnbreitung in vielen sozialwissenschaftlichen Disziplinen erfahren. Insbesondere in den letzten zwei Jahrzehnten gab es wiederholte Bemühungen, den RCA auchrnauf geschichtswissenschaftliche Fragestellungen und Themen anzuwenden. Ein interssanter Ansatz dafür ist eine integrative Methodik, die unter der Bezeichnung „Analytic Narrative“ bekannt wurde. Damit wird versucht, die klassische narrative Form der Erklärung historischer Phänomene mit spieltheoretischen Modellierungen zu verbinden. Inspiriert durch diesen Ansatz geht die vorliegende Untersuchung der Frage nach, in welcher Form und unter welchen Umständen der RCA als analytische Grundlage für historische Themenfelder und Fragestellungen geeignet sein mag. Dies wird nicht nur theoretisch, sondern an einem historischen Beispiel untersucht. Konkreter Betrachtungsgegenstand der Arbeit ist der Vierte Kreuzzug. Vor über 800 Jahren endete dieser mit der Eroberung und Plünderung Konstantinopels sowie der Zerschlagung des Byzantinischen Reichs. Seit mehr als 150 Jahren streiten Historiker über die Ursachen für diese Ereignisse. Die theoretischenrnGrundpositionen, die innerhalb dieser Debatte durch einzelne Historiker einge-rnnommen wurden, dienen als Ausgangspunkt für die hier verfolgte Untersuchung.rnEs wird gezeigt, dass die Daten, die uns über den Vierten Kreuzzug vorliegen,rndie Möglichkeit eröffnen, verschiedene auf dem RCA basierende Analyseverfah-rnren zur Anwendung zu bringen. Das zentrale Ziel der Analyse besteht darin, ausrnden vorhandenen Quellen neue Einsichten in die strategischen Handlungsoptionen der für den Verlauf des Kreuzzugs relevanten Akteure zu generieren undrnüberdies ein Höchstmaß an Überprüfbarkeit zu gewährleisten.
Resumo:
Since the late 1990s the illicit drug market has undergone considerable change: along with the traditional drugs of abuse that still dominate, more than 100 psychotropic substances designed to bypass controlled substances legislation have appeared and led to intoxications and fatalities. Starting from the huge class of phenylalkylamines, containing many subgroups, the spectrum of structures has grown from tryptamines, piperazines, phenylcyclohexyl derivates and pyrrolidinophenones to synthetic cannabinoids and the first synthetic cocaine. Due to the small prevalence and high number of unknown substances, the detection of new designer drugs is a challenge for clinical and forensic toxicologists. Standard screening procedures might fail because a recently discovered or yet unknown substance has not been incorporated in the library used. Nevertheless, many metabolism studies, case reports, screening methods and substance-profiling papers concentrating on single compounds have been published. This review provides an overview of the developed bioanalytical and analytical methods, the matrices used, sample-preparation procedures, concentration of analytes in case of intoxication and also gives a résumé of immunoassay experiences. Additionally, six screening methods for biological matrices with a larger spectrum of analytes are described in more detail.
Resumo:
To check the effectiveness of campaigns preventing drug abuse or indicating local effects of efforts against drug trafficking, it is beneficial to know consumed amounts of substances in a high spatial and temporal resolution. The analysis of drugs of abuse in wastewater (WW) has the potential to provide this information. In this study, the reliability of WW drug consumption estimates is assessed and a novel method presented to calculate the total uncertainty in observed WW cocaine (COC) and benzoylecgonine (BE) loads. Specifically, uncertainties resulting from discharge measurements, chemical analysis and the applied sampling scheme were addressed and three approaches presented. These consist of (i) a generic model-based procedure to investigate the influence of the sampling scheme on the uncertainty of observed or expected drug loads, (ii) a comparative analysis of two analytical methods (high performance liquid chromatography-tandem mass spectrometry and gas chromatography-mass spectrometry), including an extended cross-validation by influent profiling over several days, and (iii) monitoring COC and BE concentrations in WW of the largest Swiss sewage treatment plants. In addition, the COC and BE loads observed in the sewage treatment plant of the city of Berne were used to back-calculate the COC consumption. The estimated mean daily consumed amount was 107 ± 21 g of pure COC, corresponding to 321 g of street-grade COC.
Resumo:
Concern over possible adverse effects of endocrine-disrupting compounds on fish has caused the development of appropriate testing methods. In vitro screening assays may provide initial information on endocrine activities of a test compound and thereby may direct and optimize subsequent testing. Induction of vitellogenin (VTG) is used as a biomarker of exposure of fish to estrogen-active substances. Since VTG induction can be measured not only in vivo but also in fish hepatocytes in vitro, the use of VTG induction response in isolated fish liver cells has been suggested as in vitro screen for identifying estrogenic-active substances. The main advantages of the hepatocyte VTG assay are considered its ability to detect effects of estrogenic metabolites, since hepatocytes in vitro remain metabolically competent, and its ability to detect both estrogenic and anti-estrogenic effects. In this article, we critically review the current knowledge on the VTG response of cultured fish hepatocytes to (anti)estrogenic substances. In particular, we discuss the sensitivity, specificity, and variability of the VTG hepatocyte assay. In addition, we review the available data on culture factors influencing basal and induced VTG production, the response to natural and synthetic estrogens as well as to xenoestrogens, the detection of indirect estrogens, and the sources of assay variability. The VTG induction in cultured fish hepatocytes is clearly influenced by culture conditions (medium composition, temperature, etc.) and culture system (hepatocyte monolayers, aggregates, liver slices, etc.). The currently available database on estrogen-mediated VTG induction in cultured teleost hepatocytes is too small to support conclusive statements on whether there exist systematic differences of the VTG response between in vitro culture systems, VTG analytical methods or fish species. The VTG hepatocyte assay detects sensitively natural and synthetic estrogens, whereas the response to xenoestrogens appears to be more variable. The detection of weak estrogens can be critical due to the overshadow with cytotoxic concentrations. Moreover, the VTG hepatocyte assay is able to detect antiestrogens as well as indirect estrogens, i.e substances which require metabolic activation to induce an estrogenic response. Nevertheless, more chemicals need to be analysed to corroborate this statement. It will be necessary to establish standardized protocols to minimize assay variability, and to develop a set of pass-fail criteria as well as cut-offs for designating positive and negative responses.
Resumo:
Lightmicroscopical (LM) and electron microscopi cal (EM) techniques, have had a major influence on the development and direction of cell biology, and particularly also on the investigation of complex host-parasite relationships. Earlier, microscopy has been rather descriptive, but new technical and scientific advances have changed the situation. Microscopy has now become analytical, quantitative and three-dimensional, with greater emphasis on analysis of live cells with fluorescent markers. The new or improved techniques that have become available include immunocytochemistry using immunogold labeling techniques or fluorescent probes, cryopreservation and cryosectioning, in situ hybridization, fluorescent reporters for subcellular localization, micro-analytical methods for elemental distribution, confocal laser scanning microscopy, scanning tunneling microscopy and live-imaging. Taken together, these tools are providing both researchers and students with a novel and multidimensional view of the intricate biological processes during parasite development in the host.
Resumo:
Students are now involved in a vastly different textual landscape than many English scholars, one that relies on the “reading” and interpretation of multiple channels of simultaneous information. As a response to these new kinds of literate practices, my dissertation adds to the growing body of research on multimodal literacies, narratology in new media, and rhetoric through an examination of the place of video games in English teaching and research. I describe in this dissertation a hybridized theoretical basis for incorporating video games in English classrooms. This framework for textual analysis includes elements from narrative theory in literary study, rhetorical theory, and literacy theory, and when combined to account for the multiple modalities and complexities of gaming, can provide new insights about those theories and practices across all kinds of media, whether in written texts, films, or video games. In creating this framework, I hope to encourage students to view texts from a meta-level perspective, encompassing textual construction, use, and interpretation. In order to foster meta-level learning in an English course, I use specific theoretical frameworks from the fields of literary studies, narratology, film theory, aural theory, reader-response criticism, game studies, and multiliteracies theory to analyze a particular video game: World of Goo. These theoretical frameworks inform pedagogical practices used in the classroom for textual analysis of multiple media. Examining a video game from these perspectives, I use analytical methods from each, including close reading, explication, textual analysis, and individual elements of multiliteracies theory and pedagogy. In undertaking an in-depth analysis of World of Goo, I demonstrate the possibilities for classroom instruction with a complex blend of theories and pedagogies in English courses. This blend of theories and practices is meant to foster literacy learning across media, helping students develop metaknowledge of their own literate practices in multiple modes. Finally, I outline a design for a multiliteracies course that would allow English scholars to use video games along with other texts to interrogate texts as systems of information. In doing so, students can hopefully view and transform systems in their own lives as audiences, citizens, and workers.
Resumo:
An extrusion die is used to continuously produce parts with a constant cross section; such as sheets, pipes, tire components and more complex shapes such as window seals. The die is fed by a screw extruder when polymers are used. The extruder melts, mixes and pressures the material by the rotation of either a single or double screw. The polymer can then be continuously forced through the die producing a long part in the shape of the die outlet. The extruded section is then cut to the desired length. Generally, the primary target of a well designed die is to produce a uniform outlet velocity without excessively raising the pressure required to extrude the polymer through the die. Other properties such as temperature uniformity and residence time are also important but are not directly considered in this work. Designing dies for optimal outlet velocity variation using simple analytical equations are feasible for basic die geometries or simple channels. Due to the complexity of die geometry and of polymer material properties design of complex dies by analytical methods is difficult. For complex dies iterative methods must be used to optimize dies. An automated iterative method is desired for die optimization. To automate the design and optimization of an extrusion die two issues must be dealt with. The first is how to generate a new mesh for each iteration. In this work, this is approached by modifying a Parasolid file that describes a CAD part. This file is then used in a commercial meshing software. Skewing the initial mesh to produce a new geometry was also employed as a second option. The second issue is an optimization problem with the presence of noise stemming from variations in the mesh and cumulative truncation errors. In this work a simplex method and a modified trust region method were employed for automated optimization of die geometries. For the trust region a discreet derivative and a BFGS Hessian approximation were used. To deal with the noise in the function the trust region method was modified to automatically adjust the discreet derivative step size and the trust region based on changes in noise and function contour. Generally uniformity of velocity at exit of the extrusion die can be improved by increasing resistance across the die but this is limited by the pressure capabilities of the extruder. In optimization, a penalty factor that increases exponentially from the pressure limit is applied. This penalty can be applied in two different ways; the first only to the designs which exceed the pressure limit, the second to both designs above and below the pressure limit. Both of these methods were tested and compared in this work.
Resumo:
The estimation of the average travel distance in a low-level picker-to-part order picking system can be done by analytical methods in most cases. Often a uniform distribution of the access frequency over all bin locations is assumed in the storage system. This only applies if the bin location assignment is done randomly. If the access frequency of the articles is considered in the bin location assignment to reduce the average total travel distance of the picker, the access frequency over the bin locations of one aisle can be approximated by an exponential density function or any similar density function. All known calculation methods assume that the average number of orderlines per order is greater than the number of aisles of the storage system. In case of small orders this assumption is often invalid. This paper shows a new approach for calculating the average total travel distance taking into account that the average number of orderlines per order is lower than the total number of aisles in the storage system and the access frequency over the bin locations of an aisle can be approximated by any density function.
Resumo:
Standard protocols are given for assessing metabolic stability in rainbow trout using the liver S9 fraction. These protocols describe the isolation of S9 fractions from trout livers, evaluation of metabolic stability using a substrate depletion approach, and expression of the result as in vivo intrinsic clearance. Additional guidance is provided on the care and handling of test animals, design and interpretation of preliminary studies, and development of analytical methods. Although initially developed to predict metabolism impacts on chemical accumulation by fish, these procedures can be used to support a broad range of scientific and risk assessment activities including evaluation of emerging chemical contaminants and improved interpretation of toxicity testing results. These protocols have been designed for rainbow trout and can be adapted to other species as long as species-specific considerations are modified accordingly (e.g., fish maintenance and incubation mixture temperature). Rainbow trout is a cold-water species. Protocols for other species (e.g., carp, a warm-water species) can be developed based on these procedures as long as the specific considerations are taken into account.
Resumo:
The identification of plausible causes for water body status deterioration will be much easier if it can build on available, reliable, extensive and comprehensive biogeochemical monitoring data (preferably aggregated in a database). A plausible identification of such causes is a prerequisite for well-informed decisions on which mitigation or remediation measures to take. In this chapter, first a rationale for an extended monitoring programme is provided; it is then compared to the one required by the Water Framework Directive (WFD). This proposal includes a list of relevant parameters that are needed for an integrated, a priori status assessment. Secondly, a few sophisticated statistical tools are described that subsequently allow for the estiation of the magnitude of impairment as well as the likely relative importance of different stressors in a multiple stressed environment. The advantages and restrictions of these rather complicated analytical methods are discussed. Finally, the use of Decision Support Systems (DSS) is advocated with regard to the specific WFD implementation requirements.
Resumo:
Oxygenated polycyclic aromatic hydrocarbons (oxy-PAHs) and nitrogen heterocyclic polycyclic aromatic compounds (N-PACs) are toxic, highly leachable and often abundant at sites that are also contaminated with PAHs. However, due to lack of regulations and standardized methods for their analysis, they are seldom included in monitoring and risk-assessment programs. This intercomparison study constitutes an important step in the harmonization of the analytical methods currently used, and may also be considered a first step towards the certification of reference materials for these compounds. The results showed that the participants were able to determine oxy-PAHs with accuracy similar to PAHs, with average determined mass fractions agreeing well with the known levels in a spiked soil and acceptable inter- and intra-laboratory precisions for all soils analyzed. For the N-PACs, the results were less satisfactory, and have to be improved by using analytical methods more specifically optimized for these compounds.
Resumo:
For early diagnosis and therapy of alcohol-related disorders, alcohol biomarkers are highly valuable. Concerning specificity, indirect markers can be influenced by nonethanol-related factors, whereas direct markers are only formed after ethanol consumption. Sensitivity of the direct markers depends on cutoffs of analytical methods, material for analysis and plays an important role for their utilization in different fields of application. Until recently, the biomarker phosphatidylethanol has been used to differentiate between social drinking and alcohol abuse. After method optimization, the detection limit could be lowered and phosphatidylethanol became sensitive enough to even detect the consumption of low amounts of alcohol. This perspective gives a summary of most common alcohol biomarkers and summarizes new developments for monitoring alcohol consumption habits.
Resumo:
In a network of competing species, a competitive intransitivity occurs when the ranking of competitive abilities does not follow a linear hierarchy (A > B > C but C > A). A variety of mathematical models suggests that intransitive networks can prevent or slow down competitive exclusion and maintain biodiversity by enhancing species coexistence. However, it has been difficult to assess empirically the relative importance of intransitive competition because a large number of pairwise species competition experiments are needed to construct a competition matrix that is used to parameterize existing models. Here we introduce a statistical framework for evaluating the contribution of intransitivity to community structure using species abundance matrices that are commonly generated from replicated sampling of species assemblages. We provide metrics and analytical methods for using abundance matrices to estimate species competition and patch transition matrices by using reverse-engineering and a colonization-competition model. These matrices provide complementary metrics to estimate the degree of intransitivity in the competition network of the sampled communities. Benchmark tests reveal that the proposed methods could successfully detect intransitive competition networks, even in the absence of direct measures of pairwise competitive strength. To illustrate the approach, we analyzed patterns of abundance and biomass of five species of necrophagous Diptera and eight species of their hymenopteran parasitoids that co-occur in beech forests in Germany. We found evidence for a strong competitive hierarchy within communities of flies and parasitoids. However, for parasitoids, there was a tendency towards increasing intransitivity in higher weight classes, which represented larger resource patches. These tests provide novel methods for empirically estimating the degree of intransitivity in competitive networks from observational datasets. They can be applied to experimental measures of pairwise species interactions, as well as to spatio-temporal samples of assemblages in homogenous environments or environmental gradients.