5 resultados para Texture-based volume visualization
em ArchiMeD - Elektronische Publikationen der Universität Mainz - Alemanha
Resumo:
Es wurde ein für bodengebundene Feldmessungen geeignetes System zur digital-holographischen Abbildung luftgetragener Objekte entwickelt und konstruiert. Es ist, abhängig von der Tiefenposition, geeignet zur direkten Bestimmung der Größe luftgetragener Objekte oberhalb von ca. 20 µm, sowie ihrer Form bei Größen oberhalb von ca. 100µm bis in den Millimeterbereich. Die Entwicklung umfaßte zusätzlich einen Algorithmus zur automatisierten Verbesserung der Hologrammqualität und zur semiautomatischen Entfernungsbestimmung großer Objekte entwickelt. Eine Möglichkeit zur intrinsischen Effizienzsteigerung der Bestimmung der Tiefenposition durch die Berechnung winkelgemittelter Profile wurde vorgestellt. Es wurde weiterhin ein Verfahren entwickelt, das mithilfe eines iterativen Ansatzes für isolierte Objekte die Rückgewinnung der Phaseninformation und damit die Beseitigung des Zwillingsbildes erlaubt. Weiterhin wurden mithilfe von Simulationen die Auswirkungen verschiedener Beschränkungen der digitalen Holographie wie der endlichen Pixelgröße untersucht und diskutiert. Die geeignete Darstellung der dreidimensionalen Ortsinformation stellt in der digitalen Holographie ein besonderes Problem dar, da das dreidimensionale Lichtfeld nicht physikalisch rekonstruiert wird. Es wurde ein Verfahren entwickelt und implementiert, das durch Konstruktion einer stereoskopischen Repräsentation des numerisch rekonstruierten Meßvolumens eine quasi-dreidimensionale, vergrößerte Betrachtung erlaubt. Es wurden ausgewählte, während Feldversuchen auf dem Jungfraujoch aufgenommene digitale Hologramme rekonstruiert. Dabei ergab sich teilweise ein sehr hoher Anteil an irregulären Kristallformen, insbesondere infolge massiver Bereifung. Es wurden auch in Zeiträumen mit formal eisuntersättigten Bedingungen Objekte bis hinunter in den Bereich ≤20µm beobachtet. Weiterhin konnte in Anwendung der hier entwickelten Theorie des ”Phasenrandeffektes“ ein Objekt von nur ca. 40µm Größe als Eisplättchen identifiziert werden. Größter Nachteil digitaler Holographie gegenüber herkömmlichen photographisch abbildenden Verfahren ist die Notwendigkeit der aufwendigen numerischen Rekonstruktion. Es ergibt sich ein hoher rechnerischer Aufwand zum Erreichen eines einer Photographie vergleichbaren Ergebnisses. Andererseits weist die digitale Holographie Alleinstellungsmerkmale auf. Der Zugang zur dreidimensionalen Ortsinformation kann der lokalen Untersuchung der relativen Objektabstände dienen. Allerdings zeigte sich, dass die Gegebenheiten der digitalen Holographie die Beobachtung hinreichend großer Mengen von Objekten auf der Grundlage einzelner Hologramm gegenwärtig erschweren. Es wurde demonstriert, dass vollständige Objektgrenzen auch dann rekonstruiert werden konnten, wenn ein Objekt sich teilweise oder ganz außerhalb des geometrischen Meßvolumens befand. Weiterhin wurde die zunächst in Simulationen demonstrierte Sub-Bildelementrekonstruktion auf reale Hologramme angewandt. Dabei konnte gezeigt werden, dass z.T. quasi-punktförmige Objekte mit Sub-Pixelgenauigkeit lokalisiert, aber auch bei ausgedehnten Objekten zusätzliche Informationen gewonnen werden konnten. Schließlich wurden auf rekonstruierten Eiskristallen Interferenzmuster beobachtet und teilweise zeitlich verfolgt. Gegenwärtig erscheinen sowohl kristallinterne Reflexion als auch die Existenz einer (quasi-)flüssigen Schicht als Erklärung möglich, wobei teilweise in Richtung der letztgenannten Möglichkeit argumentiert werden konnte. Als Ergebnis der Arbeit steht jetzt ein System zur Verfügung, das ein neues Meßinstrument und umfangreiche Algorithmen umfaßt. S. M. F. Raupach, H.-J. Vössing, J. Curtius und S. Borrmann: Digital crossed-beam holography for in-situ imaging of atmospheric particles, J. Opt. A: Pure Appl. Opt. 8, 796-806 (2006) S. M. F. Raupach: A cascaded adaptive mask algorithm for twin image removal and its application to digital holograms of ice crystals, Appl. Opt. 48, 287-301 (2009) S. M. F. Raupach: Stereoscopic 3D visualization of particle fields reconstructed from digital inline holograms, (zur Veröffentlichung angenommen, Optik - Int. J. Light El. Optics, 2009)
Resumo:
Optical frequency comb technology has been used in this work for the first time to investigate the nuclear structure of light radioactive isotopes. Therefore, three laser systems were stabilized with different techniques to accurately known optical frequencies and used in two specialized experiments. Absolute transition frequency measurements of lithium and beryllium isotopes were performed with accuracy on the order of 10^(−10). Such a high accuracy is required for the light elements since the nuclear volume effect has only a 10^(−9) contribution to the total transition frequency. For beryllium, the isotope shift was determined with an accuracy that is sufficient to extract information about the proton distribution inside the nucleus. A Doppler-free two-photon spectroscopy on the stable lithium isotopes (6,7)^Li was performed in order to determine the absolute frequency of the 2S → 3S transition. The achieved relative accuracy of 2×10^(−10) is improved by one order of magnitude compared to previous measurements. The results provide an opportunity to determine the nuclear charge radius of the stable and short-lived isotopes in a pure optical way but this requires an improvement of the theoretical calculations by two orders of magnitude. The second experiment presented here was performed at ISOLDE/CERN, where the absolute transition frequencies of the D1 and D2 lines in beryllium ions for the isotopes (7,9,10,11)^Be were measured with an accuracy of about 1 MHz. Therefore, an advanced collinear laser spectroscopy technique involving two counter-propagating frequency-stabilized laser beams with a known absolute frequency was developed. The extracted isotope shifts were combined with recent accurate mass shift calculations and the root-mean square nuclear charge radii of (7,10)^Be and the one-neutron halo nucleus 11^Be were determined. Obtained charge radii are decreasing from 7^Be to 10^Be and increasing again for 11^Be. While the monotone decrease can be explained by a nucleon clustering inside the nucleus, the pronounced increase between 10^Be and 11^Be can be interpreted as a combination of two contributions: the center-of-mass motion of the 10^Be core and a change of intrinsic structure of the core. To disentangle these two contributions, the results from nuclear reaction measurements were used and indicate that the center-of-mass motion is the dominant effect. Additionally, the splitting isotope shift, i.e. the difference in the isotope shifts between the D1 and D2 fine structure transitions, was determined. This shows a good consistency with the theoretical calculations and provides a valuable check of the beryllium experiment.
Resumo:
Nuclear medicine imaging techniques such as PET are of increasing relevance in pharmaceutical research being valuable (pre)clinical tools to non-invasively assess drug performance in vivo. Therapeutic drugs, e.g. chemotherapeutics, often suffer from a poor balance between their efficacy and toxicity. Here, polymer based drug delivery systems can modulate the pharmacokinetics of low Mw therapeutics (prolonging blood circulation time, reducing toxic side effects, increasing target site accumulation) and therefore leading to a more efficient therapy. In this regard, poly-N-(2-hydroxypropyl)-methacrylamide (HPMA) constitutes a promising biocompatible polymer. Towards the further development of these structures, non-invasive PET imaging allows insight into structure-property relationships in vivo. This performant tool can guide design optimization towards more effective drug delivery. Hence, versatile radiolabeling strategies need to be developed and establishing 18F- as well as 131I-labeling of diverse HPMA architectures forms the basis for short- as well as long-term in vivo evaluations. By means of the prosthetic group [18F]FETos, 18F-labeling of distinct HPMA polymer architectures (homopolymers, amphiphilic copolymers as well as block copolymers) was successfully accomplished enabling their systematic evaluation in tumor bearing rats. These investigations revealed pronounced differences depending on individual polymer characteristics (molecular weight, amphiphilicity due to incorporated hydrophobic laurylmethacrylate (LMA) segments, architecture) as well as on the studied tumor model. Polymers showed higher uptake for up to 4 h p.i. into Walker 256 tumors vs. AT1 tumors (correlating to a higher cellular uptake in vitro). Highest tumor concentrations were found for amphiphilic HPMA-ran-LMA copolymers in comparison to homopolymers and block copolymers. Notably, the random LMA copolymer P4* (Mw=55 kDa, 25% LMA) exhibited most promising in vivo behavior such as highest blood retention as well as tumor uptake. Further studies concentrated on the influence of PEGylation (‘stealth effect’) in terms of improving drug delivery properties of defined polymeric micelles. Here, [18F]fluoroethylation of distinct PEGylated block copolymers (0%, 1%, 5%, 7%, 11% of incorporated PEG2kDa) enabled to systematically study the impact of PEG incorporation ratio and respective architecture on the in vivo performance. Most strikingly, higher PEG content caused prolonged blood circulation as well as a linear increase in tumor uptake (Walker 256 carcinoma). Due to the structural diversity of potential polymeric carrier systems, further versatile 18F-labeling strategies are needed. Therefore, a prosthetic 18F-labeling approach based on the Cu(I)-catalyzed click reaction was established for HPMA-based polymers, providing incorporation of fluorine-18 under mild conditions and in high yields. On this basis, a preliminary µPET study of a HPMA-based polymer – radiolabeled via the prosthetic group [18F]F-PEG3-N3 – was successfully accomplished. By revealing early pharmacokinetics, 18F-labeling enables to time-efficiently assess the potential of HPMA polymers for efficient drug delivery. Yet, investigating the long-term fate is essential, especially regarding prolonged circulation properties and passive tumor accumulation (EPR effect). Therefore, radiolabeling of diverse HPMA copolymers with the longer-lived isotope iodine-131 was accomplished enabling in vivo evaluation of copolymer P4* over several days. In this study, tumor retention of 131I-P4* could be demonstrated at least over 48h with concurrent blood clearance thereby confirming promising tumor targeting properties of amphiphilic HPMA copolymer systems based on the EPR effect.
Resumo:
In technical design processes in the automotive industry, digital prototypes rapidly gain importance, because they allow for a detection of design errors in early development stages. The technical design process includes the computation of swept volumes for maintainability analysis and clearance checks. The swept volume is very useful, for example, to identify problem areas where a safety distance might not be kept. With the explicit construction of the swept volume an engineer gets evidence on how the shape of components that come too close have to be modified.rnIn this thesis a concept for the approximation of the outer boundary of a swept volume is developed. For safety reasons, it is essential that the approximation is conservative, i.e., that the swept volume is completely enclosed by the approximation. On the other hand, one wishes to approximate the swept volume as precisely as possible. In this work, we will show, that the one-sided Hausdorff distance is the adequate measure for the error of the approximation, when the intended usage is clearance checks, continuous collision detection and maintainability analysis in CAD. We present two implementations that apply the concept and generate a manifold triangle mesh that approximates the outer boundary of a swept volume. Both algorithms are two-phased: a sweeping phase which generates a conservative voxelization of the swept volume, and the actual mesh generation which is based on restricted Delaunay refinement. This approach ensures a high precision of the approximation while respecting conservativeness.rnThe benchmarks for our test are amongst others real world scenarios that come from the automotive industry.rnFurther, we introduce a method to relate parts of an already computed swept volume boundary to those triangles of the generator, that come closest during the sweep. We use this to verify as well as to colorize meshes resulting from our implementations.
Resumo:
Geometric packing problems may be formulated mathematically as constrained optimization problems. But finding a good solution is a challenging task. The more complicated the geometry of the container or the objects to be packed, the more complex the non-penetration constraints become. In this work we propose the use of a physics engine that simulates a system of colliding rigid bodies. It is a tool to resolve interpenetration conflicts and to optimize configurations locally. We develop an efficient and easy-to-implement physics engine that is specialized for collision detection and contact handling. In succession of the development of this engine a number of novel algorithms for distance calculation and intersection volume were designed and imple- mented, which are presented in this work. They are highly specialized to pro- vide fast responses for cuboids and triangles as input geometry whereas the concepts they are based on can easily be extended to other convex shapes. Especially noteworthy in this context is our ε-distance algorithm - a novel application that is not only very robust and fast but also compact in its im- plementation. Several state-of-the-art third party implementations are being presented and we show that our implementations beat them in runtime and robustness. The packing algorithm that lies on top of the physics engine is a Monte Carlo based approach implemented for packing cuboids into a container described by a triangle soup. We give an implementation for the SAE J1100 variant of the trunk packing problem. We compare this implementation to several established approaches and we show that it gives better results in faster time than these existing implementations.