829 resultados para Two Approaches
Resumo:
ZUSAMMENFASSUNG Die Tauglichkeit von Hybridmaterialien auf der Basis von Zinkphosphathydrat-Zementen zum Einsatz als korrosionshemmende anorganische Pigmente oder zur prothetischen und konservierenden Knochen- und Zahntherapie wird weltweit empirisch seit den neunziger Jahren intensiv erforscht. In der vorliegenden Arbeit wurden zuerst Referenzproben, d.h. alpha-und beta-Hopeite (Abk. a-,b-ZPT) dank eines hydrothermalen Kristallisationsverfahrens in wässerigem Milieu bei 20°C und 90°C hergestellt. Die Kristallstruktur beider Polymorphe des Zinkphosphattetrahydrats Zn3(PO4)2 4 H2O wurde komplett bestimmt. Einkristall-strukturanalyse zeigt, daß der Hauptunterschied zwischen der alpha-und beta-Form des Zinkphosphattetrahydrats in zwei verschiedenen Anordnungen der Wasserstoffbrücken liegt. Die entsprechenden drei- und zweidimensionalen Anordnungen der Wasserstoffbrücken der a-und b-ZPT induzieren jeweils unterschiedliches thermisches Verhalten beim Aufwärmen. Während die alpha-Form ihr Kristallwasser in zwei definierten Stufen verliert, erzeugt die beta-Form instabile Dehydratationsprodukt. Dieses entspricht zwei unabhängigen, aber nebeneinander ablaufenden Dehydratationsmechanismen: (i) bei niedrigen Heizraten einen zweidimensionalen Johnson-Mehl-Avrami (JMA) Mechanismus auf der (011) Ebene, der einerseits bevorzugt an Kristallkanten stattfindet und anderseits von existierenden Kristalldefekten auf Oberflächen gesteuert wird; (ii) bei hohen Heizraten einem zweidimensionalen Diffusionsmechanismus (D2), der zuerst auf der (101) Ebene und dann auf der (110) Ebene erfolgt. Durch die Betrachtung der ZPT Dehydratation als irreversibele heterogene Festkörperstufenreaktion wurde dank eines „ähnlichen Endprodukt“-Protokolls das Dehydratationsphasendiagramm aufgestellt. Es beschreibt die möglichen Zusammenhänge zwischen den verschiedenen Hydratationszuständen und weist auf die Existenz eines Übergangszustandes um 170°C (d.h. Reaktion b-ZPT a-ZPT) hin. Daneben wurde auch ein gezieltes chemisches Ätzverfahren mit verdünnten H3PO4- und NH3 Lösungen angewendet, um die ersten Stufe des Herauslösens von Zinkphosphat genau zu untersuchen. Allerdings zeigen alpha- und beta-Hopeite charakteristische hexagonale und kubische Ätzgruben, die sich unter kristallographischer Kontrolle verbreitern. Eine zuverlässige Beschreibung der Oberfächenchemie und Topologie konnte nur durch AFM und FFM Experimente erfolgen. Gleichzeitig konnte in dieser Weise die Oberflächendefektdichte und-verteilung und die Volumenauflösungsrate von a-ZPT und b-ZPT bestimmt werden. Auf einem zweiten Weg wurde eine innovative Strategie zur Herstellung von basischen Zinkphosphatpigmenten erster und zweiter Generation (d.h. NaZnPO4 1H2O und Na2ZnPO4(OH) 2H2O) mit dem Einsatz von einerseits oberflächenmodifizierten Polystyrolatices (z.B. produziert durch ein Miniemulsionspolymerisationsverfahren) und anderseits von Dendrimeren auf der Basis von Polyamidoamid (PAMAM) beschritten. Die erhaltene Zeolithstruktur (ZPO) hat in Abhängigkeit von steigendem Natrium und Wassergehalt unterschiedliche kontrollierte Morphologie: hexagonal, würfelförmig, herzförmig, sechsarmige Sterne, lanzettenförmige Dendrite, usw. Zur quantitativen Evaluierung des Polymereinbaus in der Kristallstruktur wurden carboxylierte fluoreszenzmarkierte Latices eingesetzt. Es zeigt sich, daß Polymeradditive nicht nur das Wachstum bis zu 8 µm.min-1 reduzierten. Trotzdem scheint es auch als starker Nukleationsbeschleuniger zu wirken. Dank der Koordinationschemie (d.h. Bildung eines sechszentrigen Komplexes L-COO-Zn-PO4*H2O mit Ligandenaustausch) konnten zwei einfache Mechanismen zur Wirkung von Latexpartikeln bei der ZPO Kristallisation aufgezeigt werden: (i) ein Intrakorona- und (ii) ein Extrakorona-Keimbildungsmechanismus. Weiterhin wurde die Effizienz eines Kurzzeit- und Langzeitkorrosionschutzes durch maßgeschneiderte ZPO/ZPT Pigmente und kontrollierte Freisetzung von Phosphationen in zwei Näherungen des Auslösungsgleichgewichts abgeschätzt: (i) durch eine Auswaschungs-methode (thermodynamischer Prozess) und (ii) durch eine pH-Impulsmethode (kinetischer Prozess. Besonders deutlich wird der Ausflösungs-Fällungsmechanismus (d.h. der Metamorphismus). Die wesentliche Rolle den Natriumionen bei der Korrosionshemmung wird durch ein passendes zusammensetzungsabhängiges Auflösungsmodell (ZAAM) beschrieben, das mit dem Befund des Salzsprühteste und der Feuchtigkeitskammertests konsistent ist. Schließlich zeigt diese Arbeit das herausragende Potential funktionalisierter Latices (Polymer) bei der kontrollierten Mineralisation zur Herstellung maßgeschneiderter Zinkphosphat Materialien. Solche Hybridmaterialien werden dringend in der Entwicklung umweltfreundlicher Korrosionsschutzpigmente sowie in der Dentalmedizin benötigt.
Resumo:
The ever increasing demand for new services from users who want high-quality broadband services while on the move, is straining the efficiency of current spectrum allocation paradigms, leading to an overall feeling of spectrum scarcity. In order to circumvent this problem, two possible solutions are being investigated: (i) implementing new technologies capable of accessing the temporarily/locally unused bands, without interfering with the licensed services, like Cognitive Radios; (ii) release some spectrum bands thanks to new services providing higher spectral efficiency, e.g., DVB-T, and allocate them to new wireless systems. These two approaches are promising, but also pose novel coexistence and interference management challenges to deal with. In particular, the deployment of devices such as Cognitive Radio, characterized by the inherent unplanned, irregular and random locations of the network nodes, require advanced mathematical techniques in order to explicitly model their spatial distribution. In such context, the system performance and optimization are strongly dependent on this spatial configuration. On the other hand, allocating some released spectrum bands to other wireless services poses severe coexistence issues with all the pre-existing services on the same or adjacent spectrum bands. In this thesis, these methodologies for better spectrum usage are investigated. In particular, using Stochastic Geometry theory, a novel mathematical framework is introduced for cognitive networks, providing a closed-form expression for coverage probability and a single-integral form for average downlink rate and Average Symbol Error Probability. Then, focusing on more regulatory aspects, interference challenges between DVB-T and LTE systems are analysed proposing a versatile methodology for their proper coexistence. Moreover, the studies performed inside the CEPT SE43 working group on the amount of spectrum potentially available to Cognitive Radios and an analysis of the Hidden Node problem are provided. Finally, a study on the extension of cognitive technologies to Hybrid Satellite Terrestrial Systems is proposed.
Resumo:
The subject of this thesis is in the area of Applied Mathematics known as Inverse Problems. Inverse problems are those where a set of measured data is analysed in order to get as much information as possible on a model which is assumed to represent a system in the real world. We study two inverse problems in the fields of classical and quantum physics: QCD condensates from tau-decay data and the inverse conductivity problem. Despite a concentrated effort by physicists extending over many years, an understanding of QCD from first principles continues to be elusive. Fortunately, data continues to appear which provide a rather direct probe of the inner workings of the strong interactions. We use a functional method which allows us to extract within rather general assumptions phenomenological parameters of QCD (the condensates) from a comparison of the time-like experimental data with asymptotic space-like results from theory. The price to be paid for the generality of assumptions is relatively large errors in the values of the extracted parameters. Although we do not claim that our method is superior to other approaches, we hope that our results lend additional confidence to the numerical results obtained with the help of methods based on QCD sum rules. EIT is a technology developed to image the electrical conductivity distribution of a conductive medium. The technique works by performing simultaneous measurements of direct or alternating electric currents and voltages on the boundary of an object. These are the data used by an image reconstruction algorithm to determine the electrical conductivity distribution within the object. In this thesis, two approaches of EIT image reconstruction are proposed. The first is based on reformulating the inverse problem in terms of integral equations. This method uses only a single set of measurements for the reconstruction. The second approach is an algorithm based on linearisation which uses more then one set of measurements. A promising result is that one can qualitatively reconstruct the conductivity inside the cross-section of a human chest. Even though the human volunteer is neither two-dimensional nor circular, such reconstructions can be useful in medical applications: monitoring for lung problems such as accumulating fluid or a collapsed lung and noninvasive monitoring of heart function and blood flow.
Resumo:
Except the article forming the main content most HTML documents on the WWW contain additional contents such as navigation menus, design elements or commercial banners. In the context of several applications it is necessary to draw the distinction between main and additional content automatically. Content extraction and template detection are the two approaches to solve this task. This thesis gives an extensive overview of existing algorithms from both areas. It contributes an objective way to measure and evaluate the performance of content extraction algorithms under different aspects. These evaluation measures allow to draw the first objective comparison of existing extraction solutions. The newly introduced content code blurring algorithm overcomes several drawbacks of previous approaches and proves to be the best content extraction algorithm at the moment. An analysis of methods to cluster web documents according to their underlying templates is the third major contribution of this thesis. In combination with a localised crawling process this clustering analysis can be used to automatically create sets of training documents for template detection algorithms. As the whole process can be automated it allows to perform template detection on a single document, thereby combining the advantages of single and multi document algorithms.
Resumo:
This thesis analyses problems related to the applicability, in business environments, of Process Mining tools and techniques. The first contribution is a presentation of the state of the art of Process Mining and a characterization of companies, in terms of their "process awareness". The work continues identifying circumstance where problems can emerge: data preparation; actual mining; and results interpretation. Other problems are the configuration of parameters by not-expert users and computational complexity. We concentrate on two possible scenarios: "batch" and "on-line" Process Mining. Concerning the batch Process Mining, we first investigated the data preparation problem and we proposed a solution for the identification of the "case-ids" whenever this field is not explicitly indicated. After that, we concentrated on problems at mining time and we propose the generalization of a well-known control-flow discovery algorithm in order to exploit non instantaneous events. The usage of interval-based recording leads to an important improvement of performance. Later on, we report our work on the parameters configuration for not-expert users. We present two approaches to select the "best" parameters configuration: one is completely autonomous; the other requires human interaction to navigate a hierarchy of candidate models. Concerning the data interpretation and results evaluation, we propose two metrics: a model-to-model and a model-to-log. Finally, we present an automatic approach for the extension of a control-flow model with social information, in order to simplify the analysis of these perspectives. The second part of this thesis deals with control-flow discovery algorithms in on-line settings. We propose a formal definition of the problem, and two baseline approaches. The actual mining algorithms proposed are two: the first is the adaptation, to the control-flow discovery problem, of a frequency counting algorithm; the second constitutes a framework of models which can be used for different kinds of streams (stationary versus evolving).
Resumo:
The only nuclear model independent method for the determination of nuclear charge radii of short-lived radioactive isotopes is the measurement of the isotope shift. For light elements (Z < 10) extremely high accuracy in experiment and theory is required and was only reached for He and Li so far. The nuclear charge radii of the lightest elements are of great interest because they have isotopes which exhibit so-called halo nuclei. Those nuclei are characterized by a a very exotic nuclear structure: They have a compact core and an area of less dense nuclear matter that extends far from this core. Examples for halo nuclei are 6^He, 8^He, 11^Li and 11^Be that is investigated in this thesis. Furthermore these isotopes are of interest because up to now only for such systems with a few nucleons the nuclear structure can be calculated ab-initio. In the Institut für Kernchemie at the Johannes Gutenberg-Universität Mainz two approaches with different accuracy were developed. The goal of these approaches was the measurement of the isotope shifts between (7,10,11)^Be^+ and 9^Be^+ in the D1 line. The first approach is laser spectroscopy on laser cooled Be^+ ions that are trapped in a linear Paul trap. The accessible accuracy should be in the order of some 100 kHz. In this thesis two types of linear Paul traps were developed for this purpose. Moreover, the peripheral experimental setup was simulated and constructed. It allows the efficient deceleration of fast ions with an initial energy of 60 keV down to some eV and an effcient transport into the ion trap. For one of the Paul traps the ion trapping could already be demonstrated, while the optical detection of captured 9^Be^+ ions could not be completed, because the development work was delayed by the second approach. The second approach uses the technique of collinear laser spectroscopy that was already applied in the last 30 years for measuring isotope shifts of plenty of heavier isotopes. For light elements (Z < 10), it was so far not possible to reach the accuracy that is required to extract information about nuclear charge radii. The combination of collinear laser spectroscopy with the most modern methods of frequency metrology finally permitted the first-time determination of the nuclear charge radii of (7,10)^Be and the one neutron halo nucleus 11^Be at the COLLAPS experiment at ISOLDE/ CERN. In the course of the work reported in this thesis it was possible to measure the absolute transition frequencies and the isotope shifts in the D1 line for the Be isotopes mentioned above with an accuracy of better than 2 MHz. Combination with the most recent calculations of the mass effect allowed the extraction of the nuclear charge radii of (7,10,11)^Be with an relative accuracy better than 1%. The nuclear charge radius decreases from 7^Be continuously to 10^Be and increases again for 11^Be. This result is compared with predictions of ab-initio nuclear models which reproduce the observed trend. Particularly the "Greens Function Monte Carlo" and the "Fermionic Molecular Dynamic" model show very good agreement.
Resumo:
This work focused on the synthesis of novel monomers for the design of a series of oligo(p-benzamide)s following two approaches: iterative solution synthesis and automated solid phase protocols. These approaches present a useful method to the sequence-controlled synthesis of side-chain and main-chain functionalized oligomers for the preparation of an immense variety of nanoscaffolds. The challenge in the synthesis of such materials was their modification, while maintaining the characteristic properties (physical-chemical properties, shape persistence and anisotropy). The strategy for the preparation of predictable superstructures was devote to the selective control of noncovalent interactions, monodispersity and monomer sequence. In addition to this, the structure-properties correlation of the prepared rod-like soluble materials was pointed. The first approach involved the solution-based aramide synthesis via introduction of 2,4-dimethoxybenzyl N-amide protective group via an iterative synthetic strategy The second approach focused on the implementation of the salicylic acid scaffold to introduce substituents on the aromatic backbone for the stabilization of the OPBA-rotamers. The prepared oligomers were analyzed regarding their solubility and aggregation properties by systematically changing the degree of rotational freedom of the amide bonds, side chain polarity, monomer sequence and degree of oligomerization. The syntheses were performed on a modified commercial peptide synthesizer using a combination of fluorenylmethoxycarbonyl (Fmoc) and aramide chemistry. The automated synthesis allowed the preparation of aramides with potential applications as nanoscaffolds in supramolecular chemistry, e.g. comb-like-
Resumo:
This thesis deals with the transformation of ethanol into acetonitrile. Two approaches are investigated: (a) the ammoxidation of ethanol to acetonitrile and (b) the amination of ethanol to acetonitrile. The reaction of ethanol ammoxidation to acetonitrile has been studied using several catalytic systems, such as vanadyl pyrophosphate, supported vanadium oxide, multimetal molibdates and antimonates. The main conclusions are: (I) The surface acidity must be very low, because acidity catalyzes several undesired reactions, such as the formation of ethylene, and of heavy compounds as well. (II) Supported vanadium oxide is the catalyst showing the best catalytic behaviour, but the role of the support is of crucial importance. (III) Both metal molybdates and antimonates show interesting catalytic behaviour, but are poorly active, and probably require harder conditions than those used with the V oxide-based catalysts. (IV) One key point in the reaction network is the rate of reaction between acetaldehyde (the first intermediate) and ammonia, compared to the parallel rates of acetaldehyde transformation into by-products (CO, CO2, HCN, heavy compounds). Concerning the non-oxidative process, two possible strategies are investigated: (a) the ethanol ammonolysis to ethylamine coupled with ethylamine dehydrogenation, and (b) the direct non-reductive amination of ethanol to acetonitrile. Despite the good results obtained in each single step, the former reaction does not lead to good results in terms of yield to acetonitrile. The direct amination can be catalyzed with good acetonitrile yield over catalyst based on supported metal oxides. Strategies aimed at limiting catalyst deactivation have also been investigated.
Resumo:
One of the main problems recognized in sustainable development goals and sustainable agricultural objectives is Climate change. Farming contributes significantly to the overall Greenhouse gases (GHG) in the atmosphere, which is approximately 10-12 percent of total GHG emissions, but when taking in consideration also land-use change, including deforestation driven by agricultural expansion for food, fiber and fuel the number rises to approximately 30 percent (Smith et. al., 2007). There are two distinct methodological approaches for environmental impact assessment; Life Cycle Assessment (a bottom up approach) and Input-Output Analysis (a top down approach). The two methodologies differ significantly but there is not an immediate choice between them if the scope of the study is on a sectorial level. Instead, as an alternative, hybrid approaches which combine these two approaches have emerged. The aim of this study is to analyze in a greater detail the agricultural sectors contribution to Climate change caused by the consumption of food products. Hence, to identify the food products that have the greatest impact through their life cycle, identifying their hotspots and evaluating the mitigation possibilities for the same. At the same time evaluating methodological possibilities and models to be applied for this purpose both on a EU level and on a country level (Italy).
Resumo:
This PhD Thesis is focused on the development of fibrous polymeric scaffolds for tissue engineering applications and on the improvement of scaffold biomimetic properties. Scaffolds were fabricated by electrospinning, which allows to obtain scaffolds made of polymeric micro or nanofibers. Biomimetism was enhanced by following two approaches: (1) the use of natural biopolymers, and (2) the modification of the fibers surface chemistry. Gelatin was chosen for its bioactive properties and cellular affinity, however it lacks in mechanical properties. This problem was overcome by adding poly(lactic acid) to the scaffold through co-electrospinning and mechanical properties of the composite constructs were assessed. Gelatin effectively improves cell growth and viability and worth noting, composite scaffolds of gelatin and poly(lactic acid) were more effective than a plain gelatin scaffold. Scaffolds made of pure collagen fibers were fabricated. Modification of collagen triple helix structure in electrospun collagen fibers was studied. Mechanical properties were evaluated before and after crosslinking. The crosslinking procedure was developed and optimized by using - for the first time on electrospun collagen fibers - the crosslinking reactant 1,4-butanediol diglycidyl ether, with good results in terms of fibers stabilization. Cell culture experiments showed good results in term of cell adhesion and morphology. The fiber surface chemistry of electrospun poly(lactic acid) scaffold was modified by plasma treatment. Plasma did not affect thermal and mechanical properties of the scaffold, while it greatly increased its hydrophilicity by the introduction of carboxyl groups at the fiber surface. This fiber functionalization enhanced the fibroblast cell viability and spreading. Surface modifications by chemical reactions were conducted on electrospun scaffolds made of a polysophorolipid. The aim was to introduce a biomolecule at the fiber surface. By developing a series of chemical reactions, one oligopeptide every three repeating units of polysophorolipid was grafted at the surface of electrospun fibers.
Resumo:
Epoxy resins are mainly produced by reacting bisphenol A with epichlorohydrin. Growing concerns about the negative health effects of bisphenol A are urging researchers to find alternatives. In this work diphenolic acid is suggested, as it derives from levulinic acid, obtained from renewable resources. Nevertheless, it is also synthesized from phenol, from fossil resources, which, in the current paper has been substituted by plant-based phenols. Two interesting derivatives were identified: diphenolic acid from catechol and from resorcinol. Epichlorohydrin on the other hand, is highly carcinogenic and volatile, leading to a tremendous risk of exposure. Thus, two approaches have been investigated and compared with epichlorohydrin. The resulting resins have been characterized to find an appropriate application, as epoxy are commonly used for a wide range of products, ranging from composite materials for boats to films for food cans. Self-curing capacity was observed for the resin deriving from diphenolic acid from catechol. The glycidyl ether of the diphenolic acid from resorcinol, a fully renewable compound, was cured in isothermal and non-isothermal tests tracked by DSC. Two aliphatic amines were used, namely 1,4-butanediamine and 1,6-hexamethylendiamine, in order to determine the effect of chain length on the curing of an epoxy-amine system and determine the kinetic parameters. The latter are crucial to plan any industrial application. Both diamines demonstrated superior properties compared to traditional bisphenol A-amine systems.
Resumo:
In this thesis two approaches were applied to achieve a double general objective. The first chapter was dedicated to the study of the distribution of the expression of genes of several bitter and fat receptor in several gastrointestinal tracts. A set of 7 genes for bitter taste and for 3 genes for fat taste was amplified with real-time PCR from mRNA extracted from 5 gastrointestinal segments of weaned pigs. The presence of gene expression for several chemosensing receptors for bitter and fat taste in different compartments of the stomach confirms that this organ should be considered a player for the early detection of bolus composition. In the second chapter we investigated in young pigs the distribution of butyrate-sensing olfactory receptor (OR51E1) receptor along the GIT, its relation with some endocrine markers, its variation with age, and after interventions affecting the gut environment and intestinal microbiota in piglets and in different tissues. Our results indicate that OR51E1 is strictly related to the normal GIT enteroendocrine activity. In the third chapter we investigated the differential gene expression between oxyntic and pyloric mucosa in seven starter pigs. The obtained data indicate that there is significant differential gene exression between oxintic of the young pig and pyloric mucosa and further functional studies are needed to confirm their physiological importance. In the last chapter, thymol, that has been proposed as an oral alternative to antibiotics in the feed of pigs and broilers, was introduced directly into the stomach of 8 weaned pigs and sampled for gastric oxyntic and pyloric mucosa. The analysis of the whole transcript expression shoes that the stimulation of gastric proliferative activity and the control of digestive activity by thymol can influence positively gastric maturation and function in the weaned pigs.
Resumo:
Flüssigkristalline Elastomere (LCE) zeigen eine reversible Kontraktion und werden in der Literatur auch als „künstliche Muskeln“ bezeichnet. In dieser Arbeit werden sie mit einem integrierten Heizer versehen, um eine schnelle und präzise Ansteuerung zu ermöglichen. Anschließend werden diese als Aktoren zur Realisierung eines technischen Nachbaus des menschlichen Auges verwendet. rnDas einzigartige Verhalten der flüssigkristallinen Elastomere beruht auf der Kombination der Entropie Elastizität des Elastomers mit der Selbstorganisation der flüssigkristallinen Einheiten (Mesogene). Diese beiden Eigenschaften ermöglichen eine reversible, makroskopische Verformung beim Phasenübergang des Flüssigkristalls in die isotrope Phase. Hierbei ist es wichtig eine homogene Orientierung der Mesogene zu erzeugen, was in dieser Arbeit durch ein Magnetfeld erreicht wird. Da es sich um ein thermotropes flüssigkristallines Elastomer handelt, werden in dieser Arbeit zwei Ansätze vorgestellt, um den LCE intern zu heizen. Zum einen werden Kohlenstoffnanoröhren integriert, um diese über Strahlung oder Strom zu heizen und zum anderen wird ein flexibler Heizdraht integriert, welcher ebenfalls über Strom geheizt wird. rnUm den technischen Nachbau des menschlichen Auges zu realisieren, ist die Herstellung einer flüssigkristallinen Iris gezeigt. Hierzu wird ein radiales Magnetfeld aufgebaut, welches eine radiale Orientierung des Mesogene ermöglicht, wodurch wiederum eine radiale Kontraktion ermöglicht wird. Außerdem sind zwei Konzepte vorgestellt, um eine Elastomer Linse zu verformen. Zum einen wird diese mit einem ringförmigen LCE auseinandergezogen und somit abgeflacht. Zum anderen sind acht Aktoren über Anker an einer Linse angebracht, welche ebenfalls eine Vergrößerung der Linse bewirken. In beiden Fällen werden LCE mit dem zuvor präsentierten integrierten Heizdraht verwendet. Abschließend ist das Zusammensetzen des technische Nachbaus des menschlichen Auges dargestellt, sowie Aufnahmen, welche mit diesem erzeugt wurden.
Resumo:
New treatment options for Niemann-Pick Type C (NPC) have recently become available. To assess the efficiency and efficacy of these new treatment markers for disease status and progression are needed. Both the diagnosis and the monitoring of disease progression are challenging and mostly rely on clinical impression and functional testing of horizontal eye movements. Diffusion tensor imaging (DTI) provides information about the microintegrity especially of white matter. We show here in a case report how DTI and measures derived from this imaging method can serve as adjunct quantitative markers for disease management in Niemann-Pick Type C. Two approaches are taken--first, we compare the fractional anisotropy (FA) in the white matter globally between a 29-year-old NPC patient and 18 healthy age-matched controls and show the remarkable difference in FA relatively early in the course of the disease. Second, a voxelwise comparison of FA values reveals where white matter integrity is compromised locally and demonstrate an individualized analysis of FA changes before and after 1year of treatment with Miglustat. This method might be useful in future treatment trials for NPC to assess treatment effects.
Resumo:
Gross dissection for demonstrating anatomy of the human pelvis has traditionally involved one of two approaches, each with advantages and disadvantages. Classic hemisection in the median plane through the pelvic ring transects the visceral organs but maintains two symmetric pelvic halves. An alternative paramedial transection compromises one side of the bony pelvis but leaves the internal organs intact. The authors propose a modified technique that combines advantages of both classical dissections. This novel approach involves dividing the pubic symphysis and sacrum in the median plane after shifting all internal organs to one side. The hemipelvis without internal organs is immediately available for further dissection of the lower limb. The hemipelvis with intact internal organs is ideal for showing the complex spatial relationships of the pelvic organs and vessels relative to the intact pelvic floor.