985 resultados para echo-hiding


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis two major topics inherent with medical ultrasound images are addressed: deconvolution and segmentation. In the first case a deconvolution algorithm is described allowing statistically consistent maximum a posteriori estimates of the tissue reflectivity to be restored. These estimates are proven to provide a reliable source of information for achieving an accurate characterization of biological tissues through the ultrasound echo. The second topic involves the definition of a semi automatic algorithm for myocardium segmentation in 2D echocardiographic images. The results show that the proposed method can reduce inter- and intra observer variability in myocardial contours delineation and is feasible and accurate even on clinical data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Zusammenfassung Nanokomposite aus Polymeren und Schichtsilikaten werden zumeist auf der Basis natürlicher Tone wie Montmorillonit hergestellt. Für NMR- und EPR-Untersuchungen der Tensidschicht, die das Silikat mit dem Polymer kompatibilisiert, ist der Eisengehalt natürlicher Tone jedoch abträglich, weil er zu einer Verkürzung der Relaxationszeiten und zu einer Linienverbreiterung in den Spektren führt. Dieses Problem konnte überwunden werden, indem als Silikatkomponente eisenfreies, strukturell wohldefiniertes Magadiit hydrothermal synthetisiert und für die Kompositbildung eingesetzt wurde. Die Morphologie des Magadiits wurde durch Rasterelektronenmikroskopie charakterisiert und der Interkalationsgrad von schmelzinterkalierten Polymer-Nanokompositen wurde durch Weitwinkelröntgenstreuung bestimmt. Polymere mit Carbonylgruppen scheinen leichter zu interkalieren als solche ohne Carbonylgruppen. Polycaprolacton interkalierte sowohl in Oragnomagadiite auf der Basis von Ammoniumtensiden als auch in solche auf der Basis von Phosphoniumtensiden. Die Dynamik auf einer Nanosekundenzeitskala und die Struktur der Tensidschicht wurden mittels ortsspezifisch spinmarkierter Tensidsonden unter Nutzung von Dauerstrich- (CW) und Puls-Methoden der elektronenparamagnetischen Resonanzspektroskopie (EPR) untersucht. Zusätzlich wurde die statische 2H-Kernmagnetresonanz (NMR) an spezifisch deuterierten Tensiden angewendet, um die Tensiddynamik auf einer komplementären Zeitskala zwischen Mikrosekunden und Millisekunden zu erfassen. Sowohl die CW-EPR- als auch die 2H-NMR-Ergebnisse zeigen eine Beschleunigung der Tensiddynamik durch Interkalation von Polycaprolacton auf, während sich in den nichtinterkalierten Mikrokompositen mit Polystyrol die Tensiddynamik verlangsamt. Die Rotationskorrelationszeiten und Aktivierungsenergien offenbaren verschiedene Regime der Tensiddynamik. In Polystyrol-Mikrokompositen entspricht die Übergangstemperatur zwischen den Regimen der Glasübergangstemperatur von Polystyrol, während sie in Polycaprolacton-Nanokompositen bei der Schmelztemperatur von Polycaprolacton liegt. Durch die erhebliche Verlängerung der Elektronenspin-Relaxationszeiten bei Verwendung von eisenfreiem Magadiit können Messdaten hoher Qualität mit Puls-EPR-Experimenten erhalten werden. Insebsondere wurden die Vier-Puls-Elektron-Elektron-Doppelresonanz (DEER), die Elektronenspinechoenveloppenmodulation (ESEEM) und die Elektronen-Kern-Doppelresonanz (ENDOR) an spinmarkierten sowie spezifisch deuterierten Tensiden angewandt. Die ENDOR-Ergebnisse legen ein Model der Tensidschicht nahe, in dem zusätzlich zu den Oberflächenlagen auf dem Silikat eine wohldefinierte mittlere Lage existiert. Dieses Modell erklärt auch Verdünnungseffekte durch das Polymer in Kompositen mit Polycaprolacton und Polystyrol. Die umfangreiche Information aus den Magnetresonanztechniken ergänzt die Information aus konventionellen Charakterisierungstechniken wie Röntgendiffraktion und Transmissionselektronenmikroskopie und führt so zu einem detaillierteren Bild der Struktur und Dynamik der Tensidschicht in Nanokompositen aus Polymeren und Schichtsilikaten.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since the discovery of the nuclear magnetic resonance (NMR) phenomenon, countless NMR techniques have been developed that are today indispensable tools in physics, chemistry, biology, and medicine. As one of the main obstacles in NMR is its notorious lack of sensitivity, different hyperpolarization (HP) methods have been established to increase signals up to several orders of magnitude. In this work, different aspects of magnetic resonance, using HP noble gases, are studied, hereby combining different disciplines of research. The first part examines new fundamental effects in NMR of HP gases, in theory and experiment. The spin echo phenomenon, which provides the basis of numerous modern experiments, is studied in detail in the gas phase. The changes of the echo signal in terms of amplitude, shape, and position, due to the fast translational motion, are described by an extension of the existing theory and computer simulations. With this knowledge as a prerequisite, the detection of intermolecular double-quantum coherences was accomplished for the first time in the gas phase. The second part of this thesis focuses on the development of a practical method to enhance the dissolution process of HP 129Xe, without loss of polarization or shortening of T1. Two different setups for application in NMR spectroscopy and magnetic resonance imaging (MRI) are presented. The continuous operation allows biological and multidimensional spectroscopy in solutions. Also, first in vitro MRI images with dissolved HP 129Xe as contrast agent were obtained at a clinical scanner.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Cardiovascular disease (CVD) is a common cause of morbidity and mortality in childhood chronic kidney disease (CKD). Left ventricular hypertrophy (LVH) is known to be one of the earliest events in CVD development. Left ventricular diastolic function (DF) is thought to be also impaired in children with CKD. Tissue Doppler imaging (TDI) provide an accurate measure of DF and is less load dependent than conventional ECHO. Aim: To evaluate the LV mass and the DF in a population of children with CKD. Methods: 37 patients, median age: 10.4 (3.3-19.8); underlying renal disease: hypo/dysplasia (N=28), nephronophthisis (N=4), Alport (N=2), ARPKD (N=3), were analyzed. Thirty-eight percent of the patients were on stage 1-2 of CKD, 38% on stage 3, 16% on stage 4. Three patients were on dialysis. The most frequent factors related to CVD in CKD have been studied. LVH has been defined as a left ventricular mass index (LVMI) more than 35.7 g/h2,7. Results: Twenty-five patients (81%) had a LVH. LVMI and diastolic function index (E’/A’) were significantly related to the glomerular filtration rate (p<0.003 and p<0.004). Moreover the LVMI was correlated with the phosphorus and the hemoglobin level (p<0.0001 and p<0.004). LVH was present since the first stages of CKD (58% of patients were on stages 1-2). Early-diastolic myocardial velocity was reduced in 73% of our patients. We didn’t find any correlation between LVH and systemic hypertension. Conclusion: ECHO evaluation with TDI is suggested also in children prior to dialysis and with a normal blood pressure. If LVH is diagnosed, a periodic follow-up is necessary with the treatment of the modifiable risk factors (hypertension, disturbances of calcium, phosphorus and PTH, anemia ).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The continuous advancements and enhancements of wireless systems are enabling new compelling scenarios where mobile services can adapt according to the current execution context, represented by the computational resources available at the local device, current physical location, people in physical proximity, and so forth. Such services called context-aware require the timely delivery of all relevant information describing the current context, and that introduces several unsolved complexities, spanning from low-level context data transmission up to context data storage and replication into the mobile system. In addition, to ensure correct and scalable context provisioning, it is crucial to integrate and interoperate with different wireless technologies (WiFi, Bluetooth, etc.) and modes (infrastructure-based and ad-hoc), and to use decentralized solutions to store and replicate context data on mobile devices. These challenges call for novel middleware solutions, here called Context Data Distribution Infrastructures (CDDIs), capable of delivering relevant context data to mobile devices, while hiding all the issues introduced by data distribution in heterogeneous and large-scale mobile settings. This dissertation thoroughly analyzes CDDIs for mobile systems, with the main goal of achieving a holistic approach to the design of such type of middleware solutions. We discuss the main functions needed by context data distribution in large mobile systems, and we claim the precise definition and clean respect of quality-based contracts between context consumers and CDDI to reconfigure main middleware components at runtime. We present the design and the implementation of our proposals, both in simulation-based and in real-world scenarios, along with an extensive evaluation that confirms the technical soundness of proposed CDDI solutions. Finally, we consider three highly heterogeneous scenarios, namely disaster areas, smart campuses, and smart cities, to better remark the wide technical validity of our analysis and solutions under different network deployments and quality constraints.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Structure and folding of membrane proteins are important issues in molecular and cell biology. In this work new approaches are developed to characterize the structure of folded, unfolded and partially folded membrane proteins. These approaches combine site-directed spin labeling and pulse EPR techniques. The major plant light harvesting complex LHCIIb was used as a model system. Measurements of longitudinal and transversal relaxation times of electron spins and of hyperfine couplings to neighboring nuclei by electron spin echo envelope modulation(ESEEM) provide complementary information about the local environment of a single spin label. By double electron electron resonance (DEER) distances in the nanometer range between two spin labels can be determined. The results are analyzed in terms of relative water accessibilities of different sites in LHCIIb and its geometry. They reveal conformational changes as a function of micelle composition. This arsenal of methods is used to study protein folding during the LHCIIb self assembly and a spatially and temporally resolved folding model is proposed. The approaches developed here are potentially applicable for studying structure and folding of any protein or other self-assembling structure if site-directed spin labeling is feasible and the time scale of folding is accessible to freeze-quench techniques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Der Haupt-Lichtsammelkomplex (LHCII) des Photosyntheseapparates höherer Pflanzen gehört zu den häufigsten Membranproteinen der Erde. Seine Kristallstruktur ist bekannt. Das Apoprotein kann rekombinant in Escherichia coli überexprimiert und somit molekularbiologisch vielfältig verändert werden. In Detergenzlösung besitzt das denaturierte Protein die erstaunliche Fähigkeit, sich spontan zu funktionalen Protein-Pigment-Komplexen zu organisieren, welche strukturell nahezu identisch sind mit nativem LHCII. Der Faltungsprozess findet in vitro im Zeitbereich von Sekunden bis Minuten statt und ist abhängig von der Bindung der Cofaktoren Chlorophyll a und b sowie verschiedenen Carotinoiden.rn Diese Eigenschaften machen LHCII besonders geeignet für Strukturuntersuchungen mittels der elektronenparamagnetischen Resonanz (EPR)-Spektrokopie. Diese setzt eine punktspezifische Spinmarkierung des LHCII voraus, die in dieser Arbeit zunächst optimiert wurde. Einschließlich der Beiträge Anderer stand eine breite Auswahl von über 40 spinmarkierten Mutanten des LHCII bereit, einen N-terminalen „Cys walk“ eingeschlossen. Weder der hierfür notwendige Austausch einzelner Aminosäuren noch die Anknüpfung des Spinmarkers beeinträchtigten die Funktion des LHCII. Zudem konnte ein Protokoll zur Präparation heterogen spinmarkierter LHCII-Trimere entwickelt werden, also von Trimeren, die jeweils nur ein Monomer mit einer Spinmarkierung enthalten.rn Spinmarkierte Proben des Detergenz-solubilisierten LHCII wurden unter Verwendung verschiedener EPR-Techniken strukturell analysiert. Als besonders aussagekräftig erwies sich die Messung der Wasserzugänglichkeit einzelner Aminosäurepositionen anhand der Electron Spin Echo Envelope Modulation (ESEEM). In Kombination mit der etablierten Double Electron-Electron Resonance (DEER)-Technik zur Detektion von Abständen zwischen zwei Spinmarkern wurde der membranständige Kernbereich des LHCII in Lösung eingehend untersucht und strukturell der Kristallstruktur für sehr ähnlich befunden. Die Vermessung kristallographisch nicht erfasster Bereiche nahe dem N-Terminus offenbarte die schon früher detektierte Strukturdynamik der Domäne in Abhängigkeit des Oligomerisierungsgrades. Der neue, noch zu vervollständigende Datensatz aus Abstandsverteilungen und ESEEM-Wasserzugänglichkeiten monomerer wie trimerer Proben sollte in naher Zukunft die sehr genaue Modellierung der N-terminalen Domäne des LHCII ermöglichen.rn In einem weiteren Abschnitt der Arbeit wurde die Faltung des LHCII-Apoproteins bei der LHCII-Assemblierung in vitro untersucht. Vorausgegangene fluoreszenzspektroskopi-sche Arbeiten hatten gezeigt, dass die Bindung von Chlorophyll a und b in aufeinanderfolgenden Schritten im Zeitbereich von weniger als einer Minute bzw. mehreren Minuten erfolgten. Sowohl die Wasserzugänglichkeit einzelner Aminosäurepositionen als auch Spin-Spin-Abstände änderten sich in ähnlichen Zeitbereichen. Die Daten deuten darauf hin, dass die Ausbildung der mittleren Transmembran-Helix mit der schnelleren Chlorophyll-a-Bindung einhergeht, während sich die Superhelix aus den beiden anderen Transmembranhelices erst im langsameren Schritt, zusammen mit der Chlorophyll-b-Bindung, ausbildet.rn

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION Echocardiography is the standard clinical approach for quantification of the severity of aortic stenosis (AS). A comprehensive examination of its overall reproducibility and the simultaneous estimation of its variance components by multiple operators, readers, probe applications, and beats have not been undertaken. METHOD AND RESULTS Twenty-seven subjects with AS were scanned over 7 months in the echo-department by a median of 3 different operators. From each patient and each operator multiple runs of beats from multiple probe positions were stored for later analysis by multiple readers. The coefficient of variation was 13.3%, 15.9%, 17.6%, and 20.2% for the aortic peak velocity (Vmax), and velocity time integral (VTI), and left ventricular outflow tract (LVOT) Vmax and VTI respectively. The largest individual contributors to the overall variability were the beat-to-beat variability (9.0%, 9.3%, 9.5%, 9.4% respectively) and that of inability of an individual operator to precisely apply the probe to the same position twice (8.3%, 9.4%, 12.9%, 10.7% respectively). The tracing (inter-reader) and reader (inter-reader), and operator (inter-operator) contribution were less important. CONCLUSIONS Reproducibility of measurements in AS is poorer than often reported in the literature. The source of this variability does not appear, as traditionally believed, to result from a lack of training or operator and reader specific factors. Rather the unavoidable beat-to-beat biological variability, and the inherent impossibility of applying the ultrasound probe in exactly the same position each time are the largest contributors. Consequently, guidelines suggesting greater standardisation of procedures and further training for sonographers are unlikely to result in an improvement in precision. Clinicians themselves should be wary of relying on even three-beat averages as their expected coefficient of variance is 10.3% for the peak velocity at the aortic valve.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Innerhalb der vorliegenden Untersuchung geht es um die Verknüpfung von Medienbildung, homosexueller Sozialität und der Methodik der Biografieanalyse. Ausgangsbasis ist eine sozialkonstruktivistische Sichtweise auf Geschlecht und (Homo-) Sexualität, wobei eine sozio-historische Kontextualisierung von Homosexualität unter Berücksichtigung von Diskriminierung erfolgt. Im Fokus steht der Coming-out-Prozess, der zwischen Zeigen und Verstecken changiert und mittels des Mediums Internet einen Raum findet, indem neue Bestimmungen homosexueller Identitäten und Formen homosexueller Sozialität möglich werden. Kommunikative Aspekte des Internets werden ausführlich expliziert und durch die strukturelle Medienbildungstheorie nach Marotzki (2009) ergänzt, um mögliche verbundene Bildungsprozesse zu beschreiben. Innerhalb dieser Theorie werden vier kritische Reflexionshorizonte (Wissensbezug, Handlungsbezug, Grenzbezug, Biografiebezug) entfaltet und auf die Artikulations- und Präsentationsmöglichkeiten des Internets bezogen. Deutlich wird, dass das Internet Spielräume für Identitäten bietet, denen Potenziale für reale Identitätskonstruktionen inneliegen. Fassbar werden diese Potenziale durch das medienpädagogische Konstrukt der Medienbiografie, sowie Konzepte der erziehungswissenschaftlichen Biografieforschung (Konstrukt Bildung nach Marotzki, 1990a; Konstrukt Sexualbiografie nach Scheuermann, 1999; 1995). Empirisch orientiert sich die Studie an Methodologie und Methodik der Biografieforschung, Grounded Theory (Glaser/Strauss, 1967) und dem narrationsstrukturellen Verfahren nach Schütze (1984, 1983). Konkret wird auf folgende Forschungsfragen referiert: Wie gestalten sich Lern- und Bildungsprozesse für männliche Homosexuelle in digitalen Medienwelten? Welche Möglichkeiten und Gestaltungschancen gibt es für die Repräsentation des (sexuellen) Selbst im Medium Internet? Welche Auswirkungen haben diese virtuellen Prozesse auf die real gelebte Biografie und das Selbst- und Weltverhältnis der einzelnen Homosexuellen? Durch Rekonstruktion von vier Fallbeispielen werden Möglichkeiten des Internets für die Repräsentation und Identitätsgestaltung von männlichen Homosexuellen präsentiert, bei denen die Gestaltbarkeit von Konstruktionen sexueller Identität und die Problematik der Subjekt-Umwelt-Relation deutlich werden. Im weiteren erfolgt ein kontrastierender Vergleich der Einzelfälle (Dimensionen: Familie, Peer Group, sexualbiografische Entwicklung, Medienbildungsprozesse, biografische Fallstruktur), die einer anschließenden Konstruktion von vier idealtypischen Prozessvarianten der sexualbiografischen Identitätsentwicklung zugeführt werden. Vier verschiedene Möglichkeiten des Internets als Präsentationstraum der eigenen Sexualität und Konstruktionen homosexueller Identität lassen sich somit skizzieren (Virtualitätslagerung, Zweckorientierung, reflexive Balancierung, periodische Selbstaktualisierung). Tentative Bildungs- und Identitätsprozesse sind also in der Virtualität des Internets möglich und können rekursiv-zirkulär auf reale Identitätsentwicklungen und reale Zugänge zu spezifischen sozialen Gruppen einwirken.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nitrogen is an essential nutrient. It is for human, animal and plants a constituent element of proteins and nucleic acids. Although the majority of the Earth’s atmosphere consists of elemental nitrogen (N2, 78 %) only a few microorganisms can use it directly. To be useful for higher plants and animals elemental nitrogen must be converted to a reactive oxidized form. This conversion happens within the nitrogen cycle by free-living microorganisms, symbiotic living Rhizobium bacteria or by lightning. Humans are able to synthesize reactive nitrogen through the Haber-Bosch process since the beginning of the 20th century. As a result food security of the world population could be improved noticeably. On the other side the increased nitrogen input results in acidification and eutrophication of ecosystems and in loss of biodiversity. Negative health effects arose for humans such as fine particulate matter and summer smog. Furthermore, reactive nitrogen plays a decisive role at atmospheric chemistry and global cycles of pollutants and nutritive substances.rnNitrogen monoxide (NO) and nitrogen dioxide (NO2) belong to the reactive trace gases and are grouped under the generic term NOx. They are important components of atmospheric oxidative processes and influence the lifetime of various less reactive greenhouse gases. NO and NO2 are generated amongst others at combustion process by oxidation of atmospheric nitrogen as well as by biological processes within soil. In atmosphere NO is converted very quickly into NO2. NO2 is than oxidized to nitrate (NO3-) and to nitric acid (HNO3), which bounds to aerosol particles. The bounded nitrate is finally washed out from atmosphere by dry and wet deposition. Catalytic reactions of NOx are an important part of atmospheric chemistry forming or decomposing tropospheric ozone (O3). In atmosphere NO, NO2 and O3 are in photosta¬tionary equilibrium, therefore it is referred as NO-NO2-O3 triad. At regions with elevated NO concentrations reactions with air pollutions can form NO2, altering equilibrium of ozone formation.rnThe essential nutrient nitrogen is taken up by plants mainly by dissolved NO3- entering the roots. Atmospheric nitrogen is oxidized to NO3- within soil via bacteria by nitrogen fixation or ammonium formation and nitrification. Additionally atmospheric NO2 uptake occurs directly by stomata. Inside the apoplast NO2 is disproportionated to nitrate and nitrite (NO2-), which can enter the plant metabolic processes. The enzymes nitrate and nitrite reductase convert nitrate and nitrite to ammonium (NH4+). NO2 gas exchange is controlled by pressure gradients inside the leaves, the stomatal aperture and leaf resistances. Plant stomatal regulation is affected by climate factors like light intensity, temperature and water vapor pressure deficit. rnThis thesis wants to contribute to the comprehension of the effects of vegetation in the atmospheric NO2 cycle and to discuss the NO2 compensation point concentration (mcomp,NO2). Therefore, NO2 exchange between the atmosphere and spruce (Picea abies) on leaf level was detected by a dynamic plant chamber system under labo¬ratory and field conditions. Measurements took place during the EGER project (June-July 2008). Additionally NO2 data collected during the ECHO project (July 2003) on oak (Quercus robur) were analyzed. The used measuring system allowed simultaneously determina¬tion of NO, NO2, O3, CO2 and H2O exchange rates. Calculations of NO, NO2 and O3 fluxes based on generally small differences (∆mi) measured between inlet and outlet of the chamber. Consequently a high accuracy and specificity of the analyzer is necessary. To achieve these requirements a highly specific NO/NO2 analyzer was used and the whole measurement system was optimized to an enduring measurement precision.rnData analysis resulted in a significant mcomp,NO2 only if statistical significance of ∆mi was detected. Consequently, significance of ∆mi was used as a data quality criterion. Photo-chemical reactions of the NO-NO2-O3 triad in the dynamic plant chamber’s volume must be considered for the determination of NO, NO2, O3 exchange rates, other¬wise deposition velocity (vdep,NO2) and mcomp,NO2 will be overestimated. No significant mcomp,NO2 for spruce could be determined under laboratory conditions, but under field conditions mcomp,NO2 could be identified between 0.17 and 0.65 ppb and vdep,NO2 between 0.07 and 0.42 mm s-1. Analyzing field data of oak, no NO2 compensation point concentration could be determined, vdep,NO2 ranged between 0.6 and 2.71 mm s-1. There is increasing indication that forests are mainly a sink for NO2 and potential NO2 emissions are low. Only when assuming high NO soil emissions, more NO2 can be formed by reaction with O3 than plants are able to take up. Under these circumstance forests can be a source for NO2.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During the last few decades an unprecedented technological growth has been at the center of the embedded systems design paramount, with Moore’s Law being the leading factor of this trend. Today in fact an ever increasing number of cores can be integrated on the same die, marking the transition from state-of-the-art multi-core chips to the new many-core design paradigm. Despite the extraordinarily high computing power, the complexity of many-core chips opens the door to several challenges. As a result of the increased silicon density of modern Systems-on-a-Chip (SoC), the design space exploration needed to find the best design has exploded and hardware designers are in fact facing the problem of a huge design space. Virtual Platforms have always been used to enable hardware-software co-design, but today they are facing with the huge complexity of both hardware and software systems. In this thesis two different research works on Virtual Platforms are presented: the first one is intended for the hardware developer, to easily allow complex cycle accurate simulations of many-core SoCs. The second work exploits the parallel computing power of off-the-shelf General Purpose Graphics Processing Units (GPGPUs), with the goal of an increased simulation speed. The term Virtualization can be used in the context of many-core systems not only to refer to the aforementioned hardware emulation tools (Virtual Platforms), but also for two other main purposes: 1) to help the programmer to achieve the maximum possible performance of an application, by hiding the complexity of the underlying hardware. 2) to efficiently exploit the high parallel hardware of many-core chips in environments with multiple active Virtual Machines. This thesis is focused on virtualization techniques with the goal to mitigate, and overtake when possible, some of the challenges introduced by the many-core design paradigm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hyperpolarization techniques enhance the nuclear spin polarization and thus allow for new nuclear magnetic resonance applications like in vivo metabolic imaging. One of these techniques is Parahydrogen Induced Polarization (PHIP). It leads to a hyperpolarized 1H spin state which can be transferred to a heteronucleus like 13C by a radiofrequency (RF) pulse sequence. In this work, timing of such a sequence was analyzed and optimized for the molecule hydroxyethyl propionate. The pulse sequence was adapted for the work on a clinical magnetic resonance imaging (MRI) system which is usually equipped only with a single RF transmit channel. Optimal control theory optimizations were performed to achieve an optimized polarization transfer. A drawback of hyperpolarization is its limited lifetime due to relaxation processes. The lifetime can be increased by storing the hyperpolarization in a spin singlet state. The second part of this work therefore addresses the spin singlet state of the Cs-symmetric molecule dimethyl maleate which needs to be converted to the spin triplet state to be detectable. This conversion was realized on a clinical MRI system, both by field cycling and by two RF pulse sequences which were adapted and optimized for this purpose. Using multiple conversions enables the determination of the lifetime of the singlet state as well as the conversion efficiency of the RF pulse sequence. Both, the hyperpolarized 13C spin state and the converted singlet state were utilized for MR imaging. Careful choice of the echo time was shown to be crucial for both molecules.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

L’utilizzo del Multibeam Echo sounder (MBES) in ambienti di transizione poco profondi, con condizioni ambientali complesse come la laguna di Venezia, è ancora in fase di studio e i dati biologici e sedimentologici inerenti ai canali della laguna di Venezia sono attualmente scarsi e datati in letteratura. Questo studio ha lo scopo di mappare gli habitat e gli oggetti antropici di un canale della laguna di Venezia in un intervallo di profondità tra 0.3 e 20 m (Canale San Felice) analizzando i dati batimetrici e di riflettività (backscatter) acquisiti da ISMAR-Venezia nell’ambito del progetto RITMARE. A tale scopo il fondale del canale San Felice (Venezia) è stato caratterizzato dal punto di vista geomorfologico, sedimentologico e biologico; descrivendo anche l’eventuale presenza di oggetti antropici. L’ecoscandaglio utilizzato è il Kongsberg EM2040 Dual-Compact Multibeam in grado di emettere 800 beam (400 per trasduttore) ad una frequenza massima di 400kHZ e ci ha consentito di ricavare ottimi risultati, nonostante le particolari caratteristiche degli ambienti lagunari. I dati acquisiti sono stati processati tramite il software CARIS Hydrographic information processing system (Hips) & Sips, attraverso cui è possibile applicare le correzioni di marea e velocità del suono e migliorare la qualità dei dati grezzi ricavati da MBES. I dati sono stati quindi convertiti in ESRI Grid, formato compatibile con il software ArcGIS 10.2.1 (2013) che abbiamo impiegato per le interpretazioni e per la produzione delle mappe. Tecniche di ground-truthing, basate su riprese video e prelievi di sedimento (benna Van Veen 7l), sono state utilizzate per validare il backscatter, dimostrandosi molto efficaci e soddisfacenti per poter descrivere i fondali dal punto di vista biologico e del substrato e quindi degli habitat del canale lagunare. Tutte le informazioni raccolte durante questo studio sono state organizzate all’interno di un geodatabase, realizzato per i dati relativi alla laguna di Venezia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The authors present the case of an 81-year-old patient with severe aortic stenosis who experienced left ventricular embolization of an aortic bioprosthesis during transapical aortic valve implantation. The authors discuss reasons for prosthesis embolization and reinforce the attention to technical details and the widespread use of multimodality imaging techniques. In this context, transesophageal echocardiography appears indispensable in the detection and management of procedure-related complications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Physiologic data display is essential to decision making in critical care. Current displays echo first-generation hemodynamic monitors dating to the 1970s and have not kept pace with new insights into physiology or the needs of clinicians who must make progressively more complex decisions about their patients. The effectiveness of any redesign must be tested before deployment. Tools that compare current displays with novel presentations of processed physiologic data are required. Regenerating conventional physiologic displays from archived physiologic data is an essential first step. OBJECTIVES: The purposes of the study were to (1) describe the SSSI (single sensor single indicator) paradigm that is currently used for physiologic signal displays, (2) identify and discuss possible extensions and enhancements of the SSSI paradigm, and (3) develop a general approach and a software prototype to construct such "extended SSSI displays" from raw data. RESULTS: We present Multi Wave Animator (MWA) framework-a set of open source MATLAB (MathWorks, Inc., Natick, MA, USA) scripts aimed to create dynamic visualizations (eg, video files in AVI format) of patient vital signs recorded from bedside (intensive care unit or operating room) monitors. Multi Wave Animator creates animations in which vital signs are displayed to mimic their appearance on current bedside monitors. The source code of MWA is freely available online together with a detailed tutorial and sample data sets.