849 resultados para adaptive blind source separation method
Resumo:
Recently, many chaos-based communication systems have been proposed. They can present the many interesting properties of spread spectrum modulations. Besides, they can represent a low-cost increase in security. However, their major drawback is to have a Bit Error Rate (BER) general performance worse than their conventional counterparts. In this paper, we review some innovative techniques that can be used to make chaos-based communication systems attain lower levels of BER in non-ideal environments. In particular, we succinctly describe techniques to counter the effects of finite bandwidth, additive noise and delay in the communication channel. Although much research is necessary for chaos-based communication competing with conventional techniques, the presented results are auspicious. (C) 2011 Elsevier B. V. All rights reserved.
Resumo:
Background Transformed cells of Escherichia coli DH5-α with pGFPuv, induced by IPTG (isopropyl-β-d-thiogalactopyranoside), express the green fluorescent protein (gfpuv) during growth phases. E. coli subjected to the combination of selective permeation by freezing/thawing/sonication cycles followed by the three-phase partitioning extraction (TPP) method were compared to the direct application of TPP to the same culture of E. coli on releasing gfpuv from the over-expressing cells. Material and Methods Cultures (37°C/100 rpm/ 24 h; μ = 0.99 h-1 - 1.10 h-1) of transformed (pGFP) Escherichia coli DH5-α, expressing the green fluorescent protein (gfpuv, absorbance at 394 nm and emission at 509 nm) were sonicated in successive intervals of sonication (25 vibrations/pulse) to determine the maximum amount of gfpuv released from the cells. For selective permeation, the transformed previously frozen (-75°C) cells were subjected to three freeze/thaw (-20°C/ 0.83°C/min) cycles interlaid by sonication (3 pulses/ 6 seconds/ 25 vibrations). The intracellular permeate with gfpuv in extraction buffer (TE) solution (25 mM Tris-HCl, pH 8.0, 1 mM β-mercaptoethanol β-ME, 0.1 mM PMSF) was subjected to the three-phase partitioning (TPP) method with t-butanol and 1.6 M ammonium sulfate. Sonication efficiency was verified on the application to the cells previously treated by the TPP method. The intra-cell releases were mixed and eluted through methyl HIC column with a buffer solution (10 mM Tris-HCl, 10 mM EDTA, pH 8.0). Results The sonication maximum released amount obtained from the cells was 327.67 μg gfpuv/mL (20.73 μg gfpuv/mg total proteins – BSA), after 9 min of treatment. Through the selective permeation by three repeated freezing/thawing/sonication cycles applied to the cells, a close content of 241.19 μg gfpuv/mL (29.74 μg gfpuv/mg BSA) was obtained. The specific mass range of gfpuv released from the same cultures, by the three-phase partitioning (TPP) method, in relation to total proteins, was higher, between 107.28 μg/mg and 135.10 μg/mg. Conclusions The selective permeation of gfpuv by freezing/thawing/sonication followed by TPP separation method was equivalent to the amount of gfpuv extracted from the cells directly by TPP; although selective permeation extracts showed better elution through the HIC column.
Resumo:
The surface electrocardiogram (ECG) is an established diagnostic tool for the detection of abnormalities in the electrical activity of the heart. The interest of the ECG, however, extends beyond the diagnostic purpose. In recent years, studies in cognitive psychophysiology have related heart rate variability (HRV) to memory performance and mental workload. The aim of this thesis was to analyze the variability of surface ECG derived rhythms, at two different time scales: the discrete-event time scale, typical of beat-related features (Objective I), and the “continuous” time scale of separated sources in the ECG (Objective II), in selected scenarios relevant to psychophysiological and clinical research, respectively. Objective I) Joint time-frequency and non-linear analysis of HRV was carried out, with the goal of assessing psychophysiological workload (PPW) in response to working memory engaging tasks. Results from fourteen healthy young subjects suggest the potential use of the proposed indices in discriminating PPW levels in response to varying memory-search task difficulty. Objective II) A novel source-cancellation method based on morphology clustering was proposed for the estimation of the atrial wavefront in atrial fibrillation (AF) from body surface potential maps. Strong direct correlation between spectral concentration (SC) of atrial wavefront and temporal variability of the spectral distribution was shown in persistent AF patients, suggesting that with higher SC, shorter observation time is required to collect spectral distribution, from which the fibrillatory rate is estimated. This could be time and cost effective in clinical decision-making. The results held for reduced leads sets, suggesting that a simplified setup could also be considered, further reducing the costs. In designing the methods of this thesis, an online signal processing approach was kept, with the goal of contributing to real-world applicability. An algorithm for automatic assessment of ambulatory ECG quality, and an automatic ECG delineation algorithm were designed and validated.
Resumo:
Die Elementmassenspektrometrie wurde in den letzten Jahren sehr erfolgreich zur Aufklärung verschiedener Fragestellungen in der Bioanalytik eingesetzt. Hierbei spielen vor allem Kopplungstechniken von Trennmethoden wie der Flüssigchromatographie (LC) oder der Kapillarelektrophorese (CE) mit der induktiv gekoppelten Plasma-Massenspektrometrie (ICP-MS) als Multielementdetektor mit hervorragenden Quantifizierungseigenschaften eine entscheidende Rolle bei der Untersuchung von Biopolymeren und deren Wechselwirkung mit verschiedenen Metallen. So wurden beispielsweise verschiedene Methoden für die Trennung und Detektion von Metalloproteinen oder DNA-Metall-Addukten in unterschiedlichen Probenmaterialien entwickelt. Die traditionelle und leistungsstärkste Trennmethode für Biopolymere aller Art, die Gelelektrophorese, wurde jedoch bislang nicht in einem Online-Verfahren an die ICP-MS gekoppelt, um solche Fragestellungen zu bearbeiten. Verschiedene Versuche auf der Basis der Laserablation wurden in diese Richtung unternommen, wobei diese Techniken als sehr umständlich und zeitaufwändig anzusehen sind. In dieser Arbeit wird erstmals die technische Realisierung einer Online-Kopplung der Gelelektrophorese mit der ICP-MS beschrieben. Das System basiert auf einem Prinzip aus der präparativen Gelelektrophorese, in welcher eine kontinuierliche Elution der getrennten Komponenten aus dem Gel während der laufenden Elektrophorese durchgeführt wird. Die eluierten Komponenten werden mit dem Elutionspuffer direkt in das Zerstäubersystem des ICP-MS geführt. Die ersten Untersuchungen wurden am Beispiel der Fragemente von doppelsträngiger DNA (dsDNA) durchgeführt. Kommerziell erhältliche Standardlösungen wurden mit der Online-GE-ICP-MS mittels Detektion von 31P an einem hochauflösenden Massenspektrometer mit einer Massenauflösung von 4000 analysiert. Die Trennbedingungen (z.B. pH-Wert oder Ionenstärke der Pufferlösungen) wurden für die Trennung von dsDNA-Fragementen in Agarosegelen optimiert und auf verschiedene dsDNA-Fragmente angewandt. In einem nächsten Schritt wurden die Quantifizierungsmöglichkeiten für Biopolymere untersucht. Sehr kleine Mengen an dsDNA konnten mit einer Präzision von weniger als 3% quantifiziert werden. Hierfür kamen verschiedene Möglichkeiten der externen Kalibration zum Einsatz, wie der Kalibration mit einem Phosphat-Standard oder einem kommerziell erhältlichen quantitativen dsDNA-Standard. Um das Potenzial der entwickelten Methode für die Untersuchung von Biopolymer-Metall-Wechselwirkungen zu demonstrieren, wurden Oligonukleotide mit Cisplatin unter physiologischen Bedingungen inkubiert und die Reaktionsprodukte mit der Online-GE-ICP-MS mittels 31P- und 195Pt-Detektion untersucht. Verschiedene Cisplatin-Oligonukleotid-Addukte konnten auf diese Weise beobachtet werden, was zur Identifizierung die Anwendung der MALDI-TOF-MS als komplementärer Form der Massenspektrometrie notwendig machte. Abschließend wurde die Isotopenverdünnungsanalyse zum Zweck der Quantifizierung herangezogen.
Resumo:
Das Element Arsen besitzt eine Reihe von Isotopen, die in nahezu trägerfreier Form (nca) produziert werden können und deshalb in der Radiopharmazie für die Diagnose oder Endoradiotherapie Verwendung finden können. Bei der Positronenemissionstomographie (PET) gibt es eine gewisse Lücke bei der Versorgung mit langlebigen Positronenemittern, die zur Untersuchung von langsamen physiologischen Prozessen wie z.B. der Biodistribution und Anreicherung von Antikörpern in Tumorgewebe eingesetzt werden können. Die beiden Arsenisotope 72As (T1/2 = 26 h, 88 % beta+) und 74As (T1/2 = 17,8 d, 29 % beta+) vereinen eine lange physikalische Halbwertszeit mit einer hohen Positronenemissionsrate und sind daher geeignete Kandidaten. Da das Verhalten von radioaktivem Arsen und seine Verwendung in der molekularen Bildgebung international relativ wenig bearbeitet sind, wurde die Radiochemie des Arsens von der Isotopenproduktion an Kernreaktor und Zyklotron, über die Entwicklung von Abtrennungsmethoden für Germanium und Arsen, bis hin zur Entwicklung einer soliden Markierungschemie für Antikörper weiterentwickelt. Die in dieser Arbeit bearbeiteten Felder sind: 1. Die Isotopenproduktion der relevanten Arsenisotope (72/74/77As) wurde an Kernreaktor und Zyklotron durch Bestrahlung von GeO2- und Germaniummetalltargets durchgeführt. Pro 6 h Bestrahlung von 100 mg Germanium konnten ca. 2 MBq 77As am TRIGA Reaktor in Mainz hergestellt werden. Am Zyklotron des DKFZ in Heidelberg konnten unter optimierten Bedingungen bei der Bestrahlung von Germaniummetall (EP = 15 Mev, 20 µA, 200 µAh) ca. 4 GBq 72As und ca. 400 MBq 74As produziert werden. 2. Die Entwicklung neuer Abtrennungsmethoden für nca 72/74/77As von makroskopischen Mengen Germanium wurde vorangetrieben. Für die Aufarbeitung von GeO2- und Germaniummetalltargets kamen insgesamt 8 verschiedene Methoden wie Festphasenextraktion, Flüssig-Flüssig-Extraktion, Destillation, Anionenaustauschchromatographie zum Einsatz. Die erzielten Ausbeuten lagen dabei zwischen 31 und 56 %. Es wurden Abtrennungsfaktoren des Germaniums zwischen 1000 und 1•10E6 erreicht. Alle erfolgreichen Abtrennungsmethoden lieferten *As(III) in 500 µl PBS-Puffer bei pH 7. Diese Form des Radioarsens ist für die Markierung von SH-modifizierten Molekülen, wie z.B. Antikörpern geeignet. 3. Die Entwicklung von Methoden zur Bestimmung des Oxidationszustandes von nca *As in organischem, neutralem wässrigen, oder stark sauren Medium mittels Radio-DC und Anionenaustauschchromatographie wurde durchgeführt und führte zu einem besseren Verständnis der Redoxchemie des nca *As. 4. SH-modifizierte Antikörper wurden mit 72/74/77As(III) markiert. Dabei wurden zwei Methoden (Modifizierung mit SATA und TCEP) miteinander verglichen. Während das *As(III) bei Verwendung von TCEP in Ausbeuten > 90 % mit dem Antikörper reagierte, wurde für SATA-modifizierte Antikörper in Abhängigkeit von der verwendeten Abtrennungsmethode eine breite Spanne von 0 % bis > 90 % beobachtet. 5. Es wurden Phantommessungen mit 18F, 72As und 74As am µ-PET-Scanner durchgeführt, um erste Aussagen über die zu erwartende Auflösung der Arsenisotope zu erhalten. Die Auflösung von 74As ist mit 18F vergleichbar, während die von 72As erkennbar schlechter ist.
Resumo:
Data obtained with two CZE assays for determining carbohydrate-deficient transferrin (CDT) in human serum under routine conditions, the CAPILLARYS CDT and the high-resolution CEofix (HR-CEofix) CDT methods, are in agreement with patient sera that do not exhibit interferences, high trisialo-transferrin (Tf) levels or genetic variants. HR-CEofix CDT levels are somewhat higher compared to those obtained with the CAPILLARYS method and this bias corresponds to the difference of the upper reference values of the two assays. The lower resolution between disialo-Tf and trisialo-Tf observed in the CAPILLARYS system (mean: 1.24) compared to HR-CEofix (mean: 1.74) is believed to be the key for this difference. For critical sera with high trisialo-Tf levels, genetic variants, or certain interferences in the beta-region, the HR-CEofix approach is demonstrated to perform better than CAPILLARYS. However, the determination of CDT with the HR-CEofix method can also be hampered with interferences. Results with disialo-Tf values larger than 3% in the absence of asialo-Tf should be evaluated with immunosubtraction of Tf and possibly also confirmed with another CZE method or by HPLC. Furthermore, data gathered with the N Latex CDT direct immunonephelometric assay suggest that this assay can be used for screening purposes. To reduce the number of false negative results, CDT data above 2.0% should be confirmed using a separation method.
Resumo:
All optical systems that operate in or through the atmosphere suffer from turbulence induced image blur. Both military and civilian surveillance, gun-sighting, and target identification systems are interested in terrestrial imaging over very long horizontal paths, but atmospheric turbulence can blur the resulting images beyond usefulness. My dissertation explores the performance of a multi-frame-blind-deconvolution technique applied under anisoplanatic conditions for both Gaussian and Poisson noise model assumptions. The technique is evaluated for use in reconstructing images of scenes corrupted by turbulence in long horizontal-path imaging scenarios and compared to other speckle imaging techniques. Performance is evaluated via the reconstruction of a common object from three sets of simulated turbulence degraded imagery representing low, moderate and severe turbulence conditions. Each set consisted of 1000 simulated, turbulence degraded images. The MSE performance of the estimator is evaluated as a function of the number of images, and the number of Zernike polynomial terms used to characterize the point spread function. I will compare the mean-square-error (MSE) performance of speckle imaging methods and a maximum-likelihood, multi-frame blind deconvolution (MFBD) method applied to long-path horizontal imaging scenarios. Both methods are used to reconstruct a scene from simulated imagery featuring anisoplanatic turbulence induced aberrations. This comparison is performed over three sets of 1000 simulated images each for low, moderate and severe turbulence-induced image degradation. The comparison shows that speckle-imaging techniques reduce the MSE 46 percent, 42 percent and 47 percent on average for low, moderate, and severe cases, respectively using 15 input frames under daytime conditions and moderate frame rates. Similarly, the MFBD method provides, 40 percent, 29 percent, and 36 percent improvements in MSE on average under the same conditions. The comparison is repeated under low light conditions (less than 100 photons per pixel) where improvements of 39 percent, 29 percent and 27 percent are available using speckle imaging methods and 25 input frames and 38 percent, 34 percent and 33 percent respectively for the MFBD method and 150 input frames. The MFBD estimator is applied to three sets of field data and the results presented. Finally, a combined Bispectrum-MFBD Hybrid estimator is proposed and investigated. This technique consistently provides a lower MSE and smaller variance in the estimate under all three simulated turbulence conditions.
Resumo:
In recent years, implementation of 68Ga-radiometalated peptides for PET imaging of cancer has attracted the attention of clinicians. Herein, we propose the use of 44Sc (half-life = 3.97 h, average β+ energy [Eβ+av] = 632 keV) as a valuable alternative to 68Ga (half-life = 68 min, Eβ+av = 830 keV) for imaging and dosimetry before 177Lu-based radionuclide therapy. The aim of the study was the preclinical evaluation of a folate conjugate labeled with cyclotron-produced 44Sc and its in vitro and in vivo comparison with the 177Lu-labeled pendant. Methods: 44Sc was produced via the 44Ca(p,n)44Sc nuclear reaction at a cyclotron (17.6 ± 1.8 MeV, 50 μA, 30 min) using an enriched 44Ca target (10 mg 44CaCO3, 97.00%). Separation from the target material was performed by a semiautomated process using extraction chromatography and cation exchange chromatography. Radiolabeling of a DOTA-folate conjugate (cm09) was performed at 95°C within 10 min. The stability of 44Sc-cm09 was tested in human plasma. 44Sc-cm09 was investigated in vitro using folate receptor–positive KB tumor cells and in vivo by PET/CT imaging of tumor-bearing mice Results: Under the given irradiation conditions, 44Sc was obtained in a maximum yield of 350 MBq at high radionuclide purity (>99%). Semiautomated isolation of 44Sc from 44Ca targets allowed formulation of up to 300 MBq of 44Sc in a volume of 200–400 μL of ammonium acetate/HCl solution (1 M, pH 3.5–4.0) within 10 min. Radiolabeling of cm09 was achieved with a radiochemical yield of greater than 96% at a specific activity of 5.2 MBq/nmol. In vitro, 44Sc-cm09 was stable in human plasma over the whole time of investigation and showed folate receptor–specific binding to KB tumor cells. PET/CT images of mice injected with 44Sc-cm09 allowed excellent visualization of tumor xenografts. Comparison of cm09 labeled with 44Sc and 177Lu revealed almost identical pharmacokinetics. Conclusion: This study presents a high-yield production and efficient separation method of 44Sc at a quality suitable for radiolabeling of DOTA-functionalized biomolecules. An in vivo proof-of-concept study using a DOTA-folate conjugate demonstrated the excellent features of 44Sc for PET imaging. Thus, 44Sc is a valid alternative to 68Ga for imaging and dosimetry before 177Lu-radionuclide tumor therapy.
Resumo:
On the orbiter of the Rosetta spacecraft, the Cometary Secondary Ion Mass Analyser (COSIMA) will provide new in situ insights about the chemical composition of cometary grains all along 67P/Churyumov–Gerasimenko (67P/CG) journey until the end of December 2015 nominally. The aim of this paper is to present the pre-calibration which has already been performed as well as the different methods which have been developed in order to facilitate the interpretation of the COSIMA mass spectra and more especially of their organic content. The first step was to establish a mass spectra library in positive and negative ion mode of targeted molecules and to determine the specific features of each compound and chemical family analyzed. As the exact nature of the refractory cometary organic matter is nowadays unknown, this library is obviously not exhaustive. Therefore this library has also been the starting point for the research of indicators, which enable to highlight the presence of compounds containing specific atom or structure. These indicators correspond to the intensity ratio of specific peaks in the mass spectrum. They have allowed us to identify sample containing nitrogen atom, aliphatic chains or those containing polyaromatic hydrocarbons. From these indicators, a preliminary calibration line, from which the N/C ratio could be derived, has also been established. The research of specific mass difference could also be helpful to identify peaks related to quasi-molecular ions in an unknown mass spectrum. The Bayesian Positive Source Separation (BPSS) technique will also be very helpful for data analysis. This work is the starting point for the analysis of the cometary refractory organic matter. Nevertheless, calibration work will continue in order to reach the best possible interpretation of the COSIMA observations.
Resumo:
As a thermal separation method, distillation is one of the most important technologies in the chemical industry. Given its importance, it is no surprise that increasing efforts have been made in reducing its energy inefficiencies. A great deal of research is focused in the design and optimization of the Divided-Wall Column. Its applications are still reduced due to distrust of its controllability. Previous references studied the decentralized control of DWC but still few papers deal about Model Predictive Control. In this work we present a decentralized control of both a DWC column along with its equivalent MPC schema.
Resumo:
The aim of the novel experimental measures presented in this paper is to show the improvement achieved in the computation time for a 2D self-adaptive hp finite element method (FEM) software accelerated through the Adaptive Cross Approximation (ACA) method. This algebraic method (ACA) was presented in an previous paper in the hp context for the analysis of open region problems, where the robust behaviour, good accuracy and high compression levels of ACA were demonstrated. The truncation of the infinite domain is settled through an iterative computation of the Integral Equation (IE) over a ficticious boundary, which, regardless its accuracy and efficiency, turns out to be the bottelneck of the code. It will be shown that in this context ACA reduces drastically the computational effort of the problem.
Resumo:
The Glottal Source correlates reconstructed from the phonated parts of voice may render interesting information with applicability in different fields. One of them is defective closure (gap) detection. Through the paper the background to explain the physical foundations of defective gap are reviewed. A possible method to estimate defective gap is also presented based on a Wavelet Description of the Glottal Source. The method is validated using results from the analysis of a gender-balanced speakers database. Normative values for the different parameters estimated are given. A set of study cases with deficient glottal closure is presented and discussed.
Resumo:
Aircraft tracking plays a key and important role in the Sense-and-Avoid system of Unmanned Aerial Vehicles (UAVs). This paper presents a novel robust visual tracking algorithm for UAVs in the midair to track an arbitrary aircraft at real-time frame rates, together with a unique evaluation system. This visual algorithm mainly consists of adaptive discriminative visual tracking method, Multiple-Instance (MI) learning approach, Multiple-Classifier (MC) voting mechanism and Multiple-Resolution (MR) representation strategy, that is called Adaptive M3 tracker, i.e. AM3. In this tracker, the importance of test sample has been integrated to improve the tracking stability, accuracy and real-time performances. The experimental results show that this algorithm is more robust, efficient and accurate against the existing state-of-art trackers, overcoming the problems generated by the challenging situations such as obvious appearance change, variant surrounding illumination, partial aircraft occlusion, blur motion, rapid pose variation and onboard mechanical vibration, low computation capacity and delayed information communication between UAVs and Ground Station (GS). To our best knowledge, this is the first work to present this tracker for solving online learning and tracking freewill aircraft/intruder in the UAVs.
Resumo:
La tomografía axial computerizada (TAC) es la modalidad de imagen médica preferente para el estudio de enfermedades pulmonares y el análisis de su vasculatura. La segmentación general de vasos en pulmón ha sido abordada en profundidad a lo largo de los últimos años por la comunidad científica que trabaja en el campo de procesamiento de imagen; sin embargo, la diferenciación entre irrigaciones arterial y venosa es aún un problema abierto. De hecho, la separación automática de arterias y venas está considerado como uno de los grandes retos futuros del procesamiento de imágenes biomédicas. La segmentación arteria-vena (AV) permitiría el estudio de ambas irrigaciones por separado, lo cual tendría importantes consecuencias en diferentes escenarios médicos y múltiples enfermedades pulmonares o estados patológicos. Características como la densidad, geometría, topología y tamaño de los vasos sanguíneos podrían ser analizados en enfermedades que conllevan remodelación de la vasculatura pulmonar, haciendo incluso posible el descubrimiento de nuevos biomarcadores específicos que aún hoy en dípermanecen ocultos. Esta diferenciación entre arterias y venas también podría ayudar a la mejora y el desarrollo de métodos de procesamiento de las distintas estructuras pulmonares. Sin embargo, el estudio del efecto de las enfermedades en los árboles arterial y venoso ha sido inviable hasta ahora a pesar de su indudable utilidad. La extrema complejidad de los árboles vasculares del pulmón hace inabordable una separación manual de ambas estructuras en un tiempo realista, fomentando aún más la necesidad de diseñar herramientas automáticas o semiautomáticas para tal objetivo. Pero la ausencia de casos correctamente segmentados y etiquetados conlleva múltiples limitaciones en el desarrollo de sistemas de separación AV, en los cuales son necesarias imágenes de referencia tanto para entrenar como para validar los algoritmos. Por ello, el diseño de imágenes sintéticas de TAC pulmonar podría superar estas dificultades ofreciendo la posibilidad de acceso a una base de datos de casos pseudoreales bajo un entorno restringido y controlado donde cada parte de la imagen (incluyendo arterias y venas) está unívocamente diferenciada. En esta Tesis Doctoral abordamos ambos problemas, los cuales están fuertemente interrelacionados. Primero se describe el diseño de una estrategia para generar, automáticamente, fantomas computacionales de TAC de pulmón en humanos. Partiendo de conocimientos a priori, tanto biológicos como de características de imagen de CT, acerca de la topología y relación entre las distintas estructuras pulmonares, el sistema desarrollado es capaz de generar vías aéreas, arterias y venas pulmonares sintéticas usando métodos de crecimiento iterativo, que posteriormente se unen para formar un pulmón simulado con características realistas. Estos casos sintéticos, junto a imágenes reales de TAC sin contraste, han sido usados en el desarrollo de un método completamente automático de segmentación/separación AV. La estrategia comprende una primera extracción genérica de vasos pulmonares usando partículas espacio-escala, y una posterior clasificación AV de tales partículas mediante el uso de Graph-Cuts (GC) basados en la similitud con arteria o vena (obtenida con algoritmos de aprendizaje automático) y la inclusión de información de conectividad entre partículas. La validación de los fantomas pulmonares se ha llevado a cabo mediante inspección visual y medidas cuantitativas relacionadas con las distribuciones de intensidad, dispersión de estructuras y relación entre arterias y vías aéreas, los cuales muestran una buena correspondencia entre los pulmones reales y los generados sintéticamente. La evaluación del algoritmo de segmentación AV está basada en distintas estrategias de comprobación de la exactitud en la clasificación de vasos, las cuales revelan una adecuada diferenciación entre arterias y venas tanto en los casos reales como en los sintéticos, abriendo así un amplio abanico de posibilidades en el estudio clínico de enfermedades cardiopulmonares y en el desarrollo de metodologías y nuevos algoritmos para el análisis de imágenes pulmonares. ABSTRACT Computed tomography (CT) is the reference image modality for the study of lung diseases and pulmonary vasculature. Lung vessel segmentation has been widely explored by the biomedical image processing community, however, differentiation of arterial from venous irrigations is still an open problem. Indeed, automatic separation of arterial and venous trees has been considered during last years as one of the main future challenges in the field. Artery-Vein (AV) segmentation would be useful in different medical scenarios and multiple pulmonary diseases or pathological states, allowing the study of arterial and venous irrigations separately. Features such as density, geometry, topology and size of vessels could be analyzed in diseases that imply vasculature remodeling, making even possible the discovery of new specific biomarkers that remain hidden nowadays. Differentiation between arteries and veins could also enhance or improve methods processing pulmonary structures. Nevertheless, AV segmentation has been unfeasible until now in clinical routine despite its objective usefulness. The huge complexity of pulmonary vascular trees makes a manual segmentation of both structures unfeasible in realistic time, encouraging the design of automatic or semiautomatic tools to perform the task. However, this lack of proper labeled cases seriously limits in the development of AV segmentation systems, where reference standards are necessary in both algorithm training and validation stages. For that reason, the design of synthetic CT images of the lung could overcome these difficulties by providing a database of pseudorealistic cases in a constrained and controlled scenario where each part of the image (including arteries and veins) is differentiated unequivocally. In this Ph.D. Thesis we address both interrelated problems. First, the design of a complete framework to automatically generate computational CT phantoms of the human lung is described. Starting from biological and imagebased knowledge about the topology and relationships between structures, the system is able to generate synthetic pulmonary arteries, veins, and airways using iterative growth methods that can be merged into a final simulated lung with realistic features. These synthetic cases, together with labeled real CT datasets, have been used as reference for the development of a fully automatic pulmonary AV segmentation/separation method. The approach comprises a vessel extraction stage using scale-space particles and their posterior artery-vein classification using Graph-Cuts (GC) based on arterial/venous similarity scores obtained with a Machine Learning (ML) pre-classification step and particle connectivity information. Validation of pulmonary phantoms from visual examination and quantitative measurements of intensity distributions, dispersion of structures and relationships between pulmonary air and blood flow systems, show good correspondence between real and synthetic lungs. The evaluation of the Artery-Vein (AV) segmentation algorithm, based on different strategies to assess the accuracy of vessel particles classification, reveal accurate differentiation between arteries and vein in both real and synthetic cases that open a huge range of possibilities in the clinical study of cardiopulmonary diseases and the development of methodological approaches for the analysis of pulmonary images.