891 resultados para Computer Imaging, Vision, Pattern Recognition and Graphics


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to investigate the impact of in-plane coronary artery motion on coronary magnetic resonance angiography (MRA) and coronary MR vessel wall imaging. Free-breathing, navigator-gated, 3D-segmented k-space turbo field echo ((TFE)/echo-planar imaging (EPI)) coronary MRA and 2D fast spin-echo coronary vessel wall imaging of the right coronary artery (RCA) were performed in 15 healthy adult subjects. Images were acquired at two different diastolic time periods in each subject: 1) during a subject-specific diastasis period (in-plane velocity <4 cm/second) identified from analysis of in-plane coronary artery motion, and 2) using a diastolic trigger delay based on a previously implemented heart-rate-dependent empirical formula. RCA vessel wall imaging was only feasible with subject-specific middiastolic acquisition, while the coronary wall could not be identified with the heart-rate-dependent formula. For coronary MRA, RCA border definition was improved by 13% (P < 0.001) with the use of subject-specific trigger delay (vs. heart-rate-dependent delay). Subject-specific middiastolic image acquisition improves 3D TFE/EPI coronary MRA, and is critical for RCA vessel wall imaging.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cutaneous leishmaniases have persisted for centuries as chronically disfiguring parasitic infections affecting millions of people across the subtropics. Symptoms range from the more prevalent single, self-healing cutaneous lesion to a persistent, metastatic disease, where ulcerations and granulomatous nodules can affect multiple secondary sites of the skin and delicate facial mucosa, even sometimes diffusing throughout the cutaneous system as a papular rash. The basis for such diverse pathologies is multifactorial, ranging from parasite phylogeny to host immunocompetence and various environmental factors. Although complex, these pathologies often prey on weaknesses in the innate immune system and its pattern recognition receptors. This review explores the observed and potential associations among the multifactorial perpetrators of infectious metastasis and components of the innate immune system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To investigate the effect of the first and repeated intravitreal injections of ranibizumab (1.25mg; 0.05ml) on retrobulbar blood flow velocities in patients with wet age-related macular degeneration (AMD). Methods: This prospective non randomized study included twenty consecutive AMD patients. Time- averaged mean blood flow velocities (BFVs) in the central retinal, temporal posterior ciliary and ophthalmic arteries (CRA, TPCA and OA) were measured by ultrasound imaging before, 2 days and 3 weeks after the first injection of ranibizumab, then 6 months after supplemental monthly injections if required. At each visit, complete ophthalmological examination was performed, including best corrected visual acuity measurement according to ETDRS protocol and OCT. Results: In the treated eyes, ranibizumab injection was followed by a significant improvement in visual acuity (from 44.4 ± 21.7, to 50.9±25.9 (p<0.01) at month 6, and a decrease in mean central macular thickness from 377±115 to 267 ± 74 µm (p<0.001) at month 6. At day 2 mean BFVs decreased by 16% in the CRA and by 20% in TPCA (p<0.001, both), then remained stable. Mean BFVs did not change in OA at the day 2 but decreased at week 3 by 18% (p<0.001). Supplemental injections did not lead to additional effects at month 6. No effect was tabulated in the fellow eye. Conclusions: We report an early decrease in mean BFV in CRA and TPRA following intravitreal injections of ranibizumab corresponding to vasoconstrictive effect of this drug. Decrease in mean BFV in all retrobulbar arteries from the week 3 suggests that ranibizumab proceeds to a local and regional vasoconstrictive and antiangiogenic effects after local diffusion. Thus, ranibizumab could induce an actual hypoperfusion of the treated eye which could correspond to a vascular side effect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We tested for antigen recognition and T cell receptor (TCR)-ligand binding 12 peptide derivative variants on seven H-2Kd-restricted cytotoxic T lymphocytes (CTL) clones specific for a bifunctional photoreactive derivative of the Plasmodium berghei circumsporozoite peptide 252-260 (SYIPSAEKI). The derivative contained iodo-4-azidosalicylic acid in place of PbCS S-252 and 4-azidobenzoic acid on PbCS K-259. Selective photoactivation of the N-terminal photoreactive group allowed crosslinking to Kd molecules and photoactivation of the orthogonal group to TCR. TCR photoaffinity labeling with covalent Kd-peptide derivative complexes allowed direct assessment of TCR-ligand binding on living CTL. In most cases (over 80%) cytotoxicity (chromium release) and TCR-ligand binding differed by less than fivefold. The exceptions included (a) partial TCR agonists (8 cases), for which antigen recognition was five-tenfold less efficient than TCR-ligand binding, (b) TCR antagonists (2 cases), which were not recognized and capable of inhibiting recognition of the wild-type conjugate, (c) heteroclitic agonists (2 cases), for which antigen recognition was more efficient than TCR-ligand binding, and (d) one partial TCR agonist, which activated only Fas (C1)95), but not perforin/granzyme-mediated cytotoxicity. There was no correlation between these divergences and the avidity of TCR-ligand binding, indicating that other factors than binding avidity determine the nature of the CTL response. An unexpected and novel finding was that CD8-dependent clones clearly incline more to TCR antagonism than CD8-independent ones. As there was no correlation between CD8 dependence and the avidity of TCR-ligand binding, the possibility is suggested that CD8 plays a critical role in aberrant CTL function.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

CREB is a cAMP-responsive nuclear DNA-binding protein that binds to cAMP response elements and stimulates gene transcription upon activation of the cAMP signalling pathway. The protein consists of an amino-terminal transcriptional transactivation domain and a carboxyl-terminal DNA-binding domain (bZIP domain) comprised of a basic region and a leucine zipper involved in DNA recognition and dimerization, respectively. Recently, we discovered a testis-specific transcript of CREB that contains an alternatively spliced exon encoding multiple stop codons. CREB encoded by this transcript is a truncated protein lacking the bZIP domain. We postulated that the antigen detected by CREB antiserum in the cytoplasm of germinal cells is the truncated CREB that must also lack its nuclear translocation signal (NTS). To test this hypothesis we prepared multiple expression plasmids encoding carboxyl-terminal deletions of CREB and transiently expressed them in COS-1 cells. By Western immunoblot analysis as well as immunocytochemistry of transfected cells, we show that CREB proteins truncated to amino acid 286 or shorter are sequestered in the cytoplasm, whereas a CREB of 295 amino acids is translocated into the nucleus. Chimeric CREBs containing a heterologous NTS fused to the first 248 or 261 amino acids of CREB are able to drive the translocation of the protein into the nucleus. Thus, the nine amino acids in the basic region involved in DNA recognition between positions 287 and 295 (RRKKKEYVK) of CREB contain the NTS. Further, mutation of the lysine at position 290 in CREB to an asparagine diminishes nuclear translocation of the protein.(ABSTRACT TRUNCATED AT 250 WORDS)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Toll-like receptors (TLRs) are pattern recognition receptors playing a fundamental role in sensing microbial invasion and initiating innate and adaptive immune responses. TLRs are also triggered by danger signals released by injured or stressed cells during sepsis. Here we focus on studies developing TLR agonists and antagonists for the treatment of infectious diseases and sepsis. Positioned at the cell surface, TLR4 is essential for sensing lipopolysaccharide of Gram-negative bacteria, TLR2 is involved in the recognition of a large panel of microbial ligands, while TLR5 recognizes flagellin. Endosomal TLR3, TLR7, TLR8, TLR9 are specialized in the sensing of nucleic acids produced notably during viral infections. TLR4 and TLR2 are favorite targets for developing anti-sepsis drugs, and antagonistic compounds have shown efficient protection from septic shock in pre-clinical models. Results from clinical trials evaluating anti-TLR4 and anti-TLR2 approaches are presented, discussing the challenges of study design in sepsis and future exploitation of these agents in infectious diseases. We also report results from studies suggesting that the TLR5 agonist flagellin may protect from infections of the gastrointestinal tract and that agonists of endosomal TLRs are very promising for treating chronic viral infections. Altogether, TLR-targeted therapies have a strong potential for prevention and intervention in infectious diseases, notably sepsis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multiple sclerosis (MS), a variable and diffuse disease affecting white and gray matter, is known to cause functional connectivity anomalies in patients. However, related studies published to-date are post hoc; our hypothesis was that such alterations could discriminate between patients and healthy controls in a predictive setting, laying the groundwork for imaging-based prognosis. Using functional magnetic resonance imaging resting state data of 22 minimally disabled MS patients and 14 controls, we developed a predictive model of connectivity alterations in MS: a whole-brain connectivity matrix was built for each subject from the slow oscillations (<0.11Hz) of region-averaged time series, and a pattern recognition technique was used to learn a discriminant function indicating which particular functional connections are most affected by disease. Classification performance using strict cross-validation yielded a sensitivity of 82% (above chance at p<0.005) and specificity of 86% (p<0.01) to distinguish between MS patients and controls. The most discriminative connectivity changes were found in subcortical and temporal regions, and contralateral connections were more discriminative than ipsilateral connections. The pattern of decreased discriminative connections can be summarized post hoc in an index that correlates positively (ρ=0.61) with white matter lesion load, possibly indicating functional reorganisation to cope with increasing lesion load. These results are consistent with a subtle but widespread impact of lesions in white matter and in gray matter structures serving as high-level integrative hubs. These findings suggest that predictive models of resting state fMRI can reveal specific anomalies due to MS with high sensitivity and specificity, potentially leading to new non-invasive markers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quality inspection and assurance is a veryimportant step when today's products are sold to markets. As products are produced in vast quantities, the interest to automate quality inspection tasks has increased correspondingly. Quality inspection tasks usuallyrequire the detection of deficiencies, defined as irregularities in this thesis. Objects containing regular patterns appear quite frequently on certain industries and science, e.g. half-tone raster patterns in the printing industry, crystal lattice structures in solid state physics and solder joints and components in the electronics industry. In this thesis, the problem of regular patterns and irregularities is described in analytical form and three different detection methods are proposed. All the methods are based on characteristics of Fourier transform to represent regular information compactly. Fourier transform enables the separation of regular and irregular parts of an image but the three methods presented are shown to differ in generality and computational complexity. Need to detect fine and sparse details is common in quality inspection tasks, e.g., locating smallfractures in components in the electronics industry or detecting tearing from paper samples in the printing industry. In this thesis, a general definition of such details is given by defining sufficient statistical properties in the histogram domain. The analytical definition allowsa quantitative comparison of methods designed for detail detection. Based on the definition, the utilisation of existing thresholding methodsis shown to be well motivated. Comparison of thresholding methods shows that minimum error thresholding outperforms other standard methods. The results are successfully applied to a paper printability and runnability inspection setup. Missing dots from a repeating raster pattern are detected from Heliotest strips and small surface defects from IGT picking papers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis gives an overview of the use of the level set methods in the field of image science. The similar fast marching method is discussed for comparison, also the narrow band and the particle level set methods are introduced. The level set method is a numerical scheme for representing, deforming and recovering structures in an arbitrary dimensions. It approximates and tracks the moving interfaces, dynamic curves and surfaces. The level set method does not define how and why some boundary is advancing the way it is but simply represents and tracks the boundary. The principal idea of the level set method is to represent the N dimensional boundary in the N+l dimensions. This gives the generality to represent even the complex boundaries. The level set methods can be powerful tools to represent dynamic boundaries, but they can require lot of computing power. Specially the basic level set method have considerable computational burden. This burden can be alleviated with more sophisticated versions of the level set algorithm like the narrow band level set method or with the programmable hardware implementation. Also the parallel approach can be used in suitable applications. It is concluded that these methods can be used in a quite broad range of image applications, like computer vision and graphics, scientific visualization and also to solve problems in computational physics. Level set methods and methods derived and inspired by it will be in the front line of image processing also in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Eosinophilic fasciitis is a rare condition. It is generally limited to the distal parts of the arms and legs. MRI is the ideal imaging modality for diagnosing and monitoring this condition. MRI findings typically evidence only fascial involvement but on a less regular basis signal abnormalities may be observed in neighboring muscle tissue and hypodermic fat. Differential diagnosis of eosinophilic fasciitis by MRI requires the exclusion of several other superficial and deep soft tissue disorders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes an evaluation framework that allows a standardized and quantitative comparison of IVUS lumen and media segmentation algorithms. This framework has been introduced at the MICCAI 2011 Computing and Visualization for (Intra)Vascular Imaging (CVII) workshop, comparing the results of eight teams that participated. We describe the available data-base comprising of multi-center, multi-vendor and multi-frequency IVUS datasets, their acquisition, the creation of the reference standard and the evaluation measures. The approaches address segmentation of the lumen, the media, or both borders; semi- or fully-automatic operation; and 2-D vs. 3-D methodology. Three performance measures for quantitative analysis have been proposed. The results of the evaluation indicate that segmentation of the vessel lumen and media is possible with an accuracy that is comparable to manual annotation when semi-automatic methods are used, as well as encouraging results can be obtained also in case of fully-automatic segmentation. The analysis performed in this paper also highlights the challenges in IVUS segmentation that remains to be solved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To improve coronary magnetic resonance angiography (MRA) by combining a two-dimensional (2D) spatially selective radiofrequency (RF) pulse with a T2 -preparation module ("2D-T2 -Prep"). METHODS: An adiabatic T2 -Prep was modified so that the first and last pulses were of differing spatial selectivity. The first RF pulse was replaced by a 2D pulse, such that a pencil-beam volume is excited. The last RF pulse remains nonselective, thus restoring the T2 -prepared pencil-beam, while tipping the (formerly longitudinal) magnetization outside of the pencil-beam into the transverse plane, where it is then spoiled. Thus, only a cylinder of T2 -prepared tissue remains for imaging. Numerical simulations were followed by phantom validation and in vivo coronary MRA, where the technique was quantitatively evaluated. Reduced field-of-view (rFoV) images were similarly studied. RESULTS: In vivo, full field-of-view 2D-T2 -Prep significantly improved vessel sharpness as compared to conventional T2 -Prep, without adversely affecting signal-to-noise (SNR) or contrast-to-noise ratios (CNR). It also reduced respiratory motion artifacts. In rFoV images, the SNR, CNR, and vessel sharpness decreased, although scan time reduction was 60%. CONCLUSION: When compared with conventional T2 -Prep, the 2D-T2 -Prep improves vessel sharpness and decreases respiratory ghosting while preserving both SNR and CNR. It may also acquire rFoV images for accelerated data acquisition.