989 resultados para Quantitative imaging


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Independent research jointly commissioned by the Department of Health, Social Services and Public Safety (DHSSPS) and the HSC R&D Division.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Network analysis naturally relies on graph theory and, more particularly, on the use of node and edge metrics to identify the salient properties in graphs. When building visual maps of networks, these metrics are turned into useful visual cues or are used interactively to filter out parts of a graph while querying it, for instance. Over the years, analysts from different application domains have designed metrics to serve specific needs. Network science is an inherently cross-disciplinary field, which leads to the publication of metrics with similar goals; different names and descriptions of their analytics often mask the similarity between two metrics that originated in different fields. Here, we study a set of graph metrics and compare their relative values and behaviors in an effort to survey their potential contributions to the spatial analysis of networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently a new measure of the cooperative behavior of simultaneous time series was introduced (Carmeli et al. NeuroImage 2005). This measure called S-estimator is defined from the embedding dimension in a state space. S-estimator quantifies the amount of synchronization within a data set by comparing the actual dimensionality of the set with the expected full dimensionality of the asynchronous set. It has the advantage of being a multivariate measure over traditionally used in systems neuroscience bivariate measures of synchronization. Multivariate measures of synchronization are of particular interest for applications in the field of modern multichannel EEG research, since they easily allow mapping of local and/or regional synchronization and are compatible with other imaging techniques. We applied Sestimator to the analysis of EEG synchronization in schizophrenia patients vs. matched controls. The whole-head mapping with S-estimator revealed a specific pattern of local synchronization in schizophrenia patients. The differences in the landscape of synchronization included decreased local synchronization in the territories over occipital and midline areas and increased synchronization over temporal areas. In frontal areas, the S-estimator revealed a tendency for an asymmetry: decreased S-values over the left hemisphere were adjacent to increased values over the right hemisphere. Separate calculations showed reproducibility of this pattern across the main EEG frequency bands. The maintenance of the same synchronization landscape across EEG frequencies probably implies the structural changes in the cortical circuitry of schizophrenia patients. These changes are regionally specific and suggest that schizophrenia is a misconnectivity rather than hypo- or hyper-connectivity disorder.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article describes the composition of fingermark residue as being a complex system with numerous compounds coming from different sources and evolving over time from the initial composition (corresponding to the composition right after deposition) to the aged composition (corresponding to the evolution of the initial composition over time). This complex system will additionally vary due to effects of numerous influence factors grouped in five different classes: the donor characteristics, the deposition conditions, the substrate nature, the environmental conditions and the applied enhancement techniques. The initial and aged compositions as well as the influence factors are thus considered in this article to provide a qualitative and quantitative review of all compounds identified in fingermark residue up to now. The analytical techniques used to obtain these data are also enumerated. This review highlights the fact that despite the numerous analytical processes that have already been proposed and tested to elucidate fingermark composition, advanced knowledge is still missing. Thus, there is a real need to conduct future research on the composition of fingermark residue, focusing particularly on quantitative measurements, aging kinetics and effects of influence factors. The results of future research are particularly important for advances in fingermark enhancement and dating technique developments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This document provides general information about somatostatin receptor scintigraphy with (111)In-pentetreotide. This guideline should not be regarded as the only approach to visualise tumours expressing somatostatin receptors or as exclusive of other nuclear medicine procedures useful to obtain comparable results. The aim of this guideline is to assist nuclear medicine physicians in recommending, performing, reporting and interpreting the results of (111)In-pentetreotide scintigraphy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effect of copper (Cu) filtration on image quality and dose in different digital X-ray systems was investigated. Two computed radiography systems and one digital radiography detector were used. Three different polymethylmethacrylate blocks simulated the pediatric body. The effect of Cu filters of 0.1, 0.2, and 0.3 mm thickness on the entrance surface dose (ESD) and the corresponding effective doses (EDs) were measured at tube voltages of 60, 66, and 73 kV. Image quality was evaluated in a contrast-detail phantom with an automated analyzer software. Cu filters of 0.1, 0.2, and 0.3 mm thickness decreased the ESD by 25-32%, 32-39%, and 40-44%, respectively, the ranges depending on the respective tube voltages. There was no consistent decline in image quality due to increasing Cu filtration. The estimated ED of anterior-posterior (AP) chest projections was reduced by up to 23%. No relevant reduction in the ED was noted in AP radiographs of the abdomen and pelvis or in posterior-anterior radiographs of the chest. Cu filtration reduces the ESD, but generally does not reduce the effective dose. Cu filters can help protect radiosensitive superficial organs, such as the mammary glands in AP chest projections.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background / Purpose : Lemierre Syndrome (LS) is defined by a recent oro-pharangeal infection, the clinical presence or radiological demonstration of internal jugular vein (IJV) thrombosis and documented anaerobe germ, principally Fusobacterium necrophorum (Fn) leading to septicaemia and septic embolization. It is a rare infection described since 1900 and it nearly disappeared since the beginning of the antibiotic area. Even if it is seldom described in the literature, this infection is reappearing in the last 10 years, either because of the increase of antibiotic resistance or by modification of antibiotic prescription. The aim of this study is to describe the role of medical imaging in the diagnosis, staging and follow up of Lemierre syndrome, as well as to describe the ultrasound (US), computed tomography (CT) and magnetic resonance imaging (MRI) findings of this rare disease. Patients and methods : Radiological and medical files of patients diagnosed with Lemierre syndrome in the past 6 years at CHUV hospital were analysed retrospectively. The CT scan, US, colour Doppler US (CDUS) and MRI examinations that were performed have been examined so as to define their specific imaging findings. Results IJV thrombosis was demonstrated in 2 cases by US, by CT in 6 cases and MRI in one case. Septic pulmonary emboli were detected by CT in 5 patients. Complications of the LS were depicted by MR in one case and by CT in 1 case. Conclusion : In the appropriate clinical settings, US, CT or MR evidence of IJV thrombosis and chest CT suggestive of septic emboli, should lead the physician to consider the diagnosis of LS. As a consequence, imaging allows a faster diagnosis and a more efficient treatment of this infection, which in case of insufficient therapy can lead to death.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To determine the diagnostic value of the intravascular contrast agent gadocoletic acid (B-22956) in three-dimensional, free breathing coronary magnetic resonance angiography (MRA) for stenosis detection in patients with suspected or known coronary artery disease. METHODS: Eighteen patients underwent three-dimensional, free breathing coronary MRA of the left and right coronary system before and after intravenous application of a single dose of gadocoletic acid (B-22956) using three different dose regimens (group A 0.050 mmol/kg; group B 0.075 mmol/kg; group C 0.100 mmol/kg). Precontrast scanning followed a coronary MRA standard non-contrast T2 preparation/turbo-gradient echo sequence (T2Prep); for postcontrast scanning an inversion-recovery gradient echo sequence was used (real-time navigator correction for both scans). In pre- and postcontrast scans quantitative analysis of coronary MRA data was performed to determine the number of visible side branches, vessel length and vessel sharpness of each of the three coronary arteries (LAD, LCX, RCA). The number of assessable coronary artery segments was determined to calculate sensitivity and specificity for detection of stenosis > or = 50% on a segment-to-segment basis (16-segment-model) in pre- and postcontrast scans with x-ray coronary angiography as the standard of reference. RESULTS: Dose group B (0.075 mmol/kg) was preferable with regard to improvement of MR angiographic parameters: in postcontrast scans all MR angiographic parameters increased significantly except for the number of visible side branches of the left circumflex artery. In addition, assessability of coronary artery segments significantly improved postcontrast in this dose group (67 versus 88%, p < 0.01). Diagnostic performance (sensitivity, specificity, accuracy) was 83, 77 and 78% for precontrast and 86, 95 and 94% for postcontrast scans. CONCLUSIONS: The use of gadocoletic acid (B-22956) results in an improvement of MR angiographic parameters, asssessability of coronary segments and detection of coronary stenoses > or = 50%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Imaging mass spectrometry (IMS) is an emergent and innovative approach for measuring the composition, abundance and regioselectivity of molecules within an investigated area of fixed dimension. Although providing unprecedented molecular information compared with conventional MS techniques, enhancement of protein signature by IMS is still necessary and challenging. This paper demonstrates the combination of conventional organic washes with an optimized aqueous-based buffer for tissue section preparation before matrix-assisted laser desorption/ionization (MALDI) IMS of proteins. Based on a 500 mM ammonium formate in water-acetonitrile (9:1; v/v, 0.1% trifluororacetic acid, 0.1% Triton) solution, this buffer wash has shown to significantly enhance protein signature by profiling and IMS (~fourfold) when used after organic washes (70% EtOH followed by 90% EtOH), improving the quality and number of ion images obtained from mouse kidney and a 14-day mouse fetus whole-body tissue sections, while maintaining a similar reproducibility with conventional tissue rinsing. Even if some protein losses were observed, the data mining has demonstrated that it was primarily low abundant signals and that the number of new peaks found is greater with the described procedure. The proposed buffer has thus demonstrated to be of high efficiency for tissue section preparation providing novel and complementary information for direct on-tissue MALDI analysis compared with solely conventional organic rinsing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The tourism consumer’s purchase decision process is, to a great extent, conditioned by the image the tourist has of the different destinations that make up his or her choice set. In a highly competitive international tourist market, those responsible for destinations’ promotion and development policies seek differentiation strategies so that they may position the destinations in the most suitable market segments for their product in order to improve their attractiveness to visitors and increase or consolidate the economic benefits that tourism activity generates in their territory. To this end, the main objective we set ourselves in this paper is the empirical analysis of the factors that determine the image formation of Tarragona city as a cultural heritage destination. Without a doubt, UNESCO’s declaration of Tarragona’s artistic and monumental legacies as World Heritage site in the year 2000 meant important international recognition of the quality of the cultural and patrimonial elements offered by the city to the visitors who choose it as a tourist destination. It also represents a strategic opportunity to boost the city’s promotion of tourism and its consolidation as a unique destination given its cultural and patrimonial characteristics. Our work is based on the use of structured and unstructured techniques to identify the factors that determine Tarragona’s tourist destination image and that have a decisive influence on visitors’ process of choice of destination. In addition to being able to ascertain Tarragona’s global tourist image, we consider that the heterogeneity of its visitors requires a more detailed study that enables us to segment visitor typology. We consider that the information provided by these results may prove of great interest to those responsible for local tourism policy, both when designing products and when promoting the destination.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Anti-basal ganglia antibodies (ABGAs) have been suggested to be a hallmark of autoimmunity in Gilles de la Tourette's syndrome (GTS), possibly related to prior exposure to streptococcal infection. In order to detect whether the presence of ABGAs was associated with subtle structural changes in GTS, whole-brain analysis using independent sets of T(1) and diffusion tensor imaging MRI-based methods were performed on 22 adults with GTS with (n = 9) and without (n = 13) detectable ABGAs in the serum. Voxel-based morphometry analysis failed to detect any significant difference in grey matter density between ABGA-positive and ABGA-negative groups in caudate nuclei, putamina, thalami and frontal lobes. These results suggest that ABGA synthesis is not related to structural changes in grey and white matter (detectable with these methods) within frontostriatal circuits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose/Aim: To review the embryological basis of a wide spectrum of anorectal malformations (ARM), to provide anatomical schemas showing the possiblelocations of fistulas in boys and girls and to present the typical imaging findings of these complex anomalies using various imaging methods with emphasis on3T-MRI.Content Organization: 1. Embryology. 2. Imaging techniques. 3. Normal 3T-MRI pelvic anatomy. 4. Ano-rectal malformations in boys: - Classification -Anatomic schemas of location of fistulas. - Imaging studies. 5. Ano-rectal malformations in girls: - Classification - Anatomic schemas of location of fistulas. -Imaging studies. 6. Imaging of Currarino syndrome. 7. Imaging of Vacterl syndrome.Summary: ARM are a group of complex anatomical alterations characterized by an abnormal separation of genitourinary system from hindgut. The major teachingpoints of this pictorial essay are to show: - The normal anatomy of the pelvis floor and the sphincter muscle complex in 3T-MRI. - Anatomic schemas of thedifferent types of ARM in boys and girls. - Imaging findings of a wide spectrum of ARM using a multimodality approach. including colostogramm, voidingcystourethrogramm and MRI of the pelvis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Whole-body (WB) planar imaging has long been one of the staple methods of dosimetry, and its quantification has been formalized by the MIRD Committee in pamphlet no 16. One of the issues not specifically addressed in the formalism occurs when the count rates reaching the detector are sufficiently high to result in camera count saturation. Camera dead-time effects have been extensively studied, but all of the developed correction methods assume static acquisitions. However, during WB planar (sweep) imaging, a variable amount of imaged activity exists in the detector's field of view as a function of time and therefore the camera saturation is time dependent. A new time-dependent algorithm was developed to correct for dead-time effects during WB planar acquisitions that accounts for relative motion between detector heads and imaged object. Static camera dead-time parameters were acquired by imaging decaying activity in a phantom and obtaining a saturation curve. Using these parameters, an iterative algorithm akin to Newton's method was developed, which takes into account the variable count rate seen by the detector as a function of time. The algorithm was tested on simulated data as well as on a whole-body scan of high activity Samarium-153 in an ellipsoid phantom. A complete set of parameters from unsaturated phantom data necessary for count rate to activity conversion was also obtained, including build-up and attenuation coefficients, in order to convert corrected count rate values to activity. The algorithm proved successful in accounting for motion- and time-dependent saturation effects in both the simulated and measured data and converged to any desired degree of precision. The clearance half-life calculated from the ellipsoid phantom data was calculated to be 45.1 h after dead-time correction and 51.4 h with no correction; the physical decay half-life of Samarium-153 is 46.3 h. Accurate WB planar dosimetry of high activities relies on successfully compensating for camera saturation which takes into account the variable activity in the field of view, i.e. time-dependent dead-time effects. The algorithm presented here accomplishes this task.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: Prospective studies have shown that quantitative ultrasound (QUS) techniques predict the risk of fracture of the proximal femur with similar standardised risk ratios to dual-energy x-ray absorptiometry (DXA). Few studies have investigated these devices for the prediction of vertebral fractures. The Basel Osteoporosis Study (BOS) is a population-based prospective study to assess the performance of QUS devices and DXA in predicting incident vertebral fractures. METHODS: 432 women aged 60-80 years were followed-up for 3 years. Incident vertebral fractures were assessed radiologically. Bone measurements using DXA (spine and hip) and QUS measurements (calcaneus and proximal phalanges) were performed. Measurements were assessed for their value in predicting incident vertebral fractures using logistic regression. RESULTS: QUS measurements at the calcaneus and DXA measurements discriminated between women with and without incident vertebral fracture, (20% height reduction). The relative risks (RRs) for vertebral fracture, adjusted for age, were 2.3 for the Stiffness Index (SI) and 2.8 for the Quantitative Ultrasound Index (QUI) at the calcaneus and 2.0 for bone mineral density at the lumbar spine. The predictive value (AUC (95% CI)) of QUS measurements at the calcaneus remained highly significant (0.70 for SI, 0.72 for the QUI, and 0.67 for DXA at the lumbar spine) even after adjustment for other confounding variables. CONCLUSIONS: QUS of the calcaneus and bone mineral density measurements were shown to be significant predictors of incident vertebral fracture. The RRs for QUS measurements at the calcaneus are of similar magnitude as for DXA measurements.