239 resultados para multilevel optimization multigrid PDE image restoration


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We want to shed some light on the development of person mobility by analysing the repeated cross-sectional data of the four National Travel Surveys (NTS) that were conducted in Germany since the mid seventies. The above mentioned driving forces operate on different levels of the system that generates the spatial behaviour we observe: Travel demand is derived from the needs and desires of individuals to participate in spatially separated activities. Individuals organise their lives in an interactive process within the context they live in, using given infrastructure. Essential determinants of their demand are the individual's socio-demographic characteristics, but also the opportunities and constraints defined by the household and the environment are relevant for the behaviour which ultimately can be realised. In order to fully capture the context which determines individual behaviour, the (nested) hierarchy of persons within households within spatial settings has to be considered. The data we will use for our analysis contains information on these three levels. With the analysis of this micro-data we attempt to improve our understanding of the afore subsumed macro developments. In addition we will investigate the prediction power of a few classic sociodemographic variables for the daily travel distance of individuals in the four NTS data sets, with a focus on the evolution of this predictive power. The additional task to correctly measure distances travelled by means of the NTS is threatened by the fact that although these surveys measure the same variables, different sampling designs and data collection procedures were used. So the aim of the analysis is also to detect variables whose control corrects for the known measurement error, as a prerequisite to apply appropriate models in order to better understand the development of individual travel behaviour in a multilevel context. This task is complicated by the fact that variables that inform on survey procedures and outcomes are only provided with the data set for 2002 (see Infas and DIW Berlin, 2003).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We advocate the use of a novel compressed sensing technique for accelerating the magnetic resonance image acquisition process, coined spread spectrum MR imaging or simply s2MRI. The method resides in pre-modulating the signal of interest by a linear chirp, resulting from the application of quadratic phase profiles, before random k-space under-sampling with uniform average density. The effectiveness of the procedure is theoretically underpinned by the optimization of the coherence between the sparsity and sensing bases. The application of the technique for single coil acquisitions is thoroughly studied by means of numerical simulations as well as phantom and in vivo experiments on a 7T scanner. The corresponding results suggest a favorable comparison with state-of-the-art variable density k-space under-sampling approaches.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Type 2 diabetes has been related to a decrease of mitochondrial DNA (mtDNA) content. In this study, we show increased expression of the peroxisome proliferator-activated receptor-alpha (PPARalpha) and its target genes involved in fatty acid metabolism in skeletal muscle of Zucker Diabetic Fatty (ZDF) (fa/fa) rats. In contrast, the mRNA levels of genes involved in glucose transport and utilization (GLUT4 and phosphofructokinase) were decreased, whereas the expression of pyruvate dehydrogenase kinase 4 (PDK-4), which suppresses glucose oxidation, was increased. The shift from glucose to fatty acids as the source of energy in skeletal muscle of ZDF rats was accompanied by a reduction of subunit 1 of complex I (NADH dehydrogenase subunit 1, ND1) and subunit II of complex IV (cytochrome c oxidase II, COII), two genes of the electronic transport chain encoded by mtDNA. The transcript levels of PPARgamma Coactivator 1 (PGC-1) showed a significant reduction. Treatment with troglitazone (30 mg/kg/day) for 15 days reduced insulin values and reversed the increase in PDK-4 mRNA levels, suggesting improved insulin sensitivity. In addition, troglitazone treatment restored ND1 and PGC-1 expression in skeletal muscle. These results suggest that troglitazone may avoid mitochondrial metabolic derangement during the development of diabetes mellitus 2 in skeletal muscle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Atrial fibrillation (AF) is largely regarded to be initiated from left atrial (LA) dilatation, with subsequent dilatation of the right atrium (RA) in those who progress to chronic AF. We hypothesized that in adult patients with right-sided congenital heart disease (CHD) and AF, RA dilatation will predominate with subsequent dilatation of the left atrium, as a mirror image. METHODS: Adult patients with diagnosis of right-sided, ASD or left-sided CHD who had undergone an echocardiographic study and electrocardiographic recording in 2007 were included. RA and LA area were measured from the apical view. AF was diagnosed from a 12-lead electrocardiogram or Holter recording. A multivariate logistic regression model was used to identify predictors of AF and linear regression models were performed to measure relationship between RA and LA area and AF. RESULTS: A total of 291 patients were included in the study. Multivariate analysis showed that age (p=0.0001), RA (p=0.025) and LA area (p=0.0016) were significantly related to AF. In patients with pure left-sided pathologies, there was progressive and predominant LA dilatation that paralleled the development of AF from none to paroxysmal to chronic AF. In patients with pure right-sided pathologies, there was a mirror image of progressive and predominant RA dilatation with the development of AF. CONCLUSION: We observed a mirror image atrial dilatation in patients with right sided disease and AF. This may provide novel mechanistic insight as to the origin of AF in these patients and deserves further studying in the form of targeted electrophysiological studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evaluation of segmentation methods is a crucial aspect in image processing, especially in the medical imaging field, where small differences between segmented regions in the anatomy can be of paramount importance. Usually, segmentation evaluation is based on a measure that depends on the number of segmented voxels inside and outside of some reference regions that are called gold standards. Although some other measures have been also used, in this work we propose a set of new similarity measures, based on different features, such as the location and intensity values of the misclassified voxels, and the connectivity and the boundaries of the segmented data. Using the multidimensional information provided by these measures, we propose a new evaluation method whose results are visualized applying a Principal Component Analysis of the data, obtaining a simplified graphical method to compare different segmentation results. We have carried out an intensive study using several classic segmentation methods applied to a set of MRI simulated data of the brain with several noise and RF inhomogeneity levels, and also to real data, showing that the new measures proposed here and the results that we have obtained from the multidimensional evaluation, improve the robustness of the evaluation and provides better understanding about the difference between segmentation methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract :This article examines the interplay of text and image in The Fairy Tales of Charles Perrault (1977), translated by Angela Carter and illustrated by Martin Ware, as a form of intersemiotic dialogue that sheds new light on Carter's work. It argues that Ware's highly original artwork based on the translation not only calls into question the association of fairy tales with children's literature (which still characterizes Carter's translation), but also captures an essential if heretofore neglected aspect of Carter's creative process, namely the dynamics between translating, illustrating and rewriting classic tales. Several elements from Ware's illustrations are indeed taken up and elaborated on in The Bloody Chamber and Other Stories (1979), the collection of "stories about fairy stories" that made Carter famous. These include visual details and strategies that she transposed to the realm of writing, giving rise to reflections on the relation between visuality and textuality.RésuméCet article considère l'interaction du texte et de l'image dans les contes de Perrault traduits par Angela Carter et illustrés par Martin Ware (The Fairy Tales of Charles Perrault, 1977) comme une forme de dialogue intersémiotique particulièrement productif. Il démontre que les illustrations originales de Ware ne mettent pas seulement en question l'assimilation des contes à la littérature de jeunesse (qui est encore la perspective adoptée par la traductrice dans ce livre), mais permettent aussi de saisir un aspect essentiel bien que jusque là ignoré du procession de création dans l'oeuvre de Carter, à savoir la dynamique qui lie la traduction, l'illustration et la réécriture des contes classiques. Plusieurs éléments des illustrations de Ware sont ainsi repris et élaborés dans The Bloody Chamber and Other Stories (1979), la collection de "stories about fairy stories" qui rendit Carter célèbre. La transposition de détails et de stratégies visuelles dans l'écriture donnent ainsi l'occasion de réflexions sur les rapports entre la visualité et la textualité.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La tomodensitométrie (CT) est une technique d'imagerie dont l'intérêt n'a cessé de croître depuis son apparition dans le début des années 70. Dans le domaine médical, son utilisation est incontournable à tel point que ce système d'imagerie pourrait être amené à devenir victime de son succès si son impact au niveau de l'exposition de la population ne fait pas l'objet d'une attention particulière. Bien évidemment, l'augmentation du nombre d'examens CT a permis d'améliorer la prise en charge des patients ou a rendu certaines procédures moins invasives. Toutefois, pour assurer que le compromis risque - bénéfice soit toujours en faveur du patient, il est nécessaire d'éviter de délivrer des doses non utiles au diagnostic.¦Si cette action est importante chez l'adulte elle doit être une priorité lorsque les examens se font chez l'enfant, en particulier lorsque l'on suit des pathologies qui nécessitent plusieurs examens CT au cours de la vie du patient. En effet, les enfants et jeunes adultes sont plus radiosensibles. De plus, leur espérance de vie étant supérieure à celle de l'adulte, ils présentent un risque accru de développer un cancer radio-induit dont la phase de latence peut être supérieure à vingt ans. Partant du principe que chaque examen radiologique est justifié, il devient dès lors nécessaire d'optimiser les protocoles d'acquisitions pour s'assurer que le patient ne soit pas irradié inutilement. L'avancée technologique au niveau du CT est très rapide et depuis 2009, de nouvelles techniques de reconstructions d'images, dites itératives, ont été introduites afin de réduire la dose et améliorer la qualité d'image.¦Le présent travail a pour objectif de déterminer le potentiel des reconstructions itératives statistiques pour réduire au minimum les doses délivrées lors d'examens CT chez l'enfant et le jeune adulte tout en conservant une qualité d'image permettant le diagnostic, ceci afin de proposer des protocoles optimisés.¦L'optimisation d'un protocole d'examen CT nécessite de pouvoir évaluer la dose délivrée et la qualité d'image utile au diagnostic. Alors que la dose est estimée au moyen d'indices CT (CTDIV0| et DLP), ce travail a la particularité d'utiliser deux approches radicalement différentes pour évaluer la qualité d'image. La première approche dite « physique », se base sur le calcul de métriques physiques (SD, MTF, NPS, etc.) mesurées dans des conditions bien définies, le plus souvent sur fantômes. Bien que cette démarche soit limitée car elle n'intègre pas la perception des radiologues, elle permet de caractériser de manière rapide et simple certaines propriétés d'une image. La seconde approche, dite « clinique », est basée sur l'évaluation de structures anatomiques (critères diagnostiques) présentes sur les images de patients. Des radiologues, impliqués dans l'étape d'évaluation, doivent qualifier la qualité des structures d'un point de vue diagnostique en utilisant une échelle de notation simple. Cette approche, lourde à mettre en place, a l'avantage d'être proche du travail du radiologue et peut être considérée comme méthode de référence.¦Parmi les principaux résultats de ce travail, il a été montré que les algorithmes itératifs statistiques étudiés en clinique (ASIR?, VEO?) ont un important potentiel pour réduire la dose au CT (jusqu'à-90%). Cependant, par leur fonctionnement, ils modifient l'apparence de l'image en entraînant un changement de texture qui pourrait affecter la qualité du diagnostic. En comparant les résultats fournis par les approches « clinique » et « physique », il a été montré que ce changement de texture se traduit par une modification du spectre fréquentiel du bruit dont l'analyse permet d'anticiper ou d'éviter une perte diagnostique. Ce travail montre également que l'intégration de ces nouvelles techniques de reconstruction en clinique ne peut se faire de manière simple sur la base de protocoles utilisant des reconstructions classiques. Les conclusions de ce travail ainsi que les outils développés pourront également guider de futures études dans le domaine de la qualité d'image, comme par exemple, l'analyse de textures ou la modélisation d'observateurs pour le CT.¦-¦Computed tomography (CT) is an imaging technique in which interest has been growing since it first began to be used in the early 1970s. In the clinical environment, this imaging system has emerged as the gold standard modality because of its high sensitivity in producing accurate diagnostic images. However, even if a direct benefit to patient healthcare is attributed to CT, the dramatic increase of the number of CT examinations performed has raised concerns about the potential negative effects of ionizing radiation on the population. To insure a benefit - risk that works in favor of a patient, it is important to balance image quality and dose in order to avoid unnecessary patient exposure.¦If this balance is important for adults, it should be an absolute priority for children undergoing CT examinations, especially for patients suffering from diseases requiring several follow-up examinations over the patient's lifetime. Indeed, children and young adults are more sensitive to ionizing radiation and have an extended life span in comparison to adults. For this population, the risk of developing cancer, whose latency period exceeds 20 years, is significantly higher than for adults. Assuming that each patient examination is justified, it then becomes a priority to optimize CT acquisition protocols in order to minimize the delivered dose to the patient. Over the past few years, CT advances have been developing at a rapid pace. Since 2009, new iterative image reconstruction techniques, called statistical iterative reconstructions, have been introduced in order to decrease patient exposure and improve image quality.¦The goal of the present work was to determine the potential of statistical iterative reconstructions to reduce dose as much as possible without compromising image quality and maintain diagnosis of children and young adult examinations.¦The optimization step requires the evaluation of the delivered dose and image quality useful to perform diagnosis. While the dose is estimated using CT indices (CTDIV0| and DLP), the particularity of this research was to use two radically different approaches to evaluate image quality. The first approach, called the "physical approach", computed physical metrics (SD, MTF, NPS, etc.) measured on phantoms in well-known conditions. Although this technique has some limitations because it does not take radiologist perspective into account, it enables the physical characterization of image properties in a simple and timely way. The second approach, called the "clinical approach", was based on the evaluation of anatomical structures (diagnostic criteria) present on patient images. Radiologists, involved in the assessment step, were asked to score image quality of structures for diagnostic purposes using a simple rating scale. This approach is relatively complicated to implement and also time-consuming. Nevertheless, it has the advantage of being very close to the practice of radiologists and is considered as a reference method.¦Primarily, this work revealed that the statistical iterative reconstructions studied in clinic (ASIR? and VECO have a strong potential to reduce CT dose (up to -90%). However, by their mechanisms, they lead to a modification of the image appearance with a change in image texture which may then effect the quality of the diagnosis. By comparing the results of the "clinical" and "physical" approach, it was showed that a change in texture is related to a modification of the noise spectrum bandwidth. The NPS analysis makes possible to anticipate or avoid a decrease in image quality. This project demonstrated that integrating these new statistical iterative reconstruction techniques can be complex and cannot be made on the basis of protocols using conventional reconstructions. The conclusions of this work and the image quality tools developed will be able to guide future studies in the field of image quality as texture analysis or model observers dedicated to CT.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Individual-as-maximizing agent analogies result in a simple understanding of the functioning of the biological world. Identifying the conditions under which individuals can be regarded as fitness maximizing agents is thus of considerable interest to biologists. Here, we compare different concepts of fitness maximization, and discuss within a single framework the relationship between Hamilton's (J Theor Biol 7: 1-16, 1964) model of social interactions, Grafen's (J Evol Biol 20: 1243-1254, 2007a) formal Darwinism project, and the idea of evolutionary stable strategies. We distinguish cases where phenotypic effects are additive separable or not, the latter not being covered by Grafen's analysis. In both cases it is possible to define a maximand, in the form of an objective function phi(z), whose argument is the phenotype of an individual and whose derivative is proportional to Hamilton's inclusive fitness effect. However, this maximand can be identified with the expression for fecundity or fitness only in the case of additive separable phenotypic effects, making individual-as-maximizing agent analogies unattractive (although formally correct) under general situations of social interactions. We also feel that there is an inconsistency in Grafen's characterization of the solution of his maximization program by use of inclusive fitness arguments. His results are in conflict with those on evolutionary stable strategies obtained by applying inclusive fitness theory, and can be repaired only by changing the definition of the problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In medical imaging, merging automated segmentations obtained from multiple atlases has become a standard practice for improving the accuracy. In this letter, we propose two new fusion methods: "Global Weighted Shape-Based Averaging" (GWSBA) and "Local Weighted Shape-Based Averaging" (LWSBA). These methods extend the well known Shape-Based Averaging (SBA) by additionally incorporating the similarity information between the reference (i.e., atlas) images and the target image to be segmented. We also propose a new spatially-varying similarity-weighted neighborhood prior model, and an edge-preserving smoothness term that can be used with many of the existing fusion methods. We first present our new Markov Random Field (MRF) based fusion framework that models the above mentioned information. The proposed methods are evaluated in the context of segmentation of lymph nodes in the head and neck 3D CT images, and they resulted in more accurate segmentations compared to the existing SBA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The repeated presentation of simple objects as well as biologically salient objects can cause the adaptation of behavioral and neural responses during the visual categorization of these objects. Mechanisms of response adaptation during repeated food viewing are of particular interest for better understanding food intake beyond energetic needs. Here, we measured visual evoked potentials (VEPs) and conducted neural source estimations to initial and repeated presentations of high-energy and low-energy foods as well as non-food images. The results of our study show that the behavioral and neural responses to food and food-related objects are not uniformly affected by repetition. While the repetition of images displaying low-energy foods and non-food modulated VEPs as well as their underlying neural sources and increased behavioral categorization accuracy, the responses to high-energy images remained largely invariant between initial and repeated encounters. Brain mechanisms when viewing images of high-energy foods thus appear less susceptible to repetition effects than responses to low-energy and non-food images. This finding is likely related to the superior reward value of high-energy foods and might be one reason why in particular high-energetic foods are indulged although potentially leading to detrimental health consequences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the context of Systems Biology, computer simulations of gene regulatory networks provide a powerful tool to validate hypotheses and to explore possible system behaviors. Nevertheless, modeling a system poses some challenges of its own: especially the step of model calibration is often difficult due to insufficient data. For example when considering developmental systems, mostly qualitative data describing the developmental trajectory is available while common calibration techniques rely on high-resolution quantitative data. Focusing on the calibration of differential equation models for developmental systems, this study investigates different approaches to utilize the available data to overcome these difficulties. More specifically, the fact that developmental processes are hierarchically organized is exploited to increase convergence rates of the calibration process as well as to save computation time. Using a gene regulatory network model for stem cell homeostasis in Arabidopsis thaliana the performance of the different investigated approaches is evaluated, documenting considerable gains provided by the proposed hierarchical approach.