991 resultados para second image reversed
Resumo:
INTRODUCTION: The nutrition of very low birth weight (VLBW) infants is aimed at promoting a similar growth to that occurring in the uterus. However, in practice this is difficult to achieve and extrauterine growth restriction is frequent. The current tendency is to avoid this restriction by means of early parenteral and enteral nutrition. Nonetheless, uncertainty about many of the practices related with nutrition has resulted in a great variation in the way it is undertaken. In 2009 and 2011 in our hospital there was an unexpected increase in necrotizing enterocolitis. To check to see wether our nutrition policy was involved, we underlook a systematic review and drewup clinical practice guidelines (CPG) about enteral feeding in VLBW infants. New considerations about the duration of the fortification and the use of probiotics have led to an update of these CPG. METHODS: A total of 21 clinical questions were designed dealing with the type of milk, starting age, mode of administration, rate and volume of the increments, fortification, use of probiotics and protocol. Afete conducting a systematic search of the available evidence, the information was contrasted and summarized in order to draw up the recommendations. The quality of the evidence and the strength of the recommendations were determined from the SIGN scale. COMMENT: These CPG aim to help physicians in their decision making. The protocolized application of wellproven measurements reduces the variation in clinical practice and improves results.
Resumo:
BACKGROUND: Atrial fibrillation (AF) is largely regarded to be initiated from left atrial (LA) dilatation, with subsequent dilatation of the right atrium (RA) in those who progress to chronic AF. We hypothesized that in adult patients with right-sided congenital heart disease (CHD) and AF, RA dilatation will predominate with subsequent dilatation of the left atrium, as a mirror image. METHODS: Adult patients with diagnosis of right-sided, ASD or left-sided CHD who had undergone an echocardiographic study and electrocardiographic recording in 2007 were included. RA and LA area were measured from the apical view. AF was diagnosed from a 12-lead electrocardiogram or Holter recording. A multivariate logistic regression model was used to identify predictors of AF and linear regression models were performed to measure relationship between RA and LA area and AF. RESULTS: A total of 291 patients were included in the study. Multivariate analysis showed that age (p=0.0001), RA (p=0.025) and LA area (p=0.0016) were significantly related to AF. In patients with pure left-sided pathologies, there was progressive and predominant LA dilatation that paralleled the development of AF from none to paroxysmal to chronic AF. In patients with pure right-sided pathologies, there was a mirror image of progressive and predominant RA dilatation with the development of AF. CONCLUSION: We observed a mirror image atrial dilatation in patients with right sided disease and AF. This may provide novel mechanistic insight as to the origin of AF in these patients and deserves further studying in the form of targeted electrophysiological studies.
Resumo:
Evaluation of segmentation methods is a crucial aspect in image processing, especially in the medical imaging field, where small differences between segmented regions in the anatomy can be of paramount importance. Usually, segmentation evaluation is based on a measure that depends on the number of segmented voxels inside and outside of some reference regions that are called gold standards. Although some other measures have been also used, in this work we propose a set of new similarity measures, based on different features, such as the location and intensity values of the misclassified voxels, and the connectivity and the boundaries of the segmented data. Using the multidimensional information provided by these measures, we propose a new evaluation method whose results are visualized applying a Principal Component Analysis of the data, obtaining a simplified graphical method to compare different segmentation results. We have carried out an intensive study using several classic segmentation methods applied to a set of MRI simulated data of the brain with several noise and RF inhomogeneity levels, and also to real data, showing that the new measures proposed here and the results that we have obtained from the multidimensional evaluation, improve the robustness of the evaluation and provides better understanding about the difference between segmentation methods.
Resumo:
Mosaics have been commonly used as visual maps for undersea exploration and navigation. The position and orientation of an underwater vehicle can be calculated by integrating the apparent motion of the images which form the mosaic. A feature-based mosaicking method is proposed in this paper. The creation of the mosaic is accomplished in four stages: feature selection and matching, detection of points describing the dominant motion, homography computation and mosaic construction. In this work we demonstrate that the use of color and textures as discriminative properties of the image can improve, to a large extent, the accuracy of the constructed mosaic. The system is able to provide 3D metric information concerning the vehicle motion using the knowledge of the intrinsic parameters of the camera while integrating the measurements of an ultrasonic sensor. The experimental results of real images have been tested on the GARBI underwater vehicle
Resumo:
The behaviour of the harmonic infrared frequency of diatomic molecules subjected to moderate static uniform electric fields is analysed. The potential energy expression has been developed as a function of a static uniform electric field, which brings about a formulation describing the frequency versus field strength curve. With the help of the first and second derivatives of the expressions obtained, which correspond to the first- and second-order Stark effects, it was possible to find the maxima of the frequency versus field strength curves for a series of molecules using a Newton-Raphson search. A method is proposed which requires only the calculation of a few energy derivatives at a particular value of the field strength. At the same time, the expression for the dependence of the interatomic distance on the electric field strength is derived and the minimum of this curve is found for the same species. Derived expressions and numerical results are discussed and compared with other studi
Resumo:
Los hablantes bilingües tienen un acceso al léxico más lento y menos robusto que los monolingües, incluso cuando hablan en su lengua materna y dominante. Este fenómeno, comúnmente llamado “la desventaja bilingüe” también se observa en hablantes de una segunda lengua en comparación con hablantes de una primera lengua. Una causa que posiblemente contribuya a estas desventajas es el uso de control inhibitorio durante la producción del lenguaje: la inhibición de palabras coactivadas de la lengua actualmente no en uso puede prevenir intrusiones de dicha lengua, pero al mismo tiempo ralentizar la producción del lenguaje. El primer objetivo de los estudios descritos en este informe era testear esta hipótesis mediante diferentes predicciones generadas por teorías de control inhibitorio del lenguaje. Un segundo objetivo era investigar la extensión de la desventaja bilingüe dentro y fuera de la producción de palabras aisladas, así como avanzar en el conocimiento de las variables que la modulan. En lo atingente al primer objetivo, la evidencia obtenida es incompatible con un control inhibitorio global, desafiando la idea de mecanismos específicos en el hablante bilingüe utilizados para la selección léxica. Esto implica que una explicación común para el control de lenguaje y la desventaja bilingüe en el acceso al léxico es poco plausible. En cuanto al segundo objetivo, los resultados muestran que (a) la desventaja bilingüe no tiene un impacto al acceso a la memoria; (b) la desventaja bilingüe extiende a la producción del habla conectada; y (c) similitudes entre lenguas a diferentes niveles de representación así como la frecuencia de uso son factores que modulan la desventaja bilingüe.
Resumo:
The H∞ synchronization problem of the master and slave structure of a second-order neutral master-slave systems with time-varying delays is presented in this paper. Delay-dependent sufficient conditions for the design of a delayed output-feedback control are given by Lyapunov-Krasovskii method in terms of a linear matrix inequality (LMI). A controller, which guarantees H∞ synchronization of the master and slave structure using some free weighting matrices, is then developed. A numerical example has been given to show the effectiveness of the method
Resumo:
Abstract :This article examines the interplay of text and image in The Fairy Tales of Charles Perrault (1977), translated by Angela Carter and illustrated by Martin Ware, as a form of intersemiotic dialogue that sheds new light on Carter's work. It argues that Ware's highly original artwork based on the translation not only calls into question the association of fairy tales with children's literature (which still characterizes Carter's translation), but also captures an essential if heretofore neglected aspect of Carter's creative process, namely the dynamics between translating, illustrating and rewriting classic tales. Several elements from Ware's illustrations are indeed taken up and elaborated on in The Bloody Chamber and Other Stories (1979), the collection of "stories about fairy stories" that made Carter famous. These include visual details and strategies that she transposed to the realm of writing, giving rise to reflections on the relation between visuality and textuality.RésuméCet article considère l'interaction du texte et de l'image dans les contes de Perrault traduits par Angela Carter et illustrés par Martin Ware (The Fairy Tales of Charles Perrault, 1977) comme une forme de dialogue intersémiotique particulièrement productif. Il démontre que les illustrations originales de Ware ne mettent pas seulement en question l'assimilation des contes à la littérature de jeunesse (qui est encore la perspective adoptée par la traductrice dans ce livre), mais permettent aussi de saisir un aspect essentiel bien que jusque là ignoré du procession de création dans l'oeuvre de Carter, à savoir la dynamique qui lie la traduction, l'illustration et la réécriture des contes classiques. Plusieurs éléments des illustrations de Ware sont ainsi repris et élaborés dans The Bloody Chamber and Other Stories (1979), la collection de "stories about fairy stories" qui rendit Carter célèbre. La transposition de détails et de stratégies visuelles dans l'écriture donnent ainsi l'occasion de réflexions sur les rapports entre la visualité et la textualité.
Resumo:
La tomodensitométrie (CT) est une technique d'imagerie dont l'intérêt n'a cessé de croître depuis son apparition dans le début des années 70. Dans le domaine médical, son utilisation est incontournable à tel point que ce système d'imagerie pourrait être amené à devenir victime de son succès si son impact au niveau de l'exposition de la population ne fait pas l'objet d'une attention particulière. Bien évidemment, l'augmentation du nombre d'examens CT a permis d'améliorer la prise en charge des patients ou a rendu certaines procédures moins invasives. Toutefois, pour assurer que le compromis risque - bénéfice soit toujours en faveur du patient, il est nécessaire d'éviter de délivrer des doses non utiles au diagnostic.¦Si cette action est importante chez l'adulte elle doit être une priorité lorsque les examens se font chez l'enfant, en particulier lorsque l'on suit des pathologies qui nécessitent plusieurs examens CT au cours de la vie du patient. En effet, les enfants et jeunes adultes sont plus radiosensibles. De plus, leur espérance de vie étant supérieure à celle de l'adulte, ils présentent un risque accru de développer un cancer radio-induit dont la phase de latence peut être supérieure à vingt ans. Partant du principe que chaque examen radiologique est justifié, il devient dès lors nécessaire d'optimiser les protocoles d'acquisitions pour s'assurer que le patient ne soit pas irradié inutilement. L'avancée technologique au niveau du CT est très rapide et depuis 2009, de nouvelles techniques de reconstructions d'images, dites itératives, ont été introduites afin de réduire la dose et améliorer la qualité d'image.¦Le présent travail a pour objectif de déterminer le potentiel des reconstructions itératives statistiques pour réduire au minimum les doses délivrées lors d'examens CT chez l'enfant et le jeune adulte tout en conservant une qualité d'image permettant le diagnostic, ceci afin de proposer des protocoles optimisés.¦L'optimisation d'un protocole d'examen CT nécessite de pouvoir évaluer la dose délivrée et la qualité d'image utile au diagnostic. Alors que la dose est estimée au moyen d'indices CT (CTDIV0| et DLP), ce travail a la particularité d'utiliser deux approches radicalement différentes pour évaluer la qualité d'image. La première approche dite « physique », se base sur le calcul de métriques physiques (SD, MTF, NPS, etc.) mesurées dans des conditions bien définies, le plus souvent sur fantômes. Bien que cette démarche soit limitée car elle n'intègre pas la perception des radiologues, elle permet de caractériser de manière rapide et simple certaines propriétés d'une image. La seconde approche, dite « clinique », est basée sur l'évaluation de structures anatomiques (critères diagnostiques) présentes sur les images de patients. Des radiologues, impliqués dans l'étape d'évaluation, doivent qualifier la qualité des structures d'un point de vue diagnostique en utilisant une échelle de notation simple. Cette approche, lourde à mettre en place, a l'avantage d'être proche du travail du radiologue et peut être considérée comme méthode de référence.¦Parmi les principaux résultats de ce travail, il a été montré que les algorithmes itératifs statistiques étudiés en clinique (ASIR?, VEO?) ont un important potentiel pour réduire la dose au CT (jusqu'à-90%). Cependant, par leur fonctionnement, ils modifient l'apparence de l'image en entraînant un changement de texture qui pourrait affecter la qualité du diagnostic. En comparant les résultats fournis par les approches « clinique » et « physique », il a été montré que ce changement de texture se traduit par une modification du spectre fréquentiel du bruit dont l'analyse permet d'anticiper ou d'éviter une perte diagnostique. Ce travail montre également que l'intégration de ces nouvelles techniques de reconstruction en clinique ne peut se faire de manière simple sur la base de protocoles utilisant des reconstructions classiques. Les conclusions de ce travail ainsi que les outils développés pourront également guider de futures études dans le domaine de la qualité d'image, comme par exemple, l'analyse de textures ou la modélisation d'observateurs pour le CT.¦-¦Computed tomography (CT) is an imaging technique in which interest has been growing since it first began to be used in the early 1970s. In the clinical environment, this imaging system has emerged as the gold standard modality because of its high sensitivity in producing accurate diagnostic images. However, even if a direct benefit to patient healthcare is attributed to CT, the dramatic increase of the number of CT examinations performed has raised concerns about the potential negative effects of ionizing radiation on the population. To insure a benefit - risk that works in favor of a patient, it is important to balance image quality and dose in order to avoid unnecessary patient exposure.¦If this balance is important for adults, it should be an absolute priority for children undergoing CT examinations, especially for patients suffering from diseases requiring several follow-up examinations over the patient's lifetime. Indeed, children and young adults are more sensitive to ionizing radiation and have an extended life span in comparison to adults. For this population, the risk of developing cancer, whose latency period exceeds 20 years, is significantly higher than for adults. Assuming that each patient examination is justified, it then becomes a priority to optimize CT acquisition protocols in order to minimize the delivered dose to the patient. Over the past few years, CT advances have been developing at a rapid pace. Since 2009, new iterative image reconstruction techniques, called statistical iterative reconstructions, have been introduced in order to decrease patient exposure and improve image quality.¦The goal of the present work was to determine the potential of statistical iterative reconstructions to reduce dose as much as possible without compromising image quality and maintain diagnosis of children and young adult examinations.¦The optimization step requires the evaluation of the delivered dose and image quality useful to perform diagnosis. While the dose is estimated using CT indices (CTDIV0| and DLP), the particularity of this research was to use two radically different approaches to evaluate image quality. The first approach, called the "physical approach", computed physical metrics (SD, MTF, NPS, etc.) measured on phantoms in well-known conditions. Although this technique has some limitations because it does not take radiologist perspective into account, it enables the physical characterization of image properties in a simple and timely way. The second approach, called the "clinical approach", was based on the evaluation of anatomical structures (diagnostic criteria) present on patient images. Radiologists, involved in the assessment step, were asked to score image quality of structures for diagnostic purposes using a simple rating scale. This approach is relatively complicated to implement and also time-consuming. Nevertheless, it has the advantage of being very close to the practice of radiologists and is considered as a reference method.¦Primarily, this work revealed that the statistical iterative reconstructions studied in clinic (ASIR? and VECO have a strong potential to reduce CT dose (up to -90%). However, by their mechanisms, they lead to a modification of the image appearance with a change in image texture which may then effect the quality of the diagnosis. By comparing the results of the "clinical" and "physical" approach, it was showed that a change in texture is related to a modification of the noise spectrum bandwidth. The NPS analysis makes possible to anticipate or avoid a decrease in image quality. This project demonstrated that integrating these new statistical iterative reconstruction techniques can be complex and cannot be made on the basis of protocols using conventional reconstructions. The conclusions of this work and the image quality tools developed will be able to guide future studies in the field of image quality as texture analysis or model observers dedicated to CT.
Resumo:
Wear of polyethylene is associated with aseptic loosening of orthopaedic implants and has been observed in hip and knee prostheses and anatomical implants for the shoulder. The reversed shoulder prostheses have not been assessed as yet. We investigated the volumetric polyethylene wear of the reversed and anatomical Aequalis shoulder prostheses using a mathematical musculoskeletal model. Movement and joint stability were achieved by EMG-controlled activation of the muscles. A non-constant wear factor was considered. Simulated activities of daily living were estimated from in vivo recorded data. After one year of use, the volumetric wear was 8.4 mm(3) for the anatomical prosthesis, but 44.6 mm(3) for the reversed version. For the anatomical prosthesis the predictions for contact pressure and wear were consistent with biomechanical and clinical data. The abrasive wear of the polyethylene in reversed prostheses should not be underestimated, and further analysis, both experimental and clinical, is required.
Resumo:
In medical imaging, merging automated segmentations obtained from multiple atlases has become a standard practice for improving the accuracy. In this letter, we propose two new fusion methods: "Global Weighted Shape-Based Averaging" (GWSBA) and "Local Weighted Shape-Based Averaging" (LWSBA). These methods extend the well known Shape-Based Averaging (SBA) by additionally incorporating the similarity information between the reference (i.e., atlas) images and the target image to be segmented. We also propose a new spatially-varying similarity-weighted neighborhood prior model, and an edge-preserving smoothness term that can be used with many of the existing fusion methods. We first present our new Markov Random Field (MRF) based fusion framework that models the above mentioned information. The proposed methods are evaluated in the context of segmentation of lymph nodes in the head and neck 3D CT images, and they resulted in more accurate segmentations compared to the existing SBA.
Resumo:
Report for Iowa Utilities Board
Resumo:
IL-6 plays a central role in supporting pathological TH2 and TH17 cell development and inhibiting the protective T regulatory cells in allergic asthma. TH17 cells have been demonstrated to regulate allergic asthma in general and T-bet-deficiency-induced asthma in particular. Here we found an inverse correlation between T-bet and Il-6 mRNA expression in asthmatic children. Moreover, experimental subcutaneous immunotherapy (SIT) in T-bet((-/-)) mice inhibited IL-6, IL-21R and lung TH17 cells in a setting of asthma. Finally, local delivery of an anti-IL-6R antibody in T-bet((-/-)) mice resulted in the resolution of this allergic trait. Noteworthy, BATF, crucial for the immunoglobulin-class-switch and TH2,TH17 development, was found down-regulated in the lungs of T-bet((-/-)) mice after SIT and after treatment with anti-IL-6R antibody, indicating a critical role of IL-6 in controlling BATF/IRF4 integrated functions in TH2, TH17 cells and B cells also in a T-bet independent fashion in allergic asthma.
The role of energetic value in dynamic brain response adaptation during repeated food image viewing.
Resumo:
The repeated presentation of simple objects as well as biologically salient objects can cause the adaptation of behavioral and neural responses during the visual categorization of these objects. Mechanisms of response adaptation during repeated food viewing are of particular interest for better understanding food intake beyond energetic needs. Here, we measured visual evoked potentials (VEPs) and conducted neural source estimations to initial and repeated presentations of high-energy and low-energy foods as well as non-food images. The results of our study show that the behavioral and neural responses to food and food-related objects are not uniformly affected by repetition. While the repetition of images displaying low-energy foods and non-food modulated VEPs as well as their underlying neural sources and increased behavioral categorization accuracy, the responses to high-energy images remained largely invariant between initial and repeated encounters. Brain mechanisms when viewing images of high-energy foods thus appear less susceptible to repetition effects than responses to low-energy and non-food images. This finding is likely related to the superior reward value of high-energy foods and might be one reason why in particular high-energetic foods are indulged although potentially leading to detrimental health consequences.