980 resultados para Document imaging system
Resumo:
The feasibility of three-dimensional (3D) whole-heart imaging of the coronary venous (CV) system was investigated. The hypothesis that coronary magnetic resonance venography (CMRV) can be improved by using an intravascular contrast agent (CA) was tested. A simplified model of the contrast in T(2)-prepared steady-state free precession (SSFP) imaging was applied to calculate optimal T(2)-preparation durations for the various deoxygenation levels expected in venous blood. Non-contrast-agent (nCA)- and CA-enhanced images were compared for the delineation of the coronary sinus (CS) and its main tributaries. A quantitative analysis of the resulting contrast-to-noise ratio (CNR) and signal-to-noise ratio (SNR) in both approaches was performed. Precontrast visualization of the CV system was limited by the poor CNR between large portions of the venous blood and the surrounding tissue. Postcontrast, a significant increase in CNR between the venous blood and the myocardium (Myo) resulted in a clear delineation of the target vessels. The CNR improvement was 347% (P < 0.05) for the CS, 260% (P < 0.01) for the mid cardiac vein (MCV), and 430% (P < 0.05) for the great cardiac vein (GCV). The improvement in SNR was on average 155%, but was not statistically significant for the CS and the MCV. The signal of the Myo could be significantly reduced to about 25% (P < 0.001).
Resumo:
The electromagnetic radiation at a terahertz frequencies (from 0.1 THz to 10 THz) is situated in the frequency band comprised between the optical band and the radio band. The interest of the scientific community in this frequency band has grown up due to its large capabilities to develop innovative imaging systems. The terahertz waves are able to generate extremely short pulses that achieve good spatial resolution, good penetration capabilities and allow to identify microscopic structures using spectral analysis. The work carried out during the period of the grant has been based on the developement of system working at the aforementioned frequency band. The main system is based on a total power radiometer working at 0.1 THz to perform security imaging. Moreover, the development of this system has been useful to gain knowledge in the behavior of the component systems at this frequency band. Moreover, a vectorial network analyzer has been used to characterize materials and perform active raster imaging. A materials measurement system has been designed and used to measure material properties as permittivity, losses and water concentration. Finally, the design of a terahertz time-domain spectrometer (THz-TDS) system has been started. This system will allow to perform tomographic measurement with very high penetration resolutions while allowing the spectral characterization of the sample material. The application range of this kind of system is very wide: from the identification of cancerous tissues of a skin to the characterization of the thickness of a painted surface of a car.
Resumo:
Determining groundwater flow paths of infiltrated river water is necessary for studying biochemical processes in the riparian zone, but their characterization is complicated by strong temporal and spatial heterogeneity. We investigated to what extent repeat 3D surface electrical resistance tomography (ERT) can be used to monitor transport of a salt-tracer plume under close to natural gradient conditions. The aim is to estimate groundwater flow velocities and pathways at a site located within a riparian groundwater system adjacent to the perialpine Thur River in northeastern Switzerland. Our ERT time-lapse images provide constraints on the plume's shape, flow direction, and velocity. These images allow the movement of the plume to be followed for 35 m. Although the hydraulic gradient is only 1.43 parts per thousand, the ERT time-lapse images demonstrate that the plume's center of mass and its front propagate with velocities of 2x10(-4) m/s and 5x10(-4) m/s, respectively. These velocities are compatible with groundwater resistivity monitoring data in two observation wells 5 m from the injection well. Five additional sensors in the 5-30 m distance range did not detect the plume. Comparison of the ERT time-lapse images with a groundwater transport model and time-lapse inversions of synthetic ERT data indicate that the movement of the plume can be described for the first 6 h after injection by a uniform transport model. Subsurface heterogeneity causes a change of the plume's direction and velocity at later times. Our results demonstrate the effectiveness of using time-lapse 3D surface ERT to monitor flow pathways in a challenging perialpine environment over larger scales than is practically possible with crosshole 3D ERT.
Resumo:
Report for the scientific sojourn carried out at the University Medical Center, Swiss, from 2010 to 2012. Abundant evidence suggests that negative emotional stimuli are prioritized in the perceptual systems, eliciting enhanced neural responses in early sensory regions as compared with neutral information. This facilitated detection is generally paralleled by larger neural responses in early sensory areas, relative to the processing of neutral information. In this sense, the amygdala and other limbic regions, such as the orbitofrontal cortex, may play a critical role by sending modulatory projections onto the sensory cortices via direct or indirect feedback.The present project aimed at investigating two important issues regarding these mechanisms of emotional attention, by means of functional magnetic resonance imaging. In Study I, we examined the modulatory effects of visual emotion signals on the processing of task-irrelevant visual, auditory, and somatosensory input, that is, the intramodal and crossmodal effects of emotional attention. We observed that brain responses to auditory and tactile stimulation were enhanced during the processing of visual emotional stimuli, as compared to neutral, in bilateral primary auditory and somatosensory cortices, respectively. However, brain responses to visual task-irrelevant stimulation were diminished in left primary and secondary visual cortices in the same conditions. The results also suggested the existence of a multimodal network associated with emotional attention, presumably involving mediofrontal, temporal and orbitofrontal regions Finally, Study II examined the different brain responses along the low-level visual pathways and limbic regions, as a function of the number of retinal spikes during visual emotional processing. The experiment used stimuli resulting from an algorithm that simulates how the visual system perceives a visual input after a given number of retinal spikes. The results validated the visual model in human subjects and suggested differential emotional responses in the amygdala and visual regions as a function of spike-levels. A list of publications resulting from work in the host laboratory is included in the report.
Resumo:
Report for the scientific sojourn carried out at the University Medical Center, Swiss, from 2010 to 2012. Abundant evidence suggests that negative emotional stimuli are prioritized in the perceptual systems, eliciting enhanced neural responses in early sensory regions as compared with neutral information. This facilitated detection is generally paralleled by larger neural responses in early sensory areas, relative to the processing of neutral information. In this sense, the amygdala and other limbic regions, such as the orbitofrontal cortex, may play a critical role by sending modulatory projections onto the sensory cortices via direct or indirect feedback.The present project aimed at investigating two important issues regarding these mechanisms of emotional attention, by means of functional magnetic resonance imaging. In Study I, we examined the modulatory effects of visual emotion signals on the processing of task-irrelevant visual, auditory, and somatosensory input, that is, the intramodal and crossmodal effects of emotional attention. We observed that brain responses to auditory and tactile stimulation were enhanced during the processing of visual emotional stimuli, as compared to neutral, in bilateral primary auditory and somatosensory cortices, respectively. However, brain responses to visual task-irrelevant stimulation were diminished in left primary and secondary visual cortices in the same conditions. The results also suggested the existence of a multimodal network associated with emotional attention, presumably involving mediofrontal, temporal and orbitofrontal regions Finally, Study II examined the different brain responses along the low-level visual pathways and limbic regions, as a function of the number of retinal spikes during visual emotional processing. The experiment used stimuli resulting from an algorithm that simulates how the visual system perceives a visual input after a given number of retinal spikes. The results validated the visual model in human subjects and suggested differential emotional responses in the amygdala and visual regions as a function of spike-levels. A list of publications resulting from work in the host laboratory is included in the report.
Resumo:
This final year project presents the design principles and prototype implementation of BIMS (Biomedical Information Management System), a flexible software system which provides an infrastructure to manage all information required by biomedical research projects.The BIMS project was initiated with the motivation to solve several limitations in medical data acquisition of some research projects, in which Universitat Pompeu Fabra takes part. These limitations,based on the lack of control mechanisms to constraint information submitted by clinicians, impact on the data quality, decreasing it.BIMS can easily be adapted to manage information of a wide variety of clinical studies, not being limited to a given clinical specialty. The software can manage both, textual information, like clinical data (measurements, demographics, diagnostics, etc ...), as well as several kinds of medical images (magnetic resonance imaging, computed tomography, etc ...). Moreover, BIMS provides a web - based graphical user interface and is designed to be deployed in a distributed andmultiuser environment. It is built on top of open source software products and frameworks.Specifically, BIMS has been used to represent all clinical data being currently used within the CardioLab platform (an ongoing project managed by Universitat Pompeu Fabra), demonstratingthat it is a solid software system, which could fulfill requirements of a real production environment.
Resumo:
Three standard radiation qualities (RQA 3, RQA 5 and RQA 9) and two screens, Kodak Lanex Regular and Insight Skeletal, were used to compare the imaging performance and dose requirements of the new Kodak Hyper Speed G and the current Kodak T-MAT G/RA medical x-ray films. The noise equivalent quanta (NEQ) and detective quantum efficiencies (DQE) of the four screen-film combinations were measured at three gross optical densities and compared with the characteristics for the Kodak CR 9000 system with GP (general purpose) and HR (high resolution) phosphor plates. The new Hyper Speed G film has double the intrinsic sensitivity of the T-MAT G/RA film and a higher contrast in the high optical density range for comparable exposure latitude. By providing both high sensitivity and high spatial resolution, the new film significantly improves the compromise between dose and image quality. As expected, the new film has a higher noise level and a lower signal-to-noise ratio than the standard film, although in the high frequency range this is compensated for by a better resolution, giving better DQE results--especially at high optical density. Both screen-film systems outperform the phosphor plates in terms of MTF and DQE for standard imaging conditions (Regular screen at RQA 5 and RQA 9 beam qualities). At low energy (RQA 3), the CR system has a comparable low-frequency DQE to screen-film systems when used with a fine screen at low and middle optical densities, and a superior low-frequency DQE at high optical density.
Resumo:
The present document should be seen as one more contribution to the debate to the reform processes and a small guide to these processes and their latest outcomes.
Resumo:
Un système efficace de sismique tridimensionnelle (3-D) haute-résolution adapté à des cibles lacustres de petite échelle a été développé. Dans le Lac Léman, près de la ville de Lausanne, en Suisse, des investigations récentes en deux dimension (2-D) ont mis en évidence une zone de faille complexe qui a été choisie pour tester notre système. Les structures observées incluent une couche mince (<40 m) de sédiments quaternaires sub-horizontaux, discordants sur des couches tertiaires de molasse pentées vers le sud-est. On observe aussi la zone de faille de « La Paudèze » qui sépare les unités de la Molasse du Plateau de la Molasse Subalpine. Deux campagnes 3-D complètes, d?environ d?un kilomètre carré, ont été réalisées sur ce site de test. La campagne pilote (campagne I), effectuée en 1999 pendant 8 jours, a couvert 80 profils en utilisant une seule flûte. Pendant la campagne II (9 jours en 2001), le nouveau système trois-flûtes, bien paramétrés pour notre objectif, a permis l?acquisition de données de très haute qualité sur 180 lignes CMP. Les améliorations principales incluent un système de navigation et de déclenchement de tirs grâce à un nouveau logiciel. Celui-ci comprend un contrôle qualité de la navigation du bateau en temps réel utilisant un GPS différentiel (dGPS) à bord et une station de référence près du bord du lac. De cette façon, les tirs peuvent être déclenchés tous les 5 mètres avec une erreur maximale non-cumulative de 25 centimètres. Tandis que pour la campagne I la position des récepteurs de la flûte 48-traces a dû être déduite à partir des positions du bateau, pour la campagne II elle ont pu être calculées précisément (erreur <20 cm) grâce aux trois antennes dGPS supplémentaires placées sur des flotteurs attachés à l?extrémité de chaque flûte 24-traces. Il est maintenant possible de déterminer la dérive éventuelle de l?extrémité des flûtes (75 m) causée par des courants latéraux ou de petites variations de trajet du bateau. De plus, la construction de deux bras télescopiques maintenant les trois flûtes à une distance de 7.5 m les uns des autres, qui est la même distance que celle entre les lignes naviguées de la campagne II. En combinaison avec un espacement de récepteurs de 2.5 m, la dimension de chaque «bin» de données 3-D de la campagne II est de 1.25 m en ligne et 3.75 m latéralement. L?espacement plus grand en direction « in-line » par rapport à la direction «cross-line» est justifié par l?orientation structurale de la zone de faille perpendiculaire à la direction «in-line». L?incertitude sur la navigation et le positionnement pendant la campagne I et le «binning» imprécis qui en résulte, se retrouve dans les données sous forme d?une certaine discontinuité des réflecteurs. L?utilisation d?un canon à air à doublechambre (qui permet d?atténuer l?effet bulle) a pu réduire l?aliasing observé dans les sections migrées en 3-D. Celui-ci était dû à la combinaison du contenu relativement haute fréquence (<2000 Hz) du canon à eau (utilisé à 140 bars et à 0.3 m de profondeur) et d?un pas d?échantillonnage latéral insuffisant. Le Mini G.I 15/15 a été utilisé à 80 bars et à 1 m de profondeur, est mieux adapté à la complexité de la cible, une zone faillée ayant des réflecteurs pentés jusqu?à 30°. Bien que ses fréquences ne dépassent pas les 650 Hz, cette source combine une pénétration du signal non-aliasé jusqu?à 300 m dans le sol (par rapport au 145 m pour le canon à eau) pour une résolution verticale maximale de 1.1 m. Tandis que la campagne I a été acquise par groupes de plusieurs lignes de directions alternées, l?optimisation du temps d?acquisition du nouveau système à trois flûtes permet l?acquisition en géométrie parallèle, ce qui est préférable lorsqu?on utilise une configuration asymétrique (une source et un dispositif de récepteurs). Si on ne procède pas ainsi, les stacks sont différents selon la direction. Toutefois, la configuration de flûtes, plus courtes que pour la compagne I, a réduit la couverture nominale, la ramenant de 12 à 6. Une séquence classique de traitement 3-D a été adaptée à l?échantillonnage à haute fréquence et elle a été complétée par deux programmes qui transforment le format non-conventionnel de nos données de navigation en un format standard de l?industrie. Dans l?ordre, le traitement comprend l?incorporation de la géométrie, suivi de l?édition des traces, de l?harmonisation des «bins» (pour compenser l?inhomogénéité de la couverture due à la dérive du bateau et de la flûte), de la correction de la divergence sphérique, du filtrage passe-bande, de l?analyse de vitesse, de la correction DMO en 3-D, du stack et enfin de la migration 3-D en temps. D?analyses de vitesse détaillées ont été effectuées sur les données de couverture 12, une ligne sur deux et tous les 50 CMP, soit un nombre total de 600 spectres de semblance. Selon cette analyse, les vitesses d?intervalles varient de 1450-1650 m/s dans les sédiments non-consolidés et de 1650-3000 m/s dans les sédiments consolidés. Le fait que l?on puisse interpréter plusieurs horizons et surfaces de faille dans le cube, montre le potentiel de cette technique pour une interprétation tectonique et géologique à petite échelle en trois dimensions. On distingue cinq faciès sismiques principaux et leurs géométries 3-D détaillées sur des sections verticales et horizontales: les sédiments lacustres (Holocène), les sédiments glacio-lacustres (Pléistocène), la Molasse du Plateau, la Molasse Subalpine de la zone de faille (chevauchement) et la Molasse Subalpine au sud de cette zone. Les couches de la Molasse du Plateau et de la Molasse Subalpine ont respectivement un pendage de ~8° et ~20°. La zone de faille comprend de nombreuses structures très déformées de pendage d?environ 30°. Des tests préliminaires avec un algorithme de migration 3-D en profondeur avant sommation et à amplitudes préservées démontrent que la qualité excellente des données de la campagne II permet l?application de telles techniques à des campagnes haute-résolution. La méthode de sismique marine 3-D était utilisée jusqu?à présent quasi-exclusivement par l?industrie pétrolière. Son adaptation à une échelle plus petite géographiquement mais aussi financièrement a ouvert la voie d?appliquer cette technique à des objectifs d?environnement et du génie civil.<br/><br/>An efficient high-resolution three-dimensional (3-D) seismic reflection system for small-scale targets in lacustrine settings was developed. In Lake Geneva, near the city of Lausanne, Switzerland, past high-resolution two-dimensional (2-D) investigations revealed a complex fault zone (the Paudèze thrust zone), which was subsequently chosen for testing our system. Observed structures include a thin (<40 m) layer of subhorizontal Quaternary sediments that unconformably overlie southeast-dipping Tertiary Molasse beds and the Paudèze thrust zone, which separates Plateau and Subalpine Molasse units. Two complete 3-D surveys have been conducted over this same test site, covering an area of about 1 km2. In 1999, a pilot survey (Survey I), comprising 80 profiles, was carried out in 8 days with a single-streamer configuration. In 2001, a second survey (Survey II) used a newly developed three-streamer system with optimized design parameters, which provided an exceptionally high-quality data set of 180 common midpoint (CMP) lines in 9 days. The main improvements include a navigation and shot-triggering system with in-house navigation software that automatically fires the gun in combination with real-time control on navigation quality using differential GPS (dGPS) onboard and a reference base near the lake shore. Shots were triggered at 5-m intervals with a maximum non-cumulative error of 25 cm. Whereas the single 48-channel streamer system of Survey I requires extrapolation of receiver positions from the boat position, for Survey II they could be accurately calculated (error <20 cm) with the aid of three additional dGPS antennas mounted on rafts attached to the end of each of the 24- channel streamers. Towed at a distance of 75 m behind the vessel, they allow the determination of feathering due to cross-line currents or small course variations. Furthermore, two retractable booms hold the three streamers at a distance of 7.5 m from each other, which is the same distance as the sail line interval for Survey I. With a receiver spacing of 2.5 m, the bin dimension of the 3-D data of Survey II is 1.25 m in in-line direction and 3.75 m in cross-line direction. The greater cross-line versus in-line spacing is justified by the known structural trend of the fault zone perpendicular to the in-line direction. The data from Survey I showed some reflection discontinuity as a result of insufficiently accurate navigation and positioning and subsequent binning errors. Observed aliasing in the 3-D migration was due to insufficient lateral sampling combined with the relatively high frequency (<2000 Hz) content of the water gun source (operated at 140 bars and 0.3 m depth). These results motivated the use of a double-chamber bubble-canceling air gun for Survey II. A 15 / 15 Mini G.I air gun operated at 80 bars and 1 m depth, proved to be better adapted for imaging the complexly faulted target area, which has reflectors dipping up to 30°. Although its frequencies do not exceed 650 Hz, this air gun combines a penetration of non-aliased signal to depths of 300 m below the water bottom (versus 145 m for the water gun) with a maximum vertical resolution of 1.1 m. While Survey I was shot in patches of alternating directions, the optimized surveying time of the new threestreamer system allowed acquisition in parallel geometry, which is preferable when using an asymmetric configuration (single source and receiver array). Otherwise, resulting stacks are different for the opposite directions. However, the shorter streamer configuration of Survey II reduced the nominal fold from 12 to 6. A 3-D conventional processing flow was adapted to the high sampling rates and was complemented by two computer programs that format the unconventional navigation data to industry standards. Processing included trace editing, geometry assignment, bin harmonization (to compensate for uneven fold due to boat/streamer drift), spherical divergence correction, bandpass filtering, velocity analysis, 3-D DMO correction, stack and 3-D time migration. A detailed semblance velocity analysis was performed on the 12-fold data set for every second in-line and every 50th CMP, i.e. on a total of 600 spectra. According to this velocity analysis, interval velocities range from 1450-1650 m/s for the unconsolidated sediments and from 1650-3000 m/s for the consolidated sediments. Delineation of several horizons and fault surfaces reveal the potential for small-scale geologic and tectonic interpretation in three dimensions. Five major seismic facies and their detailed 3-D geometries can be distinguished in vertical and horizontal sections: lacustrine sediments (Holocene) , glaciolacustrine sediments (Pleistocene), Plateau Molasse, Subalpine Molasse and its thrust fault zone. Dips of beds within Plateau and Subalpine Molasse are ~8° and ~20°, respectively. Within the fault zone, many highly deformed structures with dips around 30° are visible. Preliminary tests with 3-D preserved-amplitude prestack depth migration demonstrate that the excellent data quality of Survey II allows application of such sophisticated techniques even to high-resolution seismic surveys. In general, the adaptation of the 3-D marine seismic reflection method, which to date has almost exclusively been used by the oil exploration industry, to a smaller geographical as well as financial scale has helped pave the way for applying this technique to environmental and engineering purposes.<br/><br/>La sismique réflexion est une méthode d?investigation du sous-sol avec un très grand pouvoir de résolution. Elle consiste à envoyer des vibrations dans le sol et à recueillir les ondes qui se réfléchissent sur les discontinuités géologiques à différentes profondeurs et remontent ensuite à la surface où elles sont enregistrées. Les signaux ainsi recueillis donnent non seulement des informations sur la nature des couches en présence et leur géométrie, mais ils permettent aussi de faire une interprétation géologique du sous-sol. Par exemple, dans le cas de roches sédimentaires, les profils de sismique réflexion permettent de déterminer leur mode de dépôt, leurs éventuelles déformations ou cassures et donc leur histoire tectonique. La sismique réflexion est la méthode principale de l?exploration pétrolière. Pendant longtemps on a réalisé des profils de sismique réflexion le long de profils qui fournissent une image du sous-sol en deux dimensions. Les images ainsi obtenues ne sont que partiellement exactes, puisqu?elles ne tiennent pas compte de l?aspect tridimensionnel des structures géologiques. Depuis quelques dizaines d?années, la sismique en trois dimensions (3-D) a apporté un souffle nouveau à l?étude du sous-sol. Si elle est aujourd?hui parfaitement maîtrisée pour l?imagerie des grandes structures géologiques tant dans le domaine terrestre que le domaine océanique, son adaptation à l?échelle lacustre ou fluviale n?a encore fait l?objet que de rares études. Ce travail de thèse a consisté à développer un système d?acquisition sismique similaire à celui utilisé pour la prospection pétrolière en mer, mais adapté aux lacs. Il est donc de dimension moindre, de mise en oeuvre plus légère et surtout d?une résolution des images finales beaucoup plus élevée. Alors que l?industrie pétrolière se limite souvent à une résolution de l?ordre de la dizaine de mètres, l?instrument qui a été mis au point dans le cadre de ce travail permet de voir des détails de l?ordre du mètre. Le nouveau système repose sur la possibilité d?enregistrer simultanément les réflexions sismiques sur trois câbles sismiques (ou flûtes) de 24 traces chacun. Pour obtenir des données 3-D, il est essentiel de positionner les instruments sur l?eau (source et récepteurs des ondes sismiques) avec une grande précision. Un logiciel a été spécialement développé pour le contrôle de la navigation et le déclenchement des tirs de la source sismique en utilisant des récepteurs GPS différentiel (dGPS) sur le bateau et à l?extrémité de chaque flûte. Ceci permet de positionner les instruments avec une précision de l?ordre de 20 cm. Pour tester notre système, nous avons choisi une zone sur le Lac Léman, près de la ville de Lausanne, où passe la faille de « La Paudèze » qui sépare les unités de la Molasse du Plateau et de la Molasse Subalpine. Deux campagnes de mesures de sismique 3-D y ont été réalisées sur une zone d?environ 1 km2. Les enregistrements sismiques ont ensuite été traités pour les transformer en images interprétables. Nous avons appliqué une séquence de traitement 3-D spécialement adaptée à nos données, notamment en ce qui concerne le positionnement. Après traitement, les données font apparaître différents faciès sismiques principaux correspondant notamment aux sédiments lacustres (Holocène), aux sédiments glacio-lacustres (Pléistocène), à la Molasse du Plateau, à la Molasse Subalpine de la zone de faille et la Molasse Subalpine au sud de cette zone. La géométrie 3-D détaillée des failles est visible sur les sections sismiques verticales et horizontales. L?excellente qualité des données et l?interprétation de plusieurs horizons et surfaces de faille montrent le potentiel de cette technique pour les investigations à petite échelle en trois dimensions ce qui ouvre des voies à son application dans les domaines de l?environnement et du génie civil.
Resumo:
Les traités scientifiques ne font que depuis peu d'années l'objet d'études en histoire des sciences. Pourtant, ces traités ont énormément à apporter car ils renseignent sur la manière de raisonner des auteurs, ainsi que sur le développement d'une discipline. Dans ce travail de doctorat, différents traités des maladies du système nerveux ont été dépouillés, notamment le traité de Sémiologie des affections du système nerveux (1914) de Jules Dejerine (1849-1917). Ce traité a été analysé de trois manières différentes. Il a tout d'abord été comparé à une édition précédente publiée sous forme de chapitre (1901), révélant un large remodelage du contenu du traité, suggérant une évolution rapide de la discipline neurologique en l'espace de quelques années. Deuxièmement, l'analyse de la Sémiologie a permis de recréer un réseau de professionnels avec qui Jules Dejerine était en contact et, en parcourant les livres publiés par ces auteurs, il a été possible de décrire de quelle manière ces auteurs se citent mutuellement. Finalement, ces livres contiennent de nombreuses illustrations, qui sont associées à la notion de « preuve » : les auteurs utilisent des images sous forme de dessins, de photographies ou de schémas qui illustrent des patients ou des pièces anatomiques pour « montrer » la maladie ou la lésion. Chaque illustration a un rôle à jouer pour décrire la lésion, montrer la progression de la maladie et elle aide le médecin à poser le diagnostic. Grâce à ces trois axes de recherche, un traité devient un outil de travail dynamique, qui évolue au fil des rééditions, influencé par les critiques et commentaires retrouvés dans d'autres traités et articles, et par les progrès accomplis dans la discipline traitée. Des, passages et certaines images de l'ouvrage circulent également de traité en traité et véhiculent l'autorité de l'auteur de ces passages et images qui en viennent à représenter la maladie. Ce transfert d'images joue également un rôle dans la standardisation du diagnostic et dans l'unification de la neurologie à travers le monde occidental au début du XXe siècle, une période charnière pour l'histoire de la médecine. -- Au début du XXe siècle, la neurologie est une jeune spécialité médicale qui se développe rapidement. Les différents médecins publient des traités, communiquent entre eux et échangent leurs données. Un traité scientifique est un outil de travail dynamique qui évolue avec les rééditions et le développement d'une discipline. Ces ouvrages recèlent toutes sortes d'informations et leur analyse ne fait que depuis peu de temps l'objet d'études en histoire des sciences. Ces traités regorgent notamment d'illustrations qui sont associées à la notion de « preuve » : les auteurs utilisent des images sous forme de dessins, de photographies ou de schémas qui représentent des patients ou des pièces anatomiques afin de « montrer » la maladie ou la lésion. Chaque illustration a un rôle à jouer pour décrire la pathologie, montrer la progression de la maladie et elle aide le médecin à poser le diagnostic. Les auteurs des traités, qui viennent d'Europe et d'Amérique du Nord, se citent mutuellement, permettant au lecteur de recréer leur réseau de professionnels au niveau international. De plus, comme ces auteurs réutilisent les observations et les illustrations des autres, celles-ci circulent de traité en traité et en viennent à représenter la maladie. Ce transfert d'images joue également un rôle dans la standardisation du diagnostic et dans l'unification de la neurologie à travers le monde occidental au début du XXe siècle, une période charnière pour l'histoire de la médecine. -- Until recently, the study of textbooks has been neglected in the history of the sciences. However, textbooks can provide fruitful sources of information regarding the way authors work and the development of a particular discipline. This dissertation reviews editions of a single textbook, the Sémiologie des affections du système nerveux (1914) by Jules Dejerine (1849-1917). This textbook enabled the description of three axes of research. Firstly, by comparing the book to a first edition published as a chapter, one can acknowledge an extensive remodeling of the content of the book, suggesting a vast increase in knowledge over time. Secondly, by looking at the authors that Dejerine quotes repeatedly, it becomes possible to recreate his professional network, to review the works of these authors and to determine how they cross-reference each other. Thirdly, these textbooks contain numerous medical illustrations, which are linked with the concept of "proof;" the authors demonstrate a willingness to "show" the lesion or the pathology by publishing an image. Drawings, schematic representations, radiographies, or photographs of patients or of anatomical preparations all have their own purpose in describing the lesion and the progression of the disease. They assist in the diagnosis of the pathology. By looking at all of these aspects, it is therefore possible to conclude that a neurological textbook is a dynamic object that evolves through re-editions, comments and references found in other textbooks and by the circulations of parts of these books, such as the images. The illustrations also carry the author's authority, since their ongoing use claims that the work by the owner of the image has been endorsed by others. At the same time, it validates the borrowers' arguments. By using medical illustrations from different authors worldwide, the authors are also making a claim to a common language, to a similar way of examining patients, and about how much they depend on medical imagery to prove their points. In that sense, by focusing upon these textbooks, one can affirm that neurology already existed as a worldwide specialty at the turn of the twentieth century. Much more than mere accompaniments to the text, images were of paramount importance to the unification of neurology.
Resumo:
The goal of this study was to investigate the performance of 3D synchrotron differential phase contrast (DPC) imaging for the visualization of both macroscopic and microscopic aspects of atherosclerosis in the mouse vasculature ex vivo. The hearts and aortas of 2 atherosclerotic and 2 wild-type control mice were scanned with DPC imaging with an isotropic resolution of 15 μm. The coronary artery vessel walls were segmented in the DPC datasets to assess their thickness, and histological staining was performed at the level of atherosclerotic plaques. The DPC imaging allowed for the visualization of complex structures such as the coronary arteries and their branches, the thin fibrous cap of atherosclerotic plaques as well as the chordae tendineae. The coronary vessel wall thickness ranged from 37.4 ± 5.6 μm in proximal coronary arteries to 13.6 ± 3.3 μm in distal branches. No consistent differences in coronary vessel wall thickness were detected between the wild-type and atherosclerotic hearts in this proof-of-concept study, although the standard deviation in the atherosclerotic mice was higher in most segments, consistent with the observation of occasional focal vessel wall thickening. Overall, DPC imaging of the cardiovascular system of the mice allowed for a simultaneous detailed 3D morphological assessment of both large structures and microscopic details.
Resumo:
O Breast Imaging Reporting and Data System (BI-RADS™), do American College of hRadiology, foi concebido para padronizar o laudo mamográfico e reduzir os fatores de confusão na descrição e interpretação das imagens, além de facilitar o monitoramento do resultado final. OBJETIVO: Identificar a maneira como vem sendo utilizado o BI-RADS™, gerando informações que possam auxiliar o Colégio Brasileiro de Radiologia a desenvolver estratégias para aperfeiçoar o seu uso. MATERIAIS E MÉTODOS: Os dados foram coletados na cidade de Goiânia, GO. Foram solicitados os exames de mamografia anteriores a todas as mulheres que se dirigiram ao serviço para realização de mamografia entre janeiro/2003 e junho/2003. Foram incluídos na análise exames anteriores, realizados entre 1/7/2001 e 30/6/2003. RESULTADOS: Foram coletados 104 laudos anteriores, emitidos por 40 radiologistas de 33 diferentes serviços. Dos 104 laudos, 77% (n = 80) utilizavam o BI-RADS™. Destes, apenas 15% (n = 12) eram concisos, nenhum utilizava a estrutura e organização recomendadas pelo sistema, 98,75% (n = 79) não respeitavam o léxico e 65% (n = 51) não faziam recomendação de conduta. CONCLUSÃO: O BI-RADS™, apesar de bastante utilizado, não foi reconhecido como sistema para padronização dos laudos. Foi usado quase exclusivamente como forma de classificação final dos exames.
Resumo:
OBJETIVO: Avaliar artigos, na literatura, que verificam o valor preditivo positivo das categorias 3, 4 e 5 do Breast Imaging Reporting and Data System (BI-RADS®). MATERIAIS E MÉTODOS: Foi realizada pesquisa na base de dados Medline utilizando os termos "predictive value" e "BI-RADS". Foram incluídos 11 artigos nesta revisão. RESULTADOS: O valor preditivo positivo das categorias 3, 4 e 5 variou entre 0% e 8%, 4% e 62%, 54% e 100%, respectivamente. Três artigos avaliaram, concomitantemente, os critérios morfológicos das lesões que apresentaram maior valor preditivo positivo na mamografia, sendo nódulo espiculado o critério com maior valor preditivo positivo. CONCLUSÃO: Houve grande variabilidade do valor preditivo positivo das categorias 3, 4 e 5 do BI-RADS® em todos os estudos, porém foram identificadas diferenças metodológicas que limitaram a comparação desses estudos.
Resumo:
Stability of airborne nanoparticle agglomerates is important for occupational exposure and risk assessment in determining particle size distribution of nanomaterials. In this study, we developed an integrated method to test the stability of aerosols created using different types of nanomaterials. An aerosolization method, that resembles an industrial fluidized bed process, was used to aerosolize dry nanopowders. We produced aerosols with stable particle number concentrations and size distributions, which was important for the characterization of the aerosols' properties. Next, in order to test their potential for deagglomeration, a critical orifice was used to apply a range of shear forces to them. The mean particle size of tested aerosols became smaller, whereas the total number of particles generated grew. The fraction of particles in the lower size range increased, and the fraction in the upper size range decreased. The reproducibility and repeatability of the results were good. Transmission electron microscopy imaging showed that most of the nanoparticles were still agglomerated after passing through the orifice. However, primary particle geometry was very different. These results are encouraging for the use of our system for routine tests of the deagglomeration potential of nanomaterials. Furthermore, the particle concentrations and small quantities of raw materials used suggested that our system might also be able to serve as an alternative method to test dustiness in existing processes.