912 resultados para Image analysis method


Relevância:

90.00% 90.00%

Publicador:

Resumo:

It is known already from 1970´s that laser beam is suitable for processing paper materials. In this thesis, term paper materials mean all wood-fibre based materials, like dried pulp, copy paper, newspaper, cardboard, corrugated board, tissue paper etc. Accordingly, laser processing in this thesis means all laser treatments resulting material removal, like cutting, partial cutting, marking, creasing, perforation etc. that can be used to process paper materials. Laser technology provides many advantages for processing of paper materials: non-contact method, freedom of processing geometry, reliable technology for non-stop production etc. Especially packaging industry is very promising area for laser processing applications. However, there are only few industrial laser processing applications worldwide even in beginning of 2010´s. One reason for small-scale use of lasers in paper material manufacturing is that there is a shortage of published research and scientific articles. Another problem, restraining the use of laser for processing of paper materials, is colouration of paper material i.e. the yellowish and/or greyish colour of cut edge appearing during cutting or after cutting. These are the main reasons for selecting the topic of this thesis to concern characterization of interaction of laser beam and paper materials. This study was carried out in Laboratory of Laser Processing at Lappeenranta University of Technology (Finland). Laser equipment used in this study was TRUMPF TLF 2700 carbon dioxide laser that produces a beam with wavelength of 10.6 μm with power range of 190-2500 W (laser power on work piece). Study of laser beam and paper material interaction was carried out by treating dried kraft pulp (grammage of 67 g m-2) with different laser power levels, focal plane postion settings and interaction times. Interaction between laser beam and dried kraft pulp was detected with different monitoring devices, i.e. spectrometer, pyrometer and active illumination imaging system. This way it was possible to create an input and output parameter diagram and to study the effects of input and output parameters in this thesis. When interaction phenomena are understood also process development can be carried out and even new innovations developed. Fulfilling the lack of information on interaction phenomena can assist in the way of lasers for wider use of technology in paper making and converting industry. It was concluded in this thesis that interaction of laser beam and paper material has two mechanisms that are dependent on focal plane position range. Assumed interaction mechanism B appears in range of average focal plane position of 3.4 mm and 2.4 mm and assumed interaction mechanism A in range of average focal plane position of 0.4 mm and -0.6 mm both in used experimental set up. Focal plane position 1.4 mm represents midzone of these two mechanisms. Holes during laser beam and paper material interaction are formed gradually: first small hole is formed to interaction area in the centre of laser beam cross-section and after that, as function of interaction time, hole expands, until interaction between laser beam and dried kraft pulp is ended. By the image analysis it can be seen that in beginning of laser beam and dried kraft pulp material interaction small holes off very good quality are formed. It is obvious that black colour and heat affected zone appear as function of interaction time. This reveals that there still are different interaction phases within interaction mechanisms A and B. These interaction phases appear as function of time and also as function of peak intensity of laser beam. Limit peak intensity is the value that divides interaction mechanism A and B from one-phase interaction into dual-phase interaction. So all peak intensity values under limit peak intensity belong to MAOM (interaction mechanism A one-phase mode) or to MBOM (interaction mechanism B onephase mode) and values over that belong to MADM (interaction mechanism A dual-phase mode) or to MBDM (interaction mechanism B dual-phase mode). Decomposition process of cellulose is evolution of hydrocarbons when temperature is between 380- 500°C. This means that long cellulose molecule is split into smaller volatile hydrocarbons in this temperature range. As temperature increases, decomposition process of cellulose molecule changes. In range of 700-900°C, cellulose molecule is mainly decomposed into H2 gas; this is why this range is called evolution of hydrogen. Interaction in this range starts (as in range of MAOM and MBOM), when a small good quality hole is formed. This is due to “direct evaporation” of pulp via decomposition process of evolution of hydrogen. And this can be seen can be seen in spectrometer as high intensity peak of yellow light (in range of 588-589 nm) which refers to temperature of ~1750ºC. Pyrometer does not detect this high intensity peak since it is not able to detect physical phase change from solid kraft pulp to gaseous compounds. As interaction time between laser beam and dried kraft pulp continues, hypothesis is that three auto ignition processes occurs. Auto ignition of substance is the lowest temperature in which it will spontaneously ignite in a normal atmosphere without an external source of ignition, such as a flame or spark. Three auto ignition processes appears in range of MADM and MBDM, namely: 1. temperature of auto ignition of hydrogen atom (H2) is 500ºC, 2. temperature of auto ignition of carbon monoxide molecule (CO) is 609ºC and 3. temperature of auto ignition of carbon atom (C) is 700ºC. These three auto ignition processes leads to formation of plasma plume which has strong emission of radiation in range of visible light. Formation of this plasma plume can be seen as increase of intensity in wavelength range of ~475-652 nm. Pyrometer shows maximum temperature just after this ignition. This plasma plume is assumed to scatter laser beam so that it interacts with larger area of dried kraft pulp than what is actual area of beam cross-section. This assumed scattering reduces also peak intensity. So result shows that assumably scattered light with low peak intensity is interacting with large area of hole edges and due to low peak intensity this interaction happens in low temperature. So interaction between laser beam and dried kraft pulp turns from evolution of hydrogen to evolution of hydrocarbons. This leads to black colour of hole edges.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The future of privacy in the information age is a highly debated topic. In particular, new and emerging technologies such as ICTs and cognitive technologies are seen as threats to privacy. This thesis explores images of the future of privacy among non-experts within the time frame from the present until the year 2050. The aims of the study are to conceptualise privacy as a social and dynamic phenomenon, to understand how privacy is conceptualised among citizens and to analyse ideal-typical images of the future of privacy using the causal layered analysis method. The theoretical background of the thesis combines critical futures studies and critical realism, and the empirical material is drawn from three focus group sessions held in spring 2012 as part of the PRACTIS project. From a critical realist perspective, privacy is conceptualised as a social institution which creates and maintains boundaries between normative circles and preserves the social freedom of individuals. Privacy changes when actors with particular interests engage in technology-enabled practices which challenge current privacy norms. The thesis adopts a position of technological realism as opposed to determinism or neutralism. In the empirical part, the focus group participants are divided into four clusters based on differences in privacy conceptions and perceived threats and solutions. The clusters are fundamentalists, pragmatists, individualists and collectivists. Correspondingly, four ideal-typical images of the future are composed: ‘drift to low privacy’, ‘continuity and benign evolution’, ‘privatised privacy and an uncertain future’, and ‘responsible future or moral decline’. The images are analysed using the four layers of causal layered analysis: litany, system, worldview and myth. Each image has its strengths and weaknesses. The individualistic images tend to be fatalistic in character while the collectivistic images are somewhat utopian. In addition, the images have two common weaknesses: lack of recognition of ongoing developments and simplistic conceptions of privacy based on a dichotomy between the individual and society. The thesis argues for a dialectical understanding of futures as present images of the future and as outcomes of real processes and mechanisms. The first steps in promoting desirable futures are the awareness of privacy as a social institution, the awareness of current images of the future, including their assumptions and weaknesses, and an attitude of responsibility where futures are seen as the consequences of present choices.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The actions of fibroblast growth factors (FGFs), particularly the basic form (bFGF), have been described in a large number of cells and include mitogenicity, angiogenicity and wound repair. The present review discusses the presence of the bFGF protein and messenger RNA as well as the presence of the FGF receptor messenger RNA in the rodent brain by means of semiquantitative radioactive in situ hybridization in combination with immunohistochemistry. Chemical and mechanical injuries to the brain trigger a reduction in neurotransmitter synthesis and neuronal death which are accompanied by astroglial reaction. The altered synthesis of bFGF following brain lesions or stimulation was analyzed. Lesions of the central nervous system trigger bFGF gene expression by neurons and/or activated astrocytes, depending on the type of lesion and time post-manipulation. The changes in bFGF messenger RNA are frequently accompanied by a subsequent increase of bFGF immunoreactivity in astrocytes in the lesioned pathway. The reactive astrocytes and injured neurons synthesize increased amount of bFGF, which may act as a paracrine/autocrine factor, protecting neurons from death and also stimulating neuronal plasticity and tissue repair

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The objective of the present investigation was to perform a 14-day time-course study of treatment with salbutamol, a ß2 adrenoceptor agonist, on rat soleus muscle in order to assess fiber type selectivity in the hypertrophic response and fiber type composition. Male Wistar rats were divided into four groups: control (N = 10), treated with salbutamol (N = 30), denervated (N = 30), and treated with salbutamol after denervation (N = 30). Salbutamol was injected intraperitoneally in the rats of the 2nd and 4th groups at a concentration of 0.3 mg/kg twice a day for 2 weeks. The muscles were denervated using the crush method with pean. The animals were sacrificed 3, 6, 9, 12, and 14 days after treatment. Frozen cross-sections of soleus muscle were stained for myosin ATPase, pH 9.4. Cross-sectional area and percent of muscle fibers were analyzed morphometrically by computerized image analysis. Treatment with salbutamol induced hypertrophy of all fiber types and a higher percentage of type II fibers (21%) in the healthy rat soleus muscle. Denervation caused marked atrophy of all fibers and conversion from type I to type II muscle fibers. Denervated muscles treated with salbutamol showed a significantly larger cross-sectional area of type I muscle fibers, 28.2% compared to the denervated untreated muscle. Moreover, the number of type I fibers was increased. These results indicate that administration of salbutamol is able to induce changes in cross-sectional area and fiber type distribution in the early phase of treatment. Since denervation-induced atrophy and conversion from type I to type II fibers were improved by salbutamol treatment we propose that salbutamol, like other ß2 adrenoceptor agonists, may have a therapeutic potential in improving the condition of skeletal muscle after denervation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In Brazil, scientific research is carried out mainly at universities, where professors coordinate research projects with the active participation of undergraduate and graduate students. However, there is no formal program for the teaching/learning of the scientific method. The objective of the present study was to evaluate the comprehension of the scientific method by students of health sciences who participate in scientific projects in an academic research laboratory. An observational descriptive cross-sectional study was conducted using Edgar Morin complexity as theoretical reference. In a semi-structured interview, students were asked to solve an abstract logical puzzle - TanGram. The collected data were analyzed using the hermeneutic-dialectic analysis method proposed by Minayo and discussed in terms of the theoretical reference of complexity. The students’ concept of the scientific method is limited to participation in projects, stressing the execution of practical procedures as opposed to scientific thinking. The solving of the TanGram puzzle revealed that the students had difficulties in understanding questions and activities focused on subjects and their processes. Objective answers, even when dealing with personal issues, were also reflected on the students’ opinions about the characteristics of a successful researcher. Students’ difficulties concerning these issues may affect their scientific performance and result in poorly designed experiments. This is a preliminary study that should be extended to other centers of scientific research.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We investigated whether Ca2+/calmodulin-dependent kinase II (CaMKII) and calcineurin (CaN) are involved in myocardial hypertrophy induced by tumor necrosis factor α (TNF-α). The cardiomyocytes of neonatal Wistar rats (1-2 days old) were cultured and stimulated by TNF-α (100 μg/L), and Ca2+ signal transduction was blocked by several antagonists, including BAPTA (4 µM), KN-93 (0.2 µM) and cyclosporin A (CsA, 0.2 µM). Protein content, protein synthesis, cardiomyocyte volumes, [Ca2+]i transients, CaMKIIδB and CaN were evaluated by the Lowry method, [³H]-leucine incorporation, a computerized image analysis system, a Till imaging system, and Western blot analysis, respectively. TNF-α induced a significant increase in protein content in a dose-dependent manner from 10 µg/L (53.56 µg protein/well) to 100 μg/L (72.18 µg protein/well), and in a time-dependent manner from 12 h (37.42 µg protein/well) to 72 h (42.81 µg protein/well). TNF-α (100 μg/L) significantly increased the amplitude of spontaneous [Ca2+]i transients, the total protein content, cell size, and [³H]-leucine incorporation in cultured cardiomyocytes, which was abolished by 4 µM BAPTA, an intracellular Ca2+ chelator. The increases in protein content, cell size and [³H]-leucine incorporation were abolished by 0.2 µM KN-93 or 0.2 µM CsA. TNF-α increased the expression of CaMKIIδB by 35.21% and that of CaN by 22.22% compared to control. These effects were abolished by 4 µM BAPTA, which itself had no effect. These results suggest that TNF-α induces increases in [Ca2+]i, CaMKIIδB and CaN and promotes cardiac hypertrophy. Therefore, we hypothesize that the Ca2+/CaMKII- and CaN-dependent signaling pathways are involved in myocardial hypertrophy induced by TNF-α.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aims of this study were to use the isotope analysis method to quantify the carbon of C3 photosynthetic cycle in commercial apple nectars and to determine the legal limit to identify the beverages that do not conform to the safety standards established by the Brazilian Ministry of Agriculture, Livestock and Food Supply. These beverages (apple nectars) were produced in the laboratory according to the Brazilian legislation. Adulterated nectars were also produced with an amount of pulp juice below the permitted threshold limit value. The δ13C values of the apple nectars and their fractions (pulp and purified sugar) were measured to quantify the C3 source percentage. In order to demonstrate the existence of adulteration, the values found were compared to the limit values established by the Brazilian Law. All commercial apple nectars analyzed were within the legal limits, which enabled to identify the nectars that were in conformity with the Brazilian Law. The isotopic methodology developed proved efficient to quantify the carbon of C3 origin in commercial apple nectars.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The recent rapid development of biotechnological approaches has enabled the production of large whole genome level biological data sets. In order to handle thesedata sets, reliable and efficient automated tools and methods for data processingand result interpretation are required. Bioinformatics, as the field of studying andprocessing biological data, tries to answer this need by combining methods and approaches across computer science, statistics, mathematics and engineering to studyand process biological data. The need is also increasing for tools that can be used by the biological researchers themselves who may not have a strong statistical or computational background, which requires creating tools and pipelines with intuitive user interfaces, robust analysis workflows and strong emphasis on result reportingand visualization. Within this thesis, several data analysis tools and methods have been developed for analyzing high-throughput biological data sets. These approaches, coveringseveral aspects of high-throughput data analysis, are specifically aimed for gene expression and genotyping data although in principle they are suitable for analyzing other data types as well. Coherent handling of the data across the various data analysis steps is highly important in order to ensure robust and reliable results. Thus,robust data analysis workflows are also described, putting the developed tools andmethods into a wider context. The choice of the correct analysis method may also depend on the properties of the specific data setandthereforeguidelinesforchoosing an optimal method are given. The data analysis tools, methods and workflows developed within this thesis have been applied to several research studies, of which two representative examplesare included in the thesis. The first study focuses on spermatogenesis in murinetestis and the second one examines cell lineage specification in mouse embryonicstem cells.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tässä työssä testattiin partikkelikokojakaumien analysoinnissa käytettävää kuvankäsittelyohjelmaa INCA Feature. Partikkelikokojakaumat määritettiin elektronimikroskooppikuvista INCA Feature ohjelmaa käyttäen partikkeleiden projektiokuvista päällystyspigmenttinä käytettävälle talkille ja kahdelle eri karbonaattilaadulle. Lisäksi määritettiin partikkelikokojakaumat suodatuksessa ja puhdistuksessa apuaineina käytettäville piidioksidi- ja alumiinioksidihiukkasille. Kuvankäsittelyohjelmalla määritettyjä partikkelikokojakaumia verrattiin partikkelin laskeutumisnopeuteen eli sedimentaatioon perustuvalla SediGraph 5100 analysaattorilla ja laserdiffraktioon perustuvalla Coulter LS 230 menetelmällä analysoituihin partikkelikokojakaumiin. SediGraph 5100 ja kuva-analyysiohjelma antoivat talkkipartikkelien kokojakaumalle hyvin samankaltaisen keskiarvon. Sen sijaan Coulter LS 230 laitteen antama kokojakauman keskiarvo poikkesi edellisistä. Kaikki vertailussa olleet partikkelikokojakaumamenetelmät asettivat eri näytteiden partikkelit samaan kokojärjestykseen. Kuitenkaan menetelmien tuloksia ei voida numeerisesti verrata toisiinsa, sillä kaikissa käytetyissä analyysimenetelmissä partikkelikoon mittaus perustuu partikkelin eri ominaisuuteen. Työn perusteella kaikki testatut analyysimenetelmät soveltuvat paperipigmenttien partikkelikokojakaumien määrittämiseen. Tässä työssä selvitettiin myös kuva-analyysiin tarvittava partikkelien lukumäärä, jolla analyysitulos on luotettava. Työssä todettiin, että analysoitavien partikkelien lukumäärän tulee olla vähintään 300 partikkelia. Liian suuri näytemäärä lisää kokojakauman hajontaa ja pidentää analyysiin käytettyä aikaa useaan tuntiin. Näytteenkäsittely vaatii vielä lisää tutkimuksia, sillä se on tärkein ja kriittisin vaihe SEM ja kuva-analyysiohjelmalla tehtävää partikkelikokoanalyysiä. Automaattisten mikroskooppien yleistyminen helpottaa ja nopeuttaa analyysien tekoa, jolloin menetelmän suosio tulee kasvamaan myös paperipigmenttien tutkimuksessa. Laitteiden korkea hinta ja käyttäjältä vaadittava eritysosaaminen tulevat rajaamaan käytön ainakin toistaiseksi tutkimuslaitoksiin.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Nowadays, image analysis is one of the most modern tools in evaluating physiological potential of seeds. This study aimed at verifying the efficiency of the seedling imaging analysis to assess physiological potential of wheat seeds. The seeds of wheat, cultivars IAC 370 and IAC 380, each of which represented by five different lots, were stored during four months under natural environmental conditions of temperature (T) and relative humidity (RH), in municipality of Piracicaba, Stated of São Paulo, Brazil. For this, bimonthly assessments were performed to quantify moisture content and physiological potential of seeds by means of tests of: germination, first count, accelerated aging, electrical conductivity, seedling emergence, and computerized analysis of seedlings, using the Seed Vigor Imaging System (SVIS®). It has been concluded that the computerized analyses of seedling through growth indexes and vigor, using the SVIS®, is efficient to assess physiological potential of wheat seeds.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Optical microscopy is living its renaissance. The diffraction limit, although still physically true, plays a minor role in the achievable resolution in far-field fluorescence microscopy. Super-resolution techniques enable fluorescence microscopy at nearly molecular resolution. Modern (super-resolution) microscopy methods rely strongly on software. Software tools are needed all the way from data acquisition, data storage, image reconstruction, restoration and alignment, to quantitative image analysis and image visualization. These tools play a key role in all aspects of microscopy today – and their importance in the coming years is certainly going to increase, when microscopy little-by-little transitions from single cells into more complex and even living model systems. In this thesis, a series of bioimage informatics software tools are introduced for STED super-resolution microscopy. Tomographic reconstruction software, coupled with a novel image acquisition method STED< is shown to enable axial (3D) super-resolution imaging in a standard 2D-STED microscope. Software tools are introduced for STED super-resolution correlative imaging with transmission electron microscopes or atomic force microscopes. A novel method for automatically ranking image quality within microscope image datasets is introduced, and it is utilized to for example select the best images in a STED microscope image dataset.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Currently, laser scribing is growing material processing method in the industry. Benefits of laser scribing technology are studied for example for improving an efficiency of solar cells. Due high-quality requirement of the fast scribing process, it is important to monitor the process in real time for detecting possible defects during the process. However, there is a lack of studies of laser scribing real time monitoring. Commonly used monitoring methods developed for other laser processes such a laser welding, are sufficient slow and existed applications cannot be implemented in fast laser scribing monitoring. The aim of this thesis is to find a method for laser scribing monitoring with a high-speed camera and evaluate reliability and performance of the developed monitoring system with experiments. The laser used in experiments is an IPG ytterbium pulsed fiber laser with 20 W maximum average power and Scan head optics used in the laser is Scanlab’s Hurryscan 14 II with an f100 tele-centric lens. The camera was connected to laser scanner using camera adapter to follow the laser process. A powerful fully programmable industrial computer was chosen for executing image processing and analysis. Algorithms for defect analysis, which are based on particle analysis, were developed using LabVIEW system design software. The performance of the algorithms was analyzed by analyzing a non-moving image from the scribing line with resolution 960x20 pixel. As a result, the maximum analysis speed was 560 frames per second. Reliability of the algorithm was evaluated by imaging scribing path with a variable number of defects 2000 mm/s when the laser was turned off and image analysis speed was 430 frames per second. The experiment was successful and as a result, the algorithms detected all defects from the scribing path. The final monitoring experiment was performed during a laser process. However, it was challenging to get active laser illumination work with the laser scanner due physical dimensions of the laser lens and the scanner. For reliable error detection, the illumination system is needed to be replaced.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

L'imagerie intravasculaire ultrasonore (IVUS) est une technologie médicale par cathéter qui produit des images de coupe des vaisseaux sanguins. Elle permet de quantifier et d'étudier la morphologie de plaques d'athérosclérose en plus de visualiser la structure des vaisseaux sanguins (lumière, intima, plaque, média et adventice) en trois dimensions. Depuis quelques années, cette méthode d'imagerie est devenue un outil de choix en recherche aussi bien qu'en clinique pour l'étude de la maladie athérosclérotique. L'imagerie IVUS est par contre affectée par des artéfacts associés aux caractéristiques des capteurs ultrasonores, par la présence de cônes d'ombre causés par les calcifications ou des artères collatérales, par des plaques dont le rendu est hétérogène ou par le chatoiement ultrasonore (speckle) sanguin. L'analyse automatisée de séquences IVUS de grande taille représente donc un défi important. Une méthode de segmentation en trois dimensions (3D) basée sur l'algorithme du fast-marching à interfaces multiples est présentée. La segmentation utilise des attributs des régions et contours des images IVUS. En effet, une nouvelle fonction de vitesse de propagation des interfaces combinant les fonctions de densité de probabilité des tons de gris des composants de la paroi vasculaire et le gradient des intensités est proposée. La segmentation est grandement automatisée puisque la lumière du vaisseau est détectée de façon entièrement automatique. Dans une procédure d'initialisation originale, un minimum d'interactions est nécessaire lorsque les contours initiaux de la paroi externe du vaisseau calculés automatiquement sont proposés à l'utilisateur pour acceptation ou correction sur un nombre limité d'images de coupe longitudinale. La segmentation a été validée à l'aide de séquences IVUS in vivo provenant d'artères fémorales provenant de différents sous-groupes d'acquisitions, c'est-à-dire pré-angioplastie par ballon, post-intervention et à un examen de contrôle 1 an suivant l'intervention. Les résultats ont été comparés avec des contours étalons tracés manuellement par différents experts en analyse d'images IVUS. Les contours de la lumière et de la paroi externe du vaisseau détectés selon la méthode du fast-marching sont en accord avec les tracés manuels des experts puisque les mesures d'aire sont similaires et les différences point-à-point entre les contours sont faibles. De plus, la segmentation par fast-marching 3D s'est effectuée en un temps grandement réduit comparativement à l'analyse manuelle. Il s'agit de la première étude rapportée dans la littérature qui évalue la performance de la segmentation sur différents types d'acquisition IVUS. En conclusion, la segmentation par fast-marching combinant les informations des distributions de tons de gris et du gradient des intensités des images est précise et efficace pour l'analyse de séquences IVUS de grandes tailles. Un outil de segmentation robuste pourrait devenir largement répandu pour la tâche ardue et fastidieuse qu'est l'analyse de ce type d'images.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Les cellules sont capables de détecter les distributions spatiales de protéines et ainsi de migrer ou s’étendre dans la direction appropriée. Une compréhension de la réponse cellulaire aux modifications de ces distributions spatiales de protéines est essentielle pour l’avancement des connaissances dans plusieurs domaines de recherches tels que le développement, l’immunologie ou l’oncologie. Un exemple particulièrement complexe est le guidage d’axones se déroulant pendant le développement du système nerveux. Ce dernier nécessite la présence de plusieurs distributions de molécules de guidages étant attractives ou répulsives pour connecter correctement ce réseau complexe qu’est le système nerveux. Puisque plusieurs indices de guidage collaborent, il est particulièrement difficile d’identifier la contribution individuelle ou la voie de signalisation qui est déclenchée in vivo, il est donc nécessaire d’utiliser des méthodes pour reproduire ces distributions de protéines in vitro. Plusieurs méthodes existent pour produire des gradients de protéines solubles ou liées aux substrats. Quelques méthodes pour produire des gradients solubles sont déjà couramment utilisées dans plusieurs laboratoires, mais elles limitent l’étude aux distributions de protéines qui sont normalement sécrétées in vivo. Les méthodes permettant de produire des distributions liées au substrat sont particulièrement complexes, ce qui restreint leur utilisation à quelques laboratoires. Premièrement, nous présentons une méthode simple qui exploite le photoblanchiment de molécules fluorescentes pour créer des motifs de protéines liées au substrat : Laser-assisted protein adsorption by photobleaching (LAPAP). Cette méthode permet de produire des motifs de protéines complexes d’une résolution micrométrique et d’une grande portée dynamique. Une caractérisation de la technique a été faite et en tant que preuve de fonctionnalité, des axones de neurones du ganglion spinal ont été guidés sur des gradients d’un peptide provenant de la laminine. Deuxièmement, LAPAP a été amélioré de manière à pouvoir fabriquer des motifs avec plusieurs composantes grâce à l’utilisation de lasers à différentes longueurs d’onde et d’anticorps conjugués à des fluorophores correspondants à ces longueurs d’onde. De plus, pour accélérer et simplifier le processus de fabrication, nous avons développé LAPAP à illumination à champ large qui utilise un modulateur spatial de lumière, une diode électroluminescente et un microscope standard pour imprimer directement un motif de protéines. Cette méthode est particulièrement simple comparativement à la version originale de LAPAP puisqu’elle n’implique pas le contrôle de la puissance laser et de platines motorisées, mais seulement d’envoyer l’image du motif désiré au modulateur spatial. Finalement, nous avons utilisé LAPAP pour démontrer que notre technique peut être utilisée dans des analyses de haut contenu pour quantifier les changements morphologiques résultant de la croissance neuronale sur des gradients de protéines de guidage. Nous avons produit des milliers de gradients de laminin-1 ayant différentes pentes et analysé les variations au niveau du guidage de neurites provenant d’une lignée cellulaire neuronale (RGC-5). Un algorithme pour analyser les images des cellules sur les gradients a été développé pour détecter chaque cellule et quantifier la position du centroïde du soma ainsi que les angles d’initiation, final et de braquage de chaque neurite. Ces données ont démontré que les gradients de laminine influencent l’angle d’initiation des neurites des RGC-5, mais n’influencent pas leur braquage. Nous croyons que les résultats présentés dans cette thèse faciliteront l’utilisation de motifs de protéines liées au substrat dans les laboratoires des sciences de la vie, puisque LAPAP peut être effectué à l’aide d’un microscope confocal ou d’un microscope standard légèrement modifié. Cela pourrait contribuer à l’augmentation du nombre de laboratoires travaillant sur le guidage avec des gradients liés au substrat afin d’atteindre la masse critique nécessaire à des percées majeures en neuroscience.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a method based on articulated models for the registration of spine data extracted from multimodal medical images of patients with scoliosis. With the ultimate aim being the development of a complete geometrical model of the torso of a scoliotic patient, this work presents a method for the registration of vertebral column data using 3D magnetic resonance images (MRI) acquired in prone position and X-ray data acquired in standing position for five patients with scoliosis. The 3D shape of the vertebrae is estimated from both image modalities for each patient, and an articulated model is used in order to calculate intervertebral transformations required in order to align the vertebrae between both postures. Euclidean distances between anatomical landmarks are calculated in order to assess multimodal registration error. Results show a decrease in the Euclidean distance using the proposed method compared to rigid registration and more physically realistic vertebrae deformations compared to thin-plate-spline (TPS) registration thus improving alignment.