398 resultados para Subtraction


Relevância:

10.00% 10.00%

Publicador:

Resumo:

O objetivo central desta pesquisa é avaliar os valores e possibilidades da aliança pregada no Deuteronômio. Para tanto, procuro captar a necessária tensão de qualquer tipo de aliança. Faço esse exercício, primeiramente, no próprio campo da hermenêutica. Sugiro uma leitura subalterna que agregue diferentes lutas no interior das interpretações libertárias (feminista, queer e pós-colonial). Nesse ínterim, forjo o trabalho do exegeta orgânico , a saber, aquele intérprete que articula vozes dissidentes para fazer frente às estruturas sistêmicas de subordinação. Após essa proposição teórica, avalio o Deuteronômio enquanto discursos concatenados em forma de arquivo. A principal sugestão é de que os textos deuteronômicos foram coletados ou produzidos em prol de um ideal de berit aliança . Esse ideal origina-se do material agora disposto em 4,44-26+28: um contrato comunitário atávico com Yhvh. Esse resultado é possibilitado pela crítica retórica ao texto e seus interesses propagandísticos desde o nascedouro arquivístico. Após uma comparação honesta com os tratados do Antigo Oriente Próximo, não se pode mais negar a pedagogia da obediência intrínseca ao contrato. A isso chamo, muitas vezes, de colusão do povo santo . A crítica retórica, entretanto, não encaminha apenas uma reificação desse ideal de berit, ao apontar, antes, para o debate interno da comunidade. Um contrato retórico, afinal, guarda em si, memórias silenciadas para que a propaganda se efetive. Nesse momento é que busco colisões de memórias, em especial, dentro das perícopes proibitivas do contrato. Todo o lixo deuteronômico, por assim dizer, está assinalado por duas fórmulas básicas: ki to abat yhvh eis uma abominação para Yhvh e ubi arta ha-ra mi-qirbeka exterminarás o per/vertido do teu meio . Dedico-me aos textos marcados por essas fórmulas, ao fomentar uma episódica unificação de abomináveis e per/vertidos . Avalio a luta particular de cada um/a, para então, propor uma agenda subalterna que promova a justiça social por reconhecimento e redistribuição. A aliança abominável e per/vertida intra-Deuteronômio apresenta uma proposta radicalmente democrática (i) em favor de uma cultura aberta ao Outro e (ii) contra estruturas autoritárias piramidais. Assinalo, portanto, que com essa dupla tática, os valores imperiais de hierarquização e subtração da irmandade deuteronômica são retoricamente postos em debate na comunidade.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Water-based latices, used in the production of internal liners for beer/beverage cans, were investigated using a number of analytical techniques. The epoxy-graft-acrylic polymers, used to prepare the latices, and films, produced from those latices, were also examined. It was confirmed that acrylic polymer preferentially grafts onto higher molecular weight portions of the epoxy polymer. The amount of epoxy remaining ungrafted was determined to be 80%. This figure is higher than was previously thought. Molecular weight distribution studies were carried out on the epoxy and epoxy-g-acrylic resins. A quantitative method for determining copolymer composition using GPC was evaluated. The GPC method was also used to determine polymer composition as a function of molecular weight. IR spectroscopy was used to determine the total level of acrylic modification of the polymers and NMR was used to determine the level of grafting. Particle size determinations were carried out using transmission electron microscopy and dynamic light scattering. Levels of stabilising amine greatly affected the viscosity of the latex, particle size and amount of soluble polymer but the core particle size, as determined using TEM, was unaffected. NMR spectra of the latices produced spectra only from solvents and amine modifiers. Using solid-state CP/MAS/freezing techniques spectra from the epoxy component could be observed. FT-IR spectra of the latices were obtained after special subtraction of water. The only difference between the spectra of the latices and those of the dry film were due to the presence of the solvents in the former. A distinctive morphology in the films produced from the latices was observed. This suggested that the micelle structure of the latex survives the film forming process. If insufficient acrylic is present, large epoxy domains are produced which gives rise to poor film characteristics. Casting the polymers from organic solutions failed to produce similar morphology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Phosphatase and tensin homolog (PTEN) is a redox-sensitive, dual-specificity protein phosphatase involved in regulating a number of cellular processes including metabolism, apoptosis, cell proliferation and survival. It acts as a tumor suppressor by negatively regulating the PI3K/Akt pathway. While direct evidence of a redox regulation of PTEN downstream signaling has been reported, the effect of cellular oxidative stress or direct PTEN oxidation on the PTEN interactome is still poorly defined. To investigate this, PTEN-GST fusion protein was prepared in its reduced form and an H2O2-oxidized form that was reversible by DTT treatment, and these were immobilized on a glutathione-sepharose-based support. The immobilized protein was incubated with cell lysate to capture interacting proteins. Captured proteins were eluted from the beads, analyzed by LC-MSMS and comparatively quantified using label-free methods. After subtraction of interactors that were also present in the resin and GST controls, 97 individual protein interactors were identified, including several that are novel. Fourteen interactors that varied significantly with the redox status of PTEN were identified, including thioredoxin and peroxiredoxin-1. Except for one interactor, their binding was higher for oxidized PTEN. Using western blotting, altered binding to PTEN was confirmed for 3 selected interactors (Prdx1, Trx, and Anxa2) and DDB1 was validated as a novel interactor with unaltered binding. Our results suggest that the redox status of PTEN causes a functional variation in the PTEN interactome which is important for the cellular function of PTEN. The resin capture method developed had distinct advantages in that the redox status of PTEN could be directly controlled and measured.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An approximate number is an ordered pair consisting of a (real) number and an error bound, briefly error, which is a (real) non-negative number. To compute with approximate numbers the arithmetic operations on errors should be well-known. To model computations with errors one should suitably define and study arithmetic operations and order relations over the set of non-negative numbers. In this work we discuss the algebraic properties of non-negative numbers starting from familiar properties of real numbers. We focus on certain operations of errors which seem not to have been sufficiently studied algebraically. In this work we restrict ourselves to arithmetic operations for errors related to addition and multiplication by scalars. We pay special attention to subtractability-like properties of errors and the induced “distance-like” operation. This operation is implicitly used under different names in several contemporary fields of applied mathematics (inner subtraction and inner addition in interval analysis, generalized Hukuhara difference in fuzzy set theory, etc.) Here we present some new results related to algebraic properties of this operation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a novel architecture for all-optical add-drop multiplexing of OFDM signals. Sub-channel extraction is achieved by means of waveform replication and coherent subtraction from the OFDM super-channel. Numerical simulations have been carried out to benchmark the performance of the architecture against critical design parameters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The move from Standard Definition (SD) to High Definition (HD) represents a six times increases in data, which needs to be processed. With expanding resolutions and evolving compression, there is a need for high performance with flexible architectures to allow for quick upgrade ability. The technology advances in image display resolutions, advanced compression techniques, and video intelligence. Software implementation of these systems can attain accuracy with tradeoffs among processing performance (to achieve specified frame rates, working on large image data sets), power and cost constraints. There is a need for new architectures to be in pace with the fast innovations in video and imaging. It contains dedicated hardware implementation of the pixel and frame rate processes on Field Programmable Gate Array (FPGA) to achieve the real-time performance. ^ The following outlines the contributions of the dissertation. (1) We develop a target detection system by applying a novel running average mean threshold (RAMT) approach to globalize the threshold required for background subtraction. This approach adapts the threshold automatically to different environments (indoor and outdoor) and different targets (humans and vehicles). For low power consumption and better performance, we design the complete system on FPGA. (2) We introduce a safe distance factor and develop an algorithm for occlusion occurrence detection during target tracking. A novel mean-threshold is calculated by motion-position analysis. (3) A new strategy for gesture recognition is developed using Combinational Neural Networks (CNN) based on a tree structure. Analysis of the method is done on American Sign Language (ASL) gestures. We introduce novel point of interests approach to reduce the feature vector size and gradient threshold approach for accurate classification. (4) We design a gesture recognition system using a hardware/ software co-simulation neural network for high speed and low memory storage requirements provided by the FPGA. We develop an innovative maximum distant algorithm which uses only 0.39% of the image as the feature vector to train and test the system design. Database set gestures involved in different applications may vary. Therefore, it is highly essential to keep the feature vector as low as possible while maintaining the same accuracy and performance^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study analyzed three fifth grade students’ misconceptions and error patterns when working with equivalence, addition and subtraction of fractions. The findings revealed that students used both conceptual and procedural knowledge to solve the problems. They used pictures, gave examples, and made connections to other mathematical concepts and to daily life topics. Error patterns found include using addition and subtraction of numerators and denominators, and finding the greatest common factor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This report is a review of additive and subtractive manufacturing techniques. This approach (additive manufacturing) has resided largely in the prototyping realm, where the methods of producing complex freeform solid objects directly from a computer model without part-specific tooling or knowledge. But these technologies are evolving steadily and are beginning to encompass related systems of material addition, subtraction, assembly, and insertion of components made by other processes. Furthermore, these various additive processes are starting to evolve into rapid manufacturing techniques for mass-customized products, away from narrowly defined rapid prototyping. Taking this idea far enough down the line, and several years hence, a radical restructuring of manufacturing could take place. Manufacturing itself would move from a resource base to a knowledge base and from mass production of single use products to mass customized, high value, life cycle products, majority of research and development was focused on advanced development of existing technologies by improving processing performance, materials, modelling and simulation tools, and design tools to enable the transition from prototyping to manufacturing of end use parts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

X-ray computed tomography (CT) imaging constitutes one of the most widely used diagnostic tools in radiology today with nearly 85 million CT examinations performed in the U.S in 2011. CT imparts a relatively high amount of radiation dose to the patient compared to other x-ray imaging modalities and as a result of this fact, coupled with its popularity, CT is currently the single largest source of medical radiation exposure to the U.S. population. For this reason, there is a critical need to optimize CT examinations such that the dose is minimized while the quality of the CT images is not degraded. This optimization can be difficult to achieve due to the relationship between dose and image quality. All things being held equal, reducing the dose degrades image quality and can impact the diagnostic value of the CT examination.

A recent push from the medical and scientific community towards using lower doses has spawned new dose reduction technologies such as automatic exposure control (i.e., tube current modulation) and iterative reconstruction algorithms. In theory, these technologies could allow for scanning at reduced doses while maintaining the image quality of the exam at an acceptable level. Therefore, there is a scientific need to establish the dose reduction potential of these new technologies in an objective and rigorous manner. Establishing these dose reduction potentials requires precise and clinically relevant metrics of CT image quality, as well as practical and efficient methodologies to measure such metrics on real CT systems. The currently established methodologies for assessing CT image quality are not appropriate to assess modern CT scanners that have implemented those aforementioned dose reduction technologies.

Thus the purpose of this doctoral project was to develop, assess, and implement new phantoms, image quality metrics, analysis techniques, and modeling tools that are appropriate for image quality assessment of modern clinical CT systems. The project developed image quality assessment methods in the context of three distinct paradigms, (a) uniform phantoms, (b) textured phantoms, and (c) clinical images.

The work in this dissertation used the “task-based” definition of image quality. That is, image quality was broadly defined as the effectiveness by which an image can be used for its intended task. Under this definition, any assessment of image quality requires three components: (1) A well defined imaging task (e.g., detection of subtle lesions), (2) an “observer” to perform the task (e.g., a radiologists or a detection algorithm), and (3) a way to measure the observer’s performance in completing the task at hand (e.g., detection sensitivity/specificity).

First, this task-based image quality paradigm was implemented using a novel multi-sized phantom platform (with uniform background) developed specifically to assess modern CT systems (Mercury Phantom, v3.0, Duke University). A comprehensive evaluation was performed on a state-of-the-art CT system (SOMATOM Definition Force, Siemens Healthcare) in terms of noise, resolution, and detectability as a function of patient size, dose, tube energy (i.e., kVp), automatic exposure control, and reconstruction algorithm (i.e., Filtered Back-Projection– FPB vs Advanced Modeled Iterative Reconstruction– ADMIRE). A mathematical observer model (i.e., computer detection algorithm) was implemented and used as the basis of image quality comparisons. It was found that image quality increased with increasing dose and decreasing phantom size. The CT system exhibited nonlinear noise and resolution properties, especially at very low-doses, large phantom sizes, and for low-contrast objects. Objective image quality metrics generally increased with increasing dose and ADMIRE strength, and with decreasing phantom size. The ADMIRE algorithm could offer comparable image quality at reduced doses or improved image quality at the same dose (increase in detectability index by up to 163% depending on iterative strength). The use of automatic exposure control resulted in more consistent image quality with changing phantom size.

Based on those results, the dose reduction potential of ADMIRE was further assessed specifically for the task of detecting small (<=6 mm) low-contrast (<=20 HU) lesions. A new low-contrast detectability phantom (with uniform background) was designed and fabricated using a multi-material 3D printer. The phantom was imaged at multiple dose levels and images were reconstructed with FBP and ADMIRE. Human perception experiments were performed to measure the detection accuracy from FBP and ADMIRE images. It was found that ADMIRE had equivalent performance to FBP at 56% less dose.

Using the same image data as the previous study, a number of different mathematical observer models were implemented to assess which models would result in image quality metrics that best correlated with human detection performance. The models included naïve simple metrics of image quality such as contrast-to-noise ratio (CNR) and more sophisticated observer models such as the non-prewhitening matched filter observer model family and the channelized Hotelling observer model family. It was found that non-prewhitening matched filter observers and the channelized Hotelling observers both correlated strongly with human performance. Conversely, CNR was found to not correlate strongly with human performance, especially when comparing different reconstruction algorithms.

The uniform background phantoms used in the previous studies provided a good first-order approximation of image quality. However, due to their simplicity and due to the complexity of iterative reconstruction algorithms, it is possible that such phantoms are not fully adequate to assess the clinical impact of iterative algorithms because patient images obviously do not have smooth uniform backgrounds. To test this hypothesis, two textured phantoms (classified as gross texture and fine texture) and a uniform phantom of similar size were built and imaged on a SOMATOM Flash scanner (Siemens Healthcare). Images were reconstructed using FBP and a Sinogram Affirmed Iterative Reconstruction (SAFIRE). Using an image subtraction technique, quantum noise was measured in all images of each phantom. It was found that in FBP, the noise was independent of the background (textured vs uniform). However, for SAFIRE, noise increased by up to 44% in the textured phantoms compared to the uniform phantom. As a result, the noise reduction from SAFIRE was found to be up to 66% in the uniform phantom but as low as 29% in the textured phantoms. Based on this result, it clear that further investigation was needed into to understand the impact that background texture has on image quality when iterative reconstruction algorithms are used.

To further investigate this phenomenon with more realistic textures, two anthropomorphic textured phantoms were designed to mimic lung vasculature and fatty soft tissue texture. The phantoms (along with a corresponding uniform phantom) were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Scans were repeated a total of 50 times in order to get ensemble statistics of the noise. A novel method of estimating the noise power spectrum (NPS) from irregularly shaped ROIs was developed. It was found that SAFIRE images had highly locally non-stationary noise patterns with pixels near edges having higher noise than pixels in more uniform regions. Compared to FBP, SAFIRE images had 60% less noise on average in uniform regions for edge pixels, noise was between 20% higher and 40% lower. The noise texture (i.e., NPS) was also highly dependent on the background texture for SAFIRE. Therefore, it was concluded that quantum noise properties in the uniform phantoms are not representative of those in patients for iterative reconstruction algorithms and texture should be considered when assessing image quality of iterative algorithms.

The move beyond just assessing noise properties in textured phantoms towards assessing detectability, a series of new phantoms were designed specifically to measure low-contrast detectability in the presence of background texture. The textures used were optimized to match the texture in the liver regions actual patient CT images using a genetic algorithm. The so called “Clustured Lumpy Background” texture synthesis framework was used to generate the modeled texture. Three textured phantoms and a corresponding uniform phantom were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Images were reconstructed with FBP and SAFIRE and analyzed using a multi-slice channelized Hotelling observer to measure detectability and the dose reduction potential of SAFIRE based on the uniform and textured phantoms. It was found that at the same dose, the improvement in detectability from SAFIRE (compared to FBP) was higher when measured in a uniform phantom compared to textured phantoms.

The final trajectory of this project aimed at developing methods to mathematically model lesions, as a means to help assess image quality directly from patient images. The mathematical modeling framework is first presented. The models describe a lesion’s morphology in terms of size, shape, contrast, and edge profile as an analytical equation. The models can be voxelized and inserted into patient images to create so-called “hybrid” images. These hybrid images can then be used to assess detectability or estimability with the advantage that the ground truth of the lesion morphology and location is known exactly. Based on this framework, a series of liver lesions, lung nodules, and kidney stones were modeled based on images of real lesions. The lesion models were virtually inserted into patient images to create a database of hybrid images to go along with the original database of real lesion images. ROI images from each database were assessed by radiologists in a blinded fashion to determine the realism of the hybrid images. It was found that the radiologists could not readily distinguish between real and virtual lesion images (area under the ROC curve was 0.55). This study provided evidence that the proposed mathematical lesion modeling framework could produce reasonably realistic lesion images.

Based on that result, two studies were conducted which demonstrated the utility of the lesion models. The first study used the modeling framework as a measurement tool to determine how dose and reconstruction algorithm affected the quantitative analysis of liver lesions, lung nodules, and renal stones in terms of their size, shape, attenuation, edge profile, and texture features. The same database of real lesion images used in the previous study was used for this study. That database contained images of the same patient at 2 dose levels (50% and 100%) along with 3 reconstruction algorithms from a GE 750HD CT system (GE Healthcare). The algorithms in question were FBP, Adaptive Statistical Iterative Reconstruction (ASiR), and Model-Based Iterative Reconstruction (MBIR). A total of 23 quantitative features were extracted from the lesions under each condition. It was found that both dose and reconstruction algorithm had a statistically significant effect on the feature measurements. In particular, radiation dose affected five, three, and four of the 23 features (related to lesion size, conspicuity, and pixel-value distribution) for liver lesions, lung nodules, and renal stones, respectively. MBIR significantly affected 9, 11, and 15 of the 23 features (including size, attenuation, and texture features) for liver lesions, lung nodules, and renal stones, respectively. Lesion texture was not significantly affected by radiation dose.

The second study demonstrating the utility of the lesion modeling framework focused on assessing detectability of very low-contrast liver lesions in abdominal imaging. Specifically, detectability was assessed as a function of dose and reconstruction algorithm. As part of a parallel clinical trial, images from 21 patients were collected at 6 dose levels per patient on a SOMATOM Flash scanner. Subtle liver lesion models (contrast = -15 HU) were inserted into the raw projection data from the patient scans. The projections were then reconstructed with FBP and SAFIRE (strength 5). Also, lesion-less images were reconstructed. Noise, contrast, CNR, and detectability index of an observer model (non-prewhitening matched filter) were assessed. It was found that SAFIRE reduced noise by 52%, reduced contrast by 12%, increased CNR by 87%. and increased detectability index by 65% compared to FBP. Further, a 2AFC human perception experiment was performed to assess the dose reduction potential of SAFIRE, which was found to be 22% compared to the standard of care dose.

In conclusion, this dissertation provides to the scientific community a series of new methodologies, phantoms, analysis techniques, and modeling tools that can be used to rigorously assess image quality from modern CT systems. Specifically, methods to properly evaluate iterative reconstruction have been developed and are expected to aid in the safe clinical implementation of dose reduction technologies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

At 24 stations in the Weser Estuary and the German Bight the Most Probable Numbers (MPN/g dry wt. sediment) of nitrate-dissimilating (= denitrifying) and of nitrate plus nitrite-dissimilating bacteria were recorded. The numbers of nitrite-dissimilating bacteria, i. e. denitrifiers not capable of reducing nitrate to nitrite, were calculated by subtraction of the MPN for nitrate-dissimilating from the MPN of nitrate plus nitrite-dissimilating bacteria. By determining the percentages of these bacteria in relation to the number of the heterotrophs, the ecological importance of denitrification, especially the nitrite dissimilation, was estimated. The results showed the MPN of nitrate-dissimilating bacteria to be in the range of 0-156 (up to 0.8 % of heterotrophic bacteria). An exception was the sediment of one station with a MPN of 1849, or 5.2 % of the heterotrophs. The amounts of nitrite-dissimilating bacteria were between 0 and 2352 (up to 13 % of heterotrophic bacteria). In the estuary the numbers of nitrate-dissimilating and of nitrite-dissimilating bacteria showed a decreasing tendency with distance from Bremerhaven. The highest numbers were found in the Weser off Bremerhaven and also at 3 stations in the German Bight, south of the Isle of Helgoland.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Twenty-one core samples from DSDP/IPOD Leg 63 were analyzed for products of chlorophyll diagenesis. In addition to the tetrapyrrole pigments, perylene and carotenoid pigments were isolated and identified. The 16 core samples from the San Miguel Gap site (467) and the five from the Baja California borderland location (471) afforded the unique opportunity of examining tetrapyrrole diagenesis in clay-rich marine sediments that are very high in total organic matter. The chelation reaction, whereby free-base porphyrins give rise to metalloporphyrins (viz., nickel), is well documented within the downhole sequence of sediments from the San Miguel Gap (Site 467). Recognition of unique arrays of highly dealkylated copper and nickel ETIO-porphyrins, exhibiting nearly identical carbonnumber homologies (viz., C-23 to C-30; mode = C-26), enabled subtraction of this component (thought to be derived from an allochthonous source) and thus permitted description of the actual in situ diagenesis of autochthonous chlorophyll derivatives.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To define specific pathways important in the multistep transformation process of normal plasma cells (PCs) to monoclonal gammopathy of uncertain significance (MGUS) and multiple myeloma (MM), we have applied microarray analysis to PCs from 5 healthy donors (N), 7 patients with MGUS, and 24 patients with newly diagnosed MM. Unsupervised hierarchical clustering using 125 genes with a large variation across all samples defined 2 groups: N and MGUS/MM. Supervised analysis identified 263 genes differentially expressed between N and MGUS and 380 genes differentially expressed between N and MM, 197 of which were also differentially regulated between N and MGUS. Only 74 genes were differentially expressed between MGUS and MM samples, indicating that the differences between MGUS and MM are smaller than those between N and MM or N and MGUS. Differentially expressed genes included oncogenes/tumor-suppressor genes (LAF4, RB1, and disabled homolog 2), cell-signaling genes (RAS family members, B-cell signaling and NF-kappaB genes), DNA-binding and transcription-factor genes (XBP1, zinc finger proteins, forkhead box, and ring finger proteins), and developmental genes (WNT and SHH pathways). Understanding the molecular pathogenesis of MM by gene expression profiling has demonstrated sequential genetic changes from N to malignant PCs and highlighted important pathways involved in the transformation of MGUS to MM.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Les maladies cardiovasculaires sont la première cause de mortalité dans le monde et les anévrismes de l’aorte abdominale (AAAs) font partie de ce lot déplorable. Un anévrisme est la dilatation d’une artère pouvant conduire à la mort. Une rupture d’AAA s’avère fatale près de 80% du temps. Un moyen de traiter les AAAs est l’insertion d’une endoprothèse (SG) dans l’aorte, communément appelée la réparation endovasculaire (EVAR), afin de réduire la pression exercée par le flux sanguin sur la paroi. L’efficacité de ce traitement est compromise par la survenue d’endofuites (flux sanguins entre la prothèse et le sac anévrismal) pouvant conduire à la rupture de l’anévrisme. Ces flux sanguins peuvent survenir à n’importe quel moment après le traitement EVAR. Une surveillance par tomodensitométrie (CT-scan) annuelle est donc requise, augmentant ainsi le coût du suivi post-EVAR et exposant le patient à la radiation ionisante et aux complications des contrastes iodés. L’endotension est le concept de dilatation de l’anévrisme sans la présence d’une endofuite apparente au CT-scan. Après le traitement EVAR, le sang dans le sac anévrismal coagule pour former un thrombus frais, qui deviendra progressivement un thrombus plus fibreux et plus organisé, donnant lieu à un rétrécissement de l’anévrisme. Il y a très peu de données dans la littérature pour étudier ce processus temporel et la relation entre le thrombus frais et l’endotension. L’étalon d’or du suivi post-EVAR, le CT-scan, ne peut pas détecter la présence de thrombus frais. Il y a donc un besoin d’investir dans une technique sécuritaire et moins coûteuse pour le suivi d’AAAs après EVAR. Une méthode récente, l’élastographie dynamique, mesure l’élasticité des tissus en temps réel. Le principe de cette technique repose sur la génération d’ondes de cisaillement et l’étude de leur propagation afin de remonter aux propriétés mécaniques du milieu étudié. Cette thèse vise l’application de l’élastographie dynamique pour la détection des endofuites ainsi que de la caractérisation mécanique des tissus du sac anévrismal après le traitement EVAR. Ce projet dévoile le potentiel de l’élastographie afin de réduire les dangers de la radiation, de l’utilisation d’agent de contraste ainsi que des coûts du post-EVAR des AAAs. L’élastographie dynamique utilisant le « Shear Wave Imaging » (SWI) est prometteuse. Cette modalité pourrait complémenter l’échographie-Doppler (DUS) déjà utilisée pour le suivi d’examen post-EVAR. Le SWI a le potentiel de fournir des informations sur l’organisation fibreuse du thrombus ainsi que sur la détection d’endofuites. Tout d’abord, le premier objectif de cette thèse consistait à tester le SWI sur des AAAs dans des modèles canins pour la détection d’endofuites et la caractérisation du thrombus. Des SGs furent implantées dans un groupe de 18 chiens avec un anévrisme créé au moyen de la veine jugulaire. 4 anévrismes avaient une endofuite de type I, 13 avaient une endofuite de type II et un anévrisme n’avait pas d’endofuite. Des examens échographiques, DUS et SWI ont été réalisés à l’implantation, puis 1 semaine, 1 mois, 3 mois et 6 mois après le traitement EVAR. Une angiographie, un CT-scan et des coupes macroscopiques ont été produits au sacrifice. Les régions d’endofuites, de thrombus frais et de thrombus organisé furent identifiées et segmentées. Les valeurs de rigidité données par le SWI des différentes régions furent comparées. Celles-ci furent différentes de façon significative (P < 0.001). Également, le SWI a pu détecter la présence d’endofuites où le CT-scan (1) et le DUS (3) ont échoué. Dans la continuité de ces travaux, le deuxième objectif de ce projet fut de caractériser l’évolution du thrombus dans le temps, de même que l’évolution des endofuites après embolisation dans des modèles canins. Dix-huit anévrismes furent créés dans les artères iliaques de neuf modèles canins, suivis d’une endofuite de type I après EVAR. Deux gels embolisants (Chitosan (Chi) ou Chitosan-Sodium-Tetradecyl-Sulfate (Chi-STS)) furent injectés dans le sac anévrismal pour promouvoir la guérison. Des examens échographiques, DUS et SWI ont été effectués à l’implantation et après 1 semaine, 1 mois, 3 mois et 6 mois. Une angiographie, un CT-scan et un examen histologique ont été réalisés au sacrifice afin d’évaluer la présence, le type et la grosseur de l’endofuite. Les valeurs du module d’élasticité des régions d’intérêts ont été identifiées et segmentées sur les données pathologiques. Les régions d’endofuites et de thrombus frais furent différentes de façon significative comparativement aux autres régions (P < 0.001). Les valeurs d’élasticité du thrombus frais à 1 semaine et à 3 mois indiquent que le SWI peut évaluer la maturation du thrombus, de même que caractériser l’évolution et la dégradation des gels embolisants dans le temps. Le SWI a pu détecter des endofuites où le DUS a échoué (2) et, contrairement au CT-scan, détecter la présence de thrombus frais. Finalement, la dernière étape du projet doctoral consistait à appliquer le SWI dans une phase clinique, avec des patients humains ayant déjà un AAA, pour la détection d’endofuite et la caractérisation de l’élasticité des tissus. 25 patients furent sélectionnés pour participer à l’étude. Une comparaison d’imagerie a été produite entre le SWI, le CT-scan et le DUS. Les valeurs de rigidité données par le SWI des différentes régions (endofuite, thrombus) furent identifiées et segmentées. Celles-ci étaient distinctes de façon significative (P < 0.001). Le SWI a détecté 5 endofuites sur 6 (sensibilité de 83.3%) et a eu 6 faux positifs (spécificité de 76%). Le SWI a pu détecter la présence d’endofuites où le CT-scan (2) ainsi que le DUS (2) ont échoué. Il n’y avait pas de différence statistique notable entre la rigidité du thrombus pour un AAA avec endofuite et un AAA sans endofuite. Aucune corrélation n’a pu être établie de façon significative entre les diamètres des AAAs ainsi que leurs variations et l’élasticité du thrombus. Le SWI a le potentiel de détecter les endofuites et caractériser le thrombus selon leurs propriétés mécaniques. Cette technique pourrait être combinée au suivi des AAAs post-EVAR, complémentant ainsi l’imagerie DUS et réduisant le coût et l’exposition à la radiation ionisante et aux agents de contrastes néphrotoxiques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

10.00% 10.00%

Publicador:

Resumo:

De grundläggande aritmetiska räknelagarna är centrala för elevers utveckling inom algebra. Det är därför viktigt att elever ges möjlighet att urskilja och utveckla förståelse för dessa. Genom denna kvalitativa intervjustudie undersöktes hur 16 elever i årskurs 2-5, utifrån en i förväg designad instruktionssekvens, resonerar om, generaliserar och använder den associativa egenskapen för addition. Studien visar att många elever faktiskt urskiljer den associativa egenskapen för addition genom arbetet med instruktionssekvensen, men att endast ett fåtal tillämpar denna egenskap vid beräkning. Studien visar även att flera elever efter att ha urskilt associativitet kan göra generaliseringar av egenskapen relaterad till addition eller subtraktion. Slutsatsen av studien är att en instruktionssekvens som erbjuder systematisk variation och upprepning av uttryck med samma struktur, möjliggör för elever att urskilja och beskriva associativitet. Urskiljandet möjliggörs även av att elever uppmanas att betrakta uttrycken från flera perspektiv och beskriva dem utifrån frågor om likheter och skillnader.