973 resultados para microrna target systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global connectivity, for anyone, at anyplace, at anytime, to provide high-speed, high-quality, and reliable communication channels for mobile devices, is now becoming a reality. The credit mainly goes to the recent technological advances in wireless communications comprised of a wide range of technologies, services, and applications to fulfill the particular needs of end-users in different deployment scenarios (Wi-Fi, WiMAX, and 3G/4G cellular systems). In such a heterogeneous wireless environment, one of the key ingredients to provide efficient ubiquitous computing with guaranteed quality and continuity of service is the design of intelligent handoff algorithms. Traditional single-metric handoff decision algorithms, such as Received Signal Strength (RSS) based, are not efficient and intelligent enough to minimize the number of unnecessary handoffs, decision delays, and call-dropping and/or blocking probabilities. This research presented a novel approach for the design and implementation of a multi-criteria vertical handoff algorithm for heterogeneous wireless networks. Several parallel Fuzzy Logic Controllers were utilized in combination with different types of ranking algorithms and metric weighting schemes to implement two major modules: the first module estimated the necessity of handoff, and the other module was developed to select the best network as the target of handoff. Simulations based on different traffic classes, utilizing various types of wireless networks were carried out by implementing a wireless test-bed inspired by the concept of Rudimentary Network Emulator (RUNE). Simulation results indicated that the proposed scheme provided better performance in terms of minimizing the unnecessary handoffs, call dropping, and call blocking and handoff blocking probabilities. When subjected to Conversational traffic and compared against the RSS-based reference algorithm, the proposed scheme, utilizing the FTOPSIS ranking algorithm, was able to reduce the average outage probability of MSs moving with high speeds by 17%, new call blocking probability by 22%, the handoff blocking probability by 16%, and the average handoff rate by 40%. The significant reduction in the resulted handoff rate provides MS with efficient power consumption, and more available battery life. These percentages indicated a higher probability of guaranteed session continuity and quality of the currently utilized service, resulting in higher user satisfaction levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The move from Standard Definition (SD) to High Definition (HD) represents a six times increases in data, which needs to be processed. With expanding resolutions and evolving compression, there is a need for high performance with flexible architectures to allow for quick upgrade ability. The technology advances in image display resolutions, advanced compression techniques, and video intelligence. Software implementation of these systems can attain accuracy with tradeoffs among processing performance (to achieve specified frame rates, working on large image data sets), power and cost constraints. There is a need for new architectures to be in pace with the fast innovations in video and imaging. It contains dedicated hardware implementation of the pixel and frame rate processes on Field Programmable Gate Array (FPGA) to achieve the real-time performance. ^ The following outlines the contributions of the dissertation. (1) We develop a target detection system by applying a novel running average mean threshold (RAMT) approach to globalize the threshold required for background subtraction. This approach adapts the threshold automatically to different environments (indoor and outdoor) and different targets (humans and vehicles). For low power consumption and better performance, we design the complete system on FPGA. (2) We introduce a safe distance factor and develop an algorithm for occlusion occurrence detection during target tracking. A novel mean-threshold is calculated by motion-position analysis. (3) A new strategy for gesture recognition is developed using Combinational Neural Networks (CNN) based on a tree structure. Analysis of the method is done on American Sign Language (ASL) gestures. We introduce novel point of interests approach to reduce the feature vector size and gradient threshold approach for accurate classification. (4) We design a gesture recognition system using a hardware/ software co-simulation neural network for high speed and low memory storage requirements provided by the FPGA. We develop an innovative maximum distant algorithm which uses only 0.39% of the image as the feature vector to train and test the system design. Database set gestures involved in different applications may vary. Therefore, it is highly essential to keep the feature vector as low as possible while maintaining the same accuracy and performance^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This poster presentation from the May 2015 Florida Library Association Conference, along with the Everglades Explorer discovery portal at http://ee.fiu.edu, demonstrates how traditional bibliographic and curatorial principles can be applied to: 1) selection, cross-walking and aggregation of metadata linking end-users to wide-spread digital resources from multiple silos; 2) harvesting of select PDFs, HTML and media for web archiving and access; 3) selection of CMS domains, sub-domains and folders for targeted searching using an API. Choosing content for this discovery portal is comparable to past scholarly practice of creating and publishing subject bibliographies, except metadata and data are housed in relational databases. This new and yet traditional capacity coincides with: Growth of bibliographic utilities (MarcEdit); Evolution of open-source discovery systems (eXtensible Catalog); Development of target-capable web crawling and archiving systems (Archive-it); and specialized search APIs (Google). At the same time, historical and technical changes – specifically the increasing fluidity and re-purposing of syndicated metadata – make this possible. It equally stems from the expansion of freely accessible digitized legacy and born-digital resources. Innovation principles helped frame the process by which the thematic Everglades discovery portal was created at Florida International University. The path -- to providing for more effective searching and co-location of digital scientific, educational and historical material related to the Everglades -- is contextualized through five concepts found within Dyer and Christensen’s “The Innovator’s DNA: Mastering the five skills of disruptive innovators (2011). The project also aligns with Ranganathan’s Laws of Library Science, especially the 4th Law -- to "save the time of the user.”

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Existing instrumental techniques must be adaptable to the analysis of novel explosives if science is to keep up with the practices of terrorists and criminals. The focus of this work has been the development of analytical techniques for the analysis of two types of novel explosives: ascorbic acid-based propellants, and improvised mixtures of concentrated hydrogen peroxide/fuel. In recent years, the use of these explosives in improvised explosive devices (IEDs) has increased. It is therefore important to develop methods which permit the identification of the nature of the original explosive from post-blast residues. Ascorbic acid-based propellants are low explosives which employ an ascorbic acid fuel source with a nitrate/perchlorate oxidizer. A method which utilized ion chromatography with indirect photometric detection was optimized for the analysis of intact propellants. Post-burn and post-blast residues if these propellants were analyzed. It was determined that the ascorbic acid fuel and nitrate oxidizer could be detected in intact propellants, as well as in the post-burn and post-blast residues. Degradation products of the nitrate and perchlorate oxidizers were also detected. With a quadrupole time-of-flight mass spectrometer (QToFMS), exact mass measurements are possible. When an HPLC instrument is coupled to a QToFMS, the combination of retention time with accurate mass measurements, mass spectral fragmentation information, and isotopic abundance patterns allows for the unequivocal identification of a target analyte. An optimized HPLC-ESI-QToFMS method was applied to the analysis of ascorbic acid-based propellants. Exact mass measurements were collected for the fuel and oxidizer anions, and their degradation products. Ascorbic acid was detected in the intact samples and half of the propellants subjected to open burning; the intact fuel molecule was not detected in any of the post-blast residue. Two methods were optimized for the analysis of trace levels of hydrogen peroxide: HPLC with fluorescence detection (HPLC-FD), and HPLC with electrochemical detection (HPLC-ED). Both techniques were extremely selective for hydrogen peroxide. Both methods were applied to the analysis of post-blast debris from improvised mixtures of concentrated hydrogen peroxide/fuel; hydrogen peroxide was detected on variety of substrates. Hydrogen peroxide was detected in the post-blast residues of the improvised explosives TATP and HMTD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The presence of high phase noise in addition to additive white Gaussian noise in coherent optical systems affects the performance of forward error correction (FEC) schemes. In this paper, we propose a simple scheme for such systems, using block interleavers and binary Bose–Chaudhuri–Hocquenghem (BCH) codes. The block interleavers are specifically optimized for differential quadrature phase shift keying modulation. We propose a method for selecting BCH codes that, together with the interleavers, achieve a target post-FEC bit error rate (BER). This combination of interleavers and BCH codes has very low implementation complexity. In addition, our approach is straightforward, requiring only short pre-FEC simulations to parameterize a model, based on which we select codes analytically. We aim to correct a pre-FEC BER of around (Formula presented.). We evaluate the accuracy of our approach using numerical simulations. For a target post-FEC BER of (Formula presented.), codes selected using our method result in BERs around 3(Formula presented.) target and achieve the target with around 0.2 dB extra signal-to-noise ratio.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brain-computer interfaces (BCI) have the potential to restore communication or control abilities in individuals with severe neuromuscular limitations, such as those with amyotrophic lateral sclerosis (ALS). The role of a BCI is to extract and decode relevant information that conveys a user's intent directly from brain electro-physiological signals and translate this information into executable commands to control external devices. However, the BCI decision-making process is error-prone due to noisy electro-physiological data, representing the classic problem of efficiently transmitting and receiving information via a noisy communication channel.

This research focuses on P300-based BCIs which rely predominantly on event-related potentials (ERP) that are elicited as a function of a user's uncertainty regarding stimulus events, in either an acoustic or a visual oddball recognition task. The P300-based BCI system enables users to communicate messages from a set of choices by selecting a target character or icon that conveys a desired intent or action. P300-based BCIs have been widely researched as a communication alternative, especially in individuals with ALS who represent a target BCI user population. For the P300-based BCI, repeated data measurements are required to enhance the low signal-to-noise ratio of the elicited ERPs embedded in electroencephalography (EEG) data, in order to improve the accuracy of the target character estimation process. As a result, BCIs have relatively slower speeds when compared to other commercial assistive communication devices, and this limits BCI adoption by their target user population. The goal of this research is to develop algorithms that take into account the physical limitations of the target BCI population to improve the efficiency of ERP-based spellers for real-world communication.

In this work, it is hypothesised that building adaptive capabilities into the BCI framework can potentially give the BCI system the flexibility to improve performance by adjusting system parameters in response to changing user inputs. The research in this work addresses three potential areas for improvement within the P300 speller framework: information optimisation, target character estimation and error correction. The visual interface and its operation control the method by which the ERPs are elicited through the presentation of stimulus events. The parameters of the stimulus presentation paradigm can be modified to modulate and enhance the elicited ERPs. A new stimulus presentation paradigm is developed in order to maximise the information content that is presented to the user by tuning stimulus paradigm parameters to positively affect performance. Internally, the BCI system determines the amount of data to collect and the method by which these data are processed to estimate the user's target character. Algorithms that exploit language information are developed to enhance the target character estimation process and to correct erroneous BCI selections. In addition, a new model-based method to predict BCI performance is developed, an approach which is independent of stimulus presentation paradigm and accounts for dynamic data collection. The studies presented in this work provide evidence that the proposed methods for incorporating adaptive strategies in the three areas have the potential to significantly improve BCI communication rates, and the proposed method for predicting BCI performance provides a reliable means to pre-assess BCI performance without extensive online testing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human motion monitoring is an important function in numerous applications. In this dissertation, two systems for monitoring motions of multiple human targets in wide-area indoor environments are discussed, both of which use radio frequency (RF) signals to detect, localize, and classify different types of human motion. In the first system, a coherent monostatic multiple-input multiple-output (MIMO) array is used, and a joint spatial-temporal adaptive processing method is developed to resolve micro-Doppler signatures at each location in a wide-area for motion mapping. The downranges are obtained by estimating time-delays from the targets, and the crossranges are obtained by coherently filtering array spatial signals. Motion classification is then applied to each target based on micro-Doppler analysis. In the second system, multiple noncoherent multistatic transmitters (Tx's) and receivers (Rx's) are distributed in a wide-area, and motion mapping is achieved by noncoherently combining bistatic range profiles from multiple Tx-Rx pairs. Also, motion classification is applied to each target by noncoherently combining bistatic micro-Doppler signatures from multiple Tx-Rx pairs. For both systems, simulation and real data results are shown to demonstrate the ability of the proposed methods for monitoring patient repositioning activities for pressure ulcer prevention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Development of a therapy for bone metastases is of paramount importance for castration-resistant prostate cancer (CRPC). The osteomimetic properties of CRPC confer a propensity to metastasize to osseous sites. Micro-ribonucleic acid (miRNA) is non-coding RNA that acts as a post-transcriptional regulator of multiple proteins and associated pathways. Therefore identification of miRNAs could reveal a valid third generation therapy for CRPC. Areas covered: miR34a has been found to play an integral role in the progression of prostate cancer, particularly in the regulation of metastatic genes involved in migration, intravasation, extravasation, bone attachment and bone homeostasis. The correlation between miR34a down-regulation and metastatic progression has generated substantial interest in this field. Expert opinion: Examination of the evidence reveals that miR34a is an ideal target for gene therapy for metastatic CRPC. We also conclude that future studies should focus on the effects of miR34a upregulation in CRPC with respect to migration, translocation to bone micro-environment and osteomimetic phenotype development. The success of miR34a as a therapeutic is reliant on the development of appropriate delivery systems and targeting to the bone micro-environment. In tandem with any therapeutic studies, biomarker serum levels should also be ascertained as an indicator of successful miR34a delivery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Network security monitoring remains a challenge. As global networks scale up, in terms of traffic, volume and speed, effective attribution of cyber attacks is increasingly difficult. The problem is compounded by a combination of other factors, including the architecture of the Internet, multi-stage attacks and increasing volumes of nonproductive traffic. This paper proposes to shift the focus of security monitoring from the source to the target. Simply put, resources devoted to detection and attribution should be redeployed to efficiently monitor for targeting and prevention of attacks. The effort of detection should aim to determine whether a node is under attack, and if so, effectively prevent the attack. This paper contributes by systematically reviewing the structural, operational and legal reasons underlying this argument, and presents empirical evidence to support a shift away from attribution to favour of a target-centric monitoring approach. A carefully deployed set of experiments are presented and a detailed analysis of the results is achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resource management policies are frequently designed and planned to target specific needs of particular sectors, without taking into account the interests of other sectors who share the same resources. In a climate of resource depletion, population growth, increase in energy demand and climate change awareness, it is of great importance to promote the assessment of intersectoral linkages and, by doing so, understand their effects and implications. This need is further augmented when common use of resources might not be solely relevant at national level, but also when the distribution of resources ranges over different nations. This dissertation focuses on the study of the energy systems of five south eastern European countries, which share the Sava River Basin, using a water-food(agriculture)-energy nexus approach. In the case of the electricity generation sector, the use of water is essential for the integrity of the energy systems, as the electricity production in the riparian countries relies on two major technologies dependent on water resources: hydro and thermal power plants. For example, in 2012, an average of 37% of the electricity production in the SRB countries was generated by hydropower and 61% in thermal power plants. Focusing on the SRB, in terms of existing installed capacities, the basin accommodates close to a tenth of all hydropower capacity while providing water for cooling to 42% of the net capacity of thermal power currently in operation in the basin. This energy-oriented nexus study explores the dependency on the basin’s water resources of the energy systems in the region for the period between 2015 and 2030. To do so, a multi-country electricity model was developed to provide a quantification ground to the analysis, using the open-source software modelling tool OSeMOSYS. Three main areas are subject to analysis: first, the impact of energy efficiency and renewable energy strategies in the electricity generation mix; secondly, the potential impacts of climate change under a moderate climate change projection scenario; and finally, deriving from the latter point, the cumulative impact of an increase in water demand in the agriculture sector, for irrigation. Additionally, electricity trade dynamics are compared across the different scenarios under scrutiny, as an effort to investigate the implications of the aforementioned factors in the electricity markets in the region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The CQ Cotton Regional Extension project has been a key to the delivery of emerging, cutting edge research information and knowledge to the Central Queensland cotton industry. The direct relevance of southern research to cotton production under the conditions experienced in CQ always has been an issue which could be addressed through regional assessment and adaptation. The project links the national research to the region through development and extension, with a strong focus on the major industry production issues including but not limited to disease, Integrated Pest Management (IPM), soils, nutrition and integrated weed management. Susan Mass has supported the implementation of national industry-wide programs particularly the industry Best Management Practices program (myBMP). This project has successfully transitioned to a focus on delivering national outcomes in target lead areas as part of National Development and Delivery Team established by Cotton CRC, CRDC and Cotton Australia, while maintaining a regional extension presence for Central Queensland cotton & grain farming systems. Susan Mass has very effectively merged and integrated strong regional extension support to cotton growers in Central Queensland with delivery of industry extension priorities across the entire industry in the Development and Delivery Team model. Susan is the target lead for disease and farm hygiene. Recognising the challenges of having regionally relevant research in Central Queensland, this project has facilitated locally based research including boll rot, Bt cotton resistance management, and mealybug biology through strong collaborations. This collaborative approach has included linkage to Department of Environment and Resource Managmeent (DERM) groups and myBMP programs resulting in a high uptake in CQ.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract. Two ideas taken from Bayesian optimization and classifier systems are presented for personnel scheduling based on choosing a suitable scheduling rule from a set for each person's assignment. Unlike our previous work of using genetic algorithms whose learning is implicit, the learning in both approaches is explicit, i.e. we are able to identify building blocks directly. To achieve this target, the Bayesian optimization algorithm builds a Bayesian network of the joint probability distribution of the rules used to construct solutions, while the adapted classifier system assigns each rule a strength value that is constantly updated according to its usefulness in the current situation. Computational results from 52 real data instances of nurse scheduling demonstrate the success of both approaches. It is also suggested that the learning mechanism in the proposed approaches might be suitable for other scheduling problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coronary heart disease is a major cause of morbidity and mortality worldwide. Percutaneous coronary intervention (PCI) has become the most widely used method of coronary artery revascularisation. The use of stents to hold open atherosclerosis induced arterial narrowing has significantly reduced elastic recoil and acute vessel occlusion following balloon angioplasty. However, bare metal stents have been associated with in-stent restenosis attributed to vascular smooth muscle cell (VSMC) hyperplasia and excessive neointimal formation. The resultant luminal renarrowing may manifest clinically with the return of symptoms such as chest pain or shortness of breath. The development of drug eluting stents has significantly reduced the incidence of in-stent restenosis (ISR). Unfortunately the antiproliferative medications used not only inhibit VSMC proliferation but also re-endothelialisation of the stented vessel. In addition, the drug impregnated polymer coating has been associated with a chronic inflammatory response within the vessel wall predisposing patients to stent thrombosis. Thus the identification of novel therapies which promote vessel healing without excessive proliferative or inflammatory response may improve long term outcome and reduce the need for repeated revascularisation. MicroRNAs (miRs) are short (18-25 nucleotide) non-coding RNAs acting to regulate gene expression. By binding to the 3’untranslated region of mRNA they act to fine tune gene expression either by mRNA degradation or translational repression. Originally identified in coordinating tissue development microRNAs have also been shown to play important roles coordinating the inflammatory response and in numerous cardiovascular diseases. MiR-21 has been identified in human atherosclerotic plaques, arteriosclerosis obliterans and abdominal aortic aneurysms. In addition, its up regulation has been documented in preclinical models of vascular injury. This study sought to identify the role of miR-21 in the development of ISR. Utilising a small animal model of stenting and in vitro techniques, we sought to investigate its influence upon VSMC and immune cell response following stenting. 19 The refinement of a murine stenting model within the Baker laboratory and the electrochemical dissolution of the metal stent from within harvested vascular tissues significantly improved the ability to perform detailed histological analysis. In addition, identification of miRNAs using in situ hybridisation was achieved for the first time within stented tissue. Neointimal formation and ISR was significantly reduced in mice in which miR-21 had been genetically deleted. In addition, neointimal composition was found to be altered in miR-21 KO mice with reductions in VSMC and elastin content demonstrated. Importantly, no difference in re-endothelialisation was observed. In vitro analysis demonstrated that VSMCs from miR-21 KO mice had both reduced proliferative and migratory capacity following platelet derived growth factor stimulation. Molecular analysis revealed that these differences may, at least in part, be due to de-repression of programmed cell death 4 (PDCD4). PDCD4 is a known miR-21 target within VSMCs implicated in the suppression of proliferation and promotion of apoptosis. Unfortunately, initial attempts at antimiR mediated knockdown of miR-21 in vivo, failed to produce a similar change in the suppression of ISR. Furthermore, a significant alteration in macrophage polarisation state within the neointima of miR-21 WT and KO mice was noted. Immunohistochemical staining revealed a preponderance of anti-inflammatory M2 macrophages in KO mice. Analysis of bone marrow derived macrophages from miR-21 KO mice demonstrated an increased level of the peroxisome proliferation activating receptor-γ (PPARγ) which facilitates M2 polarisation. Importantly, significant alterations in numerous pro-inflammatory cytokines, which also have mitogenic effects, were also found following genetic deletion of miR-21. In Summary, this is the first study to look at miRs in the development of ISR. MiR-21 plays an important role in the development of ISR by influencing the proliferative response of VSMCs and modulating the immune response following stent deployment. Further attempts to modulate miR-21 expression following PCI may reduce ISR and the need for repeat revascularisation while also reducing the risk of stent thrombosis.