990 resultados para Digital controllers
Resumo:
A digital differentiator simply involves the derivation of an input signal. This work includes the presentation of first-degree and second-degree differentiators, which are designed as both infinite-impulse-response (IIR) filters and finite-impulse-response (FIR) filters. The proposed differentiators have low-pass magnitude response characteristics, thereby rejecting noise frequencies higher than the cut-off frequency. Both steady-state frequency-domain characteristics and Time-domain analyses are given for the proposed differentiators. It is shown that the proposed differentiators perform well when compared to previously proposed filters. When considering the time-domain characteristics of the differentiators, the processing of quantized signals proved especially enlightening, in terms of the filtering effects of the proposed differentiators. The coefficients of the proposed differentiators are obtained using an optimization algorithm, while the optimization objectives include magnitude and phase response. The low-pass characteristic of the proposed differentiators is achieved by minimizing the filter variance. The low-pass differentiators designed show the steep roll-off, as well as having highly accurate magnitude response in the pass-band. While having a history of over three hundred years, the design of fractional differentiator has become a ‘hot topic’ in recent decades. One challenging problem in this area is that there are many different definitions to describe the fractional model, such as the Riemann-Liouville and Caputo definitions. Through use of a feedback structure, based on the Riemann-Liouville definition. It is shown that the performance of the fractional differentiator can be improved in both the frequency-domain and time-domain. Two applications based on the proposed differentiators are described in the thesis. Specifically, the first of these involves the application of second degree differentiators in the estimation of the frequency components of a power system. The second example concerns for an image processing, edge detection application.
Resumo:
Droplet-based digital microfluidics technology has now come of age, and software-controlled biochips for healthcare applications are starting to emerge. However, today's digital microfluidic biochips suffer from the drawback that there is no feedback to the control software from the underlying hardware platform. Due to the lack of precision inherent in biochemical experiments, errors are likely during droplet manipulation; error recovery based on the repetition of experiments leads to wastage of expensive reagents and hard-to-prepare samples. By exploiting recent advances in the integration of optical detectors (sensors) into a digital microfluidics biochip, we present a physical-aware system reconfiguration technique that uses sensor data at intermediate checkpoints to dynamically reconfigure the biochip. A cyberphysical resynthesis technique is used to recompute electrode-actuation sequences, thereby deriving new schedules, module placement, and droplet routing pathways, with minimum impact on the time-to-response. © 2012 IEEE.
Resumo:
The advent of digital microfluidic lab-on-a-chip (LoC) technology offers a platform for developing diagnostic applications with the advantages of portability, reduction of the volumes of the sample and reagents, faster analysis times, increased automation, low power consumption, compatibility with mass manufacturing, and high throughput. Moreover, digital microfluidics is being applied in other areas such as airborne chemical detection, DNA sequencing by synthesis, and tissue engineering. In most diagnostic and chemical-detection applications, a key challenge is the preparation of the analyte for presentation to the on-chip detection system. Thus, in diagnostics, raw physiological samples must be introduced onto the chip and then further processed by lysing blood cells and extracting DNA. For massively parallel DNA sequencing, sample preparation can be performed off chip, but the synthesis steps must be performed in a sequential on-chip format by automated control of buffers and nucleotides to extend the read lengths of DNA fragments. In airborne particulate-sampling applications, the sample collection from an air stream must be integrated into the LoC analytical component, which requires a collection droplet to scan an exposed impacted surface after its introduction into a closed analytical section. Finally, in tissue-engineering applications, the challenge for LoC technology is to build high-resolution (less than 10 microns) 3D tissue constructs with embedded cells and growth factors by manipulating and maintaining live cells in the chip platform. This article discusses these applications and their implementation in digital-microfluidic LoC platforms. © 2007 IEEE.
Resumo:
The ability to manipulate small fluid droplets, colloidal particles and single cells with the precision and parallelization of modern-day computer hardware has profound applications for biochemical detection, gene sequencing, chemical synthesis and highly parallel analysis of single cells. Drawing inspiration from general circuit theory and magnetic bubble technology, here we demonstrate a class of integrated circuits for executing sequential and parallel, timed operations on an ensemble of single particles and cells. The integrated circuits are constructed from lithographically defined, overlaid patterns of magnetic film and current lines. The magnetic patterns passively control particles similar to electrical conductors, diodes and capacitors. The current lines actively switch particles between different tracks similar to gated electrical transistors. When combined into arrays and driven by a rotating magnetic field clock, these integrated circuits have general multiplexing properties and enable the precise control of magnetizable objects.
Resumo:
CD8+ T cells are associated with long term control of virus replication to low or undetectable levels in a population of HIV+ therapy-naïve individuals known as virus controllers (VCs; <5000 RNA copies/ml and CD4+ lymphocyte counts >400 cells/µl). These subjects' ability to control viremia in the absence of therapy makes them the gold standard for the type of CD8+ T-cell response that should be induced with a vaccine. Studying the regulation of CD8+ T cells responses in these VCs provides the opportunity to discover mechanisms of durable control of HIV-1. Previous research has shown that the CD8+ T cell population in VCs is heterogeneous in its ability to inhibit virus replication and distinct T cells are responsible for virus inhibition. Further defining both the functional properties and regulation of the specific features of the select CD8+ T cells responsible for potent control of viremia the in VCs would enable better evaluation of T cell-directed vaccine strategies and may inform the design of new therapies.
Here we discuss the progress made in elucidating the features and regulation of CD8+ T cell response in virus controllers. We first detail the development of assays to quantify CD8+ T cells' ability to inhibit virus replication. This includes the use of a multi-clade HIV-1 panel which can subsequently be used as a tool for evaluation of T cell directed vaccines. We used these assays to evaluate the CD8+ response among cohorts of HIV-1 seronegative, HIV-1 acutely infected, and HIV-1 chronically infected (both VC and chronic viremic) patients. Contact and soluble CD8+ T cell virus inhibition assays (VIAs) are able to distinguish these patient groups based on the presence and magnitude of the responses. When employed in conjunction with peptide stimulation, the soluble assay reveals peptide stimulation induces CD8+ T cell responses with a prevalence of Gag p24 and Nef specificity among the virus controllers tested. Given this prevalence, we aimed to determine the gene expression profile of Gag p24-, Nef-, and unstimulated CD8+ T cells. RNA was isolated from CD8+ T-cells from two virus controllers with strong virus inhibition and one seronegative donor after a 5.5 hour stimulation period then analyzed using the Illumina Human BeadChip platform (Duke Center for Human Genome Variation). Analysis revealed that 565 (242 Nef and 323 Gag) genes were differentially expressed in CD8+ T-cells that were able to inhibit virus replication compared to those that could not. We compared the differentially expressed genes to published data sets from other CD8+ T-cell effector function experiments focusing our analysis on the most recurring genes with immunological, gene regulatory, apoptotic or unknown functions. The most commonly identified gene in these studies was TNFRSF9. Using PCR in a larger cohort of virus controllers we confirmed the up-regulation of TNFRSF9 in Gag p24 and Nef-specific CD8+ T cell mediated virus inhibition. We also observed increase in the mRNA encoding antiviral cytokines macrophage inflammatory proteins (MIP-1α, MIP-1αP, MIP-1β), interferon gamma (IFN-γ), granulocyte-macrophage colony-stimulating factor (GM-CSF), and recently identified lymphotactin (XCL1).
Our previous work suggests the CD8+ T-cell response to HIV-1 can be regulated at the level of gene regulation. Because RNA abundance is modulated by transcription of new mRNAs and decay of new and existing RNA we aimed to evaluate the net rate of transcription and mRNA decay for the cytokines we identified as differentially regulated. To estimate rate of mRNA synthesis and decay, we stimulated isolated CD8+ T-cells with Gag p24 and Nef peptides adding 4-thiouridine (4SU) during the final hour of stimulation, allowing for separation of RNA made during the final hour of stimulation. Subsequent PCR of RNA isolated from these cells, allowed us to determine how much mRNA was made for our genes of interest during the final hour which we used to calculate rate of transcription. To assess if stimulation caused a change in RNA stability, we calculated the decay rates of these mRNA over time. In Gag p24 and Nef stimulated T cells , the abundance of the mRNA of many of the cytokines examined was dependent on changes in both transcription and mRNA decay with evidence for potential differences in the regulation of mRNA between Nef and Gag specific CD8+ T cells. The results were highly reproducible in that in one subject that was measured in three independent experiments the results were concordant.
This data suggests that mRNA stability, in addition to transcription, is key in regulating the direct anti-HIV-1 function of antigen-specific memory CD8+ T cells by enabling rapid recall of anti-HIV-1 effector functions, namely the production and increased stability of antiviral cytokines. We have started to uncover the mechanisms employed by CD8+ T cell subsets with antigen-specific anti-HIV-1 activity, in turn, enhancing our ability to inhibit virus replication by informing both cure strategies and HIV-1 vaccine designs that aim to reduce transmission and can aid in blocking HIV-1 acquisition.
Resumo:
CT and digital subtraction angiography (DSA) are ubiquitous in the clinic. Their preclinical equivalents are valuable imaging methods for studying disease models and treatment. We have developed a dual source/detector X-ray imaging system that we have used for both micro-CT and DSA studies in rodents. The control of such a complex imaging system requires substantial software development for which we use the graphical language LabVIEW (National Instruments, Austin, TX, USA). This paper focuses on a LabVIEW platform that we have developed to enable anatomical and functional imaging with micro-CT and DSA. Our LabVIEW applications integrate and control all the elements of our system including a dual source/detector X-ray system, a mechanical ventilator, a physiological monitor, and a power microinjector for the vascular delivery of X-ray contrast agents. Various applications allow cardiac- and respiratory-gated acquisitions for both DSA and micro-CT studies. Our results illustrate the application of DSA for cardiopulmonary studies and vascular imaging of the liver and coronary arteries. We also show how DSA can be used for functional imaging of the kidney. Finally, the power of 4D micro-CT imaging using both prospective and retrospective gating is shown for cardiac imaging.
Resumo:
Using scientific methods in the humanities is at the forefront of objective literary analysis. However, processing big data is particularly complex when the subject matter is qualitative rather than numerical. Large volumes of text require specialized tools to produce quantifiable data from ideas and sentiments. Our team researched the extent to which tools such as Weka and MALLET can test hypotheses about qualitative information. We examined the claim that literary commentary exists within political environments and used US periodical articles concerning Russian literature in the early twentieth century as a case study. These tools generated useful quantitative data that allowed us to run stepwise binary logistic regressions. These statistical tests allowed for time series experiments using sea change and emergency models of history, as well as classification experiments with regard to author characteristics, social issues, and sentiment expressed. Both types of experiments supported our claim with varying degrees, but more importantly served as a definitive demonstration that digitally enhanced quantitative forms of analysis can apply to qualitative data. Our findings set the foundation for further experiments in the emerging field of digital humanities.
Resumo:
Los estudiantes de enseñanza media se enfrentan al uso e interpretación de los parámetros en funciones polinomiales, lugares geométricos y expresiones algebraicas en general. Este hecho conduce a la necesidad no sólo de diferenciar los parámetros de otro tipo de literales como variables o incógnitas, sino también dar un sentido de uso a los mismos con la finalidad de agrupar los objetos matemáticos en entidades más generales como son las familias de funciones. El presente taller tiene como objetivo mostrar la influencia que puede tener el uso de un recurso tecnológico dinámico en la comprensión de esta polisemia de las literales, así como en la optimización de la ideas como puede ser la generalización.
Resumo:
O estudo é uma pesquisa-ação, na área da Informática na Educação Matemática, sobre a forma de aprender a aprender cooperativamente, segundo os Estudos Sociológicos de Piaget, no espaço de aprendizagem digital da Matemática, desenvolvida no IFRS – Osório, em 2011 e 2012, com 60 estudantes do ensino médio técnico em informática. A questão central é como analisar e compreender o processo de aprendizagem cooperativa dos conceitos de Matemática neste espaço. A definição deste espaço e de aprendizagem cooperativa é resultado desta pesquisa. Além disso, demonstra-se a construção dos conceitos de Matemática, e a mobilização dos estudantes em aprender incorporando-se as tecnologias digitais online às aulas de Matemática, sob a autonomia e responsabilidade de cada estudante e/ou de seu grupo.
Resumo:
La teoría de instrucción matemática significativa basada en el modelo ontológico -semiótico de la cognición matemática denominado Teoría de las Funciones Semióticas (TFS ) proporciona un marco unificado para el estudio de las diversas formas de conocimiento matemático y sus respectivas interacciones en el seno de los sistemas didácticos (Godino, 1998 ). Presentamos un desarrollo de esta teoría consistente en la descomposición de un objeto, para nuestro modelo, la Continuidad, en unidades para identificar entidades y las funciones semióticas que se establecen, en el proceso de enseñanza y aprendizaje en una institución escolar, implementando un ambiente de tecnología digital (calculadora graficadora TI-92 Plus y/o Voyage 200).
Resumo:
En este curso corto utilizamos distintas aplicaciones de geometría dinámica para realizar construcciones geométricas en el modelo de Poincaré para geometría hiperbólica con el propósito de investigar y determinar la naturaleza de algunos teoremas de geometría para la enseñanza secundaria y superior. De esta forma clasificamos algunos de los teoremas de geometría plana como neutrales, estrictamente euclidianas o estrictamente hiperbólicos.
Resumo:
It is now possible to use powerful general purpose computer architectures to support post-production of both video and multimedia projects. By devising a suitable portable software architecture and using high-speed networking in an appropriate manner, a system has been constructed where editors are no longer tied to a specific location. New types of production, such as multi-threaded interactive video, are supported. Editors may also work remotely where very high speed network connection is not currently provided. An object-oriented database is used for the comprehensive cataloging of material and to support automatic audio/video object migration and replication. Copyright © 1997 by the Society of Motion Picture and Television Engineers, Inc.