1000 resultados para Ditado digital
Resumo:
Droplet-based digital microfluidics technology has now come of age, and software-controlled biochips for healthcare applications are starting to emerge. However, today's digital microfluidic biochips suffer from the drawback that there is no feedback to the control software from the underlying hardware platform. Due to the lack of precision inherent in biochemical experiments, errors are likely during droplet manipulation; error recovery based on the repetition of experiments leads to wastage of expensive reagents and hard-to-prepare samples. By exploiting recent advances in the integration of optical detectors (sensors) into a digital microfluidics biochip, we present a physical-aware system reconfiguration technique that uses sensor data at intermediate checkpoints to dynamically reconfigure the biochip. A cyberphysical resynthesis technique is used to recompute electrode-actuation sequences, thereby deriving new schedules, module placement, and droplet routing pathways, with minimum impact on the time-to-response. © 2012 IEEE.
Resumo:
The advent of digital microfluidic lab-on-a-chip (LoC) technology offers a platform for developing diagnostic applications with the advantages of portability, reduction of the volumes of the sample and reagents, faster analysis times, increased automation, low power consumption, compatibility with mass manufacturing, and high throughput. Moreover, digital microfluidics is being applied in other areas such as airborne chemical detection, DNA sequencing by synthesis, and tissue engineering. In most diagnostic and chemical-detection applications, a key challenge is the preparation of the analyte for presentation to the on-chip detection system. Thus, in diagnostics, raw physiological samples must be introduced onto the chip and then further processed by lysing blood cells and extracting DNA. For massively parallel DNA sequencing, sample preparation can be performed off chip, but the synthesis steps must be performed in a sequential on-chip format by automated control of buffers and nucleotides to extend the read lengths of DNA fragments. In airborne particulate-sampling applications, the sample collection from an air stream must be integrated into the LoC analytical component, which requires a collection droplet to scan an exposed impacted surface after its introduction into a closed analytical section. Finally, in tissue-engineering applications, the challenge for LoC technology is to build high-resolution (less than 10 microns) 3D tissue constructs with embedded cells and growth factors by manipulating and maintaining live cells in the chip platform. This article discusses these applications and their implementation in digital-microfluidic LoC platforms. © 2007 IEEE.
Resumo:
The ability to manipulate small fluid droplets, colloidal particles and single cells with the precision and parallelization of modern-day computer hardware has profound applications for biochemical detection, gene sequencing, chemical synthesis and highly parallel analysis of single cells. Drawing inspiration from general circuit theory and magnetic bubble technology, here we demonstrate a class of integrated circuits for executing sequential and parallel, timed operations on an ensemble of single particles and cells. The integrated circuits are constructed from lithographically defined, overlaid patterns of magnetic film and current lines. The magnetic patterns passively control particles similar to electrical conductors, diodes and capacitors. The current lines actively switch particles between different tracks similar to gated electrical transistors. When combined into arrays and driven by a rotating magnetic field clock, these integrated circuits have general multiplexing properties and enable the precise control of magnetizable objects.
Resumo:
CT and digital subtraction angiography (DSA) are ubiquitous in the clinic. Their preclinical equivalents are valuable imaging methods for studying disease models and treatment. We have developed a dual source/detector X-ray imaging system that we have used for both micro-CT and DSA studies in rodents. The control of such a complex imaging system requires substantial software development for which we use the graphical language LabVIEW (National Instruments, Austin, TX, USA). This paper focuses on a LabVIEW platform that we have developed to enable anatomical and functional imaging with micro-CT and DSA. Our LabVIEW applications integrate and control all the elements of our system including a dual source/detector X-ray system, a mechanical ventilator, a physiological monitor, and a power microinjector for the vascular delivery of X-ray contrast agents. Various applications allow cardiac- and respiratory-gated acquisitions for both DSA and micro-CT studies. Our results illustrate the application of DSA for cardiopulmonary studies and vascular imaging of the liver and coronary arteries. We also show how DSA can be used for functional imaging of the kidney. Finally, the power of 4D micro-CT imaging using both prospective and retrospective gating is shown for cardiac imaging.
Resumo:
Using scientific methods in the humanities is at the forefront of objective literary analysis. However, processing big data is particularly complex when the subject matter is qualitative rather than numerical. Large volumes of text require specialized tools to produce quantifiable data from ideas and sentiments. Our team researched the extent to which tools such as Weka and MALLET can test hypotheses about qualitative information. We examined the claim that literary commentary exists within political environments and used US periodical articles concerning Russian literature in the early twentieth century as a case study. These tools generated useful quantitative data that allowed us to run stepwise binary logistic regressions. These statistical tests allowed for time series experiments using sea change and emergency models of history, as well as classification experiments with regard to author characteristics, social issues, and sentiment expressed. Both types of experiments supported our claim with varying degrees, but more importantly served as a definitive demonstration that digitally enhanced quantitative forms of analysis can apply to qualitative data. Our findings set the foundation for further experiments in the emerging field of digital humanities.
Resumo:
Los estudiantes de enseñanza media se enfrentan al uso e interpretación de los parámetros en funciones polinomiales, lugares geométricos y expresiones algebraicas en general. Este hecho conduce a la necesidad no sólo de diferenciar los parámetros de otro tipo de literales como variables o incógnitas, sino también dar un sentido de uso a los mismos con la finalidad de agrupar los objetos matemáticos en entidades más generales como son las familias de funciones. El presente taller tiene como objetivo mostrar la influencia que puede tener el uso de un recurso tecnológico dinámico en la comprensión de esta polisemia de las literales, así como en la optimización de la ideas como puede ser la generalización.
Resumo:
O estudo é uma pesquisa-ação, na área da Informática na Educação Matemática, sobre a forma de aprender a aprender cooperativamente, segundo os Estudos Sociológicos de Piaget, no espaço de aprendizagem digital da Matemática, desenvolvida no IFRS – Osório, em 2011 e 2012, com 60 estudantes do ensino médio técnico em informática. A questão central é como analisar e compreender o processo de aprendizagem cooperativa dos conceitos de Matemática neste espaço. A definição deste espaço e de aprendizagem cooperativa é resultado desta pesquisa. Além disso, demonstra-se a construção dos conceitos de Matemática, e a mobilização dos estudantes em aprender incorporando-se as tecnologias digitais online às aulas de Matemática, sob a autonomia e responsabilidade de cada estudante e/ou de seu grupo.
Resumo:
La teoría de instrucción matemática significativa basada en el modelo ontológico -semiótico de la cognición matemática denominado Teoría de las Funciones Semióticas (TFS ) proporciona un marco unificado para el estudio de las diversas formas de conocimiento matemático y sus respectivas interacciones en el seno de los sistemas didácticos (Godino, 1998 ). Presentamos un desarrollo de esta teoría consistente en la descomposición de un objeto, para nuestro modelo, la Continuidad, en unidades para identificar entidades y las funciones semióticas que se establecen, en el proceso de enseñanza y aprendizaje en una institución escolar, implementando un ambiente de tecnología digital (calculadora graficadora TI-92 Plus y/o Voyage 200).
Resumo:
En este curso corto utilizamos distintas aplicaciones de geometría dinámica para realizar construcciones geométricas en el modelo de Poincaré para geometría hiperbólica con el propósito de investigar y determinar la naturaleza de algunos teoremas de geometría para la enseñanza secundaria y superior. De esta forma clasificamos algunos de los teoremas de geometría plana como neutrales, estrictamente euclidianas o estrictamente hiperbólicos.
Resumo:
It is now possible to use powerful general purpose computer architectures to support post-production of both video and multimedia projects. By devising a suitable portable software architecture and using high-speed networking in an appropriate manner, a system has been constructed where editors are no longer tied to a specific location. New types of production, such as multi-threaded interactive video, are supported. Editors may also work remotely where very high speed network connection is not currently provided. An object-oriented database is used for the comprehensive cataloging of material and to support automatic audio/video object migration and replication. Copyright © 1997 by the Society of Motion Picture and Television Engineers, Inc.
Resumo:
Digital Forestry has been proposed as “the science, technology, and art of systematically acquiring, integrating, analyzing, and applying digital information to support sustainable forests.” Although rooted in traditional forestry disciplines, Digital Forestry draws from a host of other fields that, in the past few decades, have become important for implementing the concept of forest ecosystem management and the principle of sustainable forestry. Digital Forestry is a framework that links all facets of forestry information at local, national, and global levels through an organized digital network. It is anticipated that a new set of principles will be established when practicing Digital Forestry concept for the evolution of forestry education, research, and practices as the 21st century unfolds.