941 resultados para Active Flow Control


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent advances in tissue engineering and regenerative medicine have shown that controlling cells microenvironment during growth is a key element to the development of successful therapeutic system. To achieve such control, researchers have first proposed the use of polymeric scaffolds that were able to support cellular growth and, to a certain extent, favor cell organization and tissue structure. With nowadays availability of a large pool of stem cell lines, such approach has appeared to be rather limited since it does not offer the fine control of the cell micro-environment in space and time (4D). Therefore, researchers are currently focusing their efforts on developing strategies that include active compound delivery systems in order to add a fourth dimension to the design of 3D scaffolds. This review will focus on recent concepts and applications of 2D and 3D techniques that have been used to control the load and release of active compounds used to promote cell differentiation and proliferation in or out of a scaffold. We will first present recent advances in the design of 2D polymeric scaffolds and the different techniques that have been used to deposit molecular cues and cells in a controlled fashion. We will continue presenting the recent advances made in the design of 3D scaffolds based on hydrogels as well as polymeric fibers and we will finish by presenting some of the research avenues that are still to be explored.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work identifies the importance of plenum pressure on the performance of the data centre. The present methodology followed in the industry considers the pressure drop across the tile as a dependant variable, but it is shown in this work that this is the only one independent variable that is responsible for the entire flow dynamics in the data centre, and any design or assessment procedure must consider the pressure difference across the tile as the primary independent variable. This concept is further explained by the studies on the effect of dampers on the flow characteristics. The dampers have found to introduce an additional pressure drop there by reducing the effective pressure drop across the tile. The effect of damper is to change the flow both in quantitative and qualitative aspects. But the effect of damper on the flow in the quantitative aspect is only considered while using the damper as an aid for capacity control. Results from the present study suggest that the use of dampers must be avoided in data centre and well designed tiles which give required flow rates must be used in the appropriate locations. In the present study the effect of hot air recirculation is studied with suitable assumptions. It identifies that, the pressure drop across the tile is a dominant parameter which governs the recirculation. The rack suction pressure of the hardware along with the pressure drop across the tile determines the point of recirculation in the cold aisle. The positioning of hardware in the racks play an important role in controlling the recirculation point. The present study is thus helpful in the design of data centre air flow, based on the theory of jets. The air flow can be modelled both quantitatively and qualitatively based on the results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantum technology, exploiting entanglement and the wave nature of matter, relies on the ability to accurately control quantum systems. Quantum control is often compromised by the interaction of the system with its environment since this causes loss of amplitude and phase. However, when the dynamics of the open quantum system is non-Markovian, amplitude and phase flow not only from the system into the environment but also back. Interaction with the environment is then not necessarily detrimental. We show that the back-flow of amplitude and phase can be exploited to carry out quantum control tasks that could not be realized if the system was isolated. The control is facilitated by a few strongly coupled, sufficiently isolated environmental modes. Our paradigmatic example considers a weakly anharmonic ladder with resonant amplitude control only, restricting realizable operations to SO(N). The coupling to the environment, when harnessed with optimization techniques, allows for full SU(N) controllability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report presents a design of a new type of robot end-effector with inherent mechanical grasping capabilities. Concentrating on designing an end-effector to grasp a simple class of objects, cylindrical, allowed a design with only one degree of actuation. The key features of this design are high bandwidth response to forces, passive grasping capabilities, ease of control, and ability to wrap around objects with simple geometries providing form closure. A prototype of this mechanism was built to evaluate these features.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The control of aerial gymnastic maneuvers is challenging because these maneuvers frequently involve complex rotational motion and because the performer has limited control of the maneuver during flight. A performer can influence a maneuver using a sequence of limb movements during flight. However, the same sequence may not produce reliable performances in the presence of off-nominal conditions. How do people compensate for variations in performance to reliably produce aerial maneuvers? In this report I explore the role that passive dynamic stability may play in making the performance of aerial maneuvers simple and reliable. I present a control strategy comprised of active and passive components for performing robot front somersaults in the laboratory. I show that passive dynamics can neutrally stabilize the layout somersault which involves an "inherently unstable" rotation about the intermediate principal axis. And I show that a strategy that uses open loop joint torques plus passive dynamics leads to more reliable 1 1/2 twisting front somersaults in simulation than a strategy that uses prescribed limb motion. Results are presented from laboratory experiments on gymnastic robots, from dynamic simulation of humans and robots, and from linear stability analyses of these systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this text, we present two stereo-based head tracking techniques along with a fast 3D model acquisition system. The first tracking technique is a robust implementation of stereo-based head tracking designed for interactive environments with uncontrolled lighting. We integrate fast face detection and drift reduction algorithms with a gradient-based stereo rigid motion tracking technique. Our system can automatically segment and track a user's head under large rotation and illumination variations. Precision and usability of this approach are compared with previous tracking methods for cursor control and target selection in both desktop and interactive room environments. The second tracking technique is designed to improve the robustness of head pose tracking for fast movements. Our iterative hybrid tracker combines constraints from the ICP (Iterative Closest Point) algorithm and normal flow constraint. This new technique is more precise for small movements and noisy depth than ICP alone, and more robust for large movements than the normal flow constraint alone. We present experiments which test the accuracy of our approach on sequences of real and synthetic stereo images. The 3D model acquisition system we present quickly aligns intensity and depth images, and reconstructs a textured 3D mesh. 3D views are registered with shape alignment based on our iterative hybrid tracker. We reconstruct the 3D model using a new Cubic Ray Projection merging algorithm which takes advantage of a novel data structure: the linked voxel space. We present experiments to test the accuracy of our approach on 3D face modelling using real-time stereo images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electroosmotic flow is a convenient mechanism for transporting polar fluid in a microfluidic device. The flow is generated through the application of an external electric field that acts on the free charges that exists in a thin Debye layer at the channel walls. The charge on the wall is due to the chemistry of the solid-fluid interface, and it can vary along the channel, e.g. due to modification of the wall. This investigation focuses on the simulation of the electroosmotic flow (EOF) profile in a cylindrical microchannel with step change in zeta potential. The modified Navier-Stoke equation governing the velocity field and a non-linear two-dimensional Poisson-Boltzmann equation governing the electrical double-layer (EDL) field distribution are solved numerically using finite control-volume method. Continuities of flow rate and electric current are enforced resulting in a non-uniform electrical field and pressure gradient distribution along the channel. The resulting parabolic velocity distribution at the junction of the step change in zeta potential, which is more typical of a pressure-driven velocity flow profile, is obtained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

TCP flows from applications such as the web or ftp are well supported by a Guaranteed Minimum Throughput Service (GMTS), which provides a minimum network throughput to the flow and, if possible, an extra throughput. We propose a scheme for a GMTS using Admission Control (AC) that is able to provide different minimum throughput to different users and that is suitable for "standard" TCP flows. Moreover, we consider a multidomain scenario where the scheme is used in one of the domains, and we propose some mechanisms for the interconnection with neighbor domains. The whole scheme uses a small set of packet classes in a core-stateless network where each class has a different discarding priority in queues assigned to it. The AC method involves only edge nodes and uses a special probing packet flow (marked as the highest discarding priority class) that is sent continuously from ingress to egress through a path. The available throughput in the path is obtained at the egress using measurements of flow aggregates, and then it is sent back to the ingress. At the ingress each flow is detected using an implicit way and then it is admission controlled. If it is accepted, it receives the GMTS and its packets are marked as the lowest discarding priority classes; otherwise, it receives a best-effort service. The scheme is evaluated through simulation in a simple "bottleneck" topology using different traffic loads consisting of "standard" TCP flows that carry files of varying sizes

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An unsupervised approach to image segmentation which fuses region and boundary information is presented. The proposed approach takes advantage of the combined use of 3 different strategies: the guidance of seed placement, the control of decision criterion, and the boundary refinement. The new algorithm uses the boundary information to initialize a set of active regions which compete for the pixels in order to segment the whole image. The method is implemented on a multiresolution representation which ensures noise robustness as well as computation efficiency. The accuracy of the segmentation results has been proven through an objective comparative evaluation of the method

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a complete control architecture that has been designed to fulfill predefined missions with an autonomous underwater vehicle (AUV). The control architecture has three levels of control: mission level, task level and vehicle level. The novelty of the work resides in the mission level, which is built with a Petri network that defines the sequence of tasks that are executed depending on the unpredictable situations that may occur. The task control system is composed of a set of active behaviours and a coordinator that selects the most appropriate vehicle action at each moment. The paper focuses on the design of the mission controller and its interaction with the task controller. Simulations, inspired on an industrial underwater inspection of a dam grate, show the effectiveness of the control architecture

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A medida que pasa el tiempo; la ciencia, el desarrollo tecnológico y la constante búsqueda de encontrar nuevas verdades, más fehacientes y que logren responder más asertivamente los diferentes cuestionamientos de la humanidad, han logrado redefinir las teorías y los axiomas que, en su momento, se tomaron como el dogma a seguir en diferentes disciplinas y cuestionamientos de la sociedad y la industria. La concepción tradicional que se tiene de la mente y el comportamiento del consumidor tenía vacios importantes en términos de aplicabilidad y generalización de sus teorías, pues pensar que la humanidad desarrolla sus decisiones de compra bajo análisis netamente racionales y apegados a estructuras temporales para manejar su dinero es un supuesto que no se emplea de manera general y concienzuda por el común. Antes que agentes 100% racionales, con un completo flujo de información y en un mercado perfecto bajo todos los preceptos económicos, somos personas de sentimientos y sentidos. Reaccionamos ante situaciones, estados de ánimo y estímulos, donde es nuestro cerebro quien recibe todo el contexto cognitivo que nos brinda el entorno y entonces actúa (y compra) de diferentes maneras. Es allí donde el Neuromarketing nace como un claro ejemplo de esa búsqueda por una nueva verdad. Una donde entender al consumidor no deje de lado su faceta más real; sus reacciones, pues son estas las que realmente definen qué le gusta o no y qué despierta en él un impulso lo suficientemente importante como para incidir en su decisión de compra. Es por ello que el Neuromarketing se ha adentrado a estudiar lo más profundo y verídico del consumidor, su cerebro. Alejándose de las técnicas tradicionales de investigación de mercados, donde el consumidor puede desvirtuar la información que percibe de un producto o una campaña publicitaría por diferentes razones sociales y psicológicas. El Neuromarketing se ha adentrado al estudio del consumidor y su cerebro mediante técnicas biométricas, en las cuales expone al consumidor al marketing y analiza sus reacciones cerebrales en términos de interés, adrenalina, memoria activa y sentimientos, apoyado por técnicas como el ¨eye tracking¨, donde las interacciones visuales del consumidor permiten identificar los puntos calientes y de interés en determinada pieza publicitaria. Pero el estudio, entendido por algunos como ¨invasivo¨ frente a las libertades en términos de privacidad y libertad de elección del consumidor deben ser responsablemente dirigidos y puestos bajo un contexto científico, donde el único fin sea el de generar nuevas hipótesis y teorías que permitan tener un mejor conocimiento del comportamiento del consumidor sin traspasar los límites del control del mismo. El Neuromarketing debate su existencia misma entre la creación de nuevas metodologías de acercamiento al pensamiento del consumidor, la efectividad de sus conocimiento dados a la industria y el yugo social que acarrea esta ciencia debido a la potencial coerción a los consumidores debido a sus hallazgos.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La perdurabilidad empresarial es una de las preocupaciones sectoriales, regionales y nacionales estudiada desde diferentes ámbitos (económico, social y académico), convirtiéndose en uno de los objetivos multinivel que impulsa el desarrollo y crecimiento de un país. De otro lado, una de las metodologías que está orientada a garantizar el cumplimiento de los objetivos o comportamientos planificados se denomina “sistemas de control” cuya aplicación es interdisciplinaria, tiene su origen en los sistemas técnicos y se traslada a los sistemas organizacionales. Este trabajo desarrolla una revisión bibliográfica del control, desde la en los sistemas técnicos y desde la Psicología, Sociología y disciplinas sociales a los sistemas organizacionales, identificando dos corrientes. La primera corriente, denominada “clásica” en sistemas causales y la segunda, llamada “inteligente”. El trabajo describe en cada corriente la conceptualización, los propósitos, las metodologías, la estructura, la taxonomía y las críticas en los sistemas de control. El control es abordado con diferentes metodologías, dependiendo de la corriente y de la naturaleza de cada sistema, pero tiene un propósito transversal o interdisciplinario que consiste en generar comportamientos esperados o deseables en el sistema, incluyendo alcanzar objetivos establecidos. En cuanto al desarrollo y comprensión, hay un mayor avance en los sistemas técnicos enfocados básicamente hacia la autonomía del sistema, mientras que en los sistemas organizacionales se ha generado la inquietud de evolucionar o generar controles diferentes o eliminar el control, pero las organizaciones continúan aplicando los controles clásicos y generando resultados de no perdurabilidad empresarial.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La operación de mega minería que abarca CERREJON se desarrolla bajo lo más altos estándares de seguridad y calidad, con el compromiso de entregar al mercado un excelente producto y evitando el deterioro del entorno. Uno de los puntos más fuertes de la compañía se haya en la integración de los procesos productivos entre la mina de carbón, el ferrocarril y el puerto, logrando que la operación sea eficiente y que garantice niveles altos de óptimos resultados y que los porcentajes de falla no existan o cada vez sean menores en el proceso de explotación y transporte del carbón de Colombia hacia el mundo. El departamento de MATERIALES es el punto de origen para garantizar que se desarrolle la operación, dado que el departamento tiene la gran labor de adquirir y entregar los bienes y servicios requeridos por la Compañía al más bajo costo total evaluado, en el menor tiempo posible y bajo el marco de la legislación Colombiana en el proceso de nacionalización, con un alto énfasis en el desarrollo de relaciones fuertes y sinérgicas entre todos los eslabones de la cadena. Todo el proceso del departamento se enmarca dentro de un ciclo que debe encaminarse a ser cada vez más efectivo y eficiente; de allí que se busquen opciones de mejoramiento para afinar los procesos. Para que sea posible y factible manejar una mega operación como esta se requiere afinar al máximo la red que se establece en su cadena de abastecimiento, buscando lograr un flujo de producto, información y fondos que garanticen disponibilidad del producto, para así generar una rentabilidad alta a la compañía, y controlar los costos por la operación. Dado que la cadena de suministro es susceptible a mejoras, gracias a la interacción con los proveedores, colaboradores y clientes; se pretende sacar el mejor provecho de esto, a través del análisis de la cadena actual de CERREJON; presentando una opción de mejora en uno de los eslabones del proceso productivo, esta opción ha sido contemplada desde años anteriores, pero en esta ocasión gracias a la integración de afinamientos en los sistemas de información, y la participación activa de los proveedores, se encuentra una opción viable de la eliminación de un reproceso, que garantiza eficiencia y efectividad en la agilización del ciclo de producción en CERREJON. A través de la ejecución del proyecto de reforma del conteo, se presenta la oportunidad de mejoramiento en la cadena del departamento; el proceso de reconteo en la mina, realizado posteriormente al conteo inicial en Puerto Bolívar, de los materiales que llegan importados vía marítima. El proceso de afinamiento del recibo documental en el consolidador de carga y los mayores proveedores de entrega directa (HITACHI y MCA) al transportador, genera la opción del uso de terminales portátiles, que de la mano con los ajustes documentales, permitirán que la carga sea contada y separada por localización, para enviarla vía tren a LMN, reduciendo tiempo de entrega al cliente final, los costos de remanejo al volver a contar, y los costos asociados a las entregas no oportunas de devolución de contenedores.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objetivo: El cuestionario Barriers to Being Active Quiz (BBAQ), indaga las barreras para ser físicamente activo. El cuestionario fue traducido al español por el mismo equipo que desarrolló la versión inglésa original, pero carece de estudios de validez en la versión española. El objetivo de esta investigación fue evaluar las propiedades psicométricas del BBAQ (en la versión completa de 21 ítems), centrándose en la fiabilidad y validez. Material y métodos: Un total de 2.634 (1.462 mujeres y 1.172 varones; 18-30 años de edad) estudiantes universitarios completaron el cuestionario BBAQ-21. El alfa de Crombach se estimó como indicador de consistencia interna. El coeficiente de correlación intra-clase (CCI) y el grado de acuerdo se calcularon para evaluar la estabilidad temporal con un periodo de 7 días entre ambas administraciones como estimadores de la reproducibilidad. Se aplicó un análisis factorial exploratorio (AFE) y confirmatorio (AFC) para analizar la validez del BBAQ-21 ítems. Resultados: El BBAQ-21 mostró valores de un alfa de Cronbach entre 0,812 y 0,844 y un CCI entre el 0,46 y 0,87. El porcentaje de acuerdo por todos los conceptos individuales varió de 45 a 80%. El AFE determinó cuatro factores que explicaron el 52,90% de la varianza y el AFC mostró moderadas cargas factoriales. Conclusiones: Los resultados obtenidos en este cuestionario avalan la utilización de este instrumento con este tipo de muestra, desde el punto de vista de la fiabilidad y validez. El BBAQ-21 está disponible para evaluar las barreras para la actividad física en América Latina.