956 resultados para embedded, system, entropy, pool, TRNG, random, ADC
Resumo:
This work aims to develop a novel Cross-Entropy (CE) optimization-based fuzzy controller for Unmanned Aerial Monocular Vision-IMU System (UAMVIS) to solve the seeand-avoid problem using its accurate autonomous localization information. The function of this fuzzy controller is regulating the heading of this system to avoid the obstacle, e.g. wall. In the Matlab Simulink-based training stages, the Scaling Factor (SF) is adjusted according to the specified task firstly, and then the Membership Function (MF) is tuned based on the optimized Scaling Factor to further improve the collison avoidance performance. After obtained the optimal SF and MF, 64% of rules has been reduced (from 125 rules to 45 rules), and a large number of real flight tests with a quadcopter have been done. The experimental results show that this approach precisely navigates the system to avoid the obstacle. To our best knowledge, this is the first work to present the optimized fuzzy controller for UAMVIS using Cross-Entropy method in Scaling Factors and Membership Functions optimization.
Resumo:
This paper presents an adaptation of the Cross-Entropy (CE) method to optimize fuzzy logic controllers. The CE is a recently developed optimization method based on a general Monte-Carlo approach to combinatorial and continuous multi-extremal optimization and importance sampling. This work shows the application of this optimization method to optimize the inputs gains, the location and size of the different membership functions' sets of each variable, as well as the weight of each rule from the rule's base of a fuzzy logic controller (FLC). The control system approach presented in this work was designed to command the orientation of an unmanned aerial vehicle (UAV) to modify its trajectory for avoiding collisions. An onboard looking forward camera was used to sense the environment of the UAV. The information extracted by the image processing algorithm is the only input of the fuzzy control approach to avoid the collision with a predefined object. Real tests with a quadrotor have been done to corroborate the improved behavior of the optimized controllers at different stages of the optimization process.
Resumo:
This project is divided into two main parts: The first part shows the integration of an Embedded Linux operating system on a development hardware platform named Zedboard. This platform contains a Zynq-7000 System on Chip (Soc) which is composed by two dual core ARM Cortex-A9 processors and a FPGA Artix-7. The Embedded Linux is built with Linuxlink, a Timesys tool. Meanwhile, the platform hardware configuration is done with Xilinx Vivado. The system is loaded with an SD card which requires to have every files needed for the booting process and for the operation. Some of these files are generated with Xilinx SDK software. The second part starts up from the system already built to integrate a peripheral in the Zynq-7000 FPGA. Also the drivers for controlling the peripheral from the operating system are developed. Finally, a user space program is created to test both of them. RESUMEN. Este proyecto consta de dos partes: La primera muestra la integración de un sistema operativo Linux embebido en una plataforma de desarrollo hardware llamada Zedboard. Esta plataforma utiliza un System on Chip (SoC) Zynq-7000 que está formado por dos procesadores ARM Cortex-A9 de doble núcleo y una FPGA Artix-7. El Linux embebido se construye utilizando la herramienta Linuxlink de Timesys, mientras que el hardware de la plataforma de desarrollo se configura con Vivado de Xilinx. El sistema se carga en una tarjeta SD que debe tener todos los archivos necesarios para completar el arranque y hacer funcionar el sistema. Algunos de esos archivos se generan con la herramienta SDK de Xilinx. En la segunda parte se utiliza el sistema construido para integrar un periférico en la FPGA del Zynq-7000, haciendo uso de Vivado, y se desarrollan los drivers necesarios para utilizarlo mediante el sistema operativo. Para probar esta última parte se desarrolla un programa de espacio de usuario.
Resumo:
Los sistemas empotrados son cada día más comunes y complejos, de modo que encontrar procesos seguros, eficaces y baratos de desarrollo software dirigidos específicamente a esta clase de sistemas es más necesario que nunca. A diferencia de lo que ocurría hasta hace poco, en la actualidad los avances tecnológicos en el campo de los microprocesadores de los últimos tiempos permiten el desarrollo de equipos con prestaciones más que suficientes para ejecutar varios sistemas software en una única máquina. Además, hay sistemas empotrados con requisitos de seguridad (safety) de cuyo correcto funcionamiento depende la vida de muchas personas y/o grandes inversiones económicas. Estos sistemas software se diseñan e implementan de acuerdo con unos estándares de desarrollo software muy estrictos y exigentes. En algunos casos puede ser necesaria también la certificación del software. Para estos casos, los sistemas con criticidades mixtas pueden ser una alternativa muy valiosa. En esta clase de sistemas, aplicaciones con diferentes niveles de criticidad se ejecutan en el mismo computador. Sin embargo, a menudo es necesario certificar el sistema entero con el nivel de criticidad de la aplicación más crítica, lo que hace que los costes se disparen. La virtualización se ha postulado como una tecnología muy interesante para contener esos costes. Esta tecnología permite que un conjunto de máquinas virtuales o particiones ejecuten las aplicaciones con unos niveles de aislamiento tanto temporal como espacial muy altos. Esto, a su vez, permite que cada partición pueda ser certificada independientemente. Para el desarrollo de sistemas particionados con criticidades mixtas se necesita actualizar los modelos de desarrollo software tradicionales, pues estos no cubren ni las nuevas actividades ni los nuevos roles que se requieren en el desarrollo de estos sistemas. Por ejemplo, el integrador del sistema debe definir las particiones o el desarrollador de aplicaciones debe tener en cuenta las características de la partición donde su aplicación va a ejecutar. Tradicionalmente, en el desarrollo de sistemas empotrados, el modelo en V ha tenido una especial relevancia. Por ello, este modelo ha sido adaptado para tener en cuenta escenarios tales como el desarrollo en paralelo de aplicaciones o la incorporación de una nueva partición a un sistema ya existente. El objetivo de esta tesis doctoral es mejorar la tecnología actual de desarrollo de sistemas particionados con criticidades mixtas. Para ello, se ha diseñado e implementado un entorno dirigido específicamente a facilitar y mejorar los procesos de desarrollo de esta clase de sistemas. En concreto, se ha creado un algoritmo que genera el particionado del sistema automáticamente. En el entorno de desarrollo propuesto, se han integrado todas las actividades necesarias para desarrollo de un sistema particionado, incluidos los nuevos roles y actividades mencionados anteriormente. Además, el diseño del entorno de desarrollo se ha basado en la ingeniería guiada por modelos (Model-Driven Engineering), la cual promueve el uso de los modelos como elementos fundamentales en el proceso de desarrollo. Así pues, se proporcionan las herramientas necesarias para modelar y particionar el sistema, así como para validar los resultados y generar los artefactos necesarios para el compilado, construcción y despliegue del mismo. Además, en el diseño del entorno de desarrollo, la extensión e integración del mismo con herramientas de validación ha sido un factor clave. En concreto, se pueden incorporar al entorno de desarrollo nuevos requisitos no-funcionales, la generación de nuevos artefactos tales como documentación o diferentes lenguajes de programación, etc. Una parte clave del entorno de desarrollo es el algoritmo de particionado. Este algoritmo se ha diseñado para ser independiente de los requisitos de las aplicaciones así como para permitir al integrador del sistema implementar nuevos requisitos del sistema. Para lograr esta independencia, se han definido las restricciones al particionado. El algoritmo garantiza que dichas restricciones se cumplirán en el sistema particionado que resulte de su ejecución. Las restricciones al particionado se han diseñado con una capacidad expresiva suficiente para que, con un pequeño grupo de ellas, se puedan expresar la mayor parte de los requisitos no-funcionales más comunes. Las restricciones pueden ser definidas manualmente por el integrador del sistema o bien pueden ser generadas automáticamente por una herramienta a partir de los requisitos funcionales y no-funcionales de una aplicación. El algoritmo de particionado toma como entradas los modelos y las restricciones al particionado del sistema. Tras la ejecución y como resultado, se genera un modelo de despliegue en el que se definen las particiones que son necesarias para el particionado del sistema. A su vez, cada partición define qué aplicaciones deben ejecutar en ella así como los recursos que necesita la partición para ejecutar correctamente. El problema del particionado y las restricciones al particionado se modelan matemáticamente a través de grafos coloreados. En dichos grafos, un coloreado propio de los vértices representa un particionado del sistema correcto. El algoritmo se ha diseñado también para que, si es necesario, sea posible obtener particionados alternativos al inicialmente propuesto. El entorno de desarrollo, incluyendo el algoritmo de particionado, se ha probado con éxito en dos casos de uso industriales: el satélite UPMSat-2 y un demostrador del sistema de control de una turbina eólica. Además, el algoritmo se ha validado mediante la ejecución de numerosos escenarios sintéticos, incluyendo algunos muy complejos, de más de 500 aplicaciones. ABSTRACT The importance of embedded software is growing as it is required for a large number of systems. Devising cheap, efficient and reliable development processes for embedded systems is thus a notable challenge nowadays. Computer processing power is continuously increasing, and as a result, it is currently possible to integrate complex systems in a single processor, which was not feasible a few years ago.Embedded systems may have safety critical requirements. Its failure may result in personal or substantial economical loss. The development of these systems requires stringent development processes that are usually defined by suitable standards. In some cases their certification is also necessary. This scenario fosters the use of mixed-criticality systems in which applications of different criticality levels must coexist in a single system. In these cases, it is usually necessary to certify the whole system, including non-critical applications, which is costly. Virtualization emerges as an enabling technology used for dealing with this problem. The system is structured as a set of partitions, or virtual machines, that can be executed with temporal and spatial isolation. In this way, applications can be developed and certified independently. The development of MCPS (Mixed-Criticality Partitioned Systems) requires additional roles and activities that traditional systems do not require. The system integrator has to define system partitions. Application development has to consider the characteristics of the partition to which it is allocated. In addition, traditional software process models have to be adapted to this scenario. The V-model is commonly used in embedded systems development. It can be adapted to the development of MCPS by enabling the parallel development of applications or adding an additional partition to an existing system. The objective of this PhD is to improve the available technology for MCPS development by providing a framework tailored to the development of this type of system and by defining a flexible and efficient algorithm for automatically generating system partitionings. The goal of the framework is to integrate all the activities required for developing MCPS and to support the different roles involved in this process. The framework is based on MDE (Model-Driven Engineering), which emphasizes the use of models in the development process. The framework provides basic means for modeling the system, generating system partitions, validating the system and generating final artifacts. The framework has been designed to facilitate its extension and the integration of external validation tools. In particular, it can be extended by adding support for additional non-functional requirements and support for final artifacts, such as new programming languages or additional documentation. The framework includes a novel partitioning algorithm. It has been designed to be independent of the types of applications requirements and also to enable the system integrator to tailor the partitioning to the specific requirements of a system. This independence is achieved by defining partitioning constraints that must be met by the resulting partitioning. They have sufficient expressive capacity to state the most common constraints and can be defined manually by the system integrator or generated automatically based on functional and non-functional requirements of the applications. The partitioning algorithm uses system models and partitioning constraints as its inputs. It generates a deployment model that is composed by a set of partitions. Each partition is in turn composed of a set of allocated applications and assigned resources. The partitioning problem, including applications and constraints, is modeled as a colored graph. A valid partitioning is a proper vertex coloring. A specially designed algorithm generates this coloring and is able to provide alternative partitions if required. The framework, including the partitioning algorithm, has been successfully used in the development of two industrial use cases: the UPMSat-2 satellite and the control system of a wind-power turbine. The partitioning algorithm has been successfully validated by using a large number of synthetic loads, including complex scenarios with more that 500 applications.
Resumo:
Las estructuras que trabajan por forma se caracterizan por la íntima e indisociable relación entre geometría y comportamiento estructural. Por consiguiente, la elección de una apropiada geometría es el paso previo indispensable en el diseño conceptual de dichas estructuras. En esa tarea, la selección de las posibles geometrías antifuniculares para las distribuciones de cargas permanentes más habituales son más bien limitadas y, muchas veces, son criterios no estructurales (adaptabilidad funcional, estética, proceso constructivo, etc.) los que no permiten la utilización de dichas geometrías que garantizarían el máximo aprovechamiento del material. En este contexto, esta tesis estudia la posibilidad de obtener una estructura sin momentos flectores incluso si la geometría no es antifunicular para sus cargas permanentes. En efecto, esta tesis presenta un procedimiento, basado en la estática gráfica, que demuestra cómo un conjunto de cargas adicionales, introducidas a través de un sistema de pretensado exterior con elementos post-tesos, puede eliminar los momentos flectores debidos a cargas permanentes en cualquier geometría plana. Esto se traduce en una estructura antifunicular que proporciona respuestas innovadoras a demandas conjuntas de versatilidad arquitectónica y optimización del material. Dicha metodología gráfica ha sido implementada en un software distribuido libremente (EXOEQUILIBRIUM), donde el análisis estructural y la variación geométrica están incluidos en el mismo entorno interactivo y paramétrico. La utilización de estas herramientas permite más versatilidad en la búsqueda de nuevas formas eficientes, lo cual tiene gran importancia en el diseño conceptual de estructuras, liberando al ingeniero de la limitación del propio cálculo y de la incomprensión del comportamiento estructural, facilitando extraordinariamente el hecho creativo a la luz de una metodología de este estilo. Esta tesis incluye la aplicación de estos procedimientos a estructuras de cualquier geometría y distribución inicial de cargas, así como el estudio de diferentes posibles criterios de diseño para optimizar la posición del sistema de post-tesado. Además, la metodología ha sido empleada en el proyecto de maquetas a escala reducida y en la construcción de un pabellón hecho enteramente de cartón, lo que ha permitido obtener una validación física del procedimiento desarrollado. En definitiva, esta tesis expande de manera relevante el rango de posibles geometrías antifuniculares y abre enormes posibilidades para el diseño de estructuras que combinan eficiencia estructural y flexibilidad arquitectónica.Curved structures are characterized by the critical relationship between their geometry and structural behaviour, and selecting an appropriate shape in the conceptual design of such structures is important for achieving materialefficiency. However, the set of bending-free geometries are limited and, often, non-structural design criteria (e.g., usability, architectural needs, aesthetics) prohibit the selection of purely funicular or antifunicular shapes. In response to this issue, this thesis studies the possibility of achieving an axial-only behaviour even if the geometry departs from the ideally bending-free shape. This dissertation presents a new design approach, based on graphic statics that shows how bending moments in a two-dimensional geometry can be eliminated by adding forces through an external post-tensioning system. This results in bending-free structures that provide innovative answers to combined demands on versatility and material optimization. The graphical procedure has been implemented in a free-downloadable design-driven software (EXOEQUILIBRIUM) where structural performance evaluations and geometric variation are embedded within an interactive and parametric working environment. This provides greater versatility in finding new efficient structural configurations during the first design stages, bridging the gap between architectural shaping and structural analysis. The thesis includes the application of the developed graphical procedure to shapes with random curvature and distribution of loads. Furthermore, the effect of different design criteria on the internal force distribution has been analyzed. Finally, the construction of reduced- and large-scale models provides further physical validation of the method and insights about the structural behaviour of these structures. In summary, this work strongly expands the range of possible forms that exhibit a bending-free behaviour and, de facto, opens up new possibilities for designs that combine high-performing solutions with architectural freedom.
Resumo:
We designed a host–guest fusion peptide system, which is completely soluble in water and has a high affinity for biological and lipid model membranes. The guest sequences are those of the fusion peptides of influenza hemagglutinin, which are solubilized by a highly charged unstructured C-terminal host sequence. These peptides partition to the surface of negatively charged liposomes or erythrocytes and elicit membrane fusion or hemolysis. They undergo a conformational change from random coil to an obliquely inserted (≈33° from the surface) α-helix on binding to model membranes. Partition coefficients for membrane insertion were measured for influenza fusion peptides of increasing lengths (n = 8, 13, 16, and 20). The hydrophobic contribution to the free energy of binding of the 20-residue fusion peptide at pH 5.0 is −7.6 kcal/mol (1 cal = 4.18 J). This energy is sufficient to stabilize a “stalk” intermediate if a typical number of fusion peptides assemble at the site of membrane fusion. The fusion activity of the fusion peptides increases with each increment in length, and this increase strictly correlates with the hydrophobic binding energy and the angle of insertion.
Resumo:
Cellular processes are mediated by complex networks of molecular interactions. Dissection of their role most commonly is achieved by using genetic mutations that alter, for example, protein–protein interactions. Small molecules that accomplish the same result would provide a powerful complement to the genetic approach, but it generally is believed that such molecules are rare. There are several natural products, however, that illustrate the feasibility of this approach. Split-pool synthesis now provides a simple mechanical means to prepare vast numbers of complex, even natural product-like, molecules individually attached to cell-sized polymer beads. Here, we describe a genetic system compatible with split-pool synthesis that allows the detection of cell-permeable, small molecule inhibitors of protein–protein interactions in 100- to 200-nl cell culture droplets, prepared by a recently described technique that arrays large numbers of such droplets. These “nanodroplets” contain defined media, cells, and one or more beads containing ≈100 pmol of a photoreleasable small molecule and a controlled number of cells. The engineered Saccharomyces cerevisiae cells used in this study express two interacting proteins after induction with galactose whose interaction results in cell death in the presence of 5-fluoroorotic acid (inducible reverse two-hybrid assay). Disruption of the interaction by a small molecule allows growth, and the small molecule can be introduced into the system hours before induction of the toxic interaction. We demonstrate that the interaction between the activin receptor R1 and the immunophilin protein FKBP12 can be disrupted by the small molecule FK506 at nanomolar concentrations in nanodroplets. This system should provide a general method for selecting cell-permeable ligands that can be used to study the relevance of protein–protein interactions in living cells or organisms.
Resumo:
We analyzed antioxidative defenses, photosynthesis, and pigments (especially xanthophyll-cycle components) in two wheat (Triticum durum Desf.) cultivars, Adamello and Ofanto, during dehydration and rehydration to determine the difference in their sensitivities to drought and to elucidate the role of different protective mechanisms against oxidative stress. Drought caused a more pronounced inhibition in growth and photosynthetic rates in the more sensitive cv Adamello compared with the relatively tolerant cv Ofanto. During dehydration the glutathione content decreased in both wheat cultivars, but only cv Adamello showed a significant increase in glutathione reductase and hydrogen peroxide-glutathione peroxidase activities. The activation states of two sulfhydryl-containing chloroplast enzymes, NADP+-dependent glyceraldehyde-3-phosphate dehydrogenase and fructose-1,6-bisphosphatase, were maintained at control levels during dehydration and rehydration in both cultivars. This indicates that the defense systems involved are efficient in the protection of sulfhydryl groups against oxidation. Drought did not cause significant effects on lipid peroxidation. Upon dehydration, a decline in chlorophyll a, lutein, neoxanthin, and β-carotene contents, and an increase in the pool of de-epoxidized xanthophyll-cycle components (i.e. zeaxanthin and antheraxanthin), were evident only in cv Adamello. Accordingly, after exposure to drought, cv Adamello showed a larger reduction in the actual photosystem II photochemical efficiency and a higher increase in nonradiative energy dissipation than cv Ofanto. Although differences in zeaxanthin content were not sufficient to explain the difference in drought tolerance between the two cultivars, zeaxanthin formation may be relevant in avoiding irreversible damage to photosystem II in the more sensitive cultivar.
Resumo:
In C3 plants large amounts of photorespiratory glycine (Gly) are converted to serine by the tetrahydrofolate (THF)-dependent activities of the Gly decarboxylase complex (GDC) and serine hydroxymethyltransferase (SHMT). Using 13C nuclear magnetic resonance, we monitored the flux of carbon through the GDC/SHMT enzyme system in Arabidopsis thaliana (L.) Heynh. Columbia exposed to inhibitors of THF-synthesizing enzymes. Plants exposed for 96 h to sulfanilamide, a dihydropteroate synthase inhibitor, showed little reduction in flux through GDC/SHMT. Two other sulfonamide analogs were tested with similar results, although all three analogs competitively inhibited the partially purified enzyme. However, methotrexate or aminopterin, which are confirmed inhibitors of Arabidopsis dihydrofolate reductase, decreased the flux through the GDC/SHMT system by 60% after 48 h and by 100% in 96 h. The uptake of [α-13C]Gly was not inhibited by either drug class. The specificity of methotrexate action was shown by the ability of 5-formyl-THF to restore flux through the GDC/SHMT pathway in methotrexate-inhibited plants. The experiments with sulfonamides strongly suggest that the mitochondrial THF pool has a long half-life. The studies with methotrexate support the additional, critical role of dihydrofolate reductase in recycling THF oxidized in thymidylate synthesis.
Resumo:
Four new members of the fibroblast growth factor (FGF) family, referred to as fibroblast growth factor homologous factors (FHFs), have been identified by a combination of random cDNA sequencing, data base searches, and degenerate PCR. Pairwise comparisons between the four FHFs show between 58% and 71% amino acid sequence identity, but each FHF shows less than 30% identity when compared with other FGFs. Like FGF-1 (acidic FGF) and FGF-2 (basic FGF), the FHFs lack a classical signal sequence and contain clusters of basic residues that can act as nuclear localization signals. In transiently transfected 293 cells FHF-1 accumulates in the nucleus and is not secreted. Each FHF is expressed in the developing and adult nervous systems, suggesting a role for this branch of the FGF family in nervous system development and function.
Resumo:
The eukaryotic green alga Dunaliella tertiolecta acclimates to decreased growth irradiance by increasing cellular levels of light-harvesting chlorophyll protein complex apoproteins associated with photosystem II (LHCIIs), whereas increased growth irradiance elicits the opposite response. Nuclear run-on transcription assays and measurements of cab mRNA stability established that light intensity-dependent changes in LHCII are controlled at the level of transcription. cab gene transcription in high-intensity light was partially enhanced by reducing plastoquinone with 3-(3,4-dichlorophenyl)-1,1-dimethyl urea (DCMU), whereas it was repressed in low-intensity light by partially inhibiting the oxidation of plastoquinol with 2,5-dibromo-3-methyl-6-isopropyl-p-benzoquinone (DBMIB). Uncouplers of photosynthetic electron transport and inhibition of water splitting had no effect on LHCII levels. These results strongly implicate the redox state of the plastoquinone pool in the chloroplast as a photon-sensing system that is coupled to the light-intensity regulation of nuclear-encoded cab gene transcription. The accumulation of cellular chlorophyll at low-intensity light can be blocked with cytoplasmically directed phosphatase inhibitors, such as okadaic acid, microcystin L-R, and tautomycin. Gel mobility-shift assays revealed that cells grown in high-intensity light contained proteins that bind to the promoter region of a cab gene carrying sequences homologous to higher plant light-responsive elements. On the basis of these experimental results, we propose a model for a light intensity signaling system where cab gene expression is reversibly repressed by a phosphorylated factor coupled to the redox status of plastoquinone through a chloroplast protein kinase.
Resumo:
Controversy still exists over the adaptive nature of variation of enzyme loci. In conifers, random amplified polymorphic DNAs (RAPDs) represent a class of marker loci that is unlikely to fall within or be strongly linked to coding DNA. We have compared the genetic diversity in natural populations of black spruce [Picea mariana (Mill.) B.S.P.] using genotypic data at allozyme loci and RAPD loci as well as phenotypic data from inferred RAPD fingerprints. The genotypic data for both allozymes and RAPDs were obtained from at least six haploid megagametophytes for each of 75 sexually mature individuals distributed in five populations. Heterozygosities and population fixation indices were in complete agreement between allozyme loci and RAPD loci. In black spruce, it is more likely that the similar levels of variation detected at both enzyme and RAPD loci are due to such evolutionary forces as migration and the mating system, rather than to balancing selection and overdominance. Furthermore, we show that biased estimates of expected heterozygosity and among-population differentiation are obtained when using allele frequencies derived from dominant RAPD phenotypes.
Resumo:
We present the first detailed numerical study in three dimensions of a first-order phase transition that remains first order in the presence of quenched disorder (specifically, the ferromagnetic-paramagnetic transition of the site-diluted four states Potts model). A tricritical point, which lies surprisingly near the pure-system limit and is studied by means of finite-size scaling, separates the first-order and second-order parts of the critical line. This investigation has been made possible by a new definition of the disorder average that avoids the diverging-variance probability distributions that plague the standard approach. Entropy, rather than free energy, is the basic object in this approach that exploits a recently introduced microcanonical Monte Carlo method.
Resumo:
We present a detailed numerical study on the effects of adding quenched impurities to a three dimensional system which in the pure case undergoes a strong first order phase transition (specifically, the ferromagnetic/paramagnetic transition of the site-diluted four states Potts model). We can state that the transition remains first-order in the presence of quenched disorder (a small amount of it) but it turns out to be second order as more impurities are added. A tricritical point, which is studied by means of Finite-Size Scaling, separates the first-order and second-order parts of the critical line. The results were made possible by a new definition of the disorder average that avoids the diverging-variance probability distributions that arise using the standard methodology. We also made use of a recently proposed microcanonical Monte Carlo method in which entropy, instead of free energy, is the basic quantity.
Resumo:
The objective of this paper is to present a system to communicate hidden information among different users by means of images. The tasks that the system is able to carry on can be divided in two different groups of utilities, implemented in java. The first group of utilities are related with the possibility to hide information in color images, using a steganographic function based on the least significant bit (LSB) methods. The second group of utilities allows us to communicate with other users with the aim to send or receive images, where some information have been previously embedded. Thus, this is the most significant characteristic of the implementation, we have built an environment where we join the email capabilities to send and receive text and images as attached files, with the main objective of hiding information.