816 resultados para Embedded System, Domain Specific Language (DSL), Agenti BDI, Arduino, Agentino
Resumo:
En la actualidad, el uso de las tecnologías ha sido primordial para el avance de las sociedades, estas han permitido que personas sin conocimientos informáticos o usuarios llamados “no expertos” se interesen en su uso, razón por la cual los investigadores científicos se han visto en la necesidad de producir estudios que permitan la adaptación de sistemas, a la problemática existente dentro del ámbito informático. Una necesidad recurrente de todo usuario de un sistema es la gestión de la información, la cual se puede administrar por medio de una base de datos y lenguaje específico, como lo es el SQL (Structured Query Language), pero esto obliga al usuario sin conocimientos a acudir a un especialista para su diseño y construcción, lo cual se ve reflejado en costos y métodos complejos, entonces se plantea una pregunta ¿qué hacer cuando los proyectos son pequeñas y los recursos y procesos son limitados? Teniendo como base la investigación realizada por la universidad de Washington[39], donde sintetizan sentencias SQL a partir de ejemplos de entrada y salida, se pretende con esta memoria automatizar el proceso y aplicar una técnica diferente de aprendizaje, para lo cual utiliza una aproximación evolucionista, donde la aplicación de un algoritmo genético adaptado origina sentencias SQL válidas que responden a las condiciones establecidas por los ejemplos de entrada y salida dados por el usuario. Se obtuvo como resultado de la aproximación, una herramienta denominada EvoSQL que fue validada en este estudio. Sobre los 28 ejercicios empleados por la investigación [39], 23 de los cuales se obtuvieron resultados perfectos y 5 ejercicios sin éxito, esto representa un 82.1% de efectividad. Esta efectividad es superior en un 10.7% al establecido por la herramienta desarrollada en [39] SQLSynthesizer y 75% más alto que la herramienta siguiente más próxima Query by Output QBO[31]. El promedio obtenido en la ejecución de cada ejercicio fue de 3 minutos y 11 segundos, este tiempo es superior al establecido por SQLSynthesizer; sin embargo, en la medida un algoritmo genético supone la existencia de fases que amplían los rangos de tiempos, por lo cual el tiempo obtenido es aceptable con relación a las aplicaciones de este tipo. En conclusión y según lo anteriormente expuesto, se obtuvo una herramienta automática con una aproximación evolucionista, con buenos resultados y un proceso simple para el usuario “no experto”.
Resumo:
Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.
Resumo:
In this paper, we propose a handwritten character recognition system for Malayalam language. The feature extraction phase consists of gradient and curvature calculation and dimensionality reduction using Principal Component Analysis. Directional information from the arc tangent of gradient is used as gradient feature. Strength of gradient in curvature direction is used as the curvature feature. The proposed system uses a combination of gradient and curvature feature in reduced dimension as the feature vector. For classification, discriminative power of Support Vector Machine (SVM) is evaluated. The results reveal that SVM with Radial Basis Function (RBF) kernel yield the best performance with 96.28% and 97.96% of accuracy in two different datasets. This is the highest accuracy ever reported on these datasets
Resumo:
This study examines the effectiveness of utilizing a DVD software program to teach specific language structures to children who are deaf or hard of hearing. This study includes a literary review of previous studies that evaluated the effectiveness of using technology to teach language to children who exhibited language delays.
Resumo:
Nuclear factor kappa B (NF-kappaB) is an inducible transcription factor present in neurons and glia. Recent genetic models identified a role for NF-kappaB in neuroprotection against various neurotoxins. Furthermore, genetic evidence for a role in learning and memory is now emerging. This review highlights our current understanding of neuronal NF-kappaB in response to synaptic transmission and summarizes potential physiological functions of NF-kappaB in the nervous system. This article contains a listing of NF-kappaB activators and inhibitors in the nervous system, furthermore specific target genes are discussed. Synaptic NF-kappaB activated by glutamate and Ca2+ will be presented in the context of retrograde signaling. A controversial role of NF-kappaB in neurodegenerative diseases will be discussed. A model is proposed explaining this paradox as deregulated physiological NF-kappaB activity, where novel results are integrated, showing that p65 could be turned from an activator to a repressor of anti-apoptotic genes.
Resumo:
The TCABR data analysis and acquisition system has been upgraded to support a joint research programme using remote participation technologies. The architecture of the new system uses Java language as programming environment. Since application parameters and hardware in a joint experiment are complex with a large variability of components, requirements and specification solutions need to be flexible and modular, independent from operating system and computer architecture. To describe and organize the information on all the components and the connections among them, systems are developed using the extensible Markup Language (XML) technology. The communication between clients and servers uses remote procedure call (RPC) based on the XML (RPC-XML technology). The integration among Java language, XML and RPC-XML technologies allows to develop easily a standard data and communication access layer between users and laboratories using common software libraries and Web application. The libraries allow data retrieval using the same methods for all user laboratories in the joint collaboration, and the Web application allows a simple graphical user interface (GUI) access. The TCABR tokamak team in collaboration with the IPFN (Instituto de Plasmas e Fusao Nuclear, Instituto Superior Tecnico, Universidade Tecnica de Lisboa) is implementing this remote participation technologies. The first version was tested at the Joint Experiment on TCABR (TCABRJE), a Host Laboratory Experiment, organized in cooperation with the IAEA (International Atomic Energy Agency) in the framework of the IAEA Coordinated Research Project (CRP) on ""Joint Research Using Small Tokamaks"". (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
From their early days, Electrical Submergible Pumping (ESP) units have excelled in lifting much greater liquid rates than most of the other types of artificial lift and developed by good performance in wells with high BSW, in onshore and offshore environments. For all artificial lift system, the lifetime and frequency of interventions are of paramount importance, given the high costs of rigs and equipment, plus the losses coming from a halt in production. In search of a better life of the system comes the need to work with the same efficiency and security within the limits of their equipment, this implies the need for periodic adjustments, monitoring and control. How is increasing the prospect of minimizing direct human actions, these adjustments should be made increasingly via automation. The automated system not only provides a longer life, but also greater control over the production of the well. The controller is the brain of most automation systems, it is inserted the logic and strategies in the work process in order to get you to work efficiently. So great is the importance of controlling for any automation system is expected that, with better understanding of ESP system and the development of research, many controllers will be proposed for this method of artificial lift. Once a controller is proposed, it must be tested and validated before they take it as efficient and functional. The use of a producing well or a test well could favor the completion of testing, but with the serious risk that flaws in the design of the controller were to cause damage to oil well equipment, many of them expensive. Given this reality, the main objective of the present work is to present an environment for evaluation of fuzzy controllers for wells equipped with ESP system, using a computer simulator representing a virtual oil well, a software design fuzzy controllers and a PLC. The use of the proposed environment will enable a reduction in time required for testing and adjustments to the controller and evaluated a rapid diagnosis of their efficiency and effectiveness. The control algorithms are implemented in both high-level language, through the controller design software, such as specific language for programming PLCs, Ladder Diagram language.
Resumo:
This paper proposes a different experimental setup compared with the traditional ones, in order to determine the acceleration of gravity, which is carried out by using a fluid at a constant rotation. A computerized rotational system-by using a data acquisition system with specific software, a power amplifier and a rotary motion sensor-is employed in order to evaluate the angular velocity and g. An equation to determine g is inferred from fluid mechanics. For this purpose, the fluid's parabolic shape inside a cylindrical receptacle is considered using a rotational movement.
Resumo:
The academic community and software industry have shown, in recent years, substantial interest in approaches and technologies related to the area of model-driven development (MDD). At the same time, continues the relentless pursuit of industry for technologies to raise productivity and quality in the development of software products. This work aims to explore those two statements, through an experiment carried by using MDD technology and evaluation of its use on solving an actual problem under the security context of enterprise systems. By building and using a tool, a visual DSL denominated CALV3, inspired by the software factory approach: a synergy between software product line, domainspecific languages and MDD, we evaluate the gains in abstraction and productivity through a systematic case study conducted in a development team. The results and lessons learned from the evaluation of this tool within industry are the main contributions of this work
Resumo:
São apresentadas algumas críticas de Della Volpe à estética lukacsiana. Segundo o filósofo italiano, uma filosofia da arte materialista não poderia estar fundamentada no conceito de intuição. As categorias básicas seriam a de plenitude cognoscitiva e de linguagens específicas. Assim, propõe-se que o sistema dellavolpiano seja levado em consideração, como verdadeiro ponto de partida para as reflexões sobre arte, sobre as relações entre a obra e a sociedade e sobre o grau de saber que o universo artístico produz.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Cuttings return analysis is an important tool to detect and prevent problems during the petroleum well drilling process. Several measurements and tools have been developed for drilling problems detection, including mud logging, PWD and downhole torque information. Cuttings flow meters were developed in the past to provide information regarding cuttings return at the shale shakers. Their use, however, significantly impact the operation including rig space issues, interferences in geological analysis besides, additional personel required. This article proposes a non intrusive system to analyze the cuttings concentration at the shale shakers, which can indicate problems during drilling process, such as landslide, the collapse of the well borehole walls. Cuttings images are acquired by a high definition camera installed above the shakers and sent to a computer coupled with a data analysis system which aims the quantification and closure of a cuttings material balance in the well surface system domain. No additional people at the rigsite are required to operate the system. Modern Artificial intelligence techniques are used for pattern recognition and data analysis. Techniques include the Optimum-Path Forest (OPF), Artificial Neural Network using Multilayer Perceptrons (ANN-MLP), Support Vector Machines (SVM) and a Bayesian Classifier (BC). Field test results conducted on offshore floating vessels are presented. Results show the robustness of the proposed system, which can be also integrated with other data to improve the efficiency of drilling problems detection. Copyright 2010, IADC/SPE Drilling Conference and Exhibition.
Resumo:
This paper presents a NCAP embedded on DE2 kit with Nios II processor and uClinux to development of a network gateway with two interfaces, wireless (ZigBee) and wired (RS232) based on IEEE 1451. Both the communications, wireless and wired, were developed to be point-to-point and working with the same protocols, based on IEEE 1451.0-2007. The tests were made using a microcomputer, which through of browser was possible access the web page stored in the DE2 kit and send commands of control and monitoring to both TIMs (WTIM and STIM). The system describes a different form of development of the NCAP node to be applied in different environments with wired or wireless in the same node. © 2011 IEEE.
Resumo:
Objective: To evaluate the influence of different air abrasion protocols on the surface roughness of an yttria-stabilized polycrystalline tetragonal zirconia) (Y-TZP) ceramic, as well as the surface topography of the ceramic after the treatment. Method: Fifty-four specimens (7.5×4×7.5mm) obtained from two ceramic blocks (LAVA, 3M ESPE) were flattened with fine-grit sandpaper and subjected to sintering in the ceramic system's specific firing oven. Next, the specimens were embedded in acrylic resin and the surfaces to be treated were polished in a polishing machine using sandpapers of decreasing abrasion (600- to 1,200-grit) followed by felt discs with 10μm and 3μm polishing pastes and colloidal silica. The specimens were then randomly assigned to 9 groups, according to factors particle and pressure(n=6): Gr1- control; Gr2- Al 2O 3(50μm)/2.5 bar; Gr3- Al 2O 3(110μm)/2.5 bar; Gr4- SiO 2(30μm)/2.5 bar; Gr5- SiO 2(30μm)/2.5 bar; Gr6- Al 2O 3(50μm)/3.5 bar; Gr7- Al2O3(110μm)/3.5 bar; Gr8- SiO 2(30μm)/3.5 bar; Gr9- SiO 2(30μm)/3.5 bar. After treatments, surface roughness was analyzed by a digital optical profilometer and the morphology was examined by scanning electron microscopy (SEM). Data (μm) were subjected to statistical analysis by Dunnett's test (5%), two-way ANOVA and Tukey's test (5%). Results: The type of particle (p=0.0001) and the pressure (p=0.0001) used in the air abrasion protocols influenced the surface roughness values among the experimental groups (ANOVA). The mean surface roughness values (μm) obtained for the experimental groups (Gr2 to Gr9) were, respectively: 0.37 D; 0.56 BC; 0.46 BC; 0.48 CD; 0.59 BC; 0.82 A; 0.53B CD; 0.67 AB. The SEM analysis revealed that Al 2O 3, regardless of the particle size and pressure used, caused damage to the surface of the specimens, as it produced superficial damages on the ceramic, in the form of grooves and cracks. Conclusion: Al2O3 (110 μm/3.5 bar) air abrasion promoted the highest surface roughness on the ceramics, but it does not mean that this protocol promotes better ceramic-cement union compared to the other air abrasion protocols.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)