947 resultados para Factory of software
Resumo:
Software deployment, eller mjukvarudistribution översatt till svenska kan ses som processen där alla aktiviteter ingår för att göra en mjukvara tillgänglig för användare utan en manuell installation på användarens dator eller annan maskin. Det finns ett flertal software deployment-verktyg, som hanterar automatiska installationer, tillgängliga för företag på marknaden idag. Avdelningen HVDC på ABB i Ludvika har behov att börja använda ett verktyg för automatiserade installationer av mjukvaror då installationer idag utförs manuellt och är tidsödande. Som Microsoftpartners vill ABB se hur Microsofts verktyg för mjukvarudistribution skulle kunna hjälpa för detta behov. Vår studie syftade till att undersöka hur arbetet med installationer av mjukvara ser ut idag, samt hitta förbättringsmöjligheter för installationer som inte kan automatiseras i nuläget. I studien ingick även att ta fram ett generellt ramverk för hur verksamheter kan gå tillväga när de vill börja använda sig utav software deployment-verktyg. I ramverket ingår en utformad kravspecifikation som ska utvärderas mot Microsofts verktyg. För att skapa en uppfattning om hur arbetet i verksamheten ser ut idag har vi utfört enkätundersökning och intervjuer med personal på HVDC. För att utveckla ett ramverk har vi använt oss av insamlade data från de intervjuer, enkätundersökning och gruppintervju som utförts, detta för att identifiera krav och önskemål från personalen hos ett software deployment-verktyg. Litteraturstudier utfördes för att skapa en teoretisk referensram att utgå ifrån vid utvecklande av ramverket och kravspecifikationen. Studien har resulterat i en beskrivning av software deployment, förbättringsmöjligheter i arbetet med installationer av mjukvara samt ett generellt ramverk som beskriver hur verksamheter kan gå tillväga när de ska börja använda ett software deployment-verktyg. Ramverket innehåller också en kravspecifikation som använts för att utvärdera Microsofts verktyg för mjukvarudistribution. I vår studie har vi inte sett att någon tidigare har tagit fram ett generellt ramverk och kravspecifikation som verksamheter kan använda sig av som underlag när de ska börja använda ett software deployment-verktyg. Vårt resultat av studien kan täcka upp detta kunskapsgap.
Resumo:
Statisticians should be involved at all stages of sample surveys and courses on surveys need to reflect this by covering both theoretical and practical aspects. Teaching methods could include some hands-on experience, directed reading, and use of software designed for teaching or professional use, as well as more traditional lecturing. Suggestions are given for a course of about fifty hours.
Resumo:
Recent advances in the massively parallel computational abilities of graphical processing units (GPUs) have increased their use for general purpose computation, as companies look to take advantage of big data processing techniques. This has given rise to the potential for malicious software targeting GPUs, which is of interest to forensic investigators examining the operation of software. The ability to carry out reverse-engineering of software is of great importance within the security and forensics elds, particularly when investigating malicious software or carrying out forensic analysis following a successful security breach. Due to the complexity of the Nvidia CUDA (Compute Uni ed Device Architecture) framework, it is not clear how best to approach the reverse engineering of a piece of CUDA software. We carry out a review of the di erent binary output formats which may be encountered from the CUDA compiler, and their implications on reverse engineering. We then demonstrate the process of carrying out disassembly of an example CUDA application, to establish the various techniques available to forensic investigators carrying out black-box disassembly and reverse engineering of CUDA binaries. We show that the Nvidia compiler, using default settings, leaks useful information. Finally, we demonstrate techniques to better protect intellectual property in CUDA algorithm implementations from reverse engineering.
Resumo:
Avec l’avènement des objets connectés, la bande passante nécessaire dépasse la capacité des interconnections électriques et interface sans fils dans les réseaux d’accès mais aussi dans les réseaux coeurs. Des systèmes photoniques haute capacité situés dans les réseaux d’accès utilisant la technologie radio sur fibre systèmes ont été proposés comme solution dans les réseaux sans fil de 5e générations. Afin de maximiser l’utilisation des ressources des serveurs et des ressources réseau, le cloud computing et des services de stockage sont en cours de déploiement. De cette manière, les ressources centralisées pourraient être diffusées de façon dynamique comme l’utilisateur final le souhaite. Chaque échange nécessitant une synchronisation entre le serveur et son infrastructure, une couche physique optique permet au cloud de supporter la virtualisation des réseaux et de les définir de façon logicielle. Les amplificateurs à semi-conducteurs réflectifs (RSOA) sont une technologie clé au niveau des ONU(unité de communications optiques) dans les réseaux d’accès passif (PON) à fibres. Nous examinons ici la possibilité d’utiliser un RSOA et la technologie radio sur fibre pour transporter des signaux sans fil ainsi qu’un signal numérique sur un PON. La radio sur fibres peut être facilement réalisée grâce à l’insensibilité a la longueur d’onde du RSOA. Le choix de la longueur d’onde pour la couche physique est cependant choisi dans les couches 2/3 du modèle OSI. Les interactions entre la couche physique et la commutation de réseaux peuvent être faites par l’ajout d’un contrôleur SDN pour inclure des gestionnaires de couches optiques. La virtualisation réseau pourrait ainsi bénéficier d’une couche optique flexible grâce des ressources réseau dynamique et adaptée. Dans ce mémoire, nous étudions un système disposant d’une couche physique optique basé sur un RSOA. Celle-ci nous permet de façon simultanée un envoi de signaux sans fil et le transport de signaux numérique au format modulation tout ou rien (OOK) dans un système WDM(multiplexage en longueur d’onde)-PON. Le RSOA a été caractérisé pour montrer sa capacité à gérer une plage dynamique élevée du signal sans fil analogique. Ensuite, les signaux RF et IF du système de fibres sont comparés avec ses avantages et ses inconvénients. Finalement, nous réalisons de façon expérimentale une liaison point à point WDM utilisant la transmission en duplex intégral d’un signal wifi analogique ainsi qu’un signal descendant au format OOK. En introduisant deux mélangeurs RF dans la liaison montante, nous avons résolu le problème d’incompatibilité avec le système sans fil basé sur le TDD (multiplexage en temps duplexé).
Resumo:
A ligeireza com que a tecnologia se desenvolve e a concorrência geral que se estabeleceram nos últimos anos na nossa sociedade, conduzem a uma exigência crescente de cidadãos mais capazes, criadores e inovadores. A sedução e o fascínio causados por essa mesma tecnologia nos jovens estudantes levam a uma distância frequentemente da sala de aula, sempre que esta persiste em resistir às mudanças do exterior. O importante é que haja uma articulação entre as preferências e as capacidades dos estudantes, em relação à tecnologia, através de práticas pedagógicas que permitam um “saber ensinar” mais aliciante e que despertem o interesse intelectual e a desejo para aprender. Este projecto centrou-se na pesquisa do e-Portefólio reflexivo e na contribuição das potencialidades do software Mahara para uma aprendizagem activa e reflexiva. Optamos por esta temática por ser a que melhor se adequa às necessidades/especificidades dos nossos alunos, e porque o Mahara permite que cada autor controle a informação (artefactos) que deseja conter no seu e-portefólio e partilhar com os outros usuários, o que o torna diferente de outros softwares de e-portefólios.
Resumo:
Electronic Publishing -- Origination, Dissemination and Design (EP-odd) is an academic journal which publishes refereed papers in the subject area of electronic publishing. The authors of the present paper are, respectively, editor-in-chief, system software consultant and senior production manager for the journal. EP-odd's policy is that editors, authors, referees and production staff will work closely together using electronic mail. Authors are also encouraged to originate their papers using one of the approved text-processing packages together with the appropriate set of macros which enforce the layout style for the journal. This same software will then be used by the publisher in the production phase. Our experiences with these strategies are presented, and two recently developed suites of software are described: one of these makes the macro sets available over electronic mail and the other automates the flow of papers through the refereeing process. The decision to produce EP-odd in this way means that the publisher has to adopt production procedures which differ markedly from those employed for a conventional journal.
Resumo:
The Graphical User Interface (GUI) is an integral component of contemporary computer software. A stable and reliable GUI is necessary for correct functioning of software applications. Comprehensive verification of the GUI is a routine part of most software development life-cycles. The input space of a GUI is typically large, making exhaustive verification difficult. GUI defects are often revealed by exercising parts of the GUI that interact with each other. It is challenging for a verification method to drive the GUI into states that might contain defects. In recent years, model-based methods, that target specific GUI interactions, have been developed. These methods create a formal model of the GUI’s input space from specification of the GUI, visible GUI behaviors and static analysis of the GUI’s program-code. GUIs are typically dynamic in nature, whose user-visible state is guided by underlying program-code and dynamic program-state. This research extends existing model-based GUI testing techniques by modelling interactions between the visible GUI of a GUI-based software and its underlying program-code. The new model is able to, efficiently and effectively, test the GUI in ways that were not possible using existing methods. The thesis is this: Long, useful GUI testcases can be created by examining the interactions between the GUI, of a GUI-based application, and its program-code. To explore this thesis, a model-based GUI testing approach is formulated and evaluated. In this approach, program-code level interactions between GUI event handlers will be examined, modelled and deployed for constructing long GUI testcases. These testcases are able to drive the GUI into states that were not possible using existing models. Implementation and evaluation has been conducted using GUITAR, a fully-automated, open-source GUI testing framework.
Resumo:
The life cycle of software applications in general is very short and with extreme volatile requirements. Within these conditions programmers need development tools and techniques with an extreme level of productivity. We consider the code reuse as the most prominent approach to solve that problem. Our proposal uses the advantages provided by the Aspect-Oriented Programming in order to build a reusable framework capable to turn both programmer and application oblivious as far as data persistence is concerned, thus avoiding the need to write any line of code about that concern. Besides the benefits to productivity, the software quality increases. This paper describes the actual state of the art, identifying the main challenge to build a complete and reusable framework for Orthogonal Persistence in concurrent environments with support for transactions. The present work also includes a successfully developed prototype of that framework, capable of freeing the programmer of implementing any read or write data operations. This prototype is supported by an object oriented database and, in the future, will also use a relational database and have support for transactions.
Resumo:
Spatial-temporal dynamics of zooplankton in the Caravelas river estuary (Bahia, Brazil). The survey was conducted in order to describe the zooplankton community of the estuary Caravelas (Bahia, Brazil), to quantify and relate the patterns of horizontal and vertical transport with the type of tide (neap and spring) and tidal phase (flood and ebb). Zooplankton samples were collected with the aid of a suction pump (300L), filtered in plankton nets (300μm) and fixed in saline formalin 4%. Samples were collected at a fixed point (A1), near the mouth of the estuary, with samples taken at neap tides and spring tides during the dry and rainy seasons. Samples were collected for 13 hours, at intervals of 1 hour in 3 depths: surface, middle and bottom. Simultaneous collection of biological, we measured the current velocity, temperature and salinity of the water through CTD. In the laboratory, samples were selected for analysis in estereomicroscope, with 25 groups identified, with Copepoda getting the highest number of species. The 168 samples obtained from temporal samples were subsampled and processed on equipment ZooScan, with the aid of software ZooProcess at the end were generated 458.997 vingnettes. 8 taxa were identified automatically, with 16 classified as a semi-automatic. The group Copepoda, despite the limited taxonomic refinement ZooScan, obtained 2 genera and 1 species identified automatically. Among the seasons dry and wet groups Brachyura (zoea), Chaetognatha, and the Calanoid copepods (others), Temora spp., Oithona spp. and Euterpina acutifrons were those who had higher frequency of occurrence, appearing in more than 70% of the samples. Copepoda group showed the largest percentage of relative abundance in both seasons. There was no seasonal variation of total zooplankton, with an average density of 7826±4219 org.m-3 in the dry season, and 7959±3675 org.m-3 in the rainy season, neither between the types and phases of the tides, but seasonal differences were significant recorded for the main zooplankton groups. Vertical stratification was seen for the major zooplankton groups (Brachyura, Chaetognatha, Calanoida (other), Oithona spp, Temora spp. e Euterpina acutifrons). The scale of this stratification varied with the type (square or tide) and tidal phase (flood or ebb). The instantaneous transport was more influenced by current velocity, with higher values observed in spring tides to the total zooplankton, however, there was a variation of this pattern depending on the zooplankton group. According to the data import and export of total zooplankton, the outflow of organisms of the estuary was higher than the input. The results suggest that the estuary of Caravelas may influence the dynamics of organic matter to the adjacent coast, with possible consequences in National Marine Park of Abrolhos
Resumo:
Ethernet connections, which are widely used in many computer networks, can suffer from electromagnetic interference. Typically, a degradation of the data transmission rate can be perceived as electromagnetic disturbances lead to corruption of data frames on the network media. In this paper a software-based measuring method is presented, which allows a direct assessment of the effects on the link layer. The results can directly be linked to the physical interaction without the influence of software related effects on higher protocol layers. This gives a simple tool for a quantitative analysis of the disturbance of an Ethernet connection based on time domain data. An example is shown, how the data can be used for further investigation of mechanisms and detection of intentional electromagnetic attacks. © 2015 Author(s).
Resumo:
Este trabalho propõe uma metodologia de tradução para igualar o programa de controle de PLC no ambiente Matlab/Simulink. A lista traduz automaticamente o programa de controlo de PLC para a linguagem. de software Matlab/Simulink. O programa do PlC é traduzido para uma função bloco do Matlab, dentro do ambiente Matlab/Simulink, que irá controlar o modelo do processo industrial, desde que a simulação seja executada. As entradas e saídas da lista de tradução do PLC depende do tipo de autómato que é escolhido. A lista de tradução será compatível com um ficheiro Matlab/Simulink que corresponde tradução de programa de controle de PLC. ABSTRACT: This work proposes a translation methodology to equa1 the program of control of PLC in the environment Matlab/Simulink. The list translates automatically the program of control of PLC for the language of software Matlab/Simulink. The program of the PlC is translated for a function block of the Matlab, inside the environment Matlab/Simulink, which will be going to control the model of the industrial process, since the simulation is executed. The entries and exits of the translation list of the PLC it depends on the type of automaton that is chosen. The translation list will be compatible with a filing cabinet Matlab/Simulink that corresponds translation of program of control of PLC.
Resumo:
Este trabajo consistió en el diseño, la implementación, validación y verificación de estrategias innovadoras para el proceso enseñanza-aprendizaje de los contenidos matemáticos del bloque número 1 del 2do año de bachillerato de la U. E. Zoila Esperanza Palacios, mediante el uso de las TIC´s. Se trabajó con dos paralelos para posteriormente poder hacer la comparación respectiva de los logros alcanzados, en las clases se implementaron estrategias previamente diseñadas, utilizando recursos informáticos como: Sitios web con animaciones de formato ¨swf¨, plataforma online THATQUIZ, Algebrator, GeoGebra, videos y Microsoft: Power Point. Lo anterior se logró mediante la propuesta de una ¨Guía Didáctica Para el Docente¨, que incluye las estrategias diseñadas, además a esto se elaboró la planificación micro curricular del bloque número 1. También se elaboraron y aplicaron encuestas para realizar una contextualización del grupo con el que se trabajó, y una segunda encuesta para determinar el nivel de satisfacción de software utilizado. La verificación del impacto de la aplicación de las estrategias propuestas se realizó de la siguiente manera: 1. Obtención y análisis de las notas obtenidas por los 2 paralelos en el primer bloque el año lectivo 2013-2014. 2. Obtención y análisis de las notas obtenidas por los dos paralelos en el primer bloque el año lectivo (2014-2015) 3. Comparación de los análisis obtenidos. 4. Aplicación y análisis de resultados de encuestas de contextualización y satisfacción de software al paralelo intervenido.
Resumo:
Ensuring the security of computers is a non-trivial task, with many techniques used by malicious users to compromise these systems. In recent years a new threat has emerged in the form of networks of hijacked zombie machines used to perform complex distributed attacks such as denial of service and to obtain sensitive data such as password information. These zombie machines are said to be infected with a dasiahotpsila - a malicious piece of software which is installed on a host machine and is controlled by a remote attacker, termed the dasiabotmaster of a botnetpsila. In this work, we use the biologically inspired dendritic cell algorithm (DCA) to detect the existence of a single hot on a compromised host machine. The DCA is an immune-inspired algorithm based on an abstract model of the behaviour of the dendritic cells of the human body. The basis of anomaly detection performed by the DCA is facilitated using the correlation of behavioural attributes such as keylogging and packet flooding behaviour. The results of the application of the DCA to the detection of a single hot show that the algorithm is a successful technique for the detection of such malicious software without responding to normally running programs.
Resumo:
Con el objetivo de mejorar el rendimiento de los equipos de desarrollo de software se han definido muchas metodologías de desarrollo, entre las que se encuentran las prescriptivas y las ágiles. Pese a esto, los equipos no suelen emplear ninguna por no encontrarlas ajustadas a su contexto particular, y se van directamente a la tarea de programar. En este artículo se propone una metodología de desarrollo de software ligera y adaptada a un contexto preciso y bien definido, que alinea Microsoft Solutions Framework for Agile Software Development (MSF4ASD) con los lineamientos de gestión de proyectos presentados en la Guía del PMBOK. Además, se brindan elementos para la implementación de una plataforma Microsoft de desarrollo de software en equipo para soportarla.
Resumo:
The general objective of this thesis has been seasonal monitoring (quarterly time scale) of coastal and estuarine areas of a section of the Northern Coast of Rio Grande do Norte, Brazil, environmentally sensitive and with intense sediment erosion in the oil activities to underpin the implementation of projects for containment of erosion and mitigate the impacts of coastal dynamics. In order to achieve the general objective, the work was done systematically in three stages which consisted the specific objectives. The first stage was the implementation of geodetic reference infrastructure for carrying out the geodetic survey of the study area. This process included the implementation of RGLS (Northern Coast of the RN GPS Network), consisting of stations with geodetic coordinates and orthometric heights of precision; positioning of Benchmarks and evaluation of the gravimetric geoid available, for use in GPS altimetry of precision; and development of software for GPS altimetry of precision. The second stage was the development and improvement of methodologies for collection, processing, representation, integration and analysis of CoastLine (CL) and Digital Elevation Models (DEM) obtained by geodetic positioning techniques. As part of this stage have been made since, the choice of equipment and positioning methods to be used, depending on the required precision and structure implanted, and the definition of the LC indicator and of the geodesic references best suited, to coastal monitoring of precision. The third step was the seasonal geodesic monitoring of the study area. It was defined the execution times of the geodetic surveys by analyzing the pattern of sediment dynamics of the study area; the performing of surveys in order to calculate and locate areas and volumes of erosion and accretion (sandy and volumetric sedimentary balance) occurred on CL and on the beaches and islands surfaces throughout the year, and study of correlations between the measured variations (in area and volume) between each survey and the action of the coastal dynamic agents. The results allowed an integrated study of spatial and temporal interrelationships of the causes and consequences of intensive coastal processes operating in the area, especially to the measurement of variability of erosion, transport, balance and supply sedimentary over the annual cycle of construction and destruction of beaches. In the analysis of the results, it was possible to identify the causes and consequences of severe coastal erosion occurred on beaches exposed, to analyze the recovery of beaches and the accretion occurring in tidal inlets and estuaries. From the optics of seasonal variations in the CL, human interventions to erosion contention have been proposed with the aim of restoring the previous situation of the beaches in the process of erosion.