876 resultados para Computer software - Development
Resumo:
[ES] El Trabajo Final de Grado tiene por finalidad ofrecer una solución que ayude a las personas a gestionar sus tareas tanto personales como empresariales de una manera más productiva. Actualmente este tipo de aplicaciones tienen mucho éxito. Se decidió que el desarrollo de esta aplicación fuera con la metodología Getting Things Done (GTD) ya que es una metodología que aumenta la productividad y reduce el estrés laboral. A día de hoy, no hay muchas aplicaciones que utilice esta metodología y las que la utilizan lo hace de una forma muy básica. Junto a esta metodología y guiándonos de la experiencia del tutor se intentó combinar esta metodología con controles de tiempo para mejorar aún más la productividad de las personas que utiliza dicho software. El resultado obtenido de este trabajo final de grado fue la base de una aplicación web para la gestión de tareas. El software creado es totalmente funcional, muy fácil de usar, muy intuitivo, y usa la filosofía Getting Things Done . Básicamente los objetivos principales conseguidos en este proyecto fueron: la gestión de usuarios. La gestión de tareas y proyectos. Aplicación de la metodología GTD. Control del tiempo productivo, e improductivo, interrupciones, temporizadores. La aplicación ha sido realizada como Trabajo Final de Grado en Ingeniería Informática, cumpliendo con todas las fases del desarrollo del software, para obtener un producto funcional que fuera aprobado por el tutor que haría el rol de potencial cliente. En el presente proyecto se ha seguido la metodología RUP, dirigida por casos de uso, iterativa e incremental. Para completar el proceso se ha realizado la elaboración de una lista de características, la especificación de los casos de uso, una fase de análisis, una de diseño, implementación y prueba. Las tecnologías utilizadas han sido, principalmente, Ruby On Rails, HTML5, CSS , AJAX y JAVASCRIPT. El objetivo a largo plazo es que esta solución pueda ser tomada como base de implementación, donde haciendo las mejoras necesarias se pueda poner en el mercado un gran software de gestión de tareas siguiendo la metodología GTD.
Resumo:
[ES] Este proyecto de fin de carrera aborda la actualización y refactorización de la aplicación Hecaton. Esta aplicación permite la monitorización y actuación en instalaciones industriales de manera remota a través de un interfaz web. Para ello hace uso de sensores y actuadores que, conectados a través de un equipo de adquisición de datos a un sistema informático servidor, permiten obtener, manipular y almacenar los datos y eventos recibidos. Hecaton ha sido desarrollado enteramente utilizando software libre. Además, el sistema permite ser personalizado, lo que posibilita su uso en todo tipo de escenarios, siendo el usuario quién define las reglas de funcionamiento. Este trabajo se trata del cuarto ciclo de desarrollo, pues la aplicación ha sido crea y ampliada en otros tres proyectos. En este último desarrollo se han actualizado las tecnologías y herramientas que forman parte de la aplicación. Se ha puesto especial énfasis en el rediseño de la interfaz web, adoptando el uso de las últimas tecnologías web que permiten un funcionamiento dinámico de la misma. Por otro lado se han corregido algunos errores de diseño e introducido el uso de nuevas herramientas para la gestión del proyecto software. Se trata por lo tanto de un ejercicio de refactorización software donde se ha puesto especial atención en conseguir un proyecto actualizado y que utilice metodologías de desarrollo actuales y que posibilite que sea actualizado en un futuro.
Resumo:
This PhD thesis presents the results, achieved at the Aerospace Engineering Department Laboratories of the University of Bologna, concerning the development of a small scale Rotary wing UAVs (RUAVs). In the first part of the work, a mission simulation environment for rotary wing UAVs was developed, as main outcome of the University of Bologna partnership in the CAPECON program (an EU funded research program aimed at studying the UAVs civil applications and economic effectiveness of the potential configuration solutions). The results achieved in cooperation with DLR (German Aerospace Centre) and with an helicopter industrial partners will be described. In the second part of the work, the set-up of a real small scale rotary wing platform was performed. The work was carried out following a series of subsequent logical steps from hardware selection and set-up to final autonomous flight tests. This thesis will focus mainly on the RUAV avionics package set-up, on the onboard software development and final experimental tests. The setup of the electronic package allowed recording of helicopter responses to pilot commands and provided deep insight into the small scale rotorcraft dynamics, facilitating the development of helicopter models and control systems in a Hardware In the Loop (HIL) simulator. A neested PI velocity controller1 was implemented on the onboard computer and autonomous flight tests were performed. Comparison between HIL simulation and experimental results showed good agreement.
Resumo:
ZusammenfassungDie Sekretion von Arzneistoffen aus Darmzellen zurück ins Darmlumen, die durch intestinale Transporter wie P-Glykoprotein (P-GP) vermittelt wird, stellt eine bekannte Quelle für unvollständige und variable Bioverfügbarkeiten und für Interaktionen mit anderen Arzneimitteln und Nahrungsbestandteilen dar. Dennoch liegen bisher keine Veröffentlichungen vor, die sich mit daraus resultierenden Konsequenzen für die Entwicklung neuer peroraler Darreichungsformen befassen. Ziel der vorliegenden Arbeit war es, deutlich zu machen, dass dem Auftreten von intestinalen Sekretionsphänomenen bei der Entwicklung von Retardarzneimitteln Rechnung getragen werden muss.Dazu wurden effektive Permeabilitäten für den Modellarzneistoff Talinolol in unterschiedlichen Darmabschnitten anhand eines Rattendarmperfusionsmodells bestimmt.Des weiteren wurde eine Retardformulierung für den Modellarzneistoff Talinolol entwickelt. Dabei wurde gezeigt, dass die Verwendung unterschiedlicher Puffer als Wirkstofffreisetzungmedien zur Ausbildung unterschiedlicher Talinolol-Kristallstrukturen führt.Die neu entwickelten Retardmatrixtabletten wurden mit Hilfe des Pharmakokinetik-Computersoftwareprogrammes Gastro Plus® evaluiert. Das Zusammenspiel von verlangsamter Wirkstofffreigabe aus der Arzneiform und intestinaler Sekretion führte zu einer deutlich verringerten Bioverfügbarkeit der Modellsubstanz Talinolol aus der Retardformulierung im Vergleich zu schnellfreisetzenden Arzneiformen.Daher sollte der Einfluß intestinaler sekretorischer Transporter wie P-GP bei der Entwicklung von Retardarzneiformen unbedingt berücksichtigt werden.
Resumo:
Die Aufgabenstellung, welche dieser Dissertation zugrunde liegt, lässt sich kurz als die Untersuchung von komponentenbasierten Konzepten zum Einsatz in der Softwareentwicklung durch Endanwender beschreiben. In den letzten 20 bis 30 Jahren hat sich das technische Umfeld, in dem ein Großteil der Arbeitnehmer seine täglichen Aufgaben verrichtet, grundlegend verändert. Der Computer, früher in Form eines Großrechners ausschließlich die Domäne von Spezialisten, ist nun ein selbstverständlicher Bestandteil der täglichen Arbeit. Der Umgang mit Anwendungsprogrammen, die dem Nutzer erlauben in einem gewissen Rahmen neue, eigene Funktionalität zu definieren, ist in vielen Bereichen so selbstverständlich, dass viele dieser Tätigkeiten nicht bewusst als Programmieren wahrgenommen werden. Da diese Nutzer nicht notwendigerweise in der Entwicklung von Software ausgebildet sind, benötigen sie entsprechende Unterstützung bei diesen Tätigkeiten. Dies macht deutlich, welche praktische Relevanz die Untersuchungen in diesem Bereich haben. Zur Erstellung eines Programmiersystems für Endanwender wird zunächst ein flexibler Anwendungsrahmen entwickelt, welcher sich als Basis zur Erstellung solcher Systeme eignet. In Softwareprojekten sind sich ändernde Anforderungen und daraus resultierende Notwendigkeiten ein wichtiger Aspekt. Dies wird im Entwurf des Frameworks durch Konzepte zur Bereitstellung von wieder verwendbarer Funktionalität durch das Framework und Möglichkeiten zur Anpassung und Erweiterung der vorhandenen Funktionalität berücksichtigt. Hier ist zum einen der Einsatz einer serviceorientierten Architektur innerhalb der Anwendung und zum anderen eine komponentenorientierte Variante des Kommando-Musters zu nennen. Zum anderen wird ein Konzept zur Kapselung von Endnutzerprogrammiermodellen in Komponenten erarbeitet. Dieser Ansatz ermöglicht es, unterschiedliche Modelle als Grundlage der entworfenen Entwicklungsumgebung zu verwenden. Im weiteren Verlauf der Arbeit wird ein Programmiermodell entworfen und unter Verwendung des zuvor genannten Frameworks implementiert. Damit dieses zur Nutzung durch Endanwender geeignet ist, ist eine Anhebung der zur Beschreibung eines Softwaresystems verwendeten Abstraktionsebene notwendig. Dies wird durch die Verwendung von Komponenten und einem nachrichtenbasierten Kompositionsmechanismus erreicht. Die vorgenommene Realisierung ist dabei noch nicht auf konkrete Anwendungsfamilien bezogen, diese Anpassungen erfolgen in einem weiteren Schritt für zwei unterschiedliche Anwendungsbereiche.
Resumo:
Cost, performance and availability considerations are forcing even the most conservative high-integrity embedded real-time systems industry to migrate from simple hardware processors to ones equipped with caches and other acceleration features. This migration disrupts the practices and solutions that industry had developed and consolidated over the years to perform timing analysis. Industry that are confident with the efficiency/effectiveness of their verification and validation processes for old-generation processors, do not have sufficient insight on the effects of the migration to cache-equipped processors. Caches are perceived as an additional source of complexity, which has potential for shattering the guarantees of cost- and schedule-constrained qualification of their systems. The current industrial approach to timing analysis is ill-equipped to cope with the variability incurred by caches. Conversely, the application of advanced WCET analysis techniques on real-world industrial software, developed without analysability in mind, is hardly feasible. We propose a development approach aimed at minimising the cache jitters, as well as at enabling the application of advanced WCET analysis techniques to industrial systems. Our approach builds on:(i) identification of those software constructs that may impede or complicate timing analysis in industrial-scale systems; (ii) elaboration of practical means, under the model-driven engineering (MDE) paradigm, to enforce the automated generation of software that is analyzable by construction; (iii) implementation of a layout optimisation method to remove cache jitters stemming from the software layout in memory, with the intent of facilitating incremental software development, which is of high strategic interest to industry. The integration of those constituents in a structured approach to timing analysis achieves two interesting properties: the resulting software is analysable from the earliest releases onwards - as opposed to becoming so only when the system is final - and more easily amenable to advanced timing analysis by construction, regardless of the system scale and complexity.
Resumo:
Computer-assisted translation (or computer-aided translation or CAT) is a form of language translation in which a human translator uses computer software in order to facilitate the translation process. Machine translation (MT) is the automated process by which a computerized system produces a translated text or speech from one natural language to another. Both of them are leading and promising technologies in the translation industry; it therefore seems important that translation students and professional translators become familiar with this relatively new types of technology. Whether used together, not only might these two different types of systems reduce translation time, but also lead to a further improvement in the field of translation technologies. The dissertation consists of four chapters. The first one surveys the chronological development of MT and CAT tools, the emergence of pre-editing, post-editing and controlled language and the very last frontiers in this sector. The second one provide a general overview on the four main CAT tools that are used nowadays and tested hereto. The third chapter is dedicated to the experimentations that have been conducted in order to analyze and evaluate the performance of the four integrated systems that are the core subject of this dissertation. Finally, the fourth chapter deals with the issue of terminological equivalence in interlinguistic translation. The purpose of this dissertation is not to provide an objective and definitive solution to the complex issues that arise at any time in the field of translation technologies, this aim being well away from being achieved, but to supply information about the limits and potentiality that are typical of those instruments which are now essential to any professional translator.
Resumo:
The single-electron transistor (SET) is one of the best candidates for future nano electronic circuits because of its ultralow power consumption, small size and unique functionality. SET devices operate on the principle of Coulomb blockade, which is more prominent at dimensions of a few nano meters. Typically, the SET device consists of two capacitively coupled ultra-small tunnel junctions with a nano island between them. In order to observe the Coulomb blockade effects in a SET device the charging energy of the device has to be greater that the thermal energy. This condition limits the operation of most of the existing SET devices to cryogenic temperatures. Room temperature operation of SET devices requires sub-10nm nano-islands due to the inverse dependence of charging energy on the radius of the conducting nano-island. Fabrication of sub-10nm structures using lithography processes is still a technological challenge. In the present investigation, Focused Ion Beam based etch and deposition technology is used to fabricate single electron transistors devices operating at room temperature. The SET device incorporates an array of tungsten nano-islands with an average diameter of 8nm. The fabricated devices are characterized at room temperature and clear Coulomb blockade and Coulomb oscillations are observed. An improvement in the resolution limitation of the FIB etching process is demonstrated by optimizing the thickness of the active layer. SET devices with structural and topological variation are developed to explore their impact on the behavior of the device. The threshold voltage of the device was minimized to ~500mV by minimizing the source-drain gap of the device to 17nm. Vertical source and drain terminals are fabricated to realize single-dot based SET device. A unique process flow is developed to fabricate Si dot based SET devices for better gate controllability in the device characteristic. The device vi parameters of the fabricated devices are extracted by using a conductance model. Finally, characteristic of these devices are validated with the simulated data from theoretical modeling.
Resumo:
Information systems (IS) outsourcing projects often fail to achieve initial goals. To avoid project failure, managers need to design formal controls that meet the specific contextual demands of the project. However, the dynamic and uncertain nature of IS outsourcing projects makes it difficult to design such specific formal controls at the outset of a project. It is hence crucial to translate high-level project goals into specific formal controls during the course of a project. This study seeks to understand the underlying patterns of such translation processes. Based on a comparative case study of four outsourced software development projects, we inductively develop a process model that consists of three unique patterns. The process model shows that the performance implications of emergent controls with higher specificity depend on differences in the translation process. Specific formal controls have positive implications for goal achievement if only the stakeholder context is adapted, while they are negative for goal achievement if in the translation process tasks are unintendedly adapted. In the latter case projects incrementally drift away from their initial direction. Our findings help to better understand control dynamics in IS outsourcing projects. We contribute to a process theoretic understanding of IS outsourcing governance and we derive implications for control theory and the IS project escalation literature.
Resumo:
The private-collective innovation model proposes incentives for individuals and firms to privately invest resources to create public goods innovations. Such innovations are characterized by non-rivalry and non-exclusivity in consumption. Examples include open source software, user-generated media products, drug formulas, and sport equipment designs. There is still limited empirical research on private-collective innovation. We present a case study to (1) provide empirical evidence of a case of private-collective innovation, showing specific benefits, and (2) to extend the private-collective innovation model by analyzing the hidden costs for the company involved. We examine the development of the Nokia Internet Tablet, which builds on both proprietary and open source software development, and that involves both Nokia developers and volunteers who are not employed by the company. Seven benefits for Nokia are identified, as are five hidden costs: difficulty to differentiate, guarding business secrets, reducing community entry barriers, giving up control, and organizational inertia. We examine the actions taken by the management to mitigate these costs throughout the development period.
Resumo:
Software Product Line Engineering (SPLE) is becoming widely used due to the improvement it means when developing software products of the same family. However, SPLE demands long-term investment on a product-line platform that might not be profitable due to rapid changing business settings. Since Agile Software Development (ASD) approaches are being successfully applied in volatile markets, several companies have suggested the idea of integrating SPLE and ASD when a family product has to be developed. Agile Product Line Engineering (APLE) advocates the integration of SPLE and ASD to address their lacks when they are individually applied to software development. A previous literature re-view of experiences and practices on APLE revealed important challenges about how to fully put APLE into practice. Our contribution address several of these challenges by tailoring the agile method Scrum by means of three concepts that we have defined: plastic partial components, working PL-architectures, and reactive reuse.
Resumo:
En este proyecto se estudian y analizan las diferentes técnicas de procesado digital de señal aplicadas a acelerómetros. Se hace uso de una tarjeta de prototipado, basada en DSP, para realizar las diferentes pruebas. El proyecto se basa, principalmente, en realizar filtrado digital en señales provenientes de un acelerómetro en concreto, el 1201F, cuyo campo de aplicación es básicamente la automoción. Una vez estudiadas la teoría de procesado y las características de los filtros, diseñamos una aplicación basándonos sobre todo en el entorno en el que se desarrollaría una aplicación de este tipo. A lo largo del diseño, se explican las diferentes fases: diseño por ordenador (Matlab), diseño de los filtros en el DSP (C), pruebas sobre el DSP sin el acelerómetro, calibración del acelerómetro, pruebas finales sobre el acelerómetro... Las herramientas utilizadas son: la plataforma Kit de evaluación 21-161N de Analog Devices (equipado con el entorno de desarrollo Visual DSP 4.5++), el acelerómetro 1201F, el sistema de calibración de acelerómetros CS-18-LF de Spektra y los programas software MATLAB 7.5 y CoolEditPRO 2.0. Se realizan únicamente filtros IIR de 2º orden, de todos los tipos (Butterworth, Chebyshev I y II y Elípticos). Realizamos filtros de banda estrecha, paso-banda y banda eliminada, de varios tipos, dentro del fondo de escala que permite el acelerómetro. Una vez realizadas todas las pruebas, tanto simulaciones como físicas, se seleccionan los filtros que presentan un mejor funcionamiento y se analizan para obtener conclusiones. Como se dispone de un entorno adecuado para ello, se combinan los filtros entre sí de varias maneras, para obtener filtros de mayor orden (estructura paralelo). De esta forma, a partir de filtros paso-banda, podemos obtener otras configuraciones que nos darán mayor flexibilidad. El objetivo de este proyecto no se basa sólo en obtener buenos resultados en el filtrado, sino también de aprovechar las facilidades del entorno y las herramientas de las que disponemos para realizar el diseño más eficiente posible. In this project, we study and analize digital signal processing in order to design an accelerometer-based application. We use a hardware card of evaluation, based on DSP, to make different tests. This project is based in design digital filters for an automotion application. The accelerometer type is 1201F. First, we study digital processing theory and main parameters of real filters, to make a design based on the application environment. Along the application, we comment all the different steps: computer design (Matlab), filter design on the DSP (C language), simulation test on the DSP without the accelerometer, accelerometer calibration, final tests on the accelerometer... Hardware and software tools used are: Kit of Evaluation 21-161-N, based on DSP, of Analog Devices (equiped with software development tool Visual DSP 4.5++), 1201-F accelerometer, CS-18-LF calibration system of SPEKTRA and software tools MATLAB 7.5 and CoolEditPRO 2.0. We only perform 2nd orden IIR filters, all-type : Butterworth, Chebyshev I and II and Ellyptics. We perform bandpass and stopband filters, with very narrow band, taking advantage of the accelerometer's full scale. Once all the evidence, both simulations and physical, are finished, filters having better performance and analyzed and selected to draw conclusions. As there is a suitable environment for it, the filters are combined together in different ways to obtain higher order filters (parallel structure). Thus, from band-pass filters, we can obtain many configurations that will give us greater flexibility. The purpose of this project is not only based on good results in filtering, but also to exploit the facilities of the environment and the available tools to make the most efficient design possible.
Resumo:
Software testing is a key aspect of software reliability and quality assurance in a context where software development constantly has to overcome mammoth challenges in a continuously changing environment. One of the characteristics of software testing is that it has a large intellectual capital component and can thus benefit from the use of the experience gained from past projects. Software testing can, then, potentially benefit from solutions provided by the knowledge management discipline. There are in fact a number of proposals concerning effective knowledge management related to several software engineering processes. Objective: We defend the use of a lesson learned system for software testing. The reason is that such a system is an effective knowledge management resource enabling testers and managers to take advantage of the experience locked away in the brains of the testers. To do this, the experience has to be gathered, disseminated and reused. Method: After analyzing the proposals for managing software testing experience, significant weaknesses have been detected in the current systems of this type. The architectural model proposed here for lesson learned systems is designed to try to avoid these weaknesses. This model (i) defines the structure of the software testing lessons learned; (ii) sets up procedures for lesson learned management; and (iii) supports the design of software tools to manage the lessons learned. Results: A different approach, based on the management of the lessons learned that software testing engineers gather from everyday experience, with two basic goals: usefulness and applicability. Conclusion: The architectural model proposed here lays the groundwork to overcome the obstacles to sharing and reusing experience gained in the software testing and test management. As such, it provides guidance for developing software testing lesson learned systems.
Resumo:
This research is concerned with the experimental software engineering area, specifically experiment replication. Replication has traditionally been viewed as a complex task in software engineering. This is possibly due to the present immaturity of the experimental paradigm applied to software development. Researchers usually use replication packages to replicate an experiment. However, replication packages are not the solution to all the information management problems that crop up when successive replications of an experiment accumulate. This research borrows ideas from the software configuration management and software product line paradigms to support the replication process. We believe that configuration management can help to manage and administer information from one replication to another: hypotheses, designs, data analysis, etc. The software product line paradigm can help to organize and manage any changes introduced into the experiment by each replication. We expect the union of the two paradigms in replication to improve the planning, design and execution of further replications and their alignment with existing replications. Additionally, this research work will contribute a web support environment for archiving information related to different experiment replications. Additionally, it will provide flexible enough information management support for running replications with different numbers and types of changes. Finally, it will afford massive storage of data from different replications. Experimenters working collaboratively on the same experiment must all have access to the different experiments.