155 resultados para Applications Software
em Instituto Politécnico do Porto, Portugal
Resumo:
The change of paradigm imposed by the Bologna process, in which the student will be responsible for their own learning, and the presence of a new generation of students with higher technological skills, represent a huge challenge for higher education institutions. The use of new Web Social concepts in teaching process, supported by applications commonly called Web 2.0, with which these new students feel at ease, can bring benefits in terms of motivation and the frequency and quality of students' involvement in academic activities. An e-learning platform with web-based applications as a complement can significantly contribute to the development of different skills in higher education students, covering areas which are usually in deficit.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores
Resumo:
In this paper, we present some of the fault tolerance management mechanisms being implemented in the Multi-μ architecture, namely its support for replica non-determinism. In this architecture, fault tolerance is achieved by node active replication, with software based replica management and fault tolerance transparent algorithms. A software layer implemented between the application and the real-time kernel, the Fault Tolerance Manager (FTManager), is the responsible for the transparent incorporation of the fault tolerance mechanisms The active replication model can be implemented either imposing replica determinism or keeping replica consistency at critical points, by means of interactive agreement mechanisms. One of the Multi-μ architecture goals is to identify such critical points, relieving the underlying system from performing the interactive agreement in every Ada dispatching point.
Resumo:
The recent trends of chip architectures with higher number of heterogeneous cores, and non-uniform memory/non-coherent caches, brings renewed attention to the use of Software Transactional Memory (STM) as a fundamental building block for developing parallel applications. Nevertheless, although STM promises to ease concurrent and parallel software development, it relies on the possibility of aborting conflicting transactions to maintain data consistency, which impacts on the responsiveness and timing guarantees required by embedded real-time systems. In these systems, contention delays must be (efficiently) limited so that the response times of tasks executing transactions are upper-bounded and task sets can be feasibly scheduled. In this paper we assess the use of STM in the development of embedded real-time software, defending that the amount of contention can be reduced if read-only transactions access recent consistent data snapshots, progressing in a wait-free manner. We show how the required number of versions of a shared object can be calculated for a set of tasks. We also outline an algorithm to manage conflicts between update transactions that prevents starvation.
Resumo:
As aplicações de Gestão ou Faturação são uma presença indispensável hoje em dia. Tendo o seu início nas aplicações “MS-DOS” em modo de texto, estas aplicações acompanharam a evolução dos sistemas operativos adotando um ambiente gráfico de forma natural. Se há poucos anos apenas as empresas com volumes de negócio significativo possuíam software de faturação, este foi sendo adotado por cada vez mais empresas e pequenos negócios. As alterações legislativas introduzidas desde 2011 conduziram a uma adoção generalizada por parte de pequenas e microempresas. O mercado de aplicações de gestão está saturado pelos grandes produtores de software nacionais: Primavera, Sage, etc. Estas aplicações, tendo sido construídas para PMEs (Pequenas e Médias Empresas) e mesmo grandes empresas, são excessivamente complexas e onerosas para muito pequenas e microempresas. O Modelo de negócio destes produtores de software é primordialmente a venda de Licenças e contratos de Manutenção, nalguns casos através de redes de Agentes. Este projeto teve como objetivo o desenvolvimento de uma Aplicação de Faturação, de baixo custo, simples e cross-platform para ser comercializada em regime de aluguer em Pequenas e Micro Empresas.
Resumo:
3rd Workshop on High-performance and Real-time Embedded Systems (HIRES 2015). 21, Jan, 2015. Amsterdam, Netherlands.
Resumo:
Article in Press, Corrected Proof
Resumo:
Recent embedded processor architectures containing multiple heterogeneous cores and non-coherent caches renewed attention to the use of Software Transactional Memory (STM) as a building block for developing parallel applications. STM promises to ease concurrent and parallel software development, but relies on the possibility of abort conflicting transactions to maintain data consistency, which in turns affects the execution time of tasks carrying transactions. Because of this fact the timing behaviour of the task set may not be predictable, thus it is crucial to limit the execution time overheads resulting from aborts. In this paper we formalise a FIFO-based algorithm to order the sequence of commits of concurrent transactions. Then, we propose and evaluate two non-preemptive and one SRP-based fully-preemptive scheduling strategies, in order to avoid transaction starvation.
Resumo:
The recent technological advancements and market trends are causing an interesting phenomenon towards the convergence of High-Performance Computing (HPC) and Embedded Computing (EC) domains. On one side, new kinds of HPC applications are being required by markets needing huge amounts of information to be processed within a bounded amount of time. On the other side, EC systems are increasingly concerned with providing higher performance in real-time, challenging the performance capabilities of current architectures. The advent of next-generation many-core embedded platforms has the chance of intercepting this converging need for predictable high-performance, allowing HPC and EC applications to be executed on efficient and powerful heterogeneous architectures integrating general-purpose processors with many-core computing fabrics. To this end, it is of paramount importance to develop new techniques for exploiting the massively parallel computation capabilities of such platforms in a predictable way. P-SOCRATES will tackle this important challenge by merging leading research groups from the HPC and EC communities. The time-criticality and parallelisation challenges common to both areas will be addressed by proposing an integrated framework for executing workload-intensive applications with real-time requirements on top of next-generation commercial-off-the-shelf (COTS) platforms based on many-core accelerated architectures. The project will investigate new HPC techniques that fulfil real-time requirements. The main sources of indeterminism will be identified, proposing efficient mapping and scheduling algorithms, along with the associated timing and schedulability analysis, to guarantee the real-time and performance requirements of the applications.
Resumo:
It is widely accepted that organizations and individuals must be innovative and continually create new knowledge and ideas to deal with rapid change. Innovation plays an important role in not only the development of new business, process and products, but also in competitiveness and success of any organization. Technology for Creativity and Innovation: Tools, Techniques and Applications provides empirical research findings and best practices on creativity and innovation in business, organizational, and social environments. It is written for educators, academics and professionals who want to improve their understanding of creativity and innovation as well as the role technology has in shaping this discipline.
Resumo:
In this paper we present VERITAS, a tool that focus time maintenance, that is one of the most important processes in the engineering of the time during the development of KBS. The verification and validation (V&V) process is part of a wider process denominated knowledge maintenance, in which an enterprise systematically gathers, organizes, shares, and analyzes knowledge to accomplish its goals and mission. The V&V process states if the software requirements specifications have been correctly and completely fulfilled. The methodologies proposed in software engineering have showed to be inadequate for Knowledge Based Systems (KBS) validation and verification, since KBS present some particular characteristics. VERITAS is an automatic tool developed for KBS verification which is able to detect a large number of knowledge anomalies. It addresses many relevant aspects considered in real applications, like the usage of rule triggering selection mechanisms and temporal reasoning.
Resumo:
Power system planning, control and operation require an adequate use of existing resources as to increase system efficiency. The use of optimal solutions in power systems allows huge savings stressing the need of adequate optimization and control methods. These must be able to solve the envisaged optimization problems in time scales compatible with operational requirements. Power systems are complex, uncertain and changing environments that make the use of traditional optimization methodologies impracticable in most real situations. Computational intelligence methods present good characteristics to address this kind of problems and have already proved to be efficient for very diverse power system optimization problems. Evolutionary computation, fuzzy systems, swarm intelligence, artificial immune systems, neural networks, and hybrid approaches are presently seen as the most adequate methodologies to address several planning, control and operation problems in power systems. Future power systems, with intensive use of distributed generation and electricity market liberalization increase power systems complexity and bring huge challenges to the forefront of the power industry. Decentralized intelligence and decision making requires more effective optimization and control techniques techniques so that the involved players can make the most adequate use of existing resources in the new context. The application of computational intelligence methods to deal with several problems of future power systems is presented in this chapter. Four different applications are presented to illustrate the promises of computational intelligence, and illustrate their potentials.
Resumo:
Artificial intelligence techniques are being widely used to face the new reality and to provide solutions that can make power systems undergo all the changes while assuring high quality power. In this way, the agents that act in the power industry are gaining access to a generation of more intelligent applications, making use of a wide set of AI techniques. Knowledge-based systems and decision-support systems have been applied in the power and energy industry. This article is intended to offer an updated overview of the application of artificial intelligence in power systems. This article paper is organized in a way so that readers can easily understand the problems and the adequacy of the proposed solutions. Because of space constraints, this approach can be neither complete nor sufficiently deep to satisfy all readers’ needs. As this is amultidisciplinary area, able to attract both software and computer engineering and power system people, this article tries to give an insight into themost important concepts involved in these applications. Complementary material can be found in the reference list, providing deeper and more specific approaches.
Resumo:
Presently power system operation produces huge volumes of data that is still treated in a very limited way. Knowledge discovery and machine learning can make use of these data resulting in relevant knowledge with very positive impact. In the context of competitive electricity markets these data is of even higher value making clear the trend to make data mining techniques application in power systems more relevant. This paper presents two cases based on real data, showing the importance of the use of data mining for supporting demand response and for supporting player strategic behavior.
Resumo:
A supervisory control and data acquisition (SCADA) system is an integrated platform that incorporates several components and it has been applied in the field of power systems and several engineering applications to monitor, operate and control a lot of processes. In the future electrical networks, SCADA systems are essential for an intelligent management of resources like distributed generation and demand response, implemented in the smart grid context. This paper presents a SCADA system for a typical residential house. The application is implemented on MOVICON™11 software. The main objective is to manage the residential consumption, reducing or curtailing loads to keep the power consumption in or below a specified setpoint, imposed by the costumer and the generation availability.