34 resultados para Source code
em Instituto Politécnico do Porto, Portugal
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores
Resumo:
Real-time systems demand guaranteed and predictable run-time behaviour in order to ensure that no task has missed its deadline. Over the years we are witnessing an ever increasing demand for functionality enhancements in the embedded real-time systems. Along with the functionalities, the design itself grows more complex. Posed constraints, such as energy consumption, time, and space bounds, also require attention and proper handling. Additionally, efficient scheduling algorithms, as proven through analyses and simulations, often impose requirements that have significant run-time cost, specially in the context of multi-core systems. In order to further investigate the behaviour of such systems to quantify and compare these overheads involved, we have developed the SPARTS, a simulator of a generic embedded real- time device. The tasks in the simulator are described by externally visible parameters (e.g. minimum inter-arrival, sporadicity, WCET, BCET, etc.), rather than the code of the tasks. While our current implementation is primarily focused on our immediate needs in the area of power-aware scheduling, it is designed to be extensible to accommodate different task properties, scheduling algorithms and/or hardware models for the application in wide variety of simulations. The source code of the SPARTS is available for download at [1].
Resumo:
Traditional Real-Time Operating Systems (RTOS) are not designed to accommodate application specific requirements. They address a general case and the application must co-exist with any limitations imposed by such design. For modern real-time applications this limits the quality of services offered to the end-user. Research in this field has shown that it is possible to develop dynamic systems where adaptation is the key for success. However, adaptation requires full knowledge of the system state. To overcome this we propose a framework to gather data, and interact with the operating system, extending the traditional POSIX trace model with a partial reflective model. Such combination still preserves the trace mechanism semantics while creating a powerful platform to develop new dynamic systems, with little impact in the system and avoiding complex changes in the kernel source code.
Resumo:
Com o envelhecimento da população, as preocupações com a garantia do seu bem-estar aumentam criando a necessidade de desenvolver ferramentas que permitam monitorizar em permanência este sector da população. A utilização de smartphones pelos mais velhos pode ser crucial no seu bem-estar e na sua autonomia contribuindo para a recolha de informação importante já que estes estão muitas vezes equipados com sensores que podem dar indicações preciosas ao cuidador sobre o estado atual do paciente. Os sensores podem fornecer dados sobre a atividade física do paciente, bem como detetar quedas ou calcular a sua posição, com a ajuda do acelerómetro, do giroscópio e do sensor de campo magnético. No entanto, funcionalidades como essas requerem, obrigatoriamente, uma frequência de amostragem mínima por parte dos sensores que permita a implementação de algoritmos, que determinarão esses parâmetros da forma mais exata possível. Dado que nem sempre os pacientes se fazem acompanhar do seu smartphone quando estão na sua residência, a criação de ambientes de AAL (Ambient Assisted Living) com recurso a dispositivos externos que podem ser “vestidos” pelos pacientes pode também ser uma solução adequada. Estes contêm normalmente os mesmos sensores que os smartphones e comunicam com estes através de tecnologias sem fios, como é o caso do Bluetooth Low Energy. Neste trabalho, avaliou-se a possibilidade de alteração da frequência dos sensores em diferentes sistemas operativos, tendo sido efectuadas modificações nas instalações por defeito de alguns sistemas operativos abertos. Com o objectivo de permitir a criação de uma solução de AAL com recurso a um dispositivo externo implementaram-se serviços e perfis num dispositivo externo, o SensorTag.
Resumo:
Existent computer programming training environments help users to learn programming by solving problems from scratch. Nevertheless, initiating the resolution of a program can be frustrating and demotivating if the student does not know where and how to start. Skeleton programming facilitates a top-down design approach, where a partially functional system with complete high level structures is available, so the student needs only to progressively complete or update the code to meet the requirements of the problem. This paper presents CodeSkelGen - a program skeleton generator. CodeSkelGen generates skeleton or buggy Java programs from a complete annotated program solution provided by the teacher. The annotations are formally described within an annotation type and processed by an annotation processor. This processor is responsible for a set of actions ranging from the creation of dummy methods to the exchange of operator types included in the source code. The generator tool will be included in a learning environment that aims to assist teachers in the creation of programming exercises and to help students in their resolution.
Resumo:
The 6loWPAN (the light version of IPv6) and RPL (routing protocol for low-power and lossy links) protocols have become de facto standards for the Internet of Things (IoT). In this paper, we show that the two native algorithms that handle changes in network topology – the Trickle and Neighbor Discovery algorithms – behave in a reactive fashion and thus are not prepared for the dynamics inherent to nodes mobility. Many emerging and upcoming IoT application scenarios are expected to impose real-time and reliable mobile data collection, which are not compatible with the long message latency, high packet loss and high overhead exhibited by the native RPL/6loWPAN protocols. To solve this problem, we integrate a proactive hand-off mechanism (dubbed smart-HOP) within RPL, which is very simple, effective and backward compatible with the standard protocol. We show that this add-on halves the packet loss and reduces the hand-off delay dramatically to one tenth of a second, upon nodes’ mobility, with a sub-percent overhead. The smart-HOP algorithm has been implemented and integrated in the Contiki 6LoWPAN/RPL stack (source-code available on-line mrpl: smart-hop within rpl, 2014) and validated through extensive simulation and experimentation.
Resumo:
The complexity of systems is considered an obstacle to the progress of the IT industry. Autonomic computing is presented as the alternative to cope with the growing complexity. It is a holistic approach, in which the systems are able to configure, heal, optimize, and protect by themselves. Web-based applications are an example of systems where the complexity is high. The number of components, their interoperability, and workload variations are factors that may lead to performance failures or unavailability scenarios. The occurrence of these scenarios affects the revenue and reputation of businesses that rely on these types of applications. In this article, we present a self-healing framework for Web-based applications (SHõWA). SHõWA is composed by several modules, which monitor the application, analyze the data to detect and pinpoint anomalies, and execute recovery actions autonomously. The monitoring is done by a small aspect-oriented programming agent. This agent does not require changes to the application source code and includes adaptive and selective algorithms to regulate the level of monitoring. The anomalies are detected and pinpointed by means of statistical correlation. The data analysis detects changes in the server response time and analyzes if those changes are correlated with the workload or are due to a performance anomaly. In the presence of per- formance anomalies, the data analysis pinpoints the anomaly. Upon the pinpointing of anomalies, SHõWA executes a recovery procedure. We also present a study about the detection and localization of anomalies, the accuracy of the data analysis, and the performance impact induced by SHõWA. Two benchmarking applications, exercised through dynamic workloads, and different types of anomaly were considered in the study. The results reveal that (1) the capacity of SHõWA to detect and pinpoint anomalies while the number of end users affected is low; (2) SHõWA was able to detect anomalies without raising any false alarm; and (3) SHõWA does not induce a significant performance overhead (throughput was affected in less than 1%, and the response time delay was no more than 2 milliseconds).
Resumo:
The persistence concern implemented as an aspect has been studied since the appearance of the Aspect-Oriented paradigm. Frequently, persistence is given as an example that can be aspectized, but until today no real world solution has applied that paradigm. Such solution should be able to enhance the programmer productivity and make the application less prone to errors. To test the viability of that concept, in a previous study we developed a prototype that implements Orthogonal Persistence as an aspect. This first version of the prototype was already fully functional with all Java types including arrays. In this work the results of our new research to overcome some limitations that we have identified on the data type abstraction and transparency in the prototype are presented. One of our goals was to avoid the Java standard idiom for genericity, based on casts, type tests and subtyping. Moreover, we also find the need to introduce some dynamic data type abilities. We consider that the Reflection is the solution to those issues. To achieve that, we have extended our prototype with a new static weaver that preprocesses the application source code in order to introduce changes to the normal behavior of the Java compiler with a new generated reflective code.
Resumo:
Applications are subject of a continuous evolution process with a profound impact on their underlining data model, hence requiring frequent updates in the applications' class structure and database structure as well. This twofold problem, schema evolution and instance adaptation, usually known as database evolution, is addressed in this thesis. Additionally, we address concurrency and error recovery problems with a novel meta-model and its aspect-oriented implementation. Modern object-oriented databases provide features that help programmers deal with object persistence, as well as all related problems such as database evolution, concurrency and error handling. In most systems there are transparent mechanisms to address these problems, nonetheless the database evolution problem still requires some human intervention, which consumes much of programmers' and database administrators' work effort. Earlier research works have demonstrated that aspect-oriented programming (AOP) techniques enable the development of flexible and pluggable systems. In these earlier works, the schema evolution and the instance adaptation problems were addressed as database management concerns. However, none of this research was focused on orthogonal persistent systems. We argue that AOP techniques are well suited to address these problems in orthogonal persistent systems. Regarding the concurrency and error recovery, earlier research showed that only syntactic obliviousness between the base program and aspects is possible. Our meta-model and framework follow an aspect-oriented approach focused on the object-oriented orthogonal persistent context. The proposed meta-model is characterized by its simplicity in order to achieve efficient and transparent database evolution mechanisms. Our meta-model supports multiple versions of a class structure by applying a class versioning strategy. Thus, enabling bidirectional application compatibility among versions of each class structure. That is to say, the database structure can be updated because earlier applications continue to work, as well as later applications that have only known the updated class structure. The specific characteristics of orthogonal persistent systems, as well as a metadata enrichment strategy within the application's source code, complete the inception of the meta-model and have motivated our research work. To test the feasibility of the approach, a prototype was developed. Our prototype is a framework that mediates the interaction between applications and the database, providing them with orthogonal persistence mechanisms. These mechanisms are introduced into applications as an {\it aspect} in the aspect-oriented sense. Objects do not require the extension of any super class, the implementation of an interface nor contain a particular annotation. Parametric type classes are also correctly handled by our framework. However, classes that belong to the programming environment must not be handled as versionable due to restrictions imposed by the Java Virtual Machine. Regarding concurrency support, the framework provides the applications with a multithreaded environment which supports database transactions and error recovery. The framework keeps applications oblivious to the database evolution problem, as well as persistence. Programmers can update the applications' class structure because the framework will produce a new version for it at the database metadata layer. Using our XML based pointcut/advice constructs, the framework's instance adaptation mechanism is extended, hence keeping the framework also oblivious to this problem. The potential developing gains provided by the prototype were benchmarked. In our case study, the results confirm that mechanisms' transparency has positive repercussions on the programmer's productivity, simplifying the entire evolution process at application and database levels. The meta-model itself also was benchmarked in terms of complexity and agility. Compared with other meta-models, it requires less meta-object modifications in each schema evolution step. Other types of tests were carried out in order to validate prototype and meta-model robustness. In order to perform these tests, we used an OO7 small size database due to its data model complexity. Since the developed prototype offers some features that were not observed in other known systems, performance benchmarks were not possible. However, the developed benchmark is now available to perform future performance comparisons with equivalent systems. In order to test our approach in a real world scenario, we developed a proof-of-concept application. This application was developed without any persistence mechanisms. Using our framework and minor changes applied to the application's source code, we added these mechanisms. Furthermore, we tested the application in a schema evolution scenario. This real world experience using our framework showed that applications remains oblivious to persistence and database evolution. In this case study, our framework proved to be a useful tool for programmers and database administrators. Performance issues and the single Java Virtual Machine concurrent model are the major limitations found in the framework.
Resumo:
Absolute positioning – the real time satellite based positioning technique that relies solely on global navigation satellite systems – lacks accuracy for several real time application domains. To provide increased positioning quality, ground or satellite based augmentation systems can be devised, depending on the extent of the area to cover. The underlying technique – multiple reference station differential positioning – can, in the case of ground systems, be further enhanced through the implementation of the virtual reference station concept. Our approach is a ground based system made of a small-sized network of three stations where the concept of virtual reference station was implemented. The stations provide code pseudorange corrections, which are combined using a measurement domain approach inversely proportional to the distance from source station to rover. All data links are established trough the Internet.
Resumo:
Orientador: Mestre Alberto Couto
Resumo:
Deoxyribonucleic acid, or DNA, is the most fundamental aspect of life but present day scientific knowledge has merely scratched the surface of the problem posed by its decoding. While experimental methods provide insightful clues, the adoption of analysis tools supported by the formalism of mathematics will lead to a systematic and solid build-up of knowledge. This paper studies human DNA from the perspective of system dynamics. By associating entropy and the Fourier transform, several global properties of the code are revealed. The fractional order characteristics emerge as a natural consequence of the information content. These properties constitute a small piece of scientific knowledge that will support further efforts towards the final aim of establishing a comprehensive theory of the phenomena involved in life.
Resumo:
Development of Dual Source Computed Tomography (Definition, Siemens Medical Solutions, Erlanger, Germany) allowed advances in temporal resolution, with the addition of a second X-ray source and an array of detectors to the TCM 64 slices. The ability to run exams on Dual Energy, allows greater differentiation of tissues, showing differences between closer attenuation coefficients. In terms of renal applications, the distinction of kidney stones and masses become one of the main advantages of the use of dual-energy technology. This article pretends to demonstrate operating principles of this equipment, as its main renal applications.
Resumo:
Espresso spent coffee grounds were chemically characterized to predict their potential, as a source of bioactive compounds, by comparison with the ones from the soluble coffee industry. Sampling included a total of 50 samples from 14 trademarks, collected in several coffee shops and prepared with distinct coffee machines. A high compositional variability was verified, particularly with regard to such water-soluble components as caffeine, total chlorogenic acids (CGA), and minerals, supported by strong positive correlations with total soluble solids retained. This is a direct consequence of the reduced extraction efficiency during espresso coffee preparation, leaving a significant pool of bioactivity retained in the extracted grounds. Besides the lipid (12.5%) and nitrogen (2.3%) contents, similar to those of industrial coffee residues, the CGA content (478.9 mg/100 g), for its antioxidant capacity, and its caffeine content (452.6 mg/100 g), due to its extensive use in the food and pharmaceutical industries, justify the selective assembly of this residue for subsequent use.
Resumo:
A CIF é uma ferramenta universal desenvolvida pela OMS que permite a classificação de funcionalidade e incapacidade, através de uma visualização global do que condiciona o desempenho do indivíduo na concretização de atividades e na participação em ocupações. A ideologia da CIF e os seus componentes interrelacionam-se com a essência da TO, indo ao encontro dos modelos da profissão. As UCCI constituem uma atualidade em Portugal e o terapeuta ocupacional é um dos profissionais obrigatórios na equipa multidisciplinar destas unidades. Atendendo à relevância internacional da CIF, à sua ligação com a TO e à necessidade de tornar a CIF operacional na prática clínica diária dado que é uma ferramenta complexa e extensa, é objetivo deste estudo contribuir para a construção de um code set da CIF para terapeutas ocupacionais que exercem funções em UCCI, especificamente em UC, UMDR e ULDM. Para a concretização desta investigação, utilizou-se a técnica de Delphi, que envolveu duas rondas. Na primeira ronda foi possível contar com a participação de 37 terapeutas ocupacionais experientes na área, uma vez que exercem funções em UCCI, e na segunda ronda contou-se com a participação de 20 elementos. Obtiveram consenso na última ronda de Delphi um total de 96 categorias, constituindo esta listagem uma proposta de code set para UCCI. No que se refere às tipologias de unidades, 69 categorias obtiveram consenso em UC, 91 em UMDR e 41 em ULDM. Concluiu-se que a criação de code sets poderá constituir uma mais-valia em contexto de equipa multidisciplinar das UCCI, sendo uma forma de tornar a CIF operacional na prática clínica diária.