40 resultados para Distributed environments
Resumo:
Dissertação para obtenção do Grau de Doutor em Engenharia Electrotécnica e de Computadores
Resumo:
Esta investigação tem como tema de estudo os ambientes pessoais de aprendizagem que se podem desenvolver em e-learning. Estes ambientes com características singulares, face ao atual estado de desenvolvimento tecnológico e social, têm sido designados na doutrina científica pela expressão anglo-saxónica Personal Learning Environments, da qual derivam os acrónimos PLE ou PLEs. Este estudo tem, como objetivo, compreender o papel dos PLEs na aprendizagem dos alunos da parte letiva do Mestrado em Gestão de Sistemas de e-Learning, da Faculdade de Ciências Sociais e Humanas da Universidade Nova de Lisboa, nos biénios que decorreram de 2010-2011 a 2012-2013. Estes alunos, ao longo da sua aprendizagem, utilizaram várias ferramentas e/ou serviços associados com as TIC e Web 2.0. Esta utilização permitiu aos alunos criarem um ecossistema de aprendizagem próprio. A metodologia de investigação utilizada teve em consideração sobretudo aspetos qualitativos. A estratégia utilizada para a recolha de informações foi o inquérito por questionário. As informações recolhidas foram sujeitas a tratamento estatístico descritivo, e posterior triangulação dos resultados de algumas das variáveis.Dos resultados obtidos, é possível concluir que os alunos do Mestrado criaram os seus próprios PLEs e que estes facilitaram as suas aprendizagens. Que a sua utilização conferiu vantagens aos alunos. Que os PLEs foram fundamentais para poderem desenvolver atividades colaborativas, e que criaram um ecossistema próprio, uma rede de troca de conhecimentos.
Resumo:
Nowadays, several sensors and mechanisms are available to estimate a mobile robot trajectory and location with respect to its surroundings. Usually absolute positioning mechanisms are the most accurate, but they also are the most expensive ones, and require pre installed equipment in the environment. Therefore, a system capable of measuring its motion and location within the environment (relative positioning) has been a research goal since the beginning of autonomous vehicles. With the increasing of the computational performance, computer vision has become faster and, therefore, became possible to incorporate it in a mobile robot. In visual odometry feature based approaches, the model estimation requires absence of feature association outliers for an accurate motion. Outliers rejection is a delicate process considering there is always a trade-off between speed and reliability of the system. This dissertation proposes an indoor 2D position system using Visual Odometry. The mobile robot has a camera pointed to the ceiling, for image analysis. As requirements, the ceiling and the oor (where the robot moves) must be planes. In the literature, RANSAC is a widely used method for outlier rejection. However, it might be slow in critical circumstances. Therefore, it is proposed a new algorithm that accelerates RANSAC, maintaining its reliability. The algorithm, called FMBF, consists on comparing image texture patterns between pictures, preserving the most similar ones. There are several types of comparisons, with different computational cost and reliability. FMBF manages those comparisons in order to optimize the trade-off between speed and reliability.
Resumo:
Companies are increasingly more and more dependent on distributed web-based software systems to support their businesses. This increases the need to maintain and extend software systems with up-to-date new features. Thus, the development process to introduce new features usually needs to be swift and agile, and the supporting software evolution process needs to be safe, fast, and efficient. However, this is usually a difficult and challenging task for a developer due to the lack of support offered by programming environments, frameworks, and database management systems. Changes needed at the code level, database model, and the actual data contained in the database must be planned and developed together and executed in a synchronized way. Even under a careful development discipline, the impact of changing an application data model is hard to predict. The lifetime of an application comprises changes and updates designed and tested using data, which is usually far from the real, production, data. So, coding DDL and DML SQL scripts to update database schema and data, is the usual (and hard) approach taken by developers. Such manual approach is error prone and disconnected from the real data in production, because developers may not know the exact impact of their changes. This work aims to improve the maintenance process in the context of Agile Platform by Outsystems. Our goal is to design and implement new data-model evolution features that ensure a safe support for change and a sound migration process. Our solution includes impact analysis mechanisms targeting the data model and the data itself. This provides, to developers, a safe, simple, and guided evolution process.
Resumo:
Most of today’s systems, especially when related to the Web or to multi-agent systems, are not standalone or independent, but are part of a greater ecosystem, where they need to interact with other entities, react to complex changes in the environment, and act both over its own knowledge base and on the external environment itself. Moreover, these systems are clearly not static, but are constantly evolving due to the execution of self updates or external actions. Whenever actions and updates are possible, the need to ensure properties regarding the outcome of performing such actions emerges. Originally purposed in the context of databases, transactions solve this problem by guaranteeing atomicity, consistency, isolation and durability of a special set of actions. However, current transaction solutions fail to guarantee such properties in dynamic environments, since they cannot combine transaction execution with reactive features, or with the execution of actions over domains that the system does not completely control (thus making rolling back a non-viable proposition). In this thesis, we investigate what and how transaction properties can be ensured over these dynamic environments. To achieve this goal, we provide logic-based solutions, based on Transaction Logic, to precisely model and execute transactions in such environments, and where knowledge bases can be defined by arbitrary logic theories.
Resumo:
The “CMS Safety Closing Sensors System” (SCSS, or CSS for brevity) is a remote monitoring system design to control safety clearance and tight mechanical movements of parts of the CMS detector, especially during CMS assembly phases. We present the different systems that makes SCSS: its sensor technologies, the readout system, the data acquisition and control software. We also report on calibration and installation details, which determine the resolution and limits of the system. We present as well our experience from the operation of the system and the analysis of the data collected since 2008. Special emphasis is given to study positioning reproducibility during detector assembly and understanding how the magnetic fields influence the detector structure.
Resumo:
As the complexity of markets and the dynamicity of systems evolve, the need for interoperable systems capable of strengthening enterprise communication effectiveness increases. This is particularly significant when it comes to collaborative enterprise networks, like manufacturing supply chains, where several companies work, communicate, and depend on each other, in order to achieve a specific goal. Once interoperability is achieved, that is once all network parties are able to communicate with and understand each other, organisations are able to exchange information along a stable environment that follows agreed laws. However, as markets adapt to new requirements and demands, an evolutionary behaviour is triggered giving space to interoperability problems, thus disrupting the sustainability of interoperability and raising the need to develop monitoring activities capable of detecting and preventing unexpected behaviour. This work seeks to contribute to the development of monitoring techniques for interoperable SOA-based enterprise networks. It focuses on the automatic detection of harmonisation breaking events during real-time communications, and strives to develop and propose a methodological approach to handle these disruptions with minimal or no human intervention, hence providing existing service-based networks with the ability to detect and promptly react to interoperability issues.
Resumo:
This study assess the quality of Cybersecurity as a service provided by IT department in corporate network and provides analysis about the service quality impact on the user, seen as a consumer of the service, and on the organization as well. In order to evaluate the quality of this service, multi-item instrument “SERVQUAL” was used for measuring consumer perceptions of service quality. To provide insights about Cybersecurity service quality impact, DeLone and McLean information systems success model was used. To test this approach, data was collected from over one hundred users from different industries and partial least square (PLS) was used to estimate the research model. This study found that SERVQUAL is adequate to assess Cybersecurity service quality and also found that Cybersecurity service quality positively influences the Cybersecurity use and individual impact in Cybersecurity.
Resumo:
Existing wireless networks are characterized by a fixed spectrum assignment policy. However, the scarcity of available spectrum and its inefficient usage demands for a new communication paradigm to exploit the existing spectrum opportunistically. Future Cognitive Radio (CR) devices should be able to sense unoccupied spectrum and will allow the deployment of real opportunistic networks. Still, traditional Physical (PHY) and Medium Access Control (MAC) protocols are not suitable for this new type of networks because they are optimized to operate over fixed assigned frequency bands. Therefore, novel PHY-MAC cross-layer protocols should be developed to cope with the specific features of opportunistic networks. This thesis is mainly focused on the design and evaluation of MAC protocols for Decentralized Cognitive Radio Networks (DCRNs). It starts with a characterization of the spectrum sensing framework based on the Energy-Based Sensing (EBS) technique considering multiple scenarios. Then, guided by the sensing results obtained by the aforementioned technique, we present two novel decentralized CR MAC schemes: the first one designed to operate in single-channel scenarios and the second one to be used in multichannel scenarios. Analytical models for the network goodput, packet service time and individual transmission probability are derived and used to compute the performance of both protocols. Simulation results assess the accuracy of the analytical models as well as the benefits of the proposed CR MAC schemes.
Resumo:
Taking into account the fact that the sun’s radiation is estimated to be enough to cover 10.000 times the world’s total energy needs (BRAKMANN & ARINGHOFF, 2003), it is difficult to understand how solar photovoltaic systems (PV) are still such a small part of the energy source matrix across the globe. Though there is an ongoing debate as to whether energy consumption leads to economic growth or whether it is the other way around, the two variables appear correlated and it is clear that ensuring the availability of energy to match a country’s growth targets is one of the prime concerns for any government. The topic of centralized vs distributed electricity generation is also approached, especially in what regards the latter fit to developing countries needs, namely the lack of investment capabilities and infrastructure, scattered population, and other factors. Finally, Brazil’s case is reviewed, showing that the current cost of electricity from the grid versus the cost from PV solutions still places an investment of this nature with 9 to 16 years to reach breakeven (from a 25 year panel lifespan), which is too high compared to the required 4 years for most Brazilians. Still, recently passed legislation opened the door, even if unknowingly, to the development of co-owned solar farms, which could reduce the implementation costs by as much as 20% and hence reduce the number of years to breakeven by 3 years.