67 resultados para Google Store


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development and implementation of measures which promote the reduction of the impacts of forest fires on soils is imperative and should be part of any strategy for forest and soil preservation and recovery, especially considering the actual scenario of continuous growth in the number of fires and burnt area. Consequently, with the dendrocaustologic reality that has characterized the Portuguese mainland in recent decades, a research project promoted by the Center for the Study of Geography and Spatial Planning (CEGOT) was implemented with the objective of applying several erosion mitigation measures in a burned area of the Peneda-Geres National Park in NW Portugal. This paper therefore seeks to present the measures applied in the study area within the project Soil Protec, relating to triggered channel processes and the results of preliminary observations concerning the evaluation of the effectiveness of erosion mitigation measures implemented, as well as their cost/benefit ratio.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Consolidation consists in scheduling multiple virtual machines onto fewer servers in order to improve resource utilization and to reduce operational costs due to power consumption. However, virtualization technologies do not offer performance isolation, causing applications’ slowdown. In this work, we propose a performance enforcing mechanism, composed of a slowdown estimator, and a interference- and power-aware scheduling algorithm. The slowdown estimator determines, based on noisy slowdown data samples obtained from state-of-the-art slowdown meters, if tasks will complete within their deadlines, invoking the scheduling algorithm if needed. When invoked, the scheduling algorithm builds performance and power aware virtual clusters to successfully execute the tasks. We conduct simulations injecting synthetic jobs which characteristics follow the last version of the Google Cloud tracelogs. The results indicate that our strategy can be efficiently integrated with state-of-the-art slowdown meters to fulfil contracted SLAs in real-world environments, while reducing operational costs in about 12%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Empowered by virtualisation technology, cloud infrastructures enable the construction of flexi- ble and elastic computing environments, providing an opportunity for energy and resource cost optimisation while enhancing system availability and achieving high performance. A crucial re- quirement for effective consolidation is the ability to efficiently utilise system resources for high- availability computing and energy-efficiency optimisation to reduce operational costs and carbon footprints in the environment. Additionally, failures in highly networked computing systems can negatively impact system performance substantially, prohibiting the system from achieving its initial objectives. In this paper, we propose algorithms to dynamically construct and readjust vir- tual clusters to enable the execution of users’ jobs. Allied with an energy optimising mechanism to detect and mitigate energy inefficiencies, our decision-making algorithms leverage virtuali- sation tools to provide proactive fault-tolerance and energy-efficiency to virtual clusters. We conducted simulations by injecting random synthetic jobs and jobs using the latest version of the Google cloud tracelogs. The results indicate that our strategy improves the work per Joule ratio by approximately 12.9% and the working efficiency by almost 15.9% compared with other state-of-the-art algorithms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent studies of mobile Web trends show the continued explosion of mobile-friend content. However, the wide number and heterogeneity of mobile devices poses several challenges for Web programmers, who want automatic delivery of context and adaptation of the content to mobile devices. Hence, the device detection phase assumes an important role in this process. In this chapter, the authors compare the most used approaches for mobile device detection. Based on this study, they present an architecture for detecting and delivering uniform m-Learning content to students in a Higher School. The authors focus mainly on the XML device capabilities repository and on the REST API Web Service for dealing with device data. In the former, the authors detail the respective capabilities schema and present a new caching approach. In the latter, they present an extension of the current API for dealing with it. Finally, the authors validate their approach by presenting the overall data and statistics collected through the Google Analytics service, in order to better understand the adherence to the mobile Web interface, its evolution over time, and the main weaknesses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports on the design and development of an Android-based context-aware system to support Erasmus students during their mobility in Porto. It enables: (i) guest users to create, rate and store personal points of interest (POI) in a private, local on board database; and (ii) authenticated users to upload and share POI as well as get and rate recommended POI from the shared central database. The system is a distributed client / server application. The server interacts with a central database that maintains the user profiles and the shared POI organized by category and rating. The Android GUI application works both as a standalone application and as a client module. In standalone mode, guest users have access to generic info, a map-based interface and a local database to store and retrieve personal POI. Upon successful authentication, users can, additionally, share POI as well as get and rate recommendations sorted by category, rating and distance-to-user.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the system developed to promote the rational use of electric energy among consumers and, thus, increase the energy efficiency. The goal is to provide energy consumers with an application that displays the energy consumption/production profiles, sets up consuming ceilings, defines automatic alerts and alarms, compares anonymously consumers with identical energy usage profiles by region and predicts, in the case of non-residential installations, the expected consumption/production values. The resulting distributed system is organized in two main blocks: front-end and back-end. The front-end includes user interface applications for Android mobile devices and Web browsers. The back-end provides data storage and processing functionalities and is installed in a cloud computing platform - the Google App Engine - which provides a standard Web service interface. This option ensures interoperability, scalability and robustness to the system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação de Mestrado apresentada ao Instituto de Contabilidade e Administração do Porto para a obtenção do grau de Mestre em Marketing Digital, sob orientação do Professor Doutor José de Freitas Santos

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dynamic and distributed environments are hard to model since they suffer from unexpected changes, incomplete knowledge, and conflicting perspectives and, thus, call for appropriate knowledge representation and reasoning (KRR) systems. Such KRR systems must handle sets of dynamic beliefs, be sensitive to communicated and perceived changes in the environment and, consequently, may have to drop current beliefs in face of new findings or disregard any new data that conflicts with stronger convictions held by the system. Not only do they need to represent and reason with beliefs, but also they must perform belief revision to maintain the overall consistency of the knowledge base. One way of developing such systems is to use reason maintenance systems (RMS). In this paper we provide an overview of the most representative types of RMS, which are also known as truth maintenance systems (TMS), which are computational instances of the foundations-based theory of belief revision. An RMS module works together with a problem solver. The latter feeds the RMS with assumptions (core beliefs) and conclusions (derived beliefs), which are accompanied by their respective foundations. The role of the RMS module is to store the beliefs, associate with each belief (core or derived belief) the corresponding set of supporting foundations and maintain the consistency of the overall reasoning by keeping, for each represented belief, the current supporting justifications. Two major approaches are used to reason maintenance: single-and multiple-context reasoning systems. Although in the single-context systems, each belief is associated to the beliefs that directly generated it—the justification-based TMS (JTMS) or the logic-based TMS (LTMS), in the multiple context counterparts, each belief is associated with the minimal set of assumptions from which it can be inferred—the assumption-based TMS (ATMS) or the multiple belief reasoner (MBR).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper discusses the changes brought by the communication revolution in teaching and learning in the scope of LSP. Its aim is to provide an insight on how teaching which was bi-dimensional, turned into a multidimensional system, gathering other complementary resources that have transformed, in a incredibly short time, the ways we receive share and store information, for instance as professionals, and keep in touch with our peers. The increasing rise of electronic publications, the incredible boom of social and professional networks, search engines, blogs, list servs, forums, e-mail blasts, Facebook pages, YouTube contents, Tweets and Apps, have twisted the way information is conveyed. Classes ceased to be predictable and have been empowered by digital platforms, innumerous and different data repositories (TILDE, IATE, LINGUEE, and so many other terminological data banks) that have definitely transformed the academic world in general and tertiary education in particular. There is a bulk of information to be digested by students, who are no longer passive but instead responsible and active for their academic outcomes. The question is whether they possess the tools to select only what is accurate and important for a certain subject or assignment, due to that overflow? Due to the reduction of the number of course years in most degrees, after the implementation of Bologna and the shrinking of the curricula contents, have students the possibility of developing critical thinking? Both teaching and learning rely on digital resources to improve the speed of the spreading of knowledge. But have those changes been effective to promote really communication? Furthermore, with the increasing Apps that have already been developed and will continue to appear for learning foreign languages, for translation among others, will the students feel the need of learning them once they have those Apps. These are some the questions we would like to discuss in our paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Auditory event-related potentials (AERPs) are widely used in diverse fields of today’s neuroscience, concerning auditory processing, speech perception, language acquisition, neurodevelopment, attention and cognition in normal aging, gender, developmental, neurologic and psychiatric disorders. However, its transposition to clinical practice has remained minimal. Mainly due to scarce literature on normative data across age, wide spectrumof results, variety of auditory stimuli used and to different neuropsychological meanings of AERPs components between authors. One of the most prominent AERP components studied in last decades was N1, which reflects auditory detection and discrimination. Subsequently, N2 indicates attention allocation and phonological analysis. The simultaneous analysis of N1 and N2 elicited by feasible novelty experimental paradigms, such as auditory oddball, seems an objective method to assess central auditory processing. The aim of this systematic review was to bring forward normative values for auditory oddball N1 and N2 components across age. EBSCO, PubMed, Web of Knowledge and Google Scholarwere systematically searched for studies that elicited N1 and/or N2 by auditory oddball paradigm. A total of 2,764 papers were initially identified in the database, of which 19 resulted from hand search and additional references, between 1988 and 2013, last 25 years. A final total of 68 studiesmet the eligibility criteria with a total of 2,406 participants from control groups for N1 (age range 6.6–85 years; mean 34.42) and 1,507 for N2 (age range 9–85 years; mean 36.13). Polynomial regression analysis revealed thatN1latency decreases with aging at Fz and Cz,N1 amplitude at Cz decreases from childhood to adolescence and stabilizes after 30–40 years and at Fz the decrement finishes by 60 years and highly increases after this age. Regarding N2, latency did not covary with age but amplitude showed a significant decrement for both Cz and Fz. Results suggested reliable normative values for Cz and Fz electrode locations; however, changes in brain development and components topography over age should be considered in clinical practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O uso da tecnologia tem crescido nas últimas décadas nas mais diversas áreas, seja na indústria ou no dia-a-dia, e é cada vez mais evidente os benefícios que traz. No desporto não é diferente. Cada dia surgem novos desenvolvimentos objetivando a melhoria do desempenho dos praticantes de atividades físicas, possibilitando atingir resultados nunca antes pensados. Além disto, a utilização da tecnologia no desporto permite a obtenção de dados biomecânicos que podem ser utilizados tanto no treinamento quando na melhoria da qualidade de vida dos atletas auxiliando na prevenção de lesões, por exemplo. Deste modo, o presente projeto se aplica na área do desporto, nomeadamente, na modalidade do surfe, onde a ausência de trabalhos científicos ainda é elevada, aliando a tecnologia eletrônica ao desporto para quantificar informações até então desconhecidas. Três fatores básicos de desempenho foram levantados, sendo eles: equilíbrio, posicionamento dos pés e movimentação da prancha de surfe. Estes fatores levaram ao desenvolvimento de um sistema capaz de medi-los dinamicamente através da medição das forças plantares e da rotação da prancha de surfe. Além da medição dos fatores, o sistema é capaz de armazenar os dados adquiridos localmente através de um cartão de memória, para posterior análise; e também enviá-los através de uma comunicação sem fio, permitindo a visualização do centro de pressões plantares; dos ângulos de rotação da prancha de surfe; e da ativação dos sensores; em tempo real. O dispositivo consiste em um sistema eletrônico embarcado composto por um microcontrolador ATMEGA1280; um circuito de aquisição e condicionamento de sinal analógico; uma central inercial; um módulo de comunicação sem fio RN131; e um conjunto de sensores de força Flexiforce. O firmware embarcado foi desenvolvido em linguagem C. O software Matlab foi utilizado para receção de dados e visualização em tempo real. Os testes realizados demostraram que o funcionamento do sistema atende aos requisitos propostos, fornecendo informação acerca do equilíbrio, através do centro de pressões; do posicionamento dos pés, através da distribuição das pressões plantares; e do movimento da prancha nos eixos pitch e roll, através da central inercial. O erro médio de medição de força verificado foi de -0.0012 ± 0.0064 N, enquanto a mínima distância alcançada na transmissão sem fios foi de 100 m. A potência medida do sistema foi de 330 mW.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Na atualidade, existe uma quantidade de dados criados diariamente que ultrapassam em muito as mais otimistas espectativas estabelecidas na década anterior. Estes dados têm origens bastante diversas e apresentam-se sobre várias formas. Este novo conceito que dá pelo nome de Big Data está a colocar novos e rebuscados desafios ao seu armazenamento, tratamento e manipulação. Os tradicionais sistemas de armazenamento não se apresentam como a solução indicada para este problema. Estes desafios são alguns dos mais analisados e dissertados temas informáticos do momento. Várias tecnologias têm emergido com esta nova era, das quais se salienta um novo paradigma de armazenamento, o movimento NoSQL. Esta nova filosofia de armazenamento visa responder às necessidades de armazenamento e processamento destes volumosos e heterogéneos dados. Os armazéns de dados são um dos componentes mais importantes do âmbito Business Intelligence e são, maioritariamente, utilizados como uma ferramenta de apoio aos processos de tomada decisão, levados a cabo no dia-a-dia de uma organização. A sua componente histórica implica que grandes volumes de dados sejam armazenados, tratados e analisados tendo por base os seus repositórios. Algumas organizações começam a ter problemas para gerir e armazenar estes grandes volumes de informação. Esse facto deve-se, em grande parte, à estrutura de armazenamento que lhes serve de base. Os sistemas de gestão de bases de dados relacionais são, há algumas décadas, considerados como o método primordial de armazenamento de informação num armazém de dados. De facto, estes sistemas começam a não se mostrar capazes de armazenar e gerir os dados operacionais das organizações, sendo consequentemente cada vez menos recomendada a sua utilização em armazéns de dados. É intrinsecamente interessante o pensamento de que as bases de dados relacionais começam a perder a luta contra o volume de dados, numa altura em que um novo paradigma de armazenamento surge, exatamente com o intuito de dominar o grande volume inerente aos dados Big Data. Ainda é mais interessante o pensamento de que, possivelmente, estes novos sistemas NoSQL podem trazer vantagens para o mundo dos armazéns de dados. Assim, neste trabalho de mestrado, irá ser estudada a viabilidade e as implicações da adoção de bases de dados NoSQL, no contexto de armazéns de dados, em comparação com a abordagem tradicional, implementada sobre sistemas relacionais. Para alcançar esta tarefa, vários estudos foram operados tendo por base o sistema relacional SQL Server 2014 e os sistemas NoSQL, MongoDB e Cassandra. Várias etapas do processo de desenho e implementação de um armazém de dados foram comparadas entre os três sistemas, sendo que três armazéns de dados distintos foram criados tendo por base cada um dos sistemas. Toda a investigação realizada neste trabalho culmina no confronto da performance de consultas, realizadas nos três sistemas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação de Mestrado apresentado ao Instituto de Contabilidade e Administração do Porto para a obtenção do grau de Mestre em Marketing Digital, sob orientação da Mestre Inês Veiga Pereira “Esta versão contém as críticas e sugestões dos elementos do júri”

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação apresentada ao Instituto Politécnico do Porto para obtenção do Grau de Mestre em Logística

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we address an order processing optimization problem known as the Minimization of Open Stacks Problem (MOSP). This problem consists in finding the best sequence for manufacturing the different products required by costumers, in a setting where only one product can be made at a time. The objective is to minimize the maximum number of incomplete orders from costumers that are being processed simultaneously. We present an integer programming model, based on the existence of a perfect elimination order in interval graphs, which finds an optimal sequence for the costumers orders. Among other economic advantages, manufacturing the products in this optimal sequence reduces the amount of space needed to store incomplete orders.