975 resultados para IoT platforms
Resumo:
Microblogging in the workplace as a functionality of Enterprise Social Networking (ESN) platforms is a relatively new phenomenon of which the use in knowledge work has not yet received much attention from research. In this cross-sectional study, I attempt to shed light on the role of microblogging in knowledge work. I identify microblogging use practices of knowledge workers on ESN platforms, and I identify its role in supporting knowledge work performance. A questionnaire is carried out among a non-representative sample of knowledge workers. The results shed light on the purposes of the microblogging messages that knowledge workers write. It also helps us find out whether microblogging supports them in performing their work. The survey is based on existing theory that supplied me with possible microblog purposes as well as theory on what the actions of knowledge workers are. The results reveal that “knowledge & news sharing”, “crowd sourcing”, “socializing & networking” and “discussion & opinion” are frequent microblog purposes. The study furthermore shows that microblogging benefits knowledge workers’ work. Microblogging seems to be a worthy addition to the existing means of communication in the workplace, and is especially useful to let knowledge, news and social contact reach a further and broader audience than it would in a situation without this social networking service.
Resumo:
The processes of mobilization of land for infrastructures of public and private domain are developed according to proper legal frameworks and systematically confronted with the impoverished national situation as regards the cadastral identification and regularization, which leads to big inefficiencies, sometimes with very negative impact to the overall effectiveness. This project report describes Ferbritas Cadastre Information System (FBSIC) project and tools, which in conjunction with other applications, allow managing the entire life-cycle of Land Acquisition and Cadastre, including support to field activities with the integration of information collected in the field, the development of multi-criteria analysis information, monitoring all information in the exploration stage, and the automated generation of outputs. The benefits are evident at the level of operational efficiency, including tools that enable process integration and standardization of procedures, facilitate analysis and quality control and maximize performance in the acquisition, maintenance and management of registration information and expropriation (expropriation projects). Therefore, the implemented system achieves levels of robustness, comprehensiveness, openness, scalability and reliability suitable for a structural platform. The resultant solution, FBSIC, is a fit-for-purpose cadastre information system rooted in the field of railway infrastructures. FBSIC integrating nature of allows: to accomplish present needs and scale to meet future services; to collect, maintain, manage and share all information in one common platform, and transform it into knowledge; to relate with other platforms; to increase accuracy and productivity of business processes related with land property management.
Resumo:
Over the last decade, human embryonic stem cells (hESCs) have garnered a lot of attention owing to their inherent self-renewal ability and pluripotency. These characteristics have opened opportunities for potential stem cell-based regenerative medicines, for development of drug discovery platforms and as unique in vitro models for the study of early human development.(...)
Resumo:
Stratigraphic Columns (SC) are the most useful and common ways to represent the eld descriptions (e.g., grain size, thickness of rock packages, and fossil and lithological components) of rock sequences and well logs. In these representations the width of SC vary according to the grain size (i.e., the wider the strata, the coarser the rocks (Miall 1990; Tucker 2011)), and the thickness of each layer is represented at the vertical axis of the diagram. Typically these representations are drawn 'manually' using vector graphic editors (e.g., Adobe Illustrator®, CorelDRAW®, Inskape). Nowadays there are various software which automatically plot SCs, but there are not versatile open-source tools and it is very di cult to both store and analyse stratigraphic information. This document presents Stratigraphic Data Analysis in R (SDAR), an analytical package1 designed for both plotting and facilitate the analysis of Stratigraphic Data in R (R Core Team 2014). SDAR, uses simple stratigraphic data and takes advantage of the exible plotting tools available in R to produce detailed SCs. The main bene ts of SDAR are: (i) used to generate accurate and complete SC plot including multiple features (e.g., sedimentary structures, samples, fossil content, color, structural data, contacts between beds), (ii) developed in a free software environment for statistical computing and graphics, (iii) run on a wide variety of platforms (i.e., UNIX, Windows, and MacOS), (iv) both plotting and analysing functions can be executed directly on R's command-line interface (CLI), consequently this feature enables users to integrate SDAR's functions with several others add-on packages available for R from The Comprehensive R Archive Network (CRAN).
Resumo:
With the recent advances in technology and miniaturization of devices such as GPS or IMU, Unmanned Aerial Vehicles became a feasible platform for a Remote Sensing applications. The use of UAVs compared to the conventional aerial platforms provides a set of advantages such as higher spatial resolution of the derived products. UAV - based imagery obtained by a user grade cameras introduces a set of problems which have to be solved, e. g. rotational or angular differences or unknown or insufficiently precise IO and EO camera parameters. In this work, UAV - based imagery of RGB and CIR type was processed using two different workflows based on PhotoScan and VisualSfM software solutions resulting in the DSM and orthophoto products. Feature detection and matching parameters influence on the result quality as well as a processing time was examined and the optimal parameter setup was presented. Products of the both workflows were compared in terms of a quality and a spatial accuracy. Both workflows were compared by presenting the processing times and quality of the results. Finally, the obtained products were used in order to demonstrate vegetation classification. Contribution of the IHS transformations was examined with respect to the classification accuracy.
Resumo:
Generating personalized movie recommendations to users is a problem that most commonly relies on user-movie ratings. These ratings are generally used either to understand the user preferences or to recommend movies that users with similar rating patterns have rated highly. However, movie recommenders are often subject to the Cold-Start problem: new movies have not been rated by anyone, so, they will not be recommended to anyone; likewise, the preferences of new users who have not rated any movie cannot be learned. In parallel, Social-Media platforms, such as Twitter, collect great amounts of user feedback on movies, as these are very popular nowadays. This thesis proposes to explore feedback shared on Twitter to predict the popularity of new movies and show how it can be used to tackle the Cold-Start problem. It also proposes, at a finer grain, to explore the reputation of directors and actors on IMDb to tackle the Cold-Start problem. To assess these aspects, a Reputation-enhanced Recommendation Algorithm is implemented and evaluated on a crawled IMDb dataset with previous user ratings of old movies,together with Twitter data crawled from January 2014 to March 2014, to recommend 60 movies affected by the Cold-Start problem. Twitter revealed to be a strong reputation predictor, and the Reputation-enhanced Recommendation Algorithm improved over several baseline methods. Additionally, the algorithm also proved to be useful when recommending movies in an extreme Cold-Start scenario, where both new movies and users are affected by the Cold-Start problem.
Resumo:
OutSystems Platform is used to develop, deploy, and maintain enterprise web an mobile web applications. Applications are developed through a visual domain specific language, in an integrated development environment, and compiled to a standard stack of web technologies. In the platform’s core, there is a compiler and a deployment service that transform the visual model into a running web application. As applications grow, compilation and deployment times increase as well, impacting the developer’s productivity. In the previous model, a full application was the only compilation and deployment unit. When the developer published an application, even if he only changed a very small aspect of it, the application would be fully compiled and deployed. Our goal is to reduce compilation and deployment times for the most common use case, in which the developer performs small changes to an application before compiling and deploying it. We modified the OutSystems Platform to support a new incremental compilation and deployment model that reuses previous computations as much as possible in order to improve performance. In our approach, the full application is broken down into smaller compilation and deployment units, increasing what can be cached and reused. We also observed that this finer model would benefit from a parallel execution model. Hereby, we created a task driven Scheduler that executes compilation and deployment tasks in parallel. Our benchmarks show a substantial improvement of the compilation and deployment process times for the aforementioned development scenario.
Resumo:
The development of human cell models that recapitulate hepatic functionality allows the study of metabolic pathways involved in toxicity and disease. The increased biological relevance, cost-effectiveness and high-throughput of cell models can contribute to increase the efficiency of drug development in the pharmaceutical industry. Recapitulation of liver functionality in vitro requires the development of advanced culture strategies to mimic in vivo complexity, such as 3D culture, co-cultures or biomaterials. However, complex 3D models are typically associated with poor robustness, limited scalability and compatibility with screening methods. In this work, several strategies were used to develop highly functional and reproducible spheroid-based in vitro models of human hepatocytes and HepaRG cells using stirred culture systems. In chapter 2, the isolation of human hepatocytes from resected liver tissue was implemented and a liver tissue perfusion method was optimized towards the improvement of hepatocyte isolation and aggregation efficiency, resulting in an isolation protocol compatible with 3D culture. In chapter 3, human hepatocytes were co-cultivated with mesenchymal stem cells (MSC) and the phenotype of both cell types was characterized, showing that MSC acquire a supportive stromal function and hepatocytes retain differentiated hepatic functions, stability of drug metabolism enzymes and higher viability in co-cultures. In chapter 4, a 3D alginate microencapsulation strategy for the differentiation of HepaRG cells was evaluated and compared with the standard 2D DMSO-dependent differentiation, yielding higher differentiation efficiency, comparable levels of drug metabolism activity and significantly improved biosynthetic activity. The work developed in this thesis provides novel strategies for 3D culture of human hepatic cell models, which are reproducible, scalable and compatible with screening platforms. The phenotypic and functional characterization of the in vitro systems performed contributes to the state of the art of human hepatic cell models and can be applied to the improvement of pre-clinical drug development efficiency of the process, model disease and ultimately, development of cell-based therapeutic strategies for liver failure.
Resumo:
The need for more efficient illumination systems has led to the proliferation of Solid-State Lighting (SSL) systems, which offer optimized power consumption. SSL systems are comprised of LED devices which are intrinsically fast devices and permit very fast light modulation. This, along with the congestion of the radio frequency spectrum has paved the path for the emergence of Visible Light Communication (VLC) systems. VLC uses free space to convey information by using light modulation. Notwithstanding, as VLC systems proliferate and cost competitiveness ensues, there are two important aspects to be considered. State-of-the-art VLC implementations use power demanding PAs, and thus it is important to investigate if regular, existent Switched-Mode Power Supply (SMPS) circuits can be adapted for VLC use. A 28 W buck regulator was implemented using a off-the-shelf LED Driver integrated circuit, using both series and parallel dimming techniques. Results show that optical clock frequencies up to 500 kHz are achievable without any major modification besides adequate component sizing. The use of an LED as a sensor was investigated, in a short-range, low-data-rate perspective. Results show successful communication in an LED-to-LED configuration, with enhanced range when using LED strings as sensors. Besides, LEDs present spectral selective sensitivity, which makes them good contenders for a multi-colour LED-to-LED system, such as in the use of RGB displays and lamps. Ultimately, the present work shows evidence that LEDs can be used as a dual-purpose device, enabling not only illumination, but also bi-directional data communication.
Resumo:
Following the Introduction, which surveys existing literature on the technology advances and regulation in telecommunications and on two-sided markets, we address specific issues on the industries of the New Economy, featured by the existence of network effects. We seek to explore how each one of these industries work, identify potential market failures and find new solutions at the economic regulation level promoting social welfare. In Chapter 1 we analyze a regulatory issue on access prices and investments in the telecommunications market. The existing literature on access prices and investment has pointed out that networks underinvest under a regime of mandatory access provision with a fixed access price per end-user. We propose a new access pricing rule, the indexation approach, i.e., the access price, per end-user, that network i pays to network j is function of the investment levels set by both networks. We show that the indexation can enhance economic efficiency beyond what is achieved with a fixed access price. In particular, access price indexation can simultaneously induce lower retail prices and higher investment and social welfare as compared to a fixed access pricing or a regulatory holidays regime. Furthermore, we provide sufficient conditions under which the indexation can implement the socially optimal investment or the Ramsey solution, which would be impossible to obtain under fixed access pricing. Our results contradict the notion that investment efficiency must be sacrificed for gains in pricing efficiency. In Chapter 2 we investigate the effect of regulations that limit advertising airtime on advertising quality and on social welfare. We show, first, that advertising time regulation may reduce the average quality of advertising broadcast on TV networks. Second, an advertising cap may reduce media platforms and firms' profits, while the net effect on viewers (subscribers) welfare is ambiguous because the ad quality reduction resulting from a regulatory cap o¤sets the subscribers direct gain from watching fewer ads. We find that if subscribers are sufficiently sensitive to ad quality, i.e., the ad quality reduction outweighs the direct effect of the cap, a cap may reduce social welfare. The welfare results suggest that a regulatory authority that is trying to increase welfare via regulation of the volume of advertising on TV might necessitate to also regulate advertising quality or, if regulating quality proves impractical, take the effect of advertising quality into consideration. 3 In Chapter 3 we investigate the rules that govern Electronic Payment Networks (EPNs). In EPNs the No-Surcharge Rule (NSR) requires that merchants charge at most the same amount for a payment card transaction as for cash. In this chapter, we analyze a three- party model (consumers, merchants, and a proprietary EPN) with endogenous transaction volumes and heterogenous merchants' transactional benefits of accepting cards to assess the welfare impacts of the NSR. We show that, if merchants are local monopolists and the network externalities from merchants to cardholders are sufficiently strong, with the exception of the EPN, all agents will be worse o¤ with the NSR, and therefore the NSR is socially undesirable. The positive role of the NSR in terms of improvement of retail price efficiency for cardholders is also highlighted.
Resumo:
Social Media @ Galp Project had a very specific purpose – analyze the feasibility for Galp to enter in new Social Media platforms and, if appropriate, develop a short-term strategy for the entrance in which some guidelines are valid for the medium-long-term. As expected, the majority of the project was focused on the second part, which consists in an analysis of some aspects concerning the organization as well as in the relationship with customers and public in general
Resumo:
A domótica e a Internet das Coisas (Internet of Things – IoT) são dois conceitos que estão a ter bastante impacto na sociedade e a mudar o seu quotidiano, pelo que o utilizador é mais exigente na utilização dos seus equipamentos, pretendendo, que sejam mais simples e eficazes. A IoT é também muito utilizada na vertente da qualidade de vida (verificação da qualidade do ar, temperatura etc.) pelo que é necessário a utilização de uma rede de sensores de forma que toda a informação recolhida seja disponibilizada ao utilizador. Os sensores estão ligados a nós da rede de forma a recolherem informação. Essa rede pode ser constituída por diferentes topologias, sendo que a mais usual é a rede em Malha, pois no caso de um dos nós deixar de funcionar, a mensagem é enviada através de outros nós até chegar ao nó principal. O passo seguinte consiste em enviar toda essa informação para a Cloud, sendo que, nas soluções existentes, é necessário que o utilizador tenha um ponto fixo (gateway) com acesso à internet. Quando não é possível internet por cabo, a solução é a utilização de redes sem fios ou a utilização de, por exemplo, um cartão 3G/4G que implica o pagamento de taxas às operadoras móveis. Estas soluções implicam, na sua grande maioria uma instalação elétrica para alimentação dos nós dos sensores. O grande problema surge quando o utilizador não possui nenhum acesso à Internet (no local onde são instalados os sensores), ou no caso de não existência de nenhuma instalação elétrica para alimentação dos nós dos sensores. A solução proposta nesta dissertação consiste na utilização de um telemóvel, de um utilizador aleatório, como gateway. Assim, um utilizador comum pode ligar os seus sensores onde for necessário. Esses sensores são configurados Over-The-Air através de uma camada de aplicação do dispositivo de comunicação, como por exemplo, o Synapse. Após configuração, os sensores estão prontos a recolher toda a informação e enviá-la através da rede até ao nó principal. O nó principal é constituído pelo dispositivo de comunicação referido anteriormente, ligado a um microcontrolador (Arduino, por exemplo) que tem agregado um leitor de cartões SD e um dispositivo Bluetooth (BLE). Toda a informação recolhida pelos sensores é guardada no cartão SD até que um utilizador, com um telemóvel (Smartphone Android com a aplicação desenvolvida instalada) com o Bluetooth ligado, se aproxime do nó principal. Assim que a ligação é aceite e estabelecida, a aplicação envia a data mais recente de um sensor específico presente na sua base de dados na Cloud para o Arduino, permitindo que este apague os dados mais antigos presentes no cartão e envie, como resposta para o telemóvel, a informação mais antiga recolhida pelo sensor, atualizando assim a informação. A aplicação deste conceito pode ser útil quando não existe nenhuma ligação à internet ou quando um utilizador, por exemplo uma entidade responsável pelo meio ambiente e que seja necessário inserir sensores numa floresta, para prevenção de fogos. Assim os nós vão enviar toda a informação recolhida através da sua rede. Posteriormente, cabe ao utilizador escolher um sítio estratégico, onde saiba que irão passar indivíduos com alguma frequência, de modo a que estes recebam essa informação para o seu telemóvel.
Resumo:
Software Product Line (SPL) engineering aims at achieving efficient development of software products in a specific domain. New products are obtained via a process which entails creating a new configuration specifying the desired product’s features. This configuration must necessarily conform to a variability model, that describes the scope of the SPL, or else it is not viable. To ensure this, configuration tools are used that do not allow invalid configurations to be expressed. A different concern, however, is making sure that a product addresses the stakeholders’ needs as best as possible. The stakeholders may not be experts on the domain, so they may have unrealistic expectations. Also, the scope of the SPL is determined not only by the domain but also by limitations of the development platforms. It is therefore possible that the desired set of features goes beyond what is possible to currently create with the SPL. This means that configuration tools should provide support not only for creating valid products, but also for improving satisfaction of user concerns. We address this goal by providing a user-centric configuration process that offers suggestions during the configuration process, based on the use of soft constraints, and identifying and explaining potential conflicts that may arise. Suggestions help mitigating stakeholder uncertainty and poor domain knowledge, by helping them address well known and desirable domain-related concerns. On the other hand, automated conflict identification and explanation helps the stakeholders to understand the trade-offs required for realizing their vision, allowing informed resolution of conflicts. Additionally, we propose a prototype-based approach to configuration, that addresses the order-dependency issues by allowing the complete (or partial) specification of the features in a single step. A subsequent resolution process will then identify possible repairs, or trade-offs, that may be required for viabilization.
Resumo:
This thesis proposes a methodology for modelling business interoperability in a context of cooperative industrial networks. The purpose is to develop a methodology that enables the design of cooperative industrial network platforms that are able to deliver business interoperability and the analysis of its impact on the performance of these platforms. To achieve the proposed objective, two modelling tools have been employed: the Axiomatic Design Theory for the design of interoperable platforms; and Agent-Based Simulation for the analysis of the impact of business interoperability. The sequence of the application of the two modelling tools depends on the scenario under analysis, i.e. whether the cooperative industrial network platform exists or not. If the cooperative industrial network platform does not exist, the methodology suggests first the application of the Axiomatic Design Theory to design different configurations of interoperable cooperative industrial network platforms, and then the use of Agent-Based Simulation to analyse or predict the business interoperability and operational performance of the designed configurations. Otherwise, one should start by analysing the performance of the existing platform and based on the achieved results, decide whether it is necessary to redesign it or not. If the redesign is needed, simulation is once again used to predict the performance of the redesigned platform. To explain how those two modelling tools can be applied in practice, a theoretical modelling framework, a theoretical Axiomatic Design model and a theoretical Agent-Based Simulation model are proposed. To demonstrate the applicability of the proposed methodology and/or to validate the proposed theoretical models, a case study regarding a Portuguese Reverse Logistics cooperative network (Valorpneu network) and a case study regarding a Portuguese construction project (Dam Baixo Sabor network) are presented. The findings of the application of the proposed methodology to these two case studies suggest that indeed the Axiomatic Design Theory can effectively contribute in the design of interoperable cooperative industrial network platforms and that Agent-Based Simulation provides an effective set of tools for analysing the impact of business interoperability on the performance of those platforms. However, these conclusions cannot be generalised as only two case studies have been carried out. In terms of relevance to theory, this is the first time that the network effect is addressed in the analysis of the impact of business interoperability on the performance of networked companies and also the first time that a holistic approach is proposed to design interoperable cooperative industrial network platforms. Regarding the practical implications, the proposed methodology is intended to provide industrial managers a management tool that can guide them easily, and in practical and systematic way, in the design of configurations of interoperable cooperative industrial network platforms and/or in the analysis of the impact of business interoperability on the performance of their companies and the networks where their companies operate.
Resumo:
Este trabalho de investigação começou por ser estruturado em torno de quatro grandes capítulos (quatro grandes linhas de orientação temática), todos eles amplamente desenvolvidos no sentido de podermos cartografar alguns dos principais territórios e sintomas da arte contemporânea, sendo certo também, que cada um deles assenta precisamente nos princípios de uma estrutura maleável que, para todos os efeitos, se encontra em processo de construção (work in progress), neste caso, graças à plasticidade do corpo, do espaço, da imagem e do uso criativo das tecnologias digitais, no âmbito das quais, aliás, tudo se parece produzir, transformar e disseminar hoje em dia à nossa volta (quase como se de uma autêntica viagem interactiva se tratasse). Por isso, a partir daqui, todo o esforço que se segue procurará ensaiar uma hipótese de trabalho (desenvolver uma investigação) que, porventura, nos permita desbravar alguns caminhos em direcção aos intermináveis túneis do futuro, sempre na expectativa de podermos dar forma, função e sentido a um desejo irreprimível de liberdade criativa, pois, a arte contemporânea tem essa extraordinária capacidade de nos transportar para muitos outros lugares do mundo, tão reais e imaginários como a nossa própria vida. Assim sendo, há que sumariar algumas das principais etapas a desenvolver ao longo desta investigação. Ora, num primeiro momento, começaremos por reflectir sobre o conceito alargado de «crise» (a crise da modernidade), para logo de seguida podermos abordar a questão da crise das antigas categorias estéticas, questionando assim, para todos os efeitos, quer o conceito de «belo» (Platão) e de «gosto» (Kant), quer ainda o conceito de «forma» (Foccilon), não só no sentido de tentarmos compreender algumas das principais razões que terão estado na origem do chamado «fim da arte» (Hegel), mas também algumas daquelas que terão conduzido à estetização generalizada da experiência contemporânea e à sua respectiva disseminação pelas mais variadas plataformas digitais. Num segundo momento, procuraremos reflectir sobre alguns dos principais problemas da inquietante história das imagens, nomeadamente para tentarmos perceber como é que todas estas transformações técnicas (ligadas ao aparecimento da fotografia, do cinema, do vídeo, do computador e da internet) terão contribuído para o processo de instauração e respectivo alargamento daquilo que todos nós ficaríamos a conhecer como a nova «era da imagem», ou a imagem na «era da sua própria reprodutibilidade técnica» (Benjamin), pois, só assim é que conseguiremos interrogar este imparável processo de movimentação, fragmentação, disseminação, simulação e interacção das mais variadas «formas de vida» (Nietzsche, Agamben). Entretanto, chegados ao terceiro grande momento, interessa-nos percepcionar a arte contemporânea como uma espécie de plataforma interactiva que, por sua vez, nos levará a interpelar alguns dos principais dispositivos metafóricos e experimentais da viagem, neste caso, da viagem enquanto linha facilitadora de acesso à arte, à cultura e à vida contemporânea em geral, ou seja, todo um processo de reflexão que nos incitará a cartografar alguns dos mais atractivos sintomas provenientes da estética do flâneur (na perspectiva de Rimbaud, Baudelaire, Long e Benjamin) e, consequentemente, a convocar algumas das principais sensações decorrentes da experiência altamente sedutora daqueles que vivem mergulhados na órbita interactiva do ciberespaço (na condição de ciberflâneurs), quase como se o mundo inteiro, agora, fosse tão somente um espaço poético «inteiramente navegável» (Manovich). Por fim, no quarto e último momento, procuraremos fazer uma profunda reflexão sobre a inquietante história do corpo, principalmente com o objectivo de reforçar a ideia de que apesar das suas inúmeras fragilidades biológicas (um ser que adoece e morre), o corpo continua a ser uma das «categorias mais persistentes de toda a cultura ocidental» (Ieda Tucherman), não só porque ele resistiu a todas as transformações que lhe foram impostas historicamente, mas também porque ele se soube reinventar e readaptar pacientemente face a todas essas transformações históricas. Sinal evidente de que a sua plasticidade lhe iria conferir, principalmente a partir do século XX («o século do corpo») um estatuto teórico e performativo verdadeiramente especial. Tão especial, aliás, que basta termos uma noção, mesmo que breve, da sua inquietante história para percebermos imediatamente a extraordinária importância dalgumas das suas mais variadas transformações, atracções, ligações e exibições ao longo das últimas décadas, nomeadamente sob o efeito criativo das tecnologias digitais (no âmbito das quais se processam algumas das mais interessantes operações de dinamização cultural e artística do nosso tempo). Em suma, esperamos sinceramente que este trabalho de investigação possa vir a contribuir para o processo de alargamento das fronteiras cada vez mais incertas, dinâmicas e interactivas do conhecimento daquilo que parece constituir, hoje em dia, o jogo fundamental da nossa contemporaneidade.