812 resultados para Virtual Reality,Cloud Gaming,Cloud Computing,Client-Server,Android,Unity,Multiutenza
Resumo:
At a recent conference on games in education, we made a radical decision to transform our standard presentation of PowerPoint slides and computer game demonstrations into a unified whole, inserting the PowerPoint presentation to the computer game. This opened up various questions relating to learning and teaching theories, which were debated by the conference delegates. In this paper, we reflect on these discussions, we present our initial experiment, and relate this to various theories of learning and teaching. In particular, we consider the applicability of “concept maps” to inform the construction of educational materials, especially their topological, geometrical and pedagogical significance. We supplement this “spatial” dimension with a theory of the dynamic, temporal dimension, grounded in a context of learning processes, such as Kolb’s learning cycle. Finally, we address the multi-player aspects of computer games, and relate this to the theories of social and collaborative learning. This paper attempts to explore various theoretical bases, and so support the development of a new learning and teaching virtual reality approach.
Resumo:
2016 is the outbreak year of the virtual reality industry. In the field of virtual reality, 3D surveying plays an important role. Nowadays, 3D surveying technology has received increasing attention. This project aims to establish and optimize a WebGL three-dimensional broadcast platform combined with streaming media technology. It takes streaming media server and panoramic video broadcast in browser as the application background. Simultaneously, it discusses about the architecture from streaming media server to panoramic media player and analyzing relevant theory problem. This paper focuses on the debugging of streaming media platform, the structure of WebGL player environment, different types of ball model analysis, and the 3D mapping technology. The main work contains the following points: Initially, relay on Easy Darwin open source streaming media server, built a streaming service platform. It can realize the transmission from RTSP stream to streaming media server, and forwards HLS slice video to clients; Then, wrote a WebGL panoramic video player based on Three.js lib with JQuery browser playback controls. Set up a HTML5 panoramic video player; Next, analyzed the latitude and longitude sphere model which from Three.js library according to WebGL rendering method. Pointed out the drawbacks of this model and the breakthrough point of improvement; After that, on the basis of Schneider transform principle, established the Schneider sphere projection model, and converted the output OBJ file to JS file for media player reading. Finally implemented real time panoramic video high precision playing without plugin; At last, I summarized the whole project. Put forward the direction of future optimization and extensible market.
Resumo:
Este artículo presenta el proceso de implementación de una API (Application Programming Interface) que permite la interacción del guante P5 de Essential Reality1 con un entorno virtual desarrollado en el lenguaje de programación Java y su librería Java 3D.2 Por otra parte, se describe un ejemplo implementado, haciendo uso de la API en cuestión. Con base en este ejemplo se presentan los resultados de la ejecución de pruebas de requerimientos de recursos físicos como la CPU y memoria física. Finalmente, se especifican las conclusiones y resultados obtenidos.
Resumo:
The air-sea flux of greenhouse gases (e.g. carbon dioxide, CO2) is a critical part of the climate system and a major factor in the biogeochemical development of the oceans. More accurate and higher resolution calculations of these gas fluxes are required if we are to fully understand and predict our future climate. Satellite Earth observation is able to provide large spatial scale datasets that can be used to study gas fluxes. However, the large storage requirements needed to host such data can restrict its use by the scientific community. Fortunately, the development of cloud-computing can provide a solution. Here we describe an open source air-sea CO2 flux processing toolbox called the ‘FluxEngine’, designed for use on a cloud-computing infrastructure. The toolbox allows users to easily generate global and regional air-sea CO2 flux data from model, in situ and Earth observation data, and its air-sea gas flux calculation is user configurable. Its current installation on the Nephalae cloud allows users to easily exploit more than 8 terabytes of climate-quality Earth observation data for the derivation of gas fluxes. The resultant NetCDF data output files contain >20 data layers containing the various stages of the flux calculation along with process indicator layers to aid interpretation of the data. This paper describes the toolbox design, the verification of the air-sea CO2 flux calculations, demonstrates the use of the tools for studying global and shelf-sea air-sea fluxes and describes future developments.
Resumo:
The big data era has dramatically transformed our lives; however, security incidents such as data breaches can put sensitive data (e.g. photos, identities, genomes) at risk. To protect users' data privacy, there is a growing interest in building secure cloud computing systems, which keep sensitive data inputs hidden, even from computation providers. Conceptually, secure cloud computing systems leverage cryptographic techniques (e.g., secure multiparty computation) and trusted hardware (e.g. secure processors) to instantiate a “secure” abstract machine consisting of a CPU and encrypted memory, so that an adversary cannot learn information through either the computation within the CPU or the data in the memory. Unfortunately, evidence has shown that side channels (e.g. memory accesses, timing, and termination) in such a “secure” abstract machine may potentially leak highly sensitive information, including cryptographic keys that form the root of trust for the secure systems. This thesis broadly expands the investigation of a research direction called trace oblivious computation, where programming language techniques are employed to prevent side channel information leakage. We demonstrate the feasibility of trace oblivious computation, by formalizing and building several systems, including GhostRider, which is a hardware-software co-design to provide a hardware-based trace oblivious computing solution, SCVM, which is an automatic RAM-model secure computation system, and ObliVM, which is a programming framework to facilitate programmers to develop applications. All of these systems enjoy formal security guarantees while demonstrating a better performance than prior systems, by one to several orders of magnitude.
Resumo:
Con agrado, se presenta un nuevo número de la Revista Bibliotecas. Incluye investigaciones de interés acerca de los siguientes temas: arquetipos bibliotecarios, alfabetización informacional e informática en la nube. En el artículo dedicado a los arquetipos, el autor afirma que la profesión bibliotecológica, al igual que ocurre con otras disciplinas, está expuesta a una serie de estereotipos. Tales impresiones y creencias provocan que las personas, motivadas por el prejuicio, no incursionen en una fascinante disciplina; como consecuencia, inciden en la matrícula de estudiantes de primer ingreso. Este escrito expone la situación de la Escuela de Bibliotecología, Documentación e Información, de la Universidad Nacional, y plantea una propuesta didáctica en aras de mejorar la concepción de la bibliotecología entre la población estudiantil de secundaria.
Resumo:
Part 13: Virtual Reality and Simulation
Resumo:
Part 13: Virtual Reality and Simulation
Resumo:
Têm-se notado nos últimos anos um crescimento na adoção de tecnologias de computação em nuvem, com uma adesão inicial por parte de particulares e pequenas empresas, e mais recentemente por grandes organizações. Esta tecnologia tem servido de base ao aparecimento de um conjunto de novas tendências, como a Internet das Coisas ligando os nossos equipamentos pessoais e wearables às redes sociais, processos de big data que permitem tipificar comportamentos de clientes ou ainda facilitar a vida ao cidadão com serviços de atendimento integrados. No entanto, tal como em todas as novas tendências disruptivas, que trazem consigo um conjunto de oportunidades, trazem também um conjunto de novos riscos que são necessários de serem equacionados. Embora este caminho praticamente se torne inevitável para uma grande parte de empresas e entidades governamentais, a sua adoção como funcionamento deve ser alvo de uma permanente avaliação e monitorização entre as vantagens e riscos associados. Para tal, é fundamental que as organizações se dotem de uma eficiente gestão do risco, de modo que possam tipificar os riscos (identificar, analisar e quantificar) e orientar-se de uma forma segura e metódica para este novo paradigma. Caso não o façam, os riscos ficam evidenciados, desde uma possível perda de competitividade face às suas congéneres, falta de confiança dos clientes, dos parceiros de negócio e podendo culminar numa total inatividade do negócio. Com esta tese de mestrado desenvolve-se uma análise genérica de risco tendo como base a Norma ISO 31000:2009 e a elaboração de uma proposta de registo de risco, que possa servir de auxiliar em processos de tomada de decisão na contratação e manutenção de serviços de Computação em Nuvem por responsáveis de organizações privadas ou estatais.
Resumo:
Descreve-se, no presente trabalho, os esforços envidados no sentido de criar uma solução informática generalista, para os problemas mais recorrentes do processo de produção de videojogos 20, baseados em sprites, a correr em plataformas móveis. O sistema desenvolvido é uma aplicação web que está inserida no paradigma cloudcomputing, usufruindo, portanto, de todas as vantagens em termos de acessibilidade, segurança da informação e manutenção que este paradigma oferece actualmente. Além das questões funcionais, a aplicação é ainda explorada do ponto de vista da arquitetura da implementação, com vista a garantir um sistema com implementação escalável, adaptável e de fácil manutenção. Propõe-se ainda um algoritmo que foi desenvolvido para resolver o problema de obter uma distribuição espacial otimizada de várias áreas retangulares, sem sobreposições nem restrições a nível das dimensões, quer do arranjo final, quer das áreas arranjadas. ABSTRACT: This document describes the efforts taken to create a generic computing solution for the most recurrent problems found in the production of two dimensional, spritebased videogames, running on mobile platforms. The developed system is a web application that fits within the scope of the recent cloud-computing paradigm and, therefore, enjoys all of its advantages in terms of data safety, accessibility and application maintainability. In addition, to the functional issues, the system is also studied in terms of its internal software architecture, since it was planned and implemented in the perspective of attaining an easy to maintain application, that is both scalable and adaptable. Furthermore, it is also proposed an algorithm that aims to find an optimized solution to the space distribution problem of several rectangular areas, with no overlapping and no dimensinal restrictions, neither on the final arrangement nor on the arranged areas.
Resumo:
The diversity in the way cloud providers o↵er their services, give their SLAs, present their QoS, or support di↵erent technologies, makes very difficult the portability and interoperability of cloud applications, and favours the well-known vendor lock-in problem. We propose a model to describe cloud applications and the required resources in an agnostic, and providers- and resources-independent way, in which individual application modules, and entire applications, may be re-deployed using different services without modification. To support this model, and after the proposal of a variety of cross-cloud application management tools by different authors, we propose going one step further in the unification of cloud services with a management approach in which IaaS and PaaS services are integrated into a unified interface. We provide support for deploying applications whose components are distributed on different cloud providers, indistinctly using IaaS and PaaS services.
Resumo:
117 p.
Resumo:
Cet essai est présenté en tant que mémoire de maîtrise dans le cadre du programme de droit des technologies de l’information. Ce mémoire traite de différents modèles d’affaires qui ont pour caractéristique commune de commercialiser les données dans le contexte des technologies de l’information. Les pratiques commerciales observées sont peu connues et l’un des objectifs est d’informer le lecteur quant au fonctionnement de ces pratiques. Dans le but de bien situer les enjeux, cet essai discutera d’abord des concepts théoriques de vie privée et de protection des renseignements personnels. Une fois ce survol tracé, les pratiques de « data brokerage », de « cloud computing » et des solutions « analytics » seront décortiquées. Au cours de cette description, les enjeux juridiques soulevés par chaque aspect de la pratique en question seront étudiés. Enfin, le dernier chapitre de cet essai sera réservé à deux enjeux, soit le rôle du consentement et la sécurité des données, qui ne relèvent pas d’une pratique commerciale spécifique, mais qui sont avant tout des conséquences directes de l’évolution des technologies de l’information.
Resumo:
Cet essai est présenté en tant que mémoire de maîtrise dans le cadre du programme de droit des technologies de l’information. Ce mémoire traite de différents modèles d’affaires qui ont pour caractéristique commune de commercialiser les données dans le contexte des technologies de l’information. Les pratiques commerciales observées sont peu connues et l’un des objectifs est d’informer le lecteur quant au fonctionnement de ces pratiques. Dans le but de bien situer les enjeux, cet essai discutera d’abord des concepts théoriques de vie privée et de protection des renseignements personnels. Une fois ce survol tracé, les pratiques de « data brokerage », de « cloud computing » et des solutions « analytics » seront décortiquées. Au cours de cette description, les enjeux juridiques soulevés par chaque aspect de la pratique en question seront étudiés. Enfin, le dernier chapitre de cet essai sera réservé à deux enjeux, soit le rôle du consentement et la sécurité des données, qui ne relèvent pas d’une pratique commerciale spécifique, mais qui sont avant tout des conséquences directes de l’évolution des technologies de l’information.