962 resultados para Video Game Industry
Resumo:
The aim of this paper is to evaluate the influence of the crushing process used to obtain recycled concrete aggregates on the performance of concrete made with those aggregates. Two crushing methods were considered: primary crushing, using a jaw crusher, and primary plus secondary crushing (PSC), using a jaw crusher followed by a hammer mill. Besides natural aggregates (NA), these two processes were also used to crush three types of concrete made in laboratory (L20, L45 e L65) and three more others from the precast industry (P20, P45 e P65). The coarse natural aggregates were totally replaced by coarse recycled concrete aggregates. The recycled aggregates concrete mixes were compared with reference concrete mixes made using only NA, and the following properties related to the mechanical and durability performance were tested: compressive strength; splitting tensile strength; modulus of elasticity; carbonation resistance; chloride penetration resistance; water absorption by capillarity; water absorption by immersion; and shrinkage. The results show that the PSC process leads to better performances, especially in the durability properties. © 2014 RILEM
Resumo:
The growing heterogeneity of networks, devices and consumption conditions asks for flexible and adaptive video coding solutions. The compression power of the HEVC standard and the benefits of the distributed video coding paradigm allow designing novel scalable coding solutions with improved error robustness and low encoding complexity while still achieving competitive compression efficiency. In this context, this paper proposes a novel scalable video coding scheme using a HEVC Intra compliant base layer and a distributed coding approach in the enhancement layers (EL). This design inherits the HEVC compression efficiency while providing low encoding complexity at the enhancement layers. The temporal correlation is exploited at the decoder to create the EL side information (SI) residue, an estimation of the original residue. The EL encoder sends only the data that cannot be inferred at the decoder, thus exploiting the correlation between the original and SI residues; however, this correlation must be characterized with an accurate correlation model to obtain coding efficiency improvements. Therefore, this paper proposes a correlation modeling solution to be used at both encoder and decoder, without requiring a feedback channel. Experiments results confirm that the proposed scalable coding scheme has lower encoding complexity and provides BD-Rate savings up to 3.43% in comparison with the HEVC Intra scalable extension under development. © 2014 IEEE.
Resumo:
The IEEE 802.15.4 Medium Access Control (MAC) protocol is an enabling technology for time sensitive wireless sensor networks thanks to its Guaranteed-Time Slot (GTS) mechanism in the beacon-enabled mode. However, the protocol only supports explicit GTS allocation, i.e. a node allocates a number of time slots in each superframe for exclusive use. The limitation of this explicit GTS allocation is that GTS resources may quickly disappear, since a maximum of seven GTSs can be allocated in each superframe, preventing other nodes to benefit from guaranteed service. Moreover, the GTSs may be only partially used, resulting in wasted bandwidth. To overcome these limitations, this paper proposes i-GAME, an implicit GTS Allocation Mechanism in beacon-enabled IEEE 802.15.4 networks. The allocation is based on implicit GTS allocation requests, taking into account the traffic specifications and the delay requirements of the flows. The i-GAME approach enables the use of a GTS by multiple nodes, while all their (delay, bandwidth) requirements are still satisfied. For that purpose, we propose an admission control algorithm that enables to decide whether to accept a new GTS allocation request or not, based not only on the remaining time slots, but also on the traffic specifications of the flows, their delay requirements and the available bandwidth resources. We show that our proposal improves the bandwidth utilization compared to the explicit allocation used in the IEEE 802.15.4 protocol standard. We also present some practical considerations for the implementation of i-GAME, ensuring backward compatibility with the IEEE 801.5.4 standard with only minor add-ons.
Resumo:
This technical report describes the implementation details of the Implicit GTS Allocation Mechanism (i-GAME), for the IEEE 802.15.4 protocol. The i-GAME was implemented in nesC/TinyOS for the CrossBow MICAz mote, over our own implementation of the IEEE 802.15.4 protocol stack. This document provides the implementation details, including a description of the i-GAME software interfaces.
Resumo:
As high dynamic range video is gaining popularity, video coding solutions able to efficiently provide both low and high dynamic range video, notably with a single bitstream, are increasingly important. While simulcasting can provide both dynamic range videos at the cost of some compression efficiency penalty, bit-depth scalable video coding can provide a better trade-off between compression efficiency, adaptation flexibility and computational complexity. Considering the widespread use of H.264/AVC video, this paper proposes a H.264/AVC backward compatible bit-depth scalable video coding solution offering a low dynamic range base layer and two high dynamic range enhancement layers with different qualities, at low complexity. Experimental results show that the proposed solution has an acceptable rate-distortion performance penalty regarding the HDR H.264/AVC single-layer coding solution.
Resumo:
In video communication systems, the video signals are typically compressed and sent to the decoder through an error-prone transmission channel that may corrupt the compressed signal, causing the degradation of the final decoded video quality. In this context, it is possible to enhance the error resilience of typical predictive video coding schemes using as inspiration principles and tools from an alternative video coding approach, the so-called Distributed Video Coding (DVC), based on the Distributed Source Coding (DSC) theory. Further improvements in the decoded video quality after error-prone transmission may also be obtained by considering the perceptual relevance of the video content, as distortions occurring in different regions of a picture have a different impact on the user's final experience. In this context, this paper proposes a Perceptually Driven Error Protection (PDEP) video coding solution that enhances the error resilience of a state-of-the-art H.264/AVC predictive video codec using DSC principles and perceptual considerations. To increase the H.264/AVC error resilience performance, the main technical novelties brought by the proposed video coding solution are: (i) design of an improved compressed domain perceptual classification mechanism; (ii) design of an improved transcoding tool for the DSC-based protection mechanism; and (iii) integration of a perceptual classification mechanism in an H.264/AVC compliant codec with a DSC-based error protection mechanism. The performance results obtained show that the proposed PDEP video codec provides a better performing alternative to traditional error protection video coding schemes, notably Forward Error Correction (FEC)-based schemes. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
This paper presents a methodology to establish investment and trading strategies of a power generation company. These strategies are integrated in the ITEM-Game simulator in order to test their results when played against defined strategies used by other players. The developed strategies are focused on investment decisions, although trading strategies are also implemented to obtain base case results. Two cases are studied considering three players with the same trading strategy. In case 1, all players also have the same investment strategy driven by a market target share. In case 2, player 1 has an improved investment strategy with a target share twice of the target of players 2 and 3. Results put in evidence the influence of the CO2 and fuel prices in the company investment decision. It is also observed the influence of the budget constraint which might prevent the player to take the desired investment decision.
Resumo:
We show that a self-generated set of combinatorial games, S. may not be hereditarily closed but, strong self-generation and hereditary closure are equivalent in the universe of short games. In [13], the question "Is there a set which will give a non-distributive but modular lattice?" appears. A useful necessary condition for the existence of a finite non-distributive modular L(S) is proved. We show the existence of S such that L(S) is modular and not distributive, exhibiting the first known example. More, we prove a Representation Theorem with Games that allows the generation of all finite lattices in game context. Finally, a computational tool for drawing lattices of games is presented. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
In this paper a new PCA-based positioning sensor and localization system for mobile robots to operate in unstructured environments (e. g. industry, services, domestic ...) is proposed and experimentally validated. The inexpensive positioning system resorts to principal component analysis (PCA) of images acquired by a video camera installed onboard, looking upwards to the ceiling. This solution has the advantage of avoiding the need of selecting and extracting features. The principal components of the acquired images are compared with previously registered images, stored in a reduced onboard image database, and the position measured is fused with odometry data. The optimal estimates of position and slippage are provided by Kalman filters, with global stable error dynamics. The experimental validation reported in this work focuses on the results of a set of experiments carried out in a real environment, where the robot travels along a lawn-mower trajectory. A small position error estimate with bounded co-variance was always observed, for arbitrarily long experiments, and slippage was estimated accurately in real time.
Resumo:
The Janssen-Cilag proposal for a risk-sharing agreement regarding bortezomib received a welcome signal from NICE. The Office of Fair Trading report included risk-sharing agreements as an available tool for the National Health Service. Nonetheless, recent discussions have somewhat neglected the economic fundamentals underlying risk-sharing agreements. We argue here that risk-sharing agreements, although attractive due to the principle of paying by results, also entail risks. Too many patients may be put under treatment even with a low success probability. Prices are likely to be adjusted upward, in anticipation of future risk-sharing agreements between the pharmaceutical company and the third-party payer. An available instrument is a verification cost per patient treated, which allows obtaining the first-best allocation of patients to the new treatment, under the risk sharing agreement. Overall, the welfare effects of risk-sharing agreements are ambiguous, and care must be taken with their use.
Resumo:
This paper presents a case study of heat exchanger network (HEN) retrofit with the objective to reduce the utilities consumption in a biodiesel production process. Pinch analysis studies allow determining the minimum duty utilities as well the maximum of heat recovery. The existence of heat exchangers for heat recovery already running in the process causes a serious restriction for the implementation of grassroot HEN design based on pinch studies. Maintaining the existing HEN, a set of alternatives with additional heat exchangers was created and analysed using some industrial advice and selection criteria. The final proposed solution allows to increase the actual 18 % of recovery heat of the all heating needs of the process to 23 %, with an estimated annual saving in hot utility of 35 k(sic)/y.
Resumo:
Electronics Letters Vol.38, nº 19
Resumo:
Com o crescimento da informação disponível na Web, arquivos pessoais e profissionais, protagonizado tanto pelo aumento da capacidade de armazenamento de dados, como pelo aumento exponencial da capacidade de processamento dos computadores, e do fácil acesso a essa mesma informação, um enorme fluxo de produção e distribuição de conteúdos audiovisuais foi gerado. No entanto, e apesar de existirem mecanismos para a indexação desses conteúdos com o objectivo de permitir a pesquisa e acesso aos mesmos, estes apresentam normalmente uma grande complexidade algorítmica ou exigem a contratação de pessoal altamente qualificado, para a verificação e categorização dos conteúdos. Nesta dissertação pretende-se estudar soluções de anotação colaborativa de conteúdos e desenvolver uma ferramenta que facilite a anotação de um arquivo de conteúdos audiovisuais. A abordagem implementada é baseada no conceito dos “Jogos com Propósito” (GWAP – Game With a Purpose) e permite que os utilizadores criem tags (metadatos na forma de palavras-chave) de forma a atribuir um significado a um objecto a ser categorizado. Assim, e como primeiro objectivo, foi desenvolvido um jogo com o propósito não só de entretenimento, mas também que permita a criação de anotações audiovisuais perante os vídeos que são apresentados ao jogador e, que desta forma, se melhore a indexação e categorização dos mesmos. A aplicação desenvolvida permite ainda a visualização dos conteúdos e metadatos categorizados, e com o objectivo de criação de mais um elemento informativo, permite a inserção de um like num determinado instante de tempo do vídeo. A grande vantagem da aplicação desenvolvida reside no facto de adicionar anotações a pontos específicos do vídeo, mais concretamente aos seus instantes de tempo. Trata-se de uma funcionalidade nova, não disponível em outras aplicações de anotação colaborativa de conteúdos audiovisuais. Com isto, o acesso aos conteúdos será bastante mais eficaz pois será possível aceder, por pesquisa, a pontos específicos no interior de um vídeo.
Resumo:
O Lean Thinking (Pensamento Magro) baseia-se no Sistema Toyota de Produção, também conhecido pela sigla TPS (Toyota Production System). Foi desenvolvido em ambiente fabril, em particular na indústria automóvel, por Taiichi Ohno (1988) com o principal objectivo de eliminar desperdícios. O Lean Thinking tem crescido e hoje é muito mais abrangente. Com o intuito de melhorar a aprendizagem dos conceitos e das práticas lean, têm sido desenvolvidos diversos jogos que simulam a utilização das diferentes ferramentas lean. Estes jogos têm uma vertente comercial e são especialmente dirigidos para a indústria contudo não se consegue encontrar um que consiga ser utilizado para simular individualmente as ferramentas Lean. No âmbito desta dissertação, foi desenvolvido um jogo didáctico para apoio nas aulas onde são estudadas as ferramentas Lean. As ferramentas Lean abordadas neste trabalho são: 5S, Organização de Layout e Total Productive Maintenance. O jogo desenvolvido permite introduzir as ferramentas individualmente e as simulações efectuadas possibilitam a análise das melhorias obtidas com a eliminação de desperdícios através da aplicação das diferentes ferramentas.
Resumo:
The Casa da Música Foundation, responsible for the management of Casa da Música do Porto building, has the need to obtain statistical data related to the number of building’s visitors. This information is a valuable tool for the elaboration of periodical reports concerning the success of this cultural institution. For this reason it was necessary to develop a system capable of returning the number of visitors for a requested period of time. This represents a complex task due to the building’s unique architectural design, characterized by very large doors and halls, and the sudden large number of people that pass through them in moments preceding and proceeding the different activities occurring in the building. To achieve the technical solution for this challenge, several image processing methods, for people detection with still cameras, were first studied. The next step was the development of a real time algorithm, using OpenCV libraries and computer vision concepts,to count individuals with the desired accuracy. This algorithm includes the scientific and technical knowledge acquired in the study of the previous methods. The themes developed in this thesis comprise the fields of background maintenance, shadow and highlight detection, and blob detection and tracking. A graphical interface was also built, to help on the development, test and tunning of the proposed system, as a complement to the work. Furthermore, tests to the system were also performed, to certify the proposed techniques against a set of limited circumstances. The results obtained revealed that the algorithm was successfully applied to count the number of people in complex environments with reliable accuracy.