960 resultados para Procedural content


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Com o aumento de plataformas móveis disponíveis no mercado e com o constante incremento na sua capacidade computacional, a possibilidade de executar aplicações e em especial jogos com elevados requisitos de desempenho aumentou consideravelmente. O mercado dos videojogos tem assim um cada vez maior número de potenciais clientes. Em especial, o mercado de jogos massive multiplayer online (MMO) tem-se tornado muito atractivo para as empresas de desenvolvimento de jogos. Estes jogos suportam uma elevada quantidade de jogadores em simultâneo que podem estar a executar o jogo em diferentes plataformas e distribuídos por um "mundo" de jogo extenso. Para incentivar a exploração desse "mundo", distribuem-se de forma inteligente pontos de interesse que podem ser explorados pelo jogador. Esta abordagem leva a um esforço substancial no planeamento e construção desses mundos, gastando tempo e recursos durante a fase de desenvolvimento. Isto representa um problema para as empresas de desenvolvimento de jogos, e em alguns casos, e impraticável suportar tais custos para equipas indie. Nesta tese e apresentada uma abordagem para a criação de mundos para jogos MMO. Estudam-se vários jogos MMO que são casos de sucesso de modo a identificar propriedades comuns nos seus mundos. O objectivo e criar uma framework flexível capaz de gerar mundos com estruturas que respeitam conjuntos de regras definidas por game designers. Para que seja possível usar a abordagem aqui apresentada em v arias aplicações diferentes, foram desenvolvidos dois módulos principais. O primeiro, chamado rule-based-map-generator, contem a lógica e operações necessárias para a criação de mundos. O segundo, chamado blocker, e um wrapper à volta do módulo rule-based-map-generator que gere as comunicações entre servidor e clientes. De uma forma resumida, o objectivo geral e disponibilizar uma framework para facilitar a geração de mundos para jogos MMO, o que normalmente e um processo bastante demorado e aumenta significativamente o custo de produção, através de uma abordagem semi-automática combinando os benefícios de procedural content generation (PCG) com conteúdo gráfico gerado manualmente.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The game industry has been experiencing a consistent increase in production costs of games lately. Part of this increase refers to the current trend of having bigger, more interactive and replayable environments. This trend translates to an increase in both team size and development time, which makes game development a even more risky investment and may reduce innovation in the area. As a possible solution to this problem, the scientific community is focusing on the generation of procedural content and, more specifically, on procedurally generated levels. Given the great diversity and complexity of games, most works choose to deal with a specific genre, platform games being one of the most studied. This work aims at proposing a procedural level generation method for platform/adventure games, a fairly more complex genre than most classic platformers which so far has not been the subject of study from other works. The level generation process was divided in two steps, planning and viusal generation, respectively responsible for generating a compact representation of the level and determining its view. The planning stage was divided in game design and level design, and uses a goaloriented process to output a set of rooms. The visual generation step receives a set of rooms and fills its interior with the appropriate parts of previously authored geometry

Relevância:

60.00% 60.00%

Publicador:

Resumo:

So far in this book, we have seen a large number of methods for generating content for existing games. So, if you have a game already, you could now generate many things for it: maps, levels, terrain, vegetation, weapons, dungeons, racing tracks. But what if you don’t already have a game, and want to generate the game itself? What would you generate, and how? At the heart of any game are its rules. This chapter will discuss representations for game rules of different kinds, along with methods to generate them, and evaluation functions and constraints that help us judge complete games rather than just isolated content artefacts. Our main focus here will be on methods for generating interesting, fun, and/or balanced game rules. However, an important perspective that will permeate the chapter is that game rule encodings and evaluation functions can encode game design expertise and style, and thus help us understand game design. By formalising aspects of the game rules, we define a space of possible rules more precisely than could be done through writing about rules in qualitative terms; and by choosing which aspects of the rules to formalise, we define what aspects of the game are interesting to explore and introduce variation in. In this way, each game generator can be thought of an executable micro-theory of game design, though often a simplified, and sometimes even a caricatured one

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study aimed, first, produce a kind of educational materials in audiovisual format, to the Physical Education, addressing the Transversal Themes in view of the basketball. The themes chosen were Cultural Plurality, addressing the basketball in a wheelchair, and Labor and Consumer Affairs, addressing the marketing and exploitation of labor force that the company is Nike. There is no doubt the importance of media production in contemporary society, as one of its main languages, but of little relevance in the school environment, especially in the area of Physical Education. In a second phase aimed at evaluating the feasibility of this material both as a teaching strategy, such as content of learning lessons. The evaluation was made through a discussion in the form of focus groups with 10 students from 9 th grade of elementary school. The results showed that despite an initial rejection of the material as a teaching strategy because of the history that Physical Education has for several years, students were open to new ways of learning. Since the results on the material, such as learning content, showed that the videos were able to address and the Transversal Themes and meet the need for procedural content with the proposed activities. With this, we realize that students are willing to discuss matters that are not always addressed, but now, with the creation of national curriculum standards should be reflected in the classroom

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The paper that if presents, tells a research carried through next to students of the High School, in which activities of mathematical investigation from questions of some examinations of federal and state public universities had been elaborated; later applied to these students in a state school of the inside of São Paulo and with data obtained from their productions, we search to analyze which procedures had been used for them and which procedural changes had occurred. It is intended with this, to collaborate for the mathematical formation of the students of High School aiming at a more significant learning of the mathematical concepts and the procedures involved when working with this type of activity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Este proyecto consiste en el desarrollo de un sistema completo de generación procedimental de misiones para videojuegos. Buscamos crear, mediante un encadenamiento de algoritmos y un modelado del juego y sus componentes, secuencias de acciones y eventos de juego encadenados entre sí de forma lógica. La realización de estas secuencias de acciones lleva progresivamente hacia un objetivo final. Estas secuencias se conocen en el mundo de los juegos como misiones. Las dos fases principales del proceso son la generación de una misión a partir de un estado de juego inicial y la búsqueda de una misión óptima utilizando ciertos criterios que pueden estar ligados a las propiedades del jugador, dando lugar a misiones adaptativas. El proyecto contempla el desarrollo íntegro del sistema, lo que incluye tanto el sistema de generación y búsqueda como un videojuego donde integrar el resto del sistema para completarlo. El resultado final es plenamente funcional y jugable. La base teórica del proyecto proviene de la simbiosis de dos artes: la generación procedimental de contenido y la narración interactiva. This project involves the development of a complete procedural game quest generation system. We seek to build, by linking a series of algorithms, game and game component models, sequences of logically chained game actions and events. The ordered accomplishment of these sequences lead progressively to the fulfillment of a final objective. These sequences are known as quests in the videogame world. The two main parts of the process are quest generation from an initial game state and optimal quest search. This last is achieved by using certain criteria that can defined by the player properties, thus giving birth to adaptive quests. In this project. The system is comprehensively developed, including the quest generation and optimal search, as well as a full videogame, in which the rest of the system will be embedded so as to complete it. The final result is fully functional and playable. The theoretical basis of the project comes from the symbiosis of two different arts: procedural content generation and interactive storytelling.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ephemeral Computation (Eph-C) is a newly created computation paradigm, the purpose of which is to take advantage of the ephemeral nature (limited lifetime) of computational resources. First we speak of this new paradigm in general terms, then more specifically in terms of videogame development. We present possible applications and benefits for the main research fields associated with videogame development. This is a preliminary work which aims to investigate the possibilities of applying ephemeral computation to the products of the videogame industry. Therefore, as a preliminary work, it attempts to serve as the inspiration for other researchers or videogame developers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Impaired respiratory function (IRF) during procedural sedation and analgesia (PSA) poses considerable risk to patient safety as it can lead to inadequate oxygenation and ventilation. Risk factors that can be screened prior to the procedure have not been identified for the cardiac catheterization laboratory (CCL).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, there has been a significant increase in the popularity of ontological analysis of conceptual modelling techniques. To date, related research explores the ontological deficiencies of classical techniques such as ER or UML modelling, as well as business process modelling techniques such as ARIS or even Web Services standards such as BPEL4WS, BPML, ebXML, BPSS and WSCI. While the ontologies that form the basis of these analyses are reasonably mature, it is the actual process of an ontological analysis that still lacks rigour. The current procedure is prone to individual interpretations and is one reason for criticism of the entire ontological analysis. This paper presents a procedural model for ontological analysis based on the use of meta models, multiple coders and metrics. The model is supported by examples from various ontological analyses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Procedural sedation and analgesia (PSA) is used to attenuate the pain and distress that may otherwise be experienced during diagnostic and interventional medical or dental procedures. As the risk of adverse events increases with the depth of sedation induced, frequent monitoring of level of consciousness is recommended. Level of consciousness is usually monitored during PSA with clinical observation. Processed electroencephalogram-based depth of anaesthesia (DoA) monitoring devices provide an alternative method to monitor level of consciousness that can be used in addition to clinical observation. However, there is uncertainty as to whether their routine use in PSA would be justified. Rigorous evaluation of the clinical benefits of DoA monitors during PSA, including comprehensive syntheses of the available evidence, is therefore required. One potential clinical benefit of using DoA monitoring during PSA is that the technology could improve patient safety by reducing sedation-related adverse events, such as death or permanent neurological disability. We hypothesise that earlier identification of lapses into deeper than intended levels of sedation using DoA monitoring leads to more effective titration of sedative and analgesic medications, and results in a reduction in the risk of adverse events caused by the consequences of over-sedation, such as hypoxaemia. The primary objective of this review is to determine whether using DoA monitoring during PSA in the hospital setting improves patient safety by reducing the risk of hypoxaemia (defined as an arterial partial pressure of oxygen below 60 mmHg or percentage of haemoglobin that is saturated with oxygen [SpO2] less than 90 %). Other potential clinical benefits of using DoA monitoring devices during sedation will be assessed as secondary outcomes. Methods/design Electronic databases will be systematically searched for randomized controlled trials comparing the use of depth of anaesthesia monitoring devices with clinical observation of level of consciousness during PSA. Language restrictions will not be imposed. Screening, study selection and data extraction will be performed by two independent reviewers. Disagreements will be resolved by discussion. Meta-analyses will be performed if suitable. Discussion This review will synthesise the evidence on an important potential clinical benefit of DoA monitoring during PSA within hospital settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of this work was to monitor a set of physical-chemical properties of heavy oil procedural streams through nuclear magnetic resonance spectroscopy, in order to propose an analysis procedure and online data processing for process control. Different statistical methods which allow to relate the results obtained by nuclear magnetic resonance spectroscopy with the results obtained by the conventional standard methods during the characterization of the different streams, have been implemented in order to develop models for predicting these same properties. The real-time knowledge of these physical-chemical properties of petroleum fractions is very important for enhancing refinery operations, ensuring technically, economically and environmentally proper refinery operations. The first part of this work involved the determination of many physical-chemical properties, at Matosinhos refinery, by following some standard methods important to evaluate and characterize light vacuum gas oil, heavy vacuum gas oil and fuel oil fractions. Kinematic viscosity, density, sulfur content, flash point, carbon residue, P-value and atmospheric and vacuum distillations were the properties analysed. Besides the analysis by using the standard methods, the same samples were analysed by nuclear magnetic resonance spectroscopy. The second part of this work was related to the application of multivariate statistical methods, which correlate the physical-chemical properties with the quantitative information acquired by nuclear magnetic resonance spectroscopy. Several methods were applied, including principal component analysis, principal component regression, partial least squares and artificial neural networks. Principal component analysis was used to reduce the number of predictive variables and to transform them into new variables, the principal components. These principal components were used as inputs of the principal component regression and artificial neural networks models. For the partial least squares model, the original data was used as input. Taking into account the performance of the develop models, by analysing selected statistical performance indexes, it was possible to conclude that principal component regression lead to worse performances. When applying the partial least squares and artificial neural networks models better results were achieved. However, it was with the artificial neural networks model that better predictions were obtained for almost of the properties analysed. With reference to the results obtained, it was possible to conclude that nuclear magnetic resonance spectroscopy combined with multivariate statistical methods can be used to predict physical-chemical properties of petroleum fractions. It has been shown that this technique can be considered a potential alternative to the conventional standard methods having obtained very promising results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN] The experiment discussed in this paper is a direct replication of Finkbeiner (2005) and an indirect replication of Jiang and Forster (2001) and Witzel and Forster (2012). The paper explores the use of episodic memory in L2 vocabulary processing. By administering an L1 episodic recognition task with L2 masked translation primes, reduced reaction times would suggest L2 vocabulary storage in episodic memory. The methodology follows Finkbeiner (2005), who argued that a blank screen introduced after the prime in Jiang Forster (2001) led to a ghosting effect, compromising the imperceptibility of the prime. The results here mostly corroborate Finkbeiner (2005) with no significant priming effects. While Finkbeiner discusses his findings in terms of the dissociability of episodic and semantic memory, and discounts Jiang and Forster’s (2001) results to participants’ strategic responding, I add a layer of analysis based on declarative and procedural constituents. From this perspective, Jiang and Forster (2001) and Witzel and Forster’s (2012) results can be seen as possible episodic memory activation, and Finkbeiner’s (2005) and my lack of priming effects might be due to the sole activation of procedural neural networks. Priming effects are found in concrete and abstract words but require verification through further experimentation.