919 resultados para Computer Game Testing
Resumo:
Who was the cowboy in Washington? What is the land of sushi? Most people would have answers to these questions readily available,yet, modern search engines, arguably the epitome of technology in finding answers to most questions, are completely unable to do so. It seems that people capture few information items to rapidly converge to a seemingly 'obvious' solution. We will study approaches for this problem, with two additional hard demands that constrain the space of possible theories: the sought model must be both psychologically and neuroscienti cally plausible. Building on top of the mathematical model of memory called Sparse Distributed Memory, we will see how some well-known methods in cryptography can point toward a promising, comprehensive, solution that preserves four crucial properties of human psychology.
Resumo:
As digital systems move away from traditional desktop setups, new interaction paradigms are emerging that better integrate with users’ realworld surroundings, and better support users’ individual needs. While promising, these modern interaction paradigms also present new challenges, such as a lack of paradigm-specific tools to systematically evaluate and fully understand their use. This dissertation tackles this issue by framing empirical studies of three novel digital systems in embodied cognition – an exciting new perspective in cognitive science where the body and its interactions with the physical world take a central role in human cognition. This is achieved by first, focusing the design of all these systems on a contemporary interaction paradigm that emphasizes physical interaction on tangible interaction, a contemporary interaction paradigm; and second, by comprehensively studying user performance in these systems through a set of novel performance metrics grounded on epistemic actions, a relatively well established and studied construct in the literature on embodied cognition. The first system presented in this dissertation is an augmented Four-in-a-row board game. Three different versions of the game were developed, based on three different interaction paradigms (tangible, touch and mouse), and a repeated measures study involving 36 participants measured the occurrence of three simple epistemic actions across these three interfaces. The results highlight the relevance of epistemic actions in such a task and suggest that the different interaction paradigms afford instantiation of these actions in different ways. Additionally, the tangible version of the system supports the most rapid execution of these actions, providing novel quantitative insights into the real benefits of tangible systems. The second system presented in this dissertation is a tangible tabletop scheduling application. Two studies with single and paired users provide several insights into the impact of epistemic actions on the user experience when these are performed outside of a system’s sensing boundaries. These insights are clustered by the form, size and location of ideal interface areas for such offline epistemic actions to occur, as well as how can physical tokens be designed to better support them. Finally, and based on the results obtained to this point, the last study presented in this dissertation directly addresses the lack of empirical tools to formally evaluate tangible interaction. It presents a video-coding framework grounded on a systematic literature review of 78 papers, and evaluates its value as metric through a 60 participant study performed across three different research laboratories. The results highlight the usefulness and power of epistemic actions as a performance metric for tangible systems. In sum, through the use of such novel metrics in each of the three studies presented, this dissertation provides a better understanding of the real impact and benefits of designing and developing systems that feature tangible interaction.
Resumo:
Objectives: The objective of the present study was to evaluate a prefabricated intraradicular threaded pure titanium post, designed and developed at the Sao Jose dos Campos School of Dentistry - UNESP, Brazil. This new post was designed to minimize stresses observed with prefabricated post systems and to improve cost-benefits. Materials and and methods: Fracture resistance testing of the post/core/root complex, fracture analysis by microscopy and stress analysis by the finite element method were used for post evaluation. The following four prefabricated metal post systems were analyzed: group 1, experimental post; group 2, modification of the experimental post; group 3, Flexi Post, and group 4, Para Post. For the analysis of fracture resistance, 40 bovine teeth were randomly assigned to the four groups (n=10) and used for the fabrication of test specimens simulating the situation in the mouth. The test specimens were subjected to compressive strength testing until fracture in an EMIC universal testing machine. After fracture of the test specimens, their roots were sectioned and analyzed by microscopy. For the finite element method, specimens of the fracture resistance test were simulated by computer modeling to determine the stress distribution pattern in the post systems studied. Results: The fracture test presented the following averages and standard deviation: G1 (45.63 +/- 8.77), G2 (49.98 +/- 7.08), G3 (43.84 +/- 5.52), G4 (47.61 +/- 7.23). Stress was homogenously distributed along the body of the intraradicular post in group 1, whereas high stress concentrations in certain regions were observed in the other groups. These stress concentrations in the body of the post induced the same stress concentration in root dentin. Conclusions: The experimental post (original and modified versions) presented similar fracture resistance and better results in the stress analysis when compared with the commercial post systems tested (08/2008PA/CEP).
Resumo:
The research on multiple classifiers systems includes the creation of an ensemble of classifiers and the proper combination of the decisions. In order to combine the decisions given by classifiers, methods related to fixed rules and decision templates are often used. Therefore, the influence and relationship between classifier decisions are often not considered in the combination schemes. In this paper we propose a framework to combine classifiers using a decision graph under a random field model and a game strategy approach to obtain the final decision. The results of combining Optimum-Path Forest (OPF) classifiers using the proposed model are reported, obtaining good performance in experiments using simulated and real data sets. The results encourage the combination of OPF ensembles and the framework to design multiple classifier systems. © 2011 Springer-Verlag.
Resumo:
This paper presents the results obtained with a business game whose model represents the decision making process related to two moments at an industrial company. The first refers to the project of the industrial plant, and the second to its management. The game model was conceived so the player's first decision would establish capacity and other parameters such as quantities of each product to produce, marketing expenses, research and development, quality, advertising, salaries, if purchases will be made in installments or in cash, if there will be credit sales and how many installments will be allowed and the number of workers in the assembly area. An experiment was conducted with employees of a Brazilian company. Data obtained indicate that the players have lack of contents, especially in finances. Although these results cannot be generalized, they confirm prior results with undergraduate and graduate students and they indicate the need for reinforcement in this undergraduate area. © 2012 Springer-Verlag.
Resumo:
Gesture-based applications have particularities, since users interact in a natural way, much as they interact in the non-digital world. Hence, new requirements are needed on the software design process. This paper shows a software development process model for these applications, including requirement specification, design, implementation, and testing procedures. The steps and activities of the proposed model were tested through a game case study, which is a puzzle game. The puzzle is completed when all pieces of a painting are correctly positioned by the drag and drop action of users hand gesture. It also shows the results obtained of applying a heuristic evaluation on this game. © 2012 IEEE.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Studies show the positive effects that video games can have on student performance and attitude towards learning. In the past few years, strategies have been generated to optimize the use of technological resources with the aim of facilitating widespread adoption of technology in the classroom. Given its low acquisition and maintenance costs, the interpersonal computer allows individual interaction and simultaneous learning with large groups of students. The purpose of this work was to compare arithmetical knowledge acquired by third-grade students through the use of game-based activities and non-game-based activities using an interpersonal computer, with knowledge acquired through the use of traditional paper-and-pencil activities, and to analyze their impact in various socio-cultural contexts. To do this, a quasi-experimental study was conducted with 271 students in three different countries (Brazil, Chile, and Costa Rica), in both rural and urban schools. A set of educational games for practising arithmetic was developed and tested in six schools within these three countries. Results show that there were no significant differences (ANCOVA) in the learning acquired from game-based vs. non-game-based activities. However, both showed a significant difference when compared with the traditional method. Additionally, both groups using the interpersonal computer showed higher levels of student interest than the traditional method group, and these technological methods were seen to be especially effective in increasing learning among weaker students.
Resumo:
Where the creation, understanding, and assessment of software testing and regression testing techniques are concerned, controlled experimentation is an indispensable research methodology. Obtaining the infrastructure necessary to support such experimentation, however, is difficult and expensive. As a result, progress in experimentation with testing techniques has been slow, and empirical data on the costs and effectiveness of techniques remains relatively scarce. To help address this problem, we have been designing and constructing infrastructure to support controlled experimentation with testing and regression testing techniques. This paper reports on the challenges faced by researchers experimenting with testing techniques, including those that inform the design of our infrastructure. The paper then describes the infrastructure that we are creating in response to these challenges, and that we are now making available to other researchers, and discusses the impact that this infrastructure has and can be expected to have.
Resumo:
Spreadsheets are widely used but often contain faults. Thus, in prior work we presented a data-flow testing methodology for use with spreadsheets, which studies have shown can be used cost-effectively by end-user programmers. To date, however, the methodology has been investigated across a limited set of spreadsheet language features. Commercial spreadsheet environments are multiparadigm languages, utilizing features not accommodated by our prior approaches. In addition, most spreadsheets contain large numbers of replicated formulas that severely limit the efficiency of data-flow testing approaches. We show how to handle these two issues with a new data-flow adequacy criterion and automated detection of areas of replicated formulas, and report results of a controlled experiment investigating the feasibility of our approach.
Resumo:
This paper discusses the power allocation with fixed rate constraint problem in multi-carrier code division multiple access (MC-CDMA) networks, that has been solved through game theoretic perspective by the use of an iterative water-filling algorithm (IWFA). The problem is analyzed under various interference density configurations, and its reliability is studied in terms of solution existence and uniqueness. Moreover, numerical results reveal the approach shortcoming, thus a new method combining swarm intelligence and IWFA is proposed to make practicable the use of game theoretic approaches in realistic MC-CDMA systems scenarios. The contribution of this paper is twofold: (i) provide a complete analysis for the existence and uniqueness of the game solution, from simple to more realist and complex interference scenarios; (ii) propose a hybrid power allocation optimization method combining swarm intelligence, game theory and IWFA. To corroborate the effectiveness of the proposed method, an outage probability analysis in realistic interference scenarios, and a complexity comparison with the classical IWFA are presented. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
This study explores educational technology and management education by analyzing fidelity in game-based management education interventions. A sample of 31 MBA students was selected to help answer the research question: To what extent do MBA students tend to recognize specific game-based academic experiences, in terms of fidelity, as relevant to their managerial performance? Two distinct game-based interventions (BG1 and BG2) with key differences in fidelity levels were explored: BG1 presented higher physical and functional fidelity levels and lower psychological fidelity levels. Hypotheses were tested with data from the participants, collected shortly after their experiences, related to the overall perceived quality of game-based interventions. The findings reveal a higher overall perception of quality towards BG1: (a) better for testing strategies, (b) offering better business and market models, (c) based on a pace that better stimulates learning, and (d) presenting a fidelity level that better supports real world performance. This study fosters the conclusion that MBA students tend to recognize, to a large extent, that specific game-based academic experiences are relevant and meaningful to their managerial development, mostly with heightened fidelity levels of adopted artifacts. Agents must be ready and motivated to explore the new, to try and err, and to learn collaboratively in order to perform.
Resumo:
The main problem connected to cone beam computed tomography (CT) systems for industrial applications employing 450 kV X-ray tubes is the high amount of scattered radiation which is added to the primary radiation (signal). This stray radiation leads to a significant degradation of the image quality. A better understanding of the scattering and methods to reduce its effects are therefore necessary to improve the image quality. Several studies have been carried out in the medical field at lower energies, whereas studies in industrial CT, especially for energies up to 450 kV, are lacking. Moreover, the studies reported in literature do not consider the scattered radiation generated by the CT system structure and the walls of the X-ray room (environmental scatter). In order to investigate the scattering on CT projections a GEANT4-based Monte Carlo (MC) model was developed. The model, which has been validated against experimental data, has enabled the calculation of the scattering including the environmental scatter, the optimization of an anti-scatter grid suitable for the CT system, and the optimization of the hardware components of the CT system. The investigation of multiple scattering in the CT projections showed that its contribution is 2.3 times the one of primary radiation for certain objects. The results of the environmental scatter showed that it is the major component of the scattering for aluminum box objects of front size 70 x 70 mm2 and that it strongly depends on the thickness of the object and therefore on the projection. For that reason, its correction is one of the key factors for achieving high quality images. The anti-scatter grid optimized by means of the developed MC model was found to reduce the scatter-toprimary ratio in the reconstructed images by 20 %. The object and environmental scatter calculated by means of the simulation were used to improve the scatter correction algorithm which could be patented by Empa. The results showed that the cupping effect in the corrected image is strongly reduced. The developed CT simulation is a powerful tool to optimize the design of the CT system and to evaluate the contribution of the scattered radiation to the image. Besides, it has offered a basis for a new scatter correction approach by which it has been possible to achieve images with the same spatial resolution as state-of-the-art well collimated fan-beam CT with a gain in the reconstruction time of a factor 10. This result has a high economic impact in non-destructive testing and evaluation, and reverse engineering.