799 resultados para Culture of sport games
Resumo:
Purpose. To determine the usability of two video games to prevent type 2 diabetes and obesity among youth through analysis of data collected during alpha-testing. ^ Subjects. Ten children aged 9 to 12 were selected for three 2-hour alpha testing sessions.^ Methods. "Escape from Diab" and "Nanoswarm" were designed to change dietary and physical inactivity behaviors, based on a theoretical framework of mediating variables obtained from social cognitive theory, self-determination theory, elaboration likelihood model, and behavioral inoculation theory. Thirteen mini-games developed by the software company were divided into 3 groups based on completion date. Children tested 4-5 mini-games in each of three sessions. Observed game play was followed by a scripted interview. Results from observation forms and interview transcripts were tabulated and coded to determine usability. Suggestions for game modifications were delivered to the software design firm, and a follow-up table reports rationale for inclusion or exclusion of such modifications.^ Results. Participants were 50% frequent video game players and 20% non game-players. Most (60%) were female. The mean grade (indicating likeability as a subset of usability) across all games given by children was significantly greater than a neutral grade of 80% (89%, p < 0.01), indicating a positive likeability score. The games on average also received positive ratings for fun, helpfulness of instructions and length compared to neutral values (midpoint on likert scales) (all p < 0.01). Observation notes indicated that participants paid attention to the instructions, did not appear to have much difficulty with the games, and were "not frustrated", "not bored", "very engaged", "not fidgety" and "very calm" (all p < 0.01). The primary issues noted in observations and interviews were unclear instructions and unclear purpose of some games. Player suggestions primarily involved ways to make on screen cues more visible or noticeable, instructions more clear, and games more elaborate or difficult.^ Conclusions. The present study highlights the importance of alpha testing video game components for usability prior to completion to enhance usability and likeability. Results indicate that creating clear instructions, making peripheral screen cues more eye-catching or noticeable, and vigorously stating the purpose of the game to improve understandability are important elements. However, future interventions will each present unique materials and user-interfaces and should therefore also be thoroughly alpha-tested. ^
Resumo:
Researchers have long believed the concept of "excitement" in games to be subjective and difficult to measure. This paper presents the development of a mathematically computable index that measures this concept from the viewpoint of an audience. One of the key aspects of the index is the differential of the probability of "winning" before and after one specific "play" in a given game. If the probability of winning becomes very positive or negative by that play, then the audience will feel the game to be "exciting." The index makes a large contribution to the study of games and enables researchers to compare and analyze the "excitement" of various games. It may be applied to many fields especially the area of welfare economics, ranging from allocative efficiency to axioms of justice and equity.
A Mathematical Representation of "Excitement" in Games: A Contribution to the Theory of Game Systems
Resumo:
Researchers have long believed the concept of "excitement" in games to be subjective and difficult to measure. This paper presents the development of a mathematically computable index that measures the concept from the viewpoint of an audience and from that of a player. One of the key aspects of the index is the differential of the probability of "winning" before and after one specific "play" in a given game. The index makes a large contribution to the study of games and enables researchers to compare and analyze the “excitement” of various games. It may be applied in many fields, especially the area of welfare economics, and applications may range from those related to allocative efficiency to axioms of justice and equity.
Resumo:
Sports started to gain relevance in Spain around the end of the nineteenth century and the beginning of the twentieth century as a leisure and health option of the upper classes imported from Britain. Its early development was intertwined with the spread of other kinds of physical activities with much more tradition on the continent: gymnastics and physical education. First played by the ruling classes – aristocracy and high bourgeoisie – sports permeated towards petty bourgeoisie and middle classes in urban areas such as Madrid, Barcelona, San Sebastián and Santander. This pattern meant that the expansion of sports was unavoidably tied to the degree of industrialisation and cultural modernisation of the country. Since 1910, and mainly during the 1920s, sport grew in popularity as a spectacle and, toa much lesser degree, as a practice among the Spanish population.
Resumo:
This paper presents a model that enables the integration of SCORM packages into web games. It is based on the fact that SCORM packages are prepared to be integrated into Learning Management Systems and to communicate with them. Hence in a similar way they can also be integrated into web games. The application of this model results in the linkage between the Learning Objects inside the package and specific actions or conditions in the game. The educational content will be shown to the players when they perform these actions or the conditions are met. For example, when they need a special weapon they will have to consume the Learning Object to get it. Based on this model we have developed an open source web platform which main aim is to facilitate teachers the creation of educational games. They can select existing SCORM packages or upload their own ones and then select a game template in which the Learning Objects will be integrated. The resulting educational game will be available online. Details about the model and the developed platform are explained in this paper. Also links to the platform and an example of a generated game will be provided.
Resumo:
Comparación de las variables cinemáticas y de frecuencia cardiaca en dos posesiones en fútbol
Resumo:
The aim of the present study was to assess the effects of game timeouts on basketball teams? offensive and defensive performances according to momentary differences in score and game period. The sample consisted of 144 timeouts registered during 18 basketball games randomly selected from the 2007 European Basketball Championship (Spain). For each timeout, five ball possessions were registered before (n?493) and after the timeout (n?475). The offensive and defensive efficiencies were registered across the first 35 min and last 5 min of games. A k-means cluster analysis classified the timeouts according to momentary score status as follows: losing ( ?10 to ?3 points), balanced ( ?2 to 3 points), and winning (4 to 10 points). Repeated-measures analysis of variance identified statistically significant main effects between pre and post timeout offensive and defensive values. Chi-square analysis of game period identified a higher percentage of timeouts called during the last 5 min of a game compared with the first 35 min (64.999.1% vs. 35.1910.3%; x ?5.4, PB0.05). Results showed higher post timeout offensive and defensive performances. No other effect or interaction was found for defensive performances. Offensive performances were better in the last 5 min of games, with the least differences when in balanced situations and greater differences when in winning situations. Results also showed one interaction between timeouts and momentary differences in score, with increased values when in losing and balanced situations but decreased values when in winning situations. Overall, the results suggest that coaches should examine offensive and defensive performances according to game period and differences in score when considering whether to call a timeout.
Resumo:
This study was designed to identify the injuries of professional women windsurfers, from their anatomical location, type of sport, context when they occurred, type of injury, the time of inactivity as a result of the same, the type of health care received and the relationship between the number of injuries and the position in the final classification of professional windsurfing competitions. We gave a retrospective questionnaire to 18 women elite windsurfers, who took part in the World Cup competition held in Fuerteventura (2008). Women are injured more frequently during training than competition (77.8%; p<0.05 vs. 20.5%). Women suffer leg injuries more than men (83.3%; p<0.05 vs. 14.3%) in freestyle. Serious injuries were more frequent for women (66.7%; p<0.05 vs. 28.2%) and the time of inactivity due to the injury was shorter for women (50%; p<0.05 vs. 20.5%). These results indicate that female windsurfers are more liable to suffer injuries, generally serious, during training sessions. Freestyle involves a greater risk of leg injuries for women. The knee is the area where most injuries occur, both for men and women, followed by the legs.
Resumo:
Business Intelligence (BI) applications have been gradually ported to the Web in search of a global platform for the consumption and publication of data and services. On the Internet, apart from techniques for data/knowledge management, BI Web applications need interfaces with a high level of interoperability (similar to the traditional desktop interfaces) for the visualisation of data/knowledge. In some cases, this has been provided by Rich Internet Applications (RIA). The development of these BI RIAs is a process traditionally performed manually and, given the complexity of the final application, it is a process which might be prone to errors. The application of model-driven engineering techniques can reduce the cost of development and maintenance (in terms of time and resources) of these applications, as they demonstrated by other types of Web applications. In the light of these issues, the paper introduces the Sm4RIA-B methodology, i.e., a model-driven methodology for the development of RIA as BI Web applications. In order to overcome the limitations of RIA regarding knowledge management from the Web, this paper also presents a new RIA platform for BI, called RI@BI, which extends the functionalities of traditional RIAs by means of Semantic Web technologies and B2B techniques. Finally, we evaluate the whole approach on a case study—the development of a social network site for an enterprise project manager.
Resumo:
The thermal degradation of flexible polyurethane foam has been studied under different conditions by thermogravimetric analysis (TG), thermogravimetric analysis-infrared spectrometry (TG-IR) and thermogravimetric analysis-mass spectrometry (TG-MS). For the kinetic study, dynamic and dynamic+isothermal runs were performed at different heating rates (5, 10 and 20 °C min−1) in three different atmospheres (N2, N2:O2 4:1 and N2:O2 9:1). Two reaction models were obtained, one for the pyrolysis and another for the combustion degradation (N2:O2 4:1 and N2:O2 9:1), simultaneously correlating the experimental data from the dynamic and dynamic+isothermal runs at different heating rates. The pyrolytic model considered consisted of two consecutive reactions with activation energies of 142 and 217.5 kJ mol−1 and reaction orders of 0.805 and 1.246. Nevertheless, to simulate the experimental data from the combustion runs, three consecutive reactions were employed with activation energies of 237.9, 103.5 and 120.1 kJ mol−1, and reaction orders of 2.003, 0.778 and 1.025. From the characterization of the sample employing TG-IR and TG-MS, the results obtained showed that the FPUF, under an inert atmosphere, started the decomposition breaking the urethane bond to produce long chains of ethers which were degraded immediately in the next step. However, under an oxidative atmosphere, at the first step not only the urethane bonds were broken but also some ether polyols started their degradation which finished at the second step producing a char that was degraded at the last stage.
Resumo:
A twenty-year period of severe land subsidence evolution in the Alto Guadalentín Basin (southeast Spain) is monitored using multi-sensor SAR images, processed by advanced differential interferometric synthetic aperture radar (DInSAR) techniques. The SAR images used in this study consist of four datasets acquired by ERS-1/2, ENVISAT, ALOS and COSMO-SkyMed satellites between 1992 and 2012. The integration of ground surface displacement maps retrieved for different time periods allows us to quantify up to 2.50 m of cumulated displacements that occurred between 1992 and 2012 in the Alto Guadalentín Basin. DInSAR results were locally compared with global positioning system (GPS) data available for two continuous stations located in the study area, demonstrating the high consistency of local vertical motion measurements between the two different surveying techniques. An average absolute error of 4.6 ± 4 mm for the ALOS data and of 4.8 ± 3.5 mm for the COSMO-SkyMed data confirmed the reliability of the analysis. The spatial analysis of DInSAR ground surface displacement reveals a direct correlation with the thickness of the compressible alluvial deposits. Detected ground subsidence in the past 20 years is most likely a consequence of a 100–200 m groundwater level drop that has persisted since the 1970s due to the overexploitation of the Alto Guadalentín aquifer system. The negative gradient of the pore pressure is responsible for the extremely slow consolidation of a very thick (> 100 m) layer of fine-grained silt and clay layers with low vertical hydraulic permeability (approximately 50 mm/h) wherein the maximum settlement has still not been reached.
Resumo:
Multi-sensor advanced DInSAR analyses have been performed and compared with two GPS station measurements, in order to evaluate the land subsidence evolution in a 20-year period, in the Alto Guadalentín Basin where the highest rate of man-induced subsidence (> 10 cm yr−1) of Europe had been detected. The control mechanisms have been examined comparing the advanced DInSAR data with conditioning and triggering factors (i.e. isobaths of Plio-Quaternary deposits, soft soil thickness and piezometric level).
Resumo:
The Tertiary detritic aquifer of Madrid (TDAM), with an average thickness of 1500 m and a heterogeneous, anisotropic structure, supplies water to Madrid, the most populated city of Spain (3.2 million inhabitants in the metropolitan area). Besides its complex structure, a previous work focused in the north-northwest of Madrid city showed that the aquifer behaves quasi elastically trough extraction/recovery cycles and ground uplifting during recovery periods compensates most of the ground subsidence measured during previous extraction periods (Ezquerro et al., 2014). Therefore, the relationship between ground deformation and groundwater level through time can be simulated using simple elastic models. In this work, we model the temporal evolution of the piezometric level in 19 wells of the TDAM in the period 1997–2010. Using InSAR and piezometric time series spanning the studied period, we first estimate the elastic storage coefficient (Ske) for every well. Both, the Ske of each well and the average Ske of all wells, are used to predict hydraulic heads at the different well locations during the study period and compared against the measured hydraulic heads, leading to very similar errors when using the Ske of each well and the average Ske of all wells: 14 and 16 % on average respectively. This result suggests that an average Ske can be used to estimate piezometric level variations in all the points where ground deformation has been measured by InSAR, thus allowing production of piezometric level maps for the different extraction/recovery cycles in the TDAM.
Resumo:
Thermal decomposition of flexible polyurethane foam (FPUF) was studied under nitrogen and air atmospheres at 550 °C and 850 °C using a laboratory scale reactor to analyse the evolved products. Ammonia, hydrogen cyanide and nitrile compounds were obtained in high yields in pyrolysis at the lower temperature, whereas at 850 °C polycyclic aromatic hydrocarbons (PAHs) and other semivolatile compounds, especially compounds containing nitrogen (benzonitrile, aniline, quinolone and indene) were the most abundant products. Different behaviour was observed in the evolution of polychlorodibenzo-p-dioxins and furans (PCDD/Fs) at 550 °C and 850 °C. At 550 °C, the less chlorinated congeners, mainly PCDF, were more abundant. Contrarily, at 850 °C the most chlorinated PCDD were dominant. In addition, the total yields of PCDD/Fs in the pyrolysis and combustion runs at 850 °C were low and quite similar.