942 resultados para Debugging in computer science
Resumo:
Computer games are significant since they embody our youngsters’ engagement with contemporary culture, including both play and education. These games rely heavily on visuals, systems of sign and expression based on concepts and principles of Art and Architecture. We are researching a new genre of computer games, ‘Educational Immersive Environments’ (EIEs) to provide educational materials suitable for the school classroom. Close collaboration with subject teachers is necessary, but we feel a specific need to engage with the practicing artist, the art theoretician and historian. Our EIEs are loaded with multimedia (but especially visual) signs which act to direct the learner and provide the ‘game-play’ experience forming semiotic systems. We suggest the hypothesis that computer games are a space of deconstruction and reconstruction (DeRe): When players enter the game their physical world and their culture is torn apart; they move in a semiotic system which serves to reconstruct an alternate reality where disbelief is suspended. The semiotic system draws heavily on visuals which direct the players’ interactions and produce motivating gameplay. These can establish a reconstructed culture and emerging game narrative. We have recently tested our hypothesis and have used this in developing design principles for computer game designers. Yet there are outstanding issues concerning the nature of the visuals used in computer games, and so questions for contemporary artists. Currently, the computer game industry employs artists in a ‘classical’ role in production of concept sketches, storyboards and 3D content. But this is based on a specification from the client which restricts the artist in intellectual freedom. Our DeRe hypothesis places the artist at the generative centre, to inform the game designer how art may inform our DeRe semiotic spaces. This must of course begin with the artists’ understanding of DeRe in this time when our ‘identities are becoming increasingly fractured, networked, virtualized and distributed’ We hope to persuade artists to engage with the medium of computer game technology to explore these issues. In particular, we pose several questions to the artist: (i) How can particular ‘periods’ in art history be used to inform the design of computer games? (ii) How can specific artistic elements or devices be used to design ‘signs’ to guide the player through the game? (iii) How can visual material be integrated with other semiotic strata such as text and audio?
Resumo:
We present an Integrated Environment suitable for learning and teaching computer programming which is designed for both students of specialised Computer Science courses, and also non-specialist students such as those following Liberal Arts. The environment is rich enough to allow exploration of concepts from robotics, artificial intelligence, social science, and philosophy as well as the specialist areas of operating systems and the various computer programming paradigms.
Resumo:
The longitudinal study focuses on the success of cegep science students at one college who were accepted into the science program although their secondary school grades in chemistry and/or physics did not meet the admission requirements, These less prepared students were admitted into the science program because they were placed in remedial classes that offered support through extra class time in their introductory college science courses. The main research question addressed in this study was to determine whether accepting less prepared students is beneficial to the student in terms of academic success.
Resumo:
A dissertation submitted in fulfillment of the requirements to the degree of Master in Computer Science and Computer Engineering
The role of musical aptitude in the pronunciation of English vowels among Polish learners of English
Resumo:
It has long been held that people who have musical training or talent acquire L2 pronunciation more successfully than those that do not. Indeed, there have been empirical studies to support this hypothesis (Pastuszek-Lipińska 2003, Fonseca-Mora et al. 2011, Zatorre and Baum 2012). However, in many of such studies, musical abilities in subjects were mostly verified through questionnaires rather than tested in a reliable, empirical manner. Therefore, we run three different musical hearing tests, i.e. pitch perception test, musical memory test, and rhythm perception test (Mandell 2009) to measure the actual musical aptitude in our subjects. The main research question is whether a better musical ear correlates with a higher rate of acquisition of English vowels in Polish EFL learners. Our group consists of 40 Polish university students studying English as their major who learn the British pronunciation model during an intense pronunciation course. 10 male and 30 female subjects with mean age of 20.1 were recorded in a recording studio. The procedure comprised spontaneous conversations, reading passages and reading words in isolation. Vowel measurements were conducted in Praat in all three speech styles and several consonantal contexts. The assumption was that participants who performed better in musical tests would produce vowels that are closer to the Southern British English model. We plotted them onto vowel charts and calculated the Euclidean distances. Preliminary results show that there is potential correlation between specific aspects of musical hearing and different elements of pronunciation. The study is a longitudinal project and will encompass two more years, during which we will repeat the recording procedure twice to measure the participants’ progress in mastering the English pronunciation and comparing it with their musical aptitude.
Resumo:
"TID-26500/R1. Distribution category: UC-2."
Resumo:
The SimProgramming teaching approach has the goal to help students overcome their learning difficulties in the transition from entry-level to advanced computer programming and prepare them for real-world labour environments, adopting learning strategies. It immerses learners in a businesslike learning environment, where students develop a problem-based learning activity with a specific set of tasks, one of which is filling weekly individual forms. We conducted thematic analysis of 401 weekly forms, to identify the students’ strategies for self-regulation of learning during assignment. The students are adopting different strategies in each phase of the approach. The early phases are devoted to organization and planning, later phases focus on applying theoretical knowledge and hands-on programming. Based on the results, we recommend the development of educational practices to help students conduct self-reflection of their performance during tasks.
Resumo:
Growing models have been widely used for clustering or topology learning. Traditionally these models work on stationary environments, grow incrementally and adapt their nodes to a given distribution based on global parameters. In this paper, we present an enhanced unsupervised self-organising network for the modelling of visual objects. We first develop a framework for building non-rigid shapes using the growth mechanism of the self-organising maps, and then we define an optimal number of nodes without overfitting or underfitting the network based on the knowledge obtained from information-theoretic considerations. We present experimental results for hands and we quantitatively evaluate the matching capabilities of the proposed method with the topographic product.
Resumo:
Significant advances in science should be given to addressing the needs of society and the historical context of the territories. Although technological developments that began with modernity and the industrial revolution allowed human beings to control the resources of nature to put to your service without limits, it is clear that the crisis of the prevailing development models manifest themselves in many ways but with three common denominators: environmental degradation, social injustice and extreme poverty. Consequently, today should not be possible to think a breakthrough in the development of science without addressing global environmental problems and the deep social injustices that increase at all scales under the gaze, impassively in many occasions, of formal science.
MINING AND VERIFICATION OF TEMPORAL EVENTS WITH APPLICATIONS IN COMPUTER MICRO-ARCHITECTURE RESEARCH
Resumo:
Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.
Resumo:
The ontology engineering research community has focused for many years on supporting the creation, development and evolution of ontologies. Ontology forecasting, which aims at predicting semantic changes in an ontology, represents instead a new challenge. In this paper, we want to give a contribution to this novel endeavour by focusing on the task of forecasting semantic concepts in the research domain. Indeed, ontologies representing scientific disciplines contain only research topics that are already popular enough to be selected by human experts or automatic algorithms. They are thus unfit to support tasks which require the ability of describing and exploring the forefront of research, such as trend detection and horizon scanning. We address this issue by introducing the Semantic Innovation Forecast (SIF) model, which predicts new concepts of an ontology at time t + 1, using only data available at time t. Our approach relies on lexical innovation and adoption information extracted from historical data. We evaluated the SIF model on a very large dataset consisting of over one million scientific papers belonging to the Computer Science domain: the outcomes show that the proposed approach offers a competitive boost in mean average precision-at-ten compared to the baselines when forecasting over 5 years.
Resumo:
Prolonged high-intensity training seems to result in increased systemic inflammation, which might explain muscle injury, delayed onset muscle soreness, and overtraining syndrome in athletes. Furthermore, an impaired immune function caused by strenuous exercise leads to the development of upper respiratory tract infections in athletes. Nutraceuticals might help counteract these performance-lowering effects. The use of nanotechnology is an interesting alternative to supply athletes with nutraceuticals, as many of these substances are insoluble in water and are poorly absorbed in the digestive tract. The present chapter starts with a brief review of the effects of exercise on immunity, followed by an analysis on how nutraceuticals such as omega-3 fatty acids, glutamine, BCAAs, or phytochemicals can counteract negative effects of strenuous exercise in athletes. Finally, how nanostructured delivery systems can constitute a new trend in enhancing bioavailability and optimizing the action of nutraceuticals will be discussed, using the example of food beverages.
Resumo:
Resuscitation and stabilization are key issues in Intensive Care Burn Units and early survival predictions help to decide the best clinical action during these phases. Current survival scores of burns focus on clinical variables such as age or the body surface area. However, the evolution of other parameters (e.g. diuresis or fluid balance) during the first days is also valuable knowledge. In this work we suggest a methodology and we propose a Temporal Data Mining algorithm to estimate the survival condition from the patient’s evolution. Experiments conducted on 480 patients show the improvement of survival prediction.