832 resultados para Computer- and videogames


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Most approaches to stereo visual odometry reconstruct the motion based on the tracking of point features along a sequence of images. However, in low-textured scenes it is often difficult to encounter a large set of point features, or it may happen that they are not well distributed over the image, so that the behavior of these algorithms deteriorates. This paper proposes a probabilistic approach to stereo visual odometry based on the combination of both point and line segment that works robustly in a wide variety of scenarios. The camera motion is recovered through non-linear minimization of the projection errors of both point and line segment features. In order to effectively combine both types of features, their associated errors are weighted according to their covariance matrices, computed from the propagation of Gaussian distribution errors in the sensor measurements. The method, of course, is computationally more expensive that using only one type of feature, but still can run in real-time on a standard computer and provides interesting advantages, including a straightforward integration into any probabilistic framework commonly employed in mobile robotics.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Means to automate the fact replace the man in their job functions for a man and machines automatic mechanism, ie documentary specialists in computer and computers are the cornerstone of any modern system of documentation and information. From this point of view immediately raises the problem of deciding what resources should be applied to solve the specific problem in each specific case. We will not let alone to propose quick fixes or recipes in order to decide what to do in any case. The solution depends on repeat for each particular problem. What we want is to move some points that can serve as a basis for reflection to help find the best solution possible, once the problem is defined correctly. The first thing to do before starting any automated system project is to define exactly the domain you want to cover and assess with greater precision possible importance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

My dissertation defends a positive answer to the question: “Can a videogame be a work of art? ” To achieve this goal I develop definitions of several concepts, primarily ‘art’, ‘games’, andvideogames’, and offer arguments about the compatibility of these notions. In Part One, I defend a definition of art from amongst several contemporary and historical accounts. This definition, the Intentional-Historical account, requires, among other things, that an artwork have the right kind of creative intentions behind it, in short that the work be intended to be regarded in a particular manner. This is a leading account that has faced several recent objections that I address, particular the buck-passing theory, the objection against non-failure theories of art, and the simultaneous creation response to the ur-art problem, while arguing that it is superior to other theories in its ability to answer the question of videogames’ art status. Part Two examines whether games can exhibit the art-making kind of creative intention. Recent literature has suggested that they can. To verify this a definition of games is needed. I review and develop the most promising account of games in the literature, the over-looked account from Bernard Suits. I propose and defend a modified version of this definition against other accounts. Interestingly, this account entails that games cannot be successfully intended to be works of art because games are goal-directed activities that require a voluntary selection of inefficient means and that is incompatible with the proper manner of regarding that is necessary for something to be an artwork. While the conclusions of Part One and Part Two may appear to suggest that videogames cannot be works of art, Part Three proposes and defends a new account of videogames that, contrary to first appearances, implies that not all videogames are games. This Intentional-Historical Formalist account allows for non-game videogames to be created with an art-making intention, though not every non-ludic videogame will have an art-making intention behind it. I then discuss examples of videogames that are good candidates for being works of art. I conclude that a videogame can be a work of art, but that not all videogames are works of art. The thesis is significant in several respects. It is a continuation of academic work that has focused on the definition and art status of videogames. It clarifies the current debate and provides a positive account of the central issues that has so far been lacking. It also defines videogames in a way that corresponds better with the actual practice of videogame making and playing than other definitions in the literature. It offers further evidence in defense of certain theories of art over others, providing a close examination of videogames as a new case study for potential art objects and for aesthetic and artistic theory in general. Finally, it provides a compelling answer to the question of whether videogames can be art. This project also provides the groundwork for new evaluative, critical, and appreciative tools for engagement with videogames as they develop as a medium. As videogames mature, more people, both inside and outside academia, have increasing interest in what they are and how to understand them. One place many have looked is to the practice of art appreciation. My project helps make sense of which appreciative and art-critical tools and methods are applicable to videogames.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: The prevalence of obesity is increasing among Iranian youngsters like other developing countries. Objectives: This study was conducted to assess regional disparities in sedentary behaviors and meal frequency in Iranian adolescents. Patients and Methods: In this national survey, 5682 students aged 10 - 18 years from urban and rural districts of 27 provinces of Iran were selected via stratified multi-stage sampling method. The country was classified into four sub-national regions, based on criteria of the combination of geography and socioeconomic status (SES). Mean of meal frequency and physical activity levels as well as prevalence of omitting meals and sedentary behavior were compared across regions with different SES after stratifying with sex and age group. Results: Meal frequency in lower socio-economic regions was significantly higher than two other regions in 10 - 13 and 10 - 18 years old groups (P trend < 0.001). However, the mean of working hours with computer was linearly increased with increasing the SES in studied regions (P trend < 0.001), whereas the corresponding figure was not significant for the mean of watching TV (P trend > 0.05). Frequency of adolescents omitting their meals was higher in higher SES regions especially in West Iran (P < 0.001) in 10 - 13 years old age group. Having personal computer and working with it more than two hours per day mainly was observed in central Iran which ranked as the highest SES group. Conclusions: Efforts to ensure Iranian youth meet healthy food habits and screen time guidelines include limiting access to screen technologies and encouraging parents to monitor their own screen time is required.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Thanks to the advanced technologies and social networks that allow the data to be widely shared among the Internet, there is an explosion of pervasive multimedia data, generating high demands of multimedia services and applications in various areas for people to easily access and manage multimedia data. Towards such demands, multimedia big data analysis has become an emerging hot topic in both industry and academia, which ranges from basic infrastructure, management, search, and mining to security, privacy, and applications. Within the scope of this dissertation, a multimedia big data analysis framework is proposed for semantic information management and retrieval with a focus on rare event detection in videos. The proposed framework is able to explore hidden semantic feature groups in multimedia data and incorporate temporal semantics, especially for video event detection. First, a hierarchical semantic data representation is presented to alleviate the semantic gap issue, and the Hidden Coherent Feature Group (HCFG) analysis method is proposed to capture the correlation between features and separate the original feature set into semantic groups, seamlessly integrating multimedia data in multiple modalities. Next, an Importance Factor based Temporal Multiple Correspondence Analysis (i.e., IF-TMCA) approach is presented for effective event detection. Specifically, the HCFG algorithm is integrated with the Hierarchical Information Gain Analysis (HIGA) method to generate the Importance Factor (IF) for producing the initial detection results. Then, the TMCA algorithm is proposed to efficiently incorporate temporal semantics for re-ranking and improving the final performance. At last, a sampling-based ensemble learning mechanism is applied to further accommodate the imbalanced datasets. In addition to the multimedia semantic representation and class imbalance problems, lack of organization is another critical issue for multimedia big data analysis. In this framework, an affinity propagation-based summarization method is also proposed to transform the unorganized data into a better structure with clean and well-organized information. The whole framework has been thoroughly evaluated across multiple domains, such as soccer goal event detection and disaster information management.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this study was to identify the quality of life profile, overweight-obesity and sedentary behavior in a group of elementary and high school children of Guanacaste. 635 students participated in the study. The participants completed a protocol by which they were anthropometrically evaluated, and also filled up a questionnaire related to sedentary behavior and quality of life. In general, the findings reflected a prevalence of overweight and obesity of 13, 9%. The most important sedentary activities were, in descending order, the small screen (watching TV, video games, computer), and certain social and cultural activities. The self-reported quality of life index was within acceptable limits but not exceeding 80 points on a scale of 1-100. There was no significant relationship between the rate of the overall quality of life, overweight, obesity and some sedentary behaviors, although some anthropometric parameters like percentage of body fat and body weight showed significant correlation with sedentary behavior and specific aspects belonging to quality of life. The study provides valuable information to health authorities, directors of educational institutions and parents about key issues related to child development.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

John Frazer's architectural work is inspired by living and generative processes. Both evolutionary and revolutionary, it explores informatin ecologies and the dynamics of the spaces between objects. Fuelled by an interest in the cybernetic work of Gordon Pask and Norbert Wiener, and the possibilities of the computer and the "new science" it has facilitated, Frazer and his team of collaborators have conducted a series of experiments that utilize genetic algorithms, cellular automata, emergent behaviour, complexity and feedback loops to create a truly dynamic architecture. Frazer studied at the Architectural Association (AA) in London from 1963 to 1969, and later became unit master of Diploma Unit 11 there. He was subsequently Director of Computer-Aided Design at the University of Ulter - a post he held while writing An Evolutionary Architecture in 1995 - and a lecturer at the University of Cambridge. In 1983 he co-founded Autographics Software Ltd, which pioneered microprocessor graphics. Frazer was awarded a person chair at the University of Ulster in 1984. In Frazer's hands, architecture becomes machine-readable, formally open-ended and responsive. His work as computer consultant to Cedric Price's Generator Project of 1976 (see P84)led to the development of a series of tools and processes; these have resulted in projects such as the Calbuild Kit (1985) and the Universal Constructor (1990). These subsequent computer-orientated architectural machines are makers of architectural form beyond the full control of the architect-programmer. Frazer makes much reference to the multi-celled relationships found in nature, and their ongoing morphosis in response to continually changing contextual criteria. He defines the elements that describe his evolutionary architectural model thus: "A genetic code script, rules for the development of the code, mapping of the code to a virtual model, the nature of the environment for the development of the model and, most importantly, the criteria for selection. In setting out these parameters for designing evolutionary architectures, Frazer goes beyond the usual notions of architectural beauty and aesthetics. Nevertheless his work is not without an aesthetic: some pieces are a frenzy of mad wire, while others have a modularity that is reminiscent of biological form. Algorithms form the basis of Frazer's designs. These algorithms determine a variety of formal results dependent on the nature of the information they are given. His work, therefore, is always dynamic, always evolving and always different. Designing with algorithms is also critical to other architects featured in this book, such as Marcos Novak (see p150). Frazer has made an unparalleled contribution to defining architectural possibilities for the twenty-first century, and remains an inspiration to architects seeking to create responsive environments. Architects were initially slow to pick up on the opportunities that the computer provides. These opportunities are both representational and spatial: computers can help architects draw buildings and, more importantly, they can help architects create varied spaces, both virtual and actual. Frazer's work was groundbreaking in this respect, and well before its time.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We treat the security of group key exchange (GKE) in the universal composability (UC) framework. Analyzing GKE protocols in the UC framework naturally addresses attacks by malicious insiders. We define an ideal functionality for GKE that captures contributiveness in addition to other desired security goals. We show that an efficient two-round protocol securely realizes the proposed functionality in the random oracle model. As a result, we obtain the most efficient UC-secure contributory GKE protocol known.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes adolescence as a useful concept rather than definitive. It explores the notion of adolescence and its relevance to contemporary society and schooling. We reflect on the purposes for the emergence of research into adolescence during the early 20th century, particularly the particular scientific and societal pressures that served to bring this field to prominence. Recent debate has started to problematise many of the early parameters used to define and provide bounds for understanding adolescents and adolescent experience and for the rationale for some notionally tailored educational contexts. This paper provides an overview of this debate and argues for a reconsideration of some of the basic tenets for definition. In particular we discuss the cultural construction of adolescence in the light of our new globalised society. A possibility for thinking about contemporary adolescents is by considering them in terms of generational characteristics. What makes a new generation? Typically, members of a generation share age, a set of experiences during formative years, and a set of social and economic conditions. The adolescents of today fall into the group known collectively as the ‘Y Generation’, the ‘D (digital) Generation’, Generation C (consumer) and the ‘Millennial’s’. Born after mid-1980, they are characterised as computer and internet competent, multi-taskers, with a global perspective. They respond best to visual language, and are heavily influenced by the media. We consider the generational traits and how this impacts on the teaching and learning.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Monitoring Internet traffic is critical in order to acquire a good understanding of threats to computer and network security and in designing efficient computer security systems. Researchers and network administrators have applied several approaches to monitoring traffic for malicious content. These techniques include monitoring network components, aggregating IDS alerts, and monitoring unused IP address spaces. Another method for monitoring and analyzing malicious traffic, which has been widely tried and accepted, is the use of honeypots. Honeypots are very valuable security resources for gathering artefacts associated with a variety of Internet attack activities. As honeypots run no production services, any contact with them is considered potentially malicious or suspicious by definition. This unique characteristic of the honeypot reduces the amount of collected traffic and makes it a more valuable source of information than other existing techniques. Currently, there is insufficient research in the honeypot data analysis field. To date, most of the work on honeypots has been devoted to the design of new honeypots or optimizing the current ones. Approaches for analyzing data collected from honeypots, especially low-interaction honeypots, are presently immature, while analysis techniques are manual and focus mainly on identifying existing attacks. This research addresses the need for developing more advanced techniques for analyzing Internet traffic data collected from low-interaction honeypots. We believe that characterizing honeypot traffic will improve the security of networks and, if the honeypot data is handled in time, give early signs of new vulnerabilities or breakouts of new automated malicious codes, such as worms. The outcomes of this research include: • Identification of repeated use of attack tools and attack processes through grouping activities that exhibit similar packet inter-arrival time distributions using the cliquing algorithm; • Application of principal component analysis to detect the structure of attackers’ activities present in low-interaction honeypots and to visualize attackers’ behaviors; • Detection of new attacks in low-interaction honeypot traffic through the use of the principal component’s residual space and the square prediction error statistic; • Real-time detection of new attacks using recursive principal component analysis; • A proof of concept implementation for honeypot traffic analysis and real time monitoring.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A teaching and learning development project is currently under way at Queensland University of Technology to develop advanced technology videotapes for use with the delivery of structural engineering courses. These tapes consist of integrated computer and laboratory simulations of important concepts, and behaviour of structures and their components for a number of structural engineering subjects. They will be used as part of the regular lectures and thus will not only improve the quality of lectures and learning environment, but also will be able to replace the ever-dwindling laboratory teaching in these subjects. The use of these videotapes, developed using advanced computer graphics, data visualization and video technologies, will enrich the learning process of the current diverse engineering student body. This paper presents the details of this new method, the methodology used, the results and evaluation in relation to one of the structural engineering subjects, steel structures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Defibrillator is a 16’41” musical work for solo performer, laptop computer and electric guitar. The electric guitar is processed in real-time by digital signal processing network in software, with gestural control provided by a foot-operated pedal board. --------- The work is informed by a range of ideas from the genres of electroacoustic music, western art music, popular music and cinematic sound. It seeks to fluidly cross and hybridise musical practices from these diverse sonic traditions and to develop a compositional language that draws upon multiple genres, but at the same time resists the ability to be located within a singular genre. Musical structures and sonic markers which form genre are ruptured at strategic levels of the musical structure in order to allow for a cross flow of concepts between genres. The process of rupture is facilitated by the practical implementation of music and sound reception theories into the compositional process. -------- The piece exhibits the by-products of a composer born into a media saturated environment, drawing on a range of musical and sonic traditions, actively seeking to explore the liminal space in between these traditions. The project stems from the author's research interests in locating points of connection between traditions of experimentation in diverse musical and sonic traditions arising from the broad uptake of media technologies in the early 20th century.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A teaching and learning development project is currently under way at Queensland University of Technology to develop advanced technology videotapes for use with the delivery of structural engineering courses. These tapes consist of integrated computer and laboratory simulations of important concepts, and behaviour of structures and their components for a number of structural engineering subjects. They will be used as part of the regular lectures and thus will not only improve the quality of lectures and learning environment, but also will be able to replace the ever-dwindling laboratory teaching in these subjects. The use of these videotapes, developed using advanced computer graphics, data visualization and video technologies, will enrich the learning process of the current diverse engineering student body. This paper presents the details of this new method, the methodology used, the results and evaluation in relation to one of the structural engineering subjects, steel structures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Streaming SIMD Extensions (SSE) is a unique feature embedded in the Pentium III and P4 classes of microprocessors. By fully exploiting SSE, parallel algorithms can be implemented on a standard personal computer and a theoretical speedup of four can be achieved. In this paper, we demonstrate the implementation of a parallel LU matrix decomposition algorithm for solving power systems network equations with SSE and discuss advantages and disadvantages of this approach.