910 resultados para Digital Elevation Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Digital human modelling (DHM) has today matured from research into industrial application. In the automotive domain, DHM has become a commonly used tool in virtual prototyping and human-centred product design. While this generation of DHM supports the ergonomic evaluation of new vehicle design during early design stages of the product, by modelling anthropometry, posture, motion or predicting discomfort, the future of DHM will be dominated by CAE methods, realistic 3D design, and musculoskeletal and soft tissue modelling down to the micro-scale of molecular activity within single muscle fibres. As a driving force for DHM development, the automotive industry has traditionally used human models in the manufacturing sector (production ergonomics, e.g. assembly) and the engineering sector (product ergonomics, e.g. safety, packaging). In product ergonomics applications, DHM share many common characteristics, creating a unique subset of DHM. These models are optimised for a seated posture, interface to a vehicle seat through standardised methods and provide linkages to vehicle controls. As a tool, they need to interface with other analytic instruments and integrate into complex CAD/CAE environments. Important aspects of current DHM research are functional analysis, model integration and task simulation. Digital (virtual, analytic) prototypes or digital mock-ups (DMU) provide expanded support for testing and verification and consider task-dependent performance and motion. Beyond rigid body mechanics, soft tissue modelling is evolving to become standard in future DHM. When addressing advanced issues beyond the physical domain, for example anthropometry and biomechanics, modelling of human behaviours and skills is also integrated into DHM. Latest developments include a more comprehensive approach through implementing perceptual, cognitive and performance models, representing human behaviour on a non-physiologic level. Through integration of algorithms from the artificial intelligence domain, a vision of the virtual human is emerging.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Language-use has proven to be the most complex and complicating of all Internet features, yet people and institutions invest enormously in language and crosslanguage features because they are fundamental to the success of the Internet’s past, present and future. The thesis takes into focus the developments of the latter – features that facilitate and signify linking between or across languages – both in their historical and current contexts. In the theoretical analysis, the conceptual platform of inter-language linking is developed to both accommodate efforts towards a new social complexity model for the co-evolution of languages and language content, as well as to create an open analytical space for language and cross-language related features of the Internet and beyond. The practiced uses of inter-language linking have changed over the last decades. Before and during the first years of the WWW, mechanisms of inter-language linking were at best important elements used to create new institutional or content arrangements, but on a large scale they were just insignificant. This has changed with the emergence of the WWW and its development into a web in which content in different languages co-evolve. The thesis traces the inter-language linking mechanisms that facilitated these dynamic changes by analysing what these linking mechanisms are, how their historical as well as current contexts can be understood and what kinds of cultural-economic innovation they enable and impede. The study discusses this alongside four empirical cases of bilingual or multilingual media use, ranging from television and web services for languages of smaller populations, to large-scale, multiple languages involving web ventures by the British Broadcasting Corporation, the Special Broadcasting Service Australia, Wikipedia and Google. To sum up, the thesis introduces the concepts of ‘inter-language linking’ and the ‘lateral web’ to model the social complexity and co-evolution of languages online. The resulting model reconsiders existing social complexity models in that it is the first that can explain the emergence of large-scale, networked co-evolution of languages and language content facilitated by the Internet and the WWW. Finally, the thesis argues that the Internet enables an open space for language and crosslanguage related features and investigates how far this process is facilitated by (1) amateurs and (2) human-algorithmic interaction cultures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a digital age, the skills required to undertake an oral history project have changed dramatically. For community groups, this shift can be new and exciting, but can also invoke feelings of anxiety when there is a gap in the skill set. Addressing this gap is one of Oral History Association of Australia, Queensland (OHAA Qld) main activities. This paper reports on the OHAA Qld chapter’s oral history workshop program, which was radically altered in 2011.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The authors present a Cause-Effect fault diagnosis model, which utilises the Root Cause Analysis approach and takes into account the technical features of a digital substation. The Dempster/Shafer evidence theory is used to integrate different types of fault information in the diagnosis model so as to implement a hierarchical, systematic and comprehensive diagnosis based on the logic relationship between the parent and child nodes such as transformer/circuit-breaker/transmission-line, and between the root and child causes. A real fault scenario is investigated in the case study to demonstrate the developed approach in diagnosing malfunction of protective relays and/or circuit breakers, miss or false alarms, and other commonly encountered faults at a modern digital substation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Building Web 2.0 sites does not necessarily ensure the success of the site. We aim to better understand what improves the success of a site by drawing insight from biologically inspired design patterns. Web 2.0 sites provide a mechanism for human interaction enabling powerful intercommunication between massive volumes of users. Early Web 2.0 site providers that were previously dominant are being succeeded by newer sites providing innovative social interaction mechanisms. Understanding what site traits contribute to this success drives research into Web sites mechanics using models to describe the associated social networking behaviour. Some of these models attempt to show how the volume of users provides a self-organising and self-contextualisation of content. One model describing coordinated environments is called stigmergy, a term originally describing coordinated insect behavior. This paper explores how exploiting stigmergy can provide a valuable mechanism for identifying and analysing online user behavior specifically when considering that user freedom of choice is restricted by the provided web site functionality. This will aid our building better collaborative Web sites improving the collaborative processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recognition that Web 2.0 applications and social media sites will strengthen and improve interaction between governments and citizens has resulted in a global push into new e-democracy or Government 2.0 spaces. These typically follow government-to-citizen (g2c) or citizen-to-citizen (c2c) models, but both these approaches are problematic: g2c is often concerned more with service delivery to citizens as clients, or exists to make a show of ‘listening to the public’ rather than to genuinely source citizen ideas for government policy, while c2c often takes place without direct government participation and therefore cannot ensure that the outcomes of citizen deliberations are accepted into the government policy-making process. Building on recent examples of Australian Government 2.0 initiatives, we suggest a new approach based on government support for citizen-to-citizen engagement, or g4c2c, as a workable compromise, and suggest that public service broadcasters should play a key role in facilitating this model of citizen engagement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The digital humanities are growing rapidly in response to a rise in Internet use. What humanists mostly work on, and which forms much of the contents of our growing repositories, are digital surrogates of originally analog artefacts. But is the data model upon which many of those surrogates are based – embedded markup – adequate for the task? Or does it in fact inhibit reusability and flexibility? To enhance interoperability of resources and tools, some changes to the standard markup model are needed. Markup could be removed from the text and stored in standoff form. The versions of which many cultural heritage texts are composed could also be represented externally, and computed automatically. These changes would not disrupt existing data representations, which could be imported without significant data loss. They would also enhance automation and ease the increasing burden on the modern digital humanist.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper develops a framework for classifying term dependencies in query expansion with respect to the role terms play in structural linguistic associations. The framework is used to classify and compare the query expansion terms produced by the unigram and positional relevance models. As the unigram relevance model does not explicitly model term dependencies in its estimation process it is often thought to ignore dependencies that exist between words in natural language. The framework presented in this paper is underpinned by two types of linguistic association, namely syntagmatic and paradigmatic associations. It was found that syntagmatic associations were a more prevalent form of linguistic association used in query expansion. Paradoxically, it was the unigram model that exhibited this association more than the positional relevance model. This surprising finding has two potential implications for information retrieval models: (1) if linguistic associations underpin query expansion, then a probabilistic term dependence assumption based on position is inadequate for capturing them; (2) the unigram relevance model captures more term dependency information than its underlying theoretical model suggests, so its normative position as a baseline that ignores term dependencies should perhaps be reviewed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper develops and evaluates an enhanced corpus based approach for semantic processing. Corpus based models that build representations of words directly from text do not require pre-existing linguistic knowledge, and have demonstrated psychologically relevant performance on a number of cognitive tasks. However, they have been criticised in the past for not incorporating sufficient structural information. Using ideas underpinning recent attempts to overcome this weakness, we develop an enhanced tensor encoding model to build representations of word meaning for semantic processing. Our enhanced model demonstrates superior performance when compared to a robust baseline model on a number of semantic processing tasks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A strongly progressive surveying and mapping industry depends on a shared understanding of the industry as it exists, some shared vision or imagination of what the industry might become, and some shared action plan capable of bringing about a realisation of that vision. The emphasis on sharing implies a need for consensus reached through widespread discussion and mutual understanding. Unless this occurs, concerted action is unlikely. A more likely outcome is that industry representatives will negate each other's efforts in their separate bids for progress. The process of bringing about consensual viewpoints is essentially one of establishing an industry identity. Establishing the industry's identity and purpose is a prerequisite for rational development of the industry's education and training, its promotion and marketing, and operational research that can deal .with industry potential and efficiency. This paper interprets evolutionary developments occurring within Queensland's surveying and mapping industry within a framework that sets out logical requirements for a viable industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This presentation will deal with the transformations that have occurred in news journalism worldwide in the early 21st century. I will argue that they have been the most significant changes to the profession for 100 years, and the challenges facing the news media industry in responding to them are substantial, as are those facing journalism education. It will develop this argument in relation to the crisis of the newspaper business model, and why social media, blogging and citizen journalism have not filled the gap left by the withdrawal of resources from traditional journalism. It will also draw upon Wikileaks as a case study in debates about computational and data-driven journalism, and whether large-scale "leaks" of electronic documents may be the future of investigative journalism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work-in-progress paper presents an ensemble-based model for detecting and mitigating Distributed Denial-of-Service (DDoS) attacks, and its partial implementation. The model utilises network traffic analysis and MIB (Management Information Base) server load analysis features for detecting a wide range of network and application layer DDoS attacks and distinguishing them from Flash Events. The proposed model will be evaluated against realistic synthetic network traffic generated using a software-based traffic generator that we have developed as part of this research. In this paper, we summarise our previous work, highlight the current work being undertaken along with preliminary results obtained and outline the future directions of our work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The promotion of resilience (the capacity of an individual or community to bounce back and recover from adversity) has become an important area of public health. In recent years it has expanded into the digital domain, and many online applications have been developed to promote children's resilience. In this study, it is argued that the majority of existing applications are limited because they take a didactic approach, and conceive of interaction as providing navigational choices. Because they simply provide information about resilience or replicate offline, scenario-based strategies, the understanding of resilience they provide is confined to a few, predetermined factors. In this study I propose a new, experiential approach to promoting resilience digitally. I define resilience as an emergent, situated and context-specific phenomenon. Using a Participatory Design model in combination with a salutogenic (strength-based) health methodology, this project has involved approximately 50 children as co-designers and co-researchers over two years. The children have contributed to the design of a new set of interactive resilience tools, which facilitate resilience promotion through dialogic and experiential learning. The major outcomes of this study include a new methodology for developing digital resilience tools, a new set of tools that have been developed and evaluated in collaboration with children and a set of design principles to guide future development. Beyond these initial and tangible outcomes, this study has also established that the benefits of introducing Participatory Design into a health promoting model rests primarily in the change of the role of children from "users" of technology and education to co-designers, where they assume a leadership role in both designing the tools and in directing their resilience learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rapid increase in the deployment of CCTV systems has led to a greater demand for algorithms that are able to process incoming video feeds. These algorithms are designed to extract information of interest for human operators. During the past several years, there has been a large effort to detect abnormal activities through computer vision techniques. Typically, the problem is formulated as a novelty detection task where the system is trained on normal data and is required to detect events which do not fit the learned `normal' model. Many researchers have tried various sets of features to train different learning models to detect abnormal behaviour in video footage. In this work we propose using a Semi-2D Hidden Markov Model (HMM) to model the normal activities of people. The outliers of the model with insufficient likelihood are identified as abnormal activities. Our Semi-2D HMM is designed to model both the temporal and spatial causalities of the crowd behaviour by assuming the current state of the Hidden Markov Model depends not only on the previous state in the temporal direction, but also on the previous states of the adjacent spatial locations. Two different HMMs are trained to model both the vertical and horizontal spatial causal information. Location features, flow features and optical flow textures are used as the features for the model. The proposed approach is evaluated using the publicly available UCSD datasets and we demonstrate improved performance compared to other state of the art methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flexible information exchange is critical to successful design-analysis integration, but current top-down, standards-based and model-oriented strategies impose restrictions that contradicts this flexibility. In this article we present a bottom-up, user-controlled and process-oriented approach to linking design and analysis applications that is more responsive to the varied needs of designers and design teams. Drawing on research into scientific workflows, we present a framework for integration that capitalises on advances in cloud computing to connect discrete tools via flexible and distributed process networks. We then discuss how a shared mapping process that is flexible and user friendly supports non-programmers in creating these custom connections. Adopting a services-oriented system architecture, we propose a web-based platform that enables data, semantics and models to be shared on the fly. We then discuss potential challenges and opportunities for its development as a flexible, visual, collaborative, scalable and open system.