961 resultados para design rules
Resumo:
This paper discusses about effectiveness of blogs for reflective learning in design education. Students in two animation units were asked to complete their online journal via blog in terms of reflective learning. Students were encouraged to respond their weekly outcomes and project development process to their blog and share it with other students. A survey was undertaken to evaluate their learning experience and one of the key outcomes indicates that interaction design for social network is significantly important to blog based learning design.
Resumo:
Early in the practice-led research debate, Steven Scrivener (2000, 2002) identified some general differences in the approach of artists and designers undertaking postgraduate research. His distinctions centered on the role of the artefact in problem-based research (associated with design) and creative-production research (associated with artistic practice). Nonetheless, in broader discussions on practice-led research, 'art and design' often continues to be conflated within a single term. In particular, marked differences between art and design methodologies, theoretical framing, research goals and research claims have been underestimated. This paper revisits Scrivener's work and establishes further distinctions between art and design research. It is informed by our own experiences of postgraduate supervision and research methods training, and an empirical study of over sixty postgraduate, practice-led projects completed at the Creative Industries Faculty of QUT between 2002 and 2008. Our reflections have led us to propose that artists and designers work with differing research goals (the evocative and the effective, respectively), which are played out in the questions asked, the creative process, the role of the artefact and the way new knowledge is evidenced. Of course, research projects will have their own idiosyncrasies but, we argue, marking out the poles at each end of the spectrum of art and design provides useful insights for postgraduate candidates, supervisors and methodologists alike.
Resumo:
Service-orientation has gained widespread acceptance and is increasingly being employed as a paradigm for structuring both business and IT architectures. An earlier study of extant service analysis and design methodologies discovered a need for holistic approaches that equally account for both business and software services, which motivated the design of a new, consolidated service analysis and design methodology. A challenge in design-oriented research is to evaluate the utility of the newly created artefacts (here: the methodology), as they are often intended to become part of complex socio-technical systems. Therefore, after presenting a brief overview of the consolidated methodology, the paper discusses possible approaches for the “evaluate” phase of this design-science research process and presents the results of an empirical evaluation conducted in an Action Research study at one of Australia’s largest financial services providers.
Resumo:
Association rule mining is one technique that is widely used when querying databases, especially those that are transactional, in order to obtain useful associations or correlations among sets of items. Much work has been done focusing on efficiency, effectiveness and redundancy. There has also been a focusing on the quality of rules from single level datasets with many interestingness measures proposed. However, with multi-level datasets now being common there is a lack of interestingness measures developed for multi-level and cross-level rules. Single level measures do not take into account the hierarchy found in a multi-level dataset. This leaves the Support-Confidence approach,which does not consider the hierarchy anyway and has other drawbacks, as one of the few measures available. In this paper we propose two approaches which measure multi-level association rules to help evaluate their interestingness. These measures of diversity and peculiarity can be used to help identify those rules from multi-level datasets that are potentially useful.
Resumo:
Association rule mining has made many advances in the area of knowledge discovery. However, the quality of the discovered association rules is a big concern and has drawn more and more attention recently. One problem with the quality of the discovered association rules is the huge size of the extracted rule set. Often for a dataset, a huge number of rules can be extracted, but many of them can be redundant to other rules and thus useless in practice. Mining non-redundant rules is a promising approach to solve this problem. In this paper, we firstly propose a definition for redundancy; then we propose a concise representation called Reliable basis for representing non-redundant association rules for both exact rules and approximate rules. An important contribution of this paper is that we propose to use the certainty factor as the criteria to measure the strength of the discovered association rules. With the criteria, we can determine the boundary between redundancy and non-redundancy to ensure eliminating as many redundant rules as possible without reducing the inference capacity of and the belief to the remaining extracted non-redundant rules. We prove that the redundancy elimination based on the proposed Reliable basis does not reduce the belief to the extracted rules. We also prove that all association rules can be deduced from the Reliable basis. Therefore the Reliable basis is a lossless representation of association rules. Experimental results show that the proposed Reliable basis can significantly reduce the number of extracted rules.
Resumo:
Recommender systems are widely used online to help users find other products, items etc that they may be interested in based on what is known about that user in their profile. Often however user profiles may be short on information and thus when there is not sufficient knowledge on a user it is difficult for a recommender system to make quality recommendations. This problem is often referred to as the cold-start problem. Here we investigate whether association rules can be used as a source of information to expand a user profile and thus avoid this problem, leading to improved recommendations to users. Our pilot study shows that indeed it is possible to use association rules to improve the performance of a recommender system. This we believe can lead to further work in utilising appropriate association rules to lessen the impact of the cold-start problem.
Resumo:
This paper presents the design of self-tuning controllers for a two terminal HVDC link. The controllers are designed utilizing a novel discrete-time converter model based on multirate sampling. The nature of converter firing system necessitates the development of a two-step ahead self-tuning control strategy. A two terminal HVDC system study has been carried out to show the effectiveness of the control strategies proposed which include the design of minimum variance controller, pole assigned controller and PLQG controller. The coordinated control of a two terminal HVDC system has been established deriving the signal from inverter end current and voltage which has been estimated based on the measurements of rectifier end quantities only realized through the robust reduced order observer. A well known scaled down sample system data has been selected for studies and the controllers designed have been tested for worst conditions. The performance of self-tuning controllers has been evaluated through digital simulation.
Resumo:
One of the new challenges in aeronautics is combining and accounting for multiple disciplines while considering uncertainties or variability in the design parameters or operating conditions. This paper describes a methodology for robust multidisciplinary design optimisation when there is uncertainty in the operating conditions. The methodology, which is based on canonical evolution algorithms, is enhanced by its coupling with an uncertainty analysis technique. The paper illustrates the use of this methodology on two practical test cases related to Unmanned Aerial Systems (UAS). These are the ideal candidates due to the multi-physics involved and the variability of missions to be performed. Results obtained from the optimisation show that the method is effective to find useful Pareto non-dominated solutions and demonstrate the use of robust design techniques.
Resumo:
This paper considers some of the implications of the rise of design as a master-metaphor of the information age. It compares the terms 'interaction design' and 'mass communication', suggesting that both can be seen as a contradiction in terms, inappropriately preserving an industrial-age division between producers and consumers. With the shift from mass media to interactive media, semiotic and political power seems to be shifting too - from media producers to designers. This paper argues that it is important for the new discipline of 'interactive design' not to fall into habits of thought inherited from the 'mass' industrial era. Instead it argues for the significance, for designers and producers alike, of what I call 'distributed expertise' -including social network markets, a DIY-culture, user-led innovation, consumer co-created content, and the use of Web 2.0 affordances for social, scientific and creative purposes as well as for entertainment. It considers the importance of the growth of 'distributed expertise' as part of a new paradigm in the growth of knowledge, which has 'evolved' through a number of phases, from 'abstraction' to 'representation', to 'productivity'. In the context of technologically mediated popular participation in the growth of knowledge and social relationships, the paper argues that design and media-production professions need to cross rather than to maintain the gap between experts and everyone else, enabling all the agents in the system to navigate the shift into the paradigm of mass productivity.
Resumo:
This paper explores a method of comparative analysis and classification of data through perceived design affordances. Included is discussion about the musical potential of data forms that are derived through eco-structural analysis of musical features inherent in audio recordings of natural sounds. A system of classification of these forms is proposed based on their structural contours. The classifications include four primitive types; steady, iterative, unstable and impulse. The classification extends previous taxonomies used to describe the gestural morphology of sound. The methods presented are used to provide compositional support for eco-structuralism.
Resumo:
Network Jamming systems provide real-time collaborative performance experiences for novice or inexperienced users. In this paper we will outline the interaction design considerations that have emerged during through evolutionary development cycles of the jam2jam Network Jamming software that employs generative techniques that require particular attention to the human computer relationship. In particular we describe the co-evolution of features and uses, explore the role of agile development methods in supporting this evolution, and show how the provision of a clear core capability can be matched with options for enhanced features support multi-levelled user experience and skill develop.
Resumo:
The importance of student engagement to higher education quality, making deep learning outcomes possible for students, and achieving student retention, is increasingly being understood. The issue of student engagement in the first year of tertiary study is of particular significance. This paper takes the position that the first year curriculum, and the pedagogical principles that inform its design, are critical influencers of student engagement in the first year learning environment. We use an analysis of case studies prepared for Kift’s ALTC Senior Fellowship to demonstrate ways in which student engagement in the first year of tertiary study can be successfully supported through intentional curriculum design that motivates students to learn, provides a positive learning climate, and encourages students to be active in their learning.
Resumo:
Artificial neural networks (ANN) have demonstrated good predictive performance in a wide range of applications. They are, however, not considered sufficient for knowledge representation because of their inability to represent the reasoning process succinctly. This paper proposes a novel methodology Gyan that represents the knowledge of a trained network in the form of restricted first-order predicate rules. The empirical results demonstrate that an equivalent symbolic interpretation in the form of rules with predicates, terms and variables can be derived describing the overall behaviour of the trained ANN with improved comprehensibility while maintaining the accuracy and fidelity of the propositional rules.
Resumo:
Surveillance for invasive non-indigenous species (NIS) is an integral part of a quarantine system. Estimating the efficiency of a surveillance strategy relies on many uncertain parameters estimated by experts, such as the efficiency of its components in face of the specific NIS, the ability of the NIS to inhabit different environments, and so on. Due to the importance of detecting an invasive NIS within a critical period of time, it is crucial that these uncertainties be accounted for in the design of the surveillance system. We formulate a detection model that takes into account, in addition to structured sampling for incursive NIS, incidental detection by untrained workers. We use info-gap theory for satisficing (not minimizing) the probability of detection, while at the same time maximizing the robustness to uncertainty. We demonstrate the trade-off between robustness to uncertainty, and an increase in the required probability of detection. An empirical example based on the detection of Pheidole megacephala on Barrow Island demonstrates the use of info-gap analysis to select a surveillance strategy.
Resumo:
We consider the problem of designing a surveillance system to detect a broad range of invasive species across a heterogeneous sampling frame. We present a model to detect a range of invertebrate invasives whilst addressing the challenges of multiple data sources, stratifying for differential risk, managing labour costs and providing sufficient power of detection.We determine the number of detection devices required and their allocation across the landscape within limiting resource constraints. The resulting plan will lead to reduced financial and ecological costs and an optimal surveillance system.