854 resultados para user-centered approach
Resumo:
In this paper, we present a control strategy design technique for an autonomous underwater vehicle based on solutions to the motion planning problem derived from differential geometric methods. The motion planning problem is motivated by the practical application of surveying the hull of a ship for implications of harbor and port security. In recent years, engineers and researchers have been collaborating on automating ship hull inspections by employing autonomous vehicles. Despite the progresses made, human intervention is still necessary at this stage. To increase the functionality of these autonomous systems, we focus on developing model-based control strategies for the survey missions around challenging regions, such as the bulbous bow region of a ship. Recent advances in differential geometry have given rise to the field of geometric control theory. This has proven to be an effective framework for control strategy design for mechanical systems, and has recently been extended to applications for underwater vehicles. Advantages of geometric control theory include the exploitation of symmetries and nonlinearities inherent to the system. Here, we examine the posed inspection problem from a path planning viewpoint, applying recently developed techniques from the field of differential geometric control theory to design the control strategies that steer the vehicle along the prescribed path. Three potential scenarios for surveying a ship?s bulbous bow region are motivated for path planning applications. For each scenario, we compute the control strategy and implement it onto a test-bed vehicle. Experimental results are analyzed and compared with theoretical predictions.
Resumo:
The main focus of this paper is the motion planning problem for a deeply submerged rigid body. The equations of motion are formulated and presented by use of the framework of differential geometry and these equations incorporate external dissipative and restoring forces. We consider a kinematic reduction of the affine connection control system for the rigid body submerged in an ideal fluid, and present an extension of this reduction to the forced affine connection control system for the rigid body submerged in a viscous fluid. The motion planning strategy is based on kinematic motions; the integral curves of rank one kinematic reductions. This method is of particular interest to autonomous underwater vehicles which can not directly control all six degrees of freedom (such as torpedo shaped AUVs) or in case of actuator failure (i.e., under-actuated scenario). A practical example is included to illustrate our technique.
Resumo:
One of the major challenges in the design of social technologies is the evaluation of their qualities of use and how they are appropriated over time. While the field of HCI abounds in short-term exploratory design and studies of use, relatively little attention has focused on the continuous development of prototypes longitudinally and studies of their emergent use. We ground the exploration and analysis of use in the everyday world, embracing contingency and open-ended use, through the use of a continuously-available exploratory prototype. Through examining use longitudinally, clearer insight can be gained of realistic, non-novelty usage and appropriation into everyday use. This paper sketches out a framework for design that puts a premium on immediate use and evolving the design in response to use and user feedback. While such design practices with continuously developing systems are common in the design of social technologies, they are little documented. We describe our approach and reflect upon its key characteristics, based on our experiences from two case studies. We also present five major patterns of long-term usage which we found useful for design.
Resumo:
Increasingly, large amounts of public and private money are being invested in education and as a result, schools are becoming more accountable to stakeholders for this financial input. In terms of the curriculum, governments worldwide are frequently tying school funding to students‟ and schools‟ academic performances, which are monitored through high-stakes testing programs. To accommodate the resultant pressures from these testing initiatives, many principals are re-focussing their school‟s curriculum on the testing requirements. Such a re-focussing, which was examined critically in this thesis, constituted an externally facilitated rapid approach to curriculum change. In line with previously enacted change theories and recommendations from these, curriculum change in schools has tended to be a fairly slow, considered, collaborative process that is facilitated internally by a deputy-principal (curriculum). However, theoretically based research has shown that such a process has often proved to be difficult and very rarely successful. The present study reports and theorises the experiences of an externally facilitated process that emerged from a practitioner model of change. This case study of the development of the controlled rapid approach to curriculum change began by establishing the reasons three principals initiated curriculum change and why they then engaged an outsider to facilitate the process. It also examined this particular change process from the perspectives of the research participants. The investigation led to the revision of the practitioner model as used in the three schools and challenged the current thinking about the process of school curriculum change. The thesis aims to offer principals and the wider education community an alternative model for consideration when undertaking curriculum change. Finally, the thesis warns that, in the longer term, the application of study‟s revised model (the Controlled Rapid Approach to Curriculum Change [CRACC] Model) may have less then desirable educational consequences.
Resumo:
Process models are used by information professionals to convey semantics about the business operations in a real world domain intended to be supported by an information system. The understandability of these models is vital to them being used for information systems development. In this paper, we examine two factors that we predict will influence the understanding of a business process that novice developers obtain from a corresponding process model: the content presentation form chosen to articulate the business domain, and the user characteristics of the novice developers working with the model. Our experimental study provides evidence that novice developers obtain similar levels of understanding when confronted with an unfamiliar or a familiar process model. However, previous modeling experience, the use of English as a second language, and previous work experience in BPM are important influencing factors of model understanding. Our findings suggest that education and research in process modeling should increase the focus on human factors and how they relate to content and content presentation formats for different modeling tasks. We discuss implications for practice and research.
Resumo:
An approach aimed at enhancing learning by matching individual students' preferred cognitive styles to computer-based instructional (CBI) material is presented. This approach was used in teaching some components of a third-year unit in an electrical engineering course at the Queensland University of Technology. Cognitive style characteristics of perceiving and processing information were considered. The bimodal nature of cognitive styles (analytic/imager, analytic/verbalizer, wholist/imager and wholist/verbalizer) was examined in order to assess the full ramification of cognitive styles on learning. In a quasi-experimental format, students' cognitive styles were analysed by cognitive style analysis (CSA) software. On the basis of the CSA results the system defaulted students to either matched or mismatched CBI material. The consistently better performance by the matched group suggests potential for further investigations where the limitations cited in this paper are eliminated. Analysing the differences between cognitive styles on individual test tasks also suggests that certain test tasks may better suit certain cognitive styles.
Resumo:
The paper provides an assessment of the performance of commercial Real Time Kinematic (RTK) systems over longer than recommended inter-station distances. The experiments were set up to test and analyse solutions from the i-MAX, MAX and VRS systems being operated with three triangle shaped network cells, each having an average inter-station distance of 69km, 118km and 166km. The performance characteristics appraised included initialization success rate, initialization time, RTK position accuracy and availability, ambiguity resolution risk and RTK integrity risk in order to provide a wider perspective of the performance of the testing systems. ----- ----- The results showed that the performances of all network RTK solutions assessed were affected by the increase in the inter-station distances to similar degrees. The MAX solution achieved the highest initialization success rate of 96.6% on average, albeit with a longer initialisation time. Two VRS approaches achieved lower initialization success rate of 80% over the large triangle. In terms of RTK positioning accuracy after successful initialisation, the results indicated a good agreement between the actual error growth in both horizontal and vertical components and the accuracy specified in the RMS and part per million (ppm) values by the manufacturers. ----- ----- Additionally, the VRS approaches performed better than the MAX and i-MAX when being tested under the standard triangle network with a mean inter-station distance of 69km. However as the inter-station distance increases, the network RTK software may fail to generate VRS correction and then may turn to operate in the nearest single-base RTK (or RAW) mode. The position uncertainty reached beyond 2 meters occasionally, showing that the RTK rover software was using an incorrect ambiguity fixed solution to estimate the rover position rather than automatically dropping back to using an ambiguity float solution. Results identified that the risk of incorrectly resolving ambiguities reached 18%, 20%, 13% and 25% for i-MAX, MAX, Leica VRS and Trimble VRS respectively when operating over the large triangle network. Additionally, the Coordinate Quality indicator values given by the Leica GX1230 GG rover receiver tended to be over-optimistic and not functioning well with the identification of incorrectly fixed integer ambiguity solutions. In summary, this independent assessment has identified some problems and failures that can occur in all of the systems tested, especially when being pushed beyond the recommended limits. While such failures are expected, they can offer useful insights into where users should be wary and how manufacturers might improve their products. The results also demonstrate that integrity monitoring of RTK solutions is indeed necessary for precision applications, thus deserving serious attention from researchers and system providers.
Resumo:
This paper presents a robust place recognition algorithm for mobile robots. The framework proposed combines nonlinear dimensionality reduction, nonlinear regression under noise, and variational Bayesian learning to create consistent probabilistic representations of places from images. These generative models are learnt from a few images and used for multi-class place recognition where classification is computed from a set of feature-vectors. Recognition can be performed in near real-time and accounts for complexity such as changes in illumination, occlusions and blurring. The algorithm was tested with a mobile robot in indoor and outdoor environments with sequences of 1579 and 3820 images respectively. This framework has several potential applications such as map building, autonomous navigation, search-rescue tasks and context recognition.
Resumo:
In the late 20th century, a value-shift began to influence political thinking, recognising the need for environmentally, socially and culturally sustainable resource development. This shift entailed moves away from thinking of nature and culture as separate entities - The former existing merely to serve the latter. Cultural landscape theory recognises 'nature' as at once both 'natural', and as a 'cultural' construct. As such it may offer a framework through which to progress in the quest for 'sustainable development'. This 2005 Masters thesis makes a contribution to that quest by asking whether contemporary developments in cultural landscape theory can contribute to rehabilitation strategies for Australian open-cut coal mining landscapes, an examplar resource development landscape. A thematic historial overview of landscape values and resource development in Australis post-1788, and a review of cultural landscape theory literature contribute to the formation of the theoretical framework: "reconnecting the interrupted landscape". The author then explores a possible application of this framework within the Australian open-cut coal mining landscape.
Resumo:
Software transactional memory has the potential to greatly simplify development of concurrent software, by supporting safe composition of concurrent shared-state abstractions. However, STM semantics are defined in terms of low-level reads and writes on individual memory locations, so implementations are unable to take advantage of the properties of user-defined abstractions. Consequently, the performance of transactions over some structures can be disappointing. ----- ----- We present Modular Transactional Memory, our framework which allows programmers to extend STM with concurrency control algorithms tailored to the data structures they use in concurrent programs. We describe our implementation in Concurrent Haskell, and two example structures: a finite map which allows concurrent transactions to operate on disjoint sets of keys, and a non-deterministic channel which supports concurrent sources and sinks. ----- ----- Our approach is based on previous work by others on boosted and open-nested transactions, with one significant development: transactions are given types which denote the concurrency control algorithms they employ. Typed transactions offer a higher level of assurance for programmers reusing transactional code, and allow more flexible abstract concurrency control.
Resumo:
Personalised social matching systems can be seen as recommender systems that recommend people to others in the social networks. However, with the rapid growth of users in social networks and the information that a social matching system requires about the users, recommender system techniques have become insufficiently adept at matching users in social networks. This paper presents a hybrid social matching system that takes advantage of both collaborative and content-based concepts of recommendation. The clustering technique is used to reduce the number of users that the matching system needs to consider and to overcome other problems from which social matching systems suffer, such as cold start problem due to the absence of implicit information about a new user. The proposed system has been evaluated on a dataset obtained from an online dating website. Empirical analysis shows that accuracy of the matching process is increased, using both user information (explicit data) and user behavior (implicit data).
Resumo:
This paper attempts to develop a theoretical acceptance model for measuring Web personalization success. Key factors impacting Web personalization acceptance are identified from a detailed literature review. The final model is then cast in a structural equation modeling (SEM) framework comprising nineteen manifest variables, which are grouped into three focal behaviors of Web users. These variables could provide a framework for better understanding of numerous factors that contribute to the success measures of Web personalization technology. Especially, those concerning the quality of personalized features and how personalized information through personalized Website can be delivered to the user. The interrelationship between success constructs is also explained. Empirical validations of this theoretical model are expected on future research.
Resumo:
Evidence based practice (EBP) has been accepted as a process to assist health professionals in clinical decision making to improve patient outcomes. It requires applying skills in a prescribed sequence to critique existing practices. Many countries, including Australia, require nurses to demonstrate competencies in EBP skills to be registered. In the last ten years, this has lead to universities incorporating EBP in undergraduate nursing degree courses. The literature reports many challenges including students’ difficulties in critically appraising research evidence, and their need for both simplification of the process and extensive support. The purpose of our study was to investigate the effectiveness of a standalone introductory EBP subject for a diverse group of third-year undergraduates, based on a novel but challenging approach to assessment. Despite many changes made in the second iteration of the subject, most students’ perceptions of the subject’s difficulty remained unchanged. This research aligns with the issues identified in the literature and has wider applicability to the teaching of rapidly changing disciplines, where evidence-driven consumers have easy access to information and expect up-to-date practices.
Resumo:
The INEX 2010 Focused Relevance Feedback track offered a refined approach to the evaluation of Focused Relevance Feedback algorithms through simulated exhaustive user feedback. As in traditional approaches we simulated a user-in-the loop by re-using the assessments of ad-hoc retrieval obtained from real users who assess focused ad-hoc retrieval submissions. The evaluation was extended in several ways: the use of exhaustive relevance feedback over entire runs; the evaluation of focused retrieval where both the retrieval results and the feedback are focused; the evaluation was performed over a closed set of documents and complete focused assessments; the evaluation was performed over executable implementations of relevance feedback algorithms; and �finally, the entire evaluation platform is reusable. We present the evaluation methodology, its implementation, and experimental results obtained for nine submissions from three participating organisations.