930 resultados para topological complexity
Resumo:
In topological mapping, perceptual aliasing can cause different places to appear indistinguishable to the robot. In case of severely corrupted or non-available odometry information, topological mapping is difficult as the robot is challenged with the loop-closing problem; that is to determine whether it has visited a particular place before. In this article we propose to use neighbourhood information to disambiguate otherwise indistinguishable places. Using neighbourhood information for place disambiguation is an approach that neither depends on a specific choice of sensors nor requires geometric information such as odometry. Local neighbourhood information is extracted from a sequence of observations of visited places. In experiments using either sonar or visual observations from an indoor environment the benefits of using neighbourhood clues for the disambiguation of otherwise identical vertices are demonstrated. Over 90% of the maps we obtain are isomorphic with the ground truth. The choice of the robot’s sensors does not impact the results of the experiments much.
Resumo:
This paper presents a general, global approach to the problem of robot exploration, utilizing a topological data structure to guide an underlying Simultaneous Localization and Mapping (SLAM) process. A Gap Navigation Tree (GNT) is used to motivate global target selection and occluded regions of the environment (called “gaps”) are tracked probabilistically. The process of map construction and the motion of the vehicle alters both the shape and location of these regions. The use of online mapping is shown to reduce the difficulties in implementing the GNT.
Resumo:
Purpose---The aim of this study is to identify complexity measures for building projects in the People’s Republic of China (PRC). Design/Methodology/Approach---A three-round of Delphi questionnaire survey was conducted to identify the key parameters that measure the degree of project complexity. A complexity index (CI) was developed based on the identified measures and their relative importance. Findings---Six key measures of project complexity have been identified, which include, namely (1) building structure & function; (2) construction method; (3) the urgency of the project schedule; (4) project size/scale; (5) geological condition; and (6) neighboring environment. Practical implications---These complexity measures help stakeholders assess degrees of project complexity and better manage the potential risks that might be induced to different levels of project complexity. Originality/Value---The findings provide insightful perspectives to define and understand project complexity. For stakeholders, understanding and addressing the complexity help to improve project planning and implementation.
Resumo:
Increasingly societies and their governments are facing important social issues that have science and technology as key features. A number of these socio-scientific issues have two features that distinguish them from the restricted contexts in which school science has traditionally been presented. Some of their science is uncertain and scientific knowledge is not the only knowledge involved. As a result, the concepts of uncertainty, risk and complexity become essential aspects of the science underlying these issues. In this chapter we discuss the nature and role of these concepts in the public understanding of science and consider their links with school science. We argue that these same concepts and their role in contemporary scientific knowledge need to be addressed in school science curricula. The new features for content, pedagogy and assessment of this urgent challenge for science educators are outlined. These will be essential if the goal of science education for citizenship is to be achieved with our students, who will increasingly be required to make personal and collective decisions on issues involving science and technology.
Resumo:
Topographic structural complexity of a reef is highly correlated to coral growth rates, coral cover and overall levels of biodiversity, and is therefore integral in determining ecological processes. Modeling these processes commonly includes measures of rugosity obtained from a wide range of different survey techniques that often fail to capture rugosity at different spatial scales. Here we show that accurate estimates of rugosity can be obtained from video footage captured using underwater video cameras (i.e., monocular video). To demonstrate the accuracy of our method, we compared the results to in situ measurements of a 2m x 20m area of forereef from Glovers Reef atoll in Belize. Sequential pairs of images were used to compute fine scale bathymetric reconstructions of the reef substrate from which precise measurements of rugosity and reef topographic structural complexity can be derived across multiple spatial scales. To achieve accurate bathymetric reconstructions from uncalibrated monocular video, the position of the camera for each image in the video sequence and the intrinsic parameters (e.g., focal length) must be computed simultaneously. We show that these parameters can be often determined when the data exhibits parallax-type motion, and that rugosity and reef complexity can be accurately computed from existing video sequences taken from any type of underwater camera from any reef habitat or location. This technique provides an infinite array of possibilities for future coral reef research by providing a cost-effective and automated method of determining structural complexity and rugosity in both new and historical video surveys of coral reefs.
Resumo:
Accepting the fact that culture and language are interrelated in second language learning (SLL), the web sites should be designed to integrate with the cultural aspects. Yet many SLL web sites fail to integrate with the cultural aspects and/or focus on language acquisition only. This study identified three issues: (1) anthropologists’ cultural models mostly adopted in cross-cultural web user interface have been superficially used; (2) web designers deal with culture as a fixed one which needs to be modeled into interface design elements, so (3) there is a need for a communication framework between educators and design practitioners, which can be utilized in web design processes. This paper discusses what anthropology can contribute to language learning, mediated through web design processes and suggests a cultural user experience framework for web-based SLL by presenting an exemplary matrix. To evaluate the effectiveness of the framework, the key stakeholders (learners, teachers, and designers) participated in a case scenario-based evaluation. The result shows a high possibility that the framework can enhance the effective communication and collaboration for the cultural integration.
Resumo:
In this research we used inductive reasoning through design to understand how stakeholders in the Waterfall Way (New South Wales, Australia) perceive the relationships between themselves and the place they live in. This paper describes a collaborative design methodology used to release information about local identities, which guided the regional brand exercise. The methodology is explicit about the uncertainties and complexities of the design process and of its reception system. As such, it aims to engage with local stakeholders and experts in order to help elicit tacit knowledge and identify system patterns and trends that would possibly not be visible if a top-down expert-based process was used. Through collective design, local people were drawn together in search for a symbol to represent the meaning attached to their places/region in relation to sustainable tourism activity.
Resumo:
Underlying all assessments are human judgements regarding the quality of students’ understandings. Despite their ubiquity, those judgements are conceptually elusive. The articles selected for inclusion in this issue explore the complexity of judgement practice raising critical questions that challenge existing views and accepted policy and practice.
Resumo:
Student performance on examinations is influenced by the level of difficulty of the questions. It seems reasonable to propose therefore that assessment of the difficulty of exam questions could be used to gauge the level of skills and knowledge expected at the end of a course. This paper reports the results of a study investigating the difficulty of exam questions using a subjective assessment of difficulty and a purpose-built exam question complexity classification scheme. The scheme, devised for exams in introductory programming courses, assesses the complexity of each question using six measures: external domain references, explicitness, linguistic complexity, conceptual complexity, length of code involved in the question and/or answer, and intellectual complexity (Bloom level). We apply the scheme to 20 introductory programming exam papers from five countries, and find substantial variation across the exams for all measures. Most exams include a mix of questions of low, medium, and high difficulty, although seven of the 20 have no questions of high difficulty. All of the complexity measures correlate with assessment of difficulty, indicating that the difficulty of an exam question relates to each of these more specific measures. We discuss the implications of these findings for the development of measures to assess learning standards in programming courses.
Resumo:
Automated process discovery techniques aim at extracting models from information system logs in order to shed light into the business processes supported by these systems. Existing techniques in this space are effective when applied to relatively small or regular logs, but otherwise generate large and spaghetti-like models. In previous work, trace clustering has been applied in an attempt to reduce the size and complexity of automatically discovered process models. The idea is to split the log into clusters and to discover one model per cluster. The result is a collection of process models -- each one representing a variant of the business process -- as opposed to an all-encompassing model. Still, models produced in this way may exhibit unacceptably high complexity. In this setting, this paper presents a two-way divide-and-conquer process discovery technique, wherein the discovered process models are split on the one hand by variants and on the other hand hierarchically by means of subprocess extraction. The proposed technique allows users to set a desired bound for the complexity of the produced models. Experiments on real-life logs show that the technique produces collections of models that are up to 64% smaller than those extracted under the same complexity bounds by applying existing trace clustering techniques.
Resumo:
We consider the problem of maximizing the secure connectivity in wireless ad hoc networks, and analyze complexity of the post-deployment key establishment process constrained by physical layer properties such as connectivity, energy consumption and interference. Two approaches, based on graph augmentation problems with nonlinear edge costs, are formulated. The first one is based on establishing a secret key using only the links that are already secured by shared keys. This problem is in NP-hard and does not accept polynomial time approximation scheme PTAS since minimum cutsets to be augmented do not admit constant costs. The second one extends the first problem by increasing the power level between a pair of nodes that has a secret key to enable them physically connect. This problem can be formulated as the optimal key establishment problem with interference constraints with bi-objectives: (i) maximizing the concurrent key establishment flow, (ii) minimizing the cost. We prove that both problems are NP-hard and MAX-SNP with a reduction to MAX3SAT problem.