162 resultados para Information processing
Resumo:
This paper describes a method for measuring the creative potential of computer games. The research approach applies a behavioral and verbal protocol to analyze the factors that influence the creative processes used by people as they play computer games from the puzzle genre. Creative potential is measured by examining task motivation and domain-relevant and creativity-relevant skills. This paper focuses on the reliability of the factors used for measurement, determining those factors that are more strongly related to creativity. The findings show that creative potential may be determined by examining the relationship between skills required and the effect of intrinsic motivation within game play activities.
Resumo:
The mining environment, being complex, irregular and time varying, presents a challenging prospect for stereo vision. The objective is to produce a stereo vision sensor suited to close-range scenes consisting primarily of rocks. This sensor should be able to produce a dense depth map within real-time constraints. Speed and robustness are of foremost importance for this investigation. A number of area based matching metrics have been implemented, including the SAD, SSD, NCC, and their zero-meaned versions. The NCC and the zero meaned SAD and SSD were found to produce the disparity maps with the highest proportion of valid matches. The plain SAD and SSD were the least computationally expensive, due to all their operations taking place in integer arithmetic, however, they were extremely sensitive to radiometric distortion. Non-parametric techniques for matching, in particular, the rank and the census transform, have also been investigated. The rank and census transforms were found to be robust with respect to radiometric distortion, as well as being able to produce disparity maps with a high proportion of valid matches. An additional advantage of both the rank and the census transform is their amenability to fast hardware implementation.
Resumo:
The mining environment, being complex, irregular and time varying, presents a challenging prospect for stereo vision. For this application, speed, reliability, and the ability to produce a dense depth map are of foremost importance. This paper assesses the suitability of a number of matching techniques for use in a stereo vision sensor for close range scenes consisting primarily of rocks. These include traditional area-based matching metrics, and non-parametric transforms, in particular, the rank and census transforms. Experimental results show that the rank and census transforms exhibit a number of clear advantages over area-based matching metrics, including their low computational complexity, and robustness to certain types of distortion.
Resumo:
As business process management technology matures, organisations acquire more and more business process models. The management of the resulting collections of process models poses real challenges. One of these challenges concerns model retrieval where support should be provided for the formulation and efficient execution of business process model queries. As queries based on only structural information cannot deal with all querying requirements in practice, there should be support for queries that require knowledge of process model semantics. In this paper we formally define a process model query language that is based on semantic relationships between tasks in process models and is independent of any particular process modelling notation.
Resumo:
Internet chatrooms are common means of interaction and communications, and they carry valuable information about formal or ad-hoc formation of groups with diverse objectives. This work presents a fully automated surveillance system for data collection and analysis in Internet chatrooms. The system has two components: First, it has an eavesdropping tool which collects statistics on individual (chatter) and chatroom behavior. This data can be used to profile a chatroom and its chatters. Second, it has a computational discovery algorithm based on Singular Value Decomposition (SVD) to locate hidden communities and communication patterns within a chatroom. The eavesdropping tool is used for fine tuning the SVD-based discovery algorithm which can be deployed in real-time and requires no semantic information processing. The evaluation of the system on real data shows that (i) statistical properties of different chatrooms vary significantly, thus profiling is possible, (ii) SVD-based algorithm has up to 70-80% accuracy to discover groups of chatters.
Resumo:
Communication processes are vital in the lifecycle of BPM projects. With this in mind, much research has been performed into facilitating this key component between stakeholders. Amongst the methods used to support this process are personalized process visualisations. In this paper, we review the development of this visualization trend, then, we propose a theoretical analysis framework based upon communication theory. We use this framework to provide theoretical support to the conjecture that 3D virtual worlds are powerful tools for communicating personalised visualisations of processes within a workplace. Meta requirements are then derived and applied, via 3D virtual world functionalities, to generate example visualisations containing personalized aspects, which we believe enhance the process of communcation between analysts and stakeholders in BPM process (re)design activities.
Resumo:
This paper describes the implementation of the first portable, embedded data acquisition unit (BabelFuse) that is able to acquire and timestamp generic sensor data and trigger General Purpose I/O (GPIO) events against a microsecond-accurate wirelessly-distributed ‘global’ clock. A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fast-moving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment especially if non-deterministic communication hardware (such as IEEE-802.11-based wireless) and inaccurate clock synchronisation protocols are used. The issue of differing timebases makes correlation of data difficult and prevents the units from reliably performing synchronised operations or manoeuvres. By utilising hardware-assisted timestamping, clock synchronisation protocols based on industry standards and firmware designed to minimise indeterminism, an embedded data acquisition unit capable of microsecond-level clock synchronisation is presented.
Resumo:
Norms regulate the behaviour of their subjects and define what is legal and what is illegal. Norms typically describe the conditions under which they are applicable and the normative effects as a results of their applications. On the other hand, process models specify how a business operation or service is to be carried out to achieve a desired outcome. Norms can have significant impact on how business operations are conducted and they can apply to the whole or part of a business process. For example, they may impose conditions on the different aspects of a process (e.g., perform tasks in a specific sequence (control-flow), at a specific time or within a certain time frame (temporal aspect), by specific people (resources)). We propose a framework that provides the formal semantics of the normative requirements for determining whether a business process complies with a normative document (where a normative document can be understood in a very broad sense, ranging from internal policies to best practice policies, to statutory acts). We also present a classification of normal requirements based on the notion of different types of obligations and the effects of violating these obligations.
Resumo:
Urban agriculture plays an important role in many facets of food security, health and sustainability. The city farm is one such manifestation of urban agriculture: it functions as a location centric social hub that supplies food, education, and opportunities for strengthening the diverse sociocultural fabrics of the local community. This paper presents the case of Northey Street City Farm in Brisbane, Australia as an opportunity space for design. The paper iden-tifies four areas that present key challenges and opportunities for HCI design that support social sustainability of the city farm: A preference for face-to-face contact leads to inconsistencies in shared knowledge; a dependence on volun-teers and very limited resources necessitates easily accessible interventions; other local urban agricultural activity needing greater visibility; and the vulner-ability of the physical location to natural phenomenon, in this instance flooding, present a design challenge and a need to consider disaster management.
Resumo:
Educational reforms currently being enacted in Kuwaiti Family and Consumer Sciences (FCS) in response to contemporary demands for increased student-centred teaching and learning are challenging for FCS teachers due to their limited experience with student-centred learning tools such as Graphic Organisers (GOs). To adopt these reforms, Kuwaiti teachers require a better understanding of and competency in promoting cognitive learning processes that will maximise student-centred learning approaches. This study followed the experiences of four Grade 6 FCS Kuwaiti teachers as they undertook a Professional Development (PD) program specifically designed to advance their understanding of the use of GOs and then as they implemented what they had learned in their Grade 6 FCS classroom. The PD program developed for this study was informed by Nasseh.s competency PD model as well as Piaget and Ausubel.s cognitive theories. This model enabled an assessment and evaluation of the development of the teachers. competencies as an outcome of the PD program in terms of the adoption of GOs, in particular, and their capacity to use GOs to engage students in personalised, in-depth, learning through critical thinking and understanding. The research revealed that the PD program was influential in reforming the teachers. learning, understanding of and competency in, cognitive and visual theories of learning, so that they facilitated student-centred teaching and learning processes that enabled students to adopt and adapt GOs in constructivist learning. The implementation of five GOs - Flow Chart, Concept Maps, K-W-L Chart, Fishbone Diagram and Venn Diagram - as learning tools in classrooms was investigated to find if changes in pedagogical approach for supporting conceptual learning through cognitive information processing would reduce the cognitive work load of students and produce better learning approaches. The study as evidenced by the participant teachers. responses and classroom observations, showed a marked increase in student interest, participation, critical thought, problem solving skills, as a result of using GOs, compared to using traditional teaching and learning methods. A theoretical model was developed from the study based on the premise that teachers. knowledge of the subject, pedagogy and student learning precede the implementation of student-centred learning reform, that it plays an important role in the implementation of student-centred learning and that it brings about a change in teaching practice. The model affirmed that observed change in teaching-practice included aspects of teachers. beliefs, as well as confidence and effect on workplace and on student learning, including engagement, understanding, critical thinking and problem solving. The model assumed that change in teaching practice is inseparable from teachers. lifelong PD needs related to knowledge, understanding, skills and competency. These findings produced a set of preliminary guidelines for establishing student-centred constructivist strategies in Kuwaiti education while retaining Kuwait.s cultural uniqueness.
Resumo:
For any discipline to be regarded as a professional undertaking by which its members may be treated as true “professionals” in a specific area, practitioners must clearly understand that discipline’s history as well as the place and significance of that history in current practice as well as its relevance to available technologies and artefacts at the time. This is common for many professional disciplines such as medicine, pharmacy, engineering, law and so on but not yet, this paper submits, in information technology. Based on twenty five elapsed years of experience in developing and delivering Cybersecurity courses at undergraduate and postgraduate levels, this paper proposes a rationale and set of differing perspectives for the planning and development of curricula relevant to the delivery of appropriate courses in the history of cybersecurity or information assurance to information and communications technology (ICT) students and thus to potential information technology professionals.
Resumo:
Privacy is an important component of freedom and plays a key role in protecting fundamental human rights. It is becoming increasingly difficult to ignore the fact that without appropriate levels of privacy, a person’s rights are diminished. Users want to protect their privacy - particularly in “privacy invasive” areas such as social networks. However, Social Network users seldom know how protect their own privacy through online mechanisms. What is required is an emerging concept that provides users legitimate control over their own personal information, whilst preserving and maintaining the advantages of engaging with online services such as Social Networks. This paper reviews “Privacy by Design (PbD)” and shows how it applies to diverse privacy areas. Such an approach will move towards mitigating many of the privacy issues in online information systems and can be a potential pathway for protecting user’s personal information. The research has posed many questions in need of further investigation for different open source distributed Social Networks. Findings from this research will lead to a novel distributed architecture that provides more transparent and accountable privacy for the users of online information systems.
Resumo:
Evolutionary theory predicts that herbivorous insects should lay eggs on plants in a way that reflects the suitability of each plant species for larval development. Empirical studies, however, often fail to find any relationship between an adult insect’s choice of host–plant and offspring fitness, and in such cases, it is generally assumed that other ‘missing’ factors (e.g. predation, host–plant abundance, learning and adult feeding sites) must be contributing to overall host suitability. Here, I consider an alternative theory – that a fitness cost inherent in the olfactory mechanism could constrain the evolution of insect host selection. I begin by reviewing current knowledge of odour processing in the insect antennal lobe with the aid of a simple schematic: the aim being to explain the workings of this mechanism to scientists who do not have prior knowledge in this field. I then use the schematic to explore how an insect’s perception of host and non-host odours is governed by a set of processing rules, or algorithm. Under the assumptions of this mechanistic view, the perception of every plant odour is interrelated, and seemingly bad host choices can still arise as part of an overall adaptive behavioural strategy. I discuss how an understanding of mechanism can improve the interpretation of theoretical and empirical studies in insect behaviour and evolution.
Resumo:
Process-aware information systems (PAISs) can be configured using a reference process model, which is typically obtained via expert interviews. Over time, however, contextual factors and system requirements may cause the operational process to start deviating from this reference model. While a reference model should ideally be updated to remain aligned with such changes, this is a costly and often neglected activity. We present a new process mining technique that automatically improves the reference model on the basis of the observed behavior as recorded in the event logs of a PAIS. We discuss how to balance the four basic quality dimensions for process mining (fitness, precision, simplicity and generalization) and a new dimension, namely the structural similarity between the reference model and the discovered model. We demonstrate the applicability of this technique using a real-life scenario from a Dutch municipality.
Resumo:
Cross-Lingual Link Discovery (CLLD) is a new problem in Information Retrieval. The aim is to automatically identify meaningful and relevant hypertext links between documents in different languages. This is particularly helpful in knowledge discovery if a multi-lingual knowledge base is sparse in one language or another, or the topical coverage in each language is different; such is the case with Wikipedia. Techniques for identifying new and topically relevant cross-lingual links are a current topic of interest at NTCIR where the CrossLink task has been running since the 2011 NTCIR-9. This paper presents the evaluation framework for benchmarking algorithms for cross-lingual link discovery evaluated in the context of NTCIR-9. This framework includes topics, document collections, assessments, metrics, and a toolkit for pooling, assessment, and evaluation. The assessments are further divided into two separate sets: manual assessments performed by human assessors; and automatic assessments based on links extracted from Wikipedia itself. Using this framework we show that manual assessment is more robust than automatic assessment in the context of cross-lingual link discovery.