164 resultados para Keywords: Gallai graphs, anti-Gallai graphs,
Resumo:
In recent years, considerable research efforts have been directed to micro-array technologies and their role in providing simultaneous information on expression profiles for thousands of genes. These data, when subjected to clustering and classification procedures, can assist in identifying patterns and providing insight on biological processes. To understand the properties of complex gene expression datasets, graphical representations can be used. Intuitively, the data can be represented in terms of a bipartite graph, with weighted edges corresponding to gene-sample node couples in the dataset. Biologically meaningful subgraphs can be sought, but performance can be influenced both by the search algorithm, and, by the graph-weighting scheme and both merit rigorous investigation. In this paper, we focus on edge-weighting schemes for bipartite graphical representation of gene expression. Two novel methods are presented: the first is based on empirical evidence; the second on a geometric distribution. The schemes are compared for several real datasets, assessing efficiency of performance based on four essential properties: robustness to noise and missing values, discrimination, parameter influence on scheme efficiency and reusability. Recommendations and limitations are briefly discussed. Keywords: Edge-weighting; weighted graphs; gene expression; bi-clustering
Resumo:
We present a method for topological SLAM that specifically targets loop closing for edge-ordered graphs. Instead of using a heuristic approach to accept or reject loop closing, we propose a probabilistically grounded multi-hypothesis technique that relies on the incremental construction of a map/state hypothesis tree. Loop closing is introduced automatically within the tree expansion, and likely hypotheses are chosen based on their posterior probability after a sequence of sensor measurements. Careful pruning of the hypothesis tree keeps the growing number of hypotheses under control and a recursive formulation reduces storage and computational costs. Experiments are used to validate the approach.
Resumo:
This paper describes the use of property graphs for mapping data between AEC software tools, which are not linked by common data formats and/or other interoperability measures. The intention of introducing this in practice, education and research is to facilitate the use of diverse, non-integrated design and analysis applications by a variety of users who need to create customised digital workflows, including those who are not expert programmers. Data model types are examined by way of supporting the choice of directed, attributed, multi-relational graphs for such data transformation tasks. A brief exemplar design scenario is also presented to illustrate the concepts and methods proposed, and conclusions are drawn regarding the feasibility of this approach and directions for further research.
Resumo:
Detecting anomalies in the online social network is a significant task as it assists in revealing the useful and interesting information about the user behavior on the network. This paper proposes a rule-based hybrid method using graph theory, Fuzzy clustering and Fuzzy rules for modeling user relationships inherent in online-social-network and for identifying anomalies. Fuzzy C-Means clustering is used to cluster the data and Fuzzy inference engine is used to generate rules based on the cluster behavior. The proposed method is able to achieve improved accuracy for identifying anomalies in comparison to existing methods.
Resumo:
The literacy demands of tables and graphs are different from those of prose texts such as narrative. This paper draws from part of a qualitative case study which sought to investigate strategies that scaffold and enhance the teaching and learning of varied representations in text. As indicated in the paper, the method focused on the teaching and learning of tables and graphs with use of Freebody and Luke's (1990) four resources model from literacy education.
Resumo:
This paper reports on the early stages of a design experiment in educational assessment that challenges the dichotomous legacy evident in many assessment activities. Combining social networking technologies with the sociology of education the paper proposes that assessment activities are best understood as a negotiable field of exchange. In this design experiment students, peers and experts engage in explicit, "front-end" assessment (Wyatt-Smith, 2008) to translate holistic judgments into institutional, and potentiality economic capital without adhering to long lists of pre-set criteria. This approach invites participants to use social networking technologies to judge creative works using scatter graphs, keywords and tag clouds. In doing so assessors will refine their evaluative expertise and negotiate the characteristics of creative works from which criteria will emerge (Sadler, 2008). The real-time advantages of web-based technologies will aggregate, externalise and democratise this transparent method of assessment for most, if not all, creative works that can be represented in a digital format.
Resumo:
Twitter is the focus of much research attention, both in traditional academic circles and in commercial market and media research, as analytics give increasing insight into the performance of the platform in areas as diverse as political communication, crisis management, television audiencing and other industries. While methods for tracking Twitter keywords and hashtags have developed apace and are well documented, the make-up of the Twitter user base and its evolution over time have been less understood to date. Recent research efforts have taken advantage of functionality provided by Twitter's Application Programming Interface to develop methodologies to extract information that allows us to understand the growth of Twitter, its geographic spread and the processes by which particular Twitter users have attracted followers. From politicians to sporting teams, and from YouTube personalities to reality television stars, this technique enables us to gain an understanding of what prompts users to follow others on Twitter. This article outlines how we came upon this approach, describes the method we adopted to produce accession graphs and discusses their use in Twitter research. It also addresses the wider ethical implications of social network analytics, particularly in the context of a detailed study of the Twitter user base.
Resumo:
Based on protein molecular dynamics, we investigate the fractal properties of energy, pressure and volume time series using the multifractal detrended fluctuation analysis (MF-DFA) and the topological and fractal properties of their converted horizontal visibility graphs (HVGs). The energy parameters of protein dynamics we considered are bonded potential, angle potential, dihedral potential, improper potential, kinetic energy, Van der Waals potential, electrostatic potential, total energy and potential energy. The shape of the h(q)h(q) curves from MF-DFA indicates that these time series are multifractal. The numerical values of the exponent h(2)h(2) of MF-DFA show that the series of total energy and potential energy are non-stationary and anti-persistent; the other time series are stationary and persistent apart from series of pressure (with H≈0.5H≈0.5 indicating the absence of long-range correlation). The degree distributions of their converted HVGs show that these networks are exponential. The results of fractal analysis show that fractality exists in these converted HVGs. For each energy, pressure or volume parameter, it is found that the values of h(2)h(2) of MF-DFA on the time series, exponent λλ of the exponential degree distribution and fractal dimension dBdB of their converted HVGs do not change much for different proteins (indicating some universality). We also found that after taking average over all proteins, there is a linear relationship between 〈h(2)〉〈h(2)〉 (from MF-DFA on time series) and 〈dB〉〈dB〉 of the converted HVGs for different energy, pressure and volume.
Resumo:
Teachers' failure to utilise MBL activities more widely may be due to not recognising their capacity to transform the nature of laboratory activities to be more consistent with contemporary constructivist theories of learning. This research aimed to increase understanding of how MBL activities specifically designed to be consistent with a constructivist theory of learning support or constrain student construction of understanding. The first author conducted the research with his Year 11 physics class of 29 students. Dyads completed nine tasks relating to kinematics using a Predict-Observe-Explain format. Data sources included video and audio recordings of students and teacher during four 70-minute sessions, students' display graphs and written notes, semi-structured student interviews, and the teacher's journal. The study identifies the actors and describes the patterns of interactions in the MBL. Analysis of students' discourse and actions identified many instances where students' initial understanding of kinematics were mediated in multiple ways. Students invented numerous techniques for manipulating data in the service of their emerging understanding. The findings are presented as eight assertions. Recommendations are made for developing pedagogical strategies incorporating MBL activities which will likely catalyse student construction of understanding.
Resumo:
Information graphics have become increasingly important in representing, organising and analysing information in a technological age. In classroom contexts, information graphics are typically associated with graphs, maps and number lines. However, all students need to become competent with the broad range of graphics that they will encounter in mathematical situations. This paper provides a rationale for creating a test to measure students’ knowledge of graphics. This instrument can be used in mass testing and individual (in-depth) situations. Our analysis of the utility of this instrument informs policy and practice. The results provide an appreciation of the relative difficulty of different information graphics; and provide the capacity to benchmark information about students’ knowledge of graphics. The implications for practice include the need to support the development of students’ knowledge of graphics, the existence of gender differences, the role of cross-curriculum applications in learning about graphics, and the need to explicate the links among graphics.
Resumo:
Mandatory numeracy tests have become commonplace in many countries, heralding a new era in school assessment. New forms of accountability and an increased emphasis on national and international standards (and benchmarks) have the potential to reshape mathematics curricula. It is noteworthy that the mathematics items used in these tests are rich in graphics. Many of the items, for example, require students to have an understanding of information graphics (e.g., maps, charts and graphs) in order to solve the tasks. This investigation classifies mathematics items in Australia’s inaugural national numeracy tests and considers the effect such standardised testing will have on practice. It is argued that the design of mathematics items are more likely to be a reliable indication of student performance if graphical, linguistic and contextual components are considered both in isolation and in integrated ways as essential elements of task design.
Resumo:
This study investigated the longitudinal performance of 378 students who completed mathematics items rich in graphics. Specifically, this study explored student performance across axis (e.g., numbers lines), opposed-position (e.g., line and column graphs) and circular (e.g., pie charts) items over a three-year period (ages 9-11 years). The results of the study revealed significant performance differences in the favour of boys on graphics items that were represented in horizontal and vertical displays. There were no gender differences on items that were represented in a circular manner.