999 resultados para time graphs
Resumo:
The Modeling method of teaching has demonstrated well--‐documented success in the improvement of student learning. The teacher/researcher in this study was introduced to Modeling through the use of a technique called White Boarding. Without formal training, the researcher began using the White Boarding technique for a limited number of laboratory experiences with his high school physics classes. The question that arose and was investigated in this study is “What specific aspects of the White Boarding process support student understanding?” For the purposes of this study, the White Boarding process was broken down into three aspects – the Analysis of data through the use of Logger Pro software, the Preparation of White Boards, and the Presentations each group gave about their specific lab data. The lab used in this study, an Acceleration of Gravity Lab, was chosen because of the documented difficulties students experience in the graphing of motion. In the lab, students filmed a given motion, utilized Logger Pro software to analyze the motion, prepared a White Board that described the motion with position--‐time and velocity--‐time graphs, and then presented their findings to the rest of the class. The Presentation included a class discussion with minimal contribution from the teacher. The three different aspects of the White Boarding experience – Analysis, Preparation, and Presentation – were compared through the use of student learning logs, video analysis of the Presentations, and follow--‐up interviews with participants. The information and observations gathered were used to determine the level of understanding of each participant during each phase of the lab. The researcher then looked for improvement in the level of student understanding, the number of “aha” moments students had, and the students’ perceptions about which phase was most important to their learning. The results suggest that while all three phases of the White Boarding experience play a part in the learning process for students, the Presentations provided the most significant changes. The implications for instruction are discussed.
Resumo:
The introduction of time-series graphs into British economics in the 19th century depended on the « timing » of history. This involved reconceptualizing history into events which were both comparable and measurable and standardized by time unit. Yet classical economists in Britain in the early 19th century viewed history as a set of heterogenous and complex events and statistical tables as giving unrelated facts. Both these attitudes had to be broken down before time-series graphs could be brought into use for revealing regularities in economic events by the century's end.
Resumo:
This work analyses the waveshapes of continuing currents and parameters of M-components in positive cloud-to-ground (CG) flashes through high-speed GPS synchronized videos. The dataset is composed of only long continuing currents (with duration longer than 40 ms) and was selected from more than 800 flashes recorded in Sao Jose dos Campos (45.864 degrees W, 23.215 degrees S) and Uruguaiana (29.806 degrees W, 57.005 degrees S) in Southeast and South of Brazil, respectively, during 2003 to 2007 summers. The videos are compared with data obtained by the Brazilian Lightning Location System (BrasilDAT) in order to determine the polarity of each flash and select only positive cases. There are only two studies of waveshapes of continuing currents in the literature. One is based on direct current measurements of triggered lightning, in which four different types of waveshapes were observed; and the other is based on measurements of luminosity variations in high-speed videos of CG negative lightning, in which besides the four types above mentioned two additional types were observed. The present work is an extension of the latter, using the same method but now applied to obtain the waveshapes of positive CG lightning. As far as the authors know, this is the first report on M-components in positive continuing currents. We also have used the luminosity-versus-time graphs to observe their occurrence and measure some parameters (duration, elapsed time and time between two successive M-components), whose statistics are presented and compared in detail to the data for negative flashes. We have plotted a histogram of the M-components elapsed time over the total duration of the continuing current for positive flashes, which presented an exponential decay (correlation coefficient: 0.83), similar to what has been observed for negative flashes. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
We investigate a conjecture on the cover times of planar graphs by means of large Monte Carlo simulations. The conjecture states that the cover time tau (G(N)) of a planar graph G(N) of N vertices and maximal degree d is lower bounded by tau (G(N)) >= C(d)N(lnN)(2) with C(d) = (d/4 pi) tan(pi/d), with equality holding for some geometries. We tested this conjecture on the regular honeycomb (d = 3), regular square (d = 4), regular elongated triangular (d = 5), and regular triangular (d = 6) lattices, as well as on the nonregular Union Jack lattice (d(min) = 4, d(max) = 8). Indeed, the Monte Carlo data suggest that the rigorous lower bound may hold as an equality for most of these lattices, with an interesting issue in the case of the Union Jack lattice. The data for the honeycomb lattice, however, violate the bound with the conjectured constant. The empirical probability distribution function of the cover time for the square lattice is also briefly presented, since very little is known about cover time probability distribution functions in general.
Resumo:
23rd Euromicro International Conference on Parallel, Distributed, and Network-Based Processing (PDP 2015). 4 to 6, Mar, 2015. Turku, Finland.
Resumo:
The impact of CO2 leakage on solubility and distribution of trace metals in seawater and sediment has been studied in lab scale chambers. Seven metals (Al, Cr, Ni, Pb, Cd, Cu, and Zn) were investigated in membrane-filtered seawater samples, and DGT samplers were deployed in water and sediment during the experiment. During the first phase (16 days), "dissolved" (<0.2 µm) concentrations of all elements increased substantially in the water. The increase in dissolved fractions of Al, Cr, Ni, Cu, Zn, Cd and Pb in the CO2 seepage chamber was respectively 5.1, 3.8, 4.5, 3.2, 1.4, 2.3 and 1.3 times higher than the dissolved concentrations of these metals in the control. During the second phase of the experiment (10 days) with the same sediment but replenished seawater, the dissolved fractions of Al, Cr, Cd, and Zn were partly removed from the water column in the CO2 chamber. DNi and DCu still increased but at reduced rates, while DPb increased faster than that was observed during the first phase. DGT-labile fractions (MeDGT) of all metals increased substantially during the first phase of CO2 seepage. DGT-labile fractions of Al, Cr, Ni, Cu, Zn, Cd and Pb were respectively 7.9, 2.0, 3.6, 1.7, 2.1, 1.9 and 2.3 times higher in the CO2 chamber than that of in the control chamber. AlDGT, CrDGT, NiDGT, and PbDGT continued to increase during the second phase of the experiment. There was no change in CdDGT during the second phase, while CuDGT and ZnDGT decreased by 30% and 25%, respectively in the CO2 chamber. In the sediment pore water, DGT labile fractions of all the seven elements increased substantially in the CO2 chamber. Our results show that CO2 leakage affected the solubility, particle reactivity and transformation rates of the studied metals in sediment and at the sediment-water interface. The metal species released due to CO2 acidification may have sufficiently long residence time in the seawater to affect bioavailability and toxicity of the metals to biota.
Resumo:
DBpedia has become one of the major sources of structured knowledge extracted from Wikipedia. Such structures gradually re-shape the representation of Topics as new events relevant to such topics emerge. Such changes make evident the continuous evolution of topic representations and introduce new challenges to supervised topic classification tasks, since labelled data can rapidly become outdated. Here we analyse topic changes in DBpedia and propose the use of semantic features as a more stable representation of a topic. Our experiments show promising results in understanding how the relevance of features to a topic changes over time.
Resumo:
Social media has become an effective channel for communicating both trends and public opinion on current events. However the automatic topic classification of social media content pose various challenges. Topic classification is a common technique used for automatically capturing themes that emerge from social media streams. However, such techniques are sensitive to the evolution of topics when new event-dependent vocabularies start to emerge (e.g., Crimea becoming relevant to War Conflict during the Ukraine crisis in 2014). Therefore, traditional supervised classification methods which rely on labelled data could rapidly become outdated. In this paper we propose a novel transfer learning approach to address the classification task of new data when the only available labelled data belong to a previous epoch. This approach relies on the incorporation of knowledge from DBpedia graphs. Our findings show promising results in understanding how features age, and how semantic features can support the evolution of topic classifiers.
Resumo:
ACM Computing Classification System (1998): G.2.2.
Resumo:
The popularity of online social media platforms provides an unprecedented opportunity to study real-world complex networks of interactions. However, releasing this data to researchers and the public comes at the cost of potentially exposing private and sensitive user information. It has been shown that a naive anonymization of a network by removing the identity of the nodes is not sufficient to preserve users’ privacy. In order to deal with malicious attacks, k -anonymity solutions have been proposed to partially obfuscate topological information that can be used to infer nodes’ identity. In this paper, we study the problem of ensuring k anonymity in time-varying graphs, i.e., graphs with a structure that changes over time, and multi-layer graphs, i.e., graphs with multiple types of links. More specifically, we examine the case in which the attacker has access to the degree of the nodes. The goal is to generate a new graph where, given the degree of a node in each (temporal) layer of the graph, such a node remains indistinguishable from other k-1 nodes in the graph. In order to achieve this, we find the optimal partitioning of the graph nodes such that the cost of anonymizing the degree information within each group is minimum. We show that this reduces to a special case of a Generalized Assignment Problem, and we propose a simple yet effective algorithm to solve it. Finally, we introduce an iterated linear programming approach to enforce the realizability of the anonymized degree sequences. The efficacy of the method is assessed through an extensive set of experiments on synthetic and real-world graphs.
Resumo:
Kernel methods provide a way to apply a wide range of learning techniques to complex and structured data by shifting the representational problem from one of finding an embedding of the data to that of defining a positive semidefinite kernel. In this paper, we propose a novel kernel on unattributed graphs where the structure is characterized through the evolution of a continuous-time quantum walk. More precisely, given a pair of graphs, we create a derived structure whose degree of symmetry is maximum when the original graphs are isomorphic. With this new graph to hand, we compute the density operators of the quantum systems representing the evolutions of two suitably defined quantum walks. Finally, we define the kernel between the two original graphs as the quantum Jensen-Shannon divergence between these two density operators. The experimental evaluation shows the effectiveness of the proposed approach. © 2013 Springer-Verlag.
Resumo:
In today's fast-paced and interconnected digital world, the data generated by an increasing number of applications is being modeled as dynamic graphs. The graph structure encodes relationships among data items, while the structural changes to the graphs as well as the continuous stream of information produced by the entities in these graphs make them dynamic in nature. Examples include social networks where users post status updates, images, videos, etc.; phone call networks where nodes may send text messages or place phone calls; road traffic networks where the traffic behavior of the road segments changes constantly, and so on. There is a tremendous value in storing, managing, and analyzing such dynamic graphs and deriving meaningful insights in real-time. However, a majority of the work in graph analytics assumes a static setting, and there is a lack of systematic study of the various dynamic scenarios, the complexity they impose on the analysis tasks, and the challenges in building efficient systems that can support such tasks at a large scale. In this dissertation, I design a unified streaming graph data management framework, and develop prototype systems to support increasingly complex tasks on dynamic graphs. In the first part, I focus on the management and querying of distributed graph data. I develop a hybrid replication policy that monitors the read-write frequencies of the nodes to decide dynamically what data to replicate, and whether to do eager or lazy replication in order to minimize network communication and support low-latency querying. In the second part, I study parallel execution of continuous neighborhood-driven aggregates, where each node aggregates the information generated in its neighborhoods. I build my system around the notion of an aggregation overlay graph, a pre-compiled data structure that enables sharing of partial aggregates across different queries, and also allows partial pre-computation of the aggregates to minimize the query latencies and increase throughput. Finally, I extend the framework to support continuous detection and analysis of activity-based subgraphs, where subgraphs could be specified using both graph structure as well as activity conditions on the nodes. The query specification tasks in my system are expressed using a set of active structural primitives, which allows the query evaluator to use a set of novel optimization techniques, thereby achieving high throughput. Overall, in this dissertation, I define and investigate a set of novel tasks on dynamic graphs, design scalable optimization techniques, build prototype systems, and show the effectiveness of the proposed techniques through extensive evaluation using large-scale real and synthetic datasets.
Resumo:
The premise of automated alert correlation is to accept that false alerts from a low level intrusion detection system are inevitable and use attack models to explain the output in an understandable way. Several algorithms exist for this purpose which use attack graphs to model the ways in which attacks can be combined. These algorithms can be classified in to two broad categories namely scenario-graph approaches, which create an attack model starting from a vulnerability assessment and type-graph approaches which rely on an abstract model of the relations between attack types. Some research in to improving the efficiency of type-graph correlation has been carried out but this research has ignored the hypothesizing of missing alerts. Our work is to present a novel type-graph algorithm which unifies correlation and hypothesizing in to a single operation. Our experimental results indicate that the approach is extremely efficient in the face of intensive alerts and produces compact output graphs comparable to other techniques.