949 resultados para Traffic sampling
Resumo:
In the construction of a large area neutron detector (neutron wall) that is used to detect neutrons at GeV energies, the performances of all the sampling paddle modules prepared for the neutron wall are investigated with a specially designed test bench. Tested by cosmic rays, an average intrinsic time resolution of 222.5 ps is achieved at the center of the modules. The light attenuation length and the effective speed of the light in the module are also investigated.
Resumo:
Natl Chiao Tung Univ, Dept Comp Sci
Resumo:
The density and distribution of spatial samples heavily affect the precision and reliability of estimated population attributes. An optimization method based on Mean of Surface with Nonhomogeneity (MSN) theory has been developed into a computer package with the purpose of improving accuracy in the global estimation of some spatial properties, given a spatial sample distributed over a heterogeneous surface; and in return, for a given variance of estimation, the program can export both the optimal number of sample units needed and their appropriate distribution within a specified research area. (C) 2010 Elsevier Ltd. All rights reserved.
Evaluation and application of micro-sampling system for inductively coupled plasma mass spectrometry
Resumo:
Two Meinhard microconcentric nebulizers, model AR30-07-FM02 and AR 30-07-FM005, were employed as a self-installed micro-sampling system for inductively coupled plasma-mass spectrometry (ICP-MS). The FM02 nebulizer at 22 muL/min of solution uptake rate gave the relative standard deviations of 7.6%, 3.0%, 2.7%, 1.8% for determinations (n = 10) of 20 mug/L Be, Co, In and Bi, respectively, and the detection limits (3s) of 0.14, 0.10, 0.02 and 0.01 mug/L for Be, Co In and Bi, respectively. The mass intensity of In-115 obtained by this micro-sampling system was 60% of that by conventional pneumatic nebulizer system at 1.3 mL/min. The analytical results for La, Ce, Pr and Nd in 20 muL Wistar rat amniotic fluid obtained by the present micro-sampling system were precisely in good agreement with those obtained using conventional pneumatic nebulization system.
Resumo:
With development of industry and acceleration of urbanization, problems of air quality as well as their influences on human health have recently been regarded highly by current international communities and governments. Generally, industrializations can result in exhausting of a lot of industry gases and dusts, while urbanization can cause increasing of modern vehicles. Comparing with traditional chemical methods, magnetic method is simple, rapid, exact, low-cost and non-destructive for monitoring air pollution and has been widely applied in domestic and international studies. In this thesis, with an aim of better monitoring air pollution, we selected plants (highroad-side perennial pine trees (Pinus pumila Regel) along a highroad linking Beijing City and the Capital International Airport, and tree bark and tree ring core samples (willow, Salix matsudana) nearby a smelting industry in northeast Beijing) for magnetic studies. With systemic magnetic measurements on these samples, magnetic response mechanism of contamination(e.g. tree leaves, tree ring)to both short- and long-term environmental pollution has been constructed, and accordingly the pollution range, degree and process of different time-scale human activities could be assessed. A series of rock magnetic experiments of tree leaves show that the primary magnetic mineral of leaf samples was identified to be magnetite, in pseudo-single domain (PSD) grain size range of 0.2-5.0 μm. Magnetite concentration and grain size in leaves are ascertained to decrease with increasing of sampling distance to highroad asphalt surface, suggesting that high magnetic response to traffic pollution is localized within a distance of about 2 m away from highroad asphalt surface. On the other hand, highroad-side trees and rainwater can effectively reduce the concentration of traffic pollution-induced particulate matters (PMs) in the atmosphere. This study is the first time to investigate the relationship of smelting factory activities and vicissitudes of environment with tree rings by magnetic methods. Results indicate that magnetic particles are omnipresent in tree bark and trunk wood. Magnetic techniques including low-temperature experiment, successive acquisition of IRM, hysteresis loops and SIRM measurements suggest that magnetic particles are predominated by magnetite in pseudo-single domain state. Comparison of magnetic properties of tree trunk and branch cores collected from different directions and heights implies that collection of magnetic particles depends on both sampling direction and height. Pollution source-facing tree trunk wood contains significantly more magnetic particles than other sides. These indicate that magnetic particles are most likely intercepted and collected by tree bark first, then enter into tree xylem tissues by translocation during growing season, and are finally enclosed in a tree ring by lignifying. Correlation between magnetic properties such as time-dependent SIRM values of tree ring cores and the annual steel yields of the smelting factory is significant. Considering the dependence of magnetic properties in sampling directions, heights, and ring cores, we proposed that magnetic particles in the xylem cannot move between tree rings. Accordingly, the SIRM and some other magnetic parameters of tree ring cores from the source-facing side could be contributed to historical study of atmospheric pollution produced by heavy metal smelting activities, isoline diagrams of SIRM values of all the tree rings indicate that air pollution is increasing worse. We believed that a synthetic rock magnetic study is an effective method for determining concentration and grain size of ferromagnets in the atmospheric PMs, and then it should be a rapid and feasible technique for monitoring atmospheric pollution.
Resumo:
Accurate knowledge of traffic demands in a communication network enables or enhances a variety of traffic engineering and network management tasks of paramount importance for operational networks. Directly measuring a complete set of these demands is prohibitively expensive because of the huge amounts of data that must be collected and the performance impact that such measurements would impose on the regular behavior of the network. As a consequence, we must rely on statistical techniques to produce estimates of actual traffic demands from partial information. The performance of such techniques is however limited due to their reliance on limited information and the high amount of computations they incur, which limits their convergence behavior. In this paper we study strategies to improve the convergence of a powerful statistical technique based on an Expectation-Maximization iterative algorithm. First we analyze modeling approaches to generating starting points. We call these starting points informed priors since they are obtained using actual network information such as packet traces and SNMP link counts. Second we provide a very fast variant of the EM algorithm which extends its computation range, increasing its accuracy and decreasing its dependence on the quality of the starting point. Finally, we study the convergence characteristics of our EM algorithm and compare it against a recently proposed Weighted Least Squares approach.
Resumo:
Network traffic arises from the superposition of Origin-Destination (OD) flows. Hence, a thorough understanding of OD flows is essential for modeling network traffic, and for addressing a wide variety of problems including traffic engineering, traffic matrix estimation, capacity planning, forecasting and anomaly detection. However, to date, OD flows have not been closely studied, and there is very little known about their properties. We present the first analysis of complete sets of OD flow timeseries, taken from two different backbone networks (Abilene and Sprint-Europe). Using Principal Component Analysis (PCA), we find that the set of OD flows has small intrinsic dimension. In fact, even in a network with over a hundred OD flows, these flows can be accurately modeled in time using a small number (10 or less) of independent components or dimensions. We also show how to use PCA to systematically decompose the structure of OD flow timeseries into three main constituents: common periodic trends, short-lived bursts, and noise. We provide insight into how the various constituents contribute to the overall structure of OD flows and explore the extent to which this decomposition varies over time.
Resumo:
Internet Traffic Managers (ITMs) are special machines placed at strategic places in the Internet. itmBench is an interface that allows users (e.g. network managers, service providers, or experimental researchers) to register different traffic control functionalities to run on one ITM or an overlay of ITMs. Thus itmBench offers a tool that is extensible and powerful yet easy to maintain. ITM traffic control applications could be developed either using a kernel API so they run in kernel space, or using a user-space API so they run in user space. We demonstrate the flexibility of itmBench by showing the implementation of both a kernel module that provides a differentiated network service, and a user-space module that provides an overlay routing service. Our itmBench Linux-based prototype is free software and can be obtained from http://www.cs.bu.edu/groups/itm/.
Resumo:
Anomalies are unusual and significant changes in a network's traffic levels, which can often involve multiple links. Diagnosing anomalies is critical for both network operators and end users. It is a difficult problem because one must extract and interpret anomalous patterns from large amounts of high-dimensional, noisy data. In this paper we propose a general method to diagnose anomalies. This method is based on a separation of the high-dimensional space occupied by a set of network traffic measurements into disjoint subspaces corresponding to normal and anomalous network conditions. We show that this separation can be performed effectively using Principal Component Analysis. Using only simple traffic measurements from links, we study volume anomalies and show that the method can: (1) accurately detect when a volume anomaly is occurring; (2) correctly identify the underlying origin-destination (OD) flow which is the source of the anomaly; and (3) accurately estimate the amount of traffic involved in the anomalous OD flow. We evaluate the method's ability to diagnose (i.e., detect, identify, and quantify) both existing and synthetically injected volume anomalies in real traffic from two backbone networks. Our method consistently diagnoses the largest volume anomalies, and does so with a very low false alarm rate.
Resumo:
Accurate knowledge of traffic demands in a communication network enables or enhances a variety of traffic engineering and network management tasks of paramount importance for operational networks. Directly measuring a complete set of these demands is prohibitively expensive because of the huge amounts of data that must be collected and the performance impact that such measurements would impose on the regular behavior of the network. As a consequence, we must rely on statistical techniques to produce estimates of actual traffic demands from partial information. The performance of such techniques is however limited due to their reliance on limited information and the high amount of computations they incur, which limits their convergence behavior. In this paper we study a two-step approach for inferring network traffic demands. First we elaborate and evaluate a modeling approach for generating good starting points to be fed to iterative statistical inference techniques. We call these starting points informed priors since they are obtained using actual network information such as packet traces and SNMP link counts. Second we provide a very fast variant of the EM algorithm which extends its computation range, increasing its accuracy and decreasing its dependence on the quality of the starting point. Finally, we evaluate and compare alternative mechanisms for generating starting points and the convergence characteristics of our EM algorithm against a recently proposed Weighted Least Squares approach.
Resumo:
Detecting and understanding anomalies in IP networks is an open and ill-defined problem. Toward this end, we have recently proposed the subspace method for anomaly diagnosis. In this paper we present the first large-scale exploration of the power of the subspace method when applied to flow traffic. An important aspect of this approach is that it fuses information from flow measurements taken throughout a network. We apply the subspace method to three different types of sampled flow traffic in a large academic network: multivariate timeseries of byte counts, packet counts, and IP-flow counts. We show that each traffic type brings into focus a different set of anomalies via the subspace method. We illustrate and classify the set of anomalies detected. We find that almost all of the anomalies detected represent events of interest to network operators. Furthermore, the anomalies span a remarkably wide spectrum of event types, including denial of service attacks (single-source and distributed), flash crowds, port scanning, downstream traffic engineering, high-rate flows, worm propagation, and network outage.
Resumo:
Recent work in sensor databases has focused extensively on distributed query problems, notably distributed computation of aggregates. Existing methods for computing aggregates broadcast queries to all sensors and use in-network aggregation of responses to minimize messaging costs. In this work, we focus on uniform random sampling across nodes, which can serve both as an alternative building block for aggregation and as an integral component of many other useful randomized algorithms. Prior to our work, the best existing proposals for uniform random sampling of sensors involve contacting all nodes in the network. We propose a practical method which is only approximately uniform, but contacts a number of sensors proportional to the diameter of the network instead of its size. The approximation achieved is tunably close to exact uniform sampling, and only relies on well-known existing primitives, namely geographic routing, distributed computation of Voronoi regions and von Neumann's rejection method. Ultimately, our sampling algorithm has the same worst-case asymptotic cost as routing a point-to-point message, and thus it is asymptotically optimal among request/reply-based sampling methods. We provide experimental results demonstrating the effectiveness of our algorithm on both synthetic and real sensor topologies.
Resumo:
A novel technique to detect and localize periodic movements in video is presented. The distinctive feature of the technique is that it requires neither feature tracking nor object segmentation. Intensity patterns along linear sample paths in space-time are used in estimation of period of object motion in a given sequence of frames. Sample paths are obtained by connecting (in space-time) sample points from regions of high motion magnitude in the first and last frames. Oscillations in intensity values are induced at time instants when an object intersects the sample path. The locations of peaks in intensity are determined by parameters of both cyclic object motion and orientation of the sample path with respect to object motion. The information about peaks is used in a least squares framework to obtain an initial estimate of these parameters. The estimate is further refined using the full intensity profile. The best estimate for the period of cyclic object motion is obtained by looking for consensus among estimates from many sample paths. The proposed technique is evaluated with synthetic videos where ground-truth is known, and with American Sign Language videos where the goal is to detect periodic hand motions.