938 resultados para Data streams


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Frequent episode discovery is a popular framework for mining data available as a long sequence of events. An episode is essentially a short ordered sequence of event types and the frequency of an episode is some suitable measure of how often the episode occurs in the data sequence. Recently,we proposed a new frequency measure for episodes based on the notion of non-overlapped occurrences of episodes in the event sequence, and showed that, such a definition, in addition to yielding computationally efficient algorithms, has some important theoretical properties in connecting frequent episode discovery with HMM learning. This paper presents some new algorithms for frequent episode discovery under this non-overlapped occurrences-based frequency definition. The algorithms presented here are better (by a factor of N, where N denotes the size of episodes being discovered) in terms of both time and space complexities when compared to existing methods for frequent episode discovery. We show through some simulation experiments, that our algorithms are very efficient. The new algorithms presented here have arguably the least possible orders of spaceand time complexities for the task of frequent episode discovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In contrast to cost modeling activities, the pricing of services must be simple and transparent. Calculating and thus knowing price structures, would not only help identify the level of detail required for cost modeling of individual instititutions, but also help develop a ”public” market for services as well as clarify the division of task and the modeling of funding and revenue streams for data preservation of public institutions. This workshop has built on the results from the workshop ”The Costs and Benefits of Keeping Knowledge” which took place 11 June 2012 in Copenhagen. This expert workshop aimed at: •Identifying ways for data repositories to abstract from their complicated cost structures and arrive at one transparent pricing structure which can be aligned with available and plausible funding schemes. Those repositories will probably need a stable institutional funding stream for data management and preservation. Are there any estimates for this, absolute or as percentage of overall cost? Part of the revenue will probably have to come through data management fees upon ingest. How could that be priced? Per dataset, per GB or as a percentage of research cost? Will it be necessary to charge access prices, as they contradict the open science paradigm? •What are the price components for pricing individual services, which prices are currently being paid e.g. to commercial providers? What are the description and conditions of the service(s) delivered and guaranteed? •What types of risks are inherent in these pricing schemes? •How can services and prices be defined in an all-inclusive and simple manner, so as to enable researchers to apply for specific amount when asking for funding of data-intensive projects?Please

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Few detailed studies have been made on the ecology of the chalk streams. A complex community of plants and animals is present and much more information is required to achieve an understanding of the requirements and interactions of all the species. It is important that the rivers affected by this scheme should be studied and kept under continued observation so that any effects produced by the scheme can be detected. The report gives a synopsis of work carried out between 1971 and 1979 focusing on the present phase 1978-1979. It assumes some familiarity with the investigations carried out on the River Lambourn during the preceding years. The aims of the present phase of the project may be divided into two broad aspects. The first involves collecting further information in the field and includes three objectives: a continuation of studies on the Lambourn sites at Bagnor; comparative studies on other chalk streams; and a comparative study on a limestone stream. The second involves detailed analyses of data previously collected to document the recovery of the Lambourn from operational pumping and to attempt to develop simple conceptual and predictive models applicable over a wide range of physical and geographical variables. (PDF contains 43 pages)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Eight streams from the North West of England were stocked with Atlantic salmon (Salmo salar L.) fed fry at densities ranging from 1 to 4/m2 over a period of up to three years to evaluate survival to the end of the first an d second growing periods and hence assess the value of stocking as a management practice. Survival to the end of the first growin g period (mean duration of 108 days) was found to vary between 7.8 and 41.3% with a mean of 22% and CV of 0.44. Survival from the end of the first growing period to the end of the second growing period (mean duration of 384 days) ranged from 19.9 to 34.1% with a mean of 26.3% and CV of 0.21. Survival was found to be positively related to 0+ trout density (P < 0.05) and negatively related to altitude (P < 0.05). A comparison of the raw survival data (non standardised with respect to duration of experiments) with that from other studies in relation to stocking densities revealed a negative relationship between fry survival and stocking density (P < 0.05). Densities in excess of 5/m2 tended to result in lower levels of survival. Post stocking fry dispersal patterns were examined for the 1991 data. On average 86.7% of the number of fry surviving remained within the stocked zone by the end of the first growing period. With the exception of one stream there was little in the way of dispersal beyond the stocked zone. The dispersal pattern approximated to the normal distribution (P < 0.05). It was estimated that stocking can result in a net gain of fish to a river system compared with natural productivity, however the numerical significance of this gain and its cost effectiveness need to be determined on a river specific basis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Depth data from archival tags on northern rock sole (Lepidopsetta polyxystra) were examined to assess whether fish used tidal currents to aid horizontal migration. Two northern rock sole, out of 115 released with archival tags in the eastern Bering Sea, were recovered 314 and 667 days after release. Both fish made periodic excursions away from the bottom during mostly night-time hours, but also during particular phases of the tide cycle. One fish that was captured and released in an area of rotary currents made vertical excursions that were correlated with tidal current direction. To test the hypothesis that the fish made vertical excursions to use tidal currents to aid migration, a hypothetical migratory path was calculated using a tide model to predict the current direction and speed during periods when the fish was off the bottom. This migration included limited movements from July through December, followed by a 200-km southern migration from January through February, then a return northward in March and April. The successful application of tidal current information to predict a horizontal migratory path not only provides evidence of selective tidal stream transport but indicates that vertical excursions were conducted primarily to assist horizontal migration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is the Evaluation of the impact of cypermethrin use in forestry on Welsh streams from the University of Plymouth, published on September 2010 by the Environment Agency South West. The report focuses attention on Cypermethrin, a highly active synthetic pyrethroid insecticide effective against a wide range of pests in agriculture, public health, and animal husbandry. It is also used in forestry to control the pine weevil, Hylobius abietis. Cypermethrin is very toxic to aquatic invertebrates and fish at nanogram per litre concentrations. This project checks the effectiveness of current best practice measures in minimising the risk of pollution associated with the use of cypermethrin in forestry in Wales. Chemical results from the intensive studies show that cypermethrin entered minor watercourses draining treated areas at two of the eight sites. In one of these cases the level was well in excess of the short-term Predicted No Effect Concentration. The absence of a buffer area at the other site resulted in the cypermethrin reaching a main drain. However dilution appeared to be sufficient to prevent any impact on water quality or on the invertebrate community in the main stream. Invertebrate and chemical data from the extensive survey showed little evidence of pollution due to wider use of cypermethrin in Welsh forestry. Finally, a number of recommendations are made for further tightening controls on forestry practice to minimise the risk of cypermethrin entering the aquatic environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is the Acid rain project biosurveys of streams in the Wastwater catchment produced by the North West Water Authority in 1985. This report forms part of a series on component biological investigations, identified by location or topic, within the acid rain project. Reporting of the Wastwater catchment data would not have been given priority ordinarily, but it has been brought forward to coincide with J. Robinson's reporting of his investigations of land use and liming in the catchment. This report shows water chemistry results of a violent rainstorms such pH, alkalinity, Mg, Ca and Al. Moreover it shows invertebrate, fish and chemical data for Wastwater catchment sites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A roofing contractor typically needs to acquire as-built dimensions of a roof structure several times over the course of its build to be able to digitally fabricate sheet metal roof panels. Obtaining these measurements using the exiting roof surveying methods could be costly in terms of equipment, labor, and/or worker exposure to safety hazards. This paper presents a video-based surveying technology as an alternative method which is simple to use, automated, less expensive, and safe. When using this method, the contractor collects video streams with a calibrated stereo camera set. Unique visual characteristics of scenes from a roof structure are then used in the processing step to automatically extract as-built dimensions of roof planes. These dimensions are finally represented in a XML format to be loaded into sheet metal folding and cutting machines. The proposed method has been tested for a roofing project and the preliminary results indicate its capabilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a model for the general flow in the neocortex. The basic process, called "sequence-seeking," is a search for a sequence of mappings or transformations, linking source and target representations. The search is bi-directional, "bottom-up" as well as "top-down," and it explores in parallel a large numbe rof alternative sequences. This operation is implemented in a structure termed "counter streams," in which multiple sequences are explored along two separate, complementary pathways which seeking to meet. The first part of the paper discusses the general sequence-seeking scheme and a number of related processes, such as the learning of successful sequences, context effects, and the use of "express lines" and partial matches. The second part discusses biological implications of the model in terms of connections within and between cortical areas. The model is compared with existing data, and a number of new predictions are proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Passive monitoring of large sites typically requires coordination between multiple cameras, which in turn requires methods for automatically relating events between distributed cameras. This paper tackles the problem of self-calibration of multiple cameras which are very far apart, using feature correspondences to determine the camera geometry. The key problem is finding such correspondences. Since the camera geometry and photometric characteristics vary considerably between images, one cannot use brightness and/or proximity constraints. Instead we apply planar geometric constraints to moving objects in the scene in order to align the scene"s ground plane across multiple views. We do not assume synchronized cameras, and we show that enforcing geometric constraints enables us to align the tracking data in time. Once we have recovered the homography which aligns the planar structure in the scene, we can compute from the homography matrix the 3D position of the plane and the relative camera positions. This in turn enables us to recover a homography matrix which maps the images to an overhead view. We demonstrate this technique in two settings: a controlled lab setting where we test the effects of errors in internal camera calibration, and an uncontrolled, outdoor setting in which the full procedure is applied to external camera calibration and ground plane recovery. In spite of noise in the internal camera parameters and image data, the system successfully recovers both planar structure and relative camera positions in both settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current research on Internet-based distributed systems emphasizes the scalability of overlay topologies for efficient search and retrieval of data items, as well as routing amongst peers. However, most existing approaches fail to address the transport of data across these logical networks in accordance with quality of service (QoS) constraints. Consequently, this paper investigates the use of scalable overlay topologies for routing real-time media streams between publishers and potentially many thousands of subscribers. Specifically, we analyze the costs of using k-ary n-cubes for QoS-constrained routing. Given a number of nodes in a distributed system, we calculate the optimal k-ary n-cube structure for minimizing the average distance between any pair of nodes. Using this structure, we describe a greedy algorithm that selects paths between nodes in accordance with the real-time delays along physical links. We show this method improves the routing latencies by as much as 67%, compared to approaches that do not consider physical link costs. We are in the process of developing a method for adaptive node placement in the overlay topology, based upon the locations of publishers, subscribers, physical link costs and per-subscriber QoS constraints. One such method for repositioning nodes in logical space is discussed, to improve the likelihood of meeting service requirements on data routed between publishers and subscribers. Future work will evaluate the benefits of such techniques more thoroughly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce a view-point invariant representation of moving object trajectories that can be used in video database applications. It is assumed that trajectories lie on a surface that can be locally approximated with a plane. Raw trajectory data is first locally approximated with a cubic spline via least squares fitting. For each sampled point of the obtained curve, a projective invariant feature is computed using a small number of points in its neighborhood. The resulting sequence of invariant features computed along the entire trajectory forms the view invariant descriptor of the trajectory itself. Time parametrization has been exploited to compute cross ratios without ambiguity due to point ordering. Similarity between descriptors of different trajectories is measured with a distance that takes into account the statistical properties of the cross ratio, and its symmetry with respect to the point at infinity. In experiments, an overall correct classification rate of about 95% has been obtained on a dataset of 58 trajectories of players in soccer video, and an overall correct classification rate of about 80% has been obtained on matching partial segments of trajectories collected from two overlapping views of outdoor scenes with moving people and cars.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cerebral cortex contains circuitry for continuously computing properties of the environment and one's body, as well as relations among those properties. The success of complex perceptuomotor performances requires integrated, simultaneous use of such relational information. Ball catching is a good example as it involves reaching and grasping of visually pursued objects that move relative to the catcher. Although integrated neural control of catching has received sparse attention in the neuroscience literature, behavioral observations have led to the identification of control principles that may be embodied in the involved neural circuits. Here, we report a catching experiment that refines those principles via a novel manipulation. Visual field motion was used to perturb velocity information about balls traveling on various trajectories relative to a seated catcher, with various initial hand positions. The experiment produced evidence for a continuous, prospective catching strategy, in which hand movements are planned based on gaze-centered ball velocity and ball position information. Such a strategy was implemented in a new neural model, which suggests how position, velocity, and temporal information streams combine to shape catching movements. The model accurately reproduces the main and interaction effects found in the behavioral experiment and provides an interpretation of recently observed target motion-related activity in the motor cortex during interceptive reaching by monkeys. It functionally interprets a broad range of neurobiological and behavioral data, and thus contributes to a unified theory of the neural control of reaching to stationary and moving targets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a graph stream clustering algorithm with a unied similarity measure on both structural and attribute properties of vertices, with each attribute being treated as a vertex. Unlike others, our approach does not require an input parameter for the number of clusters, instead, it dynamically creates new sketch-based clusters and periodically merges existing similar clusters. Experiments on two publicly available datasets reveal the advantages of our approach in detecting vertex clusters in the graph stream. We provide a detailed investigation into how parameters affect the algorithm performance. We also provide a quantitative evaluation and comparison with a well-known offline community detection algorithm which shows that our streaming algorithm can achieve comparable or better average cluster purity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biogas from anaerobic digestion of sewage sludge is a renewable resource with high energy content, which is formed mainly of CH4 (40-75 vol.%) and CO2 (15-60 vol.%) Other components such as water (H2O, 5-10 vol.%) and trace amounts of hydrogen sulfide and siloxanes can also be present. A CH4-rich stream can be produced by removing the CO2 and other impurities so that the upgraded bio-methane can be injected into the natural gas grid or used as a vehicle fuel. The main objective of this paper is to develop a new modeling methodology to assess the technical and economic performance of biogas upgrading processes using ionic liquids which physically absorb CO2. Three different ionic liquids, namely the 1-ethyl-3-methylimidazolium bis[(trifluoromethyl)sulfonyl]imide, 1-hexyl-3-methylimidazoliumbis[(trifluoromethyl)sulfonyl]imide and trihexyl(tetradecyl)phosphonium bis[(trifluoromethyl)sulfonyl]imide, are considered for CO2 capture in a pressure-swing regenerative absorption process. The simulation software Aspen Plus and Aspen Process Economic Analyzer is used to account for mass and energy balances as well as equipment cost. In all cases, the biogas upgrading plant consists of a multistage compressor for biogas compression, a packed absorption column for CO2 absorption, a flash evaporator for solvent regeneration, a centrifugal pump for solvent recirculation, a pre-absorber solvent cooler and a gas turbine for electricity recovery. The evaluated processes are compared in terms of energy efficiency, capital investment and bio-methane production costs. The overall plant efficiency ranges from 71-86 % whereas the bio-methane production cost ranges from £6.26-7.76 per GJ (LHV). A sensitivity analysis is also performed to determine how several technical and economic parameters affect the bio-methane production costs. The results of this study show that the simulation methodology developed can predict plant efficiencies and production costs of large scale CO2 capture processes using ionic liquids without having to rely on gas solubility experimental data.