971 resultados para Data filtering
Resumo:
A new technique for the harmonic analysis of current observations is described. It consists in applying a linear band pass filter which separates the various species and removes the contribution of non-tidal effects at intertidal frequencies. The tidal constituents are then evaluated through the method of least squares. In spite of the narrowness of the filter, only three days of data are lost through the filtering procedure and the only requirement on the data is that the time interval between samples be an integer fraction of one day. This technique is illustrated through the analysis of a few French current observations from the English Channel within the framework of INOUT. The characteristics of the main tidal constituents are given.
Resumo:
Sampling was conducted from March 24 to August 5 2010, in the fjord branch Kapisigdlit located in the inner part of the Godthåbsfjord system, West Greenland. The vessel "Lille Masik" was used during all cruises except on June 17-18 where sampling was done from RV Dana (National Institute for Aquatic Resources, Denmark). A total of 15 cruises (of 1-2 days duration) 7-10 days apart was carried out along a transect composed of 6 stations (St.), spanning the length of the 26 km long fjord branch. St. 1 was located at the mouth of the fjord branch and St. 6 was located at the end of the fjord branch, in the middle of a shallower inner creek . St. 1-4 was covering deeper parts of the fjord, and St. 5 was located on the slope leading up to the shallow inner creek. Mesozooplankton was sampled by vertical net tows using a Hydrobios Multinet (type Mini) equipped with a flow meter and 50 µm mesh nets or a WP-2 net 50 µm mesh size equipped with a non-filtering cod-end. Sampling was conducted at various times of day at the different stations. The nets were hauled with a speed of 0.2-0.3 m s**-1 from 100, 75 and 50 m depth to the surface at St. 2 + 4, 5 and 6, respectively. The content was immediately preserved in buffered formalin (4% final concentration). All samples were analyzed in the Plankton sorting and identification center in Szczecin (www.nmfri.gdynia.pl). Samples containing high numbers of zooplankton were split into subsamples. All copepods and other zooplankton were identified down to lowest possible taxonomic level (approx. 400 per sample), length measured and counted. Copepods were sorted into development stages (nauplii stage 1 - copepodite stage 6) using morphological features and sizes, and up to 10 individuals of each stage was length measured.
Resumo:
A unique macroseismic data set for the strongest earthquakes occurred since 1940 in Vrancea region, is constructed by a thorough review of all available sources. Inconsistencies and errors in the reported data and in their use are analyzed as well. The final data set, free from inconsistencies, including those at the political borders, contains 9822 observations for the strong intermediate-depth earthquakes: 1940, Mw=7.7; 1977, Mw=7.4; 1986, Mw=7.1; 1990, May 30, Mw=6.9 and 1990, May 31, Mw=6.4; 2004, Mw=6.0. This data set is available electronically as supplementary data for the present paper. From the discrete macroseismic data the continuous macroseismic field is generated using the methodology developed by Molchan et al. (2002) that, along with the unconventional smoothing method Modified Polynomial Filtering (MPF), uses the Diffused Boundary (DB) method, which visualizes the uncertainty in the isoseismal's boundaries. The comparison of DBs with previous isoseismals maps represents a good evaluation criterion of the reliability of earlier published maps. The produced isoseismals can be used not only for the formal comparison between observed and theoretical isoseismals, but also for the retrieval of source properties and the assessment of local responses (Molchan et al., 2011).
Resumo:
Climate change mediates marine chemical and physical environments and therefore influences marine organisms. While increasing atmospheric CO2 level and associated ocean acidification has been predicted to stimulate marine primary productivity and may affect community structure, the processes that impact food chain and biological CO2 pump are less documented. We hypothesized that copepods, as the secondary marine producer, may respond to future changes in seawater carbonate chemistry associated with ocean acidification due to increasing atmospheric CO2 concentration. Here, we show that the copepod, Centropages tenuiremis, was able to perceive the chemical changes in seawater induced under elevated CO2 concentration (>1700 µatm, pH < 7.60) with avoidance strategy. The copepod's respiration increased at the elevated CO2 (1000 µatm), associated acidity (pH 7.83) and its feeding rates also increased correspondingly, except for the initial acclimating period, when it fed less. Our results imply that marine secondary producers increase their respiration and feeding rate in response to ocean acidification to balance the energy cost against increased acidity and CO2 concentration.
Resumo:
In this paper, we propose a particle filtering (PF) method for indoor tracking using radio frequency identification (RFID) based on aggregated binary measurements. We use an Ultra High Frequency (UHF) RFID system that is composed of a standard RFID reader, a large set of standard passive tags whose locations are known, and a newly designed, special semi-passive tag attached to an object that is tracked. This semi-passive tag has the dual ability to sense the backscatter communication between the reader and other passive tags which are in its proximity and to communicate this sensed information to the reader using backscatter modulation. We refer to this tag as a sense-a-tag (ST). Thus, the ST can provide the reader with information that can be used to determine the kinematic parameters of the object on which the ST is attached. We demonstrate the performance of the method with data obtained in a laboratory environment.
Resumo:
This thesis proposes how to apply the Semantic Web tech- nologies for the Idea Management Systems to deliver a solution to knowl- edge management and information over ow problems. Firstly, the aim is to present a model that introduces rich metadata annotations and their usage in the domain of Idea Management Systems. Furthermore, the the- sis shall investigate how to link innovation data with information from other systems and use it to categorize and lter out the most valuable elements. In addition, the thesis presents a Generic Idea and Innovation Management Ontology (Gi2MO) and aims to back its creation with a set of case studies followed by evaluations that prove how Semantic Web can work as tool to create new opportunities and leverage the contemporary Idea Management legacy systems into the next level.
Resumo:
In ubiquitous data stream mining applications, different devices often aim to learn concepts that are similar to some extent. In these applications, such as spam filtering or news recommendation, the data stream underlying concept (e.g., interesting mail/news) is likely to change over time. Therefore, the resultant model must be continuously adapted to such changes. This paper presents a novel Collaborative Data Stream Mining (Coll-Stream) approach that explores the similarities in the knowledge available from other devices to improve local classification accuracy. Coll-Stream integrates the community knowledge using an ensemble method where the classifiers are selected and weighted based on their local accuracy for different partitions of the feature space. We evaluate Coll-Stream classification accuracy in situations with concept drift, noise, partition granularity and concept similarity in relation to the local underlying concept. The experimental results show that Coll-Stream resultant model achieves stability and accuracy in a variety of situations using both synthetic and real world datasets.
Resumo:
In this work we address a scenario where 3D content is transmitted to a mobile terminal with 3D display capabilities. We consider the use of 2D plus depth format to represent the 3D content and focus on the generation of synthetic views in the terminal. We evaluate different types of smoothing filters that are applied to depth maps with the aim of reducing the disoccluded regions. The evaluation takes into account the reduction of holes in the synthetic view as well as the presence of geometrical distortion caused by the smoothing operation. The selected filter has been included within an implemented module for the VideoLan Client (VLC) software in order to render 3D content from the 2D plus depth data format.