992 resultados para VARIABLE SAMPLING INTERVAL
Resumo:
This preliminary report does not present the distribution of selected key planktonic species in each Leg 133 hole, but rather, extracts the best chronodatum levels in two sets of holes, which comprise the Queensland Trough and Townsville Trough transects. In general, the sampling interval was 1.5 m, but sometimes was larger. To convert the datum levels into time, the absolute ages of Berggren et al. (1985, doi:10.1144/GSL.MEM.1985.010.01.18) were used. Extinction levels were employed in the main, because they are the most easily recognized, the order of events seems to be consistent from hole to hole, and they correlate reasonably well with chronodatum levels obtained from nannofossil biostratigraphy (see Gartner et al., 1993, doi:10.2973/odp.proc.sr.133.213.1993).
Resumo:
We analyse time series from 100 patients with bipolar disorder for correlates of depression symptoms. As the sampling interval is non-uniform, we quantify the extent of missing and irregular data using new measures of compliance and continuity. We find that uniformity of response is negatively correlated with the standard deviation of sleep ratings (ρ = -0.26, p = 0.01). To investigate the correlation structure of the time series themselves, we apply the Edelson-Krolik method for correlation estimation. We examine the correlation between depression symptoms for a subset of patients and find that self-reported measures of sleep and appetite/weight show a lower average correlation than other symptoms. Using surrogate time series as a reference dataset, we find no evidence that depression is correlated between patients, though we note a possible loss of information from sparse sampling. © 2013 The Author(s).
Resumo:
We provide a compilation of downward fluxes (total mass, POC, PON, BSiO2, CaCO3, PIC and lithogenic/terrigenous fluxes) from over 6000 sediment trap measurements distributed in the Atlantic Ocean, from 30 degree North to 49 degree South, and covering the period 1982-2011. Data from the Mediterranean Sea are also included. Data were compiled from different sources: data repositories (BCO-DMO, PANGAEA), time series sites (BATS, CARIACO), published scientific papers and/or personal communications from PI's. All sources are specifed in the data set. Data from the World Ocean Atlas 2009 were extracted to provide each flux observation with contextual environmental data, such as temperature, salinity, oxygen (concentration, AOU and percentage saturation), nitrate, phosphate and silicate.
Resumo:
This data set was obtained during the R. V. POLARSTERN cruise ANT-XXVIII/3. Current velocities were measured nearly continuously when outside territorial waters along the ship's track with a vessel-mounted TRD Instruments' 153.6-kHz Ocean Surveyor ADCP. The transducers were located 11 m below the water line and were protected against ice floes by an acoustically transparent plastic window. The current measurements were made using a pulse of 2s and vertical bin length of 4 m. The ship's velocity was calculated from position fixes obtained by the Global Positioning System (GPS). Heading, roll and pitch data from the ship's gyro platforms and the navigation data were used to convert the ADCP velocities into earth coordinates. Accuracy of the ADCP velocities mainly depends on the quality of the position fixes and the ship's heading data. Further errors stem from a misalignment of the transducer with the ship's centerline. The ADCP data were processed using the Ocean Surveyor Sputum Interpreter (OSSI) software developed by GEOMAR Helmholtz-Zentrum für Ozeanforschung Kiel. The averaging interval was set to 120 seconds. The reference layer was set to bins 5 to 16 avoiding near surface effects and biases near bin 1. Sampling interval setting: 2s; Number of bins: 80; Bin length: 4m; Pulse length: 4m; Blank beyond transmit length: 4m. Data processing setting: Top reference bin: 5; Bottom reference bin: 16; Average: 120s; Misalignment amplitude: 1.0276 +/- 0.1611, phase: 0.8100 +/- 0.7190. The precision for single ping and 4m cell size reported by TRDI is 0.30m/s. Resulting from the single ping precision and the number of pings (most of the time 36) during 120seconds the velocity accuracy is nearly 0.05m/s. (Velocity accuracy = single ping precision divided by square root of the number of pings).
Resumo:
Lake sturgeon (Acipenser fulvescens) were historically abundant in the Huron-Erie Corridor (HEC), a 160 km river/channel network composed of the St. Clair River, Lake St. Clair, and the Detroit River that connects Lake Huron to Lake Erie. In the HEC, most natural lake sturgeon spawning substrates have been eliminated or degraded as a result of channelization and dredging. To address significant habitat loss in HEC, multi-agency restoration efforts are underway to restore spawning substrate by constructing artificial spawning reefs. The main objective of this study was to conduct post-construction monitoring of lake sturgeon egg deposition and larval emergence near two of these artificial reef projects; Fighting Island Reef in the Detroit River, and Middle Channel Spawning Reef in the lower St. Clair River. We also investigated seasonal and nightly timing of larval emergence, growth, and vertical distribution in the water column at these sites, and an additional site in the St. Clair River where lake sturgeon are known to spawn on a bed of ~100 year old coal clinkers. From 2010-12, we collected viable eggs and larvae at all three sites indicating that these artificial reefs are creating conditions suitable for egg deposition, fertilization, incubation, and larval emergence. The construction methods and materials, and physical site conditions present in HEC artificial reef projects can be used to inform future spawning habitat restoration or enhancement efforts. The results from this study have also identified the likelihood of additional uncharacterized natural spawning sites in the St. Clair River. In addition to the field study, we conducted a laboratory experiment involving actual substrate materials that have been used in artificial reef construction in this system. Although coal clinkers are chemically inert, some trace elements can be reincorporated with the clinker material during the combustion process. Since lake sturgeon eggs and larvae are developing in close proximity to this material, it is important to measure the concentration of potentially toxic trace elements. This study focused on arsenic, which occurs naturally in coal and can be toxic to fishes. Total arsenic concentration was measured in samples taken from four substrate treatments submerged in distilled water; limestone cobble, rinsed limestone cobble, coal clinker, and rinsed coal clinker. Samples were taken at three time intervals: 24 hours, 11 days, and 21 days. ICP-MS analysis showed that concentrations of total arsenic were below the EPA drinking water standard (10 ppb) for all samples. However, at the 24 hour sampling interval, a two way repeated measures ANOVA with a Holm-Sidak post hoc analysis (α= 0.05) showed that the mean arsenic concentration was significantly higher in the coal clinker substrate treatment then in the rinsed coal clinker treatment (p=0.006), the limestone cobble treatment (p
Resumo:
The general assumption under which the (X) over bar chart is designed is that the process mean has a constant in-control value. However, there are situations in which the process mean wanders. When it wanders according to a first-order autoregressive (AR (1)) model, a complex approach involving Markov chains and integral equation methods is used to evaluate the properties of the (X) over bar chart. In this paper, we propose the use of a pure Markov chain approach to study the performance of the (X) over bar chart. The performance of the chat (X) over bar with variable parameters and the (X) over bar with double sampling are compared. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
We show how to construct a topological Markov map of the interval whose invariant probability measure is the stationary law of a given stochastic chain of infinite order. In particular we characterize the maps corresponding to stochastic chains with memory of variable length. The problem treated here is the converse of the classical construction of the Gibbs formalism for Markov expanding maps of the interval.
Resumo:
In medical follow-up studies, ordered bivariate survival data are frequently encountered when bivariate failure events are used as the outcomes to identify the progression of a disease. In cancer studies interest could be focused on bivariate failure times, for example, time from birth to cancer onset and time from cancer onset to death. This paper considers a sampling scheme where the first failure event (cancer onset) is identified within a calendar time interval, the time of the initiating event (birth) can be retrospectively confirmed, and the occurrence of the second event (death) is observed sub ject to right censoring. To analyze this type of bivariate failure time data, it is important to recognize the presence of bias arising due to interval sampling. In this paper, nonparametric and semiparametric methods are developed to analyze the bivariate survival data with interval sampling under stationary and semi-stationary conditions. Numerical studies demonstrate the proposed estimating approaches perform well with practical sample sizes in different simulated models. We apply the proposed methods to SEER ovarian cancer registry data for illustration of the methods and theory.
Resumo:
We propose a nonparametric variance estimator when ranked set sampling (RSS) and judgment post stratification (JPS) are applied by measuring a concomitant variable. Our proposed estimator is obtained by conditioning on observed concomitant values and using nonparametric kernel regression.
Resumo:
Monte Carlo techniques, which require the generation of samples from some target density, are often the only alternative for performing Bayesian inference. Two classic sampling techniques to draw independent samples are the ratio of uniforms (RoU) and rejection sampling (RS). An efficient sampling algorithm is proposed combining the RoU and polar RS (i.e. RS inside a sector of a circle using polar coordinates). Its efficiency is shown in drawing samples from truncated Cauchy and Gaussian random variables, which have many important applications in signal processing and communications. RESUMEN. Método eficiente para generar algunas variables aleatorias de uso común en procesado de señal y comunicaciones (por ejemplo, Gaussianas o Cauchy truncadas) mediante la combinación de dos técnicas: "ratio of uniforms" y "rejection sampling".
Resumo:
Negative-ion mode electrospray ionization, ESI(-), with Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) was coupled to a Partial Least Squares (PLS) regression and variable selection methods to estimate the total acid number (TAN) of Brazilian crude oil samples. Generally, ESI(-)-FT-ICR mass spectra present a power of resolution of ca. 500,000 and a mass accuracy less than 1 ppm, producing a data matrix containing over 5700 variables per sample. These variables correspond to heteroatom-containing species detected as deprotonated molecules, [M - H](-) ions, which are identified primarily as naphthenic acids, phenols and carbazole analog species. The TAN values for all samples ranged from 0.06 to 3.61 mg of KOH g(-1). To facilitate the spectral interpretation, three methods of variable selection were studied: variable importance in the projection (VIP), interval partial least squares (iPLS) and elimination of uninformative variables (UVE). The UVE method seems to be more appropriate for selecting important variables, reducing the dimension of the variables to 183 and producing a root mean square error of prediction of 0.32 mg of KOH g(-1). By reducing the size of the data, it was possible to relate the selected variables with their corresponding molecular formulas, thus identifying the main chemical species responsible for the TAN values.
Resumo:
Some factors complicate comparisons between linkage maps from different studies. This problem can be resolved if measures of precision, such as confidence intervals and frequency distributions, are associated with markers. We examined the precision of distances and ordering of microsatellite markers in the consensus linkage maps of chromosomes 1, 3 and 4 from two F 2 reciprocal Brazilian chicken populations, using bootstrap sampling. Single and consensus maps were constructed. The consensus map was compared with the International Consensus Linkage Map and with the whole genome sequence. Some loci showed segregation distortion and missing data, but this did not affect the analyses negatively. Several inversions and position shifts were detected, based on 95% confidence intervals and frequency distributions of loci. Some discrepancies in distances between loci and in ordering were due to chance, whereas others could be attributed to other effects, including reciprocal crosses, sampling error of the founder animals from the two populations, F(2) population structure, number of and distance between microsatellite markers, number of informative meioses, loci segregation patterns, and sex. In the Brazilian consensus GGA1, locus LEI1038 was in a position closer to the true genome sequence than in the International Consensus Map, whereas for GGA3 and GGA4, no such differences were found. Extending these analyses to the remaining chromosomes should facilitate comparisons and the integration of several available genetic maps, allowing meta-analyses for map construction and quantitative trait loci (QTL) mapping. The precision of the estimates of QTL positions and their effects would be increased with such information.
Resumo:
In this study we have used fluorescence spectroscopy to determine the post-mortem interval. Conventional methods in forensic medicine involve tissue or body fluids sampling and laboratory tests, which are often time demanding and may depend on expensive analysis. The presented method consists in using time-dependent variations on the fluorescence spectrum and its correlation with the time elapsed after regular metabolic activity cessation. This new approach addresses unmet needs for post-mortem interval determination in forensic medicine, by providing rapid and in situ measurements that shows improved time resolution relative to existing methods. (C) 2009 Optical Society of America