976 resultados para Sampling time
Resumo:
A disruption predictor based on support vector machines (SVM) has been developed to be used in JET. The training process uses thousands of discharges and, therefore, high performance computing has been necessary to obtain the models. To this respect, several models have been generated with data from different JET campaigns. In addition, various kernels (mainly linear and RBF) and parameters have been tested. The main objective of this work has been the implementation of the predictor model under real-time constraints. A “C-code” software application has been developed to simulate the real-time behavior of the predictor. The application reads the signals from the JET database and simulates the real-time data processing, in particular, the specific data hold method to be developed when reading data from the JET ATM real time network. The simulator is fully configurable by means of text files to select models, signal thresholds, sampling rates, etc. Results with data between campaigns C23and C28 will be shown.
Resumo:
In this study, a method for vehicle tracking through video analysis based on Markov chain Monte Carlo (MCMC) particle filtering with metropolis sampling is proposed. The method handles multiple targets with low computational requirements and is, therefore, ideally suited for advanced-driver assistance systems that involve real-time operation. The method exploits the removed perspective domain given by inverse perspective mapping (IPM) to define a fast and efficient likelihood model. Additionally, the method encompasses an interaction model using Markov Random Fields (MRF) that allows treatment of dependencies between the motions of targets. The proposed method is tested in highway sequences and compared to state-of-the-art methods for vehicle tracking, i.e., independent target tracking with Kalman filtering (KF) and joint tracking with particle filtering. The results showed fewer tracking failures using the proposed method.
Resumo:
Protein folding occurs on a time scale ranging from milliseconds to minutes for a majority of proteins. Computer simulation of protein folding, from a random configuration to the native structure, is nontrivial owing to the large disparity between the simulation and folding time scales. As an effort to overcome this limitation, simple models with idealized protein subdomains, e.g., the diffusion–collision model of Karplus and Weaver, have gained some popularity. We present here new results for the folding of a four-helix bundle within the framework of the diffusion–collision model. Even with such simplifying assumptions, a direct application of standard Brownian dynamics methods would consume 10,000 processor-years on current supercomputers. We circumvent this difficulty by invoking a special Brownian dynamics simulation. The method features the calculation of the mean passage time of an event from the flux overpopulation method and the sampling of events that lead to productive collisions even if their probability is extremely small (because of large free-energy barriers that separate them from the higher probability events). Using these developments, we demonstrate that a coarse-grained model of the four-helix bundle can be simulated in several days on current supercomputers. Furthermore, such simulations yield folding times that are in the range of time scales observed in experiments.
Resumo:
The generation time of HIV Type 1 (HIV-1) in vivo has previously been estimated using a mathematical model of viral dynamics and was found to be on the order of one to two days per generation. Here, we describe a new method based on coalescence theory that allows the estimate of generation times to be derived by using nucleotide sequence data and a reconstructed genealogy of sequences obtained over time. The method is applied to sequences obtained from a long-term nonprogressing individual at five sampling occasions. The estimate of viral generation time using the coalescent method is 1.2 days per generation and is close to that obtained by mathematical modeling (1.8 days per generation), thus strengthening confidence in estimates of a short viral generation time. Apart from the estimation of relevant parameters relating to viral dynamics, coalescent modeling also allows us to simulate the evolutionary behavior of samples of sequences obtained over time.
Resumo:
Global diversity curves reflect more than just the number of taxa that have existed through time: they also mirror variation in the nature of the fossil record and the way the record is reported. These sampling effects are best quantified by assembling and analyzing large numbers of locality-specific biotic inventories. Here, we introduce a new database of this kind for the Phanerozoic fossil record of marine invertebrates. We apply four substantially distinct analytical methods that estimate taxonomic diversity by quantifying and correcting for variation through time in the number and nature of inventories. Variation introduced by the use of two dramatically different counting protocols also is explored. We present sampling-standardized diversity estimates for two long intervals that sum to 300 Myr (Middle Ordovician-Carboniferous; Late Jurassic-Paleogene). Our new curves differ considerably from traditional, synoptic curves. For example, some of them imply unexpectedly low late Cretaceous and early Tertiary diversity levels. However, such factors as the current emphasis in the database on North America and Europe still obscure our view of the global history of marine biodiversity. These limitations will be addressed as the database and methods are refined.
Resumo:
As additivity is a very useful property for a distance measure, a general additive distance is proposed under the stationary time-reversible (SR) model of nucleotide substitution or, more generally, under the stationary, time-reversible, and rate variable (SRV) model, which allows rate variation among nucleotide sites. A method for estimating the mean distance and the sampling variance is developed. In addition, a method is developed for estimating the variance-covariance matrix of distances, which is useful for the statistical test of phylogenies and molecular clocks. Computer simulation shows (i) if the sequences are longer than, say, 1000 bp, the SR method is preferable to simpler methods; (ii) the SR method is robust against deviations from time-reversibility; (iii) when the rate varies among sites, the SRV method is much better than the SR method because the distance is seriously underestimated by the SR method; and (iv) our method for estimating the sampling variance is accurate for sequences longer than 500 bp. Finally, a test is constructed for testing whether DNA evolution follows a general Markovian model.
Resumo:
Correlations in low-frequency atomic displacements predicted by molecular dynamics simulations on the order of 1 ns are undersampled for the time scales currently accessible by the technique. This is shown with three different representations of the fluctuations in a macromolecule: the reciprocal space of crystallography using diffuse x-ray scattering data, real three-dimensional Cartesian space using covariance matrices of the atomic displacements, and the 3N-dimensional configuration space of the protein using dimensionally reduced projections to visualize the extent to which phase space is sampled.
Resumo:
This paper deals with the estimation of a time-invariant channel spectrum from its own nonuniform samples, assuming there is a bound on the channel’s delay spread. Except for this last assumption, this is the basic estimation problem in systems providing channel spectral samples. However, as shown in the paper, the delay spread bound leads us to view the spectrum as a band-limited signal, rather than the Fourier transform of a tapped delay line (TDL). Using this alternative model, a linear estimator is presented that approximately minimizes the expected root-mean-square (RMS) error for a deterministic channel. Its main advantage over the TDL is that it takes into account the spectrum’s smoothness (time width), thus providing a performance improvement. The proposed estimator is compared numerically with the maximum likelihood (ML) estimator based on a TDL model in pilot-assisted channel estimation (PACE) for OFDM.
Resumo:
Since the beginning of 3D computer vision problems, the use of techniques to reduce the data to make it treatable preserving the important aspects of the scene has been necessary. Currently, with the new low-cost RGB-D sensors, which provide a stream of color and 3D data of approximately 30 frames per second, this is getting more relevance. Many applications make use of these sensors and need a preprocessing to downsample the data in order to either reduce the processing time or improve the data (e.g., reducing noise or enhancing the important features). In this paper, we present a comparison of different downsampling techniques which are based on different principles. Concretely, five different downsampling methods are included: a bilinear-based method, a normal-based, a color-based, a combination of the normal and color-based samplings, and a growing neural gas (GNG)-based approach. For the comparison, two different models have been used acquired with the Blensor software. Moreover, to evaluate the effect of the downsampling in a real application, a 3D non-rigid registration is performed with the data sampled. From the experimentation we can conclude that depending on the purpose of the application some kernels of the sampling methods can improve drastically the results. Bilinear- and GNG-based methods provide homogeneous point clouds, but color-based and normal-based provide datasets with higher density of points in areas with specific features. In the non-rigid application, if a color-based sampled point cloud is used, it is possible to properly register two datasets for cases where intensity data are relevant in the model and outperform the results if only a homogeneous sampling is used.
Resumo:
Remineralization of organic matter in reactive marine sediments releases nutrients and dissolved organic matter (DOM) into the ocean. Here we focused on the molecular-level characterization of DOM by high-resolution Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR-MS) in sediment pore waters and bottom waters from contrasting redox regimes in the northern Black Sea with particular emphasis on nitrogen-bearing compounds to derive an improved understanding of the molecular transformations involved in nitrogen release. The number of nitrogen-bearing molecules is generally higher in pore waters than in bottom waters. This suggests intensified degradation of nitrogen-bearing precursor molecules such as proteins in anoxic sediments: No significant difference was observed between sediments deposited under oxic vs anoxic conditions (average O/C ratios of 0.55) suggesting that the different organic matter quality induced by contrasting redox conditions does not impact protein diagenesis in the subseafloor. Compounds in the pore waters were on average larger, less oxygenated, and had a higher number of unsaturations. Applying a mathematical model, we could show that the assemblages of nitrogen-bearing molecular formulas are potential products of proteinaceous material that was transformed by the following reactions: (a) hydrolysis and deamination, both reducing the molecular size and nitrogen content of the products and intermediates; (b) oxidation and hydration of the intermediates; and (c) methylation and dehydration.
Resumo:
We combined the analysis of sediment trap data and satellite-derived sea surface chlorophyll to quantify the amount of organic carbon export to the deep sea in the upwelling induced high production area off northwest Africa. In contrast to the generally global or basin-wide adoption of export models, we used a regionally fitted empirical model. Furthermore, the application of our model was restricted to a dynamically defined region of high chlorophyll concentration in order to restrict the model application to an environment of more homogeneous export processes. We developed a correlation-based approximation to estimate the surface source area for a sediment trap deployed from 11 June 1998 to 7 November 1999 at 21.25°N latitude and 20.64°W longitude off Cape Blanc. We also developed a regression model of chlorophyll and export of organic carbon to the 1000 m depth level. Carbon export was calculated for an area of high chlorophyll concentration (>1 mg/m**3) adjacent to the coast on a daily basis. The resulting zone of high chlorophyll concentration was 20,000-800,000 km**2 large and yielded a yearly export of 1.123 to 2.620 Tg organic carbon. The average organic carbon export within the area of high chlorophyll concentration was 20.6 mg/m**2d comparable to 13.3 mg/m**2d as found in the sediment trap results if normalized to the 1000 m level. We found strong interannual variability in export. The period autumn 1998 to summer 1999 was exceeding the mean of the other three comparable periods by a factor of 2.25. We believe that this approach of using more regionally fitted models can be successfully transferred even to different oceanographic regions by selecting appropriate definition criteria like chlorophyll concentration for the definition of an area to which it is applicable.
Resumo:
In 1990, a benthic component to the DYFAMED (dynamics of fluxes in the Mediterranean) program, the DYFAMED-BENTHOS survey, was established to investigate the possible coupling of benthic to pelagic processes at a permanent station in >2700 m water depth, 52 km off Nice, France. Surface sediment was first sampled at different periods of the year to assess the importance of the biological compartment (particularly metazoan meiofauna) and its relation to seasonally varying particulate matter input to the sea floor (estimated by measuring surface sediment particle size and porosity, as well as chloroplastic pigments, organic carbon, nitrogen and calcium carbonate contents). Beginning in 1993, surface sediment was sampled at an average interval of 1.4 months for over five consecutive years using multicorers. Biogeochemical techniques such as deployments of a free-vehicle benthic respirometer and a near-bottom sediment trap, along with analyses of sediment vertical profiles for dissolved oxygen, nutrients and dissolved metals in the porewater, were developed in conjunction with more extensive biological analyses to characterize the recycling of organic matter, and ultimately increase our understanding of the oceanic carbon cycle. This article provides the scientific background and motivation for the development of the on-going DYFAMED-BENTHOS survey, the general characteristics of the benthic site, as well as a detailed description of the sampling design applied from late 1990-2000.
Resumo:
Data were collected during various groundfish surveys carried out by IFREMER from October to December between 1997 and 2011, on the eastern continental shelf of the Bay of Biscay and in the Celtic Sea (EVHOE series). The sampling design was stratified according to latitude and depth. A 36/47 GOV trawl was used with a 20 mm mesh codend liner. Haul duration was 30 minutes at a towing speed of 4 knots. Fishing was restricted to daylight hours. Catch weights and catch numbers were recorded for all species and body size measured. The weights and numbers per haul were transformed into abundances per km**2 by considering the swept area of a standard haul (0.069 km**2).