965 resultados para Time-dependent data
Resumo:
Basalts from Hole 534A are among the oldest recovered from the ocean bottom, dating from the opening of the Atlantic 155 Ma. Upon exposure to a 1-Oe field for one week, these basalts acquire a viscous remanent magnetization (VRM), which ranges from 4 to 223% of their natural remanent magnetization (NRM). A magnetic field of similar magnitude is observed in the paleomagnetic lab of the Glomar Challenger, and it is therefore doubtful if accurate measurements of magnetic moment in such rocks can be made on board unless the paleomagnetic area is magnetically shielded. No correlation is observed between the Konigsberger ratio (beta), which is usually less than 3, and the ability to acquire a VRM. The VRM shows both a log t dependence and a Richter aftereffect. Both of these, but especially the log t dependence, will cause the susceptibility measurements (made by applying a magnetic field for a very short time) to be minimum values. The susceptibility and derived Q should therefore be used cautiously for magnetic anomaly interpretation, because they can cause the importance of the induced magnetization to be underestimated.
Resumo:
A nested ice flow model was developed for eastern Dronning Maud Land to assist with the dating and interpretation of the EDML deep ice core. The model consists of a high-resolution higher-order ice dynamic flow model that was nested into a comprehensive 3-D thermomechanical model of the whole Antarctic ice sheet. As the drill site is on a flank position the calculations specifically take into account the effects of horizontal advection as deeper ice in the core originated from higher inland. First the regional velocity field and ice sheet geometry is obtained from a forward experiment over the last 8 glacial cycles. The result is subsequently employed in a Lagrangian backtracing algorithm to provide particle paths back to their time and place of deposition. The procedure directly yields the depth-age distribution, surface conditions at particle origin, and a suite of relevant parameters such as initial annual layer thickness. This paper discusses the method and the main results of the experiment, including the ice core chronology, the non-climatic corrections needed to extract the climatic part of the signal, and the thinning function. The focus is on the upper 89% of the ice core (appr. 170 kyears) as the dating below that is increasingly less robust owing to the unknown value of the geothermal heat flux. It is found that the temperature biases resulting from variations of surface elevation are up to half of the magnitude of the climatic changes themselves.
Resumo:
The analysis of time-dependent data is an important problem in many application domains, and interactive visualization of time-series data can help in understanding patterns in large time series data. Many effective approaches already exist for visual analysis of univariate time series supporting tasks such as assessment of data quality, detection of outliers, or identification of periodically or frequently occurring patterns. However, much fewer approaches exist which support multivariate time series. The existence of multiple values per time stamp makes the analysis task per se harder, and existing visualization techniques often do not scale well. We introduce an approach for visual analysis of large multivariate time-dependent data, based on the idea of projecting multivariate measurements to a 2D display, visualizing the time dimension by trajectories. We use visual data aggregation metaphors based on grouping of similar data elements to scale with multivariate time series. Aggregation procedures can either be based on statistical properties of the data or on data clustering routines. Appropriately defined user controls allow to navigate and explore the data and interactively steer the parameters of the data aggregation to enhance data analysis. We present an implementation of our approach and apply it on a comprehensive data set from the field of earth bservation, demonstrating the applicability and usefulness of our approach.
Resumo:
Relacionado con línea de investigación del GDS del ISOM ver http://www.isom.upm.es/dsemiconductores.php
Resumo:
The interaction of high intensity X-ray lasers with matter is modeled. A collisional-radiative timedependent module is implemented to study radiation transport in matter from ultrashort and ultraintense X-ray bursts. Inverse bremsstrahlung absorption by free electrons, electron conduction or hydrodynamic effects are not considered. The collisional-radiative system is coupled with the electron distribution evolution treated with a Fokker-Planck approach with additional inelastic terms. The model includes spontaneous emission, resonant photoabsorption, collisional excitation and de-excitation, radiative recombination, photoionization, collisional ionization, three-body recombination, autoionization and dielectronic capture. It is found that for high densities, but still below solid, collisions play an important role and thermalization times are not short enough to ensure a thermal electron distribution. At these densities Maxwellian and non-Maxwellian electron distribution models yield substantial differences in collisional rates, modifying the atomic population dynamics.
Resumo:
Lagrangian descriptors are a recent technique which reveals geometrical structures in phase space and which are valid for aperiodically time dependent dynamical systems. We discuss a general methodology for constructing them and we discuss a "heuristic argument" that explains why this method is successful. We support this argument by explicit calculations on a benchmark problem. Several other benchmark examples are considered that allow us to assess the performance of Lagrangian descriptors with both finite time Lyapunov exponents (FTLEs) and finite time averages of certain components of the vector field ("time averages"). In all cases Lagrangian descriptors are shown to be both more accurate and computationally efficient than these methods.
Resumo:
This paper deals with the dynamics of liquid bridges when subjected to an oscillatory microgravity field. The analysis has been performed by using a one-dimensional slice model, already used in liquid bridge problems, which allows to calculate not only the resonance frequencies of a wide range of such fluid configurations but also the dependence of the dynamic response of the liquid bridge on the frequency on the imposed perturbations. Theoretical results are compared with experimental ones obtained aboard Spacelab-Dl, the agreement between theoretical and experimental results being satisfactory
Resumo:
Traffic flow time series data are usually high dimensional and very complex. Also they are sometimes imprecise and distorted due to data collection sensor malfunction. Additionally, events like congestion caused by traffic accidents add more uncertainty to real-time traffic conditions, making traffic flow forecasting a complicated task. This article presents a new data preprocessing method targeting multidimensional time series with a very high number of dimensions and shows its application to real traffic flow time series from the California Department of Transportation (PEMS web site). The proposed method consists of three main steps. First, based on a language for defining events in multidimensional time series, mTESL, we identify a number of types of events in time series that corresponding to either incorrect data or data with interference. Second, each event type is restored utilizing an original method that combines real observations, local forecasted values and historical data. Third, an exponential smoothing procedure is applied globally to eliminate noise interference and other random errors so as to provide good quality source data for future work.
Resumo:
Temporal patterning of biological variables, in the form of oscillations and rhythms on many time scales, is ubiquitous. Altering the temporal pattern of an input variable greatly affects the output of many biological processes. We develop here a conceptual framework for a quantitative understanding of such pattern dependence, focusing particularly on nonlinear, saturable, time-dependent processes that abound in biophysics, biochemistry, and physiology. We show theoretically that pattern dependence is governed by the nonlinearity of the input–output transformation as well as its time constant. As a result, only patterns on certain time scales permit the expression of pattern dependence, and processes with different time constants can respond preferentially to different patterns. This has implications for temporal coding and decoding, and allows differential control of processes through pattern. We show how pattern dependence can be quantitatively predicted using only information from steady, unpatterned input. To apply our ideas, we analyze, in an experimental example, how muscle contraction depends on the pattern of motorneuron firing.
Resumo:
The hyperpermeability of tumor vessels to macromolecules, compared with normal vessels, is presumably due to vascular endothelial growth factor/vascular permeability factor (VEGF/VPF) released by neoplastic and/or host cells. In addition, VEGF/VPF is a potent angiogenic factor. Removal of this growth factor may reduce the permeability and inhibit tumor angiogenesis. To test these hypotheses, we transplanted a human glioblastoma (U87), a human colon adenocarcinoma (LS174T), and a human melanoma (P-MEL) into two locations in immunodeficient mice: the cranial window and the dorsal skinfold chamber. The mice bearing vascularized tumors were treated with a bolus (0.2 ml) of either a neutralizing antibody (A4.6.1) (492 μg/ml) against VEGF/VPF or PBS (control). We found that tumor vascular permeability to albumin in antibody-treated groups was lower than in the matched controls and that the effect of the antibody was time-dependent and influenced by the mode of injection. Tumor vascular permeability did not respond to i.p. injection of the antibody until 4 days posttreatment. However, the permeability was reduced within 6 h after i.v. injection of the same amount of antibody. In addition to the reduction in vascular permeability, the tumor vessels became smaller in diameter and less tortuous after antibody injections and eventually disappeared from the surface after four consecutive treatments in U87 tumors. These results demonstrate that tumor vascular permeability can be reduced by neutralization of endogenous VEGF/VPF and suggest that angiogenesis and the maintenance of integrity of tumor vessels require the presence of VEGF/VPF in the tissue microenvironment. The latter finding reveals a new mechanism of tumor vessel regression—i.e., blocking the interactions between VEGF/VPF and endothelial cells or inhibiting VEGF/VPF synthesis in solid tumors causes dramatic reduction in vessel diameter, which may block the passage of blood elements and thus lead to vascular regression.
Resumo:
The concepts of temperature and equilibrium are not well defined in systems of particles with time-varying external forces. An example is a radio frequency ion trap, with the ions laser cooled into an ordered solid, characteristic of sub-mK temperatures, whereas the kinetic energies associated with the fast coherent motion in the trap are up to 7 orders of magnitude higher. Simulations with 1,000 ions reach equilibrium between the degrees of freedom when only aperiodic displacements (secular motion) are considered. The coupling of the periodic driven motion associated with the confinement to the nonperiodic random motion of the ions is very small at low temperatures and increases quadratically with temperature.
Resumo:
It has become clear that many organisms possess the ability to regulate their mutation rate in response to environmental conditions. So the question of finding an optimal mutation rate must be replaced by that of finding an optimal mutation schedule. We show that this task cannot be accomplished with standard population-dynamic models. We then develop a "hybrid" model for populations experiencing time-dependent mutation that treats population growth as deterministic but the time of first appearance of new variants as stochastic. We show that the hybrid model agrees well with a Monte Carlo simulation. From this model, we derive a deterministic approximation, a "threshold" model, that is similar to standard population dynamic models but differs in the initial rate of generation of new mutants. We use these techniques to model antibody affinity maturation by somatic hypermutation. We had previously shown that the optimal mutation schedule for the deterministic threshold model is phasic, with periods of mutation between intervals of mutation-free growth. To establish the validity of this schedule, we now show that the phasic schedule that optimizes the deterministic threshold model significantly improves upon the best constant-rate schedule for the hybrid and Monte Carlo models.