142 resultados para Mean-Reverting Jump-Diffusion
Resumo:
The present study focuses on single-case data analysis and specifically on two procedures for quantifying differences between baseline and treatment measurements The first technique tested is based on generalized least squares regression analysis and is compared to a proposed non-regression technique, which allows obtaining similar information. The comparison is carried out in the context of generated data representing a variety of patterns (i.e., independent measurements, different serial dependence underlying processes, constant or phase-specific autocorrelation and data variability, different types of trend, and slope and level change). The results suggest that the two techniques perform adequately for a wide range of conditions and researchers can use both of them with certain guarantees. The regression-based procedure offers more efficient estimates, whereas the proposed non-regression procedure is more sensitive to intervention effects. Considering current and previous findings, some tentative recommendations are offered to applied researchers in order to help choosing among the plurality of single-case data analysis techniques.
Resumo:
We study hydrogen stability and its evolution during thermal annealing in nanostructured amorphous silicon thin films. From the simultaneous measurement of heat and hydrogen desorption, we obtain the experimental evidence of molecular diffusion in these materials. In addition, we introduce a simple diffusion model which shows good agreement with the experimental data
Resumo:
An effect of drift is investigated on the segregation pattern in diffusion-limited aggregation (DLA) with two components (A and B species). The sticking probability PAB (=PBA) between the different species is introduced into the DLA model with drift, where the sticking probability PAA (=PBB) between the same species equals 1. By using computer simulation it is found that the drift has an important effect on not only the morphology but also the segregation pattern. Under the drift and the small sticking probability, a characteristic pattern appears where elongated clusters of A species and of B species are periodically dispersed. The period decreases with increasing drift. The periodic structure of the deposits is characterized by an autocorrelation function. The shape of the cluster consisting of only A species (or B species) shows a vertically elongated filamentlike structure. Each cluster becomes vertically longer with decreasing sticking probability PAB. The segregation pattern is distinctly different from that with no drift and a small sticking probability PAA. The effect of the concentration on the segregation pattern is also shown.
Resumo:
An analytical model based on Bowen and Holman [1989] is used to prove the existence of instabilities due to the presence of a second extremum of the background vorticity at the front side of the longshore current. The growth rate of the so-called frontshear waves depends primarily upon the frontshear but also upon the backshear and the maximum and the width of the current. Depending on the values of these parameters, either the frontshear or the backshear instabilities may dominate. Both types of waves have a cross-shore extension of the order of the width of the current, but the frontshear modes are localized closer to the coast than are the backshear modes. Moreover, under certain conditions both unstable waves have similar growth rates with close wave numbers and angular frequencies, leading to the possibility of having modulated shear waves in the alongshore direction. Numerical analysis performed on realistic current profiles confirm the behavior anticipated by the analytical model. The theory has been applied to a current profile fitted to data measured during the 1980 Nearshore Sediment Transport Studies experiment at Leadbetter Beach that has an extremum of background vorticity at the front side of the current. In this case and in agreement with field observations, the model predicts instability, whereas the theory based only on backshear instability fai led to do so.
Resumo:
We analyze the diffusion of a Brownian particle in a fluid under stationary flow. By using the scheme of nonequilibrium thermodynamics in phase space, we obtain the Fokker-Planck equation that is compared with others derived from the kinetic theory and projector operator techniques. This equation exhibits violation of the fluctuation-dissipation theorem. By implementing the hydrodynamic regime described by the first moments of the nonequilibrium distribution, we find relaxation equations for the diffusion current and pressure tensor, allowing us to arrive at a complete description of the system in the inertial and diffusion regimes. The simplicity and generality of the method we propose makes it applicable to more complex situations, often encountered in problems of soft-condensed matter, in which not only one but more degrees of freedom are coupled to a nonequilibrium bath.
Resumo:
We study biased, diffusive transport of Brownian particles through narrow, spatially periodic structures in which the motion is constrained in lateral directions. The problem is analyzed under the perspective of the Fick-Jacobs equation, which accounts for the effect of the lateral confinement by introducing an entropic barrier in a one-dimensional diffusion. The validity of this approximation, based on the assumption of an instantaneous equilibration of the particle distribution in the cross section of the structure, is analyzed by comparing the different time scales that characterize the problem. A validity criterion is established in terms of the shape of the structure and of the applied force. It is analytically corroborated and verified by numerical simulations that the critical value of the force up to which this description holds true scales as the square of the periodicity of the structure. The criterion can be visualized by means of a diagram representing the regions where the Fick-Jacobs description becomes inaccurate in terms of the scaled force versus the periodicity of the structure.
Resumo:
We study the time scales associated with diffusion processes that take place on multiplex networks, i.e., on a set of networks linked through interconnected layers. To this end, we propose the construction of a supra-Laplacian matrix, which consists of a dimensional lifting of the Laplacian matrix of each layer of the multiplex network. We use perturbative analysis to reveal analytically the structure of eigenvectors and eigenvalues of the complete network in terms of the spectral properties of the individual layers. The spectrum of the supra-Laplacian allows us to understand the physics of diffusionlike processes on top of multiplex networks.
Resumo:
Transport in small-scale biological and soft-matter systems typically occurs under confinement conditions in which particles proceed through obstacles and irregularities of the boundaries that may significantly alter their trajectories. A transport model that assimilates the confinement to the presence of entropic barriers provides an efficient approach to quantify its effect on the particle current and the diffusion coefficient. We review the main peculiarities of entropic transport and treat two cases in which confinement effects play a crucial role, with the appearance of emergent properties. The presence of entropic barriers modifies the mean first-passage time distribution and therefore plays a very important role in ion transport through micro- and nano-channels. The functionality of molecular motors, modeled as Brownian ratchets, is strongly affected when the motor proceeds in a confined medium that may constitute another source of rectification. The interplay between ratchet and entropic rectification gives rise to a wide variety of dynamical behaviors, not observed when the Brownian motor proceeds in an unbounded medium. Entropic transport offers new venues of transport control and particle manipulation and new ways to engineer more efficient devices for transport at the nanoscale.
Resumo:
High-energy charged particles in the van Allen radiation belts and in solar energetic particle events can damage satellites on orbit leading to malfunctions and loss of satellite service. Here we describe some recent results from the SPACECAST project on modelling and forecasting the radiation belts, and modelling solar energetic particle events. We describe the SPACECAST forecasting system that uses physical models that include wave-particle interactions to forecast the electron radiation belts up to 3 h ahead. We show that the forecasts were able to reproduce the >2 MeV electron flux at GOES 13 during the moderate storm of 7-8 October 2012, and the period following a fast solar wind stream on 25-26 October 2012 to within a factor of 5 or so. At lower energies of 10- a few 100 keV we show that the electron flux at geostationary orbit depends sensitively on the high-energy tail of the source distribution near 10 RE on the nightside of the Earth, and that the source is best represented by a kappa distribution. We present a new model of whistler mode chorus determined from multiple satellite measurements which shows that the effects of wave-particle interactions beyond geostationary orbit are likely to be very significant. We also present radial diffusion coefficients calculated from satellite data at geostationary orbit which vary with Kp by over four orders of magnitude. We describe a new automated method to determine the position at the shock that is magnetically connected to the Earth for modelling solar energetic particle events and which takes into account entropy, and predict the form of the mean free path in the foreshock, and particle injection efficiency at the shock from analytical theory which can be tested in simulations.
Resumo:
High-energy charged particles in the van Allen radiation belts and in solar energetic particle events can damage satellites on orbit leading to malfunctions and loss of satellite service. Here we describe some recent results from the SPACECAST project on modelling and forecasting the radiation belts, and modelling solar energetic particle events. We describe the SPACECAST forecasting system that uses physical models that include wave-particle interactions to forecast the electron radiation belts up to 3 h ahead. We show that the forecasts were able to reproduce the >2 MeV electron flux at GOES 13 during the moderate storm of 7-8 October 2012, and the period following a fast solar wind stream on 25-26 October 2012 to within a factor of 5 or so. At lower energies of 10- a few 100 keV we show that the electron flux at geostationary orbit depends sensitively on the high-energy tail of the source distribution near 10 RE on the nightside of the Earth, and that the source is best represented by a kappa distribution. We present a new model of whistler mode chorus determined from multiple satellite measurements which shows that the effects of wave-particle interactions beyond geostationary orbit are likely to be very significant. We also present radial diffusion coefficients calculated from satellite data at geostationary orbit which vary with Kp by over four orders of magnitude. We describe a new automated method to determine the position at the shock that is magnetically connected to the Earth for modelling solar energetic particle events and which takes into account entropy, and predict the form of the mean free path in the foreshock, and particle injection efficiency at the shock from analytical theory which can be tested in simulations.
Resumo:
From a business standpoint, this paper describes the point of view on the question of warranties of a FOSS editor doing business in a risk-averse market segment. It is based on 15-years experience of AdaCore in the safety-critical embedded industry. However, it is not only the point of view of a provider, as it also aims at demonstrating that the interests of providers and users are aligned in this area. From a legal point of view, the enforceability of these warranties will be partly covered, as well as the articulation between the license and the warranties on one hand, and the articulation between the license and the other contracts that can be created in a business relationship on the other hand.
Resumo:
What determines which inputs are initially considered and eventually adopted in the productionof new or improved goods? Why are some inputs much more prominent than others? We modelthe evolution of input linkages as a process where new producers first search for potentially usefulinputs and then decide which ones to adopt. A new product initially draws a set of 'essentialsuppliers'. The search stage is then confined to the network neighborhood of the latter, i.e., to theinputs used by the essential suppliers. The adoption decision is driven by a tradeoff between thebenefits accruing from input variety and the costs of input adoption. This has important implicationsfor the number of forward linkages that a product (input variety) develops over time. Inputdiffusion is fostered by network centrality ? an input that is initially represented in many networkneighborhoods is subsequently more likely to be adopted. This mechanism also delivers a powerlaw distribution of forward linkages. Our predictions continue to hold when varieties are aggregatedinto sectors. We can thus test them, using detailed sectoral US input-output tables. We showthat initial network proximity of a sector in 1967 significantly increases the likelihood of adoptionthroughout the subsequent four decades. The same is true for rapid productivity growth in aninput-producing sector. Our empirical results highlight two conditions for new products to becomecentral nodes: initial network proximity to prospective adopters, and technological progress thatreduces their relative price. Semiconductors met both conditions.
Resumo:
Les prioritats per als museus canvien. La missió de la nova museologia és convertir els museus en llocs per a gaudir i aprendre, cosa que fa que hagin de dur a terme una gestió financera molt semblant a la d'una empresa social que competeixi en el sector del lleure. Amb el pas del temps, els museus han d'establir i aplicar els criteris necessaris per a la supervivència, aplanant el terreny perquè altres institucions públiques siguin més obertes en els seus esforços per comunicar i difondre el seu patrimoni. Ja podem començar a parlar d'algunes conclusions comunament acceptades sobre el comportament dels visitants, que són necessàries per a planificar exposicions futures que vegin l'aprenentatge com un procés constructiu, les col·leccions com a objectes amb significat i les mateixes exposicions com a mitjans de comunicació que haurien de transformar la manera de pensar de l'espectador i que estan al servei del mateix missatge. Sembla que internet representa un mitjà efectiu per a assolir aquests objectius, ja que és capaç (a) d'adaptar-se als interessos i les característiques intel·lectuals d'un públic divers; (b) de redescobrir els significats dels objectes i adquirir un reconeixement sociocultural del seu valor per mitjà del seu potencial interactiu, i (c) de fer ús d'elements atractius i estimulants perquè tothom en gaudeixi. Per a aquest propòsit, és bàsic fer-nos les preguntes següents: quins criteris ha de seguir un museu virtual per a optimar la difusió del seu patrimoni?; quins elements estimulen els usuaris a quedar-se en una pàgina web i fer visites virtuals que els siguin satisfactòries?; quin paper té la usabilitat de l'aplicació en tot això?
Resumo:
Background:Average energies of nuclear collective modes may be efficiently and accurately computed using a nonrelativistic constrained approach without reliance on a random phase approximation (RPA). Purpose: To extend the constrained approach to the relativistic domain and to establish its impact on the calibration of energy density functionals. Methods: Relativistic RPA calculations of the giant monopole resonance (GMR) are compared against the predictions of the corresponding constrained approach using two accurately calibrated energy density functionals. Results: We find excellent agreement at the 2% level or better between the predictions of the relativistic RPA and the corresponding constrained approach for magic (or semimagic) nuclei ranging from 16 O to 208 Pb. Conclusions: An efficient and accurate method is proposed for incorporating nuclear collective excitations into the calibration of future energy density functionals.