93 resultados para Time correlation function
Resumo:
This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn’t represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion in the mean value of the function. Using statistical significance tests both at the local and field level, it is shown that the climatology of the SPEEDY model is not modified by the changed time stepping scheme; hence, no retuning of the parameterizations is required. It is found the accuracy of the medium-term forecasts is increased by using the RAW filter.
Resumo:
We discuss the time evolution of the wave function which is the solution of a stochastic Schrödinger equation describing the dynamics of a free quantum particle subject to spontaneous localizations in space. We prove global existence and uniqueness of solutions. We observe that there exist three time regimes: the collapse regime, the classical regime and the diffusive regime. Concerning the latter, we assert that the general solution converges almost surely to a diffusing Gaussian wave function having a finite spread both in position as well as in momentum. This paper corrects and completes earlier works on this issue.
Resumo:
Data assimilation refers to the problem of finding trajectories of a prescribed dynamical model in such a way that the output of the model (usually some function of the model states) follows a given time series of observations. Typically though, these two requirements cannot both be met at the same time–tracking the observations is not possible without the trajectory deviating from the proposed model equations, while adherence to the model requires deviations from the observations. Thus, data assimilation faces a trade-off. In this contribution, the sensitivity of the data assimilation with respect to perturbations in the observations is identified as the parameter which controls the trade-off. A relation between the sensitivity and the out-of-sample error is established, which allows the latter to be calculated under operational conditions. A minimum out-of-sample error is proposed as a criterion to set an appropriate sensitivity and to settle the discussed trade-off. Two approaches to data assimilation are considered, namely variational data assimilation and Newtonian nudging, also known as synchronization. Numerical examples demonstrate the feasibility of the approach.
Resumo:
We wish to characterize when a Lévy process X t crosses boundaries b(t), in a two-sided sense, for small times t, where b(t) satisfies very mild conditions. An integral test is furnished for computing the value of sup t→0|X t |/b(t) = c. In some cases, we also specify a function b(t) in terms of the Lévy triplet, such that sup t→0 |X t |/b(t) = 1.
Resumo:
The detection of long-range dependence in time series analysis is an important task to which this paper contributes by showing that whilst the theoretical definition of a long-memory (or long-range dependent) process is based on the autocorrelation function, it is not possible for long memory to be identified using the sum of the sample autocorrelations, as usually defined. The reason for this is that the sample sum is a predetermined constant for any stationary time series; a result that is independent of the sample size. Diagnostic or estimation procedures, such as those in the frequency domain, that embed this sum are equally open to this criticism. We develop this result in the context of long memory, extending it to the implications for the spectral density function and the variance of partial sums of a stationary stochastic process. The results are further extended to higher order sample autocorrelations and the bispectral density. The corresponding result is that the sum of the third order sample (auto) bicorrelations at lags h,k≥1, is also a predetermined constant, different from that in the second order case, for any stationary time series of arbitrary length.
Resumo:
We present the first climate prediction of the coming decade made with multiple models, initialized with prior observations. This prediction accrues from an international activity to exchange decadal predictions in near real-time, in order to assess differences and similarities, provide a consensus view to prevent over-confidence in forecasts from any single model, and establish current collective capability. We stress that the forecast is experimental, since the skill of the multi-model system is as yet unknown. Nevertheless, the forecast systems used here are based on models that have undergone rigorous evaluation and individually have been evaluated for forecast skill. Moreover, it is important to publish forecasts to enable open evaluation, and to provide a focus on climate change in the coming decade. Initialized forecasts of the year 2011 agree well with observations, with a pattern correlation of 0.62 compared to 0.31 for uninitialized projections. In particular, the forecast correctly predicted La Niña in the Pacific, and warm conditions in the north Atlantic and USA. A similar pattern is predicted for 2012 but with a weaker La Niña. Indices of Atlantic multi-decadal variability and Pacific decadal variability show no signal beyond climatology after 2015, while temperature in the Niño3 region is predicted to warm slightly by about 0.5 °C over the coming decade. However, uncertainties are large for individual years and initialization has little impact beyond the first 4 years in most regions. Relative to uninitialized forecasts, initialized forecasts are significantly warmer in the north Atlantic sub-polar gyre and cooler in the north Pacific throughout the decade. They are also significantly cooler in the global average and over most land and ocean regions out to several years ahead. However, in the absence of volcanic eruptions, global temperature is predicted to continue to rise, with each year from 2013 onwards having a 50 % chance of exceeding the current observed record. Verification of these forecasts will provide an important opportunity to test the performance of models and our understanding and knowledge of the drivers of climate change.
Resumo:
The Fourier series can be used to describe periodic phenomena such as the one-dimensional crystal wave function. By the trigonometric treatements in Hückel theory it is shown that Hückel theory is a special case of Fourier series theory. Thus, the conjugated π system is in fact a periodic system. Therefore, it can be explained why such a simple theorem as Hückel theory can be so powerful in organic chemistry. Although it only considers the immediate neighboring interactions, it implicitly takes account of the periodicity in the complete picture where all the interactions are considered. Furthermore, the success of the trigonometric methods in Hückel theory is not accidental, as it based on the fact that Hückel theory is a specific example of the more general method of Fourier series expansion. It is also important for education purposes to expand a specific approach such as Hückel theory into a more general method such as Fourier series expansion.
Resumo:
Simulations of 15 coupled chemistry climate models, for the period 1960–2100, are presented. The models include a detailed stratosphere, as well as including a realistic representation of the tropospheric climate. The simulations assume a consistent set of changing greenhouse gas concentrations, as well as temporally varying chlorofluorocarbon concentrations in accordance with observations for the past and expectations for the future. The ozone results are analyzed using a nonparametric additive statistical model. Comparisons are made with observations for the recent past, and the recovery of ozone, indicated by a return to 1960 and 1980 values, is investigated as a function of latitude. Although chlorine amounts are simulated to return to 1980 values by about 2050, with only weak latitudinal variations, column ozone amounts recover at different rates due to the influence of greenhouse gas changes. In the tropics, simulated peak ozone amounts occur by about 2050 and thereafter total ozone column declines. Consequently, simulated ozone does not recover to values which existed prior to the early 1980s. The results also show a distinct hemispheric asymmetry, with recovery to 1980 values in the Northern Hemisphere extratropics ahead of the chlorine return by about 20 years. In the Southern Hemisphere midlatitudes, ozone is simulated to return to 1980 levels only 10 years ahead of chlorine. In the Antarctic, annually averaged ozone recovers at about the same rate as chlorine in high latitudes and hence does not return to 1960s values until the last decade of the simulations.
Resumo:
We develop a new sparse kernel density estimator using a forward constrained regression framework, within which the nonnegative and summing-to-unity constraints of the mixing weights can easily be satisfied. Our main contribution is to derive a recursive algorithm to select significant kernels one at time based on the minimum integrated square error (MISE) criterion for both the selection of kernels and the estimation of mixing weights. The proposed approach is simple to implement and the associated computational cost is very low. Specifically, the complexity of our algorithm is in the order of the number of training data N, which is much lower than the order of N2 offered by the best existing sparse kernel density estimators. Numerical examples are employed to demonstrate that the proposed approach is effective in constructing sparse kernel density estimators with comparable accuracy to those of the classical Parzen window estimate and other existing sparse kernel density estimators.
Resumo:
Retributivists believe that punishment can be deserved, and that deserved punishment is intrinsically good. They also believe that certain crimes deserve certain quantities of punishment. On the plausible assumption that the overall amount of any given punishment is a function of its severity and duration, we might think that retributivists (qua retributivists) would be indifferent as to whether a punishment were long and light or short and sharp, provided the offender gets the overall amount of punishment he deserves. In this paper I argue against this, showing that retributivists should actually prefer shorter and more severe punishments to longer, gentler options. I show this by focusing on, and developing a series of interpretations of, the retributivist claim that not punishing the guilty is bad, focussing on the relationship between that badness and time. I then show that each interpretation leads to a preference for shorter over longer punishment.
Resumo:
The time-mean quasi-geostrophic potential vorticity equation of the atmospheric flow on isobaric surfaces can explicitly include an atmospheric (internal) forcing term of the stationary-eddy flow. In fact, neglecting some non-linear terms in this equation, this forcing can be mathematically expressed as a single function, called Empirical Forcing Function (EFF), which is equal to the material derivative of the time-mean potential vorticity. Furthermore, the EFF can be decomposed as a sum of seven components, each one representing a forcing mechanism of different nature. These mechanisms include diabatic components associated with the radiative forcing, latent heat release and frictional dissipation, and components related to transient eddy transports of heat and momentum. All these factors quantify the role of the transient eddies in forcing the atmospheric circulation. In order to assess the relevance of the EFF in diagnosing large-scale anomalies in the atmospheric circulation, the relationship between the EFF and the occurrence of strong North Atlantic ridges over the Eastern North Atlantic is analyzed, which are often precursors of severe droughts over Western Iberia. For such events, the EFF pattern depicts a clear dipolar structure over the North Atlantic; cyclonic (anticyclonic) forcing of potential vorticity is found upstream (downstream) of the anomalously strong ridges. Results also show that the most significant components are related to the diabatic processes. Lastly, these results highlight the relevance of the EFF in diagnosing large-scale anomalies, also providing some insight into their interaction with different physical mechanisms.
Resumo:
We present case studies of the evolution of magnetic wave amplitudes and auroral intensity through the late growth phase and the expansion phase of the substorm cycle. We present strong evidence that substorm-related auroral enhancements are clearly and demonstrably linked to ULF wave amplitudes observed at the same location. In most cases, we find that the highest correlations are observed when the magnetometer time series is advanced in time, indicating that the ULF wave amplitudes start to grow before measured auroral intensities, though interestingly this is not always the case. Further we discuss the four possible reasons that may be able to explain both the timing and the high correlations between these two phenomena, including: a simple coincidence, an artifact of instrumental effects, the response of the ionosphere to magnetic waves and auroral particle precipitation, and finally that ULF waves and auroral particle precipitation are physically linked. We discount coincidence and instrumental effects since in the studies presented here they are unlikely or in general will contribute negligible effects, and we find that the ionospheric response to waves and precipitation can explain some, but not all of the results contained within this paper. Specifically, ionospheric response to substorm waves and auroral precipitation cannot explain that the result that previous studies have shown, that onset of ULF wave activity and the onset of auroral particle precipitation occur at the same time and in the same location. This leaves the possibility that ULF waves and auroral particles are physically linked.
Resumo:
A new sparse kernel density estimator is introduced. Our main contribution is to develop a recursive algorithm for the selection of significant kernels one at time using the minimum integrated square error (MISE) criterion for both kernel selection. The proposed approach is simple to implement and the associated computational cost is very low. Numerical examples are employed to demonstrate that the proposed approach is effective in constructing sparse kernel density estimators with competitive accuracy to existing kernel density estimators.
Resumo:
Geomagnetic activity has long been known to exhibit approximately 27 day periodicity, resulting from solar wind structures repeating each solar rotation. Thus a very simple near-Earth solar wind forecast is 27 day persistence, wherein the near-Earth solar wind conditions today are assumed to be identical to those 27 days previously. Effective use of such a persistence model as a forecast tool, however, requires the performance and uncertainty to be fully characterized. The first half of this study determines which solar wind parameters can be reliably forecast by persistence and how the forecast skill varies with the solar cycle. The second half of the study shows how persistence can provide a useful benchmark for more sophisticated forecast schemes, namely physics-based numerical models. Point-by-point assessment methods, such as correlation and mean-square error, find persistence skill comparable to numerical models during solar minimum, despite the 27 day lead time of persistence forecasts, versus 2–5 days for numerical schemes. At solar maximum, however, the dynamic nature of the corona means 27 day persistence is no longer a good approximation and skill scores suggest persistence is out-performed by numerical models for almost all solar wind parameters. But point-by-point assessment techniques are not always a reliable indicator of usefulness as a forecast tool. An event-based assessment method, which focusses key solar wind structures, finds persistence to be the most valuable forecast throughout the solar cycle. This reiterates the fact that the means of assessing the “best” forecast model must be specifically tailored to its intended use.
Resumo:
The complex relationship between flavonoid-based nutrition and cardiovascular disease may be dissected by understanding the activities of these compounds in biological systems. The aim of the present study was to explore a hierarchy for the importance of dietary flavonoids on cardiovascular health by examining the structural basis for inhibitory effects of common, dietary flavonoids (quercetin, apigenin, and naringenin) and the plasma metabolite, tamarixetin. Understanding flavonoid effects on platelets in vivo can be informed by investigations of the ability of these compounds to attenuate the function of these cells. Inhibition of platelet function in whole blood and plasma was structure-dependent. The order of potency was apigenin > tamarixetin > quercetin = naringenin indicating that in vivo, important functional groups are potentially a methylated B ring, and a non-hydroxylated, planar C ring. Apigenin and the methylated metabolite of quercetin, tamarixetin significantly reduced thrombus volume at concentrations (5 μM) that suggested their reported physiological levels (0.1-1 μM) may exert low levels of inhibition. Flavonoid interactions with erythrocytes, leukocytes and human serum albumin in whole blood reduce their inhibitory activities against platelet function. The diminished inhibitory activity of flavonoids that we observed in whole blood and plasma indicated that these interactions do not overcome the attenuating effects of these compounds. Furthermore, inhibition of platelet aggregation by flavonoids was enhanced with increases in exposure time, indicating the potential for measurable inhibitory effects during resident plasma times. We conclude that flavonoid structures may be a major influence of their activities in vivo with methylated metabolites and those of flavones being more potent than those of flavonols and flavanones.