852 resultados para H-Infinity Time-Varying Adaptive Algorithm
Resumo:
We present observations of a transient event in the dayside auroral ionosphere at magnetic noon. F-region plasma convection measurements were made by the EISCAT radar, operating in the beamswinging “Polar” experiment mode, and simultaneous observations of the dayside auroral emissions were made by optical meridian-scanning photometers and all-sky TV cameras at Ny Ålesund, Spitzbergen. The data were recorded on 9 January 1989, and a sequence of bursts of flow, with associated transient aurora, were observed between 08:45 and 11:00 U.T. In this paper we concentrate on an event around 09:05 U.T. because that is very close to local magnetic noon. The optical data show a transient intensification and widening (in latitude) of the cusp/cleft region, as seen in red line auroral emissions. Over an interval of about 10 min, the band of 630 nm aurora widened from about 1.5° of invariant latitude to over 5° and returned to its original width. Embedded within the widening band of 630 nm emissions were two intense, active 557.7 nm arc fragments with rays which persisted for about 2 min each. The flow data before and after the optical transient show eastward flows, with speeds increasing markedly with latitude across the band of 630 nm aurora. Strong, apparently westward, flows appeared inside the band while it was widening, but these rotated round to eastward, through northward, as the band shrunk to its original width. The observed ion temperatures verify that the flow speeds during the transient were, to a large extent, as derived using the beamswinging technique; but they also show that the flow increase initially occurred in the western azimuth only. This spatial gradient in the flow introduces ambiguity in the direction of these initial flows and they could have been north-eastward rather than westward. However, the westward direction derived by the beamswinging is consistent with the motion of the colocated and coincident active 557.7 nm arc fragment, A more stable transient 557.7 nm aurora was found close to the shear between the inferred westward flows and the persisting eastward flows to the North. Throughout the transient, northward flow was observed across the equatorward boundary of the 630 nm aurora. Interpretation of the data is made difficult by lack of IMF data, problems in distinguishing the cusp and cleft aurora and uncertainty over which field lines are open and which are closed. However, at magnetic noon there is a 50% probability that we were observing the cusp, in which case from its southerly location we infer that the IMF was southward and many features are suggestive of time-varying reconnection at a single X-line on the dayside magnetopause. This IMF orientation is also consistent with the polar rain precipitation observed simultaneously by the DMSP-F9 satellite in the southern polar cap. There is also a 25% chance that we were observing the cleft (or the mantle poleward of the cleft). In this case we infer that the IMF was northward and the transient is well explained by reconnection which is not only transient in time but occurs at various sites located randomly on the dayside magnetopause (i.e. patchy in space). Lastly, there is a 25% chance that we were observing the cusp poleward of the cleft, in which case we infer that IMF Bz was near zero and the transient is explained by a mixture of the previous two interpretations.
Resumo:
Implicit dynamic-algebraic equations, known in control theory as descriptor systems, arise naturally in many applications. Such systems may not be regular (often referred to as singular). In that case the equations may not have unique solutions for consistent initial conditions and arbitrary inputs and the system may not be controllable or observable. Many control systems can be regularized by proportional and/or derivative feedback.We present an overview of mathematical theory and numerical techniques for regularizing descriptor systems using feedback controls. The aim is to provide stable numerical techniques for analyzing and constructing regular control and state estimation systems and for ensuring that these systems are robust. State and output feedback designs for regularizing linear time-invariant systems are described, including methods for disturbance decoupling and mixed output problems. Extensions of these techniques to time-varying linear and nonlinear systems are discussed in the final section.
Resumo:
The Clouds, Aerosol, and Precipitation in the Marine Boundary Layer (CAP-MBL) deployment at Graciosa Island in the Azores generated a 21-month (April 2009–December 2010) comprehensive dataset documenting clouds, aerosols, and precipitation using the Atmospheric Radiation Measurement Program (ARM) Mobile Facility (AMF). The scientific aim of the deployment is to gain improved understanding of the interactions of clouds, aerosols, and precipitation in the marine boundary layer. Graciosa Island straddles the boundary between the subtropics and midlatitudes in the northeast Atlantic Ocean and consequently experiences a great diversity of meteorological and cloudiness conditions. Low clouds are the dominant cloud type, with stratocumulus and cumulus occurring regularly. Approximately half of all clouds contained precipitation detectable as radar echoes below the cloud base. Radar and satellite observations show that clouds with tops from 1 to 11 km contribute more or less equally to surface-measured precipitation at Graciosa. A wide range of aerosol conditions was sampled during the deployment consistent with the diversity of sources as indicated by back-trajectory analysis. Preliminary findings suggest important two-way interactions between aerosols and clouds at Graciosa, with aerosols affecting light precipitation and cloud radiative properties while being controlled in part by precipitation scavenging. The data from Graciosa are being compared with short-range forecasts made with a variety of models. A pilot analysis with two climate and two weather forecast models shows that they reproduce the observed time-varying vertical structure of lower-tropospheric cloud fairly well but the cloud-nucleating aerosol concentrations less well. The Graciosa site has been chosen to be a permanent fixed ARM site that became operational in October 2013.
Resumo:
Simultaneous all angle collocations (SAACs) of microwave humidity sounders (AMSU-B and MHS) on-board polar orbiting satellites are used to estimate scan-dependent biases. This method has distinct advantages over previous methods, such as that the estimated scan-dependent biases are not influenced by diurnal differences between the edges of the scan and the biases can be estimated for both sides of the scan. We find the results are robust in the sense that biases estimated for one satellite pair can be reproduced by double differencing biases of these satellites with a third satellite. Channel 1 of these instruments shows the least bias for all satellites. Channel 2 has biases greater than 5 K, thus needs to be corrected. Channel 3 has biases of about 2 K and more and they are time varying for some of the satellites. Channel 4 has the largest bias which is about 15 K when the data are averaged for 5 years, but biases of individual months can be as large as 30 K. Channel 5 also has large and time varying biases for two of the AMSU-Bs. NOAA-15 (N15) channels are found to be affected the most, mainly due to radio frequency interference (RFI) from onboard data transmitters. Channel 4 of N15 shows the largest and time varying biases, so data of this channel should only be used with caution for climate applications. The two MHS instruments show the best agreement for all channels. Our estimates may be used to correct for scan-dependent biases of these instruments, or at least used as a guideline for excluding channels with large scan asymmetries from scientific analyses.
Resumo:
Initializing the ocean for decadal predictability studies is a challenge, as it requires reconstructing the little observed subsurface trajectory of ocean variability. In this study we explore to what extent surface nudging using well-observed sea surface temperature (SST) can reconstruct the deeper ocean variations for the 1949–2005 period. An ensemble made with a nudged version of the IPSLCM5A model and compared to ocean reanalyses and reconstructed datasets. The SST is restored to observations using a physically-based relaxation coefficient, in contrast to earlier studies, which use a much larger value. The assessment is restricted to the regions where the ocean reanalyses agree, i.e. in the upper 500 m of the ocean, although this can be latitude and basin dependent. Significant reconstruction of the subsurface is achieved in specific regions, namely region of subduction in the subtropical Atlantic, below the thermocline in the equatorial Pacific and, in some cases, in the North Atlantic deep convection regions. Beyond the mean correlations, ocean integrals are used to explore the time evolution of the correlation over 20-year windows. Classical fixed depth heat content diagnostics do not exhibit any significant reconstruction between the different existing observation-based references and can therefore not be used to assess global average time-varying correlations in the nudged simulations. Using the physically based average temperature above an isotherm (14 °C) alleviates this issue in the tropics and subtropics and shows significant reconstruction of these quantities in the nudged simulations for several decades. This skill is attributed to the wind stress reconstruction in the tropics, as already demonstrated in a perfect model study using the same model. Thus, we also show here the robustness of this result in an historical and observational context.
Resumo:
The destructive environmental and socio-economic impacts of the El Niño/Southern Oscillation1, 2 (ENSO) demand an improved understanding of how ENSO will change under future greenhouse warming. Robust projected changes in certain aspects of ENSO have been recently established3, 4, 5. However, there is as yet no consensus on the change in the magnitude of the associated sea surface temperature (SST) variability6, 7, 8, commonly used to represent ENSO amplitude1, 6, despite its strong effects on marine ecosystems and rainfall worldwide1, 2, 3, 4, 9. Here we show that the response of ENSO SST amplitude is time-varying, with an increasing trend in ENSO amplitude before 2040, followed by a decreasing trend thereafter. We attribute the previous lack of consensus to an expectation that the trend in ENSO amplitude over the entire twenty-first century is unidirectional, and to unrealistic model dynamics of tropical Pacific SST variability. We examine these complex processes across 22 models in the Coupled Model Intercomparison Project phase 5 (CMIP5) database10, forced under historical and greenhouse warming conditions. The nine most realistic models identified show a strong consensus on the time-varying response and reveal that the non-unidirectional behaviour is linked to a longitudinal difference in the surface warming rate across the Indo-Pacific basin. Our results carry important implications for climate projections and climate adaptation pathways.
Resumo:
In this work, we prove a weak Noether-type Theorem for a class of variational problems that admit broken extremals. We use this result to prove discrete Noether-type conservation laws for a conforming finite element discretisation of a model elliptic problem. In addition, we study how well the finite element scheme satisfies the continuous conservation laws arising from the application of Noether’s first theorem (1918). We summarise extensive numerical tests, illustrating the conservation of the discrete Noether law using the p-Laplacian as an example and derive a geometric-based adaptive algorithm where an appropriate Noether quantity is the goal functional.
Resumo:
In order to accelerate computing the convex hull on a set of n points, a heuristic procedure is often applied to reduce the number of points to a set of s points, s ≤ n, which also contains the same hull. We present an algorithm to precondition 2D data with integer coordinates bounded by a box of size p × q before building a 2D convex hull, with three distinct advantages. First, we prove that under the condition min(p, q) ≤ n the algorithm executes in time within O(n); second, no explicit sorting of data is required; and third, the reduced set of s points forms a simple polygonal chain and thus can be directly pipelined into an O(n) time convex hull algorithm. This paper empirically evaluates and quantifies the speed up gained by preconditioning a set of points by a method based on the proposed algorithm before using common convex hull algorithms to build the final hull. A speedup factor of at least four is consistently found from experiments on various datasets when the condition min(p, q) ≤ n holds; the smaller the ratio min(p, q)/n is in the dataset, the greater the speedup factor achieved.
Resumo:
This work is an assessment of frequency of extreme values (EVs) of daily rainfall in the city of Sao Paulo. Brazil, over the period 1933-2005, based on the peaks-over-threshold (POT) and Generalized Pareto Distribution (GPD) approach. Usually. a GPD model is fitted to a sample of POT Values Selected With a constant threshold. However. in this work we use time-dependent thresholds, composed of relatively large p quantities (for example p of 0.97) of daily rainfall amounts computed from all available data. Samples of POT values were extracted with several Values of p. Four different GPD models (GPD-1, GPD-2, GPD-3. and GDP-4) were fitted to each one of these samples by the maximum likelihood (ML) method. The shape parameter was assumed constant for the four models, but time-varying covariates were incorporated into scale parameter of GPD-2. GPD-3, and GPD-4, describing annual cycle in GPD-2. linear trend in GPD-3, and both annual cycle and linear trend in GPD-4. The GPD-1 with constant scale and shape parameters is the simplest model. For identification of the best model among the four models WC used rescaled Akaike Information Criterion (AIC) with second-order bias correction. This criterion isolates GPD-3 as the best model, i.e. the one with positive linear trend in the scale parameter. The slope of this trend is significant compared to the null hypothesis of no trend, for about 98% confidence level. The non-parametric Mann-Kendall test also showed presence of positive trend in the annual frequency of excess over high thresholds. with p-value being virtually zero. Therefore. there is strong evidence that high quantiles of daily rainfall in the city of Sao Paulo have been increasing in magnitude and frequency over time. For example. 0.99 quantiles of daily rainfall amount have increased by about 40 mm between 1933 and 2005. Copyright (C) 2008 Royal Meteorological Society
Resumo:
We study opinion dynamics in a population of interacting adaptive agents voting on a set of issues represented by vectors. We consider agents who can classify issues into one of two categories and can arrive at their opinions using an adaptive algorithm. Adaptation comes from learning and the information for the learning process comes from interacting with other neighboring agents and trying to change the internal state in order to concur with their opinions. The change in the internal state is driven by the information contained in the issue and in the opinion of the other agent. We present results in a simple yet rich context where each agent uses a Boolean perceptron to state their opinion. If the update occurs with information asynchronously exchanged among pairs of agents, then the typical case, if the number of issues is kept small, is the evolution into a society torn by the emergence of factions with extreme opposite beliefs. This occurs even when seeking consensus with agents with opposite opinions. If the number of issues is large, the dynamics becomes trapped, the society does not evolve into factions and a distribution of moderate opinions is observed. The synchronous case is technically simpler and is studied by formulating the problem in terms of differential equations that describe the evolution of order parameters that measure the consensus between pairs of agents. We show that for a large number of issues and unidirectional information flow, global consensus is a fixed point; however, the approach to this consensus is glassy for large societies.
Resumo:
The problem of scheduling a parallel program presented by a weighted directed acyclic graph (DAG) to the set of homogeneous processors for minimizing the completion time of the program has been extensively studied as academic optimization problem which occurs in optimizing the execution time of parallel algorithm with parallel computer.In this paper, we propose an application of the Ant Colony Optimization (ACO) to a multiprocessor scheduling problem (MPSP). In the MPSP, no preemption is allowed and each operation demands a setup time on the machines. The problem seeks to compose a schedule that minimizes the total completion time.We therefore rely on heuristics to find solutions since solution methods are not feasible for most problems as such. This novel heuristic searching approach to the multiprocessor based on the ACO algorithm a collection of agents cooperate to effectively explore the search space.A computational experiment is conducted on a suit of benchmark application. By comparing our algorithm result obtained to that of previous heuristic algorithm, it is evince that the ACO algorithm exhibits competitive performance with small error ratio.
Resumo:
In this paper, we test a version of the conditional CAPM with respect to a local market portfolio, proxied by the Brazilian stock index during the period 1976-1992. We also test a conditional APT modeI by using the difference between the 3-day rate (Cdb) and the overnight rate as a second factor in addition to the market portfolio in order to capture the large inflation risk present during this period. The conditional CAPM and APT models are estimated by the Generalized Method of Moments (GMM) and tested on a set of size portfolios created from individual securities exchanged on the Brazilian markets. The inclusion of this second factor proves to be important for the appropriate pricing of the portfolios.
Resumo:
Parametric term structure models have been successfully applied to innumerous problems in fixed income markets, including pricing, hedging, managing risk, as well as studying monetary policy implications. On their turn, dynamic term structure models, equipped with stronger economic structure, have been mainly adopted to price derivatives and explain empirical stylized facts. In this paper, we combine flavors of those two classes of models to test if no-arbitrage affects forecasting. We construct cross section (allowing arbitrages) and arbitrage-free versions of a parametric polynomial model to analyze how well they predict out-of-sample interest rates. Based on U.S. Treasury yield data, we find that no-arbitrage restrictions significantly improve forecasts. Arbitrage-free versions achieve overall smaller biases and Root Mean Square Errors for most maturities and forecasting horizons. Furthermore, a decomposition of forecasts into forward-rates and holding return premia indicates that the superior performance of no-arbitrage versions is due to a better identification of bond risk premium.
Resumo:
In this paper, we show that the widely used stationarity tests such as the KPSS test have power close to size in the presence of time-varying unconditional variance. We propose a new test as a complement of the existing tests. Monte Carlo experiments show that the proposed test possesses the following characteristics: (i) In the presence of unit root or a structural change in the mean, the proposed test is as powerful as the KPSS and other tests; (ii) In the presence a changing variance, the traditional tests perform badly whereas the proposed test has high power comparing to the existing tests; (iii) The proposed test has the same size as traditional stationarity tests under the null hypothesis of stationarity. An application to daily observations of return on US Dollar/Euro exchange rate reveals the existence of instability in the unconditional variance when the entire sample is considered, but stability is found in subsamples.
Resumo:
This article develops an econometric model in order to study country risk behavior for six emerging economies (Argentina, Mexico, Russia, Thailand, Korea and Indonesia), by expanding the Country Beta Risk Model of Harvey and Zhou (1993), Erb et. al. (1996a, 1996b) and Gangemi et. al. (2000). Toward this end, we have analyzed the impact of macroeconomic variables, especially monetary policy, upon country risk, by way of a time varying parameter approach. The results indicate an inefficient and unstable effect of monetary policy upon country risk in periods of crisis. However, this effect is stable in other periods, and the Favero-Giavazzi effect is not verified for all economies, with an opposite effect being observed in many cases.