55 resultados para Time-invariant Wavelet Analysis
Resumo:
Some necessary and sufficient conditions for closed-loop eigenstructure assignment by output feedback in time-invariant linear multivariable control systems are presented. A simple condition on a square matrix necessary and sufficient for it to be the closed-loop plant matrix of a given system with some output feedback is the basis of the paper. Some known results on entire eigenstructure assignment are deduced from this. The concept of an inner inverse of a matrix is employed to obtain a condition concerning the assignment of an eigenstructure consisting of the eigenvalues and a mixture of left and right eigenvectors.
Resumo:
The problem of robust pole assignment by feedback in a linear, multivariable, time-invariant system which is subject to structured perturbations is investigated. A measure of robustness, or sensitivity, of the poles to a given class of perturbations is derived, and a reliable and efficient computational algorithm is presented for constructing a feedback which assigns the prescribed poles and optimizes the robustness measure.
Resumo:
With the introduction of new observing systems based on asynoptic observations, the analysis problem has changed in character. In the near future we may expect that a considerable part of meteorological observations will be unevenly distributed in four dimensions, i.e. three dimensions in space and one in time. The term analysis, or objective analysis in meteorology, means the process of interpolating observed meteorological observations from unevenly distributed locations to a network of regularly spaced grid points. Necessitated by the requirement of numerical weather prediction models to solve the governing finite difference equations on such a grid lattice, the objective analysis is a three-dimensional (or mostly two-dimensional) interpolation technique. As a consequence of the structure of the conventional synoptic network with separated data-sparse and data-dense areas, four-dimensional analysis has in fact been intensively used for many years. Weather services have thus based their analysis not only on synoptic data at the time of the analysis and climatology, but also on the fields predicted from the previous observation hour and valid at the time of the analysis. The inclusion of the time dimension in objective analysis will be called four-dimensional data assimilation. From one point of view it seems possible to apply the conventional technique on the new data sources by simply reducing the time interval in the analysis-forecasting cycle. This could in fact be justified also for the conventional observations. We have a fairly good coverage of surface observations 8 times a day and several upper air stations are making radiosonde and radiowind observations 4 times a day. If we have a 3-hour step in the analysis-forecasting cycle instead of 12 hours, which is applied most often, we may without any difficulties treat all observations as synoptic. No observation would thus be more than 90 minutes off time and the observations even during strong transient motion would fall within a horizontal mesh of 500 km * 500 km.
Resumo:
Using previously published data from the whisker barrel cortex of anesthetized rodents (Berwick et al 2008 J. Neurophysiol. 99 787–98) we investigated whether highly spatially localized stimulus-evoked cortical hemodynamics responses displayed a linear time-invariant (LTI) relationship with neural activity. Presentation of stimuli to individual whiskers of 2 s and 16 s durations produced hemodynamics and neural activity spatially localized to individual cortical columns. Two-dimensional optical imaging spectroscopy (2D-OIS) measured hemoglobin responses, while multi-laminar electrophysiology recorded neural activity. Hemoglobin responses to 2 s stimuli were deconvolved with underlying evoked neural activity to estimate impulse response functions which were then convolved with neural activity evoked by 16 s stimuli to generate predictions of hemodynamic responses. An LTI system more adequately described the temporal neuro-hemodynamics coupling relationship for these spatially localized sensory stimuli than in previous studies that activated the entire whisker cortex. An inability to predict the magnitude of an initial 'peak' in the total and oxy- hemoglobin responses was alleviated when excluding responses influenced by overlying arterial components. However, this did not improve estimation of the hemodynamic responses return to baseline post-stimulus cessation.
Resumo:
We examine mid- to late Holocene centennial-scale climate variability in Ireland using proxy data from peatlands, lakes and a speleothem. A high degree of between-record variability is apparent in the proxy data and significant chronological uncertainties are present. However, tephra layers provide a robust tool for correlation and improve the chronological precision of the records. Although we can find no statistically significant coherence in the dataset as a whole, a selection of high-quality peatland water table reconstructions co-vary more than would be expected by chance alone. A locally weighted regression model with bootstrapping can be used to construct a ‘best-estimate’ palaeoclimatic reconstruction from these datasets. Visual comparison and cross-wavelet analysis of peatland water table compilations from Ireland and Northern Britain show that there are some periods of coherence between these records. Some terrestrial palaeoclimatic changes in Ireland appear to coincide with changes in the North Atlantic thermohaline circulation and solar activity. However, these relationships are inconsistent and may be obscured by chronological uncertainties. We conclude by suggesting an agenda for future Holocene climate research in Ireland.
Resumo:
The Advanced Along-Track Scanning Radiometer (AATSR) was launched on Envisat in March 2002. The AATSR instrument is designed to retrieve precise and accurate global sea surface temperature (SST) that, combined with the large data set collected from its predecessors, ATSR and ATSR-2, will provide a long term record of SST data that is greater than 15 years. This record can be used for independent monitoring and detection of climate change. The AATSR validation programme has successfully completed its initial phase. The programme involves validation of the AATSR derived SST values using in situ radiometers, in situ buoys and global SST fields from other data sets. The results of the initial programme presented here will demonstrate that the AATSR instrument is currently close to meeting its scientific objectives of determining global SST to an accuracy of 0.3 K (one sigma). For night time data, the analysis gives a warm bias of between +0.04 K (0.28 K) for buoys to +0.06 K (0.20 K) for radiometers, with slightly higher errors observed for day time data, showing warm biases of between +0.02 (0.39 K) for buoys to +0.11 K (0.33 K) for radiometers. They show that the ATSR series of instruments continues to be the world leader in delivering accurate space-based observations of SST, which is a key climate parameter.
Resumo:
This article examines the ability of several models to generate optimal hedge ratios. Statistical models employed include univariate and multivariate generalized autoregressive conditionally heteroscedastic (GARCH) models, and exponentially weighted and simple moving averages. The variances of the hedged portfolios derived using these hedge ratios are compared with those based on market expectations implied by the prices of traded options. One-month and three-month hedging horizons are considered for four currency pairs. Overall, it has been found that an exponentially weighted moving-average model leads to lower portfolio variances than any of the GARCH-based, implied or time-invariant approaches.
Resumo:
Body Sensor Networks (BSNs) have been recently introduced for the remote monitoring of human activities in a broad range of application domains, such as health care, emergency management, fitness and behaviour surveillance. BSNs can be deployed in a community of people and can generate large amounts of contextual data that require a scalable approach for storage, processing and analysis. Cloud computing can provide a flexible storage and processing infrastructure to perform both online and offline analysis of data streams generated in BSNs. This paper proposes BodyCloud, a SaaS approach for community BSNs that supports the development and deployment of Cloud-assisted BSN applications. BodyCloud is a multi-tier application-level architecture that integrates a Cloud computing platform and BSN data streams middleware. BodyCloud provides programming abstractions that allow the rapid development of community BSN applications. This work describes the general architecture of the proposed approach and presents a case study for the real-time monitoring and analysis of cardiac data streams of many individuals.
Resumo:
In the 1960s and early 1970s sea surface temperatures in the North Atlantic Ocean cooled rapidly. There is still considerable uncertainty about the causes of this event, although various mechanisms have been proposed. In this observational study it is demonstrated that the cooling proceeded in several distinct stages. Cool anomalies initially appeared in the mid-1960s in the Nordic Seas and Gulf Stream Extension, before spreading to cover most of the Subpolar Gyre. Subsequently, cool anomalies spread into the tropical North Atlantic before retreating, in the late 1970s, back to the Subpolar Gyre. There is strong evidence that changes in atmospheric circulation, linked to a southward shift of the Atlantic ITCZ, played an important role in the event, particularly in the period 1972-76. Theories for the cooling event must account for its distinctive space-time evolution. Our analysis suggests that the most likely drivers were: 1) The “Great Salinity Anomaly” of the late 1960s; 2) An earlier warming of the subpolar North Atlantic, which may have led to a slow-down in the Atlantic Meridional Overturning Circulation; 3) An increase in anthropogenic sulphur dioxide emissions. Determining the relative importance of these factors is a key area for future work.
Resumo:
We study the scaling properties and Kraichnan–Leith–Batchelor (KLB) theory of forced inverse cascades in generalized two-dimensional (2D) fluids (α-turbulence models) simulated at resolution 8192x8192. We consider α=1 (surface quasigeostrophic flow), α=2 (2D Euler flow) and α=3. The forcing scale is well resolved, a direct cascade is present and there is no large-scale dissipation. Coherent vortices spanning a range of sizes, most larger than the forcing scale, are present for both α=1 and α=2. The active scalar field for α=3 contains comparatively few and small vortices. The energy spectral slopes in the inverse cascade are steeper than the KLB prediction −(7−α)/3 in all three systems. Since we stop the simulations well before the cascades have reached the domain scale, vortex formation and spectral steepening are not due to condensation effects; nor are they caused by large-scale dissipation, which is absent. One- and two-point p.d.f.s, hyperflatness factors and structure functions indicate that the inverse cascades are intermittent and non-Gaussian over much of the inertial range for α=1 and α=2, while the α=3 inverse cascade is much closer to Gaussian and non-intermittent. For α=3 the steep spectrum is close to that associated with enstrophy equipartition. Continuous wavelet analysis shows approximate KLB scaling ℰ(k)∝k−2 (α=1) and ℰ(k)∝k−5/3 (α=2) in the interstitial regions between the coherent vortices. Our results demonstrate that coherent vortex formation (α=1 and α=2) and non-realizability (α=3) cause 2D inverse cascades to deviate from the KLB predictions, but that the flow between the vortices exhibits KLB scaling and non-intermittent statistics for α=1 and α=2.
Resumo:
The induction of classification rules from previously unseen examples is one of the most important data mining tasks in science as well as commercial applications. In order to reduce the influence of noise in the data, ensemble learners are often applied. However, most ensemble learners are based on decision tree classifiers which are affected by noise. The Random Prism classifier has recently been proposed as an alternative to the popular Random Forests classifier, which is based on decision trees. Random Prism is based on the Prism family of algorithms, which is more robust to noise. However, like most ensemble classification approaches, Random Prism also does not scale well on large training data. This paper presents a thorough discussion of Random Prism and a recently proposed parallel version of it called Parallel Random Prism. Parallel Random Prism is based on the MapReduce programming paradigm. The paper provides, for the first time, novel theoretical analysis of the proposed technique and in-depth experimental study that show that Parallel Random Prism scales well on a large number of training examples, a large number of data features and a large number of processors. Expressiveness of decision rules that our technique produces makes it a natural choice for Big Data applications where informed decision making increases the user’s trust in the system.
Resumo:
The theory of wave–mean flow interaction requires a partition of the atmospheric flow into a notional background state and perturbations to it. Here, a background state, known as the Modified Lagrangian Mean (MLM), is defined as the zonally symmetric state obtained by requiring that every potential vorticity (PV) contour lying within an isentropic layer encloses the same mass and circulation as in the full flow. For adiabatic and frictionless flow, these two integral properties are time-invariant and the MLM state is a steady solution of the primitive equations. The time dependence in the adiabatic flow is put into the perturbations, which can be described by a wave-activity conservation law that is exact even at large amplitude. Furthermore, the effects of non-conservative processes on wave activity can be calculated from the conservation law. A new method to calculate the MLM state is introduced, where the position of the lower boundary is obtained as part of the solution. The results are illustrated using Northern Hemisphere ERA-Interim data. The MLM state evolves slowly, implying that the net non-conservative effects are weak. Although ‘adiabatic eddy fluxes’ cannot affect the MLM state, the effects of Rossby-wave breaking, PV filamentation and subsequent dissipation result in sharpening of the polar vortex edge and meridional shifts in the MLM zonal flow, both at tropopause level and on the winter stratospheric vortex. The rate of downward migration of wave activity during stratospheric sudden warmings is shown to be given by the vertical scale associated with polar vortex tilt divided by the time-scale for wave dissipation estimated from the wave-activity conservation law. Aspects of troposphere–stratosphere interaction are discussed. The new framework is suitable to examine the climate and its interactions with disturbances, such as midlatitude storm tracks, and makes a clean partition between adiabatic and non-conservative processes.
Resumo:
This paper presents preliminary results from an ethnoarchaeological study of animal husbandry in the modern village of Bestansur, situated in the lower Zagros Mountains of Iraqi Kurdistan. This research explores how modern families use and manage their livestock within the local landscape and identifies traces of this use. The aim is to provide the groundwork for future archaeological investigations focusing on the nearby Neolithic site of Bestansur. This is based on the premise that modern behaviours can suggest testable patterns for past practices within the same functional and ecological domains. Semi-structured interviews conducted with villagers from several households provided large amounts of information on modern behaviours that helped direct data collection, and which also illustrate notable shifts in practices and use of the local landscape over time. Strontium isotope analysis of modern plant material demonstrates that a measurable variation exists between the alluvial floodplain and the lower foothills, while analysis of modern dung samples shows clear variation between sheep/goat and cow dung, in terms of numbers of faecal spherulites. These results are specific to the local environment of Bestansur and can be used for evaluating and contextualising archaeological evidence as well as providing modern reference material for comparative purposes.
Resumo:
Implicit dynamic-algebraic equations, known in control theory as descriptor systems, arise naturally in many applications. Such systems may not be regular (often referred to as singular). In that case the equations may not have unique solutions for consistent initial conditions and arbitrary inputs and the system may not be controllable or observable. Many control systems can be regularized by proportional and/or derivative feedback.We present an overview of mathematical theory and numerical techniques for regularizing descriptor systems using feedback controls. The aim is to provide stable numerical techniques for analyzing and constructing regular control and state estimation systems and for ensuring that these systems are robust. State and output feedback designs for regularizing linear time-invariant systems are described, including methods for disturbance decoupling and mixed output problems. Extensions of these techniques to time-varying linear and nonlinear systems are discussed in the final section.
Resumo:
We show that an analysis of the mean and variance of discrete wavelet coefficients of coaveraged time-domain interferograms can be used as a specification for determining when to stop coaveraging. We also show that, if a prediction model built in the wavelet domain is used to determine the composition of unknown samples, a stopping criterion for the coaveraging process can be developed with respect to the uncertainty tolerated in the prediction.