57 resultados para State Space Analysis


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conditions are given under which a descriptor, or generalized state-space system can be regularized by output feedback. It is shown that under these conditions, proportional and derivative output feedback controls can be constructed such that the closed-loop system is regular and has index at most one. This property ensures the solvability of the resulting system of dynamic-algebraic equations. A reduced form is given that allows the system properties as well as the feedback to be determined. The construction procedures used to establish the theory are based only on orthogonal matrix decompositions and can therefore be implemented in a numerically stable way.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A robust pole assignment by linear state feedback is achieved in state-space representation by selecting a feedback which minimises the conditioning of the assigned eigenvalues of the closed-loop system. It is shown here that when this conditioning is minimised, a lower bound on the stability margin in the frequency domain is maximised.

Relevância:

80.00% 80.00%

Publicador:

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn’t represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion in the mean value of the function. Using statistical significance tests both at the local and field level, it is shown that the climatology of the SPEEDY model is not modified by the changed time stepping scheme; hence, no retuning of the parameterizations is required. It is found the accuracy of the medium-term forecasts is increased by using the RAW filter.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Biological models of an apoptotic process are studied using models describing a system of differential equations derived from reaction kinetics information. The mathematical model is re-formulated in a state-space robust control theory framework where parametric and dynamic uncertainty can be modelled to account for variations naturally occurring in biological processes. We propose to handle the nonlinearities using neural networks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In general, particle filters need large numbers of model runs in order to avoid filter degeneracy in high-dimensional systems. The recently proposed, fully nonlinear equivalent-weights particle filter overcomes this requirement by replacing the standard model transition density with two different proposal transition densities. The first proposal density is used to relax all particles towards the high-probability regions of state space as defined by the observations. The crucial second proposal density is then used to ensure that the majority of particles have equivalent weights at observation time. Here, the performance of the scheme in a high, 65 500 dimensional, simplified ocean model is explored. The success of the equivalent-weights particle filter in matching the true model state is shown using the mean of just 32 particles in twin experiments. It is of particular significance that this remains true even as the number and spatial variability of the observations are changed. The results from rank histograms are less easy to interpret and can be influenced considerably by the parameter values used. This article also explores the sensitivity of the performance of the scheme to the chosen parameter values and the effect of using different model error parameters in the truth compared with the ensemble model runs.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The analysis step of the (ensemble) Kalman filter is optimal when (1) the distribution of the background is Gaussian, (2) state variables and observations are related via a linear operator, and (3) the observational error is of additive nature and has Gaussian distribution. When these conditions are largely violated, a pre-processing step known as Gaussian anamorphosis (GA) can be applied. The objective of this procedure is to obtain state variables and observations that better fulfil the Gaussianity conditions in some sense. In this work we analyse GA from a joint perspective, paying attention to the effects of transformations in the joint state variable/observation space. First, we study transformations for state variables and observations that are independent from each other. Then, we introduce a targeted joint transformation with the objective to obtain joint Gaussianity in the transformed space. We focus primarily in the univariate case, and briefly comment on the multivariate one. A key point of this paper is that, when (1)-(3) are violated, using the analysis step of the EnKF will not recover the exact posterior density in spite of any transformations one may perform. These transformations, however, provide approximations of different quality to the Bayesian solution of the problem. Using an example in which the Bayesian posterior can be analytically computed, we assess the quality of the analysis distributions generated after applying the EnKF analysis step in conjunction with different GA options. The value of the targeted joint transformation is particularly clear for the case when the prior is Gaussian, the marginal density for the observations is close to Gaussian, and the likelihood is a Gaussian mixture.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Most building services products are installed while a building is constructed, but they are not operated until the building is commissioned. The warranty of the products may cover the time starting from their installation to the end of the warranty period. Prior to the commissioning of the building, the products are at a dormant mode (i.e., not operated) but protected by the warranty. For such products, both the usage intensity and the failure patterns are different from those with continuous usage intensity and failure patterns. This paper develops warranty cost models for repairable products with a dormant mode from both the manufacturer's and buyer's perspectives. Relationships between the failure patterns at the dormant mode and at the operational mode are also discussed. Numerical examples and sensitivity analysis are used to demonstrate the applicability of the methodology derived in the paper.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Earth-directed coronal mass ejection (CME) of 8 April 2010 provided an opportunity for space weather predictions from both established and developmental techniques to be made from near–real time data received from the SOHO and STEREO spacecraft; the STEREO spacecraft provide a unique view of Earth-directed events from outside the Sun-Earth line. Although the near–real time data transmitted by the STEREO Space Weather Beacon are significantly poorer in quality than the subsequently downlinked science data, the use of these data has the advantage that near–real time analysis is possible, allowing actual forecasts to be made. The fact that such forecasts cannot be biased by any prior knowledge of the actual arrival time at Earth provides an opportunity for an unbiased comparison between several established and developmental forecasting techniques. We conclude that for forecasts based on the STEREO coronagraph data, it is important to take account of the subsequent acceleration/deceleration of each CME through interaction with the solar wind, while predictions based on measurements of CMEs made by the STEREO Heliospheric Imagers would benefit from higher temporal and spatial resolution. Space weather forecasting tools must work with near–real time data; such data, when provided by science missions, is usually highly compressed and/or reduced in temporal/spatial resolution and may also have significant gaps in coverage, making such forecasts more challenging.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

To achieve CO2 emissions reductions the UK Building Regulations require developers of new residential buildings to calculate expected CO2 emissions arising from their energy consumption using a methodology such as Standard Assessment Procedure (SAP 2005) or, more recently SAP 2009. SAP encompasses all domestic heat consumption and a limited proportion of the electricity consumption. However, these calculations are rarely verified with real energy consumption and related CO2 emissions. This paper presents the results of an analysis based on weekly head demand data for more than 200 individual flats. The data is collected from recently built residential development connected to a district heating network. A methodology for separating out the domestic hot water use (DHW) and space heating demand (SH) has been developed and compares measured values to the demand calculated using SAP 2005 and 2009 methodologies. The analysis shows also the variance in DHW and SH consumption between both size of the flats and tenure (privately owned or housing association). Evaluation of the space heating consumption includes also an estimation of the heating degree day (HDD) base temperature for each block of flats and its comparison to the average base temperature calculated using the SAP 2005 methodology.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The technique of relaxation of the tropical atmosphere towards an analysis in a month-season forecast model has previously been successfully exploited in a number of contexts. Here it is shown that when tropical relaxation is used to investigate the possible origin of the observed anomalies in June–July 2007, a simple dynamical model is able to reproduce the observed component of the pattern of anomalies given by an ensemble of ECMWF forecast runs. Following this result, the simple model is used for a range of experiments on time-scales of relaxation, variables and regions relaxed based on a control model run with equatorial heating in a zonal flow. A theory based on scale analysis for the large-scale tropics is used to interpret the results. Typical relationships between scales are determined from the basic equations, and for a specified diabatic heating a chain of deductions for determining the dependent variables is derived. Different critical time-scales are found for tropical relaxation of different dependent variables to be effective. Vorticity has the longest critical time-scale, typically 1.2 days. For temperature and divergence, the time-scales are 10 hours and 3 hours, respectively. However not all the tropical fields, in particular the vertical motion, are reproduced correctly by the model unless divergence is heavily damped. To obtain the correct extra-tropical fields, it is crucial to have the correct rotational flow in the subtropics to initiate the Rossby wave propagation from there. It is sufficient to relax vorticity or temperature on a time-scale comparable or less than their critical time-scales to obtain this. However if the divergent advection of vorticity is important in the Rossby Wave Source then strong relaxation of divergence is required to accurately represent the tropical forcing of Rossby waves.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The fascinating idea that tools become extensions of our body appears in artistic, literary, philosophical, and scientific works alike. In the last fifteen years, this idea has been re-framed into several related hypotheses, one of which states that tool use extends the neural representation of the multisensory space immediately surrounding the hands (variously termed peripersonal space, peri-hand space, peri-cutaneous space, action space, or near space). This and related hypotheses have been tested extensively in the cognitive neurosciences, with evidence from molecular, neurophysiological, neuroimaging, neuropsychological, and behavioural fields. Here, I briefly review the evidence for and against the hypothesis that tool use extends a neural representation of the space surrounding the hand, concentrating on neurophysiological, neuropsychological, and behavioural evidence. I then provide a re-analysis of data from six published and one unpublished experiments using the crossmodal congruency task to test this hypothesis. While the re-analysis broadly confirms the previously-reported finding that tool use does not literally extend peripersonal space, the overall effect-sizes are small and statistical power is low. I conclude by questioning whether the crossmodal congruency task can indeed be used to test the hypothesis that tool use modifies peripersonal space.