14 resultados para Inverse Approach

em CentAUR: Central Archive University of Reading - UK


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Recent coordinated observations of interplanetary scintillation (IPS) from the EISCAT, MERLIN, and STELab, and stereoscopic white-light imaging from the two heliospheric imagers (HIs) onboard the twin STEREO spacecraft are significant to continuously track the propagation and evolution of solar eruptions throughout interplanetary space. In order to obtain a better understanding of the observational signatures in these two remote-sensing techniques, the magnetohydrodynamics of the macro-scale interplanetary disturbance and the radio-wave scattering of the micro-scale electron-density fluctuation are coupled and investigated using a newly constructed multi-scale numerical model. This model is then applied to a case of an interplanetary shock propagation within the ecliptic plane. The shock could be nearly invisible to an HI, once entering the Thomson-scattering sphere of the HI. The asymmetry in the optical images between the western and eastern HIs suggests the shock propagation off the Sun–Earth line. Meanwhile, an IPS signal, strongly dependent on the local electron density, is insensitive to the density cavity far downstream of the shock front. When this cavity (or the shock nose) is cut through by an IPS ray-path, a single speed component at the flank (or the nose) of the shock can be recorded; when an IPS ray-path penetrates the sheath between the shock nose and this cavity, two speed components at the sheath and flank can be detected. Moreover, once a shock front touches an IPS ray-path, the derived position and speed at the irregularity source of this IPS signal, together with an assumption of a radial and constant propagation of the shock, can be used to estimate the later appearance of the shock front in the elongation of the HI field of view. The results of synthetic measurements from forward modelling are helpful in inferring the in-situ properties of coronal mass ejection from real observational data via an inverse approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The decadal predictability of three-dimensional Atlantic Ocean anomalies is examined in a coupled global climate model (HadCM3) using a Linear Inverse Modelling (LIM) approach. It is found that the evolution of temperature and salinity in the Atlantic, and the strength of the meridional overturning circulation (MOC), can be effectively described by a linear dynamical system forced by white noise. The forecasts produced using this linear model are more skillful than other reference forecasts for several decades. Furthermore, significant non-normal amplification is found under several different norms. The regions from which this growth occurs are found to be fairly shallow and located in the far North Atlantic. Initially, anomalies in the Nordic Seas impact the MOC, and the anomalies then grow to fill the entire Atlantic basin, especially at depth, over one to three decades. It is found that the structure of the optimal initial condition for amplification is sensitive to the norm employed, but the initial growth seems to be dominated by MOC-related basin scale changes, irrespective of the choice of norm. The consistent identification of the far North Atlantic as the most sensitive region for small perturbations suggests that additional observations in this region would be optimal for constraining decadal climate predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of the review is to provide a state-of-the-art survey on sampling and probe methods for the solution of inverse problems. Further, a configuration approach to some of the problems will be presented. We study the concepts and analytical results for several recent sampling and probe methods. We will give an introduction to the basic idea behind each method using a simple model problem and then provide some general formulation in terms of particular configurations to study the range of the arguments which are used to set up the method. This provides a novel way to present the algorithms and the analytic arguments for their investigation in a variety of different settings. In detail we investigate the probe method (Ikehata), linear sampling method (Colton-Kirsch) and the factorization method (Kirsch), singular sources Method (Potthast), no response test (Luke-Potthast), range test (Kusiak, Potthast and Sylvester) and the enclosure method (Ikehata) for the solution of inverse acoustic and electromagnetic scattering problems. The main ideas, approaches and convergence results of the methods are presented. For each method, we provide a historical survey about applications to different situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In survival analysis frailty is often used to model heterogeneity between individuals or correlation within clusters. Typically frailty is taken to be a continuous random effect, yielding a continuous mixture distribution for survival times. A Bayesian analysis of a correlated frailty model is discussed in the context of inverse Gaussian frailty. An MCMC approach is adopted and the deviance information criterion is used to compare models. As an illustration of the approach a bivariate data set of corneal graft survival times is analysed. (C) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Inverse problems for dynamical system models of cognitive processes comprise the determination of synaptic weight matrices or kernel functions for neural networks or neural/dynamic field models, respectively. We introduce dynamic cognitive modeling as a three tier top-down approach where cognitive processes are first described as algorithms that operate on complex symbolic data structures. Second, symbolic expressions and operations are represented by states and transformations in abstract vector spaces. Third, prescribed trajectories through representation space are implemented in neurodynamical systems. We discuss the Amari equation for a neural/dynamic field theory as a special case and show that the kernel construction problem is particularly ill-posed. We suggest a Tikhonov-Hebbian learning method as regularization technique and demonstrate its validity and robustness for basic examples of cognitive computations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We explore the potential for making statistical decadal predictions of sea surface temperatures (SSTs) in a perfect model analysis, with a focus on the Atlantic basin. Various statistical methods (Lagged correlations, Linear Inverse Modelling and Constructed Analogue) are found to have significant skill in predicting the internal variability of Atlantic SSTs for up to a decade ahead in control integrations of two different global climate models (GCMs), namely HadCM3 and HadGEM1. Statistical methods which consider non-local information tend to perform best, but which is the most successful statistical method depends on the region considered, GCM data used and prediction lead time. However, the Constructed Analogue method tends to have the highest skill at longer lead times. Importantly, the regions of greatest prediction skill can be very different to regions identified as potentially predictable from variance explained arguments. This finding suggests that significant local decadal variability is not necessarily a prerequisite for skillful decadal predictions, and that the statistical methods are capturing some of the dynamics of low-frequency SST evolution. In particular, using data from HadGEM1, significant skill at lead times of 6–10 years is found in the tropical North Atlantic, a region with relatively little decadal variability compared to interannual variability. This skill appears to come from reconstructing the SSTs in the far north Atlantic, suggesting that the more northern latitudes are optimal for SST observations to improve predictions. We additionally explore whether adding sub-surface temperature data improves these decadal statistical predictions, and find that, again, it depends on the region, prediction lead time and GCM data used. Overall, we argue that the estimated prediction skill motivates the further development of statistical decadal predictions of SSTs as a benchmark for current and future GCM-based decadal climate predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article a simple and effective controller design is introduced for the Hammerstein systems that are identified based on observational input/output data. The nonlinear static function in the Hammerstein system is modelled using a B-spline neural network. The controller is composed by computing the inverse of the B-spline approximated nonlinear static function, and a linear pole assignment controller. The contribution of this article is the inverse of De Boor algorithm that computes the inverse efficiently. Mathematical analysis is provided to prove the convergence of the proposed algorithm. Numerical examples are utilised to demonstrate the efficacy of the proposed approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a new nonlinear digital baseband predistorter design is introduced based on direct learning, together with a new Wiener system modeling approach for the high power amplifiers (HPA) based on the B-spline neural network. The contribution is twofold. Firstly, by assuming that the nonlinearity in the HPA is mainly dependent on the input signal amplitude the complex valued nonlinear static function is represented by two real valued B-spline neural networks, one for the amplitude distortion and another for the phase shift. The Gauss-Newton algorithm is applied for the parameter estimation, in which the De Boor recursion is employed to calculate both the B-spline curve and the first order derivatives. Secondly, we derive the predistorter algorithm calculating the inverse of the complex valued nonlinear static function according to B-spline neural network based Wiener models. The inverse of the amplitude and phase shift distortion are then computed and compensated using the identified phase shift model. Numerical examples have been employed to demonstrate the efficacy of the proposed approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optimal state estimation from given observations of a dynamical system by data assimilation is generally an ill-posed inverse problem. In order to solve the problem, a standard Tikhonov, or L2, regularization is used, based on certain statistical assumptions on the errors in the data. The regularization term constrains the estimate of the state to remain close to a prior estimate. In the presence of model error, this approach does not capture the initial state of the system accurately, as the initial state estimate is derived by minimizing the average error between the model predictions and the observations over a time window. Here we examine an alternative L1 regularization technique that has proved valuable in image processing. We show that for examples of flow with sharp fronts and shocks, the L1 regularization technique performs more accurately than standard L2 regularization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is evidence that consumption of fish, especially oily fish, has substantial beneficial effects on health. In particular an inverse relationship of oily fish intake to coronary heart disease incidence has been established. These beneficial effects are ascribed to fish oil components including long chain ω-3 polyunsaturated fatty acids. On the other hand it should be noted that oily fish also contains hazardous substances such as dioxins, PCBs and methylmercury. Soy consumption has been associated with potential beneficial and adverse effects. The claimed benefits include reduced risk of cardiovascular disease; osteoporosis, breast and prostate cancer whereas potential adverse effects include impaired thyroid function, disruption of sex hormone levels, changes in reproductive function and increased breast cancer risk The two cases of natural foods highlight the need to consider both risks and benefits in order to establish the net health impact associated to the consumption of specific food products. Within the Sixth Framework programme of the European Commission, the BRAFO project was funded to develop a framework that allows for the quantitative comparison of human health risks and benefits in relation to foods and food compounds. This paper describes the application of the developed framework to two natural foods, farmed salmon and soy protein. We conclude that the BRAFO methodology is highly applicable to natural foods. It will help the benefit-risk managers in selecting the appropriate dietary recommendations for the population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a support vector machine (SVM) approach for characterizing the feasible parameter set (FPS) in non-linear set-membership estimation problems is presented. It iteratively solves a regression problem from which an approximation of the boundary of the FPS can be determined. To guarantee convergence to the boundary the procedure includes a no-derivative line search and for an appropriate coverage of points on the FPS boundary it is suggested to start with a sequential box pavement procedure. The SVM approach is illustrated on a simple sine and exponential model with two parameters and an agro-forestry simulation model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This contribution introduces a new digital predistorter to compensate serious distortions caused by memory high power amplifiers (HPAs) which exhibit output saturation characteristics. The proposed design is based on direct learning using a data-driven B-spline Wiener system modeling approach. The nonlinear HPA with memory is first identified based on the B-spline neural network model using the Gauss-Newton algorithm, which incorporates the efficient De Boor algorithm with both B-spline curve and first derivative recursions. The estimated Wiener HPA model is then used to design the Hammerstein predistorter. In particular, the inverse of the amplitude distortion of the HPA's static nonlinearity can be calculated effectively using the Newton-Raphson formula based on the inverse of De Boor algorithm. A major advantage of this approach is that both the Wiener HPA identification and the Hammerstein predistorter inverse can be achieved very efficiently and accurately. Simulation results obtained are presented to demonstrate the effectiveness of this novel digital predistorter design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forecasting wind power is an important part of a successful integration of wind power into the power grid. Forecasts with lead times longer than 6 h are generally made by using statistical methods to post-process forecasts from numerical weather prediction systems. Two major problems that complicate this approach are the non-linear relationship between wind speed and power production and the limited range of power production between zero and nominal power of the turbine. In practice, these problems are often tackled by using non-linear non-parametric regression models. However, such an approach ignores valuable and readily available information: the power curve of the turbine's manufacturer. Much of the non-linearity can be directly accounted for by transforming the observed power production into wind speed via the inverse power curve so that simpler linear regression models can be used. Furthermore, the fact that the transformed power production has a limited range can be taken care of by employing censored regression models. In this study, we evaluate quantile forecasts from a range of methods: (i) using parametric and non-parametric models, (ii) with and without the proposed inverse power curve transformation and (iii) with and without censoring. The results show that with our inverse (power-to-wind) transformation, simpler linear regression models with censoring perform equally or better than non-linear models with or without the frequently used wind-to-power transformation.