61 resultados para Inverse Problem in Optics


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the numerical treatment of second kind integral equations on the real line of the form ∅(s) = ∫_(-∞)^(+∞)▒〖κ(s-t)z(t)ϕ(t)dt,s=R〗 (abbreviated ϕ= ψ+K_z ϕ) in which K ϵ L_1 (R), z ϵ L_∞ (R) and ψ ϵ BC(R), the space of bounded continuous functions on R, are assumed known and ϕ ϵ BC(R) is to be determined. We first derive sharp error estimates for the finite section approximation (reducing the range of integration to [-A, A]) via bounds on (1-K_z )^(-1)as an operator on spaces of weighted continuous functions. Numerical solution by a simple discrete collocation method on a uniform grid on R is then analysed: in the case when z is compactly supported this leads to a coefficient matrix which allows a rapid matrix-vector multiply via the FFT. To utilise this possibility we propose a modified two-grid iteration, a feature of which is that the coarse grid matrix is approximated by a banded matrix, and analyse convergence and computational cost. In cases where z is not compactly supported a combined finite section and two-grid algorithm can be applied and we extend the analysis to this case. As an application we consider acoustic scattering in the half-plane with a Robin or impedance boundary condition which we formulate as a boundary integral equation of the class studied. Our final result is that if z (related to the boundary impedance in the application) takes values in an appropriate compact subset Q of the complex plane, then the difference between ϕ(s)and its finite section approximation computed numerically using the iterative scheme proposed is ≤C_1 [kh log⁡〖(1⁄kh)+(1-Θ)^((-1)⁄2) (kA)^((-1)⁄2) 〗 ] in the interval [-ΘA,ΘA](Θ<1) for kh sufficiently small, where k is the wavenumber and h the grid spacing. Moreover this numerical approximation can be computed in ≤C_2 N log⁡N operations, where N = 2A/h is the number of degrees of freedom. The values of the constants C1 and C2 depend only on the set Q and not on the wavenumber k or the support of z.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The question of what explains variation in expenditures on Active Labour Market Programs (ALMPs) has attracted significant scholarship in recent years. Significant insights have been gained with respect to the role of employers, unions and dual labour markets, openness, and partisanship. However, there remain significant disagreements with respects to key explanatory variables such the role of unions or the impact of partisanship. Qualitative studies have shown that there are both good conceptual reasons as well as historical evidence that different ALMPs are driven by different dynamics. There is little reason to believe that vastly different programs such as training and employment subsidies are driven by similar structural, interest group or indeed partisan dynamics. The question is therefore whether different ALMPs have the same correlation with different key explanatory variables identified in the literature? Using regression analysis, this paper shows that the explanatory variables identified by the literature have different relation to distinct ALMPs. This refinement adds significant analytical value and shows that disagreements are at least partly due to a dependent variable problem of ‘over-aggregation’.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

NDIR is proposed for monitoring of air pollutants emitted by ship engines. Careful optical filtering overcomes the challenge of optical detection of NO2 in humid exhaust gas, despite spectroscopic overlap with the water vapour band.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We establish a general framework for a class of multidimensional stochastic processes over [0,1] under which with probability one, the signature (the collection of iterated path integrals in the sense of rough paths) is well-defined and determines the sample paths of the process up to reparametrization. In particular, by using the Malliavin calculus we show that our method applies to a class of Gaussian processes including fractional Brownian motion with Hurst parameter H>1/4, the Ornstein–Uhlenbeck process and the Brownian bridge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Satellite based top-of-atmosphere (TOA) and surface radiation budget observations are combined with mass corrected vertically integrated atmospheric energy divergence and tendency from reanalysis to infer the regional distribution of the TOA, atmospheric and surface energy budget terms over the globe. Hemispheric contrasts in the energy budget terms are used to determine the radiative and combined sensible and latent heat contributions to the cross-equatorial heat transports in the atmosphere (AHT_EQ) and ocean (OHT_EQ). The contrast in net atmospheric radiation implies an AHT_EQ from the northern hemisphere (NH) to the southern hemisphere (SH) (0.75 PW), while the hemispheric difference in sensible and latent heat implies an AHT_EQ in the opposite direction (0.51 PW), resulting in a net NH to SH AHT_EQ (0.24 PW). At the surface, the hemispheric contrast in the radiative component (0.95 PW) dominates, implying a 0.44 PW SH to NH OHT_EQ. Coupled model intercomparison project phase 5 (CMIP5) models with excessive net downward surface radiation and surface-to-atmosphere sensible and latent heat transport in the SH relative to the NH exhibit anomalous northward AHT_EQ and overestimate SH tropical precipitation. The hemispheric bias in net surface radiative flux is due to too much longwave surface radiative cooling in the NH tropics in both clear and all-sky conditions and excessive shortwave surface radiation in the SH subtropics and extratropics due to an underestimation in reflection by clouds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The weak-constraint inverse for nonlinear dynamical models is discussed and derived in terms of a probabilistic formulation. The well-known result that for Gaussian error statistics the minimum of the weak-constraint inverse is equal to the maximum-likelihood estimate is rederived. Then several methods based on ensemble statistics that can be used to find the smoother (as opposed to the filter) solution are introduced and compared to traditional methods. A strong point of the new methods is that they avoid the integration of adjoint equations, which is a complex task for real oceanographic or atmospheric applications. they also avoid iterative searches in a Hilbert space, and error estimates can be obtained without much additional computational effort. the feasibility of the new methods is illustrated in a two-layer quasigeostrophic model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

New ways of combining observations with numerical models are discussed in which the size of the state space can be very large, and the model can be highly nonlinear. Also the observations of the system can be related to the model variables in highly nonlinear ways, making this data-assimilation (or inverse) problem highly nonlinear. First we discuss the connection between data assimilation and inverse problems, including regularization. We explore the choice of proposal density in a Particle Filter and show how the ’curse of dimensionality’ might be beaten. In the standard Particle Filter ensembles of model runs are propagated forward in time until observations are encountered, rendering it a pure Monte-Carlo method. In large-dimensional systems this is very inefficient and very large numbers of model runs are needed to solve the data-assimilation problem realistically. In our approach we steer all model runs towards the observations resulting in a much more efficient method. By further ’ensuring almost equal weight’ we avoid performing model runs that are useless in the end. Results are shown for the 40 and 1000 dimensional Lorenz 1995 model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We show that the four-dimensional variational data assimilation method (4DVar) can be interpreted as a form of Tikhonov regularization, a very familiar method for solving ill-posed inverse problems. It is known from image restoration problems that L1-norm penalty regularization recovers sharp edges in the image more accurately than Tikhonov, or L2-norm, penalty regularization. We apply this idea from stationary inverse problems to 4DVar, a dynamical inverse problem, and give examples for an L1-norm penalty approach and a mixed total variation (TV) L1–L2-norm penalty approach. For problems with model error where sharp fronts are present and the background and observation error covariances are known, the mixed TV L1–L2-norm penalty performs better than either the L1-norm method or the strong constraint 4DVar (L2-norm)method. A strength of the mixed TV L1–L2-norm regularization is that in the case where a simplified form of the background error covariance matrix is used it produces a much more accurate analysis than 4DVar. The method thus has the potential in numerical weather prediction to overcome operational problems with poorly tuned background error covariance matrices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Zinc deficiency is the most ubiquitous micronutrient deficiency problem in world crops. Zinc is essential for both plants and animals because it is a structural constituent and regulatory co-factor in enzymes and proteins involved in many biochemical pathways. Millions of hectares of cropland are affected by Zn deficiency and approximately one-third of the human population suffers from an inadequate intake of Zn. The main soil factors affecting the availability of Zn to plants are low total Zn contents, high pH, high calcite and organic matter contents and high concentrations of Na, Ca, Mg, bicarbonate and phosphate in the soil solution or in labile forms. Maize is the most susceptible cereal crop, but wheat grown on calcareous soils and lowland rice on flooded soils are also highly prone to Zn deficiency. Zinc fertilizers are used in the prevention of Zn deficiency and in the biofortification of cereal grains.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ultimate criterion of success for interactive expert systems is that they will be used, and used to effect, by individuals other than the system developers. A key ingredient of success in most systems is involving users in the specification and development of systems as they are being built. However, until recently, system designers have paid little attention to ascertaining user needs and to developing systems with corresponding functionality and appropriate interfaces to match those requirements. Although the situation is beginning to change, many developers do not know how to go about involving users, or else tackle the problem in an inadequate way. This paper discusses the need for user involvement and considers why many developers are still not involving users in an optimal way. It looks at the different ways in which users can be involved in the development process and describes how to select appropriate techniques and methods for studying users. Finally, it discusses some of the problems inherent in involving users in expert system development, and recommends an approach which incorporates both ethnographic analysis and formal user testing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The “butterfly effect” is a popularly known paradigm; commonly it is said that when a butterfly flaps its wings in Brazil, it may cause a tornado in Texas. This essentially describes how weather forecasts can be extremely senstive to small changes in the given atmospheric data, or initial conditions, used in computer model simulations. In 1961 Edward Lorenz found, when running a weather model, that small changes in the initial conditions given to the model can, over time, lead to entriely different forecasts (Lorenz, 1963). This discovery highlights one of the major challenges in modern weather forecasting; that is to provide the computer model with the most accurately specified initial conditions possible. A process known as data assimilation seeks to minimize the errors in the given initial conditions and was, in 1911, described by Bjerkness as “the ultimate problem in meteorology” (Bjerkness, 1911).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new heuristic for the Steiner Minimal Tree problem is presented here. The method described is based on the detection of particular sets of nodes in networks, the “Hot Spot” sets, which are used to obtain better approximations of the optimal solutions. An algorithm is also proposed which is capable of improving the solutions obtained by classical heuristics, by means of a stirring process of the nodes in solution trees. Classical heuristics and an enumerative method are used CIS comparison terms in the experimental analysis which demonstrates the goodness of the heuristic discussed in this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider boundary value problems for the N-wave interaction equations in one and two space dimensions, posed for x [greater-or-equal, slanted] 0 and x,y [greater-or-equal, slanted] 0, respectively. Following the recent work of Fokas, we develop an inverse scattering formalism to solve these problems by considering the simultaneous spectral analysis of the two ordinary differential equations in the associated Lax pair. The solution of the boundary value problems is obtained through the solution of a local Riemann–Hilbert problem in the one-dimensional case, and a nonlocal Riemann–Hilbert problem in the two-dimensional case.