939 resultados para Variational inequality
Resumo:
This paper presents a dynamic model to study how different levels of information about the root determinants of wealth (luck versus effort) can impact inequality and intergenerational mobility through societal beliefs, individual choices and redistributive policies. To my knowledge, the model presented is the first dynamicmodel in which skills are stochastic and both beliefs and voted redistribution are determined endogenously. The model is able to explain a number of empirical facts. Large empirical evidence shows that the difference in the political support for redistribution appears to reflect differences in the social perceptions regarding the determinants of individual wealth and the underlying sources of income inequality. Moreover the beliefs about the determinants of wealth impact individual choices of effort and therefore the beliefs about the determinants of wealth impact inequality and mobility both through choices of effort and redistributive policies. The model generates multiple equilibria (US versus Europe-type) which may account for the observed features not only in terms of societal beliefs and redistribution but also in terms of perceived versus real mobility and inequality.
Resumo:
This paper describes the implementation of a 3D variational (3D-Var) data assimilation scheme for a morphodynamic model applied to Morecambe Bay, UK. A simple decoupled hydrodynamic and sediment transport model is combined with a data assimilation scheme to investigate the ability of such methods to improve the accuracy of the predicted bathymetry. The inverse forecast error covariance matrix is modelled using a Laplacian approximation which is calibrated for the length scale parameter required. Calibration is also performed for the Soulsby-van Rijn sediment transport equations. The data used for assimilation purposes comprises waterlines derived from SAR imagery covering the entire period of the model run, and swath bathymetry data collected by a ship-borne survey for one date towards the end of the model run. A LiDAR survey of the entire bay carried out in November 2005 is used for validation purposes. The comparison of the predictive ability of the model alone with the model-forecast-assimilation system demonstrates that using data assimilation significantly improves the forecast skill. An investigation of the assimilation of the swath bathymetry as well as the waterlines demonstrates that the overall improvement is initially large, but decreases over time as the bathymetry evolves away from that observed by the survey. The result of combining the calibration runs into a pseudo-ensemble provides a higher skill score than for a single optimized model run. A brief comparison of the Optimal Interpolation assimilation method with the 3D-Var method shows that the two schemes give similar results.
Resumo:
Liquid clouds play a profound role in the global radiation budget but it is difficult to remotely retrieve their vertical profile. Ordinary narrow field-of-view (FOV) lidars receive a strong return from such clouds but the information is limited to the first few optical depths. Wideangle multiple-FOV lidars can isolate radiation scattered multiple times before returning to the instrument, often penetrating much deeper into the cloud than the singly-scattered signal. These returns potentially contain information on the vertical profile of extinction coefficient, but are challenging to interpret due to the lack of a fast radiative transfer model for simulating them. This paper describes a variational algorithm that incorporates a fast forward model based on the time-dependent two-stream approximation, and its adjoint. Application of the algorithm to simulated data from a hypothetical airborne three-FOV lidar with a maximum footprint width of 600m suggests that this approach should be able to retrieve the extinction structure down to an optical depth of around 6, and total opticaldepth up to at least 35, depending on the maximum lidar FOV. The convergence behavior of Gauss-Newton and quasi-Newton optimization schemes are compared. We then present results from an application of the algorithm to observations of stratocumulus by the 8-FOV airborne “THOR” lidar. It is demonstrated how the averaging kernel can be used to diagnose the effective vertical resolution of the retrieved profile, and therefore the depth to which information on the vertical structure can be recovered. This work enables exploitation of returns from spaceborne lidar and radar subject to multiple scattering more rigorously than previously possible.
Resumo:
This paper is motivated to investigate the often neglected payoff to investments in the health of girls and women in terms of next generation outcomes. This paper investigates the intergenerational persistence of health across time and region as well as across the distribution of maternal health. It uses comparable microdata on as many as 2.24 million children born of about 0.6 million mothers in 38 developing countries in the 31 year period, 1970–2000. Mother's health is indicated by her height, BMI and anemia status. Child health is indicated by mortality risk and anthropometric failure. We find a positive relationship between maternal and child health across indicators and highlight non-linearities in these relationships. The results suggest that both contemporary and childhood health of the mother matter and that the benefits to the next generation are likely to be persistent. Averaging across the sample, persistence shows a considerable decline over time. Disaggregation shows that the decline is only significant in Latin America. Persistence has remained largely constant in Asia and has risen in Africa. The paper provides the first cross-country estimates of the intergenerational persistence in health and the first estimates of trends.
Resumo:
This paper documents the extent of inequality of educational opportunity in India spanning the period 1983–2004 using National Sample Surveys. We build on recent developments in the literature that have operationalized concepts of inequality of opportunity theory and construct several indices of inequality of educational opportunity for an adult sample. Kerala stands out as the least opportunity-unequal state. Rajasthan, Gujarat, and Uttar Pradesh experienced large-scale falls in the ranking of inequality of opportunities. By contrast, West Bengal and Orissa made significant progress in reducing inequality of opportunity. We also examine the links between progress toward equality of opportunity and a selection of pro-poor policies.
Resumo:
Variational data assimilation in continuous time is revisited. The central techniques applied in this paper are in part adopted from the theory of optimal nonlinear control. Alternatively, the investigated approach can be considered as a continuous time generalization of what is known as weakly constrained four-dimensional variational assimilation (4D-Var) in the geosciences. The technique allows to assimilate trajectories in the case of partial observations and in the presence of model error. Several mathematical aspects of the approach are studied. Computationally, it amounts to solving a two-point boundary value problem. For imperfect models, the trade-off between small dynamical error (i.e. the trajectory obeys the model dynamics) and small observational error (i.e. the trajectory closely follows the observations) is investigated. This trade-off turns out to be trivial if the model is perfect. However, even in this situation, allowing for minute deviations from the perfect model is shown to have positive effects, namely to regularize the problem. The presented formalism is dynamical in character. No statistical assumptions on dynamical or observational noise are imposed.
Resumo:
We show that the four-dimensional variational data assimilation method (4DVar) can be interpreted as a form of Tikhonov regularization, a very familiar method for solving ill-posed inverse problems. It is known from image restoration problems that L1-norm penalty regularization recovers sharp edges in the image more accurately than Tikhonov, or L2-norm, penalty regularization. We apply this idea from stationary inverse problems to 4DVar, a dynamical inverse problem, and give examples for an L1-norm penalty approach and a mixed total variation (TV) L1–L2-norm penalty approach. For problems with model error where sharp fronts are present and the background and observation error covariances are known, the mixed TV L1–L2-norm penalty performs better than either the L1-norm method or the strong constraint 4DVar (L2-norm)method. A strength of the mixed TV L1–L2-norm regularization is that in the case where a simplified form of the background error covariance matrix is used it produces a much more accurate analysis than 4DVar. The method thus has the potential in numerical weather prediction to overcome operational problems with poorly tuned background error covariance matrices.
Resumo:
The formulation and performance of the Met Office visibility analysis and prediction system are described. The visibility diagnostic within the limited-area Unified Model is a function of humidity and a prognostic aerosol content. The aerosol model includes advection, industrial and general urban sources, plus boundary-layer mixing and removal by rain. The assimilation is a 3-dimensional variational scheme in which the visibility observation operator is a very nonlinear function of humidity, aerosol and temperature. A quality control scheme for visibility data is included. Visibility observations can give rise to humidity increments of significant magnitude compared with the direct impact of humidity observations. We present the results of sensitivity studies which show the contribution of different components of the system to improved skill in visibility forecasts. Visibility assimilation is most important within the first 6-12 hours of the forecast and for visibilities below 1 km, while modelling of aerosol sources and advection is important for slightly higher visibilities (1-5 km) and is still significant at longer forecast times
Resumo:
European labour markets are increasingly divided between insiders in full-time permanent employment and outsiders in precarious work or unemployment. Using quantitative as well as qualitative methods, this thesis investigates the determinants and consequences of labour market policies that target these outsiders in three separate papers. The first paper looks at Active Labour Market Policies (ALMPs) that target the unemployed. It shows that left and right-wing parties choose different types of ALMPs depending on the policy and the welfare regime in which the party is located. These findings reconcile the conflicting theoretical expectations from the Power Resource approach and the insider-outsider theory. The second paper considers the regulation and protection of the temporary work sector. It solves the puzzle of temporary re-regulation in France, which contrasts with most other European countries that have deregulated temporary work. Permanent workers are adversely affected by the expansion of temporary work in France because of general skills and low wage coordination. The interests of temporary and permanent workers for re-regulation therefore overlap in France and left governments have an incentive to re-regulate the sector. The third paper then investigates what determines inequality between median and bottom income workers. It shows that non-inclusive economic coordination increases inequality in the absence of compensating institutions such as minimum wage regulation. The deregulation of temporary work as well as spending on employment incentives and rehabilitation also has adverse effects on inequality. Thus, policies that target outsiders have important economic effects on the rest of the workforce. Three broader contributions can be identified. First, welfare state policies may not always be in the interests of labour, so left parties may not always promote them. Second, the interests of insiders and outsiders are not necessarily at odds. Third, economic coordination may not be conducive to egalitarianism where it is not inclusive.
Resumo:
The Ultra Weak Variational Formulation (UWVF) is a powerful numerical method for the approximation of acoustic, elastic and electromagnetic waves in the time-harmonic regime. The use of Trefftz-type basis functions incorporates the known wave-like behaviour of the solution in the discrete space, allowing large reductions in the required number of degrees of freedom for a given accuracy, when compared to standard finite element methods. However, the UWVF is not well disposed to the accurate approximation of singular sources in the interior of the computational domain. We propose an adjustment to the UWVF for seismic imaging applications, which we call the Source Extraction UWVF. Differing fields are solved for in subdomains around the source, and matched on the inter-domain boundaries. Numerical results are presented for a domain of constant wavenumber and for a domain of varying sound speed in a model used for seismic imaging.
Resumo:
Variational data assimilation is commonly used in environmental forecasting to estimate the current state of the system from a model forecast and observational data. The assimilation problem can be written simply in the form of a nonlinear least squares optimization problem. However the practical solution of the problem in large systems requires many careful choices to be made in the implementation. In this article we present the theory of variational data assimilation and then discuss in detail how it is implemented in practice. Current solutions and open questions are discussed.