969 resultados para Godunov-VanLeer schemes
Resumo:
AbstractFor a wide range of environmental, hydrological, and engineering applications there is a fast growing need for high-resolution imaging. In this context, waveform tomographic imaging of crosshole georadar data is a powerful method able to provide images of pertinent electrical properties in near-surface environments with unprecedented spatial resolution. In contrast, conventional ray-based tomographic methods, which consider only a very limited part of the recorded signal (first-arrival traveltimes and maximum first-cycle amplitudes), suffer from inherent limitations in resolution and may prove to be inadequate in complex environments. For a typical crosshole georadar survey the potential improvement in resolution when using waveform-based approaches instead of ray-based approaches is in the range of one order-of- magnitude. Moreover, the spatial resolution of waveform-based inversions is comparable to that of common logging methods. While in exploration seismology waveform tomographic imaging has become well established over the past two decades, it is comparably still underdeveloped in the georadar domain despite corresponding needs. Recently, different groups have presented finite-difference time-domain waveform inversion schemes for crosshole georadar data, which are adaptations and extensions of Tarantola's seminal nonlinear generalized least-squares approach developed for the seismic case. First applications of these new crosshole georadar waveform inversion schemes on synthetic and field data have shown promising results. However, there is little known about the limits and performance of such schemes in complex environments. To this end, the general motivation of my thesis is the evaluation of the robustness and limitations of waveform inversion algorithms for crosshole georadar data in order to apply such schemes to a wide range of real world problems.One crucial issue to making applicable and effective any waveform scheme to real-world crosshole georadar problems is the accurate estimation of the source wavelet, which is unknown in reality. Waveform inversion schemes for crosshole georadar data require forward simulations of the wavefield in order to iteratively solve the inverse problem. Therefore, accurate knowledge of the source wavelet is critically important for successful application of such schemes. Relatively small differences in the estimated source wavelet shape can lead to large differences in the resulting tomograms. In the first part of my thesis, I explore the viability and robustness of a relatively simple iterative deconvolution technique that incorporates the estimation of the source wavelet into the waveform inversion procedure rather than adding additional model parameters into the inversion problem. Extensive tests indicate that this source wavelet estimation technique is simple yet effective, and is able to provide remarkably accurate and robust estimates of the source wavelet in the presence of strong heterogeneity in both the dielectric permittivity and electrical conductivity as well as significant ambient noise in the recorded data. Furthermore, our tests also indicate that the approach is insensitive to the phase characteristics of the starting wavelet, which is not the case when directly incorporating the wavelet estimation into the inverse problem.Another critical issue with crosshole georadar waveform inversion schemes which clearly needs to be investigated is the consequence of the common assumption of frequency- independent electromagnetic constitutive parameters. This is crucial since in reality, these parameters are known to be frequency-dependent and complex and thus recorded georadar data may show significant dispersive behaviour. In particular, in the presence of water, there is a wide body of evidence showing that the dielectric permittivity can be significantly frequency dependent over the GPR frequency range, due to a variety of relaxation processes. The second part of my thesis is therefore dedicated to the evaluation of the reconstruction limits of a non-dispersive crosshole georadar waveform inversion scheme in the presence of varying degrees of dielectric dispersion. I show that the inversion algorithm, combined with the iterative deconvolution-based source wavelet estimation procedure that is partially able to account for the frequency-dependent effects through an "effective" wavelet, performs remarkably well in weakly to moderately dispersive environments and has the ability to provide adequate tomographic reconstructions.
Resumo:
The aim of this thesis is to present a solution to the quantum phase problem of the single-mode optical field. The solution is based on the use of phase shift covariant normalized positive operator measures. These measures describe realistic direct coherent state phase measurements such as the phase measurement schemes based on eight-port homodyne detection or heterodyne detection. The structure of covariant operator measures and, more generally, covariant sesquilinear form measures is analyzed in this work. Four different characterizations for phase shift covariant normalized positive operator measures are presented. The canonical covariant operator measure is definded and its properties are studied. Finally, some other suggested phase theories are introduced to investigate their connections to the covariant sesquilinear form measures.
Resumo:
This paper breaks new ground toward contractual and institutional innovation in models of homeownership, equity building, and mortgage enforcement. Inspired by recent developments in the affordable housing sector and in other types of public financing schemes, this paper suggests extending institutional and financial strategies such as timeand place-based division of property rights, conditional subsidies, and credit mediation to alleviate the systemic risks of mortgage foreclosure. Alongside a for-profit shared equity scheme that would be led by local governments, we also outline a private market shared equity model, one of bootstrapping home buying with purchase options.
Resumo:
PURPOSE: Most existing methods for accelerated parallel imaging in MRI require additional data, which are used to derive information about the sensitivity profile of each radiofrequency (RF) channel. In this work, a method is presented to avoid the acquisition of separate coil calibration data for accelerated Cartesian trajectories. METHODS: Quadratic phase is imparted to the image to spread the signals in k-space (aka phase scrambling). By rewriting the Fourier transform as a convolution operation, a window can be introduced to the convolved chirp function, allowing a low-resolution image to be reconstructed from phase-scrambled data without prominent aliasing. This image (for each RF channel) can be used to derive coil sensitivities to drive existing parallel imaging techniques. As a proof of concept, the quadratic phase was applied by introducing an offset to the x(2) - y(2) shim and the data were reconstructed using adapted versions of the image space-based sensitivity encoding and GeneRalized Autocalibrating Partially Parallel Acquisitions algorithms. RESULTS: The method is demonstrated in a phantom (1 × 2, 1 × 3, and 2 × 2 acceleration) and in vivo (2 × 2 acceleration) using a 3D gradient echo acquisition. CONCLUSION: Phase scrambling can be used to perform parallel imaging acceleration without acquisition of separate coil calibration data, demonstrated here for a 3D-Cartesian trajectory. Further research is required to prove the applicability to other 2D and 3D sampling schemes. Magn Reson Med, 2014. © 2014 Wiley Periodicals, Inc.
Resumo:
We develop a stylized model of economic growth with bubbles. In this model, changes in investorsentiment lead to the appearance and collapse of macroeconomic bubbles or pyramid schemes.We show how these bubbles mitigate the effects of financial frictions. During bubbly episodes,unproductive investors demand bubbles while productive investors supply them. These transfersof resources improve the efficiency at which the economy operates, expanding consumption, thecapital stock and output. When bubbly episodes end, these transfers stop and consumption, thecapital stock and output contract. We characterize the stochastic equilibria of the model and arguethat they provide a natural way of introducing bubble shocks into business cycle models.
Resumo:
We explore a view of the crisis as a shock to investor sentiment that led to the collapse of abubble or pyramid scheme in financial markets. We embed this view in a standard model of thefinancial accelerator and explore its empirical and policy implications. In particular, we show howthe model can account for: (i) a gradual and protracted expansionary phase followed by a suddenand sharp recession; (ii) the connection (or lack of connection!) between financial and real economicactivity and; (iii) a fast and strong transmission of shocks across countries. We also use the modelto explore the role of fiscal policy.
Resumo:
We investigate the theoretical conditions for effectiveness of government consumptionexpenditure expansions using US, Euro area and UK data. Fiscal expansions taking placewhen monetary policy is accommodative lead to large output multipliers in normal times.The 2009-2010 packages need not produce significant output multipliers, may havemoderate debt effects, and only generate temporary inflation. Expenditure expansionsaccompanied by deficit/debt consolidations schemes may lead to short run output gains buttheir success depends on how monetary policy and expectations behave. Trade opennessand the cyclicality of the labor wedge explain cross-country differences in the magnitude ofthe multipliers.
Resumo:
We consider an entrepreneur that is the sole producer of a costreducing skill, but the entrepreneur that hires a team to usethe skill cannot prevent collusive trade for the innovation related knowledge between employees and competitors. We showthat there are two types of diffusion avoiding strategies forthe entrepreneur to preempt collusive communication i) settingup a large productive capacity (the traditional firm) and ii)keeping a small team (the lean firm). The traditional firm ischaracterized by its many "marginal" employees that work shortdays, receive flat wages and are incompletely informed about the innovation. The lean firm is small in number of employees,engages in complete information sharing among members, that are paid with stock option schemes. We find that the lean firm is superior to the traditional firm when technological entry costsare low and when the sector is immature.
Resumo:
This paper develops a model of the bubbly economy and uses it to study the effects of bailoutpolicies. In the bubbly economy, weak enforcement institutions do not allow firms to pledge futurerevenues to their creditors. As a result, "fundamental" collateral is scarce and this impairs the intermediationprocess that transforms savings into capital. To overcome this shortage of "fundamental"collateral, the bubbly economy creates "bubbly" collateral. This additional collateral supports anintricate array of intra- and inter-generational transfers that allow savings to be transformed intocapital and bubbles. Swings in investor sentiment lead to fluctuations in the amount of bubblycollateral, giving rise to bubbly business cycles with very rich and complex dynamics.Bailout policies can affect these dynamics in a variety of ways. Expected bailouts provideadditional collateral and expand investment and the capital stock. Realized bailouts reduce thesupply of funds and contract investment and the capital stock. Thus, bailout policies tend to fosterinvestment and growth in normal times, but to depress investment and growth during crisis periods.We show how to design bailout policies that maximize various policy objectives.
Resumo:
In this paper we study the welfare impact of alternative tax schemes on laborand capital. We evaluate the e_ect of lowering capital income taxes on thedistribution of wealth in a model with heterogeneous agents, restricting ourattention to policies with constant tax rates.We calibrate and simulate the economy; we find that lowering capital taxeshas two effects: i) it increases effciency in terms of aggregate production, andii) it redistributes wealth in favor of those agents with a low wage/wealth ratio.We find that the redistributive effect dominates, and that agents with a lowwage wealth ratio would experience a large loss in utility if capital income taxeswere eliminated.
Resumo:
The Attorney General’s Consumer Protection Division receives hundreds of calls and consumer complaints every year. Follow these tips to avoid unexpected expense and disappointments. This record is about: International Lottery Schemes: You're the Loser!
Resumo:
This Article breaks new ground toward contractual and institutional innovation in models of homeownership, equity building, and mortgage enforcement. Inspired by recent developments in the affordable housing sector and other types of public financing schemes, we suggest extending institutional and financial strategies such as time- and place-based division of property rights, conditional subsidies, and credit mediation to alleviate the systemic risks of mortgage foreclosure. Two new solutions offer a broad theoretical basis for such developments in the economic and legal institution of homeownership: a for-profit shared equity scheme led by local governments alongside a private market shared equity model, one of "bootstrapping home buying with purchase options".
Resumo:
Forecasting real-world quantities with basis on information from textual descriptions has recently attracted significant interest as a research problem, although previous studies have focused on applications involving only the English language. This document presents an experimental study on the subject of making predictions with textual contents written in Portuguese, using documents from three distinct domains. I specifically report on experiments using different types of regression models, using state-of-the-art feature weighting schemes, and using features derived from cluster-based word representations. Through controlled experiments, I have shown that prediction models using the textual information achieve better results than simple baselines such as taking the average value over the training data, and that richer document representations (i.e., using Brown clusters and the Delta- TF-IDF feature weighting scheme) result in slight performance improvements.