931 resultados para Generalized inverse


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aquifers are a vital water resource whose quality characteristics must be safeguarded or, if damaged, restored. The extent and complexity of aquifer contamination is related to characteristics of the porous medium, the influence of boundary conditions, and the biological, chemical and physical processes. After the nineties, the efforts of the scientists have been increased exponentially in order to find an efficient way for estimating the hydraulic parameters of the aquifers, and thus, recover the contaminant source position and its release history. To simplify and understand the influence of these various factors on aquifer phenomena, it is common for researchers to use numerical and controlled experiments. This work presents some of these methods, applying and comparing them on data collected during laboratory, field and numerical tests. The work is structured in four parts which present the results and the conclusions of the specific objectives.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Administração Financeira surge no início do século XIX juntamente com o movimento de consolidação das grandes empresas e a formação dos mercados nacionais americano enquanto que no Brasil os primeiros estudos ocorrem a partir da segunda metade do século XX. Desde entãoo país conseguiu consolidar alguns centros de excelência em pesquisa, formar grupo significativo de pesquisadores seniores e expandir as áreas de pesquisa no campo, contudo, ainda são poucos os trabalhos que buscam retratar as características da produtividade científica em Finanças. Buscando contribuir para a melhor compreensão do comportamento produtivo dessa área a presente pesquisa estuda sua produção científica, materializada na forma de artigos digitais, publicados em 24 conceituados periódicos nacionais classificados nos estratos Qualis/CAPES A2, B1 e B2 da Área de Administração, Ciências Contábeis e Turismo. Para tanto são aplicadas a Lei de Bradford, Lei do Elitismo de Price e Lei de Lotka. Pela Lei de Bradford são identificadas três zonas de produtividade sendo o núcleo formado por três revistas, estando uma delas classificada no estrato Qualis/CAPES B2, o que evidencia a limitação de um recorte tendo como único critério a classificação Qualis/CAPES. Para a Lei do Elitismo de Price, seja pela contagem direta ou completa, não identificamos comportamento de uma elite semelhante ao apontado pela teoria e que conta com grande número de autores com apenas uma publicação.Aplicando-se o modelo do Poder Inverso Generalizado, calculado por Mínimos Quadrados Ordinários (MQO), verificamos que produtividade dos pesquisadores, quando feita pela contagem direta, se adequa àquela definida pela Lei de Lotka ao nível de α = 0,01 de significância, contudo, pela contagem completa não podemos confirmar a hipótese de homogeneidade das distribuições, além do fato de que nas duas contagens a produtividade analisada pelo parâmetro n é maior que 2 e, portanto, a produtividade do pesquisadores de finanças é menor que a defendida pela teoria.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the Bayesian framework, predictions for a regression problem are expressed in terms of a distribution of output values. The mode of this distribution corresponds to the most probable output, while the uncertainty associated with the predictions can conveniently be expressed in terms of error bars. In this paper we consider the evaluation of error bars in the context of the class of generalized linear regression models. We provide insights into the dependence of the error bars on the location of the data points and we derive an upper bound on the true error bars in terms of the contributions from individual data points which are themselves easily evaluated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate the dependence of Bayesian error bars on the distribution of data in input space. For generalized linear regression models we derive an upper bound on the error bars which shows that, in the neighbourhood of the data points, the error bars are substantially reduced from their prior values. For regions of high data density we also show that the contribution to the output variance due to the uncertainty in the weights can exhibit an approximate inverse proportionality to the probability density. Empirical results support these conclusions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the last two decades there have been substantial developments in the mathematical theory of inverse optimization problems, and their applications have expanded greatly. In parallel, time series analysis and forecasting have become increasingly important in various fields of research such as data mining, economics, business, engineering, medicine, politics, and many others. Despite the large uses of linear programming in forecasting models there is no a single application of inverse optimization reported in the forecasting literature when the time series data is available. Thus the goal of this paper is to introduce inverse optimization into forecasting field, and to provide a streamlined approach to time series analysis and forecasting using inverse linear programming. An application has been used to demonstrate the use of inverse forecasting developed in this study. © 2007 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magnetoencephalography (MEG) is a non-invasive brain imaging technique with the potential for very high temporal and spatial resolution of neuronal activity. The main stumbling block for the technique has been that the estimation of a neuronal current distribution, based on sensor data outside the head, is an inverse problem with an infinity of possible solutions. Many inversion techniques exist, all using different a-priori assumptions in order to reduce the number of possible solutions. Although all techniques can be thoroughly tested in simulation, implicit in the simulations are the experimenter's own assumptions about realistic brain function. To date, the only way to test the validity of inversions based on real MEG data has been through direct surgical validation, or through comparison with invasive primate data. In this work, we constructed a null hypothesis that the reconstruction of neuronal activity contains no information on the distribution of the cortical grey matter. To test this, we repeatedly compared rotated sections of grey matter with a beamformer estimate of neuronal activity to generate a distribution of mutual information values. The significance of the comparison between the un-rotated anatomical information and the electrical estimate was subsequently assessed against this distribution. We found that there was significant (P < 0.05) anatomical information contained in the beamformer images across a number of frequency bands. Based on the limited data presented here, we can say that the assumptions behind the beamformer algorithm are not unreasonable for the visual-motor task investigated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several indices of plant capacity utilization based on the concept of best practice frontier have been proposed in the literature (Fare et al. 1992; De Borger and Kerstens, 1998). This paper suggests an alternative measure of capacity utilization change based on Generalized Malmquist index, proposed by Grifell-Tatje' and Lovell in 1998. The advantage of this specification is that it allows the measurement of productivity growth ignoring the nature of scale economies. Afterwards, this index is used to measure capacity change of a panel of Italian firms over the period 1989-94 using Data Envelopment Analysis and then its abilities of explaining the short-run movements of output are assessed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The retrieval of wind vectors from satellite scatterometer observations is a non-linear inverse problem. A common approach to solving inverse problems is to adopt a Bayesian framework and to infer the posterior distribution of the parameters of interest given the observations by using a likelihood model relating the observations to the parameters, and a prior distribution over the parameters. We show how Gaussian process priors can be used efficiently with a variety of likelihood models, using local forward (observation) models and direct inverse models for the scatterometer. We present an enhanced Markov chain Monte Carlo method to sample from the resulting multimodal posterior distribution. We go on to show how the computational complexity of the inference can be controlled by using a sparse, sequential Bayes algorithm for estimation with Gaussian processes. This helps to overcome the most serious barrier to the use of probabilistic, Gaussian process methods in remote sensing inverse problems, which is the prohibitively large size of the data sets. We contrast the sampling results with the approximations that are found by using the sparse, sequential Bayes algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Removing noise from piecewise constant (PWC) signals is a challenging signal processing problem arising in many practical contexts. For example, in exploration geosciences, noisy drill hole records need to be separated into stratigraphic zones, and in biophysics, jumps between molecular dwell states have to be extracted from noisy fluorescence microscopy signals. Many PWC denoising methods exist, including total variation regularization, mean shift clustering, stepwise jump placement, running medians, convex clustering shrinkage and bilateral filtering; conventional linear signal processing methods are fundamentally unsuited. This paper (part I, the first of two) shows that most of these methods are associated with a special case of a generalized functional, minimized to achieve PWC denoising. The minimizer can be obtained by diverse solver algorithms, including stepwise jump placement, convex programming, finite differences, iterated running medians, least angle regression, regularization path following and coordinate descent. In the second paper, part II, we introduce novel PWC denoising methods, and comparisons between these methods performed on synthetic and real signals, showing that the new understanding of the problem gained in part I leads to new methods that have a useful role to play.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Removing noise from signals which are piecewise constant (PWC) is a challenging signal processing problem that arises in many practical scientific and engineering contexts. In the first paper (part I) of this series of two, we presented background theory building on results from the image processing community to show that the majority of these algorithms, and more proposed in the wider literature, are each associated with a special case of a generalized functional, that, when minimized, solves the PWC denoising problem. It shows how the minimizer can be obtained by a range of computational solver algorithms. In this second paper (part II), using this understanding developed in part I, we introduce several novel PWC denoising methods, which, for example, combine the global behaviour of mean shift clustering with the local smoothing of total variation diffusion, and show example solver algorithms for these new methods. Comparisons between these methods are performed on synthetic and real signals, revealing that our new methods have a useful role to play. Finally, overlaps between the generalized methods of these two papers and others such as wavelet shrinkage, hidden Markov models, and piecewise smooth filtering are touched on.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate an application of the method of fundamental solutions (MFS) to the one-dimensional inverse Stefan problem for the heat equation by extending the MFS proposed in [5] for the one-dimensional direct Stefan problem. The sources are placed outside the space domain of interest and in the time interval (-T, T). Theoretical properties of the method, as well as numerical investigations, are included, showing that accurate and stable results can be obtained efficiently with small computational cost.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate an application of the method of fundamental solutions (MFS) to the one-dimensional parabolic inverse Cauchy–Stefan problem, where boundary data and the initial condition are to be determined from the Cauchy data prescribed on a given moving interface. In [B.T. Johansson, D. Lesnic, and T. Reeve, A method of fundamental solutions for the one-dimensional inverse Stefan Problem, Appl. Math Model. 35 (2011), pp. 4367–4378], the inverse Stefan problem was considered, where only the boundary data is to be reconstructed on the fixed boundary. We extend the MFS proposed in Johansson et al. (2011) and show that the initial condition can also be simultaneously recovered, i.e. the MFS is appropriate for the inverse Cauchy-Stefan problem. Theoretical properties of the method, as well as numerical investigations, are included, showing that accurate results can be efficiently obtained with small computational cost.