52 resultados para Minimum Entropy Deconvolution


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Charcoal particles in pollen slides are often abundant, and thus analysts are faced with the problem of setting the minimum counting sum as small as possible in order to save time. We analysed the reliability of charcoal-concentration estimates based on different counting sums, using simulated low-to high-count samples. Bootstrap simulations indicate that the variability of inferred charcoal concentrations increases progressively with decreasing sums. Below 200 items (i.e., the sum of charcoal particles and exotic marker grains), reconstructed fire incidence is either too high or too low. Statistical comparisons show that the means of bootstrap simulations stabilize after 200 counts. Moreover, a count of 200-300 items is sufficient to produce a charcoal-concentration estimate with less than+5% error if compared with high-count samples of 1000 items for charcoal/marker grain ratios 0.1-0.91. If, however, this ratio is extremely high or low (> 0.91 or < 0.1) and if such samples are frequent, we suggest that marker grains are reduced or added prior to new sample processing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Throughout the last millennium, mankind was affected by prolonged deviations from the climate mean state. While periods like the Maunder Minimum in the 17th century have been assessed in greater detail, earlier cold periods such as the 15th century received much less attention due to the sparse information available. Based on new evidence from different sources ranging from proxy archives to model simulations, it is now possible to provide an end-to-end assessment about the climate state during an exceptionally cold period in the 15th century, the role of internal, unforced climate variability and external forcing in shaping these extreme climatic conditions, and the impacts on and responses of the medieval society in Central Europe. Climate reconstructions from a multitude of natural and human archives indicate that, during winter, the period of the early Spörer Minimum (1431–1440 CE) was the coldest decade in Central Europe in the 15th century. The particularly cold winters and normal but wet summers resulted in a strong seasonal cycle that challenged food production and led to increasing food prices, a subsistence crisis, and a famine in parts of Europe. As a consequence, authorities implemented adaptation measures, such as the installation of grain storage capacities, in order to be prepared for future events. The 15th century is characterised by a grand solar minimum and enhanced volcanic activity, which both imply a reduction of seasonality. Climate model simulations show that periods with cold winters and strong seasonality are associated with internal climate variability rather than external forcing. Accordingly, it is hypothesised that the reconstructed extreme climatic conditions during this decade occurred by chance and in relation to the partly chaotic, internal variability within the climate system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Blind Deconvolution consists in the estimation of a sharp image and a blur kernel from an observed blurry image. Because the blur model admits several solutions it is necessary to devise an image prior that favors the true blur kernel and sharp image. Many successful image priors enforce the sparsity of the sharp image gradients. Ideally the L0 “norm” is the best choice for promoting sparsity, but because it is computationally intractable, some methods have used a logarithmic approximation. In this work we also study a logarithmic image prior. We show empirically how well the prior suits the blind deconvolution problem. Our analysis confirms experimentally the hypothesis that a prior should not necessarily model natural image statistics to correctly estimate the blur kernel. Furthermore, we show that a simple Maximum a Posteriori formulation is enough to achieve state of the art results. To minimize such formulation we devise two iterative minimization algorithms that cope with the non-convexity of the logarithmic prior: one obtained via the primal-dual approach and one via majorization-minimization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we propose a solution to blind deconvolution of a scene with two layers (foreground/background). We show that the reconstruction of the support of these two layers from a single image of a conventional camera is not possible. As a solution we propose to use a light field camera. We demonstrate that a single light field image captured with a Lytro camera can be successfully deblurred. More specifically, we consider the case of space-varying motion blur, where the blur magnitude depends on the depth changes in the scene. Our method employs a layered model that handles occlusions and partial transparencies due to both motion blur and out of focus blur of the plenoptic camera. We reconstruct each layer support, the corresponding sharp textures, and motion blurs via an optimization scheme. The performance of our algorithm is demonstrated on synthetic as well as real light field images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Blind deconvolution is the problem of recovering a sharp image and a blur kernel from a noisy blurry image. Recently, there has been a significant effort on understanding the basic mechanisms to solve blind deconvolution. While this effort resulted in the deployment of effective algorithms, the theoretical findings generated contrasting views on why these approaches worked. On the one hand, one could observe experimentally that alternating energy minimization algorithms converge to the desired solution. On the other hand, it has been shown that such alternating minimization algorithms should fail to converge and one should instead use a so-called Variational Bayes approach. To clarify this conundrum, recent work showed that a good image and blur prior is instead what makes a blind deconvolution algorithm work. Unfortunately, this analysis did not apply to algorithms based on total variation regularization. In this manuscript, we provide both analysis and experiments to get a clearer picture of blind deconvolution. Our analysis reveals the very reason why an algorithm based on total variation works. We also introduce an implementation of this algorithm and show that, in spite of its extreme simplicity, it is very robust and achieves a performance comparable to the top performing algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tricyclo-DNA (tcDNA) is a sugar-modified analogue of DNA currently tested for the treatment of Duchenne muscular dystrophy in an antisense approach. Tandem mass spectrometry plays a key role in modern medical diagnostics and has become a widespread technique for the structure elucidation and quantification of antisense oligonucleotides. Herein, mechanistic aspects of the fragmentation of tcDNA are discussed, which lay the basis for reliable sequencing and quantification of the antisense oligonucleotide. Excellent selectivity of tcDNA for complementary RNA is demonstrated in direct competition experiments. Moreover, the kinetic stability and fragmentation pattern of matched and mismatched tcDNA heteroduplexes were investigated and compared with non-modified DNA and RNA duplexes. Although the separation of the constituting strands is the entropy-favored fragmentation pathway of all nucleic acid duplexes, it was found to be only a minor pathway of tcDNA duplexes. The modified hybrid duplexes preferentially undergo neutral base loss and backbone cleavage. This difference is due to the low activation entropy for the strand dissociation of modified duplexes that arises from the conformational constraint of the tc-sugar-moiety. The low activation entropy results in a relatively high free activation enthalpy for the dissociation comparable to the free activation enthalpy of the alternative reaction pathway, the release of a nucleobase. The gas-phase behavior of tcDNA duplexes illustrates the impact of the activation entropy on the fragmentation kinetics and suggests that tandem mass spectrometric experiments are not suited to determine the relative stability of different types of nucleic acid duplexes.