906 resultados para Escalonamento multidimensional
Resumo:
We propose an iterative data reconstruction technique specifically designed for multi-dimensional multi-color fluorescence imaging. Markov random field is employed (for modeling the multi-color image field) in conjunction with the classical maximum likelihood method. It is noted that, ill-posed nature of the inverse problem associated with multi-color fluorescence imaging forces iterative data reconstruction. Reconstruction of three-dimensional (3D) two-color images (obtained from nanobeads and cultured cell samples) show significant reduction in the background noise (improved signal-to-noise ratio) with an impressive overall improvement in the spatial resolution (approximate to 250 nm) of the imaging system. Proposed data reconstruction technique may find immediate application in 3D in vivo and in vitro multi-color fluorescence imaging of biological specimens. (C) 2012 American Institute of Physics. http://dx.doi.org/10.1063/1.4769058]
Resumo:
We propose and experimentally demonstrate a three-dimensional (3D) image reconstruction methodology based on Taylor series approximation (TSA) in a Bayesian image reconstruction formulation. TSA incorporates the requirement of analyticity in the image domain, and acts as a finite impulse response filter. This technique is validated on images obtained from widefield, confocal laser scanning fluorescence microscopy and two-photon excited 4pi (2PE-4pi) fluorescence microscopy. Studies on simulated 3D objects, mitochondria-tagged yeast cells (labeled with Mitotracker Orange) and mitochondrial networks (tagged with Green fluorescent protein) show a signal-to-background improvement of 40% and resolution enhancement from 360 to 240 nm. This technique can easily be extended to other imaging modalities (single plane illumination microscopy (SPIM), individual molecule localization SPIM, stimulated emission depletion microscopy and its variants).
Resumo:
The protein folding funnel paradigm suggests that folding and unfolding proceed as directed diffusion in a multidimensional free energy surface where a multitude of pathways can be traversed during the protein's sojourn from initial to final state. However, finding even a single pathway, with the detail chronicling of intermediates, is an arduous task. In this work we explore the free energy surface of unfolding pathway through umbrella sampling, for a small globular a-helical protein chicken-villin headpiece (HP-36) when the melting of secondary structures is induced by adding DMSO in aqueous solution. We find that the unfolding proceeds through the initial separation or melting of aggregated hydrophobic core that comprises of three phenylalanine residues (Phe7, Phe11, and Phe18). This separation is accompanied by simultaneous melting of the second helix. Unfolding is found to be a multistage process involving crossing of three consecutive minima and two barriers at the initial stage. At a molecular level, Phe18 is observed to reorient itself towards other hydrophobic grooves to stabilize the intermediate states. We identify the configuration of the intermediates and correlate the intermediates with those obtained in our previous works. We also give an estimate of the barriers for different transition states and observe the softening of the barriers with increasing DMSO concentration. We show that higher concentration of DMSO tunes the unfolding pathway by destabilizing the third minimum and stabilizing the second one, indicating the development of a solvent modified, less rugged pathway. The prime outcome of this work is the demonstration that mixed solvents can profoundly transform the nature of the energy landscape and induce unfolding via a modified route. A successful application of Kramer's rate equation correlating the free energy simulation results shows faster rate of unfolding with increasing DMSO concentration. This work perhaps presents the first systematic theoretical study of the effect of a chemical denaturant on the microscopic free energy surface and rates of unfolding of HP-36. (C) 2014 AIP Publishing LLC.
Resumo:
The nonlinear optical response of a current-carrying single molecule coupled to two metal leads and driven by a sequence of impulsive optical pulses with controllable phases and time delays is calculated. Coherent (stimulated, heterodyne) detection of photons and incoherent detection of the optically induced current are compared. Using a diagrammatic Liouville space superoperator formalism, the signals are recast in terms of molecular correlation functions which are then expanded in the many-body molecular states. Two dimensional signals in benzene-1,4-dithiol molecule show cross peaks involving charged states. The correlation between optical and charge current signal is also observed. (C) 2015 AIP Publishing LLC.
Resumo:
Electromigration, mostly known for its damaging effects in microelectronic devices, is basically a material transport phenomenon driven by the electric field and kinetically controlled by diffusion. In this work, we show how controlled electromigration can be used to create scientifically interesting and technologically useful micro-/nano-scale patterns, which are otherwise extremely difficult to fabricate using conventional cleanroom practices, and present a few examples of such patterns. In a solid thin-film structure, electromigration is used to generate pores at preset locations for enhancing the sensitivity of a MEMS sensor. In addition to electromigration in solids, the flow instability associated with the electromigration-induced long-range flow of liquid metals is shown to form numerous structures with high surface area to volume ratio. In very thin solid films on non-conductive substrates, solidification of flow-affected region results in the formation of several features, such as nano-/micro-sized discrete metallic beads, 3D structures consisting of nano-stepped stairs, etc.
Resumo:
This paper proposes a new algorithm for waveletbased multidimensional image deconvolution which employs subband-dependent minimization and the dual-tree complex wavelet transform in an iterative Bayesian framework. In addition, this algorithm employs a new prior instead of the popular ℓ1 norm, and is thus able to embed a learning scheme during the iteration which helps it to achieve better deconvolution results and faster convergence. © 2008 IEEE.
Resumo:
Resumen: Si bien el período de recuperación posterior a la crisis del fin de la convertibilidad mostró mejoras en las mediciones de pobreza y desigualdad monetarias, el análisis de medidas multidimensionales permite detectar un estancamiento en estas mejoras ya a partir del año 2007. Este documento intenta indagar en los componentes de este cambio, mediante un ejercicio de descomposición temporal y por grupos de la medida Alkire-Foster (2007) aplicada a los datos de la Encuesta de la Deuda Social Argentina.
Resumo:
A parallel strategy for solving multidimensional tridiagonal equations is investigated in this paper. We present in detail an improved version of single parallel partition (SPP) algorithm in conjunction with message vectorization, which aggregates several communication messages into one to reduce the communication cost. We show the resulting block SPP can achieve good speedup for a wide range of message vector length (MVL), especially when the number of grid points in the divided direction is large. Instead of only using the largest possible MVL, we adopt numerical tests and modeling analysis to determine an optimal MVL so that significant improvement in speedup can be obtained.
Resumo:
Nas últimas décadas, o problema de escalonamento da produção em oficina de máquinas, na literatura referido como JSSP (do inglês Job Shop Scheduling Problem), tem recebido grande destaque por parte de pesquisadores do mundo inteiro. Uma das razões que justificam tamanho interesse está em sua alta complexidade. O JSSP é um problema de análise combinatória classificado como NP-Difícil e, apesar de existir uma grande variedade de métodos e heurísticas que são capazes de resolvê-lo, ainda não existe hoje nenhum método ou heurística capaz de encontrar soluções ótimas para todos os problemas testes apresentados na literatura. A outra razão basea-se no fato de que esse problema encontra-se presente no diaa- dia das indústrias de transformação de vários segmento e, uma vez que a otimização do escalonamento pode gerar uma redução significativa no tempo de produção e, consequentemente, um melhor aproveitamento dos recursos de produção, ele pode gerar um forte impacto no lucro dessas indústrias, principalmente nos casos em que o setor de produção é responsável por grande parte dos seus custos totais. Entre as heurísticas que podem ser aplicadas à solução deste problema, o Busca Tabu e o Multidão de Partículas apresentam uma boa performance para a maioria dos problemas testes encontrados na literatura. Geralmente, a heurística Busca Tabu apresenta uma boa e rápida convergência para pontos ótimos ou subótimos, contudo esta convergência é frequentemente interrompida por processos cíclicos e a performance do método depende fortemente da solução inicial e do ajuste de seus parâmetros. A heurística Multidão de Partículas tende a convergir para pontos ótimos, ao custo de um grande esforço computacional, sendo que sua performance também apresenta uma grande sensibilidade ao ajuste de seus parâmetros. Como as diferentes heurísticas aplicadas ao problema apresentam pontos positivos e negativos, atualmente alguns pesquisadores começam a concentrar seus esforços na hibridização das heurísticas existentes no intuito de gerar novas heurísticas híbridas que reúnam as qualidades de suas heurísticas de base, buscando desta forma diminuir ou mesmo eliminar seus aspectos negativos. Neste trabalho, em um primeiro momento, são apresentados três modelos de hibridização baseados no esquema geral das Heurísticas de Busca Local, os quais são testados com as heurísticas Busca Tabu e Multidão de Partículas. Posteriormente é apresentada uma adaptação do método Colisão de Partículas, originalmente desenvolvido para problemas contínuos, onde o método Busca Tabu é utilizado como operador de exploração local e operadores de mutação são utilizados para perturbação da solução. Como resultado, este trabalho mostra que, no caso dos modelos híbridos, a natureza complementar e diferente dos métodos Busca Tabu e Multidão de Partículas, na forma como são aqui apresentados, da origem à algoritmos robustos capazes de gerar solução ótimas ou muito boas e muito menos sensíveis ao ajuste dos parâmetros de cada um dos métodos de origem. No caso do método Colisão de Partículas, o novo algorítimo é capaz de atenuar a sensibilidade ao ajuste dos parâmetros e de evitar os processos cíclicos do método Busca Tabu, produzindo assim melhores resultados.