946 resultados para Log-Gabor Filter
Resumo:
Este documento trata de: aspectos conceituais da lei: finalidade, import??ncia e hierarquia da lei; no????es gerais da lei de licita????es - Lei n?? 8.666/93; tipos de licita????o:menor pre??o, melhor t??cnica, t??cnica e pre??o e maior lance ou oferta; modalidades de licita????o: concorr??ncia, tomada de pre??os, convite, concurso e leil??o; exce????es ?? obrigatoriedade de licita????o: dispensa e inexigibilidade; regime de execu????o indireta; comiss??o de licita????o; etapas do processo licitat??rio: edital, procedimentos/documentos do certame, registro cadastral, habilita????o dos interessados, julgamento e encerramento; preg??o; registro de pre??os
Resumo:
As normas que regem a gratifica????o de Desempenho e Produtividade estabelecem limites pares ao n??mero de servidores que podem ser avaliados com notas superiores a 8. Essa limita????o est?? fixada em 20% para as notas na faixa de 8 a 9 e 10% para as notas acima de 9. O fato da Secretaria Federal de Controle possuir muitas unidades, inviabilizou a escolha dos servidores que, devido ao seu melhor desempenho, mereceriam receber as notas mais altas. A solu????o deste e de outros problemas dele decorrentes foi a implementa????o do Sistema de Avalia????o Log??stica Variada ??? SALVA, que operacionaliza a distribui????o dos valores n??o inteiros da gratifica????o de desempenho e produtividade com exatid??o e igualdade
Resumo:
Recomp??e a Comiss??o Gestora do Plano de Gest??o de Log??stica Sustent??vel da ENAP, criada pela Portaria n?? 259, de 20 de dezembro de 2012, com o intuito de elaborar, monitorar, avaliar e revisar o Plano de Gest??o de Log??stica Sustent??vel ??? PLS da ENAP, conforme determina o ?? 2?? do art. 6?? da Instru????o Normativa SLTI/MPOG n?? 10, de 12 de novembro de 2012. Esta Comiss??o homologa e acompanha o atendimento das metas pactuadas pela ENAP junto ?? Secretaria Executiva do Minist??rio do Planejamento, Or??amento e Gest??o, com vistas ?? utiliza????o racional de recursos combatendo o desperd??cio, promovendo a redu????o do consumo e a melhoria do gasto.
Resumo:
Apresenta-se de forma resumida análise multivariada de dados categóricos, usando modelo log-linear para a situação de uma tabela de contingência 2 x 2 x 2.
Resumo:
Fluorescent protein microscopy imaging is nowadays one of the most important tools in biomedical research. However, the resulting images present a low signal to noise ratio and a time intensity decay due to the photobleaching effect. This phenomenon is a consequence of the decreasing on the radiation emission efficiency of the tagging protein. This occurs because the fluorophore permanently loses its ability to fluoresce, due to photochemical reactions induced by the incident light. The Poisson multiplicative noise that corrupts these images, in addition with its quality degradation due to photobleaching, make long time biological observation processes very difficult. In this paper a denoising algorithm for Poisson data, where the photobleaching effect is explicitly taken into account, is described. The algorithm is designed in a Bayesian framework where the data fidelity term models the Poisson noise generation process as well as the exponential intensity decay caused by the photobleaching. The prior term is conceived with Gibbs priors and log-Euclidean potential functions, suitable to cope with the positivity constrained nature of the parameters to be estimated. Monte Carlo tests with synthetic data are presented to characterize the performance of the algorithm. One example with real data is included to illustrate its application.
Resumo:
Wyner-Ziv (WZ) video coding is a particular case of distributed video coding, the recent video coding paradigm based on the Slepian-Wolf and Wyner-Ziv theorems that exploits the source correlation at the decoder and not at the encoder as in predictive video coding. Although many improvements have been done over the last years, the performance of the state-of-the-art WZ video codecs still did not reach the performance of state-of-the-art predictive video codecs, especially for high and complex motion video content. This is also true in terms of subjective image quality mainly because of a considerable amount of blocking artefacts present in the decoded WZ video frames. This paper proposes an adaptive deblocking filter to improve both the subjective and objective qualities of the WZ frames in a transform domain WZ video codec. The proposed filter is an adaptation of the advanced deblocking filter defined in the H.264/AVC (advanced video coding) standard to a WZ video codec. The results obtained confirm the subjective quality improvement and objective quality gains that can go up to 0.63 dB in the overall for sequences with high motion content when large group of pictures are used.
Resumo:
Mestrado em Radiações Aplicadas às Tecnologias da Saúde.
Resumo:
In real optimization problems, usually the analytical expression of the objective function is not known, nor its derivatives, or they are complex. In these cases it becomes essential to use optimization methods where the calculation of the derivatives, or the verification of their existence, is not necessary: the Direct Search Methods or Derivative-free Methods are one solution. When the problem has constraints, penalty functions are often used. Unfortunately the choice of the penalty parameters is, frequently, very difficult, because most strategies for choosing it are heuristics strategies. As an alternative to penalty function appeared the filter methods. A filter algorithm introduces a function that aggregates the constrained violations and constructs a biobjective problem. In this problem the step is accepted if it either reduces the objective function or the constrained violation. This implies that the filter methods are less parameter dependent than a penalty function. In this work, we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of the simplex method and filter methods. This method does not compute or approximate any derivatives, penalty constants or Lagrange multipliers. The basic idea of simplex filter algorithm is to construct an initial simplex and use the simplex to drive the search. We illustrate the behavior of our algorithm through some examples. The proposed methods were implemented in Java.
Resumo:
The filter method is a technique for solving nonlinear programming problems. The filter algorithm has two phases in each iteration. The first one reduces a measure of infeasibility, while in the second the objective function value is reduced. In real optimization problems, usually the objective function is not differentiable or its derivatives are unknown. In these cases it becomes essential to use optimization methods where the calculation of the derivatives or the verification of their existence is not necessary: direct search methods or derivative-free methods are examples of such techniques. In this work we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of simplex and filter methods. This method neither computes nor approximates derivatives, penalty constants or Lagrange multipliers.
Resumo:
Amorphous SiC tandem heterostructures are used to filter a specific band, in the visible range. Experimental and simulated results are compared to validate the use of SiC multilayered structures in applications where gain compensation is needed or to attenuate unwanted wavelengths. Spectral response data acquired under different frequencies, optical wavelength control and side irradiations are analyzed. Transfer function characteristics are discussed. Color pulsed communication channels are transmitted together and the output signal analyzed under different background conditions. Results show that under controlled wavelength backgrounds, the device sensitivity is enhanced in a precise wavelength range and quenched in the others, tuning or suppressing a specific band. Depending on the background wavelength and irradiation side, the device acts either as a long-, a short-, or a band-rejection pass filter. An optoelectronic model supports the experimental results and gives insight on the physics of the device.
Resumo:
Discrete data representations are necessary, or at least convenient, in many machine learning problems. While feature selection (FS) techniques aim at finding relevant subsets of features, the goal of feature discretization (FD) is to find concise (quantized) data representations, adequate for the learning task at hand. In this paper, we propose two incremental methods for FD. The first method belongs to the filter family, in which the quality of the discretization is assessed by a (supervised or unsupervised) relevance criterion. The second method is a wrapper, where discretized features are assessed using a classifier. Both methods can be coupled with any static (unsupervised or supervised) discretization procedure and can be used to perform FS as pre-processing or post-processing stages. The proposed methods attain efficient representations suitable for binary and multi-class problems with different types of data, being competitive with existing methods. Moreover, using well-known FS methods with the features discretized by our techniques leads to better accuracy than with the features discretized by other methods or with the original features. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
Characteristics of tunable wavelength pi'n/pin filters based on a-SiC:H multilayered stacked cells are studied both experimentally and theoretically. Results show that the device combines the demultiplexing operation with the simultaneous photodetection and self amplification of the signal. An algorithm to decode the multiplex signal is established. A capacitive active band-pass filter model is presented and supported by an electrical simulation of the state variable filter circuit. Experimental and simulated results show that the device acts as a state variable filter. It combines the properties of active high-pass and low-pass filter sections into a capacitive active band-pass filter using a changing capacitance to control the power delivered to the load.
Resumo:
This paper extents the by now classic sensor fusion complementary filter (CF) design, involving two sensors, to the case where three sensors that provide measurements in different bands are available. This paper shows that the use of classical CF techniques to tackle a generic three sensors fusion problem, based solely on their frequency domain characteristics, leads to a minimal realization, stable, sub-optimal solution, denoted as Complementary Filters3 (CF3). Then, a new approach for the estimation problem at hand is used, based on optimal linear Kalman filtering techniques. Moreover, the solution is shown to preserve the complementary property, i.e. the sum of the three transfer functions of the respective sensors add up to one, both in continuous and discrete time domains. This new class of filters are denoted as Complementary Kalman Filters3 (CKF3). The attitude estimation of a mobile robot is addressed, based on data from a rate gyroscope, a digital compass, and odometry. The experimental results obtained are reported.
Resumo:
The authors studied 70 leprosy patients and 20 normal individuals, comparing the traditional sera collection method and the finger prick blood with the conservation on filter paper for specific antibodies against the native phenolic glycolipid-I (PGL-I) from Mycobacterium leprae. The finger prick blood dried on filter paper was eluated in phosphate buffer saline (PBS) containing 0.5% gelatin. The classical method for native PGL-I was performed for these eluates, and compared with the antibody determination for sera. It was observed that there is a straight correlation comparing these two methods; although the titles found for the eluates were lower than those obtained for serology. This blood collection method could be useful for investigation of new leprosy cases in field, specially in contacts individuals.