826 resultados para 2D barcode based authentication scheme
Resumo:
Worldwide, tuberculosis (TB) is the leading cause of death among curable infectious diseases. Multidrug-resistant Mycobacterium tuberculosis is an emerging problem of great importance to public health, and there is an urgent need for new anti-TB drugs. In the present work, classical 2D quantitative structure-activity relationships (QSAR) and hologram QSAR (HQSAR) studies were performed on a training set of 91 isoniazid derivatives. Significant statistical models (classical QSAR, q(2) = 0.68 and r(2) = 0.72; HQSAR, q(2) = 0.63 and r(2) = 0.86) were obtained, indicating their consistency for untested compounds. The models were then used to evaluate an external test set containing 24 compounds which were not included in the training set, and the predicted values were in good agreement with the experimental results (HQSAR, r(pred)(2) = 0.87; classical QSAR, r(pred)(2) = 0.75).
Resumo:
Cyclic imides have been widely employed in drug design research due to their multiple pharmacological and biological properties. In the present study, two-dimensional quantitative structure-activity relationship (2D QSAR) studies were conducted on a series of potent analgesic cyclic imides using both classical and hologram QSAR (HQSAR) methods, yielding significant statistical models (classical QSAR, q(2) = 0.80; HQSAR, q(2) = 0.84). The models were then used to evaluate an external data test, and the predicted values were in good agreement with the experimental results, indicating their consistency for untested compounds.
Resumo:
2D electrophoresis is a well-known method for protein separation which is extremely useful in the field of proteomics. Each spot in the image represents a protein accumulation and the goal is to perform a differential analysis between pairs of images to study changes in protein content. It is thus necessary to register two images by finding spot correspondences. Although it may seem a simple task, generally, the manual processing of this kind of images is very cumbersome, especially when strong variations between corresponding sets of spots are expected (e.g. strong non-linear deformations and outliers). In order to solve this problem, this paper proposes a new quadratic assignment formulation together with a correspondence estimation algorithm based on graph matching which takes into account the structural information between the detected spots. Each image is represented by a graph and the task is to find a maximum common subgraph. Successful experimental results using real data are presented, including an extensive comparative performance evaluation with ground-truth data. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This work describes a novel methodology for automatic contour extraction from 2D images of 3D neurons (e.g. camera lucida images and other types of 2D microscopy). Most contour-based shape analysis methods cannot be used to characterize such cells because of overlaps between neuronal processes. The proposed framework is specifically aimed at the problem of contour following even in presence of multiple overlaps. First, the input image is preprocessed in order to obtain an 8-connected skeleton with one-pixel-wide branches, as well as a set of critical regions (i.e., bifurcations and crossings). Next, for each subtree, the tracking stage iteratively labels all valid pixel of branches, tip to a critical region, where it determines the suitable direction to proceed. Finally, the labeled skeleton segments are followed in order to yield the parametric contour of the neuronal shape under analysis. The reported system was successfully tested with respect to several images and the results from a set of three neuron images are presented here, each pertaining to a different class, i.e. alpha, delta and epsilon ganglion cells, containing a total of 34 crossings. The algorithms successfully got across all these overlaps. The method has also been found to exhibit robustness even for images with close parallel segments. The proposed method is robust and may be implemented in an efficient manner. The introduction of this approach should pave the way for more systematic application of contour-based shape analysis methods in neuronal morphology. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
A visualização de conjuntos de dados volumétricos é comum em diversas áreas de aplicação e há já alguns anos os diversos aspectos envolvidos nessas técnicas vêm sendo pesquisados. No entanto, apesar dos avanços das técnicas de visualização de volumes, a interação com grandes volumes de dados ainda apresenta desafios devido a questões de percepção (ou isolamento) de estruturas internas e desempenho computacional. O suporte do hardware gráfico para visualização baseada em texturas permite o desenvolvimento de técnicas eficientes de rendering que podem ser combinadas com ferramentas de recorte interativas para possibilitar a inspeção de conjuntos de dados tridimensionais. Muitos estudos abordam a otimização do desempenho de ferramentas de recorte, mas muito poucos tratam das metáforas de interação utilizadas por essas ferramentas. O objetivo deste trabalho é desenvolver ferramentas interativas, intuitivas e fáceis de usar para o recorte de imagens volumétricas. Inicialmente, é apresentado um estudo sobre as principais técnicas de visualização direta de volumes e como é feita a exploração desses volumes utilizando-se recorte volumétrico. Nesse estudo é identificada a solução que melhor se enquadra no presente trabalho para garantir a interatividade necessária. Após, são apresentadas diversas técnicas de interação existentes, suas metáforas e taxonomias, para determinar as possíveis técnicas de interação mais fáceis de serem utilizadas por ferramentas de recorte. A partir desse embasamento, este trabalho apresenta o desenvolvimento de três ferramentas de recorte genéricas implementadas usando-se duas metáforas de interação distintas que são freqüentemente utilizadas por usuários de aplicativos 3D: apontador virtual e mão virtual. A taxa de interação dessas ferramentas é obtida através de programas de fragmentos especiais executados diretamente no hardware gráfico. Estes programas especificam regiões dentro do volume a serem descartadas durante o rendering, com base em predicados geométricos. Primeiramente, o desempenho, precisão e preferência (por parte dos usuários) das ferramentas de recorte volumétrico são avaliados para comparar as metáforas de interação empregadas. Após, é avaliada a interação utilizando-se diferentes dispositivos de entrada para a manipulação do volume e ferramentas. A utilização das duas mãos ao mesmo tempo para essa manipulação também é testada. Os resultados destes experimentos de avaliação são apresentados e discutidos.
Resumo:
In the Hydrocarbon exploration activities, the great enigma is the location of the deposits. Great efforts are undertaken in an attempt to better identify them, locate them and at the same time, enhance cost-effectiveness relationship of extraction of oil. Seismic methods are the most widely used because they are indirect, i.e., probing the subsurface layers without invading them. Seismogram is the representation of the Earth s interior and its structures through a conveniently disposed arrangement of the data obtained by seismic reflection. A major problem in this representation is the intensity and variety of present noise in the seismogram, as the surface bearing noise that contaminates the relevant signals, and may mask the desired information, brought by waves scattered in deeper regions of the geological layers. It was developed a tool to suppress these noises based on wavelet transform 1D and 2D. The Java language program makes the separation of seismic images considering the directions (horizontal, vertical, mixed or local) and bands of wavelengths that form these images, using the Daubechies Wavelets, Auto-resolution and Tensor Product of wavelet bases. Besides, it was developed the option in a single image, using the tensor product of two-dimensional wavelets or one-wavelet tensor product by identities. In the latter case, we have the wavelet decomposition in a two dimensional signal in a single direction. This decomposition has allowed to lengthen a certain direction the two-dimensional Wavelets, correcting the effects of scales by applying Auto-resolutions. In other words, it has been improved the treatment of a seismic image using 1D wavelet and 2D wavelet at different stages of Auto-resolution. It was also implemented improvements in the display of images associated with breakdowns in each Auto-resolution, facilitating the choices of images with the signals of interest for image reconstruction without noise. The program was tested with real data and the results were good
Resumo:
The aim of this work was to develop a quality index method (QIM) scheme for whole ice-boxed refrigerated blackspot seabream and to perform shelf-life evaluations, using sensory analysis, GR Torrymeter measurements and bacterial counts of specific spoilage organisms (SSO) during chilled storage. A QIM scheme based on a total of 30 demerit points was developed. Sensory, physical and microbiological data were integrated and used to determine the rejection point. Results indicated that the shelf-life of blackspot seabream is around 12-13 days. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The scheme is based on Ami Harten's ideas (Harten, 1994), the main tools coming from wavelet theory, in the framework of multiresolution analysis for cell averages. But instead of evolving cell averages on the finest uniform level, we propose to evolve just the cell averages on the grid determined by the significant wavelet coefficients. Typically, there are few cells in each time step, big cells on smooth regions, and smaller ones close to irregularities of the solution. For the numerical flux, we use a simple uniform central finite difference scheme, adapted to the size of each cell. If any of the required neighboring cell averages is not present, it is interpolated from coarser scales. But we switch to ENO scheme in the finest part of the grids. To show the feasibility and efficiency of the method, it is applied to a system arising in polymer-flooding of an oil reservoir. In terms of CPU time and memory requirements, it outperforms Harten's multiresolution algorithm.The proposed method applies to systems of conservation laws in 1Dpartial derivative(t)u(x, t) + partial derivative(x)f(u(x, t)) = 0, u(x, t) is an element of R-m. (1)In the spirit of finite volume methods, we shall consider the explicit schemeupsilon(mu)(n+1) = upsilon(mu)(n) - Deltat/hmu ((f) over bar (mu) - (f) over bar (mu)-) = [Dupsilon(n)](mu), (2)where mu is a point of an irregular grid Gamma, mu(-) is the left neighbor of A in Gamma, upsilon(mu)(n) approximate to 1/mu-mu(-) integral(mu-)(mu) u(x, t(n))dx are approximated cell averages of the solution, (f) over bar (mu) = (f) over bar (mu)(upsilon(n)) are the numerical fluxes, and D is the numerical evolution operator of the scheme.According to the definition of (f) over bar (mu), several schemes of this type have been proposed and successfully applied (LeVeque, 1990). Godunov, Lax-Wendroff, and ENO are some of the popular names. Godunov scheme resolves well the shocks, but accuracy (of first order) is poor in smooth regions. Lax-Wendroff is of second order, but produces dangerous oscillations close to shocks. ENO schemes are good alternatives, with high order and without serious oscillations. But the price is high computational cost.Ami Harten proposed in (Harten, 1994) a simple strategy to save expensive ENO flux calculations. The basic tools come from multiresolution analysis for cell averages on uniform grids, and the principle is that wavelet coefficients can be used for the characterization of local smoothness.. Typically, only few wavelet coefficients are significant. At the finest level, they indicate discontinuity points, where ENO numerical fluxes are computed exactly. Elsewhere, cheaper fluxes can be safely used, or just interpolated from coarser scales. Different applications of this principle have been explored by several authors, see for example (G-Muller and Muller, 1998).Our scheme also uses Ami Harten's ideas. But instead of evolving the cell averages on the finest uniform level, we propose to evolve the cell averages on sparse grids associated with the significant wavelet coefficients. This means that the total number of cells is small, with big cells in smooth regions and smaller ones close to irregularities. This task requires improved new tools, which are described next.
Resumo:
fit the context of normalized variable formulation (NVF) of Leonard and total variation diminishing (TVD) constraints of Harten. this paper presents an extension of it previous work by the authors for solving unsteady incompressible flow problems. The main contributions of the paper are threefold. First, it presents the results of the development and implementation of a bounded high order upwind adaptative QUICKEST scheme in the 3D robust code (Freeflow), for the numerical solution of the full incompressible Navier-Stokes equations. Second, it reports numerical simulation results for 1D hock tube problem, 2D impinging jet and 2D/3D broken clam flows. Furthermore, these results are compared with existing analytical and experimental data. and third, it presents the application of the numerical method for solving 3D free surface flow problems. (C) 2007 IMACS. Published by Elsevier B.V. All rights reserved,
Resumo:
In Fazenda Belém oil field (Potiguar Basin, Ceará State, Brazil) occur frequently sinkholes and sudden terrain collapses associated to an unconsolidated sedimentary cap covering the Jandaíra karst. This research was carried out in order to understand the mechanisms of generation of these collapses. The main tool used was Ground Penetrating Radar (GPR). This work is developed twofold: one aspect concerns methodology improvements in GPR data processing whilst another aspect concerns the geological study of the Jandaíra karst. This second aspect was strongly supported both by the analysis of outcropping karst structures (in another regions of Potiguar Basin) and by the interpretation of radargrams from the subsurface karst in Fazenda Belém. It was designed and tested an adequate flux to process GPR data which was adapted from an usual flux to process seismic data. The changes were introduced to take into account important differences between GPR and Reflection Seismic methods, in particular: poor coupling between source and ground, mixed phase of the wavelet, low signal-to-noise ratio, monochannel acquisition, and high influence of wave propagation effects, notably dispersion. High frequency components of the GPR pulse suffer more pronounced effects of attenuation than low frequency components resulting in resolution losses in radargrams. In Fazenda Belém, there is a stronger need of an suitable flux to process GPR data because both the presence of a very high level of aerial events and the complexity of the imaged subsurface karst structures. The key point of the processing flux was an improvement in the correction of the attenuation effects on the GPR pulse based on their influence on the amplitude and phase spectra of GPR signals. In low and moderate losses dielectric media the propagated signal suffers significant changes only in its amplitude spectrum; that is, the phase spectrum of the propagated signal remains practically unaltered for the usual travel time ranges. Based on this fact, it is shown using real data that the judicious application of the well known tools of time gain and spectral balancing can efficiently correct the attenuation effects. The proposed approach can be applied in heterogeneous media and it does not require the precise knowledge of the attenuation parameters of the media. As an additional benefit, the judicious application of spectral balancing promotes a partial deconvolution of the data without changing its phase. In other words, the spectral balancing acts in a similar way to a zero phase deconvolution. In GPR data the resolution increase obtained with spectral balancing is greater than those obtained with spike and predictive deconvolutions. The evolution of the Jandaíra karst in Potiguar Basin is associated to at least three events of subaerial exposition of the carbonatic plataform during the Turonian, Santonian, and Campanian. In Fazenda Belém region, during the mid Miocene, the Jandaíra karst was covered by continental siliciclastic sediments. These sediments partially filled the void space associated to the dissolution structures and fractures. Therefore, the development of the karst in this region was attenuated in comparison to other places in Potiguar Basin where this karst is exposed. In Fazenda Belém, the generation of sinkholes and terrain collapses are controlled mainly by: (i) the presence of an unconsolidated sedimentary cap which is thick enough to cover completely the karst but with sediment volume lower than the available space associated to the dissolution structures in the karst; (ii) the existence of important structural of SW-NE and NW-SE alignments which promote a localized increase in the hydraulic connectivity allowing the channeling of underground water, thus facilitating the carbonatic dissolution; and (iii) the existence of a hydraulic barrier to the groundwater flow, associated to the Açu-4 Unity. The terrain collapse mechanisms in Fazenda Belém occur according to the following temporal evolution. The meteoric water infiltrates through the unconsolidated sedimentary cap and promotes its remobilization to the void space associated with the dissolution structures in Jandaíra Formation. This remobilization is initiated at the base of the sedimentary cap where the flow increases its abrasion due to a change from laminar to turbulent flow regime when the underground water flow reaches the open karst structures. The remobilized sediments progressively fill from bottom to top the void karst space. So, the void space is continuously migrated upwards ultimately reaching the surface and causing the sudden observed terrain collapses. This phenomenon is particularly active during the raining season, when the water table that normally is located in the karst may be temporarily located in the unconsolidated sedimentary cap
Resumo:
Difusive processes are extremely common in Nature. Many complex systems, such as microbial colonies, colloidal aggregates, difusion of fluids, and migration of populations, involve a large number of similar units that form fractal structures. A new model of difusive agregation was proposed recently by Filoche and Sapoval [68]. Based on their work, we develop a model called Difusion with Aggregation and Spontaneous Reorganization . This model consists of a set of particles with excluded volume interactions, which perform random walks on a square lattice. Initially, the lattice is occupied with a density p = N/L2 of particles occupying distinct, randomly chosen positions. One of the particles is selected at random as the active particle. This particle executes a random walk until it visits a site occupied by another particle, j. When this happens, the active particle is rejected back to its previous position (neighboring particle j), and a new active particle is selected at random from the set of N particles. Following an initial transient, the system attains a stationary regime. In this work we study the stationary regime, focusing on scaling properties of the particle distribution, as characterized by the pair correlation function ø(r). The latter is calculated by averaging over a long sequence of configurations generated in the stationary regime, using systems of size 50, 75, 100, 150, . . . , 700. The pair correlation function exhibits distinct behaviors in three diferent density ranges, which we term subcritical, critical, and supercritical. We show that in the subcritical regime, the particle distribution is characterized by a fractal dimension. We also analyze the decay of temporal correlations
Resumo:
We discuss two-dimensional Bose-Einstein Condensates (BEC) under time-periodic variation of the scattering length. In particular we argue that for high-frequency variation there exist stable self-confined condensates without an external trap, when the do component of the scattering length is negative. Our results are based on a variational approximation, on direct averaging of the Gross-Pitaevskii equation and on numerical simulations.
Resumo:
We show that an anomaly-free description of matter in (1+1) dimensions requires a deformation of the 2D relativity principle, which introduces a non-trivial centre in the 2D Poincare algebra. Then we work out the reduced phase space of the anomaly-free 2D relativistic particle, in order to show that it lives in a noncommutative 2D Minkowski space. Moreover, we build a Gaussian wave packet to show that a Planck length is well defined in two dimensions. In order to provide a gravitational interpretation for this noncommutativity, we propose to extend the usual 2D generalized dilaton gravity models by a specific Maxwell component, which guages the extra symmetry associated with the centre of the 2D Poincare algebra. In addition, we show that this extension is a high energy correction to the unextended dilaton theories that can affect the topology of spacetime. Further, we couple a test particle to the general extended dilaton models with the purpose of showing that they predict a noncommutativity in curved spacetime, which is locally described by a Moyal star product in the low energy limit. We also conjecture a probable generalization of this result, which provides strong evidence that the noncommutativity is described by a certain star product which is not of the Moyal type at high energies. Finally, we prove that the extended dilaton theories can be formulated as Poisson-Sigma models based on a nonlinear deformation of the extended Poincare algebra.