922 resultados para 2D Convolutional Codes
Resumo:
We report on integer and fractional microwave-induced resistance oscillations in a 2D electron system with high density and moderate mobility, and present results of measurements at high microwave intensity and temperature. Fractional microwave-induced resistance oscillations occur up to fractional denominator 8 and are quenched independently of their fractional order. We discuss our results and compare them with existing theoretical models. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Structured meaning-signal mappings, i.e., mappings that preserve neighborhood relationships by associating similar signals with similar meanings, are advantageous in an environment where signals are corrupted by noise and sub-optimal meaning inferences are rewarded as well. The evolution of these mappings, however, cannot be explained within a traditional language evolutionary game scenario in which individuals meet randomly because the evolutionary dynamics is trapped in local maxima that do not reflect the structure of the meaning and signal spaces. Here we use a simple game theoretical model to show analytically that when individuals adopting the same communication code meet more frequently than individuals using different codes-a result of the spatial organization of the population-then advantageous linguistic innovations can spread and take over the population. In addition, we report results of simulations in which an individual can communicate only with its K nearest neighbors and show that the probability that the lineage of a mutant that uses a more efficient communication code becomes fixed decreases exponentially with increasing K. These findings support the mother tongue hypothesis that human language evolved as a communication system used among kin, especially between mothers and offspring.
Resumo:
2D electrophoresis is a well-known method for protein separation which is extremely useful in the field of proteomics. Each spot in the image represents a protein accumulation and the goal is to perform a differential analysis between pairs of images to study changes in protein content. It is thus necessary to register two images by finding spot correspondences. Although it may seem a simple task, generally, the manual processing of this kind of images is very cumbersome, especially when strong variations between corresponding sets of spots are expected (e.g. strong non-linear deformations and outliers). In order to solve this problem, this paper proposes a new quadratic assignment formulation together with a correspondence estimation algorithm based on graph matching which takes into account the structural information between the detected spots. Each image is represented by a graph and the task is to find a maximum common subgraph. Successful experimental results using real data are presented, including an extensive comparative performance evaluation with ground-truth data. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This work describes a novel methodology for automatic contour extraction from 2D images of 3D neurons (e.g. camera lucida images and other types of 2D microscopy). Most contour-based shape analysis methods cannot be used to characterize such cells because of overlaps between neuronal processes. The proposed framework is specifically aimed at the problem of contour following even in presence of multiple overlaps. First, the input image is preprocessed in order to obtain an 8-connected skeleton with one-pixel-wide branches, as well as a set of critical regions (i.e., bifurcations and crossings). Next, for each subtree, the tracking stage iteratively labels all valid pixel of branches, tip to a critical region, where it determines the suitable direction to proceed. Finally, the labeled skeleton segments are followed in order to yield the parametric contour of the neuronal shape under analysis. The reported system was successfully tested with respect to several images and the results from a set of three neuron images are presented here, each pertaining to a different class, i.e. alpha, delta and epsilon ganglion cells, containing a total of 34 crossings. The algorithms successfully got across all these overlaps. The method has also been found to exhibit robustness even for images with close parallel segments. The proposed method is robust and may be implemented in an efficient manner. The introduction of this approach should pave the way for more systematic application of contour-based shape analysis methods in neuronal morphology. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
A novel Schiff base-copper(II) complex [Cu(2)L(2)(N(3))(2)](ClO(4))(2) 1, where L = (4-imidazolyl)ethylene-2-amino-1-ethylpyridine (apyhist), containing azide-bridges between adjacent copper ions in a dinuclear arrangement was isolated and characterized both in the solid state and in solution by X-ray crystallography and different spectroscopic techniques. Azide binding constants were estimated from titrations of the precursor [CuL(H(2)O)(2)](2+) solutions with sodium azide, giving rise to the azido-bridged species, [Cu(2)L(2)(N(3))(2)](2+). Raman spectra showed asymmetric stretching band at 2060 cm(-1), indicating the presence of azido ligands with a symmetric mu(1,) (1) binding geometry. EPA spectra, in frozen methanol/water solutions at 77 K, exhibited characteristic features of copper centers in tetragonal pyramidal coordination geometry, exhibiting magnetic interactions between them. Further, in solid state, two different values for magnetic coupling in this species were obtained, J/k = -(5.14 +/- 0.02) cm(-1) attributed to the mu(1, 1) azide-bridge mode, and J`z`/k = -(2.94 +/- 0.11) cm(-1) for the interaction between dinuclear moieties via water/perchorate bridges. Finally, an attempt was made to correlate structure and magnetic data for this dinuclear asymmetric end-on azido bridged-copper(II) 1 complex with those of another correlated dinuclear system, complex [Cu(2)L(2)Cl(2)](ClO(4))(2) 2, containing the same tridentate diimine ligand, but with chloro-bridged groups between the copper centres.
2D QSAR and similarity studies on cruzain inhibitors aimed at improving selectivity over cathepsin L
Resumo:
Hologram quantitative structure-activity relationships (HQSAR) were applied to a data set of 41 cruzain inhibitors. The best HQSAR model (Q(2) = 0.77; R-2 = 0.90) employing Surflex-Sim, as training and test sets generator, was obtained using atoms, bonds, and connections as fragment distinctions and 4-7 as fragment size. This model was then used to predict the potencies of 12 test set compounds, giving satisfactory predictive R-2 value of 0,88. The contribution maps obtained from the best HQSAR model are in agreement with the biological activities of the study compounds. The Trypanosoma cruzi cruzain shares high similarity with the mammalian homolog cathepsin L. The selectivity toward cruzam was checked by a database of 123 compounds, which corresponds to the 41 cruzain inhibitors used in the HQSAR model development plus 82 cathepsin L inhibitors. We screened these compounds by ROCS (Rapid Overlay of Chemical Structures), a Gaussian-shape volume overlap filter that can rapidly identify shapes that match the query molecule. Remarkably, ROCS was able to rank the first 37 hits as being only cruzain inhibitors. In addition, the area under the curve (AUC) obtained with ROCS was 0.96, indicating that the method was very efficient to distinguishing between cruzain and cathepsin L inhibitors. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Cytochrome P450 (CYP450) is a class of enzymes where the substrate identification is particularly important to know. It would help medicinal chemists to design drugs with lower side effects due to drug-drug interactions and to extensive genetic polymorphism. Herein, we discuss the application of the 2D and 3D-similarity searches in identifying reference Structures with higher capacity to retrieve Substrates of three important CYP enzymes (CYP2C9, CYP2D6, and CYP3A4). On the basis of the complementarities of multiple reference structures selected by different similarity search methods, we proposed the fusion of their individual Tanimoto scores into a consensus Tanimoto score (T(consensus)). Using this new score, true positive rates of 63% (CYP2C9) and 81% (CYP2D6) were achieved with false positive rates of 4% for the CYP2C9-CYP2D6 data Set. Extended similarity searches were carried out oil a validation data set, and the results showed that by using the T(consensus) score, not only the area of a ROC graph increased, but also more substrates were recovered at the beginning of a ranked list.
Resumo:
En individs prenatala testosteronhalter kan mätas genom att ta fram ett relationstal mellan längden på pek- och ringfinger: ett 2D:4D digit ratio index. Studier har visat att det finns ett samband mellan högre testosteronhalter och god matematisk förmåga. I föreliggande studie deltog 40 svenska gymnasieungdomar. Två hypoteser undersöktes: det finns ett samband mellan 2D:4D och matematikresultat samt det finns ett samband mellan kön, 2D:4D och matematikresultat. Datainsamlingen bestod av fingermätningar och provresultat i matematik. Korrelationsresultaten påvisade inget signifikant samband mellan fingerlängd och matematiska resultat. Studien påvisade heller inte någon interaktionseffekt mellan kön, fingerlängd och matematikresultat. Resultaten diskuteras ur ett sociokulturellt, biologiskt och miljömässigt perspektiv.
Resumo:
Woodworking industries still consists of wood dust problems. Young workers are especially vulnerable to safety risks. To reduce risks, it is important to change attitudes and increase knowledge about safety. Safety training have shown to establish positive attitudes towards safety among employees. The aim of current study is to analyze the effect of QR codes that link to Picture Mix EXposure (PIMEX) videos by analyzing attitudes to this safety training method and safety in student responses. Safety training videos were used in upper secondary school handicraft programs to demonstrate wood dust risks and methods to decrease exposure to wood dust. A preliminary study was conducted to investigate improvement of safety training in two schools in preparation for the main study that investigated a safety training method in three schools. In the preliminary study the PIMEX method was first used in which students were filmed while wood dust exposure was measured and subsequently displayed on a computer screen in real time. Before and after the filming, teachers, students, and researchers together analyzed wood dust risks and effective measures to reduce exposure to them. For the main study, QR codes linked to PIMEX videos were attached at wood processing machines. Subsequent interviews showed that this safety training method enables students in an early stage of their life to learn about risks and safety measures to control wood dust exposure. The new combination of methods can create awareness, change attitudes and motivation among students to work more frequently to reduce wood dust.
Resumo:
While the simulation of flood risks originating from the overtopping of river banks is well covered within continuously evaluated programs to improve flood protection measures, flash flooding is not. Flash floods are triggered by short, local thunderstorm cells with high precipitation intensities. Small catchments have short response times and flow paths and convective thunder cells may result in potential flooding of endangered settlements. Assessing local flooding and pathways of flood requires a detailed hydraulic simulation of the surface runoff. Hydrological models usually do not incorporate surface runoff at this detailedness but rather empirical equations are applied for runoff detention. In return 2D hydrodynamic models usually do not allow distributed rainfall as input nor are any types of soil/surface interaction implemented as in hydrological models. Considering several cases of local flash flooding during the last years the issue emerged for practical reasons but as well as research topics to closing the model gap between distributed rainfall and distributed runoff formation. Therefore, a 2D hydrodynamic model, depth-averaged flow equations using the finite volume discretization, was extended to accept direct rainfall enabling to simulate the associated runoff formation. The model itself is used as numerical engine, rainfall is introduced via the modification of waterlevels at fixed time intervals. The paper not only deals with the general application of the software, but intends to test the numerical stability and reliability of simulation results. The performed tests are made using different artificial as well as measured rainfall series as input. Key parameters of the simulation such as losses, roughness or time intervals for water level manipulations are tested regarding their impact on the stability.
Resumo:
O presente trabalho apresenta um anova proposta de tratamento de estruturas espirais em meios contínuos oscilatórios na vizinhança de bifurcações de Hopf supercríticas. Tais estruturas são normalmente descritas pela Equação de Cinzburg-Landau Complexa a qual usa um campo complexo associado a essas oscilações. A proposta apresentada reduz a dinâmica de espirais à interação entre os centros das mesmas. Inicialmente, comparamos numericamente as duas descrições e com os ganhos computacionais decorrentes da abordagem reduzida caracterizamos finamente as estruturas espaço-temporais formadas nesses sistemas: em vez dos estados congelados mencionados anteriormente na literatura encontrou-se uma dinâmica espaço-temporal intermitente. Esse regime ocorre em duas fases distintas: Líquido de Vórtices e Vidros de Vórtices. Esta última evolui em escalas de tempo ultralentas como fenômenos semelhantes encontrados na Mecânica Estatística, apesar de sua origem puramente determinista.
Resumo:
A visualização de conjuntos de dados volumétricos é comum em diversas áreas de aplicação e há já alguns anos os diversos aspectos envolvidos nessas técnicas vêm sendo pesquisados. No entanto, apesar dos avanços das técnicas de visualização de volumes, a interação com grandes volumes de dados ainda apresenta desafios devido a questões de percepção (ou isolamento) de estruturas internas e desempenho computacional. O suporte do hardware gráfico para visualização baseada em texturas permite o desenvolvimento de técnicas eficientes de rendering que podem ser combinadas com ferramentas de recorte interativas para possibilitar a inspeção de conjuntos de dados tridimensionais. Muitos estudos abordam a otimização do desempenho de ferramentas de recorte, mas muito poucos tratam das metáforas de interação utilizadas por essas ferramentas. O objetivo deste trabalho é desenvolver ferramentas interativas, intuitivas e fáceis de usar para o recorte de imagens volumétricas. Inicialmente, é apresentado um estudo sobre as principais técnicas de visualização direta de volumes e como é feita a exploração desses volumes utilizando-se recorte volumétrico. Nesse estudo é identificada a solução que melhor se enquadra no presente trabalho para garantir a interatividade necessária. Após, são apresentadas diversas técnicas de interação existentes, suas metáforas e taxonomias, para determinar as possíveis técnicas de interação mais fáceis de serem utilizadas por ferramentas de recorte. A partir desse embasamento, este trabalho apresenta o desenvolvimento de três ferramentas de recorte genéricas implementadas usando-se duas metáforas de interação distintas que são freqüentemente utilizadas por usuários de aplicativos 3D: apontador virtual e mão virtual. A taxa de interação dessas ferramentas é obtida através de programas de fragmentos especiais executados diretamente no hardware gráfico. Estes programas especificam regiões dentro do volume a serem descartadas durante o rendering, com base em predicados geométricos. Primeiramente, o desempenho, precisão e preferência (por parte dos usuários) das ferramentas de recorte volumétrico são avaliados para comparar as metáforas de interação empregadas. Após, é avaliada a interação utilizando-se diferentes dispositivos de entrada para a manipulação do volume e ferramentas. A utilização das duas mãos ao mesmo tempo para essa manipulação também é testada. Os resultados destes experimentos de avaliação são apresentados e discutidos.
Resumo:
There has been 47 recessions in the United States of America (US) since 1790. US recessions have increasingly affected economies of other countries in the world as nations become more and more interdependent on each other. The worst economic recession so far was the “Great Depression” – an economic recession that was caused by the 1929 crash of the stock market in the US. The 2008 economic recession in the US was a result of the burst of the “housing bubble” created by predatory lending. The economic recession resulted in increased unemployment (according to NBER 8.7 million jobs were lost from Feb. 2008 to Feb. 2010); decrease in GDP by 5.1%; increase in poverty level from 12.1% (2007) to 16.0% (2008) (NBER) This dissertation is an attempt to research the impact of the 2008 economic recession on different types of residential investments: a case study of five (5) diverse neighborhoods/zip codes in Washington DC, USA The main findings were that the effect of the 2008 economic depression on the different types of residential properties was dependent on the location of the property and the demographics/socio-economic factors associated with that location.
Resumo:
In the Hydrocarbon exploration activities, the great enigma is the location of the deposits. Great efforts are undertaken in an attempt to better identify them, locate them and at the same time, enhance cost-effectiveness relationship of extraction of oil. Seismic methods are the most widely used because they are indirect, i.e., probing the subsurface layers without invading them. Seismogram is the representation of the Earth s interior and its structures through a conveniently disposed arrangement of the data obtained by seismic reflection. A major problem in this representation is the intensity and variety of present noise in the seismogram, as the surface bearing noise that contaminates the relevant signals, and may mask the desired information, brought by waves scattered in deeper regions of the geological layers. It was developed a tool to suppress these noises based on wavelet transform 1D and 2D. The Java language program makes the separation of seismic images considering the directions (horizontal, vertical, mixed or local) and bands of wavelengths that form these images, using the Daubechies Wavelets, Auto-resolution and Tensor Product of wavelet bases. Besides, it was developed the option in a single image, using the tensor product of two-dimensional wavelets or one-wavelet tensor product by identities. In the latter case, we have the wavelet decomposition in a two dimensional signal in a single direction. This decomposition has allowed to lengthen a certain direction the two-dimensional Wavelets, correcting the effects of scales by applying Auto-resolutions. In other words, it has been improved the treatment of a seismic image using 1D wavelet and 2D wavelet at different stages of Auto-resolution. It was also implemented improvements in the display of images associated with breakdowns in each Auto-resolution, facilitating the choices of images with the signals of interest for image reconstruction without noise. The program was tested with real data and the results were good
Resumo:
The processing of materials through plasma has been growing enough in the last times in several technological applications, more specifically in surfaces treatment. That growth is due, mainly, to the great applicability of plasmas as energy source, where it assumes behavior thermal, chemical and/or physical. On the other hand, the multiplicity of simultaneous physical effects (thermal, chemical and physical interactions) present in plasmas increases the complexity for understanding their interaction with solids. In that sense, as an initial step for the development of that subject, the present work treats of the computational simulation of the heating and cooling processes of steel and copper samples immersed in a plasma atmosphere, by considering two experimental geometric configurations: hollow and plane cathode. In order to reach such goal, three computational models were developed in Fortran 90 language: an one-dimensional transient model (1D, t), a two-dimensional transient model (2D, t) and a two-dimensional transient model (2D, t) which take into account the presence of a sample holder in the experimental assembly. The models were developed based on the finite volume method and, for the two-dimensional configurations, the effect of hollow cathode on the sample was considered as a lateral external heat source. The main results obtained with the three computational models, as temperature distribution and thermal gradients in the samples and in the holder, were compared with those developed by the Laboratory of Plasma, LabPlasma/UFRN, and with experiments available in the literature. The behavior showed indicates the validity of the developed codes and illustrate the need of the use of such computational tool in that process type, due to the great easiness of obtaining thermal information of interest