958 resultados para Dunkl Transform
Resumo:
Develop a new model of Absorptive Capacity taking into account two variables namely Learning and knowledge to explain how companies transform information into knowledge
Resumo:
Novel alternating copolymers comprising biscalix[4]arene-p-phenylene ethynylene and m-phenylene ethynylene units (CALIX-m-PPE) were synthesized using the Sonogashira-Hagihara cross-coupling polymerization. Good isolated yields (60-80%) were achieved for the polymers that show M-n ranging from 1.4 x 10(4) to 5.1 x 10(4) gmol(-1) (gel permeation chromatography analysis), depending on specific polymerization conditions. The structural analysis of CALIX-m-PPE was performed by H-1, C-13, C-13-H-1 heteronuclear single quantum correlation (HSQC), C-13-H-1 heteronuclear multiple bond correlation (HMBC), correlation spectroscopy (COSY), and nuclear overhauser effect spectroscopy (NOESY) in addition to Fourier transform-Infrared spectroscopy and microanalysis allowing its full characterization. Depending on the reaction setup, variable amounts (16-45%) of diyne units were found in polymers although their photophysical properties are essentially the same. It is demonstrated that CALIX-m-PPE does not form ground-or excited-state interchain interactions owing to the highly crowded environment of the main-chain imparted by both calix[4]arene side units which behave as insulators inhibiting main-chain pi-pi staking. It was also found that the luminescent properties of CALIX-m-PPE are markedly different from those of an all-p-linked phenylene ethynylene copolymer (CALIX-p-PPE) previously reported. The unexpected appearance of a low-energy emission band at 426 nm, in addition to the locally excited-state emission (365 nm), together with a quite low fluorescence quantum yield (Phi = 0.02) and a double-exponential decay dynamics led to the formulation of an intramolecular exciplex as the new emissive species.
Resumo:
A motivação para este trabalho vem da necessidade que o autor tem em poder registar as notas tocadas na guitarra durante o processo de improviso. Quando o músico está a improvisar na guitarra, muitas vezes não se recorda das notas tocadas no momento, este trabalho trata o desenvolvimento de uma aplicação para guitarristas, que permita registar as notas tocadas na guitarra eléctrica ou clássica. O sinal é adquirido a partir da guitarra e processado com requisitos de tempo real na captura do sinal. As notas produzidas pela guitarra eléctrica, ligada ao computador, são representadas no formato de tablatura e/ou partitura. Para este efeito a aplicação capta o sinal proveniente da guitarra eléctrica a partir da placa de som do computador e utiliza algoritmos de detecção de frequência e algoritmos de estimação de duração de cada sinal para construir o registo das notas tocadas. A aplicação é desenvolvida numa perspectiva multi-plataforma, podendo ser executada em diferentes sistemas operativos Windows e Linux, usando ferramentas e bibliotecas de domínio público. Os resultados obtidos mostram a possibilidade de afinar a guitarra com valores de erro na ordem de 2 Hz em relação às frequências de afinação standard. A escrita da tablatura apresenta resultados satisfatórios, mas que podem ser melhorados. Para tal será necessário melhorar a implementação de técnicas de processamento do sinal bem como a comunicação entre processos para resolver os problemas encontrados nos testes efectuados.
Resumo:
Este trabalho consiste no desenvolvimento de um Sistema de Apoio à Criminologia – SAC, onde se pretende ajudar os detectives/analistas na prevenção proactiva da criminalidade e na gestão dos seus recursos materiais e humanos, bem como impulsionar estudos sobre a alta incidência de determinados tipos de crime numa dada região. Historicamente, a resolução de crimes tem sido uma prerrogativa da justiça penal e dos seus especialistas e, com o aumento da utilização de sistemas computacionais no sistema judicial para registar todos os dados que dizem respeito a ocorrências de crimes, dados de suspeitos e vítimas, registo criminal de indivíduos e outros dados que fluem dentro da organização, cresce a necessidade de transformar estes dados em informação proveitosa no combate à criminalidade. O SAC tira partido de técnicas de extracção de conhecimento de informação e aplica-as a um conjunto de dados de ocorrências de crimes numa dada região e espaço temporal, bem como a um conjunto de variáveis que influenciam a criminalidade, as quais foram estudadas e identificadas neste trabalho. Este trabalho é constituído por um modelo de extracção de conhecimento de informação e por uma aplicação que permite ao utilizador fornecer um conjunto de dados adequado, garantindo a máxima eficácia do modelo.
Resumo:
We present a study of the magnetic properties of a group of basalt samples from the Saldanha Massif (Mid-Atlantic Ridge - MAR - 36degrees 33' 54" N, 33degrees 26' W), and we set out to interpret these properties in the tectono-magmatic framework of this sector of the MAR. Most samples have low magnetic anisotropy and magnetic minerals of single domain grain size, typical of rapid cooling. The thermomagnetic study mostly shows two different susceptibility peaks. The high temperature peak is related to mineralogical alteration due to heating. The low temperature peak shows a distinction between three different stages of low temperature oxidation: the presence of titanomagnetite, titanomagnetite and titanomaghemite, and exclusively of titanomaghemite. Based on established empirical relationships between Curie temperature and degree of oxidation, the latter is tentatively deduced for all samples. Finally, swath bathymetry and sidescan sonar data combined with dive observations show that the Saldanha Massif is located over an exposed section of upper mantle rocks interpreted to be the result of detachment tectonics. Basalt samples inside the detachment zone often have higher than expected oxidation rates; this effect can be explained by the higher permeability caused by the detachment fault activity.
Resumo:
Recently, several distributed video coding (DVC) solutions based on the distributed source coding (DSC) paradigm have appeared in the literature. Wyner-Ziv (WZ) video coding, a particular case of DVC where side information is made available at the decoder, enable to achieve a flexible distribution of the computational complexity between the encoder and decoder, promising to fulfill novel requirements from applications such as video surveillance, sensor networks and mobile camera phones. The quality of the side information at the decoder has a critical role in determining the WZ video coding rate-distortion (RD) performance, notably to raise it to a level as close as possible to the RD performance of standard predictive video coding schemes. Towards this target, efficient motion search algorithms for powerful frame interpolation are much needed at the decoder. In this paper, the RD performance of a Wyner-Ziv video codec is improved by using novel, advanced motion compensated frame interpolation techniques to generate the side information. The development of these type of side information estimators is a difficult problem in WZ video coding, especially because the decoder only has available some reference, decoded frames. Based on the regularization of the motion field, novel side information creation techniques are proposed in this paper along with a new frame interpolation framework able to generate higher quality side information at the decoder. To illustrate the RD performance improvements, this novel side information creation framework has been integrated in a transform domain turbo coding based Wyner-Ziv video codec. Experimental results show that the novel side information creation solution leads to better RD performance than available state-of-the-art side information estimators, with improvements up to 2 dB: moreover, it allows outperforming H.264/AVC Intra by up to 3 dB with a lower encoding complexity.
Resumo:
The interaction of a variety of substrates with Pseudomonas aeruginosa native amidase (E.C. 3.5.1.4), overproduced in an Escherichia coli strain, was investigated using difference FTIR spectroscopy. The amides used as substrates showed an increase in hydrogen bonding upon association in multimers, which was not seen with esters. Evidence for an overall reduction or weakening of hydrogen bonding while amide and ester substrates are interacting with the enzyme is presented. The results describe a spectroscopic approach for analysis of substrate-amidase interaction and in situ monitoring of the hydrolysis and transferase reaction when amides or esters are used as substrates.
Resumo:
The surface morphology, structure and composition of human dentin treated with a femtosecond infrared laser (pulse duration 500 fs, wavelength 1030 nm, fluences ranging from 1 to 3 J cm(-2)) was studied by scanning electron microscopy, x-ray diffraction, x-ray photoelectron spectroscopy and Fourier transform infrared spectroscopy. The average dentin ablation threshold under these conditions was 0.6 +/- 0.2 J cm(-2) and the ablation rate achieved in the range 1 to 2 mu m/pulse for an average fluence of 3 J cm(-2). The ablation surfaces present an irregular and rugged appearance, with no significant traces of melting, deformation, cracking or carbonization. The smear layer was entirely removed by the laser treatment. For fluences only slightly higher than the ablation threshold the morphology of the laser-treated surfaces was very similar to the dentin fracture surfaces and the dentinal tubules remained open. For higher fluences, the surface was more porous and the dentin structure was partially concealed by ablation debris and a few resolidified droplets. Independently on the laser processing parameters and laser processing method used no sub-superficial cracking was observed. The dentin constitution and chemical composition was not significantly modified by the laser treatment in the processing parameter range used. In particular, the organic matter is not preferentially removed from the surface and no traces of high temperature phosphates, such as the beta-tricalcium phosphate, were observed. The achieved results are compatible with an electrostatic ablation mechanism. In conclusion, the high beam quality and short pulse duration of the ultrafast laser used should allow the accurate preparation of cavities, with negligible damage of the underlying material.
Resumo:
O presente trabalho consiste na implementação em hardware de unidades funcionais dedicadas e optimizadas, para a realização das operações de codificação e descodificação, definidas na norma de codificação com perda Joint Photographic Experts Group (JPEG), ITU-T T.81 ISO/IEC 10918-1. Realiza-se um estudo sobre esta norma de forma a caracterizar os seus principais blocos funcionais. A finalidade deste estudo foca-se na pesquisa e na proposta de optimizações, de forma a minimizar o hardware necessário para a realização de cada bloco, de modo a que o sistema realizado obtenha taxas de compressão elevadas, minimizando a distorção obtida. A redução de hardware de cada sistema, codificador e descodificador, é conseguida à custa da manipulação das equações dos blocos Forward Discrete Cosine Transform (FDCT) e Quantificação (Q) e dos blocos Forward Discrete Cosine Transform (IDCT) e Quantificação Inversa (IQ). Com as conclusões retiradas do estudo e através da análise de estruturas conhecidas, descreveu-se cada bloco em Very-High-Speed Integrated Circuits (VHSIC) Hardware Description Language (VHDL) e fez-se a sua síntese em Field Programmable Gate Array (FPGA). Cada sistema implementado recorre à execução de cada bloco em paralelo de forma a optimizar a codificação/descodificação. Assim, para o sistema codificador, será realizada a operação da FDCT e Quantificação sobre duas matrizes diferentes e em simultâneo. O mesmo sucede para o sistema descodificador, composto pelos blocos Quantificação Inversa e IDCT. A validação de cada bloco sintetizado é executada com recurso a vectores de teste obtidos através do estudo efectuado. Após a integração de cada bloco, verificou-se que, para imagens greyscale de referência com resolução de 256 linhas por 256 colunas, é necessário 820,5 μs para a codificação de uma imagem e 830,5 μs para a descodificação da mesma. Considerando uma frequência de trabalho de 100 MHz, processam-se aproximadamente 1200 imagens por segundo.
Resumo:
A new high throughput and scalable architecture for unified transform coding in H.264/AVC is proposed in this paper. Such flexible structure is capable of computing all the 4x4 and 2x2 transforms for Ultra High Definition Video (UHDV) applications (4320x7680@ 30fps) in real-time and with low hardware cost. These significantly high performance levels were proven with the implementation of several different configurations of the proposed structure using both FPGA and ASIC 90 nm technologies. In addition, such experimental evaluation also demonstrated the high area efficiency of theproposed architecture, which in terms of Data Throughput per Unit of Area (DTUA) is at least 1.5 times more efficient than its more prominent related designs(1).
Resumo:
Deoxyribonucleic acid, or DNA, is the most fundamental aspect of life but present day scientific knowledge has merely scratched the surface of the problem posed by its decoding. While experimental methods provide insightful clues, the adoption of analysis tools supported by the formalism of mathematics will lead to a systematic and solid build-up of knowledge. This paper studies human DNA from the perspective of system dynamics. By associating entropy and the Fourier transform, several global properties of the code are revealed. The fractional order characteristics emerge as a natural consequence of the information content. These properties constitute a small piece of scientific knowledge that will support further efforts towards the final aim of establishing a comprehensive theory of the phenomena involved in life.
Resumo:
A novel hybrid approach, combining wavelet transform, particle swarm optimization, and adaptive-network-based fuzzy inference system, is proposed in this paper for short-term electricity prices forecasting in a competitive market. Results from a case study based on the electricity market of mainland Spain are presented. A thorough comparison is carried out, taking into account the results of previous publications. Finally, conclusions are duly drawn.
Resumo:
The effect of cultivation parameters such as temperature incubation, IPTG induction and ethanol shock on the production of Pseudomonasaeruginosa amidase (E.C.3.5.1.4) in a recombinant Escherichia coli strain in LB ampicillin culture medium was investigated. The highest yield of solubleamidase, relatively to other proteins, was obtained in the condition at 37 degrees C using 0.40 mM IPTG to induce growth, with ethanol. Our results demonstrate the formation of insoluble aggregates containing amidase, which was biologically active, in all tested growth conditions. Addition of ethanol at 25 degrees C in the culture medium improved amidase yield, which quantitatively aggregated in a biologically active form and exhibited in all conditions an increased specific activity relatively to the soluble form of the enzyme. Non-denaturing solubilization of the aggregated amidase was successfully achieved using L-arginine. The aggregates obtained from conditions at 37 degrees C by Furier transform infrared spectroscopy (FTIR) analysis demonstrated a lower content of intermolecular interactions, which facilitated the solubilization step applying non-denaturing conditions. The higher interactions exhibited in aggregates obtained at suboptimal conditions compromised the solubilization yield. This work provides an approach for the characterization and solubilization of novel reported biologically active aggregates of this amidase.
Resumo:
In this paper, a hybrid intelligent approach is proposed for short-term electricity prices forecasting in a competitive market. The proposed approach is based on the wavelet transform and a hybrid of neural networks and fuzzy logic. Results from a case study based on the electricity market of mainland Spain are presented. A thorough comparison is carried out, taking into account the results of previous publications. Conclusions are duly drawn. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
This paper addresses the DNA code analysis in the perspective of dynamics and fractional calculus. Several mathematical tools are selected to establish a quantitative method without distorting the alphabet represented by the sequence of DNA bases. The association of Gray code, Fourier transform and fractional calculus leads to a categorical representation of species and chromosomes.