35 resultados para Unified Transform Kernel
Resumo:
This paper presents the design and implementation of direct power controllers for three-phase matrix converters (MC) operating as Unified Power Flow Controllers (UPFC). Theoretical principles of the decoupled linear power controllers of the MC-UPFC to minimize the cross-coupling between active and reactive power control are established. From the matrix converter based UPFC model with a modified Venturini high frequency PWM modulator, decoupled controllers for the transmission line active (P) and reactive (Q) power direct control are synthesized. Simulation results, obtained from Matlab/Simulink, are presented in order to confirm the proposed approach. Results obtained show decoupled power control, zero error tracking, and fast responses with no overshoot and no steady-state error.
Resumo:
The main purpose of this research is to identify the hidden knowledge and learning mechanisms in the organization in order to disclosure the tacit knowledge and transform it into explicit knowledge. Most firms usually tend to duplicate their efforts acquiring extra knowledge and new learning skills while forgetting to exploit the existing ones thus wasting one life time resources that could be applied to increase added value within the firm overall competitive advantage. This unique value in the shape of creation, acquisition, transformation and application of learning and knowledge is not disseminated throughout the individual, group and, ultimately, the company itself. This work is based on three variables that explain the behaviour of learning as the process of construction and acquisition of knowledge, namely internal social capital, technology and external social capital, which include the main attributes of learning and knowledge that help us to capture the essence of this symbiosis. Absorptive Capacity provides the right tool to explore this uncertainty within the firm it is possible to achieve the perfect match between learning skills and knowledge needed to support the overall strategy of the firm. This study has taken in to account a sample of the Portuguese textile industry and it is based on a multisectorial analysis that makes it possible a crossfunctional analysis to check on the validity of results in order to better understand and capture the dynamics of organizational behavior.
Resumo:
The main purpose of this research is to identify the hidden knowledge and learning mechanisms in the organization in order to disclosure the tacit knowledge and transform it into explicit knowledge. Most firms usually tend to duplicate their efforts acquiring extra knowledge and new learning skills while forgetting to exploit the existing ones thus wasting one life time resources that could be applied to increase added value within the firm overall competitive advantage. This unique value in the shape of creation, acquisition, transformation and application of learning and knowledge is not disseminated throughout the individual, group and, ultimately, the company itself. This work is based on three variables that explain the behaviour of learning as the process of construction and acquisition of knowledge, namely internal social capital, technology and external social capital, which include the main attributes of learning and knowledge that help us to capture the essence of this symbiosis. Absorptive Capacity provides the right tool to explore this uncertainty within the firm it is possible to achieve the perfect match between learning skills and knowledge needed to support the overall strategy of the firm. This study has taken in to account a sample of the Portuguese textile industry and it is based on a multisectorial analysis that makes it possible a crossfunctional analysis to check on the validity of results in order to better understand and capture the dynamics of organizational behavior.
Resumo:
Develop a new model of Absorptive Capacity taking into account two variables namely Learning and knowledge to explain how companies transform information into knowledge
Resumo:
Novel alternating copolymers comprising biscalix[4]arene-p-phenylene ethynylene and m-phenylene ethynylene units (CALIX-m-PPE) were synthesized using the Sonogashira-Hagihara cross-coupling polymerization. Good isolated yields (60-80%) were achieved for the polymers that show M-n ranging from 1.4 x 10(4) to 5.1 x 10(4) gmol(-1) (gel permeation chromatography analysis), depending on specific polymerization conditions. The structural analysis of CALIX-m-PPE was performed by H-1, C-13, C-13-H-1 heteronuclear single quantum correlation (HSQC), C-13-H-1 heteronuclear multiple bond correlation (HMBC), correlation spectroscopy (COSY), and nuclear overhauser effect spectroscopy (NOESY) in addition to Fourier transform-Infrared spectroscopy and microanalysis allowing its full characterization. Depending on the reaction setup, variable amounts (16-45%) of diyne units were found in polymers although their photophysical properties are essentially the same. It is demonstrated that CALIX-m-PPE does not form ground-or excited-state interchain interactions owing to the highly crowded environment of the main-chain imparted by both calix[4]arene side units which behave as insulators inhibiting main-chain pi-pi staking. It was also found that the luminescent properties of CALIX-m-PPE are markedly different from those of an all-p-linked phenylene ethynylene copolymer (CALIX-p-PPE) previously reported. The unexpected appearance of a low-energy emission band at 426 nm, in addition to the locally excited-state emission (365 nm), together with a quite low fluorescence quantum yield (Phi = 0.02) and a double-exponential decay dynamics led to the formulation of an intramolecular exciplex as the new emissive species.
Resumo:
A motivação para este trabalho vem da necessidade que o autor tem em poder registar as notas tocadas na guitarra durante o processo de improviso. Quando o músico está a improvisar na guitarra, muitas vezes não se recorda das notas tocadas no momento, este trabalho trata o desenvolvimento de uma aplicação para guitarristas, que permita registar as notas tocadas na guitarra eléctrica ou clássica. O sinal é adquirido a partir da guitarra e processado com requisitos de tempo real na captura do sinal. As notas produzidas pela guitarra eléctrica, ligada ao computador, são representadas no formato de tablatura e/ou partitura. Para este efeito a aplicação capta o sinal proveniente da guitarra eléctrica a partir da placa de som do computador e utiliza algoritmos de detecção de frequência e algoritmos de estimação de duração de cada sinal para construir o registo das notas tocadas. A aplicação é desenvolvida numa perspectiva multi-plataforma, podendo ser executada em diferentes sistemas operativos Windows e Linux, usando ferramentas e bibliotecas de domínio público. Os resultados obtidos mostram a possibilidade de afinar a guitarra com valores de erro na ordem de 2 Hz em relação às frequências de afinação standard. A escrita da tablatura apresenta resultados satisfatórios, mas que podem ser melhorados. Para tal será necessário melhorar a implementação de técnicas de processamento do sinal bem como a comunicação entre processos para resolver os problemas encontrados nos testes efectuados.
Resumo:
Este trabalho consiste no desenvolvimento de um Sistema de Apoio à Criminologia – SAC, onde se pretende ajudar os detectives/analistas na prevenção proactiva da criminalidade e na gestão dos seus recursos materiais e humanos, bem como impulsionar estudos sobre a alta incidência de determinados tipos de crime numa dada região. Historicamente, a resolução de crimes tem sido uma prerrogativa da justiça penal e dos seus especialistas e, com o aumento da utilização de sistemas computacionais no sistema judicial para registar todos os dados que dizem respeito a ocorrências de crimes, dados de suspeitos e vítimas, registo criminal de indivíduos e outros dados que fluem dentro da organização, cresce a necessidade de transformar estes dados em informação proveitosa no combate à criminalidade. O SAC tira partido de técnicas de extracção de conhecimento de informação e aplica-as a um conjunto de dados de ocorrências de crimes numa dada região e espaço temporal, bem como a um conjunto de variáveis que influenciam a criminalidade, as quais foram estudadas e identificadas neste trabalho. Este trabalho é constituído por um modelo de extracção de conhecimento de informação e por uma aplicação que permite ao utilizador fornecer um conjunto de dados adequado, garantindo a máxima eficácia do modelo.
Resumo:
O objectivo deste trabalho passa pelo desenvolvimento de uma ferramenta de simulação dinâmica de recursos rádio em LTE no sentido descendente, com recurso à Framework OMNeT++. A ferramenta desenvolvida permite realizar o planeamento das estações base, simulação e análise de resultados. São descritos os principais aspectos da tecnologia de acesso rádio, designadamente a arquitectura da rede, a codificação, definição dos recursos rádio, os ritmos de transmissão suportados ao nível de canal e o mecanismo de controlo de admissão. Foi definido o cenário de utilização de recursos rádio que inclui a definição de modelos de tráfego e de serviços orientados a pacotes e circuitos. Foi ainda considerado um cenário de referência para a verificação e validação do modelo de simulação. A simulação efectua-se ao nível de sistema, suportada por um modelo dinâmico, estocástico e orientado por eventos discretos de modo a contemplar os diferentes mecanismos característicos da tecnologia OFDMA. Os resultados obtidos permitem a análise de desempenho dos serviços, estações base e sistema ao nível do throughput médio da rede, throughput médio por eNodeB e throughput médio por móvel para além de permitir analisar o contributo de outros parâmetros designadamente, largura de banda, raio de cobertura, perfil dos serviços, esquema de modulação, entre outros. Dos resultados obtidos foi possível verificar que, considerando um cenário com estações base com raio de cobertura de 100 m obteve-se um throughput ao nível do utilizador final igual a 4.69494 Mbps, ou seja, 7 vezes superior quando comparado a estações base com raios de cobertura de 200m.
Resumo:
We present a study of the magnetic properties of a group of basalt samples from the Saldanha Massif (Mid-Atlantic Ridge - MAR - 36degrees 33' 54" N, 33degrees 26' W), and we set out to interpret these properties in the tectono-magmatic framework of this sector of the MAR. Most samples have low magnetic anisotropy and magnetic minerals of single domain grain size, typical of rapid cooling. The thermomagnetic study mostly shows two different susceptibility peaks. The high temperature peak is related to mineralogical alteration due to heating. The low temperature peak shows a distinction between three different stages of low temperature oxidation: the presence of titanomagnetite, titanomagnetite and titanomaghemite, and exclusively of titanomaghemite. Based on established empirical relationships between Curie temperature and degree of oxidation, the latter is tentatively deduced for all samples. Finally, swath bathymetry and sidescan sonar data combined with dive observations show that the Saldanha Massif is located over an exposed section of upper mantle rocks interpreted to be the result of detachment tectonics. Basalt samples inside the detachment zone often have higher than expected oxidation rates; this effect can be explained by the higher permeability caused by the detachment fault activity.
Resumo:
Recently, several distributed video coding (DVC) solutions based on the distributed source coding (DSC) paradigm have appeared in the literature. Wyner-Ziv (WZ) video coding, a particular case of DVC where side information is made available at the decoder, enable to achieve a flexible distribution of the computational complexity between the encoder and decoder, promising to fulfill novel requirements from applications such as video surveillance, sensor networks and mobile camera phones. The quality of the side information at the decoder has a critical role in determining the WZ video coding rate-distortion (RD) performance, notably to raise it to a level as close as possible to the RD performance of standard predictive video coding schemes. Towards this target, efficient motion search algorithms for powerful frame interpolation are much needed at the decoder. In this paper, the RD performance of a Wyner-Ziv video codec is improved by using novel, advanced motion compensated frame interpolation techniques to generate the side information. The development of these type of side information estimators is a difficult problem in WZ video coding, especially because the decoder only has available some reference, decoded frames. Based on the regularization of the motion field, novel side information creation techniques are proposed in this paper along with a new frame interpolation framework able to generate higher quality side information at the decoder. To illustrate the RD performance improvements, this novel side information creation framework has been integrated in a transform domain turbo coding based Wyner-Ziv video codec. Experimental results show that the novel side information creation solution leads to better RD performance than available state-of-the-art side information estimators, with improvements up to 2 dB: moreover, it allows outperforming H.264/AVC Intra by up to 3 dB with a lower encoding complexity.
Resumo:
The interaction of a variety of substrates with Pseudomonas aeruginosa native amidase (E.C. 3.5.1.4), overproduced in an Escherichia coli strain, was investigated using difference FTIR spectroscopy. The amides used as substrates showed an increase in hydrogen bonding upon association in multimers, which was not seen with esters. Evidence for an overall reduction or weakening of hydrogen bonding while amide and ester substrates are interacting with the enzyme is presented. The results describe a spectroscopic approach for analysis of substrate-amidase interaction and in situ monitoring of the hydrolysis and transferase reaction when amides or esters are used as substrates.
Resumo:
Following the theoretical model of Merton (1987), we provide a new perspective of study about the role of idiosyncratic risk in the asset pricing process. More precisely, we analyze whether the idiosyncratic risk premium depends on the idiosyncratic risk level of an asset as well as the vatriation in the market-wide measure of idiosyncratic risk. As expected, we obtain a net positive risk premium for the Spanish stock market over the period 1987-2007. Our results show a positive relation between returns and individual indiosyncratic risk levels and a negative but lower relation with the aggregate measure of idiosyncratic risk. These findings have important implications for portfolio and risk management and contribute to provide a unified and coherent answer for the main and still unsolved question about the idiosyncratic risk puzzle: whether or not there exists a premium associated to this kind of risk and the sign for this risk premium.
Resumo:
Desde há alguns anos que têm vindo a ser mobilizados esforços para a elaboração de uma regulamentação técnica unificada aplicável a nível Europeu, os Eurocódigos Estruturais, a qual visa estabelecer uma harmonização da regulamentação a utilizar na construção civil, promovendo a eliminação de anteriores barreiras técnicas. Actualmente, no âmbito Nacional, encontramo-nos numa fase de transição da regulamentação técnica em vigor para o dimensionamento de Edifícios em Betão Armado, do Regulamento de Segurança e Acções para Estruturas de Betão Armado (RSA) e Regulamento de Estruturas de Betão Armado e Pré-Esforçado (REBAP) para os Eurocódigos Estruturais, sendo que neste período de transição se verifica a coexistência de ambos devido à ainda não publicação em Diário da República da nova regulamentação a adoptar. No presente trabalho procura-se verificar a segurança da estrutura de um edifício com laje fungiforme, situado em Faro, projectado segundo a regulamentação nacional em vigor (RSA/REBAP), através da aplicação dos Eurocódigos Estruturais, nomeadamente o EC0, EC1, EC2 e EC8. Este tipo de edifícios, em que a laje fungiforme é utilizada como elemento sísmico primário, não é totalmente abordada no EC8, a verificação de segurança será então efectuada considerando duas abordagens de análise distintas. Numa primeira abordagem de análise adopta-se a estrutura como sendo da Classe de Ductilidade Baixa (DCL) considerando todos os elementos como sísmicos primários, e numa segunda abordagem de análise assume-se uma maior ductilidade da estrutura (DCM) classificando alguns elementos como sísmicos secundários.
Resumo:
The surface morphology, structure and composition of human dentin treated with a femtosecond infrared laser (pulse duration 500 fs, wavelength 1030 nm, fluences ranging from 1 to 3 J cm(-2)) was studied by scanning electron microscopy, x-ray diffraction, x-ray photoelectron spectroscopy and Fourier transform infrared spectroscopy. The average dentin ablation threshold under these conditions was 0.6 +/- 0.2 J cm(-2) and the ablation rate achieved in the range 1 to 2 mu m/pulse for an average fluence of 3 J cm(-2). The ablation surfaces present an irregular and rugged appearance, with no significant traces of melting, deformation, cracking or carbonization. The smear layer was entirely removed by the laser treatment. For fluences only slightly higher than the ablation threshold the morphology of the laser-treated surfaces was very similar to the dentin fracture surfaces and the dentinal tubules remained open. For higher fluences, the surface was more porous and the dentin structure was partially concealed by ablation debris and a few resolidified droplets. Independently on the laser processing parameters and laser processing method used no sub-superficial cracking was observed. The dentin constitution and chemical composition was not significantly modified by the laser treatment in the processing parameter range used. In particular, the organic matter is not preferentially removed from the surface and no traces of high temperature phosphates, such as the beta-tricalcium phosphate, were observed. The achieved results are compatible with an electrostatic ablation mechanism. In conclusion, the high beam quality and short pulse duration of the ultrafast laser used should allow the accurate preparation of cavities, with negligible damage of the underlying material.
Resumo:
The idea of grand unification in a minimal supersymmetric SU(5) x SU(5) framework is revisited. It is shown that the unification of gauge couplings into a unique coupling constant can be achieved at a high-energy scale compatible with proton decay constraints. This requires the addition of minimal particle content at intermediate energy scales. In particular, the introduction of the SU(2)(L) triplets belonging to the (15, 1)+((15) over bar, 1) representations, as well as of the scalar triplet Sigma(3) and octet Sigma(8) in the (24, 1) representation, turns out to be crucial for unification. The masses of these intermediate particles can vary over a wide range, and even lie in the TeV region. In contrast, the exotic vector-like fermions must be heavy enough and have masses above 10(10) GeV. We also show that, if the SU(5) x SU(5) theory is embedded into a heterotic string scenario, it is not possible to achieve gauge coupling unification with gravity at the perturbative string scale.