974 resultados para Software-reconfigurable array processing architectures


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestrado em Engenharia Electrónica e Telecomunicações

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Electrotécnica e de Computadores

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Electrotécnica e de Computadores

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Ensino da Música.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

STRIPPING is a software application developed for the automatic design of a randomly packing column where the transfer of volatile organic compounds (VOCs) from water to air can be performed and to simulate it’s behaviour in a steady-state. This software completely purges any need of experimental work for the selection of diameter of the column, and allows a choice, a priori, of the most convenient hydraulic regime for this type of operation. It also allows the operator to choose the model used for the calculation of some parameters, namely between the Eckert/Robbins model and the Billet model for estimating the pressure drop of the gaseous phase, and between the Billet and Onda/Djebbar’s models for the mass transfer. Illustrations of the graphical interface offered are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, an attempt was made in order to measure and evaluate the eco-efficiency performance of a pultruded composite processing company. For this purpose the recommendations of World Business Council for Sustainable Development (WCSD) and the directives of ISO 14301 standard were followed and applied. The main general indicators of eco-efficiency, as well as the specific indicators, were defined and determined. With basis on indicators’ figures, the value profile, the environmental profile, and the pertinent ecoefficiency’s ratios were established and analyzed. In order to evaluate potential improvements on company eco-performance, new indicators values and eco-efficiency ratios were estimated taking into account the implementation of new proceedings and procedures, both in upstream and downstream of the production process, namely: a) Adoption of new heating system for pultrusion die in the manufacturing process, more effective and with minor heat losses; c) Recycling approach, with partial waste reuse of scrap material derived from manufacturing, cutting and assembly processes of GFRP profiles. These features lead to significant improvements on the sequent assessed eco-efficiency ratios of the present case study, yielding to a more sustainable product and manufacturing process of pultruded GFRP profiles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este documento consubstancia um trabalho de natureza profissional, submetido a apreciação no âmbito de um pedido de atribuição pelo Instituto Politécnico do Porto do título de Especialista na área científica de Informática, mais concretamente em gestão de projectos e equipas em PMEs de desenvolvimento de software. Nesse sentido, proceder-se-á a uma breve análise do estado da prática relevante, seguida de uma descrição das funções que desempenhei ao longo do meu percurso profissional e dos principais projectos em que tive oportunidade de participar durante o mesmo. Finalmente, o documento procede a uma análise crítica de alguns sucessos e insucessos na aplicação do estado da prática, ilustrados com exemplos reais que vivenciei durante a minha carreira.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Post-processing a finite element solution is a well-known technique, which consists in a recalculation of the originally obtained quantities such that the rate of convergence increases without the need for expensive remeshing techniques. Postprocessing is especially effective in problems where better accuracy is required for derivatives of nodal variables in regions where Dirichlet essential boundary condition is imposed strongly. Consequently such an approach can be exceptionally good in modelling of resin infiltration under quasi steady-state assumption by remeshing techniques and with explicit time integration, because only the free-front normal velocities are necessary to advance the resin front to the next position. The new contribution is the post-processing analysis and implementation of the freeboundary velocities of mesolevel infiltration analysis. Such implementation ensures better accuracy on even coarser meshes, which in consequence reduces the computational time also by the possibility of employing larger time steps.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introdução – A cintigrafia de perfusão do miocárdio (CPM) desempenha um importante papel no diagnóstico, avaliação e seguimento de pacientes com doença arterial coronária, sendo o seu processamento realizado maioritariamente de forma semiautomática. Uma vez que o desempenho dos técnicos de medicina nuclear (TMN) pode ser afetado por fatores individuais e ambientais, diferentes profissionais que processem os mesmos dados poderão obter diferentes estimativas dos parâmetros quantitativos (PQ). Objetivo – Avaliar a influência da experiência profissional e da função visual no processamento semiautomático da CPM. Analisar a variabilidade intra e interoperador na determinação dos PQ funcionais e de perfusão. Metodologia – Selecionou-se uma amostra de 20 TMN divididos em dois grupos, de acordo com a sua experiência no software Quantitative Gated SPECTTM: Grupo A (GA) – TMN ≥600h de experiência e Grupo B (GB) – TMN sem experiência. Submeteram-se os TMN a uma avaliação ortóptica e ao processamento de 21 CPM, cinco vezes, não consecutivas. Considerou-se uma visão alterada quando pelo menos um parâmetro da função visual se encontrava anormal. Para avaliar a repetibilidade e a reprodutibilidade recorreu-se à determinação dos coeficientes de variação, %. Na comparação dos PQ entre operadores, e para a análise do desempenho entre o GA e GB, aplicou-se o Teste de Friedman e de Wilcoxon, respetivamente, considerando o processamento das mesmas CPM. Para a comparação de TMN com visão normal e alterada na determinação dos PQ utilizou-se o Teste Mann-Whitney e para avaliar a influência da visão para cada PQ recorreu-se ao coeficiente de associação ETA. Diferenças estatisticamente significativas foram assumidas ao nível de significância de 5%. Resultados e Discussão – Verificou-se uma reduzida variabilidade intra (<6,59%) e inter (<5,07%) operador. O GB demonstrou ser o mais discrepante na determinação dos PQ, sendo a parede septal (PS) o único PQ que apresentou diferenças estatisticamente significativas (zw=-2,051, p=0,040), em detrimento do GA. No que se refere à influência da função visual foram detetadas diferenças estatisticamente significativas apenas na fração de ejeção do ventrículo esquerdo (FEVE) (U=11,5, p=0,012) entre TMN com visão normal e alterada, contribuindo a visão em 33,99% para a sua variação. Denotaram-se mais diferenças nos PQ obtidos em TMN que apresentam uma maior incidência de sintomatologia ocular e uma visão binocular diminuída. A FEVE demonstrou ser o parâmetro mais consistente entre operadores (1,86%). Conclusão – A CPM apresenta-se como uma técnica repetível e reprodutível, independente do operador. Verificou-se influência da experiência profissional e da função visual no processamento semiautomático da CPM, nos PQ PS e FEVE, respetivamente.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multi-agent architectures are well suited for complex inherently distributed problem solving domains. From the many challenging aspects that arise within this framework, a crucial one emerges: how to incorporate dynamic and conflicting agent beliefs? While the belief revision activity in a single agent scenario is concentrated on incorporating new information while preserving consistency, in a multi-agent system it also has to deal with possible conflicts between the agents perspectives. To provide an adequate framework, each agent, built as a combination of an assumption based belief revision system and a cooperation layer, was enriched with additional features: a distributed search control mechanism allowing dynamic context management, and a set of different distributed consistency methodologies. As a result, a Distributed Belief Revision Testbed (DiBeRT) was developed. This paper is a preliminary report presenting some of DiBeRT contributions: a concise representation of external beliefs; a simple and innovative methodology to achieve distributed context management; and a reduced inter-agent data exchange format.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Environmental management is a complex task. The amount and heterogeneity of the data needed for an environmental decision making tool is overwhelming without adequate database systems and innovative methodologies. As far as data management, data interaction and data processing is concerned we here propose the use of a Geographical Information System (GIS) whilst for the decision making we suggest a Multi-Agent System (MAS) architecture. With the adoption of a GIS we hope to provide a complementary coexistence between heterogeneous data sets, a correct data structure, a good storage capacity and a friendly user’s interface. By choosing a distributed architecture such as a Multi-Agent System, where each agent is a semi-autonomous Expert System with the necessary skills to cooperate with the others in order to solve a given task, we hope to ensure a dynamic problem decomposition and to achieve a better performance compared with standard monolithical architectures. Finally, and in view of the partial, imprecise, and ever changing character of information available for decision making, Belief Revision capabilities are added to the system. Our aim is to present and discuss an intelligent environmental management system capable of suggesting the more appropriate land-use actions based on the existing spatial and non-spatial constraints.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent integrated circuit technologies have opened the possibility to design parallel architectures with hundreds of cores on a single chip. The design space of these parallel architectures is huge with many architectural options. Exploring the design space gets even more difficult if, beyond performance and area, we also consider extra metrics like performance and area efficiency, where the designer tries to design the architecture with the best performance per chip area and the best sustainable performance. In this paper we present an algorithm-oriented approach to design a many-core architecture. Instead of doing the design space exploration of the many core architecture based on the experimental execution results of a particular benchmark of algorithms, our approach is to make a formal analysis of the algorithms considering the main architectural aspects and to determine how each particular architectural aspect is related to the performance of the architecture when running an algorithm or set of algorithms. The architectural aspects considered include the number of cores, the local memory available in each core, the communication bandwidth between the many-core architecture and the external memory and the memory hierarchy. To exemplify the approach we did a theoretical analysis of a dense matrix multiplication algorithm and determined an equation that relates the number of execution cycles with the architectural parameters. Based on this equation a many-core architecture has been designed. The results obtained indicate that a 100 mm(2) integrated circuit design of the proposed architecture, using a 65 nm technology, is able to achieve 464 GFLOPs (double precision floating-point) for a memory bandwidth of 16 GB/s. This corresponds to a performance efficiency of 71 %. Considering a 45 nm technology, a 100 mm(2) chip attains 833 GFLOPs which corresponds to 84 % of peak performance These figures are better than those obtained by previous many-core architectures, except for the area efficiency which is limited by the lower memory bandwidth considered. The results achieved are also better than those of previous state-of-the-art many-cores architectures designed specifically to achieve high performance for matrix multiplication.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An adaptive antenna array combines the signal of each element, using some constraints to produce the radiation pattern of the antenna, while maximizing the performance of the system. Direction of arrival (DOA) algorithms are applied to determine the directions of impinging signals, whereas beamforming techniques are employed to determine the appropriate weights for the array elements, to create the desired pattern. In this paper, a detailed analysis of both categories of algorithms is made, when a planar antenna array is used. Several simulation results show that it is possible to point an antenna array in a desired direction based on the DOA estimation and on the beamforming algorithms. A comparison of the performance in terms of runtime and accuracy of the used algorithms is made. These characteristics are dependent on the SNR of the incoming signal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wireless networks have joined to the sports venues, offering to the public a set of facilities, such as the access to email, news, and also to use the social networking, uploading their photos. New challenges have emerged to provide Wi-Fi in this densely populated stadiums, such as increasing capacity and coverage. In this article, an access point antenna array to cover a sector of a stadium is presented. Its structure, designed in a low cost material allows to reduce the total manufacturing costs, an important factor due to the large number of antennas required in these venues. The material characteristic, the broad bandwidth of operation (300 MHz), along with to the low side lobe levels, important to reduce interference between sectors, makes this antenna well-positioned for wireless communications in these particular locals. (c) 2015 Wiley Periodicals, Inc. Microwave Opt Technol Lett 57:2037-2041, 2015.