73 resultados para Mixture performance


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research on the use of Construction and Demolition Waste (CDW) as recycled aggregate (in particular crushed concrete) for the production of new concrete has by now established the feasibility of this environmentally-friendly use of otherwise harmful waste. However, contrary to conventional concrete (CC), no large applications of concrete made with recycled concrete have been made and there is still a lack of knowledge in some areas of production and performance of recycled aggregate concrete (RAC). One issue concerns curing conditions: these greatly affect the performance of concrete made on site and some potential users of RAC wonder how RAC is affected by far-from-ideal curing conditions. This paper shows the main results of experiments to determine the influence of different curing conditions on the mechanical performance of concrete made with coarse recycled aggregate from crushed concrete. The properties analyzed include compressive strength, splitting tensile strength, modulus of elasticity, and abrasion resistance. The general conclusion in terms of mechanical performance is that RAC is affected by curing conditions roughly in the same way as CC. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Uma das fases que constitui um programa de desenvolvimento de um motor aeronáutico é a concepção de um modelo de simulação da sua performance, com recurso a informação recolhida em ensaios experimentais de grande escala. Estes ensaios são desenvolvidos em instalações de elevado nível tecnológico, criadas exclusivamente para esta finalidade e às quais apenas os grandes fabricantes têm acesso. O mesmo acontece com todos os resultados obtidos, os detalhes das simulações e os parâmetros de projecto, que são mantidos sigilosamente. Por outro lado, o local onde se inseriu o desenvolvimento desta dissertação, designadamente o banco de ensaio da TAP, não está previsto para projectar este tipo de equipamentos, o que impossibilita assim o acesso a informação fundamental ao estudo aprofundado dos reactores. Além disso, devido ao risco de provocar danos irreversíveis aos motores, existe ainda a impossibilidade de os instrumentar com sondas improvisadas pela equipa de engenharia. Neste sentido, e de forma a ir ao encontro dos parâmetros termodinâmicos pretendidos, o objectivo deste trabalho culminou no desenvolvimento de metodologias de cálculo analíticas que permitissem extrapolar diversas variáveis imensuráveis no ensaio do reactor, especificamente do CFM56-3. Com todo o percurso de caracterização do ciclo termodinâmico do referido motor, a TAP pretende que, a longo prazo, seja possível desenvolver uma ferramenta que simule a sua performance para auxiliar a equipa de engenharia do centro de reparação e ensaio de motores de avião. Assim, será igualmente abordado o desenvolvimento do modelo termodinâmico do CFM56-3, recorrendo ao software GasTurb.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Paper presented at the Conference “The Reflective Conservatoire – 2nd International Conference: Building Connections”. Guildhall School of Music and Drama and Barbican Conference Centre, London. 28 February – 3 March 2009

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Controlo e Gestão dos Negócios

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper seeks to investigate the effectiveness of sea-defense structures in preventing/reducing the tsunami overtopping as well as evaluating the resulting tsunami impact at El Jadida, Morocco. Different tsunami wave conditions are generated by considering various earthquake scenarios of magnitudes ranging from M-w = 8.0 to M-w = 8.6. These scenarios represent the main active earthquake faults in the SW Iberia margin and are consistent with two past events that generated tsunamis along the Atlantic coast of Morocco. The behavior of incident tsunami waves when interacting with coastal infrastructures is analyzed on the basis of numerical simulations of near-shore tsunami waves' propagation. Tsunami impact at the affected site is assessed through computing inundation and current velocity using a high-resolution digital terrain model that incorporates bathymetric, topographic and coastal structures data. Results, in terms of near-shore tsunami propagation snapshots, waves' interaction with coastal barriers, and spatial distributions of flow depths and speeds, are presented and discussed in light of what was observed during the 2011 Tohoku-oki tsunami. Predicted results show different levels of impact that different tsunami wave conditions could generate in the region. Existing coastal barriers around the El Jadida harbour succeeded in reflecting relatively small waves generated by some scenarios, but failed in preventing the overtopping caused by waves from others. Considering the scenario highly impacting the El Jadida coast, significant inundations are computed at the sandy beach and unprotected areas. The modeled dramatic tsunami impact in the region shows the need for additional tsunami standards not only for sea-defense structures but also for the coastal dwellings and houses to provide potential in-place evacuation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In general, modern networks are analysed by taking several Key Performance Indicators (KPIs) into account, their proper balance being required in order to guarantee a desired Quality of Service (QoS), particularly, cellular wireless heterogeneous networks. A model to integrate a set of KPIs into a single one is presented, by using a Cost Function that includes these KPIs, providing for each network node a single evaluation parameter as output, and reflecting network conditions and common radio resource management strategies performance. The proposed model enables the implementation of different network management policies, by manipulating KPIs according to users' or operators' perspectives, allowing for a better QoS. Results show that different policies can in fact be established, with a different impact on the network, e.g., with median values ranging by a factor higher than two.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Low-density parity-check (LDPC) codes are nowadays one of the hottest topics in coding theory, notably due to their advantages in terms of bit error rate performance and low complexity. In order to exploit the potential of the Wyner-Ziv coding paradigm, practical distributed video coding (DVC) schemes should use powerful error correcting codes with near-capacity performance. In this paper, new ways to design LDPC codes for the DVC paradigm are proposed and studied. The new LDPC solutions rely on merging parity-check nodes, which corresponds to reduce the number of rows in the parity-check matrix. This allows to change gracefully the compression ratio of the source (DCT coefficient bitplane) according to the correlation between the original and the side information. The proposed LDPC codes reach a good performance for a wide range of source correlations and achieve a better RD performance when compared to the popular turbo codes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project was developed to fully assess the indoor air quality in archives and libraries from a fungal flora point of view. It uses classical methodologies such as traditional culture media – for the viable fungi – and modern molecular biology protocols, especially relevant to assess the non-viable fraction of the biological contaminants. Denaturing high-performance liquid chromatography (DHPLC) has emerged as an alternative to denaturing gradient gel electrophoresis (DGGE) and has already been applied to the study of a few bacterial communities. We propose the application of DHPLC to the study of fungal colonization on paper-based archive materials. This technology allows for the identification of each component of a mixture of fungi based on their genetic variation. In a highly complex mixture of microbial DNA this method can be used simply to study the population dynamics, and it also allows for sample fraction collection, which can, in many cases, be immediately sequenced, circumventing the need for cloning. Some examples of the methodological application are shown. Also applied is fragment length analysis for the study of mixed Candida samples. Both of these methods can later be applied in various fields, such as clinical and sand sample analysis. So far, the environmental analyses have been extremely useful to determine potentially pathogenic/toxinogenic fungi such as Stachybotrys sp., Aspergillus niger, Aspergillus fumigatus, and Fusarium sp. This work will hopefully lead to more accurate evaluation of environmental conditions for both human health and the preservation of documents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Controlo e Gestão de Negócios

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Contabilidade e Gestão das Instituições Financeiras

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new high performance architecture for the computation of all the DCT operations adopted in the H.264/AVC and HEVC standards is proposed in this paper. Contrasting to other dedicated transform cores, the presented multi-standard transform architecture is supported on a completely configurable, scalable and unified structure, that is able to compute not only the forward and the inverse 8×8 and 4×4 integer DCTs and the 4×4 and 2×2 Hadamard transforms defined in the H.264/AVC standard, but also the 4×4, 8×8, 16×16 and 32×32 integer transforms adopted in HEVC. Experimental results obtained using a Xilinx Virtex-7 FPGA demonstrated the superior performance and hardware efficiency levels provided by the proposed structure, which outperforms its more prominent related designs by at least 1.8 times. When integrated in a multi-core embedded system, this architecture allows the computation, in real-time, of all the transforms mentioned above for resolutions as high as the 8k Ultra High Definition Television (UHDTV) (7680×4320 @ 30fps).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Signal subspace identification is a crucial first step in many hyperspectral processing algorithms such as target detection, change detection, classification, and unmixing. The identification of this subspace enables a correct dimensionality reduction, yielding gains in algorithm performance and complexity and in data storage. This paper introduces a new minimum mean square error-based approach to infer the signal subspace in hyperspectral imagery. The method, which is termed hyperspectral signal identification by minimum error, is eigen decomposition based, unsupervised, and fully automatic (i.e., it does not depend on any tuning parameters). It first estimates the signal and noise correlation matrices and then selects the subset of eigenvalues that best represents the signal subspace in the least squared error sense. State-of-the-art performance of the proposed method is illustrated by using simulated and real hyperspectral images.