36 resultados para Statistical damage identification
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Mecânica na Área de Manutenção e Produção
Resumo:
Storm- and tsunami-deposits are generated by similar depositional mechanisms making their discrimination hard to establish using classic sedimentologic methods. Here we propose an original approach to identify tsunami-induced deposits by combining numerical simulation and rock magnetism. To test our method, we investigate the tsunami deposit of the Boca do Rio estuary generated by the 1755 earthquake in Lisbon which is well described in the literature. We first test the 1755 tsunami scenario using a numerical inundation model to provide physical parameters for the tsunami wave. Then we use concentration (MS. SIRM) and grain size (chi(ARM), ARM, B1/2, ARM/SIRM) sensitive magnetic proxies coupled with SEM microscopy to unravel the magnetic mineralogy of the tsunami-induced deposit and its associated depositional mechanisms. In order to study the connection between the tsunami deposit and the different sedimentologic units present in the estuary, magnetic data were processed by multivariate statistical analyses. Our numerical simulation show a large inundation of the estuary with flow depths varying from 0.5 to 6 m and run up of similar to 7 m. Magnetic data show a dominance of paramagnetic minerals (quartz) mixed with lesser amount of ferromagnetic minerals, namely titanomagnetite and titanohematite both of a detrital origin and reworked from the underlying units. Multivariate statistical analyses indicate a better connection between the tsunami-induced deposit and a mixture of Units C and D. All these results point to a scenario where the energy released by the tsunami wave was strong enough to overtop and erode important amount of sand from the littoral dune and mixed it with reworked materials from underlying layers at least 1 m in depth. The method tested here represents an original and promising tool to identify tsunami-induced deposits in similar embayed beach environments.
Resumo:
Storm- and tsunami-deposits are generated by similar depositional mechanisms making their discrimination hard to establish using classic sedimentologic methods. Here we propose an original approach to identify tsunami-induced deposits by combining numerical simulation and rock magnetism. To test our method, we investigate the tsunami deposit of the Boca do Rio estuary generated by the 1755 earthquake in Lisbon which is well described in the literature. We first test the 1755 tsunami scenario using a numerical inundation model to provide physical parameters for the tsunami wave. Then we use concentration (MS. SIRM) and grain size (chi(ARM), ARM, B1/2, ARM/SIRM) sensitive magnetic proxies coupled with SEM microscopy to unravel the magnetic mineralogy of the tsunami-induced deposit and its associated depositional mechanisms. In order to study the connection between the tsunami deposit and the different sedimentologic units present in the estuary, magnetic data were processed by multivariate statistical analyses. Our numerical simulation show a large inundation of the estuary with flow depths varying from 0.5 to 6 m and run up of similar to 7 m. Magnetic data show a dominance of paramagnetic minerals (quartz) mixed with lesser amount of ferromagnetic minerals, namely titanomagnetite and titanohematite both of a detrital origin and reworked from the underlying units. Multivariate statistical analyses indicate a better connection between the tsunami-induced deposit and a mixture of Units C and D. All these results point to a scenario where the energy released by the tsunami wave was strong enough to overtop and erode important amount of sand from the littoral dune and mixed it with reworked materials from underlying layers at least 1 m in depth. The method tested here represents an original and promising tool to identify tsunami-induced deposits in similar embayed beach environments.
Resumo:
Storm- and tsunami-deposits are generated by similar depositional mechanisms making their discrimination hard to establish using classic sedimentologic methods. Here we propose an original approach to identify tsunami-induced deposits by combining numerical simulation and rock magnetism. To test our method, we investigate the tsunami deposit of the Boca do Rio estuary generated by the 1755 earthquake in Lisbon which is well described in the literature. We first test the 1755 tsunami scenario using a numerical inundation model to provide physical parameters for the tsunami wave. Then we use concentration (MS. SIRM) and grain size (chi(ARM), ARM, B1/2, ARM/SIRM) sensitive magnetic proxies coupled with SEM microscopy to unravel the magnetic mineralogy of the tsunami-induced deposit and its associated depositional mechanisms. In order to study the connection between the tsunami deposit and the different sedimentologic units present in the estuary, magnetic data were processed by multivariate statistical analyses. Our numerical simulation show a large inundation of the estuary with flow depths varying from 0.5 to 6 m and run up of similar to 7 m. Magnetic data show a dominance of paramagnetic minerals (quartz) mixed with lesser amount of ferromagnetic minerals, namely titanomagnetite and titanohematite both of a detrital origin and reworked from the underlying units. Multivariate statistical analyses indicate a better connection between the tsunami-induced deposit and a mixture of Units C and D. All these results point to a scenario where the energy released by the tsunami wave was strong enough to overtop and erode important amount of sand from the littoral dune and mixed it with reworked materials from underlying layers at least 1 m in depth. The method tested here represents an original and promising tool to identify tsunami-induced deposits in similar embayed beach environments.
Resumo:
Terrestrial remote sensing imagery involves the acquisition of information from the Earth's surface without physical contact with the area under study. Among the remote sensing modalities, hyperspectral imaging has recently emerged as a powerful passive technology. This technology has been widely used in the fields of urban and regional planning, water resource management, environmental monitoring, food safety, counterfeit drugs detection, oil spill and other types of chemical contamination detection, biological hazards prevention, and target detection for military and security purposes [2-9]. Hyperspectral sensors sample the reflected solar radiation from the Earth surface in the portion of the spectrum extending from the visible region through the near-infrared and mid-infrared (wavelengths between 0.3 and 2.5 µm) in hundreds of narrow (of the order of 10 nm) contiguous bands [10]. This high spectral resolution can be used for object detection and for discriminating between different objects based on their spectral xharacteristics [6]. However, this huge spectral resolution yields large amounts of data to be processed. For example, the Airbone Visible/Infrared Imaging Spectrometer (AVIRIS) [11] collects a 512 (along track) X 614 (across track) X 224 (bands) X 12 (bits) data cube in 5 s, corresponding to about 140 MBs. Similar data collection ratios are achieved by other spectrometers [12]. Such huge data volumes put stringent requirements on communications, storage, and processing. The problem of signal sbspace identification of hyperspectral data represents a crucial first step in many hypersctral processing algorithms such as target detection, change detection, classification, and unmixing. The identification of this subspace enables a correct dimensionality reduction (DR) yelding gains in data storage and retrieval and in computational time and complexity. Additionally, DR may also improve algorithms performance since it reduce data dimensionality without losses in the useful signal components. The computation of statistical estimates is a relevant example of the advantages of DR, since the number of samples required to obtain accurate estimates increases drastically with the dimmensionality of the data (Hughes phnomenon) [13].
Resumo:
Low noise surfaces have been increasingly considered as a viable and cost-effective alternative to acoustical barriers. However, road planners and administrators frequently lack information on the correlation between the type of road surface and the resulting noise emission profile. To address this problem, a method to identify and classify different types of road pavements was developed, whereby near field road noise is analyzed using statistical learning methods. The vehicle rolling sound signal near the tires and close to the road surface was acquired by two microphones in a special arrangement which implements the Close-Proximity method. A set of features, characterizing the properties of the road pavement, was extracted from the corresponding sound profiles. A feature selection method was used to automatically select those that are most relevant in predicting the type of pavement, while reducing the computational cost. A set of different types of road pavement segments were tested and the performance of the classifier was evaluated. Results of pavement classification performed during a road journey are presented on a map, together with geographical data. This procedure leads to a considerable improvement in the quality of road pavement noise data, thereby increasing the accuracy of road traffic noise prediction models.
Resumo:
Wyner - Ziv (WZ) video coding is a particular case of distributed video coding (DVC), the recent video coding paradigm based on the Slepian - Wolf and Wyner - Ziv theorems which exploits the source temporal correlation at the decoder and not at the encoder as in predictive video coding. Although some progress has been made in the last years, WZ video coding is still far from the compression performance of predictive video coding, especially for high and complex motion contents. The WZ video codec adopted in this study is based on a transform domain WZ video coding architecture with feedback channel-driven rate control, whose modules have been improved with some recent coding tools. This study proposes a novel motion learning approach to successively improve the rate-distortion (RD) performance of the WZ video codec as the decoding proceeds, making use of the already decoded transform bands to improve the decoding process for the remaining transform bands. The results obtained reveal gains up to 2.3 dB in the RD curves against the performance for the same codec without the proposed motion learning approach for high motion sequences and long group of pictures (GOP) sizes.
Resumo:
Preliminary version
Resumo:
O presente trabalho teve como objectivos avaliar a influência de diversas grandezas e parâmetros de ensaio no índice de fluidez de termoplásticos e calcular a incerteza associada às determinações. Numa primeira fase, procedeu-se à identificação dos principais parâmetros que influenciam a determinação do índice de fluidez, tendo sido seleccionados a temperatura do plastómetro, o peso de carga, o diâmetro da fieira, o comprimento da medição, o tipo de corte e o número de provetes. Para avaliar a influência destes parâmetros na medição do índice de fluidez, optou-se pela realização de um planeamento de experiências, o qual foi dividido em três etapas. Para o tratamento dos resultados obtidos utilizou-se como ferramenta a análise de variância. Após a completa análise dos desenhos factoriais, verificou-se que os efeitos dos factores temperatura do plastómetro, peso de carga e diâmetro da fieira apresentam um importante significado estatístico na medição do índice de fluidez. Na segunda fase, procedeu-se ao cálculo da incerteza associada às medições. Para tal seleccionou-se um dos métodos mais usuais, referido no Guia para a Expressão da Incerteza da Medição, conhecido como método GUM, e pela utilização da abordagem “passo a passo”. Inicialmente, foi necessária a construção de um modelo matemático para a medição do índice de fluidez que relacionasse os diferentes parâmetros utilizados. Foi estudado o comportamento de cada um dos parâmetros através da utilização de duas funções, recorrendo-se novamente à análise de variância. Através da lei de propagação das incertezas foi possível determinar a incerteza padrão combinada,e após estimativa do número de graus de liberdade, foi possível determinar o valor do coeficiente de expansão. Finalmente determinou-se a incerteza expandida da medição, relativa à determinação do índice de fluidez em volume.
Resumo:
Mestrado em Radiações Aplicadas às Tecnologias da Saúde. Especialização: Ressonância Magnética.
Resumo:
Structures experience various types of loads along their lifetime, which can be either static or dynamic and may be associated to phenomena of corrosion and chemical attack, among others. As a consequence, different types of structural damage can be produced; the deteriorated structure may have its capacity affected, leading to excessive vibration problems or even possible failure. It is very important to develop methods that are able to simultaneously detect the existence of damage and to quantify its extent. In this paper the authors propose a method to detect and quantify structural damage, using response transmissibilities measured along the structure. Some numerical simulations are presented and a comparison is made with results using frequency response functions. Experimental tests are also undertaken to validate the proposed technique. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
A replicate evaluation of increased micronucleus (MN) frequencies in peripheral lymphocytes of workers occupationally exposed to formaldehyde (FA) was undertaken to verify the observed effect and to determine scoring variability. May–Grünwald–Giemsa-stained slides were obtained from a previously performed cytokinesis-block micronucleus test (CBMNT) with 56 workers in anatomy and pathology laboratories and 85 controls. The first evaluation by one scorer (scorer 1) had led to a highly significant difference between workers and controls (3.96 vs 0.81 MN per 1000 cells). The slides were coded before re-evaluation and the code was broken after the complete re-evaluation of the study. A total of 1000 binucleated cells (BNC) were analysed per subject and the frequency of MN (in ‰) was determined. Slides were distributed equally and randomly between two scorers, so that the scorers had no knowledge of the exposure status. Scorer 2 (32 exposed, 36 controls) measured increased MN frequencies in exposed workers (9.88 vs 6.81). Statistical analysis with the two-sample Wilcoxon test indicated that this difference was not significant (p = 0.17). Scorer 3 (20 exposed, 46 controls) obtained a similar result, but slightly higher values for the comparison of exposed and controls (19.0 vs 12.89; p = 0.089). Combining the results of the two scorers (13.38 vs 10.22), a significant difference between exposed and controls (p = 0.028) was obtained when the stratified Wilcoxon test with the scorers as strata was applied. Interestingly, the re-evaluation of the slides led to clearly higher MN frequencies for exposed and controls compared with the first evaluation. Bland–Altman plots indicated that the agreement between the measurements of the different scorers was very poor, as shown by mean differences of 5.9 between scorer 1 and scorer 2 and 13.0 between scorer 1 and scorer 3. Calculation of the intra-class correlation coefficient (ICC) revealed that all scorer comparisons in this study were far from acceptable for the reliability of this assay. Possible implications for the use of the CBMNT in human biomonitoring studies are discussed.
Resumo:
Formaldehyde (FA) ranks 25th in the overall U.S. chemical production, with more than 5 million tons produced each year. Given its economic importance and widespread use, many people are exposed to FA occupationally. Recently, based on the correlation with nasopharyngeal cancer in humans, the International Agency for Research on Cancer (IARC) confirmed the classification of FA as a Group I substance. Considering the epidemiological evidence of a potential association with leukemia, the IARC has concluded that FA can cause this lymphoproliferative disorder. Our group has developed a method to assess the exposure and genotoxicity effects of FA in two different occupational settings, namely FAbased resins production and pathology and anatomy laboratories. For exposure assessment we applied simultaneously two different techniques of air monitoring: NIOSH Method 2541 and Photo Ionization Detection Equipment with simultaneously video recording. Genotoxicity effects were measured by cytokinesis-blocked micronucleus assay in peripheral blood lymphocytes and by micronucleus test in exfoliated oral cavity epithelial cells, both considered target cells. The two exposure assessment techniques show that in the two occupational settings peak exposures are still occurring. There was a statistical significant increase in the micronucleus mean of epithelial cells and peripheral lymphocytes of exposed individuals compared with controls. In conclusion, the exposure and genotoxicity effects assessment methodologies developed by us allowed to determine that these two occupational settings promote exposure to high peak FA concentrations and an increase in the micronucleus mean of exposed workers. Moreover, the developed techniques showed promising results and could be used to confirm and extend the results obtained by the analytical techniques currently available.
Resumo:
This project was developed to fully assess the indoor air quality in archives and libraries from a fungal flora point of view. It uses classical methodologies such as traditional culture media – for the viable fungi – and modern molecular biology protocols, especially relevant to assess the non-viable fraction of the biological contaminants. Denaturing high-performance liquid chromatography (DHPLC) has emerged as an alternative to denaturing gradient gel electrophoresis (DGGE) and has already been applied to the study of a few bacterial communities. We propose the application of DHPLC to the study of fungal colonization on paper-based archive materials. This technology allows for the identification of each component of a mixture of fungi based on their genetic variation. In a highly complex mixture of microbial DNA this method can be used simply to study the population dynamics, and it also allows for sample fraction collection, which can, in many cases, be immediately sequenced, circumventing the need for cloning. Some examples of the methodological application are shown. Also applied is fragment length analysis for the study of mixed Candida samples. Both of these methods can later be applied in various fields, such as clinical and sand sample analysis. So far, the environmental analyses have been extremely useful to determine potentially pathogenic/toxinogenic fungi such as Stachybotrys sp., Aspergillus niger, Aspergillus fumigatus, and Fusarium sp. This work will hopefully lead to more accurate evaluation of environmental conditions for both human health and the preservation of documents.
Resumo:
We have identified an allelic deletion common region in the q26 region of chromosome 10 in endometrial carcinomas, which has been reported previously as a potential target of genetic alterations related to this neoplasia. An allelotyping analysis of 19 pairs of tumoral and non-tumoral samples was accomplished using seven microsatellite polymorphic markers mapping in the 10q26 chromosomal region. Loss of heterozygosity for one or more loci was detected in 29% of the endometrial carcinoma samples. The observed pattern of loss enabled the identification of a 3.5 Mb common deleted region located between the D10S587 and D10S186 markers. An additional result from an endometrial sample with evidence of a RER phenotype may suggest a more centromeric region of loss within the above-mentioned interval. This 401.84 Kb interval flanked by the D10S587 and D10S216 markers may be a plausible location for a putative suppressor gene involved in early stage endometrial carcinogenesis.