26 resultados para Module average case analysis
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
This paper addresses the role that decision analysis plays in helping engineers to gain a greater understanding of the problems they face. The need of structured decision analysis is highlighted as well as the use of multiple criteria decision analysis to tackle sustainability issues with emphasis in the use of MACBETH approach. Some insights from a Portuguese Summer Course on engineering for sustainable development are presented namely the students 'and teacher perceptions about the module of decision analysis for sustainability.
Resumo:
Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.
Resumo:
The Oporto Airport located in the northern region in Porto city is crucial because is the only one located in the northern region. This airport had an increasing in number of passengers, sales revenue and accumulated investment during the last two decades, principally after the introduction and the operation of the Low Cost Companies since 2004 to the present. In order to determine if the last changes had an impact in the competitiveness of this airport, the main aims is to analise the evolution of values of the technical efficiency and equate the results before and after the introduction of the LCCs in this airport. The methodology uses the Data Envelopment Analysis. Results show that the Oporto Airport efficiency increases highly after the introduction of LCCs since 2004. The main conclusions suggest the importance of the introduction of LCCs in the increasing efficiency of the Oporto Airport and the potential relation with tourism development in this region, but more strong studies are needed.
Resumo:
Formaldehyde, classified by the IARC as carcinogenic in humans and experimental animals, is a chemical agent that is widely used in histopathology laboratories. The exposure to this substance is epidemiologically linked to cancer and to nuclear changes detected by the cytokinesis-block micronucleus test (CBMN). This method is extensively used in molecular epidemiology, since it provides information on several biomarkers of genotoxicity, such as micronuclei (MN), which are biomarkers of chromosomes breakage or loss, nucleoplasmic bridges (NPB), common biomarkers of chromosome rearrangement, poor repair and/or telomere fusion, and nuclear buds (NBUD), biomarkers of elimination of amplified DNA.
The aim of this study is to compare the frequency of genotoxicity biomarkers, provided by the CBMN assay in peripheral lymphocytes and the MN test in buccal cells, between individuals occupationally exposed and non-exposed to formaldehyde and other environmental factors, namely tobacco and alcohol consumption.
The sample comprised two groups: 56 individuals occupationally exposed to formaldehyde (cases) and 85 unexposed individuals (controls), from whom both peripheral blood and exfoliated epithelial cells of the oral mucosa were collected in order to measure the genetic endpoints proposed in this study.
The mean level of TWA8h was 0.16±0.11ppm (
Resumo:
As wind power generation undergoes rapid growth, lightning and overvoltage incidents involving wind power plants have come to be regarded as a serious problem. Firstly, lightning location systems are discussed, as well as important parameters regarding lightning protection. Also, this paper presents a case study, based on a wind turbine with an interconnecting transformer, for the study of adequate lightning and overvoltage protection measures. The electromagnetic transients circuit under study is described, and computational results are presented.
Resumo:
Os sistemas de armas da Força Aérea Portuguesa (FAP) têm por missão a defesa militar de Portugal, através de operações aéreas e da defesa do espaço aéreo nacional, sendo o F-16 o principal avião de ataque em uso nesta organização. Neste sentido, e tendo em conta o actual contexto económico mundial, as organizações devem rentabilizar todos os recursos disponíveis, custos associados e optimizar processos de trabalho. Tendo por base os pressupostos anteriores, o presente estudo pretende analisar a implementação de lean na FAP, uma vez que esta filosofia assenta na eliminação de desperdícios com vista a uma melhoria da qualidade e diminuição de tempos e custos. Posto isto, a análise deste trabalho vai recair sobre a área de manutenção do F-16, em concreto na Inspeção de Fase (IF), um tipo de manutenção que esta aeronave realiza a cada trezentas horas de voo. O estudo de caso vai incidir em dois momentos da IF: o primeiro ponto relaciona-se com o processamento da recolha de dados para a reunião preliminar onde são definidas, para as áreas de trabalho executantes, as ações de manutenção a realizar com a paragem da aeronave. Deste modo, pretende-se averiguar as causas inerentes aos atrasos verificados para a realização desta reunião. O segundo ponto em observação compreende a informação obtida através da aplicação informática SIAGFA, em uso na FAP, para o processamento de dados de manutenção das quatro aeronaves que inauguraram a IF com a filosofia lean. Esta análise permitiu perceber o número de horas de trabalho dispendidas (em média pelas quatro aeronaves) por cada uma das cartas de trabalho, verificando-se que as cartas adicionais comportam mais horas; foi possível compreender quais as áreas de trabalho consideradas críticas; foram identificados os dias de trabalho realizado e tempos de paragem sem qualquer tipo de intervenção. Foi ainda avaliado, por aeronave, o número de horas de trabalho realizadas na IF e quais os constrangimentos que se verificaram nas aeronaves, que não realizaram a IF no tempo definido para tal.
Resumo:
Reclaimed water from small wastewater treatment facilities in the rural areas of the Beira Interior region (Portugal) may constitute an alternative water source for aquifer recharge. A 21-month monitoring period in a constructed wetland treatment system has shown that 21,500 m(3) year(-1) of treated wastewater (reclaimed water) could be used for aquifer recharge. A GIS-based multi-criteria analysis was performed, combining ten thematic maps and economic, environmental and technical criteria, in order to produce a suitability map for the location of sites for reclaimed water infiltration. The areas chosen for aquifer recharge with infiltration basins are mainly composed of anthrosol with more than 1 m deep and fine sand texture, which allows an average infiltration velocity of up to 1 m d(-1). These characteristics will provide a final polishing treatment of the reclaimed water after infiltration (soil aquifer treatment (SAT)), suitable for the removal of the residual load (trace organics, nutrients, heavy metals and pathogens). The risk of groundwater contamination is low since the water table in the anthrosol areas ranges from 10 m to 50 m. Oil the other hand, these depths allow a guaranteed unsaturated area suitable for SAT. An area of 13,944 ha was selected for study, but only 1607 ha are suitable for reclaimed water infiltration. Approximately 1280 m(2) were considered enough to set up 4 infiltration basins to work in flooding and drying cycles.
Resumo:
A descriptive study was developed in order to assess air contamination caused by fungi and particles in seven poultry units. Twenty seven air samples of 25 litters were collected through impaction method. Air sampling and particle concentration measurement were performed in the pavilions’ interior and also outside premises, since this was the place regarded as reference. Simultaneously, temperature and relative humidity were also registered. Regarding fungal load in the air from the seven poultry farms, the highest value obtained was 24040 CFU/m3 and the lowest was 320 CFU/m3. Twenty eight species/genera of fungi were identified, being Scopulariopsis brevicaulis (39.0%) the most commonly isolated species and Rhizopus sp. (30.0%) the most commonly isolated genus. From the Aspergillus genus, Aspergillus flavus (74.5%) was the most frequently detected species. There was a significant correlation (r=0.487; p=0.014) between temperature and the level of fungal contamination (CFU/m3). Considering contamination caused by particles, in this study, particles with larger dimensions (PM5.0 and PM10) have higher concentrations. There was also a significant correlation between relative humidity and concentration of smaller particles namely, PM0.5 (r=0.438; p=0.025) and PM1.0 (r=0.537; p=0.005). Characterizing typical exposure levels to these contaminants in this specific occupational setting is required to allow a more detailed risk assessment analysis and to set exposure limits to protect workers’ health.
Resumo:
A presente dissertação tem como objetivo efetuar a análise comparativa de soluções adotadas de reabilitação de pavimentos flexíveis que integram a rede rodoviária nacional. No âmbito desta análise apresenta-se o estado da arte respeitante à reabilitação de pavimentos flexíveis, nomeadamente: mecanismos de degradação, famílias de degradações, avaliação da capacidade de carga dos pavimentos, metodologia utilizada no dimensionamento do reforço de pavimentos, sendo também efetuada uma análise comparativa de técnicas de reforço de pavimentos e dos tratamentos antifendas. Neste contexto, apresenta-se um caso de estudo no qual é efetuada uma análise de três soluções possíveis para a reabilitação estrutural do pavimento do IC 20 entre Almada e a Costa de Caparica. É feita a descrição da solução projetada pela EP, SA patenteada em concurso público lançado em 2007, a qual é de certa forma inovadora ao nível do tratamento retardador da reflexão de fendas. Aquela solução técnica é constituída pela aplicação de grelhas de fibra de vidro e grelhas de fibra de carbono, seguidas da colocação de uma camada de desgaste em mistura betuminosa rugosa com betume modificado com baixa percentagem de borracha reciclada de pneus usados (BBr - BBB). Complementarmente, é efetuada a análise da solução do projeto de reabilitação do IC 20, patenteado pela Subconcessionária do Baixo Tejo, que contemplou a aplicação de misturas betuminosas rugosas com betume modificado com média percentagem de borracha reciclada de pneus usados (BBr - BBM). Para além da solução patenteada pela subconcessionária, é analisada a solução do projeto de alterações (variante) apresentado pelo agrupamento de empresas construtoras, que foi adotado na execução da obra realizada no IC 20. A intervenção de reabilitação estrutural contemplou a utilização de uma camada de ligação em AC 16 10/20 (MBAM) e uma camada de desgaste em mistura betuminosa rugosa com betume modificado com média percentagem de borracha reciclada de pneus usados (BBr - BBM). Adicionalmente à caracterização de diferentes soluções de reabilitação de pavimentos flexíveis adotados em Portugal, é efetuada uma análise comparativa dos custos de ciclo de vida (construção, manutenção e conservação) de cada tipo de solução de reabilitação.
Resumo:
Mestrado em Radiações Aplicadas às Tecnologias da Saúde - Área de especialização: Terapia com Radiações.
Resumo:
Introduction - The increasing of TB burden is usually related to inadequate case detection, diagnosis and cure. Global targets for TB control, adopted by the World Health Organization (WHO), are to detect 70% of the estimated incidence of sputum smear-positive TB and to cure 85% of newly detected cases of sputum smear-positive TB. Factors associated with unsuccessful treatment outcomes are closely related to TB risk factors. Objectives - To describe treatment success rates in pulmonary TB cases and to identify factors associated with unsuccessful treatment outcomes, according to ad-hoc studies.
Resumo:
Objectives - Evaluate the nutritional status of patients with inactive or mildly active Crohn's disease (CD), and identify possible causes for potential deficiencies. Methods - A total of 78 CD patients and 80 healthy controls were evaluated in respect of nutritional status, dietary intake, and life styles factors. Results - These 73/78 CD patients were on immunomodulating therapies. Mean body mass index (BMI) was lower in patients as compared to controls (P= 0.006) but 32% of CD patients and 33.8% of controls had a BMI > 25, whereas 8% and 23.8% in each group, respectively, were obese (BMI > 30Kg/m(2)). Fat free mass was significantly decreased in both genders (P < 0.05) whereas fat mass was decreased only in males (P= 0.01). Energy intake was significantly lower in CD patients (P < 0.0001) and we observed significantly lower adjusted mean daily intakes of carbohydrates, monounsaturated fat, fiber, calcium, and vitamins C, D, E, and K (P < 0.05). 29% of patients had excluded grains from their usual diet, 28% milk, 18% vegetables, and 11% fruits. Milk exclusion resulted in a significantly lower consumption of calcium and vitamin K (P < 0.001) and the exclusion of vegetables was associated to a lower consumption of vitamins C and E (P < 0.05). Physical activity was significantly lower in CD patients (P= 0.01) and this lack of physical activity was inversely correlated with increased fat mass percentage (r=-0.315, P= 0.001). Conclusions - Results showed that the most prevalent form of malnutrition in CD patients was an excess of body weight, which was concomitant with an inadequate dietary intake, namely micronutrients, clearly related to dietary exclusion of certain foods.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia de Electrónica e Telecomunicações
Resumo:
Independent component analysis (ICA) has recently been proposed as a tool to unmix hyperspectral data. ICA is founded on two assumptions: 1) the observed spectrum vector is a linear mixture of the constituent spectra (endmember spectra) weighted by the correspondent abundance fractions (sources); 2)sources are statistically independent. Independent factor analysis (IFA) extends ICA to linear mixtures of independent sources immersed in noise. Concerning hyperspectral data, the first assumption is valid whenever the multiple scattering among the distinct constituent substances (endmembers) is negligible, and the surface is partitioned according to the fractional abundances. The second assumption, however, is violated, since the sum of abundance fractions associated to each pixel is constant due to physical constraints in the data acquisition process. Thus, sources cannot be statistically independent, this compromising the performance of ICA/IFA algorithms in hyperspectral unmixing. This paper studies the impact of hyperspectral source statistical dependence on ICA and IFA performances. We conclude that the accuracy of these methods tends to improve with the increase of the signature variability, of the number of endmembers, and of the signal-to-noise ratio. In any case, there are always endmembers incorrectly unmixed. We arrive to this conclusion by minimizing the mutual information of simulated and real hyperspectral mixtures. The computation of mutual information is based on fitting mixtures of Gaussians to the observed data. A method to sort ICA and IFA estimates in terms of the likelihood of being correctly unmixed is proposed.
Resumo:
Purpose - The study evaluates the pre- and post-training lesion localisation ability of a group of novice observers. Parallels are drawn with the performance of inexperienced radiographers taking part in preliminary clinical evaluation (PCE) and ‘red-dot’ systems, operating within radiography practice. Materials and methods - Thirty-four novice observers searched 92 images for simulated lesions. Pre-training and post-training evaluations were completed following the free-response the receiver operating characteristic (FROC) method. Training consisted of observer performance methodology, the characteristics of the simulated lesions and information on lesion frequency. Jackknife alternative FROC (JAFROC) and highest rating inferred ROC analyses were performed to evaluate performance difference on lesion-based and case-based decisions. The significance level of the test was set at 0.05 to control the probability of Type I error. Results - JAFROC analysis (F(3,33) = 26.34, p < 0.0001) and highest-rating inferred ROC analysis (F(3,33) = 10.65, p = 0.0026) revealed a statistically significant difference in lesion detection performance. The JAFROC figure-of-merit was 0.563 (95% CI 0.512,0.614) pre-training and 0.677 (95% CI 0.639,0.715) post-training. Highest rating inferred ROC figure-of-merit was 0.728 (95% CI 0.701,0.755) pre-training and 0.772 (95% CI 0.750,0.793) post-training. Conclusions - This study has demonstrated that novice observer performance can improve significantly. This study design may have relevance in the assessment of inexperienced radiographers taking part in PCE or commenting scheme for trauma.