986 resultados para Evaluate image retention


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conventional film based X-ray imaging systems are being replaced by their digital equivalents. Different approaches are being followed by considering direct or indirect conversion, with the later technique dominating. The typical, indirect conversion, X-ray panel detector uses a phosphor for X-ray conversion coupled to a large area array of amorphous silicon based optical sensors and a couple of switching thin film transistors (TFT). The pixel information can then be readout by switching the correspondent line and column transistors, routing the signal to an external amplifier. In this work we follow an alternative approach, where the electrical switching performed by the TFT is replaced by optical scanning using a low power laser beam and a sensing/switching PINPIN structure, thus resulting in a simpler device. The optically active device is a PINPIN array, sharing both front and back electrical contacts, deposited over a glass substrate. During X-ray exposure, each sensing side photodiode collects photons generated by the scintillator screen (560 nm), charging its internal capacitance. Subsequently a laser beam (445 nm) scans the switching diodes (back side) retrieving the stored charge in a sequential way, reconstructing the image. In this paper we present recent work on the optoelectronic characterization of the PINPIN structure to be incorporated in the X-ray image sensor. The results from the optoelectronic characterization of the device and the dependence on scanning beam parameters are presented and discussed. Preliminary results of line scans are also presented. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabalho de projeto apresentado à Escola Superior de Comunicação Social como parte dos requisitos para obtenção de grau de mestre em Publicidade e Marketing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The basic objective of this work is to evaluate the durability of self-compacting concrete (SCC) produced in binary and ternary mixes using fly ash (FA) and limestone filler (LF) as partial replacement of cement. The main characteristics that set SCC apart from conventional concrete (fundamentally its fresh state behaviour) essentially depend on the greater or lesser content of various constituents, namely: greater mortar volume (more ultrafine material in the form of cement and mineral additions); proper control of the maximum size of the coarse aggregate; use of admixtures such as superplasticizers. Significant amounts of mineral additions are thus incorporated to partially replace cement, in order to improve the workability of the concrete. These mineral additions necessarily affect the concrete’s microstructure and its durability. Therefore, notwithstanding the many well-documented and acknowledged advantages of SCC, a better understanding its behaviour is still required, in particular when its composition includes significant amounts of mineral additions. An ambitious working plan was devised: first, the SCC’s microstructure was studied and characterized and afterwards the main transport and degradation mechanisms of the SCC produced were studied and characterized by means of SEM image analysis, chloride migration, electrical resistivity, and carbonation tests. It was then possible to draw conclusions about the SCC’s durability. The properties studied are strongly affected by the type and content of the additions. Also, the use of ternary mixes proved to be extremely favourable, confirming the expected beneficial effect of the synergy between LF and FA. © 2015 RILEM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado Em Engenharia Mecânica - Ramo Gestão Industrial

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertation presented to obtain the degree of Doctor of Philosophy in Electrical Engineering, speciality on Perceptional Systems, by the Universidade Nova de Lisboa, Faculty of Sciences and Technology

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces a new toolbox for hyperspectral imagery, developed under the MATLAB environment. This toolbox provides easy access to different supervised and unsupervised classification methods. This new application is also versatile and fully dynamic since the user can embody their own methods, that can be reused and shared. This toolbox, while extends the potentiality of MATLAB environment, it also provides a user-friendly platform to assess the results of different methodologies. In this paper it is also presented, under the new application, a study of several different supervised and unsupervised classification methods on real hyperspectral data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A par das patologias oncológicas, as doenças do foro cardíaco, em particular a doença arterial coronária, são uma das principais causas de morte nos países industrializados, devido sobretudo, à grande incidência de enfartes do miocárdio. Uma das formas de diagnóstico e avaliação desta condição passa pela obtenção de imagens de perfusão miocárdica com radionuclídeos, realizada por Tomografia por Emissão de Positrões (PET). As soluções injectáveis de [15O]-H2O, [82Rb] e [13N]-NH3 são as mais utilizadas neste tipo de exame clínico. No Instituto de Ciências Nucleares Aplicadas à Saúde (ICNAS), a existência de um ciclotrão tem permitido a produção de uma variedade de radiofármacos, com aplicações em neurologia, oncologia e cardiologia. Recentemente, surgiu a oportunidade de iniciar exames clínicos com [13N]-NH3 para avaliação da perfusão miocárdica. É neste âmbito que surge a oportunidade do presente trabalho, pois antes da sua utilização clínica é necessário realizar a optimização da produção e a validação de todo o processo segundo as normas de Boas Práticas Radiofarmacêuticas. Após uma fase de optimização do processo, procedeu-se à avaliação dos parâmetros físico-químicos e biológicos da preparação de [13N]-NH3, de acordo com as indicações da Farmacopeia Europeia (Ph. Eur.) 8.2. De acordo com as normas farmacêuticas, foram realizados 3 lotes de produção consecutivos para validação da produção de [13N]-NH3. Os resultados mostraram um produto final límpido e ausente de cor, com valores de pH dentro do limite especificado, isto é, entre 4,5 e 8,5. A pureza química das amostras foi verificada, uma vez que relativamente ao teste colorimétrico, a tonalidade da cor da solução de [13N]-NH3 não era mais intensa que a solução de referência. As preparações foram identificadas como sendo [13N]-NH3, através dos resultados obtidos por cromatografia iónica, espectrometria de radiação gama e tempo de semi-vida. Por examinação do cromatograma obtido com a solução a ser testada, observou-se que o pico principal possuia um tempo de retenção aproximadamente igual ao pico do cromatograma obtido para a solução de referência. Além disso, o espectro de radiação gama mostrou um pico de energia 0,511 MeV e um outro adicional de 1,022 MeV para os fotões gama, característico de radionuclídeos emissores de positrões. O tempo de semi-vida manteve-se dentro do intervalo indicado, entre 9 e 11 minutos. Verificou-se, igualmente, a pureza radioquímica das amostras, correspondendo um mínimo de 99% da radioactividade total ao [13N], bem como a pureza radionuclídica, observando-se uma percentagem de impurezas inferiores a 1%, 2h após o fim da síntese. Os testes realizados para verificação da esterilidade e determinação da presença de endotoxinas bacterianas nas preparações de [13N]-NH3 apresentaram-se negativos.Os resultados obtidos contribuem, assim, para a validação do método para a produção de [13N]-NH3, uma vez que cumprem os requisitos especificados nas normas europeias, indicando a obtenção de um produto seguro e com a qualidade necessária para ser administrado em pacientes para avaliação da perfusão cardíaca por PET.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The visual image is a fundamental component of epiphany, stressing its immediacy and vividness, corresponding to the enargeia of the traditional ekphrasis and also playing with cultural and social meanings. Morris Beja in his seminal book Epiphany in the Modern Novel, draws our attention to the distinction made by Joyce between the epiphany originated in a common object, in a discourse or gesture and the one arising in “a memorable phase of the mind itself”. This type materializes in the “dream-epiphany” and in the epiphany based in memory. On the other hand, Robert Langbaum in his study of the epiphanic mode, suggests that the category of “visionary epiphany” could account for the modern effect of an internally glowing vision like Blake’s “The Tyger”, which projects the vitality of a real tyger. The short story, whose length renders it a fitting genre for the use of different types of epiphany, has dealt with the impact of the visual image in this technique, to convey different effects and different aesthetic aims. This paper will present some examples of this occurrence in short stories of authors in whose work epiphany is a fundamental concept and literary technique: Walter Pater, Joseph Conrad, K. Mansfield, Clarice Lispector. Pater’s “imaginary portraits” concentrate on “priviledged moments” of the lives of the characters depicting their impressions through pictorial language; Conrad tries to show “moments of awakening” that can be remembered by the eye; Mansfield suggests that epiphany, the “glimpse”, should replace plot as an internal ordering principle of her impressionist short-stories; in C. Lispector the visualization of some situations is so aggressive that it causes nausea and a radical revelation on the protagonist’s.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial Para obtenção do grau de Mestre em Engenharia Informática

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines the relationship between the level of satisfaction towards Human Resources Management practices among repatriates and the decision to remain on the home company after expatriation. Data was collected through semi-structured interviews of 28 Portuguese repatriates who remain and 16 organisational representatives from eight companies located in Portugal. The results show that (1) compensation system during the international assignment; (2) permanent support during the international assignment and; (3) recognition upon the return of the work and effort of expatriates during the international assignment are the most important HRM practices for promoting satisfaction among repatriates. Moreover, it is at repatriation phase that repatriates show higher dissatisfaction with HRM support. These findings will be discussed in detail and implications and suggestions for future research will be proposed as well.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This investigation reviews literature on human resource management practices that influence the retention of repatriates. The processes of selection and training/preparation before the departure, the role of the mentor and of communication during the international assignment, a program of readjustment to repatriation and a career development plan after return to the home firm are the practices identified in the literature as the main promoters of repatriates’ retention. Evidence suggests that greater responsibility on the part of the firms before, during and after the international assignment allows for more efficiency in the management of their repatriates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este artigo surgiu na sequência de um atelier “Une langue étrangère, un ordinateur, une image: c’est simple comme bonjour!”, desenvolvido no âmbito do XXI Congresso da Associação Portuguesa dos Professores de Francês, Images et imaginaires pour agir. Teve como propósito divulgar, experimentar e refletir sobre recursos digitais que podem dar um bom contributo ao processo de ensino e aprendizagem do Francês Língua Estrangeira (FLE). Evidencia-se o poder da imagem na construção do conhecimento, desafiando a criatividade e novos modos de ensinar a aprender. Verificou-se que os professores se interessaram pelas ferramentas digitais e evidenciaram a sua importância e a sua aplicabilidade nos contextos educativos. Neste sentido, o artigo divulga ferramentas informáticas focadas no desenvolvimento da oralidade/leitura/escrita do francês língua estrangeira, refere boas práticas de utilização em contexto de sala de aula, constituindo uma contribuição para a renovação da escola.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Astringency is an organoleptic property of beverages and food products resulting mainly from the interaction of salivary proteins with dietary polyphenols. It is of great importance to consumers, but the only effective way of measuring it involves trained sensorial panellists, providing subjective and expensive responses. Concurrent chemical evaluations try to screen food astringency, by means of polyphenol and protein precipitation procedures, but these are far from the real human astringency sensation where not all polyphenol–protein interactions lead to the occurrence of precipitate. Here, a novel chemical approach that tries to mimic protein–polyphenol interactions in the mouth is presented to evaluate astringency. A protein, acting as a salivary protein, is attached to a solid support to which the polyphenol binds (just as happens when drinking wine), with subsequent colour alteration that is fully independent from the occurrence of precipitate. Employing this simple concept, Bovine Serum Albumin (BSA) was selected as the model salivary protein and used to cover the surface of silica beads. Tannic Acid (TA), employed as the model polyphenol, was allowed to interact with the BSA on the silica support and its adsorption to the protein was detected by reaction with Fe(III) and subsequent colour development. Quantitative data of TA in the samples were extracted by colorimetric or reflectance studies over the solid materials. The analysis was done by taking a regular picture with a digital camera, opening the image file in common software and extracting the colour coordinates from HSL (Hue, Saturation, Lightness) and RGB (Red, Green, Blue) colour model systems; linear ranges were observed from 10.6 to 106.0 μmol L−1. The latter was based on the Kubelka–Munk response, showing a linear gain with concentrations from 0.3 to 10.5 μmol L−1. In either of these two approaches, semi-quantitative estimation of TA was enabled by direct eye comparison. The correlation between the levels of adsorbed TA and the astringency of beverages was tested by using the assay to check the astringency of wines and comparing these to the response of sensorial panellists. Results of the two methods correlated well. The proposed sensor has significant potential as a robust tool for the quantitative/semi-quantitative evaluation of astringency in wine.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although the Giemsa-stained thick blood smear (GTS) remains the gold standard for the diagnosis of malaria, molecular methods are more sensitive and specific to detect parasites and can be used at reference centers to evaluate the performance of microscopy. The description of the Plasmodium falciparum, P. vivax, P. malariae and P. ovale ssrRNA gene sequences allowed the development of a polymerase chain reaction (PCR) that had been used to differentiate the four species. The objective of this study was to determine Plasmodium species through PCR in 190 positive smears from patients in order to verify the quality of diagnosis at SUCEN's Malaria Laboratory. Considering only the 131 positive results in both techniques, GTS detected 4.6% of mixed and 3.1% of P. malariae infections whereas PCR identified 19.1% and 13.8%, respectively.