916 resultados para IS-enabled Innovation Framework


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação de Mestrado em Gestão Integrada de Qualidade, Ambiente e Segurança

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Qualquer estrutura hoje em dia deve ser resistente, robusta e leve, o que aumentou o interesse industrial e investigação nas ligações adesivas, nomeadamente pela melhoria das propriedades de resistência e fratura dos materiais. Com esta técnica de união, o projeto de estruturas pode ser orientado para estruturas mais leves, não só em relação à economia direta de peso relativamente às juntas aparafusas ou soldadas, mas também por causa da flexibilidade para ligar materiais diferentes. Em qualquer área da indústria, a aplicação em larga escala de uma determinada técnica de ligação supõe que estão disponíveis ferramentas confiáveis para o projeto e previsão da rotura. Neste âmbito, Modelos de Dano Coesivo (MDC) são uma ferramenta essencial, embora seja necessário estimar as leis MDC do adesivo à tração e corte para entrada nos modelos numéricos. Este trabalho avalia o valor da tenacidade ao corte (GIIC) de juntas coladas para três adesivos com ductilidade distinta. O trabalho experimental consiste na caracterização à fratura ao corte da ligação adesiva por métodos convencionais e pelo Integral-J. Além disso, pelo integral-J, é possível definir a forma exata da lei coesiva. Para o integral-J, é utilizado um método de correlação de imagem digital anteriormente desenvolvido para a avaliação do deslocamento ao corte do adesivo na extremidade da fenda (δs) durante o ensaio, acoplado a uma sub-rotina em Matlab® para a extração automática de δs. É também apresentado um trabalho numérico para avaliar a adequabilidade de leis coesivas triangulares aproximadas em reproduzir as curvas força-deslocamento (P-δ) experimentais dos ensaios ENF. Também se apresenta uma análise de sensibilidade para compreender a influência dos parâmetros coesivos nas previsões numéricas. Como resultado deste trabalho, foram estimadas experimentalmente as leis coesivas de cada adesivo pelo método direto, e numericamente validadas, para posterior previsão de resistência em juntas adesivas. Em conjunto com a caraterização à tração destes adesivos, é possível a previsão da rotura em modo-misto.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação apresentada como requisito parcial para obtenção do grau de Mestre em Estatística e Gestão de Informação

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação de Mestrado em Engenharia Informática

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study reports the ability of one hyperthermophile and two thermophilic microorganisms to grow anaerobically by the reduction of chlorate and perchlorate. Physiological, genomic and proteome analyses suggest that the Crenarchaeon Aeropyrum pernix reduces perchlorate with a periplasmic enzyme related to nitrate reductases, but that it lacks a functional chlorite-disproportionating enzyme (Cld) to complete the pathway. A. pernix, previously described as a strictly aerobic microorganism, seems to rely on the chemical reactivity of reduced sulfur compounds with chlorite, a mechanism previously reported for perchlorate-reducing Archaeoglobus fulgidus. The chemical oxidation of thiosulfate (in excessive amounts present in the medium) and the reduction of chlorite result in the release of sulfate and chloride, which are the products of a biotic-abiotic perchlorate reduction pathway in A. pernix. The apparent absence of Cld in two other perchlorate-reducing microorganisms, Carboxydothermus hydrogenoformans and Moorella glycerini strain NMP, and their dependence on sulfide for perchlorate reduction is consistent with observations made on A. fulgidus. Our findings suggest that microbial perchlorate reduction at high temperature differs notably from the physiology of perchlorate- and chlorate-reducing mesophiles and that it is characterized by the lack of a chlorite dismutase and is enabled by a combination of biotic and abiotic reactions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação de mestrado em Engenharia Industrial

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A aprendizagem cooperativa é concebida como uma estratégia pedagógica que privilegia uma aprendizagem personalizada e que potencia o sucesso educativo não só individual mas também coletivo. É conseguida através da cooperação de todos os membros do grupo, em que o desempenho de cada um influencia e é influenciado pelo desempenho do Outro. O grupo é concebido como uma organização social cuja eficiência implica a capacidade de construção e manutenção do grupo como um todo e de promoção do sucesso educativo de todos os elementos. É este quadro teórico que enforma a conceção, implementação e avaliação de uma intervenção pedagógica em Biologia Humana do 10º ano do Curso Tecnológico de Desporto. A estratégia pedagógica carateriza-se pela operacionalização articulada de estruturas e papéis de cooperação na abordagem da unidade didática - Transformação e Utilização de Energia - com a finalidade de promover a aprendizagem integrada de competências de cooperação e de conhecimento substantivo da área da Biologia. A avaliação da intervenção pedagógica incidiu nos objetivos de investigação - 1) Identificar o impacto da estratégia de intervenção pedagógica no desenvolvimento de competências de cooperação e 2) Identificar o impacto da estratégia de intervenção pedagógica no desenvolvimento de competências disciplinares –, efetuada a partir da análise de tarefas de aprendizagem e de um questionário de avaliação final global aplicado aos alunos. A estratégia de intervenção pedagógica é percecionada pelos alunos como tendo contribuído para a promoção do desenvolvimento não só de competências de cooperação mas também do conhecimento substantivo e do pensamento crítico.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação de mestrado em Gestão de Recursos Humanos

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relatório de estágio de mestrado em Tradução e Comunicação Multilingue

Relevância:

100.00% 100.00%

Publicador:

Resumo:

According to official statistics, disabled people in Spain number 3.5 million and make up 8.8% of the Spanish population. This group of people are increasingly being recognised as members of society with equal rights, and many of their demands are gradually being transformed into solutions that benefit society as a whole. One example is improved accessibility. Accessible built environments are more human and inclusive places, as well as being easier to get around. Improved accessibility is now recognised as a requirement shared by all members of society, although it is achieved thanks to the demands of disabled people and their representatives. The 1st National Accessibility Plan is a strategic framework for action aimed at ensuring that new products, services and built environments are designed to be accessible for as many people as possible (Design for All) and that existing ones are gradually duly adapted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Domestic action on climate change is increasingly important in the light of the difficulties with international agreements and requires a combination of solutions, in terms of institutions and policy instruments. One way of achieving government carbon policy goals may be the creation of an independent body to advise, set or monitor policy. This paper critically assesses the Committee on Climate Change (CCC), which was created in 2008 as an independent body to help move the UK towards a low carbon economy. We look at the motivation for its creation in terms of: information provision, advice, monitoring, or policy delegation. In particular we consider its ability to overcome a time inconsistency problem by comparing and contrasting it with another independent body, the Monetary Policy Committee of the Bank of England. In practice the Committee on Climate Change appears to be the ‘inverse’ of the Monetary Policy Committee, in that it advises on what the policy goal should be rather than being responsible for achieving it. The CCC incorporates both advisory and monitoring functions to inform government and achieve a credible carbon policy over a long time frame. This is a similar framework to that adopted by Stern (2006), but the CCC operates on a continuing basis. We therefore believe the CCC is best viewed as a "Rolling Stern plus" body. There are also concerns as to how binding the budgets actually are and how the budgets interact with other energy policy goals and instruments, such as Renewable Obligation Contracts and the EU Emissions Trading Scheme. The CCC could potentially be reformed to include: an explicit information provision role; consumption-based accounting of emissions and control of a policy instrument such as a balanced-budget carbon tax.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Every time another corporate scandal captures media headlines, the 'bad apple vs. bad barrel' discussion starts anew. Yet this debate overlooks the influence of the broader societal context on organizational behavior. In this article, we argue that misbehaviors of organizations (the 'barrels') and their members (the 'apples') cannot be addressed properly without a clear understanding of their broader context (the 'larder'). Whereas previously, a strong societal framework dampened the practical application of the Homo economicus concept (business actors as perfectly rational and egocentric utility-maximizing agents without any moral concern), specialization, individualization and globalization led to a business world disembedded from broader societal norms. This emancipated business world promotes a literal interpretation of Homo economicus among business organizations and their members. Consequently, we argue that the first step toward 'healthier' apples and barrels is to sanitize the larder, that is, adapt the framework in which organizations and their members evolve.Chaque fois qu'un nouveau scandale fait la une des médias, la question de savoir si le problème se situe au niveau des individus (des 'pommes isolées') ou au niveau des organisations (les 'caisses de pommes') refait surface. Ce débat tend néanmoins à sous-estimer l'influence du contexte sociétal plus large sur le comportement dans les organisations. Dans cet article, nous soutenons l'idée que les scandales éthiques dans les organisations ou parmi leurs membres ne peuvent être compris correctement sans une vision plus précise de leur contexte plus large (la 'cave à pommes'). Si dans le passé un contexte sociétal fort permettait d'adoucir les applications pratiques de l'Homo economicus (qui considère l'acteur économique comme un agent parfaitement rationnel et égocentrique cherchant à maximiser son utilité sans réflexion morale), l'individualisation et la globalisation ont conduit à un monde économique désencastré et déconnecté des normes sociales plus larges. Ce monde économique autonome promouvoit une interprétation littérale de l'Homo economicus parmi les entreprises et leurs employés. Il en résulte que le premier pas vers des pommes moins pourries passe par un assainissement de la cave, c'est-à-dire l'adoption d'un cadre socio-normatif qui permet un recadrage du contexte dans lequel les organisations économiques et leurs acteurs agissent.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reviews three different approaches to modelling the cost-effectiveness of schistosomiasis control. Although these approaches vary in their assessment of costs, the major focus of the paper is on the evaluation of effectiveness. The first model presented is a static economic model which assesses effectiveness in terms of the proportion of cases cured. This model is important in highlighting that the optimal choice of chemotherapy regime depends critically on the level of budget constraint, the unit costs of screening and treatment, the rates of compliance with screening and chemotherapy and the prevalence of infection. The limitations of this approach is that it models the cost-effectiveness of only one cycle of treatment, and effectiveness reflects only the immediate impact of treatment. The second model presented is a prevalence-based dynamic model which links prevalence rates from one year to the next, and assesses effectiveness as the proportion of cases prevented. This model was important as it introduced the concept of measuring the long-term impact of control by using a transmission model which can assess reduction in infection through time, but is limited to assessing the impact only on the prevalence of infection. The third approach presented is a theoretical framework which describes the dynamic relationships between infection and morbidity, and which assesses effectiveness in terms of case-years prevented of infection and morbidity. The use of this model in assessing the cost-effectiveness of age-targeted treatment in controlling Schistosoma mansoni is explored in detail, with respect to varying frequencies of treatment and the interaction between drug price and drug efficacy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The gradual incorporation of the nurses to the extrahospital emergency teams give them a holistic aspect in the field of care. And this is not possible without addressing the possibility of continuity of care and communication with other levels of care. All efforts in this regard and made speeches themselves as nursing, "Refer" (Nic 8100) and "Exchange of information on health care" (Nic 7960), is the conceptual framework of this work, which objetives are to quantify and exposing the proceedings in this line taken by the nurses of emergency team.