988 resultados para higher spectral component


Relevância:

20.00% 20.00%

Publicador:

Resumo:

8th International Conference of Education, Research and Innovation. 18-20 November, 2015, Seville, Spain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Independent component analysis (ICA) has recently been proposed as a tool to unmix hyperspectral data. ICA is founded on two assumptions: 1) the observed spectrum vector is a linear mixture of the constituent spectra (endmember spectra) weighted by the correspondent abundance fractions (sources); 2)sources are statistically independent. Independent factor analysis (IFA) extends ICA to linear mixtures of independent sources immersed in noise. Concerning hyperspectral data, the first assumption is valid whenever the multiple scattering among the distinct constituent substances (endmembers) is negligible, and the surface is partitioned according to the fractional abundances. The second assumption, however, is violated, since the sum of abundance fractions associated to each pixel is constant due to physical constraints in the data acquisition process. Thus, sources cannot be statistically independent, this compromising the performance of ICA/IFA algorithms in hyperspectral unmixing. This paper studies the impact of hyperspectral source statistical dependence on ICA and IFA performances. We conclude that the accuracy of these methods tends to improve with the increase of the signature variability, of the number of endmembers, and of the signal-to-noise ratio. In any case, there are always endmembers incorrectly unmixed. We arrive to this conclusion by minimizing the mutual information of simulated and real hyperspectral mixtures. The computation of mutual information is based on fitting mixtures of Gaussians to the observed data. A method to sort ICA and IFA estimates in terms of the likelihood of being correctly unmixed is proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Linear unmixing decomposes a hyperspectral image into a collection of reflectance spectra of the materials present in the scene, called endmember signatures, and the corresponding abundance fractions at each pixel in a spatial area of interest. This paper introduces a new unmixing method, called Dependent Component Analysis (DECA), which overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical properties of hyperspectral data. DECA models the abundance fractions as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. The mixing matrix is inferred by a generalized expectation-maximization (GEM) type algorithm. The performance of the method is illustrated using simulated and real data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chapter in Book Proceedings with Peer Review First Iberian Conference, IbPRIA 2003, Puerto de Andratx, Mallorca, Spain, JUne 4-6, 2003. Proceedings

Relevância:

20.00% 20.00%

Publicador:

Resumo:

International Conference with Peer Review 2012 IEEE International Conference in Geoscience and Remote Sensing Symposium (IGARSS), 22-27 July 2012, Munich, Germany

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The constant evolution of the Internet and its increasing use and subsequent entailing to private and public activities, resulting in a strong impact on their survival, originates an emerging technology. Through cloud computing, it is possible to abstract users from the lower layers to the business, focusing only on what is most important to manage and with the advantage of being able to grow (or degrades) resources as needed. The paradigm of cloud arises from the necessity of optimization of IT resources evolving in an emergent and rapidly expanding and technology. In this regard, after a study of the most common cloud platforms and the tactic of the current implementation of the technologies applied at the Institute of Biomedical Sciences of Abel Salazar and Faculty of Pharmacy of Oporto University a proposed evolution is suggested in order adorn certain requirements in the context of cloud computing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Agências financiadoras: FCT - PEstOE/FIS/UI0618/2011; PTDC/FIS/098254/2008 ERC-PATCHYCOLLOIDS e MIUR-PRIN

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recensão crítica do livro "AMJAD, Muhammad; FRAZ, Muhammad Moazam - Developing corporate image in higher education sector: a case study of University of East Anglia Norwich, United Kingdom. Lisboa: LAP LAMBERT Academic Publishing, 2012”.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A vital role is being played by SCADA Communication for Supervisory Control and Data acquisition (SCADA) Monitoring Ststems. Devices that are designed to operate in safety-critical environments are usually designed to failsafe, but security vulnerabilities could be exploited by an attacker to disable the fail-safe mechanisms. Thus these devices must not onlybe designed for safety but also for security. This paper presents a study of the comparison of different Encryption schemes for securing SCADA Component Communication. The encryption schemes such as Symetric Key Encrypton in Wireless SCADA Environment, Assymmetric-key Encryption to Internet SCADA, and the Cross Crypto Scheme Cipher to secure communication for SCADA are analysed and the outcome is evaluated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Critical Infrastructures became more vulnerable to attacks from adversaries as SCADA systems become connected to the Internet. The open standards for SCADA Communications make it very easy for attackers to gain in-depth knowledge about the working and operations of SCADA networks. A number of Intenrnet SCADA security issues were raised that have compromised the authenticity, confidentiality, integrity and non-repudiation of information transfer between SCADA Components. This paper presents an integration of the Cross Crypto Scheme Cipher to secure communications for SCADA components. The proposed scheme integrates both the best features of symmetric and asymmetric encryptiontechniques. It also utilizes the MD5 hashing algorithm to ensure the integrity of information being transmitted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Relatório do Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia de Electrónica e Telecomunicações

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aimed to carry out experimental work to determine, for Newtonian and non-Newtonian fluids, the friction factor (fc) with simultaneous heat transfer, at constant wall temperature as boundary condition, in fully developed laminar flow inside a vertical helical coil. The Newtonian fluids studied were aqueous solutions of glycerol, 25%, 36%, 43%, 59% and 78% (w/w). The non-Newtonian fluids were aqueous solutions of carboxymethylcellulose (CMC), a polymer, with concentrations of 0.2%, 0.3%, 0.4% and 0.6% (w/w) and aqueous solutions of xanthan gum (XG), another polymer, with concentrations of 0.1% and 0.2% (w/w). According to the rheological study done, the polymer solutions had shear-thinning behavior and different values of viscoelasticity. The helical coil used has an internal diameter, curvature ratio, length and pitch, respectively: 0.00483 m, 0.0263, 5.0 m and 11.34 mm. It was concluded that the friction factors, with simultaneous heat transfer, for Newtonian fluids can be calculated using expressions from literature for isothermal flows. The friction factors for CMC and XG solutions are similar to those for Newtonian fluids when the Dean number, based in a generalized Reynolds number, is less than 80. For Dean numbers higher than 80, the friction factors of the CMC solutions are lower those of the XG solutions and of the Newtonian fluids. In this range the friction factors decrease with the increase of the viscometric component of the solution and increase for increasing elastic component. The change of behavior at Dean number 80, for Newtonian and non-Newtonian fluids, is in accordance with the study of Ali [4]. There is a change of behavior at Dean number 80, for Newtonian and non-Newtonian fluids, which is in according to previous studies. The data also showed that the use of the bulk temperature or of the film temperature to calculate the physical properties of the fluid has a residual effect in the friction factor values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aimed to carry out experimental work to obtain, for Newtonian and non-Newtonian fluids, heat transfer coefficients, at constant wall temperature as boundary condition, in fully developed laminar flow inside a helical coil. The Newtonian fluids studied were aqueous solutions of glycerol, 25%, 36%, 43%, 59% and 78% (w/w) and the non-Newtonian fluids aqueous solutions of carboxymethylcellulose (CMC), a polymer, with concentrations 0.1%, 0.2%, 0.3%, 0.4% and 0.6% (w/w) and aqueous solutions of xanthan gum (XG), another polymer, with concentrations 0.1% and 0.2% (w/w). According to the rheological study performed, the polymer solutions had shear thinning behavior and different values of elasticity. The helical coil used has internal diameter, curvature ratio, length and pitch, respectively: 0.004575 m, 0.0263, 5.0 m and 11.34 mm. The Nusselt numbers for the CMC solutions are, on average, slightly higher than those for Newtonian fluids, for identical Prandtl and generalized Dean numbers. As outcome, the viscous component of the shear thinning polymer tends to potentiate the mixing effect of the Dean cells. The Nusselt numbers of the XG solutions are significant lower than those of the Newtonian solutions, for identical Prandtl and generalized Dean numbers. Therefore, the elastic component of the polymer tends to diminish the mixing effect of the Dean cells. A global correlation, for Nusselt number as a function of Péclet, generalized Dean and Weissenberg numbers for all Newtonian and non-Newtonian solutions studied, is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The higher education system in Europe is currently under stress and the debates over its reform and future are gaining momentum. Now that, for most countries, we are in a time for change, in the overall society and the whole education system, the legal and political dimensions have gained prominence, which has not been followed by a more integrative approach of the problem of order, its reform and the issue of regulation, beyond the typical static and classical cost-benefit analyses. The two classical approaches for studying (and for designing the policy measures of) the problem of the reform of the higher education system - the cost-benefit analysis and the legal scholarship description - have to be integrated. This is the argument of our paper that the very integration of economic and legal approaches, what Warren Samuels called the legal-economic nexus, is meaningful and necessary, especially if we want to address the problem of order (as formulated by Joseph Spengler) and the overall regulation of the system. On the one hand, and without neglecting the interest and insights gained from the cost-benefit analysis, or other approaches of value for money assessment, we will focus our study on the legal, social and political aspects of the regulation of the higher education system and its reform in Portugal. On the other hand, the economic and financial problems have to be taken into account, but in a more inclusive way with regard to the indirect and other socio-economic costs not contemplated in traditional or standard assessments of policies for the tertiary education sector. In the first section of the paper, we will discuss the theoretical and conceptual underpinning of our analysis, focusing on the evolutionary approach, the role of critical institutions, the legal-economic nexus and the problem of order. All these elements are related to the institutional tradition, from Veblen and Commons to Spengler and Samuels. The second section states the problem of regulation in the higher education system and the issue of policy formulation for tackling the problem. The current situation is clearly one of crisis with the expansion of the cohorts of young students coming to an end and the recurrent scandals in private institutions. In the last decade, after a protracted period of extension or expansion of the system, i. e., the continuous growth of students, universities and other institutions are competing harder to gain students and have seen their financial situation at risk. It seems that we are entering a period of radical uncertainty, higher competition and a new configuration that is slowly building up is the growth in intensity, which means upgrading the quality of the higher learning and getting more involvement in vocational training and life-long learning. With this change, and along with other deep ones in the Portuguese society and economy, the current regulation has shown signs of maladjustment. The third section consists of our conclusions on the current issue of regulation and policy challenge. First, we underline the importance of an evolutionary approach to a process of change that is essentially dynamic. A special attention will be given to the issues related to an evolutionary construe of policy analysis and formulation. Second, the integration of law and economics, through the notion of legal economic nexus, allows us to better define the issues of regulation and the concrete problems that the universities are facing. One aspect is the instability of the political measures regarding the public administration and on which the higher education system depends financially, legally and institutionally, to say the least. A corollary is the lack of clear strategy in the policy reforms. Third, our research criticizes several studies, such as the one made by the OECD in late 2006 for the Ministry of Science, Technology and Higher Education, for being too static and neglecting fundamental aspects of regulation such as the logic of actors, groups and organizations who are major players in the system. Finally, simply changing the legal rules will not necessary per se change the behaviors that the authorities want to change. By this, we mean that it is not only remiss of the policy maker to ignore some of the critical issues of regulation, namely the continuous non-respect by academic management and administrative bodies of universities of the legal rules that were once promulgated. Changing the rules does not change the problem, especially without the necessary debates form the different relevant quarters that make up the higher education system. The issues of social interaction remain as intact. Our treatment of the matter will be organized in the following way. In the first section, the theoretical principles are developed in order to be able to study more adequately the higher education transformation with a modest evolutionary theory and a legal and economic nexus of the interactions of the system and the policy challenges. After describing, in the second section, the recent evolution and current working of the higher education in Portugal, we will analyze the legal framework and the current regulatory practices and problems in light of the theoretical framework adopted. We will end with some conclusions on the current problems of regulation and the policy measures that are discusses in recent years.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes how MPEG-4 object based video (obv) can be used to allow selected objects to be inserted into the play-out stream to a specific user based on a profile derived for that user. The application scenario described here is for personalized product placement, and considers the value of this application in the current and evolving commercial media distribution market given the huge emphasis media distributors are currently placing on targeted advertising. This level of application of video content requires a sophisticated content description and metadata system (e.g., MPEG-7). The scenario considers the requirement for global libraries to provide the objects to be inserted into the streams. The paper then considers the commercial trading of objects between the libraries, video service providers, advertising agencies and other parties involved in the service. Consequently a brokerage of video objects is proposed based on negotiation and trading using intelligent agents representing the various parties. The proposed Media Brokerage Platform is a multi-agent system structured in two layers. In the top layer, there is a collection of coarse grain agents representing the real world players – the providers and deliverers of media contents and the market regulator profiler – and, in the bottom layer, there is a set of finer grain agents constituting the marketplace – the delegate agents and the market agent. For knowledge representation (domain, strategic and negotiation protocols) we propose a Semantic Web approach based on ontologies. The media components contents should be represented in MPEG-7 and the metadata describing the objects to be traded should follow a specific ontology. The top layer content providers and deliverers are modelled by intelligent autonomous agents that express their will to transact – buy or sell – media components by registering at a service registry. The market regulator profiler creates, according to the selected profile, a market agent, which, in turn, checks the service registry for potential trading partners for a given component and invites them for the marketplace. The subsequent negotiation and actual transaction is performed by delegate agents in accordance with their profiles and the predefined rules of the market.