984 resultados para PROGRAMMING APPROACH


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measuring the quality of a b-learning environment is critical to determine the success of a b-learning course. Several initiatives have been recently conducted on benchmarking and quality in e-learning. Despite these efforts in defining and examining quality issues concerning online courses, a defining instrument to evaluate quality is one of the key challenges for blended learning, since it incorporates both traditional and online instruction methods. For this paper, six frameworks for quality assessment of technological enhanced learning were examined and compared regarding similarities and differences. These frameworks aim at the same global objective: the quality of e-learning environment/products. They present different perspectives but also many common issues. Some of them are more specific and related to the course and other are more global and related to institutional aspects. In this work we collected and arrange all the quality criteria identified in order to get a more complete framework and determine if it fits our b-learning environment. We also included elements related to our own b-learning research and experience, acquired during more than 10 years of experience. As a result we have create a new quality reference with a set of dimensions and criteria that should be taken into account when you are analyzing, designing, developing, implementing and evaluating a b-learning environment. Besides these perspectives on what to do when you are developing a b-learning environment we have also included pedagogical issues in order to give directions on how to do it to reach the success of the learning. The information, concepts and procedures here presented give support to teachers and instructors, which intend to validate the quality of their blended learning courses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do Grau de Mestre em Engenharia Informática

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertation presented to obtain a Ph.D. degree in Engineering and Technology Sciences, Systems Biology at the Instituto de Tecnologia Química e Biológica, Universidade Nova de Lisboa

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The importance of wind power energy for energy and environmental policies has been growing in past recent years. However, because of its random nature over time, the wind generation cannot be reliable dispatched and perfectly forecasted, becoming a challenge when integrating this production in power systems. In addition the wind energy has to cope with the diversity of production resulting from alternative wind power profiles located in different regions. In 2012, Portugal presented a cumulative installed capacity distributed over 223 wind farms [1]. In this work the circular data statistical methods are used to analyze and compare alternative spatial wind generation profiles. Variables indicating extreme situations are analyzed. The hour (s) of the day where the farm production attains its maximum daily production is considered. This variable was converted into circular variable, and the use of circular statistics enables to identify the daily hour distribution for different wind production profiles. This methodology was applied to a real case, considering data from the Portuguese power system regarding the year 2012 with a 15-minutes interval. Six geographical locations were considered, representing different wind generation profiles in the Portuguese system.In this work the circular data statistical methods are used to analyze and compare alternative spatial wind generation profiles. Variables indicating extreme situations are analyzed. The hour (s) of the day where the farm production attains its maximum daily production is considered. This variable was converted into circular variable, and the use of circular statistics enables to identify the daily hour distribution for different wind production profiles. This methodology was applied to a real case, considering data from the Portuguese power system regarding the year 2012 with a 15-minutes interval. Six geographical locations were considered, representing different wind generation profiles in the Portuguese system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lunacloud is a cloud service provider with offices in Portugal, Spain, France and UK that focus on delivering reliable, elastic and low cost cloud Infrastructure as a Service (IaaS) solutions. The company currently relies on a proprietary IaaS platform - the Parallels Automation for Cloud Infrastructure (PACI) - and wishes to expand and integrate other IaaS solutions seamlessly, namely open source solutions. This is the challenge addressed in this thesis. This proposal, which was fostered by Eurocloud Portugal Association, contributes to the promotion of interoperability and standardisation in Cloud Computing. The goal is to investigate, propose and develop an interoperable open source solution with standard interfaces for the integrated management of IaaS Cloud Computing resources based on new as well as existing abstraction libraries or frameworks. The solution should provide bothWeb and application programming interfaces. The research conducted consisted of two surveys covering existing open source IaaS platforms and PACI (features and API) and open source IaaS abstraction solutions. The first study was focussed on the characteristics of most popular open source IaaS platforms, namely OpenNebula, OpenStack, CloudStack and Eucalyptus, as well as PACI and included a thorough inventory of the provided Application Programming Interfaces (API), i.e., offered operations, followed by a comparison of these platforms in order to establish their similarities and dissimilarities. The second study on existing open source interoperability solutions included the analysis of existing abstraction libraries and frameworks and their comparison. The approach proposed and adopted, which was supported on the conclusions of the carried surveys, reuses an existing open source abstraction solution – the Apache Deltacloud framework. Deltacloud relies on the development of software driver modules to interface with different IaaS platforms, officially provides and supports drivers to sixteen IaaS platform, including OpenNebula and OpenStack, and allows the development of new provider drivers. The latter functionality was used to develop a new Deltacloud driver for PACI. Furthermore, Deltacloud provides a Web dashboard and REpresentational State Transfer (REST) API interfaces. To evaluate the adopted solution, a test bed integrating OpenNebula, Open- Stack and PACI nodes was assembled and deployed. The tests conducted involved time elapsed and data payload measurements via the Deltacloud framework as well as via the pre-existing IaaS platform API. The Deltacloud framework behaved as expected, i.e., introduced additional delays, but no substantial overheads. Both the Web and the REST interfaces were tested and showed identical measurements. The developed interoperable solution for the seamless integration and provision of IaaS resources from PACI, OpenNebula and OpenStack IaaS platforms fulfils the specified requirements, i.e., provides Lunacloud with the ability to expand the range of adopted IaaS platforms and offers a Web dashboard and REST API for the integrated management. The contributions of this work include the surveys and comparisons made, the selection of the abstraction framework and, last, but not the least, the PACI driver developed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do Grau de Mestre em Engenharia Informática

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada para a obtenção do grau de doutor em Bioquímica pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia

Relevância:

20.00% 20.00%

Publicador:

Resumo:

According to the hedonic price method, a price of a good is related with the characteristics or the services it provides. Within this framework, the aim of this study it is to examine the effect on room rates of different characteristics of hotels in and around the city of Porto, such as star category, size, room and service quality, hotel facilities and location. It was estimated a hedonic price function, using data for 51 hotels. The results enable to identify the attributes that are important to consumers and hoteliers and to which extent. This information can be used by hotel managers to define a price strategy and helpful in new investment decisions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Journal of Proteome Research (2006)5: 2720-2726

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We derived a framework in integer programming, based on the properties of a linear ordering of the vertices in interval graphs, that acts as an edge completion model for obtaining interval graphs. This model can be applied to problems of sequencing cutting patterns, namely the minimization of open stacks problem (MOSP). By making small modifications in the objective function and using only some of the inequalities, the MOSP model is applied to another pattern sequencing problem that aims to minimize, not only the number of stacks, but also the order spread (the minimization of the stack occupation problem), and the model is tested.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The minimum interval graph completion problem consists of, given a graph G = ( V, E ), finding a supergraph H = ( V, E ∪ F ) that is an interval graph, while adding the least number of edges |F| . We present an integer programming formulation for solving the minimum interval graph completion problem recurring to a characteri- zation of interval graphs that produces a linear ordering of the maximal cliques of the solution graph.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The smart grid concept is a key issue in the future power systems, namely at the distribution level, with deep concerns in the operation and planning of these systems. Several advantages and benefits for both technical and economic operation of the power system and of the electricity markets are recognized. The increasing integration of demand response and distributed generation resources, all of them mostly with small scale distributed characteristics, leads to the need of aggregating entities such as Virtual Power Players. The operation business models become more complex in the context of smart grid operation. Computational intelligence methods can be used to give a suitable solution for the resources scheduling problem considering the time constraints. This paper proposes a methodology for a joint dispatch of demand response and distributed generation to provide energy and reserve by a virtual power player that operates a distribution network. The optimal schedule minimizes the operation costs and it is obtained using a particle swarm optimization approach, which is compared with a deterministic approach used as reference methodology. The proposed method is applied to a 33-bus distribution network with 32 medium voltage consumers and 66 distributed generation units.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent changes in electricity markets (EMs) have been potentiating the globalization of distributed generation. With distributed generation the number of players acting in the EMs and connected to the main grid has grown, increasing the market complexity. Multi-agent simulation arises as an interesting way of analysing players’ behaviour and interactions, namely coalitions of players, as well as their effects on the market. MASCEM was developed to allow studying the market operation of several different players and MASGriP is being developed to allow the simulation of the micro and smart grid concepts in very different scenarios This paper presents a methodology based on artificial intelligence techniques (AI) for the management of a micro grid. The use of fuzzy logic is proposed for the analysis of the agent consumption elasticity, while a case based reasoning, used to predict agents’ reaction to price changes, is an interesting tool for the micro grid operator.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a decision support tool methodology to help virtual power players (VPPs) in the Smart Grid (SGs) context to solve the day-ahead energy resource scheduling considering the intensive use of Distributed Generation (DG) and Vehicle-To-Grid (V2G). The main focus is the application of a new hybrid method combing a particle swarm approach and a deterministic technique based on mixedinteger linear programming (MILP) to solve the day-ahead scheduling minimizing total operation costs from the aggregator point of view. A realistic mathematical formulation, considering the electric network constraints and V2G charging and discharging efficiencies is presented. Full AC power flow calculation is included in the hybrid method to allow taking into account the network constraints. A case study with a 33-bus distribution network and 1800 V2G resources is used to illustrate the performance of the proposed method.