917 resultados para Approach to CSR development
Resumo:
The Evidence Accumulation Clustering (EAC) paradigm is a clustering ensemble method which derives a consensus partition from a collection of base clusterings obtained using different algorithms. It collects from the partitions in the ensemble a set of pairwise observations about the co-occurrence of objects in a same cluster and it uses these co-occurrence statistics to derive a similarity matrix, referred to as co-association matrix. The Probabilistic Evidence Accumulation for Clustering Ensembles (PEACE) algorithm is a principled approach for the extraction of a consensus clustering from the observations encoded in the co-association matrix based on a probabilistic model for the co-association matrix parameterized by the unknown assignments of objects to clusters. In this paper we extend the PEACE algorithm by deriving a consensus solution according to a MAP approach with Dirichlet priors defined for the unknown probabilistic cluster assignments. In particular, we study the positive regularization effect of Dirichlet priors on the final consensus solution with both synthetic and real benchmark data.
Resumo:
Mestrado em Engenharia Informática - Área de Especialização em Tecnologias do Conhecimento e Decisão
Resumo:
Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.
Resumo:
This paper proposes a methodology to increase the probability of delivering power to any load point through the identification of new investments. The methodology uses a fuzzy set approach to model the uncertainty of outage parameters, load and generation. A DC fuzzy multicriteria optimization model considering the Pareto front and based on mixed integer non-linear optimization programming is developed in order to identify the adequate investments in distribution networks components which allow increasing the probability of delivering power to all customers in the distribution network at the minimum possible cost for the system operator, while minimizing the non supplied energy cost. To illustrate the application of the proposed methodology, the paper includes a case study which considers an 33 bus distribution network.
Resumo:
The integration of the Smart Grid concept into the electric grid brings to the need for an active participation of small and medium players. This active participation can be achieved using decentralized decisions, in which the end consumer can manage loads regarding the Smart Grid needs. The management of loads must handle the users’ preferences, wills and needs. However, the users’ preferences, wills and needs can suffer changes when faced with exceptional events. This paper proposes the integration of exceptional events into the SCADA House Intelligent Management (SHIM) system developed by the authors, to handle machine learning issues in the domestic consumption context. An illustrative application and learning case study is provided in this paper.
Resumo:
This paper presents the Realistic Scenarios Generator (RealScen), a tool that processes data from real electricity markets to generate realistic scenarios that enable the modeling of electricity market players’ characteristics and strategic behavior. The proposed tool provides significant advantages to the decision making process in an electricity market environment, especially when coupled with a multi-agent electricity markets simulator. The generation of realistic scenarios is performed using mechanisms for intelligent data analysis, which are based on artificial intelligence and data mining algorithms. These techniques allow the study of realistic scenarios, adapted to the existing markets, and improve the representation of market entities as software agents, enabling a detailed modeling of their profiles and strategies. This work contributes significantly to the understanding of the interactions between the entities acting in electricity markets by increasing the capability and realism of market simulations.
Resumo:
An intensive use of dispersed energy resources is expected for future power systems, including distributed generation, especially based on renewable sources, and electric vehicles. The system operation methods and tool must be adapted to the increased complexity, especially the optimal resource scheduling problem. Therefore, the use of metaheuristics is required to obtain good solutions in a reasonable amount of time. This paper proposes two new heuristics, called naive electric vehicles charge and discharge allocation and generation tournament based on cost, developed to obtain an initial solution to be used in the energy resource scheduling methodology based on simulated annealing previously developed by the authors. The case study considers two scenarios with 1000 and 2000 electric vehicles connected in a distribution network. The proposed heuristics are compared with a deterministic approach and presenting a very small error concerning the objective function with a low execution time for the scenario with 2000 vehicles.
Resumo:
The determination of aminotranferases levels is very useful in the diagnosis of hepatopathies. In recent years, an elevated serum ALT level in blood donors has been associated with an increased risk of post-transfusion hepatitis (PTH). The purpose of the study was to research the factors associated with elevated ALT levels in a cohort of voluntary blood donors and to evaluate the relationship between increased ALT levels and the development of hepatitis C (HCV) infection. 166 volunteer blood donors with elevated ALT at the time of their first donation were studied. All of the donors were questioned about previous hepatopathies, exposure to hepatitis, exposure to chemicals, use of medication or drugs, sexual behaviour, contact with blood or secretions and their intake of alcohol. Every three months, the serum levels of AST, ALT, alkaline phosphatase, gamma glutamyl transpeptidase, cholesterol, triglyceride and glycemia are assessed over a two year follow-up. The serum thyroid hormone levels as well as the presence of auto-antibodies were also measured. Abdominal ultrasound was performed in all patients with persistently elevated ALT or AST levels. A needle biopsy of liver was performed in 9 donors without definite diagnostic after medical investigation. The presence of anti-HCV antibodies in 116 donors were assayed again the first clinical evaluation. At the end of follow-up period (2 years later) 71 donors were tested again for the presence of anti-HCV antibodies. None of donors resulted positive for hepatitis B or hepatitis C markers during the follow-up. Of the 116 donors, 101 (87%) had persistently elevated ALT serum levels during the follow-up. Obesity and alcoholism were the principal conditions related to elevated ALT serum levels in 91/101 (90.1%) donors. Hypertriglyceridemia, hypercholesterolemia, hypothyroidism and diabetes mellitus also were associated with increased ALT levels. Only 1/101 (0.9%) had mild chronic active non A-G viral hepatitis and 3/101 (2.9%) had liver biopsy with non-specific reactive hepatitis. The determination of ALT levels was not useful to detect donors infected with HCV at donation in Brazil, including the initial seronegative anti-HCV phase.
Resumo:
Thesis submitted to Faculdade de Ciências e Tecnologia of the Universidade Nova de Lisboa, in partial fulfilment of the requirements for the degree of Master in Computer Science
Resumo:
Currently, the teaching-learning process in domains, such as computer programming, is characterized by an extensive curricula and a high enrolment of students. This poses a great workload for faculty and teaching assistants responsible for the creation, delivery, and assessment of student exercises. The main goal of this chapter is to foster practice-based learning in complex domains. This objective is attained with an e-learning framework—called Ensemble—as a conceptual tool to organize and facilitate technical interoperability among services. The Ensemble framework is used on a specific domain: computer programming. Content issues are tacked with a standard format to describe programming exercises as learning objects. Communication is achieved with the extension of existing specifications for the interoperation with several systems typically found in an e-learning environment. In order to evaluate the acceptability of the proposed solution, an Ensemble instance was validated on a classroom experiment with encouraging results.
Multi-criteria optimisation approach to increase the delivered power in radial distribution networks
Resumo:
This study proposes a new methodology to increase the power delivered to any load point in a radial distribution network, through the identification of new investments in order to improve the repair time. This research work is innovative and consists in proposing a full optimisation model based on mixed-integer non-linear programming considering the Pareto front technique. The goal is to achieve a reduction in repair times of the distribution networks components, while minimising the costs of that reduction as well as non-supplied energy costs. The optimisation model considers the distribution network technical constraints, the substation transformer taps, and it is able to choose the capacitor banks size. A case study based on a 33-bus distribution network is presented in order to illustrate in detail the application of the proposed methodology.
Resumo:
This work explores the use of fluorescent probes to evaluate the responses of the green alga Pseudokirchneriella subcapitata to the action of three nominal concentrations of Cd(II), Cr(VI), Cu(II) and Zn(II) for a short time (6 h). The toxic effect of the metals on algal cells was monitored using the fluorochromes SYTOX Green (SG, membrane integrity), fluorescein diacetate (FDA, esterase activity) and rhodamine 123 (Rh123, mitochondrial membrane potential). The impact of metals on chlorophyll a (Chl a) autofluorescence was also evaluated. Esterase activity was the most sensitive parameter. At the concentrations studied, all metals induced the loss of esterase activity. SG could be used to effectively detect the loss of membrane integrity in algal cells exposed to 0.32 or 1.3 μmol L−1 Cu(II). Rh123 revealed a decrease in the mitochondrial membrane potential of algal cells exposed to 0.32 and 1.3 μmol L−1 Cu(II), indicating that mitochondrial activity was compromised. Chl a autofluorescence was also affected by the presence of Cr(VI) and Cu(II), suggesting perturbation of photosynthesis. In conclusion, the fluorescence-based approach was useful for detecting the disturbance of specific cellular characteristics. Fluorescent probes are a useful diagnostic tool for the assessment of the impact of toxicants on specific targets of P. subcapitata algal cells.
Resumo:
For efficient planning of waste collection routing, large municipalities may be partitioned into convenient sectors. The real case under consideration is the municipality of Monção, in Portugal. Waste collection involves more than 1600 containers over an area of 220 km2 and a population of around 20,000 inhabitants. This is mostly a rural area where the population is distributed in small villages around the 33 boroughs centres (freguesia) that constitute the municipality. In most freguesias, waste collection is usually conducted 3 times a week. However, there are situations in which the same collection is done every day. The case reveals some general and specific characteristics which are not rare, but are not widely addressed in the literature. Furthermore, new methods and models to deal with sectorization and routing are introduced, which can be extended to other applications. Sectorization and routing are tackled following a three-phase approach. The first phase, which is the main concern of the presentation, introduces a new method for sectorization inspired by Electromagnetism and Coulomb’s Law. The matter is not only about territorial division, but also the frequency of waste collection, which is a critical issue in these types of applications. Special characteristics related to the number and type of deposition points were also a motivation for this work. The second phase addresses the routing problems in each sector: new Mixed Capacitated Arc Routing with Limited Multi-Landfills models will be presented. The last phase integrates Sectoring and Routing. Computational results confirm the effectiveness of the entire novel approach.
Resumo:
This study aims to analyze and compare micro-firms’ organizational culture related to organizational performance. A case study methodology was used based on four firms, competitors among themselves in the Information Technology business, focusing on the years between 2008-2013. Findings pointed out many similarities to larger firms, but some specificities of micro-firms were found and propositions were defined: clan culture predominance is related to best performing micro-firms; the configuration of several culture types seemed to be the most suitable for obtaining good organizational results, provided that they do not focus only on hierarchy and market types of culture; the market culture predominance perception by employees is associated with low job satisfaction; and, after a certain time in business, micro-firms, as do larger companies, seek to standardize and control processes. Recognizing that organizational culture is considered important to firms’ results, this study sheds some light on that important factor for micro-firms.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores