890 resultados para Utility maximization


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Relatório da Prática Profissional Supervisionada Mestrado em Educação Pré-Escolar

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research on the problem of feature selection for clustering continues to develop. This is a challenging task, mainly due to the absence of class labels to guide the search for relevant features. Categorical feature selection for clustering has rarely been addressed in the literature, with most of the proposed approaches having focused on numerical data. In this work, we propose an approach to simultaneously cluster categorical data and select a subset of relevant features. Our approach is based on a modification of a finite mixture model (of multinomial distributions), where a set of latent variables indicate the relevance of each feature. To estimate the model parameters, we implement a variant of the expectation-maximization algorithm that simultaneously selects the subset of relevant features, using a minimum message length criterion. The proposed approach compares favourably with two baseline methods: a filter based on an entropy measure and a wrapper based on mutual information. The results obtained on synthetic data illustrate the ability of the proposed expectation-maximization method to recover ground truth. An application to real data, referred to official statistics, shows its usefulness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research on cluster analysis for categorical data continues to develop, new clustering algorithms being proposed. However, in this context, the determination of the number of clusters is rarely addressed. We propose a new approach in which clustering and the estimation of the number of clusters is done simultaneously for categorical data. We assume that the data originate from a finite mixture of multinomial distributions and use a minimum message length criterion (MML) to select the number of clusters (Wallace and Bolton, 1986). For this purpose, we implement an EM-type algorithm (Silvestre et al., 2008) based on the (Figueiredo and Jain, 2002) approach. The novelty of the approach rests on the integration of the model estimation and selection of the number of clusters in a single algorithm, rather than selecting this number based on a set of pre-estimated candidate models. The performance of our approach is compared with the use of Bayesian Information Criterion (BIC) (Schwarz, 1978) and Integrated Completed Likelihood (ICL) (Biernacki et al., 2000) using synthetic data. The obtained results illustrate the capacity of the proposed algorithm to attain the true number of cluster while outperforming BIC and ICL since it is faster, which is especially relevant when dealing with large data sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In data clustering, the problem of selecting the subset of most relevant features from the data has been an active research topic. Feature selection for clustering is a challenging task due to the absence of class labels for guiding the search for relevant features. Most methods proposed for this goal are focused on numerical data. In this work, we propose an approach for clustering and selecting categorical features simultaneously. We assume that the data originate from a finite mixture of multinomial distributions and implement an integrated expectation-maximization (EM) algorithm that estimates all the parameters of the model and selects the subset of relevant features simultaneously. The results obtained on synthetic data illustrate the performance of the proposed approach. An application to real data, referred to official statistics, shows its usefulness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mestrado em Controlo de Gestão e dos Negócios

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of new products or processes involves the creation, re-creation and integration of conceptual models from the related scientific and technical domains. Particularly, in the context of collaborative networks of organisations (CNO) (e.g. a multi-partner, international project) such developments can be seriously hindered by conceptual misunderstandings and misalignments, resulting from participants with different backgrounds or organisational cultures, for example. The research described in this article addresses this problem by proposing a method and the tools to support the collaborative development of shared conceptualisations in the context of a collaborative network of organisations. The theoretical model is based on a socio-semantic perspective, while the method is inspired by the conceptual integration theory from the cognitive semantics field. The modelling environment is built upon a semantic wiki platform. The majority of the article is devoted to developing an informal ontology in the context of a European R&D project, studied using action research. The case study results validated the logical structure of the method and showed the utility of the method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The overall goal of the REMPLI project is to design and implement a communication infrastructure for distributed data acquisition and remote control operations using the power grid as the communication medium. The primary target application is remote meter reading with high time resolution, where the meters can be energy, heat, gas, or water meters. The users of the system (e.g. utility companies) will benefit from the REMPLI system by gaining more detailed information about how energy is consumed by the end-users. In this context, the power-line communication (PLC) is deployed to cover the distance between utility company’s Private Network and the end user. This document specifies a protocol for real-time PLC, in the framework of the REMPLI project. It mainly comprises the Network Layer and Data Link Layer. The protocol was designed having into consideration the specific aspects of the network: different network typologies (star, tree, ring, multiple paths), dynamic changes in network topology (due to network maintenance, hazards, etc.), communication lines strongly affected by noise.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação para a obtenção do grau de Mestre em Engenharia Electrotécnica Ramo de Energia/Automação e Eletrónica Industrial

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is generally challenging to determine end-to-end delays of applications for maximizing the aggregate system utility subject to timing constraints. Many practical approaches suggest the use of intermediate deadline of tasks in order to control and upper-bound their end-to-end delays. This paper proposes a unified framework for different time-sensitive, global optimization problems, and solves them in a distributed manner using Lagrangian duality. The framework uses global viewpoints to assign intermediate deadlines, taking resource contention among tasks into consideration. For soft real-time tasks, the proposed framework effectively addresses the deadline assignment problem while maximizing the aggregate quality of service. For hard real-time tasks, we show that existing heuristic solutions to the deadline assignment problem can be incorporated into the proposed framework, enriching their mathematical interpretation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main objective of this work was to investigate the application of experimental design techniques for the identification of Michaelis-Menten kinetic parameters. More specifically, this study attempts to elucidate the relative advantages/disadvantages of employing complex experimental design techniques in relation to equidistant sampling when applied to different reactor operation modes. All studies were supported by simulation data of a generic enzymatic process that obeys to the Michaelis-Menten kinetic equation. Different aspects were investigated, such as the influence of the reactor operation mode (batch, fed-batch with pulse wise feeding and fed-batch with continuous feeding) and the experimental design optimality criteria on the effectiveness of kinetic parameters identification. The following experimental design optimality criteria were investigated: 1) minimization of the sum of the diagonal of the Fisher information matrix (FIM) inverse (A-criterion), 2) maximization of the determinant of the FIM (D-criterion), 3) maximization of the smallest eigenvalue of the FIM (E-criterion) and 4) minimization of the quotient between the largest and the smallest eigenvalue (modified E-criterion). The comparison and assessment of the different methodologies was made on the basis of the Cramér-Rao lower bounds (CRLB) error in respect to the parameters vmax and Km of the Michaelis-Menten kinetic equation. In what concerns the reactor operation mode, it was concluded that fed-batch (pulses) is better than batch operation for parameter identification. When the former operation mode is adopted, the vmax CRLB error is lowered by 18.6 % while the Km CRLB error is lowered by 26.4 % when compared to the batch operation mode. Regarding the optimality criteria, the best method was the A-criterion, with an average vmax CRLB of 6.34 % and 5.27 %, for batch and fed-batch (pulses), respectively, while presenting a Km’s CRLB of 25.1 % and 18.1 %, for batch and fed-batch (pulses), respectively. As a general conclusion of the present study, it can be stated that experimental design is justified if the starting parameters CRLB errors are inferior to 19.5 % (vmax) and 45% (Km), for batch processes, and inferior to 42 % and to 50% for fed-batch (pulses) process. Otherwise equidistant sampling is a more rational decision. This conclusion clearly supports that, for fed-batch operation, the use of experimental design is likely to largely improve the identification of Michaelis-Menten kinetic parameters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is on the maximization of total profit in a day-ahead market for a price-taker producer needing a short-term scheduling for wind power plants coordination with concentrated solar power plants, having thermal energy storage systems. The optimization approach proposed for the maximization of profit is a mixed-integer linear programming problem. The approach considers not only transmission grid constraints, but also technical operating constraints on both wind and concentrated solar power plants. Then, an improved short-term scheduling coordination is provided due to the more accurate modelling presented in this paper. Computer simulation results based on data for the Iberian wind and concentrated solar power plants illustrate the coordination benefits and show the effectiveness of the approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mestrado em Contabilidade e Gestão das Instituições Financeiras

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introdução – O melanoma maligno cutâneo (MMC) é considerado uma das mais letais neoplasias e no seu seguimento recorre-se, para além dos exames clínicos e da análise de marcadores tumorais, a diversos métodos imagiológicos, como é o exame Tomografia por Emissão de Positrões/Tomografia Computorizada (PET/CT, do acrónimo inglês Positron Emission Tomography/Computed Tomography) com 18fluor-fluorodeoxiglucose (18F-FDG). O presente estudo tem como objetivo avaliar a utilidade da PET/CT relativamente à análise da extensão e à suspeita de recidiva do MMC, comparando os achados imagiológicos com os descritos em estudos CT. Metodologia – Estudo retrospetivo de 62 estudos PET/CT realizados em 50 pacientes diagnosticados com MMC. Excluiu-se um estudo cujo resultado era duvidoso (nódulo pulmonar). As informações relativas aos resultados dos estudos anatomopatológicos e dos exames imagiológicos foram obtidas através da história clínica e dos relatórios médicos dos estudos CT e PET/CT. Foi criada uma base de dados com os dados recolhidos através do software Excel e foi efetuada uma análise estatística descritiva. Resultados – Dos estudos PET/CT analisados, 31 foram considerados verdadeiros positivos (VP), 28 verdadeiros negativos (VN), um falso positivo (FP) e um falso negativo (FN). A sensibilidade, especificidade, o valor preditivo positivo (VPP), o valor preditivo negativo (VPN) e a exatidão da PET/CT para o estadiamento e avaliação de suspeita de recidiva no MMC são, respetivamente, 96,9%, 96,6%, 96,9%, 96,6% e 96,7%. Dos resultados da CT considerados na análise estatística, 14 corresponderam a VP, 12 a VN, três a FP e cinco a FN. A sensibilidade, especificidade, o VPP e o VPN e a exatidão da CT para o estadiamento e avaliação de suspeita de recidiva no MMC são, respetivamente, 73,7%, 80,0%, 82,4%, 70,6% e 76,5%. Comparativamente aos resultados CT, a PET/CT permitiu uma mudança na atitude terapêutica em 23% dos estudos. Conclusão – A PET/CT é um exame útil na avaliação do MMC, caracterizando-se por uma maior acuidade diagnóstica no estadiamento e na avaliação de suspeita de recidiva do MMC comparativamente à CT isoladamente.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An approach for the analysis of uncertainty propagation in reliability-based design optimization of composite laminate structures is presented. Using the Uniform Design Method (UDM), a set of design points is generated over a domain centered on the mean reference values of the random variables. A methodology based on inverse optimal design of composite structures to achieve a specified reliability level is proposed, and the corresponding maximum load is outlined as a function of ply angle. Using the generated UDM design points as input/output patterns, an Artificial Neural Network (ANN) is developed based on an evolutionary learning process. Then, a Monte Carlo simulation using ANN development is performed to simulate the behavior of the critical Tsai number, structural reliability index, and their relative sensitivities as a function of the ply angle of laminates. The results are generated for uniformly distributed random variables on a domain centered on mean values. The statistical analysis of the results enables the study of the variability of the reliability index and its sensitivity relative to the ply angle. Numerical examples showing the utility of the approach for robust design of angle-ply laminates are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis we implement estimating procedures in order to estimate threshold parameters for the continuous time threshold models driven by stochastic di®erential equations. The ¯rst procedure is based on the EM (expectation-maximization) algorithm applied to the threshold model built from the Brownian motion with drift process. The second procedure mimics one of the fundamental ideas in the estimation of the thresholds in time series context, that is, conditional least squares estimation. We implement this procedure not only for the threshold model built from the Brownian motion with drift process but also for more generic models as the ones built from the geometric Brownian motion or the Ornstein-Uhlenbeck process. Both procedures are implemented for simu- lated data and the least squares estimation procedure is also implemented for real data of daily prices from a set of international funds. The ¯rst fund is the PF-European Sus- tainable Equities-R fund from the Pictet Funds company and the second is the Parvest Europe Dynamic Growth fund from the BNP Paribas company. The data for both funds are daily prices from the year 2004. The last fund to be considered is the Converging Europe Bond fund from the Schroder company and the data are daily prices from the year 2005.