466 resultados para Maximization
Resumo:
We present a novel maximum-likelihood-based algorithm for estimating the distribution of alignment scores from the scores of unrelated sequences in a database search. Using a new method for measuring the accuracy of p-values, we show that our maximum-likelihood-based algorithm is more accurate than existing regression-based and lookup table methods. We explore a more sophisticated way of modeling and estimating the score distributions (using a two-component mixture model and expectation maximization), but conclude that this does not improve significantly over simply ignoring scores with small E-values during estimation. Finally, we measure the classification accuracy of p-values estimated in different ways and observe that inaccurate p-values can, somewhat paradoxically, lead to higher classification accuracy. We explain this paradox and argue that statistical accuracy, not classification accuracy, should be the primary criterion in comparisons of similarity search methods that return p-values that adjust for target sequence length.
Resumo:
We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
O objetivo deste artigo ?? apresentar um modelo alternativo de an??lise da efici??ncia dos programas de p??s-gradua????o acad??micos em Administra????o, Contabilidade e Turismo, vinculados ??s institui????es de ensino superior p??blicas e privadas. O estudo tem como base te??rica a efici??ncia e a otimiza????o de recursos, tomando como refer??ncia a maximiza????o do retorno, sujeito ??s limita????es de recursos. Como modelo anal??tico foi utilizada a An??lise Envolt??ria de Dados (DEA), enquanto t??cnica n??o param??trica de an??lise da efici??ncia relativa. Os resultados apontaram que os programas de p??s-gradua????o foram mais eficientes em 2006, seguido por 2004 e 2005, respectivamente. Notou- se ainda que, em m??dia, os programas vinculados ??s institui????es privadas de ensino foram mais eficientes que os da rede p??blica no tri??nio 2004/2006. Os gestores desses programas podem utilizar a an??lise da efici??ncia relativa como estrat??gia de benchmarking, adotando as melhores pr??ticas observadas nos programas eficientes, visando ?? maximiza????o da efici??ncia em sua gest??o.
Resumo:
O objetivo desta dissertação é analisar a relação existente entre remuneração executiva e desempenho em companhias brasileiras de capital aberto listadas na BM&FBOVESPA. A linha teórica parte do pressuposto que o contrato de incentivos corrobora com o alinhamento de interesses entre acionistas e executivos e atua como um mecanismo de governança corporativa a fim de direcionar os esforços dos executivos para maximização de valor da companhia. A amostra foi composta pelas 100 companhias mais líquidas listadas em quantidade de negociações de ações na BM&FBOVESPA durante o período 2010-2012, totalizando 296 observações. Os dados foram extraídos dos Formulários de Referência disponibilizados pela CVM e a partir dos softwares Economática® e Thomson Reuters ®. Foram estabelecidas oito hipóteses de pesquisa e estimados modelos de regressão linear múltipla com a técnica de dados em painel desbalanceado, empregando como variável dependente a remuneração total e a remuneração média individual e como regressores variáveis concernentes ao desempenho operacional, valor de mercado, tamanho, estrutura de propriedade, governança corporativa, além de variáveis de controle. Para verificar os fatores que explicam a utilização de stock options, programa de bônus e maior percentual de remuneração variável foram estimados modelos de regressão logit. Os resultados demonstram que, na amostra selecionada, existe relação positiva entre remuneração executiva e valor de mercado. Verificou-se também que os setores de mineração, química, petróleo e gás exercem influência positiva na remuneração executiva. Não obstante, exerce relação inversa com a remuneração total à concentração acionária, o controle acionário público e o fato da companhia pertencer ao nível 2 ou novo mercado conforme classificação da BMF&BOVESPA. O maior valor de mercado influencia na utilização de stock options, assim como no emprego de bônus, sendo que este também é impactado pelo maior desempenho contábil. Foram empregados também testes de robustez com estimações por efeitos aleatórios, regressões com erros-padrão robustos clusterizados, modelos dinâmicos e os resultados foram similares. Conclui-se que a remuneração executiva está relacionada com o valor corporativo gerando riqueza aos acionistas, mas que a ausência de relação com o desempenho operacional sugere falhas no sistema remuneratório que ainda depende de maior transparência e outros mecanismos de governança para alinhar os interesses entre executivos e acionistas.
Resumo:
Background: Regulating mechanisms of branching morphogenesis of fetal lung rat explants have been an essential tool for molecular research. This work presents a new methodology to accurately quantify the epithelial, outer contour and peripheral airway buds of lung explants during cellular development from microscopic images. Methods: The outer contour was defined using an adaptive and multi-scale threshold algorithm whose level was automatically calculated based on an entropy maximization criterion. The inner lung epithelial was defined by a clustering procedure that groups small image regions according to the minimum description length principle and local statistical properties. Finally, the number of peripheral buds were counted as the skeleton branched ends from a skeletonized image of the lung inner epithelial. Results: The time for lung branching morphometric analysis was reduced in 98% in contrast to the manual method. Best results were obtained in the first two days of cellular development, with lesser standard deviations. Non-significant differences were found between the automatic and manual results in all culture days. Conclusions: The proposed method introduces a series of advantages related to its intuitive use and accuracy, making the technique suitable to images with different lightning characteristics and allowing a reliable comparison between different researchers.
Resumo:
Regulating mechanisms of branchingmorphogenesis of fetal lung rat explants have been an essential tool formolecular research.This work presents a new methodology to accurately quantify the epithelial, outer contour, and peripheral airway buds of lung explants during cellular development frommicroscopic images. Methods.Theouter contour was defined using an adaptive and multiscale threshold algorithm whose level was automatically calculated based on an entropy maximization criterion. The inner lung epithelium was defined by a clustering procedure that groups small image regions according to the minimum description length principle and local statistical properties. Finally, the number of peripheral buds was counted as the skeleton branched ends from a skeletonized image of the lung inner epithelia. Results. The time for lung branching morphometric analysis was reduced in 98% in contrast to themanualmethod. Best results were obtained in the first two days of cellular development, with lesser standard deviations. Nonsignificant differences were found between the automatic and manual results in all culture days. Conclusions. The proposed method introduces a series of advantages related to its intuitive use and accuracy, making the technique suitable to images with different lighting characteristics and allowing a reliable comparison between different researchers.
Resumo:
This paper is an elaboration of the DECA algorithm [1] to blindly unmix hyperspectral data. The underlying mixing model is linear, meaning that each pixel is a linear mixture of the endmembers signatures weighted by the correspondent abundance fractions. The proposed method, as DECA, is tailored to highly mixed mixtures in which the geometric based approaches fail to identify the simplex of minimum volume enclosing the observed spectral vectors. We resort then to a statitistical framework, where the abundance fractions are modeled as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. With respect to DECA, we introduce two improvements: 1) the number of Dirichlet modes are inferred based on the minimum description length (MDL) principle; 2) The generalized expectation maximization (GEM) algorithm we adopt to infer the model parameters is improved by using alternating minimization and augmented Lagrangian methods to compute the mixing matrix. The effectiveness of the proposed algorithm is illustrated with simulated and read data.
Resumo:
Num mercado de electricidade competitivo onde existe um ambiente de incerteza, as empresas de geração adoptam estratégias que visam a maximização do lucro, e a minimização do risco. Neste contexto, é de extrema importância para desenvolver uma estratégia adequada de gestão de risco ter em conta as diferentes opções de negociação de energia num mercado liberalizado, de forma a suportar a tomada de decisões na gestão de risco. O presente trabalho apresenta um modelo que avalia a melhor estratégia de um produtor de energia eléctrica que comercializa num mercado competitivo, onde existem dois mercados possíveis para a transacção de energia: o mercado organizado (bolsa) e o mercado de contratos bilaterais. O produtor tenta maximizar seus lucros e minimizar os riscos correspondentes, seleccionando o melhor equilíbrio entre os dois mercados possíveis (bolsa e bilateral). O mercado de contratos bilaterais visa gerir adequadamente os riscos inerentes à operação de mercados no curto prazo (mercado organizado) e dar o vendedor / comprador uma capacidade real de escolher o fornecedor com que quer negociar. O modelo apresentado neste trabalho faz uma caracterização explícita do risco no que diz respeito ao agente de mercado na questão da sua atitude face ao risco, medido pelo Value at Risk (VaR), descrito neste trabalho por Lucro-em-Risco (PAR). O preço e os factores de risco de volume são caracterizados por um valor médio e um desvio padrão, e são modelizados por distribuições normais. Os resultados numéricos são obtidos utilizando a simulação de Monte Carlo implementado em Matlab, e que é aplicado a um produtor que mantém uma carteira diversificada de tecnologias de geração, para um horizonte temporal de um ano. Esta dissertação está organizada da seguinte forma: o capítulo 1, 2 e 3 descrevem o estado-da-arte relacionado com a gestão de risco na comercialização de energia eléctrica. O capítulo 4 descreve o modelo desenvolvido e implementado, onde é também apresentado um estudo de caso com uma aplicação do modelo para avaliar o risco de negociação de um produtor. No capítulo 5 são apresentadas as principais conclusões.
Resumo:
Topology optimization consists in finding the spatial distribution of a given total volume of material for the resulting structure to have some optimal property, for instance, maximization of structural stiffness or maximization of the fundamental eigenfrequency. In this paper a Genetic Algorithm (GA) employing a representation method based on trees is developed to generate initial feasible individuals that remain feasible upon crossover and mutation and as such do not require any repairing operator to ensure feasibility. Several application examples are studied involving the topology optimization of structures where the objective functions is the maximization of the stiffness and the maximization of the first and the second eigenfrequencies of a plate, all cases having a prescribed material volume constraint.
Resumo:
The increasing importance given by environmental policies to the dissemination and use of wind power has led to its fast and large integration in power systems. In most cases, this integration has been done in an intensive way, causing several impacts and challenges in current and future power systems operation and planning. One of these challenges is dealing with the system conditions in which the available wind power is higher than the system demand. This is one of the possible applications of demand response, which is a very promising resource in the context of competitive environments that integrates even more amounts of distributed energy resources, as well as new players. The methodology proposed aims the maximization of the social welfare in a smart grid operated by a virtual power player that manages the available energy resources. When facing excessive wind power generation availability, real time pricing is applied in order to induce the increase of consumption so that wind curtailment is minimized. The proposed method is especially useful when actual and day-ahead wind forecast differ significantly. The proposed method has been computationally implemented in GAMS optimization tool and its application is illustrated in this paper using a real 937-bus distribution network with 20310 consumers and 548 distributed generators, some of them with must take contracts.
Resumo:
This paper proposes a computationally efficient methodology for the optimal location and sizing of static and switched shunt capacitors in large distribution systems. The problem is formulated as the maximization of the savings produced by the reduction in energy losses and the avoided costs due to investment deferral in the expansion of the network. The proposed method selects the nodes to be compensated, as well as the optimal capacitor ratings and their operational characteristics, i.e. fixed or switched. After an appropriate linearization, the optimization problem was formulated as a large-scale mixed-integer linear problem, suitable for being solved by means of a widespread commercial package. Results of the proposed optimizing method are compared with another recent methodology reported in the literature using two test cases: a 15-bus and a 33-bus distribution network. For the both cases tested, the proposed methodology delivers better solutions indicated by higher loss savings, which are achieved with lower amounts of capacitive compensation. The proposed method has also been applied for compensating to an actual large distribution network served by AES-Venezuela in the metropolitan area of Caracas. A convergence time of about 4 seconds after 22298 iterations demonstrates the ability of the proposed methodology for efficiently handling large-scale compensation problems.
Resumo:
Dissertação apresentada à Escola Superior de Educação de Lisboa para obtenção de grau de mestre em Ciências da Educação Especialização em Administração Escolar
Resumo:
Dissertação de Mestrado Apresentado ao Instituto de Contabilidade e Administração do Porto para a obtenção do grau de Mestre em Auditoria, sob orientação de Doutora Alcina Portugal Dias
Resumo:
This paper addresses the problem of optimal positioning of surface bonded piezoelectric patches in sandwich plates with viscoelastic core and laminated face layers. The objective is to maximize a set of modal loss factors for a given frequency range using multiobjective topology optimization. Active damping is introduced through co-located negative velocity feedback control. The multiobjective topology optimization problem is solved using the Direct MultiSearch Method. An application to a simply supported sandwich plate is presented with results for the maximization of the first six modal loss factors. The influence of the finite element mesh is analyzed and the results are, to some extent, compared with those obtained using alternative single objective optimization. (C) 2013 Elsevier Ltd. All rights reserved.
Resumo:
Mestrado em Engenharia Electrotécnica – Sistemas Eléctricos de Energia