867 resultados para least square-support vector machine


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a novel adaptive multiple modelling algorithm for non-linear and non-stationary systems. This simple modelling paradigm comprises K candidate sub-models which are all linear. With data available in an online fashion, the performance of all candidate sub-models are monitored based on the most recent data window, and M best sub-models are selected from the K candidates. The weight coefficients of the selected sub-model are adapted via the recursive least square (RLS) algorithm, while the coefficients of the remaining sub-models are unchanged. These M model predictions are then optimally combined to produce the multi-model output. We propose to minimise the mean square error based on a recent data window, and apply the sum to one constraint to the combination parameters, leading to a closed-form solution, so that maximal computational efficiency can be achieved. In addition, at each time step, the model prediction is chosen from either the resultant multiple model or the best sub-model, whichever is the best. Simulation results are given in comparison with some typical alternatives, including the linear RLS algorithm and a number of online non-linear approaches, in terms of modelling performance and time consumption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The accurate estimate of the surface longwave fluxes contribution is important for the calculation of the surface radiation budget, which in turn controls all the components of the surface energy budget, such as evaporation and the sensible heat fluxes. This study evaluates the performance of the various downward longwave radiation parameterizations for clear and all-sky days applied to the Sertozinho region in So Paulo, Brazil. Equations have been adjusted to the observations of longwave radiation. The adjusted equations were evaluated for every hour throughout the day and the results showed good fits for most of the day, except near dawn and sunset, followed by nighttime. The seasonal variation was studied by comparing the dry period against the rainy period in the dataset. The least square linear regressions resulted in coefficients equal to the coefficients found for the complete period, both in the dry period and in the rainy period. It is expected that the best fit equation to the observed data for this site be used to produce estimates in other regions of the State of So Paulo, where such information is not available.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We developed a general method for determination of water production rates from groundbased visual observations and applied it to Comet Hale-Bopp. Our main objective is to extend the method to include total visual magnitude observations obtained with CCD detector and V filter in the analysis of total visual magnitudes. We compare the CCD V-broadband careful observations of Liller [Liller, W. Pre-perihelion CCD photometry of Comet 1995 01 (Hale-Bopp). Planet. Space Sci. 45, 1505-1513, 1997; Liller, W. CCD photometry of Comet C/1995 O1 (Hale-Bopp): 1995-2000. Int. Comet Quart. 23(3), 93-97, 2001] with the total visual magnitude observations from experienced international observers found in the International Comet Quarterly (ICQ) archive. A data set of similar to 400 CCD observations covering about the same 6 years time span of the similar to 12,000 ICQ selected total visual magnitude observations were used in the analysis. A least-square method applied to the water production rates, yields power laws as a function of the heliocentric distances for the pre- and post-perihelion phases. The average dimension of the nucleus as well as its effective active area is determined and compared with values published in the literature. (C) 2009 COSPAR. Published by Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Visualization of high-dimensional data requires a mapping to a visual space. Whenever the goal is to preserve similarity relations a frequent strategy is to use 2D projections, which afford intuitive interactive exploration, e. g., by users locating and selecting groups and gradually drilling down to individual objects. In this paper, we propose a framework for projecting high-dimensional data to 3D visual spaces, based on a generalization of the Least-Square Projection (LSP). We compare projections to 2D and 3D visual spaces both quantitatively and through a user study considering certain exploration tasks. The quantitative analysis confirms that 3D projections outperform 2D projections in terms of precision. The user study indicates that certain tasks can be more reliably and confidently answered with 3D projections. Nonetheless, as 3D projections are displayed on 2D screens, interaction is more difficult. Therefore, we incorporate suitable interaction functionalities into a framework that supports 3D transformations, predefined optimal 2D views, coordinated 2D and 3D views, and hierarchical 3D cluster definition and exploration. For visually encoding data clusters in a 3D setup, we employ color coding of projected data points as well as four types of surface renderings. A second user study evaluates the suitability of these visual encodings. Several examples illustrate the framework`s applicability for both visual exploration of multidimensional abstract (non-spatial) data as well as the feature space of multi-variate spatial data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a filter-based algorithm for feature selection. The filter is based on the partitioning of the set of features into clusters. The number of clusters, and consequently the cardinality of the subset of selected features, is automatically estimated from data. The computational complexity of the proposed algorithm is also investigated. A variant of this filter that considers feature-class correlations is also proposed for classification problems. Empirical results involving ten datasets illustrate the performance of the developed algorithm, which in general has obtained competitive results in terms of classification accuracy when compared to state of the art algorithms that find clusters of features. We show that, if computational efficiency is an important issue, then the proposed filter May be preferred over their counterparts, thus becoming eligible to join a pool of feature selection algorithms to be used in practice. As an additional contribution of this work, a theoretical framework is used to formally analyze some properties of feature selection methods that rely on finding clusters of features. (C) 2011 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nesse artigo, tem-se o interesse em avaliar diferentes estratégias de estimação de parâmetros para um modelo de regressão linear múltipla. Para a estimação dos parâmetros do modelo foram utilizados dados de um ensaio clínico em que o interesse foi verificar se o ensaio mecânico da propriedade de força máxima (EM-FM) está associada com a massa femoral, com o diâmetro femoral e com o grupo experimental de ratas ovariectomizadas da raça Rattus norvegicus albinus, variedade Wistar. Para a estimação dos parâmetros do modelo serão comparadas três metodologias: a metodologia clássica, baseada no método dos mínimos quadrados; a metodologia Bayesiana, baseada no teorema de Bayes; e o método Bootstrap, baseado em processos de reamostragem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigated the temporal dynamics and changes in connectivity in the mental rotation network through the application of spatio-temporal support vector machines (SVMs). The spatio-temporal SVM [Mourao-Miranda, J., Friston, K. J., et al. (2007). Dynamic discrimination analysis: A spatial-temporal SVM. Neuroimage, 36, 88-99] is a pattern recognition approach that is suitable for investigating dynamic changes in the brain network during a complex mental task. It does not require a model describing each component of the task and the precise shape of the BOLD impulse response. By defining a time window including a cognitive event, one can use spatio-temporal fMRI observations from two cognitive states to train the SVM. During the training, the SVM finds the discriminating pattern between the two states and produces a discriminating weight vector encompassing both voxels and time (i.e., spatio-temporal maps). We showed that by applying spatio-temporal SVM to an event-related mental rotation experiment, it is possible to discriminate between different degrees of angular disparity (0 degrees vs. 20 degrees, 0 degrees vs. 60 degrees, and 0 degrees vs. 100 degrees), and the discrimination accuracy is correlated with the difference in angular disparity between the conditions. For the comparison with highest accuracy (08 vs. 1008), we evaluated how the most discriminating areas (visual regions, parietal regions, supplementary, and premotor areas) change their behavior over time. The frontal premotor regions became highly discriminating earlier than the superior parietal cortex. There seems to be a parcellation of the parietal regions with an earlier discrimination of the inferior parietal lobe in the mental rotation in relation to the superior parietal. The SVM also identified a network of regions that had a decrease in BOLD responses during the 100 degrees condition in relation to the 0 degrees condition (posterior cingulate, frontal, and superior temporal gyrus). This network was also highly discriminating between the two conditions. In addition, we investigated changes in functional connectivity between the most discriminating areas identified by the spatio-temporal SVM. We observed an increase in functional connectivity between almost all areas activated during the 100 degrees condition (bilateral inferior and superior parietal lobe, bilateral premotor area, and SMA) but not between the areas that showed a decrease in BOLD response during the 100 degrees condition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To identify chemical descriptors to distinguish Cuban from non-Cuban rums, analyses of 44 samples of rum from 15 different countries are described. To provide the chemical descriptors, analyses of the the mineral fraction, phenolic compounds, caramel, alcohols, acetic acid, ethyl acetate, ketones, and aldehydes were carried out. The analytical data were treated through the following chemometric methods: principal component analysis (PCA), partial least square-discriminate analysis (PLS-DA), and linear discriminate analysis (LDA). These analyses indicated 23 analytes as relevant chemical descriptors for the separation of rums into two distinct groups. The possibility of clustering the rum samples investigated through PCA analysis led to an accumulative percentage of 70.4% in the first three principal components, and isoamyl alcohol, n-propyl alcohol, copper, iron, 2-furfuraldehyde (furfuraldehyde), phenylmethanal (benzaldehyde), epicatechin, and vanillin were used as chemical descriptors. By applying the PLS-DA technique to the whole set of analytical data, the following analytes have been selected as descriptors: acetone, sec-butyl alcohol, isobutyl alcohol, ethyl acetate, methanol, isoamyl alcohol, magnesium, sodium, lead, iron, manganese, copper, zinc, 4-hydroxy3,5-dimethoxybenzaldehyde (syringaldehyde), methaldehyde (formaldehyde), 5-hydroxymethyl-2furfuraldehyde (5-HMF), acetalclehyde, 2-furfuraldehyde, 2-butenal (crotonaldehyde), n-pentanal (valeraldehyde), iso-pentanal (isovaleraldehyde), benzaldehyde, 2,3-butanodione monoxime, acetylacetone, epicatechin, and vanillin. By applying the LIDA technique, a model was developed, and the following analytes were selected as descriptors: ethyl acetate, sec-butyl alcohol, n-propyl alcohol, n-butyl alcohol, isoamyl alcohol, isobutyl alcohol, caramel, catechin, vanillin, epicatechin, manganese, acetalclehyde, 4-hydroxy-3-methoxybenzoic acid, 2-butenal, 4-hydroxy-3,5-dimethoxybenzoic acid, cyclopentanone, acetone, lead, zinc, calcium, barium, strontium, and sodium. This model allowed the discrimination of Cuban rums from the others with 88.2% accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação apresentada ao Programa de Pós-graduação em Administração da Universidade Municipal de são Caetano do Sul

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation deals with the problem of making inference when there is weak identification in models of instrumental variables regression. More specifically we are interested in one-sided hypothesis testing for the coefficient of the endogenous variable when the instruments are weak. The focus is on the conditional tests based on likelihood ratio, score and Wald statistics. Theoretical and numerical work shows that the conditional t-test based on the two-stage least square (2SLS) estimator performs well even when instruments are weakly correlated with the endogenous variable. The conditional approach correct uniformly its size and when the population F-statistic is as small as two, its power is near the power envelopes for similar and non-similar tests. This finding is surprising considering the bad performance of the two-sided conditional t-tests found in Andrews, Moreira and Stock (2007). Given this counter intuitive result, we propose novel two-sided t-tests which are approximately unbiased and can perform as well as the conditional likelihood ratio (CLR) test of Moreira (2003).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies the effects of generic drug’s entry on bidding behavior of drug suppliers in procurement auctions for pharmaceuticals, and the consequences on procurer’s price paid for drugs. Using an unique data set on procurement auctions for off-patent drugs organized by Brazilian public bodies, we surprisingly find no statistically difference between bids and prices paid for generic and branded drugs. On the other hand, some branded drug suppliers leave auctions in which there exists a supplier of generics, whereas the remaining ones lower their bidding price. These findings explain why we find that the presence of any supplier of generic drugs in a procurement auction reduces the price paid for pharmaceuticals by 7 percent. To overcome potential estimation bias due to generic’s entry endogeneity, we exploit variation in the number of days between drug’s patent expiration date and the tendering session. The two-stage estimations document the same pattern as the generalized least square estimations find. This evidence indicates that generic competition affects branded supplier’s behavior in public procurement auctions differently from other markets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabalho observa como as variáveis macroeconômicas (expectativa de inflação, juro real, hiato do produto e a variação cambial) influenciam a dinâmica da Estrutura a Termo da Taxa de Juros (ETTJ). Esta dinâmica foi verificada introduzindo a teoria de Análise de Componentes Principais (ACP) para capturar o efeito das componentes mais relevantes na ETTJ (nível, inclinação e curvatura). Utilizando-se as estimativas por mínimos quadrados ordinários e pelo método generalizado dos momentos, foi verificado que existe uma relação estatisticamente significante entre as variáveis macroeconômicas e as componentes principais da ETTJ.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Muitos trabalhos têm sido elaborados a respeito da curva de demanda agregada brasileira, a curva IS, desde a implementação do Plano Real e, principalmente, após a adoção do regime de câmbio flutuante. Este trabalho tem como objetivo estimar algumas especificações para a curva IS brasileira, para o período após a implementação do câmbio flutuante, do regime de metas de inflação e da Lei de Responsabilidade Fiscal, i.e. após o ano 2000. As especificações para as curvas estimadas tiveram como base o modelo novo-keynesiano, tendo sido incluídas algumas variáveis explicativas buscando captar o efeito na demanda agregada da maior intermediação financeira na potência da política monetária e o efeito do esforço fiscal feito pelo governo brasileiro. O trabalho utiliza o Método dos Momentos Generalizados (MMG) para estimar a curva IS em sua especificação foward-looking e o Método dos Mínimos Quadrados Ordinários (MQO) para estimar a curva IS em sua versão backward-looking. Os resultados mostram forte significância para o hiato do produto em todas as especificações. As especificações foward-looking mostram coeficientes significantes, porém com sinais opostos ao esperado para os juros e superávit primário. Nas regressões backward-looking o sinal dos coeficientes encontrados são os esperados, porém, mostram-se não significantes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Educação a Distância é uma metodologia de ensino que muito se desenvolveu na última década. Com a diversidade tecnológica e com as políticas governamentais que autorizam a oferta de cursos on-line, centenas de instituições disponibilizaram programas de cursos via internet. Este aumento da oferta, gerado principalmente pela alta demanda do mercado, também provocou muitos problemas, especialmente no que diz respeito à falta de qualidade dos programas e ao alto índice de evasão. O objetivo deste estudo é avaliar a influência das tecnologias interativas síncronas sobre a intenção de continuidade de uso da Educação a Distância, propondo e testando um novo modelo estrutural. Em sua primeira fase, este experimento contou com a participação de 2.376 pessoas das cinco regiões do Brasil. Para o tratamento dos dados, a técnica PLS-PM (Partial Least Square – Path Modeling) foi utilizada com uma amostra de 243 indivíduos que responderam ao questionário final. Os resultados indicam que a adaptação do aluno à metodologia – construto proposto, é um importante preditor de sua satisfação, percepção de utilidade e de sua intenção de voltar a estudar pela internet no futuro, entretanto, não foi possível confirmar a influência das tecnologias interativas síncronas sobre a intenção de continuidade de uso da EaD, revelando que a tecnologia de informação tem papel de suporte aos processos educacionais, e o que orientará a decisão do aluno são os aspectos metodológicos aplicados às diversas mídias disponíveis. Foi identificado, também, que as pessoas com mais idade têm maior predisposição para estudar via internet, comparativamente aos mais jovens. Entender os fatores que levam a continuidade dos estudos em programas de EaD pode ajudar na redução da evasão por meio de ações customizadas ao público-alvo, melhorando a receita e a rentabilidade, o que pode representar vantagem competitiva à instituição.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta dissertação baseia-se na criação de uma taxa não inflacionária da capacidade utilizada (NIRCU) para economia brasileira, utilizando microdados da Sondagem da Indústria de Transformação e Índice de Preços ao Produtor Amplo – M (IPA-M), pesquisas desenvolvidas pela FGV/IBRE. Foram criadas três taxas: NIRCU Sondagem, NIRCU Coincidente e NIRCU Antecedente. A primeira utiliza apenas dados da sondagem e a ideia é verificar que não há pressão inflacionária quando as empresas informam para sondagem que não irão variar os seus preços para os próximos três meses e que o número de turnos trabalhado é igual à média do setor. Já as demais, cruzam as informações das empresas que respondem tanto a Sondagem da Indústria de Transformação quanto no IPA-M e verifica se as que informam que não irão alterar os seus preços nos próximos três meses se concretiza quando comparados às variações do índice. A diferença entre as duas últimas abordagens é que a primeira, NIRCU Coincidente, verifica no mesmo período e a outra, NIRCU Antecedente, no trimestre seguinte. A forma encontrada para verificar a eficácia dos indicadores em mensurar a existência de pressão inflacionária foi inserir os diferentes hiatos de produto das NIRCU no modelo de Curva de Phillips, usando a metodologia de Mínimos Quadrados Ordinários (MQO). De acordo com as estimativas, a NIRCU Antecedente foi a única das três que não apresentou um bom desempenho no resultado; as NIRCU Sondagem e Coincidente tiveram uma performance muita boa, principalmente a última. Ou seja, esses dois indicadores tiveram um resultado tão bom quanto as mais tradicionais medidas de hiato de produto.