891 resultados para Log ESEO, GPS, orbite, pseudorange, least square
Resumo:
The bitter taste elicited by dairy protein hydrolysates (DPH) is a renowned issue for their acceptability by consumers and therefore incorporation into foods. The traditional method of assessment of taste in foods is by sensory analysis but this can be problematic due to the overall unpleasantness of the samples. Thus, there is a growing interest into the use of electronic tongues (e-tongues) as an alternative method to quantify the bitterness in such samples. In the present study the response of the e-tongue to the standard bitter agent caffeine and a range of both casein and whey based hydrolysates was compared to that of a trained sensory panel. Partial least square regression (PLS) was employed to compare the response of the e-tongue and the sensory panel. There was strong correlation shown between the two methods in the analysis of caffeine (R2 of 0.98) and DPH samples with R2 values ranging from 0.94-0.99. This study exhibits potential for the e-tongue to be used in bitterness screening in DPHs to reduce the reliance on expensive and time consuming sensory panels.
Resumo:
This paper describes a novel on-line learning approach for radial basis function (RBF) neural network. Based on an RBF network with individually tunable nodes and a fixed small model size, the weight vector is adjusted using the multi-innovation recursive least square algorithm on-line. When the residual error of the RBF network becomes large despite of the weight adaptation, an insignificant node with little contribution to the overall system is replaced by a new node. Structural parameters of the new node are optimized by proposed fast algorithms in order to significantly improve the modeling performance. The proposed scheme describes a novel, flexible, and fast way for on-line system identification problems. Simulation results show that the proposed approach can significantly outperform existing ones for nonstationary systems in particular.
Resumo:
This paper proposes a novel adaptive multiple modelling algorithm for non-linear and non-stationary systems. This simple modelling paradigm comprises K candidate sub-models which are all linear. With data available in an online fashion, the performance of all candidate sub-models are monitored based on the most recent data window, and M best sub-models are selected from the K candidates. The weight coefficients of the selected sub-model are adapted via the recursive least square (RLS) algorithm, while the coefficients of the remaining sub-models are unchanged. These M model predictions are then optimally combined to produce the multi-model output. We propose to minimise the mean square error based on a recent data window, and apply the sum to one constraint to the combination parameters, leading to a closed-form solution, so that maximal computational efficiency can be achieved. In addition, at each time step, the model prediction is chosen from either the resultant multiple model or the best sub-model, whichever is the best. Simulation results are given in comparison with some typical alternatives, including the linear RLS algorithm and a number of online non-linear approaches, in terms of modelling performance and time consumption.
Resumo:
The accurate estimate of the surface longwave fluxes contribution is important for the calculation of the surface radiation budget, which in turn controls all the components of the surface energy budget, such as evaporation and the sensible heat fluxes. This study evaluates the performance of the various downward longwave radiation parameterizations for clear and all-sky days applied to the Sertozinho region in So Paulo, Brazil. Equations have been adjusted to the observations of longwave radiation. The adjusted equations were evaluated for every hour throughout the day and the results showed good fits for most of the day, except near dawn and sunset, followed by nighttime. The seasonal variation was studied by comparing the dry period against the rainy period in the dataset. The least square linear regressions resulted in coefficients equal to the coefficients found for the complete period, both in the dry period and in the rainy period. It is expected that the best fit equation to the observed data for this site be used to produce estimates in other regions of the State of So Paulo, where such information is not available.
Resumo:
We developed a general method for determination of water production rates from groundbased visual observations and applied it to Comet Hale-Bopp. Our main objective is to extend the method to include total visual magnitude observations obtained with CCD detector and V filter in the analysis of total visual magnitudes. We compare the CCD V-broadband careful observations of Liller [Liller, W. Pre-perihelion CCD photometry of Comet 1995 01 (Hale-Bopp). Planet. Space Sci. 45, 1505-1513, 1997; Liller, W. CCD photometry of Comet C/1995 O1 (Hale-Bopp): 1995-2000. Int. Comet Quart. 23(3), 93-97, 2001] with the total visual magnitude observations from experienced international observers found in the International Comet Quarterly (ICQ) archive. A data set of similar to 400 CCD observations covering about the same 6 years time span of the similar to 12,000 ICQ selected total visual magnitude observations were used in the analysis. A least-square method applied to the water production rates, yields power laws as a function of the heliocentric distances for the pre- and post-perihelion phases. The average dimension of the nucleus as well as its effective active area is determined and compared with values published in the literature. (C) 2009 COSPAR. Published by Elsevier Ltd. All rights reserved.
Resumo:
Visualization of high-dimensional data requires a mapping to a visual space. Whenever the goal is to preserve similarity relations a frequent strategy is to use 2D projections, which afford intuitive interactive exploration, e. g., by users locating and selecting groups and gradually drilling down to individual objects. In this paper, we propose a framework for projecting high-dimensional data to 3D visual spaces, based on a generalization of the Least-Square Projection (LSP). We compare projections to 2D and 3D visual spaces both quantitatively and through a user study considering certain exploration tasks. The quantitative analysis confirms that 3D projections outperform 2D projections in terms of precision. The user study indicates that certain tasks can be more reliably and confidently answered with 3D projections. Nonetheless, as 3D projections are displayed on 2D screens, interaction is more difficult. Therefore, we incorporate suitable interaction functionalities into a framework that supports 3D transformations, predefined optimal 2D views, coordinated 2D and 3D views, and hierarchical 3D cluster definition and exploration. For visually encoding data clusters in a 3D setup, we employ color coding of projected data points as well as four types of surface renderings. A second user study evaluates the suitability of these visual encodings. Several examples illustrate the framework`s applicability for both visual exploration of multidimensional abstract (non-spatial) data as well as the feature space of multi-variate spatial data.
Resumo:
Nesse artigo, tem-se o interesse em avaliar diferentes estratégias de estimação de parâmetros para um modelo de regressão linear múltipla. Para a estimação dos parâmetros do modelo foram utilizados dados de um ensaio clínico em que o interesse foi verificar se o ensaio mecânico da propriedade de força máxima (EM-FM) está associada com a massa femoral, com o diâmetro femoral e com o grupo experimental de ratas ovariectomizadas da raça Rattus norvegicus albinus, variedade Wistar. Para a estimação dos parâmetros do modelo serão comparadas três metodologias: a metodologia clássica, baseada no método dos mínimos quadrados; a metodologia Bayesiana, baseada no teorema de Bayes; e o método Bootstrap, baseado em processos de reamostragem.
Resumo:
To identify chemical descriptors to distinguish Cuban from non-Cuban rums, analyses of 44 samples of rum from 15 different countries are described. To provide the chemical descriptors, analyses of the the mineral fraction, phenolic compounds, caramel, alcohols, acetic acid, ethyl acetate, ketones, and aldehydes were carried out. The analytical data were treated through the following chemometric methods: principal component analysis (PCA), partial least square-discriminate analysis (PLS-DA), and linear discriminate analysis (LDA). These analyses indicated 23 analytes as relevant chemical descriptors for the separation of rums into two distinct groups. The possibility of clustering the rum samples investigated through PCA analysis led to an accumulative percentage of 70.4% in the first three principal components, and isoamyl alcohol, n-propyl alcohol, copper, iron, 2-furfuraldehyde (furfuraldehyde), phenylmethanal (benzaldehyde), epicatechin, and vanillin were used as chemical descriptors. By applying the PLS-DA technique to the whole set of analytical data, the following analytes have been selected as descriptors: acetone, sec-butyl alcohol, isobutyl alcohol, ethyl acetate, methanol, isoamyl alcohol, magnesium, sodium, lead, iron, manganese, copper, zinc, 4-hydroxy3,5-dimethoxybenzaldehyde (syringaldehyde), methaldehyde (formaldehyde), 5-hydroxymethyl-2furfuraldehyde (5-HMF), acetalclehyde, 2-furfuraldehyde, 2-butenal (crotonaldehyde), n-pentanal (valeraldehyde), iso-pentanal (isovaleraldehyde), benzaldehyde, 2,3-butanodione monoxime, acetylacetone, epicatechin, and vanillin. By applying the LIDA technique, a model was developed, and the following analytes were selected as descriptors: ethyl acetate, sec-butyl alcohol, n-propyl alcohol, n-butyl alcohol, isoamyl alcohol, isobutyl alcohol, caramel, catechin, vanillin, epicatechin, manganese, acetalclehyde, 4-hydroxy-3-methoxybenzoic acid, 2-butenal, 4-hydroxy-3,5-dimethoxybenzoic acid, cyclopentanone, acetone, lead, zinc, calcium, barium, strontium, and sodium. This model allowed the discrimination of Cuban rums from the others with 88.2% accuracy.
Resumo:
Dissertação apresentada ao Programa de Pós-graduação em Administração da Universidade Municipal de são Caetano do Sul
Resumo:
This dissertation deals with the problem of making inference when there is weak identification in models of instrumental variables regression. More specifically we are interested in one-sided hypothesis testing for the coefficient of the endogenous variable when the instruments are weak. The focus is on the conditional tests based on likelihood ratio, score and Wald statistics. Theoretical and numerical work shows that the conditional t-test based on the two-stage least square (2SLS) estimator performs well even when instruments are weakly correlated with the endogenous variable. The conditional approach correct uniformly its size and when the population F-statistic is as small as two, its power is near the power envelopes for similar and non-similar tests. This finding is surprising considering the bad performance of the two-sided conditional t-tests found in Andrews, Moreira and Stock (2007). Given this counter intuitive result, we propose novel two-sided t-tests which are approximately unbiased and can perform as well as the conditional likelihood ratio (CLR) test of Moreira (2003).
Resumo:
This paper studies the effects of generic drug’s entry on bidding behavior of drug suppliers in procurement auctions for pharmaceuticals, and the consequences on procurer’s price paid for drugs. Using an unique data set on procurement auctions for off-patent drugs organized by Brazilian public bodies, we surprisingly find no statistically difference between bids and prices paid for generic and branded drugs. On the other hand, some branded drug suppliers leave auctions in which there exists a supplier of generics, whereas the remaining ones lower their bidding price. These findings explain why we find that the presence of any supplier of generic drugs in a procurement auction reduces the price paid for pharmaceuticals by 7 percent. To overcome potential estimation bias due to generic’s entry endogeneity, we exploit variation in the number of days between drug’s patent expiration date and the tendering session. The two-stage estimations document the same pattern as the generalized least square estimations find. This evidence indicates that generic competition affects branded supplier’s behavior in public procurement auctions differently from other markets.
Resumo:
Este trabalho observa como as variáveis macroeconômicas (expectativa de inflação, juro real, hiato do produto e a variação cambial) influenciam a dinâmica da Estrutura a Termo da Taxa de Juros (ETTJ). Esta dinâmica foi verificada introduzindo a teoria de Análise de Componentes Principais (ACP) para capturar o efeito das componentes mais relevantes na ETTJ (nível, inclinação e curvatura). Utilizando-se as estimativas por mínimos quadrados ordinários e pelo método generalizado dos momentos, foi verificado que existe uma relação estatisticamente significante entre as variáveis macroeconômicas e as componentes principais da ETTJ.
Resumo:
Muitos trabalhos têm sido elaborados a respeito da curva de demanda agregada brasileira, a curva IS, desde a implementação do Plano Real e, principalmente, após a adoção do regime de câmbio flutuante. Este trabalho tem como objetivo estimar algumas especificações para a curva IS brasileira, para o período após a implementação do câmbio flutuante, do regime de metas de inflação e da Lei de Responsabilidade Fiscal, i.e. após o ano 2000. As especificações para as curvas estimadas tiveram como base o modelo novo-keynesiano, tendo sido incluídas algumas variáveis explicativas buscando captar o efeito na demanda agregada da maior intermediação financeira na potência da política monetária e o efeito do esforço fiscal feito pelo governo brasileiro. O trabalho utiliza o Método dos Momentos Generalizados (MMG) para estimar a curva IS em sua especificação foward-looking e o Método dos Mínimos Quadrados Ordinários (MQO) para estimar a curva IS em sua versão backward-looking. Os resultados mostram forte significância para o hiato do produto em todas as especificações. As especificações foward-looking mostram coeficientes significantes, porém com sinais opostos ao esperado para os juros e superávit primário. Nas regressões backward-looking o sinal dos coeficientes encontrados são os esperados, porém, mostram-se não significantes.
Resumo:
A Educação a Distância é uma metodologia de ensino que muito se desenvolveu na última década. Com a diversidade tecnológica e com as políticas governamentais que autorizam a oferta de cursos on-line, centenas de instituições disponibilizaram programas de cursos via internet. Este aumento da oferta, gerado principalmente pela alta demanda do mercado, também provocou muitos problemas, especialmente no que diz respeito à falta de qualidade dos programas e ao alto índice de evasão. O objetivo deste estudo é avaliar a influência das tecnologias interativas síncronas sobre a intenção de continuidade de uso da Educação a Distância, propondo e testando um novo modelo estrutural. Em sua primeira fase, este experimento contou com a participação de 2.376 pessoas das cinco regiões do Brasil. Para o tratamento dos dados, a técnica PLS-PM (Partial Least Square – Path Modeling) foi utilizada com uma amostra de 243 indivíduos que responderam ao questionário final. Os resultados indicam que a adaptação do aluno à metodologia – construto proposto, é um importante preditor de sua satisfação, percepção de utilidade e de sua intenção de voltar a estudar pela internet no futuro, entretanto, não foi possível confirmar a influência das tecnologias interativas síncronas sobre a intenção de continuidade de uso da EaD, revelando que a tecnologia de informação tem papel de suporte aos processos educacionais, e o que orientará a decisão do aluno são os aspectos metodológicos aplicados às diversas mídias disponíveis. Foi identificado, também, que as pessoas com mais idade têm maior predisposição para estudar via internet, comparativamente aos mais jovens. Entender os fatores que levam a continuidade dos estudos em programas de EaD pode ajudar na redução da evasão por meio de ações customizadas ao público-alvo, melhorando a receita e a rentabilidade, o que pode representar vantagem competitiva à instituição.
Resumo:
Esta dissertação baseia-se na criação de uma taxa não inflacionária da capacidade utilizada (NIRCU) para economia brasileira, utilizando microdados da Sondagem da Indústria de Transformação e Índice de Preços ao Produtor Amplo – M (IPA-M), pesquisas desenvolvidas pela FGV/IBRE. Foram criadas três taxas: NIRCU Sondagem, NIRCU Coincidente e NIRCU Antecedente. A primeira utiliza apenas dados da sondagem e a ideia é verificar que não há pressão inflacionária quando as empresas informam para sondagem que não irão variar os seus preços para os próximos três meses e que o número de turnos trabalhado é igual à média do setor. Já as demais, cruzam as informações das empresas que respondem tanto a Sondagem da Indústria de Transformação quanto no IPA-M e verifica se as que informam que não irão alterar os seus preços nos próximos três meses se concretiza quando comparados às variações do índice. A diferença entre as duas últimas abordagens é que a primeira, NIRCU Coincidente, verifica no mesmo período e a outra, NIRCU Antecedente, no trimestre seguinte. A forma encontrada para verificar a eficácia dos indicadores em mensurar a existência de pressão inflacionária foi inserir os diferentes hiatos de produto das NIRCU no modelo de Curva de Phillips, usando a metodologia de Mínimos Quadrados Ordinários (MQO). De acordo com as estimativas, a NIRCU Antecedente foi a única das três que não apresentou um bom desempenho no resultado; as NIRCU Sondagem e Coincidente tiveram uma performance muita boa, principalmente a última. Ou seja, esses dois indicadores tiveram um resultado tão bom quanto as mais tradicionais medidas de hiato de produto.