887 resultados para Sums of squares
Resumo:
As técnicas estatísticas são fundamentais em ciência e a análise de regressão linear é, quiçá, uma das metodologias mais usadas. É bem conhecido da literatura que, sob determinadas condições, a regressão linear é uma ferramenta estatística poderosíssima. Infelizmente, na prática, algumas dessas condições raramente são satisfeitas e os modelos de regressão tornam-se mal-postos, inviabilizando, assim, a aplicação dos tradicionais métodos de estimação. Este trabalho apresenta algumas contribuições para a teoria de máxima entropia na estimação de modelos mal-postos, em particular na estimação de modelos de regressão linear com pequenas amostras, afetados por colinearidade e outliers. A investigação é desenvolvida em três vertentes, nomeadamente na estimação de eficiência técnica com fronteiras de produção condicionadas a estados contingentes, na estimação do parâmetro ridge em regressão ridge e, por último, em novos desenvolvimentos na estimação com máxima entropia. Na estimação de eficiência técnica com fronteiras de produção condicionadas a estados contingentes, o trabalho desenvolvido evidencia um melhor desempenho dos estimadores de máxima entropia em relação ao estimador de máxima verosimilhança. Este bom desempenho é notório em modelos com poucas observações por estado e em modelos com um grande número de estados, os quais são comummente afetados por colinearidade. Espera-se que a utilização de estimadores de máxima entropia contribua para o tão desejado aumento de trabalho empírico com estas fronteiras de produção. Em regressão ridge o maior desafio é a estimação do parâmetro ridge. Embora existam inúmeros procedimentos disponíveis na literatura, a verdade é que não existe nenhum que supere todos os outros. Neste trabalho é proposto um novo estimador do parâmetro ridge, que combina a análise do traço ridge e a estimação com máxima entropia. Os resultados obtidos nos estudos de simulação sugerem que este novo estimador é um dos melhores procedimentos existentes na literatura para a estimação do parâmetro ridge. O estimador de máxima entropia de Leuven é baseado no método dos mínimos quadrados, na entropia de Shannon e em conceitos da eletrodinâmica quântica. Este estimador suplanta a principal crítica apontada ao estimador de máxima entropia generalizada, uma vez que prescinde dos suportes para os parâmetros e erros do modelo de regressão. Neste trabalho são apresentadas novas contribuições para a teoria de máxima entropia na estimação de modelos mal-postos, tendo por base o estimador de máxima entropia de Leuven, a teoria da informação e a regressão robusta. Os estimadores desenvolvidos revelam um bom desempenho em modelos de regressão linear com pequenas amostras, afetados por colinearidade e outliers. Por último, são apresentados alguns códigos computacionais para estimação com máxima entropia, contribuindo, deste modo, para um aumento dos escassos recursos computacionais atualmente disponíveis.
Resumo:
Online travel shopping has attracted researchers due to its significant growth and there is a growing body of literature in this field. However, research on what drives consumers to purchase travel online has typically been fragmented. In fact, existing studies have largely concentrated on examining consumers’ online travel purchases either grounded on Davis’s Technology Acceptance Model, on the Theory of Reasoned Action and its extension, the Theory of Planned Behaviour or on Roger’s model of perceived innovation attributes, the Innovation Diffusion Theory. A thorough literature review has revealed that there is a lack of studies that integrate all theories to better understand online travel shopping. Therefore, based on relevant literature in tourism and consumer behaviour, this study proposes and tests an integrated model to explore which factors affect intentions to purchase travel online. Furthermore, it proposes a new construct, termed social media involvement, defined as a person’s level of interest or emotional attachment with social media, and examines its relationship with intentions to purchase travel online. To test the 18 hypotheses, a quantitative approach was followed by first collecting data through an online survey. With a sample of 1,532 Worldwide Internet users, Partial Least Squares analysis was than conducted to assess the validity and reliability of the data and empirically test the hypothesized relationships between the constructs. The results indicate that intentions to purchase travel online is mostly determined by attitude towards online shopping, which is influenced by perceived relative advantages of online travel shopping and trust in online travel shopping. In addition, the findings indicate that the second most important predictor of intentions to purchase travel online is compatibility, an attribute from the Innovation Diffusion Theory. Furthermore, even though online shopping is nowadays a common practice, perceived risk continues to negatively affect intentions to purchase travel online. The most surprising finding of this study was that Internet users more involved with social media for travel purposes did not have higher intentions to purchase travel online. The theoretical contributions of this study and the practical implications are discussed and future research directions are detailed.
Resumo:
The main objective of this work was to monitor a set of physical-chemical properties of heavy oil procedural streams through nuclear magnetic resonance spectroscopy, in order to propose an analysis procedure and online data processing for process control. Different statistical methods which allow to relate the results obtained by nuclear magnetic resonance spectroscopy with the results obtained by the conventional standard methods during the characterization of the different streams, have been implemented in order to develop models for predicting these same properties. The real-time knowledge of these physical-chemical properties of petroleum fractions is very important for enhancing refinery operations, ensuring technically, economically and environmentally proper refinery operations. The first part of this work involved the determination of many physical-chemical properties, at Matosinhos refinery, by following some standard methods important to evaluate and characterize light vacuum gas oil, heavy vacuum gas oil and fuel oil fractions. Kinematic viscosity, density, sulfur content, flash point, carbon residue, P-value and atmospheric and vacuum distillations were the properties analysed. Besides the analysis by using the standard methods, the same samples were analysed by nuclear magnetic resonance spectroscopy. The second part of this work was related to the application of multivariate statistical methods, which correlate the physical-chemical properties with the quantitative information acquired by nuclear magnetic resonance spectroscopy. Several methods were applied, including principal component analysis, principal component regression, partial least squares and artificial neural networks. Principal component analysis was used to reduce the number of predictive variables and to transform them into new variables, the principal components. These principal components were used as inputs of the principal component regression and artificial neural networks models. For the partial least squares model, the original data was used as input. Taking into account the performance of the develop models, by analysing selected statistical performance indexes, it was possible to conclude that principal component regression lead to worse performances. When applying the partial least squares and artificial neural networks models better results were achieved. However, it was with the artificial neural networks model that better predictions were obtained for almost of the properties analysed. With reference to the results obtained, it was possible to conclude that nuclear magnetic resonance spectroscopy combined with multivariate statistical methods can be used to predict physical-chemical properties of petroleum fractions. It has been shown that this technique can be considered a potential alternative to the conventional standard methods having obtained very promising results.
Resumo:
Clustering and Disjoint Principal Component Analysis (CDP CA) is a constrained principal component analysis recently proposed for clustering of objects and partitioning of variables, simultaneously, which we have implemented in R language. In this paper, we deal in detail with the alternating least-squares algorithm for CDPCA and highlight its algebraic features for constructing both interpretable principal components and clusters of objects. Two applications are given to illustrate the capabilities of this new methodology.
Resumo:
Attention is usually modelled by sequential fixation of peaks in saliency maps. Those maps code local conspicuity: complexity, colour and texture. Such features have no relation to entire objects, unless also disparity and optical flow are considered, which often segregate entire objects from their background. Recently we developed a model of local gist vision: which types of objects are about where in a scene. This model addresses man-made objects which are dominated by a small shape repertoire: squares, rectangles, trapeziums, triangles, circles and ellipses. Only exploiting local colour contrast, the model can detect these shapes by a small hierarchy of cell layers devoted to low- and mid-level geometry. The model has been tested successfully on video sequences containing traffic signs and other scenes, and partial occlusions were not problematic.
Resumo:
In this paper a parallel implementation of an Adaprtive Generalized Predictive Control (AGPC) algorithm is presented. Since the AGPC algorithm needs to be fed with knowledge of the plant transfer function, the parallelization of a standard Recursive Least Squares (RLS) estimator and a GPC predictor is discussed here.
Resumo:
The Adaptive Generalized Predictive Control (AGPC) algorithm can be speeded up using parallel processing. Since the AGPC algorithm needs to be fed with the knowledge of the plant transfer function, the parallelization of a standard Recursive Least Squares (RLS) estimator and a GPC predictor is discussed here.
Resumo:
The Adaptive Generalized Predictive Control (GPC) algorithm can be speeded up using parallel processing. Since the GPC algorithm needs to be fed with knowledge of the plant transfer function, the parallelization of a standard Recursive Least Squares (RLS) estimator and a GPC predictor is discussed here.
Resumo:
In this paper the parallelization of a new learning algorithm for multilayer perceptrons, specifically targeted for nonlinear function approximation purposes, is discussed. Each major step of the algorithm is parallelized, a special emphasis being put in the most computationally intensive task, a least-squares solution of linear systems of equations.
Resumo:
Dissertação de mest., Qualidade em Análises, Faculdade de Ciências e Tecnologia, Univ. do Algarve, 2013
Resumo:
Freshness and safety of muscle foods are generally considered as the most important parameters for the food industry. To address the rapid determination of meat spoilage, Fourier transform infrared (FTIR) spectroscopy technique, with the help of advanced learning-based methods, was attempted in this work. FTIR spectra were obtained from the surface of beef samples during aerobic storage at various temperatures, while a microbiological analysis had identified the population of Total viable counts. A fuzzy principal component algorithm has been also developed to reduce the dimensionality of the spectral data. The results confirmed the superiority of the adopted scheme compared to the partial least squares technique, currently used in food microbiology.
Resumo:
In this paper, we present two Partial Least Squares Regression (PLSR) models for compressive and flexural strength responses of a concrete composite material reinforced with pultrusion wastes. The main objective is to characterize this cost-effective waste management solution for glass fiber reinforced polymer (GFRP) pultrusion wastes and end-of-life products that will lead, thereby, to a more sustainable composite materials industry. The experiments took into account formulations with the incorporation of three different weight contents of GFRP waste materials into polyester based mortars, as sand aggregate and filler replacements, two waste particle size grades and the incorporation of silane adhesion promoter into the polyester resin matrix in order to improve binder aggregates interfaces. The regression models were achieved for these data and two latent variables were identified as suitable, with a 95% confidence level. This technological option, for improving the quality of GFRP filled polymer mortars, is viable thus opening a door to selective recycling of GFRP waste and its use in the production of concrete-polymer based products. However, further and complementary studies will be necessary to confirm the technical and economic viability of the process.
Resumo:
In this work, kriging with covariates is used to model and map the spatial distribution of salinity measurements gathered by an autonomous underwater vehicle in a sea outfall monitoring campaign aiming to distinguish the effluent plume from the receiving waters and characterize its spatial variability in the vicinity of the discharge. Four different geostatistical linear models for salinity were assumed, where the distance to diffuser, the west-east positioning, and the south-north positioning were used as covariates. Sample variograms were fitted by the Mat`ern models using weighted least squares and maximum likelihood estimation methods as a way to detect eventual discrepancies. Typically, the maximum likelihood method estimated very low ranges which have limited the kriging process. So, at least for these data sets, weighted least squares showed to be the most appropriate estimation method for variogram fitting. The kriged maps show clearly the spatial variation of salinity, and it is possible to identify the effluent plume in the area studied. The results obtained show some guidelines for sewage monitoring if a geostatistical analysis of the data is in mind. It is important to treat properly the existence of anomalous values and to adopt a sampling strategy that includes transects parallel and perpendicular to the effluent dispersion.
Resumo:
A PhD Dissertation, presented as part of the requirements for the Degree of Doctor of Philosophy from the NOVA - School of Business and Economics
Resumo:
Geographic information systems give us the possibility to analyze, produce, and edit geographic information. Furthermore, these systems fall short on the analysis and support of complex spatial problems. Therefore, when a spatial problem, like land use management, requires a multi-criteria perspective, multi-criteria decision analysis is placed into spatial decision support systems. The analytic hierarchy process is one of many multi-criteria decision analysis methods that can be used to support these complex problems. Using its capabilities we try to develop a spatial decision support system, to help land use management. Land use management can undertake a broad spectrum of spatial decision problems. The developed decision support system had to accept as input, various formats and types of data, raster or vector format, and the vector could be polygon line or point type. The support system was designed to perform its analysis for the Zambezi river Valley in Mozambique, the study area. The possible solutions for the emerging problems had to cover the entire region. This required the system to process large sets of data, and constantly adjust to new problems’ needs. The developed decision support system, is able to process thousands of alternatives using the analytical hierarchy process, and produce an output suitability map for the problems faced.