918 resultados para least squares method


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Quantitative Structure-Activity Relationship (QSAR) has been applied extensively in predicting toxicity of Disinfection By-Products (DBPs) in drinking water. Among many toxicological properties, acute and chronic toxicities of DBPs have been widely used in health risk assessment of DBPs. These toxicities are correlated with molecular properties, which are usually correlated with molecular descriptors. The primary goals of this thesis are: 1) to investigate the effects of molecular descriptors (e.g., chlorine number) on molecular properties such as energy of the lowest unoccupied molecular orbital (ELUMO) via QSAR modelling and analysis; 2) to validate the models by using internal and external cross-validation techniques; 3) to quantify the model uncertainties through Taylor and Monte Carlo Simulation. One of the very important ways to predict molecular properties such as ELUMO is using QSAR analysis. In this study, number of chlorine (NCl) and number of carbon (NC) as well as energy of the highest occupied molecular orbital (EHOMO) are used as molecular descriptors. There are typically three approaches used in QSAR model development: 1) Linear or Multi-linear Regression (MLR); 2) Partial Least Squares (PLS); and 3) Principle Component Regression (PCR). In QSAR analysis, a very critical step is model validation after QSAR models are established and before applying them to toxicity prediction. The DBPs to be studied include five chemical classes: chlorinated alkanes, alkenes, and aromatics. In addition, validated QSARs are developed to describe the toxicity of selected groups (i.e., chloro-alkane and aromatic compounds with a nitro- or cyano group) of DBP chemicals to three types of organisms (e.g., Fish, T. pyriformis, and P.pyosphoreum) based on experimental toxicity data from the literature. The results show that: 1) QSAR models to predict molecular property built by MLR, PLS or PCR can be used either to select valid data points or to eliminate outliers; 2) The Leave-One-Out Cross-Validation procedure by itself is not enough to give a reliable representation of the predictive ability of the QSAR models, however, Leave-Many-Out/K-fold cross-validation and external validation can be applied together to achieve more reliable results; 3) ELUMO are shown to correlate highly with the NCl for several classes of DBPs; and 4) According to uncertainty analysis using Taylor method, the uncertainty of QSAR models is contributed mostly from NCl for all DBP classes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Quantile regression (QR) was first introduced by Roger Koenker and Gilbert Bassett in 1978. It is robust to outliers which affect least squares estimator on a large scale in linear regression. Instead of modeling mean of the response, QR provides an alternative way to model the relationship between quantiles of the response and covariates. Therefore, QR can be widely used to solve problems in econometrics, environmental sciences and health sciences. Sample size is an important factor in the planning stage of experimental design and observational studies. In ordinary linear regression, sample size may be determined based on either precision analysis or power analysis with closed form formulas. There are also methods that calculate sample size based on precision analysis for QR like C.Jennen-Steinmetz and S.Wellek (2005). A method to estimate sample size for QR based on power analysis was proposed by Shao and Wang (2009). In this paper, a new method is proposed to calculate sample size based on power analysis under hypothesis test of covariate effects. Even though error distribution assumption is not necessary for QR analysis itself, researchers have to make assumptions of error distribution and covariate structure in the planning stage of a study to obtain a reasonable estimate of sample size. In this project, both parametric and nonparametric methods are provided to estimate error distribution. Since the method proposed can be implemented in R, user is able to choose either parametric distribution or nonparametric kernel density estimation for error distribution. User also needs to specify the covariate structure and effect size to carry out sample size and power calculation. The performance of the method proposed is further evaluated using numerical simulation. The results suggest that the sample sizes obtained from our method provide empirical powers that are closed to the nominal power level, for example, 80%.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Syria has been a major producer and exporter of fresh fruit and vegetables (FFV) in the Arabic region. Prior to 2011, Syrian FFV were mainly exported to the neighbouring countries, the Gulf States and Northern Africa as well as to Eastern European countries. Although the EU is potentially one of the most profitable markets of high quality FFV (such as organic ones) in the world, Syrian exports of FFV to Western European countries like Germany have been small. It could be a lucrative opportunity for Syrian growers and exporters of FFV to export organic products to markets such as Germany, where national production is limited to a few months due to climatic conditions. Yet, the organic sector in Syria is comparatively young and only a very small area of FFV is certified according to EU organic regulations. Up to the author’s knowledge, little was known about Syrian farmers’ attitudes towards organic FFV production. There was also no study so far that explored and analysed the determining factors for organic FFV adoption among Syrian farmers as well as the exports of these products to the EU markets. The overarching aim of the present dissertation focused on exploring and identifying the market potential of Syrian exports of organic FFV to Germany. The dissertation was therefore concerned with three main objectives: (i) to explore if German importers and wholesalers of organic FFV see market opportunities for Syrian organic products and what requirements in terms of quality and quantity they have, (ii) to determine the obstacles Syrian producers and exporters face when exporting agricultural products to Germany, and (iii) to investigate whether Syrian farmers of FFV can imagine converting their farms to organic production as well as the underlying reasons why they do so or not. A twofold methodological approach with expert interviews and a farmer survey were used in this dissertation to address the abovementioned objectives. While expert interviews were conducted with German and Syrian wholesalers of (organic) FFV in 2011 (9 interviews each), the farmer survey was administrated with 266 Syrian farmers of FFV in the main region for the production of FFV (i.e. the coastal region) from November 2012 till May 2013. For modelling farmers’ decisions to adopt organic farming, the Theory of Planned Behaviour as theoretical framework and Partial Least Squares Structural Equation Modelling as the main method for data analysis were used in this study. The findings of this dissertation yield implications for the different stakeholders (governmental institutions and NGOs, farmers, exporters, wholesalers, etc.) who are interested in prompting the Syrian export of organic products. Based on the empirical results and a literature review, an action plan to promote Syrian production and export of organic products was developed which can help in the post-war period in Syria at improving the organic sector.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

1. Genomewide association studies (GWAS) enable detailed dissections of the genetic basis for organisms' ability to adapt to a changing environment. In long-term studies of natural populations, individuals are often marked at one point in their life and then repeatedly recaptured. It is therefore essential that a method for GWAS includes the process of repeated sampling. In a GWAS, the effects of thousands of single-nucleotide polymorphisms (SNPs) need to be fitted and any model development is constrained by the computational requirements. A method is therefore required that can fit a highly hierarchical model and at the same time is computationally fast enough to be useful. 2. Our method fits fixed SNP effects in a linear mixed model that can include both random polygenic effects and permanent environmental effects. In this way, the model can correct for population structure and model repeated measures. The covariance structure of the linear mixed model is first estimated and subsequently used in a generalized least squares setting to fit the SNP effects. The method was evaluated in a simulation study based on observed genotypes from a long-term study of collared flycatchers in Sweden. 3. The method we present here was successful in estimating permanent environmental effects from simulated repeated measures data. Additionally, we found that especially for variable phenotypes having large variation between years, the repeated measurements model has a substantial increase in power compared to a model using average phenotypes as a response. 4. The method is available in the R package RepeatABEL. It increases the power in GWAS having repeated measures, especially for long-term studies of natural populations, and the R implementation is expected to facilitate modelling of longitudinal data for studies of both animal and human populations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

When a company desires to invest in a project, it must obtain resources needed to make the investment. The alternatives are using firm s internal resources or obtain external resources through contracts of debt and issuance of shares. Decisions involving the composition of internal resources, debt and shares in the total resources used to finance the activities of a company related to the choice of its capital structure. Although there are studies in the area of finance on the debt determinants of firms, the issue of capital structure is still controversial. This work sought to identify the predominant factors that determine the capital structure of Brazilian share capital, non-financial firms. This work was used a quantitative approach, with application of the statistical technique of multiple linear regression on data in panel. Estimates were made by the method of ordinary least squares with model of fixed effects. About 116 companies were selected to participate in this research. The period considered is from 2003 to 2007. The variables and hypotheses tested in this study were built based on theories of capital structure and in empirical researches. Results indicate that the variables, such as risk, size, and composition of assets and firms growth influence their indebtedness. The profitability variable was not relevant to the composition of indebtedness of the companies analyzed. However, analyzing only the long-term debt, comes to the conclusion that the relevant variables are the size of firms and, especially, the composition of its assets (tangibility).This sense, the smaller the size of the undertaking or the greater the representation of fixed assets in total assets, the greater its propensity to long-term debt. Furthermore, this research could not identify a predominant theory to explain the capital structure of Brazilian

Relevância:

90.00% 90.00%

Publicador:

Resumo:

One of the most disputable matters in the theory of finance has been the theory of capital structure. The seminal contributions of Modigliani and Miller (1958, 1963) gave rise to a multitude of studies and debates. Since the initial spark, the financial literature has offered two competing theories of financing decision: the trade-off theory and the pecking order theory. The trade-off theory suggests that firms have an optimal capital structure balancing the benefits and costs of debt. The pecking order theory approaches the firm capital structure from information asymmetry perspective and assumes a hierarchy of financing, with firms using first internal funds, followed by debt and as a last resort equity. This thesis analyses the trade-off and pecking order theories and their predictions on a panel data consisting 78 Finnish firms listed on the OMX Helsinki stock exchange. Estimations are performed for the period 2003–2012. The data is collected from Datastream system and consists of financial statement data. A number of capital structure characteristics are identified: firm size, profitability, firm growth opportunities, risk, asset tangibility and taxes, speed of adjustment and financial deficit. A regression analysis is used to examine the effects of the firm characteristics on capitals structure. The regression models were formed based on the relevant theories. The general capital structure model is estimated with fixed effects estimator. Additionally, dynamic models play an important role in several areas of corporate finance, but with the combination of fixed effects and lagged dependent variables the model estimation is more complicated. A dynamic partial adjustment model is estimated using Arellano and Bond (1991) first-differencing generalized method of moments, the ordinary least squares and fixed effects estimators. The results for Finnish listed firms show support for the predictions of profitability, firm size and non-debt tax shields. However, no conclusive support for the pecking-order theory is found. However, the effect of pecking order cannot be fully ignored and it is concluded that instead of being substitutes the trade-off and pecking order theory appear to complement each other. For the partial adjustment model the results show that Finnish listed firms adjust towards their target capital structure with a speed of 29% a year using book debt ratio.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Considering the social and economic importance that the milk has, the objective of this study was to evaluate the incidence and quantifying antimicrobial residues in the food. The samples were collected in dairy industry of southwestern Paraná state and thus they were able to cover all ten municipalities in the region of Pato Branco. The work focused on the development of appropriate models for the identification and quantification of analytes: tetracycline, sulfamethazine, sulfadimethoxine, chloramphenicol and ampicillin, all antimicrobials with health interest. For the calibration procedure and validation of the models was used the Infrared Spectroscopy Fourier Transform associated with chemometric method based on Partial Least Squares regression (PLS - Partial Least Squares). To prepare a work solution antimicrobials, the five analytes of interest were used in increasing doses, namely tetracycline from 0 to 0.60 ppm, sulfamethazine 0 to 0.12 ppm, sulfadimethoxine 0 to 2.40 ppm chloramphenicol 0 1.20 ppm and ampicillin 0 to 1.80 ppm to perform the work with the interest in multiresidues analysis. The performance of the models constructed was evaluated through the figures of merit: mean square error of calibration and cross-validation, correlation coefficients and offset performance ratio. For the purposes of applicability in this work, it is considered that the models generated for Tetracycline, Sulfadimethoxine and Chloramphenicol were considered viable, with the greatest predictive power and efficiency, then were employed to evaluate the quality of raw milk from the region of Pato Branco . Among the analyzed samples by NIR, 70% were in conformity with sanitary legislation, and 5% of these samples had concentrations below the Maximum Residue permitted, and is also satisfactory. However 30% of the sample set showed unsatisfactory results when evaluating the contamination with antimicrobials residues, which is non conformity related to the presence of antimicrobial unauthorized use or concentrations above the permitted limits. With the development of this work can be said that laboratory tests in the food area, using infrared spectroscopy with multivariate calibration was also good, fast in analysis, reduced costs and with minimum generation of laboratory waste. Thus, the alternative method proposed meets the quality concerns and desired efficiency by industrial sectors and society in general.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The routine analysis for quantization of organic acids and sugars are generally slow methods that involve the use and preparation of several reagents, require trained professional, the availability of special equipment and is expensive. In this context, it has been increasing investment in research whose purpose is the development of substitutive methods to reference, which are faster, cheap and simple, and infrared spectroscopy have been highlighted in this regard. The present study developed multivariate calibration models for the simultaneous and quantitative determination of ascorbic acid, citric, malic and tartaric and sugars sucrose, glucose and fructose, and soluble solids in juices and fruit nectars and classification models for ACP. We used methods of spectroscopy in the near infrared (Near Infrared, NIR) in association with the method regression of partial least squares (PLS). Were used 42 samples between juices and fruit nectars commercially available in local shops. For the construction of the models were performed with reference analysis using high-performance liquid chromatography (HPLC) and refractometry for the analysis of soluble solids. Subsequently, the acquisition of the spectra was done in triplicate, in the spectral range 12500 to 4000 cm-1. The best models were applied to the quantification of analytes in study on natural juices and juice samples produced in the Paraná Southwest Region. The juices used in the application of the models also underwent physical and chemical analysis. Validation of chromatographic methodology has shown satisfactory results, since the external calibration curve obtained R-square value (R2) above 0.98 and coefficient of variation (%CV) for intermediate precision and repeatability below 8.83%. Through the Principal Component Analysis (PCA) was possible to separate samples of juices into two major groups, grape and apple and tangerine and orange, while for nectars groups separated guava and grape, and pineapple and apple. Different validation methods, and pre-processes that were used separately and in combination, were obtained with multivariate calibration models with average forecast square error (RMSEP) and cross validation (RMSECV) errors below 1.33 and 1.53 g.100 mL-1, respectively and R2 above 0.771, except for malic acid. The physicochemical analysis enabled the characterization of drinks, including the pH working range (variation of 2.83 to 5.79) and acidity within the parameters Regulation for each flavor. Regression models have demonstrated the possibility of determining both ascorbic acids, citric, malic and tartaric with successfully, besides sucrose, glucose and fructose by means of only a spectrum, suggesting that the models are economically viable for quality control and product standardization in the fruit juice and nectars processing industry.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Three sediment records of sea surface temperature (SST) are analyzed that originate from distant locations in the North Atlantic, have centennial-to-multicentennial resolution, are based on the same reconstruction method and chronological assumptions, and span the past 15 000 yr. Using recursive least squares techniques, an estimate of the time-dependent North Atlantic SST field over the last 15 kyr is sought that is consistent with both the SST records and a surface ocean circulation model, given estimates of their respective error (co)variances. Under the authors' assumptions about data and model errors, it is found that the 10 degrees C mixed layer isotherm, which approximately traces the modern Subpolar Front, would have moved by ~15 degrees of latitude southward (northward) in the eastern North Atlantic at the onset (termination) of the Younger Dryas cold interval (YD), a result significant at the level of two standard deviations in the isotherm position. In contrast, meridional movements of the isotherm in the Newfoundland basin are estimated to be small and not significant. Thus, the isotherm would have pivoted twice around a region southeast of the Grand Banks, with a southwest-northeast orientation during the warm intervals of the Bolling-Allerod and the Holocene and a more zonal orientation and southerly position during the cold interval of the YD. This study provides an assessment of the significance of similar previous inferences and illustrates the potential of recursive least squares in paleoceanography.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Este trabalho incide na análise dos açúcares majoritários nos alimentos (glucose, frutose e sacarose) com uma língua eletrónica potenciométrica através de calibração multivariada com seleção de sensores. A análise destes compostos permite contribuir para a avaliação do impacto dos açúcares na saúde e seu efeito fisiológico, além de permitir relacionar atributos sensoriais e atuar no controlo de qualidade e autenticidade dos alimentos. Embora existam diversas metodologias analíticas usadas rotineiramente na identificação e quantificação dos açúcares nos alimentos, em geral, estes métodos apresentam diversas desvantagens, tais como lentidão das análises, consumo elevado de reagentes químicos e necessidade de pré-tratamentos destrutivos das amostras. Por isso se decidiu aplicar uma língua eletrónica potenciométrica, construída com sensores poliméricos selecionados considerando as sensibilidades aos açucares obtidas em trabalhos anteriores, na análise dos açúcares nos alimentos, visando estabelecer uma metodologia analítica e procedimentos matemáticos para quantificação destes compostos. Para este propósito foram realizadas análises em soluções padrão de misturas ternárias dos açúcares em diferentes níveis de concentração e em soluções de dissoluções de amostras de mel, que foram previamente analisadas em HPLC para se determinar as concentrações de referência dos açúcares. Foi então feita uma análise exploratória dos dados visando-se remover sensores ou observações discordantes através da realização de uma análise de componentes principais. Em seguida, foram construídos modelos de regressão linear múltipla com seleção de variáveis usando o algoritmo stepwise e foi verificado que embora fosse possível estabelecer uma boa relação entre as respostas dos sensores e as concentrações dos açúcares, os modelos não apresentavam desempenho de previsão satisfatório em dados de grupo de teste. Dessa forma, visando contornar este problema, novas abordagens foram testadas através da construção e otimização dos parâmetros de um algoritmo genético para seleção de variáveis que pudesse ser aplicado às diversas ferramentas de regressão, entre elas a regressão pelo método dos mínimos quadrados parciais. Foram obtidos bons resultados de previsão para os modelos obtidos com o método dos mínimos quadrados parciais aliado ao algoritmo genético, tanto para as soluções padrão quanto para as soluções de mel, com R²ajustado acima de 0,99 e RMSE inferior a 0,5 obtidos da relação linear entre os valores previstos e experimentais usando dados dos grupos de teste. O sistema de multi-sensores construído se mostrou uma ferramenta adequada para a análise dos iii açúcares, quando presentes em concentrações maioritárias, e alternativa a métodos instrumentais de referência, como o HPLC, por reduzir o tempo da análise e o valor monetário da análise, bem como, ter um preparo mínimo das amostras e eliminar produtos finais poluentes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this work, the relationship between diameter at breast height (d) and total height (h) of individual-tree was modeled with the aim to establish provisory height-diameter (h-d) equations for maritime pine (Pinus pinaster Ait.) stands in the Lomba ZIF, Northeast Portugal. Using data collected locally, several local and generalized h-d equations from the literature were tested and adaptations were also considered. Model fitting was conducted by using usual nonlinear least squares (nls) methods. The best local and generalized models selected, were also tested as mixed models applying a first-order conditional expectation (FOCE) approximation procedure and maximum likelihood methods to estimate fixed and random effects. For the calibration of the mixed models and in order to be consistent with the fitting procedure, the FOCE method was also used to test different sampling designs. The results showed that the local h-d equations with two parameters performed better than the analogous models with three parameters. However a unique set of parameter values for the local model can not be used to all maritime pine stands in Lomba ZIF and thus, a generalized model including covariates from the stand, in addition to d, was necessary to obtain an adequate predictive performance. No evident superiority of the generalized mixed model in comparison to the generalized model with nonlinear least squares parameters estimates was observed. On the other hand, in the case of the local model, the predictive performance greatly improved when random effects were included. The results showed that the mixed model based in the local h-d equation selected is a viable alternative for estimating h if variables from the stand are not available. Moreover, it was observed that it is possible to obtain an adequate calibrated response using only 2 to 5 additional h-d measurements in quantile (or random) trees from the distribution of d in the plot (stand). Balancing sampling effort, accuracy and straightforwardness in practical applications, the generalized model from nls fit is recommended. Examples of applications of the selected generalized equation to the forest management are presented, namely how to use it to complete missing information from forest inventory and also showing how such an equation can be incorporated in a stand-level decision support system that aims to optimize the forest management for the maximization of wood volume production in Lomba ZIF maritime pine stands.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

When a company desires to invest in a project, it must obtain resources needed to make the investment. The alternatives are using firm s internal resources or obtain external resources through contracts of debt and issuance of shares. Decisions involving the composition of internal resources, debt and shares in the total resources used to finance the activities of a company related to the choice of its capital structure. Although there are studies in the area of finance on the debt determinants of firms, the issue of capital structure is still controversial. This work sought to identify the predominant factors that determine the capital structure of Brazilian share capital, non-financial firms. This work was used a quantitative approach, with application of the statistical technique of multiple linear regression on data in panel. Estimates were made by the method of ordinary least squares with model of fixed effects. About 116 companies were selected to participate in this research. The period considered is from 2003 to 2007. The variables and hypotheses tested in this study were built based on theories of capital structure and in empirical researches. Results indicate that the variables, such as risk, size, and composition of assets and firms growth influence their indebtedness. The profitability variable was not relevant to the composition of indebtedness of the companies analyzed. However, analyzing only the long-term debt, comes to the conclusion that the relevant variables are the size of firms and, especially, the composition of its assets (tangibility).This sense, the smaller the size of the undertaking or the greater the representation of fixed assets in total assets, the greater its propensity to long-term debt. Furthermore, this research could not identify a predominant theory to explain the capital structure of Brazilian

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Inter-subject parcellation of functional Magnetic Resonance Imaging (fMRI) data based on a standard General Linear Model (GLM) and spectral clustering was recently proposed as a means to alleviate the issues associated with spatial normalization in fMRI. However, for all its appeal, a GLM-based parcellation approach introduces its own biases, in the form of a priori knowledge about the shape of Hemodynamic Response Function (HRF) and task-related signal changes, or about the subject behaviour during the task. In this paper, we introduce a data-driven version of the spectral clustering parcellation, based on Independent Component Analysis (ICA) and Partial Least Squares (PLS) instead of the GLM. First, a number of independent components are automatically selected. Seed voxels are then obtained from the associated ICA maps and we compute the PLS latent variables between the fMRI signal of the seed voxels (which covers regional variations of the HRF) and the principal components of the signal across all voxels. Finally, we parcellate all subjects data with a spectral clustering of the PLS latent variables. We present results of the application of the proposed method on both single-subject and multi-subject fMRI datasets. Preliminary experimental results, evaluated with intra-parcel variance of GLM t-values and PLS derived t-values, indicate that this data-driven approach offers improvement in terms of parcellation accuracy over GLM based techniques.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose: To develop an effective method for evaluating the quality of Cortex berberidis from different geographical origins. Methods: A simple, precise and accurate high performance liquid chromatography (HPLC) method was first developed for simultaneous quantification of four active alkaloids (magnoflorine, jatrorrhizine, palmatine, and berberine) in Cortex berberidis obtained from Qinghai, Tibet and Sichuan Provinces of China. Method validation was performed in terms of precision, repeatability, stability, accuracy, and linearity. Besides, partial least squares discriminant analysis (PLS-DA) and one-way analysis of variance (ANOVA) were applied to study the quality variations of Cortex berberidis from various geographical origins. Results: The proposed HPLC method showed good linearity, precision, repeatability, and accuracy. The four alkaloids were detected in all samples of Cortex berberidis. Among them, magnoflorine (36.46 - 87.30 mg/g) consistently showed the highest amounts in all the samples, followed by berberine (16.00 - 37.50 mg/g). The content varied in the range of 0.66 - 4.57 mg/g for palmatine and 1.53 - 16.26 mg/g for jatrorrhizine, respectively. The total content of the four alkaloids ranged from 67.62 to 114.79 mg/g. Moreover, the results obtained by the PLS-DA and ANOVA showed that magnoflorine level and the total content of these four alkaloids in Qinghai and Tibet samples were significantly higher (p < 0.01) than those in Sichuan samples. Conclusion: Quantification of multi-ingredients by HPLC combined with statistical methods provide an effective approach for achieving origin discrimination and quality evaluation of Cortex berberidis. The quality of Cortex berberidis closely correlates to the geographical origin of the samples, with Cortex berberidis samples from Qinghai and Tibet exhibiting superior qualities to those from Sichuan.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Este estudo investiga a otimização da resistência ao cisalhamento no plano de juntas de sobreposição co-curadas do compósito termoplástico unidirecional auto-reforçado de polietileno de baixa densidade reciclado reforçado por fibras de polietileno de ultra alto peso molecular através da relação desta resistência com os parâmetros processuais de prensagem a quente para a conformação da junta (pressão, temperatura, tempo e comprimento). A matriz teve sua estrutura química analisada para verificar potenciais degradações devidas à sua origem de reciclagem. Matriz e reforço foram caracterizados termicamente para definir a janela de temperatura de processamento de junta a ser estudada. A elaboração das condições de cura dos corpos de prova foi feita de acordo com a metodologia de Projeto de Experimento de Superfície de Resposta e a relação entre a resistência ao cisalhamento das juntas e os respectivos parâmetros de cura foi obtida através de equação de regressão gerada pelo método dos Mínimos Quadrados Ordinários. A caracterização mecânica em tração do material foi analisada micro e macromecanicamente. A análise química da matriz não demonstrou a presença de grupos carboxílicos que evidenciassem degradação por ramificações de cadeia e reticulação advindos da reciclagem do material. As metodologias de ensaio propostas demonstraram ser eficazes, podendo servir como base para a constituição de normas técnicas. Demonstrou-se que é possível obter juntas com resistência ótima ao cisalhamento de 6,88 MPa quando processadas a 1 bar, 115°C, 5 min e com 12 mm. A análise da fratura revelou que a ruptura por cisalhamento das juntas foi precedida por múltiplas fissuras longitudinais induzidas por sucessivos debondings, tanto dentro quanto fora da junta, devido à tensão transversal acumulada na mesma, proporcional a seu comprimento. A temperatura demonstrou ser o parâmetro de processamento mais relevante para a performance da junta, a qual é pouco afetada por variações na pressão e tempo de cura.