908 resultados para partial least-squares regression


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The composition and distribution of diatom algae inhabiting estuaries and coasts of the subtropical Americas are poorly documented, especially relative to the central role diatoms play in coastal food webs and to their potential utility as sentinels of environmental change in these threatened ecosystems. Here, we document the distribution of diatoms among the diverse habitat types and long environmental gradients represented by the shallow topographic relief of the South Florida, USA, coastline. A total of 592 species were encountered from 38 freshwater, mangrove, and marine locations in the Everglades wetland and Florida Bay during two seasonal collections, with the highest diversity occurring at sites of high salinity and low water column organic carbon concentration (WTOC). Freshwater, mangrove, and estuarine assemblages were compositionally distinct, but seasonal differences were only detected in mangrove and estuarine sites where solute concentration differed greatly between wet and dry seasons. Epiphytic, planktonic, and sediment assemblages were compositionally similar, implying a high degree of mixing along the shallow, tidal, and storm-prone coast. The relationships between diatom taxa and salinity, water total phosphorus (WTP), water total nitrogen (WTN), and WTOC concentrations were determined and incorporated into weighted averaging partial least squares regression models. Salinity was the most influential variable, resulting in a highly predictive model (r apparent 2  = 0.97, r jackknife 2  = 0.95) that can be used in the future to infer changes in coastal freshwater delivery or sea-level rise in South Florida and compositionally similar environments. Models predicting WTN (r apparent 2  = 0.75, r jackknife 2  = 0.46), WTP (r apparent 2  = 0.75, r jackknife 2  = 0.49), and WTOC (r apparent 2  = 0.79, r jackknife 2  = 0.57) were also strong, suggesting that diatoms can provide reliable inferences of changes in solute delivery to the coastal ecosystem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ecosystem engineers that increase habitat complexity are keystone species in marine systems, increasing shelter and niche availability, and therefore biodiversity. For example, kelp holdfasts form intricate structures and host the largest number of organisms in kelp ecosystems. However, methods that quantify 3D habitat complexity have only seldom been used in marine habitats, and never in kelp holdfast communities. This study investigated the role of kelp holdfasts (Laminaria hyperborea) in supporting benthic faunal biodiversity. Computer-aided tomography (CT-) scanning was used to quantify the three-dimensional geometrical complexity of holdfasts, including volume, surface area and surface fractal dimension (FD). Additionally, the number of haptera, number of haptera per unit of volume, and age of kelps were estimated. These measurements were compared to faunal biodiversity and community structure, using partial least-squares regression and multivariate ordination. Holdfast volume explained most of the variance observed in biodiversity indices, however all other complexity measures also strongly contributed to the variance observed. Multivariate ordinations further revealed that surface area and haptera per unit of volume accounted for the patterns observed in faunal community structure. Using 3D image analysis, this study makes a strong contribution to elucidate quantitative mechanisms underlying the observed relationship between biodiversity and habitat complexity. Furthermore, the potential of CT-scanning as an ecological tool is demonstrated, and a methodology for its use in future similar studies is established. Such spatially resolved imager analysis could help identify structurally complex areas as biodiversity hotspots, and may support the prioritization of areas for conservation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ecosystem engineers that increase habitat complexity are keystone species in marine systems, increasing shelter and niche availability, and therefore biodiversity. For example, kelp holdfasts form intricate structures and host the largest number of organisms in kelp ecosystems. However, methods that quantify 3D habitat complexity have only seldom been used in marine habitats, and never in kelp holdfast communities. This study investigated the role of kelp holdfasts (Laminaria hyperborea) in supporting benthic faunal biodiversity. Computer-aided tomography (CT-) scanning was used to quantify the three-dimensional geometrical complexity of holdfasts, including volume, surface area and surface fractal dimension (FD). Additionally, the number of haptera, number of haptera per unit of volume, and age of kelps were estimated. These measurements were compared to faunal biodiversity and community structure, using partial least-squares regression and multivariate ordination. Holdfast volume explained most of the variance observed in biodiversity indices, however all other complexity measures also strongly contributed to the variance observed. Multivariate ordinations further revealed that surface area and haptera per unit of volume accounted for the patterns observed in faunal community structure. Using 3D image analysis, this study makes a strong contribution to elucidate quantitative mechanisms underlying the observed relationship between biodiversity and habitat complexity. Furthermore, the potential of CT-scanning as an ecological tool is demonstrated, and a methodology for its use in future similar studies is established. Such spatially resolved imager analysis could help identify structurally complex areas as biodiversity hotspots, and may support the prioritization of areas for conservation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Considering the social and economic importance that the milk has, the objective of this study was to evaluate the incidence and quantifying antimicrobial residues in the food. The samples were collected in dairy industry of southwestern Paraná state and thus they were able to cover all ten municipalities in the region of Pato Branco. The work focused on the development of appropriate models for the identification and quantification of analytes: tetracycline, sulfamethazine, sulfadimethoxine, chloramphenicol and ampicillin, all antimicrobials with health interest. For the calibration procedure and validation of the models was used the Infrared Spectroscopy Fourier Transform associated with chemometric method based on Partial Least Squares regression (PLS - Partial Least Squares). To prepare a work solution antimicrobials, the five analytes of interest were used in increasing doses, namely tetracycline from 0 to 0.60 ppm, sulfamethazine 0 to 0.12 ppm, sulfadimethoxine 0 to 2.40 ppm chloramphenicol 0 1.20 ppm and ampicillin 0 to 1.80 ppm to perform the work with the interest in multiresidues analysis. The performance of the models constructed was evaluated through the figures of merit: mean square error of calibration and cross-validation, correlation coefficients and offset performance ratio. For the purposes of applicability in this work, it is considered that the models generated for Tetracycline, Sulfadimethoxine and Chloramphenicol were considered viable, with the greatest predictive power and efficiency, then were employed to evaluate the quality of raw milk from the region of Pato Branco . Among the analyzed samples by NIR, 70% were in conformity with sanitary legislation, and 5% of these samples had concentrations below the Maximum Residue permitted, and is also satisfactory. However 30% of the sample set showed unsatisfactory results when evaluating the contamination with antimicrobials residues, which is non conformity related to the presence of antimicrobial unauthorized use or concentrations above the permitted limits. With the development of this work can be said that laboratory tests in the food area, using infrared spectroscopy with multivariate calibration was also good, fast in analysis, reduced costs and with minimum generation of laboratory waste. Thus, the alternative method proposed meets the quality concerns and desired efficiency by industrial sectors and society in general.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Yield loss in crops is often associated with plant disease or external factors such as environment, water supply and nutrient availability. Improper agricultural practices can also introduce risks into the equation. Herbicide drift can be a combination of improper practices and environmental conditions which can create a potential yield loss. As traditional assessment of plant damage is often imprecise and time consuming, the ability of remote and proximal sensing techniques to monitor various bio-chemical alterations in the plant may offer a faster, non-destructive and reliable approach to predict yield loss caused by herbicide drift. This paper examines the prediction capabilities of partial least squares regression (PLS-R) models for estimating yield. Models were constructed with hyperspectral data of a cotton crop sprayed with three simulated doses of the phenoxy herbicide 2,4-D at three different growth stages. Fibre quality, photosynthesis, conductance, and two main hormones, indole acetic acid (IAA) and abscisic acid (ABA) were also analysed. Except for fibre quality and ABA, Spearman correlations have shown that these variables were highly affected by the chemical. Four PLS-R models for predicting yield were developed according to four timings of data collection: 2, 7, 14 and 28 days after the exposure (DAE). As indicated by the model performance, the analysis revealed that 7 DAE was the best time for data collection purposes (RMSEP = 2.6 and R2 = 0.88), followed by 28 DAE (RMSEP = 3.2 and R2 = 0.84). In summary, the results of this study show that it is possible to accurately predict yield after a simulated herbicide drift of 2,4-D on a cotton crop, through the analysis of hyperspectral data, thereby providing a reliable, effective and non-destructive alternative based on the internal response of the cotton leaves.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a novel approach to solve the ordinal regression problem using Gaussian processes. The proposed approach, probabilistic least squares ordinal regression (PLSOR), obtains the probability distribution over ordinal labels using a particular likelihood function. It performs model selection (hyperparameter optimization) using the leave-one-out cross-validation (LOO-CV) technique. PLSOR has conceptual simplicity and ease of implementation of least squares approach. Unlike the existing Gaussian process ordinal regression (GPOR) approaches, PLSOR does not use any approximation techniques for inference. We compare the proposed approach with the state-of-the-art GPOR approaches on some synthetic and benchmark data sets. Experimental results show the competitiveness of the proposed approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides a root-n consistent, asymptotically normal weighted least squares estimator of the coefficients in a truncated regression model. The distribution of the errors is unknown and permits general forms of unknown heteroskedasticity. Also provided is an instrumental variables based two-stage least squares estimator for this model, which can be used when some regressors are endogenous, mismeasured, or otherwise correlated with the errors. A simulation study indicates that the new estimators perform well in finite samples. Our limiting distribution theory includes a new asymptotic trimming result addressing the boundary bias in first-stage density estimation without knowledge of the support boundary. © 2007 Cambridge University Press.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The note proposes an efficient nonlinear identification algorithm by combining a locally regularized orthogonal least squares (LROLS) model selection with a D-optimality experimental design. The proposed algorithm aims to achieve maximized model robustness and sparsity via two effective and complementary approaches. The LROLS method alone is capable of producing a very parsimonious model with excellent generalization performance. The D-optimality design criterion further enhances the model efficiency and robustness. An added advantage is that the user only needs to specify a weighting for the D-optimality cost in the combined model selecting criterion and the entire model construction procedure becomes automatic. The value of this weighting does not influence the model selection procedure critically and it can be chosen with ease from a wide range of values.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Our study revisits and challenges two core conventional meta-regression estimators: the prevalent use of‘mixed-effects’ or random-effects meta-regression analysis and the correction of standard errors that defines fixed-effects meta-regression analysis (FE-MRA). We show how and explain why an unrestricted weighted least squares MRA (WLS-MRA) estimator is superior to conventional random-effects (or mixed-effects) meta-regression when there is publication (or small-sample) bias that is as good as FE-MRA in all cases and better than fixed effects in most practical applications. Simulations and statistical theory show that WLS-MRA provides satisfactory estimates of meta-regression coefficients that are practically equivalent to mixed effects or random effects when there is no publication bias. When there is publication selection bias, WLS-MRA always has smaller bias than mixed effects or random effects. In practical applications, an unrestricted WLS meta-regression is likely to give practically equivalent or superior estimates to fixed-effects, random-effects, and mixed-effects meta-regression approaches. However, random-effects meta-regression remains viable and perhaps somewhat preferable if selection for statistical significance (publication bias) can be ruled out and when random, additive normal heterogeneity is known to directly affect the ‘true’ regression coefficient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The method of Least Squares is due to Carl Friedrich Gauss. The Gram-Schmidt orthogonalization method is of much younger date. A method for solving Least Squares Problems is developed which automatically results in the appearance of the Gram-Schmidt orthogonalizers. Given these orthogonalizers an induction-proof is available for solving Least Squares Problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel partitioned least squares (PLS) algorithm is presented, in which estimates from several simple system models are combined by means of a Bayesian methodology of pooling partial knowledge. The method has the added advantage that, when the simple models are of a similar structure, it lends itself directly to parallel processing procedures, thereby speeding up the entire parameter estimation process by several factors.