919 resultados para moving least squares approximation
Resumo:
2000 Mathematics Subject Classification: 65C05
Resumo:
The evolution of reproductive strategies involves a complex calculus of costs and benefits to both parents and offspring. Many marine animals produce embryos packaged in tough egg capsules or gelatinous egg masses attached to benthic surfaces. While these egg structures can protect against environmental stresses, the packaging is energetically costly for parents to produce. In this series of studies, I examined a variety of ecological factors affecting the evolution of benthic development as a life history strategy. I used marine gastropods as my model system because they are incredibly diverse and abundant worldwide, and they exhibit a variety of reproductive and developmental strategies.
The first study examines predation on benthic egg masses. I investigated: 1) behavioral mechanisms of predation when embryos are targeted (rather than the whole egg mass); 2) the specific role of gelatinous matrix in predation. I hypothesized that gelatinous matrix does not facilitate predation. One study system was the sea slug Olea hansineensis, an obligate egg mass predator, feeding on the sea slug Haminoea vesicula. Olea fed intensely and efficiently on individual Haminoea embryos inside egg masses but showed no response to live embryos removed from gel, suggesting that gelatinous matrix enables predation. This may be due to mechanical support of the feeding predator by the matrix. However, Haminoea egg masses outnumber Olea by two orders of magnitude in the field, and each egg mass can contain many tens of thousands of embryos, so predation pressure on individuals is likely not strong. The second system involved the snail Nassarius vibex, a non-obligate egg mass predator, feeding on the polychaete worm Clymenella mucosa. Gel neither inhibits nor promotes embryo predation for Nassarius, but because it cannot target individual embryos inside an egg mass, its feeding is slow and inefficient, and feeding rates in the field are quite low. However, snails that compete with Nassarius for scavenged food have not been seen to eat egg masses in the field, leaving Nassarius free to exploit the resource. Overall, egg mass predation in these two systems likely benefits the predators much more than it negatively affects the prey. Thus, selection for environmentally protective aspects of egg mass production may be much stronger than selection for defense against predation.
In the second study, I examined desiccation resistance in intertidal egg masses made by Haminoea vesicula, which preferentially attaches its flat, ribbon-shaped egg masses to submerged substrata. Egg masses occasionally detach and become stranded on exposed sand at low tide. Unlike adults, the encased embryos cannot avoid desiccation by selectively moving about the habitat, and the egg mass shape has high surface-area-to-volume ratio that should make it prone to drying out. Thus, I hypothesized that the embryos would not survive stranding. I tested this by deploying individual egg masses of two age classes on exposed sand bars for the duration of low tide. After rehydration, embryos midway through development showed higher rates of survival than newly-laid embryos, though for both stages survival rates over 25% were frequently observed. Laboratory desiccation trials showed that >75% survival is possible in an egg mass that has lost 65% of its water weight, and some survival (<25%) was observed even after 83% water weight lost. Although many surviving embryos in both experiments showed damage, these data demonstrate that egg mass stranding is not necessarily fatal to embryos. They may be able to survive a far greater range of conditions than they normally encounter, compensating for their lack of ability to move. Also, desiccation tolerance of embryos may reduce pressure on parents to find optimal laying substrata.
The third study takes a big-picture approach to investigating the evolution of different developmental strategies in cone snails, the largest genus of marine invertebrates. Cone snail species hatch out of their capsules as either swimming larvae or non-dispersing forms, and their developmental mode has direct consequences for biogeographic patterns. Variability in life history strategies among taxa may be influenced by biological, environmental, or phylogenetic factors, or a combination of these. While most prior research has examined these factors singularly, my aim was to investigate the effects of a host of intrinsic, extrinsic, and historical factors on two fundamental aspects of life history: egg size and egg number. I used phylogenetic generalized least-squares regression models to examine relationships between these two egg traits and a variety of hypothesized intrinsic and extrinsic variables. Adult shell morphology and spatial variability in productivity and salinity across a species geographic range had the strongest effects on egg diameter and number of eggs per capsule. Phylogeny had no significant influence. Developmental mode in Conus appears to be influenced mostly by species-level adaptations and niche specificity rather than phylogenetic conservatism. Patterns of egg size and egg number appear to reflect energetic tradeoffs with body size and specific morphologies as well as adaptations to variable environments. Overall, this series of studies highlights the importance of organism-scale biotic and abiotic interactions in evolutionary patterns.
Resumo:
In geotechnical engineering, the stability of rock excavations and walls is estimated by using tools that include a map of the orientations of exposed rock faces. However, measuring these orientations by using conventional methods can be time consuming, sometimes dangerous, and is limited to regions of the exposed rock that are reachable by a human. This thesis introduces a 2D, simulated, quadcopter-based rock wall mapping algorithm for GPS denied environments such as underground mines or near high walls on surface. The proposed algorithm employs techniques from the field of robotics known as simultaneous localization and mapping (SLAM) and is a step towards 3D rock wall mapping. Not only are quadcopters agile, but they can hover. This is very useful for confined spaces such as underground or near rock walls. The quadcopter requires sensors to enable self localization and mapping in dark, confined and GPS denied environments. However, these sensors are limited by the quadcopter payload and power restrictions. Because of these restrictions, a light weight 2D laser scanner is proposed. As a first step towards a 3D mapping algorithm, this thesis proposes a simplified scenario in which a simulated 1D laser range finder and 2D IMU are mounted on a quadcopter that is moving on a plane. Because the 1D laser does not provide enough information to map the 2D world from a single measurement, many measurements are combined over the trajectory of the quadcopter. Least Squares Optimization (LSO) is used to optimize the estimated trajectory and rock face for all data collected over the length of a light. Simulation results show that the mapping algorithm developed is a good first step. It shows that by combining measurements over a trajectory, the scanned rock face can be estimated using a lower-dimensional range sensor. A swathing manoeuvre is introduced as a way to promote loop closures within a short time period, thus reducing accumulated error. Some suggestions on how to improve the algorithm are also provided.
Resumo:
Cette thèse développe des méthodes bootstrap pour les modèles à facteurs qui sont couram- ment utilisés pour générer des prévisions depuis l'article pionnier de Stock et Watson (2002) sur les indices de diffusion. Ces modèles tolèrent l'inclusion d'un grand nombre de variables macroéconomiques et financières comme prédicteurs, une caractéristique utile pour inclure di- verses informations disponibles aux agents économiques. Ma thèse propose donc des outils éco- nométriques qui améliorent l'inférence dans les modèles à facteurs utilisant des facteurs latents extraits d'un large panel de prédicteurs observés. Il est subdivisé en trois chapitres complémen- taires dont les deux premiers en collaboration avec Sílvia Gonçalves et Benoit Perron. Dans le premier article, nous étudions comment les méthodes bootstrap peuvent être utilisées pour faire de l'inférence dans les modèles de prévision pour un horizon de h périodes dans le futur. Pour ce faire, il examine l'inférence bootstrap dans un contexte de régression augmentée de facteurs où les erreurs pourraient être autocorrélées. Il généralise les résultats de Gonçalves et Perron (2014) et propose puis justifie deux approches basées sur les résidus : le block wild bootstrap et le dependent wild bootstrap. Nos simulations montrent une amélioration des taux de couverture des intervalles de confiance des coefficients estimés en utilisant ces approches comparativement à la théorie asymptotique et au wild bootstrap en présence de corrélation sérielle dans les erreurs de régression. Le deuxième chapitre propose des méthodes bootstrap pour la construction des intervalles de prévision permettant de relâcher l'hypothèse de normalité des innovations. Nous y propo- sons des intervalles de prédiction bootstrap pour une observation h périodes dans le futur et sa moyenne conditionnelle. Nous supposons que ces prévisions sont faites en utilisant un ensemble de facteurs extraits d'un large panel de variables. Parce que nous traitons ces facteurs comme latents, nos prévisions dépendent à la fois des facteurs estimés et les coefficients de régres- sion estimés. Sous des conditions de régularité, Bai et Ng (2006) ont proposé la construction d'intervalles asymptotiques sous l'hypothèse de Gaussianité des innovations. Le bootstrap nous permet de relâcher cette hypothèse et de construire des intervalles de prédiction valides sous des hypothèses plus générales. En outre, même en supposant la Gaussianité, le bootstrap conduit à des intervalles plus précis dans les cas où la dimension transversale est relativement faible car il prend en considération le biais de l'estimateur des moindres carrés ordinaires comme le montre une étude récente de Gonçalves et Perron (2014). Dans le troisième chapitre, nous suggérons des procédures de sélection convergentes pour les regressions augmentées de facteurs en échantillons finis. Nous démontrons premièrement que la méthode de validation croisée usuelle est non-convergente mais que sa généralisation, la validation croisée «leave-d-out» sélectionne le plus petit ensemble de facteurs estimés pour l'espace généré par les vraies facteurs. Le deuxième critère dont nous montrons également la validité généralise l'approximation bootstrap de Shao (1996) pour les regressions augmentées de facteurs. Les simulations montrent une amélioration de la probabilité de sélectionner par- cimonieusement les facteurs estimés comparativement aux méthodes de sélection disponibles. L'application empirique revisite la relation entre les facteurs macroéconomiques et financiers, et l'excès de rendement sur le marché boursier américain. Parmi les facteurs estimés à partir d'un large panel de données macroéconomiques et financières des États Unis, les facteurs fortement correlés aux écarts de taux d'intérêt et les facteurs de Fama-French ont un bon pouvoir prédictif pour les excès de rendement.
Resumo:
In this work, the relationship between diameter at breast height (d) and total height (h) of individual-tree was modeled with the aim to establish provisory height-diameter (h-d) equations for maritime pine (Pinus pinaster Ait.) stands in the Lomba ZIF, Northeast Portugal. Using data collected locally, several local and generalized h-d equations from the literature were tested and adaptations were also considered. Model fitting was conducted by using usual nonlinear least squares (nls) methods. The best local and generalized models selected, were also tested as mixed models applying a first-order conditional expectation (FOCE) approximation procedure and maximum likelihood methods to estimate fixed and random effects. For the calibration of the mixed models and in order to be consistent with the fitting procedure, the FOCE method was also used to test different sampling designs. The results showed that the local h-d equations with two parameters performed better than the analogous models with three parameters. However a unique set of parameter values for the local model can not be used to all maritime pine stands in Lomba ZIF and thus, a generalized model including covariates from the stand, in addition to d, was necessary to obtain an adequate predictive performance. No evident superiority of the generalized mixed model in comparison to the generalized model with nonlinear least squares parameters estimates was observed. On the other hand, in the case of the local model, the predictive performance greatly improved when random effects were included. The results showed that the mixed model based in the local h-d equation selected is a viable alternative for estimating h if variables from the stand are not available. Moreover, it was observed that it is possible to obtain an adequate calibrated response using only 2 to 5 additional h-d measurements in quantile (or random) trees from the distribution of d in the plot (stand). Balancing sampling effort, accuracy and straightforwardness in practical applications, the generalized model from nls fit is recommended. Examples of applications of the selected generalized equation to the forest management are presented, namely how to use it to complete missing information from forest inventory and also showing how such an equation can be incorporated in a stand-level decision support system that aims to optimize the forest management for the maximization of wood volume production in Lomba ZIF maritime pine stands.
Resumo:
Cette thèse développe des méthodes bootstrap pour les modèles à facteurs qui sont couram- ment utilisés pour générer des prévisions depuis l'article pionnier de Stock et Watson (2002) sur les indices de diffusion. Ces modèles tolèrent l'inclusion d'un grand nombre de variables macroéconomiques et financières comme prédicteurs, une caractéristique utile pour inclure di- verses informations disponibles aux agents économiques. Ma thèse propose donc des outils éco- nométriques qui améliorent l'inférence dans les modèles à facteurs utilisant des facteurs latents extraits d'un large panel de prédicteurs observés. Il est subdivisé en trois chapitres complémen- taires dont les deux premiers en collaboration avec Sílvia Gonçalves et Benoit Perron. Dans le premier article, nous étudions comment les méthodes bootstrap peuvent être utilisées pour faire de l'inférence dans les modèles de prévision pour un horizon de h périodes dans le futur. Pour ce faire, il examine l'inférence bootstrap dans un contexte de régression augmentée de facteurs où les erreurs pourraient être autocorrélées. Il généralise les résultats de Gonçalves et Perron (2014) et propose puis justifie deux approches basées sur les résidus : le block wild bootstrap et le dependent wild bootstrap. Nos simulations montrent une amélioration des taux de couverture des intervalles de confiance des coefficients estimés en utilisant ces approches comparativement à la théorie asymptotique et au wild bootstrap en présence de corrélation sérielle dans les erreurs de régression. Le deuxième chapitre propose des méthodes bootstrap pour la construction des intervalles de prévision permettant de relâcher l'hypothèse de normalité des innovations. Nous y propo- sons des intervalles de prédiction bootstrap pour une observation h périodes dans le futur et sa moyenne conditionnelle. Nous supposons que ces prévisions sont faites en utilisant un ensemble de facteurs extraits d'un large panel de variables. Parce que nous traitons ces facteurs comme latents, nos prévisions dépendent à la fois des facteurs estimés et les coefficients de régres- sion estimés. Sous des conditions de régularité, Bai et Ng (2006) ont proposé la construction d'intervalles asymptotiques sous l'hypothèse de Gaussianité des innovations. Le bootstrap nous permet de relâcher cette hypothèse et de construire des intervalles de prédiction valides sous des hypothèses plus générales. En outre, même en supposant la Gaussianité, le bootstrap conduit à des intervalles plus précis dans les cas où la dimension transversale est relativement faible car il prend en considération le biais de l'estimateur des moindres carrés ordinaires comme le montre une étude récente de Gonçalves et Perron (2014). Dans le troisième chapitre, nous suggérons des procédures de sélection convergentes pour les regressions augmentées de facteurs en échantillons finis. Nous démontrons premièrement que la méthode de validation croisée usuelle est non-convergente mais que sa généralisation, la validation croisée «leave-d-out» sélectionne le plus petit ensemble de facteurs estimés pour l'espace généré par les vraies facteurs. Le deuxième critère dont nous montrons également la validité généralise l'approximation bootstrap de Shao (1996) pour les regressions augmentées de facteurs. Les simulations montrent une amélioration de la probabilité de sélectionner par- cimonieusement les facteurs estimés comparativement aux méthodes de sélection disponibles. L'application empirique revisite la relation entre les facteurs macroéconomiques et financiers, et l'excès de rendement sur le marché boursier américain. Parmi les facteurs estimés à partir d'un large panel de données macroéconomiques et financières des États Unis, les facteurs fortement correlés aux écarts de taux d'intérêt et les facteurs de Fama-French ont un bon pouvoir prédictif pour les excès de rendement.
Resumo:
Understanding the fluctuations in population abundance is a central question in fisheries. Sardine fisheries is of great importance to Portugal and is data-rich and of primary concern to fisheries managers. In Portugal, sub-stocks of Sardina pilchardus (sardine) are found in different regions: the Northwest (IXaCN), Southwest (IXaCS) and the South coast (IXaS-Algarve). Each of these sardine sub-stocks is affected differently by a unique set of climate and ocean conditions, mainly during larval development and recruitment, which will consequently affect sardine fisheries in the short term. Taking this hypothesis into consideration we examined the effects of hydrographic (river discharge), sea surface temperature, wind driven phenomena, upwelling, climatic (North Atlantic Oscillation) and fisheries variables (fishing effort) on S. pilchardus catch rates (landings per unit effort, LPUE, as a proxy for sardine biomass). A 20-year time series (1989-2009) was used, for the different subdivisions of the Portuguese coast (sardine sub-stocks). For the purpose of this analysis a multi-model approach was used, applying different time series models for data fitting (Dynamic Factor Analysis, Generalised Least Squares), forecasting (Autoregressive Integrated Moving Average), as well as Surplus Production stock assessment models. The different models were evaluated, compared and the most important variables explaining changes in LPUE were identified. The type of relationship between catch rates of sardine and environmental variables varied across regional scales due to region-specific recruitment responses. Seasonality plays an important role in sardine variability within the three study regions. In IXaCN autumn (season with minimum spawning activity, larvae and egg concentrations) SST, northerly wind and wind magnitude were negatively related with LPUE. In IXaCS none of the explanatory variables tested was clearly related with LPUE. In IXaS-Algarve (South Portugal) both spring (period when large abundances of larvae are found) northerly wind and wind magnitude were negatively related with LPUE, revealing that environmental effects match with the regional peak in spawning time. Overall, results suggest that management of small, short-lived pelagic species, such as sardine quotas/sustainable yields, should be adapted to a regional scale because of regional environmental variability.
Resumo:
Split-plot design (SPD) and near-infrared chemical imaging were used to study the homogeneity of the drug paracetamol loaded in films and prepared from mixtures of the biocompatible polymers hydroxypropyl methylcellulose, polyvinylpyrrolidone, and polyethyleneglycol. The study was split into two parts: a partial least-squares (PLS) model was developed for a pixel-to-pixel quantification of the drug loaded into films. Afterwards, a SPD was developed to study the influence of the polymeric composition of films and the two process conditions related to their preparation (percentage of the drug in the formulations and curing temperature) on the homogeneity of the drug dispersed in the polymeric matrix. Chemical images of each formulation of the SPD were obtained by pixel-to-pixel predictions of the drug using the PLS model of the first part, and macropixel analyses were performed for each image to obtain the y-responses (homogeneity parameter). The design was modeled using PLS regression, allowing only the most relevant factors to remain in the final model. The interpretation of the SPD was enhanced by utilizing the orthogonal PLS algorithm, where the y-orthogonal variations in the design were separated from the y-correlated variation.
Resumo:
Dulce de leche samples available in the Brazilian market were submitted to sensory profiling by quantitative descriptive analysis and acceptance test, as well sensory evaluation using the just-about-right scale and purchase intent. External preference mapping and the ideal sensory characteristics of dulce de leche were determined. The results were also evaluated by principal component analysis, hierarchical cluster analysis, partial least squares regression, artificial neural networks, and logistic regression. Overall, significant product acceptance was related to intermediate scores of the sensory attributes in the descriptive test, and this trend was observed even after consumer segmentation. The results obtained by sensometric techniques showed that optimizing an ideal dulce de leche from the sensory standpoint is a multidimensional process, with necessary adjustments on the appearance, aroma, taste, and texture attributes of the product for better consumer acceptance and purchase. The optimum dulce de leche was characterized by high scores for the attributes sweet taste, caramel taste, brightness, color, and caramel aroma in accordance with the preference mapping findings. In industrial terms, this means changing the parameters used in the thermal treatment and quantitative changes in the ingredients used in formulations.
Resumo:
The aim of this study was to develop a methodology using Raman hyperspectral imaging and chemometric methods for identification of pre- and post-blast explosive residues on banknote surfaces. The explosives studied were of military, commercial and propellant uses. After the acquisition of the hyperspectral imaging, independent component analysis (ICA) was applied to extract the pure spectra and the distribution of the corresponding image constituents. The performance of the methodology was evaluated by the explained variance and the lack of fit of the models, by comparing the ICA recovered spectra with the reference spectra using correlation coefficients and by the presence of rotational ambiguity in the ICA solutions. The methodology was applied to forensic samples to solve an automated teller machine explosion case. Independent component analysis proved to be a suitable method of resolving curves, achieving equivalent performance with the multivariate curve resolution with alternating least squares (MCR-ALS) method. At low concentrations, MCR-ALS presents some limitations, as it did not provide the correct solution. The detection limit of the methodology presented in this study was 50μgcm(-2).
Resumo:
X-ray fluorescence (XRF) is a fast, low-cost, nondestructive, and truly multielement analytical technique. The objectives of this study are to quantify the amount of Na(+) and K(+) in samples of table salt (refined, marine, and light) and to compare three different methodologies of quantification using XRF. A fundamental parameter method revealed difficulties in quantifying accurately lighter elements (Z < 22). A univariate methodology based on peak area calibration is an attractive alternative, even though additional steps of data manipulation might consume some time. Quantifications were performed with good correlations for both Na (r = 0.974) and K (r = 0.992). A partial least-squares (PLS) regression method with five latent variables was very fast. Na(+) quantifications provided calibration errors lower than 16% and a correlation of 0.995. Of great concern was the observation of high Na(+) levels in low-sodium salts. The presented application may be performed in a fast and multielement fashion, in accordance with Green Chemistry specifications.
Resumo:
In this work, the artificial neural networks (ANN) and partial least squares (PLS) regression were applied to UV spectral data for quantitative determination of thiamin hydrochloride (VB1), riboflavin phosphate (VB2), pyridoxine hydrochloride (VB6) and nicotinamide (VPP) in pharmaceutical samples. For calibration purposes, commercial samples in 0.2 mol L-1 acetate buffer (pH 4.0) were employed as standards. The concentration ranges used in the calibration step were: 0.1 - 7.5 mg L-1 for VB1, 0.1 - 3.0 mg L-1 for VB2, 0.1 - 3.0 mg L-1 for VB6 and 0.4 - 30.0 mg L-1 for VPP. From the results it is possible to verify that both methods can be successfully applied for these determinations. The similar error values were obtained by using neural network or PLS methods. The proposed methodology is simple, rapid and can be easily used in quality control laboratories.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
The aim of this study was to test the hypothesis of differences in performance including differences in ST-T wave changes between healthy men and women submitted to an exercise stress test. Two hundred (45.4%) men and 241 (54.6%) women (mean age: 38.7 ± 11.0 years) were submitted to an exercise stress test. Physiologic and electrocardiographic variables were compared by the Student t-test and the chi-square test. To test the hypothesis of differences in ST-segment changes, data were ranked with functional models based on weighted least squares. To evaluate the influence of gender and age on the diagnosis of ST-segment abnormality, a logistic model was adjusted; P < 0.05 was considered to be significant. Rate-pressure product, duration of exercise and estimated functional capacity were higher in men (P < 0.05). Sixteen (6.7%) women and 9 (4.5%) men demonstrated ST-segment upslope ≥0.15 mV or downslope ≥0.10 mV; the difference was not statistically significant. Age increase of one year added 4% to the chance of upsloping of segment ST ≥0.15 mV or downsloping of segment ST ≥0.1 mV (P = 0.03; risk ratio = 1.040, 95% confidence interval (CI) = 1.002-1.080). Heart rate recovery was higher in women (P < 0.05). The chance of women showing an increase of systolic blood pressure ≤30 mmHg was 85% higher (P = 0.01; risk ratio = 1.85, 95%CI = 1.1-3.05). No significant difference in the frequency of ST-T wave changes was observed between men and women. Other differences may be related to different physical conditioning.