918 resultados para Heuristic constrained linear least squares


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The torsional stiffness of chassis is one of the most important properties of a vehicle's structure and therefore its measurement is important. For the first time, the torsional stiffness was considered on the design of a prototype Baja SAE of the team from UNESP - FEG, Equipe Piratas do Vale Bardahl. According to the team's opinion, the increase of stiffness on this prototype, called MB1114, made possible a great improvement in its performance during competitions. In this work, the experimental evaluation of the torsional stiffness from this prototype is performed, detailing the analysis of results, as well as, the hysteresis' effect, least-squares regression and uncertainty analysis. It also shows that it is possible to measure the torsional stiffness of chassis with a low experimental uncertainty without expending many resources. The test rig costed R$ 32,50 due the reuse of materials and the use of instrumentation already available on campus. Furthermore, it is simple to produce and can be easily stocked. Those features are important for Baja and Formula SAE teams. Lastly, the measured value is used to validate the finite element analysis performed by the team during this prototype's design, because similar studies will be performed for the new cars. After investigating the finite element analysis, one result 13,5% higher than the measured value was reached. This difference is believed to be occurred due the imperfections of the finite element model, in other words, for not been possible to simulate every phenomena present on the real model

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The torsional stiffness of chassis is one of the most important properties of a vehicle's structure and therefore its measurement is important. For the first time, the torsional stiffness was considered on the design of a prototype Baja SAE of the team from UNESP - FEG, Equipe Piratas do Vale Bardahl. According to the team's opinion, the increase of stiffness on this prototype, called MB1114, made possible a great improvement in its performance during competitions. In this work, the experimental evaluation of the torsional stiffness from this prototype is performed, detailing the analysis of results, as well as, the hysteresis' effect, least-squares regression and uncertainty analysis. It also shows that it is possible to measure the torsional stiffness of chassis with a low experimental uncertainty without expending many resources. The test rig costed R$ 32,50 due the reuse of materials and the use of instrumentation already available on campus. Furthermore, it is simple to produce and can be easily stocked. Those features are important for Baja and Formula SAE teams. Lastly, the measured value is used to validate the finite element analysis performed by the team during this prototype's design, because similar studies will be performed for the new cars. After investigating the finite element analysis, one result 13,5% higher than the measured value was reached. This difference is believed to be occurred due the imperfections of the finite element model, in other words, for not been possible to simulate every phenomena present on the real model

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A fast method was optimized and validated in order to quantify amphetamine-type stimulants (amphetamine, AMP; methamphetamine, MAMP; fenproporex, FPX; 3,4-methylenedioxymethamphetamine, MDMA; and 3,4-methylenedioxyamphetamine, MDA) in human hair samples. The method was based in an initial procedure of decontamination of hair samples (50 mg) with dichloromethane, followed by alkaline hydrolysis and extraction of the amphetamines using hollow-fiber liquid-phase micro extraction (HF-LPME) in the three-phase mode. Gas chromatography-mass spectrometry (GC-MS) was used for identification and quantification of the analytes. The LoQs obtained for all amphetamines (around 0.05 ng/mg) were below the cut-off value (0.2 ng/mg) established by the Society of Hair Testing (SoHT). The method showed to be simple and precise. The intra-day and inter-day precisions were within 10.6% and 11.4%, respectively, with the use of only two deuteratecl internal standards (AMP-d5 and MDMA-d5). By using the weighted least squares linear regression (1/x(2)), the accuracy of the method was satisfied in the lower concentration levels (accuracy values better than 87%). Hair samples collected from six volunteers who reported regular use of amphetamines were submitted to the developed method. Drug detection was observed in all samples of the volunteers. (c) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Ayahuasca is a psychoactive plant beverage originally used by indigenous people throughout the Amazon Basin, long before its modern use by syncretic religious groups established in Brazil, the USA and European countries. The objective of this study was to develop a method for quantification of dimethyltryptamine and beta-carbolines in human plasma samples. Results: The analytes were extracted by means of C18 cartridges and injected into LC-MS/MS, operated in positive ion mode and multiple reaction monitoring. The LOQs obtained for all analytes were below 0.5 ng/ml. By using the weighted least squares linear regression, the accuracy of the analytical method was improved at the lower end of the calibration curve (from 0.5 to 100 ng/ml; r(2)> 0.98). Conclusion: The method proved to be simple, rapid and useful to estimate administered doses for further pharmacological and toxicological investigations of ayahuasca exposure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a performance analysis of a baseband multiple-input single-output ultra-wideband system over scenarios CM1 and CM3 of the IEEE 802.15.3a channel model, incorporating four different schemes of pre-distortion: time reversal, zero-forcing pre-equaliser, constrained least squares pre-equaliser, and minimum mean square error pre-equaliser. For the third case, a simple solution based on the steepest-descent (gradient) algorithm is adopted and compared with theoretical results. The channel estimations at the transmitter are assumed to be truncated and noisy. Results show that the constrained least squares algorithm has a good trade-off between intersymbol interference reduction and signal-to-noise ratio preservation, providing a performance comparable to the minimum mean square error method but with lower computational complexity. Copyright (C) 2011 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Within the nutritional context, the supplementation of microminerals in bird food is often made in quantities exceeding those required in the attempt to ensure the proper performance of the animals. The experiments of type dosage x response are very common in the determination of levels of nutrients in optimal food balance and include the use of regression models to achieve this objective. Nevertheless, the regression analysis routine, generally, uses a priori information about a possible relationship between the response variable. The isotonic regression is a method of estimation by least squares that generates estimates which preserves data ordering. In the theory of isotonic regression this information is essential and it is expected to increase fitting efficiency. The objective of this work was to use an isotonic regression methodology, as an alternative way of analyzing data of Zn deposition in tibia of male birds of Hubbard lineage. We considered the models of plateau response of polynomial quadratic and linear exponential forms. In addition to these models, we also proposed the fitting of a logarithmic model to the data and the efficiency of the methodology was evaluated by Monte Carlo simulations, considering different scenarios for the parametric values. The isotonization of the data yielded an improvement in all the fitting quality parameters evaluated. Among the models used, the logarithmic presented estimates of the parameters more consistent with the values reported in literature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new method for analysis of scattering data from lamellar bilayer systems is presented. The method employs a form-free description of the cross-section structure of the bilayer and the fit is performed directly to the scattering data, introducing also a structure factor when required. The cross-section structure (electron density profile in the case of X-ray scattering) is described by a set of Gaussian functions and the technique is termed Gaussian deconvolution. The coefficients of the Gaussians are optimized using a constrained least-squares routine that induces smoothness of the electron density profile. The optimization is coupled with the point-of-inflection method for determining the optimal weight of the smoothness. With the new approach, it is possible to optimize simultaneously the form factor, structure factor and several other parameters in the model. The applicability of this method is demonstrated by using it in a study of a multilamellar system composed of lecithin bilayers, where the form factor and structure factor are obtained simultaneously, and the obtained results provided new insight into this very well known system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In der vorliegenden Arbeit werden zwei physikalischeFließexperimente an Vliesstoffen untersucht, die dazu dienensollen, unbekannte hydraulische Parameter des Materials, wiez. B. die Diffusivitäts- oder Leitfähigkeitsfunktion, ausMeßdaten zu identifizieren. Die physikalische undmathematische Modellierung dieser Experimente führt auf einCauchy-Dirichlet-Problem mit freiem Rand für die degeneriertparabolische Richardsgleichung in derSättigungsformulierung, das sogenannte direkte Problem. Ausder Kenntnis des freien Randes dieses Problems soll dernichtlineare Diffusivitätskoeffizient derDifferentialgleichung rekonstruiert werden. Für diesesinverse Problem stellen wir einOutput-Least-Squares-Funktional auf und verwenden zu dessenMinimierung iterative Regularisierungsverfahren wie dasLevenberg-Marquardt-Verfahren und die IRGN-Methode basierendauf einer Parametrisierung des Koeffizientenraumes durchquadratische B-Splines. Für das direkte Problem beweisen wirunter anderem Existenz und Eindeutigkeit der Lösung desCauchy-Dirichlet-Problems sowie die Existenz des freienRandes. Anschließend führen wir formal die Ableitung desfreien Randes nach dem Koeffizienten, die wir für dasnumerische Rekonstruktionsverfahren benötigen, auf einlinear degeneriert parabolisches Randwertproblem zurück.Wir erläutern die numerische Umsetzung und Implementierungunseres Rekonstruktionsverfahrens und stellen abschließendRekonstruktionsergebnisse bezüglich synthetischer Daten vor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Questa tesi descrive alcuni studi di messa a punto di metodi di analisi fisici accoppiati con tecniche statistiche multivariate per valutare la qualità e l’autenticità di oli vegetali e prodotti caseari. L’applicazione di strumenti fisici permette di abbattere i costi ed i tempi necessari per le analisi classiche ed allo stesso tempo può fornire un insieme diverso di informazioni che possono riguardare tanto la qualità come l’autenticità di prodotti. Per il buon funzionamento di tali metodi è necessaria la costruzione di modelli statistici robusti che utilizzino set di dati correttamente raccolti e rappresentativi del campo di applicazione. In questo lavoro di tesi sono stati analizzati oli vegetali e alcune tipologie di formaggi (in particolare pecorini per due lavori di ricerca e Parmigiano-Reggiano per un altro). Sono stati utilizzati diversi strumenti di analisi (metodi fisici), in particolare la spettroscopia, l’analisi termica differenziale, il naso elettronico, oltre a metodiche separative tradizionali. I dati ottenuti dalle analisi sono stati trattati mediante diverse tecniche statistiche, soprattutto: minimi quadrati parziali; regressione lineare multipla ed analisi discriminante lineare.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The early detection of subjects with probable Alzheimer's disease (AD) is crucial for effective appliance of treatment strategies. Here we explored the ability of a multitude of linear and non-linear classification algorithms to discriminate between the electroencephalograms (EEGs) of patients with varying degree of AD and their age-matched control subjects. Absolute and relative spectral power, distribution of spectral power, and measures of spatial synchronization were calculated from recordings of resting eyes-closed continuous EEGs of 45 healthy controls, 116 patients with mild AD and 81 patients with moderate AD, recruited in two different centers (Stockholm, New York). The applied classification algorithms were: principal component linear discriminant analysis (PC LDA), partial least squares LDA (PLS LDA), principal component logistic regression (PC LR), partial least squares logistic regression (PLS LR), bagging, random forest, support vector machines (SVM) and feed-forward neural network. Based on 10-fold cross-validation runs it could be demonstrated that even tough modern computer-intensive classification algorithms such as random forests, SVM and neural networks show a slight superiority, more classical classification algorithms performed nearly equally well. Using random forests classification a considerable sensitivity of up to 85% and a specificity of 78%, respectively for the test of even only mild AD patients has been reached, whereas for the comparison of moderate AD vs. controls, using SVM and neural networks, values of 89% and 88% for sensitivity and specificity were achieved. Such a remarkable performance proves the value of these classification algorithms for clinical diagnostics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A great increase of private car ownership took place in China from 1980 to 2009 with the development of the economy. To explain the relationship between car ownership and economic and social changes, an ordinary least squares linear regression model is developed using car ownership per capita as the dependent variable with GDP, savings deposits and highway mileages per capita as the independent variables. The model is tested and corrected for econometric problems such as spurious correlation and cointegration. Finally, the regression model is used to project oil consumption by the Chinese transportation sector through 2015. The result shows that about 2.0 million barrels of oil will be consumed by private cars in conservative scenario, and about 2.6 million barrels of oil per day in high case scenario in 2015. Both of them are much higher than the consumption level of 2009, which is 1.9 million barrels per day. It also shows that the annual growth rate of oil demand by transportation is 2.7% - 3.1% per year in the conservative scenario, and 6.9% - 7.3% per year in the high case forecast scenario from 2010 to 2015. As a result, actions like increasing oil efficiency need to be taken to deal with challenges of the increasing demand for oil.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Indoor localization systems become more interesting for researchers because of the attractiveness of business cases in various application fields. A WiFi-based passive localization system can provide user location information to third-party providers of positioning services. However, indoor localization techniques are prone to multipath and Non-Line Of Sight (NLOS) propagation, which lead to significant performance degradation. To overcome these problems, we provide a passive localization system for WiFi targets with several improved algorithms for localization. Through Software Defined Radio (SDR) techniques, we extract Channel Impulse Response (CIR) information at the physical layer. CIR is later adopted to mitigate the multipath fading problem. We propose to use a Nonlinear Regression (NLR) method to relate the filtered power information to propagation distances, which significantly improves the ranging accuracy compared to the commonly used log-distance path loss model. To mitigate the influence of ranging errors, a new trilateration algorithm is designed as well by combining Weighted Centroid and Constrained Weighted Least Square (WC-CWLS) algorithms. Experiment results show that our algorithm is robust against ranging errors and outperforms the linear least square algorithm and weighted centroid algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linear- and unimodal-based inference models for mean summer temperatures (partial least squares, weighted averaging, and weighted averaging partial least squares models) were applied to a high-resolution pollen and cladoceran stratigraphy from Gerzensee, Switzerland. The time-window of investigation included the Allerød, the Younger Dryas, and the Preboreal. Characteristic major and minor oscillations in the oxygen-isotope stratigraphy, such as the Gerzensee oscillation, the onset and end of the Younger Dryas stadial, and the Preboreal oscillation, were identified by isotope analysis of bulk-sediment carbonates of the same core and were used as independent indicators for hemispheric or global scale climatic change. In general, the pollen-inferred mean summer temperature reconstruction using all three inference models follows the oxygen-isotope curve more closely than the cladoceran curve. The cladoceran-inferred reconstruction suggests generally warmer summers than the pollen-based reconstructions, which may be an effect of terrestrial vegetation not being in equilibrium with climate due to migrational lags during the Late Glacial and early Holocene. Allerød summer temperatures range between 11 and 12°C based on pollen, whereas the cladoceran-inferred temperatures lie between 11 and 13°C. Pollen and cladocera-inferred reconstructions both suggest a drop to 9–10°C at the beginning of the Younger Dryas. Although the Allerød–Younger Dryas transition lasted 150–160 years in the oxygen-isotope stratigraphy, the pollen-inferred cooling took 180–190 years and the cladoceran-inferred cooling lasted 250–260 years. The pollen-inferred summer temperature rise to 11.5–12°C at the transition from the Younger Dryas to the Preboreal preceded the oxygen-isotope signal by several decades, whereas the cladoceran-inferred warming lagged. Major discrepancies between the pollen- and cladoceran-inference models are observed for the Preboreal, where the cladoceran-inference model suggests mean summer temperatures of up to 14–15°C. Both pollen- and cladoceran-inferred reconstructions suggest a cooling that may be related to the Gerzensee oscillation, but there is no evidence for a cooling synchronous with the Preboreal oscillation as recorded in the oxygen-isotope record. For the Gerzensee oscillation the inferred cooling was ca. 1 and 0.5°C based on pollen and cladocera, respectively, which lies well within the inherent prediction errors of the inference models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kriging is a widely employed method for interpolating and estimating elevations from digital elevation data. Its place of prominence is due to its elegant theoretical foundation and its convenient practical implementation. From an interpolation point of view, kriging is equivalent to a thin-plate spline and is one species among the many in the genus of weighted inverse distance methods, albeit with attractive properties. However, from a statistical point of view, kriging is a best linear unbiased estimator and, consequently, has a place of distinction among all spatial estimators because any other linear estimator that performs as well as kriging (in the least squares sense) must be equivalent to kriging, assuming that the parameters of the semivariogram are known. Therefore, kriging is often held to be the gold standard of digital terrain model elevation estimation. However, I prove that, when used with local support, kriging creates discontinuous digital terrain models, which is to say, surfaces with “rips” and “tears” throughout them. This result is general; it is true for ordinary kriging, kriging with a trend, and other forms. A U.S. Geological Survey (USGS) digital elevation model was analyzed to characterize the distribution of the discontinuities. I show that the magnitude of the discontinuity does not depend on surface gradient but is strongly dependent on the size of the kriging neighborhood.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current statistical methods for estimation of parametric effect sizes from a series of experiments are generally restricted to univariate comparisons of standardized mean differences between two treatments. Multivariate methods are presented for the case in which effect size is a vector of standardized multivariate mean differences and the number of treatment groups is two or more. The proposed methods employ a vector of independent sample means for each response variable that leads to a covariance structure which depends only on correlations among the $p$ responses on each subject. Using weighted least squares theory and the assumption that the observations are from normally distributed populations, multivariate hypotheses analogous to common hypotheses used for testing effect sizes were formulated and tested for treatment effects which are correlated through a common control group, through multiple response variables observed on each subject, or both conditions.^ The asymptotic multivariate distribution for correlated effect sizes is obtained by extending univariate methods for estimating effect sizes which are correlated through common control groups. The joint distribution of vectors of effect sizes (from $p$ responses on each subject) from one treatment and one control group and from several treatment groups sharing a common control group are derived. Methods are given for estimation of linear combinations of effect sizes when certain homogeneity conditions are met, and for estimation of vectors of effect sizes and confidence intervals from $p$ responses on each subject. Computational illustrations are provided using data from studies of effects of electric field exposure on small laboratory animals. ^