62 resultados para Least squares methods


Relevância:

90.00% 90.00%

Publicador:

Resumo:

A study combining high resolution mass spectrometry (liquid chromatography-quadrupole time-of-flight-mass spectrometry, UPLC-QTof-MS) and chemometrics for the analysis of post-mortem brain tissue from subjects with Alzheimer’s disease (AD) (n = 15) and healthy age-matched controls (n = 15) was undertaken. The huge potential of this metabolomics approach for distinguishing AD cases is underlined by the correct prediction of disease status in 94–97% of cases. Predictive power was confirmed in a blind test set of 60 samples, reaching 100% diagnostic accuracy. The approach also indicated compounds significantly altered in concentration following the onset of human AD. Using orthogonal partial least-squares discriminant analysis (OPLS-DA), a multivariate model was created for both modes of acquisition explaining the maximum amount of variation between sample groups (Positive Mode-R2 = 97%; Q2 = 93%; root mean squared error of validation (RMSEV) = 13%; Negative Mode-R2 = 99%; Q2 = 92%; RMSEV = 15%). In brain extracts, 1264 and 1457 ions of interest were detected for the different modes of acquisition (positive and negative, respectively). Incorporation of gender into the model increased predictive accuracy and decreased RMSEV values. High resolution UPLC-QTof-MS has not previously been employed to biochemically profile post-mortem brain tissue, and the novel methods described and validated herein prove its potential for making new discoveries related to the etiology, pathophysiology, and treatment of degenerative brain disorders.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective
To investigate the effect of fast food consumption on mean population body mass index (BMI) and explore the possible influence of market deregulation on fast food consumption and BMI.

Methods
The within-country association between fast food consumption and BMI in 25 high-income member countries of the Organisation for Economic Co-operation and Development between 1999 and 2008 was explored through multivariate panel regression models, after adjustment for per capita gross domestic product, urbanization, trade openness, lifestyle indicators and other covariates. The possible mediating effect of annual per capita intake of soft drinks, animal fats and total calories on the association between fast food consumption and BMI was also analysed. Two-stage least squares regression models were conducted, using economic freedom as an instrumental variable, to study the causal effect of fast food consumption on BMI.

Findings
After adjustment for covariates, each 1-unit increase in annual fast food transactions per capita was associated with an increase of 0.033 kg/m2 in age-standardized BMI (95% confidence interval, CI: 0.013–0.052). Only the intake of soft drinks – not animal fat or total calories – mediated the observed association (β: 0.030; 95% CI: 0.010–0.050). Economic freedom was an independent predictor of fast food consumption (β: 0.27; 95% CI: 0.16–0.37). When economic freedom was used as an instrumental variable, the association between fast food and BMI weakened but remained significant (β: 0.023; 95% CI: 0.001–0.045).

Conclusion
Fast food consumption is an independent predictor of mean BMI in high-income countries. Market deregulation policies may contribute to the obesity epidemic by facilitating the spread of fast food.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Thermocouples are one of the most popular devices for temperature measurement due to their robustness, ease of manufacture and installation, and low cost. However, when used in certain harsh environments, for example, in combustion systems and engine exhausts, large wire diameters are required, and consequently the measurement bandwidth is reduced. This article discusses a software compensation technique to address the loss of high frequency fluctuations based on measurements from two thermocouples. In particular, a difference equation (DE) approach is proposed and compared with existing methods both in simulation and on experimental test rig data with constant flow velocity. It is found that the DE algorithm, combined with the use of generalized total least squares for parameter identification, provides better performance in terms of time constant estimation without any a priori assumption on the time constant ratios of the thermocouples.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The in-line measurement of COD and NH4-N in the WWTP inflow is crucial for the timely monitoring of biological wastewater treatment processes and for the development of advanced control strategies for optimized WWTP operation. As a direct measurement of COD and NH4-N requires expensive and high maintenance in-line probes or analyzers, an approach estimating COD and NH4-N based on standard and spectroscopic in-line inflow measurement systems using Machine Learning Techniques is presented in this paper. The results show that COD estimation using Radom Forest Regression with a normalized MSE of 0.3, which is sufficiently accurate for practical applications, can be achieved using only standard in-line measurements. In the case of NH4-N, a good estimation using Partial Least Squares Regression with a normalized MSE of 0.16 is only possible based on a combination of standard and spectroscopic in-line measurements. Furthermore, the comparison of regression and classification methods shows that both methods perform equally well in most cases.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: It is now common for individuals to require dialysis following the failure of a kidney transplant. Management of complications and preparation for dialysis are suboptimal in this group. To aid planning, it is desirable to estimate the time to dialysis requirement. The rate of decline in the estimated glomerular filtration rate (eGFR) may be used to this end.

METHODS: This study compared the rate of eGFR decline prior to dialysis commencement between individuals with failing transplants and transplant-naïve patients. The rate of eGFR decline was also compared between transplant recipients with and without graft failure. eGFR was calculated using the four-variable MDRD equation with rate of decline calculated by least squares linear regression.

RESULTS: The annual rate of eGFR decline in incident dialysis patients with graft failure exceeded that of the transplant-naïve incident dialysis patients. In the transplant cohort, the mean annual rate of eGFR decline prior to graft failure was 7.3 ml/min/1.73 m(2) compared to 4.8 ml/min/1.73 m(2) in the transplant-naïve group (p < 0.001) and 0.35 ml/min/1.73 m(2) in recipients without graft failure (p < 0.001). Factors associated with eGFR decline were recipient age, decade of transplantation, HLA mismatch and histological evidence of chronic immunological injury.

CONCLUSIONS: Individuals with graft failure have a rapid decline in eGFR prior to dialysis commencement. To improve outcomes, dialysis planning and management of chronic kidney disease complications should be initiated earlier than in the transplant-naïve population.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tropical peatlands represent globally important carbon sinks with a unique biodiversity and are currently threatened by climate change and human activities. It is now imperative that proxy methods are developed to understand the ecohydrological dynamics of these systems and for testing peatland development models. Testate amoebae have been used as environmental indicators in ecological and palaeoecological studies of peatlands, primarily in ombrotrophic Sphagnum-dominated peatlands in the mid- and high-latitudes. We present the first ecological analysis of testate amoebae in a tropical peatland, a nutrient-poor domed bog in western (Peruvian) Amazonia. Litter samples were collected from different hydrological microforms (hummock to pool) along a transect from the edge to the interior of the peatland. We recorded 47 taxa from 21 genera. The most common taxa are Cryptodifflugia oviformis, Euglypha rotunda type, Phryganella acropodia, Pseudodifflugia fulva type and Trinema lineare. One species found only in the southern hemisphere, Argynnia spicata, is present. Arcella spp., Centropyxis aculeata and Lesqueresia spiralis are indicators of pools containing standing water. Canonical correspondence analysis and non-metric multidimensional scaling illustrate that water table depth is a significant control on the distribution of testate amoebae, similar to the results from mid- and high-latitude peatlands. A transfer function model for water table based on weighted averaging partial least-squares (WAPLS) regression is presented and performs well under cross-validation (r 2apparent=0.76,RMSE=4.29;r2jack=0.68,RMSEP=5.18. The transfer function was applied to a 1-m peat core, and sample-specific reconstruction errors were generated using bootstrapping. The reconstruction generally suggests near-surface water tables over the last 3,000 years, with a shift to drier conditions at c. cal. 1218-1273 AD

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Lumacaftor/ivacaftor combination therapy demonstrated clinical benefits inpatients with cystic fibrosis homozygous for the Phe508del CFTR mutation.Pretreatment lung function is a confounding factor that potentially impacts the efficacyand safety of lumacaftor/ivacaftor therapy. Methods Two multinational, randomised, double-blind, placebo-controlled, parallelgroupPhase 3 studies randomised patients to receive placebo or lumacaftor (600 mgonce daily [qd] or 400 mg every 12 hours [q12h]) in combination with ivacaftor (250 mgq12h) for 24 weeks. Prespecified analyses of pooled efficacy and safety data by lungfunction, as measured by percent predicted forced expiratory volume in 1 second(ppFEV1), were performed for patients with baseline ppFEV1 <40 (n=81) and ≥40(n=1016) and screening ppFEV1 <70 (n=730) and ≥70 (n=342). These studies wereregistered with ClinicalTrials.gov (NCT01807923 and NCT01807949). Findings The studies were conducted from April 2013 through April 2014.Improvements in the primary endpoint, absolute change from baseline at week 24 inppFEV1, were observed with both lumacaftor/ivacaftor doses in the subgroup withbaseline ppFEV1 <40 (least-squares mean difference versus placebo was 3∙7 and 3.3percentage points for lumacaftor 600 mg qd/ivacaftor 250 mg q12h and lumacaftor 400mg q12h/ivacaftor 250 mg q12h, respectively [p<0∙05] and in the subgroup with baselineppFEV1 ≥40 (3∙3 and 2∙8 percentage points, respectively [p<0∙001]). Similar absoluteimprovements versus placebo in ppFEV1 were observed in subgroups with screening 4ppFEV1 <70 (3∙3 and 3∙3 percentage points for lumacaftor 600 mg qd/ivacaftor 250 mgq12h and lumacaftor 400 mg q12h/ivacaftor 250 mg q12h, respectively [p<0∙001]) and≥70 (3∙3 and 1∙9 percentage points, respectively [p=0.002] and [p=0∙079]). Increases inBMI and reduction in number of pulmonary exacerbation events were observed in bothLUM/IVA dose groups vs placebo across all lung function subgroups. Treatment wasgenerally well tolerated, although the incidence of some respiratory adverse events washigher with active treatment than with placebo. Interpretation Lumacaftor/ivacaftor combination therapy benefits patients homozygousfor Phe508del CFTR who have varying degrees of lung function impairment. Funding Vertex Pharmaceuticals Incorporated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper is part of a special issue of Applied Geochemistry focusing on reliable applications of compositional multivariate statistical methods. This study outlines the application of compositional data analysis (CoDa) to calibration of geochemical data and multivariate statistical modelling of geochemistry and grain-size data from a set of Holocene sedimentary cores from the Ganges-Brahmaputra (G-B) delta. Over the last two decades, understanding near-continuous records of sedimentary sequences has required the use of core-scanning X-ray fluorescence (XRF) spectrometry, for both terrestrial and marine sedimentary sequences. Initial XRF data are generally unusable in ‘raw-format’, requiring data processing in order to remove instrument bias, as well as informed sequence interpretation. The applicability of these conventional calibration equations to core-scanning XRF data are further limited by the constraints posed by unknown measurement geometry and specimen homogeneity, as well as matrix effects. Log-ratio based calibration schemes have been developed and applied to clastic sedimentary sequences focusing mainly on energy dispersive-XRF (ED-XRF) core-scanning. This study has applied high resolution core-scanning XRF to Holocene sedimentary sequences from the tidal-dominated Indian Sundarbans, (Ganges-Brahmaputra delta plain). The Log-Ratio Calibration Equation (LRCE) was applied to a sub-set of core-scan and conventional ED-XRF data to quantify elemental composition. This provides a robust calibration scheme using reduced major axis regression of log-ratio transformed geochemical data. Through partial least squares (PLS) modelling of geochemical and grain-size data, it is possible to derive robust proxy information for the Sundarbans depositional environment. The application of these techniques to Holocene sedimentary data offers an improved methodological framework for unravelling Holocene sedimentation patterns.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The characterization of thermocouple sensors for temperature measurement in varying-flow environments is a challenging problem. Recently, the authors introduced novel difference-equation-based algorithms that allow in situ characterization of temperature measurement probes consisting of two-thermocouple sensors with differing time constants. In particular, a linear least squares (LS) lambda formulation of the characterization problem, which yields unbiased estimates when identified using generalized total LS, was introduced. These algorithms assume that time constants do not change during operation and are, therefore, appropriate for temperature measurement in homogenous constant-velocity liquid or gas flows. This paper develops an alternative ß-formulation of the characterization problem that has the major advantage of allowing exploitation of a priori knowledge of the ratio of the sensor time constants, thereby facilitating the implementation of computationally efficient algorithms that are less sensitive to measurement noise. A number of variants of the ß-formulation are developed, and appropriate unbiased estimators are identified. Monte Carlo simulation results are used to support the analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Abstract An HPLC method has been developed and validated for the determination of spironolactone, 7a-thiomethylspirolactone and canrenone in paediatric plasma samples. The method utilises 200 µl of plasma and sample preparation involves protein precipitation followed by Solid Phase Extraction (SPE). Determination of standard curves of peak height ratio (PHR) against concentration was performed by weighted least squares linear regression using a weighting factor of 1/concentration2. The developed method was found to be linear over concentration ranges of 30–1000 ng/ml for spironolactone and 25–1000 ng/ml for 7a-thiomethylspirolactone and canrenone. The lower limit of quantification for spironolactone, 7a-thiomethylspirolactone and canrenone were calculated as 28, 20 and 25 ng/ml, respectively. The method was shown to be applicable to the determination of spironolactone, 7a-thiomethylspirolactone and canrenone in paediatric plasma samples and also plasma from healthy human volunteers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objectives: To identify demographic and socioeconomic determinants of need for acute hospital treatment at small area level. To establish whether there is a relation between poverty and use of inpatient services. To devise a risk adjustment formula for distributing public funds for hospital services using, as far as possible, variables that can be updated between censuses. Design: Cross sectional analysis. Spatial interactive modelling was used to quantify the proximity of the population to health service facilities. Two stage weighted least squares regression was used to model use against supply of hospital and community services and a wide range of potential needs drivers including health, socioeconomic census variables, uptake of income support and family credit, and religious denomination. Setting: Northern Ireland. Main outcome measure: Intensity of use of inpatient services. Results: After endogeneity of supply and use was taken into account, a statistical model was produced that predicted use based on five variables: income support, family credit, elderly people living alone, all ages standardised mortality ratio, and low birth weight. The main effect of the formula produced is to move resources from urban to rural areas. Conclusions: This work has produced a population risk adjustment formula for acute hospital treatment in which four of the five variables can be updated annually rather than relying on census derived data. Inclusion of the social security data makes a substantial difference to the model and to the results produced by the formula.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The identification of nonlinear dynamic systems using linear-in-the-parameters models is studied. A fast recursive algorithm (FRA) is proposed to select both the model structure and to estimate the model parameters. Unlike orthogonal least squares (OLS) method, FRA solves the least-squares problem recursively over the model order without requiring matrix decomposition. The computational complexity of both algorithms is analyzed, along with their numerical stability. The new method is shown to require much less computational effort and is also numerically more stable than OLS.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present a fast and efficient hybrid algorithm for selecting exoplanetary candidates from wide-field transit surveys. Our method is based on the widely used SysRem and Box Least-Squares (BLS) algorithms. Patterns of systematic error that are common to all stars on the frame are mapped and eliminated using the SysRem algorithm. The remaining systematic errors caused by spatially localized flat-fielding and other errors are quantified using a boxcar-smoothing method. We show that the dimensions of the search-parameter space can be reduced greatly by carrying out an initial BLS search on a coarse grid of reduced dimensions, followed by Newton-Raphson refinement of the transit parameters in the vicinity of the most significant solutions. We illustrate the method's operation by applying it to data from one field of the SuperWASP survey, comprising 2300 observations of 7840 stars brighter than V = 13.0. We identify 11 likely transit candidates. We reject stars that exhibit significant ellipsoidal variations caused indicative of a stellar-mass companion. We use colours and proper motions from the Two Micron All Sky Survey and USNO-B1.0 surveys to estimate the stellar parameters and the companion radius. We find that two stars showing unambiguous transit signals pass all these tests, and so qualify for detailed high-resolution spectroscopic follow-up.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The analysis of chironomid taxa and environmental datasets from 46 New Zealand lakes identified temperature (February mean air temperature) and lake production (chlorophyll a (Chl a)) as the main drivers of chironomid distribution. Temperature was the strongest driver of chironomid distribution and consequently produced the most robust inference models. We present two possible temperature transfer functions from this dataset. The most robust model (weighted averaging-partial least squares (WA-PLS), n = 36) was based on a dataset with the most productive (Chl a > 10 lg l)1) lakes removed. This model produced a coefficient of determination (r2 jack) of 0.77, and a root mean squared error of prediction (RMSEPjack) of 1.31C. The Chl a transfer function (partial least squares (PLS), n = 37) was far less reliable, with an r2 jack of 0.49 and an RMSEPjack of 0.46 Log10lg l)1. Both of these transfer functions could be improved by a revision of the taxonomy for the New Zealand chironomid taxa, particularly the genus Chironomus. The Chironomus morphotype was common in high altitude, cool, oligotrophic lakes and lowland, warm, eutrophic lakes. This could reflect the widespread distribution of one eurythermic species, or the collective distribution of a number of different Chironomus species with more limited tolerances. The Chl a transfer function could also be improved by inputting mean Chl a values into the inference model rather than the spot measurements that were available for this study.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Raman spectroscopy has been used to predict the abundance of the FA in clarified butterfat that was obtained from dairy cows fed a range of levels of rapeseed oil in their diet. Partial least squares regression of the Raman spectra against FA compositions obtained by GC showed good prediction for the five major (abundance >5%) FA with R-2=0.74-0.92 and a root mean SE of prediction (RMSEP) that was 5-7% of the mean. In general, the prediction accuracy fell with decreasing abundance in the sample, but the RMSEP was 1.25%. The Raman method has the best prediction ability for unsaturated FA (R-2=0.85-0.92), and in particular trans unsaturated FA (best-predicted FA was 18:1 tDelta9). This enhancement was attributed to the isolation of the unsaturated modes from the saturated modes and the significantly higher spectral response of unsaturated bonds compared with saturated bonds. Raman spectra of the melted butter samples could also be used to predict bulk parameters calculated from standard analyzes, such as iodine value (R-2=0.80) and solid fat content at low temperature (R-2=0.87). For solid fat contents determined at higher temperatures, the prediction ability was significantly reduced (R-2=0.42), and this decrease in performance was attributed to the smaller range of values in solid fat content at the higher temperatures. Finally, although the prediction errors for the abundances of each of the FA in a given sample are much larger with Raman than with full GC analysis, the accuracy is acceptably high for quality control applications. This, combined with the fact that Raman spectra can be obtained with no sample preparation and with 60-s data collection times, means that high-throughput, on-line Raman analysis of butter samples should be possible.