989 resultados para Squares


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A study combining high resolution mass spectrometry (liquid chromatography-quadrupole time-of-flight-mass spectrometry, UPLC-QTof-MS) and chemometrics for the analysis of post-mortem brain tissue from subjects with Alzheimer’s disease (AD) (n = 15) and healthy age-matched controls (n = 15) was undertaken. The huge potential of this metabolomics approach for distinguishing AD cases is underlined by the correct prediction of disease status in 94–97% of cases. Predictive power was confirmed in a blind test set of 60 samples, reaching 100% diagnostic accuracy. The approach also indicated compounds significantly altered in concentration following the onset of human AD. Using orthogonal partial least-squares discriminant analysis (OPLS-DA), a multivariate model was created for both modes of acquisition explaining the maximum amount of variation between sample groups (Positive Mode-R2 = 97%; Q2 = 93%; root mean squared error of validation (RMSEV) = 13%; Negative Mode-R2 = 99%; Q2 = 92%; RMSEV = 15%). In brain extracts, 1264 and 1457 ions of interest were detected for the different modes of acquisition (positive and negative, respectively). Incorporation of gender into the model increased predictive accuracy and decreased RMSEV values. High resolution UPLC-QTof-MS has not previously been employed to biochemically profile post-mortem brain tissue, and the novel methods described and validated herein prove its potential for making new discoveries related to the etiology, pathophysiology, and treatment of degenerative brain disorders.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Global amphibian declines are a major element of the current biodiversity crisis. Monitoring changes in the distribution and abundance of target species is a basic component in conservation decision making and requires robust and repeatable sampling. For EU member states, surveillance of designated species, including the common frog Rana temporaria, is a formal requirement of the 'EC Habitats & Species Directive'. We deployed established methods for estimating frog population density at local water bodies and extrapolated these to the national and ecoregion scale. Spawn occurred at 49.4% of water bodies and 70.1% of independent 500-m survey squares. Using spawn mat area, we estimated the number of adult breeding females and subsequently the total population assuming a sex ratio of 1:1. A negative binomial model suggested that mean frog density was 23.5 frogsha [95% confidence interval (CI) 14.9-44.0] equating to 196M frogs (95%CI 124M-367M) throughout Ireland. A total of 86% of frogs bred in drainage ditches, which were a notably common feature of the landscape. The recorded distribution of the species did not change significantly between the last Article 17 reporting period (1993-2006) and the current period (2007-2011) throughout the Republic of Ireland. Recording effort was markedly lower in Northern Ireland, which led to an apparent decline in the recorded distribution. We highlight the need to coordinate biological surveys between adjacent political jurisdictions that share a common ecoregion to avoid apparent disparities in the quality of distributional information. Power analysis suggested that a reduced sample of 40-50 survey squares is sufficient to detect a 30% decline (consistent with the International Union for Conservation of Nature Category of 'Vulnerable') at 80% power providing guidance for minimizing future survey effort. Our results provin assessments for R. temporaria and other clump-spawning amphibians. 2013 The Zoological Society of London.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The techniques of principal component analysis (PCA) and partial least squares (PLS) are introduced from the point of view of providing a multivariate statistical method for modelling process plants. The advantages and limitations of PCA and PLS are discussed from the perspective of the type of data and problems that might be encountered in this application area. These concepts are exemplified by two case studies dealing first with data from a continuous stirred tank reactor (CSTR) simulation and second a literature source describing a low-density polyethylene (LDPE) reactor simulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective

To examine whether early inflammation is related to cortisol levels at 18 months corrected age (CA) in children born very preterm.

Study Design

Infants born ≤ 32 weeks gestational age were recruited in the NICU, and placental histopathology, MRI, and chart review were obtained. At 18 months CA developmental assessment and collection of 3 salivary cortisol samples were carried out. Generalized least squares was used to analyze data from 85 infants providing 222 cortisol samples.

Results

Infants exposed to chorioamnionitis with funisitis had a significantly different pattern of cortisol across the samples compared to infants with chorioamnionitis alone or no prenatal inflammation (F[4,139] = 7.3996, P <.0001). Postnatal infections, necrotizing enterocolitis and chronic lung disease were not significantly associated with the cortisol pattern at 18 months CA.

Conclusion

In children born very preterm, prenatal inflammatory stress may contribute to altered programming of the HPA axis.

Keywords: preterm, chorioamnionitis, funisitis, premature infants, hypothalamic-pituitary-adrenal axis, infection, cortisol, stress

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective
To investigate the effect of fast food consumption on mean population body mass index (BMI) and explore the possible influence of market deregulation on fast food consumption and BMI.

Methods
The within-country association between fast food consumption and BMI in 25 high-income member countries of the Organisation for Economic Co-operation and Development between 1999 and 2008 was explored through multivariate panel regression models, after adjustment for per capita gross domestic product, urbanization, trade openness, lifestyle indicators and other covariates. The possible mediating effect of annual per capita intake of soft drinks, animal fats and total calories on the association between fast food consumption and BMI was also analysed. Two-stage least squares regression models were conducted, using economic freedom as an instrumental variable, to study the causal effect of fast food consumption on BMI.

Findings
After adjustment for covariates, each 1-unit increase in annual fast food transactions per capita was associated with an increase of 0.033 kg/m2 in age-standardized BMI (95% confidence interval, CI: 0.013–0.052). Only the intake of soft drinks – not animal fat or total calories – mediated the observed association (β: 0.030; 95% CI: 0.010–0.050). Economic freedom was an independent predictor of fast food consumption (β: 0.27; 95% CI: 0.16–0.37). When economic freedom was used as an instrumental variable, the association between fast food and BMI weakened but remained significant (β: 0.023; 95% CI: 0.001–0.045).

Conclusion
Fast food consumption is an independent predictor of mean BMI in high-income countries. Market deregulation policies may contribute to the obesity epidemic by facilitating the spread of fast food.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thermocouples are one of the most popular devices for temperature measurement due to their robustness, ease of manufacture and installation, and low cost. However, when used in certain harsh environments, for example, in combustion systems and engine exhausts, large wire diameters are required, and consequently the measurement bandwidth is reduced. This article discusses a software compensation technique to address the loss of high frequency fluctuations based on measurements from two thermocouples. In particular, a difference equation (DE) approach is proposed and compared with existing methods both in simulation and on experimental test rig data with constant flow velocity. It is found that the DE algorithm, combined with the use of generalized total least squares for parameter identification, provides better performance in terms of time constant estimation without any a priori assumption on the time constant ratios of the thermocouples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The in-line measurement of COD and NH4-N in the WWTP inflow is crucial for the timely monitoring of biological wastewater treatment processes and for the development of advanced control strategies for optimized WWTP operation. As a direct measurement of COD and NH4-N requires expensive and high maintenance in-line probes or analyzers, an approach estimating COD and NH4-N based on standard and spectroscopic in-line inflow measurement systems using Machine Learning Techniques is presented in this paper. The results show that COD estimation using Radom Forest Regression with a normalized MSE of 0.3, which is sufficiently accurate for practical applications, can be achieved using only standard in-line measurements. In the case of NH4-N, a good estimation using Partial Least Squares Regression with a normalized MSE of 0.16 is only possible based on a combination of standard and spectroscopic in-line measurements. Furthermore, the comparison of regression and classification methods shows that both methods perform equally well in most cases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: It is now common for individuals to require dialysis following the failure of a kidney transplant. Management of complications and preparation for dialysis are suboptimal in this group. To aid planning, it is desirable to estimate the time to dialysis requirement. The rate of decline in the estimated glomerular filtration rate (eGFR) may be used to this end.

METHODS: This study compared the rate of eGFR decline prior to dialysis commencement between individuals with failing transplants and transplant-naïve patients. The rate of eGFR decline was also compared between transplant recipients with and without graft failure. eGFR was calculated using the four-variable MDRD equation with rate of decline calculated by least squares linear regression.

RESULTS: The annual rate of eGFR decline in incident dialysis patients with graft failure exceeded that of the transplant-naïve incident dialysis patients. In the transplant cohort, the mean annual rate of eGFR decline prior to graft failure was 7.3 ml/min/1.73 m(2) compared to 4.8 ml/min/1.73 m(2) in the transplant-naïve group (p < 0.001) and 0.35 ml/min/1.73 m(2) in recipients without graft failure (p < 0.001). Factors associated with eGFR decline were recipient age, decade of transplantation, HLA mismatch and histological evidence of chronic immunological injury.

CONCLUSIONS: Individuals with graft failure have a rapid decline in eGFR prior to dialysis commencement. To improve outcomes, dialysis planning and management of chronic kidney disease complications should be initiated earlier than in the transplant-naïve population.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tropical peatlands represent globally important carbon sinks with a unique biodiversity and are currently threatened by climate change and human activities. It is now imperative that proxy methods are developed to understand the ecohydrological dynamics of these systems and for testing peatland development models. Testate amoebae have been used as environmental indicators in ecological and palaeoecological studies of peatlands, primarily in ombrotrophic Sphagnum-dominated peatlands in the mid- and high-latitudes. We present the first ecological analysis of testate amoebae in a tropical peatland, a nutrient-poor domed bog in western (Peruvian) Amazonia. Litter samples were collected from different hydrological microforms (hummock to pool) along a transect from the edge to the interior of the peatland. We recorded 47 taxa from 21 genera. The most common taxa are Cryptodifflugia oviformis, Euglypha rotunda type, Phryganella acropodia, Pseudodifflugia fulva type and Trinema lineare. One species found only in the southern hemisphere, Argynnia spicata, is present. Arcella spp., Centropyxis aculeata and Lesqueresia spiralis are indicators of pools containing standing water. Canonical correspondence analysis and non-metric multidimensional scaling illustrate that water table depth is a significant control on the distribution of testate amoebae, similar to the results from mid- and high-latitude peatlands. A transfer function model for water table based on weighted averaging partial least-squares (WAPLS) regression is presented and performs well under cross-validation (r 2apparent=0.76,RMSE=4.29;r2jack=0.68,RMSEP=5.18. The transfer function was applied to a 1-m peat core, and sample-specific reconstruction errors were generated using bootstrapping. The reconstruction generally suggests near-surface water tables over the last 3,000 years, with a shift to drier conditions at c. cal. 1218-1273 AD

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Brain tissue from so-called Alzheimer's disease (AD) mouse models has previously been examined using H-1 NMR-metabolomics, but comparable information concerning human AD is negligible. Since no animal model recapitulates all the features of human AD we undertook the first H-1 NMR-metabolomics investigation of human AD brain tissue. Human post-mortem tissue from 15 AD subjects and 15 age-matched controls was prepared for analysis through a series of lyophilised, milling, extraction and randomisation steps and samples were analysed using H-1 NMR. Using partial least squares discriminant analysis, a model was built using data obtained from brain extracts. Analysis of brain extracts led to the elucidation of 24 metabolites. Significant elevations in brain alanine (15.4 %) and taurine (18.9 %) were observed in AD patients (p ≤ 0.05). Pathway topology analysis implicated either dysregulation of taurine and hypotaurine metabolism or alanine, aspartate and glutamate metabolism. Furthermore, screening of metabolites for AD biomarkers demonstrated that individual metabolites weakly discriminated cases of AD [receiver operating characteristic (ROC) AUC <0.67; p < 0.05]. However, paired metabolites ratios (e.g. alanine/carnitine) were more powerful discriminating tools (ROC AUC = 0.76; p < 0.01). This study further demonstrates the potential of metabolomics for elucidating the underlying biochemistry and to help identify AD in patients attending the memory clinic

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes an efficient learning mechanism to build fuzzy rule-based systems through the construction of sparse least-squares support vector machines (LS-SVMs). In addition to the significantly reduced computational complexity in model training, the resultant LS-SVM-based fuzzy system is sparser while offers satisfactory generalization capability over unseen data. It is well known that the LS-SVMs have their computational advantage over conventional SVMs in the model training process; however, the model sparseness is lost, which is the main drawback of LS-SVMs. This is an open problem for the LS-SVMs. To tackle the nonsparseness issue, a new regression alternative to the Lagrangian solution for the LS-SVM is first presented. A novel efficient learning mechanism is then proposed in this paper to extract a sparse set of support vectors for generating fuzzy IF-THEN rules. This novel mechanism works in a stepwise subset selection manner, including a forward expansion phase and a backward exclusion phase in each selection step. The implementation of the algorithm is computationally very efficient due to the introduction of a few key techniques to avoid the matrix inverse operations to accelerate the training process. The computational efficiency is also confirmed by detailed computational complexity analysis. As a result, the proposed approach is not only able to achieve the sparseness of the resultant LS-SVM-based fuzzy systems but significantly reduces the amount of computational effort in model training as well. Three experimental examples are presented to demonstrate the effectiveness and efficiency of the proposed learning mechanism and the sparseness of the obtained LS-SVM-based fuzzy systems, in comparison with other SVM-based learning techniques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many AMS systems can measure 14C, 13C and 12C simultaneously thus providing δ13C values which can be used for fractionation normalization without the need for offline 13C /12C measurements on isotope ratio mass spectrometers (IRMS). However AMS δ13C values on our 0.5MV NEC Compact Accelerator often differ from IRMS values on the same material by 4-5‰ or more. It has been postulated that the AMS δ13C values account for the potential graphitization and machine induced fractionation, in addition to natural fractionation, but how much does this affect the 14C ages or F14C? We present an analysis of F14C as a linear least squares fit with AMS δ13C results for several of our secondary standards. While there are samples for which there is an obvious correlation between AMS δ13C and F14C, as quantified with the calculated probability of no correlation, we find that the trend lies within one standard deviation of the variance on our F14C measurements. Our laboratory produces both zinc and hydrogen reduced graphite, and we present our results for each type. Additionally, we show the variance on our AMS δ13C measurements of our secondary standards.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, a multiloop robust control strategy is proposed based on H∞ control and a partial least squares (PLS) model (H∞_PLS) for multivariable chemical processes. It is developed especially for multivariable systems in ill-conditioned plants and non-square systems. The advantage of PLS is to extract the strongest relationship between the input and the output variables in the reduced space of the latent variable model rather than in the original space of the highly dimensional variables. Without conventional decouplers, the dynamic PLS framework automatically decomposes the MIMO process into multiple single-loop systems in the PLS subspace so that the controller design can be simplified. Since plant/model mismatch is almost inevitable in practical applications, to enhance the robustness of this control system, the controllers based on the H∞ mixed sensitivity problem are designed in the PLS latent subspace. The feasibility and the effectiveness of the proposed approach are illustrated by the simulation results of a distillation column and a mixing tank process. Comparisons between H∞_PLS control and conventional individual control (either H∞ control or PLS control only) are also made

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cascade control is one of the routinely used control strategies in industrial processes because it can dramatically improve the performance of single-loop control, reducing both the maximum deviation and the integral error of the disturbance response. Currently, many control performance assessment methods of cascade control loops are developed based on the assumption that all the disturbances are subject to Gaussian distribution. However, in the practical condition, several disturbance sources occur in the manipulated variable or the upstream exhibits nonlinear behaviors. In this paper, a general and effective index of the performance assessment of the cascade control system subjected to the unknown disturbance distribution is proposed. Like the minimum variance control (MVC) design, the output variances of the primary and the secondary loops are decomposed into a cascade-invariant and a cascade-dependent term, but the estimated ARMA model for the cascade control loop based on the minimum entropy, instead of the minimum mean squares error, is developed for non-Gaussian disturbances. Unlike the MVC index, an innovative control performance index is given based on the information theory and the minimum entropy criterion. The index is informative and in agreement with the expected control knowledge. To elucidate wide applicability and effectiveness of the minimum entropy cascade control index, a simulation problem and a cascade control case of an oil refinery are applied. The comparison with MVC based cascade control is also included.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A periodic monitoring of the pavement condition facilitates a cost-effective distribution of the resources available for maintenance of the road infrastructure network. The task can be accurately carried out using profilometers, but such an approach is generally expensive. This paper presents a method to collect information on the road profile via accelerometers mounted in a fleet of non-specialist vehicles, such as police cars, that are in use for other purposes. It proposes an optimisation algorithm, based on Cross Entropy theory, to predict road irregularities. The Cross Entropy algorithm estimates the height of the road irregularities from vehicle accelerations at each point in time. To test the algorithm, the crossing of a half-car roll model is simulated over a range of road profiles to obtain accelerations of the vehicle sprung and unsprung masses. Then, the simulated vehicle accelerations are used as input in an iterative procedure that searches for the best solution to the inverse problem of finding road irregularities. In each iteration, a sample of road profiles is generated and an objective function defined as the sum of squares of differences between the ‘measured’ and predicted accelerations is minimized until convergence is reached. The reconstructed profile is classified according to ISO and IRI recommendations and compared to its original class. Results demonstrate that the approach is feasible and that a good estimate of the short-wavelength features of the road profile can be detected, despite the variability between the vehicles used to collect the data.