932 resultados para squares
Resumo:
The in-line measurement of COD and NH4-N in the WWTP inflow is crucial for the timely monitoring of biological wastewater treatment processes and for the development of advanced control strategies for optimized WWTP operation. As a direct measurement of COD and NH4-N requires expensive and high maintenance in-line probes or analyzers, an approach estimating COD and NH4-N based on standard and spectroscopic in-line inflow measurement systems using Machine Learning Techniques is presented in this paper. The results show that COD estimation using Radom Forest Regression with a normalized MSE of 0.3, which is sufficiently accurate for practical applications, can be achieved using only standard in-line measurements. In the case of NH4-N, a good estimation using Partial Least Squares Regression with a normalized MSE of 0.16 is only possible based on a combination of standard and spectroscopic in-line measurements. Furthermore, the comparison of regression and classification methods shows that both methods perform equally well in most cases.
Resumo:
BACKGROUND: It is now common for individuals to require dialysis following the failure of a kidney transplant. Management of complications and preparation for dialysis are suboptimal in this group. To aid planning, it is desirable to estimate the time to dialysis requirement. The rate of decline in the estimated glomerular filtration rate (eGFR) may be used to this end.
METHODS: This study compared the rate of eGFR decline prior to dialysis commencement between individuals with failing transplants and transplant-naïve patients. The rate of eGFR decline was also compared between transplant recipients with and without graft failure. eGFR was calculated using the four-variable MDRD equation with rate of decline calculated by least squares linear regression.
RESULTS: The annual rate of eGFR decline in incident dialysis patients with graft failure exceeded that of the transplant-naïve incident dialysis patients. In the transplant cohort, the mean annual rate of eGFR decline prior to graft failure was 7.3 ml/min/1.73 m(2) compared to 4.8 ml/min/1.73 m(2) in the transplant-naïve group (p < 0.001) and 0.35 ml/min/1.73 m(2) in recipients without graft failure (p < 0.001). Factors associated with eGFR decline were recipient age, decade of transplantation, HLA mismatch and histological evidence of chronic immunological injury.
CONCLUSIONS: Individuals with graft failure have a rapid decline in eGFR prior to dialysis commencement. To improve outcomes, dialysis planning and management of chronic kidney disease complications should be initiated earlier than in the transplant-naïve population.
Resumo:
Tropical peatlands represent globally important carbon sinks with a unique biodiversity and are currently threatened by climate change and human activities. It is now imperative that proxy methods are developed to understand the ecohydrological dynamics of these systems and for testing peatland development models. Testate amoebae have been used as environmental indicators in ecological and palaeoecological studies of peatlands, primarily in ombrotrophic Sphagnum-dominated peatlands in the mid- and high-latitudes. We present the first ecological analysis of testate amoebae in a tropical peatland, a nutrient-poor domed bog in western (Peruvian) Amazonia. Litter samples were collected from different hydrological microforms (hummock to pool) along a transect from the edge to the interior of the peatland. We recorded 47 taxa from 21 genera. The most common taxa are Cryptodifflugia oviformis, Euglypha rotunda type, Phryganella acropodia, Pseudodifflugia fulva type and Trinema lineare. One species found only in the southern hemisphere, Argynnia spicata, is present. Arcella spp., Centropyxis aculeata and Lesqueresia spiralis are indicators of pools containing standing water. Canonical correspondence analysis and non-metric multidimensional scaling illustrate that water table depth is a significant control on the distribution of testate amoebae, similar to the results from mid- and high-latitude peatlands. A transfer function model for water table based on weighted averaging partial least-squares (WAPLS) regression is presented and performs well under cross-validation (r 2apparent=0.76,RMSE=4.29;r2jack=0.68,RMSEP=5.18. The transfer function was applied to a 1-m peat core, and sample-specific reconstruction errors were generated using bootstrapping. The reconstruction generally suggests near-surface water tables over the last 3,000 years, with a shift to drier conditions at c. cal. 1218-1273 AD
Resumo:
Brain tissue from so-called Alzheimer's disease (AD) mouse models has previously been examined using H-1 NMR-metabolomics, but comparable information concerning human AD is negligible. Since no animal model recapitulates all the features of human AD we undertook the first H-1 NMR-metabolomics investigation of human AD brain tissue. Human post-mortem tissue from 15 AD subjects and 15 age-matched controls was prepared for analysis through a series of lyophilised, milling, extraction and randomisation steps and samples were analysed using H-1 NMR. Using partial least squares discriminant analysis, a model was built using data obtained from brain extracts. Analysis of brain extracts led to the elucidation of 24 metabolites. Significant elevations in brain alanine (15.4 %) and taurine (18.9 %) were observed in AD patients (p ≤ 0.05). Pathway topology analysis implicated either dysregulation of taurine and hypotaurine metabolism or alanine, aspartate and glutamate metabolism. Furthermore, screening of metabolites for AD biomarkers demonstrated that individual metabolites weakly discriminated cases of AD [receiver operating characteristic (ROC) AUC <0.67; p < 0.05]. However, paired metabolites ratios (e.g. alanine/carnitine) were more powerful discriminating tools (ROC AUC = 0.76; p < 0.01). This study further demonstrates the potential of metabolomics for elucidating the underlying biochemistry and to help identify AD in patients attending the memory clinic
Resumo:
This paper proposes an efficient learning mechanism to build fuzzy rule-based systems through the construction of sparse least-squares support vector machines (LS-SVMs). In addition to the significantly reduced computational complexity in model training, the resultant LS-SVM-based fuzzy system is sparser while offers satisfactory generalization capability over unseen data. It is well known that the LS-SVMs have their computational advantage over conventional SVMs in the model training process; however, the model sparseness is lost, which is the main drawback of LS-SVMs. This is an open problem for the LS-SVMs. To tackle the nonsparseness issue, a new regression alternative to the Lagrangian solution for the LS-SVM is first presented. A novel efficient learning mechanism is then proposed in this paper to extract a sparse set of support vectors for generating fuzzy IF-THEN rules. This novel mechanism works in a stepwise subset selection manner, including a forward expansion phase and a backward exclusion phase in each selection step. The implementation of the algorithm is computationally very efficient due to the introduction of a few key techniques to avoid the matrix inverse operations to accelerate the training process. The computational efficiency is also confirmed by detailed computational complexity analysis. As a result, the proposed approach is not only able to achieve the sparseness of the resultant LS-SVM-based fuzzy systems but significantly reduces the amount of computational effort in model training as well. Three experimental examples are presented to demonstrate the effectiveness and efficiency of the proposed learning mechanism and the sparseness of the obtained LS-SVM-based fuzzy systems, in comparison with other SVM-based learning techniques.
Resumo:
Many AMS systems can measure 14C, 13C and 12C simultaneously thus providing δ13C values which can be used for fractionation normalization without the need for offline 13C /12C measurements on isotope ratio mass spectrometers (IRMS). However AMS δ13C values on our 0.5MV NEC Compact Accelerator often differ from IRMS values on the same material by 4-5‰ or more. It has been postulated that the AMS δ13C values account for the potential graphitization and machine induced fractionation, in addition to natural fractionation, but how much does this affect the 14C ages or F14C? We present an analysis of F14C as a linear least squares fit with AMS δ13C results for several of our secondary standards. While there are samples for which there is an obvious correlation between AMS δ13C and F14C, as quantified with the calculated probability of no correlation, we find that the trend lies within one standard deviation of the variance on our F14C measurements. Our laboratory produces both zinc and hydrogen reduced graphite, and we present our results for each type. Additionally, we show the variance on our AMS δ13C measurements of our secondary standards.
Resumo:
In this paper, a multiloop robust control strategy is proposed based on H∞ control and a partial least squares (PLS) model (H∞_PLS) for multivariable chemical processes. It is developed especially for multivariable systems in ill-conditioned plants and non-square systems. The advantage of PLS is to extract the strongest relationship between the input and the output variables in the reduced space of the latent variable model rather than in the original space of the highly dimensional variables. Without conventional decouplers, the dynamic PLS framework automatically decomposes the MIMO process into multiple single-loop systems in the PLS subspace so that the controller design can be simplified. Since plant/model mismatch is almost inevitable in practical applications, to enhance the robustness of this control system, the controllers based on the H∞ mixed sensitivity problem are designed in the PLS latent subspace. The feasibility and the effectiveness of the proposed approach are illustrated by the simulation results of a distillation column and a mixing tank process. Comparisons between H∞_PLS control and conventional individual control (either H∞ control or PLS control only) are also made
Resumo:
Cascade control is one of the routinely used control strategies in industrial processes because it can dramatically improve the performance of single-loop control, reducing both the maximum deviation and the integral error of the disturbance response. Currently, many control performance assessment methods of cascade control loops are developed based on the assumption that all the disturbances are subject to Gaussian distribution. However, in the practical condition, several disturbance sources occur in the manipulated variable or the upstream exhibits nonlinear behaviors. In this paper, a general and effective index of the performance assessment of the cascade control system subjected to the unknown disturbance distribution is proposed. Like the minimum variance control (MVC) design, the output variances of the primary and the secondary loops are decomposed into a cascade-invariant and a cascade-dependent term, but the estimated ARMA model for the cascade control loop based on the minimum entropy, instead of the minimum mean squares error, is developed for non-Gaussian disturbances. Unlike the MVC index, an innovative control performance index is given based on the information theory and the minimum entropy criterion. The index is informative and in agreement with the expected control knowledge. To elucidate wide applicability and effectiveness of the minimum entropy cascade control index, a simulation problem and a cascade control case of an oil refinery are applied. The comparison with MVC based cascade control is also included.
Resumo:
A periodic monitoring of the pavement condition facilitates a cost-effective distribution of the resources available for maintenance of the road infrastructure network. The task can be accurately carried out using profilometers, but such an approach is generally expensive. This paper presents a method to collect information on the road profile via accelerometers mounted in a fleet of non-specialist vehicles, such as police cars, that are in use for other purposes. It proposes an optimisation algorithm, based on Cross Entropy theory, to predict road irregularities. The Cross Entropy algorithm estimates the height of the road irregularities from vehicle accelerations at each point in time. To test the algorithm, the crossing of a half-car roll model is simulated over a range of road profiles to obtain accelerations of the vehicle sprung and unsprung masses. Then, the simulated vehicle accelerations are used as input in an iterative procedure that searches for the best solution to the inverse problem of finding road irregularities. In each iteration, a sample of road profiles is generated and an objective function defined as the sum of squares of differences between the ‘measured’ and predicted accelerations is minimized until convergence is reached. The reconstructed profile is classified according to ISO and IRI recommendations and compared to its original class. Results demonstrate that the approach is feasible and that a good estimate of the short-wavelength features of the road profile can be detected, despite the variability between the vehicles used to collect the data.
Resumo:
The aim of the study was to investigate the potential of a metabolomics platform to distinguish between pigs treated with ronidazole, dimetridazole and metronidazole and non-medicated animals (controls), at two withdrawal periods (day 0 and 5). Livers from each animal were biochemically profiled using UHPLC–QTof-MS in ESI+ mode of acquisition. Several Orthogonal Partial Least Squares-Discriminant Analysis models were generated from the acquired mass spectrometry data. The models classified the two groups control and treated animals. A total of 42 ions of interest explained the variation in ESI+. It was possible to find the identity of 3 of the ions and to positively classify 4 of the ionic features, which can be used as potential biomarkers of illicit 5-nitroimidazole abuse. Further evidence of the toxic mechanisms of 5-nitroimidazole drugs has been revealed, which may be of substantial importance as metronidazole is widely used in human medicine.
Resumo:
A geostatistical version of the classical Fisher rule (linear discriminant analysis) is presented.This method is applicable when a large dataset of multivariate observations is available within a domain split in several known subdomains, and it assumes that the variograms (or covariance functions) are comparable between subdomains, which only differ in the mean values of the available variables. The method consists on finding the eigen-decomposition of the matrix W-1B, where W is the matrix of sills of all direct- and cross-variograms, and B is the covariance matrix of the vectors of weighted means within each subdomain, obtained by generalized least squares. The method is used to map peat blanket occurrence in Northern Ireland, with data from the Tellus
survey, which requires a minimal change to the general recipe: to use compositionally-compliant variogram tools and models, and work with log-ratio transformed data.
Resumo:
Camera traps are used to estimate densities or abundances using capture-recapture and, more recently, random encounter models (REMs). We deploy REMs to describe an invasive-native species replacement process, and to demonstrate their wider application beyond abundance estimation. The Irish hare Lepus timidus hibernicus is a high priority endemic of conservation concern. It is threatened by an expanding population of non-native, European hares L. europaeus, an invasive species of global importance. Camera traps were deployed in thirteen 1 km squares, wherein the ratio of invader to native densities were corroborated by night-driven line transect distance sampling throughout the study area of 1652 km2. Spatial patterns of invasive and native densities between the invader’s core and peripheral ranges, and native allopatry, were comparable between methods. Native densities in the peripheral range were comparable to those in native allopatry using REM, or marginally depressed using Distance Sampling. Numbers of the invader were substantially higher than the native in the core range, irrespective of method, with a 5:1 invader-to-native ratio indicating species replacement. We also describe a post hoc optimization protocol for REM which will inform subsequent (re-)surveys, allowing survey effort (camera hours) to be reduced by up to 57% without compromising the width of confidence intervals associated with density estimates. This approach will form the basis of a more cost-effective means of surveillance and monitoring for both the endemic and invasive species. The European hare undoubtedly represents a significant threat to the endemic Irish hare.
Resumo:
The UK’s transportation network is supported by critical geotechnical assets (cuttings/embankments/dams) that require sustainable, cost-effective management, while maintaining an appropriate service level to meet social, economic, and environmental needs. Recent effects of extreme weather on these geotechnical assets have highlighted their vulnerability to climate variations. We have assessed the potential of surface wave data to portray the climate-related variations in mechanical properties of a clay-filled railway embankment. Seismic data were acquired bimonthly from July 2013 to November 2014 along the crest of a heritage railway embankment in southwest England. For each acquisition, the collected data were first processed to obtain a set of Rayleigh-wave dispersion and attenuation curves, referenced to the same spatial locations. These data were then analyzed to identify a coherent trend in their spatial and temporal variability. The relevance of the observed temporal variations was also verified with respect to the experimental data uncertainties. Finally, the surface wave dispersion data sets were inverted to reconstruct a time-lapse model of S-wave velocity for the embankment structure, using a least-squares laterally constrained inversion scheme. A key point of the inversion process was constituted by the estimation of a suitable initial model and the selection of adequate levels of spatial regularization. The initial model and the strength of spatial smoothing were then kept constant throughout the processing of all available data sets to ensure homogeneity of the procedure and comparability among the obtained VS sections. A continuous and coherent temporal pattern of surface wave data, and consequently of the reconstructed VS models, was identified. This pattern is related to the seasonal distribution of precipitation and soil water content measured on site.
Resumo:
Sidewalks are integral features of city centres. They provide the channels through which activities and interactions evolve and in turn these interactions cause the sidewalks to evolve. They help to articulate the builtform and open spaces in tying together. However, historically sidewalks have received less attention relative to urban squares and civic spaces. Owing to the concept of walkable cities, sidewalks are gaining importance. This paper provides a critical overview on the apparent ‘amnesia’ in urban design and planning theories and visits a popular sidewalk in Belfast city centre to examine the paradox and perspectives.
Resumo:
O objectivo principal da presente tese consiste no desenvolvimento de estimadores robustos do variograma com boas propriedades de eficiência. O variograma é um instrumento fundamental em Geoestatística, pois modela a estrutura de dependência do processo em estudo e influencia decisivamente a predição de novas observações. Os métodos tradicionais de estimação do variograma não são robustos, ou seja, são sensíveis a pequenos desvios das hipóteses do modelo. Essa questão é importante, pois as propriedades que motivam a aplicação de tais métodos, podem não ser válidas nas vizinhanças do modelo assumido. O presente trabalho começa por conter uma revisão dos principais conceitos em Geoestatística e da estimação tradicional do variograma. De seguida, resumem-se algumas noções fundamentais sobre robustez estatística. No seguimento, apresenta-se um novo método de estimação do variograma que se designou por estimador de múltiplos variogramas. O método consiste em quatro etapas, nas quais prevalecem, alternadamente, os critérios de robustez ou de eficiência. A partir da amostra inicial, são calculadas, de forma robusta, algumas estimativas pontuais do variograma; com base nessas estimativas pontuais, são estimados os parâmetros do modelo pelo método dos mínimos quadrados; as duas fases anteriores são repetidas, criando um conjunto de múltiplas estimativas da função variograma; por fim, a estimativa final do variograma é definida pela mediana das estimativas obtidas anteriormente. Assim, é possível obter um estimador que tem boas propriedades de robustez e boa eficiência em processos Gaussianos. A investigação desenvolvida revelou que, quando se usam estimativas discretas na primeira fase da estimação do variograma, existem situações onde a identificabilidade dos parâmetros não está assegurada. Para os modelos de variograma mais comuns, foi possível estabelecer condições, pouco restritivas, que garantem a unicidade de solução na estimação do variograma. A estimação do variograma supõe sempre a estacionaridade da média do processo. Como é importante que existam procedimentos objectivos para avaliar tal condição, neste trabalho sugere-se um teste para validar essa hipótese. A estatística do teste é um estimador-MM, cuja distribuição é desconhecida nas condições de dependência assumidas. Tendo em vista a sua aproximação, apresenta-se uma versão do método bootstrap adequada ao estudo de observações dependentes de processos espaciais. Finalmente, o estimador de múltiplos variogramas é avaliado em termos da sua aplicação prática. O trabalho contém um estudo de simulação que confirma as propriedades estabelecidas. Em todos os casos analisados, o estimador de múltiplos variogramas produziu melhores resultados do que as alternativas usuais, tanto para a distribuição assumida, como para distribuições contaminadas.