153 resultados para Statistical methodologies


Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Detection of growth-promoter use in animal production systems still proves to be an analytical challenge despite years of activity in the field. This study reports on the capability of NMR metabolomic profiling techniques to discriminate between plasma samples obtained from cattle treated with different groups of growth-promoting hormones (dexamethasone, prednisolone, oestradiol) based on recorded metabolite profiles. Two methods of NMR analysis were investigated—a Carr–Purcell–Meiboom–Gill (CPMG)-pulse sequence technique and a conventional 1H NMR method using pre-extracted plasma. Using the CPMG method, 17 distinct metabolites could be identified from the spectra. 1H NMR analysis of extracted plasma facilitated identification of 23 metabolites—six more than the alternative method and all within the aromatic region. Multivariate statistical analysis of acquired data from both forms of NMR analysis separated the plasma metabolite profiles into distinct sample cluster sets representative of the different animal study groups. Samples from both sets of corticosteroid-treated animals—dexamethasone and prednisolone—were found to be clustered relatively closely and had similar alterations to identified metabolite panels. Distinctive metabolite profiles, different from those observed within plasma from corticosteroid-treated animal plasma, were observed in oestradiol-treated animals and samples from these animals formed a cluster spatially isolated from control animal plasma samples. These findings suggest the potential use of NMR methodologies of plasma metabolite analysis as a high-throughput screening technique to aid detection of growth promoter use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently there has been an increasing interest in the development of new methods using Pareto optimality to deal with multi-objective criteria (for example, accuracy and architectural complexity). Once one has learned a model based on their devised method, the problem is then how to compare it with the state of art. In machine learning, algorithms are typically evaluated by comparing their performance on different data sets by means of statistical tests. Unfortunately, the standard tests used for this purpose are not able to jointly consider performance measures. The aim of this paper is to resolve this issue by developing statistical procedures that are able to account for multiple competing measures at the same time. In particular, we develop two tests: a frequentist procedure based on the generalized likelihood-ratio test and a Bayesian procedure based on a multinomial-Dirichlet conjugate model. We further extend them by discovering conditional independences among measures to reduce the number of parameter of such models, as usually the number of studied cases is very reduced in such comparisons. Real data from a comparison among general purpose classifiers is used to show a practical application of our tests.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports a study carried out to develop a self-compacting fibre reinforced concrete containing a high fibre content with slurry infiltrated fibre concrete (SIFCON). The SIFCON was developed with 10% of steel fibres which are infiltrated by self-compacting cement slurry without any vibration. Traditionally, the infiltration of the slurry into the layer of fibres is carried out under intensive vibration. A two-level fractional factorial design was used to optimise the properties of cement-based slurries with four independent variables, such as dosage of silica fume, dosage of superplasticiser, sand content, and water/cement ratio (W/C). Rheometer, mini-slump test, Lombardi plate cohesion meter, J-fibre penetration test, and induced bleeding were used to assess the behaviour of fresh cement slurries. The compressive strengths at 7 and 28 days were also measured. The statistical models are valid for slurries made with W/C of 0.40 to 0.50, 50 to 100% of sand by mass of cement, 5 to 10% of silica fume by mass of cement, and SP dosage of 0.6 to 1.2% by mass of cement. This model makes it possible to evaluate the effect of individual variables on measured parameters of fresh cement slurries. The proposed models offered useful information to understand trade-offs between mix variables and compare the responses obtained from various test methods in order to optimise self-compacting SIFCON.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Self-compacting concrete (SCC) is generally designed with a relatively higher content of finer, which includes cement, and dosage of superplasticizer than the conventional concrete. The design of the current SCC leads to high compressive strength, which is already used in special applications, where the high cost of materials can be tolerated. Using SCC, which eliminates the need for vibration, leads to increased speed of casting and thus reduces labour requirement, energy consumption, construction time, and cost of equipment. In order to obtain and gain maximum benefit from SCC it has to be used for wider applications. The cost of materials will be decreased by reducing the cement content and using a minimum amount of admixtures. This paper reviews statistical models obtained from a factorial design which was carried out to determine the influence of four key parameters on filling ability, passing ability, segregation and compressive strength. These parameters are important for the successful development of medium strength self-compacting concrete (MS-SCC). The parameters considered in the study were the contents of cement and pulverised fuel ash (PFA), water-to-powder ratio (W/P), and dosage of superplasticizer (SP). The responses of the derived statistical models are slump flow, fluidity loss, rheological parameters, Orimet time, V-funnel time, L-box, JRing combined to Orimet, JRing combined to cone, fresh segregation, and compressive strength at 7, 28 and 90 days. The models are valid for mixes made with 0.38 to 0.72 W/P ratio, 60 to 216 kg/m3 of cement content, 183 to 317 kg/m3 of PFA and 0 to 1% of SP, by mass of powder. The utility of such models to optimize concrete mixes to achieve good balance between filling ability, passing ability, segregation, compressive strength, and cost is discussed. Examples highlighting the usefulness of the models are presented using isoresponse surfaces to demonstrate single and coupled effects of mix parameters on slump flow, loss of fluidity, flow resistance, segregation, JRing combined to Orimet, and compressive strength at 7 and 28 days. Cost analysis is carried out to show trade-offs between cost of materials and specified consistency levels and compressive strength at 7 and 28 days that can be used to identify economic mixes. The paper establishes the usefulness of the mathematical models as a tool to facilitate the test protocol required to optimise medium strength SCC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper points out a serious flaw in dynamic multivariate statistical process control (MSPC). The principal component analysis of a linear time series model that is employed to capture auto- and cross-correlation in recorded data may produce a considerable number of variables to be analysed. To give a dynamic representation of the data (based on variable correlation) and circumvent the production of a large time-series structure, a linear state space model is used here instead. The paper demonstrates that incorporating a state space model, the number of variables to be analysed dynamically can be considerably reduced, compared to conventional dynamic MSPC techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The work in this paper is of particular significance since it considers the problem of modelling cross- and auto-correlation in statistical process monitoring. The presence of both types of correlation can lead to fault insensitivity or false alarms, although in published literature to date, only autocorrelation has been broadly considered. The proposed method, which uses a Kalman innovation model, effectively removes both correlations. The paper (and Part 2 [2]) has emerged from work supported by EPSRC grant GR/S84354/01 and is of direct relevance to problems in several application areas including chemical, electrical, and mechanical process monitoring.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper builds on work presented in the first paper, Part 1 [1] and is of equal significance. The paper proposes a novel compensation method to preserve the integrity of step-fault signatures prevalent in various processes that can be masked during the removal of both auto- and cross correlation. Using industrial data, the paper demonstrates the benefit of the proposed method, which is applicable to chemical, electrical, and mechanical process monitoring. This paper, (and Part 1 [1]), has led to further work supported by EPSRC grant GR/S84354/01 involving kernel PCA methods.