9 resultados para reference range

em Aston University Research Archive


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The involvement of oxidatively modified low density lipoprotein (LDL) in the development of CHD is widely described. We have produced two antibodies, recognizing the lipid oxidation product malondialdehyde (MDA) on whole LDL or ApoB-100. The antibodies were utilized in the development of an ELISA for quantitation of MDA-LDL in human plasma. Intra- and inter-assay coefficients of variation (% CV) were measured as 4.8 and 7.7%, respectively, and sensitivity of the assay as 0.04 μg/ml MDA-LDL. Recovery of standard MDA-LDL from native LDL was 102%, indicating the ELISA to be specific with no interference from other biomolecules. Further validation of the ELISA was carried out against two established methods for measurement of lipid peroxidation products, MDA by HPLC and F2-isoprostanes by GC-MS. Results indicated that MDA-LDL is formed at a later stage of oxidation than either MDA or F2- isoprostanes. In vivo analysis demonstrated that the ELISA was able to determine steady-state concentrations of plasma MDA-LDL (an end marker of lipid peroxidation). A reference range of 34.3 ± 8.8 μg/ml MDA-LDL was established for healthy individuals. Further, the ELISA was used to show significantly increased plasma MDA-LDL levels in subjects with confirmed ischemic heart disease, and could therefore possibly be of benefit as a diagnostic tool for assessing CHD risk. © 2003 Elsevier Inc.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Context: Subclinical hypothyroidism (SCH) and cognitive dysfunction are both common in the elderly and have been linked. It is important to determine whether T4 replacement therapy in SCH confers cognitive benefit. Objective: Our objective was to determine whether administration of T4 replacement to achieve biochemical euthyroidism in subjects with SCH improves cognitive function. Design and Setting: We conducted a double-blind placebo-controlled randomized controlled trial in the context of United Kingdom primary care. Patients: Ninety-four subjects aged 65 yr and over (57 females, 37 males) with SCH were recruited from a population of 147 identified by screening. Intervention: T4 or placebo was given at an initial dosage of one tablet of either placebo or 25 µg T4 per day for 12 months. Thyroid function tests were performed at 8-weekly intervals with dosage adjusted in one-tablet increments to achieve TSH within the reference range for subjects in treatment arm. Fifty-two subjects received T4 (31 females, 21 males; mean age 73.5 yr, range 65–94 yr); 42 subjects received placebo (26 females, 16 males; mean age 74.2 yr, 66–84 yr). Main Outcome Measures: Mini-Mental State Examination, Middlesex Elderly Assessment of Mental State (covering orientation, learning, memory, numeracy, perception, attention, and language skills), and Trail-Making A and B were administered. Results: Eighty-two percent and 84% in the T4 group achieved euthyroidism at 6- and 12-month intervals, respectively. Cognitive function scores at baseline and 6 and 12 months were as follows: Mini-Mental State Examination T4 group, 28.26, 28.9, and 28.28, and placebo group, 28.17, 27.82, and 28.25 [not significant (NS)]; Middlesex Elderly Assessment of Mental State T4 group, 11.72, 11.67, and 11.78, and placebo group, 11.21, 11.47, and 11.44 (NS); Trail-Making A T4 group, 45.72, 47.65, and 44.52, and placebo group, 50.29, 49.00, and 46.97 (NS); and Trail-Making B T4 group, 110.57, 106.61, and 96.67, and placebo group, 131.46, 119.13, and 108.38 (NS). Linear mixed-model analysis demonstrated no significant changes in any of the measures of cognitive function over time and no between-group difference in cognitive scores at 6 and 12 months. Conclusions: This RCT provides no evidence for treating elderly subjects with SCH with T4 replacement therapy to improve cognitive function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Discrete, microscopic lesions are developed in the brain in a number of neurodegenerative diseases. These lesions may not be randomly distributed in the tissue but exhibit a spatial pattern, i.e., a departure from randomness towards regularlity or clustering. The spatial pattern of a lesion may reflect its development in relation to other brain lesions or to neuroanatomical structures. Hence, a study of spatial pattern may help to elucidate the pathogenesis of a lesion. A number of statistical methods can be used to study the spatial patterns of brain lesions. They range from simple tests of whether the distribution of a lesion departs from random to more complex methods which can detect clustering and the size, distribution and spacing of clusters. This paper reviews the uses and limitations of these methods as applied to neurodegenerative disorders, and in particular to senile plaque formation in Alzheimer's disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis provides the first detailed study of maximal oxygen consumption of turbot on a fish farm over a range of fish sizes and temperatures. Also provided is a study of the diets used in turbot farming and the development of a diet that contains no fresh fish. A detailed study of previous research on flatfish nutrition, identified fresh fish, sprat in particular, as the optimum diet for turbot farming. A series of experiments was undertaken that confirmed this and also identified one possible explanation for the optimum performance of sprat, as a function of high non-protein energy ratios in sprat. This factor was exploited in the production of a diet containing no fresh fish and which produced superior results to diets containing fresh fish; the optimum level of lipid in the diet was determined as 18%. The study of oxygen consumption was on fully-fed fish so that maximum demand could be quantified. Continuous monitoring of tank water oxygen levels enabled the calculation of the Specific Dynamic Action (SDA) effect in turbot and the relation of it to dietary energy. Variation of SDA with the dietary energy profile was identified as a contributing factor to differential fish growth on various diets. Finally, the implications of this work to fish farming were considered. Economic appraisal and comparison of the diets routinely used in turbot farming identified that the diet developed as a result of this work, ie the diet containing no fresh fish protein, was more cost effective on the basis of the production of one tonne of turbot. The study of oxygen consumption enables water supply to be calculated for any fish size between 1g and 1000g between the temperatures of 7® C and 16® C. The quantification of SDA enables correct adjustment of oxygen flows according to the feeding status of the fish.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis investigates the role of accounting in planning and control in the Egyptian Iron and Steel Company "Hadisolb". The hypothesis is that there should be planning and control at appropriate levels, with a significant accounting involvement, In an organisation such as the Egyptian Iron and Steel Company "Hadisolb" . Part One of the thesis explains the role of accounting in planning and control, with special emphasis on its role in long-range corporate planning and control. Parts Two and Three review the history of the Egyptian Iron and Steel Company "Hadisolb", its organisation and structure, also the role of accounting in its planning and control arrangements, together with comments and criticisms concerning this. Part Four is mainly recommendations for alterations or improvements in planning and control in Hadisolb. This includes a suggested planning and organisation structure, physical and cost control reporting structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whether to assess the functionality of equipment or as a determinate for the accuracy of assays, reference standards are essential for the purposes of standardisation and validation. The ELISPOT assay, developed over thirty years ago, has emerged as a leading immunological assay in the development of novel vaccines for the assessment of efficacy. However, with its widespread use, there is a growing demand for a greater level of standardisation across different laboratories. One of the major difficulties in achieving this goal has been the lack of definitive reference standards. This is partly due to the ex vivo nature of the assay, which relies on cells being placed directly into the wells. Thus, the aim of this thesis was to produce an artificial reference standard using liposomes, for use within the assay. Liposomes are spherical bilayer vesicles with an enclosed aqueous compartment and therefore are models for biological membranes. Initial work examined pre-design considerations in order to produce an optimal formulation that would closely mimic the action of the cells ordinarily placed on the assay. Recognition of the structural differences between liposomes and cells led to the formulation of liposomes with increased density. This was achieved by using a synthesised cholesterol analogue. By incorporating this cholesterol analogue in liposomes, increased sedimentation rates were observed within the first few hours. The optimal liposome formulation from these studies was composed of 2-dipalmitoyl-sn-glycero-3-phosphocholine (DPPC), cholesterol (Chol) and brominated cholesterol (Brchol) at a 16:4:12 µMol ratio, based on a significantly higher (p<0.01) sedimentation (as determined by a percentage transmission of 59 ± 5.9 % compared to the control formulation at 29 ± 12 % after four hours). By considering a range of liposome formulations ‘proof of principle’ for using liposomes as ELISPOT reference standards was shown; recombinant IFN? cytokine was successfully entrapped within vesicles of different lipid compositions, which were able to promote spot formation within the ELISPOT assay. Using optimised liposome formulations composed of phosphatidylcholine with or without cholesterol (16 µMol total lipid) further development was undertaken to produce an optimised, scalable protocol for the production of liposomes as reference standards. A linear increase in spot number by the manipulation of cytokine concentration and/or lipid concentrations was not possible, potentially due to the saturation that occurred within the base of wells. Investigations into storage of the formulations demonstrated the feasibility of freezing and lyophilisation with disaccharide cryoprotectants, but also highlighted the need for further protocol optimisation to achieve a robust reference standard upon storage. Finally, the transfer of small-scale production to a medium lab-scale batch (40 mL) demonstrated this was feasible within the laboratory using the optimised protocol.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Video streaming via Transmission Control Protocol (TCP) networks has become a popular and highly demanded service, but its quality assessment in both objective and subjective terms has not been properly addressed. In this paper, based on statistical analysis a full analytic model of a no-reference objective metric, namely pause intensity (PI), for video quality assessment is presented. The model characterizes the video playout buffer behavior in connection with the network performance (throughput) and the video playout rate. This allows for instant quality measurement and control without requiring a reference video. PI specifically addresses the need for assessing the quality issue in terms of the continuity in the playout of TCP streaming videos, which cannot be properly measured by other objective metrics such as peak signal-to-noise-ratio, structural similarity, and buffer underrun or pause frequency. The performance of the analytical model is rigidly verified by simulation results and subjective tests using a range of video clips. It is demonstrated that PI is closely correlated with viewers' opinion scores regardless of the vastly different composition of individual elements, such as pause duration and pause frequency which jointly constitute this new quality metric. It is also shown that the correlation performance of PI is consistent and content independent. © 2013 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a full analytic model for pause intensity (PI), a no-reference metric for video quality assessment, is presented. The model is built upon the video play out buffer behavior at the client side and also encompasses the characteristics of a TCP network. Video streaming via TCP produces impairments in play continuity, which are not typically reflected in current objective metrics such as PSNR and SSIM. Recently the buffer under run frequency/probability has been used to characterize the buffer behavior and as a measurement for performance optimization. But we show, using subjective testing, that under run frequency cannot reflect the viewers' quality of experience for TCP based streaming. We also demonstrate that PI is a comprehensive metric made up of a combination of phenomena observed in the play out buffer. The analytical model in this work is verified with simulations carried out on ns-2, showing that the two results are closely matched. The effectiveness of the PI metric has also been proved by subjective testing on a range of video clips, where PI values exhibit a good correlation with the viewers' opinion scores. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The accuracy of a map is dependent on the reference dataset used in its construction. Classification analyses used in thematic mapping can, for example, be sensitive to a range of sampling and data quality concerns. With particular focus on the latter, the effects of reference data quality on land cover classifications from airborne thematic mapper data are explored. Variations in sampling intensity and effort are highlighted in a dataset that is widely used in mapping and modelling studies; these may need accounting for in analyses. The quality of the labelling in the reference dataset was also a key variable influencing mapping accuracy. Accuracy varied with the amount and nature of mislabelled training cases with the nature of the effects varying between classifiers. The largest impacts on accuracy occurred when mislabelling involved confusion between similar classes. Accuracy was also typically negatively related to the magnitude of mislabelled cases and the support vector machine (SVM), which has been claimed to be relatively insensitive to training data error, was the most sensitive of the set of classifiers investigated, with overall classification accuracy declining by 8% (significant at 95% level of confidence) with the use of a training set containing 20% mislabelled cases.