11 resultados para Standard reference
em Aston University Research Archive
Resumo:
The standard reference clinical score quantifying average Parkinson's disease (PD) symptom severity is the Unified Parkinson's Disease Rating Scale (UPDRS). At present, UPDRS is determined by the subjective clinical evaluation of the patient's ability to adequately cope with a range of tasks. In this study, we extend recent findings that UPDRS can be objectively assessed to clinically useful accuracy using simple, self-administered speech tests, without requiring the patient's physical presence in the clinic. We apply a wide range of known speech signal processing algorithms to a large database (approx. 6000 recordings from 42 PD patients, recruited to a six-month, multi-centre trial) and propose a number of novel, nonlinear signal processing algorithms which reveal pathological characteristics in PD more accurately than existing approaches. Robust feature selection algorithms select the optimal subset of these algorithms, which is fed into non-parametric regression and classification algorithms, mapping the signal processing algorithm outputs to UPDRS. We demonstrate rapid, accurate replication of the UPDRS assessment with clinically useful accuracy (about 2 UPDRS points difference from the clinicians' estimates, p < 0.001). This study supports the viability of frequent, remote, cost-effective, objective, accurate UPDRS telemonitoring based on self-administered speech tests. This technology could facilitate large-scale clinical trials into novel PD treatments.
Resumo:
The present work describes the development of a proton induced X-ray emission (PIXE) analysis system, especially designed and builtfor routine quantitative multi-elemental analysis of a large number of samples. The historical and general developments of the analytical technique and the physical processes involved are discussed. The philosophy, design, constructional details and evaluation of a versatile vacuum chamber, an automatic multi-sample changer, an on-demand beam pulsing system and ion beam current monitoring facility are described.The system calibration using thin standard foils of Si, P, S,Cl, K, Ca, Ti, V, Fe, Cu, Ga, Ge, Rb, Y and Mo was undertaken at proton beam energies of 1 to 3 MeV in steps of 0.5 MeV energy and compared with theoretical calculations. An independent calibration check using bovine liver Standard Reference Material was performed. The minimum detectable limits have been experimentally determined at detector positions of 90° and 135° with respect to the incident beam for the above range of proton energies as a function of atomic number Z. The system has detection limits of typically 10- 7 to 10- 9 g for elements 14
Resumo:
Background: The importance of appropriate normalization controls in quantitative real-time polymerase chain reaction (qPCR) experiments has become more apparent as the number of biological studies using this methodology has increased. In developing a system to study gene expression from transiently transfected plasmids, it became clear that normalization using chromosomally encoded genes is not ideal, at it does not take into account the transfection efficiency and the significantly lower expression levels of the plasmids. We have developed and validated a normalization method for qPCR using a co-transfected plasmid.Results: The best chromosomal gene for normalization in the presence of the transcriptional activators used in this study, cadmium, dexamethasone, forskolin and phorbol-12-myristate 13-acetate was first identified. qPCR data was analyzed using geNorm, Normfinder and BestKeeper. Each software application was found to rank the normalization controls differently with no clear correlation. Including a co-transfected plasmid encoding the Renilla luciferase gene (Rluc) in this analysis showed that its calculated stability was not as good as the optimised chromosomal genes, most likely as a result of the lower expression levels and transfection variability. Finally, we validated these analyses by testing two chromosomal genes (B2M and ActB) and a co-transfected gene (Rluc) under biological conditions. When analyzing co-transfected plasmids, Rluc normalization gave the smallest errors compared to the chromosomal reference genes.Conclusions: Our data demonstrates that transfected Rluc is the most appropriate normalization reference gene for transient transfection qPCR analysis; it significantly reduces the standard deviation within biological experiments as it takes into account the transfection efficiencies and has easily controllable expression levels. This improves reproducibility, data validity and most importantly, enables accurate interpretation of qPCR data. © 2010 Jiwaji et al; licensee BioMed Central Ltd.
Resumo:
The replacement of diesel fuel by ultra-carbofluids was perceived to offer the potential to decrease the emissions of environmental pollutants such as carbon dioxide, carbon monoxide, hydrocarbons (HC's) and smoke. Such ultracarbofluids consist of a suspension of coal in fuel oil and water generally in the ratio of 5: 3: 2 plus a small amount of stabilising additive. The literature relating to the economies of coal and fuel oil production, and the production and properties of charcoal and vegetable oils has been critically reviewed. The potential use of charcoal and vegetable oils as replacements for coal and fuel oil are discussed. An experimental investigation was undertaken using novel bio-ultracarbofluid formulations. These differed from an ultracarbofluid by having bio-renewable charcoal and vegetable oil in place of coal and fuel oil. Tests were made with a Lister-Petter 600cc 2-cylinder, 4-stroke diesel engine fitted with a Heenan-Froude DPX 1 water brake dynamometer to measure brake power output, and Mexa-321E and Mexa-211E analysers to measure exhaust pollutants. Measurements were made of engine brake power output, carbon dioxide, carbon monoxide, hydrocarbons and smoke emissions over the speed range 1000 to 3000 rpm at 200 rpm intervals. The results were compared with those obtained with a standard diesel reference fuel. All the bio-ultracarbofluid formulations produced lower brake power outputs (i.e. 5.6% to 20.7% less brake power) but substantially improved exhaust emissions of CO2, CO, HC's and smoke. The major factor in the formulation was found to be the type and amount of charcoal; charcoal with a high volatile content (27.2%) and present at 30% by mass yielded the best results, i.e. only slightly lower brake power output and significantly lower exhaust pollutants.
Resumo:
Pilot scale studies of high rate filtration were initiated to assess its potential as either a primary 'roughing' filter to alleviate the seasonal overloading of low rate filters on Hereford sewage treatment works - caused by wastes from cider production - or as a two stage high rate process to provide complete sewage treatment. Four mineral and four plastic primary filter media and two plastic secondary filter media were studied. The hydraulic loading applied to the primary plastic media (11.2 m3 /m3 .d) was twice that applied to the mineral media. The plastic media removed an average around 66 percent and the mineral media around 73 percent of the BOD applied when the 90 percentile BOD concentration was 563 mg/1. At a hydraulic loading of 4 m3 /m3 .d the secondary filters removed most of the POD from partially settled primary filter effluents, with one secondary effluent satisfying a 25 mg/1 BOD and 30 mg/1 SS standard. No significant degree of nitrification was achieved. Fungi dominated the biological film of the primary filters, with invertebrate grazers having little influence on film levels. Ponding did not arise, and modular media supported lower film levels than random-fill types. Secondary filter film levels were low, being dominated by bacteria. The biological loading applied to the filters was related to sludge dewaterability, with the most readily conditionable sludges produced by filters supporting heavy film. Sludges produced by random-fill media could be dewatered as readily as those produced by low rate filters treating the same sewage. Laboratory scale studies showed a relationship between log effluent BOD and nitrification achieved by biological filters. This relationship and the relationship between BOD load applied and removed observed in all filter media could he used to optimise operating conditions required in biological filters to achieve given effluent BOD and ammoniacal nitrogen standards.
Resumo:
The IRDS standard is an international standard produced by the International Organisation for Standardisation (ISO). In this work the process for producing standards in formal standards organisations, for example the ISO, and in more informal bodies, for example the Object Management Group (OMG), is examined. This thesis examines previous models and classifications of standards. The previous models and classifications are then combined to produce a new classification. The IRDS standard is then placed in a class in the new model as a reference anticipatory standard. Anticipatory standards are standards which are developed ahead of the technology in order to attempt to guide the market. The diffusion of the IRDS is traced over a period of eleven years. The economic conditions which affect the diffusion of standards are examined, particularly the economic conditions which prevail in compatibility markets such as the IT and ICT markets. Additionally the consequences of the introduction of gateway or converter devices into a market where a standard has not yet been established is examined. The IRDS standard did not have an installed base and this hindered its diffusion. The thesis concludes that the IRDS standard was overtaken by new developments such as object oriented technologies and middleware. This was partly because of the slow development process of developing standards in traditional organisations which operate on a consensus basis and partly because the IRDS standard did not have an installed base. Also the rise and proliferation of middleware products resulted in exchange mechanisms becoming dominant rather than repository solutions. The research method used in this work is a longitudinal study of the development and diffusion of the ISO/EEC IRDS standard. The research is regarded as a single case study and follows the interpretative epistemological point of view.
Resumo:
New Approach’ Directives now govern the health and safety of most products whether destined for workplace or domestic use. These Directives have been enacted into UK law by various specific legislation principally relating to work equipment, machinery and consumer products. This research investigates whether the risk assessment approach used to ensure the safety of machinery may be applied to consumer products. Crucially, consumer products are subject to the Consumer Protection Act (CPA) 1987, where there is no direct reference to “assessing risk”. This contrasts with the law governing the safety of products used in the workplace, where risk assessment underpins the approach. New Approach Directives are supported by European harmonised standards, and in the case of machinery, further supported by the risk assessment standard, EN 1050. The system regulating consumer product safety is discussed, its key elements identified and a graphical model produced. This model incorporates such matters as conformity assessment, the system of regulation, near miss and accident reporting. A key finding of the research is that New Approach Directives have a common feature of specifying essential performance requirements that provide a hazard prompt-list that can form the basis for a risk assessment (the hazard identification stage). Drawing upon 272 prosecution cases, and with thirty examples examined in detail, this research provides evidence that despite the high degree of regulation, unsafe consumer products still find their way onto the market. The research presents a number of risk assessment tools to help Trading Standards Officers (TSOs) prioritise their work at the initial inspection stage when dealing with subsequent enforcement action.
Resumo:
At present there is no standard assessment method for rating and comparing the quality of synthesized speech. This study assesses the suitability of Time Frequency Warping (TFW) modulation for use as a reference device for assessing synthesized speech. Time Frequency Warping modulation introduces timing errors into natural speech that produce perceptual errors similar to those found in synthetic speech. It is proposed that TFW modulation used in conjunction with a listening effort test would provide a standard assessment method for rating the quality of synthesized speech. This study identifies the most suitable TFW modulation variable parameter to be used for assessing synthetic speech and assess the results of several assessment tests that rate examples of synthesized speech in terms of the TFW variable parameter and listening effort. The study also attempts to identify the attributes of speech that differentiate synthetic, TFW modulated and natural speech.
Resumo:
Whether to assess the functionality of equipment or as a determinate for the accuracy of assays, reference standards are essential for the purposes of standardisation and validation. The ELISPOT assay, developed over thirty years ago, has emerged as a leading immunological assay in the development of novel vaccines for the assessment of efficacy. However, with its widespread use, there is a growing demand for a greater level of standardisation across different laboratories. One of the major difficulties in achieving this goal has been the lack of definitive reference standards. This is partly due to the ex vivo nature of the assay, which relies on cells being placed directly into the wells. Thus, the aim of this thesis was to produce an artificial reference standard using liposomes, for use within the assay. Liposomes are spherical bilayer vesicles with an enclosed aqueous compartment and therefore are models for biological membranes. Initial work examined pre-design considerations in order to produce an optimal formulation that would closely mimic the action of the cells ordinarily placed on the assay. Recognition of the structural differences between liposomes and cells led to the formulation of liposomes with increased density. This was achieved by using a synthesised cholesterol analogue. By incorporating this cholesterol analogue in liposomes, increased sedimentation rates were observed within the first few hours. The optimal liposome formulation from these studies was composed of 2-dipalmitoyl-sn-glycero-3-phosphocholine (DPPC), cholesterol (Chol) and brominated cholesterol (Brchol) at a 16:4:12 µMol ratio, based on a significantly higher (p<0.01) sedimentation (as determined by a percentage transmission of 59 ± 5.9 % compared to the control formulation at 29 ± 12 % after four hours). By considering a range of liposome formulations ‘proof of principle’ for using liposomes as ELISPOT reference standards was shown; recombinant IFN? cytokine was successfully entrapped within vesicles of different lipid compositions, which were able to promote spot formation within the ELISPOT assay. Using optimised liposome formulations composed of phosphatidylcholine with or without cholesterol (16 µMol total lipid) further development was undertaken to produce an optimised, scalable protocol for the production of liposomes as reference standards. A linear increase in spot number by the manipulation of cytokine concentration and/or lipid concentrations was not possible, potentially due to the saturation that occurred within the base of wells. Investigations into storage of the formulations demonstrated the feasibility of freezing and lyophilisation with disaccharide cryoprotectants, but also highlighted the need for further protocol optimisation to achieve a robust reference standard upon storage. Finally, the transfer of small-scale production to a medium lab-scale batch (40 mL) demonstrated this was feasible within the laboratory using the optimised protocol.
Resumo:
This paper details work carried out to verify the dimensional measurement performance of the Indoor GPS (iGPS) system; a network of Rotary-Laser Automatic Theodolites (R-LATs). Initially tests were carried out to determine the angular uncertainties on an individual R-LAT transmitter-receiver pair. A method is presented of determining the uncertainty of dimensional measurement for a three dimensional coordinate measurement machine. An experimental procedure was developed to compare three dimensional coordinate measurements with calibrated reference points. The reference standard used to calibrate these reference points was a fringe counting interferometer with the multilateration technique employed to establish three dimensional coordinates. This is an extension of the established technique of comparing measured lengths with calibrated lengths. The method was found to be practical and able to establish that the expanded uncertainty of the basic iGPS system was approximately 1 mm at a 95% confidence level. Further tests carried out on a highly optimized version of the iGPS system have shown that the coordinate uncertainty can be reduced to 0.25 mm at a 95% confidence level.
Resumo:
This paper determines the capability of two photogrammetric systems in terms of their measurement uncertainty in an industrial context. The first system – V-STARS inca3 from Geodetic Systems Inc. – is a commercially available measurement solution. The second system comprises an off-the-shelf Nikon D700 digital camera fitted with a 28 mm Nikkor lens and the research-based Vision Measurement Software (VMS). The uncertainty estimate of these two systems is determined with reference to a calibrated constellation of points determined by a Leica AT401 laser tracker. The calibrated points have an average associated standard uncertainty of 12·4 μm, spanning a maximum distance of approximately 14·5 m. Subsequently, the two systems’ uncertainty was determined. V-STARS inca3 had an estimated standard uncertainty of 43·1 μm, thus outperforming its manufacturer's specification; the D700/VMS combination achieved a standard uncertainty of 187 μm.