19 resultados para validation study

em Aston University Research Archive


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Lipid peroxidation products like malondialdehyde, 4-hydroxynonenal and F(2)-isoprostanes are widely used as markers of oxidative stress in vitro and in vivo. This study reports the results of a multi-laboratory validation study by COST Action B35 to assess inter-laboratory and intra-laboratory variation in the measurement of lipid peroxidation. Human plasma samples were exposed to UVA irradiation at different doses (0, 15 J, 20 J), encoded and shipped to 15 laboratories, where analyses of malondialdehyde, 4-hydroxynonenal and isoprostanes were conducted. The results demonstrate a low within-day-variation and a good correlation of results observed on two different days. However, high coefficients of variation were observed between the laboratories. Malondialdehyde determined by HPLC was found to be the most sensitive and reproducible lipid peroxidation product in plasma upon UVA treatment. It is concluded that measurement of malondialdehyde by HPLC has good analytical validity for inter-laboratory studies on lipid peroxidation in human EDTA-plasma samples, although it is acknowledged that this may not translate to biological validity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A sizeable amount of the testing in eye care, requires either the identification of targets such as letters to assess functional vision, or the subjective evaluation of imagery by an examiner. Computers can render a variety of different targets on their monitors and can be used to store and analyse ophthalmic images. However, existing computing hardware tends to be large, screen resolutions are often too low, and objective assessments of ophthalmic images unreliable. Recent advances in mobile computing hardware and computer-vision systems can be used to enhance clinical testing in optometry. High resolution touch screens embedded in mobile devices, can render targets at a wide variety of distances and can be used to record and respond to patient responses, automating testing methods. This has opened up new opportunities in computerised near vision testing. Equally, new image processing techniques can be used to increase the validity and reliability of objective computer vision systems. Three novel apps for assessing reading speed, contrast sensitivity and amplitude of accommodation were created by the author to demonstrate the potential of mobile computing to enhance clinical measurement. The reading speed app could present sentences effectively, control illumination and automate the testing procedure for reading speed assessment. Meanwhile the contrast sensitivity app made use of a bit stealing technique and swept frequency target, to rapidly assess a patient’s full contrast sensitivity function at both near and far distances. Finally, customised electronic hardware was created and interfaced to an app on a smartphone device to allow free space amplitude of accommodation measurement. A new geometrical model of the tear film and a ray tracing simulation of a Placido disc topographer were produced to provide insights on the effect of tear film breakdown on ophthalmic images. Furthermore, a new computer vision system, that used a novel eye-lash segmentation technique, was created to demonstrate the potential of computer vision systems for the clinical assessment of tear stability. Studies undertaken by the author to assess the validity and repeatability of the novel apps, found that their repeatability was comparable to, or better, than existing clinical methods for reading speed and contrast sensitivity assessment. Furthermore, the apps offered reduced examination times in comparison to their paper based equivalents. The reading speed and amplitude of accommodation apps correlated highly with existing methods of assessment supporting their validity. Their still remains questions over the validity of using a swept frequency sine-wave target to assess patient’s contrast sensitivity functions as no clinical test provides the range of spatial frequencies and contrasts, nor equivalent assessment at distance and near. A validation study of the new computer vision system found that the authors tear metric correlated better with existing subjective measures of tear film stability than those of a competing computer-vision system. However, repeatability was poor in comparison to the subjective measures due to eye lash interference. The new mobile apps, computer vision system, and studies outlined in this thesis provide further insight into the potential of applying mobile and image processing technology to enhance clinical testing by eye care professionals.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose - The paper develops a model of employee innovative behavior conceptualizing it as distinct from innovation outputs and as a multi-faceted behavior rather than a simple count of ‘innovative acts’ by employees. It understands individual employee innovative behaviors as a micro-foundation of firm intrapreneurship that is embedded in and influenced by contextual factors such as managerial, organizational and cultural support for innovation. Building from a review of existing employee innovative behavior scales and theoretical considerations we develop and validate the Innovative Behavior Inventory (IBI) and the Innovation Support Inventory (ISI). Design/methodology/approach – Two pilot studies, a third validation study in the Czech Republic and a fourth cross-cultural validation study using population representative samples from Switzerland, Germany, Italy and the Czech Republic (N=2812 employees and 450 entrepreneurs) were conducted. Findings - Both inventories were reliable and showed factorial, criterion, convergent and discriminant validity as well as cross-cultural equivalence. Employee innovative behavior was supported as comprising of idea generation, idea search, idea communication, implementation starting activities, involving others and overcoming obstacles. Managerial support was the most proximal contextual influence on innovative behavior and mediated the effect of organizational support and national culture. Originality/value - The paper advances our understanding of employee innovative behavior as a multi-faceted phenomenon and the contextual factors influencing it. Where past research typically focuses on convenience samples within a particular country, we offer first robust evidence that our model of employee innovative behavior generalizes across cultures and types of samples. Our model and the IBI and ISI inventories enable researchers to build a deeper understanding of the important micro-foundation underpinning intrapreneurial behavior in organizations and allow practitioners to identify their organizations’ strengths and weaknesses related to intrapreneurship.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Protein carbonyls are widely analysed as a measure of protein oxidation. Several different methods exist for their determination. A previous study had described orders of magnitude variance that existed when protein carbonyls were analysed in a single laboratory by ELISA using different commercial kits. We have further explored the potential causes of variance in carbonyl analysis in a ring study. A soluble protein fraction was prepared from rat liver and exposed to 0, 5 and 15 min of UV irradiation. Lyophilised preparations were distributed to six different laboratories that routinely undertook protein carbonyl analysis across Europe. ELISA and Western blotting techniques detected an increase in protein carbonyl formation between 0 and 5 min of UV irradiation irrespective of method used. After irradiation for 15 min, less oxidation was detected by half of the laboratories than after 5 min irradiation. Three of the four ELISA carbonyl results fell within 95% confidence intervals. Likely errors in calculating absolute carbonyl values may be attributed to differences in standardisation. Out of up to 88 proteins identified as containing carbonyl groups after tryptic cleavage of irradiated and control liver proteins, only seven were common in all three liver preparations. Lysine and arginine residues modified by carbonyls are likely to be resistant to tryptic proteolysis. Use of a cocktail of proteases may increase the recovery of oxidised peptides. In conclusion, standardisation is critical for carbonyl analysis and heavily oxidised proteins may not be effectively analysed by any existing technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dementia is one of the greatest contemporary health and social care challenges, and novel approaches to the care of its sufferers are needed. New information and communication technologies (ICT) have the potential to assist those caring for people with dementia, through access to networked information and support, tracking and surveillance. This article reports the views about such new technologies of 34 carers of people with dementia. We also held a group discussion with nine carers for respondent validation. The carers' actual use of new ICT was limited, although they thought a gradual increase in the use of networked technology in dementia care was inevitable but would bypass some carers who saw themselves as too old. Carers expressed a general enthusiasm for the benefits of ICT, but usually not for themselves, and they identified several key challenges including: establishing an appropriate balance between, on the one hand, privacy and autonomy and, on the other: maximising safety; establishing responsibility for and ownership of the equipment and who bears the costs; the possibility that technological help would mean a loss of valued personal contact; and the possibility that technology would substitute for existing services rather than be complementary. For carers and dementia sufferers to be supported, the expanding use of these technologies should be accompanied by intensive debate of the associated issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – Qualitative theory building approaches, such as grounded theory method (GTM), are still not very widespread and rigorously applied in operations management (OM) research. Yet it is agreed that more systematic observation of current industrial phenomena is necessary to help managers deal with their problems. The purpose of this paper is to provide an example to help guide other researchers on using GTM for theory building in OM research. Design/methodology/approach – A GTM study in the German automotive industry consisting of 31 interviews is followed by a validation stage comprising a survey (110 responses) and a focus group. Findings – The result is an example of conducting GTM research in OM, illustrated by the development of the novel collaborative enterprise governance framework for inter-firm relationship governance in the German automotive industry. Research limitations/implications – GTM is appropriate for qualitative theory building research, but the resultant theories need further testing. Research is necessary to identify the transferability of the collaborative enterprise governance concept to other industries than automotive, to other organisational areas than R&D and to product and service settings that are less complex and innovative. Practical implications – The paper helps researchers make more informed use of GTM when engaging in qualitative theory building research in OM. Originality/value – There is a lack of explicit and well-informed use of GTM in OM research because of poor understanding. This paper addresses this deficiency. The collaborative enterprise governance framework is a significant contribution in an area of growing importance within OM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis presents an experimentally validated modelling study of the flow of combustion air in an industrial radiant tube burner (RTB). The RTB is used typically in industrial heat treating furnaces. The work has been initiated because of the need for improvements in burner lifetime and performance which are related to the fluid mechanics of the com busting flow, and a fundamental understanding of this is therefore necessary. To achieve this, a detailed three-dimensional Computational Fluid Dynamics (CFD) model has been used, validated with experimental air flow, temperature and flue gas measurements. Initially, the work programme is presented and the theory behind RTB design and operation in addition to the theory behind swirling flows and methane combustion. NOx reduction techniques are discussed and numerical modelling of combusting flows is detailed in this section. The importance of turbulence, radiation and combustion modelling is highlighted, as well as the numerical schemes that incorporate discretization, finite volume theory and convergence. The study first focuses on the combustion air flow and its delivery to the combustion zone. An isothermal computational model was developed to allow the examination of the flow characteristics as it enters the burner and progresses through the various sections prior to the discharge face in the combustion area. Important features identified include the air recuperator swirler coil, the step ring, the primary/secondary air splitting flame tube and the fuel nozzle. It was revealed that the effectiveness of the air recuperator swirler is significantly compromised by the need for a generous assembly tolerance. Also, there is a substantial circumferential flow maldistribution introduced by the swirier, but that this is effectively removed by the positioning of a ring constriction in the downstream passage. Computations using the k-ε turbulence model show good agreement with experimentally measured velocity profiles in the combustion zone and proved the use of the modelling strategy prior to the combustion study. Reasonable mesh independence was obtained with 200,000 nodes. Agreement was poorer with the RNG  k-ε and Reynolds Stress models. The study continues to address the combustion process itself and the heat transfer process internal to the RTB. A series of combustion and radiation model configurations were developed and the optimum combination of the Eddy Dissipation (ED) combustion model and the Discrete Transfer (DT) radiation model was used successfully to validate a burner experimental test. The previously cold flow validated k-ε turbulence model was used and reasonable mesh independence was obtained with 300,000 nodes. The combination showed good agreement with temperature measurements in the inner and outer walls of the burner, as well as with flue gas composition measured at the exhaust. The inner tube wall temperature predictions validated the experimental measurements in the largest portion of the thermocouple locations, highlighting a small flame bias to one side, although the model slightly over predicts the temperatures towards the downstream end of the inner tube. NOx emissions were initially over predicted, however, the use of a combustion flame temperature limiting subroutine allowed convergence to the experimental value of 451 ppmv. With the validated model, the effectiveness of certain RTB features identified previously is analysed, and an analysis of the energy transfers throughout the burner is presented, to identify the dominant mechanisms in each region. The optimum turbulence-combustion-radiation model selection was then the baseline for further model development. One of these models, an eccentrically positioned flame tube model highlights the failure mode of the RTB during long term operation. Other models were developed to address NOx reduction and improvement of the flame profile in the burner combustion zone. These included a modified fuel nozzle design, with 12 circular section fuel ports, which demonstrates a longer and more symmetric flame, although with limited success in NOx reduction. In addition, a zero bypass swirler coil model was developed that highlights the effect of the stronger swirling combustion flow. A reduced diameter and a 20 mm forward displaced flame tube model shows limited success in NOx reduction; although the latter demonstrated improvements in the discharge face heat distribution and improvements in the flame symmetry. Finally, Flue Gas Recirculation (FGR) modelling attempts indicate the difficulty of the application of this NOx reduction technique in the Wellman RTB. Recommendations for further work are made that include design mitigations for the fuel nozzle and further burner modelling is suggested to improve computational validation. The introduction of fuel staging is proposed, as well as a modification in the inner tube to enhance the effect of FGR.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis considers management decision making at the ward level in hospitals especially by ward sisters, and the effectiveness of the intervention of a decision support system. Nursing practice theories were related to organisation and management theories in order to conceptualise a decision making framework for nurse manpower planning and deployment at the ward level. Decision and systems theories were explored to understand the concepts of decision making and the realities of power in an organisation. In essence, the hypothesis was concerned with changes in patterns of decision making that could occur with the intervention of a decision support system and that the degree of change would be governed by a set of `difficulty' factors within wards in a hospital. During the course of the study, a classification of ward management decision making was created, together with the development and validation of measuring instruments to test the research hypothesis. The decision support system used was rigorously evaluated to test whether benefits did accrue from its implementation. Quantitative results from sample wards together with qualitative information collected, were used to test this hypothesis and the outcomes postulated were supported by these findings. The main conclusion from this research is that a more rational approach to management decision making is feasible, using information from a decision support system. However, wards and ward sisters that need the most assistance, where the `difficulty' factors in the organisation are highest, benefit the least from this type of system. Organisational reviews are needed on these identified wards, involving managers and doctors, to reduce the levels of un-coordinated activities and disruption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research project focused upon the design strategies adopted by expert and novice designers. It was based upon a desire to compare the design problem solving strategies of novices, in this case key stage three pupils studying technolgy within the United Kingdom National Curriculum, with designers who could be considered to have developed expertise. The findings helped to provide insights into potential teaching strategies to suit novice designers. Verbal protocols were made as samples of expert and novice designers solved a design problem and talked aloud as they worked. The verbalisations were recorded on video tape. The protocols were transcribed and segmented, with each segment being assigned to a predetermined coding system which represented a model of design problem solving. The results of the encoding were analysed and consideration was also given to the general design strategy and heuristics used by the expert and novice designers. The drawings and models produced during the generation of the protocols were also analysed and considered. A number of significant differences between the problem solving strategies adopted by the expert and novice designers were identified. First of all, differences were observed in the way expert and novice designers used the problem statement and solution validation during the process. Differences were also identified in the way holistic solutions were generated near the start of the process, and also in the cycles of exploration and the processes of integration. The way design and technological knowledge was used provided further insights into the differences between experts and novices, as did the role of drawing and modelling during the process. In more general terms, differences were identified in the heuristics and overall design strategies adopted by the expert and novice designers. The above findings provided a basis for discussing teaching strategies appropriate for novice designers. Finally, opportunities for future research were discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Damage to insulation materials located near to a primary circuit coolant leak may compromise the operation of the emergency core cooling system (ECCS). Insulation material in the form of mineral wool fiber agglomerates (MWFA) maybe transported to the containment sump strainers, where they may block or penetrate the strainers. Though the impact of MWFA on the pressure drop across the strainers is minimal, corrosion products formed over time may also accumulate in the fiber cakes on the strainers, which can lead to a significant increase in the strainer pressure drop and result in cavitation in the ECCS. An experimental and theoretical study performed by the Helmholtz-Zentrum Dresden-Rossendorf and the Hochschule Zittau/Görlitz is investigating the phenomena that maybe observed in the containment vessel during a primary circuit coolant leak. The study entails the generation of fiber agglomerates, the determination of their transport properties in single and multi-effect experiments and the long-term effect that corrosion and erosion of the containment internals by the coolant has on the strainer pressure drop. The focus of this paper is on the verification and validation of numerical models that can predict the transport of MWFA. A number of pseudo-continuous dispersed phases of spherical wetted agglomerates represent the MWFA. The size, density, the relative viscosity of the fluid-fiber agglomerate mixture and the turbulent dispersion all affect how the fiber agglomerates are transported. In the cases described here, the size is kept constant while the density is modified. This definition affects both the terminal velocity and volume fraction of the dispersed phases. Note that the relative viscosity is only significant at high concentrations. Three single effect experiments were used to provide validation data on the transport of the fiber agglomerates under conditions of sedimentation in quiescent fluid, sedimentation in a horizontal flow and suspension in a horizontal flow. The experiments were performed in a rectangular column for the quiescent fluid and a racetrack type channel that provided a near uniform horizontal flow. The numerical models of sedimentation in the column and the racetrack channel found that the sedimentation characteristics are consistent with the experiments. For channel suspension, the heavier fibers tend to accumulate at the channel base even at high velocities, while lighter phases are more likely to be transported around the channel.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective - The aim of the current study was to validate child (PFA-QL) and parent–proxy (PFA-QL-PF) versions of the scale in a specialist allergy clinic and in parents of children with food allergy. Methods - For the clinic sample, a generic QoL scale (PedsQL) and the PFA-QL were completed by 103 children (age 6–16 yrs) with peanut or tree nut allergy; test–retest reliability of the PFA-QL was tested in 50 stable patients. For the non-clinical sample, 756 parents of food allergic children completed the PFA-QL-PF, the Child Health Questionnaire (CHQ-PF50), Food Allergy Quality of Life Parental Burden Scale (FAQL-PB) and a Food Allergy Impact Measure. Results - The PFA-QL and PFA-QL-PF had good internal consistency (a's of 0.77–0.82), and there was moderate-to-good agreement between the generic- and disease-specific questionnaires. The PFA-QL was stable over time in the clinic sample, and in both samples, girls were reported to have poorer QoL than boys. Conclusions - The PFA-QL and PFA-QL-PF are reliable and valid scales for use in both clinical and non-clinical populations. Unlike other available tools, they were developed and validated in the UK and thus provide a culture-specific choice for research, clinical trials and clinical practice in the UK. Validation in other countries is now needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - Food allergy can have a profound effect on quality of life (QoL) of the family. The Food Allergy Quality of Life—Parental Burden Questionnaire (FAQL-PB) was developed on a US sample to assess the QoL of parents with food allergic children. The aim of this study was to examine the reliability and validity of the FAQL-PB in a UK sample and to assess the effect of asking about parental burden in the last week compared with parental burden in general, with no time limit for recall given. Methods - A total of 1,200 parents who had at least one child with food allergy were sent the FAQL-PB and the Child Health Questionnaire (CHQ-PF50); of whom only 63 % responded. Results - Factor analysis of the FAQL-PB revealed two factors: limitations on life and emotional distress. The total scale and the two sub-scales had high internal reliability (all a > 0.85). There were small to moderate but significant correlations between total FAQL-PB scores and health and parental impact measures on the CHQ-PF50 (p < 0.01). Significantly greater parental burden was reported for the no-time limited compared with the time-limited version (p < 0.01). Conclusions - The FAQL-PB is a reliable and valid measure for use in the UK. The scale could be used in clinic to assess the physical and emotional quality of life in addition to the impact on total quality of life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To determine whether curve-fitting analysis of the ranked segment distributions of topographic optic nerve head (ONH) parameters, derived using the Heidelberg Retina Tomograph (HRT), provide a more effective statistical descriptor to differentiate the normal from the glaucomatous ONH. Methods: The sample comprised of 22 normal control subjects (mean age 66.9 years; S.D. 7.8) and 22 glaucoma patients (mean age 72.1 years; S.D. 6.9) confirmed by reproducible visual field defects on the Humphrey Field Analyser. Three 10°-images of the ONH were obtained using the HRT. The mean topography image was determined and the HRT software was used to calculate the rim volume, rim area to disc area ratio, normalised rim area to disc area ratio and retinal nerve fibre cross-sectional area for each patient at 10°-sectoral intervals. The values were ranked in descending order, and each ranked-segment curve of ordered values was fitted using the least squares method. Results: There was no difference in disc area between the groups. The group mean cup-disc area ratio was significantly lower in the normal group (0.204 ± 0.16) compared with the glaucoma group (0.533 ± 0.083) (p < 0.001). The visual field indices, mean deviation and corrected pattern S.D., were significantly greater (p < 0.001) in the glaucoma group (-9.09 dB ± 3.3 and 7.91 ± 3.4, respectively) compared with the normal group (-0.15 dB ± 0.9 and 0.95 dB ± 0.8, respectively). Univariate linear regression provided the best overall fit to the ranked segment data. The equation parameters of the regression line manually applied to the normalised rim area-disc area and the rim area-disc area ratio data, correctly classified 100% of normal subjects and glaucoma patients. In this study sample, the regression analysis of ranked segment parameters method was more effective than conventional ranked segment analysis, in which glaucoma patients were misclassified in approximately 50% of cases. Further investigation in larger samples will enable the calculation of confidence intervals for normality. These reference standards will then need to be investigated for an independent sample to fully validate the technique. Conclusions: Using a curve-fitting approach to fit ranked segment curves retains information relating to the topographic nature of neural loss. Such methodology appears to overcome some of the deficiencies of conventional ranked segment analysis, and subject to validation in larger scale studies, may potentially be of clinical utility for detecting and monitoring glaucomatous damage. © 2007 The College of Optometrists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As optical coherence tomography (OCT) becomes widespread, validation and characterization of systems becomes important. Reference standards are required to qualitatively and quantitatively measure the performance between difference systems. This would allow the performance degradation of the system over time to be monitored. In this report, the properties of the femtosecond inscribed structures from three different systems for making suitable OCT characterization artefacts (phantoms) are analyzed. The parameter test samples are directly inscribed inside transparent materials. The structures are characterized using an optical microscope and a swept-source OCT. The high reproducibility of the inscribed structures shows high potential for producing multi-modality OCT calibration and characterization phantoms. Such that a single artefact can be used to characterize multiple performance parameters such the resolution, linearity, distortion, and imaging depths. © 2012 SPIE.