47 resultados para Measurement and calculation of GFR
Resumo:
This thesis begins with a review of the literature on team-based working in organisations, highlighting the variations in research findings, and the need for greater precision in our measurement of teams. It continues with an illustration of the nature and prevalence of real and pseudo team-based working, by presenting results from a large sample of secondary data from the UK National Health Service. Results demonstrate that ‘real teams’ have an important and significant impact on the reduction of many work-related safety outcomes. Based on both theoretical and methodological limitations of existing approaches, the thesis moves on to provide a clarification and extension of the ‘real team’ construct, demarcating this from other (pseudo-like) team typologies on a sliding scale, rather than a simple dichotomy. A conceptual model for defining real teams is presented, providing a theoretical basis for the development of a scale on which teams can be measured for varying extents of ‘realness’. A new twelve-item scale is developed and tested with three samples of data comprising 53 undergraduate teams, 52 postgraduate teams, and 63 public sector teams from a large UK organisation. Evidence for the content, construct and criterion-related validity of the real team scale is examined over seven separate validation studies. Theoretical, methodological and practical implications of the real team scale are then discussed.
Resumo:
We develop a theoretical method to calculate jitter statistics of interacting solitons. Applying this approach, we have derived the non-Gaussian probability density function and calculated the bit-error rate as a function of noise level, initial separation and phase difference between solitons.
Resumo:
In dimensional metrology, often the largest source of uncertainty of measurement is thermal variation. Dimensional measurements are currently scaled linearly, using ambient temperature measurements and coefficients of thermal expansion, to ideal metrology conditions at 20˚C. This scaling is particularly difficult to implement with confidence in large volumes as the temperature is unlikely to be uniform, resulting in thermal gradients. A number of well-established computational methods are used in the design phase of product development for the prediction of thermal and gravitational effects, which could be used to a greater extent in metrology. This paper outlines the theory of how physical measurements of dimension and temperature can be combined more comprehensively throughout the product lifecycle, from design through to the manufacturing phase. The Hybrid Metrology concept is also introduced: an approach to metrology, which promises to improve product and equipment integrity in future manufacturing environments. The Hybrid Metrology System combines various state of the art physical dimensional and temperature measurement techniques with established computational methods to better predict thermal and gravitational effects.
Resumo:
The uncertainty of measurements must be quantified and considered in order to prove conformance with specifications and make other meaningful comparisons based on measurements. While there is a consistent methodology for the evaluation and expression of uncertainty within the metrology community industry frequently uses the alternative Measurement Systems Analysis methodology. This paper sets out to clarify the differences between uncertainty evaluation and MSA and presents a novel hybrid methodology for industrial measurement which enables a correct evaluation of measurement uncertainty while utilising the practical tools of MSA. In particular the use of Gage R&R ANOVA and Attribute Gage studies within a wider uncertainty evaluation framework is described. This enables in-line measurement data to be used to establish repeatability and reproducibility, without time consuming repeatability studies being carried out, while maintaining a complete consideration of all sources of uncertainty and therefore enabling conformance to be proven with a stated level of confidence. Such a rigorous approach to product verification will become increasingly important in the era of the Light Controlled Factory with metrology acting as the driving force to achieve the right first time and highly automated manufacture of high value large scale products such as aircraft, spacecraft and renewable power generation structures.
Resumo:
Five axis machine tools are increasing and becoming more popular as customers demand more complex machined parts. In high value manufacturing, the importance of machine tools in producing high accuracy products is essential. High accuracy manufacturing requires producing parts in a repeatable manner and precision in compliance to the defined design specifications. The performance of the machine tools is often affected by geometrical errors due to a variety of causes including incorrect tool offsets, errors in the centres of rotation and thermal growth. As a consequence, it can be difficult to produce highly accurate parts consistently. It is, therefore, essential to ensure that machine tools are verified in terms of their geometric and positioning accuracy. When machine tools are verified in terms of their accuracy, the resulting numerical values of positional accuracy and process capability can be used to define design for verification rules and algorithms so that machined parts can be easily produced without scrap and little or no after process measurement. In this paper the benefits of machine tool verification are listed and a case study is used to demonstrate the implementation of robust machine tool performance measurement and diagnostics using a ballbar system.
Resumo:
The thesis is concerned with the development and testing of a mathematical model of a distillation process in which the components react chemically. The formaldehyde-methanol-water system was selected and only the reversible reactions between formaldehyde and water giving methylene glycol and between formaldehyde and methanol producing hemiformal were assumed to occur under the distillation conditions. Accordingly the system has been treated as a five component system. The vapour-liquid equilibrium calculations were performed by solving iteratively the thermodynamic relationships expressing the phase equilibria with the stoichiometric equations expressing the chemical equilibria. Using optimisation techniques, the Wilson single parameters and Henry's constants were calculated for binary systems containing formaldehyde which was assumed to be a supercritical component whilst Wilson binary parameters were calculated for the remaining binary systems. Thus the phase equilibria for the formaldehyde system could be calculated using these parameters and good accuracy was obtained when calculated values were compared with experimental values. The distillation process was modelled using the mass and energy balance equations together with the phase equilibria calculations. The plate efficiencies were obtained from a modified A.I.Ch.E. Bubble Tray method. The resulting equations were solved by an iterative plate to plate calculation based on the Newton Raphson method. Experiments were carried out in a 76mm I.D., eight sieve plate distillation column and the results were compared with the mathematical model calculations. Overall, good agreement was obtained but some discrepancies were observed in the concentration profiles and these may have been caused by the effect of limited physical property data and a limited understanding of the reactions mechanism. The model equations were solved in the form of modular computer programs. Although they were written to describe the steady state distillation with simultaneous chemical reaction of the formaldehyde system, the approach used may be of wider application.
Resumo:
Purpose – The purpose of this paper is to explore the role and relevance of external standards in demonstrating the value and impact of academic library services to their stakeholders. Design/methodology/approach – Two UK standards, Charter Mark and Customer Service Excellence, are evaluated via an exploratory case study, employing multiple data collection techniques. Methods and results of phases 1-2 of a three phase research project are outlined. Findings – Despite some limitations, standards may assist the manager in demonstrating the value, impact and quality of academic libraries in a recessional environment. Active engagement and partnership with customers is imperative if academic libraries are to be viewed as vital to their parent organisations and thus survive. Originality/value – This paper provides a systematic evaluation of the role of external accreditation standards in measuring academic library service value and impact.
Resumo:
The book aims to introduce the reader to DEA in the most accessible manner possible. It is specifically aimed at those who have had no prior exposure to DEA and wish to learn its essentials, how it works, its key uses, and the mechanics of using it. The latter will include using DEA software. Students on degree or training courses will find the book especially helpful. The same is true of practitioners engaging in comparative efficiency assessments and performance management within their organisation. Examples are used throughout the book to help the reader consolidate the concepts covered. Table of content: List of Tables. List of Figures. Preface. Abbreviations. 1. Introduction to Performance Measurement. 2. Definitions of Efficiency and Related Measures. 3. Data Envelopment Analysis Under Constant Returns to Scale: Basic Principles. 4. Data Envelopment Analysis under Constant Returns to Scale: General Models. 5. Using Data Envelopment Analysis in Practice. 6. Data Envelopment Analysis under Variable Returns to Scale. 7. Assessing Policy Effectiveness and Productivity Change Using DEA. 8. Incorporating Value Judgements in DEA Assessments. 9. Extensions to Basic DEA Models. 10. A Limited User Guide for Warwick DEA Software. Author Index. Topic Index. References.
Resumo:
Many dietary factors have been associated with a decreased risk of developing cancer. One potential mechanism by which these factors, chemopreventors, protect against cancer may be via alteration of carcinogen metabolism. The broccoli constituent sulforaphane (1-isothiocyanate-4-methylsulinylbutane) (CH3-S0-(CH2)4-NCS) has been isolated as a potential inducer of phase II detoxification enzymes and also protects rodents against 9,10-dimethyl-1,2-benz[aJanthracene-induced mammary tumours. The ability of sulforaphane to also modulate phase I activation enzymes (cytochrome P450) (CYP450) was studied here. Sulforaphane was synthesised with an overall yield of 15%, essentially via 1-methylsulfinylphthalimidobutane, which was oxidised to the sulfoxide moiety. Deprotective removal of phthalimide yielded the amine, which was converted into sulforaphane by reaction with N,N'-thionocarbonyldiimidazole. Purity (95 %) was checked by 1H-NMR,13C-NMR and infrared and mass spectrometry.Sulforaphane was a competitive inhibitor of CYP2E1 in acetone-induced Sprague-Dawley rat microsomes (Ki 37.9 ± 4.5μM), as measured by the p-nitrophenol hydroxylase assay. Ethoxyresorufin deethylase activity (EROD), a measurement of CYP1A activity, was also inhibited by sulforaphane (100μM) but was not competitive, and a preincubation time-dependence was observed. In view of these results, the capacity of sulforaphane to inhibit N-nitrosodimethylamine (NDMA)-induced genotoxicity (CYP2E1-mediated) was studied using mouse liver activation systems. Sulforaphane (>0.8μM) inhibited the mutagenicity of NDMA (4.4 mg/plate) in Salmonella typhimurium strain TA100 after pre-incubation for 45 min with acetone-induced liver 9000 g supernatants from Balb/c mice. Unscheduled DNA synthesis induced by NDMA (33μ5 M) in mouse hepatocytes was also reduced by sulforaphane in a concentration-dependent manner (0.064-20μM). Sulforaphane was not genotoxic itself in any of these systems and cytotoxic only at high concentrations (>0.5 mM and > 40μM respectively). The ability of sulforaphane to modulate the orthologous human enzymes was studied using a human epithelial liver cell line (THLE) expressing individual human CYP450 isoenzymes. Using the Comet assay (a measurement of DNA strand breakage under alkaline conditions), NDMA (0.01-1μg/ml) and IQ (0.1-10μg/ml) were used to produce strand breaks in T5-2E1 cells (expressing human CYP2E1) and T5-1A2 cells (expressing human CYP1A2) respectively, however no response was observed in T5-neo cells (without CYP450 cDNA transfection). Sulforaphane inhibited both NDMA and IQ-induced DNA strand breakage in a concentration-dependent manner (0.1-10μM).The inhibition of metabolic activation as a basis for the antigenotoxic action of sulforaphane in these systems (bacteria, rodent hepatocytes and human cells) is further supported by the lack of this chemopreventor to influence NaN3 mutagenicity in S. typhimurium and H202-induced DNA strand breakage in T5-neo cells. These findings suggest that inhibition of CYP2E1 and CYP1A by sulforaphane may contribute to its chemoprotective potential.
Resumo:
A long period grating is interrogated with a fibre Bragg grating using a derivative spectroscopy technique. A quasi-linear relationship between the output of the sensing scheme and the curvature experienced by the long period grating is demonstrated, with a sensitivity of 5.05 m and with an average curvature resolution of 2.9 × 10-2 m-1. In addition, the feasibility of multiplexing an in-line series of long period gratings with this interrogation scheme is demonstrated with two pairs of fibre Bragg gratings and long period gratings. With this arrangement the cross-talk error between channels was less than ± 2.4 × 10-3 m-1.
Resumo:
This thesis describes research into business user involvement in the information systems application building process. The main interest of this research is in establishing and testing techniques to quantify the relationships between identified success factors and the outcome effectiveness of 'business user development' (BUD). The availability of a mechanism to measure the levels of the success factors, and quantifiably relate them to outcome effectiveness, is important in that it provides an organisation with the capability to predict and monitor effects on BUD outcome effectiveness. This is particularly important in an era where BUD levels have risen dramatically, user centred information systems development benefits are recognised as significant, and awareness of the risks of uncontrolled BUD activity is becoming more widespread. This research targets the measurement and prediction of BUD success factors and implementation effectiveness for particular business users. A questionnaire instrument and analysis technique has been tested and developed which constitutes a tool for predicting and monitoring BUD outcome effectiveness, and is based on the BUDES (Business User Development Effectiveness and Scope) research model - which is introduced and described in this thesis. The questionnaire instrument is designed for completion by 'business users' - the target community being more explicitly defined as 'people who primarily have a business role within an organisation'. The instrument, named BUD ESP (Business User Development Effectiveness and Scope Predictor), can readily be used with survey participants, and has been shown to give meaningful and representative results.
Resumo:
Respiration is a complex activity. If the relationship between all neurological and skeletomuscular interactions was perfectly understood, an accurate dynamic model of the respiratory system could be developed and the interaction between different inputs and outputs could be investigated in a straightforward fashion. Unfortunately, this is not the case and does not appear to be viable at this time. In addition, the provision of appropriate sensor signals for such a model would be a considerable invasive task. Useful quantitative information with respect to respiratory performance can be gained from non-invasive monitoring of chest and abdomen motion. Currently available devices are not well suited in application for spirometric measurement for ambulatory monitoring. A sensor matrix measurement technique is investigated to identify suitable sensing elements with which to base an upper body surface measurement device that monitors respiration. This thesis is divided into two main areas of investigation; model based and geometrical based surface plethysmography. In the first instance, chapter 2 deals with an array of tactile sensors that are used as progression of existing and previously investigated volumetric measurement schemes based on models of respiration. Chapter 3 details a non-model based geometrical approach to surface (and hence volumetric) profile measurement. Later sections of the thesis concentrate upon the development of a functioning prototype sensor array. To broaden the application area the study has been conducted as it would be fore a generically configured sensor array. In experimental form the system performance on group estimation compares favourably with existing system on volumetric performance. In addition provides continuous transient measurement of respiratory motion within an acceptable accuracy using approximately 20 sensing elements. Because of the potential size and complexity of the system it is possible to deploy it as a fully mobile ambulatory monitoring device, which may be used outside of the laboratory. It provides a means by which to isolate coupled physiological functions and thus allows individual contributions to be analysed separately. Thus facilitating greater understanding of respiratory physiology and diagnostic capabilities. The outcome of the study is the basis for a three-dimensional surface contour sensing system that is suitable for respiratory function monitoring and has the prospect with future development to be incorporated into a garment based clinical tool.
Resumo:
A new surface analysis technique has been developed which has a number of benefits compared to conventional Low Energy Ion Scattering Spectrometry (LEISS). A major potential advantage arising from the absence of charge exchange complications is the possibility of quantification. The instrumentation that has been developed also offers the possibility of unique studies concerning the interaction between low energy ions and atoms and solid surfaces. From these studies it may also be possible, in principle, to generate sensitivity factors to quantify LEISS data. The instrumentation, which is referred to as a Time-of-Flight Fast Atom Scattering Spectrometer has been developed to investigate these conjecture in practice. The development, involved a number of modifications to an existing instrument, and allowed samples to be bombarded with a monoenergetic pulsed beam of either atoms or ions, and provided the capability to analyse the spectra of scattered atoms and ions separately. Further to this a system was designed and constructed to allow incident, exit and azimuthal angles of the particle beam to be varied independently. The key development was that of a pulsed, and mass filtered atom source; which was developed by a cyclic process of design, modelling and experimentation. Although it was possible to demonstrate the unique capabilities of the instrument, problems relating to surface contamination prevented the measurement of the neutralisation probabilities. However, these problems appear to be technical rather than scientific in nature, and could be readily resolved given the appropriate resources. Experimental spectra obtained from a number of samples demonstrate some fundamental differences between the scattered ion and neutral spectra. For practical non-ordered surfaces the ToF spectra are more complex than their LEISS counterparts. This is particularly true for helium scattering where it appears, in the absence of detailed computer simulation, that quantitative analysis is limited to ordered surfaces. Despite this limitation the ToFFASS instrument opens the way for quantitative analysis of the 'true' surface region to a wider range of surface materials.