47 resultados para Measurement and calculation of GFR
em Aston University Research Archive
Resumo:
Coke oven liquor is a toxic wastewater produced in large quantities by the Iron and Steel, and Coking Industries, and gives rise to major effluent treatment problems in those industries. Conscious of the potentially serious environmental impact of the discharge of such wastes, pollution control agencies in many countries have made progressively more stringent quality requirements for the discharge of the treated waste. The most common means of treating the waste is the activated sludge process. Problems with achieving consistently satisfactory treatment by this process have been experienced in the past. The need to improve the quality of the discharge of the treated waste prompted attempts by TOMLINS to model the process using Adenosine Triphosophnte (ATP) as a measure of biomass, but these were unsuccessful. This thesis describes work that was carried out to determine the significance of ATP in the activated sludge treatment of the waste. The use of ATP measurements in wastewater treatment were reviewed. Investigations were conducted into the ATP behaviour of the batch activated sludge treatment of two major components of the waste, phenol, and thiocyanate, and the continuous activated sludge treatment of the liquor itself, using laboratory scale apparatus. On the basis of these results equations were formulated to describe the significance of ATP as a measured activity and biomass in the treatment system. These were used as the basis for proposals to use ATP as a control parameter in the activated sludge treatment of coke oven liquor, and wastewaters in general. These had relevance both to the treatment of the waste in the reactor and to the settlement of the sludge produced in the secondary settlement stage of the treatment process.
Resumo:
Productivity at the macro level is a complex concept but also arguably the most appropriate measure of economic welfare. Currently, there is limited research available on the various approaches that can be used to measure it and especially on the relative accuracy of said approaches. This thesis has two main objectives: firstly, to detail some of the most common productivity measurement approaches and assess their accuracy under a number of conditions and secondly, to present an up-to-date application of productivity measurement and provide some guidance on selecting between sometimes conflicting productivity estimates. With regards to the first objective, the thesis provides a discussion on the issues specific to macro-level productivity measurement and on the strengths and weaknesses of the three main types of approaches available, namely index-number approaches (represented by Growth Accounting), non-parametric distance functions (DEA-based Malmquist indices) and parametric production functions (COLS- and SFA-based Malmquist indices). The accuracy of these approaches is assessed through simulation analysis, which provided some interesting findings. Probably the most important were that deterministic approaches are quite accurate even when the data is moderately noisy, that no approaches were accurate when noise was more extensive, that functional form misspecification has a severe negative effect in the accuracy of the parametric approaches and finally that increased volatility in inputs and prices from one period to the next adversely affects all approaches examined. The application was based on the EU KLEMS (2008) dataset and revealed that the different approaches do in fact result in different productivity change estimates, at least for some of the countries assessed. To assist researchers in selecting between conflicting estimates, a new, three step selection framework is proposed, based on findings of simulation analyses and established diagnostics/indicators. An application of this framework is also provided, based on the EU KLEMS dataset.
Resumo:
Measuring and compensating the pivot points of five-axis machine tools is always challenging and very time consuming. This paper presents a newly developed approach for automatic measurement and compensation of pivot point positional errors on five-axis machine tools. Machine rotary axis errors are measured using a circular test. This method has been tested on five-axis machine tools with swivel table configuration. Results show that up to 99% of the positional errors of the rotary axis can be compensated by using this approach.
Resumo:
The solubility of telmisartan (form A) in nine organic solvents (chloroform, dichloromethane, ethanol, toluene, benzene, 2-propanol, ethyl acetate, methanol and acetone) was determined by a laser monitoring technique at temperatures from 277.85 to 338.35 K. The solubility of telmisartan (form A) in all of the nine solvents increased with temperature as did the rates at which the solubility increased except in chloroform and dichloromethane. The mole fraction solubility in chloroform is higher than that in dichloromethane, which are both one order of magnitude higher than those in the other seven solvents at the experimental temperatures. The solubility data were correlated with the modified Apelblat equation and λh equations. The results show that the λh equation is in better agreement with the experimental data than the Apelblat equation. The relative root mean square deviations (σ) of the λh equation are in the range from 0.004 to 0.45 %. The dissolution enthalpies, entropies and Gibbs energies of telmisartan in these solvents were estimated by the Van’t Hoff equation and the Gibbs equation. The melting point and the fusion enthalpy of telmisartan were determined by differential scanning calorimetry.
Resumo:
Background/aims Macular pigment is thought to protect the macula against exposure to light and oxidative stress, both of which may play a role in the development of age-related macular degeneration. The aim was to clinically evaluate a novel cathode-ray-tube-based method for measurement of macular pigment optical density (MPOD) known as apparent motion photometry (AMP). Methods The authors took repeat readings of MPOD centrally (0°) and at 3° eccentricity for 76 healthy subjects (mean (±SD) 26.5±13.2 years, range 18–74 years). Results The overall mean MPOD for the cohort was 0.50±0.24 at 0°, and 0.28±0.20 at 3° eccentricity; these values were significantly different (t=-8.905, p<0.001). The coefficients of repeatability were 0.60 and 0.48 for the 0 and 3° measurements respectively. Conclusions The data suggest that when the same operator is taking repeated 0° AMP MPOD readings over time, only changes of more than 0.60 units can be classed as clinically significant. In other words, AMP is not suitable for monitoring changes in MPOD over time, as increases of this magnitude would not be expected, even in response to dietary modification or nutritional supplementation.
Resumo:
The measurement of 8-oxo-7,8-dihydro-2'-deoxyguanosine is an increasingly popular marker of in vivo oxidative damage to DNA. A random-sequence 21-mer oligonucleotide 5'-TCA GXC GTA CGT GAT CTC AGT-3' in which X was 8-oxo-guanine (8-oxo-G) was purified and accurate determination of the oxidised base was confirmed by a 32P-end labelling strategy. The lyophilised material was analysed for its absolute content of 8-oxo-dG by several major laboratories in Europe and one in Japan. Most laboratories using HPLC-ECD underestimated, while GC-MS-SIM overestimated the level of the lesion. HPLC-ECD measured the target value with greatest accuracy. The results also suggest that none of the procedures can accurately quantitate levels of 1 in 10(6) 8-oxo-(d)G in DNA.
Resumo:
This paper begins by suggesting that when considering Corporate Social Responsibility (CSR), even CSR as justified in terms of the business case, stakeholders are of great importance to corporations. In the UK the Company Law Review (DTI, 2002) has suggested that it is appropriate for UK companies to be managed upon the basis of an enlightened shareholder approach. Within this approach the importance of stakeholders, other than shareholders, is recognised as being instrumental in succeeding in providing shareholder value. Given the importance of these other stakeholders it is then important that corporate management measure and manage stakeholder performance. In order to do this there are two general approaches that could be adopted and these are the use of monetary values to reflect stakeholder value or cost and non-monetary values. In order to consider these approaches further this paper considered the possible use of these approaches for two stakeholder groups: namely employees and the environment. It concludes that there are ethical and practical difficulties with calculating economic values for stakeholder resources and so prefers a multi-dimensional approach to stakeholder performance measurement that does not use economic valuation.
Resumo:
OBJECTIVE: To assess the effect of using different risk calculation tools on how general practitioners and practice nurses evaluate the risk of coronary heart disease with clinical data routinely available in patients' records. DESIGN: Subjective estimates of the risk of coronary heart disease and results of four different methods of calculation of risk were compared with each other and a reference standard that had been calculated with the Framingham equation; calculations were based on a sample of patients' records, randomly selected from groups at risk of coronary heart disease. SETTING: General practices in central England. PARTICIPANTS: 18 general practitioners and 18 practice nurses. MAIN OUTCOME MEASURES: Agreement of results of risk estimation and risk calculation with reference calculation; agreement of general practitioners with practice nurses; sensitivity and specificity of the different methods of risk calculation to detect patients at high or low risk of coronary heart disease. RESULTS: Only a minority of patients' records contained all of the risk factors required for the formal calculation of the risk of coronary heart disease (concentrations of high density lipoprotein (HDL) cholesterol were present in only 21%). Agreement of risk calculations with the reference standard was moderate (kappa=0.33-0.65 for practice nurses and 0.33 to 0.65 for general practitioners, depending on calculation tool), showing a trend for underestimation of risk. Moderate agreement was seen between the risks calculated by general practitioners and practice nurses for the same patients (kappa=0.47 to 0.58). The British charts gave the most sensitive results for risk of coronary heart disease (practice nurses 79%, general practitioners 80%), and it also gave the most specific results for practice nurses (100%), whereas the Sheffield table was the most specific method for general practitioners (89%). CONCLUSIONS: Routine calculation of the risk of coronary heart disease in primary care is hampered by poor availability of data on risk factors. General practitioners and practice nurses are able to evaluate the risk of coronary heart disease with only moderate accuracy. Data about risk factors need to be collected systematically, to allow the use of the most appropriate calculation tools.
Resumo:
We demonstrate a new approach to in-situ measurement of femtosecond laser pulse induced changes in glass enabling the reconstruction in 3D of the induced complex permittivity modification. The technique can be used to provide single shot and time resolved quantitative measurements with a micron scale spatial resolution.
Resumo:
The topic of my research is consumer brand equity (CBE). My thesis is that the success or otherwise of a brand is better viewed from the consumers’ perspective. I specifically focus on consumers as a unique group of stakeholders whose involvement with brands is crucial to the overall success of branding strategy. To this end, this research examines the constellation of ideas on brand equity that have hitherto been offered by various scholars. Through a systematic integration of the concepts and practices identified but these scholars (concepts and practices such as: competitiveness, consumer searching, consumer behaviour, brand image, brand relevance, consumer perceived value, etc.), this research identifies CBE as a construct that is shaped, directed and made valuable by the beliefs, attitudes and the subjective preferences of consumers. This is done by examining the criteria on the basis of which the consumers evaluate brands and make brand purchase decisions. Understanding the criteria by which consumers evaluate brands is crucial for several reasons. First, as the basis upon which consumers select brands changes with consumption norms and technology, understanding the consumer choice process will help in formulating branding strategy. Secondly, an understanding of these criteria will help in formulating a creative and innovative agenda for ‘new brand’ propositions. Thirdly, it will also influence firms’ ability to simulate and mould the plasticity of demand for existing brands. In examining these three issues, this thesis presents a comprehensive account of CBE. This is because the first issue raised in the preceding paragraph deals with the content of CBE. The second issue addresses the problem of how to develop a reliable and valid measuring instrument for CBE. The third issue examines the structural and statistical relationships between the factors of CBE and the consequences of CBE on consumer perceived value (CPV). Using LISREL-SIMPLIS 8.30, the study finds direct and significant influential links between consumer brand equity and consumer value perception.
Resumo:
The objective of this thesis is to investigate, through an empirical study, the different functions of the highways maintenance departments and to suggest methods by means of which road maintenance work could be carried out in a more efficient way by utilising its resources of men, material and plant to the utmost advantage. This is particularly important under the present circumstances of national financial difficulties which have resulted in continuous cuts in public expenditure. In order to achieve this objective, the researcher carried out a survey among several Highways Authorities by means of questionnaire and interview. The information so collected was analysed in order to understand the actual, practical situation within highways manintenance departments, and highlight any existing problems, and try to answer the question of how they could become more efficient. According to the results obtained by the questionnaire and the interview, and the analysis of these results, the researcher concludes that it is the management system where least has been done, and where problems exist and are most complex. The management of highways maintenance departments argue that the reasons for their problems include both financial and organisational difficulties, apart from the political aspect and nature of the activities undertaken. The researcher believes that this ought to necessitate improving the management's analytical tools and techniques in order to achieve the most effective way of performing each activity. To this end the researcher recommends several related procedures to be adopted by the management of the highways maintenance departments. These recommendations, arising from the study, involve the technical, practical and human aspects. These are essential factors of which management should be aware - and certainly should not neglect - in order to achieve its objectives of improved productivity in the highways maintenance departments.
Resumo:
We demonstrate a novel time-resolved Q-factor measurement technique and demonstrate its application in the analysis of optical packet switching systems with high information spectral density. For the first time, we report the time-resolved Q-factor measurement of 42.6 Gbit/s AM-PSK and DQPSK modulated packets, which were generated by a SGDBR laser under wavelength switching. The time dependent degradation of Q-factor performance during the switching transient was analyzed and was found to be correlated with different laser switching characteristics in each case.