13 resultados para Quality control and inspection

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The measurement of 8-oxo-7,8-dihydro-2'-deoxyguanosine is an increasingly popular marker of in vivo oxidative damage to DNA. A random-sequence 21-mer oligonucleotide 5'-TCA GXC GTA CGT GAT CTC AGT-3' in which X was 8-oxo-guanine (8-oxo-G) was purified and accurate determination of the oxidised base was confirmed by a 32P-end labelling strategy. The lyophilised material was analysed for its absolute content of 8-oxo-dG by several major laboratories in Europe and one in Japan. Most laboratories using HPLC-ECD underestimated, while GC-MS-SIM overestimated the level of the lesion. HPLC-ECD measured the target value with greatest accuracy. The results also suggest that none of the procedures can accurately quantitate levels of 1 in 10(6) 8-oxo-(d)G in DNA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The relationship between locus of control, the quality of exchanges between subordinates and leaders (LMX), and a variety of work-related reactions (intrinsic/extrinsic job satisfaction, work-related well-being, and organizational commitment) are examined. It was predicted that people with an internal locus of control develop better quality relations with their manager and this, in turn, results in more favourable work-related reactions. Results from two different samples (N=404, and N=51) supported this prediction, and also showed that LMX either fully, or partially, mediated the relationship between locus of control and all the work-related reactions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Atrial fibrillation (AF) patients with a high risk of stroke are recommended anticoagulation with warfarin. However, the benefit of warfarin is dependent upon time spent within the target therapeutic range (TTR) of their international normalised ratio (INR) (2.0 to 3.0). AF patients possess limited knowledge of their disease and warfarin treatment and this can impact on INR control. Education can improve patients' understanding of warfarin therapy and factors which affect INR control. Methods/Design Randomised controlled trial of an intensive educational intervention will consist of group sessions (between 2-8 patients) containing standardised information about the risks and benefits associated with OAC therapy, lifestyle interactions and the importance of monitoring and control of their International Normalised Ratio (INR). Information will be presented within an 'expert-patient' focussed DVD, revised educational booklet and patient worksheets. 200 warfarin-naïve patients who are eligible for warfarin will be randomised to either the intervention or usual care groups. All patients must have ECG-documented AF and be eligible for warfarin (according to the NICE AF guidelines). Exclusion criteria include: aged < 18 years old, contraindication(s) to warfarin, history of warfarin USE, valvular heart disease, cognitive impairment, are unable to speak/read English and disease likely to cause death within 12 months. Primary endpoint is time spent in TTR. Secondary endpoints include measures of quality of life (AF-QoL-18), anxiety and depression (HADS), knowledge of AF and anticoagulation, beliefs about medication (BMQ) and illness representations (IPQ-R). Clinical outcomes, including bleeding, stroke and interruption to anticoagulation will be recorded. All outcome measures will be assessed at baseline and 1, 2, 6 and 12 months post-intervention. Discussion More data is needed on the clinical benefit of educational intervention with AF patients receiving warfarin. Trial registration ISRCTN93952605

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, freshwater fish farmers have come under increasing pressure from the Water Authorities to control the quality of their farm effluents. This project aimed to investigate methods of treating aquacultural effluent in an efficient and cost-effective manner, and to incorporate the knowledge gained into an Expert System which could then be used in an advice service to farmers. From the results of this research it was established that sedimentation and the use of low pollution diets are the only cost effective methods of controlling the quality of fish farm effluents. Settlement has been extensively investigated and it was found that the removal of suspended solids in a settlement pond is only likely to be effective if the inlet solids concentration is in excess of 8 mg/litre. The probability of good settlement can be enhanced by keeping the ratio of length/retention time (a form of mean fluid velocity) below 4.0 metres/minute. The removal of BOD requires inlet solids concentrations in excess of 20 mg/litre to be effective, and this is seldom attained on commercial fish farms. Settlement, generally, does not remove appreciable quantities of ammonia from effluents, but algae can absorb ammonia by nutrient uptake under certain conditions. The use of low pollution, high performance diets gives pollutant yields which are low when compared with published figures obtained by many previous workers. Two Expert Systems were constructed, both of which diagnose possible causes of poor effluent quality on fish farms and suggest solutions. The first system uses knowledge gained from a literature review and the second employs the knowledge obtained from this project's experimental work. Consent details for over 100 fish farms were obtained from the public registers kept by the Water Authorities. Large variations in policy from one Authority to the next were found. These data have been compiled in a computer file for ease of comparison.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives - Powdered and granulated particulate materials make up most of the ingredients of pharmaceuticals and are often at risk of undergoing unwanted agglomeration, or caking, during transport or storage. This is particularly acute when bulk powders are exposed to extreme swings in temperature and relative humidity, which is now common as drugs are produced and administered in increasingly hostile climates and are stored for longer periods of time prior to use. This study explores the possibility of using a uniaxial unconfined compression test to compare the strength of caked agglomerates exposed to different temperatures and relative humidities. This is part of a longer-term study to construct a protocol to predict the caking tendency of a new bulk material from individual particle properties. The main challenge is to develop techniques that provide repeatable results yet are presented simply enough to be useful to a wide range of industries. Methods - Powdered sucrose, a major pharmaceutical ingredient, was poured into a split die and exposed to high and low relative humidity cycles at room temperature. The typical ranges were 20–30% for the lower value and 70–80% for the higher value. The outer die casing was then removed and the resultant agglomerate was subjected to an unconfined compression test using a plunger fitted to a Zwick compression tester. The force against displacement was logged so that the dynamics of failure as well as the failure load of the sample could be recorded. The experimental matrix included varying the number of cycles, the amount between the maximum and minimum relative humidity, the height and diameters of the samples, the number of cycles and the particle size. Results - Trends showed that the tensile strength of the agglomerates increased with the number of cycles and also with the more extreme swings in relative humidity. This agrees with previous work on alternative methods of measuring the tensile strength of sugar agglomerates formed from humidity cycling (Leaper et al 2003). Conclusions - The results show that at the very least the uniaxial tester is a good comparative tester to examine the caking tendency of powdered materials, with a simple arrangement and operation that are compatible with the requirements of industry. However, further work is required to continue to optimize the height/ diameter ratio during tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The open content creation process has proven itself to be a powerful and influential way of developing text-based content, as demonstrated by the success of Wikipedia and related sites. Distributed individuals independently edit, revise, or refine content, thereby creating knowledge artifacts of considerable breadth and quality. Our study explores the mechanisms that control and guide the content creation process and develops an understanding of open content governance. The repertory grid method is employed to systematically capture the experiences of individuals involved in the open content creation process and to determine the relative importance of the diverse control and guiding mechanisms. Our findings illustrate the important control and guiding mechanisms and highlight the multifaceted nature of open content governance. A range of governance mechanisms is discussed with regard to the varied levels of formality, the different loci of authority, and the diverse interaction environments involved. Limitations and opportunities for future research are provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quality, production and technological innovation management rank among the most important matters of concern to modern manufacturing organisations. They can provide companies with the decisive means of gaining a competitive advantage, especially within industries where there is an increasing similarity in product design and manufacturing processes. The papers in this special issue of International Journal of Technology Management have all been selected as examples of how aspects of quality, production and technological innovation can help to improve competitive performance. Most are based on presentations made at the UK Operations Management Association's Sixth International Conference held at Aston University at which the theme was 'Getting Ahead Through Technology and People'. At the conference itself over 80 papers were presented by authors from 15 countries around the world. Among the many topics addressed within the conference theme, technological innovation, quality and production management emerged as attracting the greatest concern and interest of delegates, particularly those from industry. For any new initiative to be implemented successfully, it should be led from the top of the organization. Achieving the desired level of commitment from top management can, however, be a difficulty. In the first paper of this issue, Mackness investigates this question by explaining how systems thinking can help. In the systems approach, properties such as 'emergence', 'hierarchy', 'commnication' and 'control' are used to assist top managers in preparing for change. Mackness's paper is then complemented by Iijima and Hasegawa's contribution in which they investigate the development of Quality Information Management (QIM) in Japan. They present the idea of a Design Review and demonstrate how it can be used to trace and reduce quality-related losses. The next paper on the subject of quality is by Whittle and colleagues. It relates to total quality and the process of culture change within organisations. Using the findings of investigations carried out in a number of case study companies, they describe four generic models which have been identified as characterising methods of implementing total quality within existing organisation cultures. Boaden and Dale's paper also relates to the management of quality, but looks specifically at the construction industry where it has been found there is still some confusion over the role of Quality Assurance (QA) and Total Quality Management (TQM). They describe the results of a questionnaire survey of forty companies in the industry and compare them to similar work carried out in other industries. Szakonyi's contribution then completes this group of papers which all relate specifically to the question of quality. His concern is with the two ways in which R&D or engineering managers can work on improving quality. The first is by improving it in the laboratory, while the second is by working with other functions to improve quality in the company. The next group of papers in this issue all address aspects of production management. Umeda's paper proposes a new manufacturing-oriented simulation package for production management which provides important information for both design and operation of manufacturing systems. A simulation for production strategy in a Computer Integrated Manufacturing (CIM) environment is also discussed. This paper is then followed by a contribution by Tanaka and colleagues in which they consider loading schedules for manufacturing orders in a Material Requirements Planning (MRP) environment. They compare mathematical programming with a knowledge-based approach, and comment on their relative effectiveness for different practical situations. Engstrom and Medbo's paper then looks at a particular aspect of production system design, namely the question of devising group working arrangements for assembly with new product structures. Using the case of a Swedish vehicle assembly plant where long cycle assembly work has been adopted, they advocate the use of a generally applicable product structure which can be adapted to suit individual local conditions. In the last paper of this particular group, Tay considers how automation has affected the production efficiency in Singapore. Using data from ten major industries he identifies several factors which are positively correlated with efficiency, with capital intensity being of greatest interest to policy makers. The two following papers examine the case of electronic data interchange (EDI) as a means of improving the efficiency and quality of trading relationships. Banerjee and Banerjee consider a particular approach to material provisioning for production systems using orderless inventory replenishment. Using the example of a single supplier and multiple buyers they develop an analytical model which is applicable for the exchange of information between trading partners using EDI. They conclude that EDI-based inventory control can be attractive from economic as well as other standpoints and that the approach is consistent with and can be instrumental in moving towards just-in-time (JIT) inventory management. Slacker's complementary viewpoint on EDI is from the perspective of the quality relation-ship between the customer and supplier. Based on the experience of Lucas, a supplier within the automotive industry, he concludes that both banks and trading companies must take responsibility for the development of payment mechanisms which satisfy the requirements of quality trading. The three final papers of this issue relate to technological innovation and are all country based. Berman and Khalil report on a survey of US technological effectiveness in the global economy. The importance of education is supported in their conclusions, although it remains unclear to what extent the US government can play a wider role in promoting technological innovation and new industries. The role of technology in national development is taken up by Martinsons and Valdemars who examine the case of the former Soviet Union. The failure to successfully infuse technology into Soviet enterprises is seen as a factor in that country's demise, and it is anticipated that the newly liberalised economies will be able to encourage greater technological creativity. This point is then taken up in Perminov's concluding paper which looks in detail at Russia. Here a similar analysis is made of the concluding paper which looks in detail at Russia. Here a similar analysis is made of the Soviet Union's technological decline, but a development strategy is also presented within the context of the change from a centralised to a free market economy. The papers included in this special issue of the International Journal of Technology Management each represent a unique and particular contribution to their own specific area of concern. Together, however, they also argue or demonstrate the general improvements in competitive performance that can be achieved through the application of modern principles and practice to the management of quality, production and technological innovation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: The aim of this article is to detail the correlation between quality management, specifically its tools and critical success factors, and performance in terms of primary operational and secondary organisational performances. Design/methodology/approach: Survey data from the UK and Turkey were analysed using exploratory factor analyses, structural equation modelling and regression analysis. Findings: The results show that quality management has a significant and positive impact on both primary and secondary performances; that Turkish and UK attitudes to quality management are similar; and that quality management is widely practised in manufacturing and service industries but has more statistical emphasis in the manufacturing sector. The main challenge for making quality management practice more effective lies in an appropriate balanced use of the different sorts of the tools and critical success factors. Originality/value: This study takes a novel approach by: (i) exploring the relationship between primary operational and secondary organisational performances, (ii) using service and manufacturing data and (iii) making a cross-country comparison between the UK (a developed economy) and Turkey (a developing economy). Limitations: Detailed contrast provided between only two countries. © 2013 Copyright Taylor and Francis Group, LLC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to consider hierarchical control as a mode of governance, and analyses the extent of control exhibited by central government over local government through the best value (BV) and comprehensive performance assessment (CPA) performance regimes. Design/methodology/approach – This paper utilises Ouchi's framework and, specifically, his articulation of bureaucratic or hierarchical control in the move towards achievement of organisational objectives. Hierarchical control may be inferred from the extent of “command and control” by Central Government, use of rewards and sanctions, and alignment to government priorities and discrimination of performance. Findings – CPA represents a more sophisticated performance regime than BV in the governance of local authorities by central government. In comparison to BV, CPA involved less scope for dialogue with local government prior to introduction, closer inspection of and direction of support toward poorer performing authorities, and more alignment to government priorities in the weightings attached to service blocks. Originality/value - The paper focuses upon the hierarchic/bureaucratic mode of governance as articulated by Ouchi and expands on this mode in order to analyse shifts in performance regimes in the public sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a framework for considering quality control of volunteered geographic information (VGI). Different issues need to be considered during the conception, acquisition and post-acquisition phases of VGI creation. This includes items such as collecting metadata on the volunteer, providing suitable training, giving corrective feedback during the mapping process and use of control data, among others. Two examples of VGI data collection are then considered with respect to this quality control framework, i.e. VGI data collection by National Mapping Agencies and by the most recent Geo-Wiki tool, a game called Cropland Capture. Although good practices are beginning to emerge, there is still the need for the development and sharing of best practice, especially if VGI is to be integrated with authoritative map products or used for calibration and/or validation of land cover in the future.