36 resultados para Method development and research
em Aston University Research Archive
Resumo:
The object of this work was to further develop the idea introduced by Muaddi et al (1981) which enables some of the disadvantages of earlier destructive adhesion test methods to be overcome. The test is non-destructive in nature but it does need to be calibrated against a destructive method. Adhesion is determined by measuring the effect of plating on internal friction. This is achieved by determining the damping of vibrations of a resonating specimen before and after plating. The level of adhesion was considered by the above authors to influence the degree of damping. In the major portion of the research work the electrodeposited metal was Watt's nickel, which is ductile in nature and is therefore suitable for peel adhesion testing. The base metals chosen were aluminium alloys S1C and HE9 as it is relatively easy to produce varying levels of adhesion between the substrate and electrodeposited coating by choosing the appropriate process sequence. S1C alloy is the commercially pure aluminium and was used to produce good adhesion. HE9 aluminium alloy is a more difficult to plate alloy and was chosen to produce poorer adhesion. The "Modal Testing" method used for studying vibrations was investigated as a possible means of evaluating adhesion but was not successful and so research was concentrated on the "Q" meter. The method based on the use of a "Q" meter involves the principle of exciting vibrations in a sample, interrupting the driving signal and counting the number of oscillations of the freely decaying vibrations between two known preselected amplitudes of oscillations. It was not possible to reconstruct a working instrument using Muaddi's thesis (1982) as it had either a serious error or the information was incomplete. Hence a modified "Q" meter had to be designed and constructed but it was then difficult to resonate non-magnetic materials, such as aluminium, therefore, a comparison before and after plating could not be made. A new "Q" meter was then developed based on an Impulse Technique. A regulated miniature hammer was used to excite the test piece at the fundamental mode instead of an electronic hammer and test pieces were supported at the two predetermined nodal points using nylon threads. This instrument developed was not very successful at detecting changes due to good and poor pretreatments given before plating, however, it was more sensitive to changes at the surface such as room temperature oxidation. Statistical analysis of test results from untreated aluminium alloys show that the instrument is not always consistent, the variation was even bigger when readings were taken on different days. Although aluminium is said to form protective oxides at room temperature there was evidence that the aluminium surface changes continuously due to film formation, growth and breakdown. Nickel plated and zinc alloy immersion coated samples also showed variation in Q with time. In order to prove that the variations in Q were mainly due to surface oxidation, aluminium samples were lacquered and anodised Such treatments enveloped the active surfaces reacting with the environment and the Q variation with time was almost eliminated especially after hard anodising. This instrument detected major differences between different untreated aluminium substrates.Also Q values decreased progressively as coating thicknesses were increased. This instrument was also able to detect changes in Q due to heat-treatment of aluminium alloys.
Resumo:
Book review
Resumo:
This study concerns the application of a model of effective interpersonal relationships to problems arising from staff assessment at I.C.I. Ltd. Corporate Laboratory between 1972 and 1974. In collaboration with academic and industrial supervision, the study commenced with a survey of management and supervisor opinions about the effectiveness of current staff (work) relationships, with particular reference to the problem of recognising and developing creative potential. This survey emphasised a need to improve the relationships between staff in the staff assessment context. A survey of research into creativity emphasised the importance of the interpersonal environment for obtaining creative behaviour in an organisation context. A further survey of theories of how interpersonal behaviour related to personal creativity (therapeutic psychology) provided a model of effective interpersonal behaviour (Carkhuff, 1969) that could be applied to the organisation context of staff assessment. The objective of the project was redefined as a need to improve the conditions of interpersonal behaviour in relation to certain (career development) problems arising from staff assessment practices. In order to demonstrate the application of the model of effective interpersonal behaviour, the research student recorded interviews between himself and members of staff designed to develop and operate the dimensions of the model. Different samples of staff were used to develop the 'facilitative' and the 'action oriented' dimensions of bahaviour, and then for the operation of a helping programme (based on vocational guidance tests). These interactions have been analysed, according to the scales of measurement in the model ana the results are presented in case study form in this thesis. At each stage of the project, results and conclusions were presented to the sponsoring organisation (e.g. industrial supervisor) in order to assess their (subjective) opinion of relevance to the organ isation. Finally, recommendations on further actions towards general improvement of the work relationships in the laboratory were presented in a brief report to the sponsor.
Resumo:
The need for improvement in the development of research careers and researchers’ training in transferable skills was highlighted in two particular recommendations (numbers 4.2 and 5.3) in the 2002 report ‘SET for success: the report of Sir Gareth Roberts’ Review - the supply of people with science, technology, engineering and mathematics skills’ (Roberts, 2002). As a consequence of that review, Research Councils UK (RCUK)1 have invested about £120 million, usually referred to as ’Roberts’ Money’, in research organisations to address this concern in all research disciplines. The last ‘Roberts’ Money’ payment will be for the period up to March 2011; it was therefore proposed to assess the progress made with taking forward these specific recommendations. An independent panel was formed by RCUK to undertake this review in 2010. The terms of reference for the panel are in Annex A. In summary, the panel was asked to review progress made and to advise RCUK and the higher education (HE) sector about future requirements for the development and training of researchers. In the course of their review, the panel considered a wide range of existing reports, interviewed key stakeholders in the HE sector and elsewhere, as well as drawing on their own knowledge and expertise. This report presents the findings of the panel’s review.
Resumo:
An HPLC method has been developed and validated for the rapid determination of mercaptopurine and four of its metabolites; thioguanine, thiouric acid, thioxanthine and methylmercaptopurine in plasma and red blood cells. The method involves a simple treatment procedure based on deproteinisation by perchloric acid followed by acid hydrolysis and heating for 45min at 100 degrees C. The developed method was linear over the concentration range studied with a correlation coefficient >0.994 for all compounds in both plasma and erythrocytes. The lower limits of quantification were 13, 14, 3, 2, 95pmol/8 x 10(8) RBCs and 2, 5, 2, 3, 20ng/ml plasma for thioguanine, thiouric acid, mercaptopurine, thioxanthine and methylmercaptopurine, respectively. The method described is selective and sensitive enough to analyse the different metabolites in a single run under isocratic conditions. Furthermore, it has been shown to be applicable for monitoring these metabolites in paediatric patients due to the low volume requirement (200microl of plasma or erythrocytes) and has been successfully applied for investigating population pharmacokinetics, pharmacogenetics and non-adherence to therapy in these patients.
Resumo:
Oral liquid formulations are ideal dosage forms for paediatric, geriatric and patient with dysphagia. Dysphagia is prominent among patients suffering from stroke, motor neurone disease, advanced Alzheimer’s and Parkinson’s disease. However oral liquid preparations are particularly difficult to formulate for hydrophobic and unstable drugs. Therefore current methods employed in solving this issue include the use of ‘specials’ or extemporaneous preparations. In order to challenge this, the government has encouraged research into the field of oral liquid formulations, with the EMEA and MHRA publishing list of drugs of interest. The current work investigates strategic formulation development and characterisation of select API’s (captopril, gliclazide, melatonin, L-arginine and lansoprazole), each with unique obstacles to overcome during solubilisation, stabilisation and when developing a palatable dosage from. By preparing a validated calibration protocol for each of the drug candidates, the oral liquid formulations were assessed for stability, according to the ICH guidelines along with thorough physiochemical characterisation. The results showed that pH and polarity of the solvent had the greatest influence on the extent of drug solubilisation, with inclusion of antioxidants and molecular steric hindrance influencing the extent of drug stability. Captopril, a hydrophilic ACE inhibitor (160 mg.mL-1), undergoes dimerisation with another captopril molecule. It was found that with the addition of EDTA and HP-β-CD, the drug molecule was stabilised and prevented from initiating a thiol induced first order free radical oxidation. The cyclodextrin provided further steric hindrance (1:1 molar ratio) resulting in complete reduction of the intensity of sulphur like smell associated with captopril. Palatability is a crucial factor in patient compliance, particularly when developing a dosage form targeted towards paediatrics. L-arginine is extremely bitter in solution (148.7 g.L-1). The addition of tartaric acid into the 100 mg.mL-1 formulation was sufficient to mask the bitterness associated with its guanidium ions. The hydrophobicity of gliclazide (55 mg.L-1) was strategically challenged using a binary system of a co-solvent and surfactant to reduce the polarity of the medium and ultimately increase the solubility of the drug. A second simpler method was developed using pH modification with L-arginine. Melatonin has two major obstacles in formulation: solubility (100 μg.mL-1) and photosensitivity, which were both overcome by lowering the dielectric constant of the medium and by reversibly binding the drug within the cyclodextrin cup (1:1 ratio). The cyclodextrin acts by preventing UV rays from reaching the drug molecule and initiated the degradation pathway. Lansoprazole is an acid labile drug that could only be delivered orally via a delivery vehicle. In oral liquid preparations this involved nanoparticulate vesicles. The extent of drug loading was found to be influenced by the type of polymer, concentration of polymer, and the molecular weight. All of the formulations achieved relatively long shelf-lives with good preservative efficacy.
Resumo:
Bio-impedance analysis (BIA) provides a rapid, non-invasive technique for body composition estimation. BIA offers a convenient alternative to standard techniques such as MRI, CT scan or DEXA scan for selected types of body composition analysis. The accuracy of BIA is limited because it is an indirect method of composition analysis. It relies on linear relationships between measured impedance and morphological parameters such as height and weight to derive estimates. To overcome these underlying limitations of BIA, a multi-frequency segmental bio-impedance device was constructed through a series of iterative enhancements and improvements of existing BIA instrumentation. Key features of the design included an easy to construct current-source and compact PCB design. The final device was trialled with 22 human volunteers and measured impedance was compared against body composition estimates obtained by DEXA scan. This enabled the development of newer techniques to make BIA predictions. To add a ‘visual aspect’ to BIA, volunteers were scanned in 3D using an inexpensive scattered light gadget (Xbox Kinect controller) and 3D volumes of their limbs were compared with BIA measurements to further improve BIA predictions. A three-stage digital filtering scheme was also implemented to enable extraction of heart-rate data from recorded bio-electrical signals. Additionally modifications have been introduced to measure change in bio-impedance with motion, this could be adapted to further improve accuracy and veracity for limb composition analysis. The findings in this thesis aim to give new direction to the prediction of body composition using BIA. The design development and refinement applied to BIA in this research programme suggest new opportunities to enhance the accuracy and clinical utility of BIA for the prediction of body composition analysis. In particular, the use of bio-impedance to predict limb volumes which would provide an additional metric for body composition measurement and help distinguish between fat and muscle content.
Resumo:
Representational difference analysis (RDA) has great potential for preferential amplification of unique but uncharacterised DNA sequences present in one source such as a whole genome, but absent from a related genome or other complex population of sequences. While a few examples of its successful exploitation have been published, the method has not been well dissected and robust, detailed published protocols are lacking. Here we examine the method in detail, suggest improvements and provide a protocol that has yielded key unique sequences from a pathogenic bacterial genome. © 2003 Elsevier Science B.V. All rights reserved.
Resumo:
Objectives: To develop a tool for the accurate reporting and aggregation of findings from each of the multiple methods used in a complex evaluation in an unbiased way. Study Design and Setting: We developed a Method for Aggregating The Reporting of Interventions in Complex Studies (MATRICS) within a gastroenterology study [Evaluating New Innovations in (the delivery and organisation of) Gastrointestinal (GI) endoscopy services by the NHS Modernisation Agency (ENIGMA)]. We subsequently tested it on a different gastroenterology trial [Multi-Institutional Nurse Endoscopy Trial (MINuET)]. We created three layers to define the effects, methods, and findings from ENIGMA. We assigned numbers to each effect in layer 1 and letters to each method in layer 2. We used an alphanumeric code based on layers 1 and 2 to every finding in layer 3 to link the aims, methods, and findings. We illustrated analogous findings by assigning more than one alphanumeric code to a finding. We also showed that more than one effect or method could report the same finding. We presented contradictory findings by listing them in adjacent rows of the MATRICS. Results: MATRICS was useful for the effective synthesis and presentation of findings of the multiple methods from ENIGMA. We subsequently successfully tested it by applying it to the MINuET trial. Conclusion: MATRICS is effective for synthesizing the findings of complex, multiple-method studies.
Resumo:
Researchers often use 3-way interactions in moderated multiple regression analysis to test the joint effect of 3 independent variables on a dependent variable. However, further probing of significant interaction terms varies considerably and is sometimes error prone. The authors developed a significance test for slope differences in 3-way interactions and illustrate its importance for testing psychological hypotheses. Monte Carlo simulations revealed that sample size, magnitude of the slope difference, and data reliability affected test power. Application of the test to published data yielded detection of some slope differences that were undetected by alternative probing techniques and led to changes of results and conclusions. The authors conclude by discussing the test's applicability for psychological research. Copyright 2006 by the American Psychological Association.
Resumo:
Recent discussion of the knowledge-based economy draws increasingly attention to the role that the creation and management of knowledge plays in economic development. Development of human capital, the principal mechanism for knowledge creation and management, becomes a central issue for policy-makers and practitioners at the regional, as well as national, level. Facing competition both within and across nations, regional policy-makers view human capital development as a key to strengthening the positions of their economies in the global market. Against this background, the aim of this study is to go some way towards answering the question of whether, and how, investment in education and vocational training at regional level provides these territorial units with comparative advantages. The study reviews literature in economics and economic geography on economic growth (Chapter 2). In growth model literature, human capital has gained increased recognition as a key production factor along with physical capital and labour. Although leaving technical progress as an exogenous factor, neoclassical Solow-Swan models have improved their estimates through the inclusion of human capital. In contrast, endogenous growth models place investment in research at centre stage in accounting for technical progress. As a result, they often focus upon research workers, who embody high-order human capital, as a key variable in their framework. An issue of discussion is how human capital facilitates economic growth: is it the level of its stock or its accumulation that influences the rate of growth? In addition, these economic models are criticised in economic geography literature for their failure to consider spatial aspects of economic development, and particularly for their lack of attention to tacit knowledge and urban environments that facilitate the exchange of such knowledge. Our empirical analysis of European regions (Chapter 3) shows that investment by individuals in human capital formation has distinct patterns. Those regions with a higher level of investment in tertiary education tend to have a larger concentration of information and communication technology (ICT) sectors (including provision of ICT services and manufacture of ICT devices and equipment) and research functions. Not surprisingly, regions with major metropolitan areas where higher education institutions are located show a high enrolment rate for tertiary education, suggesting a possible link to the demand from high-order corporate functions located there. Furthermore, the rate of human capital development (at the level of vocational type of upper secondary education) appears to have significant association with the level of entrepreneurship in emerging industries such as ICT-related services and ICT manufacturing, whereas such association is not found with traditional manufacturing industries. In general, a high level of investment by individuals in tertiary education is found in those regions that accommodate high-tech industries and high-order corporate functions such as research and development (R&D). These functions are supported through the urban infrastructure and public science base, facilitating exchange of tacit knowledge. They also enjoy a low unemployment rate. However, the existing stock of human and physical capital in those regions with a high level of urban infrastructure does not lead to a high rate of economic growth. Our empirical analysis demonstrates that the rate of economic growth is determined by the accumulation of human and physical capital, not by level of their existing stocks. We found no significant effects of scale that would favour those regions with a larger stock of human capital. The primary policy implication of our study is that, in order to facilitate economic growth, education and training need to supply human capital at a faster pace than simply replenishing it as it disappears from the labour market. Given the significant impact of high-order human capital (such as business R&D staff in our case study) as well as the increasingly fast pace of technological change that makes human capital obsolete, a concerted effort needs to be made to facilitate its continuous development.
Resumo:
Brand extensions are increasingly used by multinational corporations in emerging markets such as China. However, understanding how consumers in the emerging markets evaluate brand extensions is hampered by a lack of research in the emerging markets contexts. To address the knowledge void, we built on an established brand extension evaluation framework in the West, namely Aaker and Keller (1990)1. Aaker , D. A. and Keller , K. L. 1990 . Consumer evaluations of brand extensions . Journal of Marketing , 54 ( 1 ) : 27 – 41 . [CrossRef], [Web of Science ®] View all references, and extended the model by incorporating two new factors: perceived fit based on brand image consistency and competition intensity in the brand extension category. The additions of two factors are made in recognition of the uniqueness of the considerations of consumers in the emerging markets in their brand extension evaluations. The extended model was tested by an empirical experiment using consumers in China. The results partly validated the Aaker and Keller model, and evidence that both newly added factors were significant in influencing consumers' evaluation of brand extensions was also found. More important, one new factor proposed, namely, consumer-perceived fit based on brand image consistency, was found to be more significant than all the factors in Aaker and Keller's original model, suggesting that the Aaker and Keller model may be limited in explaining how consumers in the emerging markets evaluate brand extensions. Further research implications and limitations are discussed in the paper.
Resumo:
A two-tier study is presented in this thesis. The first involves the commissioning of an extant but at the time, unproven bubbling fluidised bed fast pyrolysis unit. The unit was designed for an intended nominal throughput of 300 g/h of biomass. The unit came complete with solids separation, pyrolysis vapour quenching and oil collection systems. Modifications were carried out on various sections of the system including the reactor heating, quenching and liquid collection systems. The modifications allowed for fast pyrolysis experiments to be carried out at the appropriate temperatures. Bio-oil was generated using conventional biomass feedstocks including Willow, beechwood, Pine and Miscanthus. Results from this phase of the research showed however, that although the rig was capable of processing biomass to bio-oil, it was characterised by low mass balance closures and recurrent operational problems. The problems included blockages, poor reactor hydrodynamics and reduced organic liquid yields. The less than optimal performance of individual sections, particularly the feed and reactor systems of the rig, culminated in a poor overall performance of the system. The second phase of this research involved the redesign of two key components of the unit. An alternative feeding system was commissioned for the unit. The feed system included an off the shelf gravimetric system for accurate metering and efficient delivery of biomass. Similarly, a new bubbling fluidised bed reactor with an intended nominal throughput of 500g/h of biomass was designed and constructed. The design leveraged on experience from the initial commissioning phase with proven kinetic and hydrodynamic studies. These units were commissioned as part of the optimisation phase of the study. Also as part of this study, two varieties each, of previously unreported feedstocks namely Jatropha curcas and Moringa olifiera oil seed press cakes were characterised to determine their suitability as feedstocks for liquid fuel production via fast pyrolysis. Consequently, the feedstocks were used for the production of pyrolysis liquids. The quality of the pyrolysis liquids from the feedstocks were then investigated via a number of analytical techniques. The oils from the press cakes showed high levels of stability and reduced pH values. The improvements to the design of the fast pyrolysis unit led to higher mass balance closures and increased organic liquid yields. The maximum liquid yield obtained from the press cakes was from African Jatropha press cake at 66 wt% on a dry basis.
Resumo:
This thesis describes research into business user involvement in the information systems application building process. The main interest of this research is in establishing and testing techniques to quantify the relationships between identified success factors and the outcome effectiveness of 'business user development' (BUD). The availability of a mechanism to measure the levels of the success factors, and quantifiably relate them to outcome effectiveness, is important in that it provides an organisation with the capability to predict and monitor effects on BUD outcome effectiveness. This is particularly important in an era where BUD levels have risen dramatically, user centred information systems development benefits are recognised as significant, and awareness of the risks of uncontrolled BUD activity is becoming more widespread. This research targets the measurement and prediction of BUD success factors and implementation effectiveness for particular business users. A questionnaire instrument and analysis technique has been tested and developed which constitutes a tool for predicting and monitoring BUD outcome effectiveness, and is based on the BUDES (Business User Development Effectiveness and Scope) research model - which is introduced and described in this thesis. The questionnaire instrument is designed for completion by 'business users' - the target community being more explicitly defined as 'people who primarily have a business role within an organisation'. The instrument, named BUD ESP (Business User Development Effectiveness and Scope Predictor), can readily be used with survey participants, and has been shown to give meaningful and representative results.
Resumo:
CONTEXT: The homeless are a significant group within society, which is increasing in size. They have demonstrably greater physical and mental health needs than the housed, and yet often have difficulty accessing primary health care. Medical 'reluctance' to look after homeless people is increasingly suggested as part of the problem. Medical education may have a role in ameliorating this. OBJECTIVES: This paper reports on the development and validation of a questionnaire specifically developed to measure medical students' attitudes towards the homeless. METHOD AND RESULTS: The Attitudes Towards the Homeless Questionnaire, developed using the views of over 370 medical students, was shown to have a Pearson test-retest reliability correlation coefficient of 0.8 and a Cronbach's alpha coefficient of 0.74. CONCLUSIONS: The Attitudes Towards the Homeless Questionnaire appears to be a valid and reliable instrument, which can measure students' attitudes towards the homeless. It could be a useful tool in assessing the effectiveness of educational interventions.