277 resultados para Standardisation
Resumo:
Orthodox contingency theory links effective organisational performance to compatible relationships between the environment and organisation strategy and structure and assumes that organisations have the capacity to adapt as the environment changes. Recent contributions to the literature on organisation theory claim that the key to effective performance is effective adaptation which in turn requires the simultaneous reconciliation of efficiency and innovation which is afforded by an unique environment-organisation configuration. The literature on organisation theory recognises the continuing confusion caused by the fragmented and often conflicting results from cross-sectional studies. Although the case is made for longitudinal studies which comprehensively describe the evolving relationship between the environment and the organisation there is little to suggest how such studies should be executed in practice. Typically the choice is between the approaches of the historicised case study and statistical analysis of large populations which examine the relationship between environment and organisation strategy and/or structure and ignore the product-process relationship. This study combines the historicised case study and the multi-variable and ordinal scale approach of statistical analysis to construct an analytical framework which tracks and exposes the environment-organisation-performance relationship over time. The framework examines changes in the environment, strategy and structure and uniquely includes an assessment of the organisation's product-process relationship and its contribution to organisational efficiency and innovation. The analytical framework is applied to examine the evolving environment-organisation relationship of two organisations in the same industry over the same twenty-five year period to provide a sector perspective of organisational adaptation. The findings demonstrate the significance of the environment-organisation configuration to the scope and frequency of adaptation and suggest that the level of sector homogeneity may be linked to the level of product-process standardisation.
Resumo:
Healthcare providers and policy makers are faced with an ever-increasing number of medical publications. Searching for relevant information and keeping up to date with new research findings remains a constant challenge. It has been widely acknowledged that narrative reviews of the literature are susceptible to several types of bias and a systematic approach may protect against these biases. The aim of this thesis was to apply quantitative methods in the assessment of outcomes of topical therapies for psoriasis. In particular, to systematically examine the comparative efficacy, tolerability and cost-effectiveness of topical calcipotriol in the treatment of mild-to-moderate psoriasis. Over the years, a wide range of techniques have been used to evaluate the severity of psoriasis and the outcomes from treatment. This lack of standardisation complicates the direct comparison of results and ultimately the pooling of outcomes from different clinical trials. There is a clear requirement for more comprehensive tools for measuring drug efficacy and disease severity in psoriasis. Ideally, the outcome measures need to be simple, relevant, practical, and widely applicable, and the instruments should be reliable, valid and responsive. The results of the meta-analysis reported herein show that calcipotriol is an effective antipsoriatic agent. In the short-tenn, the pooled data found calcipotriol to be more effective than calcitriol, tacalcitol, coal tar and short-contact dithranol. Only potent corticosteroids appeared to have comparable efficacy, with less short-term side-effects. Potent corticosteroids also added to the antipsoriatic effect of calcipotriol, and appeared to suppress the occurrence of calcipotriol-induced irritation. There was insufficient evidence to support any large effects in favour of improvements in efficacy when calcipotriol is used in combination with systemic therapies in patients with severe psoriasis. However, there was a total absence of long-term morbidity data on the effectiveness of any of the interventions studied. Decision analysis showed that, from the perspective of the NHS as payer, the relatively small differences in efficacy between calcipotriol and short-contact dithranol lead to large differences in the direct cost of treating patients with mildto-moderate plaque psoriasis. Further research is needed to examine the clinical and economic issues affecting patients under treatment for psoriasis in the UK. In particular, the maintenance value and cost/benefit ratio for the various treatment strategies, and the assessment of patient's preferences has not yet been adequately addressed for this chronic recurring disease.
Resumo:
Conventional methods of form-roll design and manufacture for Cold Roll-Forming of thin-walled metal sections have been entirely manual, time consuming and prone to errors, resulting in inefficiency and high production costs. With the use of computers, lead time can be significantly improved, particularly for those aspects involving routine but tedious human decisions and actions. This thesis describes the development of computer aided tools for producing form-roll designs for NC manufacture in the CAD/CAM environment. The work was undertaken to modernise the existing activity of a company manufacturing thin-walled sections. The investigated areas of the activity, including the design and drafting of the finished section, the flower patterns, the 10 to 1 templates, and the rolls complete with pinch-difference surfaces, side-rolls and extension-contours, have been successfully computerised by software development . Data generated by the developed software can be further processed for roll manufacturing using NC lathes. The software has been specially designed for portability to facilitate its implementation on different computers. The Opening-Radii method of forming was introduced as a subsitute to the conventional method for better forming. Most of the essential aspects in roll design have been successfully incorporated in the software. With computerisation, extensive standardisation in existing roll design practices and the use of more reliable and scientifically-based methods have been achieved. Satisfactory and beneficial results have also been obtained by the company in using the software through a terminal linked to the University by a GPO line. Both lead time and productivity in roll design and manufacture have been significantly improved. It is therefore concluded that computerisation in the design of form-rolls for automation by software development is viable. The work also demonstrated the promising nature of the CAD/CAM approach.
Resumo:
This thesis focuses on the investigation of the abrasion resistance of fibre reinforced concrete floors at both the macro and micro levels. A literature review of the available literature concerning subjects allied to the current project is included. This highlights themes relevant to wear mechanisms and the factors influencing it: factors that affect the abrasion resistance of concrete and several test methods for assessing it; and the historical development of fibres and the properties of different fibre types and their influence on concrete. Three accelerated abrasion testers were compared and critically discussed for their suitability for assessing the abrasion resistance of concrete floors. Based on the experimental findings one accelerated abrasion apparatus was selected as more appropriate to be used for carrying out the main investigations. The laboratory programme that followed was undertaken to investigate the influence of various material and construction factors on abrasion resistance. These included mix variations (w/c ratio), fibre reinforcement, geometry, type and volume, curing method and superplasticizing agents. The results clearly show that these factors significantly affected abrasion resistance and several mechanisms were presumed to explain and better understand these observations. To verify and understand these mechanisms that are accountable for the breakdown of concrete slabs, the same concrete specimens that were used for the macro-study, were also subjected to microstructutural investigations using techniques such as Microhardness examination, Mercury intrusion porosimetry and Petrographic examination. It has been found that the abrasion resistance of concrete is primarily dependent on the microstructure and porosity of the concrete nearest to the surface. The feasibility of predicting the abrasion resistance of fibre reinforced concrete floors by indirect and non-destructive methods was investigated using five methods that have frequently been used for assessing the quality of concrete. They included the initial surface absorption test, the impact test, ball cratering, the scratch test and the base hardness test. The impact resistance (BRE screed tester) and scratch resistance (Base hardness tester) were found to be the most sensitive to factors affecting abrasion resistance and hence are considered to be the most appropriate testing techniques. In an attempt to develop an appropriate method for assessing the abrasion resistance of heavy-duty industrial concrete floors, it was found that the presence of curing/sealing compound on the concrete surface at the time of accelerated abrasion testing produces inappropriate results. A preliminary investigation in the direction of modifying the Aston accelerated abrasion tester has been carried out and a more aggressive head has been developed and is pending future research towards standardisation.
Resumo:
Mental-health risk assessment practice in the UK is mainly paper-based, with little standardisation in the tools that are used across the Services. The tools that are available tend to rely on minimal sets of items and unsophisticated scoring methods to identify at-risk individuals. This means the reasoning by which an outcome has been determined remains uncertain. Consequently, there is little provision for: including the patient as an active party in the assessment process, identifying underlying causes of risk, and eecting shared decision-making. This thesis develops a tool-chain for the formulation and deployment of a computerised clinical decision support system for mental-health risk assessment. The resultant tool, GRiST, will be based on consensual domain expert knowledge that will be validated as part of the research, and will incorporate a proven psychological model of classication for risk computation. GRiST will have an ambitious remit of being a platform that can be used over the Internet, by both the clinician and the layperson, in multiple settings, and in the assessment of patients with varying demographics. Flexibility will therefore be a guiding principle in the development of the platform, to the extent that GRiST will present an assessment environment that is tailored to the circumstances in which it nds itself. XML and XSLT will be the key technologies that help deliver this exibility.
Resumo:
Diabetic retinopathy (DR) remains the leading cause of blindness among working-age individuals in developed countries. Current treatments for DR are indicated in advanced stages of the disease and are associated with significant adverse effects. Therefore, new pharmacological treatments for the early stages of DR are needed. DR has been classically considered to be a microcirculatory disease of the retina. However, there is growing evidence to suggest that retinal neurodegeneration is an early event in the pathogenesis of DR, which participates in the microcirculatory abnormalities that occur in DR. Therefore, the study of the underlying mechanisms that lead to neurodegeneration will be essential for identifying new therapeutic targets. From the clinical point of view, the identification of those patients in whom retinal neurodegeneration appears will be crucial for implementing early treatment based on neuroprotective drugs. When the early stages of DR are the therapeutic target, it would be inconceivable to recommend an aggressive treatment such as intravitreous injections. By contrast, topical administration of neuroprotective drugs by using eye drops is a possible option. However, clinical trials to determine the safety and effectiveness of this non-invasive route, as well as a standardisation of the methods for monitoring neurodegeneration, are needed.
Resumo:
Optical coherence tomography (OCT) is a non-invasive three-dimensional imaging system that is capable of producing high resolution in-vivo images. OCT is approved for use in clinical trials in Japan, USA and Europe. For OCT to be used effectively in a clinical diagnosis, a method of standardisation is required to assess the performance across different systems. This standardisation can be implemented using highly accurate and reproducible artefacts for calibration at both installation and throughout the lifetime of a system. Femtosecond lasers can write highly reproducible and highly localised micro-structured calibration artefacts within a transparent media. We report on the fabrication of high quality OCT calibration artefacts in fused silica using a femtosecond laser. The calibration artefacts were written in fused silica due to its high purity and ability to withstand high energy femtosecond pulses. An Amplitude Systemes s-Pulse Yb:YAG femtosecond laser with an operating wavelength of 1026 nm was used to inscribe three dimensional patterns within the highly optically transmissive substrate. Four unique artefacts have been designed to measure a wide variety of parameters, including the points spread function (PSF), modulation transfer function (MTF), sensitivity, distortion and resolution - key parameters which define the performance of the OCT. The calibration artefacts have been characterised using an optical microscope and tested on a swept source OCT. The results demonstrate that the femtosecond laser inscribed artefacts have the potential of quantitatively and qualitatively validating the performance of any OCT system.
Resumo:
Semantic Web Service, one of the most significant research areas within the Semantic Web vision, has attracted increasing attention from both the research community and industry. The Web Service Modelling Ontology (WSMO) has been proposed as an enabling framework for the total/partial automation of the tasks (e.g., discovery, selection, composition, mediation, execution, monitoring, etc.) involved in both intra- and inter-enterprise integration of Web services. To support the standardisation and tool support of WSMO, a formal model of the language is highly desirable. As several variants of WSMO have been proposed by the WSMO community, which are still under development, the syntax and semantics of WSMO should be formally defined to facilitate easy reuse and future development. In this paper, we present a formal Object-Z formal model of WSMO, where different aspects of the language have been precisely defined within one unified framework. This model not only provides a formal unambiguous model which can be used to develop tools and facilitate future development, but as demonstrated in this paper, can be used to identify and eliminate errors present in existing documentation.
Resumo:
Accommodating Intraocular Lenses (IOLs), multifocal IOLs (MIOLs) and toric IOLs are designed to provide a greater level of spectacle independency post cataract surgery. All of these IOLs are reliant on the accurate calculation of intraocular lens power determined through reliable ocular biometry. A standardised defocus area metric and reading performance index metric were devised for the evaluation of the range of focus and the reading ability of subjects implanted with presbyopic correcting IOLs. The range of clear vision after implantation of an MIOL is extended by a second focal point; however, this results in the prevalence of dysphotopsia. A bespoke halometer was designed and validated to assess this photopic phenomenon. There is a lack of standardisation in the methods used for determining IOL orientation and thus rotation. A repeatable, objective method was developed to allow the accurate assessment of IOL rotation, which was used to determine the rotational and positional stability of a closed loop haptic IOL. A new commercially available biometry device was validated for use with subjects prior to cataract surgery. The optical low coherence reflectometry instrument proved to be a valid method for assessing ocular biometry and covered a wider range of ocular parameters in comparison with previous instruments. The advantages of MIOLs were shown to include an extended range of clear vision translating into greater reading ability. However, an increased prevalence of dysphotopsia was shown with a bespoke halometer, which was dependent on the MIOL optic design. Implantation of a single optic accommodating IOL did not improve reading ability but achieved high subjective ratings of near vision. The closed-loop haptic IOL displayed excellent rotational stability in the late period but relatively poor rotational stability in the early period post implantation. The orientation error was compounded by the high frequency of positional misalignment leading to an extensive overall misalignment of the IOL. This thesis demonstrates the functionality of new IOL lens designs and the importance of standardised testing methods, thus providing a greater understanding of the consequences of implanting these IOLs. Consequently, the findings of the thesis will influence future designs of IOLs and testing methods.
Resumo:
Measurement of glycated haemoglobin A (HbA) provides an indication of longer-term glycaemic control. Standardisation of this test between laboratories is difficult to achieve, and most assays are currently calibrated to the values used in the Diabetes Control and Complications Trial (DCCT-aligned). With the availability of more specific reference standards it is now proposed that HbA is expressed as mmol HbA per mol of non-glycated haemoglobin. An HbA of 7% is approximately equal to 53 mmol/mol.
Resumo:
Whether to assess the functionality of equipment or as a determinate for the accuracy of assays, reference standards are essential for the purposes of standardisation and validation. The ELISPOT assay, developed over thirty years ago, has emerged as a leading immunological assay in the development of novel vaccines for the assessment of efficacy. However, with its widespread use, there is a growing demand for a greater level of standardisation across different laboratories. One of the major difficulties in achieving this goal has been the lack of definitive reference standards. This is partly due to the ex vivo nature of the assay, which relies on cells being placed directly into the wells. Thus, the aim of this thesis was to produce an artificial reference standard using liposomes, for use within the assay. Liposomes are spherical bilayer vesicles with an enclosed aqueous compartment and therefore are models for biological membranes. Initial work examined pre-design considerations in order to produce an optimal formulation that would closely mimic the action of the cells ordinarily placed on the assay. Recognition of the structural differences between liposomes and cells led to the formulation of liposomes with increased density. This was achieved by using a synthesised cholesterol analogue. By incorporating this cholesterol analogue in liposomes, increased sedimentation rates were observed within the first few hours. The optimal liposome formulation from these studies was composed of 2-dipalmitoyl-sn-glycero-3-phosphocholine (DPPC), cholesterol (Chol) and brominated cholesterol (Brchol) at a 16:4:12 µMol ratio, based on a significantly higher (p<0.01) sedimentation (as determined by a percentage transmission of 59 ± 5.9 % compared to the control formulation at 29 ± 12 % after four hours). By considering a range of liposome formulations ‘proof of principle’ for using liposomes as ELISPOT reference standards was shown; recombinant IFN? cytokine was successfully entrapped within vesicles of different lipid compositions, which were able to promote spot formation within the ELISPOT assay. Using optimised liposome formulations composed of phosphatidylcholine with or without cholesterol (16 µMol total lipid) further development was undertaken to produce an optimised, scalable protocol for the production of liposomes as reference standards. A linear increase in spot number by the manipulation of cytokine concentration and/or lipid concentrations was not possible, potentially due to the saturation that occurred within the base of wells. Investigations into storage of the formulations demonstrated the feasibility of freezing and lyophilisation with disaccharide cryoprotectants, but also highlighted the need for further protocol optimisation to achieve a robust reference standard upon storage. Finally, the transfer of small-scale production to a medium lab-scale batch (40 mL) demonstrated this was feasible within the laboratory using the optimised protocol.
Resumo:
The Retinal Vessel Analyser (RVA) is a commercially available ophthalmoscopic instrument capable of acquiring vessel diameter fluctuations in real time and in high temporal resolution. Visual stimulation by means of flickering light is a unique exploration tool of neurovascular coupling in the human retina. Vessel reactivity as mediated by local vascular endothelial vasodilators and vasoconstrictors can be assessed non-invasively, in vivo. In brief, the work in this thesis • deals with interobserver and intraobserver reproducibility of the flicker responses in healthy volunteers • explains the superiority of individually analysed reactivity parameters over vendorgenerated output • links in static retinal measures with dynamic ones • highlights practical limitations in the use of the RVA that may undermine its clinical usefulness • provides recommendations for standardising measurements in terms of vessel location and vessel segment length and • presents three case reports of essential hypertensives in a -year follow-up. Strict standardisation of measurement procedures is a necessity when utilising the RVA system. Agreement between research groups on implemented protocols needs to be met, before it could be considered a clinically useful tool in detecting or predicting microvascular dysfunction.
Resumo:
In the field of mental health risk assessment, there is no standardisation between the data used in different systems. As a first step towards the possible interchange of data between assessment tools, an ontology has been constructed for a particular one, GRiST (Galatean Risk Screening Tool). We briefly introduce GRiST and its data structures, then describe the ontology and the benefits that have already been realised from the construction process. For example, the ontology has been used to check the consistency of the various trees used in the model. We then consider potential uses in integration of data from other sources. © 2009 IEEE.
Resumo:
This article explores the settings and practices of translation at three types of political institutions, i.e. national, supranational, and non-governmental organisations. The three institutions are the translation service of the German Foreign Office, the translation department of the European Central Bank, and translation provision by the non-governmental organisation Amnesty International. The three case studies describe the specific translation practices in place at these institutions and illustrate some characteristic translation strategies. In this way, we reflect on how different translation practices can impact on translation agency and how these practices in turn are influenced by the type of institution and its organisational structure. The article also aims to explore to which extent the characteristics of collectivity, anonymity and standardisation, and of institutional translation as self-translation are applicable to the institutions under discussion.
Resumo:
Protein carbonyls are widely analysed as a measure of protein oxidation. Several different methods exist for their determination. A previous study had described orders of magnitude variance that existed when protein carbonyls were analysed in a single laboratory by ELISA using different commercial kits. We have further explored the potential causes of variance in carbonyl analysis in a ring study. A soluble protein fraction was prepared from rat liver and exposed to 0, 5 and 15 min of UV irradiation. Lyophilised preparations were distributed to six different laboratories that routinely undertook protein carbonyl analysis across Europe. ELISA and Western blotting techniques detected an increase in protein carbonyl formation between 0 and 5 min of UV irradiation irrespective of method used. After irradiation for 15 min, less oxidation was detected by half of the laboratories than after 5 min irradiation. Three of the four ELISA carbonyl results fell within 95% confidence intervals. Likely errors in calculating absolute carbonyl values may be attributed to differences in standardisation. Out of up to 88 proteins identified as containing carbonyl groups after tryptic cleavage of irradiated and control liver proteins, only seven were common in all three liver preparations. Lysine and arginine residues modified by carbonyls are likely to be resistant to tryptic proteolysis. Use of a cocktail of proteases may increase the recovery of oxidised peptides. In conclusion, standardisation is critical for carbonyl analysis and heavily oxidised proteins may not be effectively analysed by any existing technique.