25 resultados para Standardisation
em Aston University Research Archive
Resumo:
The worldwide growth of the translation industry requires qualified professional translators. During the last decade, we have seen an enormous increase in translator training programmes offered by universities, mainly at postgraduate level. A challenge for such university programmes is to make sure that they prepare graduates who are qualified for the needs of the diverse profession in the rapidly changing market. This means that programmes need to be developed with the market needs in mind and that they need to ensure a good match between graduates' competences and employers' requirements. This paper addresses the following questions: How can universities adapt translator training programmes to the rapidly changing industry and the accompanying changes in professional profiles? How can we reconcile the requests of the industry for graduates who have practical and professional skills with the requests of the universities for graduates who have in-depth academic knowledge and intellectual skills? What standards and benchmarks are in place to assure quality of translator training programmes? Some such developments in respect of benchmarking are illustrated at first for the United Kingdom, followed by information on the European Master’s in Translation (EMT) project, an initiative at the European level. Finally, the paper reflects on the challenges which the EMT translator competence profile poses for university programmes.
Resumo:
This paper reports the initial results of a joint research project carried out by Aston University and Lloyd's Register to develop a practical method of assessing neural network applications. A set of assessment guidelines for neural network applications were developed and tested on two applications. These case studies showed that it is practical to assess neural networks in a statistical pattern recognition framework. However there is need for more standardisation in neural network technology and a wider takeup of good development practice amongst the neural network community.
Resumo:
Purpose - To provide a framework of accounting policy choice associated with the timing of adoption of the UK Statement of Standard Accounting Practice (SSAP) No. 20, "Foreign Currency Translation". The conceptual framework describes the accounting policy choices that firms face in a setting that is influenced by: their financial characteristics; the flexible foreign exchange rates; and the stock market response to accounting decisions. Design/methodology/approach - Following the positive accounting theory context, this paper puts into a framework the motives and choices of UK firms with regard to the adoption or deferment of the adoption of SSAP 20. The paper utilises the theoretical and empirical findings of previous studies to form and substantiate the conceptual framework. Given the UK foreign exchange setting, the framework identifies the initial stage: lack of regulation and flexibility in financial reporting; the intermediate stage: accounting policy choice; and the final stage: accounting choice and policy review. Findings - There are situations where accounting regulation contrasts with the needs and business objectives of firms and vice-versa. Thus, firms may delay the adoption up to the point where the increase in political costs can just be tolerated. Overall, the study infers that firms might have chosen to defer the adoption of SSAP 20 until they reach a certain corporate goal, or the adverse impact (if any) of the accounting change on firms' financial numbers is minimal. Thus, the determination of the timing of the adoption is a matter which is subject to the objectives of the managers in association with the market and economic conditions. The paper suggests that the flexibility in financial reporting, which may enhance the scope for income-smoothing, can be mitigated by the appropriate standardisation of accounting practice. Research limitations/implications - First, the study encompassed a period when firms and investors were less sophisticated users of financial information. Second, it is difficult to ascertain the decisions that firms would have taken, had the pound appreciated over the period of adoption and had the firms incurred translation losses rather than translation gains. Originality/value - This paper is useful to accounting standards setters, professional accountants, academics and investors. The study can give the accounting standard-setting bodies useful information when they prepare a change in the accounting regulation or set an appropriate date for the implementation of an accounting standard. The paper provides significant insight about the behaviour of firms and the associated impacts of financial markets and regulation on the decision-making process of firms. The framework aims to assist the market and other authorities to reduce information asymmetry and to reinforce the efficiency of the market. © Emerald Group Publishing Limited.
Resumo:
Investigates the degree of global standardisation of a corporate visual identity system (CVIS) in multinational operations. A special emphasis of this research is accorded to UK companies operating in Malaysia. In particular, the study seeks to reveal the reasons for developing a standardised CVIS; the behavioural issues associated with CVIS; and the determination in selecting a graphic design agency. The findings of the research revealed that multinational corporations in an increasingly corporate environment adopted a standardised CVIS for several reasons, including, aiding the sale of products and services, creating an attractive environment for hiring employees, and increasing the company’s stature and presence. Further findings show that the interest in global identity was stimulated by global restructuring, merger or acquisition. The above trends help explain why increased focus has been accorded to CVIS over the past five years by many UK companies operating in Malaysia. Additional findings reveal that both the UK design agencies and in-house design department are used in the development of the firms’ CVIS.
Resumo:
The advent of the Integrated Services Digital Network (ISDN) led to the standardisation of the first video codecs for interpersonal video communications, followed closely by the development of standards for the compression, storage and distribution of digital video in the PC environment, mainly targeted at CD-ROM storage. At the same time the second-generation digital wireless networks, and the third-generation networks being developed, have enough bandwidth to support digital video services. The radio propagation medium is a difficult environment in which to deploy low bit error rate, real time services such as video. The video coding standards designed for ISDN and storage applications, were targeted at low bit error rate levels, orders of magnitude lower than the typical bit error rates experienced on wireless networks. This thesis is concerned with the transmission of digital, compressed video over wireless networks. It investigates the behaviour of motion compensated, hybrid interframe DPCM/DCT video coding algorithms, which form the basis of current coding algorithms, in the presence of high bit error rates commonly found on digital wireless networks. A group of video codecs, based on the ITU-T H.261 standard, are developed which are robust to the burst errors experienced on radio channels. The radio link is simulated at low level, to generate typical error files that closely model real world situations, in a Rayleigh fading environment perturbed by co-channel interference, and on frequency selective channels which introduce inter symbol interference. Typical anti-multipath techniques, such as antenna diversity, are deployed to mitigate the effects of the channel. Link layer error control techniques are also investigated.
Resumo:
This text is concerned with the intellectual and social alienation experienced by a twentieth century German writer (1906 - ).·the alienation begins in the context of German society, but this context is later globalised. The thesis first discusses the social and· intellectual origins and the salient features of this alienated stance, before proceeding to a detailed analysis of its recurring symptoms and later intensification in each of the author's main works, chronologically surveyed, supported by reference to minor writings. From the novels of the thirties' showing the burgher-artist conflict, and its symbolic dichotomies, the renunciation of traditional German values, and the ambiguous confrontation with new disruptive socio-political forces, we move to the post-war trilogy (1951-54), with its roots in the German social and political experience of the thirties' onwards. The latter, however, is merely a background for the presentation of a much more comprehensive view of the human condition:- a pessimistic vision of the repetitiveness and incorrigibility of this condition, the possibility of the apocalypse, the bankruptcy and ineffectiveness of European religion and culture, the 'absurd' meaninglessness of history, the intellectual artist's position and role(s) in mass-culture and an abstract, technologised mass-society, the central theme of fragmentation - of the structure of reality, society and personality, the artist's relation to this fragmentation, intensified in the twentieth,century. Style and language are consonant with this world-picture. Many of these features recur in the travel-books (1958-61); diachronic as well as synchronic approaches characterise the presentation of various modes of contemporary society in America, Russia, France and other European countries. Important features of intellectual alienation are:- the changelessness of historical motifs (e.g. tyranny, aggression), the conventions of burgher society, both old and new forms, the qualitative depreciation and standardisation of living, industrialisation and technology in complex, vulnerable and concemtrated urban societies, ambiguities of fragmented pluralism. Reference is made .to other travel-writers.
Resumo:
The IRDS standard is an international standard produced by the International Organisation for Standardisation (ISO). In this work the process for producing standards in formal standards organisations, for example the ISO, and in more informal bodies, for example the Object Management Group (OMG), is examined. This thesis examines previous models and classifications of standards. The previous models and classifications are then combined to produce a new classification. The IRDS standard is then placed in a class in the new model as a reference anticipatory standard. Anticipatory standards are standards which are developed ahead of the technology in order to attempt to guide the market. The diffusion of the IRDS is traced over a period of eleven years. The economic conditions which affect the diffusion of standards are examined, particularly the economic conditions which prevail in compatibility markets such as the IT and ICT markets. Additionally the consequences of the introduction of gateway or converter devices into a market where a standard has not yet been established is examined. The IRDS standard did not have an installed base and this hindered its diffusion. The thesis concludes that the IRDS standard was overtaken by new developments such as object oriented technologies and middleware. This was partly because of the slow development process of developing standards in traditional organisations which operate on a consensus basis and partly because the IRDS standard did not have an installed base. Also the rise and proliferation of middleware products resulted in exchange mechanisms becoming dominant rather than repository solutions. The research method used in this work is a longitudinal study of the development and diffusion of the ISO/EEC IRDS standard. The research is regarded as a single case study and follows the interpretative epistemological point of view.
Resumo:
Orthodox contingency theory links effective organisational performance to compatible relationships between the environment and organisation strategy and structure and assumes that organisations have the capacity to adapt as the environment changes. Recent contributions to the literature on organisation theory claim that the key to effective performance is effective adaptation which in turn requires the simultaneous reconciliation of efficiency and innovation which is afforded by an unique environment-organisation configuration. The literature on organisation theory recognises the continuing confusion caused by the fragmented and often conflicting results from cross-sectional studies. Although the case is made for longitudinal studies which comprehensively describe the evolving relationship between the environment and the organisation there is little to suggest how such studies should be executed in practice. Typically the choice is between the approaches of the historicised case study and statistical analysis of large populations which examine the relationship between environment and organisation strategy and/or structure and ignore the product-process relationship. This study combines the historicised case study and the multi-variable and ordinal scale approach of statistical analysis to construct an analytical framework which tracks and exposes the environment-organisation-performance relationship over time. The framework examines changes in the environment, strategy and structure and uniquely includes an assessment of the organisation's product-process relationship and its contribution to organisational efficiency and innovation. The analytical framework is applied to examine the evolving environment-organisation relationship of two organisations in the same industry over the same twenty-five year period to provide a sector perspective of organisational adaptation. The findings demonstrate the significance of the environment-organisation configuration to the scope and frequency of adaptation and suggest that the level of sector homogeneity may be linked to the level of product-process standardisation.
Resumo:
Healthcare providers and policy makers are faced with an ever-increasing number of medical publications. Searching for relevant information and keeping up to date with new research findings remains a constant challenge. It has been widely acknowledged that narrative reviews of the literature are susceptible to several types of bias and a systematic approach may protect against these biases. The aim of this thesis was to apply quantitative methods in the assessment of outcomes of topical therapies for psoriasis. In particular, to systematically examine the comparative efficacy, tolerability and cost-effectiveness of topical calcipotriol in the treatment of mild-to-moderate psoriasis. Over the years, a wide range of techniques have been used to evaluate the severity of psoriasis and the outcomes from treatment. This lack of standardisation complicates the direct comparison of results and ultimately the pooling of outcomes from different clinical trials. There is a clear requirement for more comprehensive tools for measuring drug efficacy and disease severity in psoriasis. Ideally, the outcome measures need to be simple, relevant, practical, and widely applicable, and the instruments should be reliable, valid and responsive. The results of the meta-analysis reported herein show that calcipotriol is an effective antipsoriatic agent. In the short-tenn, the pooled data found calcipotriol to be more effective than calcitriol, tacalcitol, coal tar and short-contact dithranol. Only potent corticosteroids appeared to have comparable efficacy, with less short-term side-effects. Potent corticosteroids also added to the antipsoriatic effect of calcipotriol, and appeared to suppress the occurrence of calcipotriol-induced irritation. There was insufficient evidence to support any large effects in favour of improvements in efficacy when calcipotriol is used in combination with systemic therapies in patients with severe psoriasis. However, there was a total absence of long-term morbidity data on the effectiveness of any of the interventions studied. Decision analysis showed that, from the perspective of the NHS as payer, the relatively small differences in efficacy between calcipotriol and short-contact dithranol lead to large differences in the direct cost of treating patients with mildto-moderate plaque psoriasis. Further research is needed to examine the clinical and economic issues affecting patients under treatment for psoriasis in the UK. In particular, the maintenance value and cost/benefit ratio for the various treatment strategies, and the assessment of patient's preferences has not yet been adequately addressed for this chronic recurring disease.
Resumo:
Conventional methods of form-roll design and manufacture for Cold Roll-Forming of thin-walled metal sections have been entirely manual, time consuming and prone to errors, resulting in inefficiency and high production costs. With the use of computers, lead time can be significantly improved, particularly for those aspects involving routine but tedious human decisions and actions. This thesis describes the development of computer aided tools for producing form-roll designs for NC manufacture in the CAD/CAM environment. The work was undertaken to modernise the existing activity of a company manufacturing thin-walled sections. The investigated areas of the activity, including the design and drafting of the finished section, the flower patterns, the 10 to 1 templates, and the rolls complete with pinch-difference surfaces, side-rolls and extension-contours, have been successfully computerised by software development . Data generated by the developed software can be further processed for roll manufacturing using NC lathes. The software has been specially designed for portability to facilitate its implementation on different computers. The Opening-Radii method of forming was introduced as a subsitute to the conventional method for better forming. Most of the essential aspects in roll design have been successfully incorporated in the software. With computerisation, extensive standardisation in existing roll design practices and the use of more reliable and scientifically-based methods have been achieved. Satisfactory and beneficial results have also been obtained by the company in using the software through a terminal linked to the University by a GPO line. Both lead time and productivity in roll design and manufacture have been significantly improved. It is therefore concluded that computerisation in the design of form-rolls for automation by software development is viable. The work also demonstrated the promising nature of the CAD/CAM approach.
Resumo:
This thesis focuses on the investigation of the abrasion resistance of fibre reinforced concrete floors at both the macro and micro levels. A literature review of the available literature concerning subjects allied to the current project is included. This highlights themes relevant to wear mechanisms and the factors influencing it: factors that affect the abrasion resistance of concrete and several test methods for assessing it; and the historical development of fibres and the properties of different fibre types and their influence on concrete. Three accelerated abrasion testers were compared and critically discussed for their suitability for assessing the abrasion resistance of concrete floors. Based on the experimental findings one accelerated abrasion apparatus was selected as more appropriate to be used for carrying out the main investigations. The laboratory programme that followed was undertaken to investigate the influence of various material and construction factors on abrasion resistance. These included mix variations (w/c ratio), fibre reinforcement, geometry, type and volume, curing method and superplasticizing agents. The results clearly show that these factors significantly affected abrasion resistance and several mechanisms were presumed to explain and better understand these observations. To verify and understand these mechanisms that are accountable for the breakdown of concrete slabs, the same concrete specimens that were used for the macro-study, were also subjected to microstructutural investigations using techniques such as Microhardness examination, Mercury intrusion porosimetry and Petrographic examination. It has been found that the abrasion resistance of concrete is primarily dependent on the microstructure and porosity of the concrete nearest to the surface. The feasibility of predicting the abrasion resistance of fibre reinforced concrete floors by indirect and non-destructive methods was investigated using five methods that have frequently been used for assessing the quality of concrete. They included the initial surface absorption test, the impact test, ball cratering, the scratch test and the base hardness test. The impact resistance (BRE screed tester) and scratch resistance (Base hardness tester) were found to be the most sensitive to factors affecting abrasion resistance and hence are considered to be the most appropriate testing techniques. In an attempt to develop an appropriate method for assessing the abrasion resistance of heavy-duty industrial concrete floors, it was found that the presence of curing/sealing compound on the concrete surface at the time of accelerated abrasion testing produces inappropriate results. A preliminary investigation in the direction of modifying the Aston accelerated abrasion tester has been carried out and a more aggressive head has been developed and is pending future research towards standardisation.
Resumo:
Mental-health risk assessment practice in the UK is mainly paper-based, with little standardisation in the tools that are used across the Services. The tools that are available tend to rely on minimal sets of items and unsophisticated scoring methods to identify at-risk individuals. This means the reasoning by which an outcome has been determined remains uncertain. Consequently, there is little provision for: including the patient as an active party in the assessment process, identifying underlying causes of risk, and eecting shared decision-making. This thesis develops a tool-chain for the formulation and deployment of a computerised clinical decision support system for mental-health risk assessment. The resultant tool, GRiST, will be based on consensual domain expert knowledge that will be validated as part of the research, and will incorporate a proven psychological model of classication for risk computation. GRiST will have an ambitious remit of being a platform that can be used over the Internet, by both the clinician and the layperson, in multiple settings, and in the assessment of patients with varying demographics. Flexibility will therefore be a guiding principle in the development of the platform, to the extent that GRiST will present an assessment environment that is tailored to the circumstances in which it nds itself. XML and XSLT will be the key technologies that help deliver this exibility.
Resumo:
Diabetic retinopathy (DR) remains the leading cause of blindness among working-age individuals in developed countries. Current treatments for DR are indicated in advanced stages of the disease and are associated with significant adverse effects. Therefore, new pharmacological treatments for the early stages of DR are needed. DR has been classically considered to be a microcirculatory disease of the retina. However, there is growing evidence to suggest that retinal neurodegeneration is an early event in the pathogenesis of DR, which participates in the microcirculatory abnormalities that occur in DR. Therefore, the study of the underlying mechanisms that lead to neurodegeneration will be essential for identifying new therapeutic targets. From the clinical point of view, the identification of those patients in whom retinal neurodegeneration appears will be crucial for implementing early treatment based on neuroprotective drugs. When the early stages of DR are the therapeutic target, it would be inconceivable to recommend an aggressive treatment such as intravitreous injections. By contrast, topical administration of neuroprotective drugs by using eye drops is a possible option. However, clinical trials to determine the safety and effectiveness of this non-invasive route, as well as a standardisation of the methods for monitoring neurodegeneration, are needed.
Resumo:
Optical coherence tomography (OCT) is a non-invasive three-dimensional imaging system that is capable of producing high resolution in-vivo images. OCT is approved for use in clinical trials in Japan, USA and Europe. For OCT to be used effectively in a clinical diagnosis, a method of standardisation is required to assess the performance across different systems. This standardisation can be implemented using highly accurate and reproducible artefacts for calibration at both installation and throughout the lifetime of a system. Femtosecond lasers can write highly reproducible and highly localised micro-structured calibration artefacts within a transparent media. We report on the fabrication of high quality OCT calibration artefacts in fused silica using a femtosecond laser. The calibration artefacts were written in fused silica due to its high purity and ability to withstand high energy femtosecond pulses. An Amplitude Systemes s-Pulse Yb:YAG femtosecond laser with an operating wavelength of 1026 nm was used to inscribe three dimensional patterns within the highly optically transmissive substrate. Four unique artefacts have been designed to measure a wide variety of parameters, including the points spread function (PSF), modulation transfer function (MTF), sensitivity, distortion and resolution - key parameters which define the performance of the OCT. The calibration artefacts have been characterised using an optical microscope and tested on a swept source OCT. The results demonstrate that the femtosecond laser inscribed artefacts have the potential of quantitatively and qualitatively validating the performance of any OCT system.
Resumo:
Semantic Web Service, one of the most significant research areas within the Semantic Web vision, has attracted increasing attention from both the research community and industry. The Web Service Modelling Ontology (WSMO) has been proposed as an enabling framework for the total/partial automation of the tasks (e.g., discovery, selection, composition, mediation, execution, monitoring, etc.) involved in both intra- and inter-enterprise integration of Web services. To support the standardisation and tool support of WSMO, a formal model of the language is highly desirable. As several variants of WSMO have been proposed by the WSMO community, which are still under development, the syntax and semantics of WSMO should be formally defined to facilitate easy reuse and future development. In this paper, we present a formal Object-Z formal model of WSMO, where different aspects of the language have been precisely defined within one unified framework. This model not only provides a formal unambiguous model which can be used to develop tools and facilitate future development, but as demonstrated in this paper, can be used to identify and eliminate errors present in existing documentation.