229 resultados para Computerised videotape


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research began with an attempt to solve a practical problem, namely, the prediction of the rate at which an operator will learn a task. From a review of the literature, communications with researchers in this area and the study of psychomotor learning in factories it was concluded that a more fundamental approach was required which included the development of a task taxonomy. This latter objective had been researched for over twenty years by E. A. Fleishman and his approach was adopted. Three studies were carried out to develop and extend Fleishman's approach to the industrial area. However, the results of these studies were not in accord with FIeishman's conclusions and suggested that a critical re-assessment was required of the arguments, methods and procedures used by Fleishman and his co-workers. It was concluded that Fleishman's findings were to some extent an artifact of the approximate methods and procedures which he used in the original factor analyses and that using the more modern computerised factor analytic methods a reliable ability taxonomy could be developed to describe the abilities involved in the learning of psychomotor tasks. The implications for a changing-task or changing-subject model were drawn and it was concluded that a changing task and subject model needs to be developed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A history of government drug regulation and the relationship between the pharmaceutical companies in the U.K. and the licensing authority is outlined. Phases of regulatory stringency are identified with the formation of the Committees on Safety of Drugs and Medicines viewed as watersheds. A study of the impact of government regulation on industrial R&D activities focuses on the effects on the rate and direction of new product innovation. A literature review examines the decline in new chemical entity innovation. Regulations are cited as a major but not singular cause of the decline. Previous research attempting to determine the causes of such a decline on an empirical basis is given and the methodological problems associated with such research are identified. The U.K. owned sector of the British pharmaceutical industry is selected for a study employing a bottom-up approach allowing disaggregation of data. A historical background to the industry is provided, with each company analysed or a case study basis. Variations between companies regarding the policies adopted for R&D are emphasised. The process of drug innovation is described in order to determine possible indicators of the rate and direction of inventive and innovative activity. All possible indicators are considered and their suitability assessed. R&D expenditure data for the period 1960-1983 is subsequently presented as an input indicator. Intermediate output indicators are treated in a similar way and patent data are identified as a readily-available and useful source. The advantages and disadvantages of using such data are considered. Using interview material, patenting policies for most of the U.K. companies are described providing a background for a patent-based study. Sources of patent data are examined with an emphasis on computerised systems. A number of searches using a variety of sources are presented. Patent family size is examined as a possible indicator of an invention's relative importance. The patenting activity of the companies over the period 1960-1983 is given and the variation between companies is noted. The relationship between patent data and other indicators used is analysed using statistical methods resulting in an apparent lack of correlation. An alternative approach taking into account variations in company policy and phases in research activity indicates a stronger relationship between patenting activity, R&D Expenditure and NCE output over the period. The relationship is not apparent at an aggregated company level. Some evidence is presented for a relationship between phases of regulatory stringency, inventive and innovative activity but the importance of other factors is emphasised.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The state of the art in productivity measurement and analysis shows a gap between simple methods having little relevance in practice and sophisticated mathematical theory which is unwieldy for strategic and tactical planning purposes, -particularly at company level. An extension is made in this thesis to the method of productivity measurement and analysis based on the concept of added value, appropriate to those companies in which the materials, bought-in parts and services change substantially and a number of plants and inter-related units are involved in providing components for final assembly. Reviews and comparisons of productivity measurement dealing with alternative indices and their problems have been made and appropriate solutions put forward to productivity analysis in general and the added value method in particular. Based on this concept and method, three kinds of computerised models two of them deterministic, called sensitivity analysis and deterministic appraisal, and the third one, stochastic, called risk simulation, have been developed to cope with the planning of productivity and productivity growth with reference to the changes in their component variables, ranging from a single value 'to• a class interval of values of a productivity distribution. The models are designed to be flexible and can be adjusted according to the available computer capacity expected accuracy and 'presentation of the output. The stochastic model is based on the assumption of statistical independence between individual variables and the existence of normality in their probability distributions. The component variables have been forecasted using polynomials of degree four. This model is tested by comparisons of its behaviour with that of mathematical model using real historical data from British Leyland, and the results were satisfactory within acceptable levels of accuracy. Modifications to the model and its statistical treatment have been made as required. The results of applying these measurements and planning models to the British motor vehicle manufacturing companies are presented and discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The work presented in this thesis is concerned with the dynamic behaviour of structural joints which are both loaded, and excited, normal to the joint interface. Since the forces on joints are transmitted through their interface, the surface texture of joints was carefully examined. A computerised surface measuring system was developed and computer programs were written. Surface flatness was functionally defined, measured and quantised into a form suitable for the theoretical calculation of the joint stiffness. Dynamic stiffness and damping were measured at various preloads for a range of joints with different surface textures. Dry clean and lubricated joints were tested and the results indicated an increase in damping for the lubricated joints of between 30 to 100 times. A theoretical model for the computation of the stiffness of dry clean joints was built. The model is based on the theory that the elastic recovery of joints is due to the recovery of the material behind the loaded asperities. It takes into account, in a quantitative manner, the flatness deviations present on the surfaces of the joint. The theoretical results were found to be in good agreement with those measured experimentally. It was also found that theoretical assessment of the joint stiffness could be carried out using a different model based on the recovery of loaded asperities into a spherical form. Stepwise procedures are given in order to design a joint having a particular stiffness. A theoretical model for the loss factor of dry clean joints was built. The theoretical results are in reasonable agreement with those experimentally measured. The theoretical models for the stiffness and loss factor were employed to evaluate the second natural frequency of the test rig. The results are in good agreement with the experimentally measured natural frequencies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conventional methods of form-roll design and manufacture for Cold Roll-Forming of thin-walled metal sections have been entirely manual, time consuming and prone to errors, resulting in inefficiency and high production costs. With the use of computers, lead time can be significantly improved, particularly for those aspects involving routine but tedious human decisions and actions. This thesis describes the development of computer aided tools for producing form-roll designs for NC manufacture in the CAD/CAM environment. The work was undertaken to modernise the existing activity of a company manufacturing thin-walled sections. The investigated areas of the activity, including the design and drafting of the finished section, the flower patterns, the 10 to 1 templates, and the rolls complete with pinch-difference surfaces, side-rolls and extension-contours, have been successfully computerised by software development . Data generated by the developed software can be further processed for roll manufacturing using NC lathes. The software has been specially designed for portability to facilitate its implementation on different computers. The Opening-Radii method of forming was introduced as a subsitute to the conventional method for better forming. Most of the essential aspects in roll design have been successfully incorporated in the software. With computerisation, extensive standardisation in existing roll design practices and the use of more reliable and scientifically-based methods have been achieved. Satisfactory and beneficial results have also been obtained by the company in using the software through a terminal linked to the University by a GPO line. Both lead time and productivity in roll design and manufacture have been significantly improved. It is therefore concluded that computerisation in the design of form-rolls for automation by software development is viable. The work also demonstrated the promising nature of the CAD/CAM approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of the research project was to gain d complete and accurate accounting of the needs and deficiencies of materials selection and design data, with particular attention given to the feasibility of a computerised materials selection system that would include application analysis, property data and screening techniques. The project also investigates and integrates the three major aspects of materials resources, materials selection and materials recycling. Consideration of the materials resource base suggests that, though our discovery potential has increased, geologic availability is the ultimate determinant and several metals may well become scarce at the same time, thus compounding the problem of substitution. With around 2- to 20- million units of engineering materials data, the use of a computer is the only logical answer for scientific selection of materials. The system developed at Aston is used for data storage, mathematical computation and output. The system enables programs to be run in batch and interactive (on-line) mode. The program with modification can also handle such variables as quantity of mineral resources, energy cost of materials and depletion and utilisation rates of strateqic materials. The work also carries out an in-depth study of copper recycling in the U.K. and concludes that, somewhere in the region of 2 million tonnes of copper is missing from the recycling cycle. It also sets out guidelines on product design and conservation policies from the recyclability point of view.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this study was to determine whether an ophthalmophakometric technique could offer a feasible means of investigating ocular component contributions to residual astigmatism in human eyes. Current opinion was gathered on the prevalence, magnitude and source of residual astigmatism. It emerged that a comprehensive evaluation of the astigmatic contributions of the eye's internal ocular surfaces and their respective axial separations (effectivity) had not been carried out to date. An ophthalmophakometric technique was developed to measure astigmatism arising from the internal ocular components. Procedures included the measurement of refractive error (infra-red autorefractometry), anterior corneal surface power (computerised video keratography), axial distances (A-scan ultrasonography) and the powers of the posterior corneal surface in addition to both surfaces of the crystalline lens (multi-meridional still flash ophthalmophakometry). Computing schemes were developed to yield the required biometric data. These included (1) calculation of crystalline lens surface powers in the absence of Purkinje images arising from its anterior surface, (2) application of meridional analysis to derive spherocylindrical surface powers from notional powers calculated along four pre-selected meridians, (3) application of astigmatic decomposition and vergence analysis to calculate contributions to residual astigmatism of ocular components with obliquely related cylinder axes, (4) calculation of the effect of random experimental errors on the calculated ocular component data. A complete set of biometric measurements were taken from both eyes of 66 undergraduate students. Effectivity due to corneal thickness made the smallest cylinder power contribution (up to 0.25DC) to residual astigmatism followed by contributions of the anterior chamber depth (up to 0.50DC) and crystalline lens thickness (up to 1.00DC). In each case astigmatic contributions were predominantly direct. More astigmatism arose from the posterior corneal surface (up to 1.00DC) and both crystalline lens surfaces (up to 2.50DC). The astigmatic contributions of the posterior corneal and lens surfaces were found to be predominantly inverse whilst direct astigmatism arose from the anterior lens surface. Very similar results were found for right versus left eyes and males versus females. Repeatability was assessed on 20 individuals. The ophthalmophakometric method was found to be prone to considerable accumulated experimental errors. However, these errors are random in nature so that group averaged data were found to be reasonably repeatable. A further confirmatory study was carried out on 10 individuals which demonstrated that biometric measurements made with and without cycloplegia did not differ significantly.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this Interdisciplinary Higher Degrees project was the development of a high-speed method of photometrically testing vehicle headlamps, based on the use of image processing techniques, for Lucas Electrical Limited. Photometric testing involves measuring the illuminance produced by a lamp at certain points in its beam distribution. Headlamp performance is best represented by an iso-lux diagram, showing illuminance contours, produced from a two-dimensional array of data. Conventionally, the tens of thousands of measurements required are made using a single stationary photodetector and a two-dimensional mechanical scanning system which enables a lamp's horizontal and vertical orientation relative to the photodetector to be changed. Even using motorised scanning and computerised data-logging, the data acquisition time for a typical iso-lux test is about twenty minutes. A detailed study was made of the concept of using a video camera and a digital image processing system to scan and measure a lamp's beam without the need for the time-consuming mechanical movement. Although the concept was shown to be theoretically feasible, and a prototype system designed, it could not be implemented because of the technical limitations of commercially-available equipment. An alternative high-speed approach was developed, however, and a second prototype syqtem designed. The proposed arrangement again uses an image processing system, but in conjunction with a one-dimensional array of photodetectors and a one-dimensional mechanical scanning system in place of a video camera. This system can be implemented using commercially-available equipment and, although not entirely eliminating the need for mechanical movement, greatly reduces the amount required, resulting in a predicted data acquisiton time of about twenty seconds for a typical iso-lux test. As a consequence of the work undertaken, the company initiated an 80,000 programme to implement the system proposed by the author.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research aims to investigate knowledge acquisition and concept formation in the domain of economics and business studies through a foreign language, English, from the very initial to the very final stage of development in the context of Higher Education in Turkey. It traces both the processes as well as the product of acquisition in order to provide a detailed picture of how knowledge acquisition occurs. It aims to explore ways in which the acquisition process can be facilitated and promoted while prospective students of the Department of Economics and Business Administration receive a language training programme, following the completion of which they will join their academic community which offers part of its courses through the English language. The study draws upon (some) theories of mental representation of knowledge, such as schema, frame and script. The concept of discourse community with its characteristics is investigated, enculturation of prospective students to acquire knowledge of their domain through L2 is explored, and the crucial role of the constructivist theory in relation to knowledge acquisition is highlighted. The present study was conducted through a process of enculturation taking place partly at the language centre of Çukurova University and partly at the target discourse community. The data utilised for initiating knowledge acquisition was obtained by establishing a corpus of economics and business texts, which the learners are expected to read during their academic courses utilising computerised technology. The method of think aloud protocols was used to analyse processes taking place in knowledge acquisition, while the product of what was acquired was investigated by means of written recall protocols. It has been discovered that knowledge acquisition operates on the basis of analogical and to a certain extent metaphorical reasoning. The evidence obtained from the think aloud protocols showed that neophytes were able to acquire fundamental concepts of their future domain by reaching the level of shared understanding with the members of their target community of the faculty. Diaries and questionnaire analyses demonstrated that enculturation facilitated learners' transition from the language centre into the target community. Analyses of the written recall protocols and examinations from the post-enculturation stage of the research showed that neophytes' academic performances in their target community were much higher than those of their non-enculturated counterparts. Processes learners go through and strategies they spontaneously make use of, especially while acquiring knowledge of a specific domain through L2 have so far remained unexplored research areas. The present research makes a potential contribution to the language and knowledge acquisition theories by examining closely and systematically the language and the strategies they employ in acquiring such knowledge. The research findings offer useful implications to English language teaching at language schools. Language teachers are provided with useful guidelines as to how they can provide prospective students of a particular academic community with an experience of acquiring fundamental concepts of their discipline before they become members of their target community.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The thesis deals with the background, development and description of a mathematical stock control methodology for use within an oil and chemical blending company, where demand and replenishment lead-times are generally non-stationary. The stock control model proper relies on, as input, adaptive forecasts of demand determined for an economical forecast/replenishment period precalculated on an individual stock-item basis. The control procedure is principally that of the continuous review, reorder level type, where the reorder level and reorder quantity 'float', that is, each changes in accordance with changes in demand. Two versions of the Methodology are presented; a cost minimisation version and a service level version. Realising the importance of demand forecasts, four recognised variations of the Trigg and Leach adaptive forecasting routine are examined. A fifth variation, developed, is proposed as part of the stock control methodology. The results of testing the cost minimisation version of the Methodology with historical data, by means of a computerised simulation, are presented together with a description of the simulation used. The performance of the Methodology is in addition compared favourably to a rule-of-thumb approach considered by the Company as an interim solution for reducing stack levels. The contribution of the work to the field of scientific stock control is felt to be significant for the following reasons:- (I) The Methodology is designed specifically for use with non-stationary demand and for this reason alone appears to be unique. (2) The Methodology is unique in its approach and the cost-minimisation version is shown to work successfully with the demand data presented. (3) The Methodology and the thesis as a whole fill an important gap between complex mathematical stock control theory and practical application. A brief description of a computerised order processing/stock monitoring system, designed and implemented as a pre-requisite for the Methodology's practical operation, is presented as an appendix.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research investigated expertise in hazardous substance risk assessment (HSRA). Competent pro-active risk assessment is needed to prevent occupational ill-health caused by hazardous substance exposure occurring in the future. In recent years there has been a strong demand for HSRA expertise and a shortage of expert practitioners. The discipline of Occupational Hygiene was identified as the key repository of knowledge and skills for HSRA and one objective of this research was to develop a method to elicit this expertise from experienced occupational hygienists. In the study of generic expertise, many methods of knowledge elicitation (KE) have been investigated, since this has been relevant to the development of 'expert systems' (thinking computers). Here, knowledge needed to be elicited from human experts, and this stage was often a bottleneck in system development, since experts could not explain the basis of their expertise. At an intermediate stage, information collected was used to structure a basic model of hazardous substance risk assessment activity (HSRA Model B) and this formed the basis of tape transcript analysis in the main study with derivation of a 'classification' and a 'performance matrix'. The study aimed to elicit the expertise of occupational hygienists and compare their performance with other health and safety professionals (occupational health physicians, occupational health nurses, health and safety practitioners and trainee health and safety inspectors), as evaluated using the matrix. As a group, the hygienists performed best in the exercise, and this group were particularly good at process elicitation and at recommending specific control measures, although the other groups also performed well in selected aspects of the matrix and the work provided useful findings and insights. From the research, two models of HSRA have been derived, an HSRA aid, together with a novel videotape KE technique and interesting research findings. The implications of this are discussed with respect to future training of HS professionals and wider application of the videotape KE method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mental-health risk assessment practice in the UK is mainly paper-based, with little standardisation in the tools that are used across the Services. The tools that are available tend to rely on minimal sets of items and unsophisticated scoring methods to identify at-risk individuals. This means the reasoning by which an outcome has been determined remains uncertain. Consequently, there is little provision for: including the patient as an active party in the assessment process, identifying underlying causes of risk, and eecting shared decision-making. This thesis develops a tool-chain for the formulation and deployment of a computerised clinical decision support system for mental-health risk assessment. The resultant tool, GRiST, will be based on consensual domain expert knowledge that will be validated as part of the research, and will incorporate a proven psychological model of classication for risk computation. GRiST will have an ambitious remit of being a platform that can be used over the Internet, by both the clinician and the layperson, in multiple settings, and in the assessment of patients with varying demographics. Flexibility will therefore be a guiding principle in the development of the platform, to the extent that GRiST will present an assessment environment that is tailored to the circumstances in which it nds itself. XML and XSLT will be the key technologies that help deliver this exibility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present study examines facilitative effects of trait emotional intelligence on decision making in a socially moderated, financial context. One hundred participants completed the trait emotional intelligence questionnaire and a computerised gambling card game, designed to simulate financial decision making. The results show that participants scoring high on the sociability factors made significantly better decisions in certain card game conditions compared to lower scoring counterparts. Results are discussed in light of dual-process theories.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Evaluating the impact of splitting toric power on patient tolerance to misorientation such as with intraocular lens rotation. Setting: University vision clinic. Methods: Healthy, non astigmats had +1.50D astigmatism induced with spectacle lenses at 90°, 135°, 180° and +3.00D at 90°. Two correcting cylindrical lenses of the opposite sign and half the power each were subsequently added to the trial frame misaligned by 0°, 5° or 10° in a random order and misorientated from the initial axis in a clockwise direction by up to 15° in 5° steps. A second group of adapted astigmats with between 1.00 and 3.00DC had their astigmatism corrected with two toric spectacle lenses of half the power separated by 0°, 5° or 10° and misorientated from the initial axis in both directions by up to 15° in 5° steps. Distance, high contrast visual acuity was measured using a computerised test chart at each lens misalignment and misorientation. Results: Misorientation of the split toric lenses caused a statistically significant drop in visual acuity (F= 70.341; p< 0.001). Comparatively better acuities were observed around 180°, as anticipated (F= 3.775; p= 0.035). Misaligning the split toric power produced no benefit in visual acuity retention with axis misorientation when subjects had astigmatism induced with a low (F= 2.190, p= 0.129) or high cylinder (F= 0.491, p= 0.617) or in the adapted astigmats (F= 0.120, p= 0.887). Conclusion: Misalignment of toric lens power split across the front and back lens surfaces had no beneficial effect on distance visual acuity, but also no negative effect. © 2013 British Contact Lens Association.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The Aston Medication Adherence Study was designed to examine non-adherence to prescribed medicines within an inner-city population using general practice (GP) prescribing data. Objective: To examine non-adherence patterns to prescribed oralmedications within three chronic disease states and to compare differences in adherence levels between various patient groups to assist the routine identification of low adherence amongst patients within the Heart of Birmingham teaching Primary Care Trust (HoBtPCT). Setting: Patients within the area covered by HoBtPCT (England) prescribed medication for dyslipidaemia, type-2 diabetes and hypothyroidism, between 2000 and 2010 inclusively. HoBtPCT's population was disproportionately young,with seventy per cent of residents fromBlack and Minority Ethnic groups. Method: Systematic computational analysis of all medication issue data from 76 GP surgeries dichotomised patients into two groups (adherent and non-adherent) for each pharmacotherapeutic agent within the treatment groups. Dichotomised groupings were further analysed by recorded patient demographics to identify predictors of lower adherence levels. Results were compared to an analysis of a self-reportmeasure of adherence [using the Modified Morisky Scale© (MMAS-8)] and clinical value data (cholesterol values) from GP surgery records. Main outcome: Adherence levels for different patient demographics, for patients within specific longterm treatment groups. Results: Analysis within all three groups showed that for patients with the following characteristics, adherence levels were statistically lower than for others; patients: younger than 60 years of age; whose religion is coded as "Islam"; whose ethnicity is coded as one of the Asian groupings or as "Caribbean", "Other Black" and "African"; whose primary language is coded as "Urdu" or "Bengali"; and whose postcodes indicate that they live within the most socioeconomically deprived areas of HoBtPCT. Statistically significant correlations between adherence status and results from the selfreport measure of adherence and of clinical value data analysis were found. Conclusion: Using data fromGP prescribing systems, a computerised tool to calculate individual adherence levels for oral pharmacotherapy for the treatment of diabetes, dyslipidaemia and hypothyroidism has been developed.The tool has been used to establish nonadherence levels within the three treatment groups and the demographic characteristics indicative of lower adherence levels, which in turn will enable the targeting of interventional support within HoBtPCT. © Koninklijke Nederlandse Maatschappij ter bevordering der Pharmacie 2013.