32 resultados para Man-Machine Perceptual Performance.
em Aston University Research Archive
Resumo:
Five axis machine tools are increasing and becoming more popular as customers demand more complex machined parts. In high value manufacturing, the importance of machine tools in producing high accuracy products is essential. High accuracy manufacturing requires producing parts in a repeatable manner and precision in compliance to the defined design specifications. The performance of the machine tools is often affected by geometrical errors due to a variety of causes including incorrect tool offsets, errors in the centres of rotation and thermal growth. As a consequence, it can be difficult to produce highly accurate parts consistently. It is, therefore, essential to ensure that machine tools are verified in terms of their geometric and positioning accuracy. When machine tools are verified in terms of their accuracy, the resulting numerical values of positional accuracy and process capability can be used to define design for verification rules and algorithms so that machined parts can be easily produced without scrap and little or no after process measurement. In this paper the benefits of machine tool verification are listed and a case study is used to demonstrate the implementation of robust machine tool performance measurement and diagnostics using a ballbar system.
Resumo:
A re-examination of fundamental concepts and a formal structuring of the waveform analysis problem is presented in Part I. eg. the nature of frequency is examined and a novel alternative to the classical methods of detection proposed and implemented which has the advantage of speed and independence from amplitude. Waveform analysis provides the link between Parts I and II. Part II is devoted to Human Factors and the Adaptive Task Technique. The Historical, Technical and Intellectual development of the technique is traced in a review which examines the evidence of its advantages relative to non-adaptive fixed task methods of training, skill assessment and man-machine optimisation. A second review examines research evidence on the effect of vibration on manual control ability. Findings are presented in terms of percentage increment or decrement in performance relative to performance without vibration in the range 0-0.6Rms'g'. Primary task performance was found to vary by as much as 90% between tasks at the same Rms'g'. Differences in task difficulty accounted for this difference. Within tasks vibration-added-difficulty accounted for the effects of vibration intensity. Secondary tasks were found to be largely insensitive to vibration except secondaries which involved fine manual adjustment of minor controls. Three experiments are reported next in which an adaptive technique was used to measure the % task difficulty added by vertical random and sinusoidal vibration to a 'Critical Compensatory Tracking task. At vibration intensities between 0 - 0.09 Rms 'g' it was found that random vibration added (24.5 x Rms'g')/7.4 x 100% to the difficulty of the control task. An equivalence relationship between Random and Sinusoidal vibration effects was established based upon added task difficulty. Waveform Analyses which were applied to the experimental data served to validate Phase Plane analysis and uncovered the development of a control and possibly a vibration isolation strategy. The submission ends with an appraisal of subjects mentioned in the thesis title.
Resumo:
Owing to the rise in the volume of literature, problems arise in the retrieval of required information. Various retrieval strategies have been proposed, but most of that are not flexible enough for their users. Specifically, most of these systems assume that users know exactly what they are looking for before approaching the system, and that users are able to precisely express their information needs according to l aid- down specifications. There has, however, been described a retrieval program THOMAS which aims at satisfying incompletely- defined user needs through a man- machine dialogue which does not require any rigid queries. Unlike most systems, Thomas attempts to satisfy the user's needs from a model which it builds of the user's area of interest. This model is a subset of the program's "world model" - a database in the form of a network where the nodes represent concepts since various concepts have various degrees of similarities and associations, this thesis contends that instead of models which assume equal levels of similarities between concepts, the links between the concepts should have values assigned to them to indicate the degree of similarity between the concepts. Furthermore, the world model of the system should be structured such that concepts which are related to one another be clustered together, so that a user- interaction would involve only the relevant clusters rather than the entire database such clusters being determined by the system, not the user. This thesis also attempts to link the design work with the current notion in psychology centred on the use of the computer to simulate human cognitive processes. In this case, an attempt has been made to model a dialogue between two people - the information seeker and the information expert. The system, called Thomas-II, has been implemented and found to require less effort from the user than Thomas.
Resumo:
This work attempts to create a systemic design framework for man-machine interfaces which is self consistent, compatible with other concepts, and applicable to real situations. This is tackled by examining the current architecture of computer applications packages. The treatment in the main is philosophical and theoretical and analyses the origins, assumptions and current practice of the design of applications packages. It proposes that the present form of packages is fundamentally contradictory to the notion of packaging itself. This is because as an indivisible ready-to-implement solution, current package architecture displays the following major disadvantages. First, it creates problems as a result of user-package interactions, in which the designer tries to mould all potential individual users, no matter how diverse they are, into one model. This is worsened by the minute provision, if any, of important properties such as flexibility, independence and impartiality. Second, it displays rigid structure that reduces the variety and/or multi-use of the component parts of such a package. Third, it dictates specific hardware and software configurations which probably results in reducing the number of degrees of freedom of its user. Fourth, it increases the dependence of its user upon its supplier through inadequate documentation and understanding of the package. Fifth, it tends to cause a degeneration of the expertise of design of the data processing practitioners. In view of this understanding an alternative methodological design framework which is both consistent with systems approach and the role of a package in its likely context is proposed. The proposition is based upon an extension of the identified concept of the hierarchy of holons* which facilitates the examination of the complex relationships of a package with its two principal environments. First, the user characteristics and his decision making practice and procedures; implying an examination of the user's M.I.S. network. Second, the software environment and its influence upon a package regarding support, control and operation of the package. The framework is built gradually as discussion advances around the central theme of a compatible M.I.S., software and model design. This leads to the formation of the alternative package architecture that is based upon the design of a number of independent, self-contained small parts. Such is believed to constitute the nucleus around which not only packages can be more effectively designed, but is also applicable to many man-machine systems design.
River basin surveillance using remotely sensed data: a water resources information management system
Resumo:
This thesis describes the development of an operational river basin water resources information management system. The river or drainage basin is the fundamental unit of the system; in both the modelling and prediction of hydrological processes, and in the monitoring of the effect of catchment management policies. A primary concern of the study is the collection of sufficient and sufficiently accurate information to model hydrological processes. Remote sensing, in combination with conventional point source measurement, can be a valuable source of information, but is often overlooked by hydrologists, due to the cost of acquisition and processing. This thesis describes a number of cost effective methods of acquiring remotely sensed imagery, from airborne video survey to real time ingestion of meteorological satellite data. Inexpensive micro-computer systems and peripherals are used throughout to process and manipulate the data. Spatial information systems provide a means of integrating these data with topographic and thematic cartographic data, and historical records. For the system to have any real potential the data must be stored in a readily accessible format and be easily manipulated within the database. The design of efficient man-machine interfaces and the use of software enginering methodologies are therefore included in this thesis as a major part of the design of the system. The use of low cost technologies, from micro-computers to video cameras, enables the introduction of water resources information management systems into developing countries where the potential benefits are greatest.
Resumo:
The work presented in this thesis is concerned with the dynamic behaviour of structural joints which are both loaded, and excited, normal to the joint interface. Since the forces on joints are transmitted through their interface, the surface texture of joints was carefully examined. A computerised surface measuring system was developed and computer programs were written. Surface flatness was functionally defined, measured and quantised into a form suitable for the theoretical calculation of the joint stiffness. Dynamic stiffness and damping were measured at various preloads for a range of joints with different surface textures. Dry clean and lubricated joints were tested and the results indicated an increase in damping for the lubricated joints of between 30 to 100 times. A theoretical model for the computation of the stiffness of dry clean joints was built. The model is based on the theory that the elastic recovery of joints is due to the recovery of the material behind the loaded asperities. It takes into account, in a quantitative manner, the flatness deviations present on the surfaces of the joint. The theoretical results were found to be in good agreement with those measured experimentally. It was also found that theoretical assessment of the joint stiffness could be carried out using a different model based on the recovery of loaded asperities into a spherical form. Stepwise procedures are given in order to design a joint having a particular stiffness. A theoretical model for the loss factor of dry clean joints was built. The theoretical results are in reasonable agreement with those experimentally measured. The theoretical models for the stiffness and loss factor were employed to evaluate the second natural frequency of the test rig. The results are in good agreement with the experimentally measured natural frequencies.
Resumo:
High precision manufacturers continuously seek out disruptive technologies to improve the quality, cost, and delivery of their products. With the advancement of machine tool and measurement technology many companies are ready to capitalise on the opportunity of on-machine measurement (OMM). Coupled with business case, manufacturing engineers are now questioning whether OMM can soon eliminate the need for post-process inspection systems. Metrologists will however argue that the machining environment is too hostile and that there are numerous process variables which need consideration before traceable measurement on-the-machine can be achieved. In this paper we test the measurement capability of five new multi-axis machine tools enabled as OMM systems via on-machine probing. All systems are tested under various operating conditions in order to better understand the effects of potentially significant variables. This investigation has found that key process variables such as machine tool warm-up and tool-change cycles can have an effect on machine tool measurement repeatability. New data presented here is important to many manufacturers whom are considering utilising their high precision multi-axis machine tools for both the creation and verification of their products.
Resumo:
Due to high-speed rotation, the problems about rotor mechanics and dynamics for outer rotor high-speed machine are more serious than conventional ones, in view of above problems the mechanical and dynamics analysis for an outer rotor high-speed permanent magnet claw pole motor are carried out. The rotor stress analytical calculation model was derived, then the stress distribution is calculated by finite element method also, which is coincided with that calculated by analytical model. In addition, the stress distribution of outer rotor yoke and PMs considering centrifugal force and temperature effect has been calculated, some influence factors on rotor stress distribution have been analyzed such as pole-arc coefficient and speed. The rotor natural frequency and critical speed were calculated by vibration mode analysis, and its dynamics characteristics influenced by gyroscope effect were analyzed based on Campbell diagram. Based on the analysis results above an outer rotor permanent magnet high-speed claw pole motor is design and verified.
Resumo:
The collect-and-place machine is one of the most widely used placement machines for assembling electronic components on the printed circuit boards (PCBs). Nevertheless, the number of researches concerning the optimisation of the machine performance is very few. This motivates us to study the component scheduling problem for this type of machine with the objective of minimising the total assembly time. The component scheduling problem is an integration of the component sequencing problem, that is, the sequencing of component placements; and the feeder arrangement problem, that is, the assignment of component types to feeders. To solve the component scheduling problem efficiently, a hybrid genetic algorithm is developed in this paper. A numerical example is used to compare the performance of the algorithm with different component grouping approaches and different population sizes.
Resumo:
The value of technology and the appropriate form of transfer arrangement are important questions to be resolved when transferring technology between Western manufacturing firms and partners in industrialising and developing countries. This article reports on surveys carried out in the machine tool industries in the UK and China to establish the differences and similarities between owners and acquirers of technology regarding the relative importance of the factors they evaluate, and the assessments they make, when considering a technology transfer. It also outlines the development of a framework for technology valuation. The survey results indicate that the value of product technology is related to superior technical performance, especially on reliability and functionality, and the prospects of premium prices and increased sales of the technology transfer based machine tools. Access to markets is the main objective of UK companies, while Chinese companies are concerned about improving their technological capability. There are significant risks, especially related to performance in the market, and while owners and acquirers have benefited in the short term, the long term collaboration required for strategic benefits has been difficult to achieve because of the different priorities of the owners and the acquirers.
Resumo:
Traditional high speed machinery actuators are powered and coordinated by mechanical linkages driven from a central drive, but these linkages may be replaced by independently synchronised electric drives. Problems associated with utilising such electric drives for this form of machinery were investigated. The research concentrated on a high speed rod-making machine, which required control of high inertias (0.01-0.5kgm2), at continuous high speed (2500 r/min), with low relative phase errors between two drives (0.0025 radians). Traditional minimum energy drive selection techniques for incremental motions were not applicable to continuous applications which require negligible energy dissipation. New selection techniques were developed. A brushless configuration constant enabled the comparison between seven different servo systems; the rate earth brushless drives had the best power rates which is a performance measure. Simulation was used to review control strategies, such that a microprocessor controller with a proportional velocity loop within a proportional position loop with velocity feedforward was designed. Local control schemes were investigated as means of reducing relative errors between drives: the slave of a master/slave scheme compensates for the master's errors: the matched scheme has drives with similar absolute errors so the relative error is minimised, and the feedforward scheme minimises error by adding compensation from previous knowledge. Simulation gave an approximate velocity loop bandwidth and position loop gain required to meet the specification. Theoretical limits for these parameters were defined in terms of digital sampling delays, quantisation, and system phase shifts. Performance degradation due to mechanical backlash was evaluated. Thus any drive could be checked to ensure that the performance specification could be realised. A two drive demonstrator was commissioned with 0.01kgm2 loads. By use of simulation the performance of one drive was improved by increasing the velocity loop bandwidth fourfold. With the master/slave scheme relative errors were within 0.0024 radians at a constant 2500 r/min for two 0.01 kgm^2 loads.
Resumo:
This thesis presents an examination of the factors which influence the performance of eddy-current machines and the way in which they affect optimality of those machines. After a brief introduction to the types of eddy-current machine considered, the applications to which these machines are put are examined. A list of parameters by which to assess their performance is obtained by considering the machine as part of a system. in this way an idea of what constitutes an optimal machine is obtained. The third chapter then identifies the factors which affects the performance and makes a quantitative evaluation of the effect. Here the various alternative configurations and components are compared with regard to their influence on the mechanical, electromagnetic, and thermal performance criteria of the machine. Chapter four contains a brief review of the methods of controlling eddy-current machines by electronic methods using thyristors or transistors as the final control element. Where necessary, the results of previous workers in the field of electrical machines have been extended or adapted to increase the usefulness of this thesis.
Resumo:
Keyword identification in one of two simultaneous sentences is improved when the sentences differ in F0, particularly when they are almost continuously voiced. Sentences of this kind were recorded, monotonised using PSOLA, and re-synthesised to give a range of harmonic ?F0s (0, 1, 3, and 10 semitones). They were additionally re-synthesised by LPC with the LPC residual frequency shifted by 25% of F0, to give excitation with inharmonic but regularly spaced components. Perceptual identification of frequency-shifted sentences showed a similar large improvement with nominal ?F0 as seen for harmonic sentences, although overall performance was about 10% poorer. We compared performance with that of two autocorrelation-based computational models comprising four stages: (i) peripheral frequency selectivity and half-wave rectification; (ii) within-channel periodicity extraction; (iii) identification of the two major peaks in the summary autocorrelation function (SACF); (iv) a template-based approach to speech recognition using dynamic time warping. One model sampled the correlogram at the target-F0 period and performed spectral matching; the other deselected channels dominated by the interferer and performed matching on the short-lag portion of the residual SACF. Both models reproduced the monotonic increase observed in human performance with increasing ?F0 for the harmonic stimuli, but not for the frequency-shifted stimuli. A revised version of the spectral-matching model, which groups patterns of periodicity that lie on a curve in the frequency-delay plane, showed a closer match to the perceptual data for frequency-shifted sentences. The results extend the range of phenomena originally attributed to harmonic processing to grouping by common spectral pattern.
Resumo:
Currently over 50 million people worldwide wear contact lenses, of which over 75% wear hydrogel lenses. Significant deposition occurs in approximately 80% of hydrogel lenses and many contact lens wearers cease wearing lenses due to problems associated with deposition. The contact lens field is not alone in encountering complications associated with interactions between the body and artificial devices. The widespread use of man-made materials to replace structures in the body has emphasised the importance of studies that examine the interactions between implantation materials and body tissues.This project used carefully controlled, randomized clinical studies to study the interactive effects of contact lens materials, care systems, replacement periods and patient differences. Of principal interest was the influence of these factors on material deposition and their subsequent impact on subjective performance. A range of novel and established analytical techniques were used to examine hydrogel lenses following carefully controlled clinical studies in which clinical performance was meticulously monitored. These studies established the inter-relationship between clinical performance and deposition to be evaluated. This project showed that significant differences exist between individuals in their ability to deposit hydrogel lenses, with approximately 20% of subjects displaying significant deposition irrespective of the lens material. Additionally, materials traditionally categorised together show markedly different spoilation characteristics, which are wholly attributable to their detailed chemical structure. For the first time the in vivo deposition kinetics of both protein and lipid in charged and uncharged polymers was demonstrated. In addition the importance of care systems in the deposition process was shown, clearly demonstrating the significance of the quality rather than the quantity of deposition in influencing subjective performance.
Resumo:
We examined the relations between selection for perception and selection for action in a patient FK, with bilateral damage to his temporal and medial frontal cortices. The task required a simple grasp response to a common object (a cup) in the presence of a distractor (another cup). The target was cued by colour or location, and FK made manual responses. We examined the effects on performance of cued and uncued dimensions of both the target and the distractor. FK was impaired at perceptually selecting the target when cued by colour, when the target colour but not its location changed on successive trials. The effect was sensitive to the relative orientations of targets and distractors, indicating an effect of action selection on perceptual selection, when perceptual selection was weakly instantiated. The dimension-specific carry-over effect on reaching was enhanced when there was a temporal delay between a cue and the response, and it disappeared when there was a between-trial delay. The results indicate that perceptual and action selection systems interact to determine the efficiency with which actions are selected to particular objects.