27 resultados para Tool Use
em Aston University Research Archive
Resumo:
The article explores the possibilities of formalizing and explaining the mechanisms that support spatial and social perspective alignment sustained over the duration of a social interaction. The basic proposed principle is that in social contexts the mechanisms for sensorimotor transformations and multisensory integration (learn to) incorporate information relative to the other actor(s), similar to the "re-calibration" of visual receptive fields in response to repeated tool use. This process aligns or merges the co-actors' spatial representations and creates a "Shared Action Space" (SAS) supporting key computations of social interactions and joint actions; for example, the remapping between the coordinate systems and frames of reference of the co-actors, including perspective taking, the sensorimotor transformations required for lifting jointly an object, and the predictions of the sensory effects of such joint action. The social re-calibration is proposed to be based on common basis function maps (BFMs) and could constitute an optimal solution to sensorimotor transformation and multisensory integration in joint action or more in general social interaction contexts. However, certain situations such as discrepant postural and viewpoint alignment and associated differences in perspectives between the co-actors could constrain the process quite differently. We discuss how alignment is achieved in the first place, and how it is maintained over time, providing a taxonomy of various forms and mechanisms of space alignment and overlap based, for instance, on automaticity vs. control of the transformations between the two agents. Finally, we discuss the link between low-level mechanisms for the sharing of space and high-level mechanisms for the sharing of cognitive representations. © 2013 Pezzulo, Iodice, Ferraina and Kessler.
Resumo:
Knowledge elicitation is a well-known bottleneck in the production of knowledge-based systems (KBS). Past research has shown that visual interactive simulation (VIS) could effectively be used to elicit episodic knowledge that is appropriate for machine learning purposes, with a view to building a KBS. Nonetheless, the VIS-based elicitation process still has much room for improvement. Based in the Ford Dagenham Engine Assembly Plant, a research project is being undertaken to investigate the individual/joint effects of visual display level and mode of problem case generation on the elicitation process. This paper looks at the methodology employed and some issues that have been encountered to date. Copyright © 2007 Inderscience Enterprises Ltd.
Resumo:
Background: Previous work has shown that medical problems can be diagnosed by practitioners using Google. The aim of this study was to determine whether optometry students would benefit from using Google when diagnosing eye diseases. Methods: Participants were given symptoms and signs and instructed to list three key words and use them to search Aston University e-Library and Google UK. Results: Aston University e-Library only search resulted in correct diagnosis in 16 of 60 simulated cases. Aston e-Library plus Google search resulted in correct diagnosis in 31 of 60 simulated cases. Conclusion: Google is a useful aid to help optometry students improve their success rate when diagnosing eye conditions.
Resumo:
The point of departure for this study was a recognition of the differences in suppliers' and acquirers' judgements of the value of technology when transferred between the two, and the significant impacts of technology valuation on the establishment of technology partnerships and effectiveness of technology collaborations. The perceptions, transfer strategies and objectives, perceived benefits and assessed technology contributions as well as associated costs and risks of both suppliers and acquirers were seen to be the core to these differences. This study hypothesised that the capability embodied in technology to yield future returns makes technology valuation distinct from the process of valuing manufacturing products. The study hence has gone beyond the dimensions of cost calculation and price determination that have been discussed in the existing literature, by taking a broader view of how to achieve and share future added value from transferred technology. The core of technology valuation was argued as the evaluation of the 'quality' of the capability (technology) in generating future value and the effectiveness of the transfer arrangement for best use of such a capability. A dynamic approach comprising future value generation and realisation within the context of specific forms of collaboration was therefore adopted. The research investigations focused on the UK and China machine tool industries, where there are many technology transfer activities and the value issue has already been recognised in practice. Data were gathered from three groups: machine tool manufacturing technology suppliers in the UK and acquirers in China, and machine tool users in China. Data collecting methods included questionnaire surveys and case studies within all the three groups. The study has focused on identifying and examining the major factors affecting value as well as their interactive effects on technology valuation from both the supplier's and acquirer's point of view. The survey results showed the perceptions and the assessments of the owner's value and transfer value from the supplier's and acquirer's point of view respectively. Benefits, costs and risks related to the technology transfer were the major factors affecting the value of technology. The impacts of transfer payment on the value of technology by the sharing of financial benefits, costs and risks between partners were assessed. The close relationship between technology valuation and transfer arrangements was established by which technical requirements and strategic implications were considered. The case studies reflected the research propositions and revealed that benefits, costs and risks in the financial, technical and strategic dimensions interacted in the process of technology valuation within the context of technology collaboration. Further to the assessment of factors affecting value, a technology valuation framework was developed which suggests that technology attributes for the enhancement of contributory factors and their contributions to the realisation of transfer objectives need to be measured and compared with the associated costs and risks. The study concluded that technology valuation is a dynamic process including the generation and sharing of future value and the interactions between financial, technical and strategic achievements.
Resumo:
Conventional differential scanning calorimetry (DSC) techniques are commonly used to quantify the solubility of drugs within polymeric-controlled delivery systems. However, the nature of the DSC experiment, and in particular the relatively slow heating rates employed, limit its use to the measurement of drug solubility at the drug's melting temperature. Here, we describe the application of hyper-DSC (HDSC), a variant of DSC involving extremely rapid heating rates, to the calculation of the solubility of a model drug, metronidazole, in silicone elastomer, and demonstrate that the faster heating rates permit the solubility to be calculated under non-equilibrium conditions such that the solubility better approximates that at the temperature of use. At a heating rate of 400°C/min (HDSC), metronidazole solubility was calculated to be 2.16 mg/g compared with 6.16 mg/g at 20°C/min. © 2005 Elsevier B.V. All rights reserved.
Resumo:
This paper presents a case study of the use of a visual interactive modelling system to investigate issues involved in the management of a hospital ward. Visual Interactive Modelling systems are seen to offer the learner the opportunity to explore operational management issues from a varied perspective and to provide an interactive system in which the learner receives feedback on the consequences of their actions. However to maximise the potential learning experience for a student requires the recognition that they require task structure which helps them to understand the concepts involved. These factors can be incorporated into the visual interactive model by providing an interface customised to guide the student through the experimentation. Recent developments of VIM systems in terms of their connectivity with the programming language Visual Basic facilitates this customisation.
Resumo:
The nature of Discrete-Event Simulation (DES) and the use of DES in organisations is changing. Two important developments are the use of Visual Interactive Modelling systems and the use of DES in Business Process Management (BPM) projects. Survey research is presented that shows that despite these developments usage of DES remains relatively low due to a lack of knowledge of the benefits of the technique. This paper considers two factors that could lead to a greater achievement and appreciation of the full benefit of DES and thus lead to greater usage. Firstly in relation to using DES to investigate social systems, both in the process of undertaking a simulation project and in the interpretation of the findings a 'soft' approach may generate more knowledge from the DES intervention and thus increase its benefit to businesses. Secondly in order to assess the full range of outcomes of DES the technique could be considered from the perspective of an information processing tool within the organisation. This will allow outcomes to be considered under the three modes of organisational information use of sense making, knowledge creating and decision making which relate to the theoretical areas of knowledge management, organisational learning and decision making respectively. The association of DES with these popular techniques could further increase its usage in business.
Resumo:
Considerable attention has been given in the literature to identifying and describing the effective elements which positively affect the improvement of product reliability. These have been perceived by many as the 'state of the art' in the manufacturing industry. The applicability, diffusion and effectiveness of such methods and philosophies, as a means of systematically improving the reliability of a product, come in the main from case studies and single and infra-industry empirical studies. These studies have both been carried out within the wider context of quality assurance and management, and taking reliability as a discipline in its own right. However, it is somewhat of a surprise that there are no recently published findings or research studies on the adoption of these methods by the machine tool industry. This may lead one to construct several hypothesised paradigms: (a) that machine tool manufacturers compared to other industries, are slow to respond to propositions given in the literature by theorists or (b) this may indicate that a large proportion of the manufacturers make little use of the reliability improvement techniques as described in the literature, with the overall perception that they will not lead to any significant improvements? On the other hand, it is evident that hypothetical verification of the operational and engineering methods of reliability achievement and improvement adopted in the machine tool industry is less widely researched. Therefore, research into this area is needed in order to explore the 'state of the art' practice in the machine tool industry. This is in terms of the status, structure and activities of the operation of the reliability function. This paper outlines a research programme being conducted with the co-operation of a leading machine tool manufacturer, whose UK manufacturing plant produces in the main Vertical Machining Centres (VMCs) and is continuously undergoing incremental transitions in product reliability improvement.
Resumo:
A current EPSRC project, product introduction process: a simulation in the extended enterprise (PIPSEE) is discussed. PIPSEE attempts to improve the execution of the product introduction process (PIP) within an extended enterprise in the aerospace sector. The modus operandi for accomplishing this has been to develop process understanding amongst a core team, spanning four different companies, through process modelling, review and improvement recommendation. In parallel, a web-based simulation capability is being used to conduct simulation experiments, and to disseminate findings by training others in the lessons that have been learned. It is intended that the use of the PIPSEE simulator should encourage radical thinking about the ‘fuzzy front end’ of the PIP. This presents a topical, exciting and challenging research problem.
Resumo:
Queueing theory is an effective tool in the analysis of canputer camrunication systems. Many results in queueing analysis have teen derived in the form of Laplace and z-transform expressions. Accurate inversion of these transforms is very important in the study of computer systems, but the inversion is very often difficult. In this thesis, methods for solving some of these queueing problems, by use of digital signal processing techniques, are presented. The z-transform of the queue length distribution for the Mj GY jl system is derived. Two numerical methods for the inversion of the transfom, together with the standard numerical technique for solving transforms with multiple queue-state dependence, are presented. Bilinear and Poisson transform sequences are presented as useful ways of representing continuous-time functions in numerical computations.
Resumo:
Certain species of crustose lichens have concentrically zoned margins which probably represent yearly growth rings. These marginal growth rings offer an alternative method of studying annual growth fluctuations, establishing growth rate-size curves, and determining the age of thalli for certain crustose species. Hence, marginal growth rings represent a potentially valuable, unexploited, tool in lichenometry. In a preliminary study, we measured the widths of the successive marginal rings in 25 thalli of Ochrolechia parella (L.) Massal., growing at a maritime site in north Wales. Mean ring widths of all thalli varied from a minimum of 1.02 mm (the outermost ring) to a maximum of 2.06 mm (the third ring from the margin). There is some suggestion that marginal ring width and thallus size are positively correlated; and hence that growth rates increase in larger thalli in this small population. In a further study on recently exposed bedrock adjacent to Breidalon, SE Iceland, we examined the potential for using marginal growth rings to estimate thallus age of a lichen tentatively identified as a Rhizocarpon (possibly R. concentricum (Davies) Beltram.) and thus confirm the timing of surface exposure (c. 50 years). Collectively, these results suggest: 1) the measurement of marginal rings is a possible alternative method of studying the growth of crustose lichens; 2) O. parella may grow differently to other crustose species, exhibiting a rapidly increasing radial growth rate in thalli >40 mm; 3) where lichens with marginal rings grow on recently exposed surfaces (<60 yrs), minimum age estimates can be made using growth rings as an in situ indication of lichen growth rate; 4) it is suggested that this phenomenon could provide a valuable, previously unexploited, in situ lichenometric-dating tool in areas lacking calibration control.
Resumo:
Since the advent of High Level Programming languages (HLPLs) in the early 1950s researchers have sought ways to automate the construction of HLPL compilers. To this end a variety of Translator Writing Tools (TWTs) have been developed in the last three decades. However, only a very few of these tools have gained significant commercial acceptance. This thesis re-examines traditional compiler construction techniques, along with a number of previous TWTs, and proposes a new improved tool for automated compiler construction called the Aston Compiler Constructor (ACC). This new tool allows the specification of complete compilation systems using a high level compiler oriented specification notation called the Compiler Construction Language (CCL). This specification notation is based on a modern variant of Backus Naur Form (BNF) and an extended variant of Attribute Grammars (AGs). The implementation and processing of the CCL is discussed along with an extensive CCL example. The CCL is shown to have an extensive expressive power, to be convenient in use, and highly readable, and thus a superior alternative to earlier TWTs, and to traditional compiler construction techniques. The execution performance of CCL specifications is evaluated and shown to be acceptable. A number of related areas are also addressed, including tools for the rapid construction of individual compiler components, and tools for the construction of compilation systems for multiprocessor operating systems and hardware. This latter area is expected to become of particular interest in future years due to the anticipated increased use of multiprocessor architectures.
Resumo:
The main objective of the work presented in this thesis is to investigate the two sides of the flute, the face and the heel of a twist drill. The flute face was designed to yield straight diametral lips which could be extended to eliminate the chisel edge, and consequently a single cutting edge will be obtained. Since drill rigidity and space for chip conveyance have to be a compromise a theoretical expression is deduced which enables optimum chip disposal capacity to be described in terms of drill parameters. This expression is used to describe the flute heel side. Another main objective is to study the effect on drill performance of changing the conventional drill flute. Drills were manufactured according to the new flute design. Tests were run in order to compare the performance of a conventional flute drill and non conventional design put forward. The results showed that 50% reduction in thrust force and approximately 18% reduction in torque were attained for the new design. The flank wear was measured at the outer corner and found to be less for the new design drill than for the conventional one in the majority of cases. Hole quality, roundness, size and roughness were also considered as a further aspect of drill performance. Improvement in hole quality is shown to arise under certain cutting conditions. Accordingly it might be possible to use a hole which is produced in one pass of the new drill which previously would have required a drilled and reamed hole. A subsidiary objective is to design the form milling cutter that should be employed for milling the foregoing special flute from drill blank allowing for the interference effect. A mathematical analysis in conjunction with computing technique and computers is used. To control the grinding parameter, a prototype drill grinder was designed and built upon the framework of an existing cincinnati cutter grinder. The design and build of the new grinder is based on a computer aided drill point geometry analysis. In addition to the conical grinding concept, the new grinder is also used to produce spherical point utilizing a computer aided drill point geometry analysis.
Resumo:
This work is undertaken in the attempt to understand the processes at work at the cutting edge of the twist drill. Extensive drill life testing performed by the University has reinforced a survey of previously published information. This work demonstrated that there are two specific aspects of drilling which have not previously been explained comprehensively. The first concerns the interrelating of process data between differing drilling situations, There is no method currently available which allows the cutting geometry of drilling to be defined numerically so that such comparisons, where made, are purely subjective. Section one examines this problem by taking as an example a 4.5mm drill suitable for use with aluminium. This drill is examined using a prototype solid modelling program to explore how the required numerical information may be generated. The second aspect is the analysis of drill stiffness. What aspects of drill stiffness provide the very great difference in performance between short flute length, medium flute length and long flute length drills? These differences exist between drills of identical point geometry and the practical superiority of short drills has been known to shop floor drilling operatives since drilling was first introduced. This problem has been dismissed repeatedly as over complicated but section two provides a first approximation and shows that at least for smaller drills of 4. 5mm the effects are highly significant. Once the cutting action of the twist drill is defined geometrically there is a huge body of machinability data that becomes applicable to the drilling process. Work remains to interpret the very high inclination angles of the drill cutting process in terms of cutting forces and tool wear but aspects of drill design may already be looked at in new ways with the prospect of a more analytical approach rather than the present mix of experience and trial and error. Other problems are specific to the twist drill, such as the behaviour of the chips in the flute. It is now possible to predict the initial direction of chip flow leaving the drill cutting edge. For the future the parameters of further chip behaviour may also be explored within this geometric model.
Resumo:
SPOT simulation imagery was acquired for a test site in the Forest of Dean in Gloucestershire, U.K. This data was qualitatively and quantitatively evaluated for its potential application in forest resource mapping and management. A variety of techniques are described for enhancing the image with the aim of providing species level discrimination within the forest. Visual interpretation of the imagery was more successful than automated classification. The heterogeneity within the forest classes, and in particular between the forest and urban class, resulted in poor discrimination using traditional `per-pixel' automated methods of classification. Different means of assessing classification accuracy are proposed. Two techniques for measuring textural variation were investigated in an attempt to improve classification accuracy. The first of these, a sequential segmentation method, was found to be beneficial. The second, a parallel segmentation method, resulted in little improvement though this may be related to a combination of resolution in size of the texture extraction area. The effect on classification accuracy of combining the SPOT simulation imagery with other data types is investigated. A grid cell encoding technique was selected as most appropriate for storing digitised topographic (elevation, slope) and ground truth data. Topographic data were shown to improve species-level classification, though with sixteen classes overall accuracies were consistently below 50%. Neither sub-division into age groups or the incorporation of principal components and a band ratio significantly improved classification accuracy. It is concluded that SPOT imagery will not permit species level classification within forested areas as diverse as the Forest of Dean. The imagery will be most useful as part of a multi-stage sampling scheme. The use of texture analysis is highly recommended for extracting maximum information content from the data. Incorporation of the imagery into a GIS will both aid discrimination and provide a useful management tool.