977 resultados para Automation.
Resumo:
Aim The assessment of treatment plans is an important component in the education of radiation therapists. The establishment of a grade for a plan is currently based on subjective assessment of a range of criteria. The automation of assessment could provide a number of advantages including faster feedback, reduced chance of human error, and simpler aggregation of past results. Method A collection of treatments planned by a cohort of 27 second year radiation therapy students were selected for quantitative evaluation. Treatment sites included the bladder, cervix, larynx, parotid and prostate, although only the larynx plans had been assessed in detail. The plans were designed with the Pinnacle system and exported using the DICOM framework. Assessment criteria included beam arrangement optimisation, volume contouring, target dose coverage and homogeneity, and organ-at-risk sparing. The in-house Treatment and Dose Assessor (TADA) software1 was evaluated for suitability in assisting with the quantitative assessment of these plans. Dose volume data were exported in per-student and per-structure data tables, along with beam complexity metrics, dose volume histograms, and reports on naming conventions. Results The treatment plans were exported and processed using TADA, with the processing of all 27 plans for each treatment site taking less than two minutes. Naming conventions were successfully checked against a reference protocol. Significant variations between student plans were found. Correlation with assessment feedback was established for the larynx plans. Conclusion The data generated could be used to inform the selection of future assessment criteria, monitor student development, and provide useful feedback to the students. The provision of objective, quantitative evaluations of plan quality would be a valuable addition to not only radiotherapy education programmes but also for staff development and potentially credentialing methods. New functionality within TADA developed for this work could be applied clinically to, for example, evaluate protocol compliance.
Resumo:
Draglines are used extensively for overburden stripping in Australian open cut coal mines. This paper outlines the design of a computer control system to implement an automated swing cycle on a production dragline. Subsystems and sensors have been developed to satisfy the constraints imposed by the task, the harsh operating environment and the mine's production requirements.
Resumo:
This paper describes a software architecture for real-world robotic applications. We discuss issues of software reliability, testing and realistic off-line simulation that allows the majority of the automation system to be tested off-line in the laboratory before deployment in the field. A recent project, the automation of a very large mining machine is used to illustrate the discussion.
Resumo:
This paper reviews the state-of-the-art in the automation of underground mining vehicles and reports on the development of an autonomous navigation system under development through the CMTE with sponsorship arranged by AMIRA. Past attempts at automating LHDs and haul trucks are described and their particular strengths and weaknesses are discussed. The auto-guidance system being developed overcomes some of the limitations of state-of-the-art prototype æcommercialÆ systems. It can be retrofitted to existing remote controlled vehicles, uses minimum installed infrastructure and is flexible enough for rapid relocation to alternate routes. The navigation techniques use data fusion of two separate sets of sensors combining natural feature recognition, nodal maps and inertial navigation techniques. Collision detection is incorporated and people and other traffic are excluded from the tramming area. This paper describes the work being done by the group with regard to auto-tramming and also outlines the future goals.
Resumo:
This paper reviews the state-of-the art in the automation of underground truck haulage. Past attemps at automating LHDs and haul trucks are described and their particular strengths and weaknesses are listed. We argue that the simple auto-tram...
Resumo:
Draglines are very large machines that are used to remove overburden in open-cut coal mines. This paper outlines the design of a computer control system to implement an automated swing cycle on a production dragline. Subsystems and sensors have been developed to satisfy the constraints imposed by the task, the harsh operating environment and the mine's production requirements.
Resumo:
Dragline Swing to Dump Automation By Peter Corke, CSIRO Manufacturing Technology/CRC for Mining Technology and Equipment (CMTE) Peter Corke presented a case study of a project to automate the dragline swing to dump operation. The project is funded by ACARP, BHP Coal, Pacific Coal and the CMTE and is being carried out on a dragline at Pacific Coal's Meandu mine near Brisbane. Corke began by highlighting that the minerals industry makes extensive use of large, mechanised machines. However, unlike other industries, mining has not adopted automation and most machines are controlled by human operators on board the machine itself. Choosing an automation target The dragline automation was chosen because: ò draglines are one of the biggest capital assets in a mine; ò performance between operators vary significantly, so improved capital utilisation is possible; ò the dragline is often the bottleneck in production; ò a large part of the operation cycle is spent swinging from dig to dump; and ò it is technically feasible. There has been a history of drag line automation projects, none with great success.
Resumo:
The mining industry is highly suitable for the application of robotics and automation technology since the work is both arduous and dangerous. Visual servoing is a means of integrating noncontact visual sensing with machine control to augment or replace operator based control. This article describes two of our current mining automation projects in order to demonstrate some, perhaps unusual, applications of visual servoing, and also to illustrate some very real problems with robust computer vision
Resumo:
Substation Automation Systems have undergone many transformational changes triggered by improvements in technologies. Prior to the digital era, it made sense to confirm that the physical wiring matched the schematic design by meticulous and laborious point to point testing. In this way, human errors in either the design or the construction could be identified and fixed prior to entry into service. However, even though modern secondary systems today are largely computerised, we are still undertaking commissioning testing using the same philosophy as if each signal were hard wired. This is slow and tedious and doesn’t do justice to modern computer systems and software automation. One of the major architectural advantages of the IEC 61850 standard is that it “abstracts” the definition of data and services independently of any protocol allowing the mapping of them to any protocol that can meet the modelling and performance requirements. On this basis, any substation element can be defined using these common building blocks and are made available at the design, configuration and operational stages of the system. The primary advantage of accessing data using this methodology rather than the traditional position method (such as DNP 3.0) is that generic tools can be created to manipulate data. Self-describing data contains the information that these tools need to manipulate different data types correctly. More importantly, self-describing data makes the interface between programs robust and flexible. This paper proposes that the improved data definitions and methods for dealing with this data within a tightly bound and compliant IEC 61850 Substation Automation System could completely revolutionise the need to test systems when compared to traditional point to point methods. Using the outcomes of an undergraduate thesis project, we can demonstrate with some certainty that it is possible to automatically test the configuration of a protection relay by comparing the IEC 61850 configuration extracted from the relay against its SCL file for multiple relay vendors. The software tool provides a quick and automatic check that the data sets on a particular relay are correct according to its CID file, thus ensuring that no unexpected modifications are made at any stage of the commissioning process. This tool has been implemented in a Java programming environment using an open source IEC 61850 library to facilitate the server-client association with the relay.
Resumo:
This paper reviews the state-of-the-art in the automation of underground truck haulage. Past attempts at automating LHDs and haul trucks are described and their particular strengths and weaknesses are listed. We argue that the simple auto-tramming systems currently being commercialised, that follow rail-type guides placed along the back, cannot match the performance, flexibility and reliability of systems based on modern mobile robotic principles. In addition, the lack of collision detection research in the underground environment is highlighted.
Resumo:
Draglines are used extensively for overburden stripping in Australian open cut coal mines. This paper outlines the design of a computer control system to implement an automated swing cycle on a production dragline. Subsystems and sensors have been developed to satisfy the constraints imposed by the task, the harsh operating environment and the mines production requirements.
Resumo:
Developing accurate and reliable crop detection algorithms is an important step for harvesting automation in horticulture. This paper presents a novel approach to visual detection of highly-occluded fruits. We use a conditional random field (CRF) on multi-spectral image data (colour and Near-Infrared Reflectance, NIR) to model two classes: crop and background. To describe these two classes, we explore a range of visual-texture features including local binary pattern, histogram of oriented gradients, and learn auto-encoder features. The pro-posed methods are evaluated using hand-labelled images from a dataset captured on a commercial capsicum farm. Experimental results are presented, and performance is evaluated in terms of the Area Under the Curve (AUC) of the precision-recall curves.Our current results achieve a maximum performance of 0.81AUC when combining all of the texture features in conjunction with colour information.
Location of concentrators in a computer communication network: a stochastic automation search method
Resumo:
The following problem is considered. Given the locations of the Central Processing Unit (ar;the terminals which have to communicate with it, to determine the number and locations of the concentrators and to assign the terminals to the concentrators in such a way that the total cost is minimized. There is alao a fixed cost associated with each concentrator. There is ail upper limit to the number of terminals which can be connected to a concentrator. The terminals can be connected directly to the CPU also In this paper it is assumed that the concentrators can bo located anywhere in the area A containing the CPU and the terminals. Then this becomes a multimodal optimization problem. In the proposed algorithm a stochastic automaton is used as a search device to locate the minimum of the multimodal cost function . The proposed algorithm involves the following. The area A containing the CPU and the terminals is divided into an arbitrary number of regions (say K). An approximate value for the number of concentrators is assumed (say m). The optimum number is determined by iteration later The m concentrators can be assigned to the K regions in (mk) ways (m > K) or (km) ways (K>m).(All possible assignments are feasible, i.e. a region can contain 0,1,…, to concentrators). Each possible assignment is assumed to represent a state of the stochastic variable structure automaton. To start with, all the states are assigned equal probabilities. At each stage of the search the automaton visits a state according to the current probability distribution. At each visit the automaton selects a 'point' inside that state with uniform probability. The cost associated with that point is calculated and the average cost of that state is updated. Then the probabilities of all the states are updated. The probabilities are taken to bo inversely proportional to the average cost of the states After a certain number of searches the search probabilities become stationary and the automaton visits a particular state again and again. Then the automaton is said to have converged to that state Then by conducting a local gradient search within that state the exact locations of the concentrators are determined This algorithm was applied to a set of test problems and the results were compared with those given by Cooper's (1964, 1967) EAC algorithm and on the average it was found that the proposed algorithm performs better.