14 resultados para Machine to Machine
em University of Queensland eSpace - Australia
Resumo:
We present the results of applying automated machine learning techniques to the problem of matching different object catalogues in astrophysics. In this study, we take two partially matched catalogues where one of the two catalogues has a large positional uncertainty. The two catalogues we used here were taken from the H I Parkes All Sky Survey (HIPASS) and SuperCOSMOS optical survey. Previous work had matched 44 per cent (1887 objects) of HIPASS to the SuperCOSMOS catalogue. A supervised learning algorithm was then applied to construct a model of the matched portion of our catalogue. Validation of the model shows that we achieved a good classification performance (99.12 per cent correct). Applying this model to the unmatched portion of the catalogue found 1209 new matches. This increases the catalogue size from 1887 matched objects to 3096. The combination of these procedures yields a catalogue that is 72 per cent matched.
Resumo:
Invasive vertebrate pests together with overabundant native species cause significant economic and environmental damage in the Australian rangelands. Access to artificial watering points, created for the pastoral industry, has been a major factor in the spread and survival of these pests. Existing methods of controlling watering points are mechanical and cannot discriminate between target species. This paper describes an intelligent system of controlling watering points based on machine vision technology. Initial test results clearly demonstrate proof of concept for machine vision in this application. These initial experiments were carried out as part of a 3-year project using machine vision software to manage all large vertebrates in the Australian rangelands. Concurrent work is testing the use of automated gates and innovative laneway and enclosure design. The system will have application in any habitat throughout the world where a resource is limited and can be enclosed for the management of livestock or wildlife.
Resumo:
Foreign Exchange trading has emerged in recent times as a significant activity in many countries. As with most forms of trading, the activity is influenced by many random parameters so that the creation of a system that effectively emulates the trading process will be very helpful. In this paper we try to create such a system using Machine learning approach to emulate trader behaviour on the Foreign Exchange market and to find the most profitable trading strategy.
Resumo:
A significant problem with currently suggested approaches for transforming between models in different languages is that the transformation is often described imprecisely, with the result that the overall transformation task may be imprecise, incomplete and inconsistent. This paper presents a formal metamodeling approach for transforming between UML and Object-Z. In the paper, the two languages are defined in terms of their formal metamodels, and a systematic transformation between the models is provided at the meta-level in terms of formal mapping functions. As a consequence, we can provide a precise, consistent and complete transformation between them.
Resumo:
'Free will' and its corollary, the concept of individual responsibility are keystones of the justice system. This paper shows that if we accept a physics that disallows time reversal, the concept of 'free will' is undermined by an integrated understanding of the influence of genetics and environment on human behavioural responses. Analysis is undertaken by modelling life as a novel statistico-deterministic version of a Turing machine, i.e. as a series of transitions between states at successive instants of time. Using this model it is proven by induction that the entire course of life is independent of the action of free will. Although determined by prior state, the probability of transitions between states in response to a standard environmental stimulus is not equal to 1 and the transitions may differ quantitatively at the molecular level and qualitatively at the level of the whole organism. Transitions between states correspond to behaviours. It is shown that the behaviour of identical twins (or clones), although determined, would be incompletely predictable and non-identical, creating an illusion of the operation of 'free will'. 'Free will' is a convenient construct for current judicial systems and social control because it allows rationalization of punishment for those whose behaviour falls outside socially defined norms. Indeed, it is conceivable that maintenance of ideas of free will has co-evolved with community morality to reinforce its operation. If the concept is free will is to be maintained it would require revision of our current physical theories.
Resumo:
Promiscuous human leukocyte antigen (HLA) binding peptides are ideal targets for vaccine development. Existing computational models for prediction of promiscuous peptides used hidden Markov models and artificial neural networks as prediction algorithms. We report a system based on support vector machines that outperforms previously published methods. Preliminary testing showed that it can predict peptides binding to HLA-A2 and -A3 super-type molecules with excellent accuracy, even for molecules where no binding data are currently available.
Resumo:
Machine learning techniques have been recognized as powerful tools for learning from data. One of the most popular learning techniques, the Back-Propagation (BP) Artificial Neural Networks, can be used as a computer model to predict peptides binding to the Human Leukocyte Antigens (HLA). The major advantage of computational screening is that it reduces the number of wet-lab experiments that need to be performed, significantly reducing the cost and time. A recently developed method, Extreme Learning Machine (ELM), which has superior properties over BP has been investigated to accomplish such tasks. In our work, we found that the ELM is as good as, if not better than, the BP in term of time complexity, accuracy deviations across experiments, and most importantly - prevention from over-fitting for prediction of peptide binding to HLA.
Resumo:
An emerging issue in the field of astronomy is the integration, management and utilization of databases from around the world to facilitate scientific discovery. In this paper, we investigate application of the machine learning techniques of support vector machines and neural networks to the problem of amalgamating catalogues of galaxies as objects from two disparate data sources: radio and optical. Formulating this as a classification problem presents several challenges, including dealing with a highly unbalanced data set. Unlike the conventional approach to the problem (which is based on a likelihood ratio) machine learning does not require density estimation and is shown here to provide a significant improvement in performance. We also report some experiments that explore the importance of the radio and optical data features for the matching problem.
Resumo:
The high intensity zone within the Jameson Cell is the downcomer. It is largely external and separated from the flotation tank. This, together with operation of the downcomer under vacuum, rather than at elevated pressure and the absence of moving parts, allows ready access to the high intensity zone for measurement and analysis. Experimentation was conducted allowing measurements of recovery for residence times of between 20 milliseconds and ten seconds within the downcomer of a Jameson Cell. The affect of aeration rate on the recovery of different particle sizes was also studied.
Resumo:
The real-time refinement calculus is an extension of the standard refinement calculus in which programs are developed from a precondition plus post-condition style of specification. In addition to adapting standard refinement rules to be valid in the real-time context, specific rules are required for the timing constructs such as delays and deadlines. Because many real-time programs may be nonterminating, a further extension is to allow nonterminating repetitions. A real-time specification constrains not only what values should be output, but when they should be output. Hence for a program to implement such a specification, it must guarantee to output values by the specified times. With standard programming languages such guarantees cannot be made without taking into account the timing characteristics of the implementation of the program on a particular machine. To avoid having to consider such details during the refinement process, we have extended our real-time programming language with a deadline command. The deadline command takes no time to execute and always guarantees to meet the specified time; if the deadline has already passed the deadline command is infeasible (miraculous in Dijkstra's terminology). When such a realtime program is compiled for a particular machine, one needs to ensure that all execution paths leading to a deadline are guaranteed to reach it by the specified time. We consider this checking as part of an extended compilation phase. The addition of the deadline command restores for the real-time language the advantage of machine independence enjoyed by non-real-time programming languages.
Resumo:
The software implementation of the emergency shutdown feature in a major radiotherapy system was analyzed, using a directed form of code review based on module dependences. Dependences between modules are labelled by particular assumptions; this allows one to trace through the code, and identify those fragments responsible for critical features. An `assumption tree' is constructed in parallel, showing the assumptions which each module makes about others. The root of the assumption tree is the critical feature of interest, and its leaves represent assumptions which, if not valid, might cause the critical feature to fail. The analysis revealed some unexpected assumptions that motivated improvements to the code.