48 resultados para Hardware and software
em CentAUR: Central Archive University of Reading - UK
Resumo:
The CAFS search engine is a real machine in a virtual machine world; it is the hardware component of ICL's CAFS system. The paper is an introduction and prelude to the set of papers in this volume on CAFS applications. It defines The CAFS system and its context together with the function of its hardware and software components. It examines CAFS' role in the broad context of application development and information systems; it highlights some techniques and applications which exploit the CAFS system. Finally, it concludes with some suggestions for possible further developments. 'Search out thy wit for secret policies And we will make thee famous through the world' Henry VI, 1:3
Resumo:
This paper presents the Gentle/G integrated system for reach & grasp therapy retraining following brain injury. The design, control and integration of an experimental grasp assistance unit is described for use in robot assisted stroke rehabilitation. The grasp assist unit is intended to work with the hardware and software of the Gentle/S robot although the hardware could be adapted to other rehabilitation applications. When used with the Gentle/S robot a total of 6 active and 3 passive degrees of freedom are available to provide active, active assist or passive grasp retraining in combination with reaching movements in a reach-grasp-transfer-release sequence.
Resumo:
Advances in hardware and software technology enable us to collect, store and distribute large quantities of data on a very large scale. Automatically discovering and extracting hidden knowledge in the form of patterns from these large data volumes is known as data mining. Data mining technology is not only a part of business intelligence, but is also used in many other application areas such as research, marketing and financial analytics. For example medical scientists can use patterns extracted from historic patient data in order to determine if a new patient is likely to respond positively to a particular treatment or not; marketing analysts can use extracted patterns from customer data for future advertisement campaigns; finance experts have an interest in patterns that forecast the development of certain stock market shares for investment recommendations. However, extracting knowledge in the form of patterns from massive data volumes imposes a number of computational challenges in terms of processing time, memory, bandwidth and power consumption. These challenges have led to the development of parallel and distributed data analysis approaches and the utilisation of Grid and Cloud computing. This chapter gives an overview of parallel and distributed computing approaches and how they can be used to scale up data mining to large datasets.
Resumo:
Mathematics in Defence 2011 Abstract. We review transreal arithmetic and present transcomplex arithmetic. These arithmetics have no exceptions. This leads to incremental improvements in computer hardware and software. For example, the range of real numbers, encoded by floating-point bits, is doubled when all of the Not-a-Number(NaN) states, in IEEE 754 arithmetic, are replaced with real numbers. The task of programming such systems is simplified and made safer by discarding the unordered relational operator,leaving only the operators less-than, equal-to, and greater than. The advantages of using a transarithmetic in a computation, or transcomputation as we prefer to call it, may be had by making small changes to compilers and processor designs. However, radical change is possible by exploiting the reliability of transcomputations to make pipelined dataflow machines with a large number of cores. Our initial designs are for a machine with order one million cores. Such a machine can complete the execution of multiple in-line programs each clock tick
Resumo:
TESSA is a toolkit for experimenting with sensory augmentation. It includes hardware and software to facilitate rapid prototyping of interfaces that can enhance one sense using information gathered from another sense. The toolkit contains a range of sensors (e.g. ultrasonics, temperature sensors) and actuators (e.g. tactors or stereo sound), designed modularly so that inputs and outputs can be easily swapped in and out and customized using TESSA’s graphical user interface (GUI), with “real time” feedback. The system runs on a Raspberry Pi with a built-in touchscreen, providing a compact and portable form that is amenable for field trials. At CHI Interactivity, the audience will have the opportunity to experience sensory augmentation effects using this system, and design their own sensory augmentation interfaces.
Resumo:
The availability of a network strongly depends on the frequency of service outages and the recovery time for each outage. The loss of network resources includes complete or partial failure of hardware and software components, power outages, scheduled maintenance such as software and hardware, operational errors such as configuration errors and acts of nature such as floods, tornadoes and earthquakes. This paper proposes a practical approach to the enhancement of QoS routing by means of providing alternative or repair paths in the event of a breakage of a working path. The proposed scheme guarantees that every Protected Node (PN) is connected to a multi-repair path such that no further failure or breakage of single or double repair paths can cause any simultaneous loss of connectivity between an ingress node and an egress node. Links to be protected in an MPLS network are predefined and an LSP request involves the establishment of a working path. The use of multi-protection paths permits the formation of numerous protection paths allowing greater flexibility. Our analysis will examine several methods including single, double and multi-repair routes and the prioritization of signals along the protected paths to improve the Quality of Service (QoS), throughput, reduce the cost of the protection path placement, delay, congestion and collision.
Resumo:
A two-dimensional X-ray scattering system developed around a CCD-based area detector is presented, both in terms of hardware employed and software designed and developed. An essential feature is the integration of hardware and software, detection and sample environment control which enables time-resolving in-situ wide-angle X-ray scattering measurements of global structural and orientational parameters of polymeric systems subjected to a variety of controlled external fields. The development and operation of a number of rheometers purpose-built for the application of such fields are described. Examples of the use of this system in monitoring degrees of shear-induced orientation in liquid-crystalline systems and crystallization of linear polymers subsequent to shear flow are presented.
Resumo:
Advances in hardware and software in the past decade allow to capture, record and process fast data streams at a large scale. The research area of data stream mining has emerged as a consequence from these advances in order to cope with the real time analysis of potentially large and changing data streams. Examples of data streams include Google searches, credit card transactions, telemetric data and data of continuous chemical production processes. In some cases the data can be processed in batches by traditional data mining approaches. However, in some applications it is required to analyse the data in real time as soon as it is being captured. Such cases are for example if the data stream is infinite, fast changing, or simply too large in size to be stored. One of the most important data mining techniques on data streams is classification. This involves training the classifier on the data stream in real time and adapting it to concept drifts. Most data stream classifiers are based on decision trees. However, it is well known in the data mining community that there is no single optimal algorithm. An algorithm may work well on one or several datasets but badly on others. This paper introduces eRules, a new rule based adaptive classifier for data streams, based on an evolving set of Rules. eRules induces a set of rules that is constantly evaluated and adapted to changes in the data stream by adding new and removing old rules. It is different from the more popular decision tree based classifiers as it tends to leave data instances rather unclassified than forcing a classification that could be wrong. The ongoing development of eRules aims to improve its accuracy further through dynamic parameter setting which will also address the problem of changing feature domain values.
Resumo:
There is a renewed interest in immersive visualization to navigate digital data-sets associated with large building and infrastructure projects. Following work with a fully immersive visualization facility at the University, this paper details the development of a complementary mobile visualization environment. It articulates progress on the requirements for this facility; the overall design of hardware and software; and the laboratory testing and planning for user pilots in construction applications. Like our fixed facility, this new light-weight mobile solution enables a group of users to navigate a 3D model at a 1:1 scale and to work collaboratively with structured asset information. However it offers greater flexibility as two users can assemble and start using it at a new location within an hour. The solution has been developed and tested in a laboratory and will be piloted in engineering design review and stakeholder engagement applications on a major construction project.
Resumo:
Advances in hardware and software technologies allow to capture streaming data. The area of Data Stream Mining (DSM) is concerned with the analysis of these vast amounts of data as it is generated in real-time. Data stream classification is one of the most important DSM techniques allowing to classify previously unseen data instances. Different to traditional classifiers for static data, data stream classifiers need to adapt to concept changes (concept drift) in the stream in real-time in order to reflect the most recent concept in the data as accurately as possible. A recent addition to the data stream classifier toolbox is eRules which induces and updates a set of expressive rules that can easily be interpreted by humans. However, like most rule-based data stream classifiers, eRules exhibits a poor computational performance when confronted with continuous attributes. In this work, we propose an approach to deal with continuous data effectively and accurately in rule-based classifiers by using the Gaussian distribution as heuristic for building rule terms on continuous attributes. We show on the example of eRules that incorporating our method for continuous attributes indeed speeds up the real-time rule induction process while maintaining a similar level of accuracy compared with the original eRules classifier. We termed this new version of eRules with our approach G-eRules.
Resumo:
Network diagnosis in Wireless Sensor Networks (WSNs) is a difficult task due to their improvisational nature, invisibility of internal running status, and particularly since the network structure can frequently change due to link failure. To solve this problem, we propose a Mobile Sink (MS) based distributed fault diagnosis algorithm for WSNs. An MS, or mobile fault detector is usually a mobile robot or vehicle equipped with a wireless transceiver that performs the task of a mobile base station while also diagnosing the hardware and software status of deployed network sensors. Our MS mobile fault detector moves through the network area polling each static sensor node to diagnose the hardware and software status of nearby sensor nodes using only single hop communication. Therefore, the fault detection accuracy and functionality of the network is significantly increased. In order to maintain an excellent Quality of Service (QoS), we employ an optimal fault diagnosis tour planning algorithm. In addition to saving energy and time, the tour planning algorithm excludes faulty sensor nodes from the next diagnosis tour. We demonstrate the effectiveness of the proposed algorithms through simulation and real life experimental results.