79 resultados para Embedded computer systems
Resumo:
Purpose – The purpose of this paper is to investigate the concepts of intelligent buildings (IBs), and the opportunities offered by the application of computer-aided facilities management (CAFM) systems. Design/methodology/approach – In this paper definitions of IBs are investigated, particularly definitions that are embracing open standards for effective operational change, using a questionnaire survey. The survey further investigated the extension of CAFM to IBs concepts and the opportunities that such integrated systems will provide to facilities management (FM) professionals. Findings – The results showed variation in the understanding of the concept of IBs and the application of CAFM. The survey showed that 46 per cent of respondents use a CAFM system with a majority agreeing on the potential of CAFM in delivery of effective facilities. Research limitations/implications – The questionnaire survey results are limited to the views of the respondents within the context of FM in the UK. Practical implications – Following on the many definitions of an IB does not necessarily lead to technologies of equipment that conform to an open standard. This open standard and documentation of systems produced by vendors is the key to integrating CAFM with other building management systems (BMS) and further harnessing the application of CAFM for IBs. Originality/value – The paper gives experience-based suggestions for both demand and supply sides of the service procurement to gain the feasible benefits and avoid the currently hindering obstacles, as the paper provides insight to the current and future tools for the mobile aspects of FM. The findings are relevant for service providers and operators as well.
Resumo:
This paper describes the development of an interface to a hospital portal system for information, communication and entertainment such that it can be used easily and effectively by all patients regardless of their age, disability, computer experience or native language. Specifically, this paper reports on the work conducted to ensure that the interface design took into account the needs of visually impaired users.
Resumo:
It is argued that the truth status of emergent properties of complex adaptive systems models should be based on an epistemology of proof by constructive verification and therefore on the ontological axioms of a non-realist logical system such as constructivism or intuitionism. ‘Emergent’ properties of complex adaptive systems (CAS) models create particular epistemological and ontological challenges. These challenges bear directly on current debates in the philosophy of mathematics and in theoretical computer science. CAS research, with its emphasis on computer simulation, is heavily reliant on models which explore the entailments of Formal Axiomatic Systems (FAS). The incompleteness results of Gödel, the incomputability results of Turing, and the Algorithmic Information Theory results of Chaitin, undermine a realist (platonic) truth model of emergent properties. These same findings support the hegemony of epistemology over ontology and point to alternative truth models such as intuitionism, constructivism and quasi-empiricism.
Resumo:
The evolvability of a software artifact is its capacity for producing heritable or reusable variants; the inverse quality is the artifact's inertia or resistance to evolutionary change. Evolvability in software systems may arise from engineering and/or self-organising processes. We describe our 'Conditional Growth' simulation model of software evolution and show how, it can be used to investigate evolvability from a self-organisation perspective. The model is derived from the Bak-Sneppen family of 'self-organised criticality' simulations. It shows good qualitative agreement with Lehman's 'laws of software evolution' and reproduces phenomena that have been observed empirically. The model suggests interesting predictions about the dynamics of evolvability and implies that much of the observed variability in software evolution can be accounted for by comparatively simple self-organising processes.
Resumo:
In this paper a look is taken at how the use of implant technology can be used to either increase the range of the abilities of a human and/or diminish the effects of a neural illness, such as Parkinson's Disease. The key element is the need for a clear interface linking the human brain directly with a computer. The area of interest here is the use of implant technology, particularly where a connection is made between technology and the human brain and/or nervous system. Pilot tests and experimentation are invariably carried out apriori to investigate the eventual possibilities before human subjects are themselves involved. Some of the more pertinent animal studies are discussed here. The paper goes on to describe human experimentation, in particular that carried out by the author himself, which led to him receiving a neural implant which linked his nervous system bi-directionally with the internet. With this in place neural signals were transmitted to various technological devices to directly control them. In particular, feedback to the brain was obtained from the fingertips of a robot hand and ultrasonic (extra) sensory input. A view is taken as to the prospects for the future, both in the near term as a therapeutic device and in the long term as a form of enhancement.
Resumo:
This paper presents the on-going research performed in order to integrate process automation and process management support in the context of media production. This has been addressed on the basis of a holistic approach to software engineering applied to media production modelling to ensure design correctness, completeness and effectiveness. The focus of the research and development has been to enhance the metadata management throughout the process in a similar fashion to that achieved in Decision Support Systems (DSS) to facilitate well-grounded business decisions. The paper sets out the aims and objectives and the methodology deployed. The paper describes the solution in some detail and sets out some preliminary conclusions and the planned future work.
Resumo:
BCI systems require correct classification of signals interpreted from the brain for useful operation. To this end this paper investigates a method proposed in [1] to correctly classify a series of images presented to a group of subjects in [2]. We show that it is possible to use the proposed methods to correctly recognise the original stimuli presented to a subject from analysis of their EEG. Additionally we use a verification set to show that the trained classification method can be applied to a different set of data. We go on to investigate the issue of invariance in EEG signals. That is, the brain representation of similar stimuli is recognisable across different subjects. Finally we consider the usefulness of the methods investigated towards an improved BCI system and discuss how it could potentially lead to great improvements in the ease of use for the end user by offering an alternative, more intuitive control based mode of operation.
Resumo:
Abstract. Different types of mental activity are utilised as an input in Brain-Computer Interface (BCI) systems. One such activity type is based on Event-Related Potentials (ERPs). The characteristics of ERPs are not visible in single-trials, thus averaging over a number of trials is necessary before the signals become usable. An improvement in ERP-based BCI operation and system usability could be obtained if the use of single-trial ERP data was possible. The method of Independent Component Analysis (ICA) can be utilised to separate single-trial recordings of ERP data into components that correspond to ERP characteristics, background electroencephalogram (EEG) activity and other components with non- cerebral origin. Choice of specific components and their use to reconstruct “denoised” single-trial data could improve the signal quality, thus allowing the successful use of single-trial data without the need for averaging. This paper assesses single-trial ERP signals reconstructed using a selection of estimated components from the application of ICA on the raw ERP data. Signal improvement is measured using Contrast-To-Noise measures. It was found that such analysis improves the signal quality in all single-trials.
Resumo:
Semiotics is the study of signs. Application of semiotics in information systems design is based on the notion that information systems are organizations within which agents deploy signs in the form of actions according to a set of norms. An analysis of the relationships among the agents, their actions and the norms would give a better specification of the system. Distributed multimedia systems (DMMS) could be viewed as a system consisted of many dynamic, self-controlled normative agents engaging in complex interaction and processing of multimedia information. This paper reports the work of applying the semiotic approach to the design and modeling of DMMS, with emphasis on using semantic analysis under the semiotic framework. A semantic model of DMMS describing various components and their ontological dependencies is presented, which then serves as a design model and implemented in a semantic database. Benefits of using the semantic database are discussed with reference to various design scenarios.
Resumo:
This paper describes the development of an interface to a hospital portal system for information, communication and entertainment such that it can be used easily and effectively by all patients regardless of their age, disability, computer experience or native language. Specifically, this paper reports on the work conducted to ensure that the interface design took into account the needs of visually impaired users.
Resumo:
This paper describes a multi-agent architecture to support CSCW systems modelling. Since CSCW involves different organizations, it can be seen as a social model. From this point of view, we investigate the possibility of modelling CSCW by agent technology, and then based on organizational semiotics method a multi-agent architecture is proposed via using EDA agent model. We explain the components of this multi-agent architecture and design process. It is argued that this approach provides a new perspective for modelling CSCW systems.
Resumo:
We describe a high-level design method to synthesize multi-phase regular arrays. The method is based on deriving component designs using classical regular (or systolic) array synthesis techniques and composing these separately evolved component design into a unified global design. Similarity transformations ar e applied to component designs in the composition stage in order to align data ow between the phases of the computations. Three transformations are considered: rotation, re ection and translation. The technique is aimed at the design of hardware components for high-throughput embedded systems applications and we demonstrate this by deriving a multi-phase regular array for the 2-D DCT algorithm which is widely used in many vide ocommunications applications.
Resumo:
Time correlation functions yield profound information about the dynamics of a physical system and hence are frequently calculated in computer simulations. For systems whose dynamics span a wide range of time, currently used methods require significant computer time and memory. In this paper, we discuss the multiple-tau correlator method for the efficient calculation of accurate time correlation functions on the fly during computer simulations. The multiple-tau correlator is efficacious in terms of computational requirements and can be tuned to the desired level of accuracy. Further, we derive estimates for the error arising from the use of the multiple-tau correlator and extend it for use in the calculation of mean-square particle displacements and dynamic structure factors. The method described here, in hardware implementation, is routinely used in light scattering experiments but has not yet found widespread use in computer simulations.