749 resultados para Computer Forensics, Profiling
Resumo:
Purpose – The purpose of this paper is to investigate the concepts of intelligent buildings (IBs), and the opportunities offered by the application of computer-aided facilities management (CAFM) systems. Design/methodology/approach – In this paper definitions of IBs are investigated, particularly definitions that are embracing open standards for effective operational change, using a questionnaire survey. The survey further investigated the extension of CAFM to IBs concepts and the opportunities that such integrated systems will provide to facilities management (FM) professionals. Findings – The results showed variation in the understanding of the concept of IBs and the application of CAFM. The survey showed that 46 per cent of respondents use a CAFM system with a majority agreeing on the potential of CAFM in delivery of effective facilities. Research limitations/implications – The questionnaire survey results are limited to the views of the respondents within the context of FM in the UK. Practical implications – Following on the many definitions of an IB does not necessarily lead to technologies of equipment that conform to an open standard. This open standard and documentation of systems produced by vendors is the key to integrating CAFM with other building management systems (BMS) and further harnessing the application of CAFM for IBs. Originality/value – The paper gives experience-based suggestions for both demand and supply sides of the service procurement to gain the feasible benefits and avoid the currently hindering obstacles, as the paper provides insight to the current and future tools for the mobile aspects of FM. The findings are relevant for service providers and operators as well.
Resumo:
It is evident that quantitative information on different microbial groups and their contribution in terms of activity in the gastrointestinal (GI) tract of humans and animals is required in order to formulate functional diets targeting improved gut function and host health. In this work, quantitative information on levels and spatial distributions of Bacteroides spp, Eubacterium spp, Clostridium spp, Escherichia coli, Bifidobacterium spp and Lactobacillus/Enterococcus spp. along the porcine large intestine was investigated using 16S rRNA targeted probes and fluorescent in situ hybridisation (FISH). Caecum, ascending colon (AC) and rectum luminal digesta from three groups of individually housed growing pigs fed either a corn-soybean basal diet (CON diet) or a prebiotic diet containing 10 g/kg oligofructose (FOS diet) or trans-galactooligosaccharides (TOS diet) at the expense of cornstarch were analysed. DAPI staining was used to enumerate total number of cells in the samples. Populations of total cells, Bacteroides, Eubacterium, Clostridium and Bifidobacterium, declined significantly (P < 0.05) from caecum to rectum, and were not affected by dietary treatments. Populations of Lactobacillus/ Enterococcus and E coli did not differ throughout the large intestine. The relative percent (%) contribution of each bacterial group to the total cell count did not differ between caecum and rectum, with the exception of Eubacterium that was higher in the AC digesta. FISH analysis showed that the sum of all bacterial groups made up a small percentage of the total cells, which was 12.4%, 21.8% and 10.3% in caecum, AC and rectum, respectively. This supports the view that in swine, the diversity of GI microflora might be higher compared to other species. In terms of microflora metabolic activity, the substantially higher numerical trends seen in FOS and TOS treatments regarding total volatile fatty acid, acetate concentrations and glycolytic activities, it could be postulated that FOS and TOS promoted saccharolytic activities in the porcine colon. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Media content distribution on-demand becomes more complex when performed on a mass scale involving various channels with distinct and dynamic network characteristics, and, deploying a variety of terminal devices offering a wide range of capabilities. It is practically impossible to create and prepackage various static versions of the same content to match all the varying demand parameters of clients for various contexts. In this paper we present a profiling management approach for dynamically personalised media content delivery on-demand integrated with the AXMEDIS Framework. The client profiles comprise the representation of User, Device, Network and Context of content delivery based on MPEG-21:DIA. Although the most challenging proving ground for this personalised content delivery has been the mobile testbed i.e. the distribution to mobile handsets, the framework described here can be deployed for disribution, by the AXMEDIS PnP module, through other channels e.g. satellite, Internet to a range of client terminals e.g. desktops, kiosks, IPtv and other terrminals whose baseline terminal capabilities can be made availabe by the manufacturers as is normal.
Resumo:
In this article, an overview of some of the latest developments in the field of cerebral cortex to computer interfacing (CCCI) is given. This is posed in the more general context of Brain-Computer Interfaces in order to assess advantages and disadvantages. The emphasis is clearly placed on practical studies that have been undertaken and reported on, as opposed to those speculated, simulated or proposed as future projects. Related areas are discussed briefly only in the context of their contribution to the studies being undertaken. The area of focus is notably the use of invasive implant technology, where a connection is made directly with the cerebral cortex and/or nervous system. Tests and experimentation which do not involve human subjects are invariably carried out a priori to indicate the eventual possibilities before human subjects are themselves involved. Some of the more pertinent animal studies from this area are discussed. The paper goes on to describe human experimentation, in which neural implants have linked the human nervous system bidirectionally with technology and the internet. A view is taken as to the prospects for the future for CCCI, in terms of its broad therapeutic role.
Resumo:
BCI systems require correct classification of signals interpreted from the brain for useful operation. To this end this paper investigates a method proposed in [1] to correctly classify a series of images presented to a group of subjects in [2]. We show that it is possible to use the proposed methods to correctly recognise the original stimuli presented to a subject from analysis of their EEG. Additionally we use a verification set to show that the trained classification method can be applied to a different set of data. We go on to investigate the issue of invariance in EEG signals. That is, the brain representation of similar stimuli is recognisable across different subjects. Finally we consider the usefulness of the methods investigated towards an improved BCI system and discuss how it could potentially lead to great improvements in the ease of use for the end user by offering an alternative, more intuitive control based mode of operation.
Resumo:
This paper describes a study that was conducted to learn more about how older adults use the tools in a GUI to undertake tasks in Windows applications. The objective was to gain insight into what people did and what they found most difficult. File and folder manipulation, and some aspects of formatting presented difficulties, and these were thought to be related to a lack of understanding of the task model, the correct interpretation of the visual cues presented by the interface, and the recall and translation of the task model into a suitable sequence of actions.
Resumo:
The paper describes the implementation of an offline, low-cost Brain Computer Interface (BCI) alternative to more expensive commercial models. Using inexpensive general purpose clinical EEG acquisition hardware (Truscan32, Deymed Diagnostic) as the base unit, a synchronisation module was constructed to allow the EEG hardware to be operated precisely in time to allow for recording of automatically time stamped EEG signals. The synchronising module allows the EEG recordings to be aligned in stimulus time locked fashion for further processing by the classifier to establish the class of the stimulus, sample by sample. This allows for the acquisition of signals from the subject’s brain for the goal oriented BCI application based on the oddball paradigm. An appropriate graphical user interface (GUI) was constructed and implemented as the method to elicit the required responses (in this case Event Related Potentials or ERPs) from the subject.
Resumo:
Abstract. Different types of mental activity are utilised as an input in Brain-Computer Interface (BCI) systems. One such activity type is based on Event-Related Potentials (ERPs). The characteristics of ERPs are not visible in single-trials, thus averaging over a number of trials is necessary before the signals become usable. An improvement in ERP-based BCI operation and system usability could be obtained if the use of single-trial ERP data was possible. The method of Independent Component Analysis (ICA) can be utilised to separate single-trial recordings of ERP data into components that correspond to ERP characteristics, background electroencephalogram (EEG) activity and other components with non- cerebral origin. Choice of specific components and their use to reconstruct “denoised” single-trial data could improve the signal quality, thus allowing the successful use of single-trial data without the need for averaging. This paper assesses single-trial ERP signals reconstructed using a selection of estimated components from the application of ICA on the raw ERP data. Signal improvement is measured using Contrast-To-Noise measures. It was found that such analysis improves the signal quality in all single-trials.
Resumo:
Increasingly, distributed systems are being used to host all manner of applications. While these platforms provide a relatively cheap and effective means of executing applications, so far there has been little work in developing tools and utilities that can help application developers understand problems with the supporting software, or the executing applications. To fully understand why an application executing on a distributed system is not behaving as would be expected it is important that not only the application, but also the underlying middleware, and the operating system are analysed too, otherwise issues could be missed and certainly overall performance profiling and fault diagnoses would be harder to understand. We believe that one approach to profiling and the analysis of distributed systems and the associated applications is via the plethora of log files generated at runtime. In this paper we report on a system (Slogger), that utilises various emerging Semantic Web technologies to gather the heterogeneous log files generated by the various layers in a distributed system and unify them in common data store. Once unified, the log data can be queried and visualised in order to highlight potential problems or issues that may be occurring in the supporting software or the application itself.