29 resultados para Man-Machine Perceptual Performance.
Resumo:
We introduce transreal analysis as a generalisation of real analysis. We find that the generalisation of the real exponential and logarithmic functions is well defined for all transreal numbers. Hence, we derive well defined values of all transreal powers of all non-negative transreal numbers. In particular, we find a well defined value for zero to the power of zero. We also note that the computation of products via the transreal logarithm is identical to the transreal product, as expected. We then generalise all of the common, real, trigonometric functions to transreal functions and show that transreal (sin x)/x is well defined everywhere. This raises the possibility that transreal analysis is total, in other words, that every function and every limit is everywhere well defined. If so, transreal analysis should be an adequate mathematical basis for analysing the perspex machine - a theoretical, super-Turing machine that operates on a total geometry. We go on to dispel all of the standard counter "proofs" that purport to show that division by zero is impossible. This is done simply by carrying the proof through in transreal arithmetic or transreal analysis. We find that either the supposed counter proof has no content or else that it supports the contention that division by zero is possible. The supposed counter proofs rely on extending the standard systems in arbitrary and inconsistent ways and then showing, tautologously, that the chosen extensions are not consistent. This shows only that the chosen extensions are inconsistent and does not bear on the question of whether division by zero is logically possible. By contrast, transreal arithmetic is total and consistent so it defeats any possible "straw man" argument. Finally, we show how to arrange that a function has finite or else unmeasurable (nullity) values, but no infinite values. This arithmetical arrangement might prove useful in mathematical physics because it outlaws naked singularities in all equations.
Resumo:
Many evolutionary algorithm applications involve either fitness functions with high time complexity or large dimensionality (hence very many fitness evaluations will typically be needed) or both. In such circumstances, there is a dire need to tune various features of the algorithm well so that performance and time savings are optimized. However, these are precisely the circumstances in which prior tuning is very costly in time and resources. There is hence a need for methods which enable fast prior tuning in such cases. We describe a candidate technique for this purpose, in which we model a landscape as a finite state machine, inferred from preliminary sampling runs. In prior algorithm-tuning trials, we can replace the 'real' landscape with the model, enabling extremely fast tuning, saving far more time than was required to infer the model. Preliminary results indicate much promise, though much work needs to be done to establish various aspects of the conditions under which it can be most beneficially used. A main limitation of the method as described here is a restriction to mutation-only algorithms, but there are various ways to address this and other limitations.
Resumo:
This article discusses approaches to the interpretation and analysis an event that is poised between reality and performance. It focuses upon a real event witnessed by the author while driving out of Los Angeles, USA. A body hanging on a rope from a bridge some 25/30 feet above the freeway held up the traffic. The status of the body was unclear. Was it the corpse of a dead human being or a stuffed dummy, a simulation of a death? Was it is tragic accident or suicide or was it a stunt, a protest or a performance? Whether a real body or not, it was an event: it drew an audience, it took place in a defined public space bound by time and it disrupted everyday normality and the familiar. The article debates how approaches to performance can engage with a shocking event, such as the Hanging Man, and the frameworks of interpretation that can be brought to bear on it. The analysis takes account of the function of memory in reconstructing the event, and the paradigms of cultural knowledge that offered themselves as parallels, comparators or distinctions against which the experience could be measured, such as the incidents of self-immolation related to demonstrations against the Vietnam War, the protest by the Irish Hunger Strikers and the visual impact of Anthony Gormley’s 2007 work, 'Event Horizon'. Theoretical frameworks deriving from analytical approaches to performance, media representation and ethical dilemmas are evaluated as means to assimilate an indeterminate and challenging event, and the notion of what an ‘event’ may be is itself addressed.
Resumo:
Perceptual multimedia quality is of paramount importance to the continued take-up and proliferation of multimedia applications: users will not use and pay for applications if they are perceived to be of low quality. Whilst traditionally distributed multimedia quality has been characterised by Quality of Service (QoS) parameters, these neglect the user perspective of the issue of quality. In order to redress this shortcoming, we characterise the user multimedia perspective using the Quality of Perception (QoP) metric, which encompasses not only a user’s satisfaction with the quality of a multimedia presentation, but also his/her ability to analyse, synthesise and assimilate informational content of multimedia. In recognition of the fact that monitoring eye movements offers insights into visual perception, as well as the associated attention mechanisms and cognitive processes, this paper reports on the results of a study investigating the impact of differing multimedia presentation frame rates on user QoP and eye path data. Our results show that provision of higher frame rates, usually assumed to provide better multimedia presentation quality, do not significantly impact upon the median coordinate value of eye path data. Moreover, higher frame rates do not significantly increase level of participant information assimilation, although they do significantly improve overall user enjoyment and quality perception of the multimedia content being shown.
Resumo:
Aims: Quinolone antibiotics are the agents of choice for treating systemic Salmonella infections. Resistance to quinolones is usually mediated by mutations in the DNA gyrase gene gyrA. Here we report the evaluation of standard HPLC equipment for the detection of mutations (single nucleotide polymorphisms; SNPs) in gyrA, gyrB, parC and parE by denaturing high performance liquid chromatography (DHPLC). Methods: A panel of Salmonella strains was assembled which comprised those with known different mutations in gyrA (n = 8) and fluoroquinolone-susceptible and -resistant strains (n = 50) that had not been tested for mutations in gyrA. Additionally, antibiotic-susceptible strains of serotypes other than Salmonella enterica serovar Typhimurium strains were examined for serotype-specific mutations in gyrB (n = 4), parC (n = 6) and parE (n = 1). Wild-type (WT) control DNA was prepared from Salmonella Typhimurium NCTC 74. The DNA of respective strains was amplified by PCR using Optimase (R) proofreading DNA polymerase. Duplex DNA samples were analysed using an Agilent A1100 HPLC system with a Varian Helix (TM) DNA column. Sequencing was used to validate mutations detected by DHPLC in the strains with unknown mutations. Results: Using this HPLC system, mutations in gyrA, gyrB, parC and parE were readily detected by comparison with control chromatograms. Sequencing confirmed the gyrA predicted mutations as detected by DHPLC in the unknown strains and also confirmed serotype-associated sequence changes in non-Typhimurium serotypes. Conclusions: The results demonstrated that a non-specialist standard HPLC machine fitted with a generally available column can be used to detect SNPs in gyrA, gyrB, parC and parE genes by DHPLC. Wider applications should be possible.
Resumo:
Past studies have revealed that encountering negative events interferes with cognitive processing of subsequent stimuli. The present study investigates whether negative events affect semantic and perceptual processing differently. Presentation of negative pictures produced slower reaction times than neutral or positive pictures in tasks that require semantic processing, such as natural or man-made judgments about drawings of objects, commonness judgments about objects, and categorical judgments about pairs of words. In contrast, negative picture presentation did not slow down judgments in subsequent perceptual processing (e.g., color judgments about words, size judgments about objects). The subjective arousal level of negative pictures did not modulate the interference effects on semantic or perceptual processing. These findings indicate that encountering negative emotional events interferes with semantic processing of subsequent stimuli more strongly than perceptual processing, and that not all types of subsequent cognitive processing are impaired by negative events.
Resumo:
Purpose – There is a wealth of studies which suggest that managers' positive perceptions/expectations can considerably influence the organisational performance; unfortunately, little empirical evidence has been obtained from development studies. This research aims to focus on the perceptual and behavioural trait differences of successful and unsuccessful aid workers, and their relationship with organisational performance. Design/methodology/approach – Through web-based survey, 244 valid responses were obtained from the Japan International Cooperation Agency (JICA)-aid managers worldwide. Five perception related factors were extracted and used for cluster analysis to group the respondents. Each cluster's perception/behaviour-related factors and organisational performance variables were compared by ANOVA. Findings – It was discovered that Japanese's positive perception/expectation about work and their local colleagues was related to higher organisational performance, and conversely, the negative perception on their part was generally associated with negative behaviour and lower organisational performance. Moreover, in a development context, lower work-related stress and feelings of resignation toward work were strongly associated with the acceptability of cross-cultural work environment. Practical implications – The differences in perceptual tendencies suggest that cautious consideration is advised since these findings may mainly apply to Japanese aid managers. However, as human nature is universal, positive perception and behaviour would bring out positive output in most organisations. Originality/value – This study extended the contextualised “Pygmalion effect” and has clarified the influence of perception/expectation on counter-part behaviour and organisational performance in development aid context, where people-related issues have often been ignored. This first-time research provides imperial data on the significant role of positive perception on the incumbent role holder.
Resumo:
Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.
Resumo:
This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets.
Resumo:
Research evaluating perceptual responses to music has identified many structural features as correlates that might be incorporated in computer music systems for affectively charged algorithmic composition and/or expressive music performance. In order to investigate the possible integration of isolated musical features to such a system, a discrete feature known to correlate some with emotional responses – rhythmic density – was selected from a literature review and incorporated into a prototype system. This system produces variation in rhythm density via a transformative process. A stimulus set created using this system was then subjected to a perceptual evaluation. Pairwise comparisons were used to scale differences between 48 stimuli. Listener responses were analysed with Multidimensional scaling (MDS). The 2-Dimensional solution was then rotated to place the stimuli with the largest range of variation across the horizontal plane. Stimuli with variation in rhythmic density were placed further from the source material than stimuli that were generated by random permutation. This, combined with the striking similarity between the MDS scaling and that of the 2-dimensional emotional model used by some affective algorithmic composition systems, suggests that isolated musical feature manipulation can now be used to parametrically control affectively charged automated composition in a larger system.
Resumo:
Subdermal magnetic implants originated as an art form in the world of body modification. To date an in depth scientific analysis of the benefits of this implant has yet to be established. This research explores the concept of sensory extension of the tactile sense utilising this form of implantation. This relatively simple procedure enables the tactile sense to respond to static and alternating magnetic fields. This is not to say that the underlying biology of the system has changed; i.e. the concept does not increase our tactile frequency response range or sensitivity to pressure, but now does invoke a perceptual response to a stimulus that is not innately available to humans. Within this research two social surveys have been conducted in order to ascertain one, the social acceptance of the general notion of human enhancement, and two the perceptual experiences of individuals with the magnetic implants themselves. In terms of acceptance to the notion of sensory improvement (via implantation) ~39% of the general population questioned responded positively with a further ~25% of the respondents answering with the indecisive response. Thus with careful dissemination a large proportion of individuals may adopt this technology much like this if it were to become available for consumers. Interestingly of the responses collected from the magnetic implants survey ~60% of the respondents actually underwent the implant for magnetic vision purposes. The main contribution of this research however comes from a series of psychophysical testing. In which 7 subjects with subdermal magnetic implants, were cross compared with 7 subjects that had similar magnets superficially attached to their dermis. The experimentation examined multiple psychometric thresholds of the candidates including intensity, frequency and temporal. Whilst relatively simple, the experimental setup for the perceptual experimentation conducted was novel in that custom hardware and protocols were created in order to determine the subjective thresholds of the individuals. Abstract iv The overall purpose of this research is to utilise this concept in high stress scenarios, such as driving or piloting; whereby alerts and warnings could be relayed to an operator without intruding upon their other (typically overloaded) exterior senses (i.e. the auditory and visual senses). Hence each of the thresholding experiments were designed with the intention of utilising the results in the design of signals for information transfer. The findings from the study show that the implanted group of subjects significantly outperformed the superficial group in the absolute intensity threshold experiment, i.e. the implanted group required significantly less force than the superficial group in order to perceive the stimulus. The results for the frequency difference threshold showed no significant difference in the two groups tested. Interestingly however at low frequencies, i.e. 20 and 50 Hz, the ability of the subjects tested to discriminate frequencies significantly increased with more complex waveforms i.e. square and sawtooth, when compared against the typically used sinewave. Furthermore a novel protocol for establishing the temporal gap detection threshold during a temporal numerosity study has been established in this thesis. This experiment measured the subjects’ capability to correctly determine the number of concatenated signals presented to them whilst the time between the signals, referred to as pulses, tended to zero. A significant finding was that when altering the length of, the frequency of, and the number of cycles of the pulses, the time between pulses for correct recognition altered. This finding will ultimately aid in the design of the tactile alerts for this method of information transfer. Preliminary development work for the use of this method of input to the body, in an automotive scenario, is also presented within this thesis in the form of a driving simulation. The overall goal of which is to present warning alerts to a driver, such as rear-to-end collision, or excessive speeds on roads, in order to prevent incidents and penalties from occurring. Discussion on the broader utility of this implant has been presented, reflecting on its potential use as a basis for vibrotactile, and sensory substitution, devices. This discussion furthers with postulations on its use as a human machine interface, as well as how a similar implant could be used within the ear as a hearing aid device.