67 resultados para Informatics Engineering - Human Computer Interaction
Examining the relationships between Holocene climate change, hydrology, and human society in Ireland
Resumo:
This thesis explores human-environment interactions during the Mid-Late Holocene in raised bogs in central Ireland. The raised bogs of central Ireland are widely-recognised for their considerable palaeoenvironmental and archaeological resources: research over the past few decades has established the potential for such sites to preserve sensitive records of Holocene climatic variability expressed as changes in bog surface wetness (BSW); meanwhile archaeological investigations over the past century have uncovered hundreds of peatland archaeological features dating from the Neolithic through to the Post-Medieval period including wooden trackways, platforms, and deposits of high-status metalwork. Previous studies have attempted to explore the relationship between records of past environmental change and the occurrence of peatland archaeological sites reaching varying conclusions. More recently, environmentally-deterministic models of human-environment interaction in Irish raised bogs at the regional scale have been explicitly tested leading to the conclusion that there is no relationship between BSW and past human activity. These relationships are examined in more detail on a site-by-site basis in this thesis. To that end, testate amoebae-derived BSW records from nine milled former raised bogs in central Ireland were produced from sites with known and dated archaeological records. Relationships between BSW records and environmental conditions within the study area were explored through both the development of a new central Ireland testate amoebae transfer function and through comparisons between recent BSW records and instrumental weather data. Compilation of BSW records from the nine fossil study sites show evidence both for climate forcing, particularly during 3200-2400 cal BP, as well as considerable inter-site variability. Considerable inter-site variability was also evident in the archaeological records of the same sites. Whilst comparisons between BSW and archaeological records do not show a consistent linear relationship, examination of records on a site-by-site basis were shown to reveal interpretatively important contingent relationships. It is concluded therefore, that future research on human-environment interactions should focus on individual sites and should utilise theoretical approaches from the humanities in order to avoid the twin pitfalls of masking important local patterns of change, and of environmental determinism.
Resumo:
The challenge of moving past the classic Window Icons Menus Pointer (WIMP) interface, i.e. by turning it ‘3D’, has resulted in much research and development. To evaluate the impact of 3D on the ‘finding a target picture in a folder’ task, we built a 3D WIMP interface that allowed the systematic manipulation of visual depth, visual aides, semantic category distribution of targets versus non-targets; and the detailed measurement of lower-level stimuli features. Across two separate experiments, one large sample web-based experiment, to understand associations, and one controlled lab environment, using eye tracking to understand user focus, we investigated how visual depth, use of visual aides, use of semantic categories, and lower-level stimuli features (i.e. contrast, colour and luminance) impact how successfully participants are able to search for, and detect, the target image. Moreover in the lab-based experiment, we captured pupillometry measurements to allow consideration of the influence of increasing cognitive load as a result of either an increasing number of items on the screen, or due to the inclusion of visual depth. Our findings showed that increasing the visible layers of depth, and inclusion of converging lines, did not impact target detection times, errors, or failure rates. Low-level features, including colour, luminance, and number of edges, did correlate with differences in target detection times, errors, and failure rates. Our results also revealed that semantic sorting algorithms significantly decreased target detection times. Increased semantic contrasts between a target and its neighbours correlated with an increase in detection errors. Finally, pupillometric data did not provide evidence of any correlation between the number of visible layers of depth and pupil size, however, using structural equation modelling, we demonstrated that cognitive load does influence detection failure rates when there is luminance contrasts between the target and its surrounding neighbours. Results suggest that WIMP interaction designers should consider stimulus-driven factors, which were shown to influence the efficiency with which a target icon can be found in a 3D WIMP interface.
Resumo:
On August 2931, 2004, 84 academic and industry scientists from 16 countries gathered in Copper Mountain, Colorado USA to discuss certain issues at the forefront of the science of probiotics and prebiotics. The format for this invitation only meeting included six featured lectures: engineering human vaginal lactobacilli to express HIV inhibitory molecules (Peter Lee, Stanford University), programming the gut for health (Thaddeus Stappenbeck, Washington University School of Medicine), immune modulation by intestinal helminthes (Joel Weinstock, University of Iowa Hospitals and Clinics), hygiene as a cause of autoimmune disorders (G. A. Rook, University College London), prebiotics and bone health (Connie Weaver, Purdue University) and prebiotics and colorectal cancer risk (Ian Rowland, Northern Ireland Centre for Food and Health). In addition, all participants were included in one of eight discussion groups on the topics of engineered probiotics, host-commensal bacteria communication, 'omics' technologies, hygiene and immune regulation, biomarkers for healthy people, prebiotic and probiotic applications to companion animals, development of a probiotic dossier, and physiological relevance of prebiotic activity. Brief conclusions from these discussion groups are summarized in this paper.
Resumo:
The aim of this study was to empirically evaluate an embodied conversational agent called GRETA in an effort to answer two main questions: (1) What are the benefits (and costs) of presenting information via an animated agent, with certain characteristics, in a 'persuasion' task, compared to other forms of display? (2) How important is it that emotional expressions are added in a way that is consistent with the content of the message, in animated agents? To address these questions, a positively framed healthy eating message was created which was variously presented via GRETA, a matched human actor, GRETA's voice only (no face) or as text only. Furthermore, versions of GRETA were created which displayed additional emotional facial expressions in a way that was either consistent or inconsistent with the content of the message. Overall, it was found that although GRETA received significantly higher ratings for helpfulness and likability, presenting the message via GRETA led to the poorest memory performance among users. Importantly, however, when GRETA's additional emotional expressions were consistent with the content of the verbal message, the negative effect on memory performance disappeared. Overall, the findings point to the importance of achieving consistency in animated agents. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
The convergence speed of the standard Least Mean Square adaptive array may be degraded in mobile communication environments. Different conventional variable step size LMS algorithms were proposed to enhance the convergence speed while maintaining low steady state error. In this paper, a new variable step LMS algorithm, using the accumulated instantaneous error concept is proposed. In the proposed algorithm, the accumulated instantaneous error is used to update the step size parameter of standard LMS is varied. Simulation results show that the proposed algorithm is simpler and yields better performance than conventional variable step LMS.
Resumo:
The 'Uncanny Valley' was conceived in 1970 by Prof Masahiro Mori and details a possible relationship between an object's appearance or motion and how people perceive the object. Initially this research was used without validation. Modern technology has enabled initial investigations, summarised here, that conclude further work is required. A good design guideline for humanoid robots is desired if humanoid robots are to assist with an increasingly elderly population, but not yet possible due to technological constraints. Prosthetics is considered a good resource as the user interaction is comparable to the anticipated level of human-robot interaction and there is a wide range of existing devices.
Resumo:
For people with motion impairments, access to and independent control of a computer can be essential. Symptoms such as tremor and spasm, however, can make the typical keyboard and mouse arrangement for computer interaction difficult or even impossible to use. This paper describes three approaches to improving computer input effectivness for people with motion impairments. The three approaches are: (1) to increase the number of interaction channels, (2) to enhance commonly existing interaction channels, and (3) to make more effective use of all the available information in an existing input channel. Experiments in multimodal input, haptic feedback, user modelling, and cursor control are discussed in the context of the three approaches. A haptically enhanced keyboard emulator with perceptive capability is proposed, combining approaches in a way that improves computer access for motion impaired users.
Resumo:
In recent years, the potential role of planned, internal resettlement as a climate change adaptation measure has been highlighted by national governments and the international policy community. However, in many developing countries, resettlement is a deeply political process that often results in an unequal distribution of costs and benefits amongst relocated persons. This paper examines these tensions in Mozambique, drawing on a case study of flood-affected communities in the Lower Zambezi River valley. It takes a political ecology approach – focusing on discourses of human-environment interaction, as well as the power relationships that are supported by such discourses – to show how a dominant narrative of climate change-induced hazards for small-scale farmers is contributing to their involuntary resettlement to higher-altitude, less fertile areas of land. These forced relocations are buttressed by a series of wider economic and political interests in the Lower Zambezi River region, such dam construction for hydroelectric power generation and the extension of control over rural populations, from which resettled people derive little direct benefit. Rather than engaging with these challenging issues, most international donors present in the country accept the ‘inevitability’ of extreme weather impacts and view resettlement as an unfortunate and, in some cases, necessary step to increase people’s ‘resilience’, thus rationalising the top-down imposition of unpopular social policies. The findings add weight to the argument that a depoliticised interpretation of climate change can deflect attention away from underlying drivers of vulnerability and poverty, as well as obscure the interests of governments that are intent on reordering poor and vulnerable populations.
Resumo:
We extend all elementary functions from the real to the transreal domain so that they are defined on division by zero. Our method applies to a much wider class of functions so may be of general interest.
Resumo:
Transreal numbers provide a total semantics containing classical truth values, dialetheaic, fuzzy and gap values. A paraconsistent Sheffer Stroke generalises all classical logics to a paraconsistent form. We introduce logical spaces of all possible worlds and all propositions. We operate on a proposition, in all possible worlds, at the same time. We define logical transformations, possibility and necessity relations, in proposition space, and give a criterion to determine whether a proposition is classical. We show that proofs, based on the conditional, infer gaps only from gaps and that negative and positive infinity operate as bottom and top values.
Resumo:
Virtual learning environments (VLEs) would appear to be particular effective in computer-supported collaborative work (CSCW) for active learning. Most research studies looking at computer-supported collaborative design have focused on either synchronous or asynchronous modes of communication, but near-synchronous working has received relatively little attention. Yet it could be argued that near-synchronous communication encourages creative, rhetorical and critical exchanges of ideas, building on each other’s contributions. Furthermore, although many researchers have carried out studies on collaborative design protocol, argumentation and constructive interaction, little is known about the interaction between drawing and dialogue in near-synchronous collaborative design. The paper reports the first stage of an investigation into the requirements for the design and development of interactive systems to support the learning of collaborative design activities. The aim of the study is to understand the collaborative design processes while sketching in a shared white board and audio conferencing media. Empirical data on design processes have been obtained from observation of seven sessions with groups of design students solving an interior space-planning problem of a lounge-diner in a virtual learning environment, Lyceum, an in-house software developed by the Open University to support its students in collaborative learning.
Resumo:
The nicotinic Acetylcholine Receptor (nAChR) is the major class of neurotransmitter receptors that is involved in many neurodegenerative conditions such as schizophrenia, Alzheimer's and Parkinson's diseases. The N-terminal region or Ligand Binding Domain (LBD) of nAChR is located at pre- and post-synaptic nervous system, which mediates synaptic transmission. nAChR acts as the drug target for agonist and competitive antagonist molecules that modulate signal transmission at the nerve terminals. Based on Acetylcholine Binding Protein (AChBP) from Lymnea stagnalis as the structural template, the homology modeling approach was carried out to build three dimensional model of the N-terminal region of human alpha(7)nAChR. This theoretical model is an assembly of five alpha(7) subunits with 5 fold axis symmetry, constituting a channel, with the binding picket present at the interface region of the subunits. alpha-netlrotoxin is a potent nAChR competitive antagonist that readily blocks the channel resulting in paralysis. The molecular interaction of alpha-Bungarotoxin, a long chain alpha-neurotoxin from (Bungarus multicinctus) and human alpha(7)nAChR seas studied. Agonists such as acetylcholine, nicotine, which are used in it diverse array of biological activities, such as enhancements of cognitive performances, were also docked with the theoretical model of human alpha(7)nAChR. These docked complexes were analyzed further for identifying the crucial residues involved in interaction. These results provide the details of interaction of agonists and competitive antagonists with three dimensional model of the N-terminal region of human alpha(7)nAChR and thereby point to the design of novel lead compounds.
Resumo:
Conventional supported metal catalysts are metal nanoparticles deposited on high surface area oxide supports with a poorly defined metal−support interface. Typically, the traditionally prepared Pt/ceria catalyzes both methanation (H2/CO to CH4) and water−gas shift (CO/H2O to CO2/H2) reactions. By using simple nanochemistry techniques, we show for the first time that Pt or PtAu metal can be created inside each CeO2 particle with tailored dimensions. The encapsulated metal is shown to interact with the thin CeO2 overlayer in each single particle in an optimum geometry to create a unique interface, giving high activity and excellent selectivity for the water−gas shift reaction, but is totally inert for methanation. Thus, this work clearly demonstrates the significance of nanoengineering of a single catalyst particle by a bottom-up construction approach in modern catalyst design which could enable exploitation of catalyst site differentiation, leading to new catalytic properties.