803 resultados para Embodied
Resumo:
Although many examples exist for shared neural representations of self and other, it is unknown how such shared representations interact with the rest of the brain. Furthermore, do high-level inference-based shared mentalizing representations interact with lower level embodied/simulation-based shared representations? We used functional neuroimaging (fMRI) and a functional connectivity approach to assess these questions during high-level inference-based mentalizing. Shared mentalizing representations in ventromedial prefrontal cortex, posterior cingulate/precuneus, and temporo-parietal junction (TPJ) all exhibited identical functional connectivity patterns during mentalizing of both self and other. Connectivity patterns were distributed across low-level embodied neural systems such as the frontal operculum/ventral premotor cortex, the anterior insula, the primary sensorimotor cortex, and the presupplementary motor area. These results demonstrate that identical neural circuits are implementing processes involved in mentalizing of both self and other and that the nature of such processes may be the integration of low-level embodied processes within higher level inference-based mentalizing.
Resumo:
Planning is a vital element of project management but it is still not recognized as a process variable. Its objective should be to outperform the initially defined processes, and foresee and overcome possible undesirable events. Detailed task-level master planning is unrealistic since one cannot accurately predict all the requirements and obstacles before work has even started. The process planning methodology (PPM) has thus been developed in order to overcome common problems of the overwhelming project complexity. The essential elements of the PPM are the process planning group (PPG), including a control team that dynamically links the production/site and management, and the planning algorithm embodied within two continuous-improvement loops. The methodology was tested on a factory project in Slovenia and in four successive projects of a similar nature. In addition to a number of improvement ideas and enhanced communication, the applied PPM resulted in 32% higher total productivity, 6% total savings and created a synergistic project environment.
Resumo:
The aim of this study was to empirically evaluate an embodied conversational agent called GRETA in an effort to answer two main questions: (1) What are the benefits (and costs) of presenting information via an animated agent, with certain characteristics, in a 'persuasion' task, compared to other forms of display? (2) How important is it that emotional expressions are added in a way that is consistent with the content of the message, in animated agents? To address these questions, a positively framed healthy eating message was created which was variously presented via GRETA, a matched human actor, GRETA's voice only (no face) or as text only. Furthermore, versions of GRETA were created which displayed additional emotional facial expressions in a way that was either consistent or inconsistent with the content of the message. Overall, it was found that although GRETA received significantly higher ratings for helpfulness and likability, presenting the message via GRETA led to the poorest memory performance among users. Importantly, however, when GRETA's additional emotional expressions were consistent with the content of the verbal message, the negative effect on memory performance disappeared. Overall, the findings point to the importance of achieving consistency in animated agents. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
In the forecasting of binary events, verification measures that are “equitable” were defined by Gandin and Murphy to satisfy two requirements: 1) they award all random forecasting systems, including those that always issue the same forecast, the same expected score (typically zero), and 2) they are expressible as the linear weighted sum of the elements of the contingency table, where the weights are independent of the entries in the table, apart from the base rate. The authors demonstrate that the widely used “equitable threat score” (ETS), as well as numerous others, satisfies neither of these requirements and only satisfies the first requirement in the limit of an infinite sample size. Such measures are referred to as “asymptotically equitable.” In the case of ETS, the expected score of a random forecasting system is always positive and only falls below 0.01 when the number of samples is greater than around 30. Two other asymptotically equitable measures are the odds ratio skill score and the symmetric extreme dependency score, which are more strongly inequitable than ETS, particularly for rare events; for example, when the base rate is 2% and the sample size is 1000, random but unbiased forecasting systems yield an expected score of around −0.5, reducing in magnitude to −0.01 or smaller only for sample sizes exceeding 25 000. This presents a problem since these nonlinear measures have other desirable properties, in particular being reliable indicators of skill for rare events (provided that the sample size is large enough). A potential way to reconcile these properties with equitability is to recognize that Gandin and Murphy’s two requirements are independent, and the second can be safely discarded without losing the key advantages of equitability that are embodied in the first. This enables inequitable and asymptotically equitable measures to be scaled to make them equitable, while retaining their nonlinearity and other properties such as being reliable indicators of skill for rare events. It also opens up the possibility of designing new equitable verification measures.
Resumo:
This paper assesses the potential for using building integrated photovoltaic (BIPV) roof shingles made from triple-junction amorphous silicon (3a-Si) for electrification and as a roofing material in tropical countries, such as Accra, Ghana. A model roof was constructed using triple-junction amorphous (3a-Si) PV on one section and conventional roofing tiles on the other. The performance of the PV module and tiles were measured, over a range of ambient temperatures and solar irradiance. PVSyst (a computer design software) was used to determine the most appropriate angle of tilt. It was observed that 3a-Si performs well in conditions such as Accra, because it is insensitive to high temperatures. Building integration gives security benefits, and reduces construction costs and embodied energy, compared to freestanding PV systems. Again, it serves as a means of protection from salt spray from the oceans and works well even when shaded. However, compared to conventional roofing materials, 3a-Si would increase the indoor temperature by 1-2 °C depending on the surface area of the roof covered with the PV modules. The results presented in this research enhance the understanding of varying factors involved in the selection of an appropriate method of PV installation to offset the short falls of the conventional roofing material in Ghana.
Resumo:
A generic model of Exergy Assessment is proposed for the Environmental Impact of the Building Lifecycle, with a special focus on the natural environment. Three environmental impacts: energy consumption, resource consumption and pollutant discharge have been analyzed with reference to energy-embodied exergy, resource chemical exergy and abatement exergy, respectively. The generic model of Exergy Assessment of the Environmental Impact of the Building Lifecycle thus formulated contains two sub-models, one from the aspect of building energy utilization and the other from building materials use. Combined with theories by ecologists such as Odum, the paper evaluates a building's environmental sustainability through its exergy footprint and environmental impacts. A case study from Chongqing, China illustrates the application of this method. From the case study, it was found that energy consumption constitutes 70–80% of the total environmental impact during a 50-year building lifecycle, in which the operation phase accounts for 80% of the total environmental impact, the building material production phase 15% and 5% for the other phases.
Resumo:
Over the last decade, there has been an increasing body of work that explores whether sensory and motor information is a necessary part of semantic representation and processing. This is the embodiment hypothesis. This paper presents a theoretical review of this work that is intended to be useful for researchers in the neurosciences and neuropsychology. Beginning with a historical perspective, relevant theories are placed on a continuum from strongly embodied to completely unembodied representations. Predictions are derived and neuroscientific and neuropsychological evidence that could support different theories is reviewed; finally, criticisms of embodiment are discussed. We conclude that strongly embodied and completely disembodied theories are not supported, and that the remaining theories agree that semantic representation involves some form of Convergence Zones (Damasio, 1989) and the activation of modal content. For the future, research must carefully define the boundaries of semantic processing and tackle the representation of abstract entities.
Resumo:
Embodied theories of cognition propose that neural substrates used in experiencing the referent of a word, for example perceiving upward motion, should be engaged in weaker form when that word, for example ‘rise’, is comprehended. Motivated by the finding that the perception of irrelevant background motion at near-threshold, but not supra-threshold, levels interferes with task execution, we assessed whether interference from near-threshold background motion was modulated by its congruence with the meaning of words (semantic content) when participants completed a lexical decision task (deciding if a string of letters is a real word or not). Reaction times for motion words, such as ‘rise’ or ‘fall’, were slower when the direction of visual motion and the ‘motion’ of the word were incongruent — but only when the visual motion was at nearthreshold levels. When motion was supra-threshold, the distribution of error rates, not reaction times, implicated low-level motion processing in the semantic processing of motion words. As the perception of near-threshold signals is not likely to be influenced by strategies, our results support a close contact between semantic information and perceptual systems.
Resumo:
Recent theories propose that semantic representation and sensorimotor processing have a common substrate via simulation. We tested the prediction that comprehension interacts with perception, using a standard psychophysics methodology.While passively listening to verbs that referred to upward or downward motion, and to control verbs that did not refer to motion, 20 subjects performed a motion-detection task, indicating whether or not they saw motion in visual stimuli containing threshold levels of coherent vertical motion. A signal detection analysis revealed that when verbs were directionally incongruent with the motion signal, perceptual sensitivity was impaired. Word comprehension also affected decision criteria and reaction times, but in different ways. The results are discussed with reference to existing explanations of embodied processing and the potential of psychophysical methods for assessing interactions between language and perception.
Resumo:
In 1917 D.H. Lawrence's whole outlook on the social and cultural environment of his country was embodied in his attitude towards the literary marketplace. The suppression of The Rainbow in 1915 and his opposition to the war contributed to his feeling of detachment from what he called ‘the bourgeois world, the world which controls press, publication and all’. Presenting new archival evidence, this article examines the publishing history of the poetry volume Look! We Have Come Through, issued by Chatto & Windus in 1917. Closer examination of the motives of the individual editors involved in the production of the volume reveals why Lawrence was required to make changes to his text but also why the firm were eager to publish a volume that was to have little commercial impact. Issued at a critical moment in Lawrence's relationship with the marketplace, and in the history of literary modernism, the episode shows how, in spite of general hostility to his work, there were forces in the mainstream publishing market that were keen to embrace modern literary forms and take risks with the work of authors whose subject-matter was challenging and potentially dangerous.
Resumo:
Dualism has long distinguished between the mental and the body experiences. Probing the structure and organisation of the self traditionally calls for a distinction between these two sides of the self coin. It is far beyond the scope of this chapter to address these philosophical issues, and our starting point will be the simple distinction between reflective processes involved in the elaboration of body image, self awareness and self-recognition (i.e. ‘the self’) and the sensori-motor dialogues involved in action control, reactions and automatisms (i.e. ‘the body’ schema). This oversimplification does not take into account the complex interactions taking place between these two levels of description, but our initial aim will be to distinguish between them, before addressing the question of their interactions. Cognitive and sensori-motor processes have frequently been distinguished (review: Rossetti and Revonsuo 2000), and it may be proposed that a similar dissociation can be put forward, a priori, between a central representation of self and a bodily representation corresponding to body schema (Figure 1).
Resumo:
Global temperatures are expected to rise by between 1.1 and 6.4oC this century, depending, to a large extent, on the amount of carbon we emit to the atmosphere from now onwards. This warming is expected to have very negative effects on many peoples and ecosystems and, therefore, minimising our carbon emissions is a priority. Buildings are estimated to be responsible for around 50% of carbon emissions in the UK. Potential reductions involve both operational emissions, produced during use, and embodied emissions, produced during manufacture of materials and components, and during construction, refurbishments and demolition. To date the major effort has focused on reducing the, apparently, larger operational element, which is more readily quantifiable and reduction measures are relatively straightforward to identify and implement. Various studies have compared the magnitude of embodied and operational emissions, but have shown considerable variation in the relative values. This illustrates the difficulties in quantifying embodied, as it requires a detailed knowledge of the processes involved in the different life cycle phases, and requires the use of consistent system boundaries. However, other studies have established the interaction between operational and embodied, which demonstrates the importance of considering both elements together in order to maximise potential reductions. This is borne out in statements from both the Intergovernmental Panel on Climate Change and The Low Carbon Construction Innovation and Growth Team of the UK Government. In terms of meeting the 2020 and 2050 timeframes for carbon reductions it appears to be equally, if not more, important to consider early embodied carbon reductions, rather than just future operational reductions. Future decarbonisation of energy supply and more efficient lighting and M&E equipment installed in future refits is likely to significantly reduce operational emissions, lending further weight to this argument. A method of discounting to evaluate the present value of future carbon emissions would allow more realistic comparisons to be made on the relative importance of the embodied and operational elements. This paper describes the results of case studies on carbon emissions over the whole lifecycle of three buildings in the UK, compares four available software packages for determining embodied carbon and suggests a method of carbon discounting to obtain present values for future emissions. These form the initial stages of a research project aimed at producing information on embodied carbon for different types of building, components and forms of construction, in a simplified form, which can be readily used by building designers in optimising building design in terms of minimising overall carbon emissions. Keywords: Embodied carbon; carbon emission; building; operational carbon.
Resumo:
This essay traces the development of Otto Neurath’s ideas that led to the publication of one of the first series of children’s books produced by the Isotype Institute in the late 1940s, the Visual History of Mankind. Described in its publicity material as ‘new in content’ and ‘new in method’, it embodied much of Otto Neurath’s thinking about visual education, and also coincided with other educational ideas in the UK in the 1930s and 1940s. It exemplified the Isotype Institute’s approach: teamwork, thinking about the needs of younger readers, clear explanation, and accessible content. Further, drawing on correspondence, notes and drawings from the Otto and Marie Neurath Isotype Collection at the University of Reading, the essay presents insights to the making of the books and the people involved, the costs of production and the influence of this on design decisions, and how the books were received by teachers and children.
Resumo:
Nowadays utilising the proper HVAC system is essential both in extreme weather conditions and dense buildings design. Hydraulic loops are the most common parts in all air conditioning systems. This article aims to investigate the performance of different hydraulic loop arrangements in variable flow systems. Technical, economic and environmental assessments have been considered in this process. A dynamic system simulation is generated to evaluate the system performance and an economic evaluation is conducted by whole life cost assessment. Moreover, environmental impacts have been studied by considering the whole life energy consumption, CO2 emission, the embodied energy and embodied CO2 of the system components. Finally, decision-making in choosing the most suitable hydraulic system among five well-known alternatives has been proposed.
Resumo:
The plethora, and mass take up, of digital communication tech- nologies has resulted in a wealth of interest in social network data collection and analysis in recent years. Within many such networks the interactions are transient: thus those networks evolve over time. In this paper we introduce a class of models for such networks using evolving graphs with memory dependent edges, which may appear and disappear according to their recent history. We consider time discrete and time continuous variants of the model. We consider the long term asymptotic behaviour as a function of parameters controlling the memory dependence. In particular we show that such networks may continue evolving forever, or else may quench and become static (containing immortal and/or extinct edges). This depends on the ex- istence or otherwise of certain infinite products and series involving age dependent model parameters. To test these ideas we show how model parameters may be calibrated based on limited samples of time dependent data, and we apply these concepts to three real networks: summary data on mobile phone use from a developing region; online social-business network data from China; and disaggregated mobile phone communications data from a reality mining experiment in the US. In each case we show that there is evidence for memory dependent dynamics, such as that embodied within the class of models proposed here.