96 resultados para Unity,Mixed Reality,Extended Reality,Augmented Reality,Virtual Reality,Desgin pattern


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The paper seeks to explore in depth the ways in which rhetorical strategies are employed in the international accounting standard setting process. The study proposes that rather than simply detailing new accounting requirements, the texts and drafts of accounting standards are artefacts, i.e. deliberately and carefully crafted products, that construct, persuade and encourage certain beliefs and behaviours. The persuasive and constructive strategies are also employed by the constituents submitting comment letters on the regulatory proposals. Consequently, the international accounting standard setting process is an ‘interactive process of meaning making’ (Fairclough, 1989). The study regards accounting as a social construct based on intersubjectivity (Searle, 1995; Davidson, 1990, 1994) and posits language as a constitutive factor in the process (Saussure, 1916; Peirce, 1931-58). This approach to the use of language and the role of rhetoric as a persuasive tool to convince others to our perception of ‘accounting reality’ is supported by the sociological work of Bourdieu (1990, 1991). Bourdieu has drawn our attention to how language becomes used, controlled, reformed and reconstituted by the social agents for the purposes of establishing their dominance. In our study we explore in particular the joint IASB and FASB proposals and subsequent regulations on the scope of consolidation and relevant disclosures that address issues of off-balance sheet financing, a subject that is very timely and of great topical importance. The analysis has revealed sophisticated rhetorical devices used by both the Boards and by the lobbyists. These reflect Aristotelian ethos, pathos and logos. The research demonstrates that those using accounting standards as well as those reading comment letters on the proposals for new standards should be aware of the normative nature of these documents and the subjectivity inherent in the nature of the text.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

All new homes in the UK will be required to be zero carbon from 2016. Housing sector bodies and individual housing developers are championing a transition from traditional marketing to green marketing approaches to raise consumer awareness of the benefits of low and zero carbon homes. On-site sales teams on housing developments form a central interface between the developer and potential buyers. These teams, then, have a critical role in the success or otherwise of the developers’ green marketing strategies. However, there is a dearth of empirical research that explores the actual attitudes and practices of these teams. An exploratory case study approach was adopted. The data collection consisted of reviewing relevant company documentation and semi-structure interviews with the on-site sales teams from six housing developments. The findings from two case studies suggest that the sales teams do have potential to forge a bridge between the design / production and consumption spheres in the way that consumers understand and appreciate, but further work is required. The sales teams’ practices were constrained by the incumbent, traditional marketing logic that rotates around issues such as location and selling price. The sales teams appeared to adopt a strategy of a restriction of information about the benefits of low and zero carbon homes to not disturb the prevailing logic. Further, the sales teams justify this insulating mechanism by the argument that consumers are not interested in those benefits. This rhetoric may be driving a real wedge between the design / production and consumption spheres to the detriment of the consumer and, in the longer term, the house builder itself.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In their contribution to PNAS, Penner et al. (1) used a climate model to estimate the radiative forcing by the aerosol first indirect effect (cloud albedo effect) in two different ways: first, by deriving a statistical relationship between the logarithm of cloud droplet number concentration, ln Nc, and the logarithm of aerosol optical depth, ln AOD (or the logarithm of the aerosol index, ln AI) for present-day and preindustrial aerosol fields, a method that was applied earlier to satellite data (2), and, second, by computing the radiative flux perturbation between two simulations with and without anthropogenic aerosol sources. They find a radiative forcing that is a factor of 3 lower in the former approach than in the latter [as Penner et al. (1) correctly noted, only their “inline” results are useful for the comparison]. This study is a very interesting contribution, but we believe it deserves several clarifications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The chapter focuses on the relationships between 'Reality TV' and other ‘realist’ forms and genres of television. This issue is connected to larger debates about ‘televisuality’, and the understanding of the distinctiveness of the medium. Television processes and worries over reality in all of its genres, so that realism becomes a particularly ambiguous term. One meaning focuses on the actual scenes, places and people are represented rather than imagined. A second meaning refers to television’s representation of recognisable and often contemporary experience. Another meaning of realism refers to the development of new and different forms to give access to the real. Furthermore, the establishment of category distinctions in television, such as between factual and fictional forms, or between drama and documentary, could be seen as increasingly problematic in contemporary television. Reality TV can thought of as the trying-out of forms and modes of address in one genre or form that are adopted from apparently different genres and forms, thus creating connection and distinction simultaneously. This chapter addresses these distinctions and ambiguities within Reality TV, using examples including One Born Every Minute and The Only Way is Essex.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This chapter considers aspects of urban design and associated identity of place that shifts over time and has to identify with aspects of economic pressures to develop as well as cultural concerns about change.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Using the Met Office large-eddy model (LEM) we simulate a mixed-phase altocumulus cloud that was observed from Chilbolton in southern England by a 94 GHz Doppler radar, a 905 nm lidar, a dual-wavelength microwave radiometer and also by four radiosondes. It is important to test and evaluate such simulations with observations, since there are significant differences between results from different cloud-resolving models for ice clouds. Simulating the Doppler radar and lidar data within the LEM allows us to compare observed and modelled quantities directly, and allows us to explore the relationships between observed and unobserved variables. For general-circulation models, which currently tend to give poor representations of mixed-phase clouds, the case shows the importance of using: (i) separate prognostic ice and liquid water, (ii) a vertical resolution that captures the thin layers of liquid water, and (iii) an accurate representation the subgrid vertical velocities that allow liquid water to form. It is shown that large-scale ascents and descents are significant for this case, and so the horizontally averaged LEM profiles are relaxed towards observed profiles to account for these. The LEM simulation then gives a reasonable. cloud, with an ice-water path approximately two thirds of that observed, with liquid water at the cloud top, as observed. However, the liquid-water cells that form in the updraughts at cloud top in the LEM have liquid-water paths (LWPs) up to half those observed, and there are too few cells, giving a mean LWP five to ten times smaller than observed. In reality, ice nucleation and fallout may deplete ice-nuclei concentrations at the cloud top, allowing more liquid water to form there, but this process is not represented in the model. Decreasing the heterogeneous nucleation rate in the LEM increased the LWP, which supports this hypothesis. The LEM captures the increase in the standard deviation in Doppler velocities (and so vertical winds) with height, but values are 1.5 to 4 times smaller than observed (although values are larger in an unforced model run, this only increases the modelled LWP by a factor of approximately two). The LEM data show that, for values larger than approximately 12 cm s(-1), the standard deviation in Doppler velocities provides an almost unbiased estimate of the standard deviation in vertical winds, but provides an overestimate for smaller values. Time-smoothing the observed Doppler velocities and modelled mass-squared-weighted fallspeeds shows that observed fallspeeds are approximately two-thirds of the modelled values. Decreasing the modelled fallspeeds to those observed increases the modelled IWC, giving an IWP 1.6 times that observed.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This paper investigates the use of really simple syndication (RSS) to dynamically change virtual environments. The case study presented here uses meteorological data downloaded from the Internet in the form of an RSS feed, this data is used to simulate current weather patterns in a virtual environment. The downloaded data is aggregated and interpreted in conjunction with a configuration file, used to associate relevant weather information to the rendering engine. The engine is able to animate a wide range of basic weather patterns. Virtual reality is a way of immersing a user into a different environment, the amount of immersion the user experiences is important. Collaborative virtual reality will benefit from this work by gaining a simple way to incorporate up-to-date RSS feed data into any environment scenario. Instead of simulating weather conditions in training scenarios, actual weather conditions can be incorporated, improving the scenario and immersion.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The interface between humans and technology is a rapidly changing field. In particular as technological methods have improved dramatically so interaction has become possible that could only be speculated about even a decade earlier. This interaction can though take on a wide range of forms. Indeed standard buttons and dials with televisual feedback are perhaps a common example. But now virtual reality systems, wearable computers and most of all, implant technology are throwing up a completely new concept, namely a symbiosis of human and machine. No longer is it sensible simply to consider how a human interacts with a machine, but rather how the human-machine symbiotic combination interacts with the outside world. In this paper we take a look at some of the recent approaches, putting implant technology in context. We also consider some specific practical examples which may well alter the way we look at this symbiosis in the future. The main area of interest as far as symbiotic studies are concerned is clearly the use of implant technology, particularly where a connection is made between technology and the human brain and/or nervous system. Often pilot tests and experimentation has been carried out apriori to investigate the eventual possibilities before human subjects are themselves involved. Some of the more pertinent animal studies are discussed briefly here. The paper however concentrates on human experimentation, in particular that carried out by the authors themselves, firstly to indicate what possibilities exist as of now with available technology, but perhaps more importantly to also show what might be possible with such technology in the future and how this may well have extensive social effects. The driving force behind the integration of technology with humans on a neural level has historically been to restore lost functionality in individuals who have suffered neurological trauma such as spinal cord damage, or who suffer from a debilitating disease such as lateral amyotrophic sclerosis. Very few would argue against the development of implants to enable such people to control their environment, or some aspect of their own body functions. Indeed this technology in the short term has applications for amelioration of symptoms for the physically impaired, such as alternative senses being bestowed on a blind or deaf individual. However the issue becomes distinctly more complex when it is proposed that such technology be used on those with no medical need, but instead who wish to enhance and augment their own bodies, particularly in terms of their mental attributes. These issues are discussed here in the light of practical experimental test results and their ethical consequences.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The perspex machine arose from the unification of projective geometry with the Turing machine. It uses a total arithmetic, called transreal arithmetic, that contains real arithmetic and allows division by zero. Transreal arithmetic is redefined here. The new arithmetic has both a positive and a negative infinity which lie at the extremes of the number line, and a number nullity that lies off the number line. We prove that nullity, 0/0, is a number. Hence a number may have one of four signs: negative, zero, positive, or nullity. It is, therefore, impossible to encode the sign of a number in one bit, as floating-, point arithmetic attempts to do, resulting in the difficulty of having both positive and negative zeros and NaNs. Transrational arithmetic is consistent with Cantor arithmetic. In an extension to real arithmetic, the product of zero, an infinity, or nullity with its reciprocal is nullity, not unity. This avoids the usual contradictions that follow from allowing division by zero. Transreal arithmetic has a fixed algebraic structure and does not admit options as IEEE, floating-point arithmetic does. Most significantly, nullity has a simple semantics that is related to zero. Zero means "no value" and nullity means "no information." We argue that nullity is as useful to a manufactured computer as zero is to a human computer. The perspex machine is intended to offer one solution to the mind-body problem by showing how the computable aspects of mind and. perhaps, the whole of mind relates to the geometrical aspects of body and, perhaps, the whole of body. We review some of Turing's writings and show that he held the view that his machine has spatial properties. In particular, that it has the property of being a 7D lattice of compact spaces. Thus, we read Turing as believing that his machine relates computation to geometrical bodies. We simplify the perspex machine by substituting an augmented Euclidean geometry for projective geometry. This leads to a general-linear perspex-machine which is very much easier to pro-ram than the original perspex-machine. We then show how to map the whole of perspex space into a unit cube. This allows us to construct a fractal of perspex machines with the cardinality of a real-numbered line or space. This fractal is the universal perspex machine. It can solve, in unit time, the halting problem for itself and for all perspex machines instantiated in real-numbered space, including all Turing machines. We cite an experiment that has been proposed to test the physical reality of the perspex machine's model of time, but we make no claim that the physical universe works this way or that it has the cardinality of the perspex machine. We leave it that the perspex machine provides an upper bound on the computational properties of physical things, including manufactured computers and biological organisms, that have a cardinality no greater than the real-number line.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Efficient Navigation is essential for the user-acceptance of the Virtual Environments (VEs), but it is also inherently, a difficult task to perform. Resulting research in the area provides users with great variety of navigation assistance in VEs however it is still known to be inadequate, complex and suffers through many limitations. In this paper we discuss the task of navigation in the virtual environments and record the wayfinding assistance currently available for the VEs. The paper introduces taxonomy of navigation and categorizes the aids on basis of the functions performed. The paper provides views on current work performed in the area of non-speech auditory aids. Further we conclude by providing views on the important areas that require further investigation and research.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This paper addresses the crucial problem of wayfinding assistance in the Virtual Environments (VEs). A number of navigation aids such as maps, agents, trails and acoustic landmarks are available to support the user for navigation in VEs, however it is evident that most of the aids are visually dominated. This work-in-progress describes a sound based approach that intends to assist the task of 'route decision' during navigation in a VE using music. Furthermore, with use of musical sounds it aims to reduce the cognitive load associated with other visually as well as physically dominated tasks. To achieve these goals, the approach exploits the benefits provided by music to ease and enhance the task of wayfinding, whilst making the user experience in the VE smooth and enjoyable.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

As Virtual Reality pushes the boundaries of the human computer interface new ways of interaction are emerging. One such technology is the integration of haptic interfaces (force-feedback devices) into virtual environments. This modality offers an improved sense of immersion to that achieved when relying only on audio and visual modalities. The paper introduces some of the technical obstacles such as latency and network traffic that need to be overcome for maintaining a high degree of immersion during haptic tasks. The paper describes the advantages of integrating haptic feedback into systems, and presents some of the technical issues inherent in a networked haptic virtual environment. A generic control interface has been developed to seamlessly mesh with existing networked VR development libraries.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Virtual Reality (VR) has been used in a variety of forms to assist in the treatment of a wide range of psychological illness. VR can also fulfil the need that psychologists have for safe environments in which to conduct experiments. Currently the main barrier against using this technology is the complexity in developing applications. This paper presents two different co-operative psychological applications which have been developed using a single framework. These applications require different levels of co-operation between the users and clients, ranging from full psychologist involvement to their minimal intervention. This paper will also discuss our approach to developing these different environments and our experiences to date in utilising these environments.