54 resultados para Realism.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dynamic multi-user interactions in a single networked virtual environment suffer from abrupt state transition problems due to communication delays arising from network latency--an action by one user only becoming apparent to another user after the communication delay. This results in a temporal suspension of the environment for the duration of the delay--the virtual world `hangs'--followed by an abrupt jump to make up for the time lost due to the delay so that the current state of the virtual world is displayed. These discontinuities appear unnatural and disconcerting to the users. This paper proposes a novel method of warping times associated with users to ensure that each user views a continuous version of the virtual world, such that no hangs or jumps occur despite other user interactions. Objects passed between users within the environment are parameterized, not by real time, but by a virtual local time, generated by continuously warping real time. This virtual time periodically realigns itself with real time as the virtual environment evolves. The concept of a local user dynamically warping the local time is also introduced. As a result, the users are shielded from viewing discontinuities within their virtual worlds, consequently enhancing the realism of the virtual environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent studies using comprehensive middle atmosphere models predict a strengthening of the Brewer-Dobson circulation in response to climate change. To gain confidence in the realism of this result it is important to quantify and understand the contributions from the different components of stratospheric wave drag that cause this increase. Such an analysis is performed here using three 150-yr transient simulations from the Canadian Middle Atmosphere Model (CMAM), a Chemistry-Climate Model that simulates climate change and ozone depletion and recovery. Resolved wave drag and parameterized orographic gravity wave drag account for 60% and 40%, respectively, of the long-term trend in annual mean net upward mass flux at 70 hPa, with planetary waves accounting for 60% of the resolved wave drag trend. Synoptic wave drag has the strongest impact in northern winter, where it accounts for nearly as much of the upward mass flux trend as planetary wave drag. Owing to differences in the latitudinal structure of the wave drag changes, the relative contribution of resolved and parameterized wave drag to the tropical upward mass flux trend over any particular latitude range is highly sensitive to the range of latitudes considered. An examination of the spatial structure of the climate change response reveals no straightforward connection between the low-latitude and high-latitude changes: while the model results show an increase in Arctic downwelling in winter, they also show a decrease in Antarctic downwelling in spring. Both changes are attributed to changes in the flux of stationary planetary wave activity into the stratosphere.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The divide between agency and structural explanations of the causes of social phenomena has dominated research on housing as in other social fields. However, there has been some research that has sought to transcend this schism by combining agency and structural dimensions in the understanding of housing processes and outcomes. The article reviews the two most common approaches to doing this in housing research – structuration (following the work of Giddens) and critical realism. The example of research on homelessness is used to show how the approaches have been applied to housing issues.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of NWP models with grid spacing down to 1 km should produce more realistic forecasts of convective storms. However, greater realism does not necessarily mean more accurate precipitation forecasts. The rapid growth of errors on small scales in conjunction with preexisting errors on larger scales may limit the usefulness of such models. The purpose of this paper is to examine whether improved model resolution alone is able to produce more skillful precipitation forecasts on useful scales, and how the skill varies with spatial scale. A verification method will be described in which skill is determined from a comparison of rainfall forecasts with radar using fractional coverage over different sized areas. The Met Office Unified Model was run with grid spacings of 12, 4, and 1 km for 10 days in which convection occurred during the summers of 2003 and 2004. All forecasts were run from 12-km initial states for a clean comparison. The results show that the 1-km model was the most skillful over all but the smallest scales (approximately <10–15 km). A measure of acceptable skill was defined; this was attained by the 1-km model at scales around 40–70 km, some 10–20 km less than that of the 12-km model. The biggest improvement occurred for heavier, more localized rain, despite it being more difficult to predict. The 4-km model did not improve much on the 12-km model because of the difficulties of representing convection at that resolution, which was accentuated by the spinup from 12-km fields.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article explores the representations and tonal qualities of British “structured reality” programming. Focusing on The Only Way Is Essex and Made in Chelsea, it investigates their glocalizing of the model established by MTV’s Laguna Beach and The Hills. It argues that while they blur boundaries between docusoap, drama, and soap opera, the British programs also recognize and foreground issues of construction for their reality TV-literate youth audience. It suggests the programs play a key role in their respective channel identities and the ideologies of British youth television, connecting to larger issues of class, gender, and taste. This is articulated through their regional and classed femininities, with the article exploring how the programs draw on classed ideologies surrounding “natural” and “excessive” femininities and of the role of this in their engagement with construction and camp play. This play contributes to the tonal shift offered by the British programs, mixing the melodrama of the MTV programs with a knowing, at times comic edge that can tip into mockery. In doing so, the programs offer their audience a combination of performative self-awareness and emotional realism that situates them clearly within British youth television

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates a puzzling feature of social conventions: the fact that they are both arbitrary and normative. We examine how this tension is addressed in sociological accounts of conventional phenomena. Traditional approaches tend to generate either synchronic accounts that fail to consider the arbitrariness of conventions, or diachronic accounts that miss central aspects of their normativity. As a remedy, we propose a processual conception that considers conventions as both the outcome and material cause of much human activity. This conceptualization, which borrows from the économie des conventions as well as critical realism, provides a novel perspective on how conventions are nested and defined, and on how they are established, maintained and challenged.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter looks into the gap between presentational realism and the representation of physical experience in Werner Herzog's work so as to retrieve the indexical trace – or the absolute materiality of death. To that end, it draws links between Herzog and other directors akin to realism in its various forms, including surrealism. In particular, it focuses on François Truffaut and Glauber Rocha, representing respectively the Nouvelle Vague and the Cinema Novo, whose works had a decisive weight on Herzog’s aesthetic choices to the point of originating distinct phases of his outputs. The analyses, though restricted to a small number of films, intends to re-evaluate Herzog’s position within, and contribution to, film history.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article examines utopian gestures and inaugural desires in two films which became symbolic of the Brazilian Film Revival in the late 1990s: Central Station (1998) and Midnight (1999). Both evolve around the idea of an overcrowded or empty centre in a country trapped between past and future, in which the motif of the zero stands for both the announcement and the negation of utopia. The analysis draws parallels between them and new wave films which also elaborate on the idea of the zero, with examples picked from Italian neo-realism, the Brazilian Cinema Novo and the New German Cinema. In Central Station, the ‘point zero’, or the core of the homeland, is retrieved in the archaic backlands, where political issues are resolved in the private sphere and the social drama turns into family melodrama. Midnight, in its turn, recycles Glauber Rocha’s utopian prophecies in the new millennium’s hour zero, when the earthly paradise represented by the sea is re-encountered by the middle-class character, but not by the poor migrant. In both cases, public injustice is compensated by the heroes’ personal achievements, but those do not refer to the real nation, its history or society. Their utopian breadth, based on nostalgia, citation and genre techniques, is of a virtual kind, attune to cinema only.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Brain activity can be measured non-invasively with functional imaging techniques. Each pixel in such an image represents a neural mass of about 105 to 107 neurons. Mean field models (MFMs) approximate their activity by averaging out neural variability while retaining salient underlying features, like neurotransmitter kinetics. However, MFMs incorporating the regional variability, realistic geometry and connectivity of cortex have so far appeared intractable. This lack of biological realism has led to a focus on gross temporal features of the EEG. We address these impediments and showcase a "proof of principle" forward prediction of co-registered EEG/fMRI for a full-size human cortex in a realistic head model with anatomical connectivity, see figure 1. MFMs usually assume homogeneous neural masses, isotropic long-range connectivity and simplistic signal expression to allow rapid computation with partial differential equations. But these approximations are insufficient in particular for the high spatial resolution obtained with fMRI, since different cortical areas vary in their architectonic and dynamical properties, have complex connectivity, and can contribute non-trivially to the measured signal. Our code instead supports the local variation of model parameters and freely chosen connectivity for many thousand triangulation nodes spanning a cortical surface extracted from structural MRI. This allows the introduction of realistic anatomical and physiological parameters for cortical areas and their connectivity, including both intra- and inter-area connections. Proper cortical folding and conduction through a realistic head model is then added to obtain accurate signal expression for a comparison to experimental data. To showcase the synergy of these computational developments, we predict simultaneously EEG and fMRI BOLD responses by adding an established model for neurovascular coupling and convolving "Balloon-Windkessel" hemodynamics. We also incorporate regional connectivity extracted from the CoCoMac database [1]. Importantly, these extensions can be easily adapted according to future insights and data. Furthermore, while our own simulation is based on one specific MFM [2], the computational framework is general and can be applied to models favored by the user. Finally, we provide a brief outlook on improving the integration of multi-modal imaging data through iterative fits of a single underlying MFM in this realistic simulation framework.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The extent to which past climate change has dictated the pattern and timing of the out-of-Africa expansion by anatomically modern humans is currently unclear [Stewart JR, Stringer CB (2012) Science 335:1317–1321]. In particular, the incompleteness of the fossil record makes it difficult to quantify the effect of climate. Here, we take a different approach to this problem; rather than relying on the appearance of fossils or archaeological evidence to determine arrival times in different parts of the world, we use patterns of genetic variation in modern human populations to determine the plausibility of past demographic parameters. We develop a spatially explicit model of the expansion of anatomically modern humans and use climate reconstructions over the past 120 ky based on the Hadley Centre global climate model HadCM3 to quantify the possible effects of climate on human demography. The combinations of demographic parameters compatible with the current genetic makeup of worldwide populations indicate a clear effect of climate on past population densities. Our estimates of this effect, based on population genetics, capture the observed relationship between current climate and population density in modern hunter–gatherers worldwide, providing supporting evidence for the realism of our approach. Furthermore, although we did not use any archaeological and anthropological data to inform the model, the arrival times in different continents predicted by our model are also broadly consistent with the fossil and archaeological records. Our framework provides the most accurate spatiotemporal reconstruction of human demographic history available at present and will allow for a greater integration of genetic and archaeological evidence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

•In current models, the ecophysiological effects of CO2 create both woody thickening and terrestrial carbon uptake, as observed now, and forest cover and terrestrial carbon storage increases that took place after the last glacial maximum (LGM). Here, we aimed to assess the realism of modelled vegetation and carbon storage changes between LGM and the pre-industrial Holocene (PIH). •We applied Land Processes and eXchanges (LPX), a dynamic global vegetation model (DGVM), with lowered CO2 and LGM climate anomalies from the Palaeoclimate Modelling Intercomparison Project (PMIP II), and compared the model results with palaeodata. •Modelled global gross primary production was reduced by 27–36% and carbon storage by 550–694 Pg C compared with PIH. Comparable reductions have been estimated from stable isotopes. The modelled areal reduction of forests is broadly consistent with pollen records. Despite reduced productivity and biomass, tropical forests accounted for a greater proportion of modelled land carbon storage at LGM (28–32%) than at PIH (25%). •The agreement between palaeodata and model results for LGM is consistent with the hypothesis that the ecophysiological effects of CO2 influence tree–grass competition and vegetation productivity, and suggests that these effects are also at work today.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the most problematic aspects of the ‘Harvard School’ of liberal international theory is its failure to fulfil its own methodological ideals. Although Harvard School liberals subscribe to a nomothetic model of explanation, in practice they employ their theories as heuristic resources. Given this practice, we should expect them neither to develop candidate causal generalizations nor to be value-neutral: their explanatory insights are underpinned by value-laden choices about which questions to address and what concepts to employ. A key question for liberal theorists, therefore, is how a theory may be simultaneously explanatory and value-oriented. The difficulties inherent in resolving this problem are manifested in Ikenberry’s writing: whilst his work on constitutionalism in international politics partially fulfils the requirements of a more satisfactory liberal explanatory theory, his recent attempts to develop prescriptions for US foreign policy reproduce, in a new form, key failings of Harvard School realism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The recent recovery of an empirically and ethically richer realist tradition involves an explicit contrast with neorealism's more scientistic explanatory aspirations. This contrast is, however, incomplete. Although Waltz's theoretical work is shaped by his understanding of the requirements of scientific adequacy, his empirical essays are normatively quite rich: he defends bipolarity, and criticizes US adventurism overseas, because he believes bipolarity to be conducive to effective great power management of the international system, and hence to the avoidance of nuclear war. He is, in this sense, a theorist divided against himself: much of his oeuvre exhibits precisely the kind of pragmatic sensibility that is typically identified as distinguishing realism from neorealism. His legacy for a reoriented realism is therefore more complex than is usually realized. Indeed, the nature of Waltz's own analytical endeavour points towards a kind of international political theory in which explanatory and normative questions are intertwined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Forgetting immediate physical reality and having awareness of one�s location in the simulated world is critical to enjoyment and performance in virtual environments be it an interactive 3D game such as Quake or an online virtual 3d community space such as Second Life. Answer to the question "where am I?" at two levels, whether the locus is in the immediate real world as opposed to the virtual world and whether one is aware of the spatial co-ordinates of that locus, hold the key to any virtual 3D experience. While 3D environments, especially virtual environments and their impact on spatial comprehension has been studied in disciplines such as architecture, it is difficult to determine the relative contributions of specific attributes such as screen size or stereoscopy towards spatial comprehension since most of them treat the technology as monolith (box-centered). Using a variable-centered approach put forth by Nass and Mason (1990) which breaks down the technology into its component variables and their corresponding values as its theoretical basis, this paper looks at the contributions of five variables (Stereoscopy, screen size, field of view, level of realism and level of detail) common to most virtual environments on spatial comprehension and presence. The variable centered approach can be daunting as the increase in the number of variables can exponentially increase the number of conditions and resources required. We overcome this drawback posed by adoption of such a theoretical approach by the use of a fractional factorial design for the experiment. This study has completed the first wave of data collection and starting the next phase in January 2007 and expected to complete by February 2007. Theoretical and practical implications of the study are discussed.