29 resultados para automata intersection
em CentAUR: Central Archive University of Reading - UK
Resumo:
We present results from fast-response wind measurements within and above a busy intersection between two street canyons (Marylebone Road and Gloucester Place) in Westminster, London taken as part of the DAPPLE (Dispersion of Air Pollution and Penetration into the Local Environment; www.dapple.org.uk) 2007 field campaign. The data reported here were collected using ultrasonic anemometers on the roof-top of a building adjacent to the intersection and at two heights on a pair of lamp-posts on opposite sides of the intersection. Site characteristics, data analysis and the variation of intersection flow with the above-roof wind direction (θref) are discussed. Evidence of both flow channelling and recirculation was identified within the canyon, only a few metres from the intersection for along-street and across-street roof-top winds respectively. Results also indicate that for oblique rooftop flows, the intersection flow is a complex combination of bifurcated channelled flows, recirculation and corner vortices. Asymmetries in local building geometry around the intersection and small changes in the background wind direction (changes in 15-min mean θref of 5–10 degrees) were also observed to have profound influences on the behaviour of intersection flow patterns. Consequently, short time-scale variability in the background flow direction can lead to highly scattered in-street mean flow angles masking the true multi-modal features of the flow and thus further complicating modelling challenges.
Resumo:
Researchers at the University of Reading have developed over many years some simple mobile robots that explore an environment they perceive through simple ultrasonic sensors. Information from these sensors has allowed the robots to learn the simple task of moving around while avoiding dynamic obstacles using a static set of fuzzy automata, the choice of which has been criticised, due to its arbitrary nature. This paper considers how a dynamic set of automata can overcome this criticism. In addition, a new reinforcement learning function is outlined which is both scalable to different numbers and types of sensors. The innovations compare successfully with earlier work.
Resumo:
The recent celebrations of the centenary of the publication of the Futurist manifesto led to a renewed discussion of the ideas and artworks of the Italian artists’ group. Jacques Rancière related the Futurist ethos with the modernist project of liberating art from representation. Franco ‘Bifo’ Berardi, in his post-Futurist manifesto, also identified a historical irony at play in the emptying out of Futurism’s promise: a liberated mechanical humanity did indeed materialize, in a global economic system premised on financial servitude to the future via debt. However, these models continue to assess Futurism against an unchallenged humanism, finding it either supporting ideals of freedom and human rights despite itself, or else lacking in these areas. But Futurism is potentially more relevant than ever not in spite of its anti-humanist agenda, precisely because of it. Tom McCarthy annexes not Futurist art but Futurist writing to an emerging object oriented ontology that seeks to challenge the primacy of the human. If Futurism is to be repurposed as a critical concept, it can only do so by countering the humanist myth the liberal subject that underlies the current cultural and political hegemony of neo-liberalism.
Resumo:
The problem of a manipulator operating in a noisy workspace and required to move from an initial fixed position P0 to a final position Pf is considered. However, Pf is corrupted by noise, giving rise to Pˆf, which may be obtained by sensors. The use of learning automata is proposed to tackle this problem. An automaton is placed at each joint of the manipulator which moves according to the action chosen by the automaton (forward, backward, stationary) at each instant. The simultaneous reward or penalty of the automata enables avoiding any inverse kinematics computations that would be necessary if the distance of each joint from the final position had to be calculated. Three variable-structure learning algorithms are used, i.e., the discretized linear reward-penalty (DLR-P, the linear reward-penalty (LR-P ) and a nonlinear scheme. Each algorithm is separately tested with two (forward, backward) and three forward, backward, stationary) actions.
Resumo:
Contemporary US sitcom is at an interesting crossroads: it has received an increasing amount of scholarly attention (e.g. Mills 2009; Butler 2010; Newman and Levine 2012; Vermeulen and Whitfield 2013), which largely understands it as shifting towards the aesthetically and narratively complex. At the same time, in the post-broadcasting era, US networks are particularly struggling for their audience share. With the days of blockbuster successes like Must See TV’s Friends (NBC 1994-2004) a distant dream, recent US sitcoms are instead turning towards smaller, engaged audiences. Here, a cult sensibility of intertextual in-jokes, temporal and narrational experimentation (e.g. flashbacks and alternate realities) and self-reflexive performance styles have marked shows including Community (NBC 2009-2015), How I Met Your Mother (CBS 2005-2014), New Girl (Fox 2011-present) and 30 Rock (NBC 2006-2013). However, not much critical attention has so far been paid to how these developments in textual sensibility in contemporary US sitcom may be influenced by, and influencing, the use of transmedia storytelling practices, an increasingly significant industrial concern and rising scholarly field of enquiry (e.g. Jenkins 2006; Mittell 2015; Richards 2010; Scott 2010; Jenkins, Ford and Green 2013). This chapter investigates this mutual influence between sitcom and transmedia by taking as its case studies two network shows that encourage invested viewership through their use of transtexts, namely How I Met Your Mother (hereafter HIMHM) and New Girl (hereafter NG). As such, it will pay particular attention to the most transtextually visible character/actor from each show: HIMYM’s Barney Stinson, played by Neil Patrick Harris, and NG’s Schmidt, played by Max Greenfield. This chapter argues that these sitcoms do not simply have their particular textual sensibility and also (happen to) engage with transmedia practices, but that the two are mutually informing and defining. This chapter explores the relationships and interplay between sitcom aesthetics, narratives and transmedia storytelling (or industrial transtexts), focusing on the use of multiple delivery channels in order to disperse “integral elements of a fiction” (Jenkins, 2006 95-6), by official entities such as the broadcasting channels. The chapter pays due attention to the specific production contexts of both shows and how these inform their approaches to transtexts. This chapter’s conceptual framework will be particularly concerned with how issues of texture, the reality envelope and accepted imaginative realism, as well as performance and the actor’s input inform and illuminate contemporary sitcoms and transtexts, and will be the first scholarly research to do so. It will seek out points of connections between two (thus far) separate strands of scholarship and will move discussions on transtexts beyond the usual genre studied (i.e. science-fiction and fantasy), as well as make a contribution to the growing scholarship on contemporary sitcom by approaching it from a new critical angle. On the basis that transmedia scholarship stands to benefit from widening its customary genre choice (i.e. telefantasy) for its case studies and from making more use of in-depth close analysis in its engagement with transtexts, the chapter argues that notions of texture, accepted imaginative realism and the reality envelope, as well as performance and the actor’s input deserve to be paid more attention to within transtext-related scholarship.
Resumo:
Turbulence statistics obtained by direct numerical simulations are analysed to investigate spatial heterogeneity within regular arrays of building-like cubical obstacles. Two different array layouts are studied, staggered and square, both at a packing density of λp=0.25 . The flow statistics analysed are mean streamwise velocity ( u− ), shear stress ( u′w′−−−− ), turbulent kinetic energy (k) and dispersive stress fraction ( u˜w˜ ). The spatial flow patterns and spatial distribution of these statistics in the two arrays are found to be very different. Local regions of high spatial variability are identified. The overall spatial variances of the statistics are shown to be generally very significant in comparison with their spatial averages within the arrays. Above the arrays the spatial variances as well as dispersive stresses decay rapidly to zero. The heterogeneity is explored further by separately considering six different flow regimes identified within the arrays, described here as: channelling region, constricted region, intersection region, building wake region, canyon region and front-recirculation region. It is found that the flow in the first three regions is relatively homogeneous, but that spatial variances in the latter three regions are large, especially in the building wake and canyon regions. The implication is that, in general, the flow immediately behind (and, to a lesser extent, in front of) a building is much more heterogeneous than elsewhere, even in the relatively dense arrays considered here. Most of the dispersive stress is concentrated in these regions. Considering the experimental difficulties of obtaining enough point measurements to form a representative spatial average, the error incurred by degrading the sampling resolution is investigated. It is found that a good estimate for both area and line averages can be obtained using a relatively small number of strategically located sampling points.
Resumo:
The paper describes a field study focused on the dispersion of a traffic-related pollutant within an area close to a busy intersection between two street canyons in Central London. Simultaneous measurements of airflow, traffic flow and carbon monoxide concentrations ([CO]) are used to explore the causes of spatial variability in [CO] over a full range of background wind directions. Depending on the roof-top wind direction, evidence of both flow channelling and recirculation regimes were identified from data collected within the main canyon and the intersection. However, at the intersection, the merging of channelled flows from the canyons increased the flow complexity and turbulence intensity. These features, coupled with the close proximity of nearby queuing traffic in several directions, led to the highest overall time-average measured [CO] occurring at the intersection. Within the main street canyon, the data supported the presence of a helical flow regime for oblique roof-top flows, leading to increased [CO] on the canyon leeward side. Predominant wind directions led to some locations having significantly higher diurnal average [CO] due to being mostly on the canyon leeward side during the study period. For all locations, small changes in the background wind direction could cause large changes in the in-street mean wind angle and local turbulence intensity, implying that dispersion mechanisms would be highly sensitive to small changes in above roof flows. During peak traffic flow periods, concentrations within parallel side streets were approximately four times lower than within the main canyon and intersection which has implications for controlling personal exposure. Overall, the results illustrate that pollutant concentrations can be highly spatially variable over even short distances within complex urban geometries, and that synoptic wind patterns, traffic queue location and building topologies all play a role in determining where pollutant hot spots occur.
Resumo:
Negative correlations between task performance in dynamic control tasks and verbalizable knowledge, as assessed by a post-task questionnaire, have been interpreted as dissociations that indicate two antagonistic modes of learning, one being “explicit”, the other “implicit”. This paper views the control tasks as finite-state automata and offers an alternative interpretation of these negative correlations. It is argued that “good controllers” observe fewer different state transitions and, consequently, can answer fewer post-task questions about system transitions than can “bad controllers”. Two experiments demonstrate the validity of the argument by showing the predicted negative relationship between control performance and the number of explored state transitions, and the predicted positive relationship between the number of explored state transitions and questionnaire scores. However, the experiments also elucidate important boundary conditions for the critical effects. We discuss the implications of these findings, and of other problems arising from the process control paradigm, for conclusions about implicit versus explicit learning processes.
Resumo:
The different types of surface intersection which may occur in linear configurations of triatomic molecules are reviewed, particularly with regard to the way in which the degeneracy is split as the molecule bends. The Renner-Teller effect in states of symmetry Π, Δ, Φ, etc., and intersections between Σ and Π, Σ and Δ, and Π and Δ states are discussed. A general method of modelling such intersecting potential surfaces is proposed, as a development of the model previously used by Murrell and Carter and co-workers for single-valued surfaces. Some of the lower energy surfaces of H2O, NH2, O3, C3, and HNO are discussed as examples.
Resumo:
A technique is presented for locating and tracking objects in cluttered environments. Agents are randomly distributed across the image, and subsequently grouped around targets. Each agent uses a weightless neural network and a histogram intersection technique to score its location. The system has been used to locate and track a head in 320x240 resolution video at up to 15fps.
Resumo:
This article discusses the treatment of sentimentality in the fiction of J.M. Barrie, focusing in particular on Tommy and Grizel (1900). It place the discussion in the context of wider debates over sentimentality in Victorian culture and explores the intersection between these and discourse of gender and sexuality in the late nineteenth century.
Resumo:
This paper describes the design and implementation of an agent based network for the support of collaborative switching tasks within the control room environment of the National Grid Company plc. This work includes aspects from several research disciplines, including operational analysis, human computer interaction, finite state modelling techniques, intelligent agents and computer supported co-operative work. Aspects of these procedures have been used in the analysis of collaborative tasks to produce distributed local models for all involved users. These models have been used as the basis for the production of local finite state automata. These automata have then been embedded within an agent network together with behavioural information extracted from the task and user analysis phase. The resulting support system is capable of task and communication management within the transmission despatch environment.