807 resultados para dilemmatic spaces
Resumo:
The contribution described in this paper is an algorithm for learning nonlinear, reference tracking, control policies given no prior knowledge of the dynamical system and limited interaction with the system through the learning process. Concepts from the field of reinforcement learning, Bayesian statistics and classical control have been brought together in the formulation of this algorithm which can be viewed as a form of indirect self tuning regulator. On the task of reference tracking using a simulated inverted pendulum it was shown to yield generally improved performance on the best controller derived from the standard linear quadratic method using only 30 s of total interaction with the system. Finally, the algorithm was shown to work on the simulated double pendulum proving its ability to solve nontrivial control tasks. © 2011 IEEE.
Resumo:
The airflow and thermal stratification produced by a localised heat source located at floor level in a closed room is of considerable practical interest and is commonly referred to as a 'filling box'. In rooms with low aspect ratios H/R ≲ 1 (room height H to characteristic horizontal dimension R) the thermal plume spreads laterally on reaching the ceiling and a descending horizontal 'front' forms separating a stably stratified, warm upper region from cooler air below. The stratification is well predicted for H/R ≲ 1 by the original filling box model of Baines and Turner (J. Fluid. Mech. 37 (1968) 51). This model represents a somewhat idealised situation of a plume rising from a point source of buoyancy alone-in particular the momentum flux at the source is zero. In practical situations, real sources of heating and cooling in a ventilation system often include initial fluxes of both buoyancy and momentum, e.g. where a heating system vents warm air into a space. This paper describes laboratory experiments to determine the dependence of the 'front' formation and stratification on the source momentum and buoyancy fluxes of a single source, and on the location and relative strengths of two sources from which momentum and buoyancy fluxes were supplied separately. For a single source with a non-zero input of momentum, the rate of descent of the front is more rapid than for the case of zero source momentum flux and increases with increasing momentum input. Increasing the source momentum flux effectively increases the height of the enclosure, and leads to enhanced overturning motions and finally to complete mixing for highly momentum-driven flows. Stratified flows may be maintained by reducing the aspect ratio of the enclosure. At these low aspect ratios different long-time behaviour is observed depending on the nature of the heat input. A constant heat flux always produces a stratified interior at large times. On the other hand, a constant temperature supply ultimately produces a well-mixed space at the supply temperature. For separate sources of momentum and buoyancy, the developing stratification is shown to be strongly dependent on the separation of the sources and their relative strengths. Even at small separation distances the stratification initially exhibits horizontal inhomogeneity with localised regions of warm fluid (from the buoyancy source) and cool fluid. This inhomogeneity is less pronounced as the strength of one source is increased relative to the other. Regardless of the strengths of the sources, a constant buoyancy flux source dominates after sufficiently large times, although the strength of the momentum source determines whether the enclosure is initially well mixed (strong momentum source) or stably stratified (weak momentum source). © 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
Convergence analysis of consensus algorithms is revisited in the light of the Hilbert distance. The Lyapunov function used in the early analysis by Tsitsiklis is shown to be the Hilbert distance to consensus in log coordinates. Birkhoff theorem, which proves contraction of the Hilbert metric for any positive homogeneous monotone map, provides an early yet general convergence result for consensus algorithms. Because Birkhoff theorem holds in arbitrary cones, we extend consensus algorithms to the cone of positive definite matrices. The proposed generalization finds applications in the convergence analysis of quantum stochastic maps, which are a generalization of stochastic maps to non-commutative probability spaces. ©2010 IEEE.
Resumo:
State-of-the-art speech recognisers are usually based on hidden Markov models (HMMs). They model a hidden symbol sequence with a Markov process, with the observations independent given that sequence. These assumptions yield efficient algorithms, but limit the power of the model. An alternative model that allows a wide range of features, including word- and phone-level features, is a log-linear model. To handle, for example, word-level variable-length features, the original feature vectors must be segmented into words. Thus, decoding must find the optimal combination of segmentation of the utterance into words and word sequence. Features must therefore be extracted for each possible segment of audio. For many types of features, this becomes slow. In this paper, long-span features are derived from the likelihoods of word HMMs. Derivatives of the log-likelihoods, which break the Markov assumption, are appended. Previously, decoding with this model took cubic time in the length of the sequence, and longer for higher-order derivatives. This paper shows how to decode in quadratic time. © 2013 IEEE.
Resumo:
Cubic boron nitride (c-BN) films were deposited on Si(001) substrates in an ion beam assisted deposition (IBAD) system under various conditions, and the growth parameter spaces and optical properties of c-BN films have been investigated systematically. The results indicate that suitable ion bombardment is necessary for the growth of c-BN films, and a well defined parameter space can be established by using the P/a-parameter. The refractive index of BN films keeps a constant of 1.8 for the c-BN content lower than 50%, while for c-BN films with higher cubic phase the refractive index increases with the c-BN content from 1.8 at chi(c) = 50% to 2.1 at chi(c) = 90%. Furthermore, the relationship between n and rho for BN films can be described by the Anderson-Schreiber equation, and the overlap field parameter gamma is determined to be 2.05.
Resumo:
The storage of photoexcited electron-hole pairs is experimentally carried out and theoretically realized by transferring electrons in both real and k spaces through resonant Gamma - X in an AlAs/GaAs heterostructure. This is proven by the peculiar capacitance jump and hysteresis in the measured capacitance-voltage curves. Our structure may be used as a photonic memory cell with a long storage time and a fast retrieval of photons as well.
Resumo:
In a recent seminal paper, Gibson and Wexler (1993) take important steps to formalizing the notion of language learning in a (finite) space whose grammars are characterized by a finite number of parameters. They introduce the Triggering Learning Algorithm (TLA) and show that even in finite space convergence may be a problem due to local maxima. In this paper we explicitly formalize learning in finite parameter space as a Markov structure whose states are parameter settings. We show that this captures the dynamics of TLA completely and allows us to explicitly compute the rates of convergence for TLA and other variants of TLA e.g. random walk. Also included in the paper are a corrected version of GW's central convergence proof, a list of "problem states" in addition to local maxima, and batch and PAC-style learning bounds for the model.
Resumo:
Eckerdal, A., McCartney, R., Mostr?m, J. E., Sanders, K., Thomas, L., and Zander, C. 2007. From Limen to Lumen: computing students in liminal spaces. In Proceedings of the Third international Workshop on Computing Education Research (Atlanta, Georgia, USA, September 15 - 16, 2007). ICER '07. ACM, New York, NY, 123-132.
Resumo:
We consider the problem of variable selection in regression modeling in high-dimensional spaces where there is known structure among the covariates. This is an unconventional variable selection problem for two reasons: (1) The dimension of the covariate space is comparable, and often much larger, than the number of subjects in the study, and (2) the covariate space is highly structured, and in some cases it is desirable to incorporate this structural information in to the model building process. We approach this problem through the Bayesian variable selection framework, where we assume that the covariates lie on an undirected graph and formulate an Ising prior on the model space for incorporating structural information. Certain computational and statistical problems arise that are unique to such high-dimensional, structured settings, the most interesting being the phenomenon of phase transitions. We propose theoretical and computational schemes to mitigate these problems. We illustrate our methods on two different graph structures: the linear chain and the regular graph of degree k. Finally, we use our methods to study a specific application in genomics: the modeling of transcription factor binding sites in DNA sequences. © 2010 American Statistical Association.
Resumo:
We present iterative algorithms for solving linear inverse problems with discrete data and compare their performances with the method of singular function expansion, in view of applications in optical imaging and particle sizing.
Resumo:
info:eu-repo/semantics/published
Resumo:
In The Eye of Power, Foucault delineated the key concerns surrounding hospital architecture in the latter half of the eighteenth century as being the ‘visibility of bodies, individuals and things'. As such, the ‘new form of hospital' that came to be developed ‘was at once the effect and support of a new type of gaze'. This was a gaze that was not simply concerned with ways of minimising overcrowding or cross-contamination. Rather, this was a surveillance intended to produce knowledge about the pathological bodies contained within the hospital walls. This would then allow for their appropriate classification. Foucault went on to describe how these principles came to be applied to the architecture of prisons. This was exemplified for him in the distinct shape of Bentham's panopticon. This circular design, which has subsequently become an often misused synonym for a contemporary culture of surveillance, was premised on a binary of the seen and the not-seen. An individual observer could stand at the central point of the circle and observe the cells (and their occupants) on the perimeter whilst themselves remaining unseen. The panopticon in its purest form was never constructed, yet it conveys the significance of the production of knowledge through observation that became central to institutional design at this time and modern thought more broadly. What is curious though is that whilst the aim of those late eighteenth century buildings was to produce wellventilated spaces suffused with light, this provoked an interest in its opposite. The gothic movement in literature that was developing in parallel conversely took a ‘fantasy world of stone walls, darkness, hideouts and dungeons…' as its landscape (Vidler, 1992: 162). Curiously, despite these modern developments in prison design, the façade took on these characteristics. The gothic imagination came to describe that unseen world that lay behind the outer wall. This is what Evans refers to as an architectural ‘hoax'. The façade was taken to represent the world within the prison walls and it was the façade that came to inform the popular imagination about what occurred behind it. The rational, modern principles ordering the prison became conflated with the meanings projected by and onto the façade. This confusion of meanings have then been repeated and reenforced in the subsequent representations of the prison. This is of paramount importance since it is the cinematic and televisual representation of the prison, as I argue here and elsewhere, that maintain this erroneous set of meanings, this ‘hoax'.