709 resultados para Adaptive game technology


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Authorised users (insiders) are behind the majority of security incidents with high financial impacts. Because authorisation is the process of controlling users’ access to resources, improving authorisation techniques may mitigate the insider threat. Current approaches to authorisation suffer from the assumption that users will (can) not depart from the expected behaviour implicit in the authorisation policy. In reality however, users can and do depart from the canonical behaviour. This paper argues that the conflict of interest between insiders and authorisation mechanisms is analogous to the subset of problems formally studied in the field of game theory. It proposes a game theoretic authorisation model that can ensure users’ potential misuse of a resource is explicitly considered while making an authorisation decision. The resulting authorisation model is dynamic in the sense that its access decisions vary according to the changes in explicit factors that influence the cost of misuse for both the authorisation mechanism and the insider.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance of an adaptive filter may be studied through the behaviour of the optimal and adaptive coefficients in a given environment. This thesis investigates the performance of finite impulse response adaptive lattice filters for two classes of input signals: (a) frequency modulated signals with polynomial phases of order p in complex Gaussian white noise (as nonstationary signals), and (b) the impulsive autoregressive processes with alpha-stable distributions (as non-Gaussian signals). Initially, an overview is given for linear prediction and adaptive filtering. The convergence and tracking properties of the stochastic gradient algorithms are discussed for stationary and nonstationary input signals. It is explained that the stochastic gradient lattice algorithm has many advantages over the least-mean square algorithm. Some of these advantages are having a modular structure, easy-guaranteed stability, less sensitivity to the eigenvalue spread of the input autocorrelation matrix, and easy quantization of filter coefficients (normally called reflection coefficients). We then characterize the performance of the stochastic gradient lattice algorithm for the frequency modulated signals through the optimal and adaptive lattice reflection coefficients. This is a difficult task due to the nonlinear dependence of the adaptive reflection coefficients on the preceding stages and the input signal. To ease the derivations, we assume that reflection coefficients of each stage are independent of the inputs to that stage. Then the optimal lattice filter is derived for the frequency modulated signals. This is performed by computing the optimal values of residual errors, reflection coefficients, and recovery errors. Next, we show the tracking behaviour of adaptive reflection coefficients for frequency modulated signals. This is carried out by computing the tracking model of these coefficients for the stochastic gradient lattice algorithm in average. The second-order convergence of the adaptive coefficients is investigated by modeling the theoretical asymptotic variance of the gradient noise at each stage. The accuracy of the analytical results is verified by computer simulations. Using the previous analytical results, we show a new property, the polynomial order reducing property of adaptive lattice filters. This property may be used to reduce the order of the polynomial phase of input frequency modulated signals. Considering two examples, we show how this property may be used in processing frequency modulated signals. In the first example, a detection procedure in carried out on a frequency modulated signal with a second-order polynomial phase in complex Gaussian white noise. We showed that using this technique a better probability of detection is obtained for the reduced-order phase signals compared to that of the traditional energy detector. Also, it is empirically shown that the distribution of the gradient noise in the first adaptive reflection coefficients approximates the Gaussian law. In the second example, the instantaneous frequency of the same observed signal is estimated. We show that by using this technique a lower mean square error is achieved for the estimated frequencies at high signal-to-noise ratios in comparison to that of the adaptive line enhancer. The performance of adaptive lattice filters is then investigated for the second type of input signals, i.e., impulsive autoregressive processes with alpha-stable distributions . The concept of alpha-stable distributions is first introduced. We discuss that the stochastic gradient algorithm which performs desirable results for finite variance input signals (like frequency modulated signals in noise) does not perform a fast convergence for infinite variance stable processes (due to using the minimum mean-square error criterion). To deal with such problems, the concept of minimum dispersion criterion, fractional lower order moments, and recently-developed algorithms for stable processes are introduced. We then study the possibility of using the lattice structure for impulsive stable processes. Accordingly, two new algorithms including the least-mean P-norm lattice algorithm and its normalized version are proposed for lattice filters based on the fractional lower order moments. Simulation results show that using the proposed algorithms, faster convergence speeds are achieved for parameters estimation of autoregressive stable processes with low to moderate degrees of impulsiveness in comparison to many other algorithms. Also, we discuss the effect of impulsiveness of stable processes on generating some misalignment between the estimated parameters and the true values. Due to the infinite variance of stable processes, the performance of the proposed algorithms is only investigated using extensive computer simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis deals with the problem of the instantaneous frequency (IF) estimation of sinusoidal signals. This topic plays significant role in signal processing and communications. Depending on the type of the signal, two major approaches are considered. For IF estimation of single-tone or digitally-modulated sinusoidal signals (like frequency shift keying signals) the approach of digital phase-locked loops (DPLLs) is considered, and this is Part-I of this thesis. For FM signals the approach of time-frequency analysis is considered, and this is Part-II of the thesis. In part-I we have utilized sinusoidal DPLLs with non-uniform sampling scheme as this type is widely used in communication systems. The digital tanlock loop (DTL) has introduced significant advantages over other existing DPLLs. In the last 10 years many efforts have been made to improve DTL performance. However, this loop and all of its modifications utilizes Hilbert transformer (HT) to produce a signal-independent 90-degree phase-shifted version of the input signal. Hilbert transformer can be realized approximately using a finite impulse response (FIR) digital filter. This realization introduces further complexity in the loop in addition to approximations and frequency limitations on the input signal. We have tried to avoid practical difficulties associated with the conventional tanlock scheme while keeping its advantages. A time-delay is utilized in the tanlock scheme of DTL to produce a signal-dependent phase shift. This gave rise to the time-delay digital tanlock loop (TDTL). Fixed point theorems are used to analyze the behavior of the new loop. As such TDTL combines the two major approaches in DPLLs: the non-linear approach of sinusoidal DPLL based on fixed point analysis, and the linear tanlock approach based on the arctan phase detection. TDTL preserves the main advantages of the DTL despite its reduced structure. An application of TDTL in FSK demodulation is also considered. This idea of replacing HT by a time-delay may be of interest in other signal processing systems. Hence we have analyzed and compared the behaviors of the HT and the time-delay in the presence of additive Gaussian noise. Based on the above analysis, the behavior of the first and second-order TDTLs has been analyzed in additive Gaussian noise. Since DPLLs need time for locking, they are normally not efficient in tracking the continuously changing frequencies of non-stationary signals, i.e. signals with time-varying spectra. Nonstationary signals are of importance in synthetic and real life applications. An example is the frequency-modulated (FM) signals widely used in communication systems. Part-II of this thesis is dedicated for the IF estimation of non-stationary signals. For such signals the classical spectral techniques break down, due to the time-varying nature of their spectra, and more advanced techniques should be utilized. For the purpose of instantaneous frequency estimation of non-stationary signals there are two major approaches: parametric and non-parametric. We chose the non-parametric approach which is based on time-frequency analysis. This approach is computationally less expensive and more effective in dealing with multicomponent signals, which are the main aim of this part of the thesis. A time-frequency distribution (TFD) of a signal is a two-dimensional transformation of the signal to the time-frequency domain. Multicomponent signals can be identified by multiple energy peaks in the time-frequency domain. Many real life and synthetic signals are of multicomponent nature and there is little in the literature concerning IF estimation of such signals. This is why we have concentrated on multicomponent signals in Part-H. An adaptive algorithm for IF estimation using the quadratic time-frequency distributions has been analyzed. A class of time-frequency distributions that are more suitable for this purpose has been proposed. The kernels of this class are time-only or one-dimensional, rather than the time-lag (two-dimensional) kernels. Hence this class has been named as the T -class. If the parameters of these TFDs are properly chosen, they are more efficient than the existing fixed-kernel TFDs in terms of resolution (energy concentration around the IF) and artifacts reduction. The T-distributions has been used in the IF adaptive algorithm and proved to be efficient in tracking rapidly changing frequencies. They also enables direct amplitude estimation for the components of a multicomponent

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper argues a model of adaptive design for sustainable architecture within a framework of entropy evolution. The spectrum of sustainable architecture consists of efficient use of energy and material resource in the life-cycle of buildings, active involvement of the occupants into micro-climate control within the building, and the natural environment as the physical context. The interactions amongst all the parameters compose a complex system of sustainable architecture design, of which the conventional linear and fragmented design technologies are insufficient to indicate holistic and ongoing environmental performance. The latest interpretation of the Second Law of Thermodynamics states a microscopic formulation of an entropy evolution of complex open systems. It provides a design framework for an adaptive system evolves for the optimization in open systems, this adaptive system evolves for the optimization of building environmental performance. The paper concludes that adaptive modelling in entropy evolution is a design alternative for sustainable architecture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This full day workshop invites participants to consider the nexus where the interests of game design, the expectations of play and HCI meet: the game interface. Game interfaces seem different to the interface to other software and there have been a number of observations. Shneiderman famously noticed that while most software designers are intent on following the tenets of the “invisible computer” and making access easy for the user, games inter-faces are made for players: they embed challenge. Schell discusses a “strange” relationship between the player and the game enabled by the interface and user interface designers frequently opine that much can be learned from the design of game interfaces. So where does the game interface actually sit? Even more interesting is the question as to whether the history of the relationship and sub-sequent expectations are now limiting the potential of game design as an expressive form. Recent innovations in I/O design such as Nintendo’s Wii, Sony’s Move and Microsoft's Kinect seem to usher in an age of physical player-enabled interaction, experience and embodied, engaged design. This workshop intends to cast light on this often mentioned and sporadically examined area and to establish a platform for new and innovative design in the field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Becoming a teacher in technology-rich classrooms is a complex and challenging transition for career-change entrants. Those with generic or specialist Information and Communication Technology (ICT) expertise bring a mindset about purposeful uses of ICT that enrich student learning and school communities. The transition process from a non-education environment is both enhanced and constrained by shifting the technology context of generic or specialist ICT expertise, developed through a former career as well as general life experience. In developing an understanding of the complexity of classrooms and creating a learner centred way of working, perceptions about learners and learning evolve and shift. Shifts in thinking about how ICT expertise supports learners and enhances learning preceded shifts in perceptions about being a teacher, working with colleagues, and functioning in schools that have varying degrees of intensity and impact on evolving professional identities. Current teacher education and school induction programs are seen to be falling short of meeting the needs of career-change entrants and, as a flow on, the students they nurture. Research (see, for example, Tigchelaar, Brouwer, & Korthagen, 2008; Williams & Forgasz, 2009) highlights the value of generic and specialist expertise career-change teachers bring to the profession and draws attention to the challenges such expertise begets (Anthony & Ord, 2008; Priyadharshini & Robinson-Pant, 2003). As such, the study described in this thesis investigated perceptions of career-change entrants, who have generic (Mishra & Koehler, 2006) or specialist expertise, that is, ICT qualifications and work experience in the use of ICT. The career-change entrants‘ perceptions were sought as they shifted the technology context and transitioned into teaching in technology-rich classrooms. The research involved an interpretive analysis of qualitative data and quantitative data. The study used the explanatory case study (Yin, 1994) methodology enriched through grounded theory processes (Strauss & Corbin, 1998), to develop a theory about professional identity transition from the perceptions of the participants in the study. The study provided insights into the expertise and experiences of career change entrants, particularly in relation to how professional identities that include generic and specialist ICT knowledge and expertise were reconfigured while transitioning into the teaching profession. This thesis presents the Professional Identity Transition Theory that encapsulates perceptions about teaching in technology-rich classrooms amongst a selection of the increasing number of career-change entrants. The theory, grounded in the data, (Strauss & Corbin, 1998) proposes that career-change entrants experience transition phases of varying intensity that impact on professional identity, retention and development as a teacher. These phases are linked to a shift in perceptions rather than time as a teacher. Generic and specialist expertise in the use of ICT is a weight of the past and an asset that makes the transition process more challenging for career-change entrants. The study showed that career-change entrants used their experiences and perceptions to develop a way of working in a school community. Their way of working initially had an adaptive orientation focussed on immediate needs as their teaching practice developed. Following a shift of thinking, more generative ways of working focussed on the future emerged to enable continual enhancement and development of practice. Sustaining such learning is a personal, school and systemic challenge for the teaching profession.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this project was to implement a just-in-time hints help system into a real time strategy (RTS) computer game that would deliver information to the user at the time that it would be of the most benefit. The goal of this help system is to improve the user’s learning in terms of their rate of learning, retention and avoidance of stagnation. The first stage of this project was implementing a computer game to incorporate four different types of skill that the user must acquire, namely motor, perceptual, declarative knowledge and strategic. Subsequently, the just-in-time hints help system was incorporated into the game to assess the user’s knowledge and deliver hints accordingly. The final stage of the project was to test the effectiveness of this help system by conducting two phases of testing. The goal of this testing was to demonstrate an increase in the user’s assessment of the helpfulness of the system from phase one to phase two. The results of this testing showed that there was no significant difference in the user’s responses in the two phases. However, when the results were analysed with respect to several categories of hints that were identified, it became apparent that patterns in the data were beginning to emerge. The conclusions of the project were that further testing with a larger sample size would be required to provide more reliable results and that factors such as the user’s skill level and different types of goals should be taken into account.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of how to efficiently and safely design dose finding studies. Both current and novel utility functions are explored using Bayesian adaptive design methodology for the estimation of a maximum tolerated dose (MTD). In particular, we explore widely adopted approaches such as the continual reassessment method and minimizing the variance of the estimate of an MTD. New utility functions are constructed in the Bayesian framework and are evaluated against current approaches. To reduce computing time, importance sampling is implemented to re-weight posterior samples thus avoiding the need to draw samples using Markov chain Monte Carlo techniques. Further, as such studies are generally first-in-man, the safety of patients is paramount. We therefore explore methods for the incorporation of safety considerations into utility functions to ensure that only safe and well-predicted doses are administered. The amalgamation of Bayesian methodology, adaptive design and compound utility functions is termed adaptive Bayesian compound design (ABCD). The performance of this amalgamation of methodology is investigated via the simulation of dose finding studies. The paper concludes with a discussion of results and extensions that could be included into our approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The indecision surrounding the definition of Technology extends to the classroom as not knowing what a subject “is” affects how it is taught. Similarly, its relative newness – and consequent lack of habitus in school settings - means that it is still struggling to find its own place in the curriculum as well as resolve its relationship with more established subject domains, particularly Science and Mathematics. The guidance from syllabus documents points to open-ended student-directed projects where extant studies indicate a more common experience of teacher –directed activities and an emphasis on product over process. There are issues too for researchers in documenting classroom observations and in analysing teacher practice in new learning environments. This paper presents a framework for defining and mapping classroom practice and for attempting to describe the social practice in the Technology classroom. The framework is a bricolage which draws on contemporary research. More formally, the development of the framework is consonant with the aim of design-based research to develop a flexible, adaptive and generalisable theory to better understanding a teaching domain where promise is not seen to match current reality. The framework may also inform emergent approaches to STEM (Science, Technology, Education and Mathematics) in education.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an explanation of why the reuse of building components after demolition or deconstruction is critical to the future of the construction industry. An examination of the historical cause and response to climate change sets the scene as to why governance is becoming increasingly focused on the built environment as a mechanism to controlling waste generation associated with the process of demolition, construction and operation. Through an annotated description to the evolving design and construction methodology of a range of timber dwellings (typically 'Queenslanders' during the eras of 1880-1900, 1900-1920 & 1920-1940) the paper offers an evaluation to the variety of materials, which can be used advantageously by those wishing to 'regenerate' a Queenslander. This analysis of 'regeneration' details the constraints when considering relocation and/ or reuse by adaption including deconstruction of building components against the legislative framework requirements of the Queensland Building Act 1975 and the Queensland Sustainable Planning Act 2009, with a specific examination to those of the Building Codes of Australia. The paper concludes with a discussion of these constraints, their impacts on 'regeneration' and the need for further research to seek greater understanding of the practicalities and drivers of relocation, adaptive and building components suitability for reuse after deconstruction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statement: Jams, Jelly Beans and the Fruits of Passion Let us search, instead, for an epistemology of practice implicit in the artistic, intuitive processes which some practitioners do bring to situations of uncertainty, instability, uniqueness, and value conflict. (Schön 1983, p40) Game On was born out of the idea of creative community; finding, networking, supporting and inspiring the people behind the face of an industry, those in the mist of the machine and those intending to join. We understood this moment to be a pivotal opportunity to nurture a new emerging form of game making, in an era of change, where the old industry models were proving to be unsustainable. As soon as we started putting people into a room under pressure, to make something in 48hrs, a whole pile of evolutionary creative responses emerged. People refashioned their craft in a moment of intense creativity that demanded different ways of working, an adaptive approach to the craft of making games – small – fast – indie. An event like the 48hrs forces participants’ attention on the process as much as the outcome. As one game industry professional taking part in a challenge for the first time observed: there are three paths in the genesis from idea to finished work: the path that focuses on mechanics; the path that focuses on team structure and roles and the path that focuses on the idea, the spirit – and the more successful teams need to put the spirit of the work first and foremost. The spirit drives the adaptation, it becomes improvisation. As Schön says: “Improvisation consists on varying, combining and recombining a set of figures within the schema which bounds and gives coherence to the performance.” (1983, p55). This improvisational approach is all about those making the games: the people and the principles of their creative process. This documentation evidences the intensity of their passion, determination and the shit that they are prepared to put themselves through to achieve their goal – to win a cup full of jellybeans and make a working game in 48hrs. 48hr is a project where, on all levels, analogue meets digital. This concept was further explored through the documentation process. This set of four videos were created by Cameron Owen on the fly during the challenge using both the iphone video camera and editing software in order to be available with immediacy and allow the event audience to share the experience - and perhaps to give some insights into the creative process exposed by the 48 hour challenge. ____________________________ Schön, D. A. 1983, The Reflective Practitioner: How Professionals Think in Action, Basic Books, New York