947 resultados para Hybrid-game Strategies
Resumo:
Modern power networks incorporate communications and information technology infrastructure into the electrical power system to create a smart grid in terms of control and operation. The smart grid enables real-time communication and control between consumers and utility companies allowing suppliers to optimize energy usage based on price preference and system technical issues. The smart grid design aims to provide overall power system monitoring, create protection and control strategies to maintain system performance, stability and security. This dissertation contributed to the development of a unique and novel smart grid test-bed laboratory with integrated monitoring, protection and control systems. This test-bed was used as a platform to test the smart grid operational ideas developed here. The implementation of this system in the real-time software creates an environment for studying, implementing and verifying novel control and protection schemes developed in this dissertation. Phasor measurement techniques were developed using the available Data Acquisition (DAQ) devices in order to monitor all points in the power system in real time. This provides a practical view of system parameter changes, system abnormal conditions and its stability and security information system. These developments provide valuable measurements for technical power system operators in the energy control centers. Phasor Measurement technology is an excellent solution for improving system planning, operation and energy trading in addition to enabling advanced applications in Wide Area Monitoring, Protection and Control (WAMPAC). Moreover, a virtual protection system was developed and implemented in the smart grid laboratory with integrated functionality for wide area applications. Experiments and procedures were developed in the system in order to detect the system abnormal conditions and apply proper remedies to heal the system. A design for DC microgrid was developed to integrate it to the AC system with appropriate control capability. This system represents realistic hybrid AC/DC microgrids connectivity to the AC side to study the use of such architecture in system operation to help remedy system abnormal conditions. In addition, this dissertation explored the challenges and feasibility of the implementation of real-time system analysis features in order to monitor the system security and stability measures. These indices are measured experimentally during the operation of the developed hybrid AC/DC microgrids. Furthermore, a real-time optimal power flow system was implemented to optimally manage the power sharing between AC generators and DC side resources. A study relating to real-time energy management algorithm in hybrid microgrids was performed to evaluate the effects of using energy storage resources and their use in mitigating heavy load impacts on system stability and operational security.
Resumo:
This paper compares the performances of two different optimisation techniques for solving inverse problems; the first one deals with the Hierarchical Asynchronous Parallel Evolutionary Algorithms software (HAPEA) and the second is implemented with a game strategy named Nash-EA. The HAPEA software is based on a hierarchical topology and asynchronous parallel computation. The Nash-EA methodology is introduced as a distributed virtual game and consists of splitting the wing design variables - aerofoil sections - supervised by players optimising their own strategy. The HAPEA and Nash-EA software methodologies are applied to a single objective aerodynamic ONERA M6 wing reconstruction. Numerical results from the two approaches are compared in terms of the quality of model and computational expense and demonstrate the superiority of the distributed Nash-EA methodology in a parallel environment for a similar design quality.
Resumo:
The present paper focuses on some interesting classes of process-control games, where winning essentially means successfully controlling the process. A master for one of these games is an agent who plays a winning strategy. In this paper we investigate situations in which even a complete model (given by a program) of a particular game does not provide enough information to synthesize—even incrementally—a winning strategy. However, if in addition to getting a program, a machine may also watch masters play winning strategies, then the machine is able to incrementally learn a winning strategy for the given game. Studied are successful learning from arbitrary masters and from pedagogically useful selected masters. It is shown that selected masters are strictly more helpful for learning than are arbitrary masters. Both for learning from arbitrary masters and for learning from selected masters, though, there are cases where one can learn programs for winning strategies from masters but not if one is required to learn a program for the master's strategy itself. Both for learning from arbitrary masters and for learning from selected masters, one can learn strictly more by watching m+1 masters than one can learn by watching only m. Last, a simulation result is presented where the presence of a selected master reduces the complexity from infinitely many semantic mind changes to finitely many syntactic ones.
Resumo:
INTRODUCTION: Since the introduction of its QUT ePrints institutional repository of published research outputs, together with the world’s first mandate for author contributions to an institutional repository, Queensland University of Technology (QUT) has been a leader in support of green road open access. With QUT ePrints providing our mechanism for supporting the green road to open access, QUT has since then also continued to expand its secondary open access strategy supporting gold road open access, which is also designed to assist QUT researchers to maximise the accessibility and so impact of their research. ---------- METHODS: QUT Library has adopted the position of selectively supporting true gold road open access publishing by using the Library Resource Allocation budget to pay the author publication fees for QUT authors wishing to publish in the open access journals of a range of publishers including BioMed Central, Public Library of Science and Hindawi. QUT Library has been careful to support only true open access publishers and not those open access publishers with hybrid models which “double dip” by charging authors publication fees and libraries subscription fees for the same journal content. QUT Library has maintained a watch on the growing number of open access journals available from gold road open access publishers and their increased rate of success as measured by publication impact. ---------- RESULTS: This paper reports on the successes and challenges of QUT’s efforts to support true gold road open access publishers and promote these publishing strategy options to researchers at QUT. The number and spread of QUT papers submitted and published in the journals of each publisher is provided. Citation counts for papers and authors are also presented and analysed, with the intention of identifying the benefits to accessibility and research impact for early career and established researchers.---------- CONCLUSIONS: QUT Library is eager to continue and further develop support for this publishing strategy, and makes a number of recommendations to other research institutions, on how they can best achieve success with this strategy.
Resumo:
A trend in design and implementation of modern industrial automation systems is to integrate computing, communication and control into a unified framework at different levels of machine/factory operations and information processing. These distributed control systems are referred to as networked control systems (NCSs). They are composed of sensors, actuators, and controllers interconnected over communication networks. As most of communication networks are not designed for NCS applications, the communication requirements of NCSs may be not satisfied. For example, traditional control systems require the data to be accurate, timely and lossless. However, because of random transmission delays and packet losses, the control performance of a control system may be badly deteriorated, and the control system rendered unstable. The main challenge of NCS design is to both maintain and improve stable control performance of an NCS. To achieve this, communication and control methodologies have to be designed. In recent decades, Ethernet and 802.11 networks have been introduced in control networks and have even replaced traditional fieldbus productions in some real-time control applications, because of their high bandwidth and good interoperability. As Ethernet and 802.11 networks are not designed for distributed control applications, two aspects of NCS research need to be addressed to make these communication networks suitable for control systems in industrial environments. From the perspective of networking, communication protocols need to be designed to satisfy communication requirements for NCSs such as real-time communication and high-precision clock consistency requirements. From the perspective of control, methods to compensate for network-induced delays and packet losses are important for NCS design. To make Ethernet-based and 802.11 networks suitable for distributed control applications, this thesis develops a high-precision relative clock synchronisation protocol and an analytical model for analysing the real-time performance of 802.11 networks, and designs a new predictive compensation method. Firstly, a hybrid NCS simulation environment based on the NS-2 simulator is designed and implemented. Secondly, a high-precision relative clock synchronization protocol is designed and implemented. Thirdly, transmission delays in 802.11 networks for soft-real-time control applications are modeled by use of a Markov chain model in which real-time Quality-of- Service parameters are analysed under a periodic traffic pattern. By using a Markov chain model, we can accurately model the tradeoff between real-time performance and throughput performance. Furthermore, a cross-layer optimisation scheme, featuring application-layer flow rate adaptation, is designed to achieve the tradeoff between certain real-time and throughput performance characteristics in a typical NCS scenario with wireless local area network. Fourthly, as a co-design approach for both a network and a controller, a new predictive compensation method for variable delay and packet loss in NCSs is designed, where simultaneous end-to-end delays and packet losses during packet transmissions from sensors to actuators is tackled. The effectiveness of the proposed predictive compensation approach is demonstrated using our hybrid NCS simulation environment.
Resumo:
In this work a novel hybrid approach is presented that uses a combination of both time domain and frequency domain solution strategies to predict the power distribution within a lossy medium loaded within a waveguide. The problem of determining the electromagnetic fields evolving within the waveguide and the lossy medium is decoupled into two components, one for computing the fields in the waveguide including a coarse representation of the medium (the exterior problem) and one for a detailed resolution of the lossy medium (the interior problem). A previously documented cell-centred Maxwell’s equations numerical solver can be used to resolve the exterior problem accurately in the time domain. Thereafter the discrete Fourier transform can be applied to the computed field data around the interface of the medium to estimate the frequency domain boundary condition in-formation that is needed for closure of the interior problem. Since only the electric fields are required to compute the power distribution generated within the lossy medium, the interior problem can be resolved efficiently using the Helmholtz equation. A consistent cell-centred finite-volume method is then used to discretise this equation on a fine mesh and the underlying large, sparse, complex matrix system is solved for the required electric field using the iterative Krylov subspace based GMRES iterative solver. It will be shown that the hybrid solution methodology works well when a single frequency is considered in the evaluation of the Helmholtz equation in a single mode waveguide. A restriction of the scheme is that the material needs to be sufficiently lossy, so that any penetrating waves in the material are absorbed.
Resumo:
A microgrid may be supplied from inertial (rotating type) and non-inertial (converter-interfaced) distributed generators (DGs). However the dynamic response of these two types of DGs is different. Inertial DGs have a slower response due to their governor characteristics while non inertial DGs have the ability to respond very quickly. The focus of this paper is to propose better controls using droop characteristics to improve the dynamic interaction between different DG types in an autonomous microgrid. The transient behavior of DGs in the microgrid is investigated during the DG synchronization and load changes. Power sharing strategies based on frequency and voltage droop are considered for DGs. Droop control strategies are proposed for DGs to improve the smooth synchronization and dynamic power sharing minimizing transient oscillations in the microgrid. Simulation studies are carried out on PSCAD for validation.
Resumo:
A number of learning problems can be cast as an Online Convex Game: on each round, a learner makes a prediction x from a convex set, the environment plays a loss function f, and the learner’s long-term goal is to minimize regret. Algorithms have been proposed by Zinkevich, when f is assumed to be convex, and Hazan et al., when f is assumed to be strongly convex, that have provably low regret. We consider these two settings and analyze such games from a minimax perspective, proving minimax strategies and lower bounds in each case. These results prove that the existing algorithms are essentially optimal.
Resumo:
This paper defines and discusses two contrasting approaches to designing game environments. The first, referred to as scripting, requires developers to anticipate, hand-craft and script specific game objects, events and player interactions. The second, known as emergence, involves defining general, global rules that interact to give rise to emergent gameplay. Each of these approaches is defined, discussed and analyzed with respect to the considerations and affects for game developers and game players. Subsequently, various techniques for implementing these design approaches are identified and discussed. It is concluded that scripting and emergence are two extremes of the same continuum, neither of which are ideal for game development. Rather, there needs to be a compromise in which the boundaries of action (such as story and game objectives) can be hardcoded and non-scripted behavior (such as interactions and strategies) are able to emerge within these boundaries.
Resumo:
This paper presents a hybrid framework of Swedish cultural practices and Australian grounded theory for organizational development and suggests practical strategies for 'working smarter' in 21st Century libraries. Toward that end, reflective evidence-based practices are offered to incrementally build organizational capacity for asking good questions, selecting authoritative sources, evaluating multiple perspectives, organizing emerging insights, and communicating them to inform, educate, and influence. In addition, to ensure the robust information exchange necessary to collective workplace learning, leadership traits are proposed for ensuring inclusive communication, decision making, and planning processes. These findings emerge from action research projects conducted from 2003 to 2008 in two North American libraries.
Resumo:
This paper presents research findings and design strategies that illustrate how digital technology can be applied as a tool for hybrid placemaking in ways that would not be possible in purely digital or physical space. Digital technology has revolutionised the way people learn and gather new information. This trend has challenged the role of the library as a physical place, as well as the interplay of digital and physical aspects of the library. The paper provides an overview of how the penetration of digital technology into everyday life has affected the library as a place, both as designed by place makers, and, as perceived by library users. It then identifies a gap in current library research about the use of digital technology as a tool for placemaking, and reports results from a study of Gelatine – a custom built user check-in system that displays real-time user information on a set of public screens. Gelatine and its evaluation at The Edge, at State Library of Queensland illustrates how combining affordances of social, spatial and digital space can improve the connected learning experience among on-site visitors. Future design strategies involving gamifying the user experience in libraries are described based on Gelatine’s infrastructure. The presented design ideas and concepts are relevant for managers and designers of libraries as well as other informal, social learning environments.
Resumo:
This thesis developed and evaluated strategies for social and ubiquitous computing designs that can enhance connected learning and networking opportunities for users in coworking spaces. Based on a social and a technical design intervention deployed at the State Library of Queensland, the research findings illustrate the potential of combining social, spatial and digital affordances in order to nourish peer-to-peer learning, creativity, inspiration, and innovation. The study proposes a hybrid notion of placemaking as a new way of thinking about the design of coworking and interactive learning spaces.
Resumo:
This study investigated how and to what degree “hybrid photography”—the simultaneous use of indexical and fictional properties and strategies— innovates the representation of animals within animalcentric, ecocentric frameworks. Design theory structured this project’s Practice-led, Visual research methodology framework. Grounded theory processes articulated emerging categories of hybrid photography through systematically and comparatively treating animal photography works for reflexive analysis. Design theory then applied and clarified categories, developing practice that re-visualised shark perspectives as new ecological discourse. Shadows, a creative practice installation, realised a full-scale photographic investigation into shark and marine animal realities of a specific environment—Heron Island and Gladstone, Great Barrier Reef—facing ecological crisis from dredging and development at Gladstone Harbour. Works rendered and explored hybrid photography’s capacity for illuminating nonhuman animals, in particular, sharks, and comprise 65% of this project’s weighting. This exegetical paper offers a definition, strategies and evaluation of hybrid photography in unsettling animal perspectives as effective ecological discourse, and comprises 35%.
Resumo:
Although urbanization can promote social and economic development, it can also cause various problems. As the key decision makers of urbanization, local governments should be able to evaluate urbanization performance, summarize experiences, and find problems caused by urbanization. This paper introduces a hybrid Entropy–McKinsey Matrix method for evaluating sustainable urbanization. The McKinsey Matrix is commonly referred to as the GE Matrix. The values of a development index (DI) and coordination index (CI) are calculated by employing the Entropy method and are used as a basis for constructing a GE Matrix. The matrix can assist in assessing sustainable urbanization performance by locating the urbanization state point. A case study of the city of Jinan in China demonstrates the process of using the evaluation method. The case study reveals that the method is an effective tool in helping policy makers understand the performance of urban sustainability and therefore formulate suitable strategies for guiding urbanization toward better sustainability.