425 resultados para DISTRIBUTED GENERATION
Resumo:
This paper looks at the challenges presented for the Australian Library and Information Association by its role as the professional association responsible for ensuring the quality of Australian library technician graduates. There is a particular focus on the issue of course recognition, where the Association's role is complicated by the need to work alongside the national quality assurance processes that have been established by the relevant technical education authorities. The paper describes the history of course recognition in Australia; examines the relationship between course recognition and other quality measures; and describes the process the Association has undertaken recently to ensure appropriate professional scrutiny in a changing environment of accountability.
Resumo:
Modern enterprise knowledge management systems typically require distributed approaches and the integration of numerous heterogeneous sources of information. A powerful foundation for these tasks can be Topic Maps, which not only provide a semantic net-like knowledge representation means and the possibility to use ontologies for modelling knowledge structures, but also offer concepts to link these knowledge structures with unstructured data stored in files, external documents etc. In this paper, we present the architecture and prototypical implementation of a Topic Map application infrastructure, the ‘Topic Grid’, which enables transparent, node-spanning access to different Topic Maps distributed in a network.
Resumo:
To understand the diffusion of high technology products such as PCs, digital cameras and DVD players it is necessary to consider the dynamics of successive generations of technology. From the consumer’s perspective, these technology changes may manifest themselves as either a new generation product substituting for the old (for instance digital cameras) or as multiple generations of a single product (for example PCs). To date, research has been confined to aggregate level sales models. These models consider the demand relationship between one generation of a product and a successor generation. However, they do not give insights into the disaggregate-level decisions by individual households – whether to adopt the newer generation, and if so, when. This paper makes two contributions. It is the first large scale empirical study to collect household data for successive generations of technologies in an effort to understand the drivers of adoption. Second, in contrast to traditional analysis in diffusion research that conceptualizes technology substitution as an “adoption of innovation” type process, we propose that from a consumer’s perspective, technology substitution combines elements of both adoption (adopting the new generation technology) and replacement (replacing generation I product with generation II). Key Propositions In some cases, successive generations are clear “substitutes” for the earlier generation (e.g. PCs Pentium I to II to III ). More commonly the new generation II technology is a “partial substitute” for existing generation I technology (e.g. DVD players and VCRs). Some consumers will purchase generation II products as substitutes for their generation I product, while other consumers will purchase generation II products as additional products to be used as well as their generation I product. We propose that substitute generation II purchases combine elements of both adoption and replacement, but additional generation II purchases are solely adoption-driven process. Moreover, drawing on adoption theory consumer innovativeness is the most important consumer characteristic for adoption timing of new products. Hence, we hypothesize consumer innovativeness to influence the timing of both additional and substitute generation II purchases but to have a stronger impact on additional generation II purchases. We further propose that substitute generation II purchases act partially as a replacement purchase for the generation I product. Thus, we hypothesize that households with older generation I products will make substitute generation II purchases earlier. Methods We employ Cox hazard modeling to study factors influencing the timing of a household’s adoption of generation II products. A separate hazard model is conducted for additional and substitute purchases. The age of the generation I product is calculated based on the most recent household purchase of that product. Control variables include size and income of household, age and education of decision-maker. Results and Implications Our preliminary results confirm both our hypotheses. Consumer innovativeness has a strong influence on both additional purchases and substitute purchases. Also consistent with our hypotheses, the age of the generation I product has a dramatic influence for substitute purchases of VCR/DVD players and a strong influence for PCs/notebooks. Yet, also as hypothesized, there was no influence on additional purchases. This implies that there is a clear distinction between additional and substitute purchases of generation II products, each with different drivers. For substitute purchases, product age is a key driver. Therefore marketers of high technology products can utilize data on generation I product age (e.g. from warranty or loyalty programs) to target customers who are more likely to make a purchase.
Resumo:
This paper considers some of the implications of the rise of design as a master-metaphor of the information age. It compares the terms 'interaction design' and 'mass communication', suggesting that both can be seen as a contradiction in terms, inappropriately preserving an industrial-age division between producers and consumers. With the shift from mass media to interactive media, semiotic and political power seems to be shifting too - from media producers to designers. This paper argues that it is important for the new discipline of 'interactive design' not to fall into habits of thought inherited from the 'mass' industrial era. Instead it argues for the significance, for designers and producers alike, of what I call 'distributed expertise' -including social network markets, a DIY-culture, user-led innovation, consumer co-created content, and the use of Web 2.0 affordances for social, scientific and creative purposes as well as for entertainment. It considers the importance of the growth of 'distributed expertise' as part of a new paradigm in the growth of knowledge, which has 'evolved' through a number of phases, from 'abstraction' to 'representation', to 'productivity'. In the context of technologically mediated popular participation in the growth of knowledge and social relationships, the paper argues that design and media-production professions need to cross rather than to maintain the gap between experts and everyone else, enabling all the agents in the system to navigate the shift into the paradigm of mass productivity.
Resumo:
This paper discusses a method, Generation in Context, for interrogating theories of music analysis and music perception. Given an analytic theory, the method consists of creating a generative process that implements the theory in reverse. Instead of using the theory to create analyses from scores, the theory is used to generate scores from analyses. Subjective evaluation of the quality of the musical output provides a mechanism for testing the theory in a contextually robust fashion. The method is exploratory, meaning that in addition to testing extant theories it provides a general mechanism for generating new theoretical insights. We outline our initial explorations in the use of generative processes for music research, and we discuss how generative processes provide evidence as to the veracity of theories about how music is experienced, with insights into how these theories may be improved and, concurrently, provide new techniques for music creation. We conclude that Generation in Context will help reveal new perspectives on our understanding of music.
Resumo:
Multicarrier code division multiple access (MC-CDMA) is a very promising candidate for the multiple access scheme in fourth generation wireless communi- cation systems. During asynchronous transmission, multiple access interference (MAI) is a major challenge for MC-CDMA systems and significantly affects their performance. The main objectives of this thesis are to analyze the MAI in asyn- chronous MC-CDMA, and to develop robust techniques to reduce the MAI effect. Focus is first on the statistical analysis of MAI in asynchronous MC-CDMA. A new statistical model of MAI is developed. In the new model, the derivation of MAI can be applied to different distributions of timing offset, and the MAI power is modelled as a Gamma distributed random variable. By applying the new statistical model of MAI, a new computer simulation model is proposed. This model is based on the modelling of a multiuser system as a single user system followed by an additive noise component representing the MAI, which enables the new simulation model to significantly reduce the computation load during computer simulations. MAI reduction using slow frequency hopping (SFH) technique is the topic of the second part of the thesis. Two subsystems are considered. The first sub- system involves subcarrier frequency hopping as a group, which is referred to as GSFH/MC-CDMA. In the second subsystem, the condition of group hopping is dropped, resulting in a more general system, namely individual subcarrier frequency hopping MC-CDMA (ISFH/MC-CDMA). This research found that with the introduction of SFH, both of GSFH/MC-CDMA and ISFH/MC-CDMA sys- tems generate less MAI power than the basic MC-CDMA system during asyn- chronous transmission. Because of this, both SFH systems are shown to outper- form MC-CDMA in terms of BER. This improvement, however, is at the expense of spectral widening. In the third part of this thesis, base station polarization diversity, as another MAI reduction technique, is introduced to asynchronous MC-CDMA. The com- bined system is referred to as Pol/MC-CDMA. In this part a new optimum com- bining technique namely maximal signal-to-MAI ratio combining (MSMAIRC) is proposed to combine the signals in two base station antennas. With the applica- tion of MSMAIRC and in the absents of additive white Gaussian noise (AWGN), the resulting signal-to-MAI ratio (SMAIR) is not only maximized but also in- dependent of cross polarization discrimination (XPD) and antenna angle. In the case when AWGN is present, the performance of MSMAIRC is still affected by the XPD and antenna angle, but to a much lesser degree than the traditional maximal ratio combining (MRC). Furthermore, this research found that the BER performance for Pol/MC-CDMA can be further improved by changing the angle between the two receiving antennas. Hence the optimum antenna angles for both MSMAIRC and MRC are derived and their effects on the BER performance are compared. With the derived optimum antenna angle, the Pol/MC-CDMA system is able to obtain the lowest BER for a given XPD.
Resumo:
This paper proposes a method enhancing stability of an autonomous microgrid with distribution static compensator (DSTATCOM) and power sharing with multiple distributed generators (DG). It is assumed that all the DGs are connected through voltage source converter (VSC) and all connected loads are passive, making the microgrid totally inertia less. The VSCs are controlled by either state feedback or current feedback mode to achieve desired voltage-current or power outputs respectively. A modified angle droop is used for DG voltage reference generation. Power sharing ratio of the proposed droop control is established through derivation and verified by simulation results. A DSTATCOM is connected in the microgrid to provide ride through capability during power imbalance in the microgrid, thereby enhancing the system stability. This is established through extensive simulation studies using PSCAD.
Resumo:
The load–frequency control (LFC) problem has been one of the major subjects in a power system. In practice, LFC systems use proportional–integral (PI) controllers. However since these controllers are designed using a linear model, the non-linearities of the system are not accounted for and they are incapable of gaining good dynamical performance for a wide range of operating conditions in a multi-area power system. A strategy for solving this problem because of the distributed nature of a multi-area power system is presented by using a multi-agent reinforcement learning (MARL) approach. It consists of two agents in each power area; the estimator agent provides the area control error (ACE) signal based on the frequency bias estimation and the controller agent uses reinforcement learning to control the power system in which genetic algorithm optimisation is used to tune its parameters. This method does not depend on any knowledge of the system and it admits considerable flexibility in defining the control objective. Also, by finding the ACE signal based on the frequency bias estimation the LFC performance is improved and by using the MARL parallel, computation is realised, leading to a high degree of scalability. Here, to illustrate the accuracy of the proposed approach, a three-area power system example is given with two scenarios.
Resumo:
Since its launch in 2001, the Creative Commons open content licensing initiative has received both praise and censure. While some have touted it as a major step towards removing the burdens copyright law imposes on creativity and innovation in the digital age, others have argued that it robs artists of their rightful income. This paper aims to provide a brief overview and analysis of the practical application of the Creative Commons licences five years after their launch. It looks at how the Creative Commons licences are being used and who is using them, and attempts to identify likely motivations for doing so. By identifying trends in how this licence use has changed over time, it also attempts to rebut arguments that Creative Commons is a movement of academics and hobbyists, and has no value for traditional organisations or working artists.
Resumo:
DMAPS (Distributed Multi-Agent Planning System) is a planning system developed for distributed multi-robot teams based on MAPS (Multi-Agent Planning System). MAPS assumes that each agent has the same global view of the environment in order to determine the most suitable actions. This assumption fails when perception is local to the agents: each agent has only a partial and unique view of the environment. DMAPS addresses this problem by creating a probabilistic global view on each agent by fusing the perceptual information from each robot. The experimental results on consuming tasks show that while the probabilistic global view is not identical on each robot, the shared view is still effective in increasing performance of the team.
Resumo:
This paper describes an application of decoupled probabilistic world modeling to achieve team planning. The research is based on the principle that the action selection mechanism of a member in a robot team can select an effective action if a global world model is available to all team members. In the real world, the sensors are imprecise, and are individual to each robot, hence providing each robot a partial and unique view about the environment. We address this problem by creating a probabilistic global view on each agent by combining the perceptual information from each robot. This probabilistic view forms the basis for selecting actions to achieve the team goal in a dynamic environment. Experiments have been carried out to investigate the effectiveness of this principle using custom-built robots for real world performance, in addition, to extensive simulation results. The results show an improvement in team effectiveness when using probabilistic world modeling based on perception sharing for team planning.
Resumo:
Generative music systems can be performed by manipulating the values of their algorithmic parameters, and their semi-autonomous nature provides an opportunity for coordinated interaction amongst a network of systems, a practice we call Network Jamming. This paper outlines the characteristics of this networked performance practice and discusses the types of mediated musical relationships and ensemble configurations that can arise. We have developed and tested the jam2jam network jamming software over recent years. We describe this system, draw from our experiences with it, and use it to illustrate some characteristics of Network Jamming.