198 resultados para Dunkl Operators
Resumo:
Current governance challenges facing the global games industry are heavily dominated by online games. Whilst much academic and industry attention has been afforded to Virtual Worlds, the more pressing contemporary challenges may arise in casual games, especially when found on social networks. As authorities are faced with an increasing volume of disputes between participants and platform operators, the likelihood of external regulation increases, and the role that such regulation would have on the industry – both internationally and within specific regions – is unclear. Kelly (2010) argues that “when you strip away the graphics of these [social] games, what you are left with is simply a button [...] You push it and then the game returns a value of either Win or Lose”. He notes that while “every game developer wants their game to be played, preferably addictively, because it’s so awesome”, these mechanics lead not to “addiction of engagement through awesomeness” but “the addiction of compulsiveness”, surmising that “the reality is that they’ve actually sort-of kind-of half-intentionally built a virtual slot machine industry”. If such core elements of social game design are questioned, this gives cause to question the real-money options to circumvent them. With players able to purchase virtual currency and speed the completion of tasks, the money invested by the 20% purchasing in-game benefits (Zainwinger, 2012) may well be the result of compulsion. The decision by the Japanese Consumer Affairs agency to investigate the ‘Kompu Gacha’ mechanic (in which players are rewarded for completing a set of items obtained through purchasing virtual goods such as mystery boxes), and the resultant verdict that such mechanics should be regulated through gambling legislation, demonstrates that politicians are beginning to look at the mechanics deployed in these environments. Purewal (2012) states that “there’s a reasonable argument that complete gacha would be regulated under gambling law under at least some (if not most) Western jurisdictions”. This paper explores the governance challenged within these games and platforms, their role in the global industry, and current practice amongst developers in the Australian and United States to address such challenges.
Resumo:
Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.
Resumo:
Many applications can benefit from the accurate surface temperature estimates that can be made using a passive thermal-infrared camera. However, the process of radiometric calibration which enables this can be both expensive and time consuming. An ad hoc approach for performing radiometric calibration is proposed which does not require specialized equipment and can be completed in a fraction of the time of the conventional method. The proposed approach utilizes the mechanical properties of the camera to estimate scene temperatures automatically, and uses these target temperatures to model the effect of sensor temperature on the digital output. A comparison with a conventional approach using a blackbody radiation source shows that the accuracy of the method is sufficient for many tasks requiring temperature estimation. Furthermore, a novel visualization method is proposed for displaying the radiometrically calibrated images to human operators. The representation employs an intuitive coloring scheme and allows the viewer to perceive a large variety of temperatures accurately.
Resumo:
Performance of urban transit systems may be quantified and assessed using transit capacity and productive capacity in planning, design and operational management activities. Bunker (4) defines important productive performance measures of an individual transit service and transit line, which are extended in this paper to quantify efficiency and operating fashion of transit services and lines. Comparison of a hypothetical bus line’s operation during a morning peak hour and daytime hour demonstrates the usefulness of productiveness efficiency and passenger transmission efficiency, passenger churn and average proportion line length traveled to the operator in understanding their services’ and lines’ productive performance, operating characteristics, and quality of service. Productiveness efficiency can flag potential pass-up activity under high load conditions, as well as ineffective resource deployment. Proportion line length traveled can directly measure operating fashion. These measures can be used to compare between lines/routes and, within a given line, various operating scenarios and time horizons to target improvements. The next research stage is investigating within-line variation using smart card passenger data and field observation of pass-ups. Insights will be used to further develop practical guidance to operators.
Resumo:
Aim Facilities in retirement villages form a supportive environment for older residents. The purpose of this paper is to investigate the provision of these facilities in retirement villages, which are regarded as a viable accommodation option for the ever-increasing ageing population in Australia. Method A content analysis of 124 retirement villages operated by 22 developers in Queensland and South Australia was conducted for the research purpose. Results The most widely provided facilities are community centres, libraries, barbeque facilities, hairdressers/salons and billiards/snooker/pool tables. Commercial operators provide more facilities than not-for-profit organisations and larger retirement villages normally have more facilities due to the economics of scale involved. Conclusions The results of the study provide a useful reference for providing facilities within retirement villages that may support the quality lifestyles for the older residents.
Resumo:
This thesis considers and evaluates different approaches to regulating online gaming communities, including traditional top-down regulation, as well as bottom-up and hybrid forms led by participants. I examine the regulatory environment in both the video game and gambling industries through case studies of the science fiction, massively multiplayer game Eve Online and offshore gambling platforms and their community sites. I identify that the participant driven approach to regulation sometimes used in the offshore gambling industry was dependent on a number of factors, notably the strength of the community and the risks to platform operators of negative publicity. By subsequently comparing this to the video gaming industry, I suggest that participant driven processes may be an appropriate way to resolve disputes in the games industry, and show how these are – to a limited extent – already being applied.
Resumo:
The safety of passengers is a major concern to airports. In the event of crises, having an effective and efficient evacuation process in place can significantly aid in enhancing passenger safety. Hence, it is necessary for airport operators to have an in-depth understanding of the evacuation process of their airport terminal. Although evacuation models have been used in studying pedestrian behaviour for decades, little research has been done in considering the evacuees’ group dynamics and the complexity of the environment. In this paper, an agent-based model is presented to simulate passenger evacuation process. Different exits were allocated to passengers based on their location and security level. The simulation results show that the evacuation time can be influenced by passenger group dynamics. This model also provides a convenient way to design airport evacuation strategy and examine its efficiency. The model was created using AnyLogic software and its parameters were initialised using recent research data published in the literature.
Resumo:
In recent years, the imperative to communicate organisational impacts to a variety of stakeholders has gained increasing importance within all sectors. Despite growing external demands for evaluation and social impact measurement, there has been limited critically informed analysis about the presumed importance of these activities to organisational success and the practical challenges faced by organisations in undertaking such assessment. In this paper, we present the findings from an action research study of five Australian small to medium social enterprises’ practices and use of evaluation and social impact analysis. Our findings have implications for social enterprise operators, policy makers and social investors regarding when, why and at what level these activities contribute to organisational performance and the fulfilment of mission.
Resumo:
The central document governing the global organization of Air Navigation Services (ANS) is the Convention on International Civil Aviation, commonly referred to as the “Chicago Convention,” whose original version was signed in that city in 1944. In the Convention, Contracting States agreed to ensure the minimum standards of ANS established by ICAO, a specialized United Nations agency created by the Convention. Emanating from obligations under the Chicago Convention, ANS has traditionally provided by departments of national governments. However, there is a widespread trend toward transferring delivery of ANS services outside of line departments of national governments to independent agencies or corporations. The Civil Air Navigation Services Organisation (CANSO), which is the trade association for independent ANS providers, currently counts approximately 60 members, and is steadily growing. However, whatever delivery mechanisms are chosen, national governments remain ultimately responsible for ensuring that adequate ANS services are available. The provision by governments of ANS reflects the responsibility of the state for safety, international relations, and indirectly, the macroeconomic benefits of ensuring a sound infrastructure for aviation. ANS is a “public good” and an “essential good” provided to all aircraft using a country’s airfields and airspace. However, ANS also represents a service that directly benefits only a limited number of users, notably aircraft owners and operators. The idea that the users of the system, rather than the taxpaying public, should incur the costs associated with ANS provision is inherent in the commercialization process. However, ICAO sets out broad principles for the establishment of user charges, which member states are expected to comply with. ICAO states that only distance flown and aircraft weights are acceptable parameters for use in a charging system. These two factors are considered to be easy to measure, bear a reasonable relationship to the value of service received, and do not discriminate due to factors such as where the flight originated or the nation of aircraft registration.
Resumo:
E-mail spam has remained a scourge and menacing nuisance for users, internet and network service operators and providers, in spite of the anti-spam techniques available; and spammers are relentlessly circumventing these anti-spam techniques embedded or installed in form of software products on both client and server sides of both fixed and mobile devices to their advantage. This continuous evasion degrades the capabilities of these anti-spam techniques as none of them provides a comprehensive reliable solution to the problem posed by spam and spammers. Major problem for instance arises when these anti-spam techniques misjudge or misclassify legitimate emails as spam (false positive); or fail to deliver or block spam on the SMTP server (false negative); and the spam passes-on to the receiver, and yet this server from where it originates does not notice or even have an auto alert service to indicate that the spam it was designed to prevent has slipped and moved on to the receiver’s SMTP server; and the receiver’s SMTP server still fail to stop the spam from reaching user’s device and with no auto alert mechanism to inform itself of this inability; thus causing a staggering cost in loss of time, effort and finance. This paper takes a comparative literature overview of some of these anti-spam techniques, especially the filtering technological endorsements designed to prevent spam, their merits and demerits to entrench their capability enhancements, as well as evaluative analytical recommendations that will be subject to further research.
Resumo:
In coastal areas, extreme weather events, such as floods and cyclones, can have debilitating effects on the social and economic viability of marine-based industries. In March 2011, the Great Barrier Reef Marine Park Authority implemented an Extreme Weather Response Program, following a period of intense flooding and cyclonic activity between December 2010 and February 2011. In this paper, we discuss the results of a project within the Program, which aimed to: (1) assess the impacts of extreme weather events on regional tourism and commercial fishing industries; and (2) develop and road-test an impact assessment matrix to improve government and industry responses to extreme weather events. Results revealed that extreme weather events both directly and indirectly affected all five of the measured categories, i.e. ecological, personal, social, infrastructure and economic components. The severity of these impacts, combined with their location and the nature of their business, influenced how tourism operators and fishers assessed the impact of the events (low, medium, high or extreme). The impact assessment tool was revised following feedback obtained during stakeholder workshops and may prove useful for managers in responding to potential direct and indirect impacts of future extreme weather events on affected marine industries. © 2013 Planning Institute Australia.
Resumo:
Reverse osmosis (RO) is used by coal seam gas (CSG) operators to treat produced water as it is a well-established and proven technology worldwide. Despite the suitability of RO, there are problems associated with RO technology such as membrane fouling which although not preventing use of RO does decrease effectiveness and increase operating costs. Hence, effective pre-treatment of water samples is essential. Electrocoagulation (EC) potentially can provide improved water purification compared to conventional coagulation prior to an RO unit. This paper provides the first reported study of EC for CSG water pre-treatment and compares the performance to a range of aluminium and iron based coagulants. It was found that EC was superior in terms of removal of silica, calcium, magnesium, barium and strontium in the produced water.
Resumo:
The interest in utilising multiple heterogeneous Unmanned Aerial Vehicles (UAVs) in close proximity is growing rapidly. As such, many challenges are presented in the effective coordination and management of these UAVs; converting the current n-to-1 paradigm (n operators operating a single UAV) to the 1-to-n paradigm (one operator managing n UAVs). This paper introduces an Information Abstraction methodology used to produce the functional capability framework initially proposed by Chen et al. and its Level Of Detail (LOD) indexing scale. This framework was validated through comparing the operator workload and Situation Awareness (SA) of three experiment scenarios involving multiple autonomously heterogeneous UAVs. The first scenario was set in a high LOD configuration with highly abstracted UAV functional information; the second scenario was set in a mixed LOD configuration; and the final scenario was set in a low LOD configuration with maximal UAV functional information. Results show that there is a significant statistical decrease in operator workload when a UAV’s functional information is displayed at its physical form (low LOD - maximal information) when comparing to the mixed LOD configuration.
Resumo:
Market operators in New Zealand and Australia, such as the New Zealand Exchange (NZX) and the Australian Securities Exchange (ASX), have the regulatory power in their listing rules to issue queries to their market participants to explain unusual fluctuations in trading price and/or volume in the market. The operator will issue a price query where it believes that the market has not been fully informed as to price relevant information. Responsive regulation theory has informed much of the regulatory debate in securities laws in the region. Price queries map onto the lower level of the enforcement pyramid envisaged by responsive regulation and are one strategy that a market operator can use in communicating its compliance expectations to its stakeholders. The issue of a price query may be a precursor to more severe enforcement activities. The aim of this study is to investigate whether increased use of price queries by the securities market operator in New Zealand corresponded with an increase in disclosure frequency by all participating companies. The study finds that an increased use of price queries did correspond with an increase in disclosure frequency. A possible explanation for this finding is that price queries are an effective means of appealing to the factors that motivate corporations, and the individuals who control them, to comply with the law and regulatory requirements. This finding will have implications for both the NZX and the ASX as well as for regulators and policy makers generally.
Resumo:
Semantic Space models, which provide a numerical representation of words’ meaning extracted from corpus of documents, have been formalized in terms of Hermitian operators over real valued Hilbert spaces by Bruza et al. [1]. The collapse of a word into a particular meaning has been investigated applying the notion of quantum collapse of superpositional states [2]. While the semantic association between words in a Semantic Space can be computed by means of the Minkowski distance [3] or the cosine of the angle between the vector representation of each pair of words, a new procedure is needed in order to establish relations between two or more Semantic Spaces. We address the question: how can the distance between different Semantic Spaces be computed? By representing each Semantic Space as a subspace of a more general Hilbert space, the relationship between Semantic Spaces can be computed by means of the subspace distance. Such distance needs to take into account the difference in the dimensions between subspaces. The availability of a distance for comparing different Semantic Subspaces would enable to achieve a deeper understanding about the geometry of Semantic Spaces which would possibly translate into better effectiveness in Information Retrieval tasks.