218 resultados para Problem Gambling
Resumo:
Contemporary online environments suffer from a regulatory gap; that is there are few options for participants between customer service departments and potentially expensive court cases in foreign jurisdictions. Whatever form of regulation ultimately fills that gap will be charged with determining whether specific behavior, within a specific environment, is fair or foul; whether it’s cheating or not. However, cheating is a term that, despite substantial academic study, remains problematic. Is anything the developer doesn’t want you to do cheating? Is it only if your actions breach the formal terms of service? What about the community norms, do they matter at all? All of these remain largely unresolved questions, due to the lack of public determination of cases in such environments, which have mostly been settled prior to legal action. In this paper, I propose a re-branding of participant activity in such environments into developer-sanctioned, advantage play, and cheating. Advantage play, ultimately, is activity within the environment in which the player is able to turn the mechanics of the environment to their advantage without breaching the rules of the environment. Such a definition, and the term itself, is based on the usage of the term within the gambling industry, in which advantage play is considered betting with the advantage in the players’ favor rather than that of the house. Through examples from both the gambling industry and the Massively Multiplayer Role-Playing Game Eve Online, I consider the problems in defining cheating, suggest how the term ‘advantage play’ may be useful in understanding participants behavior in contemporary environments, and ultimately consider the use of such terminology in dispute resolution models which may overcome this regulatory gap.
Resumo:
Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.
Resumo:
Scholars in Context: Prospects and Transitions is an edited collection of papers from Face to Face, the 1996 University of Queensland Graduate School of Education Postgraduate Conference. It presents current research undertaken in one of Australia's largest and leading centres for postgraduate research in education. The book is divided into three sections: classrooms through different lenses, in which a variety of classroom related issues are addressed through a range of frameworks; the big picture: global issues, which provides national and international perspectives on policy and cultural issues in a range of education sectors; and framing the individual: perspectives and insights, which includes different strands of research into individuals' development in the context of families and schools. Scholars in Context: Prospects and Transitions demonstrates how current researchers maintain a commitment to innovation and rigour, despite the current uncertainties that bedevil higher education. The work presented here makes a significant contribution to many fields of education research. The range of issues this collection addresses, the variety of theoretical and analytical perspectives adopted, and the scholarship evidenced in each contribution, make this text a valuable compendium of very recent work in education research.
Resumo:
Server consolidation using virtualization technology has become an important technology to improve the energy efficiency of data centers. Virtual machine placement is the key in the server consolidation technology. In the past few years, many approaches to the virtual machine placement have been proposed. However, existing virtual machine placement approaches consider the energy consumption by physical machines only, but do not consider the energy consumption in communication network, in a data center. However, the energy consumption in the communication network in a data center is not trivial, and therefore should be considered in the virtual machine placement. In our preliminary research, we have proposed a genetic algorithm for a new virtual machine placement problem that considers the energy consumption in both physical machines and the communication network in a data center. Aiming at improving the performance and efficiency of the genetic algorithm, this paper presents a hybrid genetic algorithm for the energy-efficient virtual machine placement problem. Experimental results show that the hybrid genetic algorithm significantly outperforms the original genetic algorithm, and that the hybrid genetic algorithm is scalable.
Resumo:
Governments are challenged by the need to ensure that ageing populations stay active and engaged as they age. Therefore, it is critical to investigate the role of mobility in older people's engagement in out-of-home activities, and to identify the experiences they have within their communities. This research investigates the use of transportation by older people and its implications for their out-of-home activities within suburban environments. The qualitative, mixed-method approach employs data collection methods which include a daily travel diary (including a questionnaire), Global Positioning System (GPS) tracking and semi-structured interviews with older people living in suburban environments in Brisbane, Australia. Results show that older people are mobile throughout the city, and their car provides them with that opportunity to access desired destinations. This ability to drive allows older people to live independently and to assist others who do not drive, particularly where transport alternatives are not as accessible. The ability to transport goods and other people is a significant advantage of the private car over other transport options. People with no access to private transportation who live in low-density environments are disadvantaged when it comes to participation within the community. Further research is needed to better understand the relationship between transportation and participation within the community environment, to assist policy makers and city and transportation planners to develop strategies for age-friendly environments within the community.
Resumo:
Purpose – The purpose of this paper is to explore the role of leadership in problem-oriented policing (POP). Design/methodology/approach – This paper uses interrupted time series models to isolate the impact on crime trends of a transformational leader's efforts to spearhead the implementation of a program of POP, called the problem solving model (PSM), in a southern state in Australia. Findings – This paper finds that the PSM led directly to an impact on overall crime, with a significant reduction in crimes per 100,000 persons per year after the introduction of the PSM. The majority of the overall crime drop attributable to implementation of POP was driven by reductions in property crime. It was noted that the leadership influence of the PSM was not effective in reducing all types of crime. Crimes against the person where not affected by the introduction of the PSM and public nuisance crimes largely followed the forecasted, upward trajectory. Practical implications – The driver behind the PSM was Commissioner Hyde and the success of the PSM is largely attributable to his strong commitment to transformational leadership and a top-down approach to implementation. These qualities encapsulate the original ideas behind POP that Goldstein (1979, 2003), back in 1979, highlighted as critical for the success of future POP programs. Social implications – Reducing crime is an important part of creating safe communities and improving quality of life for all citizens. This research shows that successful implementation of the PSM within South Australia under the strong leadership of Commissioner Hyde was a major factor in reducing property crime and overall crime rates. Originality/value – This paper is valuable because it demonstrates the link between strong leadership in policing, the commissioner's vision for POP and how his vision then translated into widespread adoption of POP. The study empirically shows that the statewide adoption of POP led to significant reductions in crime, particularly property crime.
Resumo:
The sum of k mins protocol was proposed by Hopper and Blum as a protocol for secure human identification. The goal of the protocol is to let an unaided human securely authenticate to a remote server. The main ingredient of the protocol is the sum of k mins problem. The difficulty of solving this problem determines the security of the protocol. In this paper, we show that the sum of k mins problem is NP-Complete and W[1]-Hard. This latter notion relates to fixed parameter intractability. We also discuss the use of the sum of k mins protocol in resource-constrained devices.
Resumo:
The policies and regulations governing the practice of state asset management have emerged as an urgent question among many countries worldwide for there is heightened awareness of the complex and crucial role that state assets play in public service provision. Indonesia is an example of such country, introducing a ‘big-bang’ reform in state asset management laws, policies, regulations, and technical guidelines. Indonesia exemplified its enthusiasm in reforming state asset management policies and practices through the establishment of the Directorate General of State Assets in 2006. The Directorate General of State Assets have stressed the new direction that it is taking state asset management laws and policies through the introduction of Republic of Indonesia Law Number 38 Year 2008, which is an amended regulation overruling Republic of Indonesia Law Number 6 Year 2006 on Central/Regional Government State Asset Management. Law number 38/2008 aims to further exemplify good governance principles and puts forward a ‘the highest and best use of assets’ principle in state asset management. The purpose of this study is to explore and analyze specific contributing influences to state asset management practices, answering the question why innovative state asset management policy implementation is stagnant. The methodology of this study is that of qualitative case study approach, utilizing empirical data sample of four Indonesian regional governments. Through a thematic analytical approach this study provides an in-depth analysis of each influencing factors to state asset management reform. Such analysis suggests the potential of an ‘excuse rhetoric’; whereby the influencing factors identified are a smoke-screen, or are myths that public policy makers and implementers believe in, as a means to ex-plain stagnant implementation of innovative state asset management practice. Thus this study offers deeper insights of the intricate web that influences state as-set management innovative policies to state asset management policy makers; to be taken into consideration in future policy writing.
Resumo:
We study the multicast stream authentication problem when an opponent can drop, reorder and introduce data packets into the communication channel. In such a model, packet overhead and computing efficiency are two parameters to be taken into account when designing a multicast stream protocol. In this paper, we propose to use two families of erasure codes to deal with this problem, namely, rateless codes and maximum distance separable codes. Our constructions will have the following advantages. First, our packet overhead will be small. Second, the number of signature verifications to be performed at the receiver is O(1). Third, every receiver will be able to recover all the original data packets emitted by the sender despite losses and injection occurred during the transmission of information.
Resumo:
Addressing the Crew Scheduling Problem (CSP) in transportation systems can be too complex to capture all details. The designed models usually ignore or simplify features which are difficult to formulate. This paper proposes an alternative formulation using a Mixed Integer Programming (MIP) approach to the problem. The optimisation model integrates the two phases of pairing generation and pairing optimisation by simultaneously sequencing trips into feasible duties and minimising total elapsed time of any duty. Crew scheduling constraints in which the crew have to return to their home depot at the end of the shift are included in the model. The flexibility of this model comes in the inclusion of the time interval of relief opportunities, allowing the crew to be relieved during a finite time interval. This will enhance the robustness of the schedule and provide a better representation of real-world conditions.
Resumo:
Trivium is a stream cipher candidate of the eStream project. It has successfully moved into phase three of the selection process under the hardware category. No attacks faster than the exhaustive search have so far been reported on Trivium. Bivium-A and Bivium-B are simplified versions of Trivium that are built on the same design principles but with two registers. The simplified design is useful in investigating Trivium type ciphers with a reduced complexity and provides insight into effective attacks which could be extended to Trivium. This paper focuses on an algebraic analysis which uses the boolean satisfiability problem in propositional logic. For reduced variants of the cipher, this analysis recovers the internal state with a minimal amount of keystream observations.
Resumo:
The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation in cloud computing. From the computational point of view, the mappers/reducers placement problem is a generalization of the classical bin packing problem, which is NP-complete. Thus, in this paper we propose a new heuristic algorithm for the mappers/reducers placement problem in cloud computing and evaluate it by comparing with other several heuristics on solution quality and computation time by solving a set of test problems with various characteristics. The computational results show that our heuristic algorithm is much more efficient than the other heuristics. Also, we verify the effectiveness of our heuristic algorithm by comparing the mapper/reducer placement for a benchmark problem generated by our heuristic algorithm with a conventional mapper/reducer placement. The comparison results show that the computation using our mapper/reducer placement is much cheaper while still satisfying the computation deadline.
Resumo:
MapReduce is a computation model for processing large data sets in parallel on large clusters of machines, in a reliable, fault-tolerant manner. A MapReduce computation is broken down into a number of map tasks and reduce tasks, which are performed by so called mappers and reducers, respectively. The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation. From the computational point of view, the mappers/reducers placement problem is a generation of the classical bin packing problem, which is NPcomplete. Thus, in this paper we propose a new grouping genetic algorithm for the mappers/reducers placement problem in cloud computing. Compared with the original one, our grouping genetic algorithm uses an innovative coding scheme and also eliminates the inversion operator which is an essential operator in the original grouping genetic algorithm. The new grouping genetic algorithm is evaluated by experiments and the experimental results show that it is much more efficient than four popular algorithms for the problem, including the original grouping genetic algorithm.
Resumo:
Suppose two parties, holding vectors A = (a 1,a 2,...,a n ) and B = (b 1,b 2,...,b n ) respectively, wish to know whether a i > b i for all i, without disclosing any private input. This problem is called the vector dominance problem, and is closely related to the well-studied problem for securely comparing two numbers (Yao’s millionaires problem). In this paper, we propose several protocols for this problem, which improve upon existing protocols on round complexity or communication/computation complexity.