215 resultados para applied game


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The material presented in this thesis may be viewed as comprising two key parts, the first part concerns batch cryptography specifically, whilst the second deals with how this form of cryptography may be applied to security related applications such as electronic cash for improving efficiency of the protocols. The objective of batch cryptography is to devise more efficient primitive cryptographic protocols. In general, these primitives make use of some property such as homomorphism to perform a computationally expensive operation on a collective input set. The idea is to amortise an expensive operation, such as modular exponentiation, over the input. Most of the research work in this field has concentrated on its employment as a batch verifier of digital signatures. It is shown that several new attacks may be launched against these published schemes as some weaknesses are exposed. Another common use of batch cryptography is the simultaneous generation of digital signatures. There is significantly less previous work on this area, and the present schemes have some limited use in practical applications. Several new batch signatures schemes are introduced that improve upon the existing techniques and some practical uses are illustrated. Electronic cash is a technology that demands complex protocols in order to furnish several security properties. These typically include anonymity, traceability of a double spender, and off-line payment features. Presently, the most efficient schemes make use of coin divisibility to withdraw one large financial amount that may be progressively spent with one or more merchants. Several new cash schemes are introduced here that make use of batch cryptography for improving the withdrawal, payment, and deposit of electronic coins. The devised schemes apply both to the batch signature and verification techniques introduced, demonstrating improved performance over the contemporary divisible based structures. The solutions also provide an alternative paradigm for the construction of electronic cash systems. Whilst electronic cash is used as the vehicle for demonstrating the relevance of batch cryptography to security related applications, the applicability of the techniques introduced extends well beyond this.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present paper focuses on some interesting classes of process-control games, where winning essentially means successfully controlling the process. A master for one of these games is an agent who plays a winning strategy. In this paper we investigate situations in which even a complete model (given by a program) of a particular game does not provide enough information to synthesize—even incrementally—a winning strategy. However, if in addition to getting a program, a machine may also watch masters play winning strategies, then the machine is able to incrementally learn a winning strategy for the given game. Studied are successful learning from arbitrary masters and from pedagogically useful selected masters. It is shown that selected masters are strictly more helpful for learning than are arbitrary masters. Both for learning from arbitrary masters and for learning from selected masters, though, there are cases where one can learn programs for winning strategies from masters but not if one is required to learn a program for the master's strategy itself. Both for learning from arbitrary masters and for learning from selected masters, one can learn strictly more by watching m+1 masters than one can learn by watching only m. Last, a simulation result is presented where the presence of a selected master reduces the complexity from infinitely many semantic mind changes to finitely many syntactic ones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

At least two important transportation planning activities rely on planning-level crash prediction models. One is motivated by the Transportation Equity Act for the 21st Century, which requires departments of transportation and metropolitan planning organizations to consider safety explicitly in the transportation planning process. The second could arise from a need for state agencies to establish incentive programs to reduce injuries and save lives. Both applications require a forecast of safety for a future period. Planning-level crash prediction models for the Tucson, Arizona, metropolitan region are presented to demonstrate the feasibility of such models. Data were separated into fatal, injury, and property-damage crashes. To accommodate overdispersion in the data, negative binomial regression models were applied. To accommodate the simultaneity of fatality and injury crash outcomes, simultaneous estimation of the models was conducted. All models produce crash forecasts at the traffic analysis zone level. Statistically significant (p-values < 0.05) and theoretically meaningful variables for the fatal crash model included population density, persons 17 years old or younger as a percentage of the total population, and intersection density. Significant variables for the injury and property-damage crash models were population density, number of employees, intersections density, percentage of miles of principal arterial, percentage of miles of minor arterials, and percentage of miles of urban collectors. Among several conclusions it is suggested that planning-level safety models are feasible and may play a role in future planning activities. However, caution must be exercised with such models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the formalization and application of a methodology to evaluate the safety benefit of countermeasures in the face of uncertainty. To illustrate the methodology, 18 countermeasures for improving safety of at grade railroad crossings (AGRXs) in the Republic of Korea are considered. Akin to “stated preference” methods in travel survey research, the methodology applies random selection and laws of large numbers to derive accident modification factor (AMF) densities from expert opinions. In a full Bayesian analysis framework, the collective opinions in the form of AMF densities (data likelihood) are combined with prior knowledge (AMF density priors) for the 18 countermeasures to obtain ‘best’ estimates of AMFs (AMF posterior credible intervals). The countermeasures are then compared and recommended based on the largest safety returns with minimum risk (uncertainty). To the author's knowledge the complete methodology is new and has not previously been applied or reported in the literature. The results demonstrate that the methodology is able to discern anticipated safety benefit differences across candidate countermeasures. For the 18 at grade railroad crossings considered in this analysis, it was found that the top three performing countermeasures for reducing crashes are in-vehicle warning systems, obstacle detection systems, and constant warning time systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a novel modified theory based upon Rayleigh scattering of ultrasound from composite nanoparticles with a liquid core and solid shell. We derive closed form solutions to the scattering cross-section and have applied this model to an ultrasound contrast agent consisting of a liquid-filled core (perfluorooctyl bromide, PFOB) encapsulated by a polymer shell (poly-caprolactone, PCL). Sensitivity analysis was performed to predict the dependence of the scattering cross-section upon material and dimensional parameters. A rapid increase in the scattering cross-section was achieved by increasing the compressibility of the core, validating the incorporation of high compressibility PFOB; the compressibility of the shell had little impact on the overall scattering cross-section although a more compressible shell is desirable. Changes in the density of the shell and the core result in predicted local minima in the scattering cross-section, approximately corresponding to the PFOB-PCL contrast agent considered; hence, incorporation of a lower shell density could potentially significantly improve the scattering cross-section. A 50% reduction in shell thickness relative to external radius increased the predicted scattering cross-section by 50%. Although it has often been considered that the shell has a negative effect on the echogeneity due to its low compressibility, we have shown that it can potentially play an important role in the echogeneity of the contrast agent. The challenge for the future is to identify suitable shell and core materials that meet the predicted characteristics in order to achieve optimal echogenity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The upper Condamine River in southern Queensland has formed extensive alluvial deposits which have been used for irrigation of cotton crops for over 40 years. Due to excessive use and long term drought conditions these groundwater resources are under substantial threat. This condition is now recognised by all stakeholders, and Qld Department of Environment and Resource Management (DERM) are currently undertaking a water planning process for the Central Condamine Alluvium with water users and other stakeholders. DERM aims to effectively demonstrate the character of the groundwater system and its current status, and notably the continued long-term drawdown of the watertable. It was agreed that 3D visualisation was an ideal tool to achieve this. The Groundwater Visualisation System (GVS) developed at QUT was utilised and the visualisation model developed in conjunction with DERM to achieve a planning-management tool for this particular application

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study explores the effects of use-simulated and peripheral placements in video games on attitude to the brand. Results indicate that placements do not lead to enhanced brand attitude, even when controlling for involvement and skill. It appears this is due to constraints on brand information processing in a game context.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: Empowerment is a complex process of psychological, social, organizational and structural change. It allows individuals and groups to achieve positive growth and effectively address the social and psychological impacts of historical oppression, marginalization and disadvantage. The Growth and Empowerment Measure (GEM) was developed to measure change in dimensions of empowerment as defi ned and described by Aboriginal Australians who participated in the Family Well Being programme.---------- Method: The GEM has two components: a 14-item Emotional Empowerment Scale (EES14) and 12 Scenarios (12S). It is accompanied by the Kessler 6 Psychological Distress Scale (K6), supplemented by two questions assessing frequency of happy and angry feelings. For validation, the measure was applied with 184 Indigenous Australian participants involved in personal and/or organizational social health activities.---------- Results: Psychometric analyses of the new instruments support their validity and reliability and indicate two-component structures for both the EES (Self-capacity; Inner peace) and the 12S (Healing and enabling growth, Connection and purpose). Strong correlations were observed across the scales and subscales. Participants who scored higher on the newly developed scales showed lower distress on the K6, particularly when the two additional questions were included. However, exploratory factor analyses demonstrated that GEM subscales are separable from the Kessler distress measure.---------- Conclusion: The GEM shows promise in enabling measurement and enhancing understanding of both process and outcome of psychological and social empowerment within an Australian Indigenous context.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This full day workshop invites participants to consider the nexus where the interests of game design, the expectations of play and HCI meet: the game interface. Game interfaces seem different to the interface to other software and there have been a number of observations. Shneiderman famously noticed that while most software designers are intent on following the tenets of the “invisible computer” and making access easy for the user, games inter-faces are made for players: they embed challenge. Schell discusses a “strange” relationship between the player and the game enabled by the interface and user interface designers frequently opine that much can be learned from the design of game interfaces. So where does the game interface actually sit? Even more interesting is the question as to whether the history of the relationship and sub-sequent expectations are now limiting the potential of game design as an expressive form. Recent innovations in I/O design such as Nintendo’s Wii, Sony’s Move and Microsoft's Kinect seem to usher in an age of physical player-enabled interaction, experience and embodied, engaged design. This workshop intends to cast light on this often mentioned and sporadically examined area and to establish a platform for new and innovative design in the field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Performance / Event Documentation and Curatorial Research Statement

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ultimate goal of an authorisation system is to allocate each user the level of access they need to complete their job - no more and no less. This proves to be challenging in an organisational setting because on one hand employees need enough access to perform their tasks, while on the other hand more access will bring about an increasing risk of misuse - either intentionally, where an employee uses the access for personal benefit, or unintentionally through carelessness, losing the information or being socially engineered to give access to an adversary. With the goal of developing a more dynamic authorisation model, we have adopted a game theoretic framework to reason about the factors that may affect users’ likelihood to misuse a permission at the time of an access decision. Game theory provides a useful but previously ignored perspective in authorisation theory: the notion of the user as a self-interested player who selects among a range of possible actions depending on their pay-offs.