995 resultados para Botterill, Jason
Resumo:
This article presents a survey of authorisation models and considers their ‘fitness-for-purpose’ in facilitating information sharing. Network-supported information sharing is an important technical capability that underpins collaboration in support of dynamic and unpredictable activities such as emergency response, national security, infrastructure protection, supply chain integration and emerging business models based on the concept of a ‘virtual organisation’. The article argues that present authorisation models are inflexible and poorly scalable in such dynamic environments due to their assumption that the future needs of the system can be predicted, which in turn justifies the use of persistent authorisation policies. The article outlines the motivation and requirement for a new flexible authorisation model that addresses the needs of information sharing. It proposes that a flexible and scalable authorisation model must allow an explicit specification of the objectives of the system and access decisions must be made based on a late trade-off analysis between these explicit objectives. A research agenda for the proposed Objective-based Access Control concept is presented.
Resumo:
The National Party of Australia is under challenge. Will it be able to adapt and survive or will it become increasingly irrelevant in Australian politics? With population growth in some coastal and hinterland areas and decline in inland agricultural areas, the face of rural and regional Australia is changing. As a result, the National Party's traditional support is being eroded. Within the long-standing Coalition, the influence of the Nationals appears to be in decline, yet they continue to resist amalgamation with the Liberal Party.
Resumo:
he purpose of this study was to evaluate the comparative cost of treating alcohol dependence with either cognitive behavioral therapy (CBT) alone or CBT combined with naltrexone (CBT+naltrexone). Two hundred ninety-eight outpatients dependent on alcohol who were consecutively treated for alcohol dependence participated in this study. One hundred seven (36%) patients received adjunctive pharmacotherapy (CBT+naltrexone). The Drug Abuse Treatment Cost Analysis Program was used to estimate treatment costs. Adjunctive pharmacotherapy (CBT+naltrexone) introduced an additional treatment cost and was 54% more expensive than CBT alone. When treatment abstinence rates (36.1% CBT; 62.6% CBT+naltrexone) were applied to cost effectiveness ratios, CBT+naltrexone demonstrated an advantage over CBT alone. There were no differences between groups on a preference-based health measure (SF-6D). In this treatment center, to achieve 100 abstainers over a 12-week program, 280 patients require CBT compared with 160 CBT+naltrexone. The dominant choice was CBT+naltrexone based on modest economic advantages and significant efficiencies in the numbers needed to treat.
Resumo:
Alexithymia is characterised by deficits in emotional insight and self reflection, that impact on the efficacy of psychological treatments. Given the high prevalence of alexithymia in Alcohol Use Disorders, valid assessment tools are critical. The majority of research on the relationship between alexithymia and alcohol-dependence has employed the self-administered Toronto Alexithymia Scale (TAS-20). The Observer Alexithymia Scale (OAS) has also been recommended. The aim of the present study was to assess the validity and reliability of the OAS and the TAS-20 in an alcohol-dependent sample. Two hundred and ten alcohol-dependent participants in an outpatient Cognitive Behavioral Treatment program were administered the TAS-20 at assessment and upon treatment completion at 12 weeks. Clinical psychologists provided observer assessment data for a subsample of 159 patients. The findings confirmed acceptable internal consistency, test-retest reliability and scale homogeneity for both the OAS and TAS-20, except for the low internal consistency of the TAS-20 EOT scale. The TAS-20 was more strongly associated with alcohol problems than the OAS.
Resumo:
This article is an abbreviated version of a debate between two economists holding somewhat different perspectives on the nature of non-market production in the space of new digital media. While the ostensible focus here is on the role of markets in the innovation of new technologies to create new economic value, this context also serves to highlight the private and public value of digital literacy.
Resumo:
The field was the curation of cross-cultural new media/ digital media practices within large-scale exhibition practices in China. The context was improved understandings of the intertwining of the natural and the artificial with respect to landscape and culture, and their consequent effect on our contemporary globalised society. The research highlighted new languages of media art with respect to landscape and their particular underpinning dialects. The methodology was principally practice-led. --------- The research brought together over 60 practitioners from both local and diasporic Asian, European and Australian cultures for the first time within a Chinese exhibition context. Through pursuing a strong response to both cultural displacement and re-identification the research forged and documented an enduring commonality within difference – an agenda further concentrated through sensitivities surrounding that year’s Beijing’s Olympics. In contrast to the severe threats posed to the local dialects of many of the world’s spoken and written languages the ‘Vernacular Terrain’ project evidenced that many local creative ‘dialects’ of the environment-media art continuum had indeed survived and flourished. --------- The project was co-funded by the Beijing Film Academy, QUT Precincts, IDAProjects and Platform China Art Institute. A broad range of peer-reviewed grants was won including from the Australia China Council and the Australian Embassy in China. Through invitations from external curators much of the work then traveled to other venues including the Block Gallery at QUT and the outdoor screens at Federation Square, Melbourne. The Vernacular Terrain catalogue featured a comprehensive history of the IDA project from 2000 to 2008 alongside several major essays. Due to the reputation IDA Projects had established, the team were invited to curate a major exhibition showcasing fifty new media artists: The Vernacular Terrain, at the prestigious Songzhang Art Museum, Beijing in Dec 07-Jan 2008. The exhibition was designed for an extensive, newly opened gallery owned by one of China's most important art historians Li Xian Ting. This exhibition was not only this gallery’s inaugural non-Chinese curated show but also the Gallery’s first new media exhibition. It included important works by artists such as Peter Greenway, Michael Roulier, Maleonn and Cui Xuiwen. --------- Each artist was chosen both for a focus upon their own local environmental concerns as well as their specific forms of practice - that included virtual world design, interactive design, video art, real time and manipulated multiplayer gaming platforms and web 2.0 practices. This exhibition examined the interconnectivities of cultural dialogue on both a micro and macro scale; incorporating the local and the global, through display methods and design approaches that stitched these diverse practices into a spatial map of meanings and conversations. By examining the contexts of each artist’s practice in relationship to the specificity of their own local place and prevailing global contexts the exhibition sought to uncover a global vernacular. Through pursuing this concentrated anthropological direction the research identified key themes and concerns of a contextual language that was clearly underpinned by distinctive local ‘dialects’ thereby contributing to a profound sense of cross-cultural association. Through augmentation of existing discourse the exhibition confirmed the enduring relevance and influence of both localized and globalised languages of the landscape-technology continuum.
Resumo:
This special issue of Innovation : Management, Policy & Practice (also released as a book: ISBN 978-1-921348-31-0) will explore some empirical and analytic connections between creative industries and innovation policy. Seven papers are presented. The first four are empirical, providing analysis of large and/or detailed data sets on creative industries businesses and occupations to discern their contribution to innovation. The next three papers focus on comparative and historical policy analysis, connecting creative industries policy (broadly considered, including media, arts and cultural policy) and innovation policy. To introduce this special issue I want to review the arguments connecting the statistical, conceptual and policy neologism of ‘creative industries’ to: (1) the elements of a national innovation system; and (2) to innovation policy. In approaching this connection, two overarching issues arise.
Resumo:
New air traffic automated separation management concepts are constantly under investigation. Yet most of the automated separation management algorithms proposed over the last few decades have assumed either perfect communication or exact knowledge of all aircraft locations. In realistic environments, these idealized assumptions are not valid and any communication failure can potentially lead to disastrous outcomes. This paper examines the separation performance behavior of several popular algorithms during periods of information loss. This comparison is done through simulation studies. These simulation studies suggest that communication failure can cause the performance of these separation management algorithms to degrade significantly. This paper also describes some preliminary flight tests.
Resumo:
US state-based data breach notification laws have unveiled serious corporate and government failures regarding the security of personal information. These laws require organisations to notify persons who may be affected by an unauthorized acquisition of their personal information. Safe harbours to notification exist if personal information is encrypted. Three types of safe harbour have been identified in the literature: exemptions, rebuttable presumptions and factors. The underlying assumption of exemptions is that encrypted personal information is secure and therefore unauthorized access does not pose a risk. However, the viability of this assumption is questionable when examined against data breaches involving encrypted information and the demanding practical requirements of effective encryption management. Recent recommendations by the Australian Law Reform Commission (ALRC) would amend the Privacy Act 1988 (Cth) to implement a data breach scheme that includes a different type of safe harbour, factor based analysis. The authors examine the potential capability of the ALRC’s proposed encryption safe harbour in relation to the US experience at the state legislature level.
Resumo:
Machine vision represents a particularly attractive solution for sensing and detecting potential collision-course targets due to the relatively low cost, size, weight, and power requirements of the sensors involved (as opposed to radar). This paper describes the development and evaluation of a vision-based collision detection algorithm suitable for fixed-wing aerial robotics. The system was evaluated using highly realistic vision data of the moments leading up to a collision. Based on the collected data, our detection approaches were able to detect targets at distances ranging from 400m to about 900m. These distances (with some assumptions about closing speeds and aircraft trajectories) translate to an advanced warning of between 8-10 seconds ahead of impact, which approaches the 12.5 second response time recommended for human pilots. We make use of the enormous potential of graphic processing units to achieve processing rates of 30Hz (for images of size 1024-by- 768). Currently, integration in the final platform is under way.
Resumo:
This paper proposes a novel automated separation management concept in which onboard decision support is integrated within a centralised air traffic separation management system. The onboard decision support system involves a decentralised separation manager that can overrule air traffic management instructions under certain circumstances. This approach allows the advantages of both centralised and decentralised concepts to be combined (and disadvantages of each separation management approach to be mitigated). Simulation studies are used to illustrate the potential benefits of the combined separation management concept.
Resumo:
Machine vision represents a particularly attractive solution for sensing and detecting potential collision-course targets due to the relatively low cost, size, weight, and power requirements of vision sensors (as opposed to radar and TCAS). This paper describes the development and evaluation of a real-time vision-based collision detection system suitable for fixed-wing aerial robotics. Using two fixed-wing UAVs to recreate various collision-course scenarios, we were able to capture highly realistic vision (from an onboard camera perspective) of the moments leading up to a collision. This type of image data is extremely scarce and was invaluable in evaluating the detection performance of two candidate target detection approaches. Based on the collected data, our detection approaches were able to detect targets at distances ranging from 400m to about 900m. These distances (with some assumptions about closing speeds and aircraft trajectories) translate to an advanced warning of between 8-10 seconds ahead of impact, which approaches the 12.5 second response time recommended for human pilots. We overcame the challenge of achieving real-time computational speeds by exploiting the parallel processing architectures of graphics processing units found on commercially-off-the-shelf graphics devices. Our chosen GPU device suitable for integration onto UAV platforms can be expected to handle real-time processing of 1024 by 768 pixel image frames at a rate of approximately 30Hz. Flight trials using manned Cessna aircraft where all processing is performed onboard will be conducted in the near future, followed by further experiments with fully autonomous UAV platforms.
Resumo:
Objective: The Brief Michigan Alcoholism Screening Test (bMAST) is a 10-item test derived from the 25-item Michigan Alcoholism Screening Test (MAST). It is widely used in the assessment of alcohol dependence. In the absence of previous validation studies, the principal aim of this study was to assess the validity and reliability of the bMAST as a measure of the severity of problem drinking. Method: There were 6,594 patients (4,854 men, 1,740 women) who had been referred for alcohol-use disorders to a hospital alcohol and drug service who voluntarily participated in this study. Results: An exploratory factor analysis defined a two-factor solution, consisting of Perception of Current Drinking and Drinking Consequences factors. Structural equation modeling confirmed that the fit of a nine-item, two-factor model was superior to the original one-factor model. Concurrent validity was assessed through simultaneous administration of the Alcohol Use Disorders Identification Test (AUDIT) and associations with alcohol consumption and clinically assessed features of alcohol dependence. The two-factor bMAST model showed moderate correlations with the AUDIT. The two-factor bMAST and AUDIT were similarly associated with quantity of alcohol consumption and clinically assessed dependence severity features. No differences were observed between the existing weighted scoring system and the proposed simple scoring system. Conclusions: In this study, both the existing bMAST total score and the two-factor model identified were as effective as the AUDIT in assessing problem drinking severity. There are additional advantages of employing the two-factor bMAST in the assessment and treatment planning of patients seeking treatment for alcohol-use disorders. (J. Stud. Alcohol Drugs 68: 771-779,2007)
Resumo:
This paper proposes a security architecture for the basic cross indexing systems emerging as foundational structures in current health information systems. In these systems unique identifiers are issued to healthcare providers and consumers. In most cases, such numbering schemes are national in scope and must therefore necessarily be used via an indexing system to identify records contained in pre-existing local, regional or national health information systems. Most large scale electronic health record systems envisage that such correlation between national healthcare identifiers and pre-existing identifiers will be performed by some centrally administered cross referencing, or index system. This paper is concerned with the security architecture for such indexing servers and the manner in which they interface with pre-existing health systems (including both workstations and servers). The paper proposes two required structures to achieve the goal of a national scale, and secure exchange of electronic health information, including: (a) the employment of high trust computer systems to perform an indexing function, and (b) the development and deployment of an appropriate high trust interface module, a Healthcare Interface Processor (HIP), to be integrated into the connected workstations or servers of healthcare service providers. This proposed architecture is specifically oriented toward requirements identified in the Connectivity Architecture for Australia’s e-health scheme as outlined by NEHTA and the national e-health strategy released by the Australian Health Ministers.
Resumo:
Confirmatory factor analyses were conducted to evaluate the factorial validity of the Toronto Alexithymia Scale in an alcohol-dependent sample. Several factor models were examined, but all models were rejected given their poor fit. A revision of the TAS-20 in alcohol-dependent populations may be needed.