274 resultados para classification scheme
em Queensland University of Technology - ePrints Archive
Resumo:
This paper considers issues of methodological innovation in communication, media and cultural studies, that arise out of the extent to which we now live in a media environment characterised by an digital media abundance, the convergence of media platforms, content and services, and the globalisation of media content through ubiquitous computing and high-speed broadband networks. These developments have also entailed a shift in the producer-consumer relationships that characterised the 20th century mass communications paradigm, with the rapid proliferation of user-created content, accelerated innovation, the growing empowerment of media users themselves, and the blurring of distinctions between public and private, as well as age-based distinctions in terms of what media can be accessed by whom and for what purpose. It considers these issues through a case study of the Australian Law Reform Commission's National Classification Scheme Review.
Resumo:
This paper considers the debate about the relationship between globalization and media policy from the perspective provided by a current review of the Australian media classification scheme. Drawing upon the author’s recent experience in being ‘inside’ the policy process, as Lead Commissioner on the Australian National Classification Scheme Review, it is argued that theories of globalization – including theories of neoliberal globalization – fail to adequately capture the complexities of the reform process, particularly around the relationship between regulation and markets. The paper considers the pressure points for media content policies arising from media globalization, and the wider questions surrounding media content policies in an age of media convergence.
Resumo:
Greater than 750 individual particles have now been selected from collection flags housed in the JSC Cosmic Dust Curatorial Facility and most have been documented in the Cosmic Dust Catalogs [1]. As increasing numbers of particles are placed in Cosmic Dust Collections, and a greater diversity of particles are introduced to the stratosphere through natural and man-made processes (e.g. decaying orbits of space debris [2]), there is an even greater need for a classification scheme to encompass all stratospheric particles rather than only extraterrestrial particles. The fundamental requirements for a suitable classification scheme have been outlined in earlier communications [3,4]. A quantitative survey of particles on collection flag W7017 indicates that there is some bias in the number of samples selected within a given category for the Cosmic Dust Catalog [5]. However, the sample diversity within this selection is still appropriate for the development of a reliable classification scheme. In this paper, we extend the earlier works on stratospheric particle classification to include particles collected during the period May 1981 to November 1983.
Resumo:
This presentation discusses some of the general issues relating to the classification of UAS for the purposes of defining and promulgating safety regulations. One possible approach for the definition of a classification scheme for UAS Type Certification Categories reviewed.
Resumo:
Many existing schemes for malware detection are signature-based. Although they can effectively detect known malwares, they cannot detect variants of known malwares or new ones. Most network servers do not expect executable code in their in-bound network traffic, such as on-line shopping malls, Picasa, Youtube, Blogger, etc. Therefore, such network applications can be protected from malware infection by monitoring their ports to see if incoming packets contain any executable contents. This paper proposes a content-classification scheme that identifies executable content in incoming packets. The proposed scheme analyzes the packet payload in two steps. It first analyzes the packet payload to see if it contains multimedia-type data (such as . If not, then it classifies the payload either as text-type (such as or executable. Although in our experiments the proposed scheme shows a low rate of false negatives and positives (4.69% and 2.53%, respectively), the presence of inaccuracies still requires further inspection to efficiently detect the occurrence of malware. In this paper, we also propose simple statistical and combinatorial analysis to deal with false positives and negatives.
Resumo:
This article outlines the key recommendations of the Australian Law Reform Commission’s review of the National Classification Scheme, as outlined in its report Classification – Content Regulation and Convergent Media (ALRC, 2012). It identifies key contextual factors that underpin the need for reform of media classification laws and policies, including the fragmentation of regulatory responsibilities and the convergence of media platforms, content and services, as well as discussing the ALRC’s approach to law reform.
Resumo:
With the increasing number of stratospheric particles available for study (via the U2 and/or WB57F collections), it is essential that a simple, yet rational, classification scheme be developed for general use. Such a scheme should be applicable to all particles collected from the stratosphere, rather than limited to only extraterrestial or chemical sub-groups. Criteria for the efficacy of such a scheme would include: (a) objectivity , (b) ease of use, (c) acceptance within the broader scientific community and (d) how well the classification provides intrinsic categories which are consistent with our knowledge of particle types present in the stratosphere.
Resumo:
In this presentation, I reflect upon the global landscape surrounding the governance and classification of media content, at a time of rapid change in media platforms and services for content production and distribution, and contested cultural and social norms. I discuss the tensions and contradictions arising in the relationship between national, regional and global dimensions of media content distribution, as well as the changing relationships between state and non-state actors. These issues will be explored through consideration of issues such as: recent debates over film censorship; the review of the National Classification Scheme conducted by the Australian Law Reform Commission; online controversies such as the future of the Reddit social media site; and videos posted online by the militant group ISIS.
Resumo:
An application that translates raw thermal melt curve data into more easily assimilated knowledge is described. This program, called ‘Meltdown’, performs a number of data remediation steps before classifying melt curves and estimating melting temperatures. The final output is a report that summarizes the results of a differential scanning fluorimetry experiment. Meltdown uses a Bayesian classification scheme, enabling reproducible identification of various trends commonly found in DSF datasets. The goal of Meltdown is not to replace human analysis of the raw data, but to provide a sensible interpretation of the data to make this useful experimental technique accessible to naïve users, as well as providing a starting point for detailed analyses by more experienced users.
Resumo:
In a typical large office block, by far the largest lifetime expense is the salaries of the workers - 84% for salaries compared with : office rent (14%), total energy (1%), and maintenance (1%). The key drive for business is therefore the maximisation of the productivity of the employees as this is the largest cost. Reducing total energy use by 50% will not produce the same financial return as 1% productivity improvement? The aim of the project which led to this review of the literature was to understand as far as possible the state of knowledge internationally about how the indoor environment of buildings does influence occupants and the impact this influence may have on the total cost of ownership of buildings. Therefore one of the main focus areas for the literature has been identifying whether there is a link between productivity and health of building occupants and the indoor environment. Productivity is both easy to define - the ratio of output to input - but at the same time very hard to measure in a relatively small environment where individual contributions can influence the results, in particular social interactions. Health impacts from a building environment are also difficult to measure well, as establishing casual links between the indoor environment and a particular health issue can be very difficult. All of those issues are canvassed in the literature reported here. Humans are surprisingly adaptive to different physical environments, but the workplace should not test the limits of human adaptability. Physiological models of stress, for example, accept that the body has a finite amount of adaptive energy available to cope with stress. The importance of, and this projects' focus on, the physical setting within the integrated system of high performance workplaces, means this literature survey explores research which has been undertaken on both physical and social aspects of the built environment. The literature has been largely classified in several different ways, according to the classification scheme shown below. There is still some inconsistency in the use of keywords, which is being addressed and greater uniformity will be developed for a CD version of this literature, enabling searching using this classification scheme.
Resumo:
Buffer overflow vulnerabilities continue to prevail and the sophistication of attacks targeting these vulnerabilities is continuously increasing. As a successful attack of this type has the potential to completely compromise the integrity of the targeted host, early detection is vital. This thesis examines generic approaches for detecting executable payload attacks, without prior knowledge of the implementation of the attack, in such a way that new and previously unseen attacks are detectable. Executable payloads are analysed in detail for attacks targeting the Linux and Windows operating systems executing on an Intel IA-32 architecture. The execution flow of attack payloads are analysed and a generic model of execution is examined. A novel classification scheme for executable attack payloads is presented which allows for characterisation of executable payloads and facilitates vulnerability and threat assessments, and intrusion detection capability assessments for intrusion detection systems. An intrusion detection capability assessment may be utilised to determine whether or not a deployed system is able to detect a specific attack and to identify requirements for intrusion detection functionality for the development of new detection methods. Two novel detection methods are presented capable of detecting new and previously unseen executable attack payloads. The detection methods are capable of identifying and enumerating the executable payload’s interactions with the operating system on the targeted host at the time of compromise. The detection methods are further validated using real world data including executable payload attacks.
Resumo:
The crystal structures of the proton-transfer compounds of 3,5-dinitrosalicylic acid (DNSA) with a series of aniline-type Lewis bases [aniline, 2-hydroxyaniline, 2-methoxyaniline, 3-methoxyaniline, 4-fluoroaniline, 4-chloroaniline and 2-aminoaniline] have been determined and their hydrogen-bonding systems analysed. All are anhydrous 1:1 salts: [(C6H8N)+(C7H3N2O7)-], (1), [(C6H8NO)+(C7H3N2O7)-], (2), [(C7H10NO)+(C7H3N2O7)-], (3), [(C7H10NO)+(C7H3N2O7)-], (4), [(C6H7FN)+(C7H3N2O7)-], (5), [(C6H7ClN)+(C7H3N2O7)-], (6), and [(C6H9N2)+(C7H3N2O7)-], (7) respectively. Crystals of 1 and 6 are triclinic, space group P-1 while the remainder are monoclinic with space group either P21/n (2, 4, 5 and 7) or P21 (3). Unit cell dimensions and contents are: for 1, a = 7.2027(17), b = 7.5699(17), c = 12.9615(16) Å, α = 84.464(14), β = 86.387(15), γ = 75.580(14)o, Z = 2; for 2, a = 7.407(3), b = 6.987(3), c = 27.653(11) Å, β = 94.906(7)o, Z = 4; for 3, a = 8.2816(18), b = 23.151(6), c = 3.9338(10), β = 95.255(19)o, Z = 2; for 4, a = 11.209(2), b = 8.7858(19), c = 15.171(3) Å, β = 93.717(4)o, Z = 4; for 5, a = 26.377(3), b = 10.1602(12), c = 5.1384(10) Å, β = 91.996(13)o, Z = 4; for 6, a = 11.217(3), b = 14.156(5), c = 4.860(3) Å, α = 99.10(4), β = 96.99(4), γ = 76.35(2)o, Z = 2; for 7, a = 12.830(4), b = 8.145(3), c = 14.302(4) Å, β = 102.631(6)o, Z = 4. In all compounds at least one primary linear intermolecular N+-H…O(carboxyl) hydrogen-bonding interaction is present which, together with secondary hydrogen bonding results in the formation of mostly two-dimensional network structures, exceptions being with compounds 4 and 5 (one-dimensional) and compound 6 (three-dimensional). In only two cases [compounds 1 and 4], are weak cation-anion or cation-cation π-π interactions found while weak aromatic C-H…O interactions are insignificant. The study shows that all compounds fit the previously formulated classification scheme for primary and secondary interactive modes for proton-transfer compounds of 3,5-dinitrosalicylic acid but there are some unusual variants.
Resumo:
This paper will consider some of the wider contextual and policy questions arising out of four major public inquiries that took place in Australia over 2011–2012: the Convergence Review, the National Classification Scheme Review, the Independent Media Inquiry (Finkelstein Review) and the National Cultural Policy. This paper considers whether we are now witnessing a ‘convergent media policy moment’ akin to the ‘cultural policy moment’ theorized by Australian cultural researchers in the early 1990s, and the limitations of various approaches to understanding policy – including critiques of neoliberalism – in understanding such shifts. It notes the rise of ‘soft law’ as a means of addressing the challenges of regulatory design in an era of rapid media change, with consideration of two cases: the approach to media influence taken in the Convergence Review, and the concept of ‘deeming’ developed in the National Classification Scheme Review.
Resumo:
The development of public service broadcasters (PSBs) in the 20th century was framed around debates about its difference compared to commercial broadcasting. These debates navigated between two poles. One concerned the relationship between non‐commercial sources of funding and the role played by statutory Charters as guarantors of the independence of PSBs. The other concerned the relationship between PSBs being both a complementary and a comprehensive service, although there are tensions inherent in this duality. In the 21st century, as reconfigured public service media organisations (PSMs) operate across multiple platforms in a convergent media environment, how are these debates changing, if at all? Is the case for PSM “exceptionalism” changed with Web‐based services, catch‐up TV, podcasting, ancillary product sales, and commissioning of programs from external sources in order to operate in highly diversified cross‐media environments? Do the traditional assumptions about non‐commercialism still hold as the basis for different forms of PSM governance and accountability? This paper will consider the question of PSM exceptionalism in the context of three reviews into Australian media that took place over 2011‐2012: the Convergence Review undertaken through the Department of Broadband, Communications and the Digital Economy; the National Classification Scheme Review undertaken by the Australian Law Reform Commission; and the Independent Media Inquiry that considered the future of news and journalism.
Resumo:
On 24 March 2011, Attorney-General Robert McClelland referred the National Classification Scheme to the ALRC and asked it to conduct widespread public consultation across the community and industry. The review considered issues including: existing Commonwealth, State and Territory classification laws the current classification categories contained in the Classification Act, Code and Guidelines the rapid pace of technological change the need to improve classification information available to the community the effect of media on children and the desirability of a strong content and distribution industry in Australia. During the inquiry, the ALRC conducted face-to-face consultations with stakeholders, hosted two online discussion forums, and commissioned pilot community and reference group forums into community attitudes to higher level media content. The ALRC published two consultation documents—an Issues Paper and a Discussion Paper—and invited submissions from the public. The Final Report was tabled in Parliament on 1 March 2012. Recommendations: The report makes 57 recommendations for reform. The net effect of the recommendations would be the establishment of a new National Classification Scheme that: applies consistent rules to content that are sufficiently flexible to be adaptive to technological change; places a regulatory focus on restricting access to adult content, helping to promote cyber-safety and protect children from inappropriate content across media platforms; retains the Classification Board as an independent classification decision maker with an essential role in setting benchmarks; promotes industry co-regulation, encouraging greater industry content classification, with government regulation more directly focused on content of higher community concern; provides for pragmatic regulatory oversight, to meet community expectations and safeguard community standards; reduces the overall regulatory burden on media content industries while ensuring that content obligations are focused on what Australians most expect to be classified; and harmonises classification laws across Australia, for the benefit of consumers and content providers.