392 resultados para Cantabrian Arc
Resumo:
Queensland University of Technology (QUT) completed an Australian National Data Service (ANDS) funded “Seeding the Commons Project” to contribute metadata to Research Data Australia. The project employed two Research Data Librarians from October 2009 through to July 2010. Technical support for the project was provided by QUT’s High Performance Computing and Research Support Specialists. ---------- The project identified and described QUT’s category 1 (ARC / NHMRC) research datasets. Metadata for the research datasets was stored in QUT’s Research Data Repository (Architecta Mediaflux). Metadata which was suitable for inclusion in Research Data Australia was made available to the Australian Research Data Commons (ARDC) in RIF-CS format. ---------- Several workflows and processes were developed during the project. 195 data interviews took place in connection with 424 separate research activities which resulted in the identification of 492 datasets. ---------- The project had a high level of technical support from QUT High Performance Computing and Research Support Specialists who developed the Research Data Librarian interface to the data repository that enabled manual entry of interview data and dataset metadata, creation of relationships between repository objects. The Research Data Librarians mapped the QUT metadata repository fields to RIF-CS and an application was created by the HPC and Research Support Specialists to generate RIF-CS files for harvest by the Australian Research Data Commons (ARDC). ---------- This poster will focus on the workflows and processes established for the project including: ---------- • Interview processes and instruments • Data Ingest from existing systems (including mapping to RIF-CS) • Data entry and the Data Librarian interface to Mediaflux • Verification processes • Mapping and creation of RIF-CS for the ARDC
Resumo:
Purpose: Flickering stimuli increase the metabolic demand of the retina,making it a sensitive perimetric stimulus to the early onset of retinal disease. We determine whether flickering stimuli are a sensitive indicator of vision deficits resulting from to acute, mild systemic hypoxia when compared to standard static perimetry. Methods: Static and flicker visual perimetry were performed in 14 healthy young participants while breathing 12% oxygen (hypoxia) under photopic illumination. The hypoxia visual field data were compared with the field data measured during normoxia. Absolute sensitivities (in dB) were analysed in seven concentric rings at 1°, 3°, 6°, 10°, 15°, 22° and 30° eccentricities as well as mean defect (MD) and pattern defect (PD) were calculated. Preliminary data are reported for mesopic light levels. Results: Under photopic illumination, flicker and static visual field sensitivities at all eccentricities were not significantly different between hypoxia and normoxia conditions. The mean defect and pattern defect were not significantly different for either test between the two oxygenation conditions. Conclusion: Although flicker stimulation increases cellular metabolism, flicker photopic visual field impairment is not detected during mild hypoxia. These findings contrast with electrophysiological flicker tests in young participants that show impairment at photopic illumination during the same levels of mild hypoxia. Potential mechanisms contributing to the difference between the visual fields and electrophysiological flicker tests including variability in perimetric data, neuronal adaptation and vascular autoregulation, are considered. The data have implications for the use of visual perimetry in the detection of ischaemic/hypoxic retinal disorders under photopic and mesopic light levels.
Resumo:
PURPOSE: To determine if participants with normal visual acuity, no ophthalmoscopically signs of age-related maculopathy (ARM) in both eyes and who are carriers of the CFH, LOC387715 and HRTA1 high-risk genotypes (“gene-positive”) have impaired rod- and cone-mediated mesopic visual function compared to persons who do not carry the risk genotypes (“gene-negative”).---------- METHODS: Fifty-three Caucasian study participants (mean 55.8 ± 6.1) were genotyped for CFH, LOC387715/ARMS2 and HRTA1 polymorphisms. We genotyped single nucleotide polymorphisms (SNPs) in the CFH (rs380390), LOC387715/ARMS2 (rs10490924) and HTRA1 (rs11200638) genes using Applied Biosystems optimised TaqMan assays. We determined the critical fusion frequency (CFF) mediated by cones alone (Long, Middle and Short wavelength sensitive cones; LMS) and by the combined activities of cones and rods (LMSR). The stimuli were generated using a 4-primary photostimulator that provides independent control of the photoreceptor excitation under mesopic light levels. Visual function was further assessed using standard clinical tests, flicker perimetry and microperimetry.---------- RESULTS: The mesopic CFF mediated by rods and cones (LMSR) was significantly reduced in gene-positive compared to gene-negative participants after correction for age (p=0.03). Cone-mediated CFF (LMS) was not significantly different between gene-positive and -negative participants. There were no significant associations between flicker perimetry and microperimetry and genotype.---------- CONCLUSIONS: This is the first study to relate ARM risk genotypes with mesopic visual function in clinically normal persons. These preliminary results could become of clinical importance as mesopic vision may be used to document sub-clinical retinal changes in persons with risk genotypes and to determine whether those persons progress into manifest disease.
Resumo:
China’s Creative Industries explores the role of new technologies, globalization and higher levels of connectivity in re-defining relationships between ‘producers’ and ‘consumers’ in 21st century China. The evolution of new business models, the impact of state regulation, the rise of entrepreneurial consumers and the role of intellectual property rights are traced through China’s film, music and fashion industries. The book argues that social network markets, consumer entrepreneurship and business model evolution are driving forces in the production and commercialization of cultural commodities. In doing so it raises important questions about copyright’s role in the business of culture, particularly in a digital age.
Resumo:
China has a reputation as an economy based on utility: the large-scale manufacture of low-priced goods. But useful values like functionality, fitness for purpose and efficiency are only part of the story. More important are what Veblen called ‘honorific’ values, arguably the driving force of development, change and value in any economy. To understand the Chinese economy therefore, it is not sufficient to point to its utilitarian aspect. Honorific status-competition is a more fundamental driver than utilitarian cost-competition. We argue that ‘social network markets’ are the expression of these honorific values, relationships and connections that structure and coordinate individual choices. This paper explores how such markets are developing in China in the area of fashion and fashion media. These, we argue, are an expression of ‘risk culture’ for high-end entrepreneurial consumers and producers alike, providing a stimulus to dynamic innovation in the arena of personal taste and comportment, as part of an international cultural system based on constant change. We examine the launch of Vogue China in 2005, and China’s reception as a fashion player among the international editions of Vogue, as an expression of a ‘decisive moment’ in the integration of China into an international social network market based on honorific values.
Resumo:
To evaluate whether luminance contrast discrimination losses in amblyopia on putative magnocellular (MC) and parvocellular (PC) pathway tasks reflect deficits at retinogeniculate or cortical sites. Fifteen amblyopes including six anisometropes, seven strabismics, two mixed and 12 age-matched controls were investigated. Contrast discrimination was measured using established psychophysical procedures that differentiate MC and PC processing. Data were described with a model of the contrast response of primate retinal ganglion cells. All amblyopes and controls displayed the same contrast signatures on the MC and PC tasks, with three strabismics having reduced sensitivity. Amblyopic PC contrast gain was similar to electrophysiological estimates from visually normal, non-human primates. Sensitivity losses evident in a subset of the amblyopes reflect cortical summation deficits, with no change in retinogeniculate contrast responses. The data do not support the proposal that amblyopic contrast sensitivity losses on MC and PC tasks reflect retinogeniculate deficits, but rather are due to anomalous post-retinogeniculate cortical processing of retinal signals.
Resumo:
The Digital Economy Bill has been heavily criticized by consumer organizations, internet service providers and technology experts on the grounds that it will reduce the public’s ability to access politically sensitive information, impinge on citizens’ rights to privacy, threaten freedom of expression and have a chilling effect on digital innovation. Its passage in spite of these criticisms reflects, among other things, the power of the rhetoric that has been employed by its proponents. This paper examines economic arguments surrounding the digital economy debate in light of lessons from one of the world's fastest growing economies: China.
Resumo:
Type unions, pointer variables and function pointers are a long standing source of subtle security bugs in C program code. Their use can lead to hard-to-diagnose crashes or exploitable vulnerabilities that allow an attacker to attain privileged access over classified data. This paper describes an automatable framework for detecting such weaknesses in C programs statically, where possible, and for generating assertions that will detect them dynamically, in other cases. Exclusively based on analysis of the source code, it identifies required assertions using a type inference system supported by a custom made symbol table. In our preliminary findings, our type system was able to infer the correct type of unions in different scopes, without manual code annotations or rewriting. Whenever an evaluation is not possible or is difficult to resolve, appropriate runtime assertions are formed and inserted into the source code. The approach is demonstrated via a prototype C analysis tool.
Resumo:
China has made great progress in constructing comprehensive legislative and judicial infrastructures to protect intellectual property rights. But levels of enforcement remain low. Estimates suggest that 90% of film and music products consumed in China are ‘pirated’ and in 2009 81% of the infringing goods seized at the US border originated from China. Despite of heavy criticism over its failure to enforce IPRs, key areas of China’s creative industries, including film, mobile-music, fashion and animation, are developing rapidly. This paper explores how the rapid expansion of China’s creative economy might be reconciled with conceptual approaches that view the CIs in terms of creativity inputs and IP outputs. It argues that an evolutionary understanding of copyright’s role in creative innovation might better explain China’s experiences and provide more general insights into the nature of the creative industries and the policies most likely to promote growth in this sector of the economy.
Resumo:
The processes of digitization and deregulation have transformed the production, distribution and consumption of information and entertainment media over the past three decades. Today, researchers are confronted with profoundly different landscapes of domestic and personal media than the pioneers of qualitative audience research that came to form much of the conceptual basis of Cultural Studies first in Britain and North America and subsequently across all global regions. The process of media convergence, as a consequence of the dual forces of digitisation and deregulation, thus constitutes a central concept in the analysis of popular mass media. From the study of the internationalisation and globalisation of media content, changing regimes of media production, via the social shaping and communication technologies and conversely the impact of communication technology on social, cultural and political realities, to the emergence of transmedia storytelling, the interplay of intertextuality and genre and the formation of mediated social networks, convergence informs and shapes contemporary conceptual debates in the field of popular communication and beyond. However, media convergence challenges not only the conceptual canon of (popular) communication research, but poses profound methodological challenges. As boundaries between producers and consumers are increasingly fluent, formerly stable fields and categories of research such as industries, texts and audiences intersect and overlap, requiring combined and new research strategies. This preconference aims to offer a forum to present and discuss methodological innovations in the study of contemporary media and the analysis of the social, cultural,and political impact and challenges arising through media convergence. The preconference thus aims to focus on the following methodological questions and challenges: *New strategies of audience research responding to the increasing individualisation of popular media consumption. *Methods of data triangulation in and through the integrated study of media production, distribution and consumption. *Bridging the methodological and often associated conceptual gap between qualitative and quantitative research in the study of popular media. *The future of ethnographic audience and production research in light of blurring boundaries between media producers and consumers. *A critical re-examination of which textual configurations can be meaningfully described and studied as text. *Methodological innovations aimed at assessing the macro social, cultural and political impact of mediatization (including, but not limited to, "creative methods"). *Methodological responses to the globalisation of popular media and practicalities of international and transnational comparative research. *An exploration of new methods required in the study of media flow and intertextuality.
Resumo:
Measurements in the exhaust plume of a petrol-driven motor car showed that molecular cluster ions of both signs were present in approximately equal amounts. The emission rate increased sharply with engine speed while the charge symmetry remained unchanged. Measurements at the kerbside of nine motorways and five city roads showed that the mean total cluster ion concentration near city roads (603 cm-3) was about one-half of that near motorways (1211 cm-3) and about twice as high as that in the urban background (269 cm-3). Both positive and negative ion concentrations near a motorway showed a significant linear increase with traffic density (R2=0.3 at p<0.05) and correlated well with each other in real time (R2=0.87 at p<0.01). Heavy duty diesel vehicles comprised the main source of ions near busy roads. Measurements were conducted as a function of downwind distance from two motorways carrying around 120-150 vehicles per minute. Total traffic-related cluster ion concentrations decreased rapidly with distance, falling by one-half from the closest approach of 2m to 5m of the kerb. Measured concentrations decreased to background at about 15m from the kerb when the wind speed was 1.3 m s-1, this distance being greater at higher wind speed. The number and net charge concentrations of aerosol particles were also measured. Unlike particles that were carried downwind to distances of a few hundred metres, cluster ions emitted by motor vehicles were not present at more than a few tens of metres from the road.
Resumo:
Neo-liberalism has become one of the boom concepts of our time. From its original reference point as a descriptor of the economics of the “Chicago School” such as Milton Friedman, or authors such as Friedrich von Hayek, neo-liberalism has become an all-purpose descriptor and explanatory device for phenomena as diverse as Bollywood weddings, standardized testing in schools, violence in Australian cinema, and the digitization of content in public libraries. Moreover, it has become an entirely pejorative term: no-one refers to their own views as “neo-liberal”, but it rather refers to the erroneous views held by others, whether they acknowledge this or not. Neo-liberalism as it has come to be used, then, bears many of the hallmarks of a dominant ideology theory in the classical Marxist sense, even if it is often not explored in these terms. This presentation will take the opportunity provided by the English language publication of Michel Foucault’s 1978-79 lectures, under the title of The Birth of Biopolitics, to consider how he used the term neo-liberalism, and how this equates with its current uses in critical social and cultural theory. It will be argued that Foucault did not understand neo-liberalism as a dominant ideology in these lectures, but rather as marking a point of inflection in the historical evolution of liberal political philosophies of government. It will also be argued that his interpretation of neo-liberalism was more nuanced and more comparative than the more recent uses of Foucault in the literature on neo-liberalism. It will also look at how Foucault develops comparative historical models of liberal capitalism in The Birth of Biopolitics, arguing that this dimension of his work has been lost in more recent interpretations, which tend to retro-fit Foucault to contemporary critiques of either U.S. neo-conservatism or the “Third Way” of Tony Blair’s New Labour in the UK.
Resumo:
The launch of the Apple iPad in January 2010 was one of the most anticipated and publicised launched of a new technological device in recent history. Positioning itself as between a smart phone and a PC, but with the attributes of both, Apple have sought to develop a new market niche with the iPad for tablet PC devices, and early signs are that market expectations are being met.. The iPad’s launch was potentially fortuitous for the newspaper industry worldwide, as it offered the potential to address its two recurring problems: the slow but inexorable decline of print media circulation, and the inability to satisfactorily monetise online readerships. As a result, the Apple iPad has benefited from an enormous amount of free publicity in newspapers, as they develop their own applications (apps) for the device. This paper reports on findings from work undertaken through Smart Services CRC into potential take-up and likely uses of the iPad, and their implications for the news media industry. It reports on focus group analysis undertaken in the mid-2010 using “customer job mapping” methodologies, that draw attention to current gaps in user behaviour in terms of available devices, in order to anticipate possibilities beyond the current “three screens” of PC, mobile phone and television.
Resumo:
The Guardian reportage of the United Kingdom Member of Parliament (MP) expenses scandal of 2009 used crowdsourcing and computational journalism techniques. Computational journalism can be broadly defined as the application of computer science techniques to the activities of journalism. Its foundation lies in computer assisted reporting techniques and its importance is increasing due to the: (a) increasing availability of large scale government datasets for scrutiny; (b) declining cost, increasing power and ease of use of data mining and filtering software; and Web 2.0; and (c) explosion of online public engagement and opinion.. This paper provides a case study of the Guardian MP expenses scandal reportage and reveals some key challenges and opportunities for digital journalism. It finds journalists may increasingly take an active role in understanding, interpreting, verifying and reporting clues or conclusions that arise from the interrogations of datasets (computational journalism). Secondly a distinction should be made between information reportage and computational journalism in the digital realm, just as a distinction might be made between citizen reporting and citizen journalism. Thirdly, an opportunity exists for online news providers to take a ‘curatorial’ role, selecting and making easily available the best data sources for readers to use (information reportage). These activities have always been fundamental to journalism, however the way in which they are undertaken may change. Findings from this paper may suggest opportunities and challenges for the implementation of computational journalism techniques in practice by digital Australian media providers, and further areas of research.
Resumo:
Within a surveillance video, occlusions are commonplace, and accurately resolving these occlusions is key when seeking to accurately track objects. The challenge of accurately segmenting objects is further complicated by the fact that within many real-world surveillance environments, the objects appear very similar. For example, footage of pedestrians in a city environment will consist of many people wearing dark suits. In this paper, we propose a novel technique to segment groups and resolve occlusions using optical flow discontinuities. We demonstrate that the ratio of continuous to discontinuous pixels within a region can be used to locate the overlapping edges, and incorporate this into an object tracking framework. Results on a portion of the ETISEO database show that the proposed algorithm results in improved tracking performance overall, and improved tracking within occlusions.