991 resultados para open door
Resumo:
Fierce debates have characterised 2013 as the implications of government mandates for open access have been debated and pathways for implementation worked out. There is no doubt that journals will move to a mix of gold and green and there will be an unsettled relationship between the two. But what of books? Is it conceivable that in those subjects, such as in the humanities and social sciences, where something longer than the journal article is still the preferred form of scholarly communications that these will stay closed? Will it be acceptable to have some publicly funded research made available only in closed book form (regardless of whether print or digital) while other subjects where articles are favoured go open access? Frances Pinter is in the middle of these debates, having founded Knowledge Unlatched (see www.knowledgeunlatched.org). KU is a global library consortium enabling open access books. Knowledge Unlatched is helping libraries to work together for a sustainable open future for specialist academic books. Its vision is a healthy market that includes free access for end users. In this session she will review all the different models that are being experimented with around the world. These include author-side payments, institutional subsidies, research funding body approaches etc. She will compare and contrast these models with those that are already in place for journal articles. She will also review the policy landscape and report on how open access scholarly books are faring to date Frances Pinter, Founder, Knowledge Unlatched, UK
Resumo:
This three-hour workshop tackles the crucial question of whether globally coordinated, market based approaches to funding open access monographs can support the unique needs of Australian research communities. The workshop takes place in the context of the release in August 2013 of the Book Industry Collaborative Council (BICC) report and especially the recommendations included in the chapter on scholarly book publishing in the humanities and social sciences. This workshop, with expert speakers from the BICC Committee and from across the scholarly publishing industry, will discuss the policy issues most likely to ensure that Australian scholarly communities and audiences are best served in an era of digital technology and globalisation. Australia must think globally and support developments that enhance the accessibility of publicly-funded research. Speakers will outline recent developments in scholarly monograph publishing including new Open Access initiatives and developments. Knowledge Unlatched, is one example of an attempt to create an internationally coordinated, market-based route to open access for Humanities, Arts and Social Sciences (HASS) monographs. Knowledge Unlatched, a not-for-profit London-based company is piloting a global library consortium approach to funding open access monographs and released its pilot program in early October with 28 titles from 13 publishers. The workshop invites discussion and debate from librarians, publishers, researchers and research funders on the role of international coordination and markets in securing a more open future for Australian HASS scholarship.
Resumo:
On 19 June 2013 Knowledge Unlatched and the Berkman Center for Internet and Society at Harvard Law School jointly convened a one-day workshop titled Open Access and Scholarly Books in Cambridge, MA. The workshop brought together a group of 21 invited publishers, librarians, academics and Open Access innovators to discuss the challenge of making scholarly books Open Access. This report captures discussions that took place on the day.
Resumo:
After nearly fifteen years of the open access (OA) movement and its hard-fought struggle for a more open scholarly communication system, publishers are realizing that business models can be both open and profitable. Making journal articles available on an OA license is becoming an accepted strategy for maximizing the value of content to both research communities and the businesses that serve them. The first blog in this two-part series celebrating Data Innovation Day looks at the role that data-innovation is playing in the shift to open access for journal articles.
Resumo:
The world is increasingly moving towards more open models of publishing and communication. The UK government has demonstrated a firm commitment to ensuring that academic research outputs are made available to all who might benefit from access to them, and its open access policy attempts to make academic publications freely available to readers, rather than being locked behind pay walls or only available to researchers with access to well-funded university libraries. Open access policies have an important role to play in fostering an open innovation ecosystem and ensuring that maximum value is derived from investments in university-based research. But are we ready to embrace this change?
Resumo:
Many emerging economies are dangling the patent system to stimulate bio-technological innovations with the ultimate premise that these will improve their economic and social growth. The patent system mandates full disclosure of the patented invention in exchange of a temporary exclusive patent right. Recently, however, patent offices have fallen short of complying with such a mandate, especially for genetic inventions. Most patent offices provide only static information about disclosed patent sequences and even some do not keep track of the sequence listing data in their own database. The successful partnership of QUT Library and Cambia exemplifies advocacy in Open Access, Open Innovation and User Participation. The library extends its services to various departments within the university, builds and encourages research networks to complement skills needed to make a contribution in the real world.
Resumo:
In this research we observe the situated, embodied and playful interaction that participants engage in with open-ended interactive artworks. The larger project from which this work derives [28] contributes a methodological model for the evaluation of open-ended interactive artwork that treats each work individually and recognises the importance of the artist intent and the traditions from which the work derives. In this paper, we describe this evolving methodology for evaluating and understanding participation via three case studies of open-ended interactive art installations. This analysis builds an understanding of open-ended free-play non-narrative environments and the affordances these environments enable for participants.
Resumo:
In this paper we introduce and discuss the nature of free-play in the context of three open-ended interactive art installation works. We observe the interaction work of situated free-play of the participants in these environments and, building on precedent work, devise a set of sensitising terms derived both from the literature and from what we observe from participants interacting there. These sensitising terms act as guides and are designed to be used by those who experience, evaluate or report on open-ended interactive art. That is, we propose these terms as a common-ground language to be used by participants communicating while in the art work to describe their experience, by researchers in the various stages of research process (observation, coding activity, analysis, reporting, and publication), and by inter-disciplinary researchers working across the fields of HCI and art. This work builds a foundation for understanding the relationship between free-play, open-ended environments, and interactive installations and contributes sensitising terms useful for the HCI community for discussion and analysis of open-ended interactive art works.
Resumo:
This article investigates the discourses of academic legitimacy that surround the production, consumption, and accreditation of online scholarship. Using the web-based media and cultural studies journal (http://journal.media-culture.org.au) as a case study, it examines how online scholarly journals often position themselves as occupying a space between the academic and the popular and as having a functional advantage over print-based media in promoting a spirit of public intellectualism. The current research agenda of both government and academe prioritises academic research that is efficient, self-promoting, and relevant to the public. Yet, although the cost-effectiveness and public-intellectual focus of online scholarship speak to these research priorities, online journals such as M/C Journal have occupied, and continue to occupy, an unstable position in relation to the perceived academic legitimacy of their content. Although some online scholarly journals have achieved a limited form of recognition within a system of accreditation that still privileges print-based scholarship, I argue that this, nevertheless, points to the fact that traditional textual notions of legitimate academic work continue to pervade the research agenda of an academe that increasingly promotes flexible delivery of teaching and online research initiatives.
Resumo:
Event report on the Open Access and Research 2013 conference which focused on recent developments and the strategic advantages they bring to the research sector.
Resumo:
Enterprises, both public and private, have rapidly commenced using the benefits of enterprise resource planning (ERP) combined with business analytics and “open data sets” which are often outside the control of the enterprise to gain further efficiencies, build new service operations and increase business activity. In many cases, these business activities are based around relevant software systems hosted in a “cloud computing” environment. “Garbage in, garbage out”, or “GIGO”, is a term long used to describe problems in unqualified dependency on information systems, dating from the 1960s. However, a more pertinent variation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems, such as ERP and usage of open datasets in a cloud environment, the ability to verify the authenticity of those data sets used may be almost impossible, resulting in dependence upon questionable results. Illicit data set “impersonation” becomes a reality. At the same time the ability to audit such results may be an important requirement, particularly in the public sector. This paper discusses the need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment and analyses some current technologies that are offered and which may be appropriate. However, severe limitations to addressing these requirements have been identified and the paper proposes further research work in the area.
Resumo:
Enterprise resource planning (ERP) systems are rapidly being combined with “big data” analytics processes and publicly available “open data sets”, which are usually outside the arena of the enterprise, to expand activity through better service to current clients as well as identifying new opportunities. Moreover, these activities are now largely based around relevant software systems hosted in a “cloud computing” environment. However, the over 50- year old phrase related to mistrust in computer systems, namely “garbage in, garbage out” or “GIGO”, is used to describe problems of unqualified and unquestioning dependency on information systems. However, a more relevant GIGO interpretation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems based around ERP and open datasets as well as “big data” analytics, particularly in a cloud environment, the ability to verify the authenticity and integrity of the data sets used may be almost impossible. In turn, this may easily result in decision making based upon questionable results which are unverifiable. Illicit “impersonation” of and modifications to legitimate data sets may become a reality while at the same time the ability to audit any derived results of analysis may be an important requirement, particularly in the public sector. The pressing need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment is discussed in this paper. Some current and appropriate technologies currently being offered are also examined. However, severe limitations in addressing the problems identified are found and the paper proposes further necessary research work for the area. (Note: This paper is based on an earlier unpublished paper/presentation “Identity, Addressing, Authenticity and Audit Requirements for Trust in ERP, Analytics and Big/Open Data in a ‘Cloud’ Computing Environment: A Review and Proposal” presented to the Department of Accounting and IT, College of Management, National Chung Chen University, 20 November 2013.)
Resumo:
The Comment by Mayers and Reiter criticizes our work on two counts. Firstly, it is claimed that the quantum decoherence effects that we report in consequence of our experimental analysis of neutron Compton scattering from H in gaseous H2 are not, as we maintain, outside the framework of conventional neutron scatteringtheory. Secondly, it is claimed that we did not really observe such effects, owing to a faulty analysis of the experimental data, which are claimed to be in agreement with conventional theory. We focus in this response on the critical issue of the reliability of our experimental results and analysis. Using the same standard Vesuvio instrument programs used by Mayers et al., we show that, if the experimental results for H in gaseous H2 are in agreement with conventional theory, then those for D in gaseous D2 obtained in the same way cannot be, and vice-versa. We expose a flaw in the calibration methodology used by Mayers et al. that leads to the present disagreement over the behaviour of H, namely the ad hoc adjustment of the measured H peak positions in TOF during the calibration of Vesuvio so that agreement is obtained with the expectation of conventional theory. We briefly address the question of the necessity to apply the theory of open quantum systems.
Resumo:
Decoherence of quantum entangled particles is observed in most systems, and is usually caused by system-environment interactions. Disentangling two subsystems A and B of a quantum systemAB is tantamount to erasure of quantum phase relations between A and B. It is widely believed that this erasure is an innocuous process, which e.g. does not affect the energies of A and B. Surprisingly, recent theoretical investigations by different groups showed that disentangling two systems, i.e. their decoherence, can cause an increase of their energies. Applying this result to the context of neutronCompton scattering from H2 molecules, we provide for the first time experimental evidence which supports this prediction. The results reveal that the neutron-proton collision leading to the cleavage of the H-H bond in the sub-femtosecond timescale is accompanied by larger energy transfer (by about 3%) than conventional theory predicts. It is proposed to interpreted the results by considering the neutron-proton collisional system as an entangled open quantum system being subject to decoherence owing to the interactions with the “environment” (i.e., two electrons plus second proton of H2).
Resumo:
A growing number of online journals and academic platforms are adopting light peer review or 'publish then filter' models of scholarly communication. These approaches have the advantage of enabling instant exchanges of knowledge between academics and are part of a wider search for alternatives to traditional peer review and certification processes in scholarly publishing. However, establishing credibility and identifying the correct balance between communication and scholarly rigour remains an important challenge for digital communication platforms targeting academic communities. This paper looks at a highly influential, government-backed, open publishing platform in China: Science Paper Online, which is using transparent post-publication peer-review processes to encourage innovation and address systemic problems in China's traditional academic publishing system. There can be little doubt that the Chinese academic publishing landscape differs in important ways from counterparts in the United States and Western Europe. However, this article suggests that developments in China also provide important lessons about the potential of digital technology and government policy to facilitate a large-scale shift towards more open and networked models of scholarly communication.