997 resultados para ABC Open
Resumo:
A multi-resource multi-stage scheduling methodology is developed to solve short-term open-pit mine production scheduling problems as a generic multi-resource multi-stage scheduling problem. It is modelled using essential characteristics of short-term mining production operations such as drilling, sampling, blasting and excavating under the capacity constraints of mining equipment at each processing stage. Based on an extended disjunctive graph model, a shifting-bottleneck-procedure algorithm is enhanced and applied to obtain feasible short-term open-pit mine production schedules and near-optimal solutions. The proposed methodology and its solution quality are verified and validated using a real mining case study.
Resumo:
This case study examines the way in which Knowledge Unlatched is combining collective action and open access licenses to encourage innovation in markets for specialist academic books. Knowledge Unlatched is a not for profit organisation that has been established to help a global community of libraries coordinate their book purchasing activities more effectively and, in so doing, to ensure that books librarians select for their own collections become available for free for anyone in the world to read. The Knowledge Unlatched model is an attempt to re-coordinate a market in order to facilitate a transition to digitally appropriate publishing models that include open access. It offers librarians an opportunity to facilitate the open access publication of books that their own readers would value access to. It provides publishers with a stable income stream on titles selected by libraries, as well as an ability to continue selling books to a wider market on their own terms. Knowledge Unlatched provides a rich case study for researchers and practitioners interested in understanding how innovations in procurement practices can be used to stimulate more effective, equitable markets for socially valuable products.
Resumo:
Although digital technology has made it possible for many more people to access content at no extra cost, fewer people than ever before are able to read the books written by university-based researchers. This workshop explores the role that open access licenses and collective action might play in reviving the scholarly monograph: a specialised area of academic publishing that has seen sales decline by more than 90 per cent over the past three decades. It also introduces Knowledge Unlatched an ambitious attempt to create an internationally coordinated, sustainable route to open access for scholarly books. Knowledge Unlatched is now in its pilot phase.
Resumo:
Fierce debates have characterised 2013 as the implications of government mandates for open access have been debated and pathways for implementation worked out. There is no doubt that journals will move to a mix of gold and green and there will be an unsettled relationship between the two. But what of books? Is it conceivable that in those subjects, such as in the humanities and social sciences, where something longer than the journal article is still the preferred form of scholarly communications that these will stay closed? Will it be acceptable to have some publicly funded research made available only in closed book form (regardless of whether print or digital) while other subjects where articles are favoured go open access? Frances Pinter is in the middle of these debates, having founded Knowledge Unlatched (see www.knowledgeunlatched.org). KU is a global library consortium enabling open access books. Knowledge Unlatched is helping libraries to work together for a sustainable open future for specialist academic books. Its vision is a healthy market that includes free access for end users. In this session she will review all the different models that are being experimented with around the world. These include author-side payments, institutional subsidies, research funding body approaches etc. She will compare and contrast these models with those that are already in place for journal articles. She will also review the policy landscape and report on how open access scholarly books are faring to date Frances Pinter, Founder, Knowledge Unlatched, UK
Resumo:
This three-hour workshop tackles the crucial question of whether globally coordinated, market based approaches to funding open access monographs can support the unique needs of Australian research communities. The workshop takes place in the context of the release in August 2013 of the Book Industry Collaborative Council (BICC) report and especially the recommendations included in the chapter on scholarly book publishing in the humanities and social sciences. This workshop, with expert speakers from the BICC Committee and from across the scholarly publishing industry, will discuss the policy issues most likely to ensure that Australian scholarly communities and audiences are best served in an era of digital technology and globalisation. Australia must think globally and support developments that enhance the accessibility of publicly-funded research. Speakers will outline recent developments in scholarly monograph publishing including new Open Access initiatives and developments. Knowledge Unlatched, is one example of an attempt to create an internationally coordinated, market-based route to open access for Humanities, Arts and Social Sciences (HASS) monographs. Knowledge Unlatched, a not-for-profit London-based company is piloting a global library consortium approach to funding open access monographs and released its pilot program in early October with 28 titles from 13 publishers. The workshop invites discussion and debate from librarians, publishers, researchers and research funders on the role of international coordination and markets in securing a more open future for Australian HASS scholarship.
Resumo:
On 19 June 2013 Knowledge Unlatched and the Berkman Center for Internet and Society at Harvard Law School jointly convened a one-day workshop titled Open Access and Scholarly Books in Cambridge, MA. The workshop brought together a group of 21 invited publishers, librarians, academics and Open Access innovators to discuss the challenge of making scholarly books Open Access. This report captures discussions that took place on the day.
Resumo:
After nearly fifteen years of the open access (OA) movement and its hard-fought struggle for a more open scholarly communication system, publishers are realizing that business models can be both open and profitable. Making journal articles available on an OA license is becoming an accepted strategy for maximizing the value of content to both research communities and the businesses that serve them. The first blog in this two-part series celebrating Data Innovation Day looks at the role that data-innovation is playing in the shift to open access for journal articles.
Resumo:
The world is increasingly moving towards more open models of publishing and communication. The UK government has demonstrated a firm commitment to ensuring that academic research outputs are made available to all who might benefit from access to them, and its open access policy attempts to make academic publications freely available to readers, rather than being locked behind pay walls or only available to researchers with access to well-funded university libraries. Open access policies have an important role to play in fostering an open innovation ecosystem and ensuring that maximum value is derived from investments in university-based research. But are we ready to embrace this change?
Resumo:
Many emerging economies are dangling the patent system to stimulate bio-technological innovations with the ultimate premise that these will improve their economic and social growth. The patent system mandates full disclosure of the patented invention in exchange of a temporary exclusive patent right. Recently, however, patent offices have fallen short of complying with such a mandate, especially for genetic inventions. Most patent offices provide only static information about disclosed patent sequences and even some do not keep track of the sequence listing data in their own database. The successful partnership of QUT Library and Cambia exemplifies advocacy in Open Access, Open Innovation and User Participation. The library extends its services to various departments within the university, builds and encourages research networks to complement skills needed to make a contribution in the real world.
Resumo:
In this research we observe the situated, embodied and playful interaction that participants engage in with open-ended interactive artworks. The larger project from which this work derives [28] contributes a methodological model for the evaluation of open-ended interactive artwork that treats each work individually and recognises the importance of the artist intent and the traditions from which the work derives. In this paper, we describe this evolving methodology for evaluating and understanding participation via three case studies of open-ended interactive art installations. This analysis builds an understanding of open-ended free-play non-narrative environments and the affordances these environments enable for participants.
Resumo:
In this paper we introduce and discuss the nature of free-play in the context of three open-ended interactive art installation works. We observe the interaction work of situated free-play of the participants in these environments and, building on precedent work, devise a set of sensitising terms derived both from the literature and from what we observe from participants interacting there. These sensitising terms act as guides and are designed to be used by those who experience, evaluate or report on open-ended interactive art. That is, we propose these terms as a common-ground language to be used by participants communicating while in the art work to describe their experience, by researchers in the various stages of research process (observation, coding activity, analysis, reporting, and publication), and by inter-disciplinary researchers working across the fields of HCI and art. This work builds a foundation for understanding the relationship between free-play, open-ended environments, and interactive installations and contributes sensitising terms useful for the HCI community for discussion and analysis of open-ended interactive art works.
Resumo:
This article investigates the discourses of academic legitimacy that surround the production, consumption, and accreditation of online scholarship. Using the web-based media and cultural studies journal (http://journal.media-culture.org.au) as a case study, it examines how online scholarly journals often position themselves as occupying a space between the academic and the popular and as having a functional advantage over print-based media in promoting a spirit of public intellectualism. The current research agenda of both government and academe prioritises academic research that is efficient, self-promoting, and relevant to the public. Yet, although the cost-effectiveness and public-intellectual focus of online scholarship speak to these research priorities, online journals such as M/C Journal have occupied, and continue to occupy, an unstable position in relation to the perceived academic legitimacy of their content. Although some online scholarly journals have achieved a limited form of recognition within a system of accreditation that still privileges print-based scholarship, I argue that this, nevertheless, points to the fact that traditional textual notions of legitimate academic work continue to pervade the research agenda of an academe that increasingly promotes flexible delivery of teaching and online research initiatives.
Resumo:
Event report on the Open Access and Research 2013 conference which focused on recent developments and the strategic advantages they bring to the research sector.
Resumo:
Enterprises, both public and private, have rapidly commenced using the benefits of enterprise resource planning (ERP) combined with business analytics and “open data sets” which are often outside the control of the enterprise to gain further efficiencies, build new service operations and increase business activity. In many cases, these business activities are based around relevant software systems hosted in a “cloud computing” environment. “Garbage in, garbage out”, or “GIGO”, is a term long used to describe problems in unqualified dependency on information systems, dating from the 1960s. However, a more pertinent variation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems, such as ERP and usage of open datasets in a cloud environment, the ability to verify the authenticity of those data sets used may be almost impossible, resulting in dependence upon questionable results. Illicit data set “impersonation” becomes a reality. At the same time the ability to audit such results may be an important requirement, particularly in the public sector. This paper discusses the need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment and analyses some current technologies that are offered and which may be appropriate. However, severe limitations to addressing these requirements have been identified and the paper proposes further research work in the area.
Resumo:
Enterprise resource planning (ERP) systems are rapidly being combined with “big data” analytics processes and publicly available “open data sets”, which are usually outside the arena of the enterprise, to expand activity through better service to current clients as well as identifying new opportunities. Moreover, these activities are now largely based around relevant software systems hosted in a “cloud computing” environment. However, the over 50- year old phrase related to mistrust in computer systems, namely “garbage in, garbage out” or “GIGO”, is used to describe problems of unqualified and unquestioning dependency on information systems. However, a more relevant GIGO interpretation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems based around ERP and open datasets as well as “big data” analytics, particularly in a cloud environment, the ability to verify the authenticity and integrity of the data sets used may be almost impossible. In turn, this may easily result in decision making based upon questionable results which are unverifiable. Illicit “impersonation” of and modifications to legitimate data sets may become a reality while at the same time the ability to audit any derived results of analysis may be an important requirement, particularly in the public sector. The pressing need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment is discussed in this paper. Some current and appropriate technologies currently being offered are also examined. However, severe limitations in addressing the problems identified are found and the paper proposes further necessary research work for the area. (Note: This paper is based on an earlier unpublished paper/presentation “Identity, Addressing, Authenticity and Audit Requirements for Trust in ERP, Analytics and Big/Open Data in a ‘Cloud’ Computing Environment: A Review and Proposal” presented to the Department of Accounting and IT, College of Management, National Chung Chen University, 20 November 2013.)