949 resultados para Localization open issues
Resumo:
Many emerging economies are dangling the patent system to stimulate bio-technological innovations with the ultimate premise that these will improve their economic and social growth. The patent system mandates full disclosure of the patented invention in exchange of a temporary exclusive patent right. Recently, however, patent offices have fallen short of complying with such a mandate, especially for genetic inventions. Most patent offices provide only static information about disclosed patent sequences and even some do not keep track of the sequence listing data in their own database. The successful partnership of QUT Library and Cambia exemplifies advocacy in Open Access, Open Innovation and User Participation. The library extends its services to various departments within the university, builds and encourages research networks to complement skills needed to make a contribution in the real world.
Resumo:
In this introductory chapter to Schmeinck, D. and Lidstone, J. (2014) “Current trends and issues in geographical education” in Schmeinck, D. and Lidstone, J. (2014) Eds) Standards and Research in Geographical Education: Current Trends and International Issues. Berlin. Mensch und Buch Verlag. Pp. 5 - 16. , the authors review and analyse eleven papers originally presented to the Congress of the International Geographical Union held in Cologne in 2012. Taking the collection of papers as a single corpus representing the “state of the art” of geography education, they applied lexical and bibliometric analyses in an innovative attempt to identify the nature of geographical education as represented by this anthology of peer reviewed chapters presented at the start of the second decade of the Twenty-first century?
Resumo:
In this research we observe the situated, embodied and playful interaction that participants engage in with open-ended interactive artworks. The larger project from which this work derives [28] contributes a methodological model for the evaluation of open-ended interactive artwork that treats each work individually and recognises the importance of the artist intent and the traditions from which the work derives. In this paper, we describe this evolving methodology for evaluating and understanding participation via three case studies of open-ended interactive art installations. This analysis builds an understanding of open-ended free-play non-narrative environments and the affordances these environments enable for participants.
Resumo:
In this paper we introduce and discuss the nature of free-play in the context of three open-ended interactive art installation works. We observe the interaction work of situated free-play of the participants in these environments and, building on precedent work, devise a set of sensitising terms derived both from the literature and from what we observe from participants interacting there. These sensitising terms act as guides and are designed to be used by those who experience, evaluate or report on open-ended interactive art. That is, we propose these terms as a common-ground language to be used by participants communicating while in the art work to describe their experience, by researchers in the various stages of research process (observation, coding activity, analysis, reporting, and publication), and by inter-disciplinary researchers working across the fields of HCI and art. This work builds a foundation for understanding the relationship between free-play, open-ended environments, and interactive installations and contributes sensitising terms useful for the HCI community for discussion and analysis of open-ended interactive art works.
Resumo:
This article investigates the discourses of academic legitimacy that surround the production, consumption, and accreditation of online scholarship. Using the web-based media and cultural studies journal (http://journal.media-culture.org.au) as a case study, it examines how online scholarly journals often position themselves as occupying a space between the academic and the popular and as having a functional advantage over print-based media in promoting a spirit of public intellectualism. The current research agenda of both government and academe prioritises academic research that is efficient, self-promoting, and relevant to the public. Yet, although the cost-effectiveness and public-intellectual focus of online scholarship speak to these research priorities, online journals such as M/C Journal have occupied, and continue to occupy, an unstable position in relation to the perceived academic legitimacy of their content. Although some online scholarly journals have achieved a limited form of recognition within a system of accreditation that still privileges print-based scholarship, I argue that this, nevertheless, points to the fact that traditional textual notions of legitimate academic work continue to pervade the research agenda of an academe that increasingly promotes flexible delivery of teaching and online research initiatives.
Resumo:
What is the state of geographical education in the second decade of the 21st century? This volume presents a selection of peer reviewed papers presented at the 2012 Cologne Congress of the International Geographical Union (IGU) sessions on Geographical Education as representative of current thinking in the area. It then presents (perhaps for the first time) a cross-case analysis of the common factors of all these papers as a current summary of the “state of the art” of geographical education today. The primary aim of the individual authors as well as the editors is not only to record the current state of the art of geographical education but also to promote ongoing discussions of the longer term health and future prospects of international geographical education. We wish to encourage ongoing debate and discussion amongst local, national, regional and international education journals, conferences and discussion groups as part of the international mission of the Commission on Geographical Eduction. While the currency of these chapters in terms of their foci, breadth and recency of the theoretical literature on which they are based and the new research findings they present justifies considerable confidence in the current health of geographical education as an educational and research endeavour, each new publication should only be the start of new scholarly inquiry. Where should we, as a scholarly community, place our energies for the future? If readers are left with a new sense of direction, then the aims of the authors and editors will have been amply met.
Resumo:
Event report on the Open Access and Research 2013 conference which focused on recent developments and the strategic advantages they bring to the research sector.
Resumo:
Enterprises, both public and private, have rapidly commenced using the benefits of enterprise resource planning (ERP) combined with business analytics and “open data sets” which are often outside the control of the enterprise to gain further efficiencies, build new service operations and increase business activity. In many cases, these business activities are based around relevant software systems hosted in a “cloud computing” environment. “Garbage in, garbage out”, or “GIGO”, is a term long used to describe problems in unqualified dependency on information systems, dating from the 1960s. However, a more pertinent variation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems, such as ERP and usage of open datasets in a cloud environment, the ability to verify the authenticity of those data sets used may be almost impossible, resulting in dependence upon questionable results. Illicit data set “impersonation” becomes a reality. At the same time the ability to audit such results may be an important requirement, particularly in the public sector. This paper discusses the need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment and analyses some current technologies that are offered and which may be appropriate. However, severe limitations to addressing these requirements have been identified and the paper proposes further research work in the area.
Resumo:
Enterprise resource planning (ERP) systems are rapidly being combined with “big data” analytics processes and publicly available “open data sets”, which are usually outside the arena of the enterprise, to expand activity through better service to current clients as well as identifying new opportunities. Moreover, these activities are now largely based around relevant software systems hosted in a “cloud computing” environment. However, the over 50- year old phrase related to mistrust in computer systems, namely “garbage in, garbage out” or “GIGO”, is used to describe problems of unqualified and unquestioning dependency on information systems. However, a more relevant GIGO interpretation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems based around ERP and open datasets as well as “big data” analytics, particularly in a cloud environment, the ability to verify the authenticity and integrity of the data sets used may be almost impossible. In turn, this may easily result in decision making based upon questionable results which are unverifiable. Illicit “impersonation” of and modifications to legitimate data sets may become a reality while at the same time the ability to audit any derived results of analysis may be an important requirement, particularly in the public sector. The pressing need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment is discussed in this paper. Some current and appropriate technologies currently being offered are also examined. However, severe limitations in addressing the problems identified are found and the paper proposes further necessary research work for the area. (Note: This paper is based on an earlier unpublished paper/presentation “Identity, Addressing, Authenticity and Audit Requirements for Trust in ERP, Analytics and Big/Open Data in a ‘Cloud’ Computing Environment: A Review and Proposal” presented to the Department of Accounting and IT, College of Management, National Chung Chen University, 20 November 2013.)
Resumo:
Balancing the demands of research and ethics is always challenging and even more so when recruiting vulnerable groups. Within the context of current legislation and international human rights declarations, it is strongly advocated that research can and must be undertaken with all recipients of health care services. Research in the field of intellectual disability presents particular challenges in regard to consenting processes. This paper is a critical reflection and analysis of the complex processes undertaken and events that occurred in gaining informed consent from people with intellectual disability to participate in a study exploring their experiences of being an inpatient in mental health hospitals within Aotearoa/New Zealand. A framework based on capacity, information and voluntariness is presented with excerpts from the field provided to explore consenting processes. The practical implications of the processes utilised are then discussed in order to stimulate debate regarding clearer and enhanced methods of gaining informed consent from people with intellectual disability.
Resumo:
Australia’s building stock includes many older commercial buildings with numerous factors that impact energy performance and indoor environment quality. The built environment industry has generally focused heavily on improving physical building design elements for greater energy efficiency (such as retrofits and environmental upgrades), however there are noticeable ‘upper limits’ to performance improvements in these areas. To achieve a stepchange improvement in building performance, the authors propose that additional components need to be addressed in a whole of building approach, including the way building design elements are managed and the level of stakeholder engagement between owners, tenants and building managers. This paper focuses on the opportunities provided by this whole-of-building approach, presenting the findings of a research project undertaken through the Sustainable Built Environment National Research Centre (SBEnrc) in Australia. Researchers worked with a number of industry partners over two years to investigate issues facing stakeholders at base building and tenancy levels, and the barriers to improving building performance. Through a mixed-method, industry-led research approach, five ‘nodes’ were identified in whole-of-building performance evaluation, each with interlinking and overlapping complexities that can influence performance. The nodes cover building management, occupant experience, indoor environment quality, agreements and culture, and design elements. This paper outlines the development and testing of these nodes and their interactions, and the resultant multi-nodal tool, called the ‘Performance Nexus’ tool. The tool is intended to be of most benefit in evaluating opportunities for performance improvement in the vast number of existing low-performing building stock.
Resumo:
The Comment by Mayers and Reiter criticizes our work on two counts. Firstly, it is claimed that the quantum decoherence effects that we report in consequence of our experimental analysis of neutron Compton scattering from H in gaseous H2 are not, as we maintain, outside the framework of conventional neutron scatteringtheory. Secondly, it is claimed that we did not really observe such effects, owing to a faulty analysis of the experimental data, which are claimed to be in agreement with conventional theory. We focus in this response on the critical issue of the reliability of our experimental results and analysis. Using the same standard Vesuvio instrument programs used by Mayers et al., we show that, if the experimental results for H in gaseous H2 are in agreement with conventional theory, then those for D in gaseous D2 obtained in the same way cannot be, and vice-versa. We expose a flaw in the calibration methodology used by Mayers et al. that leads to the present disagreement over the behaviour of H, namely the ad hoc adjustment of the measured H peak positions in TOF during the calibration of Vesuvio so that agreement is obtained with the expectation of conventional theory. We briefly address the question of the necessity to apply the theory of open quantum systems.
Resumo:
Decoherence of quantum entangled particles is observed in most systems, and is usually caused by system-environment interactions. Disentangling two subsystems A and B of a quantum systemAB is tantamount to erasure of quantum phase relations between A and B. It is widely believed that this erasure is an innocuous process, which e.g. does not affect the energies of A and B. Surprisingly, recent theoretical investigations by different groups showed that disentangling two systems, i.e. their decoherence, can cause an increase of their energies. Applying this result to the context of neutronCompton scattering from H2 molecules, we provide for the first time experimental evidence which supports this prediction. The results reveal that the neutron-proton collision leading to the cleavage of the H-H bond in the sub-femtosecond timescale is accompanied by larger energy transfer (by about 3%) than conventional theory predicts. It is proposed to interpreted the results by considering the neutron-proton collisional system as an entangled open quantum system being subject to decoherence owing to the interactions with the “environment” (i.e., two electrons plus second proton of H2).
Resumo:
A growing number of online journals and academic platforms are adopting light peer review or 'publish then filter' models of scholarly communication. These approaches have the advantage of enabling instant exchanges of knowledge between academics and are part of a wider search for alternatives to traditional peer review and certification processes in scholarly publishing. However, establishing credibility and identifying the correct balance between communication and scholarly rigour remains an important challenge for digital communication platforms targeting academic communities. This paper looks at a highly influential, government-backed, open publishing platform in China: Science Paper Online, which is using transparent post-publication peer-review processes to encourage innovation and address systemic problems in China's traditional academic publishing system. There can be little doubt that the Chinese academic publishing landscape differs in important ways from counterparts in the United States and Western Europe. However, this article suggests that developments in China also provide important lessons about the potential of digital technology and government policy to facilitate a large-scale shift towards more open and networked models of scholarly communication.
Resumo:
In this Article, Petia Wohed, Arthur H. M. ter Hofstede, Nick Russell, Birger Andersson, and Wil M. P. van der Aalst present the results of their examination of existing open source BPM systems. Their conclusions are illuminating for both Open Source developers as well as the user community. Read their Article for the details of their study.