694 resultados para open work
Resumo:
Objectives: We compared post-operative analgesic requirements between women with early stage endometrial cancer treated by total abdominal hysterectomy (TAH) or total laparoscopic hysterectomy (TLH). Methods: 760 patients with apparent stage I endometrial cancer were treated in the international, multicentre, prospective randomised trial (LACE) by TAH (n=353) or TLH (n=407) (2005-2010). Epidural, opioid and non-opioid analgesic requirements were collected until ten months after surgery. Results: Baseline demographics and analgesic use were comparable between treatment arms. TAH patients were more likely to receive epidural analgesia than TLH patients (33% vs 0.5%, p<0.001) during the early postoperative phase. Although opioid use was comparable in the TAH vs TLH groups during postoperative 0-2 days (99.7% vs 98.5%, p 0.09), a significantly higher proportion of TAH patients required opioids 3-5 days (70% vs 22%, p<0.0001), 6-14 days (35% vs 15%, p<0.0001), and 15-60 days (15% vs 9%, p 0.02) post-surgery. Mean pain scores were significantly higher in the TAH versus TLH group one (2.48 vs 1.62, p<0.0001) and four weeks (0.89 vs 0.63, p 0.01) following surgery. Conclusion: Treatment of early stage endometrial cancer with TLH is associated with less frequent use of epidural, lower post-operative opioid requirements and better pain scores than TAH.
Resumo:
Distal radius fractures stabilized by open reduction internal fixation (ORIF) have become increasingly common. There is currently no consensus on the optimal time to commence range of motion (ROM) exercises post-ORIF. A retrospective cohort review was conducted over a five-year period to compare wrist and forearm range of motion outcomes and number of therapy sessions between patients who commenced active ROM exercises within the first seven days and from day eight onward following ORIF of distal radius fractures. One hundred and twenty-one patient cases were identified. Clinical data, active ROM at initial and discharge therapy assessments, fracture type, surgical approaches, and number of therapy sessions attended were recorded. One hundred and seven (88.4%) cases had complete datasets. The early active ROM group (n = 37) commenced ROM a mean (SD) of 4.27 (1.8) days post-ORIF. The comparator group (n = 70) commenced ROM exercises 24.3 (13.6) days post-ORIF. No significant differences were identified between groups in ROM at initial or discharge assessments, or therapy sessions attended. The results from this study indicate that patients who commenced active ROM exercises an average of 24 days after surgery achieved comparable ROM outcomes with similar number of therapy sessions to those who commenced ROM exercises within the first week.
Resumo:
Many aspects of China's academic publishing system differ from the systems found in liberal market based economies of the United States, Western Europe and Australia. A high level of government intervention in both the publishing industry and academia and the challenges associated with attempting to make a transition from a centrally controlled towards a more market based publishing industry are two notable differences; however, as in other countries, academic communities and publishers are being transformed by digital technologies. This research explores the complex yet dynamic digital transformation of academic publishing in China, with a specific focus of the open and networked initiatives inspired by Web 2.0 and social media. The thesis draws on two case studies: Science Paper Online, a government-operated online preprint platform and open access mandate; and New Science, a social reference management website operated by a group of young PhD students. Its analysis of the innovations, business models, operating strategies, influences, and difficulties faced by these two initiatives highlights important characteristics and trends in digital publishing experiments in China. The central argument of this thesis is that the open and collaborative possibilities of Web 2.0 inspired initiatives are emerging outside the established journal and monograph publishing system in China, introducing innovative and somewhat disruptive approaches to the certification, communication and commercial exploitation of knowledge. Moreover, emerging publishing models are enabling and encouraging a new system of practising and communicating science in China, putting into practice some elements of the Open Science ethos. There is evidence of both disruptive change to old publishing structures and the adaptive modification of emergent replacements in the Chinese practice. As such, the transformation from traditional to digital and interactive modes of publishing, involves both competition and convergence between new and old publishers, as well as dynamics of co-evolution involving new technologies, business models, social norms, and government reform agendas. One key concern driving this work is whether there are new opportunities and new models for academic publishing in the Web 2.0 age and social media environment, which might allow the basic functions of communication and certification to be achieved more effectively. This thesis enriches existing knowledge of open and networked transformations of scholarly publishing by adding a Chinese story. Although the development of open and networked publishing platforms in China remains in its infancy, the lessons provided by this research are relevant to practitioners and stakeholders interested in understanding the transformative dynamics of networked technologies for publishing and advocating open access in practice, not only in China, but also internationally.
Resumo:
This special issue of the Journal of Urban Technology brings together five articles that are based on presentations given at the Street Computing Workshop held on 24 November 2009 in Melbourne in conjunction with the Australian Computer- Human Interaction conference (OZCHI 2009). Our own article introduces the Street Computing vision and explores the potential, challenges, and foundations of this research trajectory. In order to do so, we first look at the currently available sources of information and discuss their link to existing research efforts. Section 2 then introduces the notion of Street Computing and our research approach in more detail. Section 3 looks beyond the core concept itself and summarizes related work in this field of interest. We conclude by introducing the papers that have been contributed to this special issue.
Resumo:
As support grows for greater access to information and data held by governments, so does awareness of the need for appropriate policy, technical and legal frameworks to achieve the desired economic and societal outcomes. Since the late 2000s numerous international organizations, inter-governmental bodies and governments have issued open government data policies, which set out key principles underpinning access to, and the release and reuse of data. These policies reiterate the value of government data and establish the default position that it should be openly accessible to the public under transparent and non-discriminatory conditions, which are conducive to innovative reuse of the data. A key principle stated in open government data policies is that legal rights in government information must be exercised in a manner that is consistent with and supports the open accessibility and reusability of the data. In particular, where government information and data is protected by copyright, access should be provided under licensing terms which clearly permit its reuse and dissemination. This principle has been further developed in the policies issued by Australian Governments into a specific requirement that Government agencies are to apply the Creative Commons Attribution licence (CC BY) as the default licensing position when releasing government information and data. A wide-ranging survey of the practices of Australian Government agencies in managing their information and data, commissioned by the Office of the Australian Information Commissioner in 2012, provides valuable insights into progress towards the achievement of open government policy objectives and the adoption of open licensing practices. The survey results indicate that Australian Government agencies are embracing open access and a proactive disclosure culture and that open licensing under Creative Commons licences is increasingly prevalent. However, the finding that ‘[t]he default position of open access licensing is not clearly or robustly stated, nor properly reflected in the practice of Government agencies’ points to the need to further develop the policy framework and the principles governing information access and reuse, and to provide practical guidance tools on open licensing if the broadest range of government information and data is to be made available for innovative reuse.
Resumo:
This study presents the largest-known, investigation on discomfort glare with 493 surveys collected from five green buildings in Brisbane, Australia. The study was conducted on full-time employees, working under their everyday lighting conditions, all of whom had no affiliation with the research institution. The survey consisted of a specially tailored questionnaire to assess potential factors relating to discomfort glare. Luminance maps extracted from high dynamic range (HDR) images were used to capture the luminous environment of the occupants. Occupants who experienced glare on their monitor and/or electric glare were excluded from analysis leaving 419 available surveys. Occupants were more sensitive to glare than any of the tested indices accounted for. A new index, the UGP was developed to take into account the scope of results in the investigation. The index is based on a linear transformation of the UGR to calculate a probability of disturbed persons. However all glare indices had some correlation to discomfort, and statistically there was no difference between the DGI, UGR and CGI. The UGP broadly reflects the demographics of the working population in Australia and the new index is applicable to open plan green buildings.
Resumo:
Fierce debates have characterised 2013 as the implications of government mandates for open access have been debated and pathways for implementation worked out. There is no doubt that journals will move to a mix of gold and green and there will be an unsettled relationship between the two. But what of books? Is it conceivable that in those subjects, such as in the humanities and social sciences, where something longer than the journal article is still the preferred form of scholarly communications that these will stay closed? Will it be acceptable to have some publicly funded research made available only in closed book form (regardless of whether print or digital) while other subjects where articles are favoured go open access? Frances Pinter is in the middle of these debates, having founded Knowledge Unlatched (see www.knowledgeunlatched.org). KU is a global library consortium enabling open access books. Knowledge Unlatched is helping libraries to work together for a sustainable open future for specialist academic books. Its vision is a healthy market that includes free access for end users. In this session she will review all the different models that are being experimented with around the world. These include author-side payments, institutional subsidies, research funding body approaches etc. She will compare and contrast these models with those that are already in place for journal articles. She will also review the policy landscape and report on how open access scholarly books are faring to date Frances Pinter, Founder, Knowledge Unlatched, UK
Resumo:
In this research we observe the situated, embodied and playful interaction that participants engage in with open-ended interactive artworks. The larger project from which this work derives [28] contributes a methodological model for the evaluation of open-ended interactive artwork that treats each work individually and recognises the importance of the artist intent and the traditions from which the work derives. In this paper, we describe this evolving methodology for evaluating and understanding participation via three case studies of open-ended interactive art installations. This analysis builds an understanding of open-ended free-play non-narrative environments and the affordances these environments enable for participants.
Resumo:
In this paper we introduce and discuss the nature of free-play in the context of three open-ended interactive art installation works. We observe the interaction work of situated free-play of the participants in these environments and, building on precedent work, devise a set of sensitising terms derived both from the literature and from what we observe from participants interacting there. These sensitising terms act as guides and are designed to be used by those who experience, evaluate or report on open-ended interactive art. That is, we propose these terms as a common-ground language to be used by participants communicating while in the art work to describe their experience, by researchers in the various stages of research process (observation, coding activity, analysis, reporting, and publication), and by inter-disciplinary researchers working across the fields of HCI and art. This work builds a foundation for understanding the relationship between free-play, open-ended environments, and interactive installations and contributes sensitising terms useful for the HCI community for discussion and analysis of open-ended interactive art works.
Resumo:
This article investigates the discourses of academic legitimacy that surround the production, consumption, and accreditation of online scholarship. Using the web-based media and cultural studies journal (http://journal.media-culture.org.au) as a case study, it examines how online scholarly journals often position themselves as occupying a space between the academic and the popular and as having a functional advantage over print-based media in promoting a spirit of public intellectualism. The current research agenda of both government and academe prioritises academic research that is efficient, self-promoting, and relevant to the public. Yet, although the cost-effectiveness and public-intellectual focus of online scholarship speak to these research priorities, online journals such as M/C Journal have occupied, and continue to occupy, an unstable position in relation to the perceived academic legitimacy of their content. Although some online scholarly journals have achieved a limited form of recognition within a system of accreditation that still privileges print-based scholarship, I argue that this, nevertheless, points to the fact that traditional textual notions of legitimate academic work continue to pervade the research agenda of an academe that increasingly promotes flexible delivery of teaching and online research initiatives.
Resumo:
Enterprises, both public and private, have rapidly commenced using the benefits of enterprise resource planning (ERP) combined with business analytics and “open data sets” which are often outside the control of the enterprise to gain further efficiencies, build new service operations and increase business activity. In many cases, these business activities are based around relevant software systems hosted in a “cloud computing” environment. “Garbage in, garbage out”, or “GIGO”, is a term long used to describe problems in unqualified dependency on information systems, dating from the 1960s. However, a more pertinent variation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems, such as ERP and usage of open datasets in a cloud environment, the ability to verify the authenticity of those data sets used may be almost impossible, resulting in dependence upon questionable results. Illicit data set “impersonation” becomes a reality. At the same time the ability to audit such results may be an important requirement, particularly in the public sector. This paper discusses the need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment and analyses some current technologies that are offered and which may be appropriate. However, severe limitations to addressing these requirements have been identified and the paper proposes further research work in the area.
Resumo:
Enterprise resource planning (ERP) systems are rapidly being combined with “big data” analytics processes and publicly available “open data sets”, which are usually outside the arena of the enterprise, to expand activity through better service to current clients as well as identifying new opportunities. Moreover, these activities are now largely based around relevant software systems hosted in a “cloud computing” environment. However, the over 50- year old phrase related to mistrust in computer systems, namely “garbage in, garbage out” or “GIGO”, is used to describe problems of unqualified and unquestioning dependency on information systems. However, a more relevant GIGO interpretation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems based around ERP and open datasets as well as “big data” analytics, particularly in a cloud environment, the ability to verify the authenticity and integrity of the data sets used may be almost impossible. In turn, this may easily result in decision making based upon questionable results which are unverifiable. Illicit “impersonation” of and modifications to legitimate data sets may become a reality while at the same time the ability to audit any derived results of analysis may be an important requirement, particularly in the public sector. The pressing need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment is discussed in this paper. Some current and appropriate technologies currently being offered are also examined. However, severe limitations in addressing the problems identified are found and the paper proposes further necessary research work for the area. (Note: This paper is based on an earlier unpublished paper/presentation “Identity, Addressing, Authenticity and Audit Requirements for Trust in ERP, Analytics and Big/Open Data in a ‘Cloud’ Computing Environment: A Review and Proposal” presented to the Department of Accounting and IT, College of Management, National Chung Chen University, 20 November 2013.)
Resumo:
The Comment by Mayers and Reiter criticizes our work on two counts. Firstly, it is claimed that the quantum decoherence effects that we report in consequence of our experimental analysis of neutron Compton scattering from H in gaseous H2 are not, as we maintain, outside the framework of conventional neutron scatteringtheory. Secondly, it is claimed that we did not really observe such effects, owing to a faulty analysis of the experimental data, which are claimed to be in agreement with conventional theory. We focus in this response on the critical issue of the reliability of our experimental results and analysis. Using the same standard Vesuvio instrument programs used by Mayers et al., we show that, if the experimental results for H in gaseous H2 are in agreement with conventional theory, then those for D in gaseous D2 obtained in the same way cannot be, and vice-versa. We expose a flaw in the calibration methodology used by Mayers et al. that leads to the present disagreement over the behaviour of H, namely the ad hoc adjustment of the measured H peak positions in TOF during the calibration of Vesuvio so that agreement is obtained with the expectation of conventional theory. We briefly address the question of the necessity to apply the theory of open quantum systems.
Resumo:
The effect of radiation on natural convection of Newtonian fluid contained in an open cavity is investigated in this study. The governing partial differential equations are solved numerically using the Alternate Direct Implicit method together with the Successive Over Relaxation method. The study is focused on studying the flow pattern and the convective and radiative heat transfer rates are studied for different values of radiation parameters namely, the optical thickness of the fluid, scattering albedo, and the Planck number. It was found that in the optically thin limit, an increase in the optical thickness of the fluid raises the temperature and radiation heat transfer of the fluid. However, a further increase in the optical thickness decreases the radiative heat transfer rate due to increase in the energy level of the fluid, which ultimately reduces the total heat transfer rate within the fluid.
Resumo:
Background Sexually-transmitted pathogens often have severe reproductive health implications if treatment is delayed or absent, especially in females. The complex processes of disease progression, namely replication and ascension of the infection through the genital tract, span both extracellular and intracellular physiological scales, and in females can vary over the distinct phases of the menstrual cycle. The complexity of these processes, coupled with the common impossibility of obtaining comprehensive and sequential clinical data from individual human patients, makes mathematical and computational modelling valuable tools in developing our understanding of the infection, with a view to identifying new interventions. While many within-host models of sexually-transmitted infections (STIs) are available in existing literature, these models are difficult to deploy in clinical/experimental settings since simulations often require complex computational approaches. Results We present STI-GMaS (Sexually-Transmitted Infections – Graphical Modelling and Simulation), an environment for simulation of STI models, with a view to stimulating the uptake of these models within the laboratory or clinic. The software currently focuses upon the representative case-study of Chlamydia trachomatis, the most common sexually-transmitted bacterial pathogen of humans. Here, we demonstrate the use of a hybrid PDE–cellular automata model for simulation of a hypothetical Chlamydia vaccination, demonstrating the effect of a vaccine-induced antibody in preventing the infection from ascending to above the cervix. This example illustrates the ease with which existing models can be adapted to describe new studies, and its careful parameterisation within STI-GMaS facilitates future tuning to experimental data as they arise. Conclusions STI-GMaS represents the first software designed explicitly for in-silico simulation of STI models by non-theoreticians, thus presenting a novel route to bridging the gap between computational and clinical/experimental disciplines. With the propensity for model reuse and extension, there is much scope within STI-GMaS to allow clinical and experimental studies to inform model inputs and drive future model development. Many of the modelling paradigms and software design principles deployed to date transfer readily to other STIs, both bacterial and viral; forthcoming releases of STI-GMaS will extend the software to incorporate a more diverse range of infections.