434 resultados para Theater, Open-air.
Resumo:
Event report on the Open Access and Research 2013 conference which focused on recent developments and the strategic advantages they bring to the research sector.
Resumo:
Enterprises, both public and private, have rapidly commenced using the benefits of enterprise resource planning (ERP) combined with business analytics and “open data sets” which are often outside the control of the enterprise to gain further efficiencies, build new service operations and increase business activity. In many cases, these business activities are based around relevant software systems hosted in a “cloud computing” environment. “Garbage in, garbage out”, or “GIGO”, is a term long used to describe problems in unqualified dependency on information systems, dating from the 1960s. However, a more pertinent variation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems, such as ERP and usage of open datasets in a cloud environment, the ability to verify the authenticity of those data sets used may be almost impossible, resulting in dependence upon questionable results. Illicit data set “impersonation” becomes a reality. At the same time the ability to audit such results may be an important requirement, particularly in the public sector. This paper discusses the need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment and analyses some current technologies that are offered and which may be appropriate. However, severe limitations to addressing these requirements have been identified and the paper proposes further research work in the area.
Resumo:
Enterprise resource planning (ERP) systems are rapidly being combined with “big data” analytics processes and publicly available “open data sets”, which are usually outside the arena of the enterprise, to expand activity through better service to current clients as well as identifying new opportunities. Moreover, these activities are now largely based around relevant software systems hosted in a “cloud computing” environment. However, the over 50- year old phrase related to mistrust in computer systems, namely “garbage in, garbage out” or “GIGO”, is used to describe problems of unqualified and unquestioning dependency on information systems. However, a more relevant GIGO interpretation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems based around ERP and open datasets as well as “big data” analytics, particularly in a cloud environment, the ability to verify the authenticity and integrity of the data sets used may be almost impossible. In turn, this may easily result in decision making based upon questionable results which are unverifiable. Illicit “impersonation” of and modifications to legitimate data sets may become a reality while at the same time the ability to audit any derived results of analysis may be an important requirement, particularly in the public sector. The pressing need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment is discussed in this paper. Some current and appropriate technologies currently being offered are also examined. However, severe limitations in addressing the problems identified are found and the paper proposes further necessary research work for the area. (Note: This paper is based on an earlier unpublished paper/presentation “Identity, Addressing, Authenticity and Audit Requirements for Trust in ERP, Analytics and Big/Open Data in a ‘Cloud’ Computing Environment: A Review and Proposal” presented to the Department of Accounting and IT, College of Management, National Chung Chen University, 20 November 2013.)
Resumo:
The Comment by Mayers and Reiter criticizes our work on two counts. Firstly, it is claimed that the quantum decoherence effects that we report in consequence of our experimental analysis of neutron Compton scattering from H in gaseous H2 are not, as we maintain, outside the framework of conventional neutron scatteringtheory. Secondly, it is claimed that we did not really observe such effects, owing to a faulty analysis of the experimental data, which are claimed to be in agreement with conventional theory. We focus in this response on the critical issue of the reliability of our experimental results and analysis. Using the same standard Vesuvio instrument programs used by Mayers et al., we show that, if the experimental results for H in gaseous H2 are in agreement with conventional theory, then those for D in gaseous D2 obtained in the same way cannot be, and vice-versa. We expose a flaw in the calibration methodology used by Mayers et al. that leads to the present disagreement over the behaviour of H, namely the ad hoc adjustment of the measured H peak positions in TOF during the calibration of Vesuvio so that agreement is obtained with the expectation of conventional theory. We briefly address the question of the necessity to apply the theory of open quantum systems.
Resumo:
Decoherence of quantum entangled particles is observed in most systems, and is usually caused by system-environment interactions. Disentangling two subsystems A and B of a quantum systemAB is tantamount to erasure of quantum phase relations between A and B. It is widely believed that this erasure is an innocuous process, which e.g. does not affect the energies of A and B. Surprisingly, recent theoretical investigations by different groups showed that disentangling two systems, i.e. their decoherence, can cause an increase of their energies. Applying this result to the context of neutronCompton scattering from H2 molecules, we provide for the first time experimental evidence which supports this prediction. The results reveal that the neutron-proton collision leading to the cleavage of the H-H bond in the sub-femtosecond timescale is accompanied by larger energy transfer (by about 3%) than conventional theory predicts. It is proposed to interpreted the results by considering the neutron-proton collisional system as an entangled open quantum system being subject to decoherence owing to the interactions with the “environment” (i.e., two electrons plus second proton of H2).
Resumo:
A growing number of online journals and academic platforms are adopting light peer review or 'publish then filter' models of scholarly communication. These approaches have the advantage of enabling instant exchanges of knowledge between academics and are part of a wider search for alternatives to traditional peer review and certification processes in scholarly publishing. However, establishing credibility and identifying the correct balance between communication and scholarly rigour remains an important challenge for digital communication platforms targeting academic communities. This paper looks at a highly influential, government-backed, open publishing platform in China: Science Paper Online, which is using transparent post-publication peer-review processes to encourage innovation and address systemic problems in China's traditional academic publishing system. There can be little doubt that the Chinese academic publishing landscape differs in important ways from counterparts in the United States and Western Europe. However, this article suggests that developments in China also provide important lessons about the potential of digital technology and government policy to facilitate a large-scale shift towards more open and networked models of scholarly communication.
Resumo:
In this Article, Petia Wohed, Arthur H. M. ter Hofstede, Nick Russell, Birger Andersson, and Wil M. P. van der Aalst present the results of their examination of existing open source BPM systems. Their conclusions are illuminating for both Open Source developers as well as the user community. Read their Article for the details of their study.
Resumo:
This Article is about legal scholarly publication in a time of plenitude. It is an attempt to explain why the most pressing questions in legal scholarly publishing are about how we ensure access to an infinity of content. It explains why standard assumptions about resource scarcity in publication are wrong in general, and how the changes in the modality of publication affect legal scholarship. It talks about the economics of open access to legal material, and how this connects to a future where there is infinite content. And because student-edited law reviews fit this future better than their commercially-produced, peer-refereed cousins, this Article is, in part, a defense of the crazy-beautiful institution that is the American law review.
Resumo:
Adversarial multiarmed bandits with expert advice is one of the fundamental problems in studying the exploration-exploitation trade-o. It is known that if we observe the advice of all experts on every round we can achieve O(√KTlnN) regret, where K is the number of arms, T is the number of game rounds, and N is the number of experts. It is also known that if we observe the advice of just one expert on every round, we can achieve regret of order O(√NT). Our open problem is what can be achieved by asking M experts on every round, where 1 < M < N.
Resumo:
Novel nano zero-valent iron/palygorskite composite materials prepared by evaporative and centrifuge methods are tested for the degradation of bisphenol A in an aqueous medium. A systematic study is presented which showed that nano zero-valent iron material has little effect on bisphenol A degradation. When hydrogen peroxide was added to initiate the reaction, some percentage of bisphenol A removal (∼20%) was achieved; however, with the aid of air bubbles, the percentage removal can be significantly increased to ∼99%. Compared with pristine nano zero-valent iron and commercial iron powder, nano zero-valent iron/palygorskite composite materials have much higher reactivity towards bisphenol A and these materials are superior as they have little impact on the solution pH. However, for pristine nano zero-valent iron, it is difficult to maintain the reaction system at a favourable low pH which is a key factor in maintaining high bisphenol A removal. All materials were characterized by X-ray diffraction, scanning electron microscopy, elemental analysis, transmission electron microscopy and X-ray photoelectron spectroscopy. The optimum conditions were obtained based on a series of batch experiments. This study has extended the application of nano zero-valent iron/palygorskite composites as effective materials for the removal of phenolic compounds from the environment.
Resumo:
This project was conducted at Lithgow Correctional Centre (LCC), NSW, Australia. Air quality field measurements were conducted on two occasions (23-27 May 2012, and 3-8 December 2012), just before and six months after the introduction of smoke free buildings policies (28 May 2012) at the LCC, respectively. The main aims of this project were to: (1) investigate the indoor air quality; (2) quantify the level of exposure to environmental tobacco smoke (ETS); (3) identify the main indoor particle sources; (4) distinguish between PM2.5 / particle number from ETS, as opposed to other sources; and (5) provide recommendations for improving indoor air quality and/or minimising exposure at the LCC. The measurements were conducted in Unit 5.2A, Unit 5.2B, Unit 1.1 and Unit 3.1, together with personal exposure measurements, based on the following parameters: -Indoor and outdoor particle number (PN) concentration in the size range 0.005-3 µm -Indoor and outdoor PM2.5 particle mass concentration -Indoor and outdoor VOC concentrations -Personal particle number exposure levels (in the size range 0.01-0.3 µm) -Indoor and outdoor CO and CO2 concentrations, temperature and relative humidity In order to enhance the outcomes of this project, the indoor and outdoor particle number (PN) concentrations were measured by two additional instruments (CPC 3787) which were not listed in the original proposal.
Resumo:
The purpose of this study was to investigate the effect of very small air gaps (less than 1 mm) on the dosimetry of small photon fields used for stereotactic treatments. Measurements were performed with optically stimulated luminescent dosimeters (OSLDs) for 6 MV photons on a Varian 21iX linear accelerator with a Brainlab lMLC attachment for square field sizes down to 6 mm 9 6 mm. Monte Carlo simulations were performed using EGSnrc C++ user code cavity. It was found that the Monte Carlo model used in this study accurately simulated the OSLD measurements on the linear accelerator. For the 6 mm field size, the 0.5 mm air gap upstream to the active area of the OSLD caused a 5.3 % dose reduction relative to a Monte Carlo simulation with no air gap...
Resumo:
Numerical simulations of thermomagnetic convection of paramagnetic fluids placed in a micro-gravity condition (g nearly 0) and under a uniform vertical gradient magnetic field in an open ended square enclosure with ramp heating temperature condition applied on a vertical wall is investigated in this study. In presence of the strong magnetic gradient field thermal convection of the paramagnetic fluid might take place even in a zero-gravity environment as a direct consequence of temperature differences occurring within the fluid. The thermal boundary layer develops adjacent to the hot wall as soon as the ramp temperature condition is applied on it. There are two scenario that can be observed based on the ramp heating time. The steady state of the thermal boundary layer can be reached before the ramp time is finished or vice versa. If the ramp time is larger than the quasi-steady time then the thermal boundary layer is in a quasi-steady mode with convection balancing conduction after the quasi-steady time. Further increase of the heat input simply accelerates the flow to maintain the proper thermal balance. Finally, the boundary layer becomes completely steady state when the ramp time is finished. Effects of magnetic Rayleigh number, Prandtl number and paramagnetic fluid parameter on the flow pattern and heat transfer are presented.
Resumo:
In this work, 17-polychlorinated dibenzo-pdioxin/furan (PCDD/Fs) isomers were measured in ambient air at four urban sites in Seoul, Korea (from February to June 2009). The concentrations of their summed values RPCDD/Fs) across all four sites ranged from 1,947 (271 WHO05 TEQ) (Jong Ro) to 2,600 (349 WHO05 TEQ) fg/m3 (Yang Jae) with a mean of 2,125 ± 317) fg/m3 (292 WHO05 TEQ fg/m3). The sum values for the two isomer groups of RPCDD and RPCDF were 527 (30 WHO05 TEQ) and 1,598 (263 WHO05 TEQ) fg/m3, respectively. The concentration profile of individual species was dominated by the 2,3,4,7,8-PeCDF isomer, which contributed approximately 36 % of the RPCDD/Fs value. The observed temporal trends in PCDD/F concentrations were characterized by relative enhancement in the winter and spring. The relative contribution of different sources, when assessed by principal component analysis, is explained by the dominance of vehicular emissions along with coal (or gas) burning as the key source of ambient PCDD/Fs in the residential areas studied.