1000 resultados para open-shell
Resumo:
The Australian masonry standard allows either prism tests or correction factors based on the block height and mortar thickness to evaluate masonry compressive strength. The correction factor helps the taller units with conventional 10 mm mortar being not disadvantaged due to size effect. In recent times, 2-4 mm thick, high-adhesive mortars and H blocks with only the mid-web shell are used in masonry construction. H blocks and thinner and higher adhesive mortars have renewed interest of the compression behaviour of hollow concrete masonry and hence is revisited in this paper. This paper presents an experimental study carried out to examine the effects of the thickness of mortar joints, the type of mortar adhesives and the presence of web shells in the hollow concrete masonry prisms under axial compression. A non-contact digital image correlation technique was used to measure the deformation of the prisms and was found adequate for the determination of strain fi eld of the loaded face shells subjected to axial compression. It is found that the absence of end web shells lowers the compressive strength and stiffness of the prisms and the thinner and higher adhesive mortars increase the compressive strength and stiffness, while lowering the Poisson's ratio. © Institution of Engineers Australia, 2013.
Resumo:
In this research we observe the situated, embodied and playful interaction that participants engage in with open-ended interactive artworks. The larger project from which this work derives [28] contributes a methodological model for the evaluation of open-ended interactive artwork that treats each work individually and recognises the importance of the artist intent and the traditions from which the work derives. In this paper, we describe this evolving methodology for evaluating and understanding participation via three case studies of open-ended interactive art installations. This analysis builds an understanding of open-ended free-play non-narrative environments and the affordances these environments enable for participants.
Resumo:
In this paper we introduce and discuss the nature of free-play in the context of three open-ended interactive art installation works. We observe the interaction work of situated free-play of the participants in these environments and, building on precedent work, devise a set of sensitising terms derived both from the literature and from what we observe from participants interacting there. These sensitising terms act as guides and are designed to be used by those who experience, evaluate or report on open-ended interactive art. That is, we propose these terms as a common-ground language to be used by participants communicating while in the art work to describe their experience, by researchers in the various stages of research process (observation, coding activity, analysis, reporting, and publication), and by inter-disciplinary researchers working across the fields of HCI and art. This work builds a foundation for understanding the relationship between free-play, open-ended environments, and interactive installations and contributes sensitising terms useful for the HCI community for discussion and analysis of open-ended interactive art works.
Resumo:
This article investigates the discourses of academic legitimacy that surround the production, consumption, and accreditation of online scholarship. Using the web-based media and cultural studies journal (http://journal.media-culture.org.au) as a case study, it examines how online scholarly journals often position themselves as occupying a space between the academic and the popular and as having a functional advantage over print-based media in promoting a spirit of public intellectualism. The current research agenda of both government and academe prioritises academic research that is efficient, self-promoting, and relevant to the public. Yet, although the cost-effectiveness and public-intellectual focus of online scholarship speak to these research priorities, online journals such as M/C Journal have occupied, and continue to occupy, an unstable position in relation to the perceived academic legitimacy of their content. Although some online scholarly journals have achieved a limited form of recognition within a system of accreditation that still privileges print-based scholarship, I argue that this, nevertheless, points to the fact that traditional textual notions of legitimate academic work continue to pervade the research agenda of an academe that increasingly promotes flexible delivery of teaching and online research initiatives.
Resumo:
Dynamic light scattering (DLS) has become a primary nanoparticle characterization technique with applications from materials characterization to biological and environmental detection. With the expansion in DLS use from homogeneous spheres to more complicated nanostructures, comes a decrease in accuracy. Much research has been performed to develop different diffusion models that account for the vastly different structures but little attention has been given to the effect on the light scattering properties in relation to DLS. In this work, small (core size < 5 nm) core-shell nanoparticles were used as a case study to measure the capping thickness of a layer of dodecanethiol (DDT) on Au and ZnO nanoparticles by DLS. We find that the DDT shell has very little effect on the scattering properties of the inorganic core and hence can be ignored to a first approximation. However, this results in conventional DLS analysis overestimating the hydrodynamic size in the volume and number weighted distributions. By introducing a simple correction formula that more accurately yields hydrodynamic size distributions a more precise determination of the molecular shell thickness is obtained. With this correction, the measured thickness of the DDT shell was found to be 7.3 ± 0.3 Å, much less than the extended chain length of 16 Å. This organic layer thickness suggests that on small nanoparticles, the DDT monolayer adopts a compact disordered structure rather than an open ordered structure on both ZnO and Au nanoparticle surfaces. These observations are in agreement with published molecular dynamics results.
Resumo:
Event report on the Open Access and Research 2013 conference which focused on recent developments and the strategic advantages they bring to the research sector.
Resumo:
Enterprises, both public and private, have rapidly commenced using the benefits of enterprise resource planning (ERP) combined with business analytics and “open data sets” which are often outside the control of the enterprise to gain further efficiencies, build new service operations and increase business activity. In many cases, these business activities are based around relevant software systems hosted in a “cloud computing” environment. “Garbage in, garbage out”, or “GIGO”, is a term long used to describe problems in unqualified dependency on information systems, dating from the 1960s. However, a more pertinent variation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems, such as ERP and usage of open datasets in a cloud environment, the ability to verify the authenticity of those data sets used may be almost impossible, resulting in dependence upon questionable results. Illicit data set “impersonation” becomes a reality. At the same time the ability to audit such results may be an important requirement, particularly in the public sector. This paper discusses the need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment and analyses some current technologies that are offered and which may be appropriate. However, severe limitations to addressing these requirements have been identified and the paper proposes further research work in the area.
Resumo:
Enterprise resource planning (ERP) systems are rapidly being combined with “big data” analytics processes and publicly available “open data sets”, which are usually outside the arena of the enterprise, to expand activity through better service to current clients as well as identifying new opportunities. Moreover, these activities are now largely based around relevant software systems hosted in a “cloud computing” environment. However, the over 50- year old phrase related to mistrust in computer systems, namely “garbage in, garbage out” or “GIGO”, is used to describe problems of unqualified and unquestioning dependency on information systems. However, a more relevant GIGO interpretation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems based around ERP and open datasets as well as “big data” analytics, particularly in a cloud environment, the ability to verify the authenticity and integrity of the data sets used may be almost impossible. In turn, this may easily result in decision making based upon questionable results which are unverifiable. Illicit “impersonation” of and modifications to legitimate data sets may become a reality while at the same time the ability to audit any derived results of analysis may be an important requirement, particularly in the public sector. The pressing need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment is discussed in this paper. Some current and appropriate technologies currently being offered are also examined. However, severe limitations in addressing the problems identified are found and the paper proposes further necessary research work for the area. (Note: This paper is based on an earlier unpublished paper/presentation “Identity, Addressing, Authenticity and Audit Requirements for Trust in ERP, Analytics and Big/Open Data in a ‘Cloud’ Computing Environment: A Review and Proposal” presented to the Department of Accounting and IT, College of Management, National Chung Chen University, 20 November 2013.)
Resumo:
The Comment by Mayers and Reiter criticizes our work on two counts. Firstly, it is claimed that the quantum decoherence effects that we report in consequence of our experimental analysis of neutron Compton scattering from H in gaseous H2 are not, as we maintain, outside the framework of conventional neutron scatteringtheory. Secondly, it is claimed that we did not really observe such effects, owing to a faulty analysis of the experimental data, which are claimed to be in agreement with conventional theory. We focus in this response on the critical issue of the reliability of our experimental results and analysis. Using the same standard Vesuvio instrument programs used by Mayers et al., we show that, if the experimental results for H in gaseous H2 are in agreement with conventional theory, then those for D in gaseous D2 obtained in the same way cannot be, and vice-versa. We expose a flaw in the calibration methodology used by Mayers et al. that leads to the present disagreement over the behaviour of H, namely the ad hoc adjustment of the measured H peak positions in TOF during the calibration of Vesuvio so that agreement is obtained with the expectation of conventional theory. We briefly address the question of the necessity to apply the theory of open quantum systems.
Resumo:
Decoherence of quantum entangled particles is observed in most systems, and is usually caused by system-environment interactions. Disentangling two subsystems A and B of a quantum systemAB is tantamount to erasure of quantum phase relations between A and B. It is widely believed that this erasure is an innocuous process, which e.g. does not affect the energies of A and B. Surprisingly, recent theoretical investigations by different groups showed that disentangling two systems, i.e. their decoherence, can cause an increase of their energies. Applying this result to the context of neutronCompton scattering from H2 molecules, we provide for the first time experimental evidence which supports this prediction. The results reveal that the neutron-proton collision leading to the cleavage of the H-H bond in the sub-femtosecond timescale is accompanied by larger energy transfer (by about 3%) than conventional theory predicts. It is proposed to interpreted the results by considering the neutron-proton collisional system as an entangled open quantum system being subject to decoherence owing to the interactions with the “environment” (i.e., two electrons plus second proton of H2).
Resumo:
A growing number of online journals and academic platforms are adopting light peer review or 'publish then filter' models of scholarly communication. These approaches have the advantage of enabling instant exchanges of knowledge between academics and are part of a wider search for alternatives to traditional peer review and certification processes in scholarly publishing. However, establishing credibility and identifying the correct balance between communication and scholarly rigour remains an important challenge for digital communication platforms targeting academic communities. This paper looks at a highly influential, government-backed, open publishing platform in China: Science Paper Online, which is using transparent post-publication peer-review processes to encourage innovation and address systemic problems in China's traditional academic publishing system. There can be little doubt that the Chinese academic publishing landscape differs in important ways from counterparts in the United States and Western Europe. However, this article suggests that developments in China also provide important lessons about the potential of digital technology and government policy to facilitate a large-scale shift towards more open and networked models of scholarly communication.
Resumo:
In this Article, Petia Wohed, Arthur H. M. ter Hofstede, Nick Russell, Birger Andersson, and Wil M. P. van der Aalst present the results of their examination of existing open source BPM systems. Their conclusions are illuminating for both Open Source developers as well as the user community. Read their Article for the details of their study.
Resumo:
This Article is about legal scholarly publication in a time of plenitude. It is an attempt to explain why the most pressing questions in legal scholarly publishing are about how we ensure access to an infinity of content. It explains why standard assumptions about resource scarcity in publication are wrong in general, and how the changes in the modality of publication affect legal scholarship. It talks about the economics of open access to legal material, and how this connects to a future where there is infinite content. And because student-edited law reviews fit this future better than their commercially-produced, peer-refereed cousins, this Article is, in part, a defense of the crazy-beautiful institution that is the American law review.
Resumo:
Adversarial multiarmed bandits with expert advice is one of the fundamental problems in studying the exploration-exploitation trade-o. It is known that if we observe the advice of all experts on every round we can achieve O(√KTlnN) regret, where K is the number of arms, T is the number of game rounds, and N is the number of experts. It is also known that if we observe the advice of just one expert on every round, we can achieve regret of order O(√NT). Our open problem is what can be achieved by asking M experts on every round, where 1 < M < N.
Resumo:
Numerical simulations of thermomagnetic convection of paramagnetic fluids placed in a micro-gravity condition (g nearly 0) and under a uniform vertical gradient magnetic field in an open ended square enclosure with ramp heating temperature condition applied on a vertical wall is investigated in this study. In presence of the strong magnetic gradient field thermal convection of the paramagnetic fluid might take place even in a zero-gravity environment as a direct consequence of temperature differences occurring within the fluid. The thermal boundary layer develops adjacent to the hot wall as soon as the ramp temperature condition is applied on it. There are two scenario that can be observed based on the ramp heating time. The steady state of the thermal boundary layer can be reached before the ramp time is finished or vice versa. If the ramp time is larger than the quasi-steady time then the thermal boundary layer is in a quasi-steady mode with convection balancing conduction after the quasi-steady time. Further increase of the heat input simply accelerates the flow to maintain the proper thermal balance. Finally, the boundary layer becomes completely steady state when the ramp time is finished. Effects of magnetic Rayleigh number, Prandtl number and paramagnetic fluid parameter on the flow pattern and heat transfer are presented.