968 resultados para micro fusion framework
Resumo:
This chapter analyses the copyright law framework needed to ensure open access to outputs of the Australian academic and research sector such as journal articles and theses. It overviews the new knowledge landscape, the principles of copyright law, the concept of open access to knowledge, the recently developed open content models of copyright licensing and the challenges faced in providing greater access to knowledge and research outputs.
Resumo:
As the Internet becomes deeply embedded into consumers’ daily life, the digital virtual world brings significant influence to consumers’ self and narrative. Prior studies look at consumer self from either from a certain online space or comparing consumers’ physical and digital virtual selves but not the integration of the physical/digital world. This paper aims to explore the meanings of the digital virtual space on consumers’ narrative as a whole (their interests, dreams, or subjectivity). We utilise a postmodern concept of the cyborg to understand the cultural complexity, subjective meanings of, and the extent to which the digital virtual space plays a role in consumers’ self-narrative. We conducted in-depth interviews and gathered three consumer narratives. Our findings indicate that consumers’ narrative contains important fragments from both physical and digital virtual worlds and their physical and digital virtual selves form a feedback loop that strengthen their overall narrative.
Resumo:
Methods are presented for the preparation, ligand density analysis and use of an affinity adsorbent for the purification of a glutathione S-transferase (GST) fusion protein in packed and expanded bed chromatographic processes. The protein is composed of GST fused to a zinc finger transcription factor (ZnF). Glutathione, the affinity ligand for GST purification, is covalently immobilized to a solid-phase adsorbent (Streamline™). The GST–ZnF fusion protein displays a dissociation constant of 0.6 x10-6 M to glutathione immobilized to Streamline™. Ligand density optimization, fusion protein elution conditions (pH and glutathione concentration) and ligand orientation are briefly discussed.
Resumo:
Every university in Australia has a set of policies that guide the institution in its educational practices, however, the policies are often developed in isolation to each other. Now imagine a space where policies are evidence-based, refined annually, cohesively interrelated, and meet stakeholders’ needs. Is this happenstance or the result of good planning? Culturally, Queensland University of Technology (QUT) is a risk-averse institution that takes pride in its financial solvency and is always keen to know “how are we going?” With a twenty-year history of annual reporting that assures the quality of course performance through multiple lines of evidence, QUT’s Learning and Teaching Unit went one step further and strategically aligned a suite of policies that take into consideration the needs of their stakeholders, collaborate with other areas across the institution and use multiple lines of evidence to inform curriculum decision-making. In QUT’s experience, strategic planning can lead to policy that is designed to meet stakeholders’ needs, not manage them; where decision-making is supported by evidence, not rhetoric; where all feedback is incorporated, not ignored; and where policies are cohesively interrelated, not isolated. While many may call this ‘policy nirvana’, QUT has positioned itself to demonstrate good educational practice through Reframe, its evaluation framework. In this case, best practice was achieved through the application of a theory of change and a design-led logic model that allows for transition to other institutions with different cultural specificity. The evaluation approach follows Seldin’s (2003) notion to offer depth and breadth to the evaluation framework along with Berk’s (2005) concept of multiple lines of evidence. In summary, this paper offers university executives, academics, planning and quality staff an opportunity to understand the critical steps that lead to strategic planning and design of evidence-based educational policy that positions a university for best practice in learning and teaching.
Resumo:
Porosity is one of the key parameters of the macroscopic structure of porous media, generally defined as the ratio of the free spaces occupied (by the volume of air) within the material to the total volume of the material. Porosity is determined by measuring skeletal volume and the envelope volume. Solid displacement method is one of the inexpensive and easy methods to determine the envelope volume of a sample with an irregular shape. In this method, generally glass beads are used as a solid due to their uniform size, compactness and fluidity properties. The smaller size of the glass beads means that they enter into the open pores which have a larger diameter than the glass beads. Although extensive research has been carried out on porosity determination using displacement method, no study exists which adequately reports micro-level observation of the sample during measurement. This study set out with the aim of assessing the accuracy of solid displacement method of bulk density measurement of dried foods by micro-level observation. Solid displacement method of porosity determination was conducted using a cylindrical vial (cylindrical plastic container) and 57 µm glass beads in order to measure the bulk density of apple slices at different moisture contents. A scanning electron microscope (SEM), a profilometer and ImageJ software were used to investigate the penetration of glass beads into the surface pores during the determination of the porosity of dried food. A helium pycnometer was used to measure the particle density of the sample. Results show that a significant number of pores were large enough to allow the glass beads to enter into the pores, thereby causing some erroneous results. It was also found that coating the dried sample with appropriate coating material prior to measurement can resolve this problem.
Resumo:
Multidimensional data are getting increasing attention from researchers for creating better recommender systems in recent years. Additional metadata provides algorithms with more details for better understanding the interaction between users and items. While neighbourhood-based Collaborative Filtering (CF) approaches and latent factor models tackle this task in various ways effectively, they only utilize different partial structures of data. In this paper, we seek to delve into different types of relations in data and to understand the interaction between users and items more holistically. We propose a generic multidimensional CF fusion approach for top-N item recommendations. The proposed approach is capable of incorporating not only localized relations of user-user and item-item but also latent interaction between all dimensions of the data. Experimental results show significant improvements by the proposed approach in terms of recommendation accuracy.
Resumo:
Focus groups are a popular qualitative research method for information systems researchers. However, compared with the abundance of research articles and handbooks on planning and conducting focus groups, surprisingly, there is little research on how to analyse focus group data. Moreover, those few articles that specifically address focus group analysis are all in fields other than information systems, and offer little specific guidance for information systems researchers. Further, even the studies that exist in other fields do not provide a systematic and integrated procedure to analyse both focus group ‘content’ and ‘interaction’ data. As the focus group is a valuable method to answer the research questions of many IS studies (in the business, government and society contexts), we believe that more attention should be paid to this method in the IS research. This paper offers a systematic and integrated procedure for qualitative focus group data analysis in information systems research.
Resumo:
Industrial control systems (ICS) have been moving from dedicated communications to switched and routed corporate networks, making it probable that these devices are being exposed to the Internet. Many ICS have been designed with poor or little security features, making them vulnerable to potential attack. Recently, several tools have been developed that can scan the internet, including ZMap, Masscan and Shodan. However, little in-depth analysis has been done to compare these Internet-wide scanning techniques, and few Internet-wide scans have been conducted targeting ICS and protocols. In this paper we present a Taxonomy of Internet-wide scanning with a comparison of three popular network scanning tools, and a framework for conducting Internet-wide scans.
Resumo:
There have been substantial advances in small field dosimetry techniques and technologies, over the last decade, which have dramatically improved the achievable accuracy of small field dose measurements. This educational note aims to help radiation oncology medical physicists to apply some of these advances in clinical practice. The evaluation of a set of small field output factors (total scatter factors) is used to exemplify a detailed measurement and simulation procedure and as a basis for discussing the possible effects of simplifying that procedure. Field output factors were measured with an unshielded diode and a micro-ionisation chamber, at the centre of a set of square fields defined by a micro-multileaf collimator. Nominal field sizes investigated ranged from 6×6 to 98×98 mm2. Diode measurements in fields smaller than 30 mm across were corrected using response factors calculated using Monte Carlo simulations of the full diode geometry and daisy-chained to match micro-chamber measurements at intermediate field sizes. Diode measurements in fields smaller than 15 mm across were repeated twelve times over three separate measurement sessions, to evaluate the to evaluate the reproducibility of the radiation field size and its correspondence with the nominal field size. The five readings that contributed to each measurement on each day varied by up to 0.26%, for the “very small” fields smaller than 15 mm, and 0.18% for the fields larger than 15 mm. The diode response factors calculated for the unshielded diode agreed with previously published results, within 1.6%. The measured dimensions of the very small fields differed by up to 0.3 mm, across the different measurement sessions, contributing an uncertainty of up to 1.2% to the very small field output factors. The overall uncertainties in the field output factors were 1.8% for the very small fields and 1.1% for the fields larger than 15 mm across. Recommended steps for acquiring small field output factor measurements for use in radiotherapy treatment planning system beam configuration data are provided.