948 resultados para systematically conferred advantages
Resumo:
There has been significant research in the field of database watermarking recently. However, there has not been sufficient attention given to the requirement of providing reversibility (the ability to revert back to original relation from watermarked relation) and blindness (not needing the original relation for detection purpose) at the same time. This model has several disadvantages over reversible and blind watermarking (requiring only the watermarked relation and secret key from which the watermark is detected and the original relation is restored) including the inability to identify the rightful owner in case of successful secondary watermarking, the inability to revert the relation to the original data set (required in high precision industries) and the requirement to store the unmarked relation at a secure secondary storage. To overcome these problems, we propose a watermarking scheme that is reversible as well as blind. We utilize difference expansion on integers to achieve reversibility. The major advantages provided by our scheme are reversibility to a high quality original data set, rightful owner identification, resistance against secondary watermarking attacks, and no need to store the original database at a secure secondary storage. We have implemented our scheme and results show the success rate is limited to 11% even when 48% tuples are modified.
Resumo:
There has been significant research in the field of database watermarking recently. However, there has not been sufficient attention given to the requirement of providing reversibility (the ability to revert back to original relation from watermarked relation) and blindness (not needing the original relation for detection purpose) at the same time. This model has several disadvantages over reversible and blind watermarking (requiring only the watermarked relation and secret key from which the watermark is detected and the original relation is restored) including the inability to identify the rightful owner in case of successful secondary watermarking, the inability to revert the relation to the original data set (required in high precision industries) and the requirement to store the unmarked relation at a secure secondary storage. To overcome these problems, we propose a watermarking scheme that is reversible as well as blind. We utilize difference expansion on integers to achieve reversibility. The major advantages provided by our scheme are reversibility to a high quality original data set, rightful owner identification, resistance against secondary watermarking attacks, and no need to store the original database at a secure secondary storage. We have implemented our scheme and results show the success rate is limited to 11% even when 48% tuples are modified.
Resumo:
This paper describes the design and implementation of a wireless neural telemetry system that enables new experimental paradigms, such as neural recordings during rodent navigation in large outdoor environments. RoSco, short for Rodent Scope, is a small lightweight user-configurable module suitable for digital wireless recording from freely behaving small animals. Due to the digital transmission technology, RoSco has advantages over most other wireless modules of noise immunity and online user-configurable settings. RoSco digitally transmits entire neural waveforms for 14 of 16 channels at 20 kHz with 8-bit encoding which are streamed to the PC as standard USB audio packets. Up to 31 RoSco wireless modules can coexist in the same environment on non-overlapping independent channels. The design has spatial diversity reception via two antennas, which makes wireless communication resilient to fading and obstacles. In comparison with most existing wireless systems, this system has online user-selectable independent gain control of each channel in 8 factors from 500 to 32,000 times, two selectable ground references from a subset of channels, selectable channel grounding to disable noisy electrodes, and selectable bandwidth suitable for action potentials (300 Hz–3 kHz) and low frequency field potentials (4 Hz–3 kHz). Indoor and outdoor recordings taken from freely behaving rodents are shown to be comparable to a commercial wired system in sorting for neural populations. The module has low input referred noise, battery life of 1.5 hours and transmission losses of 0.1% up to a range of 10 m.
Resumo:
Synthetic hydrogels selectively decorated with cell adhesion motifs are rapidly emerging as promising substrates for 3D cell culture. When cells are grown in 3D they experience potentially more physiologically relevant cell-cell interactions and physical cues compared with traditional 2D cell culture on stiff surfaces. A newly developed polymer based on poly(2-oxazoline)s has been used for the first time to control attachment of fibroblast cells and is discussed here for its potential use in 3D cell culture with particular focus on cancer cells towards the ultimate aim of high throughput screening of anti-cancer therapies. Advantages and limitations of using poly(2-oxazoline) hydrogels are discussed and compared with more established polymers, especially polyethylene glycol (PEG).
Resumo:
1. Background/context This presentation will report on emerging results from a two phase project funded by the Australian Learning and Teaching Council (ALTC). The project was designed in partnership with five universities and aimed to embed peer review within the local teaching and learning culture by using a distributive leadership framework. 2. The initiative/practice The presentation will highlight research outcomes that bring together both the fundamentals of peer review of teaching with the broader contextual elements of Integration, Leadership and Development. It will be demonstrated that peer review of teaching can be implemented and have advantages for academic staff, teaching evaluation and an organisation if attention is given to strategies that influence the contexts and cultures of teaching. Peer review as a strategy to develop excellence in teaching is considered from a holistic perspective that by necessity encompasses all elements of an educational environment. Results demonstrate achievements that can be obtained through working to foster conditions needed for sustainable leadership and change. The work has implications for policy, research, teaching development and student outcomes and has potential application world-wide. 3. Method(s) of evaluative data collection and analysis The 2 phase project collected focus group and questionnaire data to inform research results that were analysed using a thematic qualitative approach and statistical exploration. 4. Evidence of effectiveness The presentation will demonstrate the effectiveness of distributive leadership and strategic approaches to working for cultural change through the presentation of project findings.
Resumo:
This presentation addresses issues related to leadership, academic development and scholarship of teaching and learning, and highlights research funded by the Australian Office of Learning and Teaching (OLT) designed to embed and sustain peer review of teaching within the culture of 5 Australian universities: Queensland University of Technology, University of Technology, Sydney, University of Adelaide, Curtin University, and Charles Darwin University. Peer review of teaching in higher education will be emphasised as a professional process for providing feedback on teaching and learning practice, which if sustained, can become an effective ongoing strategy for academic development (Barnard et al, 2011; Bell, 2005; Bolt and Atkinson, 2010; McGill & Beaty 2001, 1992; Kemmis & McTaggart, 2000). The research affirms that using developmental peer review models (Barnard et al, 2011; D'Andrea, 2002; Hammersley-Fletcher & Orsmond, 2004) can bring about successful implementation, especially when implemented within a distributive leadership framework (Spillane & Healey, 2010). The project’s aims and objectives were to develop leadership capacity and integrate peer review as a cultural practice in higher education. The research design was a two stage inquiry process over 2 years. The project began in July 2011 and encompassed a development and pilot phase followed by a cascade phase with questionnaire and focus group evaluation processes to support ongoing improvement and measures of outcome. Leadership development activities included locally delivered workshops complemented by the identification and support of champions. To optimise long term sustainability, the project was implemented through existing learning and teaching structures and processes within the respective partner universities. Research outcomes highlight the fundamentals of peer review of teaching and the broader contextual elements of integration, leadership and development, expressed as a conceptual model for embedding peer review of teaching within higher education. The research opens a communicative space about introduction of peer review that goes further than simply espousing its worth and introduction. The conceptual model highlights the importance of development of distributive leadership capacity, integration of policies and processes, and understanding the values, beliefs, assumptions and behaviors embedded in an organizational culture. The presentation overviews empirical findings that demonstrate progress to advance peer review requires an ‘across-the-board’ commitment to embed change, and inherently demands a process that co-creates connection across colleagues, discipline groups, and the university sector. Progress toward peer review of teaching as a cultural phenomenon can be achieved and has advantages for academic staff, scholarship, teaching evaluation and an organisation, if attention is given to strategies that influence the contexts and cultures of teaching practice. Peer review as a strategy to develop excellence in teaching is considered from a holistic perspective that by necessity encompasses all elements of an educational environment and has a focus on scholarship of teaching. The work is ongoing and has implication for policy, research, teaching development and student outcomes, and has potential application world-wide.
Resumo:
Event report on the Open Access and Research 2013 conference which focused on recent developments and the strategic advantages they bring to the research sector.
Resumo:
Management of the industrial nations' hazardous waste is a current and exponentially increasing, global threatening situation. Improved environmental information must be obtained and managed concerning the current status, temporal dynamics and potential future status of these critical sites. To test the application of spatial environmental techniques to the problem of hazardous waste sites, as Superfund (CERCLA) test site was chosen in an industrial/urban valley experiencing severe TCE, PCE, and CTC ground water contamination. A paradigm is presented for investigating spatial/environmental tools available for the mapping, monitoring and modelling of the environment and its toxic contaminated plumes. This model incorporates a range of technical issues concerning the collection of data as augmented by remotely sensed tools, the format and storage of data utilizing geographic information systems, and the analysis and modelling of environment through the use of advance GIS analysis algorithms and geophysic models of hydrologic transport including statistical surface generation. This spatial based approach is evaluated against the current government/industry standards of operations. Advantages and lessons learned of the spatial approach are discussed.
Resumo:
The complex supply chain relations of the construction industry, coupled with the substantial amount of information to be shared on a regular basis between the parties involved, make the traditional paper-based data interchange methods inefficient, error prone and expensive. The successful information technology (IT) applications that enable seamless data interchange, such as the Electronic Data Interchange (EDI) systems, have generally failed to be successfully implemented in the construction industry. An alternative emerging technology, Extensible Markup Language (XML), and its applicability to streamline business processes and to improve data interchange methods within the construction industry are analysed, as is the EDI technology to identify the strategic advantages that XML technology provides to overcome the barriers to implementation. In addition, the successful implementation of XML-based automated data interchange platforms for a large organization, and the proposed benefits thereof, are presented as a case study.
Resumo:
Using Media-Access-Control (MAC) address for data collection and tracking is a capable and cost effective approach as the traditional ways such as surveys and video surveillance have numerous drawbacks and limitations. Positioning cell-phones by Global System for Mobile communication was considered an attack on people's privacy. MAC addresses just keep a unique log of a WiFi or Bluetooth enabled device for connecting to another device that has not potential privacy infringements. This paper presents the use of MAC address data collection approach for analysis of spatio-temporal dynamics of human in terms of shared space utilization. This paper firstly discuses the critical challenges and key benefits of MAC address data as a tracking technology for monitoring human movement. Here, proximity-based MAC address tracking is postulated as an effective methodology for analysing the complex spatio-temporal dynamics of human movements at shared zones such as lounge and office areas. A case study of university staff lounge area is described in detail and results indicates a significant added value of the methodology for human movement tracking. By analysis of MAC address data in the study area, clear statistics such as staff’s utilisation frequency, utilisation peak periods, and staff time spent is obtained. The analyses also reveal staff’s socialising profiles in terms of group and solo gathering. The paper is concluded with a discussion on why MAC address tracking offers significant advantages for tracking human behaviour in terms of shared space utilisation with respect to other and more prominent technologies, and outlines some of its remaining deficiencies.
Resumo:
Quantitative determination of modification of primary sediment features, by the activity of organisms (i.e., bioturbation) is essential in geosciences. Some methods proposed since the 1960s are mainly based on visual or subjective determinations. The first semiquantitative evaluations of the Bioturbation Index, Ichnofabric Index, or the amount of bioturbation were attempted, in the best cases using a series of flashcards designed in different situations. Recently, more effective methods involve the use of analytical and computational methods such as X-rays, magnetic resonance imaging or computed tomography; these methods are complex and often expensive. This paper presents a compilation of different methods, using Adobe® Photoshop® software CS6, for digital estimation that are a part of the IDIAP (Ichnological Digital Analysis Images Package), which is an inexpensive alternative to recently proposed methods, easy to use, and especially recommended for core samples. The different methods — “Similar Pixel Selection Method (SPSM)”, “Magic Wand Method (MWM)” and the “Color Range Selection Method (CRSM)” — entail advantages and disadvantages depending on the sediment (e.g., composition, color, texture, porosity, etc.) and ichnological features (size of traces, infilling material, burrow wall, etc.). The IDIAP provides an estimation of the amount of trace fossils produced by a particular ichnotaxon, by a whole ichnocoenosis or even for a complete ichnofabric. We recommend the application of the complete IDIAP to a given case study, followed by selection of the most appropriate method. The IDIAP was applied to core material recovered from the IODP Expedition 339, enabling us, for the first time, to arrive at a quantitative estimation of the discrete trace fossil assemblage in core samples.
Resumo:
We study the multicast stream authentication problem when an opponent can drop, reorder and introduce data packets into the communication channel. In such a model, packet overhead and computing efficiency are two parameters to be taken into account when designing a multicast stream protocol. In this paper, we propose to use two families of erasure codes to deal with this problem, namely, rateless codes and maximum distance separable codes. Our constructions will have the following advantages. First, our packet overhead will be small. Second, the number of signature verifications to be performed at the receiver is O(1). Third, every receiver will be able to recover all the original data packets emitted by the sender despite losses and injection occurred during the transmission of information.
Resumo:
A sub‒domain smoothed Galerkin method is proposed to integrate the advantages of mesh‒free Galerkin method and FEM. Arbitrarily shaped sub‒domains are predefined in problems domain with mesh‒free nodes. In each sub‒domain, based on mesh‒free Galerkin weak formulation, the local discrete equation can be obtained by using the moving Kriging interpolation, which is similar to the discretization of the high‒order finite elements. Strain smoothing technique is subsequently applied to the nodal integration of sub‒domain by dividing the sub‒domain into several smoothing cells. Moreover, condensation of DOF can also be introduced into the local discrete equations to improve the computational efficiency. The global governing equations of present method are obtained on the basis of the scheme of FEM by assembling all local discrete equations of the sub‒domains. The mesh‒free properties of Galerkin method are retained in each sub‒domain. Several 2D elastic problems have been solved on the basis of this newly proposed method to validate its computational performance. These numerical examples proved that the newly proposed sub‒domain smoothed Galerkin method is a robust technique to solve solid mechanics problems based on its characteristics of high computational efficiency, good accuracy, and convergence.
Resumo:
With increasing signs of climate change and the influence of national and international carbon-related laws and agreements, governments all over the world are grappling with how to rapidly transition to low-carbon living. This includes adapting to the impacts of climate change that are very likely to be experienced due to current emission levels (including extreme weather and sea level changes), and mitigating against further growth in greenhouse gas emissions that are likely to result in further impacts. Internationally, the concept of ‘Biophilic Urbanism’, a term coined by Professors Tim Beatley and Peter Newman to refer to the use of natural elements as design features in urban landscapes, is emerging as a key component in addressing such climate change challenges in rapidly growing urban contexts. However, the economics of incorporating such options is not well understood and requires further attention to underpin a mainstreaming of biophilic urbanism. Indeed, there appears to be an ad hoc, reactionary approach to creating economic arguments for or against the design, installation or maintenance of natural elements such as green walls, green roofs, streetscapes, and parklands. With this issue in mind, this paper will overview research as part of an industry collaborative research project that considers the potential for using a number of environmental economic valuation techniques that have evolved over the last several decades in agricultural and resource economics, to systematically value the economic value of biophilic elements in the urban context. Considering existing literature on environmental economic valuation techniques, the paper highlights opportunities for creating a standardised language for valuing biophilic elements. The conclusions have implications for expanding the field of environmental economic value to support the economic evaluations and planning of the greater use of natural elements in cities. Insights are also noted for the more mature fields of agricultural and resource economics.
Resumo:
With a view to minimising the spiraling labour costs, the concrete masonry industry is developing thin layer mortar technology (known as thin bed technology) collaboratively with Queensland University of Technology. Similar technologies are practiced in Europe mainly for clay brick masonry; in the UK thin layer mortared concrete masonry has been researched under commercial contract with limited information published. This paper presents numerous experimental data generated over the past three years. It is shown that this form of masonry requires special drymixed mortar containing a minimum of 2% polymer for improved workability and blocks with tighter height tolerance, both of which might increase the cost of these constituent materials. However, through semiskilled labour, tools to dispense and control the thickness of mortar and the associated increase in productivity, reduction to the overall costs of this form of construction can be achieved. Further the polymer mortar provides several advantages: (1) improved sustainability due to dry curing and (2) potential to construct mortar layers of 2mm thickness and (3) ability for mechanisation of mortar application and control of thickness without the need for skilled labour.