107 resultados para Transformed functions
Resumo:
This paper provides a commentary on the contribution by Dr Chow who questioned whether the functions of learning are general across all categories of tasks or whether there are some task-particular aspects to the functions of learning in relation to task type. Specifically, they queried whether principles and practice for the acquisition of sport skills are different than what they are for musical, industrial, military and human factors skills. In this commentary we argue that ecological dynamics contains general principles of motor learning that can be instantiated in specific performance contexts to underpin learning design. In this proposal, we highlight the importance of conducting skill acquisition research in sport, rather than relying on empirical outcomes of research from a variety of different performance contexts. Here we discuss how task constraints of different performance contexts (sport, industry, military, music) provide different specific information sources that individuals use to couple their actions when performing and acquiring skills. We conclude by suggesting that his relationship between performance task constraints and learning processes might help explain the traditional emphasis on performance curves and performance outcomes to infer motor learning.
Resumo:
The current study examined the structure of the volunteer functions inventory within a sample of older individuals (N = 187). The career items were replaced with items examining the concept of continuity of work, a potentially more useful and relevant concept for this population. Factor analysis supported a four factor solution, with values, social and continuity emerging as single factors and enhancement and protective items loading together on a single factor. Understanding items did not load highly on any factor. The values and continuity functions were the only dimensions to emerge as predictors of intention to volunteer. This research has important implications for understanding the motivation of older adults to engage in contemporary volunteering settings.
Resumo:
Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.
Resumo:
This thesis investigated the viability of using Frequency Response Functions in combination with Artificial Neural Network technique in damage assessment of building structures. The proposed approach can help overcome some of limitations associated with previously developed vibration based methods and assist in delivering more accurate and robust damage identification results. Excellent results are obtained for damage identification of the case studies proving that the proposed approach has been developed successfully.
Resumo:
Australian TV News: New Forms, Functions, and Futures examines the changing relationships between television, politics and popular culture. Drawing extensively on qualitative audience research and industry interviews, this book demonstrates that while ‘infotainment’ and satirical programmes may not follow the journalism orthodoxy (or, in some cases, reject it outright), they nevertheless play an important role in the way everyday Australians understand what is happening in the world. This therefore throws into question some longstanding assumptions about what form TV news should take, the functions it ought to serve, and the future prospects of the fourth estate.
Resumo:
Whether to keep products segregated (e.g., unbundled) or integrate some or all of them (e.g., bundle) has been a problem of profound interest in areas such as portfolio theory in finance, risk capital allocations in insurance and marketing of consumer products. Such decisions are inherently complex and depend on factors such as the underlying product values and consumer preferences, the latter being frequently described using value functions, also known as utility functions in economics. In this paper, we develop decision rules for multiple products, which we generally call ‘exposure units’ to naturally cover manifold scenarios spanning well beyond ‘products’. Our findings show, e.g. that the celebrated Thaler's principles of mental accounting hold as originally postulated when the values of all exposure units are positive (i.e. all are gains) or all negative (i.e. all are losses). In the case of exposure units with mixed-sign values, decision rules are much more complex and rely on cataloging the Bell number of cases that grow very fast depending on the number of exposure units. Consequently, in the present paper, we provide detailed rules for the integration and segregation decisions in the case up to three exposure units, and partial rules for the arbitrary number of units.
Resumo:
We define a pair-correlation function that can be used to characterize spatiotemporal patterning in experimental images and snapshots from discrete simulations. Unlike previous pair-correlation functions, the pair-correlation functions developed here depend on the location and size of objects. The pair-correlation function can be used to indicate complete spatial randomness, aggregation or segregation over a range of length scales, and quantifies spatial structures such as the shape, size and distribution of clusters. Comparing pair-correlation data for various experimental and simulation images illustrates their potential use as a summary statistic for calibrating discrete models of various physical processes.
Resumo:
Currently, mass spectrometry-based metabolomics studies extend beyond conventional chemical categorization and metabolic phenotype analysis to understanding gene function in various biological contexts (e.g., mammalian, plant, and microbial). These novel utilities have led to many innovative discoveries in the following areas: disease pathogenesis, therapeutic pathway or target identification, the biochemistry of animal and plant physiological and pathological activities in response to diverse stimuli, and molecular signatures of host-pathogen interactions during microbial infection. In this review, we critically evaluate the representative applications of mass spectrometry-based metabolomics to better understand gene function in diverse biological contexts, with special emphasis on working principles, study protocols, and possible future development of this technique. Collectively, this review raises awareness within the biomedical community of the scientific value and applicability of mass spectrometry-based metabolomics strategies to better understand gene function, thus advancing this application's utility in a broad range of biological fields
Resumo:
Sequences with optimal correlation properties are much sought after for applications in communication systems. In 1980, Alltop (\emph{IEEE Trans. Inf. Theory} 26(3):350-354, 1980) described a set of sequences based on a cubic function and showed that these sequences were optimal with respect to the known bounds on auto and crosscorrelation. Subsequently these sequences were used to construct mutually unbiased bases (MUBs), a structure of importance in quantum information theory. The key feature of this cubic function is that its difference function is a planar function. Functions with planar difference functions have been called \emph{Alltop functions}. This paper provides a new family of Alltop functions and establishes the use of Alltop functions for construction of sequence sets and MUBs.
Resumo:
This paper presents a comprehensive formal security framework for key derivation functions (KDF). The major security goal for a KDF is to produce cryptographic keys from a private seed value where the derived cryptographic keys are indistinguishable from random binary strings. We form a framework of five security models for KDFs. This consists of four security models that we propose: Known Public Inputs Attack (KPM, KPS), Adaptive Chosen Context Information Attack (CCM) and Adaptive Chosen Public Inputs Attack(CPM); and another security model, previously defined by Krawczyk [6], which we refer to as Adaptive Chosen Context Information Attack(CCS). These security models are simulated using an indistinguisibility game. In addition we prove the relationships between these five security models and analyse KDFs using the framework (in the random oracle model).
Resumo:
The importance of applying unsaturated soil mechanics to geotechnical engineering design has been well understood. However, the consumption of time and the necessity for a specific laboratory testing apparatus when measuring unsaturated soil properties have limited the application of unsaturated soil mechanics theories in practice. Although methods for predicting unsaturated soil properties have been developed, the verification of these methods for a wide range of soil types is required in order to increase the confidence of practicing engineers in using these methods. In this study, a new permeameter was developed to measure the hydraulic conductivity of unsaturated soils using the steady-state method and directly measured suction (negative pore-water pressure) values. The apparatus is instrumented with two tensiometers for the direct measurement of suction during the tests. The apparatus can be used to obtain the hydraulic conductivity function of sandy soil over a low suction range (0-10 kPa). Firstly, the repeatability of the unsaturated hydraulic conductivity measurement, using the new permeameter, was verified by conducting tests on two identical sandy soil specimens and obtaining similar results. The hydraulic conductivity functions of the two sandy soils were then measured during the drying and wetting processes of the soils. A significant hysteresis was observed when the hydraulic conductivity was plotted against the suction. However, the hysteresis effects were not apparent when the conductivity was plotted against the volumetric water content. Furthermore, the measured unsaturated hydraulic conductivity functions were compared with predictions using three different predictive methods that are widely incorporated into numerical software. The results suggest that these predictive methods are capable of capturing the measured behavior with reasonable agreement.
Resumo:
Many aspects of China's academic publishing system differ from the systems found in liberal market based economies of the United States, Western Europe and Australia. A high level of government intervention in both the publishing industry and academia and the challenges associated with attempting to make a transition from a centrally controlled towards a more market based publishing industry are two notable differences; however, as in other countries, academic communities and publishers are being transformed by digital technologies. This research explores the complex yet dynamic digital transformation of academic publishing in China, with a specific focus of the open and networked initiatives inspired by Web 2.0 and social media. The thesis draws on two case studies: Science Paper Online, a government-operated online preprint platform and open access mandate; and New Science, a social reference management website operated by a group of young PhD students. Its analysis of the innovations, business models, operating strategies, influences, and difficulties faced by these two initiatives highlights important characteristics and trends in digital publishing experiments in China. The central argument of this thesis is that the open and collaborative possibilities of Web 2.0 inspired initiatives are emerging outside the established journal and monograph publishing system in China, introducing innovative and somewhat disruptive approaches to the certification, communication and commercial exploitation of knowledge. Moreover, emerging publishing models are enabling and encouraging a new system of practising and communicating science in China, putting into practice some elements of the Open Science ethos. There is evidence of both disruptive change to old publishing structures and the adaptive modification of emergent replacements in the Chinese practice. As such, the transformation from traditional to digital and interactive modes of publishing, involves both competition and convergence between new and old publishers, as well as dynamics of co-evolution involving new technologies, business models, social norms, and government reform agendas. One key concern driving this work is whether there are new opportunities and new models for academic publishing in the Web 2.0 age and social media environment, which might allow the basic functions of communication and certification to be achieved more effectively. This thesis enriches existing knowledge of open and networked transformations of scholarly publishing by adding a Chinese story. Although the development of open and networked publishing platforms in China remains in its infancy, the lessons provided by this research are relevant to practitioners and stakeholders interested in understanding the transformative dynamics of networked technologies for publishing and advocating open access in practice, not only in China, but also internationally.
Resumo:
Polycrystalline gold electrodes of the kind that are routinely used in analysis and catalysis in aqueous media are often regarded as exhibiting relatively simple double-layer charging/discharging and monolayer oxide formation/ removal in the positive potential region. Application of the large amplitude Fourier transformed alternating current (FT-ac) voltammetric technique that allows the faradaic current contribution of fast electron-transfer processes to be emphasized in the higher harmonic components has revealed the presence of well-defined faradaic (premonolayer oxidation) processes at positive potentials in the double-layer region in acidic and basic media which are enhanced by electrochemical activation. These underlying quasi-reversible interfacial electron-transfer processes may mediate the course of electrocatalytic oxidation reactions of hydrazine, ethylene glycol, and glucose on gold electrodes in aqueous media. The observed responses support key assumptions associated with the incipient hydrous oxide adatom mediator (IHOAM) model of electrocatalysis.
Resumo:
An analytical evaluation of the higher ac harmonic components derived from large amplitude Fourier transformed voltammetry is provided for the reversible oxidation of ferrocenemethanol (FcMeOH) and oxidation of uric acid by an EEC mechanism in a pH 7.4 phosphate buffer at a glassy carbon (GC) electrode. The small background current in the analytically optimal fifth harmonic is predominantly attributed to faradaic current associated with the presence of electroactive functional groups on the GC electrode surface, rather than to capacitive current which dominates the background in the dc, and the initial three ac harmonics. The detection limits for the dc and the first to fifth harmonic ac components are 1.9, 5.89, 2.1, 2.5, 0.8, and 0.5 µM for FcMeOH, respectively, using a sine wave modulation of 100 mV at 21.46 Hz and a dc sweep rate of 111.76 mV s−1. Analytical performance then progressively deteriorates in the sixth and higher harmonics. For the determination of uric acid, the capacitive background current was enhanced and the reproducibility lowered by the presence of surface active uric acid, but the rapid overall 2e− rather than 1e– electron transfer process gives rise to a significantly enhanced fifth harmonic faradaic current which enabled a detection limit of 0.3 µM to be achieved which is similar to that reported using chemically modified electrodes. Resolution of overlapping voltammetric signals for a mixture of uric acid and dopamine is also achieved using higher fourth or fifth harmonic components, under very low background current conditions. The use of higher fourth and fifth harmonics exhibiting highly favorable faradaic to background (noise) current ratios should therefore be considered in analytical applications under circumstances where the electron transfer rate is fast.