902 resultados para Boolean Functions, Nonlinearity, Evolutionary Computation, Equivalence Classes
Resumo:
This paper provides a commentary on the contribution by Dr Chow who questioned whether the functions of learning are general across all categories of tasks or whether there are some task-particular aspects to the functions of learning in relation to task type. Specifically, they queried whether principles and practice for the acquisition of sport skills are different than what they are for musical, industrial, military and human factors skills. In this commentary we argue that ecological dynamics contains general principles of motor learning that can be instantiated in specific performance contexts to underpin learning design. In this proposal, we highlight the importance of conducting skill acquisition research in sport, rather than relying on empirical outcomes of research from a variety of different performance contexts. Here we discuss how task constraints of different performance contexts (sport, industry, military, music) provide different specific information sources that individuals use to couple their actions when performing and acquiring skills. We conclude by suggesting that his relationship between performance task constraints and learning processes might help explain the traditional emphasis on performance curves and performance outcomes to infer motor learning.
Resumo:
For the timber industry, the ability to simulate the drying of wood is invaluable for manufacturing high quality wood products. Mathematically, however, modelling the drying of a wet porous material, such as wood, is a diffcult task due to its heterogeneous and anisotropic nature, and the complex geometry of the underlying pore structure. The well{ developed macroscopic modelling approach involves writing down classical conservation equations at a length scale where physical quantities (e.g., porosity) can be interpreted as averaged values over a small volume (typically containing hundreds or thousands of pores). This averaging procedure produces balance equations that resemble those of a continuum with the exception that effective coeffcients appear in their deffnitions. Exponential integrators are numerical schemes for initial value problems involving a system of ordinary differential equations. These methods differ from popular Newton{Krylov implicit methods (i.e., those based on the backward differentiation formulae (BDF)) in that they do not require the solution of a system of nonlinear equations at each time step but rather they require computation of matrix{vector products involving the exponential of the Jacobian matrix. Although originally appearing in the 1960s, exponential integrators have recently experienced a resurgence in interest due to a greater undertaking of research in Krylov subspace methods for matrix function approximation. One of the simplest examples of an exponential integrator is the exponential Euler method (EEM), which requires, at each time step, approximation of φ(A)b, where φ(z) = (ez - 1)/z, A E Rnxn and b E Rn. For drying in porous media, the most comprehensive macroscopic formulation is TransPore [Perre and Turner, Chem. Eng. J., 86: 117-131, 2002], which features three coupled, nonlinear partial differential equations. The focus of the first part of this thesis is the use of the exponential Euler method (EEM) for performing the time integration of the macroscopic set of equations featured in TransPore. In particular, a new variable{ stepsize algorithm for EEM is presented within a Krylov subspace framework, which allows control of the error during the integration process. The performance of the new algorithm highlights the great potential of exponential integrators not only for drying applications but across all disciplines of transport phenomena. For example, when applied to well{ known benchmark problems involving single{phase liquid ow in heterogeneous soils, the proposed algorithm requires half the number of function evaluations than that required for an equivalent (sophisticated) Newton{Krylov BDF implementation. Furthermore for all drying configurations tested, the new algorithm always produces, in less computational time, a solution of higher accuracy than the existing backward Euler module featured in TransPore. Some new results relating to Krylov subspace approximation of '(A)b are also developed in this thesis. Most notably, an alternative derivation of the approximation error estimate of Hochbruck, Lubich and Selhofer [SIAM J. Sci. Comput., 19(5): 1552{1574, 1998] is provided, which reveals why it performs well in the error control procedure. Two of the main drawbacks of the macroscopic approach outlined above include the effective coefficients must be supplied to the model, and it fails for some drying configurations, where typical dual{scale mechanisms occur. In the second part of this thesis, a new dual{scale approach for simulating wood drying is proposed that couples the porous medium (macroscale) with the underlying pore structure (microscale). The proposed model is applied to the convective drying of softwood at low temperatures and is valid in the so{called hygroscopic range, where hygroscopically held liquid water is present in the solid phase and water exits only as vapour in the pores. Coupling between scales is achieved by imposing the macroscopic gradient on the microscopic field using suitably defined periodic boundary conditions, which allows the macroscopic ux to be defined as an average of the microscopic ux over the unit cell. This formulation provides a first step for moving from the macroscopic formulation featured in TransPore to a comprehensive dual{scale formulation capable of addressing any drying configuration. Simulation results reported for a sample of spruce highlight the potential and flexibility of the new dual{scale approach. In particular, for a given unit cell configuration it is not necessary to supply the effective coefficients prior to each simulation.
Resumo:
In urban locations in Australia and elsewhere, public space may be said to be under attack from developers and also from attempts by civic authorities to oversee and control it (Davis 1995, Mitchell 2003, Watson 2006, Iveson 2006). The use of public space use by young people in particular, raises issues in Australia and elsewhere in the world. In a context of monitoring and control procedures, young people’s use of public space is often viewed as a threat to the prevailing social order (Loader 1996, White 1998, Crane and Dee 2001). This paper discusses recent technological developments in the surveillance, governance and control of public space used by young people, children and people of all ages.
Resumo:
The current study examined the structure of the volunteer functions inventory within a sample of older individuals (N = 187). The career items were replaced with items examining the concept of continuity of work, a potentially more useful and relevant concept for this population. Factor analysis supported a four factor solution, with values, social and continuity emerging as single factors and enhancement and protective items loading together on a single factor. Understanding items did not load highly on any factor. The values and continuity functions were the only dimensions to emerge as predictors of intention to volunteer. This research has important implications for understanding the motivation of older adults to engage in contemporary volunteering settings.
Resumo:
It is widely recognised that defining trade-offs between greenhouse gas emissions using ‘emission equivalence’ based on global warming potentials (GWPs) referenced to carbon dioxide produces anomalous results when applied to methane. The short atmospheric lifetime of methane, compared to the timescales of CO2 uptake, leads to the greenhouse warming depending strongly on the temporal pattern of emission substitution. We argue that a more appropriate way to consider the relationship between the warming effects of methane and carbon dioxide is to define a ‘mixed metric’ that compares ongoing methane emissions (or reductions) to one-off emissions (or reductions) of carbon dioxide. Quantifying this approach, we propose that a one-off sequestration of 1 t of carbon would offset an ongoing methane emission in the range 0.90–1.05 kg CH4 per year. We present an example of how our approach would apply to rangeland cattle production, and consider the broader context of mitigation of climate change, noting the reverse trade-off would raise significant challenges in managing the risk of non-compliance. Our analysis is consistent with other approaches to addressing the criticisms of GWP-based emission equivalence, but provides a simpler and more robust approach while still achieving close equivalence of climate mitigation outcomes ranging over decadal to multi-century timescales.
Four new avian mitochondrial genomes help get to basic evolutionary questions in the late cretaceous
Resumo:
Good phylogenetic trees are required to test hypotheses about evolutionary processes. We report four new avian mitochondrial genomes, which together with an improved method of phylogenetic analysis for vertebrate mt genomes give results for three questions in avian evolution. The new mt genomes are: magpie goose (Anseranas semipalmata), an owl (morepork, Ninox novaeseelandiae); a basal passerine (rifleman, or New Zealand wren, Acanthisitta chloris); and a parrot (kakapo or owl-parrot, Strigops habroptilus). The magpie goose provides an important new calibration point for avian evolution because the well-studied Presbyornis fossils are on the lineage to ducks and geese, after the separation of the magpie goose. We find, as with other animal mitochondrial genomes, that RY-coding is helpful in adjusting for biases between pyrimidines and between purines. When RY-coding is used at third positions of the codon, the root occurs between paleognath and neognath birds (as expected from morphological and nuclear data). In addition, passerines form a relatively old group in Neoaves, and many modern avian lineages diverged during the Cretaceous. Although many aspects of the avian tree are stable, additional taxon sampling is required.
Resumo:
Approximate Bayesian computation has become an essential tool for the analysis of complex stochastic models when the likelihood function is numerically unavailable. However, the well-established statistical method of empirical likelihood provides another route to such settings that bypasses simulations from the model and the choices of the approximate Bayesian computation parameters (summary statistics, distance, tolerance), while being convergent in the number of observations. Furthermore, bypassing model simulations may lead to significant time savings in complex models, for instance those found in population genetics. The Bayesian computation with empirical likelihood algorithm we develop in this paper also provides an evaluation of its own performance through an associated effective sample size. The method is illustrated using several examples, including estimation of standard distributions, time series, and population genetics models.
Resumo:
In this paper we present a new simulation methodology in order to obtain exact or approximate Bayesian inference for models for low-valued count time series data that have computationally demanding likelihood functions. The algorithm fits within the framework of particle Markov chain Monte Carlo (PMCMC) methods. The particle filter requires only model simulations and, in this regard, our approach has connections with approximate Bayesian computation (ABC). However, an advantage of using the PMCMC approach in this setting is that simulated data can be matched with data observed one-at-a-time, rather than attempting to match on the full dataset simultaneously or on a low-dimensional non-sufficient summary statistic, which is common practice in ABC. For low-valued count time series data we find that it is often computationally feasible to match simulated data with observed data exactly. Our particle filter maintains $N$ particles by repeating the simulation until $N+1$ exact matches are obtained. Our algorithm creates an unbiased estimate of the likelihood, resulting in exact posterior inferences when included in an MCMC algorithm. In cases where exact matching is computationally prohibitive, a tolerance is introduced as per ABC. A novel aspect of our approach is that we introduce auxiliary variables into our particle filter so that partially observed and/or non-Markovian models can be accommodated. We demonstrate that Bayesian model choice problems can be easily handled in this framework.
Resumo:
Background: Studies on the relationship between performance and design of the throwing frame have been limited and therefore require further investigation. Objectives: The specific objectives were to provide benchmark information about performance and whole body positioning of male athletes in F30s classes. Study Design: Descriptive analysis. Methods: A total of 48 attempts performed by 12 stationary discus throwers in F33 and F34 classes during seated discus throwing event of 2002 International Paralympic Committee Athletics World Championships were analysed in this study. The whole body positioning included overall throwing posture (i.e. number of points of contact between the thrower and the frame, body position, throwing orientation and throwing side) and lower limb placements (i.e. seating arrangements, points of contact on the both feet, type of attachment of both legs and feet). Results: Three (25%), five (42%), one (8%) and three (25%) athletes used from three to six points of contact, respectively. Seven (58%) and five (42%) athletes threw from a standing or a seated position, respectively. A straddle, a stool or a chair was used by six (50%), four (33%) or two (17%) throwers, respectively. Conclusions: This study provides key information for a better understanding of the interaction between throwing technique of elite seated throwers and their throwing frame.
Resumo:
This thesis investigates patterns of evolution in a group of native Australo-Papuan rodents. Past climatic change and associated sea level fluctuations, and fragmentation of wet forests in eastern Australia has facilitated rapid radiation, diversification and speciation in this group. This study adds to our understanding of the evolution of Australia’s rainforest fauna and describes the evolutionary relationships of a new genus of Australian rodent.
Resumo:
This PhD study has examined the population genetics of the Russian wheat aphid (RWA, Diuraphis noxia), one of the world’s most invasive agricultural pests, throughout its native and introduced global range. Firstly, this study investigated the geographic distribution of genetic diversity within and among RWA populations in western China. Analysis of mitochondrial data from 18 sites provided evidence for the long-term existence and expansion of RWAs in western China. The results refute the hypothesis that RWA is an exotic species only present in China since 1975. The estimated date of RWA expansion throughout western China coincides with the debut of wheat domestication and cultivation practices in western Asia in the Holocene. It is concluded that western China represents the limit of the far eastern native range of this species. Analysis of microsatellite data indicated high contemporary gene flow among northern populations in western China, while clear geographic isolation between northern and southern populations was identified across the Tianshan mountain range and extensive desert regions. Secondly, this study analyzed the worldwide pathway of invasion using both microsatellite and endosymbiont genetic data. Individual RWAs were obtained from native populations in Central Asia and the Middle East and invasive populations in Africa and the Americas. Results indicated two pathways of RWA invasion from 1) Syria in the Middle East to North Africa and 2) Turkey to South Africa, Mexico and then North and South America. Very little clone diversity was identified among invasive populations suggesting that a limited founder event occurred together with predominantly asexual reproduction and rapid population expansion. The most likely explanation for the rapid spread (within two years) from South Africa to the New World is by human movement, probably as a result of the transfer of wheat breeding material. Furthermore, the mitochondrial data revealed the presence of a universal haplotype and it is proposed that this haplotype is representative of a wheat associated super-clone that has gained dominance worldwide as a result of the widespread planting of domesticated wheat. Finally, this study examined salivary gland gene diversity to determine whether a functional basis for RWA invasiveness could be identified. Peroxidase DNA sequence data were obtained for a selection of worldwide RWA samples. Results demonstrated that most native populations were polymorphic while invasive populations were monomorphic, supporting previous conclusions relating to demographic founder effects in invasive populations. Purifying selection most likely explains the existence of a universal allele present in Middle Eastern populations, while balancing selection was evident in East Asian populations. Selection acting on the peroxidase gene may provide an allele-dependent advantage linked to the successful establishment of RWAs on wheat, and ultimately their invasion potential. In conclusion, this study is the most comprehensive molecular genetic investigation of RWA population genetics undertaken to date and provides significant insights into the source and pathway of global invasion and the potential existence of a wheat-adapted genotype that has colonised major wheat growing countries worldwide except for Australia. This research has major biosecurity implications for Australia’s grain industry.
Resumo:
Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.
Resumo:
This thesis investigated the viability of using Frequency Response Functions in combination with Artificial Neural Network technique in damage assessment of building structures. The proposed approach can help overcome some of limitations associated with previously developed vibration based methods and assist in delivering more accurate and robust damage identification results. Excellent results are obtained for damage identification of the case studies proving that the proposed approach has been developed successfully.
Resumo:
Australian TV News: New Forms, Functions, and Futures examines the changing relationships between television, politics and popular culture. Drawing extensively on qualitative audience research and industry interviews, this book demonstrates that while ‘infotainment’ and satirical programmes may not follow the journalism orthodoxy (or, in some cases, reject it outright), they nevertheless play an important role in the way everyday Australians understand what is happening in the world. This therefore throws into question some longstanding assumptions about what form TV news should take, the functions it ought to serve, and the future prospects of the fourth estate.