108 resultados para Turbulent functions
Resumo:
The influence of inflow turbulence on the results of Favre–Reynolds-averaged Navier–Stokes computations of supersonic oblique-shock-wave/turbulent-boundary-layer interactions (shock-wave Mach-number MSW ∼2.9), using seven-equation Reynolds-stress model turbulence closures, is studied. The generation of inflow conditions (and the initialization of the flowfield) for mean flow, Reynolds stresses, and turbulence length scale, based on semi-analytic grid-independent boundary-layer profiles, is described in detail. Particular emphasis is given to freestream turbulence intensity and length scale. The influence of external-flow turbulence intensity is studied in detail both for flat-plate boundary-layer flow and for a compression-ramp interaction with large separation. It is concluded that the Reynolds-stress model correctly reproduces the effects of external flow turbulence.
Resumo:
This paper is concerned with recent advances in the development of near wall-normal-free Reynolds-stress models, whose single point closure formulation, based on the inhomogeneity direction concept, is completely independent of the distance from the wall, and of the normal to the wall direction. In the present approach the direction of the inhomogeneity unit vector is decoupled from the coefficient functions of the inhomogeneous terms. A study of the relative influence of the particular closures used for the rapid redistribution terms and for the turbulent diffusion is undertaken, through comparison with measurements, and with a baseline Reynolds-stress model (RSM) using geometric wall normals. It is shown that wall-normal-free rsms can be reformulated as a projection on a tensorial basis that includes the inhomogeneity direction unit vector, suggesting that the theory of the redistribution tensor closure should be revised by taking into account inhomogeneity effects in the tensorial integrity basis used for its representation.
Resumo:
The research described in this paper forms part of an in-depth investigation of safety culture in one of Australia’s largest construction companies. The research builds on a previous qualitative study with organisational safety leaders and further investigates how safety culture is perceived and experienced by organisational members, as well as how this relates to their safety behaviour and related outcomes at work. Participants were 2273 employees of the case study organisation, with 689 from the Construction function and 1584 from the Resources function. The results of several analyses revealed some interesting organisational variance on key measures. Specifically, the Construction function scored significantly higher on all key measures: safety climate, safety motivation, safety compliance, and safety participation. The results are discussed in terms of relevance in an applied research context.
Resumo:
In 1980 Alltop produced a family of cubic phase sequences that nearly meet the Welch bound for maximum non-peak correlation magnitude. This family of sequences were shown by Wooters and Fields to be useful for quantum state tomography. Alltop’s construction used a function that is not planar, but whose difference function is planar. In this paper we show that Alltop type functions cannot exist in fields of characteristic 3 and that for a known class of planar functions, x^3 is the only Alltop type function.
Resumo:
This paper provides a commentary on the contribution by Dr Chow who questioned whether the functions of learning are general across all categories of tasks or whether there are some task-particular aspects to the functions of learning in relation to task type. Specifically, they queried whether principles and practice for the acquisition of sport skills are different than what they are for musical, industrial, military and human factors skills. In this commentary we argue that ecological dynamics contains general principles of motor learning that can be instantiated in specific performance contexts to underpin learning design. In this proposal, we highlight the importance of conducting skill acquisition research in sport, rather than relying on empirical outcomes of research from a variety of different performance contexts. Here we discuss how task constraints of different performance contexts (sport, industry, military, music) provide different specific information sources that individuals use to couple their actions when performing and acquiring skills. We conclude by suggesting that his relationship between performance task constraints and learning processes might help explain the traditional emphasis on performance curves and performance outcomes to infer motor learning.
Resumo:
The current study examined the structure of the volunteer functions inventory within a sample of older individuals (N = 187). The career items were replaced with items examining the concept of continuity of work, a potentially more useful and relevant concept for this population. Factor analysis supported a four factor solution, with values, social and continuity emerging as single factors and enhancement and protective items loading together on a single factor. Understanding items did not load highly on any factor. The values and continuity functions were the only dimensions to emerge as predictors of intention to volunteer. This research has important implications for understanding the motivation of older adults to engage in contemporary volunteering settings.
Resumo:
Model calculations, which include the effects of turbulence during subsequent solar nebula evolution after the collapse of a cool interstellar cloud, can reconcile some of the apparent differences between physical parameters obtained from theory and the cosmochemical record. Two important aspects of turbulence in a protoplanetary cloud include the growth and transport of solid grains. While the physical effects of the process can be calculated and compared with the probable remains of the nebula formulation period, the more subtle effects on primitive grains and their survival in the cosmochemical record cannot be readily evaluated. The environment offered by the Space Station (or Space Shuttle) experimental facility can provide the vacuum and low gravity conditions for sufficiently long time periods required for experimental verification of these cosmochemical models.
Resumo:
Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.
Resumo:
This thesis investigated the viability of using Frequency Response Functions in combination with Artificial Neural Network technique in damage assessment of building structures. The proposed approach can help overcome some of limitations associated with previously developed vibration based methods and assist in delivering more accurate and robust damage identification results. Excellent results are obtained for damage identification of the case studies proving that the proposed approach has been developed successfully.
Resumo:
Australian TV News: New Forms, Functions, and Futures examines the changing relationships between television, politics and popular culture. Drawing extensively on qualitative audience research and industry interviews, this book demonstrates that while ‘infotainment’ and satirical programmes may not follow the journalism orthodoxy (or, in some cases, reject it outright), they nevertheless play an important role in the way everyday Australians understand what is happening in the world. This therefore throws into question some longstanding assumptions about what form TV news should take, the functions it ought to serve, and the future prospects of the fourth estate.
Resumo:
Whether to keep products segregated (e.g., unbundled) or integrate some or all of them (e.g., bundle) has been a problem of profound interest in areas such as portfolio theory in finance, risk capital allocations in insurance and marketing of consumer products. Such decisions are inherently complex and depend on factors such as the underlying product values and consumer preferences, the latter being frequently described using value functions, also known as utility functions in economics. In this paper, we develop decision rules for multiple products, which we generally call ‘exposure units’ to naturally cover manifold scenarios spanning well beyond ‘products’. Our findings show, e.g. that the celebrated Thaler's principles of mental accounting hold as originally postulated when the values of all exposure units are positive (i.e. all are gains) or all negative (i.e. all are losses). In the case of exposure units with mixed-sign values, decision rules are much more complex and rely on cataloging the Bell number of cases that grow very fast depending on the number of exposure units. Consequently, in the present paper, we provide detailed rules for the integration and segregation decisions in the case up to three exposure units, and partial rules for the arbitrary number of units.
Resumo:
An estuary is formed at the mouth of a river where the tides meet a freshwater flow and it may be classified as a function of the salinity distribution and density stratification. An overview of the broad characteristics of the estuaries of South-East Queensland(Australia) is presented herein, where the small peri-urban estuaries may provide an useful indicator of potential changes which might occur in larger systems with growing urbanisation. Small peri-urban estuaries exhibits many key hydrological features and associated with ecosystem types of larger estuaries, albeit at smaller scales, often with a greater extent of urban development as a proportion of catchment area. We explore the potential for some smaller peri-urban estuaries to be used as natural laboratories to gain some much needed information on the estuarine processes, although any dynamics similarity is presently limited by critical absence of in-depth physical investigation in larger estuarine systems. The absence of the detailed turbulence and sedimentary data hampers the understanding and modelling of the estuarine zones. The interactions between the various stake holders are likely to define the vision for the future of South-East Queensland's peri-urban estuaries. This will require a solid understanding of the bio-physical function and capacity of the peri-urban estuaries. Based upon the knowledge gap, it is recommended that an adaptive trial and error approach be adopted for the future of investigation and management strategies.
Resumo:
We define a pair-correlation function that can be used to characterize spatiotemporal patterning in experimental images and snapshots from discrete simulations. Unlike previous pair-correlation functions, the pair-correlation functions developed here depend on the location and size of objects. The pair-correlation function can be used to indicate complete spatial randomness, aggregation or segregation over a range of length scales, and quantifies spatial structures such as the shape, size and distribution of clusters. Comparing pair-correlation data for various experimental and simulation images illustrates their potential use as a summary statistic for calibrating discrete models of various physical processes.