21 resultados para Automated Test Data Generation
em Aston University Research Archive
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY WITH PRIOR ARRANGEMENT
Resumo:
Field material testing provides firsthand information on pavement conditions which are most helpful in evaluating performance and identifying preventive maintenance or overlay strategies. High variability of field asphalt concrete due to construction raises the demand for accuracy of the test. Accordingly, the objective of this study is to propose a reliable and repeatable methodology to evaluate the fracture properties of field-aged asphalt concrete using the overlay test (OT). The OT is selected because of its efficiency and feasibility for asphalt field cores with diverse dimensions. The fracture properties refer to the Paris’ law parameters based on the pseudo J-integral (A and n) because of the sound physical significance of the pseudo J-integral with respect to characterizing the cracking process. In order to determine A and n, a two-step OT protocol is designed to characterize the undamaged and damaged behaviors of asphalt field cores. To ensure the accuracy of determined undamaged and fracture properties, a new analysis method is then developed for data processing, which combines the finite element simulations and mechanical analysis of viscoelastic force equilibrium and evolution of pseudo displacement work in the OT specimen. Finally, theoretical equations are derived to calculate A and n directly from the OT test data. The accuracy of the determined fracture properties is verified. The proposed methodology is applied to a total of 27 asphalt field cores obtained from a field project in Texas, including the control Hot Mix Asphalt (HMA) and two types of warm mix asphalt (WMA). The results demonstrate a high linear correlation between n and −log A for all the tested field cores. Investigations of the effect of field aging on the fracture properties confirm that n is a good indicator to quantify the cracking resistance of asphalt concrete. It is also indicated that summer climatic condition clearly accelerates the rate of aging. The impact of the WMA technologies on fracture properties of asphalt concrete is visualized by comparing the n-values. It shows that the Evotherm WMA technology slightly improves the cracking resistance, while the foaming WMA technology provides the comparable fracture properties with the HMA. After 15 months aging in the field, the cracking resistance does not exhibit significant difference between HMA and WMAs, which is confirmed by the observations of field distresses.
Resumo:
Neural networks can be regarded as statistical models, and can be analysed in a Bayesian framework. Generalisation is measured by the performance on independent test data drawn from the same distribution as the training data. Such performance can be quantified by the posterior average of the information divergence between the true and the model distributions. Averaging over the Bayesian posterior guarantees internal coherence; Using information divergence guarantees invariance with respect to representation. The theory generalises the least mean squares theory for linear Gaussian models to general problems of statistical estimation. The main results are: (1)~the ideal optimal estimate is always given by average over the posterior; (2)~the optimal estimate within a computational model is given by the projection of the ideal estimate to the model. This incidentally shows some currently popular methods dealing with hyperpriors are in general unnecessary and misleading. The extension of information divergence to positive normalisable measures reveals a remarkable relation between the dlt dual affine geometry of statistical manifolds and the geometry of the dual pair of Banach spaces Ld and Ldd. It therefore offers conceptual simplification to information geometry. The general conclusion on the issue of evaluating neural network learning rules and other statistical inference methods is that such evaluations are only meaningful under three assumptions: The prior P(p), describing the environment of all the problems; the divergence Dd, specifying the requirement of the task; and the model Q, specifying available computing resources.
Resumo:
Levels of lignin and hydroxycinnamic acid wall components in three genera of forage grasses (Lolium,Festuca and Dactylis) have been accurately predicted by Fourier-transform infrared spectroscopy using partial least squares models correlated to analytical measurements. Different models were derived that predicted the concentrations of acid detergent lignin, total hydroxycinnamic acids, total ferulate monomers plus dimers, p-coumarate and ferulate dimers in independent spectral test data from methanol extracted samples of perennial forage grass with accuracies of 92.8%, 86.5%, 86.1%, 59.7% and 84.7% respectively, and analysis of model projection scores showed that the models relied generally on spectral features that are known absorptions of these compounds. Acid detergent lignin was predicted in samples of two species of energy grass, (Phalaris arundinacea and Pancium virgatum) with an accuracy of 84.5%.
Resumo:
A mathematical model is developed for the general pneumatic tyre. The model will permit the investigations of tyre deformations produced by arbitrary external loading, and will enable estimates to be made of the distributions of applied and reactive forces. The principle of Finite Elements is used to idealise the composite tyre structure, each element consisting of a triangle of double curvature with varying thickness. Large deflections of' the structure are accomodated by the use of an iterative sequence of small incremental steps, each of' which obeys the laws of linear mechanics. The theoretical results are found to compare favourably with the experimental test data obtained from two different types of ttye construction. However, limitations in the discretisation process has prohibited accurate assessments to be made of stress distributions in the regions of high stress gradients ..
Resumo:
Methods of dynamic modelling and analysis of structures, for example the finite element method, are well developed. However, it is generally agreed that accurate modelling of complex structures is difficult and for critical applications it is necessary to validate or update the theoretical models using data measured from actual structures. The techniques of identifying the parameters of linear dynamic models using Vibration test data have attracted considerable interest recently. However, no method has received a general acceptance due to a number of difficulties. These difficulties are mainly due to (i) Incomplete number of Vibration modes that can be excited and measured, (ii) Incomplete number of coordinates that can be measured, (iii) Inaccuracy in the experimental data (iv) Inaccuracy in the model structure. This thesis reports on a new approach to update the parameters of a finite element model as well as a lumped parameter model with a diagonal mass matrix. The structure and its theoretical model are equally perturbed by adding mass or stiffness and the incomplete number of eigen-data is measured. The parameters are then identified by an iterative updating of the initial estimates, by sensitivity analysis, using eigenvalues or both eigenvalues and eigenvectors of the structure before and after perturbation. It is shown that with a suitable choice of the perturbing coordinates exact parameters can be identified if the data and the model structure are exact. The theoretical basis of the technique is presented. To cope with measurement errors and possible inaccuracies in the model structure, a well known Bayesian approach is used to minimize the least squares difference between the updated and the initial parameters. The eigen-data of the structure with added mass or stiffness is also determined using the frequency response data of the unmodified structure by a structural modification technique. Thus, mass or stiffness do not have to be added physically. The mass-stiffness addition technique is demonstrated by simulation examples and Laboratory experiments on beams and an H-frame.
Resumo:
In this article, it is argued that reflexivity is integral to experiential qualitative research in psychology. Reflexivity has been defined in many ways. Woolgar’s continuum of reflexivity though provides a useful gauge by which to judge whether a researcher is involved in simple reflection or reflexivity. The article demonstrates the benefits of adopting a reflexive attitude by presenting “challenge-to-competency.” The author’s encounter with Sarah will help illustrate the role of reflexivity both in data generation and in interpretative analysis. To close, it is proposed that reflexivity as hermeneutic reflection, with its grounding in hermeneutics and phenomenology, is a useful construct for guiding our engagement in reflexivity in experiential qualitative research.
Resumo:
This thesis describes an investigation of the effect of elevated temperatures upon the properties of plain concrete containing a siliceous aggregate. A complete stress-strain relationship and creep behaviour are studied. Transient effects (non-steady state) are also examined in order to simulate more realistic conditions. A temperature range of 20-700ºC is used. corresponding to the temperatures generally attained during an actual fire. In order to carry out the requisite tests, a stiff compression testing machine has been designed and built. The overall control of the test rig is provided by a logger/computer system by developing appropriate software, thus enabling the load to be held constant for any period of tlme. Before outlining any details of the development of the testing apparatus which includes an electric furnace and the.associated instrumentation, previous work on properties of both concrete and. steel at elevated temperatures is reviewed. The test programme comprises four series of tests:stress-strain tests (with and without pre-load), transient tests (heating to failure under constant stress) and creep tests (constant stress and constant temperature). Where 3 stress levels are examined: 0.2, 0.4 & 0.6 fc. The experimental results show that the properties of concrete are significantly affected by temperature and the magnitude of the load. The slope of the descending portion branch of the stress-strain curves (strain softening) is found to be temperature dependent. After normalizing the data, the stress-strain curves for different temperatures are represented by a single curve. The creep results are analysed using an approach involving the activation energy which is found to be constant. The analysis shows that the time-dependent deformation is sensibly linear with the applied stress. The total strain concept is shown to hold for the test data within limits.
Resumo:
Particulate solids are complex redundant systems which consist of discrete particles. The interactions between the particles are complex and have been the subject of many theoretical and experimental investigations. Invetigations of particulate material have been restricted by the lack of quantitative information on the mechanisms occurring within an assembly. Laboratory experimentation is limited as information on the internal behaviour can only be inferred from measurements on the assembly boundary, or the use of intrusive measuring devices. In addition comparisons between test data are uncertain due to the difficulty in reproducing exact replicas of physical systems. Nevertheless, theoretical and technological advances require more detailed material information. However, numerical simulation affords access to information on every particle and hence the micro-mechanical behaviour within an assembly, and can replicate desired systems. To use a computer program to numerically simulate material behaviour accurately it is necessary to incorporte realistic interaction laws. This research programme used the finite difference simulation program `BALL', developed by Cundall (1971), which employed linear spring force-displacement laws. It was thus necessary to incorporate more realistic interaction laws. Therefore, this research programme was primarily concerned with the implementation of the normal force-displacement law of Hertz (1882) and the tangential force-displacement laws of Mindlin and Deresiewicz (1953). Within this thesis the contact mechanics theories employed in the program are developed and the adaptations which were necessary to incorporate these laws are detailed. Verification of the new contact force-displacement laws was achieved by simulating a quasi-static oblique contact and single particle oblique impact. Applications of the program to the simulation of large assemblies of particles is given, and the problems in undertaking quasi-static shear tests along with the results from two successful shear tests are described.
Resumo:
As microblog services such as Twitter become a fast and convenient communication approach, identification of trendy topics in microblog services has great academic and business value. However detecting trendy topics is very challenging due to huge number of users and short-text posts in microblog diffusion networks. In this paper we introduce a trendy topics detection system under computation and communication resource constraints. In stark contrast to retrieving and processing the whole microblog contents, we develop an idea of selecting a small set of microblog users and processing their posts to achieve an overall acceptable trendy topic coverage, without exceeding resource budget for detection. We formulate the selection operation of these subset users as mixed-integer optimization problems, and develop heuristic algorithms to compute their approximate solutions. The proposed system is evaluated with real-time test data retrieved from Sina Weibo, the dominant microblog service provider in China. It's shown that by monitoring 500 out of 1.6 million microblog users and tracking their microposts (about 15,000 daily) with our system, nearly 65% trendy topics can be detected, while on average 5 hours earlier before they appear in Sina Weibo official trends.
Resumo:
In this paper, we present syllable-based duration modelling in the context of a prosody model for Standard Yorùbá (SY) text-to-speech (TTS) synthesis applications. Our prosody model is conceptualised around a modular holistic framework. This framework is implemented using the Relational Tree (R-Tree) techniques. An important feature of our R-Tree framework is its flexibility in that it facilitates the independent implementation of the different dimensions of prosody, i.e. duration, intonation, and intensity, using different techniques and their subsequent integration. We applied the Fuzzy Decision Tree (FDT) technique to model the duration dimension. In order to evaluate the effectiveness of FDT in duration modelling, we have also developed a Classification And Regression Tree (CART) based duration model using the same speech data. Each of these models was integrated into our R-Tree based prosody model. We performed both quantitative (i.e. Root Mean Square Error (RMSE) and Correlation (Corr)) and qualitative (i.e. intelligibility and naturalness) evaluations on the two duration models. The results show that CART models the training data more accurately than FDT. The FDT model, however, shows a better ability to extrapolate from the training data since it achieved a better accuracy for the test data set. Our qualitative evaluation results show that our FDT model produces synthesised speech that is perceived to be more natural than our CART model. In addition, we also observed that the expressiveness of FDT is much better than that of CART. That is because the representation in FDT is not restricted to a set of piece-wise or discrete constant approximation. We, therefore, conclude that the FDT approach is a practical approach for duration modelling in SY TTS applications. © 2006 Elsevier Ltd. All rights reserved.
Resumo:
Objective: To test the practicality and effectiveness of cheap, ubiquitous, consumer-grade smartphones to discriminate Parkinson’s disease (PD) subjects from healthy controls, using self-administered tests of gait and postural sway. Background: Existing tests for the diagnosis of PD are based on subjective neurological examinations, performed in-clinic. Objective movement symptom severity data, collected using widely-accessible technologies such as smartphones, would enable the remote characterization of PD symptoms based on self-administered, behavioral tests. Smartphones, when backed up by interviews using web-based videoconferencing, could make it feasible for expert neurologists to perform diagnostic testing on large numbers of individuals at low cost. However, to date, the compliance rate of testing using smart-phones has not been assessed. Methods: We conducted a one-month controlled study with twenty participants, comprising 10 PD subjects and 10 controls. All participants were provided identical LG Optimus S smartphones, capable of recording tri-axial acceleration. Using these smartphones, patients conducted self-administered, short (less than 5 minute) controlled gait and postural sway tests. We analyzed a wide range of summary measures of gait and postural sway from the accelerometry data. Using statistical machine learning techniques, we identified discriminating patterns in the summary measures in order to distinguish PD subjects from controls. Results: Compliance was high all 20 participants performed an average of 3.1 tests per day for the duration of the study. Using this test data, we demonstrated cross-validated sensitivity of 98% and specificity of 98% in discriminating PD subjects from healthy controls. Conclusions: Using consumer-grade smartphone accelerometers, it is possible to distinguish PD from healthy controls with high accuracy. Since these smartphones are inexpensive (around $30 each) and easily available, and the tests are highly non-invasive and objective, we envisage that this kind of smartphone-based testing could radically increase the reach and effectiveness of experts in diagnosing PD.
Resumo:
OBJECTIVES: To understand older adults' experiences of moving into extra care housing which offers enrichment activities alongside social and healthcare support. DESIGN: A longitudinal study was conducted which adopted a phenomenological approach to data generation and analysis. METHODS: Semi-structured interviews were conducted in the first 18 months of living in extra care housing. Interpretative phenomenological analysis was used because its commitment to idiography enabled an in-depth analysis of the subjective lived experience of moving into extra care housing. Themes generated inductively were examined against an existential-phenomenological theory of well-being. RESULTS: Learning to live in an extra care community showed negotiating new relationships was not straightforward; maintaining friendships outside the community became more difficult as capacity declined. In springboard for opportunity/confinement, living in extra care provided new opportunities for social engagement and a restored sense of self. Over time horizons began to shrink as incapacities grew. Seeking care illustrated reticence to seek care, due to embarrassment and a sense of duty to one's partner. Becoming aged presented an ontological challenge. Nevertheless, some showed a readiness for death, a sense of homecoming. CONCLUSIONS: An authentic later life was possible but residents required emotional and social support to live through the transition and challenges of becoming aged. Enhancement activities boosted residents' quality of life but the range of activities could be extended to cater better for quieter, smaller scale events within the community; volunteer activity facilitators could be used here. Peer mentoring may help build new relationships and opportunities for interactive stimulation. Acknowledging the importance of feeling-empathic imagination-in caregiving may help staff and residents relate better to each other, thus helping individuals to become ontologically secure and live well to the end.
Resumo:
The purpose of the present study is to test the case linkage principles of behavioural consistency and behavioural distinctiveness using serial vehicle theft data. Data from 386 solved vehicle thefts committed by 193 offenders were analysed using Jaccard's, regression and Receiver Operating Characteristic analyses to determine whether objectively observable aspects of crime scene behaviour could be used to distinguish crimes committed by the same offender from those committed by different offenders. The findings indicate that spatial behaviour, specifically the distance between theft locations and between dump locations, is a highly consistent and distinctive aspect of vehicle theft behaviour; thus, intercrime and interdump distance represent the most useful aspects of vehicle theft for the purpose of case linkage analysis. The findings have theoretical and practical implications for understanding of criminal behaviour and for the development of decision-support tools to assist police investigation and apprehension of serial vehicle theft offenders.
Resumo:
A novel architecture for microwave/millimeter-wave signal generation and data modulation using a fiber-grating-based distributed feedback laser has been proposed in this letter. For demonstration, a 155.52-Mb/s data stream on a 16.9-GHz subcarrier has been transmitted and recovered successfully. It has been proved that this technology would be of benefit to future microwave data transmission systems.