933 resultados para Computer Generated Proofs
Resumo:
A novel multiple regression method (RM) is developed to predict identity-by-descent probabilities at a locus L (IBDL), among individuals without pedigree, given information on surrounding markers and population history. These IBDL probabilities are a function of the increase in linkage disequilibrium (LD) generated by drift in a homogeneous population over generations. Three parameters are sufficient to describe population history: effective population size (Ne), number of generations since foundation (T), and marker allele frequencies among founders (p). IBD L are used in a simulation study to map a quantitative trait locus (QTL) via variance component estimation. RM is compared to a coalescent method (CM) in terms of power and robustness of QTL detection. Differences between RM and CM are small but significant. For example, RM is more powerful than CM in dioecious populations, but not in monoecious populations. Moreover, RM is more robust than CM when marker phases are unknown or when there is complete LD among founders or Ne is wrong, and less robust when p is wrong. CM utilises all marker haplotype information, whereas RM utilises information contained in each individual marker and all possible marker pairs but not in higher order interactions. RM consists of a family of models encompassing four different population structures, and two ways of using marker information, which contrasts with the single model that must cater for all possible evolutionary scenarios in CM.
Resumo:
An increased interest in utilising groups of Unmanned Aerial Vehicles (UAVs) with heterogeneous capabilities and autonomy is presenting the challenge to effectively manage such during missions and operations. This has been the focus of research in recent years, moving from a traditional UAV management paradigm of n-to-1 (n operators for one UAV, with n being at least two operators) toward 1-to-n (one operator, multiple UAVs). This paper has expanded on the authors’ previous work on UAV functional capability framework, by incorporating the concept of Functional Level of Autonomy (F-LOA) with two configurations: The lower F-LOA configuration contains sufficient information for the operator to generate solutions and make decisions to address perturbation events. Alternatively, the higher F-LOA configuration presents information reflecting on the F-LOA of the UAV, allowing the operator to interpret solutions and decisions generated autonomously, and decide whether to veto from this decision.
Resumo:
Oxidative stress caused by generation of free radicals and related reactive oxygen species (ROS) at the sites of deposition has been proposed as a mechanism for many of the adverse health outcomes associated with exposure to particulate matter (PM). Recently, a new profluorescent nitroxide molecular probe (BPEAnit) developed at QUT was applied in an entirely novel, rapid and non-cell based assay for assessing the oxidative potential of particles (i.e. potential of particles to induce oxidative stress). The technique was applied on particles produced by several combustion sources, namely cigarette smoke, diesel exhaust and wood smoke. One of the main findings from the initial studies undertaken at QUT was that the oxidative potential per PM mass significantly varies for different combustion sources as well as the type of fuel used and combustion conditions. However, possibly the most important finding from our studies was that there was a strong correlation between the organic fraction of particles and the oxidative potential measured by the PFN assay, which clearly highlights the importance of organic species in particle-induced toxicity.
Resumo:
The Australian Business Assessment of Computer User Security (ABACUS) survey is a nationwide assessment of the prevalence and nature of computer security incidents experienced by Australian businesses. This report presents the findings of the survey which may be used by businesses in Australia to assess the effectiveness of their information technology security measures.
Resumo:
Purpose: The measurement of broadband ultrasonic attenuation (BUA) in cancellous bone for the assessment of osteoporosis follows a parabolic-type dependence with bone volume fraction; having minima values corresponding to both entire bone and entire marrow. Langton has recently proposed that the primary BUA mechanism may be significant phase interference due to variations in propagation transit time through the test sample as detected over the phase-sensitive surface of the receive ultrasound transducer. This fundamentally simple concept assumes that the propagation of ultrasound through a complex solid : liquid composite sample such as cancellous bone may be considered by an array of parallel ‘sonic rays’. The transit time of each ray is defined by the proportion of bone and marrow propagated, being a minimum (tmin) solely through bone and a maximum (tmax) solely through marrow. A Transit Time Spectrum (TTS), ranging from tmin to tmax, may be defined describing the proportion of sonic rays having a particular transit time, effectively describing lateral inhomogeneity of transit time over the surface of the receive ultrasound transducer. Phase interference may result from interaction of ‘sonic rays’ of differing transit times. The aim of this study was to test the hypothesis that there is a dependence of phase interference upon the lateral inhomogenity of transit time by comparing experimental measurements and computer simulation predictions of ultrasound propagation through a range of relatively simplistic solid:liquid models exhibiting a range of lateral inhomogeneities. Methods: A range of test models was manufactured using acrylic and water as surrogates for bone and marrow respectively. The models varied in thickness in one dimension normal to the direction of propagation, hence exhibiting a range of transit time lateral inhomogeneities, ranging from minimal (single transit time) to maximal (wedge; ultimately the limiting case where each sonic ray has a unique transit time). For the experimental component of the study, two unfocused 1 MHz ¾” broadband diameter transducers were utilized in transmission mode; ultrasound signals were recorded for each of the models. The computer simulation was performed with Matlab, where the transit time and relative amplitude of each sonic ray was calculated. The transit time for each sonic ray was defined as the sum of transit times through acrylic and water components. The relative amplitude considered the reception area for each sonic ray along with absorption in the acrylic. To replicate phase-sensitive detection, all sonic rays were summed and the output signal plotted in comparison with the experimentally derived output signal. Results: From qualtitative and quantitative comparison of the experimental and computer simulation results, there is an extremely high degree of agreement of 94.2% to 99.0% between the two approaches, supporting the concept that propagation of an ultrasound wave, for the models considered, may be approximated by a parallel sonic ray model where the transit time of each ray is defined by the proportion of ‘bone’ and ‘marrow’. Conclusions: This combined experimental and computer simulation study has successfully demonstrated that lateral inhomogeneity of transit time has significant potential for phase interference to occur if a phase-sensitive ultrasound receive transducer is implemented as in most commercial ultrasound bone analysis devices.
Resumo:
We applied a texture-based flow visualisation technique to a numerical hydrodynamic model of the Pumicestone Passage in southeast Queensland, Australia. The quality of the visualisations using our flow visualisation tool, are compared with animations generated using more traditional drogue release plot and velocity contour and vector techniques. The texture-based method is found to be far more effective in visualising advective flow within the model domain. In some instances, it also makes it easier for the researcher to identify specific hydrodynamic features within the complex flow regimes of this shallow tidal barrier estuary as compared with the direct and geometric based methods.
Resumo:
Previously, expected satiety (ES) has been measured using software and two-dimensional pictures presented on a computer screen. In this context, ES is an excellent predictor of self-selected portions, when quantified using similar images and similar software. In the present study we sought to establish the veracity of ES as a predictor of behaviours associated with real foods. Participants (N = 30) used computer software to assess their ES and ideal portion of three familiar foods. A real bowl of one food (pasta and sauce) was then presented and participants self-selected an ideal portion size. They then consumed the portion ad libitum. Additional measures of appetite, expected and actual liking, novelty, and reward, were also taken. Importantly, our screen-based measures of expected satiety and ideal portion size were both significantly related to intake (p < .05). By contrast, measures of liking were relatively poor predictors (p > .05). In addition, consistent with previous studies, the majority (90%) of participants engaged in plate cleaning. Of these, 29.6% consumed more when prompted by the experimenter. Together, these findings further validate the use of screen-based measures to explore determinants of portion-size selection and energy intake in humans.
Resumo:
This paper presents a unified view of the relationship between (1) quantity and (2) price generating mechanisms in estimating individual prime construction costs/prices. A brief review of quantity generating techniques is provided with particular emphasis on experientially based assumptive approaches and this is compared with the level of pricing data available for the quantities generated in terms of reliability of the ensuing prime cost estimates. It is argued that there is a tradeoff between the reliability of quantity items and reliability of rates. Thus it is shown that the level of quantity generation is optimised by maximising the joint reliability function of the quantity items and their associated rates. Some thoughts on how this joint reliability function can be evaluated and quantified follow. The application of these ideas is described within the overall strategy of the estimator's decision - "Which estimating technique shall I use for a given level of contract information? - and a case is made for the computer generation of estimates by several methods, with an indication of the reliability of each estimate, the ultimate choice of estimate being left to the estimator concerned. Finally, the potential for the development of automatic estimating systems within this framework is examined.
Resumo:
Background: Optimal adherence to antiretroviral therapy (ART) is necessary for people living with HIV/AIDS (PLHIV). There have been relatively few systematic analyses of factors that promote or inhibit adherence to antiretroviral therapy among PLHIV in Asia. This study assessed ART adherence and examined factors associated with suboptimal adherence in northern Viet Nam. Methods: Data from 615 PLHIV on ART in two urban and three rural outpatient clinics were collected by medical record extraction and from patient interviews using audio computer-assisted self-interview (ACASI). Results: The prevalence of suboptimal adherence was estimated to be 24.9% via a visual analogue scale (VAS) of past-month dose-missing and 29.1% using a modified Adult AIDS Clinical Trial Group scale for on-time dose-taking in the past 4 days. Factors significantly associated with the more conservative VAS score were: depression (p < 0.001), side-effect experiences (p < 0.001), heavy alcohol use (p = 0.001), chance health locus of control (p = 0.003), low perceived quality of information from care providers (p = 0.04) and low social connectedness (p = 0.03). Illicit drug use alone was not significantly associated with suboptimal adherence, but interacted with heavy alcohol use to reduce adherence (p < 0.001). Conclusions: This is the largest survey of ART adherence yet reported from Asia and the first in a developing country to use the ACASI method in this context. The evidence strongly indicates that ART services in Viet Nam should include screening and treatment for depression, linkage with alcohol and/or drug dependence treatment, and counselling to address the belief that chance or luck determines health outcomes.
Resumo:
This paper treats the blast response of a pile foundation in saturated sand using explicit nonlinear finite element analysis, considering complex material behavior of soil and soil–pile interaction. Blast wave propagation in the soil is studied and the horizontal deformation of pile and effective stresses in the pile are presented. Results indicate that the upper part of the pile to be vulnerable and the pile response decays with distance from the explosive. The findings of this research provide valuable information on the effects of underground explosions on pile foundation and will guide future development, validation and application of computer models.
Resumo:
Computer games have become a commonplace but engaging activity among students. They enjoy playing computer games as they can perform larger-than-life activities virtually such as jumping from great heights, flying planes, and racing cars; actions that are otherwise not possible in real life. Computer games also offer user interactivity which gives them a certain appeal. Considering this appeal, educators should consider integrating computer games into student learning and to encourage students to author computer games of their own. It is thought that students can be engaged in learning by authoring and using computer games and can also gain essential skills such as collaboration, teamwork, problem solving and deductive reasoning. The research in this study revolves around building student engagement through the task of authoring computer games. The study aims to demonstrate how the creation and sharing of student-authored educational games might facilitate student engagement and how ICT (information and communication technology) plays a supportive role in student learning. Results from this study may lead to the broader integration of computer games into student learning and contribute to similar studies. In this qualitative case study, based in a state school in a low socio-economic area west of Brisbane, Australia, students were selected in both junior and senior secondary classes who have authored computer games as a part of their ICT learning. Senior secondary students (Year 12 ICT) were given the task of programming the games, which were to be based on Mathematics learning topics while the junior secondary students (Year 8 ICT) were given the task of creating multimedia elements for the games. A Mathematics teacher volunteered to assist in the project and provided guidance on the inclusion of suitable Mathematics curricular content into these computer games. The student-authored computer games were then used to support another group of Year 8 Mathematics students to learn the topics of Area, Volume and Time. Data was collected through interviews, classroom observations and artefacts. The teacher researcher, acting in the role of ICT teacher, coordinated with the students and the Mathematics teacher to conduct this study. Instrumental case study was applied as research methodology and Third Generation Activity Theory served as theoretical framework for this study. Data was analysed adopting qualitative coding procedures. Findings of this study indicate that having students author and play computer games promoted student engagement and that ICT played a supportive role in learning and allowed students to gain certain essential skills. Although this study will suggest integrating computer games to support classroom learning, it cannot be presumed that computer games are an immediate solution for promoting student engagement.
Resumo:
Topic modeling has been widely utilized in the fields of information retrieval, text mining, text classification etc. Most existing statistical topic modeling methods such as LDA and pLSA generate a term based representation to represent a topic by selecting single words from multinomial word distribution over this topic. There are two main shortcomings: firstly, popular or common words occur very often across different topics that bring ambiguity to understand topics; secondly, single words lack coherent semantic meaning to accurately represent topics. In order to overcome these problems, in this paper, we propose a two-stage model that combines text mining and pattern mining with statistical modeling to generate more discriminative and semantic rich topic representations. Experiments show that the optimized topic representations generated by the proposed methods outperform the typical statistical topic modeling method LDA in terms of accuracy and certainty.
Resumo:
Deterministic computer simulation of physical experiments is now a common technique in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This paper aims to discuss some practical issues when designing a computer simulation and/or experiments for manufacturing systems. A case study approach is reviewed and presented.