496 resultados para multiple measurements
Resumo:
Alignment-free methods, in which shared properties of sub-sequences (e.g. identity or match length) are extracted and used to compute a distance matrix, have recently been explored for phylogenetic inference. However, the scalability and robustness of these methods to key evolutionary processes remain to be investigated. Here, using simulated sequence sets of various sizes in both nucleotides and amino acids, we systematically assess the accuracy of phylogenetic inference using an alignment-free approach, based on D2 statistics, under different evolutionary scenarios. We find that compared to a multiple sequence alignment approach, D2 methods are more robust against among-site rate heterogeneity, compositional biases, genetic rearrangements and insertions/deletions, but are more sensitive to recent sequence divergence and sequence truncation. Across diverse empirical datasets, the alignment-free methods perform well for sequences sharing low divergence, at greater computation speed. Our findings provide strong evidence for the scalability and the potential use of alignment-free methods in large-scale phylogenomics.
Resumo:
The autonomous capabilities in collaborative unmanned aircraft systems are growing rapidly. Without appropriate transparency, the effectiveness of the future multiple Unmanned Aerial Vehicle (UAV) management paradigm will be significantly limited by the human agent’s cognitive abilities; where the operator’s CognitiveWorkload (CW) and Situation Awareness (SA) will present as disproportionate. This proposes a challenge in evaluating the impact of robot autonomous capability feedback, allowing the human agent greater transparency into the robot’s autonomous status - in a supervisory role. This paper presents; the motivation, aim, related works, experiment theory, methodology, results and discussions, and the future work succeeding this preliminary study. The results in this paper illustrates that, with a greater transparency of a UAV’s autonomous capability, an overall improvement in the subjects’ cognitive abilities was evident, that is, with a confidence of 95%, the test subjects’ mean CW was demonstrated to have a statistically significant reduction, while their mean SA was demonstrated to have a significant increase.
Resumo:
The research introduces a promising technique for monitoring the degradation status of oil-paper insulation systems of large power transformers in an online mode and innovative enhancements are also made on the existing offline measurements, which afford more direct understanding of the insulation degradation process. Further, these techniques benefit from a quick measurement owing to the chirp waveform signal application. The techniques are improved and developed on the basis of measuring the impedance response of insulation systems. The feasibility and validity of the techniques was supported by the extensive simulation works as well as experimental investigations.
Resumo:
The quality of environmental decisions should be gauged according to managers' objectives. Management objectives generally seek to maximize quantifiable measures of system benefit, for instance population growth rate. Reaching these goals often requires a certain degree of learning about the system. Learning can occur by using management action in combination with a monitoring system. Furthermore, actions can be chosen strategically to obtain specific kinds of information. Formal decision making tools can choose actions to favor such learning in two ways: implicitly via the optimization algorithm that is used when there is a management objective (for instance, when using adaptive management), or explicitly by quantifying knowledge and using it as the fundamental project objective, an approach new to conservation.This paper outlines three conservation project objectives - a pure management objective, a pure learning objective, and an objective that is a weighted mixture of these two. We use eight optimization algorithms to choose actions that meet project objectives and illustrate them in a simulated conservation project. The algorithms provide a taxonomy of decision making tools in conservation management when there is uncertainty surrounding competing models of system function. The algorithms build upon each other such that their differences are highlighted and practitioners may see where their decision making tools can be improved. © 2010 Elsevier Ltd.
Resumo:
This paper presents a framework for synchronising multiple triggered sensors with respect to a local clock using standard computing hardware. Providing sensor measurements with accurate and meaningful timestamps is important for many sensor fusion, state estimation and control applications. Accurately synchronising sensor timestamps can be performed with specialised hardware, however, performing sensor synchronisation using standard computing hardware and non-real-time operating systems is difficult due to inaccurate and temperature sensitive clocks, variable communication delays and operating system scheduling delays. Results show the ability of our framework to estimate time offsets to sub-millisecond accuracy. We also demonstrate how synchronising timestamps with our framework results in a tenfold reduction in image stabilisation error for a vehicle driving on rough terrain. The source code will be released as an open source tool for time synchronisation in ROS.
Resumo:
Water to air methane emissions from freshwater reservoirs can be dominated by sediment bubbling (ebullitive) events. Previous work to quantify methane bubbling from a number of Australian sub-tropical reservoirs has shown that this can contribute as much as 95% of total emissions. These bubbling events are controlled by a variety of different factors including water depth, surface and internal waves, wind seiching, atmospheric pressure changes and water levels changes. Key to quantifying the magnitude of this emission pathway is estimating both the bubbling rate as well as the areal extent of bubbling. Both bubbling rate and areal extent are seldom constant and require persistent monitoring over extended time periods before true estimates can be generated. In this paper we present a novel system for persistent monitoring of both bubbling rate and areal extent using multiple robotic surface chambers and adaptive sampling (grazing) algorithms to automate the quantification process. Individual chambers are self-propelled and guided and communicate between each other without the need for supervised control. They can maintain station at a sampling site for a desired incubation period and continuously monitor, record and report fluxes during the incubation. To exploit the methane sensor detection capabilities, the chamber can be automatically lowered to decrease the head-space and increase concentration. The grazing algorithms assign a hierarchical order to chambers within a preselected zone. Chambers then converge on the individual recording the highest 15 minute bubbling rate. Individuals maintain a specified distance apart from each other during each sampling period before all individuals are then required to move to different locations based on a sampling algorithm (systematic or adaptive) exploiting prior measurements. This system has been field tested on a large-scale subtropical reservoir, Little Nerang Dam, and over monthly timescales. Using this technique, localised bubbling zones on the water storage were found to produce over 50,000 mg m-2 d-1 and the areal extent ranged from 1.8 to 7% of the total reservoir area. The drivers behind these changes as well as lessons learnt from the system implementation are presented. This system exploits relatively cheap materials, sensing and computing and can be applied to a wide variety of aquatic and terrestrial systems.
Resumo:
Ultrafine particles are particles that are less than 0.1 micrometres (µm) in diameter. Due to their very small size they can penetrate deep into the lungs, and potentially cause more damage than larger particles. The Ultrafine Particles from Traffic Emissions and Children’s Health (UPTECH) study is the first Australian epidemiological study to assess the health effects of ultrafine particles on children’s health in general and peripheral airways in particular. The study is being conducted in Brisbane, Australia. Continuous indoor and outdoor air pollution monitoring was conducted within each of the twenty five participating school campuses to measure particulate matter, including in the ultrafine size range, and gases. Respiratory health effects were evaluated by conducting the following tests on participating children at each school: spirometry, forced oscillation technique (FOT) and multiple breath nitrogen washout test (MBNW) (to assess airway function), fraction of exhaled nitric oxide (FeNO, to assess airway inflammation), blood cotinine levels (to assess exposure to second-hand tobacco smoke), and serum C-reactive protein (CRP) levels (to measure systemic inflammation). A pilot study was conducted prior to commencing the main study to assess the feasibility and reliably of measurement of some of the clinical tests that have been proposed for the main study. Air pollutant exposure measurements were not included in the pilot study.
Resumo:
Player experiences and expectations are connected. The presumptions players have about how they control their gameplay interactions may shape the way they play and perceive videogames. A successfully engaging player experience might rest on the way controllers meet players' expectations. We studied player interaction with novel controllers on the Sony PlayStation Wonderbook, an augmented reality (AR) gaming system. Our goal was to understand player expectations regarding game controllers in AR game design. Based on this preliminary study, we propose several interaction guidelines for hybrid input from both augmented reality and physical game controllers
Resumo:
This thesis in software engineering presents a novel automated framework to identify similar operations utilized by multiple algorithms for solving related computing problems. It provides a new effective solution to perform multi-application based algorithm analysis, employing fundamentally light-weight static analysis techniques compared to the state-of-art approaches. Significant performance improvements are achieved across the objective algorithms through enhancing the efficiency of the identified similar operations, targeting discrete application domains.
Resumo:
The purpose of this study was to examine the main and interactive effects of four dimensions of professional commitment on strain (i.e., depression, anxiety, perceived health status, and job dissatisfaction) for a sample of 176 law professionals. The study utilized a two-wave design in which professional commitment and strain were measured at Time 1 and strain was measured again at Time 2 (T2), 2 months later. A significant two-way interaction indicated that high affective commitment was related to less T2 job dissatisfaction only for lawyers with low accumulated costs. A significant four-way interaction indicated that high affective professional commitment was only related to fewer symptoms of T2 anxiety for lawyers with high normative professional commitment and both low limited alternatives and accumulated costs. A similar pattern of results emerged in regard to T2 perceived health status. The theoretical and practical implications of these results for career counselors are discussed.
Resumo:
There is an increasing need in biology and clinical medicine to robustly and reliably measure tens-to-hundreds of peptides and proteins in clinical and biological samples with high sensitivity, specificity, reproducibility and repeatability. Previously, we demonstrated that LC-MRM-MS with isotope dilution has suitable performance for quantitative measurements of small numbers of relatively abundant proteins in human plasma, and that the resulting assays can be transferred across laboratories while maintaining high reproducibility and quantitative precision. Here we significantly extend that earlier work, demonstrating that 11 laboratories using 14 LC-MS systems can develop, determine analytical figures of merit, and apply highly multiplexed MRM-MS assays targeting 125 peptides derived from 27 cancer-relevant proteins and 7 control proteins to precisely and reproducibly measure the analytes in human plasma. To ensure consistent generation of high quality data we incorporated a system suitability protocol (SSP) into our experimental design. The SSP enabled real-time monitoring of LC-MRM-MS performance during assay development and implementation, facilitating early detection and correction of chromatographic and instrumental problems. Low to sub-nanogram/mL sensitivity for proteins in plasma was achieved by one-step immunoaffinity depletion of 14 abundant plasma proteins prior to analysis. Median intra- and inter-laboratory reproducibility was <20%, sufficient for most biological studies and candidate protein biomarker verification. Digestion recovery of peptides was assessed and quantitative accuracy improved using heavy isotope labeled versions of the proteins as internal standards. Using the highly multiplexed assay, participating laboratories were able to precisely and reproducibly determine the levels of a series of analytes in blinded samples used to simulate an inter-laboratory clinical study of patient samples. Our study further establishes that LC-MRM-MS using stable isotope dilution, with appropriate attention to analytical validation and appropriate quality c`ontrol measures, enables sensitive, specific, reproducible and quantitative measurements of proteins and peptides in complex biological matrices such as plasma.
Resumo:
Study Design: Comparative analysis Background: Calculations of lower limbs kinetics are limited by floor-mounted force-plates. Objectives: Comparison of hip joint moments, power and mechanical work on the prosthetic limb of a transfemoral amputee calculated by inverse dynamics using either the ground reactions (force-plates) or knee reactions (transducer). Methods: Kinematics, ground reactions and knee reactions were collected using a motion analysis system, two force-plates and a multi-axial transducer mounted below the socket, respectively. Results: The inverse dynamics using ground reactions under-estimated the peaks of hip energy generation and absorption occurring at 63 % and 76 % of the gait cycle (GC) by 28 % and 54 %, respectively. This method over-estimated a phase of negative work at the hip (from 37 %GC to 56 %GC) by 24%. It under-estimated the phases of positive (from 57 %GC to 72 %GC) and negative (from 73 %GC to 98 %GC) work at the hip by 11 % and 58%, respectively. Conclusions: A transducer mounted within the prosthesis has the capacity to provide more realistic kinetics of the prosthetic limb because it enables assessment of multiple consecutive steps and a wide range of activities without issues of foot placement on force-plates. CLINICAL RELEVANCE The hip is the only joint that an amputee controls directly to set in motion the prosthesis. Hip joint kinetics are associated with joint degeneration, low back pain, risks of fall, etc. Therefore, realistic assessment of hip kinetics over multiple gait cycles and a wide range of activities is essential.
Resumo:
In the structural health monitoring (SHM) field, long-term continuous vibration-based monitoring is becoming increasingly popular as this could keep track of the health status of structures during their service lives. However, implementing such a system is not always feasible due to on-going conflicts between budget constraints and the need of sophisticated systems to monitor real-world structures under their demanding in-service conditions. To address this problem, this paper presents a comprehensive development of a cost-effective and flexible vibration DAQ system for long-term continuous SHM of a newly constructed institutional complex with a special focus on the main building. First, selections of sensor type and sensor positions are scrutinized to overcome adversities such as low-frequency and low-level vibration measurements. In order to economically tackle the sparse measurement problem, a cost-optimized Ethernet-based peripheral DAQ model is first adopted to form the system skeleton. A combination of a high-resolution timing coordination method based on the TCP/IP command communication medium and a periodic system resynchronization strategy is then proposed to synchronize data from multiple distributed DAQ units. The results of both experimental evaluations and experimental–numerical verifications show that the proposed DAQ system in general and the data synchronization solution in particular work well and they can provide a promising cost-effective and flexible alternative for use in real-world SHM projects. Finally, the paper demonstrates simple but effective ways to make use of the developed monitoring system for long-term continuous structural health evaluation as well as to use the instrumented building herein as a multi-purpose benchmark structure for studying not only practical SHM problems but also synchronization related issues.
Resumo:
Student perceptions of teaching have often been used in tertiary education for evaluation purposes. However, there is a paucity of research on the validity, reliability, and applicability of instruments that cover a wide range of student perceptions of pedagogies and practices in high school settings for descriptive purposes. The study attempts to validate an inventory of pedagogy and practice (IPP) that provides researchers and practitioners with a psychometrically sound instrument that covers the most salient factors related to teaching. Using a sample of students (N = 1515) from 39 schools in Singapore, 14 factors about teaching in English lessons from the students’ perspective were tested with confirmatory factor analysis (classroom task goal, structure and clarity, curiosity and interest, positive class climate, feedback, questioning, quality homework, review of students’ work, conventional teaching, exam preparation, behaviour management, maximizing learning time, student-centred pedagogy, and subject domain teaching). Two external criterion factors were used to further test the IPP factor structure. The inventory will enable teachers to understand more about their teaching and researchers to examine how teaching may be related to learning outcomes.
Resumo:
Background Dementia is a chronic illness without cure or effective treatment, which results in declining mental and physical function and assistance from others to manage activities of daily living. Many people with dementia live in long term care facilities, yet research into their quality of life (QoL) was rare until the last decade. Previous studies failed to incorporate important variables related to the facility and care provision or to look closely at the daily lives of residents. This paper presents a protocol for a comprehensive, multi-perspective assessment of QoL of residents with dementia living in long term care in Australia. A secondary aim is investigating the effectiveness of self-report instruments for measuring QoL. Methods The study utilizes a descriptive, mixed methods design to examine how facility, care staff, and resident factors impact QoL. Over 500 residents with dementia from a stratified, random sample of 53 facilities are being recruited. A sub-sample of 12 residents is also taking part in qualitative interviews and observations. Conclusions This national study will provide a broad understanding of factors underlying QoL for residents with dementia in long term care. The present study uses a similar methodology to the US-based Collaborative Studies of Long Term Care (CS-LTC) Dementia Care Study, applying it to the Australian setting.