234 resultados para Hartree Fock scheme correlation errors


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper considers the debate about the relationship between globalization and media policy from the perspective provided by a current review of the Australian media classification scheme. Drawing upon the author’s recent experience in being ‘inside’ the policy process, as Lead Commissioner on the Australian National Classification Scheme Review, it is argued that theories of globalization – including theories of neoliberal globalization – fail to adequately capture the complexities of the reform process, particularly around the relationship between regulation and markets. The paper considers the pressure points for media content policies arising from media globalization, and the wider questions surrounding media content policies in an age of media convergence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Navigational collisions are one of the major safety concerns in many seaports. Despite the extent of recent works done on port navigational safety research, little is known about harbor pilot’s perception of collision risks in port fairways. This paper uses a hierarchical ordered probit model to investigate associations between perceived risks and the geometric and traffic characteristics of fairways and the pilot attributes. Perceived risk data, collected through a risk perception survey conducted among the Singapore port pilots, are used to calibrate the model. Intra-class correlation coefficient justifies use of the hierarchical model in comparison with an ordinary model. Results show higher perceived risks in fairways attached to anchorages, and in those featuring sharper bends and higher traffic operating speeds. Lesser risks are perceived in fairways attached to shoreline and confined waters, and in those with one-way traffic, traffic separation scheme, cardinal marks and isolated danger marks. Risk is also found to be perceived higher in night.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statistical dependence between classifier decisions is often shown to improve performance over statistically independent decisions. Though the solution for favourable dependence between two classifier decisions has been derived, the theoretical analysis for the general case of 'n' client and impostor decision fusion has not been presented before. This paper presents the expressions developed for favourable dependence of multi-instance and multi-sample fusion schemes that employ 'AND' and 'OR' rules. The expressions are experimentally evaluated by considering the proposed architecture for text-dependent speaker verification using HMM based digit dependent speaker models. The improvement in fusion performance is found to be higher when digit combinations with favourable client and impostor decisions are used for speaker verification. The total error rate of 20% for fusion of independent decisions is reduced to 2.1% for fusion of decisions that are favourable for both client and impostors. The expressions developed here are also applicable to other biometric modalities, such as finger prints and handwriting samples, for reliable identity verification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Submission to the Australian Government Attorney General’s Department consultation paper on Revising the Scope of the Copyright ‘Safe Harbour Scheme

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The major limitation of current typing methods for Streptococcus pyogenes, such as emm sequence typing and T typing, is that these are based on regions subject to considerable selective pressure. Multilocus sequence typing (MLST) is a better indicator of the genetic backbone of a strain but is not widely used due to high costs. The objective of this study was to develop a robust and cost-effective alternative to S. pyogenes MLST. A 10-member single nucleotide polymorphism (SNP) set that provides a Simpson’s Index of Diversity (D) of 0.99 with respect to the S. pyogenes MLST database was derived. A typing format involving high-resolution melting (HRM) analysis of small fragments nucleated by each of the resolution-optimized SNPs was developed. The fragments were 59–119 bp in size and, based on differences in G+C content, were predicted to generate three to six resolvable HRM curves. The combination of curves across each of the 10 fragments can be used to generate a melt type (MelT) for each sequence type (ST). The 525 STs currently in the S. pyogenes MLST database are predicted to resolve into 298 distinct MelTs and the method is calculated to provide a D of 0.996 against the MLST database. The MelTs are concordant with the S. pyogenes population structure. To validate the method we examined clinical isolates of S. pyogenes of 70 STs. Curves were generated as predicted by G+C content discriminating the 70 STs into 65 distinct MelTs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background. Vertebral rotation found in structural scoliosis contributes to trunkal asymmetry which is commonly measured with a simple Scoliometer device on a patient's thorax in the forward flexed position. The new generation of mobile 'smartphones' have an integrated accelerometer, making accurate angle measurement possible, which provides a potentially useful clinical tool for assessing rib hump deformity. This study aimed to compare rib hump angle measurements performed using a Smartphone and traditional Scoliometer on a set of plaster torsos representing the range of torsional deformities seen in clinical practice. Methods. Nine observers measured the rib hump found on eight plaster torsos moulded from scoliosis patients with both a Scoliometer and an Apple iPhone on separate occasions. Each observer repeated the measurements at least a week after the original measurements, and were blinded to previous results. Intra-observer reliability and inter-observer reliability were analysed using the method of Bland and Altman and 95% confidence intervals were calculated. The Intra-Class Correlation Coefficients (ICC) were calculated for repeated measurements of each of the eight plaster torso moulds by the nine observers. Results. Mean absolute difference between pairs of iPhone/Scoliometer measurements was 2.1 degrees, with a small (1 degrees) bias toward higher rib hump angles with the iPhone. 95% confidence intervals for intra-observer variability were +/- 1.8 degrees (Scoliometer) and +/- 3.2 degrees (iPhone). 95% confidence intervals for inter-observer variability were +/- 4.9 degrees (iPhone) and +/- 3.8 degrees (Scoliometer). The measurement errors and confidence intervals found were similar to or better than the range of previously published thoracic rib hump measurement studies. Conclusions. The iPhone is a clinically equivalent rib hump measurement tool to the Scoliometer in spinal deformity patients. The novel use of plaster torsos as rib hump models avoids the variables of patient fatigue and discomfort, inconsistent positioning and deformity progression using human subjects in a single or multiple measurement sessions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study presents a multi-layer genetic algorithm (GA) approach using correlation-based methods to facilitate damage determination for through-truss bridge structures. To begin, the structure’s damage-suspicious elements are divided into several groups. In the first GA layer, the damage is initially optimised for all groups using correlation objective function. In the second layer, the groups are combined to larger groups and the optimisation starts over at the normalised point of the first layer result. Then the identification process repeats until reaching the final layer where one group includes all structural elements and only minor optimisations are required to fine tune the final result. Several damage scenarios on a complicated through-truss bridge example are nominated to address the proposed approach’s effectiveness. Structural modal strain energy has been employed as the variable vector in the correlation function for damage determination. Simulations and comparison with the traditional single-layer optimisation shows that the proposed approach is efficient and feasible for complicated truss bridge structures when the measurement noise is taken into account.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to explore barriers and facilitators to using CityCycle, a public bicycle share scheme in Brisbane, Australia. Focus groups were conducted with participants belonging to one of three categories. Group one consisted of infrequent and noncyclists (no bicycle riding over the past month), group two were regular bicycle riders (ridden a bicycle at least once in the past month) and group three was composed of CityCycle members. A thematic analytic method was used to analyse the data. Three main themes were found: Accessibility/spontaneity, safety and weather/topography. The lengthy sign-up process was thought to stifle the spontaneity typically thought to attract people to public bike share. Mandatory helmet legislation was thought to reduce spontaneous use. Safety was a major concern for all groups and this included a perceived lack of suitable bicycle infrastructure, as well as regular riders describing a negative attitude of some car drivers. Interestingly, CityCycle riders unanimously perceived car driver attitudes to improve when on CityCycle bicycles relative to riding on personal bicycles. Conclusions: In order to increase the popularity of the CityCycle scheme, the results of this study suggest that a more accessible, spontaneous sign-up process is required, 24/7 opening hours, and greater incentives to sign up new members and casual users, as seeing people using CityCycle appears critical to further take up.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stormwater quality modelling results is subject to uncertainty. The variability of input parameters is an important source of overall model error. An in-depth understanding of the variability associated with input parameters can provide knowledge on the uncertainty associated with these parameters and consequently assist in uncertainty analysis of stormwater quality models and the decision making based on modelling outcomes. This paper discusses the outcomes of a research study undertaken to analyse the variability related to pollutant build-up parameters in stormwater quality modelling. The study was based on the analysis of pollutant build-up samples collected from 12 road surfaces in residential, commercial and industrial land uses. It was found that build-up characteristics vary appreciably even within the same land use. Therefore, using land use as a lumped parameter would contribute significant uncertainties in stormwater quality modelling. Additionally, it was also found that the variability in pollutant build-up can also be significant depending on the pollutant type. This underlines the importance of taking into account specific land use characteristics and targeted pollutant species when undertaking uncertainty analysis of stormwater quality models or in interpreting the modelling outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the context of ambiguity resolution (AR) of Global Navigation Satellite Systems (GNSS), decorrelation among entries of an ambiguity vector, integer ambiguity search and ambiguity validations are three standard procedures for solving integer least-squares problems. This paper contributes to AR issues from three aspects. Firstly, the orthogonality defect is introduced as a new measure of the performance of ambiguity decorrelation methods, and compared with the decorrelation number and with the condition number which are currently used as the judging criterion to measure the correlation of ambiguity variance-covariance matrix. Numerically, the orthogonality defect demonstrates slightly better performance as a measure of the correlation between decorrelation impact and computational efficiency than the condition number measure. Secondly, the paper examines the relationship of the decorrelation number, the condition number, the orthogonality defect and the size of the ambiguity search space with the ambiguity search candidates and search nodes. The size of the ambiguity search space can be properly estimated if the ambiguity matrix is decorrelated well, which is shown to be a significant parameter in the ambiguity search progress. Thirdly, a new ambiguity resolution scheme is proposed to improve ambiguity search efficiency through the control of the size of the ambiguity search space. The new AR scheme combines the LAMBDA search and validation procedures together, which results in a much smaller size of the search space and higher computational efficiency while retaining the same AR validation outcomes. In fact, the new scheme can deal with the case there are only one candidate, while the existing search methods require at least two candidates. If there are more than one candidate, the new scheme turns to the usual ratio-test procedure. Experimental results indicate that this combined method can indeed improve ambiguity search efficiency for both the single constellation and dual constellations respectively, showing the potential for processing high dimension integer parameters in multi-GNSS environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter reports on eleven interviews with Pro-Am archivists of Australian television which aimed to find out how they decide what materials are important enough to archive. Interviewees mostly choose to collect materials in which they have a personal interest. But they are also aware of the relationship between their own favourites and wider accounts of Australian television history, and negotiate between these two positions. Most interviewees acknowledged Australian television’s links with British and American programming, but also felt that Australian television is distinctive. They argued that Australian television history is ignored in a way that isn’t true for the UK or the US. Several also argued that Australian television has had a ‘naïve’ nature that has allowed it to be more experimental.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fastmoving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment due to the use of non-deterministic readily-available hardware (such as 802.11-based wireless) and inaccurate clock synchronisation protocols (such as Network Time Protocol (NTP)). As a result, the synchronisation of the clocks between robots can be out by tens-to-hundreds of milliseconds making correlation of data difficult and preventing the possibility of the units performing synchronised actions such as triggering cameras or intricate swarm manoeuvres. In this thesis, a complete data fusion unit is designed, implemented and tested. The unit, named BabelFuse, is able to accept sensor data from a number of low-speed communication buses (such as RS232, RS485 and CAN Bus) and also timestamp events that occur on General Purpose Input/Output (GPIO) pins referencing a submillisecondaccurate wirelessly-distributed "global" clock signal. In addition to its timestamping capabilities, it can also be used to trigger an attached camera at a predefined start time and frame rate. This functionality enables the creation of a wirelessly-synchronised distributed image acquisition system over a large geographic area; a real world application for this functionality is the creation of a platform to facilitate wirelessly-distributed 3D stereoscopic vision. A ‘best-practice’ design methodology is adopted within the project to ensure the final system operates according to its requirements. Initially, requirements are generated from which a high-level architecture is distilled. This architecture is then converted into a hardware specification and low-level design, which is then manufactured. The manufactured hardware is then verified to ensure it operates as designed and firmware and Linux Operating System (OS) drivers are written to provide the features and connectivity required of the system. Finally, integration testing is performed to ensure the unit functions as per its requirements. The BabelFuse System comprises of a single Grand Master unit which is responsible for maintaining the absolute value of the "global" clock. Slave nodes then determine their local clock o.set from that of the Grand Master via synchronisation events which occur multiple times per-second. The mechanism used for synchronising the clocks between the boards wirelessly makes use of specific hardware and a firmware protocol based on elements of the IEEE-1588 Precision Time Protocol (PTP). With the key requirement of the system being submillisecond-accurate clock synchronisation (as a basis for timestamping and camera triggering), automated testing is carried out to monitor the o.sets between each Slave and the Grand Master over time. A common strobe pulse is also sent to each unit for timestamping; the correlation between the timestamps of the di.erent units is used to validate the clock o.set results. Analysis of the automated test results show that the BabelFuse units are almost threemagnitudes more accurate than their requirement; clocks of the Slave and Grand Master units do not di.er by more than three microseconds over a running time of six hours and the mean clock o.set of Slaves to the Grand Master is less-than one microsecond. The common strobe pulse used to verify the clock o.set data yields a positive result with a maximum variation between units of less-than two microseconds and a mean value of less-than one microsecond. The camera triggering functionality is verified by connecting the trigger pulse output of each board to a four-channel digital oscilloscope and setting each unit to output a 100Hz periodic pulse with a common start time. The resulting waveform shows a maximum variation between the rising-edges of the pulses of approximately 39¥ìs, well below its target of 1ms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motorcycle trauma is a serious road safety issue in Queensland and throughout Australia. In 2009, Queensland Transport (later Transport and Main Roads or TMR) appointed CARRS-Q to provide a three-year program of Road Safety Research Services for Motorcycle Rider Safety. Funding for this research originated from the Motor Accident Insurance Commission. This program of research was undertaken to produce knowledge to assist TMR to improve motorcycle safety by further strengthening the licensing and training system to make learner riders safer by developing a pre-learner package (Deliverable 1 which is the focus of this report), and by evaluating the Q-Ride CAP program to ensure that it is maximally effective and contributes to the best possible training for new riders (Deliverable 2), which is the focus of this report. Deliverable 3 of the program identified potential new licensing components that will reduce the incidence of risky riding and improve higher-order cognitive skills in new riders. While fatality and injury rates for learner car drivers are typically lower than for those with intermediate licences, this pattern is not found for learner motorcycle riders. Learner riders cannot be supervised as effectively as learner car drivers and errors are more likely to result in injury for learner riders than learner drivers. It is therefore imperative to improve safety for learner riders. Deliverable 1 examines the potential for improving the motorcycle learner and licence scheme by introducing a pre-learner motorcycle licensing and training scheme within Queensland. The tasks undertaken for Deliverable 1 were a literature review, analysis of learner motorcyclist crash and licensing data, and the development of a potential pre-learner motorcycle rider program.