942 resultados para Subpixel precision
Resumo:
In this paper, we present the outcomes of a project on the exploration of the use of Field Programmable Gate Arrays (FPGAs) as co-processors for scientific computation. We designed a custom circuit for the pipelined solving of multiple tri-diagonal linear systems. The design is well suited for applications that require many independent tri-diagonal system solves, such as finite difference methods for solving PDEs or applications utilising cubic spline interpolation. The selected solver algorithm was the Tri-Diagonal Matrix Algorithm (TDMA or Thomas Algorithm). Our solver supports user specified precision thought the use of a custom floating point VHDL library supporting addition, subtraction, multiplication and division. The variable precision TDMA solver was tested for correctness in simulation mode. The TDMA pipeline was tested successfully in hardware using a simplified solver model. The details of implementation, the limitations, and future work are also discussed.
Resumo:
miRDeep and its varieties are widely used to quantify known and novel micro RNA (miRNA) from small RNA sequencing (RNAseq). This article describes miRDeep*, our integrated miRNA identification tool, which is modeled off miRDeep, but the precision of detecting novel miRNAs is improved by introducing new strategies to identify precursor miRNAs. miRDeep* has a user-friendly graphic interface and accepts raw data in FastQ and Sequence Alignment Map (SAM) or the binary equivalent (BAM) format. Known and novel miRNA expression levels, as measured by the number of reads, are displayed in an interface, which shows each RNAseq read relative to the pre-miRNA hairpin. The secondary pre-miRNA structure and read locations for each predicted miRNA are shown and kept in a separate figure file. Moreover, the target genes of known and novel miRNAs are predicted using the TargetScan algorithm, and the targets are ranked according to the confidence score. miRDeep* is an integrated standalone application where sequence alignment, pre-miRNA secondary structure calculation and graphical display are purely Java coded. This application tool can be executed using a normal personal computer with 1.5 GB of memory. Further, we show that miRDeep* outperformed existing miRNA prediction tools using our LNCaP and other small RNAseq datasets. miRDeep* is freely available online at http://www.australianprostatecentre.org/research/software/mirdeep-star
Resumo:
This paper presents a novel evolutionary computation approach to three-dimensional path planning for unmanned aerial vehicles (UAVs) with tactical and kinematic constraints. A genetic algorithm (GA) is modified and extended for path planning. Two GAs are seeded at the initial and final positions with a common objective to minimise their distance apart under given UAV constraints. This is accomplished by the synchronous optimisation of subsequent control vectors. The proposed evolutionary computation approach is called synchronous genetic algorithm (SGA). The sequence of control vectors generated by the SGA constitutes to a near-optimal path plan. The resulting path plan exhibits no discontinuity when transitioning from curve to straight trajectories. Experiments and results show that the paths generated by the SGA are within 2% of the optimal solution. Such a path planner when implemented on a hardware accelerator, such as field programmable gate array chips, can be used in the UAV as on-board replanner, as well as in ground station systems for assisting in high precision planning and modelling of mission scenarios.
Resumo:
Retrieving information from Twitter is always challenging due to its large volume, inconsistent writing and noise. Most existing information retrieval (IR) and text mining methods focus on term-based approach, but suffers from the problems of terms variation such as polysemy and synonymy. This problem deteriorates when such methods are applied on Twitter due to the length limit. Over the years, people have held the hypothesis that pattern-based methods should perform better than term-based methods as it provides more context, but limited studies have been conducted to support such hypothesis especially in Twitter. This paper presents an innovative framework to address the issue of performing IR in microblog. The proposed framework discover patterns in tweets as higher level feature to assign weight for low-level features (i.e. terms) based on their distributions in higher level features. We present the experiment results based on TREC11 microblog dataset and shows that our proposed approach significantly outperforms term-based methods Okapi BM25, TF-IDF and pattern based methods, using precision, recall and F measures.
Resumo:
This paper is concerned with the unsupervised learning of object representations by fusing visual and motor information. The problem is posed for a mobile robot that develops its representations as it incrementally gathers data. The scenario is problematic as the robot only has limited information at each time step with which it must generate and update its representations. Object representations are refined as multiple instances of sensory data are presented; however, it is uncertain whether two data instances are synonymous with the same object. This process can easily diverge from stability. The premise of the presented work is that a robot's motor information instigates successful generation of visual representations. An understanding of self-motion enables a prediction to be made before performing an action, resulting in a stronger belief of data association. The system is implemented as a data-driven partially observable semi-Markov decision process. Object representations are formed as the process's hidden states and are coordinated with motor commands through state transitions. Experiments show the prediction process is essential in enabling the unsupervised learning method to converge to a solution - improving precision and recall over using sensory data alone.
Resumo:
This item provides supplementary materials for the paper mentioned in the title, specifically a range of organisms used in the study. The full abstract for the main paper is as follows: Next Generation Sequencing (NGS) technologies have revolutionised molecular biology, allowing clinical sequencing to become a matter of routine. NGS data sets consist of short sequence reads obtained from the machine, given context and meaning through downstream assembly and annotation. For these techniques to operate successfully, the collected reads must be consistent with the assumed species or species group, and not corrupted in some way. The common bacterium Staphylococcus aureus may cause severe and life-threatening infections in humans,with some strains exhibiting antibiotic resistance. In this paper, we apply an SVM classifier to the important problem of distinguishing S. aureus sequencing projects from alternative pathogens, including closely related Staphylococci. Using a sequence k-mer representation, we achieve precision and recall above 95%, implicating features with important functional associations.
Resumo:
Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the “gold standard” for predicting dose deposition in the patient. In this study, software has been developed that enables the transfer of treatment plan information from the treatment planning system to a Monte Carlo dose calculation engine. A database of commissioned linear accelerator models (Elekta Precise and Varian 2100CD at various energies) has been developed using the EGSnrc/BEAMnrc Monte Carlo suite. Planned beam descriptions and CT images can be exported from the treatment planning system using the DICOM framework. The information in these files is combined with an appropriate linear accelerator model to allow the accurate calculation of the radiation field incident on a modelled patient geometry. The Monte Carlo dose calculation results are combined according to the monitor units specified in the exported plan. The result is a 3D dose distribution that could be used to verify treatment planning system calculations. The software, MCDTK (Monte Carlo Dicom ToolKit), has been developed in the Java programming language and produces BEAMnrc and DOSXYZnrc input files, ready for submission on a high-performance computing cluster. The code has been tested with the Eclipse (Varian Medical Systems), Oncentra MasterPlan (Nucletron B.V.) and Pinnacle3 (Philips Medical Systems) planning systems. In this study the software was validated against measurements in homogenous and heterogeneous phantoms. Monte Carlo models are commissioned through comparison with quality assurance measurements made using a large square field incident on a homogenous volume of water. This study aims to provide a valuable confirmation that Monte Carlo calculations match experimental measurements for complex fields and heterogeneous media.
Resumo:
Accurate thin-film energy dispersive spectroscopic (EDS) analyses of clays with low-atomic-number (low Z) elements (e.g. Na, Al, Si), presents a challenge to the microscopist not only because of the spatial resolution required, but also because of their susceptibility to electron beam-induced radiation damange and very low X-ray count rates. Most common clays, such as kaolinite, smectite and illite occur as submicrometer crystallites with varying degrees of structural disorder in at least two directions and may have dimensions as small as one or two unit cells along the basal direction. Thus, even clays with relatively large a-b dimenstions (e.g., 100 x 100 nm) may be <5nm in the c-axis direction. For typical conditions in an analytical electron microscope (AEM), this sample thickness gives rise to very poor count rates (<200cps) and necessitates long counting times (>300s) in order to obtain satisfactory statistical precision. Unfortunately, beam damage rates for the common clays are very rapid (<10s in imaging mode) between 100kV and 200kV. With a focussed probe for elemental analyses, the damage rate is exacerbated by a high current density and may result in loss of low-Z elements during data collection and consequent loss of analytical accuracy.
Resumo:
Introduction: Evaluating the effectiveness of interventions designed to increase the physical activity in communities is often a difficult and complex task, requiring considerable expertise and investment, and often constrained by methodological limitations. These limitations, in turn, create additional challenges when these studies are used in systematic reviews as they hinder the confidence, precision and interpretation of results. The objective of this paper is to summarise the methodological challenges posed in conducting a systematic review of community-wide physical activity interventions to help inform those conducting future primary research and systematic reviews. Methods: We conducted a Cochrane systematic review of community-wide interventions to increase physical activity. We assessed the methodological quality of the included studies. We will investigate these in greater detail, particularly in relation to the potential impact on measures of effect, confidence in results, generalizability of results and general interpretation. Results: The systematic review was conducted and has been published in the Cochrane Library. A logic model was helpful in defining and interpreting the studies. Many studies of unsuitable study design were excluded; however several important methodological limitations of the primary studies evaluating community-wide physical activity interventions emerged. These included: - the failure to use validated tools to measure physical activity; - issues associated with pre and post test designs; - inadequate sampling of populations; - poor control groups; and - intervention and measurement protocols of inadequate duration. Although it is challenging to undertake rigorous evaluations of complex interventions, these issues result in significant uncertainty over the effectiveness of these interventions, and the possible factors required for a community-wide intervention to be successful. In particular, the combination of several of these limitations (e.g. un-validated tools, inadequate sampling, and short duration) is that studies may lack the sensitivity to detect any meaningful change. Multiple publications of findings for the same study also made interpretation difficult; however, interventions with parallel qualitative publications were helpful. Discussion: Evaluating community wide interventions to increase physical activity in a rigorous way is incredibly challenging. These findings reflect these challenges but have important ramifications for researchers conducting primary studies to determine the efficacy of such interventions, as well as for researchers conducting systematic reviews. This new review shows that the inadequacies of design and evaluation are continuing. It is hoped that the adoption of such suggestions may aid in the development of systematic reviews, but more importantly, in enabling translation of such findings into policy and practice.
Resumo:
Portable water filled barriers (PWFB) are semi-rigid roadside barriers which have the potential to display good crash attenuation characteristics at low and moderate impact speeds. The traditional mesh based numerical methods alone fail to simulate this type of impact with precision, stability and efficiency. This paper proposes to develop an advanced simulation model based on the combination of Smoothed Particles Hydrodynamics (SPH), a meshless method, and finite element method (FEM) for fluid-structure analysis using the commercially available software package LS-Dyna. The interaction between SPH particles and FEA elements is studied in this paper. Two methods of element setup at the element boundary were investigated. The response of the impacted barrier and fluid inside were analysed and compared. The system response and lagging were observed and reported in this paper. It was demonstrated that coupled SPH/FEM can be used in full scale PWFB modelling application. This will aid the research in determining the best initial setup to couple FEA and SPH in road safety barrier for impact response and safety analysis in the future.
Resumo:
Nowadays people heavily rely on the Internet for information and knowledge. Wikipedia is an online multilingual encyclopaedia that contains a very large number of detailed articles covering most written languages. It is often considered to be a treasury of human knowledge. It includes extensive hypertext links between documents of the same language for easy navigation. However, the pages in different languages are rarely cross-linked except for direct equivalent pages on the same subject in different languages. This could pose serious difficulties to users seeking information or knowledge from different lingual sources, or where there is no equivalent page in one language or another. In this thesis, a new information retrieval task—cross-lingual link discovery (CLLD) is proposed to tackle the problem of the lack of cross-lingual anchored links in a knowledge base such as Wikipedia. In contrast to traditional information retrieval tasks, cross language link discovery algorithms actively recommend a set of meaningful anchors in a source document and establish links to documents in an alternative language. In other words, cross-lingual link discovery is a way of automatically finding hypertext links between documents in different languages, which is particularly helpful for knowledge discovery in different language domains. This study is specifically focused on Chinese / English link discovery (C/ELD). Chinese / English link discovery is a special case of cross-lingual link discovery task. It involves tasks including natural language processing (NLP), cross-lingual information retrieval (CLIR) and cross-lingual link discovery. To justify the effectiveness of CLLD, a standard evaluation framework is also proposed. The evaluation framework includes topics, document collections, a gold standard dataset, evaluation metrics, and toolkits for run pooling, link assessment and system evaluation. With the evaluation framework, performance of CLLD approaches and systems can be quantified. This thesis contributes to the research on natural language processing and cross-lingual information retrieval in CLLD: 1) a new simple, but effective Chinese segmentation method, n-gram mutual information, is presented for determining the boundaries of Chinese text; 2) a voting mechanism of name entity translation is demonstrated for achieving a high precision of English / Chinese machine translation; 3) a link mining approach that mines the existing link structure for anchor probabilities achieves encouraging results in suggesting cross-lingual Chinese / English links in Wikipedia. This approach was examined in the experiments for better, automatic generation of cross-lingual links that were carried out as part of the study. The overall major contribution of this thesis is the provision of a standard evaluation framework for cross-lingual link discovery research. It is important in CLLD evaluation to have this framework which helps in benchmarking the performance of various CLLD systems and in identifying good CLLD realisation approaches. The evaluation methods and the evaluation framework described in this thesis have been utilised to quantify the system performance in the NTCIR-9 Crosslink task which is the first information retrieval track of this kind.
Resumo:
Widespread adoption by electricity utilities of Non-Conventional Instrument Transformers, such as optical or capacitive transducers, has been limited due to the lack of a standardised interface and multi-vendor interoperability. Low power analogue interfaces are being replaced by IEC 61850 9 2 and IEC 61869 9 digital interfaces that use Ethernet networks for communication. These ‘process bus’ connections achieve significant cost savings by simplifying connections between switchyard and control rooms; however the in-service performance when these standards are employed is largely unknown. The performance of real-time Ethernet networks and time synchronisation was assessed using a scale model of a substation automation system. The test bed was constructed from commercially available timing and protection equipment supplied by a range of vendors. Test protocols have been developed to thoroughly evaluate the performance of Ethernet networks and network based time synchronisation. The suitability of IEEE Std 1588 Precision Time Protocol (PTP) as a synchronising system for sampled values was tested in the steady state and under transient conditions. Similarly, the performance of hardened Ethernet switches designed for substation use was assessed under a range of network operating conditions. This paper presents test methods that use a precision Ethernet capture card to accurately measure PTP and network performance. These methods can be used for product selection and to assess ongoing system performance as substations age. Key findings on the behaviour of multi-function process bus networks are presented. System level tests were performed using a Real Time Digital Simulator and transformer protection relay with sampled value and Generic Object Oriented Substation Events (GOOSE) capability. These include the interactions between sampled values, PTP and GOOSE messages. Our research has demonstrated that several protocols can be used on a shared process bus, even with very high network loads. This should provide confidence that this technology is suitable for transmission substations.
Resumo:
This chapter provides researchers with a guide to some of the types of dating techniques that can be used in geomorpological investigations and issues that need to be addressed when using gechronological data, specifically issues relating to accuracy and precision. This chapter also introduces the 'types' of dating methods that are commonly used in geomorphological studies. This includes sidereal, isotopic, radiogenic, and chemical dating methods.
Resumo:
Using Monte Carlo simulation for radiotherapy dose calculation can provide more accurate results when compared to the analytical methods usually found in modern treatment planning systems, especially in regions with a high degree of inhomogeneity. These more accurate results acquired using Monte Carlo simulation however, often require orders of magnitude more calculation time so as to attain high precision, thereby reducing its utility within the clinical environment. This work aims to improve the utility of Monte Carlo simulation within the clinical environment by developing techniques which enable faster Monte Carlo simulation of radiotherapy geometries. This is achieved principally through the use new high performance computing environments and simpler alternative, yet equivalent representations of complex geometries. Firstly the use of cloud computing technology and it application to radiotherapy dose calculation is demonstrated. As with other super-computer like environments, the time to complete a simulation decreases as 1=n with increasing n cloud based computers performing the calculation in parallel. Unlike traditional super computer infrastructure however, there is no initial outlay of cost, only modest ongoing usage fees; the simulations described in the following are performed using this cloud computing technology. The definition of geometry within the chosen Monte Carlo simulation environment - Geometry & Tracking 4 (GEANT4) in this case - is also addressed in this work. At the simulation implementation level, a new computer aided design interface is presented for use with GEANT4 enabling direct coupling between manufactured parts and their equivalent in the simulation environment, which is of particular importance when defining linear accelerator treatment head geometry. Further, a new technique for navigating tessellated or meshed geometries is described, allowing for up to 3 orders of magnitude performance improvement with the use of tetrahedral meshes in place of complex triangular surface meshes. The technique has application in the definition of both mechanical parts in a geometry as well as patient geometry. Static patient CT datasets like those found in typical radiotherapy treatment plans are often very large and present a significant performance penalty on a Monte Carlo simulation. By extracting the regions of interest in a radiotherapy treatment plan, and representing them in a mesh based form similar to those used in computer aided design, the above mentioned optimisation techniques can be used so as to reduce the time required to navigation the patient geometry in the simulation environment. Results presented in this work show that these equivalent yet much simplified patient geometry representations enable significant performance improvements over simulations that consider raw CT datasets alone. Furthermore, this mesh based representation allows for direct manipulation of the geometry enabling motion augmentation for time dependant dose calculation for example. Finally, an experimental dosimetry technique is described which allows the validation of time dependant Monte Carlo simulation, like the ones made possible by the afore mentioned patient geometry definition. A bespoke organic plastic scintillator dose rate meter is embedded in a gel dosimeter thereby enabling simultaneous 3D dose distribution and dose rate measurement. This work demonstrates the effectiveness of applying alternative and equivalent geometry definitions to complex geometries for the purposes of Monte Carlo simulation performance improvement. Additionally, these alternative geometry definitions allow for manipulations to be performed on otherwise static and rigid geometry.
Resumo:
Summary: More than ever before contemporary societies are characterised by the huge amounts of data being transferred. Authorities, companies, academia and other stakeholders refer to Big Data when discussing the importance of large and complex datasets and developing possible solutions for their use. Big Data promises to be the next frontier of innovation for institutions and individuals, yet it also offers possibilities to predict and influence human behaviour with ever-greater precision