898 resultados para Mapping documentary
Resumo:
The challenge of persistent appearance-based navigation and mapping is to develop an autonomous robotic vision system that can simultaneously localize, map and navigate over the lifetime of the robot. However, the computation time and memory requirements of current appearance-based methods typically scale not only with the size of the environment but also with the operation time of the platform; also, repeated revisits to locations will develop multiple competing representations which reduce recall performance. In this paper we present a solution to the persistent localization, mapping and global path planning problem in the context of a delivery robot in an office environment over a one-week period. Using a graphical appearance-based SLAM algorithm, CAT-Graph, we demonstrate constant time and memory loop closure detection with minimal degradation during repeated revisits to locations, along with topological path planning that improves over time without using a global metric representation. We compare the localization performance of CAT-Graph to openFABMAP, an appearance-only SLAM algorithm, and the path planning performance to occupancy-grid based metric SLAM. We discuss the limitations of the algorithm with regard to environment change over time and illustrate how the topological graph representation can be coupled with local movement behaviors for persistent autonomous robot navigation.
Resumo:
Introduction: Undergraduate students studying the Bachelor of Radiation Therapy at Queensland University of Technology (QUT) attend clinical placements in a number of department sites across Queensland. To ensure that the curriculum prepares students for the most common treatments and current techniques in use in these departments, a curriculum matching exercise was performed. Methods: A cross-sectional census was performed on a pre-determined “Snapshot” date in 2012. This was undertaken by the clinical education staff in each department who used a standardized proforma to count the number of patients as well as prescription, equipment, and technique data for a list of tumour site categories. This information was combined into aggregate anonymized data. Results: All 12 Queensland radiation therapy clinical sites participated in the Snapshot data collection exercise to produce a comprehensive overview of clinical practice on the chosen day. A total of 59 different tumour sites were treated on the chosen day and as expected the most common treatment sites were prostate and breast, comprising 46% of patients treated. Data analysis also indicated that intensity-modulated radiotherapy (IMRT) use is relatively high with 19.6% of patients receiving IMRT treatment on the chosen day. Both IMRT and image-guided radiotherapy (IGRT) indications matched recommendations from the evidence. Conclusion: The Snapshot method proved to be a feasible and efficient method of gathering useful
Resumo:
Historical information can be used, in addition to pedigree, traits and genotypes, to map quantitative trait locus (QTL) in general populations via maximum likelihood estimation of variance components. This analysis is known as linkage disequilibrium (LD) and linkage mapping, because it exploits both linkage in families and LD at the population level. The search for QTL in the wild population of Soay sheep on St. Kilda is a proof of principle. We analysed the data from a previous study and confirmed some of the QTLs reported. The most striking result was the confirmation of a QTL affecting birth weight that had been reported using association tests but not when using linkage-based analyses. Copyright © Cambridge University Press 2010.
Resumo:
A novel multiple regression method (RM) is developed to predict identity-by-descent probabilities at a locus L (IBDL), among individuals without pedigree, given information on surrounding markers and population history. These IBDL probabilities are a function of the increase in linkage disequilibrium (LD) generated by drift in a homogeneous population over generations. Three parameters are sufficient to describe population history: effective population size (Ne), number of generations since foundation (T), and marker allele frequencies among founders (p). IBD L are used in a simulation study to map a quantitative trait locus (QTL) via variance component estimation. RM is compared to a coalescent method (CM) in terms of power and robustness of QTL detection. Differences between RM and CM are small but significant. For example, RM is more powerful than CM in dioecious populations, but not in monoecious populations. Moreover, RM is more robust than CM when marker phases are unknown or when there is complete LD among founders or Ne is wrong, and less robust when p is wrong. CM utilises all marker haplotype information, whereas RM utilises information contained in each individual marker and all possible marker pairs but not in higher order interactions. RM consists of a family of models encompassing four different population structures, and two ways of using marker information, which contrasts with the single model that must cater for all possible evolutionary scenarios in CM.
Resumo:
A whole-genome scan was conducted to map quantitative trait loci (QTL) for BSE resistance or susceptibility. Cows from four half-sib families were included and 173 microsatellite markers were used to construct a 2835-cM (Kosambi) linkage map covering 29 autosomes and the pseudoautosomal region of the sex chromosome. Interval mapping by linear regression was applied and extended to a multiple-QTL analysis approach that used identified QTL on other chromosomes as cofactors to increase mapping power. In the multiple-QTL analysis, two genome-wide significant QTL (BTA17 and X/Y ps) and four genome-wide suggestive QTL (BTA1, 6, 13, and 19) were revealed. The QTL identified here using linkage analysis do not overlap with regions previously identified using TDT analysis. One factor that may explain the disparity between the results is that a more extensive data set was used in the present study. Furthermore, methodological differences between TDT and linkage analyses may affect the power of these approaches.
Resumo:
Next generation screens of diverse dimensions such as the Pebble e-paper watch, Google’s Project Glass, Microsoft’s Kinect and IllumiRoom, and large-scale multi-touch screen surface areas, increasingly saturate and diversify the urban mediascape. This paper seeks to contribute to media architecture and interaction design theory by starting to critically examine how these different screen formats are creating a ubiquitous screen mediascape across the city. We introduce next generation personal, domestic, and public screens. The paper critically challenges conventional dichotomies such as local / global, online / offline, private / public, large / small, mobile / static, that have been created in the past to describe some of the qualities and characteristics of interfaces and their usage. More and more scholars recognise that the black and white nature of these dichotomies does not adequately represent the fluid and agile capabilities of many new screen interfaces. With this paper, we hope to illustrate the more nuanced ‘trans-scalar’ qualities of these new urban interactions, that is, ways in which they provide a range functionality, without being locked into either end of a scale.
Resumo:
The building sector is the dominant consumer of energy and therefore a major contributor to anthropomorphic climate change. The rapid generation of photorealistic, 3D environment models with incorporated surface temperature data has the potential to improve thermographic monitoring of building energy efficiency. In pursuit of this goal, we propose a system which combines a range sensor with a thermal-infrared camera. Our proposed system can generate dense 3D models of environments with both appearance and temperature information, and is the first such system to be developed using a low-cost RGB-D camera. The proposed pipeline processes depth maps successively, forming an ongoing pose estimate of the depth camera and optimizing a voxel occupancy map. Voxels are assigned 4 channels representing estimates of their true RGB and thermal-infrared intensity values. Poses corresponding to each RGB and thermal-infrared image are estimated through a combination of timestamp-based interpolation and a pre-determined knowledge of the extrinsic calibration of the system. Raycasting is then used to color the voxels to represent both visual appearance using RGB, and an estimate of the surface temperature. The output of the system is a dense 3D model which can simultaneously represent both RGB and thermal-infrared data using one of two alternative representation schemes. Experimental results demonstrate that the system is capable of accurately mapping difficult environments, even in complete darkness.
Resumo:
Mapping Multiple Literacies brings together the latest theory and research in the fields of literacy study and European philosophy, Multiple Literacies Theory (MLT) and the philosophical work of Gilles Deleuze. It frames the process of becoming literate as a fluid process involving multiple modes of presentation, and explains these processes in terms of making maps of our social lives and ways of doing things together. For Deleuze, language acquisition is a social activity of which we are a part, but only one part amongst many others. Masny and Cole draw on Deleuze's thinking to expand the repertoires of literacy research and understanding. They outline how we can understand literacy as a social activity and map the ways in which becoming literate may take hold and transform communities. The chapters in this book weave together theory, data and practice to open up a creative new area of literacy studies and to provoke vigorous debate about the sociology of literacy.
Resumo:
The application of different EMS current thresholds on muscle activates not only the muscle but also peripheral sensory axons that send proprioceptive and pain signals to the cerebral cortex. A 32-channel time-domain fNIRS instrument was employed to map regional cortical activities under varied EMS current intensities applied on the right wrist extensor muscle. Eight healthy volunteers underwent four EMS at different current thresholds based on their individual maximal tolerated intensity (MTI), i.e., 10 % < 50 % < 100 % < over 100 % MTI. Time courses of the absolute oxygenated and deoxygenated hemoglobin concentrations primarily over the bilateral sensorimotor cortical (SMC) regions were extrapolated, and cortical activation maps were determined by general linear model using the NIRS-SPM software. The stimulation-induced wrist extension paradigm significantly increased activation of the contralateral SMC region according to the EMS intensities, while the ipsilateral SMC region showed no significant changes. This could be due in part to a nociceptive response to the higher EMS current intensities and result also from increased sensorimotor integration in these cortical regions.
Resumo:
Text categorisation is challenging, due to the complex structure with heterogeneous, changing topics in documents. The performance of text categorisation relies on the quality of samples, effectiveness of document features, and the topic coverage of categories, depending on the employing strategies; supervised or unsupervised; single labelled or multi-labelled. Attempting to deal with these reliability issues in text categorisation, we propose an unsupervised multi-labelled text categorisation approach that maps the local knowledge in documents to global knowledge in a world ontology to optimise categorisation result. The conceptual framework of the approach consists of three modules; pattern mining for feature extraction; feature-subject mapping for categorisation; concept generalisation for optimised categorisation. The approach has been promisingly evaluated by compared with typical text categorisation methods, based on the ground truth encoded by human experts.
Resumo:
Multiple sclerosis (MS) is a common chronic inflammatory disease of the central nervous system. Susceptibility to the disease is affected by both environmental and genetic factors. Genetic factors include haplotypes in the histocompatibility complex (MHC) and over 50 non-MHC loci reported by genome-wide association studies. Amongst these, we previously reported polymorphisms in chromosome 12q13-14 with a protective effect in individuals of European descent. This locus spans 288 kb and contains 17 genes, including several candidate genes which have potentially significant pathogenic and therapeutic implications. In this study, we aimed to fine-map this locus. We have implemented a two-phase study: a variant discovery phase where we have used next-generation sequencing and two target-enrichment strategies [long-range polymerase chain reaction (PCR) and Nimblegen's solution phase hybridization capture] in pools of 25 samples; and a genotyping phase where we genotyped 712 variants in 3577 healthy controls and 3269 MS patients. This study confirmed the association (rs2069502, P = 9.9 × 10−11, OR = 0.787) and narrowed down the locus of association to an 86.5 kb region. Although the study was unable to pinpoint the key-associated variant, we have identified a 42 (genotyped and imputed) single-nucleotide polymorphism haplotype block likely to harbour the causal variant. No evidence of association at previously reported low-frequency variants in CYP27B1 was observed. As part of the study we compared variant discovery performance using two target-enrichment strategies. We concluded that our pools enriched with Nimblegen's solution phase hybridization capture had better sensitivity to detect true variants than the pools enriched with long-range PCR, whilst specificity was better in the long-range PCR-enriched pools compared with solution phase hybridization capture enriched pools; this result has important implications for the design of future fine-mapping studies.
Resumo:
To understand the underlying genetic architecture of cardiovascular disease (CVD) risk traits, we undertook a genome-wide linkage scan to identify CVD quantitative trait loci (QTLs) in 377 individuals from the Norfolk Island population. The central aim of this research focused on the utilization of a genetically and geographically isolated population of individuals from Norfolk Island for the purposes of variance component linkage analysis to identify QTLs involved in CVD risk traits. Substantial evidence supports the involvement of traits such as systolic and diastolic blood pressures, high-density lipoprotein-cholesterol, low-density lipoprotein-cholesterol, body mass index and triglycerides as important risk factors for CVD pathogenesis. In addition to the environmental inXuences of poor diet, reduced physical activity, increasing age, cigarette smoking and alcohol consumption, many studies have illustrated a strong involvement of genetic components in the CVD phenotype through family and twin studies. We undertook a genome scan using 400 markers spaced approximately 10 cM in 600 individuals from Norfolk Island. Genotype data was analyzed using the variance components methods of SOLAR. Our results gave a peak LOD score of 2.01 localizing to chromosome 1p36 for systolic blood pressure and replicated previously implicated loci for other CVD relevant QTLs.
Resumo:
The Chemistry Discipline Network has recently completed two distinct mapping exercises. The first is a snapshot of chemistry taught at 12 institutions around Australia in 2011. There were many similarities but also important differences in the content taught and assessed at different institutions. There were also significant differences in delivery, particularly laboratory contact hours, as well as forms and weightings of assessment. The second mapping exercise mapped the chemistry degrees at three institutions to the Threshold Learning Outcomes for chemistry. Importantly, some of the TLOs were addressed by multiple units at all institutions, while others were not met, or were met at an introductory level only. The exercise also exposed some challenges in using the TLOs as currently written.
Resumo:
Democratic governments raise taxes and charges and spend revenue on delivering peace, order and good government. The delivery process begins with a legislature as that can provide a framework of legally enforceable rules enacted according to the government’s constitution. These rules confer rights and obligations that allow particular people to carry on particular functions at particular places and times. Metadata standards as applied to public records contain information about the functioning of government as distinct from the non-government sector of society. Metadata standards apply to database construction. Data entry, storage, maintenance, interrogation and retrieval depend on a controlled vocabulary needed to enable accurate retrieval of suitably catalogued records in a global information environment. Queensland’s socioeconomic progress now depends in part on technical efficiency in database construction to address queries about who does what, where and when; under what legally enforceable authority; and how the evidence of those facts is recorded. The Survey and Mapping Infrastructure Act 2003 (Qld) addresses technical aspects of where questions – typically the officially recognised name of a place and a description of its boundaries. The current 10-year review of the Survey and Mapping Regulation 2004 provides a valuable opportunity to consider whether the Regulation makes sense in the context of a number of later laws concerned with management of Public Sector Information (PSI) as well as policies for ICT hardware and software procurement. Removing ambiguities about how official place names are to be regarded on a whole-of-government basis can achieve some short term goals. Longer-term goals depend on a more holistic approach to information management – and current aspirations for more open government and community engagement are unlikely to occur without such a longer-term vision.