960 resultados para Map-based Cloning
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-05
Resumo:
Bacterial artificial chromosome (BAC) libraries have been widely used in different aspects of genome research. In this paper we report the construction of the first mungbean (Vigna radiata L. Wilczek) BAC libraries. These BAC clones were obtained from two ligations and represent an estimated 3.5 genome equivalents. This correlated well with the screening of nine random single-copy restriction fragment length polymorphism probes, which detected on average three BACs each. These mungbean clones were successfully used in the development of two PCR-based markers linked closely with a major locus conditioning bruchid (Callosobruchus chinesis) resistance. These markers will be invaluable in facilitating the introgression of bruchid resistance into breeding programmes as well as the further characterisation of the resistance locus.
Resumo:
Phytophthora root rot, caused by Phytophthora medicaginis, is a major limitation to lucerne ( Medicago sativa L.) production in Australia and North America. Quantitative trait loci (QTLs) involved in resistance to P. medicaginis were identified in a lucerne backcross population of 120 individuals. A genetic linkage map was constructed for tetraploid lucerne using 50 RAPD ( randomly amplified polymorphic DNA), 104 AFLP (amplified fragment length polymorphism) markers, and one SSR ( simple sequence repeat or microsatellite) marker, which originated from the resistant parent (W116); 13 markers remain unlinked. The linkage map contains 18 linkage groups covering 2136.5 cM, with an average distance of 15.0 cM between markers. Four of the linkage groups contained only either 2 or 3 markers. Using duplex markers and repulsion phase linkages the map condensed to 7 homology groups and 2 unassigned linkage groups. Three regions located on linkage groups 2, 14, and 18, were identified as associated with root reaction and the QTLs explained 6 - 15% of the phenotypic variation. The research also indicates that different resistance QTLs are involved in conferring resistance in different organs. Two QTLs were identified as associated with disease resistance expressed after inoculation of detached leaves. The marker, W11-2 on group 18, identified as associated with root reaction, contributed 7% of the phenotypic variation in leaf response in our population. This marker appears to be linked to a QTL encoding a resistance factor contributing to both root and leaf reaction. One other QTL, not identified as associated with root reaction, was positioned on group 1 and contributed to 6% of the variation. This genetic linkage map provides an entry point for future molecular-based improvement of lucerne in Australia, and markers linked to the QTLs we have reported should be useful for marker-assisted selection for partial resistance to P. medicaginis in lucerne.
Resumo:
In vitro evolution imitates the natural evolution of genes and has been very successfully applied to the modification of coding sequences, but it has not yet been applied to promoter sequences. We propose an alternative method for functional promoter analysis by applying an in vitro evolution scheme consisting of rounds of error-prone PCR, followed by DNA shuffling and selection of mutant promoter activities. We modified the activity in embryogenic sugarcane cells of the promoter region of the Goldfinger isolate of banana streak virus and obtained mutant promoter sequences that showed an average mutation rate of 2.5% after applying one round of error-prone PCR and DNA shuffling. Selection and sequencing of promoter sequences with decreased or unaltered activity allowed us to rapidly map the position of one cis-acting element that influenced promoter activity in embryogenic sugarcane cells and to discover neutral mutations that did not affect promoter Junction. The selective-shotgun approach of this promoter analysis method immediately after the promoter boundaries have been defined by 5' deletion analysis dramatically reduces the labor associated with traditional linker-scanning deletion analysis to reveal the position of functional promoter domains. Furthermore, this method allows the entire promoter to be investigated at once, rather than selected domains or nucleotides, increasing the, prospect of identifying interacting promoter regions.
Resumo:
Information about the world is often represented in the brain in the form of topographic maps. A paradigm example is the topographic representation of the visual world in the optic tectum/superior colliculus. This map initially forms during neural development using activity-independent molecular cues, most notably some type of chemospecific matching between molecular gradients in the retina and corresponding gradients in the tectum/superior colliculus. Exactly how this process might work has been studied both experimentally and theoretically for several decades. This review discusses the experimental data briefly, and then in more detail the theoretical models proposed. The principal conclusions are that (1) theoretical models have helped clarify several important ideas in the field, (2) earlier models were often more sophisticated than more recent models, and (3) substantial revisions to current modelling approaches are probably required to account for more than isolated subsets of the experimental data.
Resumo:
Deterioration of enhanced biological phosphorus removal (EBPR) has been linked to the proliferation of glycogen-accumulating organisms (GAOs), but few organisms possessing the GAO metabolic phenotype have been identified. An unidentified GAO was highly enriched in a laboratory-scale bioreactor and attempts to identify this organism using conventional 16S rRNA gene cloning had failed. Therefore, rRNA-based stable isotope probing followed by full-cycle rRNA analysis was used to specifically identify the putative GAOs based on their characteristic metabolic phenotype. The study obtained sequences from a group of Alphaproteobacteria not previously shown to possess the GAO phenotype, but 90% identical by 16S rRNA gene analysis to a phylogenetic clade containing cloned sequences from putative GAOs and the isolate Defluvicoccus vanus. Fluorescence in situ hybridization (FISH) probes (DF988 and DF1020) were designed to target the new group and post-FISH chemical staining demonstrated anaerobic-aerobic cycling of polyhydroxyalkanoates, as per the GAO phenotype. The successful use of probes DF988 and DF1020 required the use of unlabelled helper probes which increased probe signal intensity up to 6.6-fold, thus highlighting the utility of helper probes in FISH. The new group constituted 33% of all Bacteria in the lab-scale bioreactor from which they were identified and were also abundant (51 and 55% of Bacteria) in two other similar bioreactors in which phosphorus removal had deteriorated. Unlike the previously identified Defluvicoccus-related organisms, the group identified in this study were also found in two full-scale treatment plants performing EBPR, suggesting that this group may be industrially relevant.
Resumo:
The parameterless self-organizing map (PLSOM) is a new neural network algorithm based on the self-organizing map (SOM). It eliminates the need for a learning rate and annealing schemes for learning rate and neighborhood size. We discuss the relative performance of the PLSOM and the SOM and demonstrate some tasks in which the SOM fails but the PLSOM performs satisfactory. Finally we discuss some example applications of the PLSOM and present a proof of ordering under certain limited conditions.
Resumo:
T he international FANTOM consortium aims to produce a comprehensive picture of the mammalian transcriptome, based upon an extensive cDNA collection and functional annotation of full-length enriched cDNAs. The previous dataset, FANTOM(2), comprised 60,770 full- length enriched cDNAs. Functional annotation revealed that this cDNA dataset contained only about half of the estimated number of mouse protein- coding genes, indicating that a number of cDNAs still remained to be collected and identified. To pursue the complete gene catalog that covers all predicted mouse genes, cloning and sequencing of full- length enriched cDNAs has been continued since FANTOM2. In FANTOM3, 42,031 newly isolated cDNAs were subjected to functional annotation, and the annotation of 4,347 FANTOM2 cDNAs was updated. To accomplish accurate functional annotation, we improved our automated annotation pipeline by introducing new coding sequence prediction programs and developed a Web- based annotation interface for simplifying the annotation procedures to reduce manual annotation errors. Automated coding sequence and function prediction was followed with manual curation and review by expert curators. A total of 102,801 full- length enriched mouse cDNAs were annotated. Out of 102,801 transcripts, 56,722 were functionally annotated as protein coding ( including partial or truncated transcripts), providing to our knowledge the greatest current coverage of the mouse proteome by full- length cDNAs. The total number of distinct non- protein- coding transcripts increased to 34,030. The FANTOM3 annotation system, consisting of automated computational prediction, manual curation, and. nal expert curation, facilitated the comprehensive characterization of the mouse transcriptome, and could be applied to the transcriptomes of other species.
Resumo:
Government agencies responsible for riparian environments are assessing the combined utility of field survey and remote sensing for mapping and monitoring indicators of riparian zone condition. The objective of this work was to compare the Tropical Rapid Appraisal of Riparian Condition (TRARC) method to a satellite image based approach. TRARC was developed for rapid assessment of the environmental condition of savanna riparian zones. The comparison assessed mapping accuracy, representativeness of TRARC assessment, cost-effectiveness, and suitability for multi-temporal analysis. Two multi-spectral QuickBird images captured in 2004 and 2005 and coincident field data covering sections of the Daly River in the Northern Territory, Australia were used in this work. Both field and image data were processed to map riparian health indicators (RHIs) including percentage canopy cover, organic litter, canopy continuity, stream bank stability, and extent of tree clearing. Spectral vegetation indices, image segmentation and supervised classification were used to produce RHI maps. QuickBird image data were used to examine if the spatial distribution of TRARC transects provided a representative sample of ground based RHI measurements. Results showed that TRARC transects were required to cover at least 3% of the study area to obtain a representative sample. The mapping accuracy and costs of the image based approach were compared to those of the ground based TRARC approach. Results proved that TRARC was more cost-effective at smaller scales (1-100km), while image based assessment becomes more feasible at regional scales (100-1000km). Finally, the ability to use both the image and field based approaches for multi-temporal analysis of RHIs was assessed. Change detection analysis demonstrated that image data can provide detailed information on gradual change, while the TRARC method was only able to identify more gross scale changes. In conclusion, results from both methods were considered to complement each other if used at appropriate spatial scales.
Resumo:
We propose a novel recursive-algorithm based maximum a posteriori probability (MAP) detector in spectrally-efficient coherent wavelength division multiplexing (CoWDM) systems, and investigate its performance in a 1-bit/s/Hz on-off keyed (OOK) system limited by optical-signal-to-noise ratio. The proposed method decodes each sub-channel using the signal levels not only of the particular sub-channel but also of its adjacent sub-channels, and therefore can effectively compensate deterministic inter-sub-channel crosstalk as well as inter-symbol interference arising from narrow-band filtering and chromatic dispersion (CD). Numerical simulation of a five-channel OOK-based CoWDM system with 10Gbit/s per channel using either direct or coherent detection shows that the MAP decoder can eliminate the need for phase control of each optical carrier (which is necessarily required in a conventional CoWDM system), and greatly relaxes the spectral design of the demultiplexing filter at the receiver. It also significantly improves back-to-back sensitivity and CD tolerance of the system.
Resumo:
Monitoring land-cover changes on sites of conservation importance allows environmental problems to be detected, solutions to be developed and the effectiveness of actions to be assessed. However, the remoteness of many sites or a lack of resources means these data are frequently not available. Remote sensing may provide a solution, but large-scale mapping and change detection may not be appropriate, necessitating site-level assessments. These need to be easy to undertake, rapid and cheap. We present an example of a Web-based solution based on free and open-source software and standards (including PostGIS, OpenLayers, Web Map Services, Web Feature Services and GeoServer) to support assessments of land-cover change (and validation of global land-cover maps). Authorised users are provided with means to assess land-cover visually and may optionally provide uncertainty information at various levels: from a general rating of their confidence in an assessment to a quantification of the proportions of land-cover types within a reference area. Versions of this tool have been developed for the TREES-3 initiative (Simonetti, Beuchle and Eva, 2011). This monitors tropical land-cover change through ground-truthing at latitude / longitude degree confluence points, and for monitoring of change within and around Important Bird Areas (IBAs) by Birdlife International and the Royal Society for the Protection of Birds (RSPB). In this paper we present results from the second of these applications. We also present further details on the potential use of the land-cover change assessment tool on sites of recognised conservation importance, in combination with NDVI and other time series data from the eStation (a system for receiving, processing and disseminating environmental data). We show how the tool can be used to increase the usability of earth observation data by local stakeholders and experts, and assist in evaluating the impact of protection regimes on land-cover change.
Resumo:
We propose and investigate a method for the stable determination of a harmonic function from knowledge of its value and its normal derivative on a part of the boundary of the (bounded) solution domain (Cauchy problem). We reformulate the Cauchy problem as an operator equation on the boundary using the Dirichlet-to-Neumann map. To discretize the obtained operator, we modify and employ a method denoted as Classic II given in [J. Helsing, Faster convergence and higher accuracy for the Dirichlet–Neumann map, J. Comput. Phys. 228 (2009), pp. 2578–2576, Section 3], which is based on Fredholm integral equations and Nyström discretization schemes. Then, for stability reasons, to solve the discretized integral equation we use the method of smoothing projection introduced in [J. Helsing and B.T. Johansson, Fast reconstruction of harmonic functions from Cauchy data using integral equation techniques, Inverse Probl. Sci. Eng. 18 (2010), pp. 381–399, Section 7], which makes it possible to solve the discretized operator equation in a stable way with minor computational cost and high accuracy. With this approach, for sufficiently smooth Cauchy data, the normal derivative can also be accurately computed on the part of the boundary where no data is initially given.
Resumo:
In the face of global population growth and the uneven distribution of water supply, a better knowledge of the spatial and temporal distribution of surface water resources is critical. Remote sensing provides a synoptic view of ongoing processes, which addresses the intricate nature of water surfaces and allows an assessment of the pressures placed on aquatic ecosystems. However, the main challenge in identifying water surfaces from remotely sensed data is the high variability of spectral signatures, both in space and time. In the last 10 years only a few operational methods have been proposed to map or monitor surface water at continental or global scale, and each of them show limitations. The objective of this study is to develop and demonstrate the adequacy of a generic multi-temporal and multi-spectral image analysis method to detect water surfaces automatically, and to monitor them in near-real-time. The proposed approach, based on a transformation of the RGB color space into HSV, provides dynamic information at the continental scale. The validation of the algorithm showed very few omission errors and no commission errors. It demonstrates the ability of the proposed algorithm to perform as effectively as human interpretation of the images. The validation of the permanent water surface product with an independent dataset derived from high resolution imagery, showed an accuracy of 91.5% and few commission errors. Potential applications of the proposed method have been identified and discussed. The methodology that has been developed 27 is generic: it can be applied to sensors with similar bands with good reliability, and minimal effort. Moreover, this experiment at continental scale showed that the methodology is efficient for a large range of environmental conditions. Additional preliminary tests over other continents indicate that the proposed methodology could also be applied at the global scale without too many difficulties
Resumo:
Resource Space Model is a kind of data model which can effectively and flexibly manage the digital resources in cyber-physical system from multidimensional and hierarchical perspectives. This paper focuses on constructing resource space automatically. We propose a framework that organizes a set of digital resources according to different semantic dimensions combining human background knowledge in WordNet and Wikipedia. The construction process includes four steps: extracting candidate keywords, building semantic graphs, detecting semantic communities and generating resource space. An unsupervised statistical language topic model (i.e., Latent Dirichlet Allocation) is applied to extract candidate keywords of the facets. To better interpret meanings of the facets found by LDA, we map the keywords to Wikipedia concepts, calculate word relatedness using WordNet's noun synsets and construct corresponding semantic graphs. Moreover, semantic communities are identified by GN algorithm. After extracting candidate axes based on Wikipedia concept hierarchy, the final axes of resource space are sorted and picked out through three different ranking strategies. The experimental results demonstrate that the proposed framework can organize resources automatically and effectively.©2013 Published by Elsevier Ltd. All rights reserved.
Resumo:
The UK government aims at achieving 80% CO2 emission reduction by 2050 which requires collective efforts across all the UK industry sectors. In particular, the housing sector has a large potential to contribute to achieving the aim because the housing sector alone accounts for 27% of the total UK CO2 emission, and furthermore, 87% of the housing which is responsible for current 27% CO2 emission will still stand in 2050. Therefore, it is essential to improve energy efficiency of existing housing stock built with low energy efficiency standard. In order for this, a whole‐house needs to be refurbished in a sustainable way by considering the life time financial and environmental impacts of a refurbished house. However, the current refurbishment process seems to be challenging to generate a financially and environmentally affordable refurbishment solution due to the highly fragmented nature of refurbishment practice and a lack of knowledge and skills about whole‐house refurbishment in the construction industry. In order to generate an affordable refurbishment solution, diverse information regarding costs and environmental impacts of refurbishment measures and materials should be collected and integrated in right sequences throughout the refurbishment project life cycle among key project stakeholders. Consequently, various researchers increasingly study a way of utilizing Building Information Modelling (BIM) to tackle current problems in the construction industry because BIM can support construction professionals to manage construction projects in a collaborative manner by integrating diverse information, and to determine the best refurbishment solution among various alternatives by calculating the life cycle costs and lifetime CO2 performance of a refurbishment solution. Despite the capability of BIM, the BIM adoption rate is low with 25% in the housing sector and it has been rarely studied about a way of using BIM for housing refurbishment projects. Therefore, this research aims to develop a BIM framework to formulate a financially and environmentally affordable whole‐house refurbishment solution based on the Life Cycle Costing (LCC) and Life Cycle Assessment (LCA) methods simultaneously. In order to achieve the aim, a BIM feasibility study was conducted as a pilot study to examine whether BIM is suitable for housing refurbishment, and a BIM framework was developed based on the grounded theory because there was no precedent research. After the development of a BIM framework, this framework was examined by a hypothetical case study using BIM input data collected from questionnaire survey regarding homeowners’ preferences for housing refurbishment. Finally, validation of the BIM framework was conducted among academics and professionals by providing the BIM framework and a formulated refurbishment solution based on the LCC and LCA studies through the framework. As a result, BIM was identified as suitable for housing refurbishment as a management tool, and it is timely for developing the BIM framework. The BIM framework with seven project stages was developed to formulate an affordable refurbishment solution. Through the case study, the Building Regulation is identified as the most affordable energy efficiency standard which renders the best LCC and LCA results when it is applied for whole‐house refurbishment solution. In addition, the Fabric Energy Efficiency Standard (FEES) is recommended when customers are willing to adopt high energy standard, and the maximum 60% of CO2 emissions can be reduced through whole‐house fabric refurbishment with the FEES. Furthermore, limitations and challenges to fully utilize BIM framework for housing refurbishment were revealed such as a lack of BIM objects with proper cost and environmental information, limited interoperability between different BIM software and limited information of LCC and LCA datasets in BIM system. Finally, the BIM framework was validated as suitable for housing refurbishment projects, and reviewers commented that the framework can be more practical if a specific BIM library for housing refurbishment with proper LCC and LCA datasets is developed. This research is expected to provide a systematic way of formulating a refurbishment solution using BIM, and to become a basis for further research on BIM for the housing sector to resolve the current limitations and challenges. Future research should enhance the BIM framework by developing more detailed process map and develop BIM objects with proper LCC and LCA Information.