883 resultados para New applications
Resumo:
Tropical wetlands are estimated to represent about 50% of the natural wetland methane (CH4) emissions and explain a large fraction of the observed CH4 variability on timescales ranging from glacial–interglacial cycles to the currently observed year-to-year variability. Despite their importance, however, tropical wetlands are poorly represented in global models aiming to predict global CH4 emissions. This publication documents a first step in the development of a process-based model of CH4 emissions from tropical floodplains for global applications. For this purpose, the LPX-Bern Dynamic Global Vegetation Model (LPX hereafter) was slightly modified to represent floodplain hydrology, vegetation and associated CH4 emissions. The extent of tropical floodplains was prescribed using output from the spatially explicit hydrology model PCR-GLOBWB. We introduced new plant functional types (PFTs) that explicitly represent floodplain vegetation. The PFT parameterizations were evaluated against available remote-sensing data sets (GLC2000 land cover and MODIS Net Primary Productivity). Simulated CH4 flux densities were evaluated against field observations and regional flux inventories. Simulated CH4 emissions at Amazon Basin scale were compared to model simulations performed in the WETCHIMP intercomparison project. We found that LPX reproduces the average magnitude of observed net CH4 flux densities for the Amazon Basin. However, the model does not reproduce the variability between sites or between years within a site. Unfortunately, site information is too limited to attest or disprove some model features. At the Amazon Basin scale, our results underline the large uncertainty in the magnitude of wetland CH4 emissions. Sensitivity analyses gave insights into the main drivers of floodplain CH4 emission and their associated uncertainties. In particular, uncertainties in floodplain extent (i.e., difference between GLC2000 and PCR-GLOBWB output) modulate the simulated emissions by a factor of about 2. Our best estimates, using PCR-GLOBWB in combination with GLC2000, lead to simulated Amazon-integrated emissions of 44.4 ± 4.8 Tg yr−1. Additionally, the LPX emissions are highly sensitive to vegetation distribution. Two simulations with the same mean PFT cover, but different spatial distributions of grasslands within the basin, modulated emissions by about 20%. Correcting the LPX-simulated NPP using MODIS reduces the Amazon emissions by 11.3%. Finally, due to an intrinsic limitation of LPX to account for seasonality in floodplain extent, the model failed to reproduce the full dynamics in CH4 emissions but we proposed solutions to this issue. The interannual variability (IAV) of the emissions increases by 90% if the IAV in floodplain extent is accounted for, but still remains lower than in most of the WETCHIMP models. While our model includes more mechanisms specific to tropical floodplains, we were unable to reduce the uncertainty in the magnitude of wetland CH4 emissions of the Amazon Basin. Our results helped identify and prioritize directions towards more accurate estimates of tropical CH4 emissions, and they stress the need for more research to constrain floodplain CH4 emissions and their temporal variability, even before including other fundamental mechanisms such as floating macrophytes or lateral water fluxes.
Resumo:
Dendrogeomorphology uses information sources recorded in the roots, trunks and branches of trees and bushes located in the fluvial system to complement (or sometimes even replace) systematic and palaeohydrological records of past floods. The application of dendrogeomorphic data sources and methods to palaeoflood analysis over nearly 40 years has allowed improvements to be made in frequency and magnitude estimations of past floods. Nevertheless, research carried out so far has shown that the dendrogeomorphic indicators traditionally used (mainly scar evidence), and their use to infer frequency and magnitude, have been restricted to a small, limited set of applications. New possibilities with enormous potential remain unexplored. New insights in future research of palaeoflood frequency and magnitude using dendrogeomorphic data sources should: (1) test the application of isotopic indicators (16O/18O ratio) to discover the meteorological origin of past floods; (2) use different dendrogeomorphic indicators to estimate peak flows with 2D (and 3D) hydraulic models and study how they relate to other palaeostage indicators; (3) investigate improved calibration of 2D hydraulic model parameters (roughness); and (4) apply statistics-based cost–benefit analysis to select optimal mitigation measures. This paper presents an overview of these innovative methodologies, with a focus on their capabilities and limitations in the reconstruction of recent floods and palaeofloods.
Resumo:
Development of transcriptional pulsing approaches using the c-fos and Tet-off promoter systems greatly facilitated studies of mRNA turnover in mammalian cells. However, optimal protocols for these approaches vary for different cell types and/or physiological conditions, limiting their widespread application. In this study, we have further optimized transcriptional pulsing systems for different cell lines and developed new protocols to facilitate investigation of various aspects of mRNA turnover. We apply the Tet-off transcriptional pulsing strategy to investigate ARE-mediated mRNA decay in human erythroleukemic K562 cells arrested at various phases of the cell cycle by pharmacological inhibitors. This application facilitates studies of the role of mRNA stability in control of cell-cycle dependent gene expression. To advance the investigation of factors involved in mRNA turnover and its regulation, we have also incorporated recently developed transfection and siRNA reagents into the transcriptional pulsing approach. Using these protocols, siRNA and DNA plasmids can be effectively cotransfected into mouse NIH3T3 cells to obtain high knockdown efficiency. Moreover, we have established a tTA-harboring stable line using human bronchial epithelial BEAS-2B cells and applied the transcriptional pulsing approach to monitor mRNA deadenylation and decay kinetics in this cell system. This broadens the application of the transcriptional pulsing system to investigate the regulation of mRNA turnover related to allergic inflammation. Critical factors that need to be considered when employing these approaches are characterized and discussed.
Resumo:
We propose to build and operate a detector based on the emulsion film technology for the measurement of the gravitational acceleration on antimatter, to be performed by the AEgIS experiment (AD6) at CERN. The goal of AEgIS is to test the weak equivalence principle with a precision of 1% on the gravitational acceleration g by measuring the vertical position of the annihilation vertex of antihydrogen atoms after their free fall while moving horizontally in a vacuum pipe. With the emulsion technology developed at the University of Bern we propose to improve the performance of AEgIS by exploiting the superior position resolution of emulsion films over other particle detectors. The idea is to use a new type of emulsion films, especially developed for applications in vacuum, to yield a spatial resolution of the order of one micron in the measurement of the sag of the antihydrogen atoms in the gravitational field. This is an order of magnitude better than what was planned in the original AEgIS proposal.
Resumo:
Variable number of tandem repeats (VNTR) are genetic loci at which short sequence motifs are found repeated different numbers of times among chromosomes. To explore the potential utility of VNTR loci in evolutionary studies, I have conducted a series of studies to address the following questions: (1) What are the population genetic properties of these loci? (2) What are the mutational mechanisms of repeat number change at these loci? (3) Can DNA profiles be used to measure the relatedness between a pair of individuals? (4) Can DNA fingerprint be used to measure the relatedness between populations in evolutionary studies? (5) Can microsatellite and short tandem repeat (STR) loci which mutate stepwisely be used in evolutionary analyses?^ A large number of VNTR loci typed in many populations were studied by means of statistical methods developed recently. The results of this work indicate that there is no significant departure from Hardy-Weinberg expectation (HWE) at VNTR loci in most of the human populations examined, and the departure from HWE in some VNTR loci are not solely caused by the presence of population sub-structure.^ A statistical procedure is developed to investigate the mutational mechanisms of VNTR loci by studying the allele frequency distributions of these loci. Comparisons of frequency distribution data on several hundreds VNTR loci with the predictions of two mutation models demonstrated that there are differences among VNTR loci grouped by repeat unit sizes.^ By extending the ITO method, I derived the distribution of the number of shared bands between individuals with any kinship relationship. A maximum likelihood estimation procedure is proposed to estimate the relatedness between individuals from the observed number of shared bands between them.^ It was believed that classical measures of genetic distance are not applicable to analysis of DNA fingerprints which reveal many minisatellite loci simultaneously in the genome, because the information regarding underlying alleles and loci is not available. I proposed a new measure of genetic distance based on band sharing between individuals that is applicable to DNA fingerprint data.^ To address the concern that microsatellite and STR loci may not be useful for evolutionary studies because of the convergent nature of their mutation mechanisms, by a theoretical study as well as by computer simulation, I conclude that the possible bias caused by the convergent mutations can be corrected, and a novel measure of genetic distance that makes the correction is suggested. In summary, I conclude that hypervariable VNTR loci are useful in evolutionary studies of closely related populations or species, especially in the study of human evolution and the history of geographic dispersal of Homo sapiens. (Abstract shortened by UMI.) ^
Resumo:
Firn microstructure is accurately characterized using images obtained from scanning electron microscopy (SEM). Visibly etched grain boundaries within images are used to create a skeleton outline of the microstructure. A pixel-counting utility is applied to the outline to determine grain area. Firn grain sizes calculated using the technique described here are compared to those calculated using the techniques of Cow (1969) and Gay and Weiss (1999) on samples of the same material, and are found to be substantially smaller. The differences in grain size between the techniques are attributed to sampling deficiencies (e.g. the inclusion of pore filler in the grain area) in earlier methods. The new technique offers the advantages of greater accuracy and the ability to determine individual components of the microstructure (grain and pore), which have important applications in ice-core analyses. The new method is validated by calculating activation energies of grain boundary diffusion using predicted values based on the ratio of grain-size measurements between the new and existing techniques. The resulting activation energy falls within the range of values previously reported for firn/ice.
Resumo:
Introduction The purpose of this paper is to present the technical specifications of the Forensic Reference Phantom (FRP), to test its behavior relative to organic test materials, and discuss potential applications of the phantom in forensic radiology. Materials and method The FRP prototype is made of synthetic materials designed to simulate the computed tomography (CT) attenuation of water. It has six bore holes that accommodate multiuse containers. These containers were filled with test materials and scanned at 80 kVp, 120 kVp, and 140 kVp. X-ray attenuation was measured by two readers. Intra- and inter-reader reliability was assessed using the intra-class correlation coefficient (ICC). Significance levels between mean CT numbers at 80 kVp, 120 kVp, and 140 kVp were assessed with the Friedman-test. The T-test was used to assess significance levels between the FRP and water. Results Overall mean CT numbers ranged from −3.0–3.7HU for the FRP; −1000.3–−993.5HU for air; −157.7– −108.1HU for oil; 35.5–42.0HU for musle tissue; and 1301.5–2354.8HU for cortical bone. Inter-reader and intra-reader reliability were excellent (ICC>0.994; and ICC=0.999 respectively). CT numbers were significantly different at different energy levels. There was no significant difference between the attenuation of the FRP and water. Conclusions The FRP is a new tool for quality assurance and research in forensic radiology. The mean CT attenuation of the FRP is equivalent to water. The phantom can be scanned during routine post-mortem CT to assess the composition of unidentified objects. In addition, the FRP may be used to investigate new imaging algorithms and scan protocols in forensic radiology.
Resumo:
(Full text is available at http://www.manu.edu.mk/prilozi). New generation genomic platforms enable us to decipher the complex genetic basis of complex diseases and Balkan Endemic Nephropathy (BEN) at a high-throughput basis. They give valuable information about predisposing Single Nucleotide Polymorphisms (SNPs), Copy Number Variations (CNVs) or Loss of Heterozygosity (LOH) (using SNP-array) and about disease-causing mutations along the whole sequence of candidate-genes (using Next Generation Sequencing). This information could be used for screening of individuals in risk families and moving the main medicine stream to the prevention. They also might have an impact on more effective treatment. Here we discuss these genomic platforms and report some applications of SNP-array technology in a case with familial nephrotic syndrome. Key words: complex diseases, genome wide association studies, SNP, genomic arrays, next generation sequ-encing.
Resumo:
The new Bern cyclotron laboratory aims at industrial radioisotope production for PET diagnostics and multidisciplinary research by means of a specifically conceived beam transfer line, terminated in a separate bunker. In this framework, an innovative beam monitor detector based on doped silica and optical fibres has been designed, constructed, and tested. Scintillation light produced by Ce and Sb doped silica fibres moving across the beam is measured, giving information on beam position, shape, and intensity. The doped fibres are coupled to commercial optical fibres, allowing the read-out of the signal far away from the radiation source. This general-purpose device can be easily adapted for any accelerator used in medical applications and is suitable either for low currents used in hadrontherapy or for currents up to a few μA for radioisotope production, as well as for both pulsed and continuous beams.
Resumo:
We present a new thermodynamic activity-composition model for di-trioctahedral chlorite in the system FeO–MgO–Al2O3–SiO2–H2O that is based on the Holland–Powell internally consistent thermodynamic data set. The model is formulated in terms of four linearly independent end-members, which are amesite, clinochlore, daphnite and sudoite. These account for the most important crystal-chemical substitutions in chlorite, the Fe–Mg, Tschermak and di-trioctahedral substitution. The ideal part of end-member activities is modeled with a mixing-on-site formalism, and non-ideality is described by a macroscopic symmetric (regular) formalism. The symmetric interaction parameters were calibrated using a set of 271 published chlorite analyses for which robust independent temperature estimates are available. In addition, adjustment of the standard state thermodynamic properties of sudoite was required to accurately reproduce experimental brackets involving sudoite. This new model was tested by calculating representative P–T sections for metasediments at low temperatures (<400 °C), in particular sudoite and chlorite bearing metapelites from Crete. Comparison between the calculated mineral assemblages and field data shows that the new model is able to predict the coexistence of chlorite and sudoite at low metamorphic temperatures. The predicted lower limit of the chloritoid stability field is also in better agreement with petrological observations. For practical applications to metamorphic and hydrothermal environments, two new semi-empirical chlorite geothermometers named Chl(1) and Chl(2) were calibrated based on the chlorite + quartz + water equilibrium (2 clinochlore + 3 sudoite = 4 amesite + 4 H2O + 7 quartz). The Chl(1) thermometer requires knowledge of the (Fe3+/ΣFe) ratio in chlorite and predicts correct temperatures for a range of redox conditions. The Chl(2) geothermometer which assumes that all iron in chlorite is ferrous has been applied to partially recrystallized detrital chlorite from the Zone houillère in the French Western Alps.
Resumo:
Crowdsourcing linguistic phenomena with smartphone applications is relatively new. In linguistics, apps have predominantly been developed to create pronunciation dictionaries, to train acoustic models, and to archive endangered languages. This paper presents the first account of how apps can be used to collect data suitable for documenting language change: we created an app, Dialäkt Äpp (DÄ), which predicts users’ dialects. For 16 linguistic variables, users select a dialectal variant from a drop-down menu. DÄ then geographically locates the user’s dialect by suggesting a list of communes where dialect variants most similar to their choices are used. Underlying this prediction are 16 maps from the historical Linguistic Atlas of German-speaking Switzerland, which documents the linguistic situation around 1950. Where users disagree with the prediction, they can indicate what they consider to be their dialect’s location. With this information, the 16 variables can be assessed for language change. Thanks to the playfulness of its functionality, DÄ has reached many users; our linguistic analyses are based on data from nearly 60,000 speakers. Results reveal a relative stability for phonetic variables, while lexical and morphological variables seem more prone to change. Crowdsourcing large amounts of dialect data with smartphone apps has the potential to complement existing data collection techniques and to provide evidence that traditional methods cannot, with normal resources, hope to gather. Nonetheless, it is important to emphasize a range of methodological caveats, including sparse knowledge of users’ linguistic backgrounds (users only indicate age, sex) and users’ self-declaration of their dialect. These are discussed and evaluated in detail here. Findings remain intriguing nevertheless: as a means of quality control, we report that traditional dialectological methods have revealed trends similar to those found by the app. This underlines the validity of the crowdsourcing method. We are presently extending DÄ architecture to other languages.
Resumo:
DNA Barcoding (Hebert et al. 2003) has the potential to revolutionize the process of identifying and cataloguing biodiversity; however, significant controversy surrounds some of the proposed applications. In the seven years since DNA barcoding was introduced, the Web of Science records more than 600 studies that have weighed the pros and cons of this procedure. Unfortunately, the scientific community has been unable to come to any consensus on what threshold to use to differentiate species or even whether the barcoding region provides enough information to serve as an accurate species identification tool. The purpose of my thesis is to analyze mitochondrial DNA (mtDNA) barcoding’s potential to identify known species and provide a well-resolved phylogeny for the New Zealand cicada genus Kikihia. In order to do this, I created a phylogenetic tree for species in the genus Kikihia based solely on the barcoding region and compared it to a phylogeny previously created by Marshall et al. (2008) that benefits from information from other mtDNA and nuclear genes as well as species-specific song data. I determined how well the barcoding region delimits species that have been recognized based on morphology and song. In addition, I looked at the effect of sampling on the success of barcoding studies. I analyzed subsets of a larger, more densely sampled dataset for the Kikihia Muta Group to determine which aspects of my sampling strategy led to the most accurate identifications. Since DNA barcoding would by definition have problems in diagnosing hybrid individuals, I studied two species (K. “murihikua” and K. angusta) that are known to hybridize. Individuals that were not obvious hybrids (determined by morphology) were selected for the case study. Phylogenetic analysis of the barcoding region revealed insights into the reasons these two species could not be successfully differentiated using barcoding alone.
Resumo:
The PROPELLER (Periodically Rotated Overlapping Parallel Lines with Enhanced Reconstruction) magnetic resonance imaging (MRI) technique has inherent advantages over other fast imaging methods, including robust motion correction, reduced image distortion, and resistance to off-resonance effects. These features make PROPELLER highly desirable for T2*-sensitive imaging, high-resolution diffusion imaging, and many other applications. However, PROPELLER has been predominantly implemented as a fast spin-echo (FSE) technique, which is insensitive to T2* contrast, and requires time-inefficient signal averaging to achieve adequate signal-to-noise ratio (SNR) for many applications. These issues presently constrain the potential clinical utility of FSE-based PROPELLER. ^ In this research, our aim was to extend and enhance the potential applications of PROPELLER MRI by developing a novel multiple gradient echo PROPELLER (MGREP) technique that can overcome the aforementioned limitations. The MGREP pulse sequence was designed to acquire multiple gradient-echo images simultaneously, without any increase in total scan time or RF energy deposition relative to FSE-based PROPELLER. A new parameter was also introduced for direct user-control over gradient echo spacing, to allow variable sensitivity to T2* contrast. In parallel to pulse sequence development, an improved algorithm for motion correction was also developed and evaluated against the established method through extensive simulations. The potential advantages of MGREP over FSE-based PROPELLER were illustrated via three specific applications: (1) quantitative T2* measurement, (2) time-efficient signal averaging, and (3) high-resolution diffusion imaging. Relative to the FSE-PROPELLER method, the MGREP sequence was found to yield quantitative T2* values, increase SNR by ∼40% without any increase in acquisition time or RF energy deposition, and noticeably improve image quality in high-resolution diffusion maps. In addition, the new motion algorithm was found to improve the performance considerably in motion-artifact reduction. ^ Overall, this work demonstrated a number of enhancements and extensions to existing PROPELLER techniques. The new technical capabilities of PROPELLER imaging, developed in this thesis research, are expected to serve as the foundation for further expanding the scope of PROPELLER applications. ^
Resumo:
Breast cancer is the most common non-skin cancer and the second leading cause of cancer-related death in women in the United States. Studies on ipsilateral breast tumor relapse (IBTR) status and disease-specific survival will help guide clinic treatment and predict patient prognosis.^ After breast conservation therapy, patients with breast cancer may experience breast tumor relapse. This relapse is classified into two distinct types: true local recurrence (TR) and new ipsilateral primary tumor (NP). However, the methods used to classify the relapse types are imperfect and are prone to misclassification. In addition, some observed survival data (e.g., time to relapse and time from relapse to death)are strongly correlated with relapse types. The first part of this dissertation presents a Bayesian approach to (1) modeling the potentially misclassified relapse status and the correlated survival information, (2) estimating the sensitivity and specificity of the diagnostic methods, and (3) quantify the covariate effects on event probabilities. A shared frailty was used to account for the within-subject correlation between survival times. The inference was conducted using a Bayesian framework via Markov Chain Monte Carlo simulation implemented in softwareWinBUGS. Simulation was used to validate the Bayesian method and assess its frequentist properties. The new model has two important innovations: (1) it utilizes the additional survival times correlated with the relapse status to improve the parameter estimation, and (2) it provides tools to address the correlation between the two diagnostic methods conditional to the true relapse types.^ Prediction of patients at highest risk for IBTR after local excision of ductal carcinoma in situ (DCIS) remains a clinical concern. The goals of the second part of this dissertation were to evaluate a published nomogram from Memorial Sloan-Kettering Cancer Center, to determine the risk of IBTR in patients with DCIS treated with local excision, and to determine whether there is a subset of patients at low risk of IBTR. Patients who had undergone local excision from 1990 through 2007 at MD Anderson Cancer Center with a final diagnosis of DCIS (n=794) were included in this part. Clinicopathologic factors and the performance of the Memorial Sloan-Kettering Cancer Center nomogram for prediction of IBTR were assessed for 734 patients with complete data. Nomogram for prediction of 5- and 10-year IBTR probabilities were found to demonstrate imperfect calibration and discrimination, with an area under the receiver operating characteristic curve of .63 and a concordance index of .63. In conclusion, predictive models for IBTR in DCIS patients treated with local excision are imperfect. Our current ability to accurately predict recurrence based on clinical parameters is limited.^ The American Joint Committee on Cancer (AJCC) staging of breast cancer is widely used to determine prognosis, yet survival within each AJCC stage shows wide variation and remains unpredictable. For the third part of this dissertation, biologic markers were hypothesized to be responsible for some of this variation, and the addition of biologic markers to current AJCC staging were examined for possibly provide improved prognostication. The initial cohort included patients treated with surgery as first intervention at MDACC from 1997 to 2006. Cox proportional hazards models were used to create prognostic scoring systems. AJCC pathologic staging parameters and biologic tumor markers were investigated to devise the scoring systems. Surveillance Epidemiology and End Results (SEER) data was used as the external cohort to validate the scoring systems. Binary indicators for pathologic stage (PS), estrogen receptor status (E), and tumor grade (G) were summed to create PS+EG scoring systems devised to predict 5-year patient outcomes. These scoring systems facilitated separation of the study population into more refined subgroups than the current AJCC staging system. The ability of the PS+EG score to stratify outcomes was confirmed in both internal and external validation cohorts. The current study proposes and validates a new staging system by incorporating tumor grade and ER status into current AJCC staging. We recommend that biologic markers be incorporating into revised versions of the AJCC staging system for patients receiving surgery as the first intervention.^ Chapter 1 focuses on developing a Bayesian method to solve misclassified relapse status and application to breast cancer data. Chapter 2 focuses on evaluation of a breast cancer nomogram for predicting risk of IBTR in patients with DCIS after local excision gives the statement of the problem in the clinical research. Chapter 3 focuses on validation of a novel staging system for disease-specific survival in patients with breast cancer treated with surgery as the first intervention. ^
Resumo:
My dissertation focuses on two aspects of RNA sequencing technology. The first is the methodology for modeling the overdispersion inherent in RNA-seq data for differential expression analysis. This aspect is addressed in three sections. The second aspect is the application of RNA-seq data to identify the CpG island methylator phenotype (CIMP) by integrating datasets of mRNA expression level and DNA methylation status. Section 1: The cost of DNA sequencing has reduced dramatically in the past decade. Consequently, genomic research increasingly depends on sequencing technology. However it remains elusive how the sequencing capacity influences the accuracy of mRNA expression measurement. We observe that accuracy improves along with the increasing sequencing depth. To model the overdispersion, we use the beta-binomial distribution with a new parameter indicating the dependency between overdispersion and sequencing depth. Our modified beta-binomial model performs better than the binomial or the pure beta-binomial model with a lower false discovery rate. Section 2: Although a number of methods have been proposed in order to accurately analyze differential RNA expression on the gene level, modeling on the base pair level is required. Here, we find that the overdispersion rate decreases as the sequencing depth increases on the base pair level. Also, we propose four models and compare them with each other. As expected, our beta binomial model with a dynamic overdispersion rate is shown to be superior. Section 3: We investigate biases in RNA-seq by exploring the measurement of the external control, spike-in RNA. This study is based on two datasets with spike-in controls obtained from a recent study. We observe an undiscovered bias in the measurement of the spike-in transcripts that arises from the influence of the sample transcripts in RNA-seq. Also, we find that this influence is related to the local sequence of the random hexamer that is used in priming. We suggest a model of the inequality between samples and to correct this type of bias. Section 4: The expression of a gene can be turned off when its promoter is highly methylated. Several studies have reported that a clear threshold effect exists in gene silencing that is mediated by DNA methylation. It is reasonable to assume the thresholds are specific for each gene. It is also intriguing to investigate genes that are largely controlled by DNA methylation. These genes are called “L-shaped” genes. We develop a method to determine the DNA methylation threshold and identify a new CIMP of BRCA. In conclusion, we provide a detailed understanding of the relationship between the overdispersion rate and sequencing depth. And we reveal a new bias in RNA-seq and provide a detailed understanding of the relationship between this new bias and the local sequence. Also we develop a powerful method to dichotomize methylation status and consequently we identify a new CIMP of breast cancer with a distinct classification of molecular characteristics and clinical features.