5 resultados para bottom-up analysis
em Digital Commons - Michigan Tech
Resumo:
The developmental processes and functions of an organism are controlled by the genes and the proteins that are derived from these genes. The identification of key genes and the reconstruction of gene networks can provide a model to help us understand the regulatory mechanisms for the initiation and progression of biological processes or functional abnormalities (e.g. diseases) in living organisms. In this dissertation, I have developed statistical methods to identify the genes and transcription factors (TFs) involved in biological processes, constructed their regulatory networks, and also evaluated some existing association methods to find robust methods for coexpression analyses. Two kinds of data sets were used for this work: genotype data and gene expression microarray data. On the basis of these data sets, this dissertation has two major parts, together forming six chapters. The first part deals with developing association methods for rare variants using genotype data (chapter 4 and 5). The second part deals with developing and/or evaluating statistical methods to identify genes and TFs involved in biological processes, and construction of their regulatory networks using gene expression data (chapter 2, 3, and 6). For the first part, I have developed two methods to find the groupwise association of rare variants with given diseases or traits. The first method is based on kernel machine learning and can be applied to both quantitative as well as qualitative traits. Simulation results showed that the proposed method has improved power over the existing weighted sum method (WS) in most settings. The second method uses multiple phenotypes to select a few top significant genes. It then finds the association of each gene with each phenotype while controlling the population stratification by adjusting the data for ancestry using principal components. This method was applied to GAW 17 data and was able to find several disease risk genes. For the second part, I have worked on three problems. First problem involved evaluation of eight gene association methods. A very comprehensive comparison of these methods with further analysis clearly demonstrates the distinct and common performance of these eight gene association methods. For the second problem, an algorithm named the bottom-up graphical Gaussian model was developed to identify the TFs that regulate pathway genes and reconstruct their hierarchical regulatory networks. This algorithm has produced very significant results and it is the first report to produce such hierarchical networks for these pathways. The third problem dealt with developing another algorithm called the top-down graphical Gaussian model that identifies the network governed by a specific TF. The network produced by the algorithm is proven to be of very high accuracy.
Resumo:
Attempts to strengthen a chromium-modified titanium trialuminide by a combination of grain size refinement and dispersoid strengthening led to a new means to synthesize such materials. This Reactive Mechanical Alloying/Milling process uses in situ reactions between the metallic powders and elements from a process control agent and/or a gaseous environment to assemble a dispersed small hard particle phase within the matrix by a bottom-up approach. In the current research milled powders of the trialuminide alloy along with titanium carbide were produced. The amount of the carbide can be varied widely with simple processing changes and in this case the milling process created trialuminide grain sizes and carbide particles that are the smallest known from such a process. Characterization of these materials required the development of x-ray diffraction means to determine particle sizes by deconvoluting and synthesizing components of the complex multiphase diffraction patterns and to carry out whole pattern analysis to analyze the diffuse scattering that developed from larger than usual highly defective grain boundary regions. These identified regions provide an important mass transport capability in the processing and not only facilitate the alloy development, but add to the understanding of the mechanical alloying process. Consolidation of the milled powder that consisted of small crystallites of the alloy and dispersed carbide particles two nanometers in size formed a unique, somewhat coarsened, microstructure producing an ultra-high strength solid material composed of the chromium-modified titanium trialuminide alloy matrix with small platelets of the complex carbides Ti2AlC and Ti3AlC2. This synthesis process provides the unique ability to nano-engineer a wide variety of composite materials, or special alloys, and has shown the ability to be extended to a wide variety of metallic materials.
Resumo:
Approximately 90% of fine aerosol in the Midwestern United States has a regional component with a sizable fraction attributed to secondary production of organic aerosol (SOA). The Ozark Forest is an important source of biogenic SOA precursors like isoprene (> 150 mg m-2 d-1), monoterpenes (10-40 mg m-2 d-1), and sesquiterpenes (10-40 mg m-2d-1). Anthropogenic sources include secondary sulfate and nitrate and biomass burning (51-60%), vehicle emissions (17-26%), and industrial emissions (16-18%). Vehicle emissions are an important source of volatile and vapor-phase, semivolatile aliphatic and aromatic hydrocarbons that are important anthropogenic sources of SOA precursors. The short lifetime of SOA precursors and the complex mixture of functionalized oxidation products make rapid sampling, quantitative processing methods, and comprehensive organic molecular analysis essential elements of a comprehensive strategy to advance understanding of SOA formation pathways. Uncertainties in forecasting SOA production on regional scales are large and related to uncertainties in biogenic emission inventories and measurement of SOA yields under ambient conditions. This work presents a bottom-up approach to develop a conifer emission inventory based on foliar and cortical oleoresin composition, development of a model to estimate terpene and terpenoid signatures of foliar and bole emissions from conifers, development of processing and analytic techniques for comprehensive organic molecular characterization of SOA precursors and oxidation products, implementation of the high-volume sampling technique to measure OA and vapor-phase organic matter, and results from a 5 day field experiment conducted to evaluate temporal and diurnal trends in SOA precursors and oxidation products. A total of 98, 115, and 87 terpene and terpenoid species were identified and quantified in commercially available essential oils of Pinus sylvestris, Picea mariana, and Thuja occidentalis, respectively, by comprehensive, two-dimensional gas chromatography with time-of-flight mass spectrometric detection (GC × GC-ToF-MS). Analysis of the literature showed that cortical oleoresin composition was similar to foliar composition of the oldest branches. Our proposed conceptual model for estimation of signatures of terpene and terpenoid emissions from foliar and cortical oleoresin showed that emission potentials of the foliar and bole release pathways are dissimilar and should be considered for conifer species that develop resin blisters or are infested with herbivores or pathogens. Average derivatization efficiencies for Methods 1 and 2 were 87.9 and 114%, respectively. Despite the lower average derivatization efficiency of Method 1, distinct advantages included a greater certainty of derivatization yield for the entire suite of multi- and poly-functional species and fewer processing steps for sequential derivatization. Detection limits for Method 1 using GC × GC- ToF-MS were 0.09-1.89 ng μL-1. A theoretical retention index diagram was developed for a hypothetical GC × 2GC analysis of the complex mixture of SOA precursors and derivatized oxidation products. In general, species eluted (relative to the alkyl diester reference compounds) from the primary column (DB-210) in bands according to n and from the secondary columns (BPX90, SolGel-WAX) according to functionality, essentially making the GC × 2GC retention diagram a Carbon number-functionality grid. The species clustered into 35 groups by functionality and species within each group exhibited good separation by n. Average recoveries of n-alkanes and polyaromatic hydrocarbons (PAHs) by Soxhlet extraction of XAD-2 resin with dichloromethane were 80.1 ± 16.1 and 76.1 ± 17.5%, respectively. Vehicle emissions were the common source for HSVOCs [i.e., resolved alkanes, the unresolved complex mixture (UCM), alkylbenzenes, and 2- and 3-ring PAHs]. An absence of monoterpenes at 0600-1000 and high concentrations of monoterpenoids during the same period was indicative of substantial losses of monoterpenes overnight and the early morning hours. Post-collection, comprehensive organic molecular characterization of SOA precursors and products by GC × GC-ToFMS in ambient air collected with ~2 hr resolution is a promising method for determining biogenic and anthropogenic SOA yields that can be used to evaluate SOA formation models.
Resumo:
As environmental problems became more complex, policy and regulatory decisions become far more difficult to make. The use of science has become an important practice in the decision making process of many federal agencies. Many different types of scientific information are used to make decisions within the EPA, with computer models becoming especially important. Environmental models are used throughout the EPA in a variety of contexts and their predictive capacity has become highly valued in decision making. The main focus of this research is to examine the EPA’s Council for Regulatory Modeling (CREM) as a case study in addressing science issues, particularly models, in government agencies. Specifically, the goal was to answer the following questions: What is the history of the CREM and how can this information shed light on the process of science policy implementation? What were the goals of implementing the CREM? Were these goals reached and how have they changed? What have been the impediments that the CREM has faced and why did these impediments occur? The three main sources of information for this research came from observations during summer employment with the CREM, document review and supplemental interviews with CREM participants and other members of the modeling community. Examining a history of modeling at the EPA, as well as a history of the CREM, provides insight into the many challenges that are faced when implementing science policy and science policy programs. After examining the many impediments that the CREM has faced in implementing modeling policies, it was clear that the impediments fall into two separate categories, classic and paradoxical. The classic impediments include the more standard impediments to science policy implementation that might be found in any regulatory environment, such as lack of resources and changes in administration. Paradoxical impediments are cyclical in nature, with no clear solution, such as balancing top-down versus bottom-up initiatives and coping with differing perceptions. These impediments, when not properly addressed, severely hinder the ability for organizations to successfully implement science policy.
Resumo:
The remarkable advances in nanoscience and nanotechnology over the last two decades allow one to manipulate individuals atoms, molecules and nanostructures, make it possible to build devices with only a few nanometers, and enhance the nano-bio fusion in tackling biological and medical problems. It complies with the ever-increasing need for device miniaturization, from magnetic storage devices, electronic building blocks for computers, to chemical and biological sensors. Despite the continuing efforts based on conventional methods, they are likely to reach the fundamental limit of miniaturization in the next decade, when feature lengths shrink below 100 nm. On the one hand, quantum mechanical efforts of the underlying material structure dominate device characteristics. On the other hand, one faces the technical difficulty in fabricating uniform devices. This has posed a great challenge for both the scientific and the technical communities. The proposal of using a single or a few organic molecules in electronic devices has not only opened an alternative way of miniaturization in electronics, but also brought up brand-new concepts and physical working mechanisms in electronic devices. This thesis work stands as one of the efforts in understanding and building of electronic functional units at the molecular and atomic levels. We have explored the possibility of having molecules working in a wide spectrum of electronic devices, ranging from molecular wires, spin valves/switches, diodes, transistors, and sensors. More specifically, we have observed significant magnetoresistive effect in a spin-valve structure where the non-magnetic spacer sandwiched between two magnetic conducting materials is replaced by a self-assembled monolayer of organic molecules or a single molecule (like a carbon fullerene). The diode behavior in donor(D)-bridge(B)-acceptor(A) type of single molecules is then discussed and a unimolecular transistor is designed. Lastly, we have proposed and primarily tested the idea of using functionalized electrodes for rapid nanopore DNA sequencing. In these studies, the fundamental roles of molecules and molecule-electrode interfaces on quantum electron transport have been investigated based on first-principles calculations of the electronic structure. Both the intrinsic properties of molecules themselves and the detailed interfacial features are found to play critical roles in electron transport at the molecular scale. The flexibility and tailorability of the properties of molecules have opened great opportunity in a purpose-driven design of electronic devices from the bottom up. The results that we gained from this work have helped in understanding the underlying physics, developing the fundamental mechanism and providing guidance for future experimental efforts.