964 resultados para Data Migration Processes Modeling
Resumo:
Every x-ray attenuation curve inherently contains all the information necessary to extract the complete energy spectrum of a beam. To date, attempts to obtain accurate spectral information from attenuation data have been inadequate.^ This investigation presents a mathematical pair model, grounded in physical reality by the Laplace Transformation, to describe the attenuation of a photon beam and the corresponding bremsstrahlung spectral distribution. In addition the Laplace model has been mathematically extended to include characteristic radiation in a physically meaningful way. A method to determine the fraction of characteristic radiation in any diagnostic x-ray beam was introduced for use with the extended model.^ This work has examined the reconstructive capability of the Laplace pair model for a photon beam range of from 50 kVp to 25 MV, using both theoretical and experimental methods.^ In the diagnostic region, excellent agreement between a wide variety of experimental spectra and those reconstructed with the Laplace model was obtained when the atomic composition of the attenuators was accurately known. The model successfully reproduced a 2 MV spectrum but demonstrated difficulty in accurately reconstructing orthovoltage and 6 MV spectra. The 25 MV spectrum was successfully reconstructed although poor agreement with the spectrum obtained by Levy was found.^ The analysis of errors, performed with diagnostic energy data, demonstrated the relative insensitivity of the model to typical experimental errors and confirmed that the model can be successfully used to theoretically derive accurate spectral information from experimental attenuation data. ^
Resumo:
Microarray technology is a high-throughput method for genotyping and gene expression profiling. Limited sensitivity and specificity are one of the essential problems for this technology. Most of existing methods of microarray data analysis have an apparent limitation for they merely deal with the numerical part of microarray data and have made little use of gene sequence information. Because it's the gene sequences that precisely define the physical objects being measured by a microarray, it is natural to make the gene sequences an essential part of the data analysis. This dissertation focused on the development of free energy models to integrate sequence information in microarray data analysis. The models were used to characterize the mechanism of hybridization on microarrays and enhance sensitivity and specificity of microarray measurements. ^ Cross-hybridization is a major obstacle factor for the sensitivity and specificity of microarray measurements. In this dissertation, we evaluated the scope of cross-hybridization problem on short-oligo microarrays. The results showed that cross hybridization on arrays is mostly caused by oligo fragments with a run of 10 to 16 nucleotides complementary to the probes. Furthermore, a free-energy based model was proposed to quantify the amount of cross-hybridization signal on each probe. This model treats cross-hybridization as an integral effect of the interactions between a probe and various off-target oligo fragments. Using public spike-in datasets, the model showed high accuracy in predicting the cross-hybridization signals on those probes whose intended targets are absent in the sample. ^ Several prospective models were proposed to improve Positional Dependent Nearest-Neighbor (PDNN) model for better quantification of gene expression and cross-hybridization. ^ The problem addressed in this dissertation is fundamental to the microarray technology. We expect that this study will help us to understand the detailed mechanism that determines sensitivity and specificity on the microarrays. Consequently, this research will have a wide impact on how microarrays are designed and how the data are interpreted. ^
Resumo:
Mixture modeling is commonly used to model categorical latent variables that represent subpopulations in which population membership is unknown but can be inferred from the data. In relatively recent years, the potential of finite mixture models has been applied in time-to-event data. However, the commonly used survival mixture model assumes that the effects of the covariates involved in failure times differ across latent classes, but the covariate distribution is homogeneous. The aim of this dissertation is to develop a method to examine time-to-event data in the presence of unobserved heterogeneity under a framework of mixture modeling. A joint model is developed to incorporate the latent survival trajectory along with the observed information for the joint analysis of a time-to-event variable, its discrete and continuous covariates, and a latent class variable. It is assumed that the effects of covariates on survival times and the distribution of covariates vary across different latent classes. The unobservable survival trajectories are identified through estimating the probability that a subject belongs to a particular class based on observed information. We applied this method to a Hodgkin lymphoma study with long-term follow-up and observed four distinct latent classes in terms of long-term survival and distributions of prognostic factors. Our results from simulation studies and from the Hodgkin lymphoma study demonstrated the superiority of our joint model compared with the conventional survival model. This flexible inference method provides more accurate estimation and accommodates unobservable heterogeneity among individuals while taking involved interactions between covariates into consideration.^
Resumo:
Essential biological processes are governed by organized, dynamic interactions between multiple biomolecular systems. Complexes are thus formed to enable the biological function and get dissembled as the process is completed. Examples of such processes include the translation of the messenger RNA into protein by the ribosome, the folding of proteins by chaperonins or the entry of viruses in host cells. Understanding these fundamental processes by characterizing the molecular mechanisms that enable then, would allow the (better) design of therapies and drugs. Such molecular mechanisms may be revealed trough the structural elucidation of the biomolecular assemblies at the core of these processes. Various experimental techniques may be applied to investigate the molecular architecture of biomolecular assemblies. High-resolution techniques, such as X-ray crystallography, may solve the atomic structure of the system, but are typically constrained to biomolecules of reduced flexibility and dimensions. In particular, X-ray crystallography requires the sample to form a three dimensional (3D) crystal lattice which is technically di‑cult, if not impossible, to obtain, especially for large, dynamic systems. Often these techniques solve the structure of the different constituent components within the assembly, but encounter difficulties when investigating the entire system. On the other hand, imaging techniques, such as cryo-electron microscopy (cryo-EM), are able to depict large systems in near-native environment, without requiring the formation of crystals. The structures solved by cryo-EM cover a wide range of resolutions, from very low level of detail where only the overall shape of the system is visible, to high-resolution that approach, but not yet reach, atomic level of detail. In this dissertation, several modeling methods are introduced to either integrate cryo-EM datasets with structural data from X-ray crystallography, or to directly interpret the cryo-EM reconstruction. Such computational techniques were developed with the goal of creating an atomic model for the cryo-EM data. The low-resolution reconstructions lack the level of detail to permit a direct atomic interpretation, i.e. one cannot reliably locate the atoms or amino-acid residues within the structure obtained by cryo-EM. Thereby one needs to consider additional information, for example, structural data from other sources such as X-ray crystallography, in order to enable such a high-resolution interpretation. Modeling techniques are thus developed to integrate the structural data from the different biophysical sources, examples including the work described in the manuscript I and II of this dissertation. At intermediate and high-resolution, cryo-EM reconstructions depict consistent 3D folds such as tubular features which in general correspond to alpha-helices. Such features can be annotated and later on used to build the atomic model of the system, see manuscript III as alternative. Three manuscripts are presented as part of the PhD dissertation, each introducing a computational technique that facilitates the interpretation of cryo-EM reconstructions. The first manuscript is an application paper that describes a heuristics to generate the atomic model for the protein envelope of the Rift Valley fever virus. The second manuscript introduces the evolutionary tabu search strategies to enable the integration of multiple component atomic structures with the cryo-EM map of their assembly. Finally, the third manuscript develops further the latter technique and apply it to annotate consistent 3D patterns in intermediate-resolution cryo-EM reconstructions. The first manuscript, titled An assembly model for Rift Valley fever virus, was submitted for publication in the Journal of Molecular Biology. The cryo-EM structure of the Rift Valley fever virus was previously solved at 27Å-resolution by Dr. Freiberg and collaborators. Such reconstruction shows the overall shape of the virus envelope, yet the reduced level of detail prevents the direct atomic interpretation. High-resolution structures are not yet available for the entire virus nor for the two different component glycoproteins that form its envelope. However, homology models may be generated for these glycoproteins based on similar structures that are available at atomic resolutions. The manuscript presents the steps required to identify an atomic model of the entire virus envelope, based on the low-resolution cryo-EM map of the envelope and the homology models of the two glycoproteins. Starting with the results of the exhaustive search to place the two glycoproteins, the model is built iterative by running multiple multi-body refinements to hierarchically generate models for the different regions of the envelope. The generated atomic model is supported by prior knowledge regarding virus biology and contains valuable information about the molecular architecture of the system. It provides the basis for further investigations seeking to reveal different processes in which the virus is involved such as assembly or fusion. The second manuscript was recently published in the of Journal of Structural Biology (doi:10.1016/j.jsb.2009.12.028) under the title Evolutionary tabu search strategies for the simultaneous registration of multiple atomic structures in cryo-EM reconstructions. This manuscript introduces the evolutionary tabu search strategies applied to enable a multi-body registration. This technique is a hybrid approach that combines a genetic algorithm with a tabu search strategy to promote the proper exploration of the high-dimensional search space. Similar to the Rift Valley fever virus, it is common that the structure of a large multi-component assembly is available at low-resolution from cryo-EM, while high-resolution structures are solved for the different components but lack for the entire system. Evolutionary tabu search strategies enable the building of an atomic model for the entire system by considering simultaneously the different components. Such registration indirectly introduces spatial constrains as all components need to be placed within the assembly, enabling the proper docked in the low-resolution map of the entire assembly. Along with the method description, the manuscript covers the validation, presenting the benefit of the technique in both synthetic and experimental test cases. Such approach successfully docked multiple components up to resolutions of 40Å. The third manuscript is entitled Evolutionary Bidirectional Expansion for the Annotation of Alpha Helices in Electron Cryo-Microscopy Reconstructions and was submitted for publication in the Journal of Structural Biology. The modeling approach described in this manuscript applies the evolutionary tabu search strategies in combination with the bidirectional expansion to annotate secondary structure elements in intermediate resolution cryo-EM reconstructions. In particular, secondary structure elements such as alpha helices show consistent patterns in cryo-EM data, and are visible as rod-like patterns of high density. The evolutionary tabu search strategy is applied to identify the placement of the different alpha helices, while the bidirectional expansion characterizes their length and curvature. The manuscript presents the validation of the approach at resolutions ranging between 6 and 14Å, a level of detail where alpha helices are visible. Up to resolution of 12 Å, the method measures sensitivities between 70-100% as estimated in experimental test cases, i.e. 70-100% of the alpha-helices were correctly predicted in an automatic manner in the experimental data. The three manuscripts presented in this PhD dissertation cover different computation methods for the integration and interpretation of cryo-EM reconstructions. The methods were developed in the molecular modeling software Sculptor (http://sculptor.biomachina.org) and are available for the scientific community interested in the multi-resolution modeling of cryo-EM data. The work spans a wide range of resolution covering multi-body refinement and registration at low-resolution along with annotation of consistent patterns at high-resolution. Such methods are essential for the modeling of cryo-EM data, and may be applied in other fields where similar spatial problems are encountered, such as medical imaging.
Resumo:
A comprehensive hydroclimatic data set is presented for the 2011 water year to improve understanding of hydrologic processes in the rain-snow transition zone. This type of dataset is extremely rare in scientific literature because of the quality and quantity of soil depth, soil texture, soil moisture, and soil temperature data. Standard meteorological and snow cover data for the entire 2011 water year are included, which include several rain-on-snow events. Surface soil textures and soil depths from 57 points are presented as well as soil texture profiles from 14 points. Meteorological data include continuous hourly shielded, unshielded, and wind corrected precipitation, wind speed, air temperature, relative humidity, dew point temperature, and incoming solar and thermal radiation data. Sub-surface data included are hourly soil moisture data from multiple depths from 7 soil profiles within the catchment, and soil temperatures from multiple depths from 2 soil profiles. Hydrologic response data include hourly stream discharge from the catchment outlet weir, continuous snow depths from one location, intermittent snow depths from 5 locations, and snow depth and density data from ten weekly snow surveys. Though it represents only a single water year, the presentation of both above and below ground hydrologic condition makes it one of the most detailed and complete hydro-climatic datasets from the climatically sensitive rain-snow transition zone for a wide range of modeling and descriptive studies.
Resumo:
The formation of calcareous skeletons by marine planktonic organisms and their subsequent sinking to depth generates a continuous rain of calcium carbonate to the deep ocean and underlying sediments. This is important in regulating marine carbon cycling and ocean-atmosphere CO2 exchange. The present rise in atmospheric CO2 levels causes significant changes in surface ocean pH and carbonate chemistry. Such changes have been shown to slow down calcification in corals and coralline macroalgae, but the majority of marine calcification occurs in planktonic organisms. Here we report reduced calcite production at increased CO2 concentrations in monospecific cultures of two dominant marine calcifying phytoplankton species, the coccolithophorids Emiliania huxleyi and Gephyrocapsa oceanica . This was accompanied by an increased proportion of malformed coccoliths and incomplete coccospheres. Diminished calcification led to a reduction in the ratio of calcite precipitation to organic matter production. Similar results were obtained in incubations of natural plankton assemblages from the north Pacific ocean when exposed to experimentally elevated CO2 levels. We suggest that the progressive increase in atmospheric CO2 concentrations may therefore slow down the production of calcium carbonate in the surface ocean. As the process of calcification releases CO2 to the atmosphere, the response observed here could potentially act as a negative feedback on atmospheric CO2 levels.
Resumo:
We investigate the sensitivity of U/Ca, Mg/Ca, and Sr/Ca to changes in seawater [CO3[2-]] and temperature in calcite produced by the two planktonic foraminifera species, Orbulina universa and Globigerina bulloides, in laboratory culture experiments. Our results demonstrate that at constant temperature, U/Ca in O. universa decreases by 25 +/- 7% per 100 µmol [CO3[2-]] kg**-1, as seawater [CO3[2-]] increases from 110 to 470 µmol kg**-1. Results from G. bulloides suggest a similar relationship, but U/Ca is consistently offset by ~+40% at the same environmental [CO3[2-]]. In O. universa, U/Ca is insensitive to temperature between 15°C and 25°C. Applying the O. universa relationship to three U/Ca records from a related species, Globigerinoides sacculifer, we estimate that Caribbean and tropical Atlantic [CO3[2-]] was 110 +/- 70 µmol kg**-1 and 80 +/- 40 µmol kg**-1 higher, respectively, during the last glacial period relative to the Holocene. This result is consistent with estimates of the glacial-interglacial change in surface water [CO3[2-]] based on both modeling and on boron isotope pH estimates. In settings where the addition of U by diagenetic processes is not a factor, down-core records of foraminiferal U/Ca have potential to provide information about changes in the ocean's carbonate concentration.
Resumo:
Production pathways of the prominent volatile organic halogen compound methyl iodide (CH3I) are not fully understood. Based on observations, production of CH3I via photochemical degradation of organic material or via phytoplankton production has been proposed. Additional insights could not be gained from correlations between observed biological and environmental variables or from biogeochemical modeling to identify unambiguously the source of methyl iodide. In this study, we aim to address this question of source mechanisms with a three-dimensional global ocean general circulation model including biogeochemistry (MPIOM-HAMOCC (MPIOM - Max Planck Institute Ocean Model HAMOCC - HAMburg Ocean Carbon Cycle model)) by carrying out a series of sensitivity experiments. The simulated fields are compared with a newly available global data set. Simulated distribution patterns and emissions of CH3I differ largely for the two different production pathways. The evaluation of our model results with observations shows that, on the global scale, observed surface concentrations of CH3I can be best explained by the photochemical production pathway. Our results further emphasize that correlations between CH3I and abiotic or biotic factors do not necessarily provide meaningful insights concerning the source of origin. Overall, we find a net global annual CH3I air-sea flux that ranges between 70 and 260 Gg/yr. On the global scale, the ocean acts as a net source of methyl iodide for the atmosphere, though in some regions in boreal winter, fluxes are of the opposite direction (from the atmosphere to the ocean).
Resumo:
State-of-the-art process-based models have shown to be applicable to the simulation and prediction of coastal morphodynamics. On annual to decadal temporal scales, these models may show limitations in reproducing complex natural morphological evolution patterns, such as the movement of bars and tidal channels, e.g. the observed decadal migration of the Medem Channel in the Elbe Estuary, German Bight. Here a morphodynamic model is shown to simulate the hydrodynamics and sediment budgets of the domain to some extent, but fails to adequately reproduce the pronounced channel migration, due to the insufficient implementation of bank erosion processes. In order to allow for long-term simulations of the domain, a nudging method has been introduced to update the model-predicted bathymetries with observations. The model-predicted bathymetry is nudged towards true states in annual time steps. Sensitivity analysis of a user-defined correlation length scale, for the definition of the background error covariance matrix during the nudging procedure, suggests that the optimal error correlation length is similar to the grid cell size, here 80-90 m. Additionally, spatially heterogeneous correlation lengths produce more realistic channel depths than do spatially homogeneous correlation lengths. Consecutive application of the nudging method compensates for the (stand-alone) model prediction errors and corrects the channel migration pattern, with a Brier skill score of 0.78. The proposed nudging method in this study serves as an analytical approach to update model predictions towards a predefined 'true' state for the spatiotemporal interpolation of incomplete morphological data in long-term simulations.
Resumo:
Bathymetry based on data recorded during MSM34-2 between 27.12.2013 and 18.01.2014 in the Black Sea. The main objective of this cruise was the mapping and imaging of the gas hydrate distribution and gas accumulations as well as possible gas migration pathways. Objectives of Cruise: Gas hydrates have been the focus of scientific and economic interest for the past 15-20 years, mainly because the amount of carbon stored in gas hydrates is much greater than in other carbon reservoirs. Several countries including Japan, Korea and India have launched vast reasearch programmes dedicated to the exploration for gas hydrate resources and ultimately the exploitation of the gas hydrates for methane. The German SUGAR project that is financed the the Ministry of Education and Research (BmBF) and the Ministry of Economics (BmWi) aims at developing technology to exploit gas hydrate resources by injecting and storing CO2 instead of methane in the hydrates. This approach includes techniques to locate and quantify hydrate reservoirs, drill into the reservoir, extract methane from the hydrates by replacing it with CO2, and monitor the thus formed CO2-hydrate reservoir. Numerical modeling has shown that any exploitation of the gas hydrates can only be succesful, if sufficient hydrate resources are present within permeable reservoirs such as sandy or gravelly deposits. The ultimate goal of the SUGAR project being a field test of the technology developed within the project, knowledge of a suitable test site becomes crucial. Within European waters only the Norwegian margin and the Danube deep-sea fan show clear geophysical evidence for large gas hydrate accumulations, but only the Danube deep-sea fan most likely contains gas hydrates within sandy deposits. The main objective of cruise MSM34 therefore is locating and characterising suitable gas hydrate deposits on the Danube deep-sea fan.