951 resultados para vector quantization based Gaussian modeling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Albian turbidites and intercalated shales were cored from ~1145 to 1700 meters below seafloor at Site 1276 in the Newfoundland Basin. Strata at this level dip ~2.5° seaward (toward an azimuth of ~130°) based on seismic profiles. In contrast, beds dip an average of ~10° in the cores. This higher apparent dip is the sum of the ~2.5° seaward dip and a measured hole deviation of 7.43°, which must be essentially in the same seaward direction. Using the maximum dip direction in the cores as a reference direction, paleocurrents were measured from 11 current-ripple foresets and 11 vector means of grain fabric in planar-laminated sandstones. Five of the planar-laminated sandstone samples have a grain imbrication 8°, permitting specification of a unique flow direction rather than just the line-of-motion of the current. Both ripples and grain fabric point to unconfined flow toward the north-northeast. There is considerable spread in the data so that some paleoflow indicators point toward the northwest, whereas others point southeast. Nevertheless, the overall pattern of paleoflow suggests a source for the turbidity currents on the southeastern Grand Banks, likely from the long-emergent Avalon Uplift in that area. On average, turbidity currents apparently flowed axially in the young Albian rift, toward the north. This is opposite to what might be expected for a northward-propagating rift and a young ocean opening in a zipperlike fashion from south to north.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human SULT1A1 is primarily responsible for sulfonation of xenobiotics, including the activation of promutagens, and it has been implicated in several forms of cancer. Human SULT1A3 has been shown to be the major sulfotransferase that sulfonates dopamine. These two enzymes shares 93% amino acid sequence identity and have distinct but overlapping substrate preferences. The resolution of the crystal structures of these two enzymes has enabled us to elucidate the mechanisms controlling their substrate preferences and inhibition. The presence of two p-nitrophenol (pNP) molecules in the crystal structure of SULT1A1 was postulated to explain cooperativity at low and inhibition at high substrate concentrations, respectively. In SULT1A1, substrate inhibition occurs with pNP as the substrate but not with dopamine. For SULT1A3, substrate inhibition is found for dopamine but not with pNP. We investigated how substrate inhibition occurs in these two enzymes using molecular modeling, site-directed mutagenesis, and kinetic analysis. The results show that residue Phe-247 of SULT1A1, which interacts with both p-nitrophenol molecules in the active site, is important for substrate inhibition. Mutation of phenylalanine to leucine at this position in SULT1A1 results in substrate inhibition by dopamine. We also propose, based on modeling and kinetic studies, that substrate inhibition by dopamine in SULT1A3 is caused by binding of two dopamine molecules in the active site. © 2004 by The American Society for Biochemistry and Molecular Biology, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For determining functionality dependencies between two proteins, both represented as 3D structures, it is an essential condition that they have one or more matching structural regions called patches. As 3D structures for proteins are large, complex and constantly evolving, it is computationally expensive and very time-consuming to identify possible locations and sizes of patches for a given protein against a large protein database. In this paper, we address a vector space based representation for protein structures, where a patch is formed by the vectors within the region. Based on our previews work, a compact representation of the patch named patch signature is applied here. A similarity measure of two patches is then derived based on their signatures. To achieve fast patch matching in large protein databases, a match-and-expand strategy is proposed. Given a query patch, a set of small k-sized matching patches, called candidate patches, is generated in match stage. The candidate patches are further filtered by enlarging k in expand stage. Our extensive experimental results demonstrate encouraging performances with respect to this biologically critical but previously computationally prohibitive problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Text classification is essential for narrowing down the number of documents relevant to a particular topic for further pursual, especially when searching through large biomedical databases. Protein-protein interactions are an example of such a topic with databases being devoted specifically to them. This paper proposed a semi-supervised learning algorithm via local learning with class priors (LL-CP) for biomedical text classification where unlabeled data points are classified in a vector space based on their proximity to labeled nodes. The algorithm has been evaluated on a corpus of biomedical documents to identify abstracts containing information about protein-protein interactions with promising results. Experimental results show that LL-CP outperforms the traditional semisupervised learning algorithms such as SVMand it also performs better than local learning without incorporating class priors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

MOTIVATION: G protein-coupled receptors (GPCRs) play an important role in many physiological systems by transducing an extracellular signal into an intracellular response. Over 50% of all marketed drugs are targeted towards a GPCR. There is considerable interest in developing an algorithm that could effectively predict the function of a GPCR from its primary sequence. Such an algorithm is useful not only in identifying novel GPCR sequences but in characterizing the interrelationships between known GPCRs. RESULTS: An alignment-free approach to GPCR classification has been developed using techniques drawn from data mining and proteochemometrics. A dataset of over 8000 sequences was constructed to train the algorithm. This represents one of the largest GPCR datasets currently available. A predictive algorithm was developed based upon the simplest reasonable numerical representation of the protein's physicochemical properties. A selective top-down approach was developed, which used a hierarchical classifier to assign sequences to subdivisions within the GPCR hierarchy. The predictive performance of the algorithm was assessed against several standard data mining classifiers and further validated against Support Vector Machine-based GPCR prediction servers. The selective top-down approach achieves significantly higher accuracy than standard data mining methods in almost all cases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The binding between antigenic peptides (epitopes) and the MHC molecule is a key step in the cellular immune response. Accurate in silico prediction of epitope-MHC binding affinity can greatly expedite epitope screening by reducing costs and experimental effort. Recently, we demonstrated the appealing performance of SVRMHC, an SVR-based quantitative modeling method for peptide-MHC interactions, when applied to three mouse class I MHC molecules. Subsequently, we have greatly extended the construction of SVRMHC models and have established such models for more than 40 class I and class II MHC molecules. Here we present the SVRMHC web server for predicting peptide-MHC binding affinities using these models. Benchmarked percentile scores are provided for all predictions. The larger number of SVRMHC models available allowed for an updated evaluation of the performance of the SVRMHC method compared to other well- known linear modeling methods. SVRMHC is an accurate and easy-to-use prediction server for epitope-MHC binding with significant coverage of MHC molecules. We believe it will prove to be a valuable resource for T cell epitope researchers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The standards of diagnostic systems formation in medicine based on modeling expert’s “means of action” in form of illegible trees of solution-making taking into consideration the criteria of credibility and usefulness have been suggested. The fragments of “applied” trees at diagnosing infectious and urological diseases have been considered as well. The possibilities of modern tooling theory usage for decision-making during creation of artificial intelligence systems have been discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work is supported by the Hungarian Scientific Research Fund (OTKA), grant T042706.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Assessing frequency and extent of mass movement at continental margins is crucial to evaluate risks for offshore constructions and coastal areas. A multidisciplinary approach including geophysical, sedimentological, geotechnical, and geochemical methods was applied to investigate multistage mass transport deposits (MTDs) off Uruguay, on top of which no surficial hemipelagic drape was detected based on echosounder data. Nonsteady state pore water conditions are evidenced by a distinct gradient change in the sulfate (SO4**2-) profile at 2.8 m depth. A sharp sedimentological contact at 2.43 m coincides with an abrupt downward increase in shear strength from approx. 10 to >20 kPa. This boundary is interpreted as a paleosurface (and top of an older MTD) that has recently been covered by a sediment package during a younger landslide event. This youngest MTD supposedly originated from an upslope position and carried its initial pore water signature downward. The kink in the SO4**2- profile approx. 35 cm below the sedimentological and geotechnical contact indicates that bioirrigation affected the paleosurface before deposition of the youngest MTD. Based on modeling of the diffusive re-equilibration of SO4**2- the age of the most recent MTD is estimated to be <30 years. The mass movement was possibly related to an earthquake in 1988 (approx. 70 km southwest of the core location). Probabilistic slope stability back analysis of general landslide structures in the study area reveals that slope failure initiation requires additional ground accelerations. Therefore, we consider the earthquake as a reasonable trigger if additional weakening processes (e.g., erosion by previous retrogressive failure events or excess pore pressures) preconditioned the slope for failure. Our study reveals the necessity of multidisciplinary approaches to accurately recognize and date recent slope failures in complex settings such as the investigated area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A certain type of bacterial inclusion, known as a bacterial microcompartment, was recently identified and imaged through cryo-electron tomography. A reconstructed 3D object from single-axis limited angle tilt-series cryo-electron tomography contains missing regions and this problem is known as the missing wedge problem. Due to missing regions on the reconstructed images, analyzing their 3D structures is a challenging problem. The existing methods overcome this problem by aligning and averaging several similar shaped objects. These schemes work well if the objects are symmetric and several objects with almost similar shapes and sizes are available. Since the bacterial inclusions studied here are not symmetric, are deformed, and show a wide range of shapes and sizes, the existing approaches are not appropriate. This research develops new statistical methods for analyzing geometric properties, such as volume, symmetry, aspect ratio, polyhedral structures etc., of these bacterial inclusions in presence of missing data. These methods work with deformed and non-symmetric varied shaped objects and do not necessitate multiple objects for handling the missing wedge problem. The developed methods and contributions include: (a) an improved method for manual image segmentation, (b) a new approach to 'complete' the segmented and reconstructed incomplete 3D images, (c) a polyhedral structural distance model to predict the polyhedral shapes of these microstructures, (d) a new shape descriptor for polyhedral shapes, named as polyhedron profile statistic, and (e) the Bayes classifier, linear discriminant analysis and support vector machine based classifiers for supervised incomplete polyhedral shape classification. Finally, the predicted 3D shapes for these bacterial microstructures belong to the Johnson solids family, and these shapes along with their other geometric properties are important for better understanding of their chemical and biological characteristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over 150 million cubic meter of sand-sized sediment has disappeared from the central region of the San Francisco Bay Coastal System during the last half century. This enormous loss may reflect numerous anthropogenic influences, such as watershed damming, bay-fill development, aggregate mining, and dredging. The reduction in Bay sediment also appears to be linked to a reduction in sediment supply and recent widespread erosion of adjacent beaches, wetlands, and submarine environments. A unique, multi-faceted provenance study was performed to definitively establish the primary sources, sinks, and transport pathways of beach sized-sand in the region, thereby identifying the activities and processes that directly limit supply to the outer coast. This integrative program is based on comprehensive surficial sediment sampling of the San Francisco Bay Coastal System, including the seabed, Bay floor, area beaches, adjacent rock units, and major drainages. Analyses of sample morphometrics and biological composition (e.g., Foraminifera) were then integrated with a suite of tracers including 87Sr/86Sr and 143Nd/144Nd isotopes, rare earth elements, semi-quantitative X-ray diffraction mineralogy, and heavy minerals, and with process-based numerical modeling, in situ current measurements, and bedform asymmetry to robustly determine the provenance of beach-sized sand in the region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Electrical Submersible Pumping is an artificial lift method for oil wells employed in onshore and offshore areas. The economic revenue of the petroleum production in a well depends on the oil flow and the availability of lifting equipment. The fewer the failures, the lower the revenue shortfall and costs to repair it. The frequency with which failures occur depends on the operating conditions to which the pumps are submitted. In high-productivity offshore wells monitoring is done by operators with engineering support 24h/day, which is not economically viable for the land areas. In this context, the automation of onshore wells has clear economic advantages. This work proposes a system capable of automatically control the operation of electrical submersible pumps, installed in oil wells, by an adjustment at the electric motor rotation based on signals provided by sensors installed on the surface and subsurface, keeping the pump operating within the recommended range, closest to the well s potential. Techniques are developed to estimate unmeasured variables, enabling the automation of wells that do not have all the required sensors. The automatic adjustment, according to an algorithm that runs on a programmable logic controller maintains the flow and submergence within acceptable parameters avoiding undesirable operating conditions, as the gas interference and high engine temperature, without need to resort to stopping the engine, which would reduce the its useful life. The control strategy described, based on modeling of physical phenomena and operational experience reported in literature, is materialized in terms of a fuzzy controller based on rules, and all generated information can be accompanied by a supervisory system

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Global climate change is predicted to have impacts on the frequency and severity of flood events. In this study, output from Global Circulation Models (GCMs) for a range of possible future climate scenarios was used to force hydrologic models for four case study watersheds built using the Soil and Water Assessment Tool (SWAT). GCM output was applied with either the "delta change" method or a bias correction. Potential changes in flood risk are assessed based on modeling results and possible relationships to watershed characteristics. Differences in model outputs when using the two different methods of adjusting GCM output are also compared. Preliminary results indicate that watersheds exhibiting higher proportions of runoff in streamflow are more vulnerable to changes in flood risk. The delta change method appears to be more useful when simulating extreme events as it better preserves daily climate variability as opposed to using bias corrected GCM output.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present a new approach for corpus-based speech enhancement that significantly improves over a method published by Xiao and Nickel in 2010. Corpus-based enhancement systems do not merely filter an incoming noisy signal, but resynthesize its speech content via an inventory of pre-recorded clean signals. The goal of the procedure is to perceptually improve the sound of speech signals in background noise. The proposed new method modifies Xiao's method in four significant ways. Firstly, it employs a Gaussian mixture model (GMM) instead of a vector quantizer in the phoneme recognition front-end. Secondly, the state decoding of the recognition stage is supported with an uncertainty modeling technique. With the GMM and the uncertainty modeling it is possible to eliminate the need for noise dependent system training. Thirdly, the post-processing of the original method via sinusoidal modeling is replaced with a powerful cepstral smoothing operation. And lastly, due to the improvements of these modifications, it is possible to extend the operational bandwidth of the procedure from 4 kHz to 8 kHz. The performance of the proposed method was evaluated across different noise types and different signal-to-noise ratios. The new method was able to significantly outperform traditional methods, including the one by Xiao and Nickel, in terms of PESQ scores and other objective quality measures. Results of subjective CMOS tests over a smaller set of test samples support our claims.