816 resultados para Data-Driven Behavior Modeling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding the effects of different types and quality of data on bioclimatic modeling predictions is vital to ascertaining the value of existing models, and to improving future models. Bioclimatic models were constructed using the CLIMEX program, using different data types – seasonal dynamics, geographic (overseas) distribution, and a combination of the two – for two biological control agents for the major weed Lantana camara L. in Australia. The models for one agent, Teleonemia scrupulosa Stål (Hemiptera:Tingidae) were based on a higher quality and quantity of data than the models for the other agent, Octotoma scabripennis Guérin-Méneville (Coleoptera: Chrysomelidae). Predictions of the geographic distribution for Australia showed that T. scrupulosa models exhibited greater accuracy with a progressive improvement from seasonal dynamics data, to the model based on overseas distribution, and finally the model combining the two data types. In contrast, O. scabripennis models were of low accuracy, and showed no clear trends across the various model types. These case studies demonstrate the importance of high quality data for developing models, and of supplementing distributional data with species seasonal dynamics data wherever possible. Seasonal dynamics data allows the modeller to focus on the species response to climatic trends, while distributional data enables easier fitting of stress parameters by restricting the species envelope to the described distribution. It is apparent that CLIMEX models based on low quality seasonal dynamics data, together with a small quantity of distributional data, are of minimal value in predicting the spatial extent of species distribution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The core aim of machine learning is to make a computer program learn from the experience. Learning from data is usually defined as a task of learning regularities or patterns in data in order to extract useful information, or to learn the underlying concept. An important sub-field of machine learning is called multi-view learning where the task is to learn from multiple data sets or views describing the same underlying concept. A typical example of such scenario would be to study a biological concept using several biological measurements like gene expression, protein expression and metabolic profiles, or to classify web pages based on their content and the contents of their hyperlinks. In this thesis, novel problem formulations and methods for multi-view learning are presented. The contributions include a linear data fusion approach during exploratory data analysis, a new measure to evaluate different kinds of representations for textual data, and an extension of multi-view learning for novel scenarios where the correspondence of samples in the different views or data sets is not known in advance. In order to infer the one-to-one correspondence of samples between two views, a novel concept of multi-view matching is proposed. The matching algorithm is completely data-driven and is demonstrated in several applications such as matching of metabolites between humans and mice, and matching of sentences between documents in two languages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deterministic models have been widely used to predict water quality in distribution systems, but their calibration requires extensive and accurate data sets for numerous parameters. In this study, alternative data-driven modeling approaches based on artificial neural networks (ANNs) were used to predict temporal variations of two important characteristics of water quality chlorine residual and biomass concentrations. The authors considered three types of ANN algorithms. Of these, the Levenberg-Marquardt algorithm provided the best results in predicting residual chlorine and biomass with error-free and ``noisy'' data. The ANN models developed here can generate water quality scenarios of piped systems in real time to help utilities determine weak points of low chlorine residual and high biomass concentration and select optimum remedial strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bacteriorhodopsin has been the subject of intense study in order to understand its photochemical function. The recent atomic model proposed by Henderson and coworkers based on electron cryo-microscopic studies has helped in understanding many of the structural and functional aspects of bacteriorhodopsin. However, the accuracy of the positions of the side chains is not very high since the model is based on low-resolution data. In this study, we have minimized the energy of this structure of bacteriorhodopsin and analyzed various types of interactions such as - intrahelical and interhelical hydrogen bonds and retinal environment. In order to understand the photochemical action, it is necessary to obtain information on the structures adopted at the intermediate states. In this direction, we have generated some intermediate structures taking into account certain experimental data, by computer modeling studies. Various isomers of retinal with 13-cis and/or 15-cis conformations and all possible staggered orientations of Lys-216 side chain were generated. The resultant structures were examined for the distance between Lys-216-schiff base nitrogen and the carboxylate oxygen atoms of Asp-96 - a residue which is known to reprotonate the schiff base at later stages of photocycle. Some of the structures were selected on the basis of suitable retinal orientation and the stability of these structures were tested by energy minimization studies. Further, the minimized structures are analyzed for the hydrogen bond interactions and retinal environment and the results are compared with those of the minimized rest state structure. The importance of functional groups in stabilizing the structure of bacteriorhodopsin and in participating dynamically during the photocycle have been discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent focus of flood frequency analysis (FFA) studies has been on development of methods to model joint distributions of variables such as peak flow, volume, and duration that characterize a flood event, as comprehensive knowledge of flood event is often necessary in hydrological applications. Diffusion process based adaptive kernel (D-kernel) is suggested in this paper for this purpose. It is data driven, flexible and unlike most kernel density estimators, always yields a bona fide probability density function. It overcomes shortcomings associated with the use of conventional kernel density estimators in FFA, such as boundary leakage problem and normal reference rule. The potential of the D-kernel is demonstrated by application to synthetic samples of various sizes drawn from known unimodal and bimodal populations, and five typical peak flow records from different parts of the world. It is shown to be effective when compared to conventional Gaussian kernel and the best of seven commonly used copulas (Gumbel-Hougaard, Frank, Clayton, Joe, Normal, Plackett, and Student's T) in estimating joint distribution of peak flow characteristics and extrapolating beyond historical maxima. Selection of optimum number of bins is found to be critical in modeling with D-kernel.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we consider applying derived knowledge base regarding the sensitivity and specificity of damage(s) to be detected by an SHM system being designed and qualified. These efforts are necessary toward developing capabilities in SHM system to classify reliably various probable damages through sequence of monitoring, i.e., damage precursor identification, detection of damage and monitoring its progression. We consider the particular problem of visual and ultrasonic NDE based SHM system design requirements, where the damage detection sensitivity and specificity data definitions for a class of structural components are established. Methodologies for SHM system specification creation are discussed in details. Examples are shown to illustrate how the physics of damage detection scheme limits particular damage detection sensitivity and specificity and further how these information can be used in algorithms to combine various different NDE schemes in an SHM system to enhance efficiency and effectiveness. Statistical and data driven models to determine the sensitivity and probability of damage detection (POD) has been demonstrated for plate with varying one-sided line crack using optical and ultrasonic based inspection techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three separate topics, each stimulated by experiments, are treated theoretically in this dessertation: isotopic effects of ozone, electron transfer at interfaces, and intramolecular directional electron transfer in a supramolecular system.

The strange mass-independent isotope effect for the enrichment of ozone, which has been a puzzle in the literature for some 20 years, and the equally puzzling unconventional strong mass-dependent effect of individual reaction rate constants are studied as different aspects of a symmetry-driven behavior. A statistical (RRKM-based) theory with a hindered-rotor transition state is used. The individual rate constant ratios of recombination reactions at low pressures are calculated using the theory involving (1) small deviation from the statistical density of states for symmetric isotopomers, and (2) weak collisions for deactivation of the vibrationally excited ozone molecules. The weak collision and partitioning among exit channels play major roles in producing the large unconventional isotope effect in "unscrambled" systems. The enrichment studies reflect instead the non-statistical effect in "scrambled" systems. The theoretical results of low-pressure ozone enrichments and individual rate constant ratios obtained from these calculations are consistent with the corresponding experimental results. The isotopic exchange rate constant for the reaction ^(16)O + ^(18)O ^(18)O→+ ^(16)O ^(18)O + ^(18)O provides information on the nature of a variationally determined hindered-rotor transition state using experimental data at 130 K and 300 K. Pressure effects on the recombination rate constant, on the individual rate constant ratios and on the enrichments are also investigated. The theoretical results are consistent with the experimental data. The temperature dependence of the enrichment and rate constant ratios is also discussed, and experimental tests are suggested. The desirability of a more accurate potential energy surface for ozone in the transition state region is also noted.

Electron transfer reactions at semiconductor /liquid interfaces are studied using a tight-binding model for the semiconductors. The slab method and a z-transform method are employed in obtaining the tight-binding electronic structures of semiconductors having surfaces. The maximum electron transfer rate constants at Si/viologen^(2-/+) and InP /Me_(2)Fc^(+/O) interfaces are computed using the tight-binding type calculations for the solid and the extended-Huckel for the coupling to the redox agent at the interface. These electron transfer reactions are also studied using a free electron model for the semiconductor and the redox molecule, where Bardeen's method is adapted to calculate the coupling matrix element between the molecular and semiconductor electronic states. The calculated results for maximum rate constant of the electron transfer from the semiconductor bulk states are compared with the experimentally measured values of Lewis and coworkers, and are in reasonable agreement, without adjusting parameters. In the case of InP /liquid interface, the unusual current vs applied potential behavior is additionally interpreted, in part, by the presence of surface states.

Photoinduced electron transfer reactions in small supramolecular systems, such as 4-aminonaphthalimide compounds, are interesting in that there are, in principle, two alternative pathways (directions) for the electron transfer. The electron transfer, however, is unidirectional, as deduced from pH-dependent fluorescence quenching studies on different compounds. The role of electronic coupling matrix element and the charges in protonation are considered to explain the directionality of the electron transfer and other various results. A related mechanism is proposed to interpret the fluorescence behavior of similar molecules as fluorescent sensors of metal ions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modeling studies are preformed to investigate the plasma and heat transfer characteristics of a low power argon arcjet thruster. Computed temperature, velocity, static pressure, and Mach number distribution in arcjet thruster under typical operating condition are presented in this paper. It shows that the performance data from numerical modeling results are basically consistent with the experimental measured values.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Transcriptional regulation has been studied intensively in recent decades. One important aspect of this regulation is the interaction between regulatory proteins, such as transcription factors (TF) and nucleosomes, and the genome. Different high-throughput techniques have been invented to map these interactions genome-wide, including ChIP-based methods (ChIP-chip, ChIP-seq, etc.), nuclease digestion methods (DNase-seq, MNase-seq, etc.), and others. However, a single experimental technique often only provides partial and noisy information about the whole picture of protein-DNA interactions. Therefore, the overarching goal of this dissertation is to provide computational developments for jointly modeling different experimental datasets to achieve a holistic inference on the protein-DNA interaction landscape.

We first present a computational framework that can incorporate the protein binding information in MNase-seq data into a thermodynamic model of protein-DNA interaction. We use a correlation-based objective function to model the MNase-seq data and a Markov chain Monte Carlo method to maximize the function. Our results show that the inferred protein-DNA interaction landscape is concordant with the MNase-seq data and provides a mechanistic explanation for the experimentally collected MNase-seq fragments. Our framework is flexible and can easily incorporate other data sources. To demonstrate this flexibility, we use prior distributions to integrate experimentally measured protein concentrations.

We also study the ability of DNase-seq data to position nucleosomes. Traditionally, DNase-seq has only been widely used to identify DNase hypersensitive sites, which tend to be open chromatin regulatory regions devoid of nucleosomes. We reveal for the first time that DNase-seq datasets also contain substantial information about nucleosome translational positioning, and that existing DNase-seq data can be used to infer nucleosome positions with high accuracy. We develop a Bayes-factor-based nucleosome scoring method to position nucleosomes using DNase-seq data. Our approach utilizes several effective strategies to extract nucleosome positioning signals from the noisy DNase-seq data, including jointly modeling data points across the nucleosome body and explicitly modeling the quadratic and oscillatory DNase I digestion pattern on nucleosomes. We show that our DNase-seq-based nucleosome map is highly consistent with previous high-resolution maps. We also show that the oscillatory DNase I digestion pattern is useful in revealing the nucleosome rotational context around TF binding sites.

Finally, we present a state-space model (SSM) for jointly modeling different kinds of genomic data to provide an accurate view of the protein-DNA interaction landscape. We also provide an efficient expectation-maximization algorithm to learn model parameters from data. We first show in simulation studies that the SSM can effectively recover underlying true protein binding configurations. We then apply the SSM to model real genomic data (both DNase-seq and MNase-seq data). Through incrementally increasing the types of genomic data in the SSM, we show that different data types can contribute complementary information for the inference of protein binding landscape and that the most accurate inference comes from modeling all available datasets.

This dissertation provides a foundation for future research by taking a step toward the genome-wide inference of protein-DNA interaction landscape through data integration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The detection of dense harmful algal blooms (HABs) by satellite remote sensing is usually based on analysis of chlorophyll-a as a proxy. However, this approach does not provide information about the potential harm of bloom, nor can it identify the dominant species. The developed HAB risk classification method employs a fully automatic data-driven approach to identify key characteristics of water leaving radiances and derived quantities, and to classify pixels into “harmful”, “non-harmful” and “no bloom” categories using Linear Discriminant Analysis (LDA). Discrimination accuracy is increased through the use of spectral ratios of water leaving radiances, absorption and backscattering. To reduce the false alarm rate the data that cannot be reliably classified are automatically labelled as “unknown”. This method can be trained on different HAB species or extended to new sensors and then applied to generate independent HAB risk maps; these can be fused with other sensors to fill gaps or improve spatial or temporal resolution. The HAB discrimination technique has obtained accurate results on MODIS and MERIS data, correctly identifying 89% of Phaeocystis globosa HABs in the southern North Sea and 88% of Karenia mikimotoi blooms in the Western English Channel. A linear transformation of the ocean colour discriminants is used to estimate harmful cell counts, demonstrating greater accuracy than if based on chlorophyll-a; this will facilitate its integration into a HAB early warning system operating in the southern North Sea.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Affiliation: Département de biochimie, Faculté de médecine, Université de Montréal

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The attached file is created with Scientific Workplace Latex

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die Inspektion von Schule und Unterricht hat eine lange Tradition. Auf der Grundlage von Dokumenten aus der hessischen Schulgeschichte wird gezeigt, dass anfänglich der einzelne Lehrer, sein sittliches Verhalten und insbesondere sein Umgang mit der unterrichtlichen Disziplin im Zentrum der Beobachtungen stand. Das moderne Inspektionswesen nutzt demgegenüber das sozialwissenschaftliche Instrumentarium um anstelle punktueller Erfolgskontrollen schulische und unterrichtliche Entwicklungsprozesse in ihrer Komplexität zu beobachten, zu beschreiben und zu bewerten.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los aportes teóricos y aplicados de la complejidad en economía han tomado tantas direcciones y han sido tan frenéticos en las últimas décadas, que no existe un trabajo reciente, hasta donde conocemos, que los compile y los analice de forma integrada. El objetivo de este proyecto, por tanto, es desarrollar un estado situacional de las diferentes aplicaciones conceptuales, teóricas, metodológicas y tecnológicas de las ciencias de la complejidad en la economía. Asimismo, se pretende analizar las tendencias recientes en el estudio de la complejidad de los sistemas económicos y los horizontes que las ciencias de la complejidad ofrecen de cara al abordaje de los fenómenos económicos del mundo globalizado contemporáneo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This contribution introduces a new digital predistorter to compensate serious distortions caused by memory high power amplifiers (HPAs) which exhibit output saturation characteristics. The proposed design is based on direct learning using a data-driven B-spline Wiener system modeling approach. The nonlinear HPA with memory is first identified based on the B-spline neural network model using the Gauss-Newton algorithm, which incorporates the efficient De Boor algorithm with both B-spline curve and first derivative recursions. The estimated Wiener HPA model is then used to design the Hammerstein predistorter. In particular, the inverse of the amplitude distortion of the HPA's static nonlinearity can be calculated effectively using the Newton-Raphson formula based on the inverse of De Boor algorithm. A major advantage of this approach is that both the Wiener HPA identification and the Hammerstein predistorter inverse can be achieved very efficiently and accurately. Simulation results obtained are presented to demonstrate the effectiveness of this novel digital predistorter design.