906 resultados para computational models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

When studying a biological regulatory network, it is usual to use boolean network models. In these models, boolean variables represent the behavior of each component of the biological system. Taking in account that the size of these state transition models grows exponentially along with the number of components considered, it becomes important to have tools to minimize such models. In this paper, we relate bisimulations, which are relations used in the study of automata (general state transition models) with attractors, which are an important feature of biological boolean models. Hence, we support the idea that bisimulations can be important tools in the study some main features of boolean network models.We also discuss the differences between using this approach and other well-known methodologies to study this kind of systems and we illustrate it with some examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Findings on the role that emotion plays in human behavior have transformed Artificial Intelligence computations. Modern research explores how to simulate more intelligent and flexible systems. Several studies focus on the role that emotion has in order to establish values for alternative decision and decision outcomes. For instance, Busemeyer et al. (2007) argued that emotional state affects the subjectivity value of alternative choice. However, emotional concepts in these theories are generally not defined formally and it is difficult to describe in systematic detail how processes work. In this sense, structures and processes cannot be explicitly implemented. Some attempts have been incorporated into larger computational systems that try to model how emotion affects human mental processes and behavior (Becker-Asano & Wachsmuth, 2008; Marinier, Laird & Lewis, 2009; Marsella & Gratch, 2009; Parkinson, 2009; Sander, Grandjean & Scherer, 2005). As we will see, some tutoring systems have explored this potential to inform user models. Likewise, dialogue systems, mixed-initiative planning systems, or systems that learn from observation could also benefit from such an approach (Dickinson, Brew & Meurers, 2013; Jurafsky & Martin, 2009). That is, considering emotion as interaction can be relevant in order to explain the dynamic role it plays in action and cognition (see Boehner et al., 2007).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Part 1 of this thesis, we propose that biochemical cooperativity is a fundamentally non-ideal process. We show quantal effects underlying biochemical cooperativity and highlight apparent ergodic breaking at small volumes. The apparent ergodic breaking manifests itself in a divergence of deterministic and stochastic models. We further predict that this divergence of deterministic and stochastic results is a failure of the deterministic methods rather than an issue of stochastic simulations.

Ergodic breaking at small volumes may allow these molecular complexes to function as switches to a greater degree than has previously been shown. We propose that this ergodic breaking is a phenomenon that the synapse might exploit to differentiate Ca$^{2+}$ signaling that would lead to either the strengthening or weakening of a synapse. Techniques such as lattice-based statistics and rule-based modeling are tools that allow us to directly confront this non-ideality. A natural next step to understanding the chemical physics that underlies these processes is to consider \textit{in silico} specifically atomistic simulation methods that might augment our modeling efforts.

In the second part of this thesis, we use evolutionary algorithms to optimize \textit{in silico} methods that might be used to describe biochemical processes at the subcellular and molecular levels. While we have applied evolutionary algorithms to several methods, this thesis will focus on the optimization of charge equilibration methods. Accurate charges are essential to understanding the electrostatic interactions that are involved in ligand binding, as frequently discussed in the first part of this thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this dissertation, we apply mathematical programming techniques (i.e., integer programming and polyhedral combinatorics) to develop exact approaches for influence maximization on social networks. We study four combinatorial optimization problems that deal with maximizing influence at minimum cost over a social network. To our knowl- edge, all previous work to date involving influence maximization problems has focused on heuristics and approximation. We start with the following viral marketing problem that has attracted a significant amount of interest from the computer science literature. Given a social network, find a target set of customers to seed with a product. Then, a cascade will be caused by these initial adopters and other people start to adopt this product due to the influence they re- ceive from earlier adopters. The idea is to find the minimum cost that results in the entire network adopting the product. We first study a problem called the Weighted Target Set Selection (WTSS) Prob- lem. In the WTSS problem, the diffusion can take place over as many time periods as needed and a free product is given out to the individuals in the target set. Restricting the number of time periods that the diffusion takes place over to be one, we obtain a problem called the Positive Influence Dominating Set (PIDS) problem. Next, incorporating partial incentives, we consider a problem called the Least Cost Influence Problem (LCIP). The fourth problem studied is the One Time Period Least Cost Influence Problem (1TPLCIP) which is identical to the LCIP except that we restrict the number of time periods that the diffusion takes place over to be one. We apply a common research paradigm to each of these four problems. First, we work on special graphs: trees and cycles. Based on the insights we obtain from special graphs, we develop efficient methods for general graphs. On trees, first, we propose a polynomial time algorithm. More importantly, we present a tight and compact extended formulation. We also project the extended formulation onto the space of the natural vari- ables that gives the polytope on trees. Next, building upon the result for trees---we derive the polytope on cycles for the WTSS problem; as well as a polynomial time algorithm on cycles. This leads to our contribution on general graphs. For the WTSS problem and the LCIP, using the observation that the influence propagation network must be a directed acyclic graph (DAG), the strong formulation for trees can be embedded into a formulation on general graphs. We use this to design and implement a branch-and-cut approach for the WTSS problem and the LCIP. In our computational study, we are able to obtain high quality solutions for random graph instances with up to 10,000 nodes and 20,000 edges (40,000 arcs) within a reasonable amount of time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The International Space Station (ISS) requires a substantial amount of potable water for use by the crew. The economic and logistic limitations of transporting the vast amount of water required onboard the ISS necessitate onboard recovery and reuse of the aqueous waste streams. Various treatment technologies are employed within the ISS water processor to render the waste water potable, including filtration, ion exchange, adsorption, and catalytic wet oxidation. The ion exchange resins and adsorption media are combined in multifiltration beds for removal of ionic and organic compounds. A mathematical model (MFBMODEL™) designed to predict the performance of a multifiltration (MF) bed was developed. MFBMODEL consists of ion exchange models for describing the behavior of the different resin types in a MF bed (e.g., mixed bed, strong acid cation, strong base anion, and weak base anion exchange resins) and an adsorption model capable of predicting the performance of the adsorbents in a MF bed. Multicomponent ion exchange ii equilibrium models that incorporate the water formation reaction, electroneutrality condition, and degree of ionization of weak acids and bases for mixed bed, strong acid cation, strong base anion, and weak base anion exchange resins were developed and verified. The equilibrium models developed use a tanks-inseries approach that allows for consideration of variable influent concentrations. The adsorption modeling approach was developed in related studies and application within the MFBMODEL framework was demonstrated in the Appendix to this study. MFBMODEL consists of a graphical user interface programmed in Visual Basic and Fortran computational routines. This dissertation shows MF bed modeling results in which the model is verified for a surrogate of the ISS waste shower and handwash stream. In addition, a multicomponent ion exchange model that incorporates mass transfer effects was developed, which is capable of describing the performance of strong acid cation (SAC) and strong base anion (SBA) exchange resins, but not including reaction effects. This dissertation presents results showing the mass transfer model's capability to predict the performance of binary and multicomponent column data for SAC and SBA exchange resins. The ion exchange equilibrium and mass transfer models developed in this study are also applicable to terrestrial water treatment systems. They could be applied for removal of cations and anions from groundwater (e.g., hardness, nitrate, perchlorate) and from industrial process waters (e.g. boiler water, ultrapure water in the semiconductor industry).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Implementation of stable aeroelastic models with the ability to capture the complex features of Multi concept smartblades is a prime step in reducing the uncertainties that come along with blade dynamics. The numerical simulations of fluid structure interaction can thus be used to test a realistic scenarios comprising of full-scale blades at a reasonably low computational cost. A code which was a combination of two advanced numerical models was designed and was run with the help of paralell HPC supercomputer platform. The first model was based on a variation of dimensional reduction technique proposed by Hodges and Yu. This model was the one to record the structural response of heterogenous composite blades. This technique reduces the geometrical complexities of the heterogenous blade section into a stiffness matrix for an equivalent beam. This derived equivalent 1-D strain energy matrix is similar to the actual 3-D strain energy matrix in an asymptotic sense. As this 1-D matrix helps in accurately modeling the blade structure as a 1-D finite element problem, this substantially redues the computational effort and subsequently the computational cost that are required to model the structural dynamics at each step. Second model comprises of implementation of the Blade Element Momentum Theory. In this approach we map all the velocities and the forces with the help of orthogonal matrices that help in capturing the large deformations and the effects of rotations in calculating the aerodynamic forces. This ultimately helps us to take into account the complex flexo torsional deformations. In this thesis we have succesfully tested these computayinal tools developed by MTU’s research team lead by for the aero elastic analysis of wind-turbine blades. The validation in this thesis is majorly based on several experiments done on NREL-5MW blade, as this is widely accepted as a benchmark blade in the wind industry. Along with the use of this innovative model the internal blade structure was also changed to add up to the existing benefits of the already advanced numerical models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Buildings and other infrastructures located in the coastal regions of the US have a higher level of wind vulnerability. Reducing the increasing property losses and causalities associated with severe windstorms has been the central research focus of the wind engineering community. The present wind engineering toolbox consists of building codes and standards, laboratory experiments, and field measurements. The American Society of Civil Engineers (ASCE) 7 standard provides wind loads only for buildings with common shapes. For complex cases it refers to physical modeling. Although this option can be economically viable for large projects, it is not cost-effective for low-rise residential houses. To circumvent these limitations, a numerical approach based on the techniques of Computational Fluid Dynamics (CFD) has been developed. The recent advance in computing technology and significant developments in turbulence modeling is making numerical evaluation of wind effects a more affordable approach. The present study targeted those cases that are not addressed by the standards. These include wind loads on complex roofs for low-rise buildings, aerodynamics of tall buildings, and effects of complex surrounding buildings. Among all the turbulence models investigated, the large eddy simulation (LES) model performed the best in predicting wind loads. The application of a spatially evolving time-dependent wind velocity field with the relevant turbulence structures at the inlet boundaries was found to be essential. All the results were compared and validated with experimental data. The study also revealed CFD’s unique flow visualization and aerodynamic data generation capabilities along with a better understanding of the complex three-dimensional aerodynamics of wind-structure interactions. With the proper modeling that realistically represents the actual turbulent atmospheric boundary layer flow, CFD can offer an economical alternative to the existing wind engineering tools. CFD’s easy accessibility is expected to transform the practice of structural design for wind, resulting in more wind-resilient and sustainable systems by encouraging optimal aerodynamic and sustainable structural/building design. Thus, this method will help ensure public safety and reduce economic losses due to wind perils.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a growing societal need to address the increasing prevalence of behavioral health issues, such as obesity, alcohol or drug use, and general lack of treatment adherence for a variety of health problems. The statistics, worldwide and in the USA, are daunting. Excessive alcohol use is the third leading preventable cause of death in the United States (with 79,000 deaths annually), and is responsible for a wide range of health and social problems. On the positive side though, these behavioral health issues (and associated possible diseases) can often be prevented with relatively simple lifestyle changes, such as losing weight with a diet and/or physical exercise, or learning how to reduce alcohol consumption. Medicine has therefore started to move toward finding ways of preventively promoting wellness, rather than solely treating already established illness.^ Evidence-based patient-centered Brief Motivational Interviewing (BMI) interventions have been found particularly effective in helping people find intrinsic motivation to change problem behaviors after short counseling sessions, and to maintain healthy lifestyles over the long-term. Lack of locally available personnel well-trained in BMI, however, often limits access to successful interventions for people in need. To fill this accessibility gap, Computer-Based Interventions (CBIs) have started to emerge. Success of the CBIs, however, critically relies on insuring engagement and retention of CBI users so that they remain motivated to use these systems and come back to use them over the long term as necessary.^ Because of their text-only interfaces, current CBIs can therefore only express limited empathy and rapport, which are the most important factors of health interventions. Fortunately, in the last decade, computer science research has progressed in the design of simulated human characters with anthropomorphic communicative abilities. Virtual characters interact using humans’ innate communication modalities, such as facial expressions, body language, speech, and natural language understanding. By advancing research in Artificial Intelligence (AI), we can improve the ability of artificial agents to help us solve CBI problems.^ To facilitate successful communication and social interaction between artificial agents and human partners, it is essential that aspects of human social behavior, especially empathy and rapport, be considered when designing human-computer interfaces. Hence, the goal of the present dissertation is to provide a computational model of rapport to enhance an artificial agent’s social behavior, and to provide an experimental tool for the psychological theories shaping the model. Parts of this thesis were already published in [LYL+12, AYL12, AL13, ALYR13, LAYR13, YALR13, ALY14].^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract : Recently, there is a great interest to study the flow characteristics of suspensions in different environmental and industrial applications, such as snow avalanches, debris flows, hydrotransport systems, and material casting processes. Regarding rheological aspects, the majority of these suspensions, such as fresh concrete, behave mostly as non-Newtonian fluids. Concrete is the most widely used construction material in the world. Due to the limitations that exist in terms of workability and formwork filling abilities of normal concrete, a new class of concrete that is able to flow under its own weight, especially through narrow gaps in the congested areas of the formwork was developed. Accordingly, self-consolidating concrete (SCC) is a novel construction material that is gaining market acceptance in various applications. Higher fluidity characteristics of SCC enable it to be used in a number of special applications, such as densely reinforced sections. However, higher flowability of SCC makes it more sensitive to segregation of coarse particles during flow (i.e., dynamic segregation) and thereafter at rest (i.e., static segregation). Dynamic segregation can increase when SCC flows over a long distance or in the presence of obstacles. Therefore, there is always a need to establish a trade-off between the flowability, passing ability, and stability properties of SCC suspensions. This should be taken into consideration to design the casting process and the mixture proportioning of SCC. This is called “workability design” of SCC. An efficient and non-expensive workability design approach consists of the prediction and optimization of the workability of the concrete mixtures for the selected construction processes, such as transportation, pumping, casting, compaction, and finishing. Indeed, the mixture proportioning of SCC should ensure the construction quality demands, such as demanded levels of flowability, passing ability, filling ability, and stability (dynamic and static). This is necessary to develop some theoretical tools to assess under what conditions the construction quality demands are satisfied. Accordingly, this thesis is dedicated to carry out analytical and numerical simulations to predict flow performance of SCC under different casting processes, such as pumping and tremie applications, or casting using buckets. The L-Box and T-Box set-ups can evaluate flow performance properties of SCC (e.g., flowability, passing ability, filling ability, shear-induced and gravitational dynamic segregation) in casting process of wall and beam elements. The specific objective of the study consists of relating numerical results of flow simulation of SCC in L-Box and T-Box test set-ups, reported in this thesis, to the flow performance properties of SCC during casting. Accordingly, the SCC is modeled as a heterogeneous material. Furthermore, an analytical model is proposed to predict flow performance of SCC in L-Box set-up using the Dam Break Theory. On the other hand, results of the numerical simulation of SCC casting in a reinforced beam are verified by experimental free surface profiles. The results of numerical simulations of SCC casting (modeled as a single homogeneous fluid), are used to determine the critical zones corresponding to the higher risks of segregation and blocking. The effects of rheological parameters, density, particle contents, distribution of reinforcing bars, and particle-bar interactions on flow performance of SCC are evaluated using CFD simulations of SCC flow in L-Box and T-box test set-ups (modeled as a heterogeneous material). Two new approaches are proposed to classify the SCC mixtures based on filling ability and performability properties, as a contribution of flowability, passing ability, and dynamic stability of SCC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding how virus strains offer protection against closely related emerging strains is vital for creating effective vaccines. For many viruses, including Foot-and-Mouth Disease Virus (FMDV) and the Influenza virus where multiple serotypes often co-circulate, in vitro testing of large numbers of vaccines can be infeasible. Therefore the development of an in silico predictor of cross-protection between strains is important to help optimise vaccine choice. Vaccines will offer cross-protection against closely related strains, but not against those that are antigenically distinct. To be able to predict cross-protection we must understand the antigenic variability within a virus serotype, distinct lineages of a virus, and identify the antigenic residues and evolutionary changes that cause the variability. In this thesis we present a family of sparse hierarchical Bayesian models for detecting relevant antigenic sites in virus evolution (SABRE), as well as an extended version of the method, the extended SABRE (eSABRE) method, which better takes into account the data collection process. The SABRE methods are a family of sparse Bayesian hierarchical models that use spike and slab priors to identify sites in the viral protein which are important for the neutralisation of the virus. In this thesis we demonstrate how the SABRE methods can be used to identify antigenic residues within different serotypes and show how the SABRE method outperforms established methods, mixed-effects models based on forward variable selection or l1 regularisation, on both synthetic and viral datasets. In addition we also test a number of different versions of the SABRE method, compare conjugate and semi-conjugate prior specifications and an alternative to the spike and slab prior; the binary mask model. We also propose novel proposal mechanisms for the Markov chain Monte Carlo (MCMC) simulations, which improve mixing and convergence over that of the established component-wise Gibbs sampler. The SABRE method is then applied to datasets from FMDV and the Influenza virus in order to identify a number of known antigenic residue and to provide hypotheses of other potentially antigenic residues. We also demonstrate how the SABRE methods can be used to create accurate predictions of the important evolutionary changes of the FMDV serotypes. In this thesis we provide an extended version of the SABRE method, the eSABRE method, based on a latent variable model. The eSABRE method takes further into account the structure of the datasets for FMDV and the Influenza virus through the latent variable model and gives an improvement in the modelling of the error. We show how the eSABRE method outperforms the SABRE methods in simulation studies and propose a new information criterion for selecting the random effects factors that should be included in the eSABRE method; block integrated Widely Applicable Information Criterion (biWAIC). We demonstrate how biWAIC performs equally to two other methods for selecting the random effects factors and combine it with the eSABRE method to apply it to two large Influenza datasets. Inference in these large datasets is computationally infeasible with the SABRE methods, but as a result of the improved structure of the likelihood, we are able to show how the eSABRE method offers a computational improvement, leading it to be used on these datasets. The results of the eSABRE method show that we can use the method in a fully automatic manner to identify a large number of antigenic residues on a variety of the antigenic sites of two Influenza serotypes, as well as making predictions of a number of nearby sites that may also be antigenic and are worthy of further experiment investigation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diffusion equations that use time fractional derivatives are attractive because they describe a wealth of problems involving non-Markovian Random walks. The time fractional diffusion equation (TFDE) is obtained from the standard diffusion equation by replacing the first-order time derivative with a fractional derivative of order α ∈ (0, 1). Developing numerical methods for solving fractional partial differential equations is a new research field and the theoretical analysis of the numerical methods associated with them is not fully developed. In this paper an explicit conservative difference approximation (ECDA) for TFDE is proposed. We give a detailed analysis for this ECDA and generate discrete models of random walk suitable for simulating random variables whose spatial probability density evolves in time according to this fractional diffusion equation. The stability and convergence of the ECDA for TFDE in a bounded domain are discussed. Finally, some numerical examples are presented to show the application of the present technique.