918 resultados para Physics Based Modeling
Resumo:
In recent years, the bio-conjugated nanostructured materials have emerged as a new class of materials for the bio-sensing and medical diagnostics applications. In spite of their multi-directional applications, interfacing nanomaterials with bio-molecules has been a challenge due to somewhat limited knowledge about the underlying physics and chemistry behind these interactions and also for the complexity of biomolecules. The main objective of this dissertation is to provide such a detailed knowledge on bioconjugated nanomaterials toward their applications in designing the next generation of sensing devices. Specifically, we investigate the changes in the electronic properties of a boron nitride nanotube (BNNT) due to the adsorption of different bio-molecules, ranging from neutral (DNA/RNA nucleobases) to polar (amino acid molecules). BNNT is a typical member of III-V compounds semiconductors with morphology similar to that of carbon nanotubes (CNTs) but with its own distinct properties. More specifically, the natural affinity of BNNTs toward living cells with no apparent toxicity instigates the applications of BNNTs in drug delivery and cell therapy. Our results predict that the adsorption of DNA/RNA nucleobases on BNNTs amounts to different degrees of modulation in the band gap of BNNTs, which can be exploited for distinguishing these nucleobases from each other. Interestingly, for the polar amino acid molecules, the nature of interaction appeared to vary ranging from Coulombic, van der Waals and covalent depending on the polarity of the individual molecules, each with a different binding strength and amount of charge transfer involved in the interaction. The strong binding of amino acid molecules on the BNNTs explains the observed protein wrapping onto BNNTs without any linkers, unlike carbon nanotubes (CNTs). Additionally, the widely varying binding energies corresponding to different amino acid molecules toward BNNTs indicate to the suitability of BNNTs for the biosensing applications, as compared to the metallic CNTs. The calculated I-V characteristics in these bioconjugated nanotubes predict notable changes in the conductivity of BNNTs due to the physisorption of DNA/RNA nucleobases. This is not the case with metallic CNTs whose transport properties remained unaltered in their conjugated systems with the nucleobases. Collectively, the bioconjugated BNNTs are found to be an excellent system for the next generation sensing devices.
Resumo:
This document will demonstrate the methodology used to create an energy and conductance based model for power electronic converters. The work is intended to be a replacement for voltage and current based models which have limited applicability to the network nodal equations. Using conductance-based modeling allows direct application of load differential equations to the bus admittance matrix (Y-bus) with a unified approach. When applied directly to the Y-bus, the system becomes much easier to simulate since the state variables do not need to be transformed. The proposed transformation applies to loads, sources, and energy storage systems and is useful for DC microgrids. Transformed state models of a complete microgrid are compared to experimental results and show the models accurately reflect the system dynamic behavior.
Resumo:
Image-based modeling of tumor growth combines methods from cancer simulation and medical imaging. In this context, we present a novel approach to adapt a healthy brain atlas to MR images of tumor patients. In order to establish correspondence between a healthy atlas and a pathologic patient image, tumor growth modeling in combination with registration algorithms is employed. In a first step, the tumor is grown in the atlas based on a new multi-scale, multi-physics model including growth simulation from the cellular level up to the biomechanical level, accounting for cell proliferation and tissue deformations. Large-scale deformations are handled with an Eulerian approach for finite element computations, which can operate directly on the image voxel mesh. Subsequently, dense correspondence between the modified atlas and patient image is established using nonrigid registration. The method offers opportunities in atlasbased segmentation of tumor-bearing brain images as well as for improved patient-specific simulation and prognosis of tumor progression.
Resumo:
Results of studies of the static and dynamic dielectric properties in rod-like 4-n-octyloxy-4'-cyanobiphenyl (8OCB) with isotropic (I)–nematic (N)–smectic A (SmA)–crystal (Cr) mesomorphism, combined with measurements of the low-frequency nonlinear dielectric effect and heat capacity are presented. The analysis is supported by the derivative-based and distortion-sensitive transformation of experimental data. Evidence for the I–N and N–SmA pretransitional anomalies, indicating the influence of tricritical behavior, is shown. It has also been found that neither the N phase nor the SmA phase are uniform and hallmarks of fluid–fluid crossovers can be detected. The dynamics, tested via the evolution of the primary relaxation time, is clearly non-Arrhenius and described via τ(T) = τc(T−TC)−phgr. In the immediate vicinity of the I–N transition a novel anomaly has been found: Δτ ∝ 1/(T − T*), where T* is the temperature of the virtual continuous transition and Δτ is the excess over the 'background behavior'. Experimental results are confronted with the comprehensive Landau–de Gennes theory based modeling.
Resumo:
It is expected that climate change will have significant impacts on ecosystems. Most model projections agree that the ocean will experience stronger stratification and less nutrient supply from deep waters. These changes will likely affect marine phytoplankton communities and will thus impact on the higher trophic levels of the oceanic food web. The potential consequences of future climate change on marine microbial communities can be investigated and predicted only with the help of mathematical models. Here we present the application of a model that describes aggregate properties of marine phytoplankton communities and captures the effects of a changing environment on their composition and adaptive capacity. Specifically, the model describes the phytoplankton community in terms of total biomass, mean cell size, and functional diversity. The model is applied to two contrasting regions of the Atlantic Ocean (tropical and temperate) and is tested under two emission scenarios: SRES A2 or “business as usual” and SRES B1 or “local utopia.” We find that all three macroecological properties will decline during the next century in both regions, although this effect will be more pronounced in the temperate region. Being consistent with previous model predictions, our results show that a simple trait-based modeling framework represents a valuable tool for investigating how phytoplankton communities may reorganize under a changing climate.
Resumo:
The efficiency of sputtered refractory elements by H+ and He++ solar wind ions from Mercury's surface and their contribution to the exosphere are studied for various solar wind conditions. A 3D solar wind-planetary interaction hybrid model is used for the evaluation of precipitation maps of the sputter agents on Mercury's surface. By assuming a global mineralogical surface composition, the related sputter yields are calculated by means of the 2013 SRIM code and are coupled with a 3D exosphere model. Because of Mercury's magnetic field, for quiet and nominal solar wind conditions the plasma can only precipitate around the polar areas, while for extreme solar events (fast solar wind, coronal mass ejections, interplanetary magnetic clouds) the solar wind plasma has access to the entire dayside. In that case the release of particles form the planet's surface can result in an exosphere density increase of more than one order of magnitude. The corresponding escape rates are also about an order of magnitude higher. Moreover, the amount of He++ ions in the precipitating solar plasma flow enhances also the release of sputtered elements from the surface in the exosphere. A comparison of our model results with MESSENGER observations of sputtered Mg and Ca elements in the exosphere shows a reasonable quantitative agreement. (C) 2015 Elsevier Ltd. All rights reserved.
Resumo:
By 2050 it is estimated that the number of worldwide Alzheimer?s disease (AD) patients will quadruple from the current number of 36 million people. To date, no single test, prior to postmortem examination, can confirm that a person suffers from AD. Therefore, there is a strong need for accurate and sensitive tools for the early diagnoses of AD. The complex etiology and multiple pathogenesis of AD call for a system-level understanding of the currently available biomarkers and the study of new biomarkers via network-based modeling of heterogeneous data types. In this review, we summarize recent research on the study of AD as a connectivity syndrome. We argue that a network-based approach in biomarker discovery will provide key insights to fully understand the network degeneration hypothesis (disease starts in specific network areas and progressively spreads to connected areas of the initial loci-networks) with a potential impact for early diagnosis and disease-modifying treatments. We introduce a new framework for the quantitative study of biomarkers that can help shorten the transition between academic research and clinical diagnosis in AD.
Resumo:
The design, development, and use of complex systems models raises a unique class of challenges and potential pitfalls, many of which are commonly recurring problems. Over time, researchers gain experience in this form of modeling, choosing algorithms, techniques, and frameworks that improve the quality, confidence level, and speed of development of their models. This increasing collective experience of complex systems modellers is a resource that should be captured. Fields such as software engineering and architecture have benefited from the development of generic solutions to recurring problems, called patterns. Using pattern development techniques from these fields, insights from communities such as learning and information processing, data mining, bioinformatics, and agent-based modeling can be identified and captured. Collections of such 'pattern languages' would allow knowledge gained through experience to be readily accessible to less-experienced practitioners and to other domains. This paper proposes a methodology for capturing the wisdom of computational modelers by introducing example visualization patterns, and a pattern classification system for analyzing the relationship between micro and macro behaviour in complex systems models. We anticipate that a new field of complex systems patterns will provide an invaluable resource for both practicing and future generations of modelers.
Resumo:
This paper presents a formal but practical approach for defining and using design patterns. Initially we formalize the concepts commonly used in defining design patterns using Object-Z. We also formalize consistency constraints that must be satisfied when a pattern is deployed in a design model. Then we implement the pattern modeling language and its consistency constraints using an existing modeling framework, EMF, and incorporate the implementation as plug-ins to the Eclipse modeling environment. While the language is defined formally in terms of Object-Z definitions, the language is implemented in a practical environment. Using the plug-ins, users can develop precise pattern descriptions without knowing the underlying formalism, and can use the tool to check the validity of the pattern descriptions and pattern usage in design models. In this work, formalism brings precision to the pattern language definition and its implementation brings practicability to our pattern-based modeling approach.
Resumo:
When constructing and using environmental models, it is typical that many of the inputs to the models will not be known perfectly. In some cases, it will be possible to make observations, or occasionally physics-based uncertainty propagation, to ascertain the uncertainty on these inputs. However, such observations are often either not available or even possible, and another approach to characterising the uncertainty on the inputs must be sought. Even when observations are available, if the analysis is being carried out within a Bayesian framework then prior distributions will have to be specified. One option for gathering or at least estimating this information is to employ expert elicitation. Expert elicitation is well studied within statistics and psychology and involves the assessment of the beliefs of a group of experts about an uncertain quantity, (for example an input / parameter within a model), typically in terms of obtaining a probability distribution. One of the challenges in expert elicitation is to minimise the biases that might enter into the judgements made by the individual experts, and then to come to a consensus decision within the group of experts. Effort is made in the elicitation exercise to prevent biases clouding the judgements through well-devised questioning schemes. It is also important that, when reaching a consensus, the experts are exposed to the knowledge of the others in the group. Within the FP7 UncertWeb project (http://www.uncertweb.org/), there is a requirement to build a Webbased tool for expert elicitation. In this paper, we discuss some of the issues of building a Web-based elicitation system - both the technological aspects and the statistical and scientific issues. In particular, we demonstrate two tools: a Web-based system for the elicitation of continuous random variables and a system designed to elicit uncertainty about categorical random variables in the setting of landcover classification uncertainty. The first of these examples is a generic tool developed to elicit uncertainty about univariate continuous random variables. It is designed to be used within an application context and extends the existing SHELF method, adding a web interface and access to metadata. The tool is developed so that it can be readily integrated with environmental models exposed as web services. The second example was developed for the TREES-3 initiative which monitors tropical landcover change through ground-truthing at confluence points. It allows experts to validate the accuracy of automated landcover classifications using site-specific imagery and local knowledge. Experts may provide uncertainty information at various levels: from a general rating of their confidence in a site validation to a numerical ranking of the possible landcover types within a segment. A key challenge in the web based setting is the design of the user interface and the method of interacting between the problem owner and the problem experts. We show the workflow of the elicitation tool, and show how we can represent the final elicited distributions and confusion matrices using UncertML, ready for integration into uncertainty enabled workflows.We also show how the metadata associated with the elicitation exercise is captured and can be referenced from the elicited result, providing crucial lineage information and thus traceability in the decision making process.
Resumo:
For the first time we report full numerical NLSE-based modeling of generation properties of random distributed feedback fiber laser based on Rayleigh scattering. The model which takes into account the random backscattering via its average strength only describes well power and spectral properties of random DFB fiber lasers. The influence of dispersion and nonlinearity on spectral and statistical properties is investigated. The evidence of non-gaussian intensity statistics is found. © 2013 Optical Society of America.
Resumo:
Motivation: Within bioinformatics, the textual alignment of amino acid sequences has long dominated the determination of similarity between proteins, with all that implies for shared structure, function, and evolutionary descent. Despite the relative success of modern-day sequence alignment algorithms, so-called alignment-free approaches offer a complementary means of determining and expressing similarity, with potential benefits in certain key applications, such as regression analysis of protein structure-function studies, where alignment-base similarity has performed poorly. Results: Here, we offer a fresh, statistical physics-based perspective focusing on the question of alignment-free comparison, in the process adapting results from “first passage probability distribution” to summarize statistics of ensemble averaged amino acid propensity values. In this paper, we introduce and elaborate this approach.
Resumo:
The ability to predict the properties of magnetic materials in a device is essential to ensuring the correct operation and optimization of the design as well as the device behavior over a wide range of input frequencies. Typically, development and simulation of wide-bandwidth models requires detailed, physics-based simulations that utilize significant computational resources. Balancing the trade-offs between model computational overhead and accuracy can be cumbersome, especially when the nonlinear effects of saturation and hysteresis are included in the model. This study focuses on the development of a system for analyzing magnetic devices in cases where model accuracy and computational intensity must be carefully and easily balanced by the engineer. A method for adjusting model complexity and corresponding level of detail while incorporating the nonlinear effects of hysteresis is presented that builds upon recent work in loss analysis and magnetic equivalent circuit (MEC) modeling. The approach utilizes MEC models in conjunction with linearization and model-order reduction techniques to process magnetic devices based on geometry and core type. The validity of steady-state permeability approximations is also discussed.
Resumo:
The analysis of steel and composite frames has traditionally been carried out by idealizing beam-to-column connections as either rigid or pinned. Although some advanced analysis methods have been proposed to account for semi-rigid connections, the performance of these methods strongly depends on the proper modeling of connection behavior. The primary challenge of modeling beam-to-column connections is their inelastic response and continuously varying stiffness, strength, and ductility. In this dissertation, two distinct approaches—mathematical models and informational models—are proposed to account for the complex hysteretic behavior of beam-to-column connections. The performance of the two approaches is examined and is then followed by a discussion of their merits and deficiencies. To capitalize on the merits of both mathematical and informational representations, a new approach, a hybrid modeling framework, is developed and demonstrated through modeling beam-to-column connections. Component-based modeling is a compromise spanning two extremes in the field of mathematical modeling: simplified global models and finite element models. In the component-based modeling of angle connections, the five critical components of excessive deformation are identified. Constitutive relationships of angles, column panel zones, and contact between angles and column flanges, are derived by using only material and geometric properties and theoretical mechanics considerations. Those of slip and bolt hole ovalization are simplified by empirically-suggested mathematical representation and expert opinions. A mathematical model is then assembled as a macro-element by combining rigid bars and springs that represent the constitutive relationship of components. Lastly, the moment-rotation curves of the mathematical models are compared with those of experimental tests. In the case of a top-and-seat angle connection with double web angles, a pinched hysteretic response is predicted quite well by complete mechanical models, which take advantage of only material and geometric properties. On the other hand, to exhibit the highly pinched behavior of a top-and-seat angle connection without web angles, a mathematical model requires components of slip and bolt hole ovalization, which are more amenable to informational modeling. An alternative method is informational modeling, which constitutes a fundamental shift from mathematical equations to data that contain the required information about underlying mechanics. The information is extracted from observed data and stored in neural networks. Two different training data sets, analytically-generated and experimental data, are tested to examine the performance of informational models. Both informational models show acceptable agreement with the moment-rotation curves of the experiments. Adding a degradation parameter improves the informational models when modeling highly pinched hysteretic behavior. However, informational models cannot represent the contribution of individual components and therefore do not provide an insight into the underlying mechanics of components. In this study, a new hybrid modeling framework is proposed. In the hybrid framework, a conventional mathematical model is complemented by the informational methods. The basic premise of the proposed hybrid methodology is that not all features of system response are amenable to mathematical modeling, hence considering informational alternatives. This may be because (i) the underlying theory is not available or not sufficiently developed, or (ii) the existing theory is too complex and therefore not suitable for modeling within building frame analysis. The role of informational methods is to model aspects that the mathematical model leaves out. Autoprogressive algorithm and self-learning simulation extract the missing aspects from a system response. In a hybrid framework, experimental data is an integral part of modeling, rather than being used strictly for validation processes. The potential of the hybrid methodology is illustrated through modeling complex hysteretic behavior of beam-to-column connections. Mechanics-based components of deformation such as angles, flange-plates, and column panel zone, are idealized to a mathematical model by using a complete mechanical approach. Although the mathematical model represents envelope curves in terms of initial stiffness and yielding strength, it is not capable of capturing the pinching effects. Pinching is caused mainly by separation between angles and column flanges as well as slip between angles/flange-plates and beam flanges. These components of deformation are suitable for informational modeling. Finally, the moment-rotation curves of the hybrid models are validated with those of the experimental tests. The comparison shows that the hybrid models are capable of representing the highly pinched hysteretic behavior of beam-to-column connections. In addition, the developed hybrid model is successfully used to predict the behavior of a newly-designed connection.
Resumo:
In Part 1 of this thesis, we propose that biochemical cooperativity is a fundamentally non-ideal process. We show quantal effects underlying biochemical cooperativity and highlight apparent ergodic breaking at small volumes. The apparent ergodic breaking manifests itself in a divergence of deterministic and stochastic models. We further predict that this divergence of deterministic and stochastic results is a failure of the deterministic methods rather than an issue of stochastic simulations.
Ergodic breaking at small volumes may allow these molecular complexes to function as switches to a greater degree than has previously been shown. We propose that this ergodic breaking is a phenomenon that the synapse might exploit to differentiate Ca$^{2+}$ signaling that would lead to either the strengthening or weakening of a synapse. Techniques such as lattice-based statistics and rule-based modeling are tools that allow us to directly confront this non-ideality. A natural next step to understanding the chemical physics that underlies these processes is to consider \textit{in silico} specifically atomistic simulation methods that might augment our modeling efforts.
In the second part of this thesis, we use evolutionary algorithms to optimize \textit{in silico} methods that might be used to describe biochemical processes at the subcellular and molecular levels. While we have applied evolutionary algorithms to several methods, this thesis will focus on the optimization of charge equilibration methods. Accurate charges are essential to understanding the electrostatic interactions that are involved in ligand binding, as frequently discussed in the first part of this thesis.