930 resultados para Lattice theory - Computer simulation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The physics of self-organization and complexity is manifested on a variety of biological scales, from large ecosystems to the molecular level. Protein molecules exhibit characteristics of complex systems in terms of their structure, dynamics, and function. Proteins have the extraordinary ability to fold to a specific functional three-dimensional shape, starting from a random coil, in a biologically relevant time. How they accomplish this is one of the secrets of life. In this work, theoretical research into understanding this remarkable behavior is discussed. Thermodynamic and statistical mechanical tools are used in order to investigate the protein folding dynamics and stability. Theoretical analyses of the results from computer simulation of the dynamics of a four-helix bundle show that the excluded volume entropic effects are very important in protein dynamics and crucial for protein stability. The dramatic effects of changing the size of sidechains imply that a strategic placement of amino acid residues with a particular size may be an important consideration in protein engineering. Another investigation deals with modeling protein structural transitions as a phase transition. Using finite size scaling theory, the nature of unfolding transition of a four-helix bundle protein was investigated and critical exponents for the transition were calculated for various hydrophobic strengths in the core. It is found that the order of the transition changes from first to higher order as the strength of the hydrophobic interaction in the core region is significantly increased. Finally, a detailed kinetic and thermodynamic analysis was carried out in a model two-helix bundle. The connection between the structural free-energy landscape and folding kinetics was quantified. I show how simple protein engineering, by changing the hydropathy of a small number of amino acids, can enhance protein folding by significantly changing the free energy landscape so that kinetic traps are removed. The results have general applicability in protein engineering as well as understanding the underlying physical mechanisms of protein folding. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract

The goal of modern radiotherapy is to precisely deliver a prescribed radiation dose to delineated target volumes that contain a significant amount of tumor cells while sparing the surrounding healthy tissues/organs. Precise delineation of treatment and avoidance volumes is the key for the precision radiation therapy. In recent years, considerable clinical and research efforts have been devoted to integrate MRI into radiotherapy workflow motivated by the superior soft tissue contrast and functional imaging possibility. Dynamic contrast-enhanced MRI (DCE-MRI) is a noninvasive technique that measures properties of tissue microvasculature. Its sensitivity to radiation-induced vascular pharmacokinetic (PK) changes has been preliminary demonstrated. In spite of its great potential, two major challenges have limited DCE-MRI’s clinical application in radiotherapy assessment: the technical limitations of accurate DCE-MRI imaging implementation and the need of novel DCE-MRI data analysis methods for richer functional heterogeneity information.

This study aims at improving current DCE-MRI techniques and developing new DCE-MRI analysis methods for particular radiotherapy assessment. Thus, the study is naturally divided into two parts. The first part focuses on DCE-MRI temporal resolution as one of the key DCE-MRI technical factors, and some improvements regarding DCE-MRI temporal resolution are proposed; the second part explores the potential value of image heterogeneity analysis and multiple PK model combination for therapeutic response assessment, and several novel DCE-MRI data analysis methods are developed.

I. Improvement of DCE-MRI temporal resolution. First, the feasibility of improving DCE-MRI temporal resolution via image undersampling was studied. Specifically, a novel MR image iterative reconstruction algorithm was studied for DCE-MRI reconstruction. This algorithm was built on the recently developed compress sensing (CS) theory. By utilizing a limited k-space acquisition with shorter imaging time, images can be reconstructed in an iterative fashion under the regularization of a newly proposed total generalized variation (TGV) penalty term. In the retrospective study of brain radiosurgery patient DCE-MRI scans under IRB-approval, the clinically obtained image data was selected as reference data, and the simulated accelerated k-space acquisition was generated via undersampling the reference image full k-space with designed sampling grids. Two undersampling strategies were proposed: 1) a radial multi-ray grid with a special angular distribution was adopted to sample each slice of the full k-space; 2) a Cartesian random sampling grid series with spatiotemporal constraints from adjacent frames was adopted to sample the dynamic k-space series at a slice location. Two sets of PK parameters’ maps were generated from the undersampled data and from the fully-sampled data, respectively. Multiple quantitative measurements and statistical studies were performed to evaluate the accuracy of PK maps generated from the undersampled data in reference to the PK maps generated from the fully-sampled data. Results showed that at a simulated acceleration factor of four, PK maps could be faithfully calculated from the DCE images that were reconstructed using undersampled data, and no statistically significant differences were found between the regional PK mean values from undersampled and fully-sampled data sets. DCE-MRI acceleration using the investigated image reconstruction method has been suggested as feasible and promising.

Second, for high temporal resolution DCE-MRI, a new PK model fitting method was developed to solve PK parameters for better calculation accuracy and efficiency. This method is based on a derivative-based deformation of the commonly used Tofts PK model, which is presented as an integrative expression. This method also includes an advanced Kolmogorov-Zurbenko (KZ) filter to remove the potential noise effect in data and solve the PK parameter as a linear problem in matrix format. In the computer simulation study, PK parameters representing typical intracranial values were selected as references to simulated DCE-MRI data for different temporal resolution and different data noise level. Results showed that at both high temporal resolutions (<1s) and clinically feasible temporal resolution (~5s), this new method was able to calculate PK parameters more accurate than the current calculation methods at clinically relevant noise levels; at high temporal resolutions, the calculation efficiency of this new method was superior to current methods in an order of 102. In a retrospective of clinical brain DCE-MRI scans, the PK maps derived from the proposed method were comparable with the results from current methods. Based on these results, it can be concluded that this new method can be used for accurate and efficient PK model fitting for high temporal resolution DCE-MRI.

II. Development of DCE-MRI analysis methods for therapeutic response assessment. This part aims at methodology developments in two approaches. The first one is to develop model-free analysis method for DCE-MRI functional heterogeneity evaluation. This approach is inspired by the rationale that radiotherapy-induced functional change could be heterogeneous across the treatment area. The first effort was spent on a translational investigation of classic fractal dimension theory for DCE-MRI therapeutic response assessment. In a small-animal anti-angiogenesis drug therapy experiment, the randomly assigned treatment/control groups received multiple fraction treatments with one pre-treatment and multiple post-treatment high spatiotemporal DCE-MRI scans. In the post-treatment scan two weeks after the start, the investigated Rényi dimensions of the classic PK rate constant map demonstrated significant differences between the treatment and the control groups; when Rényi dimensions were adopted for treatment/control group classification, the achieved accuracy was higher than the accuracy from using conventional PK parameter statistics. Following this pilot work, two novel texture analysis methods were proposed. First, a new technique called Gray Level Local Power Matrix (GLLPM) was developed. It intends to solve the lack of temporal information and poor calculation efficiency of the commonly used Gray Level Co-Occurrence Matrix (GLCOM) techniques. In the same small animal experiment, the dynamic curves of Haralick texture features derived from the GLLPM had an overall better performance than the corresponding curves derived from current GLCOM techniques in treatment/control separation and classification. The second developed method is dynamic Fractal Signature Dissimilarity (FSD) analysis. Inspired by the classic fractal dimension theory, this method measures the dynamics of tumor heterogeneity during the contrast agent uptake in a quantitative fashion on DCE images. In the small animal experiment mentioned before, the selected parameters from dynamic FSD analysis showed significant differences between treatment/control groups as early as after 1 treatment fraction; in contrast, metrics from conventional PK analysis showed significant differences only after 3 treatment fractions. When using dynamic FSD parameters, the treatment/control group classification after 1st treatment fraction was improved than using conventional PK statistics. These results suggest the promising application of this novel method for capturing early therapeutic response.

The second approach of developing novel DCE-MRI methods is to combine PK information from multiple PK models. Currently, the classic Tofts model or its alternative version has been widely adopted for DCE-MRI analysis as a gold-standard approach for therapeutic response assessment. Previously, a shutter-speed (SS) model was proposed to incorporate transcytolemmal water exchange effect into contrast agent concentration quantification. In spite of richer biological assumption, its application in therapeutic response assessment is limited. It might be intriguing to combine the information from the SS model and from the classic Tofts model to explore potential new biological information for treatment assessment. The feasibility of this idea was investigated in the same small animal experiment. The SS model was compared against the Tofts model for therapeutic response assessment using PK parameter regional mean value comparison. Based on the modeled transcytolemmal water exchange rate, a biological subvolume was proposed and was automatically identified using histogram analysis. Within the biological subvolume, the PK rate constant derived from the SS model were proved to be superior to the one from Tofts model in treatment/control separation and classification. Furthermore, novel biomarkers were designed to integrate PK rate constants from these two models. When being evaluated in the biological subvolume, this biomarker was able to reflect significant treatment/control difference in both post-treatment evaluation. These results confirm the potential value of SS model as well as its combination with Tofts model for therapeutic response assessment.

In summary, this study addressed two problems of DCE-MRI application in radiotherapy assessment. In the first part, a method of accelerating DCE-MRI acquisition for better temporal resolution was investigated, and a novel PK model fitting algorithm was proposed for high temporal resolution DCE-MRI. In the second part, two model-free texture analysis methods and a multiple-model analysis method were developed for DCE-MRI therapeutic response assessment. The presented works could benefit the future DCE-MRI routine clinical application in radiotherapy assessment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The last two decades have seen a proliferation of research frameworks that emphasise the importance of understanding adaptive processes that happen at different levels. We contribute to this growing body of literature by exploring how cultural (mal)adaptive dynamics relate to multilevel social-ecological processes occurring at different scales, where the lower levels combine into new units with new organizations, functions, and emergent properties or collective behaviors. After a brief review of the concept of “cultural adaptation” from the perspective of cultural evolutionary theory, the core of the paper is constructed around the exploration of multilevel processes occurring at the temporal, spatial, social, and political scales. We do so by using insights from cultural evolutionary theory and by examining small-scale societies as case studies. In each section, we discuss the importance of the selected scale for understanding cultural adaptation and then present an example that illustrates how multilevel processes in the selected scale help explain observed patterns in the cultural adaptive process. The last section of the paper discusses the potential of modeling and computer simulation for studying multilevel processes in cultural adaptation. We conclude by highlighting how elements from cultural evolutionary theory might enrich the multilevel process discussion in resilience theory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis aims to describe and demonstrate the developed concept to facilitate the use of thermal simulation tools during the building design process. Despite the impact of architectural elements on the performance of buildings, some influential decisions are frequently based solely on qualitative information. Even though such design support is adequate for most decisions, the designer will eventually have doubts concerning the performance of some design decisions. These situations will require some kind of additional knowledge to be properly approached. The concept of designerly ways of simulating focuses on the formulation and solution of design dilemmas, which are doubts about the design that cannot be fully understood nor solved without using quantitative information. The concept intends to combine the power of analysis from computer simulation tools with the capacity of synthesis from architects. Three types of simulation tools are considered: solar analysis, thermal/energy simulation and CFD. Design dilemmas are formulated and framed according to the architect s reflection process about performance aspects. Throughout the thesis, the problem is investigated in three fields: professional, technical and theoretical fields. This approach on distinct parts of the problem aimed to i) characterize different professional categories with regards to their design practice and use of tools, ii) investigate preceding researchers on the use of simulation tools and iii) draw analogies between the proposed concept, and some concepts developed or described in previous works about design theory. The proposed concept was tested in eight design dilemmas extracted from three case studies in the Netherlands. The three investigated processes are houses designed by Dutch architectural firms. Relevant information and criteria from each case study were obtained through interviews and conversations with the involved architects. The practical application, despite its success in the research context, allowed the identification of some applicability limitations of the concept, concerning the architects need to have technical knowledge and the actual evolution stage of simulation tools

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We apply Agent-Based Modeling and Simulation (ABMS) to investigate a set of problems in a retail context. Specifically, we are working to understand the relationship between human resource management practices and retail productivity. Despite the fact we are working within a relatively novel and complex domain, it is clear that intelligent agents do offer potential for developing organizational capabilities in the future. Our multi-disciplinary research team has worked with a UK department store to collect data and capture perceptions about operations from actors within departments. Based on this case study work, we have built a simulator that we present in this paper. We then use the simulator to gather empirical evidence regarding two specific management practices: empowerment and employee development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In our research we investigate the output accuracy of discrete event simulation models and agent based simulation models when studying human centric complex systems. In this paper we focus on human reactive behaviour as it is possible in both modelling approaches to implement human reactive behaviour in the model by using standard methods. As a case study we have chosen the retail sector, and here in particular the operations of the fitting room in the women wear department of a large UK department store. In our case study we looked at ways of determining the efficiency of implementing new management policies for the fitting room operation through modelling the reactive behaviour of staff and customers of the department. First, we have carried out a validation experiment in which we compared the results from our models to the performance of the real system. This experiment also allowed us to establish differences in output accuracy between the two modelling methods. In a second step a multi-scenario experiment was carried out to study the behaviour of the models when they are used for the purpose of operational improvement. Overall we have found that for our case study example both, discrete event simulation and agent based simulation have the same potential to support the investigation into the efficiency of implementing new management policies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis aims to describe and demonstrate the developed concept to facilitate the use of thermal simulation tools during the building design process. Despite the impact of architectural elements on the performance of buildings, some influential decisions are frequently based solely on qualitative information. Even though such design support is adequate for most decisions, the designer will eventually have doubts concerning the performance of some design decisions. These situations will require some kind of additional knowledge to be properly approached. The concept of designerly ways of simulating focuses on the formulation and solution of design dilemmas, which are doubts about the design that cannot be fully understood nor solved without using quantitative information. The concept intends to combine the power of analysis from computer simulation tools with the capacity of synthesis from architects. Three types of simulation tools are considered: solar analysis, thermal/energy simulation and CFD. Design dilemmas are formulated and framed according to the architect s reflection process about performance aspects. Throughout the thesis, the problem is investigated in three fields: professional, technical and theoretical fields. This approach on distinct parts of the problem aimed to i) characterize different professional categories with regards to their design practice and use of tools, ii) investigate preceding researchers on the use of simulation tools and iii) draw analogies between the proposed concept, and some concepts developed or described in previous works about design theory. The proposed concept was tested in eight design dilemmas extracted from three case studies in the Netherlands. The three investigated processes are houses designed by Dutch architectural firms. Relevant information and criteria from each case study were obtained through interviews and conversations with the involved architects. The practical application, despite its success in the research context, allowed the identification of some applicability limitations of the concept, concerning the architects need to have technical knowledge and the actual evolution stage of simulation tools

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Las teorías administrativas se han basado, casi sin excepción, en los fundamentos y los modelos de la ciencia clásica (particularmente, en los modelos de la física newtoniana). Sin embargo, las organizaciones actualmente se enfrentan a un mundo globalizado, plagado de información (y no necesariamente conocimiento), hiperconectado, dinámico y cargado de incertidumbre, por lo que muchas de las teorías pueden mostrar limitaciones para las organizaciones. Y quizá no por la estructura, la lógica o el alcance de las mismas, sino por la falta de criterios que justifiquen su aplicación. En muchos casos, las organizaciones siguen utilizando la intuición, las suposiciones y las verdades a medias en la toma de decisiones. Este panorama pone de manifiesto dos hechos: de un lado, la necesidad de buscar un método que permita comprender las situaciones de cada organización para apoyar la toma de decisiones. De otro lado, la necesidad de potenciar la intuición con modelos y técnicas no tradicionales (usualmente provenientes o inspiradas por la ingeniería). Este trabajo busca anticipar los pilares de un posible método que permita apoyar la toma de decisiones por medio de la simulación de modelos computacionales, utilizando las posibles interacciones entre: la administración basada en modelos, la ciencia computacional de la organización y la ingeniería emergente.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study proposed to evaluate the mandibular biomechanics in the posterior dentition based on experimental and computational analyses. The analyses were performed on a model of human mandible, which was modeled by epoxy resin for photoelastic analysis and by computer-aided design for finite element analysis. To standardize the evaluation, specific areas were determined at the lateral surface of mandibular body. The photoelastic analysis was configured through a vertical load on the first upper molar and fixed support at the ramus of mandible. The same configuration was used in the computer simulation. Force magnitudes of 50, 100, 150, and 200 N were applied to evaluate the bone stress. The stress results presented similar distribution in both analyses, with the more intense stress being at retromolar area and oblique line and alveolar process at molar level. This study presented the similarity of results in the experimental and computational analyses and, thus, showed the high importance of morphology biomechanical characterization at posterior dentition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJETIVO: Desenvolver simulação computadorizada de ablação para produzir lentes de contato personalizadas a fim de corrigir aberrações de alta ordem. MÉTODOS: Usando dados reais de um paciente com ceratocone, mensurados em um aberrômetro ("wavefront") com sensor Hartmann-Shack, foram determinados as espessuras de lentes de contato que compensam essas aberrações assim como os números de pulsos necessários para fazer ablação as lentes especificamente para este paciente. RESULTADOS: Os mapas de correção são apresentados e os números dos pulsos foram calculados, usando feixes com a largura de 0,5 mm e profundidade de ablação de 0,3 µm. CONCLUSÕES: Os resultados simulados foram promissores, mas ainda precisam ser aprimorados para que o sistema de ablação "real" possa alcançar a precisão desejada.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article describes the design, implementation, and experiences with AcMus, an open and integrated software platform for room acoustics research, which comprises tools for measurement, analysis, and simulation of rooms for music listening and production. Through use of affordable hardware, such as laptops, consumer audio interfaces and microphones, the software allows evaluation of relevant acoustical parameters with stable and consistent results, thus providing valuable information in the diagnosis of acoustical problems, as well as the possibility of simulating modifications in the room through analytical models. The system is open-source and based on a flexible and extensible Java plug-in framework, allowing for cross-platform portability, accessibility and experimentation, thus fostering collaboration of users, developers and researchers in the field of room acoustics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Enzymes are extremely efficient catalysts. Here, part of the mechanisms proposed to explain this catalytic power will be compared to quantitative experimental results and computer simulations. Influence of the enzymatic environment over species along the reaction coordinate will be analysed. Concepts of transition state stabilisation and reactant destabilisation will be confronted. Divided site model and near-attack conformation hypotheses will also be discussed. Molecular interactions such as covalent catalysis, general acid-base catalysis, electrostatics, entropic effects, steric hindrance, quantum and dynamical effects will also be analysed as sources of catalysis. Reaction mechanisms, in particular that catalysed by protein tyrosine phosphatases, illustrate the concepts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Shallow subsurface layers of gold nanoclusters were formed in polymethylmethacrylate (PMMA) polymer by very low energy (49 eV) gold ion implantation. The ion implantation process was modeled by computer simulation and accurately predicted the layer depth and width. Transmission electron microscopy (TEM) was used to image the buried layer and individual nanoclusters; the layer width was similar to 6-8 nm and the cluster diameter was similar to 5-6 nm. Surface plasmon resonance (SPR) absorption effects were observed by UV-visible spectroscopy. The TEM and SPR results were related to prior measurements of electrical conductivity of Au-doped PMMA, and excellent consistency was found with a model of electrical conductivity in which either at low implantation dose the individual nanoclusters are separated and do not physically touch each other, or at higher implantation dose the nanoclusters touch each other to form a random resistor network (percolation model). (C) 2009 American Vacuum Society. [DOI: 10.1116/1.3231449]