869 resultados para swd: Computational geometry
Resumo:
The dynamic character of proteins strongly influences biomolecular recognition mechanisms. With the development of the main models of ligand recognition (lock-and-key, induced fit, conformational selection theories), the role of protein plasticity has become increasingly relevant. In particular, major structural changes concerning large deviations of protein backbones, and slight movements such as side chain rotations are now carefully considered in drug discovery and development. It is of great interest to identify multiple protein conformations as preliminary step in a screening campaign. Protein flexibility has been widely investigated, in terms of both local and global motions, in two diverse biological systems. On one side, Replica Exchange Molecular Dynamics has been exploited as enhanced sampling method to collect multiple conformations of Lactate Dehydrogenase A (LDHA), an emerging anticancer target. The aim of this project was the development of an Ensemble-based Virtual Screening protocol, in order to find novel potent inhibitors. On the other side, a preliminary study concerning the local flexibility of Opioid Receptors has been carried out through ALiBERO approach, an iterative method based on Elastic Network-Normal Mode Analysis and Monte Carlo sampling. Comparison of the Virtual Screening performances by using single or multiple conformations confirmed that the inclusion of protein flexibility in screening protocols has a positive effect on the probability to early recognize novel or known active compounds.
Resumo:
Flow features inside centrifugal compressor stages are very complicated to simulate with numerical tools due to the highly complex geometry and varying gas conditions all across the machine. For this reason, a big effort is currently being made to increase the fidelity of the numerical models during the design and validation phases. Computational Fluid Dynamics (CFD) plays an increasing role in the assessment of the performance prediction of centrifugal compressor stages. Historically, CFD was considered reliable for performance prediction on a qualitatively level, whereas tests were necessary to predict compressors performance on a quantitatively basis. In fact "standard" CFD with only the flow-path and blades included into the computational domain is known to be weak in capturing efficiency level and operating range accurately due to the under-estimation of losses and the lack of secondary flows modeling. This research project aims to fill the gap in accuracy between "standard" CFD and tests data by including a high fidelity reproduction of the gas domain and the use of advanced numerical models and tools introduced in the author's OEM in-house CFD code. In other words, this thesis describes a methodology by which virtual tests can be conducted on single stages and multistage centrifugal compressors in a similar fashion to a typical rig test that guarantee end users to operate machines with a confidence level not achievable before. Furthermore, the new "high fidelity" approach allowed understanding flow phenomena not fully captured before, increasing aerodynamicists capability and confidence in designing high efficiency and high reliable centrifugal compressor stages.
Resumo:
The Curry-Howard isomorphism is the idea that proofs in natural deduction can be put in correspondence with lambda terms in such a way that this correspondence is preserved by normalization. The concept can be extended from Intuitionistic Logic to other systems, such as Linear Logic. One of the nice conseguences of this isomorphism is that we can reason about functional programs with formal tools which are typical of proof systems: such analysis can also include quantitative qualities of programs, such as the number of steps it takes to terminate. Another is the possiblity to describe the execution of these programs in terms of abstract machines. In 1990 Griffin proved that the correspondence can be extended to Classical Logic and control operators. That is, Classical Logic adds the possiblity to manipulate continuations. In this thesis we see how the things we described above work in this larger context.
Resumo:
Thanks to the increasing slenderness and lightness allowed by new construction techniques and materials, the effects of wind on structures became in the last decades a research field of great importance in Civil Engineering. Thanks to the advances in computers power, the numerical simulation of wind tunnel tests has became a valid complementary activity and an attractive alternative for the future. Due to its flexibility, during the last years, the computational approach gained importance with respect to the traditional experimental investigation. However, still today, the computational approach to fluid-structure interaction problems is not as widely adopted as it could be expected. The main reason for this lies in the difficulties encountered in the numerical simulation of the turbulent, unsteady flow conditions generally encountered around bluff bodies. This thesis aims at providing a guide to the numerical simulation of bridge deck aerodynamic and aeroelastic behaviour describing in detail the simulation strategies and setting guidelines useful for the interpretation of the results.
Resumo:
In this thesis we provide a characterization of probabilistic computation in itself, from a recursion-theoretical perspective, without reducing it to deterministic computation. More specifically, we show that probabilistic computable functions, i.e., those functions which are computed by Probabilistic Turing Machines (PTM), can be characterized by a natural generalization of Kleene's partial recursive functions which includes, among initial functions, one that returns identity or successor with probability 1/2. We then prove the equi-expressivity of the obtained algebra and the class of functions computed by PTMs. In the the second part of the thesis we investigate the relations existing between our recursion-theoretical framework and sub-recursive classes, in the spirit of Implicit Computational Complexity. More precisely, endowing predicative recurrence with a random base function is proved to lead to a characterization of polynomial-time computable probabilistic functions.
Resumo:
In this thesis the evolution of the techno-social systems analysis methods will be reported, through the explanation of the various research experience directly faced. The first case presented is a research based on data mining of a dataset of words association named Human Brain Cloud: validation will be faced and, also through a non-trivial modeling, a better understanding of language properties will be presented. Then, a real complex system experiment will be introduced: the WideNoise experiment in the context of the EveryAware european project. The project and the experiment course will be illustrated and data analysis will be displayed. Then the Experimental Tribe platform for social computation will be introduced . It has been conceived to help researchers in the implementation of web experiments, and aims also to catalyze the cumulative growth of experimental methodologies and the standardization of tools cited above. In the last part, three other research experience which already took place on the Experimental Tribe platform will be discussed in detail, from the design of the experiment to the analysis of the results and, eventually, to the modeling of the systems involved. The experiments are: CityRace, about the measurement of human traffic-facing strategies; laPENSOcosì, aiming to unveil the political opinion structure; AirProbe, implemented again in the EveryAware project framework, which consisted in monitoring air quality opinion shift of a community informed about local air pollution. At the end, the evolution of the technosocial systems investigation methods shall emerge together with the opportunities and the threats offered by this new scientific path.
Resumo:
The aim of the work was to explore the practical applicability of molecular dynamics at different length and time scales. From nanoparticles system over colloids and polymers to biological systems like membranes and finally living cells, a broad range of materials was considered from a theoretical standpoint. In this dissertation five chemistry-related problem are addressed by means of theoretical and computational methods. The main results can be outlined as follows. (1) A systematic study of the effect of the concentration, chain length, and charge of surfactants on fullerene aggregation is presented. The long-discussed problem of the location of C60 in micelles was addressed and fullerenes were found in the hydrophobic region of the micelles. (2) The interactions between graphene sheet of increasing size and phospholipid membrane are quantitatively investigated. (3) A model was proposed to study structure, stability, and dynamics of MoS2, a material well-known for its tribological properties. The telescopic movement of nested nanotubes and the sliding of MoS2 layers is simulated. (4) A mathematical model to gain understaning of the coupled diffusion-swelling process in poly(lactic-co-glycolic acid), PLGA, was proposed. (5) A soft matter cell model is developed to explore the interaction of living cell with artificial surfaces. The effect of the surface properties on the adhesion dynamics of cells are discussed.
Resumo:
If the generic fibre f−1(c) of a Lagrangian fibration f : X → B on a complex Poisson– variety X is smooth, compact, and connected, it is isomorphic to the compactification of a complex abelian Lie–group. For affine Lagrangian fibres it is not clear what the structure of the fibre is. Adler and van Moerbeke developed a strategy to prove that the generic fibre of a Lagrangian fibration is isomorphic to the affine part of an abelian variety.rnWe extend their strategy to verify that the generic fibre of a given Lagrangian fibration is the affine part of a (C∗)r–extension of an abelian variety. This strategy turned out to be successful for all examples we studied. Additionally we studied examples of Lagrangian fibrations that have the affine part of a ramified cyclic cover of an abelian variety as generic fibre. We obtained an embedding in a Lagrangian fibration that has the affine part of a C∗–extension of an abelian variety as generic fibre. This embedding is not an embedding in the category of Lagrangian fibrations. The C∗–quotient of the new Lagrangian fibration defines in a natural way a deformation of the cyclic quotient of the original Lagrangian fibration.
Resumo:
Self-organising pervasive ecosystems of devices are set to become a major vehicle for delivering infrastructure and end-user services. The inherent complexity of such systems poses new challenges to those who want to dominate it by applying the principles of engineering. The recent growth in number and distribution of devices with decent computational and communicational abilities, that suddenly accelerated with the massive diffusion of smartphones and tablets, is delivering a world with a much higher density of devices in space. Also, communication technologies seem to be focussing on short-range device-to-device (P2P) interactions, with technologies such as Bluetooth and Near-Field Communication gaining greater adoption. Locality and situatedness become key to providing the best possible experience to users, and the classic model of a centralised, enormously powerful server gathering and processing data becomes less and less efficient with device density. Accomplishing complex global tasks without a centralised controller responsible of aggregating data, however, is a challenging task. In particular, there is a local-to-global issue that makes the application of engineering principles challenging at least: designing device-local programs that, through interaction, guarantee a certain global service level. In this thesis, we first analyse the state of the art in coordination systems, then motivate the work by describing the main issues of pre-existing tools and practices and identifying the improvements that would benefit the design of such complex software ecosystems. The contribution can be divided in three main branches. First, we introduce a novel simulation toolchain for pervasive ecosystems, designed for allowing good expressiveness still retaining high performance. Second, we leverage existing coordination models and patterns in order to create new spatial structures. Third, we introduce a novel language, based on the existing ``Field Calculus'' and integrated with the aforementioned toolchain, designed to be usable for practical aggregate programming.
Resumo:
Heart diseases are the leading cause of death worldwide, both for men and women. However, the ionic mechanisms underlying many cardiac arrhythmias and genetic disorders are not completely understood, thus leading to a limited efficacy of the current available therapies and leaving many open questions for cardiac electrophysiologists. On the other hand, experimental data availability is still a great issue in this field: most of the experiments are performed in vitro and/or using animal models (e.g. rabbit, dog and mouse), even when the final aim is to better understand the electrical behaviour of in vivo human heart either in physiological or pathological conditions. Computational modelling constitutes a primary tool in cardiac electrophysiology: in silico simulations, based on the available experimental data, may help to understand the electrical properties of the heart and the ionic mechanisms underlying a specific phenomenon. Once validated, mathematical models can be used for making predictions and testing hypotheses, thus suggesting potential therapeutic targets. This PhD thesis aims to apply computational cardiac modelling of human single cell action potential (AP) to three clinical scenarios, in order to gain new insights into the ionic mechanisms involved in the electrophysiological changes observed in vitro and/or in vivo. The first context is blood electrolyte variations, which may occur in patients due to different pathologies and/or therapies. In particular, we focused on extracellular Ca2+ and its effect on the AP duration (APD). The second context is haemodialysis (HD) therapy: in addition to blood electrolyte variations, patients undergo a lot of other different changes during HD, e.g. heart rate, cell volume, pH, and sympatho-vagal balance. The third context is human hypertrophic cardiomyopathy (HCM), a genetic disorder characterised by an increased arrhythmic risk, and still lacking a specific pharmacological treatment.
Resumo:
Stratosphärische Partikel sind typischerweise mit dem bloßen Auge nicht wahrnehmbar. Dennoch haben sie einen signifikanten Einfluss auf die Strahlungsbilanz der Erde und die heteorogene Chemie in der Stratosphäre. Kontinuierliche, vertikal aufgelöste, globale Datensätze sind daher essenziell für das Verständnis physikalischer und chemischer Prozesse in diesem Teil der Atmosphäre. Beginnend mit den Messungen des zweiten Stratospheric Aerosol Measurement (SAM II) Instruments im Jahre 1978 existiert eine kontinuierliche Zeitreihe für stratosphärische Aerosol-Extinktionsprofile, welche von Messinstrumenten wie dem zweiten Stratospheric Aerosol and Gas Experiment (SAGE II), dem SCIAMACHY, dem OSIRIS und dem OMPS bis heute fortgeführt wird. rnrnIn dieser Arbeit wird ein neu entwickelter Algorithmus vorgestellt, der das sogenannte ,,Zwiebel-Schäl Prinzip'' verwendet, um Extinktionsprofile zwischen 12 und 33 km zu berechnen. Dafür wird der Algorithmus auf Radianzprofile einzelner Wellenlängen angewandt, die von SCIAMACHY in der Limb-Geometrie gemessen wurden. SCIAMACHY's einzigartige Methode abwechselnder Limb- und Nadir-Messungen bietet den Vorteil, hochaufgelöste vertikale und horizontale Messungen mit zeitlicher und räumlicher Koinzidenz durchführen zu können. Die dadurch erlangten Zusatzinformationen können verwendet werden, um die Effekte von horizontalen Gradienten entlang der Sichtlinie des Messinstruments zu korrigieren, welche vor allem kurz nach Vulkanausbrüchen und für polare Stratosphärenwolken beobachtet werden. Wenn diese Gradienten für die Berechnung von Extinktionsprofilen nicht beachtet werden, so kann dies dazu führen, dass sowohl die optischen Dicke als auch die Höhe von Vulkanfahnen oder polarer Stratosphärenwolken unterschätzt werden. In dieser Arbeit wird ein Verfahren vorgestellt, welches mit Hilfe von dreidimensionalen Strahlungstransportsimulationen und horizontal aufgelösten Datensätzen die berechneten Extinktionsprofile korrigiert.rnrnVergleichsstudien mit den Ergebnissen von Satelliten- (SAGE II) und Ballonmessungen zeigen, dass Extinktionsprofile von stratosphärischen Partikeln mit Hilfe des neu entwickelten Algorithmus berechnet werden können und gut mit bestehenden Datensätzen übereinstimmen. Untersuchungen des Nabro Vulkanausbruchs 2011 und des Auftretens von polaren Stratosphärenwolken in der südlichen Hemisphäre zeigen, dass das Korrekturverfahren für horizontale Gradienten die berechneten Extinktionsprofile deutlich verbessert.
Resumo:
The mechanical action of the heart is made possible in response to electrical events that involve the cardiac cells, a property that classifies the heart tissue between the excitable tissues. At the cellular level, the electrical event is the signal that triggers the mechanical contraction, inducing a transient increase in intracellular calcium which, in turn, carries the message of contraction to the contractile proteins of the cell. The primary goal of my project was to implement in CUDA (Compute Unified Device Architecture, an hardware architecture for parallel processing created by NVIDIA) a tissue model of the rabbit sinoatrial node to evaluate the heterogeneity of its structure and how that variability influences the behavior of the cells. In particular, each cell has an intrinsic discharge frequency, thus different from that of every other cell of the tissue and it is interesting to study the process of synchronization of the cells and look at the value of the last discharge frequency if they synchronized.
Resumo:
La programmazione aggregata è un paradigma che supporta la programmazione di sistemi di dispositivi, adattativi ed eventualmente a larga scala, nel loro insieme -- come aggregati. L'approccio prevalente in questo contesto è basato sul field calculus, un calcolo formale che consente di definire programmi aggregati attraverso la composizione funzionale di campi computazionali, creando i presupposti per la specifica di pattern di auto-organizzazione robusti. La programmazione aggregata è attualmente supportata, in modo più o meno parziale e principalmente per la simulazione, da DSL dedicati (cf., Protelis), ma non esistono framework per linguaggi mainstream finalizzati allo sviluppo di applicazioni. Eppure, un simile supporto sarebbe auspicabile per ridurre tempi e sforzi d'adozione e per semplificare l'accesso al paradigma nella costruzione di sistemi reali, nonché per favorire la ricerca stessa nel campo. Il presente lavoro consiste nello sviluppo, a partire da un prototipo della semantica operazionale del field calculus, di un framework per la programmazione aggregata in Scala. La scelta di Scala come linguaggio host nasce da motivi tecnici e pratici. Scala è un linguaggio moderno, interoperabile con Java, che ben integra i paradigmi ad oggetti e funzionale, ha un sistema di tipi espressivo, e fornisce funzionalità avanzate per lo sviluppo di librerie e DSL. Inoltre, la possibilità di appoggiarsi, su Scala, ad un framework ad attori solido come Akka, costituisce un altro fattore trainante, data la necessità di colmare l'abstraction gap inerente allo sviluppo di un middleware distribuito. Nell'elaborato di tesi si presenta un framework che raggiunge il triplice obiettivo: la costruzione di una libreria Scala che realizza la semantica del field calculus in modo corretto e completo, la realizzazione di una piattaforma distribuita Akka-based su cui sviluppare applicazioni, e l'esposizione di un'API generale e flessibile in grado di supportare diversi scenari.
Resumo:
OBJECTIVES:: Metacarpal juxta-articular bone is altered in Rheumatoid Arthritis (RA). However, a detailed analysis of disease related geometrical adaptations of the metacarpal shaft is missing. The aim of the present study was to assess the role of RA disease, forearm muscle cross-sectional area (CSA), age and sex on bone geometry at the metacarpal shaft. METHODS:: In 64 RA patients and 128 control subjects geometric properties of the third metacarpal bone mid-shaft and forearm muscle CSA were measured by peripheral quantitative computed tomography (pQCT). Linear models were performed for cortical CSA, total bone CSA, polar stress-strain Index (polar SSI, a surrogate for bone's resistance to bending and torsion), cortical thickness and Metacarpal Index (MI=cortical CSA/total CSA) with explanatory variables muscle CSA, age, RA status and sex. RESULTS:: Forearm muscle CSA was associated with cortical and total metacarpal CSA, and polar SSI. RA group status was associated with all bone parameters except cortical CSA. There was a significant interaction between RA status and age, indicating that the RA group had a greater age-related decrease in cortical CSA, cortical thickness and MI. CONCLUSIONS:: Bone geometry of the metacarpal shaft is altered in RA patients compared to healthy controls. While bone mass of the metacarpal shaft is adapted to forearm muscle mass, cortical thickness and MI are reduced but outer bone shaft circumference and polar SSI increased in RA patients. These adaptations correspond to an enhanced aging pattern in RA patients.
Resumo:
Recent studies have suggested that areal BMD (aBMD) measured by DXA is elevated in patients with DISH. We used peripheral QCT (pQCT) to assess volumetric BMD (vBMD) and bone geometry of the radius, tibia and the third metacarpal bone.