931 resultados para Computational Aeroacoustics (CAA)
Resumo:
Die Wechselwirkung zwischen Proteinen und anorganischen Oberflächen fasziniert sowohl aus angewandter als auch theoretischer Sicht. Sie ist ein wichtiger Aspekt in vielen Anwendungen, unter anderem in chirugischen Implantaten oder Biosensoren. Sie ist außerdem ein Beispiel für theoretische Fragestellungen betreffend die Grenzfläche zwischen harter und weicher Materie. Fest steht, dass Kenntnis der beteiligten Mechanismen erforderlich ist um die Wechselwirkung zwischen Proteinen und Oberflächen zu verstehen, vorherzusagen und zu optimieren. Aktuelle Fortschritte im experimentellen Forschungsbereich ermöglichen die Untersuchung der direkten Peptid-Metall-Bindung. Dadurch ist die Erforschung der theoretischen Grundlagen weiter ins Blickfeld aktueller Forschung gerückt. Eine Möglichkeit die Wechselwirkung zwischen Proteinen und anorganischen Oberflächen zu erforschen ist durch Computersimulationen. Obwohl Simulationen von Metalloberflächen oder Proteinen als Einzelsysteme schon länger verbreitet sind, bringt die Simulation einer Kombination beider Systeme neue Schwierigkeiten mit sich. Diese zu überwinden erfordert ein Mehrskalen-Verfahren: Während Proteine als biologische Systeme ausreichend mit klassischer Molekulardynamik beschrieben werden können, bedarf die Beschreibung delokalisierter Elektronen metallischer Systeme eine quantenmechanische Formulierung. Die wichtigste Voraussetzung eines Mehrskalen-Verfahrens ist eine Übereinstimmung der Simulationen auf den verschiedenen Skalen. In dieser Arbeit wird dies durch die Verknüpfung von Simulationen alternierender Skalen erreicht. Diese Arbeit beginnt mit der Untersuchung der Thermodynamik der Benzol-Hydratation mittels klassischer Molekulardynamik. Dann wird die Wechselwirkung zwischen Wasser und den [111]-Metalloberflächen von Gold und Nickel mittels eines Multiskalen-Verfahrens modelliert. In einem weiteren Schritt wird die Adsorbtion des Benzols an Metalloberflächen in wässriger Umgebung studiert. Abschließend wird die Modellierung erweitert und auch die Aminosäuren Alanin und Phenylalanin einbezogen. Dies eröffnet die Möglichkeit realistische Protein- Metall-Systeme in Computersimulationen zu betrachten und auf theoretischer Basis die Wechselwirkung zwischen Peptiden und Oberflächen für jede Art Peptide und Oberfläche vorauszusagen.
Resumo:
The purpose of this thesis is to further the understanding of the structural, electronic and magnetic properties of ternary inter-metallic compounds using density functional theory (DFT). Four main problems are addressed. First, a detailed analysis on the ternary Heusler compounds is made. It has long been known that many Heusler compounds ($X_2YZ$; $X$ and $Y$ transition elements, $Z$ main group element) exhibit interesting half-metallic and ferromagnetic properties. In order to understand these, the dependence of magnetic and electronic properties on the structural parameters, the type of exchange-correlation functional and electron-electron correlation was examined. It was found that almost all Co$_2YZ$ Heusler compounds exhibit half-metallic ferromagnetism. It is also observed that $X$ and $Y$ atoms mainly contribute to the total magnetic moment. The magnitude of the total magnetic moment is determined only indirectly by the nature of $Z$ atoms, and shows a trend consistent with Slater-Pauling behaviour in several classes of these compounds. In contrast to experiments, calculations give a non-integer value of the magnetic moment in certain Co$_2$-based Heusler compounds. To explain deviations of the calculated magnetic moment, the LDA+$U$ scheme was applied and it was found that the inclusion of electron-electron correlation beyond the LSDA and GGA is necessary to obtain theoretical description of some Heusler compounds that are half-metallic ferromagnets. The electronic structure and magnetic properties of substitutional series of the quaternary Heusler compound Co$_2$Mn$_{1-x}$Fe$_x$Si were investigated under LDA+$U$. The calculated band structure suggest that the most stable compound in a half-metallic state will occur at an intermediate Fe concentration. These calculated findings are qualitatively confirmed by experimental studies. Second, the effect of antisite disordering in the Co$_2$TiSn system was investigated theoretically as well as experimentally. Preservation of half-metallicity for Co$_2$TiSn was observed with moderate antisite disordering and experimental findings suggest that the Co and Ti antisites disorder amounts to approximately 10~% in the compound. Third, a systematic examination was carried out for band gaps and the nature (covalent or ionic) of bonding in semiconducting 8- and 18-electron or half-metallic ferromagnet half-Heusler compounds. It was found that the most appropriate description of these compounds from the viewpoint of electronic structures is one of a $YZ$ zinc blende lattice stuffed by the $X$ ion. Simple valence rules are obeyed for bonding in the 8- and 18-electron compounds. Fourth, hexagonal analogues of half-Heusler compounds have been searched. Three series of compounds were investigated: GdPdSb, GdAutextit{X} (textit{X} = Mn, Cd and In) and EuNiP. GdPdSb is suggested as a possible half-metallic weak ferromagnet at low temperature. GdAutextit{X} (textit{X} = Mn, Cd and In) and EuNiP were investigated because they exhibit interesting bonding, structural and magnetic properties. The results qualitatively confirm experimental studies on magnetic and structural behaviour in GdPdSb, GdAutextit{X} (textit{X} = Mn, Cd and In) and EuNiP compounds. ~
Resumo:
This thesis investigates two distinct research topics. The main topic (Part I) is the computational modelling of cardiomyocytes derived from human stem cells, both embryonic (hESC-CM) and induced-pluripotent (hiPSC-CM). The aim of this research line lies in developing models of the electrophysiology of hESC-CM and hiPSC-CM in order to integrate the available experimental data and getting in-silico models to be used for studying/making new hypotheses/planning experiments on aspects not fully understood yet, such as the maturation process, the functionality of the Ca2+ hangling or why the hESC-CM/hiPSC-CM action potentials (APs) show some differences with respect to APs from adult cardiomyocytes. Chapter I.1 introduces the main concepts about hESC-CMs/hiPSC-CMs, the cardiac AP, and computational modelling. Chapter I.2 presents the hESC-CM AP model, able to simulate the maturation process through two developmental stages, Early and Late, based on experimental and literature data. Chapter I.3 describes the hiPSC-CM AP model, able to simulate the ventricular-like and atrial-like phenotypes. This model was used to assess which currents are responsible for the differences between the ventricular-like AP and the adult ventricular AP. The secondary topic (Part II) consists in the study of texture descriptors for biological image processing. Chapter II.1 provides an overview on important texture descriptors such as Local Binary Pattern or Local Phase Quantization. Moreover the non-binary coding and the multi-threshold approach are here introduced. Chapter II.2 shows that the non-binary coding and the multi-threshold approach improve the classification performance of cellular/sub-cellular part images, taken from six datasets. Chapter II.3 describes the case study of the classification of indirect immunofluorescence images of HEp2 cells, used for the antinuclear antibody clinical test. Finally the general conclusions are reported.
Resumo:
The cardiomyocyte is a complex biological system where many mechanisms interact non-linearly to regulate the coupling between electrical excitation and mechanical contraction. For this reason, the development of mathematical models is fundamental in the field of cardiac electrophysiology, where the use of computational tools has become complementary to the classical experimentation. My doctoral research has been focusing on the development of such models for investigating the regulation of ventricular excitation-contraction coupling at the single cell level. In particular, the following researches are presented in this thesis: 1) Study of the unexpected deleterious effect of a Na channel blocker on a long QT syndrome type 3 patient. Experimental results were used to tune a Na current model that recapitulates the effect of the mutation and the treatment, in order to investigate how these influence the human action potential. Our research suggested that the analysis of the clinical phenotype is not sufficient for recommending drugs to patients carrying mutations with undefined electrophysiological properties. 2) Development of a model of L-type Ca channel inactivation in rabbit myocytes to faithfully reproduce the relative roles of voltage- and Ca-dependent inactivation. The model was applied to the analysis of Ca current inactivation kinetics during normal and abnormal repolarization, and predicts arrhythmogenic activity when inhibiting Ca-dependent inactivation, which is the predominant mechanism in physiological conditions. 3) Analysis of the arrhythmogenic consequences of the crosstalk between β-adrenergic and Ca-calmodulin dependent protein kinase signaling pathways. The descriptions of the two regulatory mechanisms, both enhanced in heart failure, were integrated into a novel murine action potential model to investigate how they concur to the development of cardiac arrhythmias. These studies show how mathematical modeling is suitable to provide new insights into the mechanisms underlying cardiac excitation-contraction coupling and arrhythmogenesis.
Resumo:
Biodiesel represents a possible substitute to the fossil fuels; for this reason a good comprehension of the kinetics involved is important. Due to the complexity of the biodiesel mixture a common practice is the use of surrogate molecules to study its reactivity. In this work are presented the experimental and computational results obtained for the oxidation and pyrolysis of methane and methyl formate conducted in a plug flow reactor. The work was divided into two parts: the first one was the setup assembly whilst, in the second one, was realized a comparison between the experimental and model results; these last was obtained using models available in literature. It was started studying the methane since, a validate model was available, in this way was possible to verify the reliability of the experimental results. After this first study the attention was focused on the methyl formate investigation. All the analysis were conducted at different temperatures, pressures and, for the oxidation, at different equivalence ratios. The results shown that, a good comprehension of the kinetics is reach but efforts are necessary to better evaluate kinetics parameters such as activation energy. The results even point out that the realized setup is adapt to study the oxidation and pyrolysis and, for this reason, it will be employed to study a longer chain esters with the aim to better understand the kinetic of the molecules that are part of the biodiesel mixture.
Resumo:
The thesis applies the ICC tecniques to the probabilistic polinomial complexity classes in order to get an implicit characterization of them. The main contribution lays on the implicit characterization of PP (which stands for Probabilistic Polynomial Time) class, showing a syntactical characterisation of PP and a static complexity analyser able to recognise if an imperative program computes in Probabilistic Polynomial Time. The thesis is divided in two parts. The first part focuses on solving the problem by creating a prototype of functional language (a probabilistic variation of lambda calculus with bounded recursion) that is sound and complete respect to Probabilistic Prolynomial Time. The second part, instead, reverses the problem and develops a feasible way to verify if a program, written with a prototype of imperative programming language, is running in Probabilistic polynomial time or not. This thesis would characterise itself as one of the first step for Implicit Computational Complexity over probabilistic classes. There are still open hard problem to investigate and try to solve. There are a lot of theoretical aspects strongly connected with these topics and I expect that in the future there will be wide attention to ICC and probabilistic classes.
Resumo:
A study of the pyrolysis and oxidation (phi 0.5-1-2) of methane and methyl formate (phi 0.5) in a laboratory flow reactor (Length = 50 cm, inner diameter = 2.5 cm) has been carried out at 1-4 atm and 300-1300 K temperature range. Exhaust gaseous species analysis was realized using a gas chromatographic system, Varian CP-4900 PRO Mirco-GC, with a TCD detector and using helium as carrier for a Molecular Sieve 5Å column and nitrogen for a COX column, whose temperatures and pressures were respectively of 65°C and 150kPa. Model simulations using NTUA [1], Fisher et al. [12], Grana [13] and Dooley [14] kinetic mechanisms have been performed with CHEMKIN. The work provides a basis for further development and optimization of existing detailed chemical kinetic schemes.
Resumo:
The goal of the present research is to define a Semantic Web framework for precedent modelling, by using knowledge extracted from text, metadata, and rules, while maintaining a strong text-to-knowledge morphism between legal text and legal concepts, in order to fill the gap between legal document and its semantics. The framework is composed of four different models that make use of standard languages from the Semantic Web stack of technologies: a document metadata structure, modelling the main parts of a judgement, and creating a bridge between a text and its semantic annotations of legal concepts; a legal core ontology, modelling abstract legal concepts and institutions contained in a rule of law; a legal domain ontology, modelling the main legal concepts in a specific domain concerned by case-law; an argumentation system, modelling the structure of argumentation. The input to the framework includes metadata associated with judicial concepts, and an ontology library representing the structure of case-law. The research relies on the previous efforts of the community in the field of legal knowledge representation and rule interchange for applications in the legal domain, in order to apply the theory to a set of real legal documents, stressing the OWL axioms definitions as much as possible in order to enable them to provide a semantically powerful representation of the legal document and a solid ground for an argumentation system using a defeasible subset of predicate logics. It appears that some new features of OWL2 unlock useful reasoning features for legal knowledge, especially if combined with defeasible rules and argumentation schemes. The main task is thus to formalize legal concepts and argumentation patterns contained in a judgement, with the following requirement: to check, validate and reuse the discourse of a judge - and the argumentation he produces - as expressed by the judicial text.
Resumo:
From the late 1980s, the automation of sequencing techniques and the computer spread gave rise to a flourishing number of new molecular structures and sequences and to proliferation of new databases in which to store them. Here are presented three computational approaches able to analyse the massive amount of publicly avalilable data in order to answer to important biological questions. The first strategy studies the incorrect assignment of the first AUG codon in a messenger RNA (mRNA), due to the incomplete determination of its 5' end sequence. An extension of the mRNA 5' coding region was identified in 477 in human loci, out of all human known mRNAs analysed, using an automated expressed sequence tag (EST)-based approach. Proof-of-concept confirmation was obtained by in vitro cloning and sequencing for GNB2L1, QARS and TDP2 and the consequences for the functional studies are discussed. The second approach analyses the codon bias, the phenomenon in which distinct synonymous codons are used with different frequencies, and, following integration with a gene expression profile, estimates the total number of codons present across all the expressed mRNAs (named here "codonome value") in a given biological condition. Systematic analyses across different pathological and normal human tissues and multiple species shows a surprisingly tight correlation between the codon bias and the codonome bias. The third approach is useful to studies the expression of human autism spectrum disorder (ASD) implicated genes. ASD implicated genes sharing microRNA response elements (MREs) for the same microRNA are co-expressed in brain samples from healthy and ASD affected individuals. The different expression of a recently identified long non coding RNA which have four MREs for the same microRNA could disrupt the equilibrium in this network, but further analyses and experiments are needed.
Resumo:
In this thesis I described the theory and application of several computational methods in solving medicinal chemistry and biophysical tasks. I pointed out to the valuable information which could be achieved by means of computer simulations and to the possibility to predict the outcome of traditional experiments. Nowadays, computer represents an invaluable tool for chemists. In particular, the main topics of my research consisted in the development of an automated docking protocol for the voltage-gated hERG potassium channel blockers, and the investigation of the catalytic mechanism of the human peptidyl-prolyl cis-trans isomerase Pin1.
Resumo:
The dynamic character of proteins strongly influences biomolecular recognition mechanisms. With the development of the main models of ligand recognition (lock-and-key, induced fit, conformational selection theories), the role of protein plasticity has become increasingly relevant. In particular, major structural changes concerning large deviations of protein backbones, and slight movements such as side chain rotations are now carefully considered in drug discovery and development. It is of great interest to identify multiple protein conformations as preliminary step in a screening campaign. Protein flexibility has been widely investigated, in terms of both local and global motions, in two diverse biological systems. On one side, Replica Exchange Molecular Dynamics has been exploited as enhanced sampling method to collect multiple conformations of Lactate Dehydrogenase A (LDHA), an emerging anticancer target. The aim of this project was the development of an Ensemble-based Virtual Screening protocol, in order to find novel potent inhibitors. On the other side, a preliminary study concerning the local flexibility of Opioid Receptors has been carried out through ALiBERO approach, an iterative method based on Elastic Network-Normal Mode Analysis and Monte Carlo sampling. Comparison of the Virtual Screening performances by using single or multiple conformations confirmed that the inclusion of protein flexibility in screening protocols has a positive effect on the probability to early recognize novel or known active compounds.
Resumo:
The Curry-Howard isomorphism is the idea that proofs in natural deduction can be put in correspondence with lambda terms in such a way that this correspondence is preserved by normalization. The concept can be extended from Intuitionistic Logic to other systems, such as Linear Logic. One of the nice conseguences of this isomorphism is that we can reason about functional programs with formal tools which are typical of proof systems: such analysis can also include quantitative qualities of programs, such as the number of steps it takes to terminate. Another is the possiblity to describe the execution of these programs in terms of abstract machines. In 1990 Griffin proved that the correspondence can be extended to Classical Logic and control operators. That is, Classical Logic adds the possiblity to manipulate continuations. In this thesis we see how the things we described above work in this larger context.
Resumo:
Thanks to the increasing slenderness and lightness allowed by new construction techniques and materials, the effects of wind on structures became in the last decades a research field of great importance in Civil Engineering. Thanks to the advances in computers power, the numerical simulation of wind tunnel tests has became a valid complementary activity and an attractive alternative for the future. Due to its flexibility, during the last years, the computational approach gained importance with respect to the traditional experimental investigation. However, still today, the computational approach to fluid-structure interaction problems is not as widely adopted as it could be expected. The main reason for this lies in the difficulties encountered in the numerical simulation of the turbulent, unsteady flow conditions generally encountered around bluff bodies. This thesis aims at providing a guide to the numerical simulation of bridge deck aerodynamic and aeroelastic behaviour describing in detail the simulation strategies and setting guidelines useful for the interpretation of the results.
Resumo:
In this thesis we provide a characterization of probabilistic computation in itself, from a recursion-theoretical perspective, without reducing it to deterministic computation. More specifically, we show that probabilistic computable functions, i.e., those functions which are computed by Probabilistic Turing Machines (PTM), can be characterized by a natural generalization of Kleene's partial recursive functions which includes, among initial functions, one that returns identity or successor with probability 1/2. We then prove the equi-expressivity of the obtained algebra and the class of functions computed by PTMs. In the the second part of the thesis we investigate the relations existing between our recursion-theoretical framework and sub-recursive classes, in the spirit of Implicit Computational Complexity. More precisely, endowing predicative recurrence with a random base function is proved to lead to a characterization of polynomial-time computable probabilistic functions.
Resumo:
In this thesis the evolution of the techno-social systems analysis methods will be reported, through the explanation of the various research experience directly faced. The first case presented is a research based on data mining of a dataset of words association named Human Brain Cloud: validation will be faced and, also through a non-trivial modeling, a better understanding of language properties will be presented. Then, a real complex system experiment will be introduced: the WideNoise experiment in the context of the EveryAware european project. The project and the experiment course will be illustrated and data analysis will be displayed. Then the Experimental Tribe platform for social computation will be introduced . It has been conceived to help researchers in the implementation of web experiments, and aims also to catalyze the cumulative growth of experimental methodologies and the standardization of tools cited above. In the last part, three other research experience which already took place on the Experimental Tribe platform will be discussed in detail, from the design of the experiment to the analysis of the results and, eventually, to the modeling of the systems involved. The experiments are: CityRace, about the measurement of human traffic-facing strategies; laPENSOcosì, aiming to unveil the political opinion structure; AirProbe, implemented again in the EveryAware project framework, which consisted in monitoring air quality opinion shift of a community informed about local air pollution. At the end, the evolution of the technosocial systems investigation methods shall emerge together with the opportunities and the threats offered by this new scientific path.