917 resultados para Computational lambda-calculus
Resumo:
Thanks to the increasing slenderness and lightness allowed by new construction techniques and materials, the effects of wind on structures became in the last decades a research field of great importance in Civil Engineering. Thanks to the advances in computers power, the numerical simulation of wind tunnel tests has became a valid complementary activity and an attractive alternative for the future. Due to its flexibility, during the last years, the computational approach gained importance with respect to the traditional experimental investigation. However, still today, the computational approach to fluid-structure interaction problems is not as widely adopted as it could be expected. The main reason for this lies in the difficulties encountered in the numerical simulation of the turbulent, unsteady flow conditions generally encountered around bluff bodies. This thesis aims at providing a guide to the numerical simulation of bridge deck aerodynamic and aeroelastic behaviour describing in detail the simulation strategies and setting guidelines useful for the interpretation of the results.
Resumo:
In this thesis we provide a characterization of probabilistic computation in itself, from a recursion-theoretical perspective, without reducing it to deterministic computation. More specifically, we show that probabilistic computable functions, i.e., those functions which are computed by Probabilistic Turing Machines (PTM), can be characterized by a natural generalization of Kleene's partial recursive functions which includes, among initial functions, one that returns identity or successor with probability 1/2. We then prove the equi-expressivity of the obtained algebra and the class of functions computed by PTMs. In the the second part of the thesis we investigate the relations existing between our recursion-theoretical framework and sub-recursive classes, in the spirit of Implicit Computational Complexity. More precisely, endowing predicative recurrence with a random base function is proved to lead to a characterization of polynomial-time computable probabilistic functions.
Resumo:
The new stage of the Mainz Microtron, MAMI, at the Institute for Nuclear Physics of the Johannes Gutenberg-University, operational since 2007, allows open strangeness experiments to be performed. Covering the lack of electroproduction data at very low Q2, p(e,K+)Lambda and p(e,K+)Sigma0, reactions have been studied at Q^2 = 0.036(GeV/c)^2 andrnQ^2 = 0.05(GeV=c)^2 in a large angular range. Cross-section at W=1.75rnGeV will be given in angular bins and compared with the predictions of Saclay-Lyon and Kaon Maid isobaric models. We conclude that the original Kaon-Maid model, which has large longitudinal couplings of the photon to nucleon resonances, is unphysical. Extensive studies for the suitability of silicon photomultipliers as read out devices for a scintillating fiber tracking detector, with potential applications in both positive and negative arms of the spectrometer, will be presented as well.
Resumo:
In this thesis the evolution of the techno-social systems analysis methods will be reported, through the explanation of the various research experience directly faced. The first case presented is a research based on data mining of a dataset of words association named Human Brain Cloud: validation will be faced and, also through a non-trivial modeling, a better understanding of language properties will be presented. Then, a real complex system experiment will be introduced: the WideNoise experiment in the context of the EveryAware european project. The project and the experiment course will be illustrated and data analysis will be displayed. Then the Experimental Tribe platform for social computation will be introduced . It has been conceived to help researchers in the implementation of web experiments, and aims also to catalyze the cumulative growth of experimental methodologies and the standardization of tools cited above. In the last part, three other research experience which already took place on the Experimental Tribe platform will be discussed in detail, from the design of the experiment to the analysis of the results and, eventually, to the modeling of the systems involved. The experiments are: CityRace, about the measurement of human traffic-facing strategies; laPENSOcosì, aiming to unveil the political opinion structure; AirProbe, implemented again in the EveryAware project framework, which consisted in monitoring air quality opinion shift of a community informed about local air pollution. At the end, the evolution of the technosocial systems investigation methods shall emerge together with the opportunities and the threats offered by this new scientific path.
Measurement of CP asymmetries in $\lambda^0_b \to pk^-$ and $\lambda^0_b \to p \pi^-$ decays at LHCb
Resumo:
The LHCb experiment has been designed to perform precision measurements in the flavour physics sector at the Large Hadron Collider (LHC) located at CERN. After the recent observation of CP violation in the decay of the Bs0 meson to a charged pion-kaon pair at LHCb, it is interesting to see whether the same quark-level transition in Λ0b baryon decays gives rise to large CP-violating effects. Such decay processes involve both tree and penguin Feynman diagrams and could be sensitive probes for physics beyond the Standard Model. The measurement of the CP-violating observable defined as ∆ACP = ACP(Λ0b → pK−)−ACP(Λ0b →pπ−),where ACP(Λ0b →pK−) and ACP(Λ0b →pπ−) are the direct CP asymmetries in Λ0b → pK− and Λ0b → pπ− decays, is presented for the first time using LHCb data. The procedure followed to optimize the event selection, to calibrate particle identification, to parametrise the various components of the invariant mass spectra, and to compute corrections due to the production asymmetry of the initial state and the detection asymmetries of the final states, is discussed in detail. Using the full 2011 and 2012 data sets of pp collisions collected with the LHCb detector, corresponding to an integrated luminosity of about 3 fb−1, the value ∆ACP = (0.8 ± 2.1 ± 0.2)% is obtained. The first uncertainty is statistical and the second corresponds to one of the dominant systematic effects. As the result is compatible with zero, no evidence of CP violation is found. This is the most precise measurement of CP violation in the decays of baryons containing the b quark to date. Once the analysis will be completed with an exhaustive study of systematic uncertainties, the results will be published by the LHCb Collaboration.
Resumo:
The aim of the work was to explore the practical applicability of molecular dynamics at different length and time scales. From nanoparticles system over colloids and polymers to biological systems like membranes and finally living cells, a broad range of materials was considered from a theoretical standpoint. In this dissertation five chemistry-related problem are addressed by means of theoretical and computational methods. The main results can be outlined as follows. (1) A systematic study of the effect of the concentration, chain length, and charge of surfactants on fullerene aggregation is presented. The long-discussed problem of the location of C60 in micelles was addressed and fullerenes were found in the hydrophobic region of the micelles. (2) The interactions between graphene sheet of increasing size and phospholipid membrane are quantitatively investigated. (3) A model was proposed to study structure, stability, and dynamics of MoS2, a material well-known for its tribological properties. The telescopic movement of nested nanotubes and the sliding of MoS2 layers is simulated. (4) A mathematical model to gain understaning of the coupled diffusion-swelling process in poly(lactic-co-glycolic acid), PLGA, was proposed. (5) A soft matter cell model is developed to explore the interaction of living cell with artificial surfaces. The effect of the surface properties on the adhesion dynamics of cells are discussed.
Resumo:
The assessment of historical structures is a significant need for the next generations, as historical monuments represent the community’s identity and have an important cultural value to society. Most of historical structures built by using masonry which is one of the oldest and most common construction materials used in the building sector since the ancient time. Also it is considered a complex material, as it is a composition of brick units and mortar, which affects the structural performance of the building by having different mechanical behaviour with respect to different geometry and qualities given by the components.
Resumo:
Die Produktion von Hyperkernen wurde in peripheren Schwerionenreaktionen untersucht, bei denen eine Kohlenstofffolie mit $^6$Li Projektilen mit einer Strahlenergie von $2 A$~GeV bestrahlt wurde. Es konnten klare Signale f{"{u}}r $Lambda$, $^3_{Lambda}$H, $^4_{Lambda}$H in deren jeweiligen invarianten Massenverteilungen aus Mesonenzerfall beobachtet werden.rnrnIn dieser Arbeit wird eine unabh{"{a}}ngige Datenauswertung vorgelegt, die eine Verifizierung fr"{u}herer Ergebnisse der HypHI Kollaboration zum Ziel hatte. Zu diesem Zweck wurde eine neue Track-Rekonstruktion, basierend auf einem Kalman-Filter-Ansatz, und zwei unterschiedliche Algorithmen zur Rekonstruktion sekund"{a}rer Vertices entwickelt.rn%-Rekonstruktionsalgorithmen .rnrnDie invarianten Massen des $Lambda$-Hyperon und der $^3_{Lambda}$H- und $^4_{Lambda}$H-Hyperkerne wurden mit $1109.6 pm 0.4$, $2981.0 pm 0.3$ und $3898.1 pm 0.7$~MeV$/c^2$ und statistischen Signifikanzen von $9.8sigma$, $12.8sigma$ beziehungsweise $7.3sigma$ bestimmt. Die in dieser Arbeit erhaltenen Ergebnisse stimmen mit der fr{"{u}}heren Auswertung {"{u}}berein.rnrnDas Ausbeutenverh{"{a}}ltnis der beiden Hyperkerne wurde als $N(^3_{Lambda}$H)/$N(^4_{Lambda}$H)$ sim 3$ bestimmt. Das deutet darauf hin, dass der Produktionsmechanismus f{"{u}}r Hyperkerne in Schwerionen-induzierten Reaktionen im Projektil-Rapidit{"{a}}tsbereich nicht allein durch einen Koaleszenzmechanismus beschrieben werden kann, sondern dass auch sekund{"{a}}re Pion-/Kaon-induzierte Reaktionen und Fermi-Aufbruch involviert sind.rn
Resumo:
Heart diseases are the leading cause of death worldwide, both for men and women. However, the ionic mechanisms underlying many cardiac arrhythmias and genetic disorders are not completely understood, thus leading to a limited efficacy of the current available therapies and leaving many open questions for cardiac electrophysiologists. On the other hand, experimental data availability is still a great issue in this field: most of the experiments are performed in vitro and/or using animal models (e.g. rabbit, dog and mouse), even when the final aim is to better understand the electrical behaviour of in vivo human heart either in physiological or pathological conditions. Computational modelling constitutes a primary tool in cardiac electrophysiology: in silico simulations, based on the available experimental data, may help to understand the electrical properties of the heart and the ionic mechanisms underlying a specific phenomenon. Once validated, mathematical models can be used for making predictions and testing hypotheses, thus suggesting potential therapeutic targets. This PhD thesis aims to apply computational cardiac modelling of human single cell action potential (AP) to three clinical scenarios, in order to gain new insights into the ionic mechanisms involved in the electrophysiological changes observed in vitro and/or in vivo. The first context is blood electrolyte variations, which may occur in patients due to different pathologies and/or therapies. In particular, we focused on extracellular Ca2+ and its effect on the AP duration (APD). The second context is haemodialysis (HD) therapy: in addition to blood electrolyte variations, patients undergo a lot of other different changes during HD, e.g. heart rate, cell volume, pH, and sympatho-vagal balance. The third context is human hypertrophic cardiomyopathy (HCM), a genetic disorder characterised by an increased arrhythmic risk, and still lacking a specific pharmacological treatment.
Resumo:
The mechanical action of the heart is made possible in response to electrical events that involve the cardiac cells, a property that classifies the heart tissue between the excitable tissues. At the cellular level, the electrical event is the signal that triggers the mechanical contraction, inducing a transient increase in intracellular calcium which, in turn, carries the message of contraction to the contractile proteins of the cell. The primary goal of my project was to implement in CUDA (Compute Unified Device Architecture, an hardware architecture for parallel processing created by NVIDIA) a tissue model of the rabbit sinoatrial node to evaluate the heterogeneity of its structure and how that variability influences the behavior of the cells. In particular, each cell has an intrinsic discharge frequency, thus different from that of every other cell of the tissue and it is interesting to study the process of synchronization of the cells and look at the value of the last discharge frequency if they synchronized.
Resumo:
I Big Data hanno forgiato nuove tecnologie che migliorano la qualità della vita utilizzando la combinazione di rappresentazioni eterogenee di dati in varie discipline. Occorre, quindi, un sistema realtime in grado di computare i dati in tempo reale. Tale sistema viene denominato speed layer, come si evince dal nome si è pensato a garantire che i nuovi dati siano restituiti dalle query funcions con la rapidità in cui essi arrivano. Il lavoro di tesi verte sulla realizzazione di un’architettura che si rifaccia allo Speed Layer della Lambda Architecture e che sia in grado di ricevere dati metereologici pubblicati su una coda MQTT, elaborarli in tempo reale e memorizzarli in un database per renderli disponibili ai Data Scientist. L’ambiente di programmazione utilizzato è JAVA, il progetto è stato installato sulla piattaforma Hortonworks che si basa sul framework Hadoop e sul sistema di computazione Storm, che permette di lavorare con flussi di dati illimitati, effettuando l’elaborazione in tempo reale. A differenza dei tradizionali approcci di stream-processing con reti di code e workers, Storm è fault-tolerance e scalabile. Gli sforzi dedicati al suo sviluppo da parte della Apache Software Foundation, il crescente utilizzo in ambito di produzione di importanti aziende, il supporto da parte delle compagnie di cloud hosting sono segnali che questa tecnologia prenderà sempre più piede come soluzione per la gestione di computazioni distribuite orientate agli eventi. Per poter memorizzare e analizzare queste moli di dati, che da sempre hanno costituito una problematica non superabile con i database tradizionali, è stato utilizzato un database non relazionale: HBase.
Resumo:
To assess if finite element (FE) models can be used to predict deformation of the femoropopliteal segment during knee flexion.
Resumo:
Breast cancer is the most common cancer among women, and tamoxifen is the preferred drug for estrogen receptor-positive breast cancer treatment. Many of these cancers are intrinsically resistant to tamoxifen or acquire resistance during treatment. Consequently, there is an ongoing need for breast cancer drugs that have different molecular targets. Previous work has shown that 8-mer and cyclic 9-mer peptides inhibit breast cancer in mouse and rat models, interacting with an unsolved receptor, while peptides smaller than eight amino acids did not. We show that the use of replica exchange molecular dynamics predicts the structure and dynamics of active peptides, leading to the discovery of smaller peptides with full biological activity. Simulations identified smaller peptide analogues with the same conserved reverse turn demonstrated in the larger peptides. These analogues were synthesized and shown to inhibit estrogen-dependent cell growth in a mouse uterine growth assay, a test showing reliable correlation with human breast cancer inhibition.
Resumo:
Background: Breast cancer is the most common cancer among women. Tamoxifen is the preferred drug for estrogen receptor-positive breast cancer treatment, yet many of these cancers are intrinsically resistant to tamoxifen or acquire resistance during treatment. Therefore, scientists are searching for breast cancer drugs that have different molecular targets. Methodology: Recently, a computational approach was used to successfully design peptides that are new lead compounds against breast cancer. We used replica exchange molecular dynamics to predict the structure and dynamics of active peptides, leading to the discovery of smaller bioactive peptides. Conclusions: These analogs inhibit estrogen-dependent cell growth in a mouse uterine growth assay, a test showing reliable correlation with human breast cancer inhibition. We outline the computational methods that were tried and used along with the experimental information that led to the successful completion of this research.