963 resultados para API (Application Programming Interface)
Resumo:
The objective of this thesis was to improve the commercial CFD software Ansys Fluent to obtain a tool able to perform accurate simulations of flow boiling in the slug flow regime. The achievement of a reliable numerical framework allows a better understanding of the bubble and flow dynamics induced by the evaporation and makes possible the prediction of the wall heat transfer trends. In order to save computational time, the flow is modeled with an axisymmetrical formulation. Vapor and liquid phases are treated as incompressible and in laminar flow. By means of a single fluid approach, the flow equations are written as for a single phase flow, but discontinuities at the interface and interfacial effects need to be accounted for and discretized properly. Ansys Fluent provides a Volume Of Fluid technique to advect the interface and to map the discontinuous fluid properties throughout the flow domain. The interfacial effects are dominant in the boiling slug flow and the accuracy of their estimation is fundamental for the reliability of the solver. Self-implemented functions, developed ad-hoc, are introduced within the numerical code to compute the surface tension force and the rates of mass and energy exchange at the interface related to the evaporation. Several validation benchmarks assess the better performances of the improved software. Various adiabatic configurations are simulated in order to test the capability of the numerical framework in modeling actual flows and the comparison with experimental results is very positive. The simulation of a single evaporating bubble underlines the dominant effect on the global heat transfer rate of the local transient heat convection in the liquid after the bubble transit. The simulation of multiple evaporating bubbles flowing in sequence shows that their mutual influence can strongly enhance the heat transfer coefficient, up to twice the single phase flow value.
Resumo:
La presente tesi è dedicata al riuso nel software. Eccettuata un'introduzione organica al tema, l'analisi è a livello dei meccanismi offerti dai linguaggi di programmazione e delle tecniche di sviluppo, con speciale attenzione rivolta al tema della concorrenza. Il primo capitolo fornisce un quadro generale nel quale il riuso del software è descritto, assieme alle ragioni che ne determinano l'importanza e ai punti cruciali relativi alla sua attuazione. Si individuano diversi livelli di riuso sulla base dell'astrazione e degli artefatti in gioco, e si sottolinea come i linguaggi contribuiscano alla riusabilità e alla realizzazione del riuso. In seguito, viene esplorato, con esempi di codice, il supporto al riuso da parte del paradigma ad oggetti, in termini di incapsulamento, ereditarietà, polimorfismo, composizione. La trattazione prosegue analizzando differenti feature – tipizzazione, interfacce, mixin, generics – offerte da vari linguaggi di programmazione, mostrando come esse intervengano sulla riusabilità dei componenti software. A chiudere il capitolo, qualche parola contestualizzata sull'inversione di controllo, la programmazione orientata agli aspetti, e il meccanismo della delega. Il secondo capitolo abbraccia il tema della concorrenza. Dopo aver introdotto l'argomento, vengono approfonditi alcuni significativi modelli di concorrenza: programmazione multi-threaded, task nel linguaggio Ada, SCOOP, modello ad Attori. Essi vengono descritti negli elementi fondamentali e ne vengono evidenziati gli aspetti cruciali in termini di contributo al riuso, con esempi di codice. Relativamente al modello ad Attori, viene presentata la sua implementazione in Scala/Akka come caso studio. Infine, viene esaminato il problema dell'inheritance anomaly, sulla base di esempi e delle tre classi principali di anomalia, e si analizza la suscettibilità del supporto di concorrenza di Scala/Akka a riscontrare tali problemi. Inoltre, in questo capitolo si nota come alcuni aspetti relativi al binomio riuso/concorrenza, tra cui il significato profondo dello stesso, non siano ancora stati adeguatamente affrontati dalla comunità informatica. Il terzo e ultimo capitolo esordisce con una panoramica dell'agent-oriented programming, prendendo il linguaggio simpAL come riferimento. In seguito, si prova ad estendere al caso degli agenti la nozione di riuso approfondita nei capitoli precedenti.
Resumo:
Questa tesi si pone come obiettivo l'analisi delle componenti di sollecitazione statica di un serbatoio, in acciaio API 5L X52, sottoposto a carichi di flessione e pressione interna attraverso il programma agli elementi finiti PLCd4, sviluppato presso l'International Center for Numerical Methods in Engineering (CIMNE - Barcelona). Questo tipo di analisi rientra nel progetto europeo ULCF, il cui traguardo è lo studio della fatica a bassissimo numero di cicli per strutture in acciaio. Prima di osservare la struttura completa del serbatoio è stato studiato il comportamento del materiale per implementare all'interno del programma una nuova tipologia di curva che rappresentasse al meglio l'andamento delle tensioni interne. Attraverso il lavoro di preparazione alla tesi è stato inserito all'interno del programma un algoritmo per la distribuzione delle pressioni superficiali sui corpi 3D, successivamente utilizzato per l'analisi della pressione interna nel serbatoio. Sono state effettuate analisi FEM del serbatoio in diverse configurazioni di carico ove si è cercato di modellare al meglio la struttura portante relativa al caso reale di "full scale test". Dal punto di vista analitico i risultati ottenuti sono soddisfacenti in quanto rispecchiano un corretto comportamento del serbatoio in condizioni di pressioni molto elevate e confermano la bontà del programma nell'analisi computazionale.
Resumo:
The subject of this work is the diffusion of turbulence in a non-turbulent flow. Such phenomenon can be found in almost every practical case of turbulent flow: all types of shear flows (wakes, jet, boundary layers) present some boundary between turbulence and the non-turbulent surround; all transients from a laminar flow to turbulence must account for turbulent diffusion; mixing of flows often involve the injection of a turbulent solution in a non-turbulent fluid. The mechanism of what Phillips defined as “the erosion by turbulence of the underlying non-turbulent flow”, is called entrainment. It is usually considered to operate on two scales with different mechanics. The small scale nibbling, which is the entrainment of fluid by viscous diffusion of turbulence, and the large scale engulfment, which entraps large volume of flow to be “digested” subsequently by viscous diffusion. The exact role of each of them in the overall entrainment rate is still not well understood, as it is the interplay between these two mechanics of diffusion. It is anyway accepted that the entrainment rate scales with large properties of the flow, while is not understood how the large scale inertial behavior can affect an intrinsically viscous phenomenon as diffusion of vorticity. In the present work we will address then the problem of turbulent diffusion through pseudo-spectral DNS simulations of the interface between a volume of decaying turbulence and quiescent flow. Such simulations will give us first hand measures of velocity, vorticity and strains fields at the interface; moreover the framework of unforced decaying turbulence will permit to study both spatial and temporal evolution of such fields. The analysis will evidence that for this kind of flows the overall production of enstrophy , i.e. the square of vorticity omega^2 , is dominated near the interface by the local inertial transport of “fresh vorticity” coming from the turbulent flow. Viscous diffusion instead plays a major role in enstrophy production in the outbound of the interface, where the nibbling process is dominant. The data from our simulation seems to confirm the theory of an inertially stirred viscous phenomenon proposed by others authors before and provides new data about the inertial diffusion of turbulence across the interface.
Resumo:
Liquid crystals (LCs) are an interesting class of soft condensed matter systems characterized by an unusual combination of fluidity and long-range order, mainly known for their applications in displays (LCDs). However, the interest in LC continues to grow pushed by their application in new technologies in medicine, optical imaging, micro and nano technologies etc. In LCDs uniaxial alignment of LCs is mainly achieved by a rubbing process. During this treatment, the surfaces of polymer coated display substrates are rubbed in one direction by a rotating cylinder covered with a rubbing cloth. Basically, LC alignment involves two possible aligning directions: uniaxial planar (homogeneous) and vertical (homeotropic) to the display substrate. An interesting unresolved question concerning LCs regards the origin of their alignment on rubbed surfaces, and in particular on the polymeric ones used in the display industry. Most studies have shown that LCs on the surface of the rubbed polymer film layer are lying parallel to the rubbing direction. In these systems, micrometric grooves are generated on the film surface along the rubbing direction and also the polymer chains are stretched in this direction. Both the parallel aligned microgrooves and the polymer chains at the film surface may play a role in the LC alignment and it is not easy to quantify the effect of each contribution. The work described in this thesis is an attempt to find new microscopic evidences on the origin of LC alignment on polymeric surfaces through molecular dynamics (MD) simulations, which allow the investigation of the phenomenon with atomic detail. The importance of the arrangement of the polymeric chains in LCs alignment was studied by performing MD simulations of a thin film of a typical nematic LC, 4-cyano-4’-pentylbiphenyl (5CB), in contact with two different polymers: poly(methyl methacrylate)(PMMA) and polystyrene (PS). At least four factors are believed to influence the LC alignment: 1. the interactions of LCs with the backbone vinyl chains; 2. the interactions of LCs with the oriented side groups; 3. the anisotropic interactions of LCs with nanometric grooves; 4. the presence of static surface charges. Here we exclude the effect of microgrooves and of static surface charges from our virtual experiment, by using flat and neutral polymer surfaces, with the aim of isolating the chemical driving factors influencing the alignment of LC phases on polymeric surfaces.
Resumo:
Biological membranes are one of the vital key elements of life but are also highly complex architectures. Therefore, various model membrane systems have been developed to enable systematic investigations of different membrane related processes. A biomimetic model architecture should provide a simplified system, which allows for systematic investigation of the membrane while maintaining the essential membrane characteristics such as membrane fluidity or electrical sealing properties. This work has been focused on two complementary parts. In a first part, the behaviour of the whey protein ß-lactoglobulin (ßlg) at a membrane interface has been investigated. Protein-lipid interactions have been studied using Langmuir monolayers at the air-water interface and tethered bilayer lipid membranes. A combination of different surface analytical techniques such as surface plasmon spectroscopy, neutron reflectivity and electrochemical techniques allowed for a detailed analysis of the underlying processes. Those experiments showed that the protein adsorbed in native confirmation, slightly flattened, to hydrophobic monolayers. If hydrophilic bilayers with defects were present, ßlg penetrated the upper layer. Interactions with phospholipids were only observed if the protein was denatured beforehand. Experiments at the air-water interface showed a more rigid conformation of the protein at acidic pH compared to alkaline pH. In the second part of this work, the structure of different model membrane systems has been investigated. Solid supported membrane systems have been established as powerful biomimetic architectures, which allow for the systematic investigation of various membrane related processes. Additionally, these systems have been proposed for biosensing applications. Tethered bilayer lipid membranes (tBLMS) are one type of solid supported membranes. The structure of the anchor lipid that tethers the membrane to the solid support has a significant impact on the membrane properties. Especially the sub-membrane part, which is defined by the spacer group, is important for the biological activity of incorporated membrane proteins. Various anchor lipids have been synthesised with different spacer and anchor groups. An increase of the spacer length led to a direct increase of the water reservoir beneath the membrane. However, this elongation also resulted in an amplified roughness of the monolayer and subsequently to diminished mechanical and electrical bilayer qualities. Additionally, a cholesterol-spacer had been designed to modulate the membrane fluidity. Model membrane systems with additional cholesterol-spacer or upper bilayer leaflets with additional cholesterol also exhibited an increased water reservoir with only slightly diminished mechanical and electrical abilities. Both parts show that tBLMs are very effective model systems that can be applied as biomimetic platforms to study for example lipid-protein interactions. They also enable the incorporation of ion channels and allow for potential biosensing application.
Resumo:
Mainstream hardware is becoming parallel, heterogeneous, and distributed on every desk, every home and in every pocket. As a consequence, in the last years software is having an epochal turn toward concurrency, distribution, interaction which is pushed by the evolution of hardware architectures and the growing of network availability. This calls for introducing further abstraction layers on top of those provided by classical mainstream programming paradigms, to tackle more effectively the new complexities that developers have to face in everyday programming. A convergence it is recognizable in the mainstream toward the adoption of the actor paradigm as a mean to unite object-oriented programming and concurrency. Nevertheless, we argue that the actor paradigm can only be considered a good starting point to provide a more comprehensive response to such a fundamental and radical change in software development. Accordingly, the main objective of this thesis is to propose Agent-Oriented Programming (AOP) as a high-level general purpose programming paradigm, natural evolution of actors and objects, introducing a further level of human-inspired concepts for programming software systems, meant to simplify the design and programming of concurrent, distributed, reactive/interactive programs. To this end, in the dissertation first we construct the required background by studying the state-of-the-art of both actor-oriented and agent-oriented programming, and then we focus on the engineering of integrated programming technologies for developing agent-based systems in their classical application domains: artificial intelligence and distributed artificial intelligence. Then, we shift the perspective moving from the development of intelligent software systems, toward general purpose software development. Using the expertise maturated during the phase of background construction, we introduce a general-purpose programming language named simpAL, which founds its roots on general principles and practices of software development, and at the same time provides an agent-oriented level of abstraction for the engineering of general purpose software systems.
Resumo:
Descrizione delle Natural User Interface e del framework OpenNI 2.0 compreso di caso applicativo.
Resumo:
The central topic of this thesis is the study of algorithms for type checking, both from the programming language and from the proof-theoretic point of view. A type checking algorithm takes a program or a proof, represented as a syntactical object, and checks its validity with respect to a specification or a statement. It is a central piece of compilers and proof assistants. We postulate that since type checkers are at the interface between proof theory and program theory, their study can let these two fields mutually enrich each other. We argue by two main instances: first, starting from the problem of proof reuse, we develop an incremental type checker; secondly, starting from a type checking program, we evidence a novel correspondence between natural deduction and the sequent calculus.
Resumo:
The development of a multibody model of a motorbike engine cranktrain is presented in this work, with an emphasis on flexible component model reduction. A modelling methodology based upon the adoption of non-ideal joints at interface locations, and the inclusion of component flexibility, is developed: both are necessary tasks if one wants to capture dynamic effects which arise in lightweight, high-speed applications. With regard to the first topic, both a ball bearing model and a journal bearing model are implemented, in order to properly capture the dynamic effects of the main connections in the system: angular contact ball bearings are modelled according to a five-DOF nonlinear scheme in order to grasp the crankshaft main bearings behaviour, while an impedance-based hydrodynamic bearing model is implemented providing an enhanced operation prediction at the conrod big end locations. Concerning the second matter, flexible models of the crankshaft and the connecting rod are produced. The well-established Craig-Bampton reduction technique is adopted as a general framework to obtain reduced model representations which are suitable for the subsequent multibody analyses. A particular component mode selection procedure is implemented, based on the concept of Effective Interface Mass, allowing an assessment of the accuracy of the reduced models prior to the nonlinear simulation phase. In addition, a procedure to alleviate the effects of modal truncation, based on the Modal Truncation Augmentation approach, is developed. In order to assess the performances of the proposed modal reduction schemes, numerical tests are performed onto the crankshaft and the conrod models in both frequency and modal domains. A multibody model of the cranktrain is eventually assembled and simulated using a commercial software. Numerical results are presented, demonstrating the effectiveness of the implemented flexible model reduction techniques. The advantages over the conventional frequency-based truncation approach are discussed.
A farm-level programming model to compare the atmospheric impact of conventional and organic farming
Resumo:
A model is developed to represent the activity of a farm using the method of linear programming. Two are the main components of the model, the balance of soil fertility and the livestock nutrition. According to the first, the farm is supposed to have a total requirement of nitrogen, which is to be accomplished either through internal sources (manure) or through external sources (fertilisers). The second component describes the animal husbandry as having a nutritional requirement which must be satisfied through the internal production of arable crops or the acquisition of feed from the market. The farmer is supposed to maximise total net income from the agricultural and the zoo-technical activities by choosing one rotation among those available for climate and acclivity. The perspective of the analysis is one of a short period: the structure of the farm is supposed to be fixed without possibility to change the allocation of permanent crops and the amount of animal husbandry. The model is integrated with an environmental module that describes the role of the farm within the carbon-nitrogen cycle. On the one hand the farm allows storing carbon through the photosynthesis of the plants and the accumulation of carbon in the soil; on the other some activities of the farm emit greenhouse gases into the atmosphere. The model is tested for some representative farms of the Emilia-Romagna region, showing to be capable to give different results for conventional and organic farming and providing first results concerning the different atmospheric impact. Relevant data about the representative farms and the feasible rotations are extracted from the FADN database, with an integration of the coefficients from the literature.
Resumo:
Tesi riguardante la creazione di tutte le risorse grafiche necessarie ad un videogioco tridimensionale in prima persona con Blender e Unity3D. Gli argomenti trattati sono: prgettazione, 3D modeling, texturing e shading.
Resumo:
La tesi riguarda tutto il processo di progettazione di un videogioco e l'implementazione dello stesso. Gli argomenti trattati sono: Unity, Design & Gameplay e l'implementazioni del progetto.
Resumo:
In this thesis, I present the realization of a fiber-optical interface using optically trapped cesium atoms, which is an efficient tool for coupling light and atoms. The basic principle of the presented scheme relies on the trapping of neutral cesium atoms in a two-color evanescent field surrounding a nanofiber. The strong confinement of the fiber guided light, which also protrudes outside the nanofiber, provides strong confinement of the atoms as well as efficient coupling to near-resonant light propagating through the fiber. In chapter 1, the necessary physical and mathematical background describing the propagation of light in an optical fiber is presented. The exact solution of Maxwell’s equations allows us to model fiber-guided light fields which give rise to the trapping potentials and the atom-light coupling in the close vicinity of a nanofiber. Chapter 2 gives the theoretical background of light-atom interaction. A quantum mechanical model of the light-induced shifts of the relevant atomic levels is reviewed, which allows us to quantify the perturbation of the atomic states due to the presence of the trapping light-fields. The experimental realization of the fiber-based atom trap is the focus of chapter 3. Here, I analyze the properties of the fiber-based trap in terms of the confinement of the atoms and the impact of several heating mechanisms. Furthermore, I demonstrate the transportation of the trapped atoms, as a first step towards a deterministic delivery of individual atoms. In chapter 4, I present the successful interfacing of the trapped atomic ensemble and fiber-guided light. Three different approaches are discussed, i.e., those involving the measurement of either near-resonant scattering in absorption or the emission into the guided mode of the nanofiber. In the analysis of the spectroscopic properties of the trapped ensemble we find good agreement with the prediction of theoretical model discussed in chapter 2. In addition, I introduce a non-destructive scheme for the interrogation of the atoms states, which is sensitive to phase shifts of far-detuned fiber-guided light interacting with the trapped atoms. The inherent birefringence in our system, induced by the atoms, changes the state of polarization of the probe light and can be thus detected via a Stokes vector measurement.