930 resultados para OpenFOAM, diesel, banco di flussaggio, open source
Resumo:
This work of thesis wants to present a dissertation of the wide range of modern dense matching algorithms, which are spreading in different application and research fields, with a particular attention to the innovative “Semi-Global” matching techniques. The choice of develop a semi-global numerical code was justified by the need of getting insight on the variables and strategies that affect the algorithm performances with the primary objective of maximizing the method accuracy and efficiency, and the results level of completeness. The dissertation will consist in the metrological characterization of the proprietary implementation of the semi-global matching algorithm, evaluating the influence of several matching variables and functions implemented in the process and comparing the accuracy and completeness of different results (digital surface models, disparity maps and 2D displacement fields) obtained using our code and other commercial and open-source matching programs in a wide variety of application fields.
Resumo:
The potential for sharing environmental data and models is huge, but can be challenging for experts without specific programming expertise. We built an open-source, cross-platform framework (‘Tzar’) to run models across distributed machines. Tzar is simple to set up and use, allows dynamic parameter generation and enhances reproducibility by accessing versioned data and code. Combining Tzar with Docker helps us lower the entry barrier further by versioning and bundling all required modules and dependencies, together with the database needed to schedule work.
Resumo:
Starting from the Schumpeterian producer-driven understanding of innovation, followed by user-generated solutions and understanding of collaborative forms of co-creation, scholars investigated the drivers and the nature of interactions underpinning success in various ways. Innovation literature has gone a long way, where open innovation has attracted researchers to investigate problems like compatibilities of external resources, networks of innovation, or open source collaboration. Openness itself has gained various shades in the different strands of literature. In this paper the author provides with an overview and a draft evaluation of the different models of open innovation, illustrated with some empirical findings from various fields drawn from the literature. She points to the relevance of transaction costs affecting viable forms of (open) innovation strategies of firms, and the importance to define the locus of innovation for further analyses of different firm and interaction level formations.
Resumo:
Nell’ultima decade abbiamo assistito alla transizione di buona parte dei business da offline ad online. Istantaneamente grazie al nuovo rapporto tra azienda e cliente fornito dalla tecnologia, molti dei metodi di marketing potevano essere rivoluzionati. Il web ci ha abilitato all’analisi degli utenti e delle loro opinioni ad ampio spettro. Capire con assoluta precisione il tasso di conversione in acquisti degli utenti fornito dalle piattaforme pubblicitarie e seguirne il loro comportamento su larga scala sul web, operazione sempre stata estremamente difficile da fare nel mondo reale. Per svolgere queste operazioni sono disponibili diverse applicazioni commerciali, che comportano un costo che può essere notevole da sostenere per le aziende. Nel corso della seguente tesi si punta a fornire una analisi di una piattaforma open source per la raccolta dei dati dal web in un database strutturato
Resumo:
Cloud computing enables independent end users and applications to share data and pooled resources, possibly located in geographically distributed Data Centers, in a fully transparent way. This need is particularly felt by scientific applications to exploit distributed resources in efficient and scalable way for the processing of big amount of data. This paper proposes an open so- lution to deploy a Platform as a service (PaaS) over a set of multi- site data centers by applying open source virtualization tools to facilitate operation among virtual machines while optimizing the usage of distributed resources. An experimental testbed is set up in Openstack environment to obtain evaluations with different types of TCP sample connections to demonstrate the functionality of the proposed solution and to obtain throughput measurements in relation to relevant design parameters.
Resumo:
Abstract: Decision support systems have been widely used for years in companies to gain insights from internal data, thus making successful decisions. Lately, thanks to the increasing availability of open data, these systems are also integrating open data to enrich decision making process with external data. On the other hand, within an open-data scenario, decision support systems can be also useful to decide which data should be opened, not only by considering technical or legal constraints, but other requirements, such as "reusing potential" of data. In this talk, we focus on both issues: (i) open data for decision making, and (ii) decision making for opening data. We will first briefly comment some research problems regarding using open data for decision making. Then, we will give an outline of a novel decision-making approach (based on how open data is being actually used in open-source projects hosted in Github) for supporting open data publication. Bio of the speaker: Jose-Norberto Mazón holds a PhD from the University of Alicante (Spain). He is head of the "Cátedra Telefónica" on Big Data and coordinator of the Computing degree at the University of Alicante. He is also member of the WaKe research group at the University of Alicante. His research work focuses on open data management, data integration and business intelligence within "big data" scenarios, and their application to the tourism domain (smart tourism destinations). He has published his research in international journals, such as Decision Support Systems, Information Sciences, Data & Knowledge Engineering or ACM Transaction on the Web. Finally, he is involved in the open data project in the University of Alicante, including its open data portal at http://datos.ua.es
Resumo:
Exhaust emissions from diesel engines are a substantial source of air pollution in this country. In recognition of this fact, the Environmental Protection Agency has issued strict new regulations due to take effect -in 1991 and 1994 that will drastically reduce the amount of some pollutants these engines will be allowed to emit. The technology is not currently available to produce diesel engines that can meet these regulations without large penalties in engine performance and efficiency. One technique that offers promise of being able to reduce emissions from both existing engines and new engines is alcohol fumigation.
Resumo:
L’elaborato di tesi è stato redatto durante un’esperienza formativa presso la Parker Hannifin Manufacturing – Divisione Calzoni di Anzola dell’Emilia (BO). Il lavoro della tesi ha avuto per oggetto l’analisi di sicurezza e l’analisi dei guasti di un banco di assemblaggio e collaudo per motori oleodinamici a pistoni radiali. Il lavoro si è articolato in più fasi: una prima fase volta all’analisi funzionale (meccanica, idraulica, elettrica ed elettronica) dell’apparecchiatura, la fase successiva è stata dedicata allo studio delle varie normative necessarie a redigere il rapporto di sicurezza. Si sono poi valutati ed analizzati tutti i requisiti di sicurezza dell’apparecchiatura (valutazione del rischio) al fine di redigere il fascicolo tecnico. Infine, si è svolta l’analisi dei guasti (FMEA) secondo il metodo adottato in azienda. Il primo ed il secondo capitolo introducono la finalità industriale ed una descrizione generica dell’apparecchiatura. Nel terzo capitolo vi è una descrizione accurata del sistema idraulico e delle sue possibili configurazioni; il quarto nasce invece per approcciarsi alle normative di sicurezza a cui si è fatto riferimento. Infine, il quinto ed il sesto capitolo contengono rispettivamente il fascicolo tecnico e l’analisi dei guasti relativa all'apparecchiatura industriale studiata.
Resumo:
L’obbiettivo che si pone questo lavoro è quello di combinare in un unico impianto due tecnologie utilizzate per scopi differenti (impianto integrato): un impianto di climatizzazione geotermico a bassa entalpia di tipo open-loop ed un impianto di bonifica delle acque di falda di tipo Pump&Treat. Il sito selezionato per lo studio è ubicato in via Lombardia, nell’area industriale di Ozzano dell’Emilia (BO), ed è definito “Ex stabilimento Ot-Gal”: si tratta di una galvanotecnica con trattamento di metalli, dismessa alla fine degli anni ’90. Durante una precedente fase di caratterizzazione del sito condotta dalla ditta Geo-Net Srl, sono stati rilevati in falda dei superamenti delle CSC previste dal D.lgs. 152/2006 di alcuni contaminanti, in particolare Tricloroetilene (TCE) e 1.1-Dicloroetilene (1.1-DCE). Successivamente, nel 2010-2011, Geo-net Srl ha eseguito una parziale bonifica delle acque di falda attraverso l’utilizzo di un impianto Pump and Treat. Grazie a tutti i dati pregressi riguardanti i monitoraggi, le prove e i sondaggi, messi a disposizione per questo studio da Geo-Net Srl, è stato possibile eseguire una sperimentazione teorica, in forma di modellazione numerica di flusso e trasporto, dell’impianto integrato oggetto di studio. La sperimentazione è stata effettuata attraverso l’utilizzo di modelli numerici basati sul codice di calcolo MODFLOW e su codici ad esso comunemente associati, quali MODPATH e MT3DMS. L’analisi dei risultati ottenuti ha permesso di valutare in modo accurato l’integrazione di queste due tecnologie combinate in unico impianto. In particolare, la bonifica all’interno del sito avviene dopo 15 dalla messa in attività. Sono stati anche confrontati i costi da sostenere per la realizzazione e l’esercizio dell’impianto integrato rispetto a quelli di un impianto tradizionale. Tale confronto ha mostrato che l’ammortamento dell’impianto integrato avviene in 13 anni e che i restanti 7 anni di esercizio producono un risparmio economico.
Resumo:
FEA simulation of thermal metal cutting is central to interactive design and manufacturing. It is therefore relevant to assess the applicability of FEA open software to simulate 2D heat transfer in metal sheet laser cuts. Application of open source code (e.g. FreeFem++, FEniCS, MOOSE) makes possible additional scenarios (e.g. parallel, CUDA, etc.), with lower costs. However, a precise assessment is required on the scenarios in which open software can be a sound alternative to a commercial one. This article contributes in this regard, by presenting a comparison of the aforementioned freeware FEM software for the simulation of heat transfer in thin (i.e. 2D) sheets, subject to a gliding laser point source. We use the commercial ABAQUS software as the reference to compare such open software. A convective linear thin sheet heat transfer model, with and without material removal is used. This article does not intend a full design of computer experiments. Our partial assessment shows that the thin sheet approximation turns to be adequate in terms of the relative error for linear alumina sheets. Under mesh resolutions better than 10e−5 m , the open and reference software temperature differ in at most 1 % of the temperature prediction. Ongoing work includes adaptive re-meshing, nonlinearities, sheet stress analysis and Mach (also called ‘relativistic’) effects.
Resumo:
The analysis of the wind flow around buildings has a great interest from the point of view of the wind energy assessment, pollutant dispersion control, natural ventilation and pedestrians wind comfort and safety. Since LES turbulence models are computationally time consuming when applied to real geometries, RANS models are still widely used. However, RANS models are very sensitive to the chosen turbulence parametrisation and the results can vary according to the application. In this investigation, the simulation of the wind flow around an isolated building is performed using various types of RANS turbulence models in the open source code OpenFOAM, and the results are compared with benchmark experimental data. In order to confirm the numerical accuracy of the simulations, a grid dependency analysis is performed and the convergence index and rate are calculated. Hit rates are calculated for all the cases and the models that successfully pass a validation criterion are analysed at different regions of the building roof, and the most accurate RANS models for the modelling of the flow at each region are identified. The characteristics of the wind flow at each region are also analysed from the point of view of the wind energy generation, and the most adequate wind turbine model for the wind energy exploitation at each region of the building roof is chosen.
Resumo:
Dissertação de Mestrado, Ciências da Linguagem, Faculdade de Ciências Humanas e Sociais, Universidade do Algarve, 2016
Resumo:
Computational models for the investigation of flows in deformable tubes are developed and implemented in the open source computing environment OpenFOAM. Various simulations for Newtonian and non-Newtonian fluids under various flow conditions are carried out and analyzed. First, simulations are performed to investigate the flow of a shear-thinning, non-Newtonian fluid in a collapsed elastic tube and comparisons are made with experimental data. The fluid is modeled by means of the Bird-Carreau viscosity law. The computational domain of the deformed tube is constructed from data obtained via computer tomography imaging. Comparison of the computed velocity fields with the ultrasound Doppler velocity profile measurements show good agreement, as does the adjusted pressure drop along the tube's axis. Analysis of the shear rates show that the shear-thinning effect of the fluid becomes relevant in the cross-sections with the biggest deformation. The peristaltic motion is simulated by means of upper and lower rollers squeezing the fluid along a tube. Two frames of reference are considered. In the moving frame the computational domain is fixed and the coordinate system is moving with the roller speed, and in the fixed frame the roller is represented by a deforming mesh. Several two-dimensional simulations are carried out for Newtonian and non-Newtonian fluids. The effect of the shear-thinning behavior of the fluid on the transport efficiency is examined. In addition, the influence of the roller speed and the gap width between the rollers on the xxvii transport efficiency is discussed. Comparison with experimental data is also presented and different types of moving waves are implemented. In addition, the influence of the roller speed and the gap width between the rollers on the transport efficiency is discussed. Comparison with experimental data is also presented and different types of moving waves are implemented.
Resumo:
Current IEEE 802.11 wireless networks are vulnerable to session hijacking attacks as the existing standards fail to address the lack of authentication of management frames and network card addresses, and rely on loosely coupled state machines. Even the new WLAN security standard - IEEE 802.11i does not address these issues. In our previous work, we proposed two new techniques for improving detection of session hijacking attacks that are passive, computationally inexpensive, reliable, and have minimal impact on network performance. These techniques utilise unspoofable characteristics from the MAC protocol and the physical layer to enhance confidence in the intrusion detection process. This paper extends our earlier work and explores usability, robustness and accuracy of these intrusion detection techniques by applying them to eight distinct test scenarios. A correlation engine has also been introduced to maintain the false positives and false negatives at a manageable level. We also explore the process of selecting optimum thresholds for both detection techniques. For the purposes of our experiments, Snort-Wireless open source wireless intrusion detection system was extended to implement these new techniques and the correlation engine. Absence of any false negatives and low number of false positives in all eight test scenarios successfully demonstrated the effectiveness of the correlation engine and the accuracy of the detection techniques.
Resumo:
Access All was performance produced following a three-month mentorship in web-based performance that I was commissioned to conduct for the performance company Igneous. This live, triple-site performance event for three performers in three remote venues was specifically designed for presentation at Access Grid Nodes - conference rooms located around the globe equipped with a high end, open source computer teleconferencing technology that allowed multiple nodes to cross-connect with each other. Whilst each room was setup somewhat differently they all deployed the same basic infrastructre of multiple projectors, cameras, and sound as well as a reconfigurable floorspace. At that time these relatively formal setups imposed a clear series of limitations in terms of software capabilities and basic infrastructure and so there was much interest in understanding how far its capabilities might be pushed.----- Numerous performance experiments were undertaken between three Access Grid nodes in QUT Brisbane, VISLAB Sydney and Manchester Supercomputing Centre, England, culminating in the public performance staged simultaneously between the sites with local audiences at each venue and others online. Access All was devised in collaboration with interdisciplinary performance company Bonemap, Kelli Dipple (Interarts curator, Tate Modern London) and Mike Stubbs British curator and Director of FACT (Liverpool).----- This period of research and development was instigated and shaped by a public lecture I had earlier delivered in Sydney for the ‘Global Access Grid Network, Super Computing Global Conference’ entitled 'Performance Practice across Electronic Networks'. The findings of this work went on to inform numerous future networked and performative works produced from 2002 onwards.