12 resultados para Production Process

em AMS Tesi di Laurea - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rate at which petroleum based plastics are being produced, used and thrown away is increasing every year because of an increase in the global population. Polyhydroxyalkanoates can represent a valid alternative to petroleum based plastics. They are biodegradable polymers that can be produced by some microorganisms as intracellular reserves. The actual problem is represented by the production cost of these bioplastics, which is still not competitive if compared to the one of petroleum based plastics. Mixed microbial cultures can be fed with substrates obtained from the acidogenic fermentation of carbon rich wastes, such as cheese whey, municipal effluents and various kinds of food wastes, that have a low or sometimes even inexisting cost and in this way wastes can be valorized instead of being discharged. The process consists of three phases: acidogenic fermentation in which the substrate is obtained, culture selection in which a PHA-storing culture is selected and enriched eliminating organisms that do not show this property and accumulation, in which the culture is fed until reaching the maximum storage capacity. In this work the possibility to make the process cheaper was explored trying to couple the selection and accumulation steps and a halotolerant culture collected from seawater was used and fed with an artificially salted synthetic substrated made of an aqueous solution containing a mixture of volatile fatty acids in order to explore also if its performance can allow to use it to treat substrates derived from saline effluents, as these streams cannot be treated properly by bacterias found in activated sludge plants due to inhibition caused by high salt concentrations. Generating and selling the produced PHAs obtained from these bacterias it could be possible to lower, nullify or even overcome the costs associated to the new section of a treating plant dedicated to saline effluents.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Top quark studies play an important role in the physics program of the Large Hadron Collider (LHC). The energy and luminosity reached allow the acquisition of a large amount of data especially in kinematic regions never studied before. In this thesis is presented the measurement of the ttbar production differential cross section on data collected by ATLAS in 2012 in proton proton collisions at \sqrt{s} = 8 TeV, corresponding to an integrated luminosity of 20.3 fb^{−1}. The measurement is performed for ttbar events in the semileptonic channel where the hadronically decaying top quark has a transverse momentum above 300 GeV. The hadronic top quark decay is reconstructed as a single large radius jet and identified using jet substructure properties. The final differential cross section result has been compared with several theoretical distributions obtaining a discrepancy of about the 25% between data and predictions, depending on the MC generator. Furthermore the kinematic distributions of the ttbar production process are very sensitive to the choice of the parton distribution function (PDF) set used in the simulations and could provide constraints on gluons PDF. In particular in this thesis is performed a systematic study on the PDF of the protons, varying several PDF sets and checking which one better describes the experimental distributions. The boosted techniques applied in this measurement will be fundamental in the next data taking at \sqrt{s}=13 TeV when will be produced a large amount of heavy particles with high momentum.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ontology design and population -core aspects of semantic technologies- re- cently have become fields of great interest due to the increasing need of domain-specific knowledge bases that can boost the use of Semantic Web. For building such knowledge resources, the state of the art tools for ontology design require a lot of human work. Producing meaningful schemas and populating them with domain-specific data is in fact a very difficult and time-consuming task. Even more if the task consists in modelling knowledge at a web scale. The primary aim of this work is to investigate a novel and flexible method- ology for automatically learning ontology from textual data, lightening the human workload required for conceptualizing domain-specific knowledge and populating an extracted schema with real data, speeding up the whole ontology production process. Here computational linguistics plays a fundamental role, from automati- cally identifying facts from natural language and extracting frame of relations among recognized entities, to producing linked data with which extending existing knowledge bases or creating new ones. In the state of the art, automatic ontology learning systems are mainly based on plain-pipelined linguistics classifiers performing tasks such as Named Entity recognition, Entity resolution, Taxonomy and Relation extraction [11]. These approaches present some weaknesses, specially in capturing struc- tures through which the meaning of complex concepts is expressed [24]. Humans, in fact, tend to organize knowledge in well-defined patterns, which include participant entities and meaningful relations linking entities with each other. In literature, these structures have been called Semantic Frames by Fill- 6 Introduction more [20], or more recently as Knowledge Patterns [23]. Some NLP studies has recently shown the possibility of performing more accurate deep parsing with the ability of logically understanding the structure of discourse [7]. In this work, some of these technologies have been investigated and em- ployed to produce accurate ontology schemas. The long-term goal is to collect large amounts of semantically structured information from the web of crowds, through an automated process, in order to identify and investigate the cognitive patterns used by human to organize their knowledge.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Abstract (US) Composite material components design and production techniques are discussed in the present graduation paper. In particular, this paper covers the design process and the production process of a carbon-fiber composite material component for a high performance car, more specifically, the Dallara T12 race car. This graduation paper is split in two. After a brief introduction on existing composite materials (their origins and applications), the first part of the present paper covers the main theoretical concepts behind the design of composite material components: particular focus will be given to carbon-fiber composites. The second part of the present paper covers the whole design and production process that the candidate carried out to create the new front mainplane of the Dallara T12 race car. This graduation paper is the result of a six-months-long internship that the candidate conducted as Design Office Trainee inside Dallara Automobili S.p.A. Abstract (ITA) La presente tesi di laurea discute le metodologie progettuali e produttive legate alla realizzazione di un componente in materiale composito. Nello specifico, viene discussa la progettazione e la produzione di un componente in fibra di carbonio destinato ad una vettura da competizione. La vettura in esame è la Dallara T12. Il lavoro è diviso in due parti. Nella prima parte, dopo una breve introduzione sull’origine e le tipologie di materiali compositi esistenti, vengono trattati i concetti teorici fondamentali su cui si basa la progettazione di generici componenti in materiale composito, con particolare riguardo ai materiali in fibra di carbonio. Nella seconda parte viene discusso tutto il processo produttivo che il candidato ha portato a termine per realizzare il nuovo alettone anteriore della Dallara T12. La presente tesi di laurea è il risultato del lavoro di progettazione che il candidato ha svolto presso l’Ufficio Tecnico di Dallara Automobili S.p.A. nel corso di un tirocinio formativo di sei mesi.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The need to effectively manage the documentation covering the entire production process, from the concept phase right through to market realise, constitutes a key issue in the creation of a successful and highly competitive product. For almost forty years the most commonly used strategies to achieve this have followed Product Lifecycle Management (PLM) guidelines. Translated into information management systems at the end of the '90s, this methodology is now widely used by companies operating all over the world in many different sectors. PLM systems and editor programs are the two principal types of software applications used by companies for their process aotomation. Editor programs allow to store in documents the information related to the production chain, while the PLM system stores and shares this information so that it can be used within the company and made it available to partners. Different software tools, which capture and store documents and information automatically in the PLM system, have been developed in recent years. One of them is the ''DirectPLM'' application, which has been developed by the Italian company ''Focus PLM''. It is designed to ensure interoperability between many editors and the Aras Innovator PLM system. In this dissertation we present ''DirectPLM2'', a new version of the previous software application DirectPLM. It has been designed and developed as prototype during the internship by Focus PLM. Its new implementation separates the abstract logic of business from the real commands implementation, previously strongly dependent on Aras Innovator. Thanks to its new design, Focus PLM can easily develop different versions of DirectPLM2, each one devised for a specific PLM system. In fact, the company can focus the development effort only on a specific set of software components which provides specialized functions interacting with that particular PLM system. This allows shorter Time-To-Market and gives the company a significant competitive advantage.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Globalization has increased the pressure on organizations and companies to operate in the most efficient and economic way. This tendency promotes that companies concentrate more and more on their core businesses, outsource less profitable departments and services to reduce costs. By contrast to earlier times, companies are highly specialized and have a low real net output ratio. For being able to provide the consumers with the right products, those companies have to collaborate with other suppliers and form large supply chains. An effect of large supply chains is the deficiency of high stocks and stockholding costs. This fact has lead to the rapid spread of Just-in-Time logistic concepts aimed minimizing stock by simultaneous high availability of products. Those concurring goals, minimizing stock by simultaneous high product availability, claim for high availability of the production systems in the way that an incoming order can immediately processed. Besides of design aspects and the quality of the production system, maintenance has a strong impact on production system availability. In the last decades, there has been many attempts to create maintenance models for availability optimization. Most of them concentrated on the availability aspect only without incorporating further aspects as logistics and profitability of the overall system. However, production system operator’s main intention is to optimize the profitability of the production system and not the availability of the production system. Thus, classic models, limited to represent and optimize maintenance strategies under the light of availability, fail. A novel approach, incorporating all financial impacting processes of and around a production system, is needed. The proposed model is subdivided into three parts, maintenance module, production module and connection module. This subdivision provides easy maintainability and simple extendability. Within those modules, all aspect of production process are modeled. Main part of the work lies in the extended maintenance and failure module that offers a representation of different maintenance strategies but also incorporates the effect of over-maintaining and failed maintenance (maintenance induced failures). Order release and seizing of the production system are modeled in the production part. Due to computational power limitation, it was not possible to run the simulation and the optimization with the fully developed production model. Thus, the production model was reduced to a black-box without higher degree of details.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A relevant problem of polyolefins processing is the presence of volatile and semi-volatile compounds (VOCs and SVOCs) such as linear chains alkanes found out in final products. These VOCs can be detected by customers from the unpleasant smelt and can be an environmental issue, at the same time they can cause negative side effects during process. Since no previously standardized analytical techniques for polymeric matrix are available in bibliography, we have implemented different VOCs extraction methods and gaschromatographic analysis for quali-quantitative studies of such compounds. In literature different procedures can be found including microwave extraction (MAE) and thermo desorption (TDS) used with different purposes. TDS coupled with GC-MS are necessary for the identification of different compounds in the polymer matrix. Although the quantitative determination is complex, the results obtained from TDS/GC-MS show that by-products are mainly linear chains oligomers with even number of carbon in a C8-C22 range (for HDPE). In order to quantify these linear alkanes by-products, a more accurate GC-FID determination with internal standard has been run on MAE extracts. Regardless the type of extruder used, it is difficult to distinguish the effect of the various processes, which in any case entails having a lower-boiling substance content, lower than the corresponding virgin polymer. The two HDPEs studied can be distinguished on the basis of the quantity of analytes found, therefore the production process is mainly responsible for the amount of VOCs and SVOCs observed. The extruder technology used by Sacmi SC allows to obtain a significant reduction in VOCs compared to the conventional screw system. Thus, the result is significantly important as a lower quantity of volatile substances certainly leads to a lower migration of such materials, especially when used for food packaging.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nanofibrous membranes are a promising material for tailoring the properties of laminated CFRP composites by embedding them into the structure. This project aimed to understand the effect of number, position and thickness of nanofibrous modifications specifically on the damping behaviour of the resulting nano-modified CFRP composite with an epoxy matrix. An improvement of damping capacity is expected to improve a composites lifetime and fatigue resistance by prohibiting the formation of microcracks and consequently hindering delamination, it also promises a rise in comfort for a range of final products by intermission of vibration propagation and therefore diminution of noise. Electrospinning was the technique employed to produce nanofibrous membranes from a blend of polymeric solutions. SEM, WAXS and DSC were utilised to evaluate the quality of the obtained membranes before they were introduced, following a specific stacking sequence, in the production process of the laminate. A suitable curing cycle in an autoclave was applied to mend the modifications together with the matrix material, ensuring full crosslinking of the matrix and therefore finalising the production process. DMA was exercised in order to gain an understanding about the effects of the different modifications on the properties of the composite. During this investigation it became apparent that a high number of modifications of laminate CFRP composites, with an epoxy matrix, with thick rubbery nanofibrous membranes has a positive effect on the damping capacity and the temperature range the effect applies in. A suggestion for subsequent studies as well as a recommendation for the production of nano-modified CFRP structures is included at the end of this document.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this thesis is to present the concept of simulation for automatic machines and how it might be used to test and debug software implemented for an automatic machine. The simulation is used to detect errors and allow corrections of the code before the machine has been built. Simulation permits testing different solutions and improving the software to get an optimized one. Additionally, simulation can be used to keep track of a machine after the installation in order to improve the production process during the machine’s life cycle. The central argument of this project is discussing the advantage of using virtual commissioning to test the implemented software in a virtual environment. Such an environment is getting benefit in avoiding potential damages as well as reduction of time to have the machine ready to work. Also, the use of virtual commissioning allows testing different solutions without high losses of time and money. Subsequently, an optimized solution could be found after testing different proposed solutions. The software implemented is based on the Object-Oriented Programming paradigm which implies different features such as encapsulation, modularity, and reusability of the code. Therefore, this way of programming helps to get simplified code that is easier to be understood and debugged as well as its high efficiency. Finally, different communication protocols are implemented in order to allow communication between the real plant and the simulation model. By the outcome that this communication provides, we might be able to gather all the necessary data for the simulation and the analysis, in real-time, of the production process in a way to improve it during the machine life cycle.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Industry 4.0 refers to the 4th industrial revolution and at its bases, we can see the digitalization and the automation of the assembly line. The whole production process has improved and evolved thanks to the advances made in networking, and AI studies, which include of course machine learning, cloud computing, IoT, and other technologies that are finally being implemented into the industrial scenario. All these technologies have in common a need for faster, more secure, robust, and reliable communication. One of the many solutions for these demands is the use of mobile communication technologies in the industrial environment, but which technology is better suited for these demands? Of course, the answer isn’t as simple as it seems. The 4th industrial revolution has a never seen incomparable potential with respect to the previous ones, every factory, enterprise, or company have different network demands, and even in each of these infrastructures, the demands may diversify by sector, or by application. For example, in the health care industry, there may be e a need for increased bandwidth for the analysis of high-definition videos or, faster speeds in order to have analytics occur in real-time, and again another application might be higher security and reliability to protect patients’ data. As seen above, choosing the right technology for the right environment and application, considers many things, and the ones just stated are but a speck of dust with respect to the overall picture. In this thesis, we will investigate a comparison between the use of two of the available technologies in use for the industrial environment: Wi-Fi 6 and 5G Private Networks in the specific case of a steel factory.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The future hydrogen demand is expected to increase, both in existing industries (including upgrading of fossil fuels or ammonia production) and in new technologies, like fuel cells. Nowadays, hydrogen is obtained predominantly by steam reforming of methane, but it is well known that hydrocarbon based routes result in environmental problems and besides the market is dependent on the availability of this finite resource which is suffering of rapid depletion. Therefore, alternative processes using renewable sources like wind, solar energy and biomass, are now being considered for the production of hydrogen. One of those alternative methods is the so-called “steam-iron process” which consists in the reduction of a metal-oxide by hydrogen-containing feedstock, like ethanol for instance, and then the reduced material is reoxidized with water to produce “clean” hydrogen (water splitting). This kind of thermochemical cycles have been studied before but currently some important facts like the development of more active catalysts, the flexibility of the feedstock (including renewable bio-alcohols) and the fact that the purification of hydrogen could be avoided, have significantly increased the interest for this research topic. With the aim of increasing the understanding of the reactions that govern the steam-iron route to produce hydrogen, it is necessary to go into the molecular level. Spectroscopic methods are an important tool to extract information that could help in the development of more efficient materials and processes. In this research, ethanol was chosen as a reducing fuel and the main goal was to study its interaction with different catalysts having similar structure (spinels), to make a correlation with the composition and the mechanism of the anaerobic oxidation of the ethanol which is the first step of the steam-iron cycle. To accomplish this, diffuse reflectance spectroscopy (DRIFTS) was used to study the surface composition of the catalysts during the adsorption of ethanol and its transformation during the temperature program. Furthermore, mass spectrometry was used to monitor the desorbed products. The set of studied materials include Cu, Co and Ni ferrites which were also characterized by means of X-ray diffraction, surface area measurements, Raman spectroscopy, and temperature programmed reduction.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the industry of steelmaking, the process of galvanizing is a treatment which is applied to protect the steel from corrosion. The air knife effect (AKE) occurs when nozzles emit a steam of air on the surfaces of a steel strip to remove excess zinc from it. In our work we formalized the problem to control the AKE and we implemented, with the R&D dept.of MarcegagliaSPA, a DL model able to drive the AKE. We call it controller. It takes as input the tuple : a tuple of the physical conditions of the process line (t,h,s) with the target value of the zinc coating (c); and generates the expected tuple of (pres and dist) to drive the mechanical nozzles towards the (c). According to the requirements we designed the structure of the network. We collected and explored the data set of the historical data of the smart factory. Finally, we designed the loss function as sum of three components: the minimization between the coating addressed by the network and the target value we want to reach; and two weighted minimization components for both pressure and distance. In our solution we construct a second module, named coating net, to predict the coating of zinc resulting from the AKE when the conditions are applied to the prod. line. Its structure is made by a linear and a deep nonlinear “residual” component learned by empirical observations. The predictions made by the coating nets are used as ground truth in the loss function of the controller. By tuning the weights of the different components of the loss function, it is possible to train models with slightly different optimization purposes. In the tests we compared the regularization of different strategies with the standard one in condition of optimal estimation for both; the overall accuracy is ± 3 g/m^2 dal target for all of them. Lastly, we analyze how the controller modeled the current solutions with the new logic: the sub-optimal values of pres and dist can be optimize of 50% and 20%.