32 resultados para 090402 Catalytic Process Engineering
em Universidad de Alicante
Resumo:
Gasoline coming from refinery fluid catalytic cracking (FCC) unit is a major contributor to the total commercial grade gasoline pool. The contents of the FCC gasoline are primarily paraffins, naphthenes, olefins, aromatics, and undesirables such as sulfur and sulfur containing compounds in low quantities. The proportions of these components in the FCC gasoline invariable determine its quality as well as the performance of the associated downstream units. The increasing demand for cleaner and lighter fuels significantly influences the need not only for novel processing technologies but also for alternative refinery and petrochemical feedstocks. Current and future clean gasoline requirements include increased isoparaffins contents, reduced olefin contents, reduced aromatics, reduced benzene, and reduced sulfur contents. The present study is aimed at investigating the effect of processing an unconventional refinery feedstock, composed of blend of vacuum gas oil (VGO) and low density polyethylene (LDPE) on FCC full range gasoline yields and compositional spectrum including its paraffins, isoparaffins, olefins, napthenes, and aromatics contents distribution within a range of operating variables of temperature (500–700 °C) and catalyst-feed oil ratio (CFR 5–10) using spent equilibrium FCC Y-zeolite based catalyst in a FCC pilot plant operated at the University of Alicante’s Research Institute of Chemical Process Engineering (RICPE). The coprocessing of the oil-polymer blend led to the production of gasoline with very similar yields and compositions as those obtained from the base oil, albeit, in some cases, the contribution of the feed polymer content as well as the processing variables on the gasoline compositional spectrum were appreciated. Carbon content analysis showed a higher fraction of the C9–C12 compounds at all catalyst rates employed and for both feedstocks. The gasoline’s paraffinicity, olefinicity, and degrees of branching of the paraffins and olefins were also affected in various degrees by the scale of operating severity. In the majority of the cases, the gasoline aromatics tended toward the decrease as the reactor temperature was increased. While the paraffins and iso-paraffins gasoline contents were relatively stable at around 5 % wt, the olefin contents on the other hand generally increased with increase in the FCC reactor temperature.
Resumo:
Hydrophobic Ti-MCM-41 samples prepared by post-synthesis silylation treatment demonstrate to be highly active and selective catalysts in olefins epoxidation by using organic hydroperoxides as oxidizing agents in liquid phase reaction systems. Epoxide yields show important enhancements with increased silylation degrees of the Ti-mesoporous samples. Catalytic studies are combined and correlated with spectroscopic techniques (e.g. XRD, XANES, UV-Visible, 29Si MAS-NMR) and calorimetric measurements to better understand the changes in the surface chemistry of Ti-MCM-41 samples due to the post-synthesis silylation treatment and to ascertain the role of these trimethylsilyl groups incorporated in olefin epoxidation. In such manner, the effect of the organic moieties on solids, and both water and glycol molecules contents on the catalytic activity and selectivity are analyzed in detail. Results show that the hydrophobicity level of the samples is responsible for the decrease in water adsorption and, consequently, the negligible formation of the non-desired glycol during the catalytic process. Thus, catalyst deactivation by glycol poisoning of Ti active sites is greatly diminished, this increasing catalyst stability and leading to practically quantitative production of the corresponding epoxide. The extended use of these hydrophobic Ti-MCM-41 catalysts together with organic hydroperoxides for the highly efficient and selective epoxidation of natural terpenes is also exemplified.
Resumo:
Póster presentado en 19th International Congress of Chemical and Process Engineering, Prague, Czech Republic August 28th-September 1st, 2010.
Resumo:
Póster presentado en 19th International Congress of Chemical and Process Engineering, Prague, Czech Republic August 28th-September 1st, 2010.
Resumo:
Póster presentado en Escape 22, European Symposium on Computer Aided Process Engineering, University College London, UK, 17-20 June 2012.
Resumo:
Presentation in the 11th European Symposium of the Working Party on Computer Aided Process Engineering, Kolding, Denmark, May 27-30, 2001.
Resumo:
Poster presented in the 24th European Symposium on Computer Aided Process Engineering (ESCAPE 24), Budapest, Hungary, June 15-18, 2014.
Resumo:
Comunicación presentada en forma de póster en el "12th Mediterranean Congress of Chemical Engineering", Barcelona (Spain), November 15-18, 2011.
Resumo:
Presentation submitted to PSE Seminar, Chemical Engineering Department, Center for Advanced Process Design-making (CAPD), Carnegie Mellon University, Pittsburgh (USA), October 2012.
Resumo:
Nowadays, data mining is based on low-level specications of the employed techniques typically bounded to a specic analysis platform. Therefore, data mining lacks a modelling architecture that allows analysts to consider it as a truly software-engineering process. Here, we propose a model-driven approach based on (i) a conceptual modelling framework for data mining, and (ii) a set of model transformations to automatically generate both the data under analysis (via data-warehousing technology) and the analysis models for data mining (tailored to a specic platform). Thus, analysts can concentrate on the analysis problem via conceptual data-mining models instead of low-level programming tasks related to the underlying-platform technical details. These tasks are now entrusted to the model-transformations scaffolding.
Resumo:
Data mining is one of the most important analysis techniques to automatically extract knowledge from large amount of data. Nowadays, data mining is based on low-level specifications of the employed techniques typically bounded to a specific analysis platform. Therefore, data mining lacks a modelling architecture that allows analysts to consider it as a truly software-engineering process. Bearing in mind this situation, we propose a model-driven approach which is based on (i) a conceptual modelling framework for data mining, and (ii) a set of model transformations to automatically generate both the data under analysis (that is deployed via data-warehousing technology) and the analysis models for data mining (tailored to a specific platform). Thus, analysts can concentrate on understanding the analysis problem via conceptual data-mining models instead of wasting efforts on low-level programming tasks related to the underlying-platform technical details. These time consuming tasks are now entrusted to the model-transformations scaffolding. The feasibility of our approach is shown by means of a hypothetical data-mining scenario where a time series analysis is required.
Resumo:
Póster presentado en EDULEARN12, International Conference on Education and New Learning Technologies, Barcelona, 2nd-4th July 2012.
Resumo:
Reverse engineering is the process of discovering the technological principles of a device, object or system through analysis of its structure, function, and operation. From a device used in clinical practice, as the corneal topographer, reverse engineering will be used to infer physical principles and laws. In our case, reverse engineering involves taking this mechanical device apart and analyzing its working detail. The initial knowledge of the application and usefulness of the device provides a motivation that, together with the combination of theory and practice, will help the students to understand and learn concepts studied in different subjects in the Optics and Optometry degree. These subjects belong to both the core and compulsory subjects of the syllabus of first and second year of the degree. Furthermore, the experimental practice is used as transverse axis that relates theoretical concepts, technology transfer and research.
Resumo:
Comunicación presentada en las XVI Jornadas de Ingeniería del Software y Bases de Datos, JISBD 2011, A Coruña, 5-7 septiembre 2011.
Empirical study on the maintainability of Web applications: Model-driven Engineering vs Code-centric
Resumo:
Model-driven Engineering (MDE) approaches are often acknowledged to improve the maintainability of the resulting applications. However, there is a scarcity of empirical evidence that backs their claimed benefits and limitations with respect to code-centric approaches. The purpose of this paper is to compare the performance and satisfaction of junior software maintainers while executing maintainability tasks on Web applications with two different development approaches, one being OOH4RIA, a model-driven approach, and the other being a code-centric approach based on Visual Studio .NET and the Agile Unified Process. We have conducted a quasi-experiment with 27 graduated students from the University of Alicante. They were randomly divided into two groups, and each group was assigned to a different Web application on which they performed a set of maintainability tasks. The results show that maintaining Web applications with OOH4RIA clearly improves the performance of subjects. It also tips the satisfaction balance in favor of OOH4RIA, although not significantly. Model-driven development methods seem to improve both the developers’ objective performance and subjective opinions on ease of use of the method. This notwithstanding, further experimentation is needed to be able to generalize the results to different populations, methods, languages and tools, different domains and different application sizes.