17 resultados para Best available techniques
em Instituto Politécnico do Porto, Portugal
Resumo:
Esta dissertação foi realizada em colaboração com o grupo empresarial Monteiro, Ribas e teve como principais objetivos efetuar uma avaliação das melhores técnicas disponíveis relativas à refrigeração industrial e às emissões resultantes da armazenagem. O primeiro objetivo teve como alvo todas as instalações da Monteiro, Ribas enquanto que o segundo objetivo se debruçou sobre Monteiro, Ribas, Embalagens Flexíveis, S.A.. Para cumprir estes objetivos, inicialmente efetuou-se um levantamento das melhores técnicas disponíveis apresentadas nos respetivos documentos de referência. Em seguida selecionaram-se as técnicas que se adequavam às condições e às instalações em estudo e procedeu-se a uma avaliação de forma a verificar o grau de implementação das medidas sugeridas no BREF (Best Available Techniques Reference Document). Relativamente aos sistemas de refrigeração industrial verificou-se que estão implementadas quase todas as medidas referenciadas no respetivo documento de referência. Isto prende-se com o facto dos sistemas de refrigeração existentes no complexo industrial Monteiro, Ribas serem relativamente recentes. Foram implementados no ano de 2012, e são caracterizados por apresentarem uma conceção moderna com elevada eficiência. No que diz respeito à armazenagem de produtos químicos perigosos, a instalação em estudo, apresenta algumas inconformidades, uma vez que a maioria das técnicas mencionadas no BREF não se encontram implementadas, pelo que foi necessário efetuar uma avaliação de riscos ambientais, com recurso à metodologia proposta pela Norma Espanhola UNE 150008:2008 – Análise e Avaliação do Risco Ambiental. Para isso procedeu-se então à formulação de vários cenários de riscos e à quantificação de riscos para à Monteiro, Ribas Embalagens Flexíveis S.A., tendo-se apurado que os riscos estavam avaliados como moderados a altos. Por fim foram sugeridas algumas medidas de prevenção e de minimização do risco que a instalação deve aplicar, como por exemplo, o parque de resíduos perigosos deve ser equipado com kits de contenção de derrames (material absorvente), procedimentos a realizar em caso de emergência, fichas de dados de segurança e o extintor deve ser colocado num local de fácil visualização. No transporte de resíduos perigosos, para o respetivo parque, é aconselhável utilizar bacias de contenção de derrames portáteis e kits de contenção de derrames. Relativamente ao armazém de produtos químicos perigosos é recomendado que se proceda a sua reformulação tendo em conta as MTD apresentadas no subcapítulo 5.2.3 desta dissertação.
Resumo:
The paper introduces an approach to solve the problem of generating a sequence of jobs that minimizes the total weighted tardiness for a set of jobs to be processed in a single machine. An Ant Colony System based algorithm is validated with benchmark problems available in the OR library. The obtained results were compared with the best available results and were found to be nearer to the optimal. The obtained computational results allowed concluding on their efficiency and effectiveness.
Resumo:
Mestrado em Engenharia Química - Ramo Optimização Energética na Indústria Química
Resumo:
Até 2020, a Europa terá de reduzir 20% das suas emissões de gases com efeito de estufa, 20% da produção de energia terá de ser proveniente de fontes renováveis e a eficiência energética deverá aumentar 20%. Estas são as metas apresentadas pela União Europeia, que ficaram conhecidas por 20/20/20 [1]. A Refinaria de Matosinhosé um complexo industrial que opera no sector da refinação e que apresenta preocupações ao nível da eficiência energética e dos aspectos ambientais subjacentes. No âmbito da racionalização energética das refinarias, a Galp Energia tem vindo a implementar um conjunto de medidas, adoptando as melhores tecnologias disponíveis com o objectivo de diminuir os consumos de energia, promover a eficiência energética e reduzir as emissões de dióxido de carbono. Para ir de encontro a estas medidas foi elaborado um estudo comparativo que permitiu à empresa definir as medidas consideradas prioritárias. Uma solução encontrada visa a execução de projectos que não requerem investimento e que têm acções imediatas, tais como o aumento da eficiência energética das fornalhas [1]. Este trabalho realizado na Galp Energia S.A. teve como objectivo principal a optimização energética da Unidade de Desalfatação do Propano da Fábrica de Óleos Base. Esta optimização baseou-se no aproveitamento energético da corrente de fundo da coluna de rectificação T2003C com uma potência calorífica de 2,79 Gcal/h. Após levantamento de todas as variáveis do processo relativas a esta unidade, especialmente a potência calorífica das correntes envolvidas chegou-se á conclusão que a fornalha H2101 poderá ser substituída por dois permutadores, reduzindo desta forma os consumos energéticos. Pois a corrente de fundo da coluna T2003 com uma potência calorífica 2,79 Gcal/h poderá permutar calor com a corrente da mistura asfalto com propano, fazendo com que esta atinja temperatura superior à obtida com a fornalha em funcionamento. A análise económica ao consumo e respectivo custo do fuelóleo na fornalha para o período de um ano foi realizada, sendo o seu custo de combustível de 611.396,00 €. O valor da aquisição dos permutadores é 86.355,97€, sendo rentável a alteração proposta neste projecto.
Resumo:
Buildings and the whole built environment are in a key role when societies are mitigating climate change and adapting to its consequences. More than 50% of the existing residential buildings in EU-25 were built before 1970. Thus, these buildings are of significant importance in reducing energy consumption and CO2 emissions. The existence of more nearly zero energy buildings (nZEB) is a possible solution for this problem. This study aims to analyze the application of the nZEB methodology in the retrofitting of a typical Portuguese dwelling build in 1950. It was shown that the primary energy used can be reduced to a very low value (11,95 kWhep/m2.y) in comparison with the reference consumption (69,15 kWhep/m2.y), with the application of the best construction techniques together with the use of energy from on-site renewable sources.
Resumo:
Adhesive bonding is an excellent alternative to traditional joining techniques such as welding, mechanical fastening or riveting. However, there are many factors that have to be accounted for during joint design to accurately predict the joint strength. One of these is the adhesive layer thickness (tA). Most of the results are for epoxy structural adhesives, tailored to perform best with small values of tA, and these show that the lap joint strength decreases with increase of tA (the optimum joint strength is usually obtained with tA values between 0.1 and 0.2 mm). Recently, polyurethane adhesives were made available in the market, designed to perform with larger tA values, and whose fracture behaviour is still not studied. In this work, the effect of tA on the tensile fracture toughness (View the MathML source) of a bonded joint is studied, considering a novel high strength and ductile polyurethane adhesive for the automotive industry. This work consists on the fracture characterization of the bond by a conventional and the J-integral techniques, which accurately account for root rotation effects. An optical measurement method is used for the evaluation of crack tip opening (δn) and adherends rotation at the crack tip (θo) during the test, supported by a Matlab® sub-routine for the automated extraction of these parameters. As output of this work, fracture data is provided in traction for the selected adhesive, enabling the subsequent strength prediction of bonded joints.
Resumo:
Recruitment is based on a conglomerate of techniques and procedures put in place to attract qualified. The recruitment process has suffered changes, becoming even more sophisticated, involving a whole organisation and a whole community. A new source of recruitment has emerged with the use of online social networks using facilitators in its development and usage, allowing the search for candidates to be fast, cheap and "global". In Portugal, the information available and studies conducted into this phenomenon are still irrelevant, with little reported on the importance of online social recruitment. The purpose of this article is to contribute to what is understood by the professional process of recruitment through online social media by recruitment companies in the Northern Region of Portugal, analysing the use of online media by recruitment professionals, facilitator support tools and the associated best practices.
Resumo:
It is widely accepted that organizations and individuals must be innovative and continually create new knowledge and ideas to deal with rapid change. Innovation plays an important role in not only the development of new business, process and products, but also in competitiveness and success of any organization. Technology for Creativity and Innovation: Tools, Techniques and Applications provides empirical research findings and best practices on creativity and innovation in business, organizational, and social environments. It is written for educators, academics and professionals who want to improve their understanding of creativity and innovation as well as the role technology has in shaping this discipline.
Resumo:
The introduction of Electric Vehicles (EVs) together with the implementation of smart grids will raise new challenges to power system operators. This paper proposes a demand response program for electric vehicle users which provides the network operator with another useful resource that consists in reducing vehicles charging necessities. This demand response program enables vehicle users to get some profit by agreeing to reduce their travel necessities and minimum battery level requirements on a given period. To support network operator actions, the amount of demand response usage can be estimated using data mining techniques applied to a database containing a large set of operation scenarios. The paper includes a case study based on simulated operation scenarios that consider different operation conditions, e.g. available renewable generation, and considering a diversity of distributed resources and electric vehicles with vehicle-to-grid capacity and demand response capacity in a 33 bus distribution network.
Resumo:
To comply with natural gas demand growth patterns and Europe´s import dependency, the gas industry needs to organize an efficient upstream infrastructure. The best location of Gas Supply Units – GSUs and the alternative transportation mode – by phisical or virtual pipelines, are the key of a successful industry. In this work we study the optimal location of GSUs, as well as determining the most efficient allocation from gas loads to sources, selecting the best transportation mode, observing specific technical restrictions and minimizing system total costs. For the location of GSUs on system we use the P-median problem, for assigning gas demands nodes to source facilities we use the classical transportation problem. The developed model is an optimisation-based approach, based on a Lagrangean heuristic, using Lagrangean relaxation for P-median problems – Simple Lagrangean Heuristic. The solution of this heuristic can be improved by adding a local search procedure - the Lagrangean Reallocation Heuristic. These two heuristics, Simple Lagrangean and Lagrangean Reallocation, were tested on a realistic network - the primary Iberian natural gas network, organized with 65 nodes, connected by physical and virtual pipelines. Computational results are presented for both approaches, showing the location gas sources and allocation loads arrangement, system total costs and gas transportation mode.
Resumo:
Introduction: A major focus of data mining process - especially machine learning researches - is to automatically learn to recognize complex patterns and help to take the adequate decisions strictly based on the acquired data. Since imaging techniques like MPI – Myocardial Perfusion Imaging on Nuclear Cardiology, can implicate a huge part of the daily workflow and generate gigabytes of data, there could be advantages on Computerized Analysis of data over Human Analysis: shorter time, homogeneity and consistency, automatic recording of analysis results, relatively inexpensive, etc.Objectives: The aim of this study relates with the evaluation of the efficacy of this methodology on the evaluation of MPI Stress studies and the process of decision taking concerning the continuation – or not – of the evaluation of each patient. It has been pursued has an objective to automatically classify a patient test in one of three groups: “Positive”, “Negative” and “Indeterminate”. “Positive” would directly follow to the Rest test part of the exam, the “Negative” would be directly exempted from continuation and only the “Indeterminate” group would deserve the clinician analysis, so allowing economy of clinician’s effort, increasing workflow fluidity at the technologist’s level and probably sparing time to patients. Methods: WEKA v3.6.2 open source software was used to make a comparative analysis of three WEKA algorithms (“OneR”, “J48” and “Naïve Bayes”) - on a retrospective study using the comparison with correspondent clinical results as reference, signed by nuclear cardiologist experts - on “SPECT Heart Dataset”, available on University of California – Irvine, at the Machine Learning Repository. For evaluation purposes, criteria as “Precision”, “Incorrectly Classified Instances” and “Receiver Operating Characteristics (ROC) Areas” were considered. Results: The interpretation of the data suggests that the Naïve Bayes algorithm has the best performance among the three previously selected algorithms. Conclusions: It is believed - and apparently supported by the findings - that machine learning algorithms could significantly assist, at an intermediary level, on the analysis of scintigraphic data obtained on MPI, namely after Stress acquisition, so eventually increasing efficiency of the entire system and potentially easing both roles of Technologists and Nuclear Cardiologists. In the actual continuation of this study, it is planned to use more patient information and significantly increase the population under study, in order to allow improving system accuracy.
Resumo:
In this study, efforts were made in order to put forward an integrated recycling approach for the thermoset based glass fibre reinforced polymer (GPRP) rejects derived from the pultrusion manufacturing industry. Both the recycling process and the development of a new cost-effective end-use application for the recyclates were considered. For this purpose, i) among the several available recycling techniques for thermoset based composite materials, the most suitable one for the envisaged application was selected (mechanical recycling); and ii) an experimental work was carried out in order to assess the added-value of the obtained recyclates as aggregates and reinforcement replacements into concrete-polymer composite materials. Potential recycling solution was assessed by mechanical behaviour of resultant GFRP waste modified concrete-polymer composites with regard to unmodified materials. In the mix design process of the new GFRP waste based composite material, the recyclate content and size grade, and the effect of the incorporation of an adhesion promoter were considered as material factors and systematically tested between reasonable ranges. The optimization process of the modified formulations was supported by the Fuzzy Boolean Nets methodology, which allowed finding the best balance between material parameters that maximizes both flexural and compressive strengths of final composite. Comparing to related end-use applications of GFRP wastes in cementitious based concrete materials, the proposed solution overcome some of the problems found, namely the possible incompatibilities arisen from alkalis-silica reaction and the decrease in the mechanical properties due to high water-cement ratio required to achieve the desirable workability. Obtained results were very promising towards a global cost-effective waste management solution for GFRP industrial wastes and end-of-life products that will lead to a more sustainable composite materials industry.
Resumo:
Mathematical models and statistical analysis are key instruments in soil science scientific research as they can describe and/or predict the current state of a soil system. These tools allow us to explore the behavior of soil related processes and properties as well as to generate new hypotheses for future experimentation. A good model and analysis of soil properties variations, that permit us to extract suitable conclusions and estimating spatially correlated variables at unsampled locations, is clearly dependent on the amount and quality of data and of the robustness techniques and estimators. On the other hand, the quality of data is obviously dependent from a competent data collection procedure and from a capable laboratory analytical work. Following the standard soil sampling protocols available, soil samples should be collected according to key points such as a convenient spatial scale, landscape homogeneity (or non-homogeneity), land color, soil texture, land slope, land solar exposition. Obtaining good quality data from forest soils is predictably expensive as it is labor intensive and demands many manpower and equipment both in field work and in laboratory analysis. Also, the sampling collection scheme that should be used on a data collection procedure in forest field is not simple to design as the sampling strategies chosen are strongly dependent on soil taxonomy. In fact, a sampling grid will not be able to be followed if rocks at the predicted collecting depth are found, or no soil at all is found, or large trees bar the soil collection. Considering this, a proficient design of a soil data sampling campaign in forest field is not always a simple process and sometimes represents a truly huge challenge. In this work, we present some difficulties that have occurred during two experiments on forest soil that were conducted in order to study the spatial variation of some soil physical-chemical properties. Two different sampling protocols were considered for monitoring two types of forest soils located in NW Portugal: umbric regosol and lithosol. Two different equipments for sampling collection were also used: a manual auger and a shovel. Both scenarios were analyzed and the results achieved have allowed us to consider that monitoring forest soil in order to do some mathematical and statistical investigations needs a sampling procedure to data collection compatible to established protocols but a pre-defined grid assumption often fail when the variability of the soil property is not uniform in space. In this case, sampling grid should be conveniently adapted from one part of the landscape to another and this fact should be taken into consideration of a mathematical procedure.
Resumo:
The massification of electric vehicles (EVs) can have a significant impact on the power system, requiring a new approach for the energy resource management. The energy resource management has the objective to obtain the optimal scheduling of the available resources considering distributed generators, storage units, demand response and EVs. The large number of resources causes more complexity in the energy resource management, taking several hours to reach the optimal solution which requires a quick solution for the next day. Therefore, it is necessary to use adequate optimization techniques to determine the best solution in a reasonable amount of time. This paper presents a hybrid artificial intelligence technique to solve a complex energy resource management problem with a large number of resources, including EVs, connected to the electric network. The hybrid approach combines simulated annealing (SA) and ant colony optimization (ACO) techniques. The case study concerns different EVs penetration levels. Comparisons with a previous SA approach and a deterministic technique are also presented. For 2000 EVs scenario, the proposed hybrid approach found a solution better than the previous SA version, resulting in a cost reduction of 1.94%. For this scenario, the proposed approach is approximately 94 times faster than the deterministic approach.
Resumo:
More than ever, there is an increase of the number of decision support methods and computer aided diagnostic systems applied to various areas of medicine. In breast cancer research, many works have been done in order to reduce false-positives when used as a double reading method. In this study, we aimed to present a set of data mining techniques that were applied to approach a decision support system in the area of breast cancer diagnosis. This method is geared to assist clinical practice in identifying mammographic findings such as microcalcifications, masses and even normal tissues, in order to avoid misdiagnosis. In this work a reliable database was used, with 410 images from about 115 patients, containing previous reviews performed by radiologists as microcalcifications, masses and also normal tissue findings. Throughout this work, two feature extraction techniques were used: the gray level co-occurrence matrix and the gray level run length matrix. For classification purposes, we considered various scenarios according to different distinct patterns of injuries and several classifiers in order to distinguish the best performance in each case described. The many classifiers used were Naïve Bayes, Support Vector Machines, k-nearest Neighbors and Decision Trees (J48 and Random Forests). The results in distinguishing mammographic findings revealed great percentages of PPV and very good accuracy values. Furthermore, it also presented other related results of classification of breast density and BI-RADS® scale. The best predictive method found for all tested groups was the Random Forest classifier, and the best performance has been achieved through the distinction of microcalcifications. The conclusions based on the several tested scenarios represent a new perspective in breast cancer diagnosis using data mining techniques.