835 resultados para Engenharia de software - Métodos experimentais


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The success achieved by thermal methods of recovery, in heavy oils, prompted the emergence of studies on the use of electromagnetic waves as heat generating sources in oil reservoirs. Thus, this generation is achieved by three types of different processes according to the frequency range used. They are: the electromagnetic induction heating, the resistive and the dielectric, also known as radiation. This study was based on computer simulations in oil reservoirs with characteristics similar to those found in the sedimentary basins of the Brazilian Northeast. All cases studied were simulated using the software STARS, CMG (Computer Group, version 2012.10 Modeling). Some simulations took into account the inclusion of electrically sensitive particles in certain sectors of the reservoir model studied by fracturing. The purpose of this work is the use of the electromagnetic induction heating as a recovery method of heavy oil, to check the influence of these aforementioned particles on the reservoir model used. Comparative analyses were made involving electromagnetic induction heating, the operation of hydraulic fracturing and the injection of water to the different situations of the reservoir model studied. It was found that fracturing the injection well in order that the electromagnetic heating occurs in the same well where there is water injection, there was a considerable increase in the recovery factor and in the cumulative oil production in relation to the models in which hydraulic fracturing occurred in the production well and water injection in the injection well. This is due to the generation of steam in situ in the reservoir.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The produce of waste and the amount of the water produced coming from activities of petroleum production and extraction has been a biggest challenge for oil companies with respect to environmental compliance due to toxicity. The discard or the reuse this effluent containing organic compounds as BTEX (benzene, toluene, ethylbenzene and xylene) can cause serious environmental and human health problems. Thus, the objective this paper was study the performance of two process (separately and sequential) in one synthetic effluent for the benzene, toluene and xylene removal (volatile hydrocarbons presents in the produced water) through of electrochemical treatment using Ti/Pt electrode and exchange resin ionic used in the adsorption process. The synthetic solution of BTX was prepared with concentration of 22,8 mg L-1, 9,7 mg L-1 e 9,0 mg L-1, respectively, in Na2SO4 0,1 mol L-1. The experiments was developed in batch with 0.3 L of solution at 25ºC. The electrochemical oxidation process was accomplished with a Ti/Pt electrode with different current density (J = 10, 20 e 30 mA.cm-2). In the adsorption process, we used an ionic exchange resin (Purolite MB 478), using different amounts of mass (2,5, 5 and 10 g). To verify the process of technics in the sequential treatment, was fixed the current density at 10 mA cm-2 and the resin weight was 2.5 g. Analysis of UV-VIS spectrophotometry, chemical oxygen demand (COD) and gas chromatography with selective photoionization detector (PID) and flame ionization (FID), confirmed the high efficiency in the removal of organic compounds after treatment. It was found that the electrochemical process (separate and sequential) is more efficient than absorption, reaching values of COD removal exceeding 70%, confirmed by the study of the cyclic voltammetry and polarization curves. While the adsorption (separately), the COD removal did not exceed 25,8%, due to interactions resin. However, the sequential process (electrochemical oxidation and adsorption) proved to be a suitable alternative, efficient and cost-effectiveness for the treatment of effluents petrochemical.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O NAVSTAR/GPS (NAVigation System with Timing And Ranging/Global Po- sitioning System), mais conhecido por GPS, _e um sistema de navegacão baseado em sat_elites desenvolvido pelo departamento de defesa norte-americano em meados de 1970. Criado inicialmente para fins militares, o GPS foi adaptado para o uso civil. Para fazer a localização, o receptor precisa fazer a aquisição de sinais dos satélites visíveis. Essa etapa é de extrema importância, pois é responsável pela detecção dos satélites visíveis, calculando suas respectivas frequências e fases iniciais. Esse processo pode demandar bastante tempo de processamento e precisa ser implementado de forma eficiente. Várias técnicas são utilizadas atualmente, mas a maioria delas colocam em conflito questões de projeto tais como, complexidade computacional, tempo de aquisição e recursos computacionais. Objetivando equilibrar essas questões, foi desenvolvido um método que reduz a complexidade do processo de aquisição utilizando algumas estratégias, a saber, redução do efeito doppler, amostras e tamanho do sinal utilizados, além do paralelismo. Essa estratégia é dividida em dois passos, um grosseiro em todo o espaço de busca e um fino apenas na região identificada previamente pela primeira etapa. Devido a busca grosseira, o limiar do algoritmo convencional não era mais aceitável. Nesse sentido, um novo limiar foi estabelecido baseado na variância dos picos de correlação. Inicialmente, é feita uma busca com pouca precisão comparando a variância dos cinco maiores picos de correlação encontrados. Caso a variância ultrapasse um certo limiar, a região de maior pico torna-se candidata à detecção. Por fim, essa região passa por um refinamento para se ter a certeza de detecção. Os resultados mostram que houve uma redução significativa na complexidade e no tempo de execução, sem que tenha sido necessário utilizar algoritmos muito complexos.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the oil industry, natural gas is a vital component of the world energy supply and an important source of hydrocarbons. It is one of the cleanest, safest and most relevant of all energy sources, and helps to meet the world's growing demand for clean energy in the future. With the growing share of natural gas in the Brazil energy matrix, the main purpose of its use has been the supply of electricity by thermal power generation. In the current production process, as in a Natural Gas Processing Unit (NGPU), natural gas undergoes various separation units aimed at producing liquefied natural gas and fuel gas. The latter should be specified to meet the thermal machines specifications. In the case of remote wells, the process of absorption of heavy components aims the match of fuel gas application and thereby is an alternative to increase the energy matrix. Currently, due to the high demand for this raw gas, research and development techniques aimed at adjusting natural gas are studied. Conventional methods employed today, such as physical absorption, show good results. The objective of this dissertation is to evaluate the removal of heavy components of natural gas by absorption. In this research it was used as the absorbent octyl alcohol (1-octanol). The influence of temperature (5 and 40 °C) and flowrate (25 and 50 ml/min) on the absorption process was studied. Absorption capacity expressed by the amount absorbed and kinetic parameters, expressed by the mass transfer coefficient, were evaluated. As expected from the literature, it was observed that the absorption of heavy hydrocarbon fraction is favored by lowering the temperature. Moreover, both temperature and flowrate favors mass transfer (kinetic effect). The absorption kinetics for removal of heavy components was monitored by chromatographic analysis and the experimental results demonstrated a high percentage of recovery of heavy components. Furthermore, it was observed that the use of octyl alcohol as absorbent was feasible for the requested separation process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the oil industry, natural gas is a vital component of the world energy supply and an important source of hydrocarbons. It is one of the cleanest, safest and most relevant of all energy sources, and helps to meet the world's growing demand for clean energy in the future. With the growing share of natural gas in the Brazil energy matrix, the main purpose of its use has been the supply of electricity by thermal power generation. In the current production process, as in a Natural Gas Processing Unit (NGPU), natural gas undergoes various separation units aimed at producing liquefied natural gas and fuel gas. The latter should be specified to meet the thermal machines specifications. In the case of remote wells, the process of absorption of heavy components aims the match of fuel gas application and thereby is an alternative to increase the energy matrix. Currently, due to the high demand for this raw gas, research and development techniques aimed at adjusting natural gas are studied. Conventional methods employed today, such as physical absorption, show good results. The objective of this dissertation is to evaluate the removal of heavy components of natural gas by absorption. In this research it was used as the absorbent octyl alcohol (1-octanol). The influence of temperature (5 and 40 °C) and flowrate (25 and 50 ml/min) on the absorption process was studied. Absorption capacity expressed by the amount absorbed and kinetic parameters, expressed by the mass transfer coefficient, were evaluated. As expected from the literature, it was observed that the absorption of heavy hydrocarbon fraction is favored by lowering the temperature. Moreover, both temperature and flowrate favors mass transfer (kinetic effect). The absorption kinetics for removal of heavy components was monitored by chromatographic analysis and the experimental results demonstrated a high percentage of recovery of heavy components. Furthermore, it was observed that the use of octyl alcohol as absorbent was feasible for the requested separation process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Zinc is an essential nutrient that is required for numerous metabolic functions, and zinc deficiency results in growth retardation, cellmediated immune dysfunction, and cognitive impairment. Objective: This study evaluated nutritional assessment methods for zinc supplementation in prepubertal nonzinc- deficient children. Design: We performed a randomised, controlled, triple-blind study. The children were divided into a control group (10% sorbitol, n = 31) and an experimental group (10 mg Zn/day, n = 31) for 3 months. Anthropometric and dietary assessments as well as bioelectrical measurements were performed in all children. Results: Our study showed (1) an increased body mass index for age and an increased phase angle in the experimental group; (2) a positive correlation between nutritional assessment parameters in both groups; (3) increased soft tissue, and mainly fat-free mass, in the body composition of the experimental group, as determined using bioelectrical impedance vector analysis; (4) increased consumption of all nutrients, including zinc, in the experimental group; and (5) an increased serum zinc concentration in both groups (p < 0.0001). Conclusions: Given that a reference for body composition analysis does not exist for intervention studies, longitudinal studies are needed to investigate vector migration during zinc supplementation. These results reinforce the importance of employing multiple techniques to assess the nutritional status of populations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Zinc is an essential nutrient that is required for numerous metabolic functions, and zinc deficiency results in growth retardation, cellmediated immune dysfunction, and cognitive impairment. Objective: This study evaluated nutritional assessment methods for zinc supplementation in prepubertal nonzinc- deficient children. Design: We performed a randomised, controlled, triple-blind study. The children were divided into a control group (10% sorbitol, n = 31) and an experimental group (10 mg Zn/day, n = 31) for 3 months. Anthropometric and dietary assessments as well as bioelectrical measurements were performed in all children. Results: Our study showed (1) an increased body mass index for age and an increased phase angle in the experimental group; (2) a positive correlation between nutritional assessment parameters in both groups; (3) increased soft tissue, and mainly fat-free mass, in the body composition of the experimental group, as determined using bioelectrical impedance vector analysis; (4) increased consumption of all nutrients, including zinc, in the experimental group; and (5) an increased serum zinc concentration in both groups (p < 0.0001). Conclusions: Given that a reference for body composition analysis does not exist for intervention studies, longitudinal studies are needed to investigate vector migration during zinc supplementation. These results reinforce the importance of employing multiple techniques to assess the nutritional status of populations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents a hybrid technique of frequency selective surfaces project (FSS) on a isotropic dielectric layer, considering various geometries for the elements of the unit cell. Specifically, the hybrid technique uses the equivalent circuit method in conjunction with genetic algorithm, aiming at the synthesis of structures with response single-band and dual-band. The equivalent circuit method allows you to model the structure by using an equivalent circuit and also obtaining circuits for different geometries. From the obtaining of the parameters of these circuits, you can get the transmission and reflection characteristics of patterned structures. For the optimization of patterned structures, according to the desired frequency response, Matlab™ optimization tool named optimtool proved to be easy to use, allowing you to explore important results on the optimization analysis. In this thesis, numeric and experimental results are presented for the different characteristics of the analyzed geometries. For this, it was determined a technique to obtain the parameter N, which is based on genetic algorithms and differential geometry, to obtain the algebraic rational models that determine values of N more accurate, facilitating new projects of FSS with these geometries. The optimal results of N are grouped according to the occupancy factor of the cell and the thickness of the dielectric, for modeling of the structures by means of rational algebraic equations. Furthermore, for the proposed hybrid model was developed a fitness function for the purpose of calculating the error occurred in the definitions of FSS bandwidths with transmission features single band and dual band. This thesis deals with the construction of prototypes of FSS with frequency settings and band widths obtained with the use of this function. The FSS were initially reviewed through simulations performed with the commercial software Ansoft Designer ™, followed by simulation with the equivalent circuit method for obtaining a value of N in order to converge the resonance frequency and the bandwidth of the FSS analyzed, then the results obtained were compared. The methodology applied is validated with the construction and measurement of prototypes with different geometries of the cells of the arrays of FSS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents a hybrid technique of frequency selective surfaces project (FSS) on a isotropic dielectric layer, considering various geometries for the elements of the unit cell. Specifically, the hybrid technique uses the equivalent circuit method in conjunction with genetic algorithm, aiming at the synthesis of structures with response single-band and dual-band. The equivalent circuit method allows you to model the structure by using an equivalent circuit and also obtaining circuits for different geometries. From the obtaining of the parameters of these circuits, you can get the transmission and reflection characteristics of patterned structures. For the optimization of patterned structures, according to the desired frequency response, Matlab™ optimization tool named optimtool proved to be easy to use, allowing you to explore important results on the optimization analysis. In this thesis, numeric and experimental results are presented for the different characteristics of the analyzed geometries. For this, it was determined a technique to obtain the parameter N, which is based on genetic algorithms and differential geometry, to obtain the algebraic rational models that determine values of N more accurate, facilitating new projects of FSS with these geometries. The optimal results of N are grouped according to the occupancy factor of the cell and the thickness of the dielectric, for modeling of the structures by means of rational algebraic equations. Furthermore, for the proposed hybrid model was developed a fitness function for the purpose of calculating the error occurred in the definitions of FSS bandwidths with transmission features single band and dual band. This thesis deals with the construction of prototypes of FSS with frequency settings and band widths obtained with the use of this function. The FSS were initially reviewed through simulations performed with the commercial software Ansoft Designer ™, followed by simulation with the equivalent circuit method for obtaining a value of N in order to converge the resonance frequency and the bandwidth of the FSS analyzed, then the results obtained were compared. The methodology applied is validated with the construction and measurement of prototypes with different geometries of the cells of the arrays of FSS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of access technologies for communication, based on scanning methods, enables new communication opportunities for individuals with severe motor dysfunction. One of the most commom examples of this type of technology is the single switch scanning. Single switch scanning keyboards are often used as augmentative and alternative communication devices for inidividuals with severe mobility restrictions and with compromised speech and writing. They consist of a matrix of keys and simulate the operation of a physical keyboard to write messages. One of the limitations of these systems is their low performance. Low communication rates and considerable errors ocurrence are some of the few problems that users of these devices suffers during daily use. The development and evaluation of new strategies in augmentative and alternative communication are essential to improve the communication opportunities of user who make use of such technology. Thus, this work explores different strategies to increase communication rate and reduce user’s mistakes. Computational and practical analysis were performed for the evaluation of proposed strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The past few decades have brought many changes to the dental practice and the technology has become ready available. The result of a satisfactory rehabilitation treatment basically depends on the balance between biological and mechanical factors. The marginal adaptation of crowns and prosthetic structures is vital factor for long-term success. The development of CAD / CAM technology in the manufacture of dental prostheses revolutionized dentistry, this technology is capable of generating a virtual model from the direct digital scanning from the mouth, casts or impressions. It allows the planning and design of the structure in a computered software. The virtual projects are obtained with high precision and a significant reduction in clinical and laboratory time. Thus, the present study (Chapters 1, 2 and 3) computed microtomography was used to evaluate, different materials, different CAD/CAM systems, different ways of obtaining virtual model (with direct or indirect scanning), and in addition, also aims to evaluate the influence of cementing agent in the final adaptation of crowns and copings obtained by CAD / CAM. Furthermore, this study (Chapter 4, 5 and 6) also aims to evaluate significant differences in vertical and horizontal misfits in abutment-free frameworks on external hexagon implants (HE) using full castable UCLAs, castable UCLAs with cobalt-chromium pre-machined bases and obtained by CAD / CAM with CoCr or Zirconia by different scanning and milling systems. For this, the scanning electron microscopy and interferometry were used. It was concluded that the CAD / CAM technology is capable to produce restorations, copings and screw-retained implant-supported frameworks in different materials and systems offering satisfactory results of marginal accuracy, with significative reduction in clinical and laboratory time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ao longo dos anos, a análise de risco de crédito tem vindo a assumir um papel decisivo na análise do financiamento das empresas, sendo este um elemento fundamental para os órgãos de gestão. O financiamento é um elemento muito importante de suporte à atividade empresarial, uma vez que as empresas não detendo capital para realizar o investimento ou atividades correntes, recorrem ao crédito. Para que as empresas possam diminuir o risco de perdas, elas têm de seguir políticas de análise de crédito e de cobranças muito rigorosas. Este controlo será mais eficaz e eficiente se a organização mantiver relações de proximidade com os seus clientes. Um dos métodos cada vez mais utilizados para se conseguir manter relações estáveis e duradouras passa por adotar estratégias de CRM – Customer Relationship Management. A presente dissertação tem como objetivo desenvolver um modelo de análise de risco de crédito para os clientes da empresa inCentea. Este modelo permitirá perceber se o cliente reúne as condições necessárias para atribuição de crédito, e assim diminuir os risco para a inCentea. Conclui-se que a utilização de um maior número de variáveis na avaliação do risco permite uma minimização do risco. Através da integração do modelo de análise de crédito no software de CRM, a inCentea poderá fundamentar a sua decisão de concessão ou não de crédito com base em indicadores económicos e financeiros.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

À medida que a sociedade evolui, também as profissões evoluem com ela. Uma profissão que tem crescido de forma notável é a de consultor na área das tecnologias de informação. De facto, há alguns anos atrás, esta profissão não tinha o relevo que tem na actualidade, pois com a evolução das tecnologias da informação, as ferramentas ao dispor das empresas diversificaram-se e como tal, a mão-de-obra vai-se especializando nestas ferramentas. As empresas, na impossibilidade de contratar mão-de-obra para instalar, manter e evoluir todas as suas ferramentas, recorrem cada vez mais à experiência de consultores externos; estes são contratados para realizar tarefas específicas que as empresas não têm capacidade de executar com os próprios recursos e a sua ligação com estas entidades termina após a conclusão destas tarefas. Cada vez mais as empresas procuram tirar o máximo partido das ferramentas informáticas à sua disposição e os sistemas de Enterprise Resource Planning não são excepção. À medida que os utilizadores vão usufruindo dos sistemas, surgem novas necessidades que, depois de colmatadas, permitem a sua optimização. Além disso, com a alteração das leis, é preciso adaptar os sistemas para que estes evoluam e possam reflectir os novos requisitos legais. Com o aumento da regulação bancária, as entidades bancárias são obrigadas a comunicar informação sobre as suas finanças ao Banco de Portugal, de modo a que o banco possa analisar a saúde financeira das entidades e emitir recomendações sobre a sua forma de actuar. Esta nova forma de comunicação será efectuada por meio de relatórios em eXtended Business Reporting Language (XBRL). Pretende-se caracterizar a profissão de consultor e apresentar alguns projectos realizados enquanto consultor numa consultora de grande dimensão.