20 resultados para Process capability analysis

em Repositório Científico do Instituto Politécnico de Lisboa - Portugal


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Most of the wastewater treatment systems in small rural communities of the Cova da Beira region (Portugal) consist of constructed wetlands (CW) with horizontal subsurface flow (HSSF). It is believed that those systems allow the compliance of discharge standards as well as the production of final effluents with suitability for reuse. Results obtained in a nine-month campaign in an HSSF bed pointed out that COD and TSS removal were lower than expected. A discrete sampling also showed that removal of TC, FC and HE was not enough to fulfill international irrigation goals. However, the bed had a very good response to variation of incoming nitrogen loads presenting high removal of nitrogen forms. A good correlation between mass load and mass removal rate was observed for BOD5, COD, TN, NH4-N, TP and TSS, which shows a satisfactory response of the bed to the variable incoming loads. The entrance of excessive loads of organic matter and solids contributed for the decrease of the effective volume for pollutant uptake and therefore, may have negatively influenced the treatment capability. Primary treatment should be improved in order to decrease the variation of incoming organic and solid loads and to improve the removal of COD, solids and pathogenic. The final effluent presented good physical-chemical quality to be reused for irrigation, which is the most likely application in the area.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A competitividade entre as empresas e a busca por modelos de gestão e organização cada vez mais eficientes, tem dominado a actualidade. A filosofia de gestão Lean vem dar resposta a essas necessidades de aumento dos níveis de competitividade e eficiência, através de uma mudança da cultura organizacional, que assenta na redução ou eliminação de desperdícios e na melhoria contínua dos processos de fabrico de bens ou do fornecimento de serviços. A gestão Lean é suportada e implementada pela aplicação de um conjunto de ferramentas correctamente seleccionadas e adaptadas ao contexto organizacional da empresa ou organização. A presente dissertação visa caracterizar as ferramentas mais comuns da filosofia Lean, tendo em consideração a sua aplicabilidade na indústria e no sector dos serviços. É igualmente abordada a forma de aplicação das ferramentas Lean de maneira a que não constituam um acto isolado que conduz seguramente ao fracasso da implementação Lean na organização. Por essa razão são discutidas algumas regras e critérios, com base na proposta de um método de aplicação das ferramentas Lean que evite erros cometidos no passado e que levaram ao insucesso da aplicação do Lean em algumas organizações. Recorreu-se a um estudo de caso do ramo dos serviços, cujos resultados permitiram verificar a aplicabilidade do método proposto na aplicação de ferramentas Lean ao ramo dos serviços. O estudo de caso revelou a existência de uma elevada percentagem de desperdícios no processo em análise e permitiu melhorar o funcionamento desses mesmos processos. As melhorias alcançadas foram realizadas com base na eliminação dos desperdícios, na resolução de problemas e consequente uniformização de processos que melhoraram a qualidade e eficiência do serviço prestado, evidenciando que a organização alvo do estudo se encontra no bom caminho para atingir com sucesso a alteração da cultura organizacional para a filosofia Lean.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A razão atribuída à escolha do tema do T.F.M. em Caminhos-de-ferro, tem a ver com o facto de ser uma via de comunicação específica. Contrariamente ao que acontece na execução de auto-estradas, no caminho-de-ferro, a gestora da infra-estrutura responsabiliza-se pela execução, exploração (sinalização e controlo de circulação), abastecimento de energia fornecida às vias que dispõem de catenária (tracção eléctrica), manutenção e conservação da via. O processo de análise e inspecção de geometria de via, é usado quando e necessário preservar a infra-estrutura. Este processo iniciou-se nos caminhos-de-ferro portugueses, há muitos anos, depois da inauguração do primeiro troco de linha férrea. A primeira viagem ocorre em Outubro de 1856, sendo o início do processo em 1968, com a dresina “Matisa PV-6”. Em 1991 a C.P. adquiriu outro veículo de inspecção de via, tendo sido escolhido o VIV02 EM 120 da marca Plasser & Theurer, para substituir “Matisa PV-6”. O tema Análise dos Métodos de Medição dos Parâmetros Geométricos de Via e Correlação entre os Dados Obtidos está directamente relacionado com a manutenção e conservação de via. Na Unidade Operacional Sul (hoje ROS – Região Operacional Sul), local onde desenvolvi o T.F.M., não existem obras de construção de caminhos-de-ferro que pudesse acompanhar e constituir tema para o meu trabalho. Na realidade, com a falta de investimento que se perspectiva no futuro próximo, a manutenção da infra-estrutura passa a ser a actividade principal desenvolvida pela REFER, de modo a assegurar a comodidade, segurança e rapidez na deslocação de cargas e pessoas. A Analise Geométrica de Via e actualmente uma das principais actividades no âmbito da manutenção, que é feita por diagnóstico, contrariamente ao que acontecia no passado em que a conservação metódica era realizada num determinado ano num troço seleccionado independentemente da necessidade ou não da mesma. Uma ajuda preciosa, no que se refere à decisão de se realizar um determinado trabalho de conservação, e a do veículo VIV02 EM 120 que faz inspeções ao longo de toda a rede ferroviária e permite recolher dados e classificar através do desvio padrão, troços com extensão de 200 metros, obtendo os dados relevantes sobre a necessidade de intervenção. Para além do referido veículo existem também equipamentos ligeiros de inspecção dos parâmetros geométricos de via. Um desses equipamentos designa-se por Trólei, não sendo motorizado, pois o mesmo é movido manualmente por um operador. Obviamente que este equipamento não faz a inspecção a toda a rede pois a operação de medição é morosa, sendo contudo utilizado para análise de defeitos geométricos em pequenos trocos, tornando-se assim uma mais-valia, evitando o deslocar de um equipamento “pesado” como o VIV 02 EM 120. Para atingir os objectivos deste trabalho realizaram-se testes de medição com ambos (veiculo e equipamento ligeiro), no mesmo espaço temporal e com as mesmas características físicas, como a temperatura, humidade etc. Os resultados, de acordo com os objectivos, são a comparação entre as medições de ambos, com vista a comprovar a sua utilidade e necessidade, de acordo com os vários tipos de superstruturas constituintes da rede ferroviária nacional.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mestrado em Intervenção Sócio-Organizacional na Saúde - Área de especialização: Políticas de Administração e Gestão de Serviços de Saúde.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Trabalho de Projecto para obtenção do grau de Mestre em Engenharia Civil na Área de Especialização de Estruturas

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a case study of heat exchanger network (HEN) retrofit with the objective to reduce the utilities consumption in a biodiesel production process. Pinch analysis studies allow determining the minimum duty utilities as well the maximum of heat recovery. The existence of heat exchangers for heat recovery already running in the process causes a serious restriction for the implementation of grassroot HEN design based on pinch studies. Maintaining the existing HEN, a set of alternatives with additional heat exchangers was created and analysed using some industrial advice and selection criteria. The final proposed solution allows to increase the actual 18 % of recovery heat of the all heating needs of the process to 23 %, with an estimated annual saving in hot utility of 35 k(sic)/y.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Relatório de Estágio apresentado à Escola Superior de Educação de Lisboa para obtenção de grau de mestre em Ensino do 1.º e 2.º ciclo do Ensino Básico

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The development of high spatial resolution airborne and spaceborne sensors has improved the capability of ground-based data collection in the fields of agriculture, geography, geology, mineral identification, detection [2, 3], and classification [4–8]. The signal read by the sensor from a given spatial element of resolution and at a given spectral band is a mixing of components originated by the constituent substances, termed endmembers, located at that element of resolution. This chapter addresses hyperspectral unmixing, which is the decomposition of the pixel spectra into a collection of constituent spectra, or spectral signatures, and their corresponding fractional abundances indicating the proportion of each endmember present in the pixel [9, 10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. The linear mixing model holds when the mixing scale is macroscopic [13]. The nonlinear model holds when the mixing scale is microscopic (i.e., intimate mixtures) [14, 15]. The linear model assumes negligible interaction among distinct endmembers [16, 17]. The nonlinear model assumes that incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [18]. Under the linear mixing model and assuming that the number of endmembers and their spectral signatures are known, hyperspectral unmixing is a linear problem, which can be addressed, for example, under the maximum likelihood setup [19], the constrained least-squares approach [20], the spectral signature matching [21], the spectral angle mapper [22], and the subspace projection methods [20, 23, 24]. Orthogonal subspace projection [23] reduces the data dimensionality, suppresses undesired spectral signatures, and detects the presence of a spectral signature of interest. The basic concept is to project each pixel onto a subspace that is orthogonal to the undesired signatures. As shown in Settle [19], the orthogonal subspace projection technique is equivalent to the maximum likelihood estimator. This projection technique was extended by three unconstrained least-squares approaches [24] (signature space orthogonal projection, oblique subspace projection, target signature space orthogonal projection). Other works using maximum a posteriori probability (MAP) framework [25] and projection pursuit [26, 27] have also been applied to hyperspectral data. In most cases the number of endmembers and their signatures are not known. Independent component analysis (ICA) is an unsupervised source separation process that has been applied with success to blind source separation, to feature extraction, and to unsupervised recognition [28, 29]. ICA consists in finding a linear decomposition of observed data yielding statistically independent components. Given that hyperspectral data are, in given circumstances, linear mixtures, ICA comes to mind as a possible tool to unmix this class of data. In fact, the application of ICA to hyperspectral data has been proposed in reference 30, where endmember signatures are treated as sources and the mixing matrix is composed by the abundance fractions, and in references 9, 25, and 31–38, where sources are the abundance fractions of each endmember. In the first approach, we face two problems: (1) The number of samples are limited to the number of channels and (2) the process of pixel selection, playing the role of mixed sources, is not straightforward. In the second approach, ICA is based on the assumption of mutually independent sources, which is not the case of hyperspectral data, since the sum of the abundance fractions is constant, implying dependence among abundances. This dependence compromises ICA applicability to hyperspectral images. In addition, hyperspectral data are immersed in noise, which degrades the ICA performance. IFA [39] was introduced as a method for recovering independent hidden sources from their observed noisy mixtures. IFA implements two steps. First, source densities and noise covariance are estimated from the observed data by maximum likelihood. Second, sources are reconstructed by an optimal nonlinear estimator. Although IFA is a well-suited technique to unmix independent sources under noisy observations, the dependence among abundance fractions in hyperspectral imagery compromises, as in the ICA case, the IFA performance. Considering the linear mixing model, hyperspectral observations are in a simplex whose vertices correspond to the endmembers. Several approaches [40–43] have exploited this geometric feature of hyperspectral mixtures [42]. Minimum volume transform (MVT) algorithm [43] determines the simplex of minimum volume containing the data. The MVT-type approaches are complex from the computational point of view. Usually, these algorithms first find the convex hull defined by the observed data and then fit a minimum volume simplex to it. Aiming at a lower computational complexity, some algorithms such as the vertex component analysis (VCA) [44], the pixel purity index (PPI) [42], and the N-FINDR [45] still find the minimum volume simplex containing the data cloud, but they assume the presence in the data of at least one pure pixel of each endmember. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. Hyperspectral sensors collects spatial images over many narrow contiguous bands, yielding large amounts of data. For this reason, very often, the processing of hyperspectral data, included unmixing, is preceded by a dimensionality reduction step to reduce computational complexity and to improve the signal-to-noise ratio (SNR). Principal component analysis (PCA) [46], maximum noise fraction (MNF) [47], and singular value decomposition (SVD) [48] are three well-known projection techniques widely used in remote sensing in general and in unmixing in particular. The newly introduced method [49] exploits the structure of hyperspectral mixtures, namely the fact that spectral vectors are nonnegative. The computational complexity associated with these techniques is an obstacle to real-time implementations. To overcome this problem, band selection [50] and non-statistical [51] algorithms have been introduced. This chapter addresses hyperspectral data source dependence and its impact on ICA and IFA performances. The study consider simulated and real data and is based on mutual information minimization. Hyperspectral observations are described by a generative model. This model takes into account the degradation mechanisms normally found in hyperspectral applications—namely, signature variability [52–54], abundance constraints, topography modulation, and system noise. The computation of mutual information is based on fitting mixtures of Gaussians (MOG) to data. The MOG parameters (number of components, means, covariances, and weights) are inferred using the minimum description length (MDL) based algorithm [55]. We study the behavior of the mutual information as a function of the unmixing matrix. The conclusion is that the unmixing matrix minimizing the mutual information might be very far from the true one. Nevertheless, some abundance fractions might be well separated, mainly in the presence of strong signature variability, a large number of endmembers, and high SNR. We end this chapter by sketching a new methodology to blindly unmix hyperspectral data, where abundance fractions are modeled as a mixture of Dirichlet sources. This model enforces positivity and constant sum sources (full additivity) constraints. The mixing matrix is inferred by an expectation-maximization (EM)-type algorithm. This approach is in the vein of references 39 and 56, replacing independent sources represented by MOG with mixture of Dirichlet sources. Compared with the geometric-based approaches, the advantage of this model is that there is no need to have pure pixels in the observations. The chapter is organized as follows. Section 6.2 presents a spectral radiance model and formulates the spectral unmixing as a linear problem accounting for abundance constraints, signature variability, topography modulation, and system noise. Section 6.3 presents a brief resume of ICA and IFA algorithms. Section 6.4 illustrates the performance of IFA and of some well-known ICA algorithms with experimental data. Section 6.5 studies the ICA and IFA limitations in unmixing hyperspectral data. Section 6.6 presents results of ICA based on real data. Section 6.7 describes the new blind unmixing scheme and some illustrative examples. Section 6.8 concludes with some remarks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Benchmarking is an important tool to organisations to improve their productivity, product quality, process efficiency or services. From Benchmarking the organisations could compare their performance with competitors and identify their strengths and weaknesses. This study intends to do a benchmarking analysis on the main Iberian Sea ports with a special focus on their container terminals efficiency. To attain this, the DEA (data envelopment analysis) is used since it is considered by several researchers as the most effective method to quantify a set of key performance indicators. In order to reach a more reliable diagnosis tool the DEA is used together with the data mining in comparing the sea ports operational data of container terminals during 2007.Taking into account that sea ports are global logistics networks the performance evaluation is essential to an effective decision making in order to improve their efficiency and, therefore, their competitiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The kraft pulps produced from heartwood and sapwood of Eucalyptus globulus at 130 degrees C, 150 degrees C, and 170 degrees C were characterized by wet chemistry (total lignin as sum of Klason and soluble lignin fractions) and pyrolysis (total lignin denoted as py-lignin). The total lignin content obtained with both methods was similar. In the course of delignification, the py-lignin values were higher (by 2 to 5%) compared to Klason values, which is in line with the importance of soluble lignin for total lignin determination. Pyrolysis analysis presents advantages over wet chemical procedures, and it can be applied to wood and pulps to determine lignin contents at different stages of the delignification process. The py-lignin values were used for kinetic modelling of delignification, with very high predictive value and results similar to those of modelling using wet chemical determinations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Independent component analysis (ICA) has recently been proposed as a tool to unmix hyperspectral data. ICA is founded on two assumptions: 1) the observed spectrum vector is a linear mixture of the constituent spectra (endmember spectra) weighted by the correspondent abundance fractions (sources); 2)sources are statistically independent. Independent factor analysis (IFA) extends ICA to linear mixtures of independent sources immersed in noise. Concerning hyperspectral data, the first assumption is valid whenever the multiple scattering among the distinct constituent substances (endmembers) is negligible, and the surface is partitioned according to the fractional abundances. The second assumption, however, is violated, since the sum of abundance fractions associated to each pixel is constant due to physical constraints in the data acquisition process. Thus, sources cannot be statistically independent, this compromising the performance of ICA/IFA algorithms in hyperspectral unmixing. This paper studies the impact of hyperspectral source statistical dependence on ICA and IFA performances. We conclude that the accuracy of these methods tends to improve with the increase of the signature variability, of the number of endmembers, and of the signal-to-noise ratio. In any case, there are always endmembers incorrectly unmixed. We arrive to this conclusion by minimizing the mutual information of simulated and real hyperspectral mixtures. The computation of mutual information is based on fitting mixtures of Gaussians to the observed data. A method to sort ICA and IFA estimates in terms of the likelihood of being correctly unmixed is proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Linear unmixing decomposes a hyperspectral image into a collection of reflectance spectra of the materials present in the scene, called endmember signatures, and the corresponding abundance fractions at each pixel in a spatial area of interest. This paper introduces a new unmixing method, called Dependent Component Analysis (DECA), which overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical properties of hyperspectral data. DECA models the abundance fractions as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. The mixing matrix is inferred by a generalized expectation-maximization (GEM) type algorithm. The performance of the method is illustrated using simulated and real data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chapter in Book Proceedings with Peer Review First Iberian Conference, IbPRIA 2003, Puerto de Andratx, Mallorca, Spain, JUne 4-6, 2003. Proceedings

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mestrado em Contabilidade e Gestão das Instituições Financeiras

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last years the electricity industry has faced a restructuring process. Among the aims of this process was the increase in competition, especially in the generation activity where firms would have an incentive to become more efficient. However, the competitive behavior of generating firms might jeopardize the expected benefits of the electricity industry liberalization. The present paper proposes a conjectural variations model to study the competitive behavior of generating firms acting in liberalized electricity markets. The model computes a parameter that represents the degree of competition of each generating firm in each trading period. In this regard, the proposed model provides a powerful methodology for regulatory and competition authorities to monitor the competitive behavior of generating firms. As an application of the model, a study of the day-ahead Iberian electricity market (MIBEL) was conducted to analyze the impact of the integration of the Portuguese and Spanish electricity markets on the behavior of generating firms taking into account the hourly results of the months of June and July of 2007. The advantages of the proposed methodology over other methodologies used to address market power, namely Residual Supply index and Lerner index are highlighted. (C) 2014 Elsevier Ltd. All rights reserved.