826 resultados para fuzzy logic power system stabilizer


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new man-made target tracking algorithm integrating features from (Forward Looking InfraRed) image sequence is presented based on particle filter. Firstly, a multiscale fractal feature is used to enhance targets in FLIR images. Secondly, the gray space feature is defined by Bhattacharyya distance between intensity histograms of the reference target and a sample target from MFF (Multi-scale Fractal Feature) image. Thirdly, the motion feature is obtained by differencing between two MFF images. Fourthly, a fusion coefficient can be automatically obtained by online feature selection method for features integrating based on fuzzy logic. Finally, a particle filtering framework is developed to fulfill the target tracking. Experimental results have shown that the proposed algorithm can accurately track weak or small man-made target in FLIR images with complicated background. The algorithm is effective, robust and satisfied to real time tracking.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genetic algorithms (GAs) have been introduced into site layout planning as reported in a number of studies. In these studies, the objective functions were defined so as to employ the GAs in searching for the optimal site layout. However, few studies have been carried out to investigate the actual closeness of relationships between site facilities; it is these relationships that ultimately govern the site layout. This study has determined that the underlying factors of site layout planning for medium-size projects include work flow, personnel flow, safety and environment, and personal preferences. By finding the weightings on these factors and the corresponding closeness indices between each facility, a closeness relationship has been deduced. Two contemporary mathematical approaches - fuzzy logic theory and an entropy measure - were adopted in finding these results in order to minimize the uncertainty and vagueness of the collected data and improve the quality of the information. GAs were then applied to searching for the optimal site layout in a medium-size government project using the GeneHunter software. The objective function involved minimizing the total travel distance. An optimal layout was obtained within a short time. This reveals that the application of GA to site layout planning is highly promising and efficient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Associative memory networks such as Radial Basis Functions, Neurofuzzy and Fuzzy Logic used for modelling nonlinear processes suffer from the curse of dimensionality (COD), in that as the input dimension increases the parameterization, computation cost, training data requirements, etc. increase exponentially. Here a new algorithm is introduced for the construction of a Delaunay input space partitioned optimal piecewise locally linear models to overcome the COD as well as generate locally linear models directly amenable to linear control and estimation algorithms. The training of the model is configured as a new mixture of experts network with a new fast decision rule derived using convex set theory. A very fast simulated reannealing (VFSR) algorithm is utilized to search a global optimal solution of the Delaunay input space partition. A benchmark non-linear time series is used to demonstrate the new approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wind generation’s contribution to meeting extreme peaks in electricity demand is a key concern for the integration of wind power. In Great Britain (GB), robustly assessing this contribution directly from power system data (i.e. metered wind-supply and electricity demand) is difficult as extreme peaks occur infrequently (by definition) and measurement records are both short and inhomogeneous. Atmospheric circulation-typing combined with meteorological reanalysis data is proposed as a means to address some of these difficulties, motivated by a case study of the extreme peak demand events in January 2010. A preliminary investigation of the physical and statistical properties of these circulation types suggests that they can be used to identify the conditions that are most likely to be associated with extreme peak demand events. Three broad cases are highlighted as requiring further investigation. The high-over-Britain anticyclone is found to be generally associated with very low winds but relatively moderate temperatures (and therefore moderate peak demands, somewhat in contrast to the classic low-wind cold snap that is sometimes apparent in the literature). In contrast, both longitudinally extended blocking over Scotland/Scandinavia and latitudinally extended troughs over western Europe appear to be more closely linked to the very cold GB temperatures (usually associated with extreme peak demands). In both of these latter situations, wind resource averaged across GB appears to be more moderate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Almost all the electricity currently produced in the UK is generated as part of a centralised power system designed around large fossil fuel or nuclear power stations. This power system is robust and reliable but the efficiency of power generation is low, resulting in large quantities of waste heat. The principal aim of this paper is to investigate an alternative concept: the energy production by small scale generators in close proximity to the energy users, integrated into microgrids. Microgrids—de-centralised electricity generation combined with on-site production of heat—bear the promise of substantial environmental benefits, brought about by a higher energy efficiency and by facilitating the integration of renewable sources such as photovoltaic arrays or wind turbines. By virtue of good match between generation and load, microgrids have a low impact on the electricity network, despite a potentially significant level of generation by intermittent energy sources. The paper discusses the technical and economic issues associated with this novel concept, giving an overview of the generator technologies, the current regulatory framework in the UK, and the barriers that have to be overcome if microgrids are to make a major contribution to the UK energy supply. The focus of this study is a microgrid of domestic users powered by small Combined Heat and Power generators and photovoltaics. Focusing on the energy balance between the generation and load, it is found that the optimum combination of the generators in the microgrid- consisting of around 1.4 kWp PV array per household and 45% household ownership of micro-CHP generators- will maintain energy balance on a yearly basis if supplemented by energy storage of 2.7 kWh per household. We find that there is no fundamental technological reason why microgrids cannot contribute an appreciable part of the UK energy demand. Indeed, an estimate of cost indicates that the microgrids considered in this study would supply electricity at a cost comparable with the present electricity supply if the current support mechanisms for photovoltaics were maintained. Combining photovoltaics and micro-CHP and a small battery requirement gives a microgrid that is independent of the national electricity network. In the short term, this has particular benefits for remote communities but more wide-ranging possibilities open up in the medium to long term. Microgrids could meet the need to replace current generation nuclear and coal fired power stations, greatly reducing the demand on the transmission and distribution network.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

he perspective European Supergrid would consist of an integrated power system network, where electricity demands from one country could be met by generation from another country. This paper makes use of a bi-linear fixed-effects model to analyse the determinants for trading electricity across borders among 34 countries connected by the European Supergrid. The key question that this paper aims to address is the extent to which the privatisation of European electricity markets has brought about higher cross-border trade of electricity. The analysis makes use of distance, price ratios, gate closure times, size of peaks and aggregate demand as standard determinants. Controlling for other standard determinants, it is concluded that privatisation in most cases led to higher power exchange and that the benefits are more significant where privatisation measures have been in place for a longer period.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wind generation's contribution to supporting peak electricity demand is one of the key questions in wind integration studies. Differently from conventional units, the available outputs of different wind farms cannot be approximated as being statistically independent, and hence near-zero wind output is possible across an entire power system. This paper will review the risk model structures currently used to assess wind's capacity value, along with discussion of the resulting data requirements. A central theme is the benefits from performing statistical estimation of the joint distribution for demand and available wind capacity, focusing attention on uncertainties due to limited histories of wind and demand data; examination of Great Britain data from the last 25 years shows that the data requirements are greater than generally thought. A discussion is therefore presented into how analysis of the types of weather system which have historically driven extreme electricity demands can help to deliver robust insights into wind's contribution to supporting demand, even in the face of such data limitations. The role of the form of the probability distribution for available conventional capacity in driving wind capacity credit results is also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the two-level network design problem with intermediate facilities. This problem consists of designing a minimum cost network respecting some requirements, usually described in terms of the network topology or in terms of a desired flow of commodities between source and destination vertices. Each selected link must receive one of two types of edge facilities and the connection of different edge facilities requires a costly and capacitated vertex facility. We propose a hybrid decomposition approach which heuristically obtains tentative solutions for the vertex facilities number and location and use these solutions to limit the computational burden of a branch-and-cut algorithm. We test our method on instances of the power system secondary distribution network design problem. The results show that the method is efficient both in terms of solution quality and computational times. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report describes the outcome of the first visit to Tanzania, within the project "Mini-grids supplied by renewable energy - improving technical and social feasibility". The trip included visits to three different organizations; Ihushi Development Center (IDC) near Mwanza, TIDESO near Bukoba, and Mavuno Project in Karagwe. At IDC, a brief evaluation of the current power system was done and measuring equipment for long term measurements were installed. At all three locations investigations regarding the current and future electricity demand were conducted and connections to people relevant to the study were established. The report is including as well some technical specifications as some observations regarding organization and management of the technical systems. The trip was including only short visits and therefore only brief introductions to the different organizations, based on observations done by the author. The report is hence describing the author’s understanding of the technical system and social structures after only short visits to each of the organizations, and may differ from observations done at another point in time, over a different time period, or by some other person.This report describes the outcome of the first visit to Tanzania, within the project "Mini-grids supplied by renewable energy - improving technical and social feasibility". The trip included visits to three different organizations; Ihushi Development Center (IDC) near Mwanza, TIDESO near Bukoba, and Mavuno Project in Karagwe. At IDC, a brief evaluation of the current power system was done and measuring equipment for long term measurements were installed. At all three locations investigations regarding the current and future electricity demand were conducted and connections to people relevant to the study were established. The report is including as well some technical specifications as some observations regarding organization and management of the technical systems. The trip was including only short visits and therefore only brief introductions to the different organizations, based on observations done by the author. The report is hence describing the author’s understanding of the technical system and social structures after only short visits to each of the organizations, and may differ from observations done at another point in time, over a different time period, or by some other person.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ever increasing spurt in digital crimes such as image manipulation, image tampering, signature forgery, image forgery, illegal transaction, etc. have hard pressed the demand to combat these forms of criminal activities. In this direction, biometrics - the computer-based validation of a persons' identity is becoming more and more essential particularly for high security systems. The essence of biometrics is the measurement of person’s physiological or behavioral characteristics, it enables authentication of a person’s identity. Biometric-based authentication is also becoming increasingly important in computer-based applications because the amount of sensitive data stored in such systems is growing. The new demands of biometric systems are robustness, high recognition rates, capability to handle imprecision, uncertainties of non-statistical kind and magnanimous flexibility. It is exactly here that, the role of soft computing techniques comes to play. The main aim of this write-up is to present a pragmatic view on applications of soft computing techniques in biometrics and to analyze its impact. It is found that soft computing has already made inroads in terms of individual methods or in combination. Applications of varieties of neural networks top the list followed by fuzzy logic and evolutionary algorithms. In a nutshell, the soft computing paradigms are used for biometric tasks such as feature extraction, dimensionality reduction, pattern identification, pattern mapping and the like.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Guias para exploração mineral são normalmente baseados em modelos conceituais de depósitos. Esses guias são, normalmente, baseados na experiência dos geólogos, em dados descritivos e em dados genéticos. Modelamentos numéricos, probabilísticos e não probabilísticos, para estimar a ocorrência de depósitos minerais é um novo procedimento que vem a cada dia aumentando sua utilização e aceitação pela comunidade geológica. Essa tese utiliza recentes metodologias para a geração de mapas de favorablidade mineral. A denominada Ilha Cristalina de Rivera, uma janela erosional da Bacia do Paraná, situada na porção norte do Uruguai, foi escolhida como estudo de caso para a aplicação das metodologias. A construção dos mapas de favorabilidade mineral foi feita com base nos seguintes tipos de dados, informações e resultados de prospecção: 1) imagens orbitais; 2) prospecção geoquimica; 3) prospecção aerogeofísica; 4) mapeamento geo-estrutural e 5) altimetria. Essas informacões foram selecionadas e processadas com base em um modelo de depósito mineral (modelo conceitual), desenvolvido com base na Mina de Ouro San Gregorio. O modelo conceitual (modelo San Gregorio), incluiu características descritivas e genéticas da Mina San Gregorio, a qual abrange os elementos característicos significativos das demais ocorrências minerais conhecidas na Ilha Cristalina de Rivera. A geração dos mapas de favorabilidade mineral envolveu a construção de um banco de dados, o processamento dos dados, e a integração dos dados. As etapas de construção e processamento dos dados, compreenderam a coleta, a seleção e o tratamento dos dados de maneira a constituírem os denominados Planos de Informação. Esses Planos de Informação foram gerados e processados organizadamente em agrupamentos, de modo a constituírem os Fatores de Integração para o mapeamento de favorabilidade mineral na Ilha Cristalina de Rivera. Os dados foram integrados por meio da utilização de duas diferentes metodologias: 1) Pesos de Evidência (dirigida pelos dados) e 2) Lógica Difusa (dirigida pelo conhecimento). Os mapas de favorabilidade mineral resultantes da implementação das duas metodologias de integração foram primeiramente analisados e interpretados de maneira individual. Após foi feita uma análise comparativa entre os resultados. As duas metodologias xxiv obtiveram sucesso em identificar, como áreas de alta favorabilidade, as áreas mineralizadas conhecidas, além de outras áreas ainda não trabalhadas. Os mapas de favorabilidade mineral resultantes das duas metodologias mostraram-se coincidentes em relação as áreas de mais alta favorabilidade. A metodologia Pesos de Evidência apresentou o mapa de favorabilidade mineral mais conservador em termos de extensão areal, porém mais otimista em termos de valores de favorabilidade em comparação aos mapas de favorabilidade mineral resultantes da implementação da metodologia Lógica Difusa. Novos alvos para exploração mineral foram identificados e deverão ser objeto de investigação em detalhe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabalho apresenta a estruturação de um controle difuso, para a automação de reatores seqüenciais em batelada (RSB), no processo de remoção biológica de matéria orgânica e nitrogênio em águas residuárias domésticas, utilizando parâmetros inferenciais, pH, ORP e OD, em que as variáveis controladas foram as durações da reação aeróbia e anóxica. O experimento, em escala de bancada, foi composto por dois reatores seqüenciais em batelada, com volume útil de 10 L, no qual 6 L foram alimentados com esgoto sintético, com características de águas residuárias domésticas. O sistema de automação foi composto pela aquisição dos parâmetros eletroquímicos (pH, ORP e OD), pelos dispositivos atuadores (motor-bomba, aerador e misturador) e pelo controle predeterminado ou difuso. O programa computacional CONRSB foi implementado de forma a integrar o sistema de automação. O controle difuso, implementado, foi constituído pelos procedimentos de: normalização, nebulização, inferência, desnebulização e desnormalização. As variáveis de entrada para o controlador difuso, durante o período: aeróbio foram dpH/dt, dpH/d(t-1) e o pH ; anóxico foram dORP/dt, dORP/d(t-1) e o OD. As normalizações das variáveis crisps estiveram no universo de [0,1], utilizando os valores extremos do ciclo 1 ao 70. Nas nebulizações foram aplicadas as funções triangulares, as quais representaram, satisfatoriamente, as indeterminações dos parâmetros. A inferência nebulosa foi por meio da base heurística (regras), com amparo do especialista, em que a implicação de Mamdani foi aplicada Nessas implicações foram utilizadas dezoito expressões simbólicas para cada período, aeróbio e anóxico. O método de desnebulização foi pelo centro de áreas, que se mostrou eficaz em termos de tempo de processamento. Para a sintonia do controlador difuso empregou-se o programa computacional MATLAB, juntamente com as rotinas Fuzzy logic toolbox e o Simulink. O intervalo entre as atuações do controlador difuso, ficou estabelecido em 5,0 minutos, sendo obtido por meio de tentativas. A operação do RSB 1, durante os 85 ciclos, apresentou a relação média DBO/NTK de 4,67 mg DBO/mg N, sendo classificado como processo combinado de oxidação de carbono e nitrificação. A relação média alimento/microrganismo foi de 0,11 kg DBO/kg sólido suspenso volátil no licor misto.dia, enquadrando nos sistemas com aeração prolongada, em que a idade do lodo correspondeu aos 29 dias. O índice volumétrico do lodo médio foi de 117,5 mL/g, indicando uma sedimentação com características médias. As eficiências médias no processo de remoção de carbono e nitrogênio foram de 90,8% (como DQO) e 49,8%, respectivamente. As taxas específicas médias diárias, no processo de nitrificação e desnitrificação, foram de 24,2g N/kg SSVLM.dia e 15,5 g N/kg SSVLM.dia, respectivamente. O monitoramento, em tempo real, do pH, ORP e OD, mostrou ter um grande potencial no controle dos processos biológicos, em que o pH foi mais representativo no período aeróbio, sendo o ORP e o OD mais representativos no período anóxico. A operação do RSB com o controlador difuso, apresentou do ciclo 71 ao 85, as eficiências médias no processo de remoção de carbono e nitrogênio de 96,4% (como DQO) e 76,4%, respectivamente. A duração média do período aeróbio foi de 162,1 minutos, que tomando como referência o período máximo de 200,0 minutos, reduziu em 19,0% esses períodos. A duração média do período anóxico foi de 164,4 minutos, que tomando como referência o período máximo de 290,0 minutos, apresentou uma redução de 43,3%, mostrando a atuação robusta do controlador difuso. O estudo do perfil temporal, no ciclo 85, mostrou a atuação efetiva do controlador difuso, associada aos pontos de controle nos processos biológicos do RSB. Nesse ciclo, as taxas máximas específicas de nitrificação e desnitrificação observadas, foram de 32,7 g NO3 --N/kg sólido suspenso volátil no licor misto.dia e 43,2g NO3 --N/kg sólido suspenso volátil no licor misto.dia, respectivamente.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As pesquisas sobre governança no sistema financeiro contribuem significativamente para a análise dos diversos elementos que influenciam a performance nesse setor. No entanto, estudos empíricos nessa área ainda são limitados. Um dos motivos é a complexidade inerente à noção de governança na área pública. Da mesma forma que os sistemas complexos, a governança pode ser descrita como um sistema que abrange um grande número de entidades interdependentes entre si, com diferentes graus de relacionamentos. Neste trabalho de pesquisa, o significado de governança regulamentar do SFN se insere nesse escopo de entendimento, isto é, a governança como um fenômeno que resulta das diversas interações existentes entre os atores que influenciam ou são influenciados pelas atividades de regulação do setor financeiro. Em função das especificidades dos sistemas complexos, desenvolve-se e implementa-se um modelo baseado em agentes para a análise da governança regulamentar do SFN mediante experimentos de simulação. Os modelos baseados em agentes possibilitam explicitar aspectos relativos às interações e comportamentos dos agentes (nível micro), ou seja, os comportamentos não-lineares do sistema, que são difíceis de serem capturados com outros formalismos matemáticos. O modelo baseado em agentes é integrado a um modelo econométrico que tem como função caracterizar o ambiente macro-econômico. O ambiente micro é modelado por intermédio de agentes computacionais, com o uso da arquitetura BDI (do inglês, beliefs-desires-intentions). Esses agentes interagem entre si e com o ambiente, possuem crenças sobre o meio onde atuam e desejos que querem satisfazer, levando-os a formar intenções para agir. O comportamento dos agentes foi modelado utilizando-se lógica difusa (fuzzy logic), com o uso de regras construídas por intermédio de pesquisa de análise de conteúdo, a partir de informações coletadas em notícias de jornais, e entrevistas semiestruturadas com especialistasda área financeira. Os resultados dos experimentos demonstram o potencial da simulação dos modelos baseados em agentes para a realização de estudos de ambientes complexos de governança regulamentar.