975 resultados para Protection optical networks


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A methodology to increase the probability of delivering power to any load point through the identification of new investments in distribution network components is proposed in this paper. The method minimizes the investment cost as well as the cost of energy not supplied in the network. A DC optimization model based on mixed integer non-linear programming is developed considering the Pareto front technique in order to identify the adequate investments in distribution networks components which allow increasing the probability of delivering power for any customer in the distribution system at the minimum possible cost for the system operator, while minimizing the energy not supplied cost. Thus, a multi-objective problem is formulated. To illustrate the application of the proposed methodology, the paper includes a case study which considers a 180 bus distribution network

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The high penetration of distributed energy resources (DER) in distribution networks and the competitiveenvironment of electricity markets impose the use of new approaches in several domains. The networkcost allocation, traditionally used in transmission networks, should be adapted and used in the distribu-tion networks considering the specifications of the connected resources. The main goal is to develop afairer methodology trying to distribute the distribution network use costs to all players which are usingthe network in each period. In this paper, a model considering different type of costs (fixed, losses, andcongestion costs) is proposed comprising the use of a large set of DER, namely distributed generation(DG), demand response (DR) of direct load control type, energy storage systems (ESS), and electric vehi-cles with capability of discharging energy to the network, which is known as vehicle-to-grid (V2G). Theproposed model includes three distinct phases of operation. The first phase of the model consists in aneconomic dispatch based on an AC optimal power flow (AC-OPF); in the second phase Kirschen’s andBialek’s tracing algorithms are used and compared to evaluate the impact of each resource in the net-work. Finally, the MW-mile method is used in the third phase of the proposed model. A distributionnetwork of 33 buses with large penetration of DER is used to illustrate the application of the proposedmodel.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most of distribution generation and smart grid research works are dedicated to the study of network operation parameters, reliability among others. However, many of this research works usually uses traditional test systems such as IEEE test systems. This work proposes a voltage magnitude study in presence of fault conditions considering the realistic specifications found in countries like Brazil. The methodology considers a hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzyprobabilistic models and a remedial action algorithm which is based on optimal power flow. To illustrate the application of the proposed method, the paper includes a case study that considers a real 12 bus sub-transmission network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents several forecasting methodologies based on the application of Artificial Neural Networks (ANN) and Support Vector Machines (SVM), directed to the prediction of the solar radiance intensity. The methodologies differ from each other by using different information in the training of the methods, i.e, different environmental complementary fields such as the wind speed, temperature, and humidity. Additionally, different ways of considering the data series information have been considered. Sensitivity testing has been performed on all methodologies in order to achieve the best parameterizations for the proposed approaches. Results show that the SVM approach using the exponential Radial Basis Function (eRBF) is capable of achieving the best forecasting results, and in half execution time of the ANN based approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most of distributed generation and smart grid research works are dedicated to network operation parameters studies, reliability, etc. However, many of these works normally uses traditional test systems, for instance, IEEE test systems. This paper proposes voltage magnitude and reliability studies in presence of fault conditions, considering realistic conditions found in countries like Brazil. The methodology considers a hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models and a remedial action algorithm which is based on optimal power flow. To illustrate the application of the proposed method, the paper includes a case study that considers a real 12-bus sub-transmission network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the past few years, induction of protective immunity to cutaneous leishmaniasis has been attempted by many researchers using a variety of antigenic preparations, such as living promastigotes or promastigote extracts, partially purified, or defined proteins. In this study, eleven proteins from Leishmania (Leishmania) amazonensis (LLa) with estimated molecular mass ranging from 97 to 13.5kDa were isolated by polyacrylamide gel electrophoresis and electro-elution. The proteins were associated as vaccine in different preparations with gp63 and BCG (Bacilli Calmette-Guérin). The antigenicity of these vaccines was measured by their ability to induce the production of IFN-g by lymphocyte from subjects vaccinated with Leishvacinâ . The immunogenicity was evaluated in vaccinated mice. C57BL/10 mice were vaccinated with three doses of each vaccine consisting of 30 mg of each protein at 15 days interval. One hundred mg of live BCG was only used in the first dose. Seven days after the last dose, they received a first challenge infection with 105 infective promastigotes and four months later, a second challenge was done. Two months after the second challenge, 42.86% of protection was obtained in the group of mice vaccinated with association of proteins of gp63+46+22kDa, gp63+13.5+25+42kDa, gp63+46+42kDa, gp63+66kDa, and gp63+97kDa; 57.14% of protection was demonstrated with gp63+46+97+13.5kDa, gp63+46+97kDa, gp63+46+33kDa, and 71.43% protection for gp63 plus all proteins. The vaccine of gp63+46+40kDa that did not protect the mice, despite the good specific stimulation of lymphocytes (LSI = 7.60) and 10.77UI/ml of IFN-g production. When crude extract of L. (L.) amazonensis was used with BCG a 57.14% of protection was found after the first challenge and 28.57% after the second, the same result was observed for gp63. The data obtained with the vaccines can suggest that the future vaccine probably have to contain, except the 40kDa, a cocktail of proteins that would protect mice against cutaneous leishmaniasis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As distribuições de Lei de Potência (PL Power Laws), tais como a lei de Pareto e a lei de Zipf são distribuições estatísticas cujos tamanhos dos eventos são inversamente proporcionais à sua frequência. Estas leis de potência são caracterizadas pelas suas longas caudas. Segundo Vilfredo Pareto (1896), engenheiro, cientista, sociólogo e economista italiano, autor da Lei de Pareto, 80% das consequências advêm de 20% das causas. Segundo o mesmo, grande parte da economia mundial segue uma determinada distribuição, onde 80% da riqueza mundial é detida por 20% da população ou 80% da poluição mundial é feita por 20% dos países. Estas percentagens podem oscilar nos intervalos [75-85] e [15-25]. A mesma percentagem poderá ser aplicada à gestão de tempo, onde apenas 20% do tempo dedicado a determinado assunto produzirá cerca de 80% dos resultados obtidos. A lei de Pareto, também designada de regra 80/20, tem aplicações nas várias ciências e no mundo físico, nomeadamente na biodiversidade. O número de ocorrências de fenómenos extremos, aliados ao impacto nas redes de telecomunicações nas situações de catástrofe, no apoio imediato às populações e numa fase posterior de reconstrução, têm preocupado cada vez mais as autoridades oficiais de protecção civil e as operadoras de telecomunicações. O objectivo é o de preparar e adaptarem as suas estruturas para proporcionar uma resposta eficaz a estes episódios. Neste trabalho estuda-se o comportamento de vários fenómenos extremos (eventos críticos) e aproximam-se os dados por uma distribuição de Pareto (Lei de Pareto) ou lei de potência. No final, especula-se sobre a influência dos eventos críticos na utilização das redes móveis. É fundamental que as redes móveis estejam preparadas para lidar com as repercussões de fenómenos deste tipo.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As redes de terras são algo que atormenta muitas operadoras de telecomunicações. Quando o comum civil pensa que apenas existem antenas e que é importante a transmissão de dados porque naturalmente não lhes faz falta saber mais nada, as pessoas envolventes na construção de uma estação de telecomunicações têm preocupações não só com a parte de rádio e antenas, mas fundamentalmente com a parte de energia e de infra-estruturas. No desenvolvimento deste projecto, torna-se então importante perceber como as redes de terras são implementadas, como atingir valores satisfatórios para as operadoras, como realizar melhoria da rede de terras já existentes, implementar furos artesianos/valas e sobretudo como melhorar no futuro. Os materiais utilizados e o estudo prévio das condições do terreno onde a estação rádio-base é implementada nem sempre coincidem com os resultados desejados e que pensamos que seriam facilmente atingidos após implementação da torre. Neste projecto, vamos então acompanhar todos os processos, desde a legalização, construção, testes e finalmente a medição da resistência de terras final. Se os valores foram aceitáveis, óptimo para a operadora de telecomunicações e para a empresa responsável pela implementação. Caso os resultados estejam aquém das expectativas, é altura de selecionar os melhores e recorrendo a um conjunto de técnicas que garantam resultados e o menos dispêndio de dinheiro possível, vamos avançar para as melhorias. Os regimes de protecção das pessoas e da estação são bastante importantes. É importante também perceber as condições dos solos e saber simular com o programa ERICO que será fundamental para se efetuar uma comparação entre o teórico e o implementado praticamente. No fim, parte fulcral serão as medições finais, a relação preço-qualidade das melhorias implementadas, a análise dos resultados com argumentos válidos e o bom senso da operadora para aceitar ou rejeitar os trabalhos. Contrato é contrato, fazer mais e melhor é uma exigência constante.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Conservação e Restauro, especialização em pintura sobre tela

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Based on a report for the seminar Industrial Networks, at Goethe Universität Frankfurt am Main Dozent: Prof. Dr. Blättel-Mink, Prof. Dr. António Moniz SS 2011

Relevância:

20.00% 20.00%

Publicador:

Resumo:

IEEE 802.11 is one of the most well-established and widely used standard for wireless LAN. Its Medium Access control (MAC) layer assumes that the devices adhere to the standard’s rules and timers to assure fair access and sharing of the medium. However, wireless cards driver flexibility and configurability make it possible for selfish misbehaving nodes to take advantages over the other well-behaving nodes. The existence of selfish nodes degrades the QoS for the other devices in the network and may increase their energy consumption. In this paper we propose a green solution for selfish misbehavior detection in IEEE 802.11-based wireless networks. The proposed scheme works in two phases: Global phase which detects whether the network contains selfish nodes or not, and Local phase which identifies which node or nodes within the network are selfish. Usually, the network must be frequently examined for selfish nodes during its operation since any node may act selfishly. Our solution is green in the sense that it saves the network resources as it avoids wasting the nodes energy by examining all the individual nodes of being selfish when it is not necessary. The proposed detection algorithm is evaluated using extensive OPNET simulations. The results show that the Global network metric clearly indicates the existence of a selfish node while the Local nodes metric successfully identified the selfish node(s). We also provide mathematical analysis for the selfish misbehaving and derived formulas for the successful channel access probability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Radio link quality estimation is essential for protocols and mechanisms such as routing, mobility management and localization, particularly for low-power wireless networks such as wireless sensor networks. Commodity Link Quality Estimators (LQEs), e.g. PRR, RNP, ETX, four-bit and RSSI, can only provide a partial characterization of links as they ignore several link properties such as channel quality and stability. In this paper, we propose F-LQE (Fuzzy Link Quality Estimator, a holistic metric that estimates link quality on the basis of four link quality properties—packet delivery, asymmetry, stability, and channel quality—that are expressed and combined using Fuzzy Logic. We demonstrate through an extensive experimental analysis that F-LQE is more reliable than existing estimators (e.g., PRR, WMEWMA, ETX, RNP, and four-bit) as it provides a finer grain link classification. It is also more stable as it has lower coefficient of variation of link estimates. Importantly, we evaluate the impact of F-LQE on the performance of tree routing, specifically the CTP (Collection Tree Protocol). For this purpose, we adapted F-LQE to build a new routing metric for CTP, which we dubbed as F-LQE/RM. Extensive experimental results obtained with state-of-the-art widely used test-beds show that F-LQE/RM improves significantly CTP routing performance over four-bit (the default LQE of CTP) and ETX (another popular LQE). F-LQE/RM improves the end-to-end packet delivery by up to 16%, reduces the number of packet retransmissions by up to 32%, reduces the Hop count by up to 4%, and improves the topology stability by up to 47%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In-network storage of data in wireless sensor networks contributes to reduce the communications inside the network and to favor data aggregation. In this paper, we consider the use of n out of m codes and data dispersal in combination to in-network storage. In particular, we provide an abstract model of in-network storage to show how n out of m codes can be used, and we discuss how this can be achieved in five cases of study. We also define a model aimed at evaluating the probability of correct data encoding and decoding, we exploit this model and simulations to show how, in the cases of study, the parameters of the n out of m codes and the network should be configured in order to achieve correct data coding and decoding with high probability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Disaster management is one of the most relevant application fields of wireless sensor networks. In this application, the role of the sensor network usually consists of obtaining a representation or a model of a physical phenomenon spreading through the affected area. In this work we focus on forest firefighting operations, proposing three fully distributed ways for approximating the actual shape of the fire. In the simplest approach, a circular burnt area is assumed around each node that has detected the fire and the union of these circles gives the overall fire’s shape. However, as this approach makes an intensive use of the wireless sensor network resources, we have proposed to incorporate two in-network aggregation techniques, which do not require considering the complete set of fire detections. The first technique models the fire by means of a complex shape composed of multiple convex hulls representing different burning areas, while the second technique uses a set of arbitrary polygons. Performance evaluation of realistic fire models on computer simulations reveals that the method based on arbitrary polygons obtains an improvement of 20% in terms of accuracy of the fire shape approximation, reducing the overhead in-network resources to 10% in the best case.