1000 resultados para Sistemas de distribuição


Relevância:

30.00% 30.00%

Publicador:

Resumo:

As infiltrações com origem direta e indireta em eventos pluviométricos são os principais responsáveis por múltiplos problemas nos sistemas de drenagem, tais como, entrada da rede em carga, com possíveis extravasamentos para as ruas, falta de capacidade de tratamento nas devidas estações e aumento da poluição nos meios recetores. Deste modo, pretendeu-se efetuar a caraterização e o controlo das afluências indevidas no sistema de drenagem de águas residuais da vila de Lorvão, recorrendo à construção de um Sistema de Informação Geográfica onde se armazenou toda a informação cadastral recolhida. Desenvolveu-se, ainda, uma estratégia que permitiu aferir o desempenho deste sistema no que respeita aos caudais de infiltração, através da medição dos caudais afluentes à Estação de Tratamento de Águas Residuais, dos dados de distribuição de água, dos dados do reservatório abastecedor de água e da precipitação ocorrida.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O presente trabalho tem como base um estágio na entidade gestora INOVA-EM-S.A. a fim de reduzir a água não faturada. O estudo forcar-se-á especificamente nas perdas de água, às quais se dará uma maior importância às perdas reais. Este tema assume grande importância, notando-se cada vez mais, a consciência para este problema que afeta as entidades gestoras de todo o mundo. A nível nacional, a entidade reguladora - ERSAR define como principal alvo o valor de 20% de água não faturada, valor este que se tem revelado bastante inferior ao verificado para a realidade de várias entidades gestoras. Numa primeira fase, por forma a reduzir as perdas de água, será analisado todo o sistema de abastecimento de água do concelho de Cantanhede nomeadamente reservatórios, condutas, consumidores e os respetivos consumos. Posteriormente serão analisados todos os dados dos consumidores, bem como os das ZMCs presentes, resultando na elaboração de indicadores. Estes indicadores serão importantes para a tomada de decisão sobre qual ZMC intervir, e também aqui será essencial a análise dos caudais mínimos noturnos resultante do processo de telemetria. A escolha da zona de medição crítica será consequência dos valores referentes aos indicadores anteriormente referidos, assim como do tempo necessário para o estudo e todo o processo de atuação nesta ZMC. A ZMC (Bolho) escolhida será alvo de campanhas de deteção e intervenção, com a finalidade da redução das perdas reais, entretanto, paralelamente a este processo, será revisto todo o parque de contadores procedendo à verificação e mesmo à substituição de alguns contadores, combatendo assim parte das perdas aparentes. Estes processos serão acompanhados com a monitorização constante permitindo assim a verificação das medidas tomadas. Posteriormente serão abandonadas as intervenções, dando lugar a intervenções pontuais a realizar apenas quando necessário, mantendo a monitorização constante durante este período. Verificou-se que os indicadores, bem como os caudais mínimos noturnos subiram de forma considerável. Mais tarde será realizada uma nova campanha de deteção de fugas, desta vez não tão exaustiva como a anterior. Esta campanha será realizada com vista a analisar o estado da ZMC depois de realizado todo este processo exaustivo de intervenções. Como resultado desta segunda campanha serão identificados novos locais com roturas, locais que anteriormente na primeira campanha não foram destacados como locais a intervir por não apresentarem indícios de rotura ou fuga de água.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this research is to analyze different daylighting systems in schools in the city of Natal/RN. Although with the abundantly daylight available locally, there are a scarce and diffuse architectural recommendations relating sky conditions, dimensions of daylight systems, shading, fraction of sky visibility, required illuminance, glare, period of occupation and depth of the lit area. This research explores different selected apertures systems to explore the potential of natural light for each system. The method has divided into three phases: The first phase is the modeling which involves the construction of three-dimensional model of a classroom in Sketchup software 2014, which is featured in follow recommendations presented in the literature to obtain a good quality of environmental comfort in school settings. The second phase is the dynamic performance computer simulation of the light through the Daysim software. The input data are the climate file of 2009 the city of Natal / RN, the classroom volumetry in 3ds format with the assignment of optical properties of each surface, the sensor mapping file and the user load file . The results produced in the simulation are organized in a spreadsheet prepared by Carvalho (2014) to determine the occurrence of useful daylight illuminance (UDI) in the range of 300 to 3000lux and build graphics illuminance curves and contours of UDI to identify the uniformity of distribution light, the need of the minimum level of illuminance and the occurrence of glare.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important problem faced by the oil industry is to distribute multiple oil products through pipelines. Distribution is done in a network composed of refineries (source nodes), storage parks (intermediate nodes), and terminals (demand nodes) interconnected by a set of pipelines transporting oil and derivatives between adjacent areas. Constraints related to storage limits, delivery time, sources availability, sending and receiving limits, among others, must be satisfied. Some researchers deal with this problem under a discrete viewpoint in which the flow in the network is seen as batches sending. Usually, there is no separation device between batches of different products and the losses due to interfaces may be significant. Minimizing delivery time is a typical objective adopted by engineers when scheduling products sending in pipeline networks. However, costs incurred due to losses in interfaces cannot be disregarded. The cost also depends on pumping expenses, which are mostly due to the electricity cost. Since industrial electricity tariff varies over the day, pumping at different time periods have different cost. This work presents an experimental investigation of computational methods designed to deal with the problem of distributing oil derivatives in networks considering three minimization objectives simultaneously: delivery time, losses due to interfaces and electricity cost. The problem is NP-hard and is addressed with hybrid evolutionary algorithms. Hybridizations are mainly focused on Transgenetic Algorithms and classical multi-objective evolutionary algorithm architectures such as MOEA/D, NSGA2 and SPEA2. Three architectures named MOTA/D, NSTA and SPETA are applied to the problem. An experimental study compares the algorithms on thirty test cases. To analyse the results obtained with the algorithms Pareto-compliant quality indicators are used and the significance of the results evaluated with non-parametric statistical tests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A typical electrical power system is characterized by centr alization of power gene- ration. However, with the restructuring of the electric sys tem, this topology is changing with the insertion of generators in parallel with the distri bution system (distributed gene- ration) that provides several benefits to be located near to e nergy consumers. Therefore, the integration of distributed generators, especially fro m renewable sources in the Brazi- lian system has been common every year. However, this new sys tem topology may result in new challenges in the field of the power system control, ope ration, and protection. One of the main problems related to the distributed generati on is the islanding formation, witch can result in safety risk to the people and to the power g rid. Among the several islanding protection techniques, passive techniques have low implementation cost and simplicity, requiring only voltage and current measuremen ts to detect system problems. This paper proposes a protection system based on the wavelet transform with overcur- rent and under/overvoltage functions as well as infomation of fault-induced transients in order to provide a fast detection and identification of fault s in the system. The propo- sed protection scheme was evaluated through simulation and experimental studies, with performance similar to the overcurrent and under/overvolt age conventional methods, but with the additional detection of the exact moment of the fault.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are several abiotic factors reported in the literature as regulators of the distribution of fish species in marine environments. Among them stand out structural complexity of habitat, benthic composition, depth and distance from the coast are usually reported as positive influencers in the diversity of difentes species, including reef fish. These are dominant elements in reef systems and considered high ecological and socioeconomic importance. Understanding how the above factors influence the distribution and habitat use of reef fish communities are important for their management and conservation. Thus, this study aims to evaluate the influence of these variables on the community of reef fishes along an environmental gradient of depth and distance from shore base in sandstone reefs in the coast of state of Rio Grande do Norte, Brazil. These variables are also used for creating a simple predictive model reef fish biomass for the environment studied. Data collection was performed through visual surveys in situ, and recorded environmental data (structural complexity of habitat, type of coverage of the substrate, benthic invertebrates) and ecological (wealth, abundance and reef fish size classes). As a complement, information on the diet were raised through literature and the biomass was estimated from the length-weight relationship of each species. Overall, the reefs showed a low coverage by corals and the Shallow reefs, Intermediate I and II dominated by algae and the Funds by algae and sponges. The complexity has increased along the gradient and positively influenced the species richness and abundance. Both attributes influenced in the structure of the reef fish community, increasing the richness, abundance and biomass of fish as well as differentiating the trophic structure of the community along the depth gradient and distance from the coast. Distribution and use of habitat by recifas fish was associated with food availability. The predictor model identified depth, roughness and coverage for foliose algae, calcareous algae and soft corals as the most significant variables influencing in the biomass of reef fish. In short, the description and understanding of these patterns are important steps to elucidate the ecological processes. In this sense, our approach provides a new understanding of the structure of the reef fish community of Rio Grande do Norte, allowing understand a part of a whole and assist future monitoring actions, evaluation, management and conservation of these and other reefs of Brazil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Binary systems are key environments to study the fundamental properties of stars. In this work, we analyze 99 binary systems identified by the CoRoT space mission. From the study of the phase diagrams of these systems, our sample is divided into three groups: those whose systems are characterized by the variability relative to the binary eclipses; those presenting strong modulations probably due to the presence of stellar spots on the surface of star; and those whose systems have variability associated with the expansion and contraction of the surface layers. For eclipsing binary stars, phase diagrams are used to estimate the classification in regard to their morphology, based on the study of equipotential surfaces. In this context, to determine the rotation period, and to identify the presence of active regions, and to investigate if the star exhibits or not differential rotation and study stellar pulsation, we apply the wavelet procedure. The wavelet transform has been used as a powerful tool in the treatment of a large number of problems in astrophysics. Through the wavelet transform, one can perform an analysis in time-frequency light curves rich in details that contribute significantly to the study of phenomena associated with the rotation, the magnetic activity and stellar pulsations. In this work, we apply Morlet wavelet (6th order), which offers high time and frequency resolution and obtain local (energy distribution of the signal) and global (time integration of local map) wavelet power spectra. Using the wavelet analysis, we identify thirteen systems with periodicities related to the rotational modulation, besides the beating pattern signature in the local wavelet map of five pulsating stars over the entire time span.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Binary systems are key environments to study the fundamental properties of stars. In this work, we analyze 99 binary systems identified by the CoRoT space mission. From the study of the phase diagrams of these systems, our sample is divided into three groups: those whose systems are characterized by the variability relative to the binary eclipses; those presenting strong modulations probably due to the presence of stellar spots on the surface of star; and those whose systems have variability associated with the expansion and contraction of the surface layers. For eclipsing binary stars, phase diagrams are used to estimate the classification in regard to their morphology, based on the study of equipotential surfaces. In this context, to determine the rotation period, and to identify the presence of active regions, and to investigate if the star exhibits or not differential rotation and study stellar pulsation, we apply the wavelet procedure. The wavelet transform has been used as a powerful tool in the treatment of a large number of problems in astrophysics. Through the wavelet transform, one can perform an analysis in time-frequency light curves rich in details that contribute significantly to the study of phenomena associated with the rotation, the magnetic activity and stellar pulsations. In this work, we apply Morlet wavelet (6th order), which offers high time and frequency resolution and obtain local (energy distribution of the signal) and global (time integration of local map) wavelet power spectra. Using the wavelet analysis, we identify thirteen systems with periodicities related to the rotational modulation, besides the beating pattern signature in the local wavelet map of five pulsating stars over the entire time span.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sandstone-type reservoir rocks are commonly responsible for oil accumulation. The wettability is an important parameter for the physical properties of the container, since it interferes in characteristics such as relative permeability to the aqueous phase, residual oil distribution in the reservoir, operating characteristics with waterflood and recovery of crude oil. This study applied different types of microemulsion systems - MES - in sandstone reservoirs and evaluated their influences on wettability and residual oil recovery. For this purpose, four microemulsion were prepared by changing the nature of ionic surfactants (ionic and nonionic). Microemulsions could then be characterized by surface tension analysis, density, particle diameter and viscosity in the temperature range 30° C to 70° C. The studied oil was described as light and the sandstone rock was derived from the Botucatu formation. The study of the influence of microemulsion systems on sandstone wettability was performed by contact angle measurements using as parameters the rock treatment time with the MES and the time after the brine surface contact by checking the angle variation behavior. In the study results, the rock was initially wettable to oil and had its wettability changed to mixed wettability after treatment with MES, obtaining preference for water. Regarding rock-MES contact time, it was observed that the rock wettability changed more when the contact time between the surface and the microemulsion systems was longer. It was also noted only a significant reduction for the first 5 minutes of interaction between the treated surface and brine. The synthesized anionic surfactant, commercial cationic, commercial anionic and commercial nonionic microemulsion systems presented the best results, respectively. With regard to enhanced oil recovery performance, all systems showed a significant percentage of recovered oil, with the anionic systems presenting the best results. A percentage of 80% recovery was reached, confirming the wettability study results, which pointed the influence of this property on the interaction of fluids and reservoir rock, and the ability of microemulsion systems to perform enhanced oil recovery in sandstone reservoirs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sandstone-type reservoir rocks are commonly responsible for oil accumulation. The wettability is an important parameter for the physical properties of the container, since it interferes in characteristics such as relative permeability to the aqueous phase, residual oil distribution in the reservoir, operating characteristics with waterflood and recovery of crude oil. This study applied different types of microemulsion systems - MES - in sandstone reservoirs and evaluated their influences on wettability and residual oil recovery. For this purpose, four microemulsion were prepared by changing the nature of ionic surfactants (ionic and nonionic). Microemulsions could then be characterized by surface tension analysis, density, particle diameter and viscosity in the temperature range 30° C to 70° C. The studied oil was described as light and the sandstone rock was derived from the Botucatu formation. The study of the influence of microemulsion systems on sandstone wettability was performed by contact angle measurements using as parameters the rock treatment time with the MES and the time after the brine surface contact by checking the angle variation behavior. In the study results, the rock was initially wettable to oil and had its wettability changed to mixed wettability after treatment with MES, obtaining preference for water. Regarding rock-MES contact time, it was observed that the rock wettability changed more when the contact time between the surface and the microemulsion systems was longer. It was also noted only a significant reduction for the first 5 minutes of interaction between the treated surface and brine. The synthesized anionic surfactant, commercial cationic, commercial anionic and commercial nonionic microemulsion systems presented the best results, respectively. With regard to enhanced oil recovery performance, all systems showed a significant percentage of recovered oil, with the anionic systems presenting the best results. A percentage of 80% recovery was reached, confirming the wettability study results, which pointed the influence of this property on the interaction of fluids and reservoir rock, and the ability of microemulsion systems to perform enhanced oil recovery in sandstone reservoirs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software bug analysis is one of the most important activities in Software Quality. The rapid and correct implementation of the necessary repair influence both developers, who must leave the fully functioning software, and users, who need to perform their daily tasks. In this context, if there is an incorrect classification of bugs, there may be unwanted situations. One of the main factors to be assigned bugs in the act of its initial report is severity, which lives up to the urgency of correcting that problem. In this scenario, we identified in datasets with data extracted from five open source systems (Apache, Eclipse, Kernel, Mozilla and Open Office), that there is an irregular distribution of bugs with respect to existing severities, which is an early sign of misclassification. In the dataset analyzed, exists a rate of about 85% bugs being ranked with normal severity. Therefore, this classification rate can have a negative influence on software development context, where the misclassified bug can be allocated to a developer with little experience to solve it and thus the correction of the same may take longer, or even generate a incorrect implementation. Several studies in the literature have disregarded the normal bugs, working only with the portion of bugs considered severe or not severe initially. This work aimed to investigate this portion of the data, with the purpose of identifying whether the normal severity reflects the real impact and urgency, to investigate if there are bugs (initially classified as normal) that could be classified with other severity, and to assess if there are impacts for developers in this context. For this, an automatic classifier was developed, which was based on three algorithms (Näive Bayes, Max Ent and Winnow) to assess if normal severity is correct for the bugs categorized initially with this severity. The algorithms presented accuracy of about 80%, and showed that between 21% and 36% of the bugs should have been classified differently (depending on the algorithm), which represents somewhere between 70,000 and 130,000 bugs of the dataset.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing demand in electricity and decrease forecast, increasingly, of fossil fuel reserves, as well as increasing environmental concern in the use of these have generated a concern about the quality of electricity generation, making it well welcome new investments in generation through alternative, clean and renewable sources. Distributed generation is one of the main solutions for the independent and selfsufficient generating systems, such as the sugarcane industry. This sector has grown considerably, contributing expressively in the production of electricity to the distribution networks. Faced with this situation, one of the main objectives of this study is to propose the implementation of an algorithm to detect islanding disturbances in the electrical system, characterized by situations of under- or overvoltage. The algorithm should also commonly quantize the time that the system was operating in these conditions, to check the possible consequences that will be caused in the electric power system. In order to achieve this it used the technique of wavelet multiresolution analysis (AMR) for detecting the generated disorders. The data obtained can be processed so as to be used for a possible predictive maintenance in the protection equipment of electrical network, since they are prone to damage on prolonged operation under abnormal conditions of frequency and voltage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Universities are institutions that generate and manipulate large amounts of data as a result of the multiple functions they perform, of the amount of involved professionals and students they attend. Information gathered from these data is used, for example, for operational activities and to support decision-making by managers. To assist managers in accomplishing their tasks, the Information Systems (IS) are presented as tools that offer features aiming to improve the performance of its users, assist with routine tasks and provide support to decision-making. The purpose of this research is to evaluate the influence of the users features and of the task in the success of IS. The study is of a descriptive-exploratory nature, therefore, the constructs used to define the conceptual model of the research are known and previously validated. However, individual features of users and of the task are IS success antecedents. In order to test the influence of these antecedents, it was developed a decision support IS that uses the Multicriteria Decision Aid Constructivist (MCDA-C) methodology with the participation and involvement of users. The sample consisted of managers and former managers of UTFPR Campus Pato Branco who work or have worked in teaching activities, research, extension and management. For data collection an experiment was conducted in the computer lab of the Campus Pato Branco in order to verify the hypotheses of the research. The experiment consisted of performing a distribution task of teaching positions between the academic departments using the IS developed. The task involved decision-making related to management activities. The data that fed the system used were real, from the Campus itself. A questionnaire was answered by the participants of the experiment in order to obtain data to verify the research hypotheses. The results obtained from the data analysis partially confirmed the influence of the individual features in IS success and fully confirmed the influence of task features. The data collected failed to support significant ratio between the individual features and the individual impact. For many of the participants the first contact with the IS was during the experiment, which indicates the lack of experience with the system. Regarding the success of IS, the data revealed that there is no significance in the relationship between Information Quality (IQ) and Individual Impact (II). It is noteworthy that the IS used in the experiment is to support decision-making and the information provided by this system are strictly quantitative, which may have caused some conflict in the analysis of the criteria involved in the decision-making process. This is because the criteria of teaching, research, extension and management are interconnected such that one reflects on another. Thus, the opinion of the managers does not depend exclusively on quantitative data, but also of knowledge and value judgment that each manager has about the problem to be solved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The municipal management in any country of the globe requires planning and allocation of resources evenly. In Brazil, the Law of Budgetary Guidelines (LDO) guides municipal managers toward that balance. This research develops a model that seeks to find the balance of the allocation of public resources in Brazilian municipalities, considering the LDO as a parameter. For this using statistical techniques and multicriteria analysis as a first step in order to define allocation strategies, based on the technical aspects arising from the municipal manager. In a second step, presented in linear programming based optimization where the objective function is derived from the preference of the results of the manager and his staff. The statistical representation is presented to support multicriteria development in the definition of replacement rates through time series. The multicriteria analysis was structured by defining the criteria, alternatives and the application of UTASTAR methods to calculate replacement rates. After these initial settings, an application of linear programming was developed to find the optimal allocation of enforcement resources of the municipal budget. Data from the budget of a municipality in southwestern Paraná were studied in the application of the model and analysis of results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Em Portugal o “Sistema Nacional de Sangue” subsiste com base numa cadeia de abastecimento de sangue assente em dádivas provenientes 100% de Dadores Voluntários não Remunerados, com um serviço nacional de sangue de base exclusivamente pública. O seu modelo organizacional assenta numa estrutura de âmbito nacional com funções de regulação, fiscalização, produção e distribuição de sangue e componentes sanguíneos às organizações de saúde públicas e privadas. Em complemento a esta rede de abastecimento existem ainda outras fontes de abastecimento, como os hospitais licenciados e outras instituições, que também procedem a todas as actividades de colheita, processamento e validação para uso final de componentes sanguíneos. Nestas situações a produção de componentes sanguíneos é realizada numa perspectiva de auto-suficiência, local ou regional, um dos princípios fundamentais de todos os serviços nacionais de sangue. Esta dissertação pretende contribuir para o conhecimento do sistema de abastecimento e consumo de sangue, aproveitando para colocar algumas questões em relação ao pode significar a gestão eficiente e eficaz da cadeia de abastecimento de sangue. A gestão eficaz da cadeia de abastecimento passa pelo conhecimento profundo da sua população de dadores, saber quem são, o que os move, quais os seus comportamentos perante a dádiva. Só assim é possível intervir eficazmente e adequar as estratégias mais eficazes à sua motivação e fidelização dádiva regular. Assim, foi efectuado um caso de estudo relativo à população de dadores do Serviço de ImunoHemoterapia do Centro Hospitalar de Vila Nova de Gaia- Espinho (CHVNG- Espinho), por forma a procurar oportunidades de melhoria nas estratégias de angariação de dadores e promoção da dádiva de sangue, acrescentando valor ao serviço e ao CHVNG-Espinho.