977 resultados para Applied Load
Resumo:
O presente trabalho teve como objectivos avaliar a influência de diversas grandezas e parâmetros de ensaio no índice de fluidez de termoplásticos e calcular a incerteza associada às determinações. Numa primeira fase, procedeu-se à identificação dos principais parâmetros que influenciam a determinação do índice de fluidez, tendo sido seleccionados a temperatura do plastómetro, o peso de carga, o diâmetro da fieira, o comprimento da medição, o tipo de corte e o número de provetes. Para avaliar a influência destes parâmetros na medição do índice de fluidez, optou-se pela realização de um planeamento de experiências, o qual foi dividido em três etapas. Para o tratamento dos resultados obtidos utilizou-se como ferramenta a análise de variância. Após a completa análise dos desenhos factoriais, verificou-se que os efeitos dos factores temperatura do plastómetro, peso de carga e diâmetro da fieira apresentam um importante significado estatístico na medição do índice de fluidez. Na segunda fase, procedeu-se ao cálculo da incerteza associada às medições. Para tal seleccionou-se um dos métodos mais usuais, referido no Guia para a Expressão da Incerteza da Medição, conhecido como método GUM, e pela utilização da abordagem “passo a passo”. Inicialmente, foi necessária a construção de um modelo matemático para a medição do índice de fluidez que relacionasse os diferentes parâmetros utilizados. Foi estudado o comportamento de cada um dos parâmetros através da utilização de duas funções, recorrendo-se novamente à análise de variância. Através da lei de propagação das incertezas foi possível determinar a incerteza padrão combinada,e após estimativa do número de graus de liberdade, foi possível determinar o valor do coeficiente de expansão. Finalmente determinou-se a incerteza expandida da medição, relativa à determinação do índice de fluidez em volume.
Resumo:
Introdução: Após lesão do SNC o músculo pode perder a sua variabilidade e flexibilidade, tal como se verifica em indivíduos após AVE. A caracterização do tónus muscular tem sido um indicador a ter em atenção para o diagnóstico clínico. As alterações do tónus podem resultar de uma combinação de alterações neurais, como consequência dos processos inerentes à neuroplasticidade e alterações biomecânicas. Objectivo: Verificar quais as modificações no tónus muscular, segundo a escala de Tardieu, após a aplicação de um programa de reabilitação neuromotora baseado no conceito de Bobath em dois indivíduos com sequelas de AVE. Pretendeu-se também verificar as repercussões nas actividades funcionais. Participantes e métodos: Foram seleccionados dois indivíduos e aplicado um programa de reabilitação, durante onze semanas, e avaliados em dois momentos, antes da intervenção (PRE) e após a intervenção (APÓS). Aplicaram-se vários instrumentos de avaliação, nomeadamente a escala de Tardieu. O programa de reabilitação realizado baseou-se no conceito de Bobath. Resultados: Na escala de Tardieu, foi comum aos dois indivíduos melhorias a nível da qualidade de reacção muscular. Ambos os indivíduos apresentaram melhorias no controlo postural e equilíbrio, que se evidenciaram na CIF. Conclusão: Foi possível observar modificações no tónus muscular após aplicação de um programa de reabilitação e, consequentemente modificações na distribuição da carga na base de suporte, no alinhamento das estruturas articulares e musculares e na marcha. Ao longo da intervenção, observaram-se repercussões positivas em ambos os indivíduos, permitindo a estes realizar as AVDs com menor dificuldade
Resumo:
The purpose of this paper is to present and discuss a general HV topology of the solid-state Marx modulator, for unipolar or bipolar generation connected with a step-up transformer to increase the output voltage applied to a resistive load. Due to the use of an output transformer, discussion about the reset of the transformer is made to guarantee zero average voltage applied to the primary. It is also discussed the transformer magnetizing energy recovering back to the energy storage capacitors. Simulation results for a circuit that generates 100 kV pulses using 1000 V semiconductors are presented and discussed regarding the voltage and current stress on the semiconductors and result obtained.
Resumo:
Facing the lateral vibration problem of a machine rotor as a beam on elastic supports in bending, the authors deal with the free vibration of elastically restrained Bernoulli-Euler beams carrying a finite number of concentrated elements along their length. Based on Rayleigh's quotient, an iterative strategy is developed to find the approximated torsional stiffness coefficients, which allows the reconciliation between the theoretical model results and the experimental ones, obtained through impact tests. The mentioned algorithm treats the vibration of continuous beams under a determined set of boundary and continuity conditions, including different torsional stiffness coefficients and the effect of attached concentrated masses and rotational inertias, not only in the energetic terms of the Rayleigh's quotient but also on the mode shapes, considering the shape functions defined in branches. Several loading cases are examined and examples are given to illustrate the validity of the model and accuracy of the obtained natural frequencies.
Resumo:
O documento em anexo encontra-se na versão pre-print (versão inicial enviada para o editor).
Resumo:
With the electricity market liberalization, the distribution and retail companies are looking for better market strategies based on adequate information upon the consumption patterns of its electricity consumers. A fair insight on the consumers’ behavior will permit the definition of specific contract aspects based on the different consumption patterns. In order to form the different consumers’ classes, and find a set of representative consumption patterns we use electricity consumption data from a utility client’s database and two approaches: Two-step clustering algorithm and the WEACS approach based on evidence accumulation (EAC) for combining partitions in a clustering ensemble. While EAC uses a voting mechanism to produce a co-association matrix based on the pairwise associations obtained from N partitions and where each partition has equal weight in the combination process, the WEACS approach uses subsampling and weights differently the partitions. As a complementary step to the WEACS approach, we combine the partitions obtained in the WEACS approach with the ALL clustering ensemble construction method and we use the Ward Link algorithm to obtain the final data partition. The characterization of the obtained consumers’ clusters was performed using the C5.0 classification algorithm. Experiment results showed that the WEACS approach leads to better results than many other clustering approaches.
Resumo:
Long-term contractual decisions are the basis of an efficient risk management. However those types of decisions have to be supported with a robust price forecast methodology. This paper reports a different approach for long-term price forecast which tries to give answers to that need. Making use of regression models, the proposed methodology has as main objective to find the maximum and a minimum Market Clearing Price (MCP) for a specific programming period, and with a desired confidence level α. Due to the problem complexity, the meta-heuristic Particle Swarm Optimization (PSO) was used to find the best regression parameters and the results compared with the obtained by using a Genetic Algorithm (GA). To validate these models, results from realistic data are presented and discussed in detail.
Resumo:
The present research paper presents five different clustering methods to identify typical load profiles of medium voltage (MV) electricity consumers. These methods are intended to be used in a smart grid environment to extract useful knowledge about customer’s behaviour. The obtained knowledge can be used to support a decision tool, not only for utilities but also for consumers. Load profiles can be used by the utilities to identify the aspects that cause system load peaks and enable the development of specific contracts with their customers. The framework presented throughout the paper consists in several steps, namely the pre-processing data phase, clustering algorithms application and the evaluation of the quality of the partition, which is supported by cluster validity indices. The process ends with the analysis of the discovered knowledge. To validate the proposed framework, a case study with a real database of 208 MV consumers is used.
Resumo:
Short-term risk management is highly dependent on long-term contractual decisions previously established; risk aversion factor of the agent and short-term price forecast accuracy. Trying to give answers to that problem, this paper provides a different approach for short-term risk management on electricity markets. Based on long-term contractual decisions and making use of a price range forecast method developed by the authors, the short-term risk management tool presented here has as main concern to find the optimal spot market strategies that a producer should have for a specific day in function of his risk aversion factor, with the objective to maximize the profits and simultaneously to practice the hedge against price market volatility. Due to the complexity of the optimization problem, the authors make use of Particle Swarm Optimization (PSO) to find the optimal solution. Results from realistic data, namely from OMEL electricity market, are presented and discussed in detail.
Resumo:
This paper describes a methodology that was developed for the classification of Medium Voltage (MV) electricity customers. Starting from a sample of data bases, resulting from a monitoring campaign, Data Mining (DM) techniques are used in order to discover a set of a MV consumer typical load profile and, therefore, to extract knowledge regarding to the electric energy consumption patterns. In first stage, it was applied several hierarchical clustering algorithms and compared the clustering performance among them using adequacy measures. In second stage, a classification model was developed in order to allow classifying new consumers in one of the obtained clusters that had resulted from the previously process. Finally, the interpretation of the discovered knowledge are presented and discussed.
Resumo:
The concept of demand response has a growing importance in the context of the future power systems. Demand response can be seen as a resource like distributed generation, storage, electric vehicles, etc. All these resources require the existence of an infrastructure able to give players the means to operate and use them in an efficient way. This infrastructure implements in practice the smart grid concept, and should accommodate a large number of diverse types of players in the context of a competitive business environment. In this paper, demand response is optimally scheduled jointly with other resources such as distributed generation units and the energy provided by the electricity market, minimizing the operation costs from the point of view of a virtual power player, who manages these resources and supplies the aggregated consumers. The optimal schedule is obtained using two approaches based on particle swarm optimization (with and without mutation) which are compared with a deterministic approach that is used as a reference methodology. A case study with two scenarios implemented in DemSi, a demand Response simulator developed by the authors, evidences the advantages of the use of the proposed particle swarm approaches.
Fuzzy Monte Carlo mathematical model for load curtailment minimization in transmission power systems
Resumo:
This paper presents a methodology which is based on statistical failure and repair data of the transmission power system components and uses fuzzyprobabilistic modeling for system component outage parameters. Using statistical records allows developing the fuzzy membership functions of system component outage parameters. The proposed hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models allows catching both randomness and fuzziness of component outage parameters. A network contingency analysis to identify any overloading or voltage violation in the network is performed once obtained the system states by Monte Carlo simulation. This is followed by a remedial action algorithm, based on optimal power flow, to reschedule generations and alleviate constraint violations and, at the same time, to avoid any load curtailment, if possible, or, otherwise, to minimize the total load curtailment, for the states identified by the contingency analysis. In order to illustrate the application of the proposed methodology to a practical case, the paper will include a case study for the Reliability Test System (RTS) 1996 IEEE 24 BUS.
Resumo:
This paper present a methodology to choose the distribution networks reconfiguration that presents the lower power losses. The proposed methodology is based on statistical failure and repair data of the distribution power system components and uses fuzzy-probabilistic modeling for system component outage parameters. The proposed hybrid method using fuzzy sets and Monte Carlo simulation based on the fuzzyprobabilistic models allows catching both randomness and fuzziness of component outage parameters. A logic programming algorithm is applied, once obtained the system states by Monte Carlo Simulation, to get all possible reconfigurations for each system state. To evaluate the line flows and bus voltages and to identify if there is any overloading, and/or voltage violation an AC load flow has been applied to select the feasible reconfiguration with lower power losses. To illustrate the application of the proposed methodology, the paper includes a case study that considers a 115 buses distribution network.
Resumo:
With the current increase of energy resources prices and environmental concerns intelligent load management systems are gaining more and more importance. This paper concerns a SCADA House Intelligent Management (SHIM) system that includes an optimization module using deterministic and genetic algorithm approaches. SHIM undertakes contextual load management based on the characterization of each situation. SHIM considers available generation resources, load demand, supplier/market electricity price, and consumers’ constraints and preferences. The paper focus on the recently developed learning module which is based on artificial neural networks (ANN). The learning module allows the adjustment of users’ profiles along SHIM lifetime. A case study considering a system with fourteen discrete and four variable loads managed by a SHIM system during five consecutive similar weekends is presented.
Resumo:
This paper proposes a swarm intelligence long-term hedging tool to support electricity producers in competitive electricity markets. This tool investigates the long-term hedging opportunities available to electric power producers through the use of contracts with physical (spot and forward) and financial (options) settlement. To find the optimal portfolio the producer risk preference is stated by a utility function (U) expressing the trade-off between the expectation and the variance of the return. Variance estimation and the expected return are based on a forecasted scenario interval determined by a long-term price range forecast model, developed by the authors, whose explanation is outside the scope of this paper. The proposed tool makes use of Particle Swarm Optimization (PSO) and its performance has been evaluated by comparing it with a Genetic Algorithm (GA) based approach. To validate the risk management tool a case study, using real price historical data for mainland Spanish market, is presented to demonstrate the effectiveness of the proposed methodology.