936 resultados para Probabilistic Optimal Power Flow


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an effective decision making system for leak detection based on multiple generalized linear models and clustering techniques. The training data for the proposed decision system is obtained by setting up an experimental pipeline fully operational distribution system. The system is also equipped with data logging for three variables; namely, inlet pressure, outlet pressure, and outlet flow. The experimental setup is designed such that multi-operational conditions of the distribution system, including multi pressure and multi flow can be obtained. We then statistically tested and showed that pressure and flow variables can be used as signature of leak under the designed multi-operational conditions. It is then shown that the detection of leakages based on the training and testing of the proposed multi model decision system with pre data clustering, under multi operational conditions produces better recognition rates in comparison to the training based on the single model approach. This decision system is then equipped with the estimation of confidence limits and a method is proposed for using these confidence limits for obtaining more robust leakage recognition results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Implementation of a Monte Carlo simulation for the solution of population balance equations (PBEs) requires choice of initial sample number (N0), number of replicates (M), and number of bins for probability distribution reconstruction (n). It is found that Squared Hellinger Distance, H2, is a useful measurement of the accuracy of Monte Carlo (MC) simulation, and can be related directly to N0, M, and n. Asymptotic approximations of H2 are deduced and tested for both one-dimensional (1-D) and 2-D PBEs with coalescence. The central processing unit (CPU) cost, C, is found in a power-law relationship, C= aMNb0, with the CPU cost index, b, indicating the weighting of N0 in the total CPU cost. n must be chosen to balance accuracy and resolution. For fixed n, M × N0 determines the accuracy of MC prediction; if b > 1, then the optimal solution strategy uses multiple replications and small sample size. Conversely, if 0 < b < 1, one replicate and a large initial sample size is preferred. © 2015 American Institute of Chemical Engineers AIChE J, 61: 2394–2402, 2015

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Agents inhabiting large scale environments are faced with the problem of generating maps by which they can navigate. One solution to this problem is to use probabilistic roadmaps which rely on selecting and connecting a set of points that describe the interconnectivity of free space. However, the time required to generate these maps can be prohibitive, and agents do not typically know the environment in advance. In this paper we show that the optimal combination of different point selection methods used to create the map is dependent on the environment, no point selection method dominates. This motivates a novel self-adaptive approach for an agent to combine several point selection methods. The success rate of our approach is comparable to the state of the art and the generation cost is substantially reduced. Self-adaptation therefore enables a more efficient use of the agent's resources. Results are presented for both a set of archetypal scenarios and large scale virtual environments based in Second Life, representing real locations in London.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 78A50

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since wind has an intrinsically complex and stochastic nature, accurate wind power forecasts are necessary for the safety and economics of wind energy utilization. In this paper, we investigate a combination of numeric and probabilistic models: one-day-ahead wind power forecasts were made with Gaussian Processes (GPs) applied to the outputs of a Numerical Weather Prediction (NWP) model. Firstly the wind speed data from NWP was corrected by a GP. Then, as there is always a defined limit on power generated in a wind turbine due the turbine controlling strategy, a Censored GP was used to model the relationship between the corrected wind speed and power output. To validate the proposed approach, two real world datasets were used for model construction and testing. The simulation results were compared with the persistence method and Artificial Neural Networks (ANNs); the proposed model achieves about 11% improvement in forecasting accuracy (Mean Absolute Error) compared to the ANN model on one dataset, and nearly 5% improvement on another.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Incorporating Material Balance Principle (MBP) in industrial and agricultural performance measurement systems with pollutant factors has been on the rise in recent years. Many conventional methods of performance measurement have proven incompatible with the material flow conditions. This study will address the issue of eco-efficiency measurement adjusted for pollution, taking into account materials flow conditions and the MBP requirements, in order to provide ‘real’ measures of performance that can serve as guides when making policies. We develop a new approach by integrating slacks-based measure to enhance the Malmquist Luenberger Index by a material balance condition that reflects the conservation of matter. This model is compared with a similar model, which incorporates MBP using the trade-off approach to measure productivity and eco-efficiency trends of power plants. Results reveal similar findings for both models substantiating robustness and applicability of the proposed model in this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a comprehensive study of power output characteristics of random distributed feedback Raman fiber lasers. The calculated optimal slope efficiency of the backward wave generation in the one-arm configuration is shown to be as high as ∼90% for 1 W threshold. Nevertheless, in real applications a presence of a small reflection at fiber ends can appreciably deteriorate the power performance. The developed numerical model well describes the experimental data. © 2012 Optical Society of America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

More than 165 induction times of butyl paraben-ethanol solution in a batch moving fluid oscillation baffled crystallizer with various amplitudes (1-9 mm) and frequencies (1.0-9.0 Hz) have been determined to study the effect of COBR operating conditions on nucleation. The induction time decreases with increasing amplitude and frequency at power density below about 500 W/m3; however, a further increase of the frequency and amplitude leads to an increase of the induction time. The interfacial energies and pre-exponential factors in both homogeneous and heterogeneous nucleation are determined by classical nucleation theory at oscillatory frequency 2.0 Hz and amplitudes of 3 or 5 mm both with and without net flow. To capture the shear rate conditions in oscillatory flow crystallizers, a large eddy simulation approach in a computational fluid dynamics framework is applied. Under ideal conditions the shear rate distribution shows spatial and temporal periodicity and radial symmetry. The spatial distributions of the shear rate indicate an increase of average and maximum values of the shear rate with increasing amplitude and frequency. In continuous operation, net flow enhances the shear rate at most time points, promoting nucleation. The mechanism of the shear rate influence on nucleation is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For micro gas turbines (MGT) of around 1 kW or less, a commercially suitable recuperator must be used to produce a thermal efficiency suitable for use in UK Domestic Combined Heat and Power (DCHP). This paper uses computational fluid dynamics (CFD) to investigate a recuperator design based on a helically coiled pipe-in-pipe heat exchanger which utilises industry standard stock materials and manufacturing techniques. A suitable mesh strategy was established by geometrically modelling separate boundary layer volumes to satisfy y + near wall conditions. A higher mesh density was then used to resolve the core flow. A coiled pipe-in-pipe recuperator solution for a 1 kW MGT DCHP unit was established within the volume envelope suitable for a domestic wall-hung boiler. Using a low MGT pressure ratio (necessitated by using a turbocharger oil cooled journal bearing platform) meant unit size was larger than anticipated. Raising MGT pressure ratio from 2.15 to 2.5 could significantly reduce recuperator volume. Dimensional reasoning confirmed the existence of optimum pipe diameter combinations for minimum pressure drop. Maximum heat exchanger effectiveness was achieved using an optimum or minimum pressure drop pipe combination with large pipe length as opposed to a large pressure drop pipe combination with shorter pipe length. © 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increase in renewable energy generators introduced into the electricity grid is putting pressure on its stability and management as predictions of renewable energy sources cannot be accurate or fully controlled. This, with the additional pressure of fluctuations in demand, presents a problem more complex than the current methods of controlling electricity distribution were designed for. A global approximate and distributed optimisation method for power allocation that accommodates uncertainties and volatility is suggested and analysed. It is based on a probabilistic method known as message passing [1], which has deep links to statistical physics methodology. This principled method of optimisation is based on local calculations and inherently accommodates uncertainties; it is of modest computational complexity and provides good approximate solutions.We consider uncertainty and fluctuations drawn from a Gaussian distribution and incorporate them into the message-passing algorithm. We see the effect that increasing uncertainty has on the transmission cost and how the placement of volatile nodes within a grid, such as renewable generators or consumers, effects it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, the authors investigate the outage-optimal relay strategy under outdated channel state information (CSI) in a decode-and-forward cooperative communication system. They first confirm mathematically that minimising the outage probability under outdated CSI is equivalent to minimising the conditional outage probability on the outdated CSI of all the decodable relays' links. They then propose a multiple-relay strategy with optimised transmitting power allocation (MRS-OTPA) that minimises the conditional outage probability. It is shown that this MRS is a generalised relay approach to achieve the outage optimality under outdated CSI. To reduce the complexity, they also propose a MRS with equal transmitting power allocation (MRS-ETPA) that achieves near-optimal outage performance. It is proved that full spatial diversity, which has been achieved under ideal CSI, can still be achieved under outdated CSI through MRS-OTPA and MRS-ETPA. Finally, the outage performance and diversity order of MRS-OTPA and MRS-ETPA are evaluated by simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Insulated-gate bipolar transistor (IGBT) power modules find widespread use in numerous power conversion applications where their reliability is of significant concern. Standard IGBT modules are fabricated for general-purpose applications while little has been designed for bespoke applications. However, conventional design of IGBTs can be improved by the multiobjective optimization technique. This paper proposes a novel design method to consider die-attachment solder failures induced by short power cycling and baseplate solder fatigue induced by the thermal cycling which are among major failure mechanisms of IGBTs. Thermal resistance is calculated analytically and the plastic work design is obtained with a high-fidelity finite-element model, which has been validated experimentally. The objective of minimizing the plastic work and constrain functions is formulated by the surrogate model. The nondominated sorting genetic algorithm-II is used to search for the Pareto-optimal solutions and the best design. The result of this combination generates an effective approach to optimize the physical structure of power electronic modules, taking account of historical environmental and operational conditions in the field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A klasszikus tételnagyság probléma két fontosabb készletezési költséget ragad meg: rendelési és készlettartási költségek. Ebben a dolgozatban a vállalatok készpénz áramlásának a beszerzési tevékenységre gyakorolt hatását vizsgáljuk. Ebben az elemzésben a készpénzáramlási egyenlőséget használjuk, amely nagyban emlékeztet a készletegyenletekre. Eljárásunkban a beszerzési és rendelési folyamatot diszkontálva vizsgáljuk. A költségfüggvény lineáris készpénztartási, a pénzkiadás haszonlehetőség és lineáris kamatköltségből áll. Bemutatjuk a vizsgált modell optimális megoldását. Az optimális megoldást egy számpéldával illusztráljuk. = The classical economic order quantity model has two types of costs: ordering and inventory holding costs. In this paper we try to investigate the effect of purchasing activity on cash flow of a firm. In the examinations we use a cash flow identity similar to that of in inventory modeling. In our approach we analyze the purchasing and ordering process with discounted costs. The cost function of the model consists of linear cash holding, linear opportunity cost of spending cash, and linear interest costs. We show the optimal solution of the proposed model. The optimal solutions will be presented by numerical examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation develops a new figure of merit to measure the similarity (or dissimilarity) of Gaussian distributions through a novel concept that relates the Fisher distance to the percentage of data overlap. The derivations are expanded to provide a generalized mathematical platform for determining an optimal separating boundary of Gaussian distributions in multiple dimensions. Real-world data used for implementation and in carrying out feasibility studies were provided by Beckman-Coulter. It is noted that although the data used is flow cytometric in nature, the mathematics are general in their derivation to include other types of data as long as their statistical behavior approximate Gaussian distributions. ^ Because this new figure of merit is heavily based on the statistical nature of the data, a new filtering technique is introduced to accommodate for the accumulation process involved with histogram data. When data is accumulated into a frequency histogram, the data is inherently smoothed in a linear fashion, since an averaging effect is taking place as the histogram is generated. This new filtering scheme addresses data that is accumulated in the uneven resolution of the channels of the frequency histogram. ^ The qualitative interpretation of flow cytometric data is currently a time consuming and imprecise method for evaluating histogram data. This method offers a broader spectrum of capabilities in the analysis of histograms, since the figure of merit derived in this dissertation integrates within its mathematics both a measure of similarity and the percentage of overlap between the distributions under analysis. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The introduction of phase change material fluid and nanofluid in micro-channel heat sink design can significantly increase the cooling capacity of the heat sink because of the unique features of these two kinds of fluids. To better assist the design of a high performance micro-channel heat sink using phase change fluid and nanofluid, the heat transfer enhancement mechanism behind the flow with such fluids must be completely understood. ^ A detailed parametric study is conducted to further investigate the heat transfer enhancement of the phase change material particle suspension flow, by using the two-phase non-thermal-equilibrium model developed by Hao and Tao (2004). The parametric study is conducted under normal conditions with Reynolds numbers of Re = 90–600 and phase change material particle concentrations of ϵp ≤ 0.25, as well as extreme conditions of very low Reynolds numbers (Re < 50) and high phase change material particle concentration (ϵp = 50%–70%) slurry flow. By using the two newly-defined parameters, named effectiveness factor ϵeff and performance index PI, respectively, it is found that there exists an optimal relation between the channel design parameters L and D, particle volume fraction ϵp, Reynolds number Re, and the wall heat flux qw. The influence of the particle volume fraction ϵp, particle size dp, and the particle viscosity μ p, to the phase change material suspension flow, are investigated and discussed. The model was validated by available experimental data. The conclusions will assist designers in making their decisions that relate to the design or selection of a micro-pump suitable for micro or mini scale heat transfer devices. ^ To understand the heat transfer enhancement mechanism of the nanofluid flow from the particle level, the lattice Boltzmann method is used because of its mesoscopic feature and its many numerical advantages. By using a two-component lattice Boltzmann model, the heat transfer enhancement of the nanofluid is analyzed, through incorporating the different forces acting on the nanoparticles to the two-component lattice Boltzmann model. It is found that the nanofluid has better heat transfer enhancement at low Reynolds numbers, and the Brownian motion effect of the nanoparticles will be weakened by the increase of flow speed. ^