55 resultados para Electricity Network Distribution Wastes


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Uncertainty is known to be a concomitant factor of almost all the real world commodities such as oil prices, stock prices, sales and demand of products. As a consequence, forecasting problems are becoming more and more challenging and ridden with uncertainty. Such uncertainties are generally quantified by statistical tools such as prediction intervals (Pis). Pis quantify the uncertainty related to forecasts by estimating the ranges of the targeted quantities. Pis generated by traditional neural network based approaches are limited by high computational burden and impractical assumptions about the distribution of the data. A novel technique for constructing high quality Pis using support vector machines (SVMs) is being proposed in this paper. The proposed technique directly estimates the upper and lower bounds of the PI in a short time and without any assumptions about the data distribution. The SVM parameters are tuned using particle swarm optimization technique by minimization of a modified Pi-based objective function. Electricity price and demand data of the Ontario electricity market is used to validate the performance of the proposed technique. Several case studies for different months indicate the superior performance of the proposed method in terms of high quality PI generation and shorter computational times.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The aim of this research is to examine the efficiency of different aggregation algorithms to the forecasts obtained from individual neural network (NN) models in an ensemble. In this study an ensemble of 100 NN models are constructed with a heterogeneous architecture. The outputs from NN models are combined by three different aggregation algorithms. These aggregation algorithms comprise of a simple average, trimmed mean, and a Bayesian model averaging. These methods are utilized with certain modifications and are employed on the forecasts obtained from all individual NN models. The output of the aggregation algorithms is analyzed and compared with the individual NN models used in NN ensemble and with a Naive approach. Thirty-minutes interval electricity demand data from Australian Energy Market Operator (AEMO) and the New York Independent System Operator's web site (NYISO) are used in the empirical analysis. It is observed that the aggregation algorithm perform better than many of the individual NN models. In comparison with the Naive approach, the aggregation algorithms exhibit somewhat better forecasting performance.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Over the past decade, the growing demand of Grid-connected photo voltaic (GCPV) system has been increasing due to an extensive use of renewable energy technologies for sustainable power generation and distribution. High-penetrated GCPV systems enhance the operation of the network by improving the voltage levels and reducing the active power losses along the length of the feeder. This paper aims to investigate the voltage variations and Total Harmonic Distortion (THD) of a typical GCPV system modelled in Power system simulator, PSS SINCAL with the change of level of PV integrations in a Low Voltage (LV) distribution network. Five different case studies are considered to investigate the impact of PV integrations on LV nodes and the corresponding voltage variations and harmonics. In addition, this paper also explores and benchmarks the voltage improvement techniques by implementing On Load Tap Changer (OLTC) with respective to the main transformer and addition of Shunt Capacitor (SC) at appropriate node points in LV network,

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents a nonlinear controller design for a DSTATCOM connected to a distribution network with distributed generation (DG) to regulate the line voltage by providing reactive power compensation.The controller is designed based on the partial feedback linearization which transforms the nonlinear system into a reduced-order linear system and an autonomous system whose dynamics are known as internal dynamics of the system. This paper also investigates the stability of internal dynamics of a DSTATCOM as it is a basic requirement to design partial feedback linearizing controllers. The performance of the proposed controller is evaluated in terms reactive power compensation to enhance the voltage stability of distribution with DG.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper formulates the problem of learning Bayesian network structures from data as determining the structure that best approximates the probability distribution indicated by the data. A new metric, Penalized Mutual Information metric, is proposed, and a evolutionary algorithm is designed to search for the best structure among alternatives. The experimental results show that this approach is reliable and promising.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the application of neural networks to the recognition of lubrication defects typical to an industrial cold forging process employed by fastener manufacturers. The accurate recognition of lubrication errors, such as coating not being applied properly or damaged during material handling, is very important to the quality of the final product in fastener manufacture. Lubrication errors lead to increased forging loads and premature tool failure, as well as to increased defect sorting and the re-processing of the coated rod. The lubrication coating provides a barrier between the work material and the die during the drawing operation; moreover it needs be sufficiently robust to remain on the wire during the transfer to the cold forging operation. In the cold forging operation the wire undergoes multi-stage deformation without the application of any additional lubrication. Four types of lubrication errors, typical to production of fasteners, were introduced to a set of sample rods, which were subsequently drawn under laboratory conditions. The drawing force was measured, from which a limited set of features was extracted. The neural network based model learned from these features is able to recognize all types of lubrication errors to a high accuracy. The overall accuracy of the neural network model is around 98% with almost uniform distribution of errors between all four errors and the normal condition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although the development of geographic information system (GIS) technology and digital data manipulation techniques has enabled practitioners in the geographical and geophysical sciences to make more efficient use of resource information, many of the methods used in forming spatial prediction models are still inherently based on traditional techniques of map stacking in which layers of data are combined under the guidance of a theoretical domain model. This paper describes a data-driven approach by which Artificial Neural Networks (ANNs) can be trained to represent a function characterising the probability that an instance of a discrete event, such as the presence of a mineral deposit or the sighting of an endangered animal species, will occur over some grid element of the spatial area under consideration. A case study describes the application of the technique to the task of mineral prospectivity mapping in the Castlemaine region of Victoria using a range of geological, geophysical and geochemical input variables. Comparison of the maps produced using neural networks with maps produced using a density estimation-based technique demonstrates that the maps can reliably be interpreted as representing probabilities. However, while the neural network model and the density estimation-based model yield similar results under an appropriate choice of values for the respective parameters, the neural network approach has several advantages, especially in high dimensional input spaces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The peer-to-peer content distribution network (PCDN) is a hot topic recently, and it has a huge potential for massive data intensive applications on the Internet. One of the challenges in PCDN is routing for data sources and data deliveries. In this paper, we studied a type of network model which is formed by dynamic autonomy area, structured source servers and proxy servers. Based on this network model, we proposed a number of algorithms to address the routing and data delivery issues. According to the highly dynamics of the autonomy area, we established dynamic tree structure proliferation system routing, proxy routing and resource searching algorithms. The simulations results showed that the performance of the proposed network model and the algorithms are stable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DDoS is a spy-on-spy game between attackers and detectors. Attackers are mimicking network traffic patterns to disable the detection algorithms which are based on these features. It is an open problem of discriminating the mimicking DDoS attacks from massive legitimate network accessing. We observed that the zombies use controlled function(s) to pump attack packages to the victim, therefore, the attack flows to the victim are always share some properties, e.g. packages distribution behaviors, which are not possessed by legitimate flows in a short time period. Based on this observation, once there appear suspicious flows to a server, we start to calculate the distance of the package distribution behavior among the suspicious flows. If the distance is less than a given threshold, then it is a DDoS attack, otherwise, it is a legitimate accessing. Our analysis and the preliminary experiments indicate that the proposed method- can discriminate mimicking flooding attacks from legitimate accessing efficiently and effectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The presence of nitric oxide synthase (NOS) and role of nitric oxide (NO) in vascular regulation was investigated in the Australian lungfish, Neoceratodus forsteri. No evidence was found for NOS in the endothelium of large and small blood vessels following processing for NADPH-diaphorase histochemistry. However, both NADPH-diaphorase histochemistry and neural NOS immunohistochemistry demonstrated a sparse network of nitrergic nerves in the dorsal aorta, hepatic artery, and branchial arteries, but there were no nitrergic nerves in small blood vessels in tissues. In contrast, nitrergic nerves were found in non-vascular tissues of the lung, gut and kidney. Dual-wire myography was used to determine if NO signalling occurred in the branchial artery of N. forsteri. Both SNP and SIN-1 had no effect on the pre-constricted branchial artery, but the particulate guanylyl cyclase (GC) activator, C-type natriuretic peptide, always caused vasodilation. Nicotine mediated a dilation that was not inhibited by the soluble GC inhibitor, ODQ, or the NOS inhibitor, L-NNA, but was blocked by the cyclooxygenase inhibitor, indomethacin. These data suggest that NO control of the branchial artery is lacking, but that prostaglandins could be endothelial relaxing factors in the vasculature of lungfish.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we test for asymmetric behaviour of industrial and residential electricity demand for the G7 countries, using the entropy-based test for symmetry suggested by [Racine, J., and Maasoumi, E., 2007. A versatile and robust metric entropy test of time-reversibility, and other hypotheses. Journal of Econometrics 138(2), 547–567; Racine, J., and Maasoumi, E., 2008. A robust entropy-based test of asymmetry for discrete and continuous processes. Econometric Reviews 28, 246–261], the Triples test of [Randles, R., Flinger, M., Policello, G., and Wolfe, D., 1980. An asymptotically distribution-free test for symmetry versus asymmetry. Journal of the American Statistical Association 75, 168–172] and the [Bai, J., and Ng, S., 2001. A consistent test for conditional symmetry in time series models. Journal of Econometrics 103, 225–258] test for conditional symmetry. Using data that spans over three decades, we find overwhelming evidence of conditional symmetry of residential and industrial electricity consumption. This finding implies that the use of econometric tests based on linear data generating processes is credible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the Multi-level Virtual Ring (MVR), a new name routing scheme for sensor networks. MVR uses selection algorithm to identify sensor nodes' virtual level and uses Distribution Hash Table (DHT) to map them to the MVR, The address routing performs well in wired network, but it's not true in sensor network. Because when nodes are moving, the address of the nodes must be changed Further, the address routing needs servers to allocate addresses to nodes. To solve this problem, the name routing is being introduced, such as Virtual Ring Routing (VRR). MVR is a new name routing scheme, which improves the routing performance significantly by introducing the multi-level virtual ring and cross-level routing. Experiments show this embedded name routing is workable and achieves better routing performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The continuous growth of the users pool of Social Networking web sites such as Facebook and MySpace, and their incessant augmentation of services and capabilities will in the future, meet and compare in contrast with today's Content distribution Networks (CDN) and Peer-to-Peer File sharing applications such as Kazaa and BitTorrent, but how can these two main streams applications, that already encounter their own security problems cope with the combined issues, trust for Social Networks, content and index poisoning in CDN? We will address the problems of Social Trust and File Sharing with an overlay level of trust model based on social activity and transactions, this can be an answer to enable users to increase the reliability of their online social life and also enhance the content distribution and create a better file sharing example. The aim of this research is to lower the risk of malicious activity on a given Social Network by applying a correlated trust model, to guarantee the validity of someone's identity, privacy and trustfulness in sharing content.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Any attempt to model an economy requires foundational assumptions about the relations between prices, values and the distribution of wealth. These assumptions exert a profound influence over the results of any model. Unfortunately, there are few areas in economics as vexed as the theory of value. I argue in this paper that the fundamental problem with past theories of value is that it is simply not possible to model the determination of value, the formation of prices and the distribution of income in a real economy with analytic mathematical models. All such attempts leave out crucial processes or make unrealistic assumptions which significantly affect the results. There have been two primary approaches to the theory of value. The first, associated with classical economists such as Ricardo and Marx were substance theories of value, which view value as a substance inherent in an object and which is conserved in exchange. For Marxists, the value of a commodity derives solely from the value of the labour power used to produce it - and therefore any profit is due to the exploitation of the workers. The labour theory of value has been discredited because of its assumption that labour was the only ‘factor’ that contributed to the creation of value, and because of its fundamentally circular argument. Neoclassical theorists argued that price was identical with value and was determined purely by the interaction of supply and demand. Value then, was completely subjective. Returns to labour (wages) and capital (profits) were determined solely by their marginal contribution to production, so that each factor received its just reward by definition. Problems with the neoclassical approach include assumptions concerning representative agents, perfect competition, perfect and costless information and contract enforcement, complete markets for credit and risk, aggregate production functions and infinite, smooth substitution between factors, distribution according to marginal products, firms always on the production possibility frontier and firms’ pricing decisions, ignoring money and credit, and perfectly rational agents with infinite computational capacity. Two critical areas include firstly, the underappreciated Sonnenschein-Mantel- Debreu results which showed that the foundational assumptions of the Walrasian general-equilibrium model imply arbitrary excess demand functions and therefore arbitrary equilibrium price sets. Secondly, in real economies, there is no equilibrium, only continuous change. Equilibrium is never reached because of constant changes in preferences and tastes; technological and organisational innovations; discoveries of new resources and new markets; inaccurate and evolving expectations of businesses, consumers, governments and speculators; changing demand for credit; the entry and exit of firms; the birth, learning, and death of citizens; changes in laws and government policies; imperfect information; generalized increasing returns to scale; random acts of impulse; weather and climate events; changes in disease patterns, and so on. The problem is not the use of mathematical modelling, but the kind of mathematical modelling used. Agent-based models (ABMs), objectoriented programming and greatly increased computer power however, are opening up a new frontier. Here a dynamic bargaining ABM is outlined as a basis for an alternative theory of value. A large but finite number of heterogeneous commodities and agents with differing degrees of market power are set in a spatial network. Returns to buyers and sellers are decided at each step in the value chain, and in each factor market, through the process of bargaining. Market power and its potential abuse against the poor and vulnerable are fundamental to how the bargaining dynamics play out. Ethics therefore lie at the very heart of economic analysis, the determination of prices and the distribution of wealth. The neoclassicals are right then that price is the enumeration of value at a particular time and place, but wrong to downplay the critical roles of bargaining, power and ethics in determining those same prices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effective disinfection planning and management in large, complex water distribution systems requires an accurate network water quality model. This model should be based on reaction kinetics, which describes disinfectant loss from bulk water over time, within experimental error. Models in the literature were reviewed for their ability to meet this requirement in real networks. Essential features were identified as accuracy, simplicity, computational efficiency, and ability to describe consistently the effects of initial chlorine dose, temperature variation, and successive rechlorinations. A reaction scheme of two organic constituents reacting with free chlorine was found to be necessary and sufficient to provide the required features. Recent release of the multispecies extension (MSX) to EPANET and MWH Soft's H2OMap Water MSX network software enables users to implement this and other multiple-reactant bulk decay models in real system simulations.