892 resultados para Electricity Network Distribution Wastes


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Differently from theoretical scale-free networks, most real networks present multi-scale behavior, with nodes structured in different types of functional groups and communities. While the majority of approaches for classification of nodes in a complex network has relied on local measurements of the topology/connectivity around each node, valuable information about node functionality can be obtained by concentric (or hierarchical) measurements. This paper extends previous methodologies based on concentric measurements, by studying the possibility of using agglomerative clustering methods, in order to obtain a set of functional groups of nodes, considering particular institutional collaboration network nodes, including various known communities (departments of the University of Sao Paulo). Among the interesting obtained findings, we emphasize the scale-free nature of the network obtained, as well as identification of different patterns of authorship emerging from different areas (e.g. human and exact sciences). Another interesting result concerns the relatively uniform distribution of hubs along concentric levels, contrariwise to the non-uniform pattern found in theoretical scale-free networks such as the BA model. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the two-dimensional version of a drainage network model introduced ill Gangopadhyay, Roy and Sarkar (2004), and show that the appropriately rescaled family of its paths converges in distribution to the Brownian web. We do so by verifying the convergence criteria proposed in Fontes, Isopi, Newman and Ravishankar (2002).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Sweden, there are about 0.5 million single-family houses that are heated by electricity alone, and rising electricity costs force the conversion to other heating sources such as heat pumps and wood pellet heating systems. Pellet heating systems for single-family houses are currently a strongly growing market. Future lack of wood fuels is possible even in Sweden, and combining wood pellet heating with solar heating will help to save the bio-fuel resources. The objectives of this thesis are to investigate how the electrically heated single-family houses can be converted to pellet and solar heating systems, and how the annual efficiency and solar gains can be increased in such systems. The possible reduction of CO-emissions by combining pellet heating with solar heating has also been investigated. Systems with pellet stoves (both with and without a water jacket), pellet boilers and solar heating have been simulated. Different system concepts have been compared in order to investigate the most promising solutions. Modifications in system design and control strategies have been carried out in order to increase the system efficiency and the solar gains. Possibilities for increasing the solar gains have been limited to investigation of DHW-units for hot water production and the use of hot water for heating of dishwashers and washing machines via a heat exchanger instead of electricity (heat-fed appliances). Computer models of pellet stoves, boilers, DHW-units and heat-fed appliances have been developed and the parameters for the models have been identified from measurements on real components. The conformity between the models and the measurements has been checked. The systems with wood pellet stoves have been simulated in three different multi-zone buildings, simulated in detail with heat distribution through door openings between the zones. For the other simulations, either a single-zone house model or a load file has been used. Simulations were carried out for Stockholm, Sweden, but for the simulations with heat-fed machines also for Miami, USA. The foremost result of this thesis is the increased understanding of the dynamic operation of combined pellet and solar heating systems for single-family houses. The results show that electricity savings and annual system efficiency is strongly affected by the system design and the control strategy. Large reductions in pellet consumption are possible by combining pellet boilers with solar heating (a reduction larger than the solar gains if the system is properly designed). In addition, large reductions in carbon monoxide emissions are possible. To achieve these reductions it is required that the hot water production and the connection of the radiator circuit is moved to a well insulated, solar heated buffer store so that the boiler can be turned off during the periods when the solar collectors cover the heating demand. The amount of electricity replaced using systems with pellet stoves is very dependant on the house plan, the system design, if internal doors are open or closed and the comfort requirements. Proper system design and control strategies are crucial to obtain high electricity savings and high comfort with pellet stove systems. The investigated technologies for increasing the solar gains (DHW-units and heat-fed appliances) significantly increase the solar gains, but for the heat-fed appliances the market introduction is difficult due to the limited financial savings and the need for a new heat distribution system. The applications closest to market introduction could be for communal laundries and for use in sunny climates where the dominating part of the heat can be covered by solar heating. The DHW-unit is economical but competes with the internal finned-tube heat exchanger which is the totally dominating technology for hot water preparation in solar combisystems for single-family houses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A solar thermal system with seasonal borehole storage for heating of a residential area in Anneberg, Sweden, approximately 10 km north of Stockholm, has been in operation since late 2002. Originally, the project was part of the EU THERMIE project “Large-scale Solar Heating Systems for Housing Developments” (REB/0061/97) and was the first solar heating plant in Europe with borehole storage in rock not utilizing a heat pump. Earlier evaluations of the system show lower performance than the preliminary simulation study, with residents complaining of a high use of electricity for domestic hot water (DHW) preparation and auxiliary heating. One explanation mentioned in the earlier evaluations is that the borehole storage had not yet reached “steady state” temperatures at the time of evaluation. Many years have passed since then and this paper presents results from a new evaluation. The main aim of this work is to evaluate the current performance of the system based on several key figures, as well as on system function based on available measurement data. The analysis show that though the borehole storage now has reached a quasi-steady state and operates as intended, the auxiliary electricity consumption is much higher than the original design values largely due to high losses in the distribution network, higher heat loads as well as lower solar gains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis contributes to the heuristic optimization of the p-median problem and Swedish population redistribution.   The p-median model is the most representative model in the location analysis. When facilities are located to a population geographically distributed in Q demand points, the p-median model systematically considers all the demand points such that each demand point will have an effect on the decision of the location. However, a series of questions arise. How do we measure the distances? Does the number of facilities to be located have a strong impact on the result? What scale of the network is suitable? How good is our solution? We have scrutinized a lot of issues like those. The reason why we are interested in those questions is that there are a lot of uncertainties in the solutions. We cannot guarantee our solution is good enough for making decisions. The technique of heuristic optimization is formulated in the thesis.   Swedish population redistribution is examined by a spatio-temporal covariance model. A descriptive analysis is not always enough to describe the moving effects from the neighbouring population. A correlation or a covariance analysis is more explicit to show the tendencies. Similarly, the optimization technique of the parameter estimation is required and is executed in the frame of statistical modeling. 

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper formulates the problem of learning Bayesian network structures from data as determining the structure that best approximates the probability distribution indicated by the data. A new metric, Penalized Mutual Information metric, is proposed, and a evolutionary algorithm is designed to search for the best structure among alternatives. The experimental results show that this approach is reliable and promising.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the application of neural networks to the recognition of lubrication defects typical to an industrial cold forging process employed by fastener manufacturers. The accurate recognition of lubrication errors, such as coating not being applied properly or damaged during material handling, is very important to the quality of the final product in fastener manufacture. Lubrication errors lead to increased forging loads and premature tool failure, as well as to increased defect sorting and the re-processing of the coated rod. The lubrication coating provides a barrier between the work material and the die during the drawing operation; moreover it needs be sufficiently robust to remain on the wire during the transfer to the cold forging operation. In the cold forging operation the wire undergoes multi-stage deformation without the application of any additional lubrication. Four types of lubrication errors, typical to production of fasteners, were introduced to a set of sample rods, which were subsequently drawn under laboratory conditions. The drawing force was measured, from which a limited set of features was extracted. The neural network based model learned from these features is able to recognize all types of lubrication errors to a high accuracy. The overall accuracy of the neural network model is around 98% with almost uniform distribution of errors between all four errors and the normal condition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although the development of geographic information system (GIS) technology and digital data manipulation techniques has enabled practitioners in the geographical and geophysical sciences to make more efficient use of resource information, many of the methods used in forming spatial prediction models are still inherently based on traditional techniques of map stacking in which layers of data are combined under the guidance of a theoretical domain model. This paper describes a data-driven approach by which Artificial Neural Networks (ANNs) can be trained to represent a function characterising the probability that an instance of a discrete event, such as the presence of a mineral deposit or the sighting of an endangered animal species, will occur over some grid element of the spatial area under consideration. A case study describes the application of the technique to the task of mineral prospectivity mapping in the Castlemaine region of Victoria using a range of geological, geophysical and geochemical input variables. Comparison of the maps produced using neural networks with maps produced using a density estimation-based technique demonstrates that the maps can reliably be interpreted as representing probabilities. However, while the neural network model and the density estimation-based model yield similar results under an appropriate choice of values for the respective parameters, the neural network approach has several advantages, especially in high dimensional input spaces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The peer-to-peer content distribution network (PCDN) is a hot topic recently, and it has a huge potential for massive data intensive applications on the Internet. One of the challenges in PCDN is routing for data sources and data deliveries. In this paper, we studied a type of network model which is formed by dynamic autonomy area, structured source servers and proxy servers. Based on this network model, we proposed a number of algorithms to address the routing and data delivery issues. According to the highly dynamics of the autonomy area, we established dynamic tree structure proliferation system routing, proxy routing and resource searching algorithms. The simulations results showed that the performance of the proposed network model and the algorithms are stable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DDoS is a spy-on-spy game between attackers and detectors. Attackers are mimicking network traffic patterns to disable the detection algorithms which are based on these features. It is an open problem of discriminating the mimicking DDoS attacks from massive legitimate network accessing. We observed that the zombies use controlled function(s) to pump attack packages to the victim, therefore, the attack flows to the victim are always share some properties, e.g. packages distribution behaviors, which are not possessed by legitimate flows in a short time period. Based on this observation, once there appear suspicious flows to a server, we start to calculate the distance of the package distribution behavior among the suspicious flows. If the distance is less than a given threshold, then it is a DDoS attack, otherwise, it is a legitimate accessing. Our analysis and the preliminary experiments indicate that the proposed method- can discriminate mimicking flooding attacks from legitimate accessing efficiently and effectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The presence of nitric oxide synthase (NOS) and role of nitric oxide (NO) in vascular regulation was investigated in the Australian lungfish, Neoceratodus forsteri. No evidence was found for NOS in the endothelium of large and small blood vessels following processing for NADPH-diaphorase histochemistry. However, both NADPH-diaphorase histochemistry and neural NOS immunohistochemistry demonstrated a sparse network of nitrergic nerves in the dorsal aorta, hepatic artery, and branchial arteries, but there were no nitrergic nerves in small blood vessels in tissues. In contrast, nitrergic nerves were found in non-vascular tissues of the lung, gut and kidney. Dual-wire myography was used to determine if NO signalling occurred in the branchial artery of N. forsteri. Both SNP and SIN-1 had no effect on the pre-constricted branchial artery, but the particulate guanylyl cyclase (GC) activator, C-type natriuretic peptide, always caused vasodilation. Nicotine mediated a dilation that was not inhibited by the soluble GC inhibitor, ODQ, or the NOS inhibitor, L-NNA, but was blocked by the cyclooxygenase inhibitor, indomethacin. These data suggest that NO control of the branchial artery is lacking, but that prostaglandins could be endothelial relaxing factors in the vasculature of lungfish.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we test for asymmetric behaviour of industrial and residential electricity demand for the G7 countries, using the entropy-based test for symmetry suggested by [Racine, J., and Maasoumi, E., 2007. A versatile and robust metric entropy test of time-reversibility, and other hypotheses. Journal of Econometrics 138(2), 547–567; Racine, J., and Maasoumi, E., 2008. A robust entropy-based test of asymmetry for discrete and continuous processes. Econometric Reviews 28, 246–261], the Triples test of [Randles, R., Flinger, M., Policello, G., and Wolfe, D., 1980. An asymptotically distribution-free test for symmetry versus asymmetry. Journal of the American Statistical Association 75, 168–172] and the [Bai, J., and Ng, S., 2001. A consistent test for conditional symmetry in time series models. Journal of Econometrics 103, 225–258] test for conditional symmetry. Using data that spans over three decades, we find overwhelming evidence of conditional symmetry of residential and industrial electricity consumption. This finding implies that the use of econometric tests based on linear data generating processes is credible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the Multi-level Virtual Ring (MVR), a new name routing scheme for sensor networks. MVR uses selection algorithm to identify sensor nodes' virtual level and uses Distribution Hash Table (DHT) to map them to the MVR, The address routing performs well in wired network, but it's not true in sensor network. Because when nodes are moving, the address of the nodes must be changed Further, the address routing needs servers to allocate addresses to nodes. To solve this problem, the name routing is being introduced, such as Virtual Ring Routing (VRR). MVR is a new name routing scheme, which improves the routing performance significantly by introducing the multi-level virtual ring and cross-level routing. Experiments show this embedded name routing is workable and achieves better routing performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The continuous growth of the users pool of Social Networking web sites such as Facebook and MySpace, and their incessant augmentation of services and capabilities will in the future, meet and compare in contrast with today's Content distribution Networks (CDN) and Peer-to-Peer File sharing applications such as Kazaa and BitTorrent, but how can these two main streams applications, that already encounter their own security problems cope with the combined issues, trust for Social Networks, content and index poisoning in CDN? We will address the problems of Social Trust and File Sharing with an overlay level of trust model based on social activity and transactions, this can be an answer to enable users to increase the reliability of their online social life and also enhance the content distribution and create a better file sharing example. The aim of this research is to lower the risk of malicious activity on a given Social Network by applying a correlated trust model, to guarantee the validity of someone's identity, privacy and trustfulness in sharing content.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Any attempt to model an economy requires foundational assumptions about the relations between prices, values and the distribution of wealth. These assumptions exert a profound influence over the results of any model. Unfortunately, there are few areas in economics as vexed as the theory of value. I argue in this paper that the fundamental problem with past theories of value is that it is simply not possible to model the determination of value, the formation of prices and the distribution of income in a real economy with analytic mathematical models. All such attempts leave out crucial processes or make unrealistic assumptions which significantly affect the results. There have been two primary approaches to the theory of value. The first, associated with classical economists such as Ricardo and Marx were substance theories of value, which view value as a substance inherent in an object and which is conserved in exchange. For Marxists, the value of a commodity derives solely from the value of the labour power used to produce it - and therefore any profit is due to the exploitation of the workers. The labour theory of value has been discredited because of its assumption that labour was the only ‘factor’ that contributed to the creation of value, and because of its fundamentally circular argument. Neoclassical theorists argued that price was identical with value and was determined purely by the interaction of supply and demand. Value then, was completely subjective. Returns to labour (wages) and capital (profits) were determined solely by their marginal contribution to production, so that each factor received its just reward by definition. Problems with the neoclassical approach include assumptions concerning representative agents, perfect competition, perfect and costless information and contract enforcement, complete markets for credit and risk, aggregate production functions and infinite, smooth substitution between factors, distribution according to marginal products, firms always on the production possibility frontier and firms’ pricing decisions, ignoring money and credit, and perfectly rational agents with infinite computational capacity. Two critical areas include firstly, the underappreciated Sonnenschein-Mantel- Debreu results which showed that the foundational assumptions of the Walrasian general-equilibrium model imply arbitrary excess demand functions and therefore arbitrary equilibrium price sets. Secondly, in real economies, there is no equilibrium, only continuous change. Equilibrium is never reached because of constant changes in preferences and tastes; technological and organisational innovations; discoveries of new resources and new markets; inaccurate and evolving expectations of businesses, consumers, governments and speculators; changing demand for credit; the entry and exit of firms; the birth, learning, and death of citizens; changes in laws and government policies; imperfect information; generalized increasing returns to scale; random acts of impulse; weather and climate events; changes in disease patterns, and so on. The problem is not the use of mathematical modelling, but the kind of mathematical modelling used. Agent-based models (ABMs), objectoriented programming and greatly increased computer power however, are opening up a new frontier. Here a dynamic bargaining ABM is outlined as a basis for an alternative theory of value. A large but finite number of heterogeneous commodities and agents with differing degrees of market power are set in a spatial network. Returns to buyers and sellers are decided at each step in the value chain, and in each factor market, through the process of bargaining. Market power and its potential abuse against the poor and vulnerable are fundamental to how the bargaining dynamics play out. Ethics therefore lie at the very heart of economic analysis, the determination of prices and the distribution of wealth. The neoclassicals are right then that price is the enumeration of value at a particular time and place, but wrong to downplay the critical roles of bargaining, power and ethics in determining those same prices.