48 resultados para distribution network


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The presence of nitric oxide synthase (NOS) and role of nitric oxide (NO) in vascular regulation was investigated in the Australian lungfish, Neoceratodus forsteri. No evidence was found for NOS in the endothelium of large and small blood vessels following processing for NADPH-diaphorase histochemistry. However, both NADPH-diaphorase histochemistry and neural NOS immunohistochemistry demonstrated a sparse network of nitrergic nerves in the dorsal aorta, hepatic artery, and branchial arteries, but there were no nitrergic nerves in small blood vessels in tissues. In contrast, nitrergic nerves were found in non-vascular tissues of the lung, gut and kidney. Dual-wire myography was used to determine if NO signalling occurred in the branchial artery of N. forsteri. Both SNP and SIN-1 had no effect on the pre-constricted branchial artery, but the particulate guanylyl cyclase (GC) activator, C-type natriuretic peptide, always caused vasodilation. Nicotine mediated a dilation that was not inhibited by the soluble GC inhibitor, ODQ, or the NOS inhibitor, L-NNA, but was blocked by the cyclooxygenase inhibitor, indomethacin. These data suggest that NO control of the branchial artery is lacking, but that prostaglandins could be endothelial relaxing factors in the vasculature of lungfish.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the Multi-level Virtual Ring (MVR), a new name routing scheme for sensor networks. MVR uses selection algorithm to identify sensor nodes' virtual level and uses Distribution Hash Table (DHT) to map them to the MVR, The address routing performs well in wired network, but it's not true in sensor network. Because when nodes are moving, the address of the nodes must be changed Further, the address routing needs servers to allocate addresses to nodes. To solve this problem, the name routing is being introduced, such as Virtual Ring Routing (VRR). MVR is a new name routing scheme, which improves the routing performance significantly by introducing the multi-level virtual ring and cross-level routing. Experiments show this embedded name routing is workable and achieves better routing performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The continuous growth of the users pool of Social Networking web sites such as Facebook and MySpace, and their incessant augmentation of services and capabilities will in the future, meet and compare in contrast with today's Content distribution Networks (CDN) and Peer-to-Peer File sharing applications such as Kazaa and BitTorrent, but how can these two main streams applications, that already encounter their own security problems cope with the combined issues, trust for Social Networks, content and index poisoning in CDN? We will address the problems of Social Trust and File Sharing with an overlay level of trust model based on social activity and transactions, this can be an answer to enable users to increase the reliability of their online social life and also enhance the content distribution and create a better file sharing example. The aim of this research is to lower the risk of malicious activity on a given Social Network by applying a correlated trust model, to guarantee the validity of someone's identity, privacy and trustfulness in sharing content.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Any attempt to model an economy requires foundational assumptions about the relations between prices, values and the distribution of wealth. These assumptions exert a profound influence over the results of any model. Unfortunately, there are few areas in economics as vexed as the theory of value. I argue in this paper that the fundamental problem with past theories of value is that it is simply not possible to model the determination of value, the formation of prices and the distribution of income in a real economy with analytic mathematical models. All such attempts leave out crucial processes or make unrealistic assumptions which significantly affect the results. There have been two primary approaches to the theory of value. The first, associated with classical economists such as Ricardo and Marx were substance theories of value, which view value as a substance inherent in an object and which is conserved in exchange. For Marxists, the value of a commodity derives solely from the value of the labour power used to produce it - and therefore any profit is due to the exploitation of the workers. The labour theory of value has been discredited because of its assumption that labour was the only ‘factor’ that contributed to the creation of value, and because of its fundamentally circular argument. Neoclassical theorists argued that price was identical with value and was determined purely by the interaction of supply and demand. Value then, was completely subjective. Returns to labour (wages) and capital (profits) were determined solely by their marginal contribution to production, so that each factor received its just reward by definition. Problems with the neoclassical approach include assumptions concerning representative agents, perfect competition, perfect and costless information and contract enforcement, complete markets for credit and risk, aggregate production functions and infinite, smooth substitution between factors, distribution according to marginal products, firms always on the production possibility frontier and firms’ pricing decisions, ignoring money and credit, and perfectly rational agents with infinite computational capacity. Two critical areas include firstly, the underappreciated Sonnenschein-Mantel- Debreu results which showed that the foundational assumptions of the Walrasian general-equilibrium model imply arbitrary excess demand functions and therefore arbitrary equilibrium price sets. Secondly, in real economies, there is no equilibrium, only continuous change. Equilibrium is never reached because of constant changes in preferences and tastes; technological and organisational innovations; discoveries of new resources and new markets; inaccurate and evolving expectations of businesses, consumers, governments and speculators; changing demand for credit; the entry and exit of firms; the birth, learning, and death of citizens; changes in laws and government policies; imperfect information; generalized increasing returns to scale; random acts of impulse; weather and climate events; changes in disease patterns, and so on. The problem is not the use of mathematical modelling, but the kind of mathematical modelling used. Agent-based models (ABMs), objectoriented programming and greatly increased computer power however, are opening up a new frontier. Here a dynamic bargaining ABM is outlined as a basis for an alternative theory of value. A large but finite number of heterogeneous commodities and agents with differing degrees of market power are set in a spatial network. Returns to buyers and sellers are decided at each step in the value chain, and in each factor market, through the process of bargaining. Market power and its potential abuse against the poor and vulnerable are fundamental to how the bargaining dynamics play out. Ethics therefore lie at the very heart of economic analysis, the determination of prices and the distribution of wealth. The neoclassicals are right then that price is the enumeration of value at a particular time and place, but wrong to downplay the critical roles of bargaining, power and ethics in determining those same prices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effective disinfection planning and management in large, complex water distribution systems requires an accurate network water quality model. This model should be based on reaction kinetics, which describes disinfectant loss from bulk water over time, within experimental error. Models in the literature were reviewed for their ability to meet this requirement in real networks. Essential features were identified as accuracy, simplicity, computational efficiency, and ability to describe consistently the effects of initial chlorine dose, temperature variation, and successive rechlorinations. A reaction scheme of two organic constituents reacting with free chlorine was found to be necessary and sufficient to provide the required features. Recent release of the multispecies extension (MSX) to EPANET and MWH Soft's H2OMap Water MSX network software enables users to implement this and other multiple-reactant bulk decay models in real system simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the structure, function and role of local business associations in home based business development within an urban region. Casey local government area (LGA), Victoria, is the focus, where nine local business associations in the area (as well as the local council) are evaluated in the context of support for local-based business development. The evaluation draws upon primary data collected by surveys of local home based businesses, and follows up by semi-structured interviews of representatives from these business associations and the local council. This paper identifies that local business associations are fragmented and have significant overlap in their activities, of which the commonest activity is acting as a knowledge distribution node. The cash strapped local council is the most important node. All are restricted by vision and resources. As a result, the services provided have little impact on sustainable business development in Casey.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prediction intervals (PIs) are excellent tools for quantification of uncertainties associated with point forecasts and predictions. This paper adopts and develops the lower upper bound estimation (LUBE) method for construction of PIs using neural network (NN) models. This method is fast and simple and does not require calculation of heavy matrices, as required by traditional methods. Besides, it makes no assumption about the data distribution. A new width-based index is proposed to quantitatively check how much PIs are informative. Using this measure and the coverage probability of PIs, a multi-objective optimization problem is formulated to train NN models in the LUBE method. The optimization problem is then transformed into a training problem through definition of a PI-based cost function. Particle swarm optimization (PSO) with the mutation operator is used to minimize the cost function. Experiments with synthetic and real-world case studies indicate that the proposed PSO-based LUBE method can construct higher quality PIs in a simpler and faster manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Bayes net has qualitative and quantitative aspects: The qualitative aspect is its graphical structure that corresponds to correlations among the variables in the Bayes net. The quantitative aspects are the net parameters. This paper develops a hybrid criterion for learning Bayes net structures that is based on both aspects. We combine model selection criteria measuring data fit with correlation information from statistical tests: Given a sample d, search for a structure G that maximizes score(G, d), over the set of structures G that satisfy the dependencies detected in d. We rely on the statistical test only to accept conditional dependencies, not conditional independencies. We show how to adapt local search algorithms to accommodate the observed dependencies. Simulation studies with GES search and the BDeu/BIC scores provide evidence that the additional dependency information leads to Bayes nets that better fit the target model in distribution and structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Isolated distribution systems are dispersed throughout regional Queensland to supply small isolated communities that are distant from the main supply grid. The costs of maintaining the electricity supply to these areas is costly; mainly due to the cost of diesel fuel. Furthermore, there is a community focus on climate change, and Ergon Energy aims to reduce the reliance on fossil fuels whilst optimising cost efficiencies and greenhouse gas emissions. The objective of this study is to examine the impacts of renewable energy sources in isolated power systems. For the locations studied, viable renewable energy sources have been integrated into these networks. Anticipated challenges and issues with the integration of the intermittent renewable energy sources were addressed, using mitigation techniques, including energy storage solutions. The investigation and findings demonstrated that network improvements can be achieved by an ideal level of renewable penetration, which has been the main focus of the project. The project involved the development and simulation of MATLAB Simulink and SINCAL models of the two isolated networks at Gununa and Bamaga. The subsequent analysis of these systems has shown a modest penetration level of renewables can be combined with energy storage solutions, which reduces fuel consumption and greenhouse gas emissions at these locations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Maintaining reliability and stability of a power systems in transmission and distribution level becomes a big challenge in present scenario. Grid operators are always responsible to maintain equilibrium between available power generation and demand of end users. Maintaining grid balance is a bigger issue, in case of any unexpected generation shortage or grid disturbance or integration of any renewable energy sources like wind and solar power in the energy mix. In order to compensate such imbalance and to facilitate more renewable energy sources with the grid, energy storage system (ESS) started to be playing an important role with the advancement of the state of the art technology. ESS can also help to get reduction in greenhouse gas (GHG) emission by means of integrating more renewable energy sources to the grid. There are various types of Energy Storage (ES) technologies which are being used in power systems network from large scale (above 50MW) to small scale (up to 100KW). Based on the characteristics, each storage technology has their own merits and demerits. This paper carried out extensive review study and verifies merits and demerits of each storage technology and identifies the suitable technology for the future. This paper also has conducted feasibility study with the aid of E-SelectTM tool for various ES technologies in applications point of view at different grid locations. This review study helps to evaluate feasible ES technology for a particular electrical application and also helps to develop smart hybrid storage system for grid applications in efficient way.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cross-linked poly(ethylene glycol) diacrylate (PEGDA) hydrogels with uniformly controlled nanoporous structures templated from hexagonal lyotropic liquid crystals (LLC) represent separation membrane materials with potentially high permeability and selectivity due to their high pore density and narrow pore size distribution. However, retaining LLC templated nanostructures is a challenge as the polymer gels are not strong enough to sustain the surface tension during the drying process. In the current study, cross-linked PEGDA gels were reinforced with a silica network synthesized via an in situ sol-gel method, which assists in the retention of the hexagonal LLC structure. The silica precursor does not obstruct the formation of hexagonal phases. After surfactant removal and drying, these hexagonal structures in samples with a certain amount of tetraethoxysilane (TEOS) loading are well retained while the nanostructures are collapsed in samples without silica reinforcement, leading to the hypothesis that the reinforcement provided by the silica network stabilizes the LLC structure. The study examines the conditions necessary for a sufficient and well dispersed silica network in PEGDA gels that contributes to the retention of original LLC structures, which potentially enables broad applications of these gels as biomedical and membrane materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the arrival of big data era, the Internet traffic is growing exponentially. A wide variety of applications arise on the Internet and traffic classification is introduced to help people manage the massive applications on the Internet for security monitoring and quality of service purposes. A large number of Machine Learning (ML) algorithms are introduced to deal with traffic classification. A significant challenge to the classification performance comes from imbalanced distribution of data in traffic classification system. In this paper, we proposed an Optimised Distance-based Nearest Neighbor (ODNN), which has the capability of improving the classification performance of imbalanced traffic data. We analyzed the proposed ODNN approach and its performance benefit from both theoretical and empirical perspectives. A large number of experiments were implemented on the real-world traffic dataset. The results show that the performance of “small classes” can be improved significantly even only with small number of training data and the performance of “large classes” remains stable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Networks of marine protected areas (MPAs) are being adopted globally to protect ecosystems and supplement fisheries management. The state of California recently implemented a coast-wide network of MPAs, a statewide seafloor mapping program, and ecological characterizations of species and ecosystems targeted for protection by the network. The main goals of this study were to use these data to evaluate how well seafloor features, as proxies for habitats, are represented and replicated across an MPA network and how well ecological surveys representatively sampled fish habitats inside MPAs and adjacent reference sites. Seafloor data were classified into broad substrate categories (rock and sediment) and finer scale geomorphic classifications standard to marine classification schemes using surface analyses (slope, ruggedness, etc.) done on the digital elevation model derived from multibeam bathymetry data. These classifications were then used to evaluate the representation and replication of seafloor structure within the MPAs and across the ecological surveys. Both the broad substrate categories and the finer scale geomorphic features were proportionately represented for many of the classes with deviations of 1-6% and 0-7%, respectively. Within MPAs, however, representation of seafloor features differed markedly from original estimates, with differences ranging up to 28%. Seafloor structure in the biological monitoring design had mismatches between sampling in the MPAs and their corresponding reference sites and some seafloor structure classes were missed entirely. The geomorphic variables derived from multibeam bathymetry data for these analyses are known determinants of the distribution and abundance of marine species and for coastal marine biodiversity. Thus, analyses like those performed in this study can be a valuable initial method of evaluating and predicting the conservation value of MPAs across a regional network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

 Privacy is receiving growing concern from various parties especially consumers due to the simplification of the collection and distribution of personal data. This research focuses on preserving privacy in social network data publishing. The study explores the data anonymization mechanism in order to improve privacy protection of social network users. We identified new type of privacy breach and has proposed an effective mechanism for privacy protection.