834 resultados para Multicommodity capacitated network design problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION With the advent of Web 2.0, social networking websites like Facebook, MySpace and LinkedIn have become hugely popular. According to (Nilsen, 2009), social networking websites have global1 figures of almost 250 millions unique users among the top five2, with the time people spend on those networks increasing 63% between 2007 and 2008. Facebook alone saw a massive growth of 566% in number of minutes in the same period of time. Furthermore their appeal is clear, they enable users to easily form persistent networks of friends with whom they can interact and share content. Users then use those networks to keep in touch with their current friends and to reconnect with old friends. However, online social network services have rapidly evolved into highly complex systems which contain a large amount of personally salient information derived from large networks of friends. Since that information varies from simple links to music, photos and videos, users not only have to deal with the huge amount of data generated by them and their friends but also with the fact that it‟s composed of many different media forms. Users are presented with increasing challenges, especially as the number of friends on Facebook rises. An example of a problem is when a user performs a simple task like finding a specific friend in a group of 100 or more friends. In that case he would most likely have to go through several pages and make several clicks till he finds the one he is looking for. Another example is a user with more than 100 friends in which his friends make a status update or another action per day, resulting in 10 updates per hour to keep up. That is plausible, especially since the change in direction of Facebook to rival with Twitter, by encouraging users to update their status as they do on Twitter. As a result, to better present the web of information connected to a user the use of better visualizations is essential. The visualizations used nowadays on social networking sites haven‟t gone through major changes during their lifetimes. They have added more functionality and gave more tools to their users, but still the core of their visualization hasn‟t changed. The information is still presented in a flat way in lists/groups of text and images which can‟t show the extra connections pieces of information. Those extra connections can give new meaning and insights to the user, allowing him to more easily see if that content is important to him and the information related to it. However showing extra connections of information but still allowing the user to easily navigate through it and get the needed information with a quick glance is difficult. The use of color coding, clusters and shapes becomes then essential to attain that objective. But taking into consideration the advances in computer hardware in the last decade and the software platforms available today, there is the opportunity to take advantage of 3D. That opportunity comes in because we are at a phase were the hardware and the software available is ready for the use of 3D in the web. With the use of the extra dimension brought by 3D, visualizations can be constructed to show the content and its related information to the user at the same screen and in a clear way. Also it would allow a great deal of interactivity. Another opportunity to create better information‟s visualization presents itself in the form of the open APIs, specifically the ones made available by the social networking sites. Those APIs allow any developers to create their own applications or sites taking advantage of the huge amount of information there is on those networks. Specifically to this case, they open the door for the creation of new social network visualizations. Nevertheless, the third dimension is by itself not enough to create a better interface for a social networking website, there are some challenges to overcome. One of those challenges is to make the user understand what the system is doing during the interaction with the user. Even though that is important in 2D visualizations, it becomes essential in 3D due to the extra dimension. To overcome that challenge it‟s necessary the use of the principles of animations defined by the artists at Walt Disney Studios (Johnston, et al., 1995). By applying those principles in the development of the interface, the actions of the system in response to the user inputs became clear and understandable. Furthermore, a user study needs to be performed so the users‟ main goals and motivations, while navigating the social network, are revealed. Their goals and motivations are important in the construction of an interface that reflects the user expectations for the interface, but also helps in the development of appropriate metaphors. Those metaphors have an important role in the interface, because if correctly chosen they help the user understand the elements of the interface instead of making him memorize it. The last challenge is the use of 3D visualization on the web, since there have been several attempts to bring 3D into it, mainly with the various versions of VRML which were destined to failure due to the hardware limitations at the time. However, in the last couple of years there has been a movement to make the necessary tools to finally allow developers to use 3D in a useful way, using X3D or OpenGL but especially flash. This thesis argues that there is a need for a better social network visualization that shows all the dimensions of the information connected to the user and that allows him to move through it. But there are several characteristics the new visualization has to possess in order for it to present a real gain in usability to Facebook‟s users. The first quality is to have the friends at the core of its design, and the second to make use of the metaphor of circles of friends to separate users in groups taking into consideration the order of friendship. To achieve that several methods have to be used, from the use of 3D to get an extra dimension for presenting relevant information, to the use of direct manipulation to make the interface comprehensible, predictable and controllable. Moreover animation has to be use to make all the action on the screen perceptible to the user. Additionally, with the opportunity given by the 3D enabled hardware, the flash platform, through the use of the flash engine Papervision3D and the Facebook platform, all is in place to make the visualization possible. But even though it‟s all in place, there are challenges to overcome like making the system actions in 3D understandable to the user and creating correct metaphors that would allow the user to understand the information and options available to him. This thesis document is divided in six chapters, with Chapter 2 reviewing the literature relevant to the work described in this thesis. In Chapter 3 the design stage that resulted in the application presented in this thesis is described. In Chapter 4, the development stage, describing the architecture and the components that compose the application. In Chapter 5 the usability test process is explained and the results obtained through it are presented and analyzed. To finish, Chapter 6 presents the conclusions that were arrived in this thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A constraint satisfaction problem is a classical artificial intelligence paradigm characterized by a set of variables (each variable with an associated domain of possible values), and a set of constraints that specify relations among subsets of these variables. Solutions are assignments of values to all variables that satisfy all the constraints. Many real world problems may be modelled by means of constraints. The range of problems that can use this representation is very diverse and embraces areas like resource allocation, scheduling, timetabling or vehicle routing. Constraint programming is a form of declarative programming in the sense that instead of specifying a sequence of steps to execute, it relies on properties of the solutions to be found, which are explicitly defined by constraints. The idea of constraint programming is to solve problems by stating constraints which must be satisfied by the solutions. Constraint programming is based on specialized constraint solvers that take advantage of constraints to search for solutions. The success and popularity of complex problem solving tools can be greatly enhanced by the availability of friendly user interfaces. User interfaces cover two fundamental areas: receiving information from the user and communicating it to the system; and getting information from the system and deliver it to the user. Despite its potential impact, adequate user interfaces are uncommon in constraint programming in general. The main goal of this project is to develop a graphical user interface that allows to, intuitively, represent constraint satisfaction problems. The idea is to visually represent the variables of the problem, their domains and the problem constraints and enable the user to interact with an adequate constraint solver to process the constraints and compute the solutions. Moreover, the graphical interface should be capable of configure the solver’s parameters and present solutions in an appealing interactive way. As a proof of concept, the developed application – GraphicalConstraints – focus on continuous constraint programming, which deals with real valued variables and numerical constraints (equations and inequalities). RealPaver, a state-of-the-art solver in continuous domains, was used in the application. The graphical interface supports all stages of constraint processing, from the design of the constraint network to the presentation of the end feasible space solutions as 2D or 3D boxes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several positioning techniques have been developed to explore the GPS capability to provide precise coordinates in real time. However, a significant problem to all techniques is the ionosphere effect and the troposphere refraction. Recent researches in Brazil, at São Paulo State University (UNESP), have been trying to tackle these problems. In relation to the ionosphere effects it has been developed a model named Mod_Ion. Concerning tropospheric refraction, a model of Numerical Weather Prediction(NWP) has been used to compute the zenithal tropospheric delay (ZTD). These two models have been integrated with two positioning methods: DGPS (Differential GPS) and network RTK (Real Time Kinematic). These two positioning techniques are being investigated at São Paulo State University (UNESP), Brazil. The in-house DGPS software was already finalized and has provided very good results. The network RTK software is still under development. Therefore, only preliminary results from this method using the VRS (Virtual Reference Station) concept are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work shows the design, simulation, and analysis of two optical interconnection networks for a Dataflow parallel computer architecture. To verify the optical interconnection network performance on the Dataflow architecture, we have analyzed the load balancing among the processors during the parallel programs executions. The load balancing is a very important parameter because it is directly associated to the dataflow parallelism degree. This article proves that optical interconnection networks designed with simple optical devices can provide efficiently the dataflow requirements of a high performance communication system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Systems based on artificial neural networks have high computational rates due to the use of a massive number of simple processing elements and the high degree of connectivity between these elements. This paper presents a novel approach to solve robust parameter estimation problem for nonlinear model with unknown-but-bounded errors and uncertainties. More specifically, a modified Hopfield network is developed and its internal parameters are computed using the valid-subspace technique. These parameters guarantee the network convergence to the equilibrium points. A solution for the robust estimation problem with unknown-but-bounded error corresponds to an equilibrium point of the network. Simulation results are presented as an illustration of the proposed approach. Copyright (C) 2000 IFAC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A neural network model for solving the N-Queens problem is presented in this paper. More specifically, a modified Hopfield network is developed and its internal parameters are computed using the valid-subspace technique. These parameters guarantee the convergence of the network to the equilibrium points. The network is shown to be completely stable and globally convergent to the solutions of the N-Queens problem. Simulation results are presented to validate the proposed approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Economic dispatch (ED) problems have recently been solved by artificial neural network approaches. Systems based on artificial neural networks have high computational rates due to the use of a massive number of simple processing elements and the high degree of connectivity between these elements. The ability of neural networks to realize some complex non-linear function makes them attractive for system optimization. All ED models solved by neural approaches described in the literature fail to represent the transmission system. Therefore, such procedures may calculate dispatch policies, which do not take into account important active power constraints. Another drawback pointed out in the literature is that some of the neural approaches fail to converge efficiently toward feasible equilibrium points. A modified Hopfield approach designed to solve ED problems with transmission system representation is presented in this paper. The transmission system is represented through linear load flow equations and constraints on active power flows. The internal parameters of such modified Hopfield networks are computed using the valid-subspace technique. These parameters guarantee the network convergence to feasible equilibrium points, which represent the solution for the ED problem. Simulation results and a sensitivity analysis involving IEEE 14-bus test system are presented to illustrate efficiency of the proposed approach. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents a well-known interior point method (IPM) used to solve problems of linear programming that appear as sub-problems in the solution of the long-term transmission network expansion planning problem. The linear programming problem appears when the transportation model is used, and when there is the intention to solve the planning problem using a constructive heuristic algorithm (CHA), ora branch-and-bound algorithm. This paper shows the application of the IPM in a CHA. A good performance of the IPM was obtained, and then it can be used as tool inside algorithm, used to solve the planning problem. Illustrative tests are shown, using electrical systems known in the specialized literature. (C) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of signal tracking, in the presence of a disturbance signal in the plant, is solved using a zero-variation methodology. A state feedback controller is designed in order to minimise the H-2-norm of the closed-loop system, such that the effect of the disturbance is attenuated. Then, a state estimator is designed and the modification of the zeros is used to minimise the H-infinity-norm from the reference input signal to the error signal. The error is taken to be the difference between the reference and the output signals, thereby making it a tracking problem. The design is formulated in a linear matrix inequality framework, such that the optimal solution of the stated control problem is obtained. Practical examples illustrate the effectiveness of the proposed method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents a neural network based on the ART architecture ( adaptive resonance theory), named fuzzy ART& ARTMAP neural network, applied to the electric load-forecasting problem. The neural networks based on the ARTarchitecture have two fundamental characteristics that are extremely important for the network performance ( stability and plasticity), which allow the implementation of continuous training. The fuzzy ART& ARTMAP neural network aims to reduce the imprecision of the forecasting results by a mechanism that separate the analog and binary data, processing them separately. Therefore, this represents a reduction on the processing time and improved quality of the results, when compared to the Back-Propagation neural network, and better to the classical forecasting techniques (ARIMA of Box and Jenkins methods). Finished the training, the fuzzy ART& ARTMAP neural network is capable to forecast electrical loads 24 h in advance. To validate the methodology, data from a Brazilian electric company is used. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present the results of the use of a methodology for multinodal load forecasting through an artificial neural network-type Multilayer Perceptron, making use of radial basis functions as activation function and the Backpropagation algorithm, as an algorithm to train the network. This methodology allows you to make the prediction at various points in power system, considering different types of consumers (residential, commercial, industrial) of the electric grid, is applied to the problem short-term electric load forecasting (24 hours ahead). We use a database (Centralised Dataset - CDS) provided by the Electricity Commission de New Zealand to this work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A combinatorial mathematical model in tandem with a metaheuristic technique for solving transmission network expansion planning (TNEP) using an AC model associated with reactive power planning (RPP) is presented in this paper. AC-TNEP is handled through a prior DC model while additional lines as well as VAr-plants are used as reinforcements to cope with real network requirements. The solution of the reinforcement stage can be obtained by assuming all reactive demands are supplied locally to achieve a solution for AC-TNEP and by neglecting the local reactive sources, a reactive power planning (RPP) will be managed to find the minimum required reactive power sources. Binary GA as well as a real genetic algorithm (RCA) are employed as metaheuristic optimization techniques for solving this combinatorial TNEP as well as the RPP problem. High quality results related with lower investment costs through case studies on test systems show the usefulness of the proposal when working directly with the AC model in transmission network expansion planning, instead of relaxed models. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a new approach and coding scheme for solving economic dispatch problems (ED) in power systems through an effortless hybrid method (EHM). This novel coding scheme can effectively prevent futile searching and also prevents obtaining infeasible solutions through the application of stochastic search methods, consequently dramatically improves search efficiency and solution quality. The dominant constraint of an economic dispatch problem is power balance. The operational constraints, such as generation limitations, ramp rate limits, prohibited operating zones (POZ), network loss are considered for practical operation. Firstly, in the EHM procedure, the output of generator is obtained with a lambda iteration method and without considering POZ and later in a genetic based algorithm this constraint is satisfied. To demonstrate its efficiency, feasibility and fastness, the EHM algorithm was applied to solve constrained ED problems of power systems with 6 and 15 units. The simulation results obtained from the EHM were compared to those achieved from previous literature in terms of solution quality and computational efficiency. Results reveal that the superiority of this method in both aspects of financial and CPU time. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)