53 resultados para Issues in social networks
Resumo:
Aquest projecte mostra com les connexions dels usuaris d'una xarxa social suposen un risc afegit per a la privacitat dels usuaris que hi formen part. Aquestes connexions ofereixen informació suficient per a poder dur a terme processos d'agregació d'informació entre diferents xarxes socials, permetent a un atacant millorar el seu coneixement inicial sobre les xarxes. El projecte és un recorregut per totes les fases necessàries per dur a terme aquest procés, des de la recollida de la informació fins a l'agregació de les dades obtingudes.
Resumo:
En el nostre projecte, considerem un escenari urbà o interurbà on persones amb dispositius mòbils (smartphones) o vehicles equipats amb interfícies de comunicació, estan interessats en compartir fitxers entre ells o descarregar-los al creuar Punts d’Accés (APs) propers a la carretera. Estudiem la possibilitat d’utilizar la cooperació en les trobades casuals entre nodes per augmentar la velocitat de descàrrega global. Amb aquest objectiu, plantejem algoritmes per a la selecció de quins paquets, per a quins destins i quins transportistes s’escullen en cada moment. Mitjançant extenses simulacions, mostrem com les cooperacions carry&forward dels nodes augmenten significativament la velocitat de descàrrega dels usuaris, i com aquest resultat es manté per a diversos patrons de mobilitat, col•locacions d'AP i càrregues de la xarxa. Per altra banda, aparells com els smartphones, on la targeta de WiFi està encesa contínuament, consumeixen l'energia de la bateria en poques hores. En molts escenaris, una targeta WiFi sempre activa és poc útil, perque sovint no hi ha necessitat de transmissió o recepció. Aquest fet es veu agreujat en les Delay Tolerant Networks (DTN), on els nodes intercanvien dades quan es creuen i en tenen l’oportunitat. Les tècniques de gestió de l’estalvi d’energia permeten extendre la duració de les bateries. El nostre projecte analitza els avantatges i inconvenients que apareixen quan els nodes apaguen períodicament la seva targeta wireless per a estalviar energia en escenaris DTN. Els nostres resultats mostren les condicions en que un node pot desconnectar la bateria sense afectar la probabilitat de contacte amb altres nodes, i les condicions en que aquesta disminueix. Per exemple, es demostra que la vida del node pot ser duplicada mantenint la probabilitat de contacte a 1. I que aquesta disminueix ràpidament en intentar augmentar més la vida útil.
Examining the sustainability issues in UKOER projects : Developing a sustainable OER ecosystem in HE
Resumo:
The development of open educational resources (OERs) is becoming a strategic priority for governments and education institutions around the world, in response to funding cuts and rising costs in educational provision. In the United Kingdom, a government-sponsored Pilot Programme on Open Educational Recourses (JISC/HEA, 2009) was launched in 2009 with an initial budget of £5.7m. This paper reviews the key sustainability issues identified by the projects including the different approaches and models that have been adopted in order to sustain the continuing development and release of OER once funding has ended. The analysis also considers the challenges relating to the development and implementation of policies and processes for sustainable OER practice within institutions and among academics. The paper concludes by drawing on the experiences from the wider United Kingdom and international OER communities to develop a sustainable OER ecosystem model that can facilitate discussions on future development of OER initiatives.
Resumo:
In this paper a novel methodology aimed at minimizing the probability of network failure and the failure impact (in terms of QoS degradation) while optimizing the resource consumption is introduced. A detailed study of MPLS recovery techniques and their GMPLS extensions are also presented. In this scenario, some features for reducing the failure impact and offering minimum failure probabilities at the same time are also analyzed. Novel two-step routing algorithms using this methodology are proposed. Results show that these methods offer high protection levels with optimal resource consumption
Resumo:
Quantitatively assessing the importance or criticality of each link in a network is of practical value to operators, as that can help them to increase the network's resilience, provide more efficient services, or improve some other aspect of the service. Betweenness is a graph-theoretical measure of centrality that can be applied to communication networks to evaluate link importance. However, as we illustrate in this paper, the basic definition of betweenness centrality produces inaccurate estimations as it does not take into account some aspects relevant to networking, such as the heterogeneity in link capacity or the difference between node-pairs in their contribution to the total traffic. A new algorithm for discovering link centrality in transport networks is proposed in this paper. It requires only static or semi-static network and topology attributes, and yet produces estimations of good accuracy, as verified through extensive simulations. Its potential value is demonstrated by an example application. In the example, the simple shortest-path routing algorithm is improved in such a way that it outperforms other more advanced algorithms in terms of blocking ratio
Resumo:
Traffic Engineering objective is to optimize network resource utilization. Although several works have been published about minimizing network resource utilization in MPLS networks, few of them have been focused in LSR label space reduction. This letter studies Asymmetric Merged Tunneling (AMT) as a new method for reducing the label space in MPLS network. The proposed method may be regarded as a combination of label merging (proposed in the MPLS architecture) and asymmetric tunneling (proposed recently in our previous works). Finally, simulation results are performed by comparing AMT with both ancestors. They show a great improvement in the label space reduction factor
Resumo:
Most network operators have considered reducing LSR label spaces (number of labels used) as a way of simplifying management of underlaying virtual private networks (VPNs) and therefore reducing operational expenditure (OPEX). The IETF outlined the label merging feature in MPLS-allowing the configuration of multipoint-to-point connections (MP2P)-as a means of reducing label space in LSRs. We found two main drawbacks in this label space reduction a)it should be separately applied to a set of LSPs with the same egress LSR-which decreases the options for better reductions, and b)LSRs close to the edge of the network experience a greater label space reduction than those close to the core. The later implies that MP2P connections reduce the number of labels asymmetrically
Resumo:
In this paper, different recovery methods applied at different network layers and time scales are used in order to enhance the network reliability. Each layer deploys its own fault management methods. However, current recovery methods are applied to only a specific layer. New protection schemes, based on the proposed partial disjoint path algorithm, are defined in order to avoid protection duplications in a multi-layer scenario. The new protection schemes also encompass shared segment backup computation and shared risk link group identification. A complete set of experiments proves the efficiency of the proposed methods in relation with previous ones, in terms of resources used to protect the network, the failure recovery time and the request rejection ratio
Resumo:
In networks with small buffers, such as optical packet switching based networks, the convolution approach is presented as one of the most accurate method used for the connection admission control. Admission control and resource management have been addressed in other works oriented to bursty traffic and ATM. This paper focuses on heterogeneous traffic in OPS based networks. Using heterogeneous traffic and bufferless networks the enhanced convolution approach is a good solution. However, both methods (CA and ECA) present a high computational cost for high number of connections. Two new mechanisms (UMCA and ISCA) based on Monte Carlo method are proposed to overcome this drawback. Simulation results show that our proposals achieve lower computational cost compared to enhanced convolution approach with an small stochastic error in the probability estimation
Resumo:
This paper proposes a multicast implementation based on adaptive routing with anticipated calculation. Three different cost measures for a point-to-multipoint connection: bandwidth cost, connection establishment cost and switching cost can be considered. The application of the method based on pre-evaluated routing tables makes possible the reduction of bandwidth cost and connection establishment cost individually
Resumo:
The emergence of uncorrelated growing networks is proved when nodes are removed either uniformly or under the preferential survival rule recently observed in the World Wide Web evolution. To this aim, the rate equation for the joint probability of degrees is derived, and stationary symmetrical solutions are obtained, by passing to the continuum limit. When a uniformly random removal of extant nodes and linear preferential attachment of new nodes are at work, we prove that the only stationary solution corresponds to uncorrelated networks for any removal rate r ∈ (0,1). In the more general case of preferential survival of nodes, uncorrelated solutions are also obtained. These results generalize the uncorrelatedness displayed by the (undirected) Barab´asi-Albert network model to models with uniformly random and selective (against low degrees) removal of nodes
Resumo:
Background: We address the problem of studying recombinational variations in (human) populations. In this paper, our focus is on one computational aspect of the general task: Given two networks G1 and G2, with both mutation and recombination events, defined on overlapping sets of extant units the objective is to compute a consensus network G3 with minimum number of additional recombinations. We describe a polynomial time algorithm with a guarantee that the number of computed new recombination events is within ϵ = sz(G1, G2) (function sz is a well-behaved function of the sizes and topologies of G1 and G2) of the optimal number of recombinations. To date, this is the best known result for a network consensus problem.Results: Although the network consensus problem can be applied to a variety of domains, here we focus on structure of human populations. With our preliminary analysis on a segment of the human Chromosome X data we are able to infer ancient recombinations, population-specific recombinations and more, which also support the widely accepted 'Out of Africa' model. These results have been verified independently using traditional manual procedures. To the best of our knowledge, this is the first recombinations-based characterization of human populations. Conclusion: We show that our mathematical model identifies recombination spots in the individual haplotypes; the aggregate of these spots over a set of haplotypes defines a recombinational landscape that has enough signal to detect continental as well as population divide based on a short segment of Chromosome X. In particular, we are able to infer ancient recombinations, population-specific recombinations and more, which also support the widely accepted 'Out of Africa' model. The agreement with mutation-based analysis can be viewed as an indirect validation of our results and the model. Since the model in principle gives us more information embedded in the networks, in our future work, we plan to investigate more non-traditional questions via these structures computed by our methodology.
Resumo:
The use of simple and multiple correspondence analysis is well-established in socialscience research for understanding relationships between two or more categorical variables.By contrast, canonical correspondence analysis, which is a correspondence analysis with linearrestrictions on the solution, has become one of the most popular multivariate techniques inecological research. Multivariate ecological data typically consist of frequencies of observedspecies across a set of sampling locations, as well as a set of observed environmental variablesat the same locations. In this context the principal dimensions of the biological variables aresought in a space that is constrained to be related to the environmental variables. Thisrestricted form of correspondence analysis has many uses in social science research as well,as is demonstrated in this paper. We first illustrate the result that canonical correspondenceanalysis of an indicator matrix, restricted to be related an external categorical variable, reducesto a simple correspondence analysis of a set of concatenated (or stacked ) tables. Then weshow how canonical correspondence analysis can be used to focus on, or partial out, aparticular set of response categories in sample survey data. For example, the method can beused to partial out the influence of missing responses, which usually dominate the results of amultiple correspondence analysis.