883 resultados para Scale-free network


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Network virtualisation is seen as a promising approach to overcome the so-called “Internet impasse” and bring innovation back into the Internet, by allowing easier migration towards novel networking approaches as well as the coexistence of complementary network architectures on a shared infrastructure in a commercial context. Recently, the interest from the operators and mainstream industry in network virtualisation has grown quite significantly, as the potential benefits of virtualisation became clearer, both from an economical and an operational point of view. In the beginning, the concept has been mainly a research topic and has been materialized in small-scale testbeds and research network environments. This PhD Thesis aims to provide the network operator with a set of mechanisms and algorithms capable of managing and controlling virtual networks. To this end, we propose a framework that aims to allocate, monitor and control virtual resources in a centralized and efficient manner. In order to analyse the performance of the framework, we performed the implementation and evaluation on a small-scale testbed. To enable the operator to make an efficient allocation, in real-time, and on-demand, of virtual networks onto the substrate network, it is proposed a heuristic algorithm to perform the virtual network mapping. For the network operator to obtain the highest profit of the physical network, it is also proposed a mathematical formulation that aims to maximize the number of allocated virtual networks onto the physical network. Since the power consumption of the physical network is very significant in the operating costs, it is important to make the allocation of virtual networks in fewer physical resources and onto physical resources already active. To address this challenge, we propose a mathematical formulation that aims to minimize the energy consumption of the physical network without affecting the efficiency of the allocation of virtual networks. To minimize fragmentation of the physical network while increasing the revenue of the operator, it is extended the initial formulation to contemplate the re-optimization of previously mapped virtual networks, so that the operator has a better use of its physical infrastructure. It is also necessary to address the migration of virtual networks, either for reasons of load balancing or for reasons of imminent failure of physical resources, without affecting the proper functioning of the virtual network. To this end, we propose a method based on cloning techniques to perform the migration of virtual networks across the physical infrastructure, transparently, and without affecting the virtual network. In order to assess the resilience of virtual networks to physical network failures, while obtaining the optimal solution for the migration of virtual networks in case of imminent failure of physical resources, the mathematical formulation is extended to minimize the number of nodes migrated and the relocation of virtual links. In comparison with our optimization proposals, we found out that existing heuristics for mapping virtual networks have a poor performance. We also found that it is possible to minimize the energy consumption without penalizing the efficient allocation. By applying the re-optimization on the virtual networks, it has been shown that it is possible to obtain more free resources as well as having the physical resources better balanced. Finally, it was shown that virtual networks are quite resilient to failures on the physical network.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

As technology advances not only do new standards and programming styles appear but also some of the previously established ones gain relevance. In a new Internet paradigm where interconnection between small devices is key to the development of new businesses and scientific advancement there is the need to find simple solutions that anyone can implement in order to allow ideas to become more than that, ideas. Open-source software is still alive and well, especially in the area of the Internet of Things. This opens windows for many low capital entrepreneurs to experiment with their ideas and actually develop prototypes, which can help identify problems with a project or shine light on possible new features and interactions. As programming becomes more and more popular between people of fields not related to software there is the need for guidance in developing something other than basic algorithms, which is where this thesis comes in: A comprehensive document explaining the challenges and available choices of developing a sensor data and message delivery system, which scales well and implements the delivery of critical messages. Modularity and extensibility were also given much importance, making this an affordable tool for anyone that wants to build a sensor network of the kind.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The field site network (FSN) plays a central role in conducting joint research within all Assessing Large-scale Risks for biodiversity with tested Methods (ALARM) modules and provides a mechanism for integrating research on different topics in ALARM on the same site for measuring multiple impacts on biodiversity. The network covers most European climates and biogeographic regions, from Mediterranean through central European and boreal to subarctic. The project links databases with the European-wide field site network FSN, including geographic information system (GIS)-based information to characterise the test location for ALARM researchers for joint on-site research. Maps are provided in a standardised way and merged with other site-specific information. The application of GIS for these field sites and the information management promotes the use of the FSN for research and to disseminate the results. We conclude that ALARM FSN sites together with other research sites in Europe jointly could be used as a future backbone for research proposals

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Shape provides one of the most relevant information about an object. This makes shape one of the most important visual attributes used to characterize objects. This paper introduces a novel approach for shape characterization, which combines modeling shape into a complex network and the analysis of its complexity in a dynamic evolution context. Descriptors computed through this approach show to be efficient in shape characterization, incorporating many characteristics, such as scale and rotation invariant. Experiments using two different shape databases (an artificial shapes database and a leaf shape database) are presented in order to evaluate the method. and its results are compared to traditional shape analysis methods found in literature. (C) 2009 Published by Elsevier B.V.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The free-carrier absorption cross-section sigma of a magnetic colloid composed of magnetite nanoparticles dispersed in oil is obtained by using the Z-scan technique in different experimental conditions of the laser beam. We show that it is possible to obtain sigma with picosecond pulsed and millisecond chopped beams with pulse frequencies smaller than about 30 Hz. For higher pulse frequencies, the heating of the colloidal system triggers the appearance of the Soret effect. This effect artificially increases the value of sigma calculated from the experimental results. The limits of the different experimental setups are discussed. (C) 2012 Optical Society of America

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fluctuation-dissipation theorems can be used to predict characteristics of noise from characteristics of the macroscopic response of a system. In the case of gene networks, feedback control determines the "network rigidity," defined as resistance to slow external changes. We propose an effective Fokker-Planck equation that relates gene expression noise to topology and to time scales of the gene network. We distinguish between two situations referred to as normal and inverted time hierarchies. The noise can be buffered by network feedback in the first situation, whereas it can be topology independent in the latter.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Too Big to Ignore (TBTI; www.toobigtoignore.net) is a research network and knowledge mobilization partnership established to elevate the profile of small-scale fisheries (SSF), to argue against their marginalization in national and international policies, and to develop research and governance capacity to address global fisheries challenges. Network participants and partners are conducting global and comparative analyses, as well as in-depth studies of SSF in the context of local complexity and dynamics, along with a thorough examination of governance challenges, to encourage careful consideration of this sector in local, regional and global policy arenas. Comprising 15 partners and 62 researchers from 27 countries, TBTI conducts activities in five regions of the world. In Latin America and the Caribbean (LAC) region, we are taking a participative approach to investigate and promote stewardship and self-governance in SSF, seeking best practices and success stories that could be replicated elsewhere. As well, the region will focus to promote sustainable livelihoods of coastal communities. Key activities include workshops and stakeholder meetings, facilitation of policy dialogue and networking, as well as assessing local capacity needs and training. Currently, LAC members are putting together publications that examine key issues concerning SSF in the region and best practices, with a first focus on ecosystem stewardship. Other planned deliverables include comparative analysis, a regional profile on the top research issues on SSF, and a synthesis of SSF knowledge in LAC

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Questa dissertazione esamina le sfide e i limiti che gli algoritmi di analisi di grafi incontrano in architetture distribuite costituite da personal computer. In particolare, analizza il comportamento dell'algoritmo del PageRank così come implementato in una popolare libreria C++ di analisi di grafi distribuiti, la Parallel Boost Graph Library (Parallel BGL). I risultati qui presentati mostrano che il modello di programmazione parallela Bulk Synchronous Parallel è inadatto all'implementazione efficiente del PageRank su cluster costituiti da personal computer. L'implementazione analizzata ha infatti evidenziato una scalabilità negativa, il tempo di esecuzione dell'algoritmo aumenta linearmente in funzione del numero di processori. Questi risultati sono stati ottenuti lanciando l'algoritmo del PageRank della Parallel BGL su un cluster di 43 PC dual-core con 2GB di RAM l'uno, usando diversi grafi scelti in modo da facilitare l'identificazione delle variabili che influenzano la scalabilità. Grafi rappresentanti modelli diversi hanno dato risultati differenti, mostrando che c'è una relazione tra il coefficiente di clustering e l'inclinazione della retta che rappresenta il tempo in funzione del numero di processori. Ad esempio, i grafi Erdős–Rényi, aventi un basso coefficiente di clustering, hanno rappresentato il caso peggiore nei test del PageRank, mentre i grafi Small-World, aventi un alto coefficiente di clustering, hanno rappresentato il caso migliore. Anche le dimensioni del grafo hanno mostrato un'influenza sul tempo di esecuzione particolarmente interessante. Infatti, si è mostrato che la relazione tra il numero di nodi e il numero di archi determina il tempo totale.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Summary PhD Thesis Jan Pollmann: This thesis focuses on global scale measurements of light reactive non-methane hydrocarbon (NMHC), in the volatility range from ethane to toluene with a special focus on ethane, propane, isobutane, butane, isopentane and pentane. Even though they only occur at the ppt level (nmol mol-1) in the remote troposphere these species can yield insight into key atmospheric processes. An analytical method was developed and subsequently evaluated to analyze NMHC from the NOAA – ERSL cooperative air sampling network. Potential analytical interferences through other atmospheric trace gases (water vapor and ozone) were carefully examined. The analytical parameters accuracy and precision were analyzed in detail. It was proven that more than 90% of the data points meet the Global Atmospheric Watch (GAW) data quality objective. Trace gas measurements from 28 measurement stations were used to derive the global atmospheric distribution profile for 4 NMHC (ethane, propane, isobutane, butane). A close comparison of the derived ethane data with previously published reports showed that northern hemispheric ethane background mixing ratio declined by approximately 30% since 1990. No such change was observed for southern hemispheric ethane. The NMHC data and trace gas data supplied by NOAA ESRL were used to estimate local diurnal averaged hydroxyl radical (OH) mixing ratios by variability analysis. Comparison of the variability derived OH with directly measured OH and modeled OH mixing ratios were found in good agreement outside the tropics. Tropical OH was on average two times higher than predicted by the model. Variability analysis was used to assess the effect of chlorine radicals on atmospheric oxidation chemistry. It was found that Cl is probably not of significant relevance on a global scale.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

L'indagine condotta, avvalendosi del paradigma della social network analysis, offre una descrizione delle reti di supporto personale e del capitale sociale di un campione di 80 italiani ex post un trattamento terapeutico residenziale di lungo termine per problemi di tossicodipendenza. Dopo aver identificato i profili delle reti di supporto sociale degli intervistati, si è proceduto, in primis, alla misurazione e comparazione delle ego-centered support networks tra soggetti drug free e ricaduti e, successivamente, all'investigazione delle caratteristiche delle reti e delle forme di capitale sociale – closure e brokerage – che contribuiscono al mantenimento dell'astinenza o al rischio di ricaduta nel post-trattamento. Fattori soggettivi, come la discriminazione pubblica percepita e l'attitudine al lavoro, sono stati inoltre esplorati al fine di investigare la loro correlazione con la condotta di reiterazione nell'uso di sostanze. Dai risultati dello studio emerge che un più basso rischio di ricaduta è positivamente associato ad una maggiore attitudine al lavoro, ad una minore percezione di discriminazione da parte della società, all'avere membri di supporto con un più alto status socio-economico e che mobilitano risorse reputazionali e, infine, all'avere reti più eterogenee nell'occupazione e caratterizzate da più elevati livelli di reciprocità. Inoltre, il capitale sociale di tipo brokerage contribuisce al mantenimento dell'astinenza in quanto garantisce l'accesso del soggetto ad informazioni meno omogenee e la sua esposizione a opportunità più numerose e differenziate. I risultati dello studio, pertanto, dimostrano l'importante ruolo delle personal support networks nel prevenire o ridurre il rischio di ricaduta nel post-trattamento, in linea con precedenti ricerche che suggeriscono la loro incorporazione nei programmi terapeutici per tossicodipendenti.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

PURPOSE: There is a need for valid and reliable short scales that can be used to assess social networks and social supports and to screen for social isolation in older persons. DESIGN AND METHODS: The present study is a cross-national and cross-cultural evaluation of the performance of an abbreviated version of the Lubben Social Network Scale (LSNS-6), which was used to screen for social isolation among community-dwelling older adult populations in three European countries. Based on the concept of lack of redundancy of social ties we defined clinical cut-points of the LSNS-6 for identifying persons deemed at risk for social isolation. RESULTS: Among all three samples, the LSNS-6 and two subscales (Family and Friends) demonstrated high levels of internal consistency, stable factor structures, and high correlations with criterion variables. The proposed clinical cut-points showed good convergent validity, and classified 20% of the respondents in Hamburg, 11% of those in Solothurn (Switzerland), and 15% of those in London as at risk for social isolation. IMPLICATIONS: We conclude that abbreviated scales such as the LSNS-6 should be considered for inclusion in practice protocols of gerontological practitioners. Screening older persons based on the LSNS-6 provides quantitative information on their family and friendship ties, and identifies persons at increased risk for social isolation who might benefit from in-depth assessment and targeted interventions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Rationale: Focal onset epileptic seizures are due to abnormal interactions between distributed brain areas. By estimating the cross-correlation matrix of multi-site intra-cerebral EEG recordings (iEEG), one can quantify these interactions. To assess the topology of the underlying functional network, the binary connectivity matrix has to be derived from the cross-correlation matrix by use of a threshold. Classically, a unique threshold is used that constrains the topology [1]. Our method aims to set the threshold in a data-driven way by separating genuine from random cross-correlation. We compare our approach to the fixed threshold method and study the dynamics of the functional topology. Methods: We investigate the iEEG of patients suffering from focal onset seizures who underwent evaluation for the possibility of surgery. The equal-time cross-correlation matrices are evaluated using a sliding time window. We then compare 3 approaches assessing the corresponding binary networks. For each time window: * Our parameter-free method derives from the cross-correlation strength matrix (CCS)[2]. It aims at disentangling genuine from random correlations (due to finite length and varying frequency content of the signals). In practice, a threshold is evaluated for each pair of channels independently, in a data-driven way. * The fixed mean degree (FMD) uses a unique threshold on the whole connectivity matrix so as to ensure a user defined mean degree. * The varying mean degree (VMD) uses the mean degree of the CCS network to set a unique threshold for the entire connectivity matrix. * Finally, the connectivity (c), connectedness (given by k, the number of disconnected sub-networks), mean global and local efficiencies (Eg, El, resp.) are computed from FMD, CCS, VMD, and their corresponding random and lattice networks. Results: Compared to FMD and VMD, CCS networks present: *topologies that are different in terms of c, k, Eg and El. *from the pre-ictal to the ictal and then post-ictal period, topological features time courses that are more stable within a period, and more contrasted from one period to the next. For CCS, pre-ictal connectivity is low, increases to a high level during the seizure, then decreases at offset. k shows a ‘‘U-curve’’ underlining the synchronization of all electrodes during the seizure. Eg and El time courses fluctuate between the corresponding random and lattice networks values in a reproducible manner. Conclusions: The definition of a data-driven threshold provides new insights into the topology of the epileptic functional networks.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article examines social network users’ legal defences against content removal under the EU and ECHR frameworks, and their implications for the effective exercise of free speech online. A review of the Terms of Use and content moderation policies of two major social network services, Facebook and Twitter, shows that end users are unlikely to have a contractual defence against content removal. Under the EU and ECHR frameworks, they may demand the observance of free speech principles in state-issued blocking orders and their implementation by intermediaries, but cannot invoke this ‘fair balance’ test against the voluntary removal decisions by the social network service. Drawing on practical examples, this article explores the threat to free speech created by this lack of accountability: Firstly, a shift from legislative regulation and formal injunctions to public-private collaborations allows state authorities to influence these ostensibly voluntary policies, thereby circumventing constitutional safeguards. Secondly, even absent state interference, the commercial incentives of social media cannot be guaranteed to coincide with democratic ideals. In light of the blurring of public and private functions in the regulation of social media expression, this article calls for the increased accountability of the social media services towards end users regarding the observance of free speech principles