856 resultados para Network-based routing
Resumo:
Today's communication networks consist of numerous interdependent network components. To manage these networks and to ensure their reliable and efficient operation to meet the increasing customer usability demands, extensive network management tools are required from the service provider. The goal of this study was to adapt the Next Generation Network (NGN) providing VoIP services within a performance oriented network management system. This study focuses only on NGN network and the project was implemented as an assignment of the Network Operations Center of Elisa Corporation. The theoretical part of this study introduces the network environment of the Elisa NGN platform: its components and used signalling protocols as well as other exploitable communication protocols. In addition, the Simple Network Management Protocol (SNMP) is closely examined since it is commonly used as the basis of IP (Internet Protocol) network management. Also some primary applications enabled by the NGN technology are introduced. The empirical part of this study contains a short overview of the implemented network performance management system and its properties. The most crucial monitored MIB modules, SNMP parameters and implemented performance measurements are described. The trap topology and the role of the traps for management of the NGN platform are considered and finally, the conclusion based on the several disquisitions is made supported with suggestions for future improvements.
Resumo:
Defects in FAM161A, a protein of unknown function localized at the cilium of retinal photoreceptor cells, cause retinitis pigmentosa, a form of hereditary blindness. By using different fragments of this protein as baits to screen cDNA libraries of human and bovine retinas, we defined a yeast two-hybrid-based FAM161A interactome, identifying 53 bona fide partners. In addition to statistically significant enrichment in ciliary proteins, as expected, this interactome revealed a substantial bias towards proteins from the Golgi apparatus, the centrosome and the microtubule network. Validation of interaction with key partners by co-immunoprecipitation and proximity ligation assay confirmed that FAM161A is a member of the recently recognized Golgi-centrosomal interactome, a network of proteins interconnecting Golgi maintenance, intracellular transport and centrosome organization. Notable FAM161A interactors included AKAP9, FIP3, GOLGA3, KIFC3, KLC2, PDE4DIP, NIN and TRIP11. Furthermore, analysis of FAM161A localization during the cell cycle revealed that this protein followed the centrosome during all stages of mitosis, likely reflecting a specific compartmentalization related to its role at the ciliary basal body during the G0 phase. Altogether, these findings suggest that FAM161A's activities are probably not limited to ciliary tasks but also extend to more general cellular functions, highlighting possible novel mechanisms for the molecular pathology of retinal disease.
Resumo:
This paper describes Question Waves, an algorithm that can be applied to social search protocols, such as Asknext or Sixearch. In this model, the queries are propagated through the social network, with faster propagation through more trustable acquaintances. Question Waves uses local information to make decisions and obtain an answer ranking. With Question Waves, the answers that arrive first are the most likely to be relevant, and we computed the correlation of answer relevance with the order of arrival to demonstrate this result. We obtained correlations equivalent to the heuristics that use global knowledge, such as profile similarity among users or the expertise value of an agent. Because Question Waves is compatible with the social search protocol Asknext, it is possible to stop a search when enough relevant answers have been found; additionally, stopping the search early only introduces a minimal risk of not obtaining the best possible answer. Furthermore, Question Waves does not require a re-ranking algorithm because the results arrive sorted
Resumo:
Defining digital humanities might be an endless debate if we stick to the discussion about the boundaries of this concept as an academic "discipline". In an attempt to concretely identify this field and its actors, this paper shows that it is possible to analyse them through Twitter, a social media widely used by this "community of practice". Based on a network analysis of 2,500 users identified as members of this movement, the visualisation of the "who's following who?" graph allows us to highlight the structure of the network's relationships, and identify users whose position is particular. Specifically, we show that linguistic groups are key factors to explain clustering within a network whose characteristics look similar to a small world.
Resumo:
Myeloid malignancies (MMs) are a heterogeneous group of hematologic malignancies presenting different incidence, prognosis and survival.1–3 Changing classifications (FAB 1994, WHO 2001 and WHO 2008) and few available epidemiological data complicate incidence comparisons.4,5 Taking this into account, the aims of the present study were: a) to calculate the incidence rates and trends of MMs in the Province of Girona, northeastern Spain, between 1994 and 2008 according to the WHO 2001 classification; and b) to predict the number of MMs cases in Spain during 2013. Data were extracted from the population-based Girona Cancer Registry (GCR) located in the north-east of Catalonia, Spain, and covering a population of 731,864 inhabitants (2008 census). Cases were registered according to the rules of the European Network for Cancer Registries and the Manual for Coding and Reporting Haematological Malignancies (HAEMACARE project). To ensure the complete coverage of MMs in the GCR, and especially myeloproliferative neoplasms (MPN) and myelodysplastic syndromes (MDS), a retrospective search was performed. The ICD-O-2 (1990) codes were converted into their corresponding ICD-O-3 (2000) codes, including MDS, polycythemia vera (PV) and essential thrombocythemia (ET) as malignant diseases. Results of crude rate (CR) and European standardized incidence rate (ASRE) were expressed per 100,000 inhabitants/year
Resumo:
Contemporary public administrations have become increasingly more complex, having to cordinate actions with emerging actors in the public and the private spheres. In this scenario the modern ICTs have begun to be seen as an ideal vehicle to resolve some of the problems of public administration. We argue that there is a clear need to explore the extent to which public administrations are undergoing a process of transformation towards a netowork government linked to the systematic incorporation of ICTs in their basic activities. Through critically analysing a selection of e-government evaluation reports, we conclude that research should be carried out if we are to build a solid government assessment framework based on network-like organisation characteristics.
Resumo:
The presence of e-portfolios in educational centres, companies and administrations has emergedstrongly during the last years by creating very different practices coming from different objectives and purposes. This situation has led researchers and practitioners to design and implement e-portfolios with little reference to previous knowledge of them; consequently, developments are disparate with many of the processes and dimensions used both in development and use being unnecessary complex. In order to minimize the inconveniences, unify these developmental processes and improve the resultsof implementation and use of e-portfolios, it seemed necessary to create a network of researchers, teachers and trainers coming from different universities and institutions of different kinds who are interested in the investigation and the practice of e-portfolios in Spain. Therefore, The Network on e-portfoliowas created in 2006, funded by the Spanish Ministry of Education and led by the UniversitatOberta de Catalunya. Besides the goals associatedwith the creation of this network and which wewanted to share with other European researchers and experts of other continents, we will also present in this paper some data concerned with the first study carried out on the use of e-portfolios in our country that shows where we are and which trends are the most important for the near future.
Resumo:
Peer-reviewed
Resumo:
Peer-reviewed
Resumo:
Peer-reviewed
Resumo:
Wavelength division multiplexing (WDM) networks have been adopted as a near-future solution for the broadband Internet. In previous work we proposed a new architecture, named enhanced grooming (G+), that extends the capabilities of traditional optical routes (lightpaths). In this paper, we compare the operational expenditures incurred by routing a set of demands using lightpaths with that of lighttours. The comparison is done by solving an integer linear programming (ILP) problem based on a path formulation. Results show that, under the assumption of single-hop routing, almost 15% of the operational cost can be reduced with our architecture. In multi-hop routing the operation cost is reduced in 7.1% and at the same time the ratio of operational cost to number of optical-electro-optical conversions is reduced for our architecture. This means that ISPs could provide the same satisfaction in terms of delay to the end-user with a lower investment in the network architecture
Resumo:
The identification of biomarkers of vascular cognitive impairment is urgent for its early diagnosis. The aim of this study was to detect and monitor changes in brain structure and connectivity, and to correlate them with the decline in executive function. We examined the feasibility of early diagnostic magnetic resonance imaging (MRI) to predict cognitive impairment before onset in an animal model of chronic hypertension: Spontaneously Hypertensive Rats. Cognitive performance was tested in an operant conditioning paradigm that evaluated learning, memory, and behavioral flexibility skills. Behavioral tests were coupled with longitudinal diffusion weighted imaging acquired with 126 diffusion gradient directions and 0.3 mm(3) isometric resolution at 10, 14, 18, 22, 26, and 40 weeks after birth. Diffusion weighted imaging was analyzed in two different ways, by regional characterization of diffusion tensor imaging (DTI) indices, and by assessing changes in structural brain network organization based on Q-Ball tractography. Already at the first evaluated times, DTI scalar maps revealed significant differences in many regions, suggesting loss of integrity in white and gray matter of spontaneously hypertensive rats when compared to normotensive control rats. In addition, graph theory analysis of the structural brain network demonstrated a significant decrease of hierarchical modularity, global and local efficacy, with predictive value as shown by regional three-fold cross validation study. Moreover, these decreases were significantly correlated with the behavioral performance deficits observed at subsequent time points, suggesting that the diffusion weighted imaging and connectivity studies can unravel neuroimaging alterations even overt signs of cognitive impairment become apparent.
Resumo:
Technology scaling has proceeded into dimensions in which the reliability of manufactured devices is becoming endangered. The reliability decrease is a consequence of physical limitations, relative increase of variations, and decreasing noise margins, among others. A promising solution for bringing the reliability of circuits back to a desired level is the use of design methods which introduce tolerance against possible faults in an integrated circuit. This thesis studies and presents fault tolerance methods for network-onchip (NoC) which is a design paradigm targeted for very large systems-onchip. In a NoC resources, such as processors and memories, are connected to a communication network; comparable to the Internet. Fault tolerance in such a system can be achieved at many abstraction levels. The thesis studies the origin of faults in modern technologies and explains the classification to transient, intermittent and permanent faults. A survey of fault tolerance methods is presented to demonstrate the diversity of available methods. Networks-on-chip are approached by exploring their main design choices: the selection of a topology, routing protocol, and flow control method. Fault tolerance methods for NoCs are studied at different layers of the OSI reference model. The data link layer provides a reliable communication link over a physical channel. Error control coding is an efficient fault tolerance method especially against transient faults at this abstraction level. Error control coding methods suitable for on-chip communication are studied and their implementations presented. Error control coding loses its effectiveness in the presence of intermittent and permanent faults. Therefore, other solutions against them are presented. The introduction of spare wires and split transmissions are shown to provide good tolerance against intermittent and permanent errors and their combination to error control coding is illustrated. At the network layer positioned above the data link layer, fault tolerance can be achieved with the design of fault tolerant network topologies and routing algorithms. Both of these approaches are presented in the thesis together with realizations in the both categories. The thesis concludes that an optimal fault tolerance solution contains carefully co-designed elements from different abstraction levels
Resumo:
Aim of the Thesis is to study and understand the theoretical concept of Metanational corporation and understand how the Web 2.0 technologies can be used to support the theory. Empiric part of the study compares the theory to the case company’s current situation Goal of theoretical framework is to show how the Web 2.0 technologies can be used in the three levels of the Metanational corporation. In order to do this, knowledge management and more accurately knowledge transferring is studied to understand what is needed from the Web 2.0 technologies in the different functions and operations of the Metanational corporation. Final synthesis of the theoretical framework is to present a model where the Web 2.0 technologies are placed on the levels of the Metanational corporation. Empirical part of the study is based on interviews made in the case company. Aim of the interviews is to understand the current state of the company related to the theoretical framework. Based on the interviews, the differences between the theoretical concept and the case company are presented and studied. Finally the study presents the found problem areas, and where the adoption of the Web 2.0 tools is seen as beneficiary, based on the interviews and theoretical framework.