951 resultados para Local and Wide Area Network


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Resource monitoring in distributed systems is required to understand the 'health' of the overall system and to help identify particular problems, such as dysfunctional hardware or faulty system or application software. Monitoring systems such as GridRM provide the ability to connect to any number of different types of monitoring agents and provide different views of the system, based on a client's particular preferences. Web 2.0 technologies, and in particular 'mashups', are emerging as a promising technique for rapidly constructing rich user interfaces, that combine and present data in intuitive ways. This paper describes a Web 2.0 user interface that was created to expose resource data harvested by the GridRM resource monitoring system.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a distributed environment remote entities, usually the producers or consumers of services, need a means to publish their existence so that clients, needing their services, can search and find the appropriate ones that they can then interact with directly. The publication of information is via a registry service, and the interaction is via a high-level messaging service. Typically, separate libraries provide these two services. Tycho is an implementation of a wide-area asynchronous messaging framework with an integrated distributed registry. This will free developers from the need to assemble their applications from a range of potentially diverse middleware offerings, which should simplify and speed application development and more importantly allow developers to concentrate on their own domain of expertise. In the first part of the paper we outline our motivation for producing Tycho and then review a number of registry and messaging systems popular with the Grid community. In the second part of the paper we describe the architecture and implementation of Tycho. In the third part of the paper we present and discuss various performance tests that were undertaken to compare Tycho with alternative similar systems. Finally, we summarise and conclude the paper and outline future work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research details methods to improve upon current worst-case message response time analysis of CAN networks. Also, through the development of a CAN network model, and using modern simulation software, methods were shown to provide more realistic analyses of both sporadic and periodic messages on CAN networks prior to implementation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A wide variety of stressors elicit Fos expression in the medial prefrontal cortex (mPFC). No direct attempts, however, have been made to determine the role of the inputs that drive this response. We examined the effects of lesions of mPFC catecholamine terminals on local expression of Fos after exposure to air puff, a stimulus that in the rat acts as an acute psychological stressor. We also examined the effects of these lesions on Fos expression in a variety of subcortical neuronal populations implicated in the control of adrenocortical activation, one classic hallmark of the stress response. Lesions of the mPFC that were restricted to dopaminergic terminals significantly reduced numbers of Fos-immunoreactive (Fos-IR) cells seen in the mPFC after air puff, but had no significant effect on stress-induced Fos expression in the subcortical structures examined. Lesions of the mPFC that affected both dopaminergic and noradrenergic terminals also reduced numbers of Fos-IR cells observed in the mPFC after air puff. Additionally, these lesions resulted in a significant reduction in stress-induced Fos-IR in the ventral bed nucleus of the stria terminalis. These results demonstrate a role for catecholaminergic inputs to the mPFC, in the generation of both local and subcortical responses to psychological stress.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we consider two methods for automatically determining values for thresholding edge maps. Rather than use statistical methods they are based on the figural properties of the edges. Two approaches are taken. We investigate applying an edge evaluation measure based on edge continuity and edge thinness to determine the threshold on edge strength. However, the technique is not valid when applied to edge detector outputs that are one-pixel wide. In this case, we use a measure based on work by Lowe for assessing edges. This measure is based on length and average strength of complete linked edge lists.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents two different approaches to detect, locate, and characterize structural damage. Both techniques utilize electrical impedance in a first stage to locate the damaged area. In the second stage, to quantify the damage severity, one can use neural network, or optimization technique. The electrical impedance-based, which utilizes the electromechanical coupling property of piezoelectric materials, has shown engineering feasibility in a variety of practical field applications. Relying on high frequency structural excitations, this technique is very sensitive to minor structural changes in the near field of the piezoelectric sensors, and therefore, it is able to detect the damage in its early stage. Optimization approaches must be used for the case where a good condensed model is known, while neural network can be also used to estimate the nature of damage without prior knowledge of the model of the structure. The paper concludes with an experimental example in a welded cubic aluminum structure, in order to verify the performance of these two proposed methodologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An enhanced genetic algorithm (EGA) is applied to solve the long-term transmission expansion planning (LTTEP) problem. The following characteristics of the proposed EGA to solve the static and multistage LTTEP problem are presented, (1) generation of an initial population using fast, efficient heuristic algorithms, (2) better implementation of the local improvement phase and (3) efficient solution of linear programming problems (LPs). Critical comparative analysis is made between the proposed genetic algorithm and traditional genetic algorithms. Results using some known systems show that the proposed EGA presented higher efficiency in solving the static and multistage LTTEP problem, solving a smaller number of linear programming problems to find the optimal solutions and thus finding a better solution to the multistage LTTEP problem. Copyright © 2012 Luis A. Gallego et al.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To properly describe the interactions between the ocean and atmosphere, it is necessary to assess a variety of time and spatial scales phenomena. Here, high resolution oceanographic and meteorological data collected during an observational campaign carried out aboard a ship in the tropical Atlantic Ocean, on May 15-24, 2002, is used to describe the radiation balance at the ocean interface. Data collected by two PIRATA buoys, along the equator at 23°W and 35°W and satellite and climate data are compared with the data obtained during the observational campaign. Comparison indicates remarkable similarity for daily and hourly values of radiation fluxes components as consequence of the temporal and spatial consistence presented by the air and water temperatures measured in situ and estimated from large scale information. The discrepancy, mainly in the Sao Pedro and Sao Paulo Archipelago area, seems to be associated to the local upwelling of cold water, which is not detected in all other estimates investigated here. More in situ data are necessary to clarify whether this upwelling flow has a larger scale effect and what are the meteorological and oceanographic implications of the local upwelling area on the tropical waters at the Brazilian coast.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spatial patterns in assemblage structures are generated by ecological processes that occur on multiple scales. Identifying these processes is important for the prediction of impact, for restoration and for conservation of biodiversity. This study used a hierarchical sampling design to quantify variations in assemblage structures of Brazilian estuarine fish across 2 spatial scales and to reveal the ecological processes underlying the patterns observed. Eight areas separated by 0.7 to 25 km (local scale) were sampled in 5 estuaries separated by 970 to 6000 km (regional scale) along the coast, encompassing both tropical and subtropical regions. The assemblage structure varied significantly in terms of relative biomass and presence/absence of species on both scales, but the regional variation was greater than the local variation for either dataset. However, the 5 estuaries sampled segregated into 2 major groups largely congruent with the Brazilian and Argentinian biogeographic provinces. Three environmental variables (mean temperature of the coldest month, mangrove area and mean annual precipitation) and distance between estuaries explained 44.8 and 16.3%, respectively, of the regional-scale variability in the species relative biomass. At the local scale, the importance of environmental predictors for the spatial structure of the assemblages differed between estuarine systems. Overall, these results support the idea that on a regional scale, the composition of fish assemblages is simultaneously determined by environmental filters and species dispersal capacity, while on a local scale, the effect of environmental factors should vary depending on estuary-specific physical and hydrological characteristics © 2013 Inter-Research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer and telecommunication networks are changing the world dramatically and will continue to do so in the foreseeable future. The Internet, primarily based on packet switches, provides very flexible data services such as e-mail and access to the World Wide Web. The Internet is a variable-delay, variable- bandwidth network that provides no guarantee on quality of service (QoS) in its initial phase. New services are being added to the pure data delivery framework of yesterday. Such high demands on capacity could lead to a “bandwidth crunch” at the core wide-area network, resulting in degradation of service quality. Fortunately, technological innovations have emerged which can provide relief to the end user to overcome the Internet’s well-known delay and bandwidth limitations. At the physical layer, a major overhaul of existing networks has been envisaged from electronic media (e.g., twisted pair and cable) to optical fibers - in wide-area, metropolitan-area, and even local-area settings. In order to exploit the immense bandwidth potential of optical fiber, interesting multiplexing techniques have been developed over the years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This doctoral work gains deeper insight into the dynamics of knowledge flows within and across clusters, unfolding their features, directions and strategic implications. Alliances, networks and personnel mobility are acknowledged as the three main channels of inter-firm knowledge flows, thus offering three heterogeneous measures to analyze the phenomenon. The interplay between the three channels and the richness of available research methods, has allowed for the elaboration of three different papers and perspectives. The common empirical setting is the IT cluster in Bangalore, for its distinguished features as a high-tech cluster and for its steady yearly two-digit growth around the service-based business model. The first paper deploys both a firm-level and a tie-level analysis, exploring the cases of 4 domestic companies and of 2 MNCs active the cluster, according to a cluster-based perspective. The distinction between business-domain knowledge and technical knowledge emerges from the qualitative evidence, further confirmed by quantitative analyses at tie-level. At firm-level, the specialization degree seems to be influencing the kind of knowledge shared, while at tie-level both the frequency of interaction and the governance mode prove to determine differences in the distribution of knowledge flows. The second paper zooms out and considers the inter-firm networks; particularly focusing on the role of cluster boundary, internal and external networks are analyzed, in their size, long-term orientation and exploration degree. The research method is purely qualitative and allows for the observation of the evolving strategic role of internal network: from exploitation-based to exploration-based. Moreover, a causal pattern is emphasized, linking the evolution and features of the external network to the evolution and features of internal network. The final paper addresses the softer and more micro-level side of knowledge flows: personnel mobility. A social capital perspective is here developed, which considers both employees’ acquisition and employees’ loss as building inter-firm ties, thus enhancing company’s overall social capital. Negative binomial regression analyses at dyad-level test the significant impact of cluster affiliation (cluster firms vs non-cluster firms), industry affiliation (IT firms vs non-IT fims) and foreign affiliation (MNCs vs domestic firms) in shaping the uneven distribution of personnel mobility, and thus of knowledge flows, among companies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Italian radio telescopes currently undergo a major upgrade period in response to the growing demand for deep radio observations, such as surveys on large sky areas or observations of vast samples of compact radio sources. The optimised employment of the Italian antennas, at first constructed mainly for VLBI activities and provided with a control system (FS – Field System) not tailored to single-dish observations, required important modifications in particular of the guiding software and data acquisition system. The production of a completely new control system called ESCS (Enhanced Single-dish Control System) for the Medicina dish started in 2007, in synergy with the software development for the forthcoming Sardinia Radio Telescope (SRT). The aim is to produce a system optimised for single-dish observations in continuum, spectrometry and polarimetry. ESCS is also planned to be installed at the Noto site. A substantial part of this thesis work consisted in designing and developing subsystems within ESCS, in order to provide this software with tools to carry out large maps, spanning from the implementation of On-The-Fly fast scans (following both conventional and innovative observing strategies) to the production of single-dish standard output files and the realisation of tools for the quick-look of the acquired data. The test period coincided with the commissioning phase for two devices temporarily installed – while waiting for the SRT to be completed – on the Medicina antenna: a 18-26 GHz 7-feed receiver and the 14-channel analogue backend developed for its use. It is worth stressing that it is the only K-band multi-feed receiver at present available worldwide. The commissioning of the overall hardware/software system constituted a considerable section of the thesis work. Tests were led in order to verify the system stability and its capabilities, down to sensitivity levels which had never been reached in Medicina using the previous observing techniques and hardware devices. The aim was also to assess the scientific potential of the multi-feed receiver for the production of wide maps, exploiting its temporary availability on a mid-sized antenna. Dishes like the 32-m antennas at Medicina and Noto, in fact, offer the best conditions for large-area surveys, especially at high frequencies, as they provide a suited compromise between sufficiently large beam sizes to cover quickly large areas of the sky (typical of small-sized telescopes) and sensitivity (typical of large-sized telescopes). The KNoWS (K-band Northern Wide Survey) project is aimed at the realisation of a full-northern-sky survey at 21 GHz; its pilot observations, performed using the new ESCS tools and a peculiar observing strategy, constituted an ideal test-bed for ESCS itself and for the multi-feed/backend system. The KNoWS group, which I am part of, supported the commissioning activities also providing map-making and source-extraction tools, in order to complete the necessary data reduction pipeline and assess the general system scientific capabilities. The K-band observations, which were carried out in several sessions along the December 2008-March 2010 period, were accompanied by the realisation of a 5 GHz test survey during the summertime, which is not suitable for high-frequency observations. This activity was conceived in order to check the new analogue backend separately from the multi-feed receiver, and to simultaneously produce original scientific data (the 6-cm Medicina Survey, 6MS, a polar cap survey to complete PMN-GB6 and provide an all-sky coverage at 5 GHz).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il lavoro è stato suddiviso in tre macro-aree. Una prima riguardante un'analisi teorica di come funzionano le intrusioni, di quali software vengono utilizzati per compierle, e di come proteggersi (usando i dispositivi che in termine generico si possono riconoscere come i firewall). Una seconda macro-area che analizza un'intrusione avvenuta dall'esterno verso dei server sensibili di una rete LAN. Questa analisi viene condotta sui file catturati dalle due interfacce di rete configurate in modalità promiscua su una sonda presente nella LAN. Le interfacce sono due per potersi interfacciare a due segmenti di LAN aventi due maschere di sotto-rete differenti. L'attacco viene analizzato mediante vari software. Si può infatti definire una terza parte del lavoro, la parte dove vengono analizzati i file catturati dalle due interfacce con i software che prima si occupano di analizzare i dati di contenuto completo, come Wireshark, poi dei software che si occupano di analizzare i dati di sessione che sono stati trattati con Argus, e infine i dati di tipo statistico che sono stati trattati con Ntop. Il penultimo capitolo, quello prima delle conclusioni, invece tratta l'installazione di Nagios, e la sua configurazione per il monitoraggio attraverso plugin dello spazio di disco rimanente su una macchina agent remota, e sui servizi MySql e DNS. Ovviamente Nagios può essere configurato per monitorare ogni tipo di servizio offerto sulla rete.