862 resultados para large-scale network
Resumo:
Virtualisation of cellular networks can be seen as a way to significantly reduce the complexity of processes, required nowadays to provide reliable cellular networks. The Future Communication Architecture for Mobile Cloud Services: Mobile Cloud Networking (MCN) is a EU FP7 Large-scale Integrating Project (IP) funded by the European Commission that is focusing on cloud computing concepts to achieve virtualisation of cellular networks. It aims at the development of a fully cloud-based mobile communication and application platform, or more specifically, it aims to investigate, implement and evaluate the technological foundations for the mobile communication system of Long Term Evolution (LTE), based on Mobile Network plus Decentralized Computing plus Smart Storage offered as one atomic service: On-Demand, Elastic and Pay-As-You-Go. This paper provides a brief overview of the MCN project and discusses the challenges that need to be solved.
Resumo:
Web-scale knowledge retrieval can be enabled by distributed information retrieval, clustering Web clients to a large-scale computing infrastructure for knowledge discovery from Web documents. Based on this infrastructure, we propose to apply semiotic (i.e., sub-syntactical) and inductive (i.e., probabilistic) methods for inferring concept associations in human knowledge. These associations can be combined to form a fuzzy (i.e.,gradual) semantic net representing a map of the knowledge in the Web. Thus, we propose to provide interactive visualizations of these cognitive concept maps to end users, who can browse and search the Web in a human-oriented, visual, and associative interface.
Resumo:
Microstructures and textures of calcite mylonites from the Morcles nappe large-scale shear zone in southwestern Switzerland develop principally as a function of 1) extrinsic physical parameters including temperature, stress, strain, strain rate and 2) intrinsic parameters, such as mineral composition. We collected rock samples at a single location from this shear zone, on which laboratory ultrasonic velocities, texture and microstructures were investigated and quantified. The samples had different concentration of secondary mineral phases (< 5 up to 40 vol.%). Measured seismic P wave anisotropy ranges from 6.5% for polyphase mylonites (~ 40 vol.%) to 18.4% in mylonites with < 5 vol.% secondary phases. Texture strength of calcite is the main factor governing the seismic P wave anisotropy. Measured S wave splitting is generally highest in the foliation plane, but its origin is more difficult to explain solely by calcite texture. Additional texture measurements were made on calcite mylonites with low concentration of secondary phases (≤ 10 vol.%) along the metamorphic gradient of the shear zone (15 km distance). A systematic increase in texture strength is observed moving from the frontal part of the shear zone (anchimetamorphism; 280 °C) to the higher temperature, basal part (greenschist facies; 350–400 °C). Calculated P wave velocities become increasingly anisotropic towards the high-strain part of the nappe, from an average of 5.8% in the frontal part to 13.2% in the root of the basal part. Secondary phases raise an additional complexity, and may act either to increase or decrease seismic anisotropy of shear zone mylonites. In light of our findings we reinterpret the origin of some seismically reflective layers in the Grône–Zweisimmen line in southwestern Switzerland (PNR20 Swiss National Research Program). We hypothesize that reflections originate in part from the lateral variation in textural and microstructural arrangement of calcite mylonites in shear zones.
Resumo:
Surface temperature is a key aspect of weather and climate, but the term may refer to different quantities that play interconnected roles and are observed by different means. In a community-based activity in June 2012, the EarthTemp Network brought together 55 researchers from five continents to improve the interaction between scientific communities who focus on surface temperature in particular domains, to exploit the strengths of different observing systems and to better meet the needs of different communities. The workshop identified key needs for progress towards meeting scientific and societal requirements for surface temperature understanding and information, which are presented in this community paper. A "whole-Earth" perspective is required with more integrated, collaborative approaches to observing and understanding Earth's various surface temperatures. It is necessary to build understanding of the relationships between different surface temperatures, where presently inadequate, and undertake large-scale systematic intercomparisons. Datasets need to be easier to obtain and exploit for a wide constituency of users, with the differences and complementarities communicated in readily understood terms, and realistic and consistent uncertainty information provided. Steps were also recommended to curate and make available data that are presently inaccessible, develop new observing systems and build capacities to accelerate progress in the accuracy and usability of surface temperature datasets.
Resumo:
Aim To evaluate the climate sensitivity of model-based forest productivity estimates using a continental-scale tree-ring network. Location Europe and North Africa (30–70° N, 10° W–40° E). Methods We compiled close to 1000 annually resolved records of radial tree growth for all major European tree species and quantified changes in growth as a function of historical climatic variation. Sites were grouped using a neural network clustering technique to isolate spatiotemporal and species-specific climate response patterns. The resulting empirical climate sensitivities were compared with the sensitivities of net primary production (NPP) estimates derived from the ORCHIDEE-FM and LPJ-wsl dynamic global vegetation models (DGVMs). Results We found coherent biogeographic patterns in climate response that depend upon (1) phylogenetic controls and (2) ambient environmental conditions delineated by latitudinal/elevational location. Temperature controls dominate forest productivity in high-elevation and high-latitude areas whereas moisture sensitive sites are widespread at low elevation in central and southern Europe. DGVM simulations broadly reproduce the empirical patterns, but show less temperature sensitivity in the boreal zone and stronger precipitation sensitivity towards the mid-latitudes. Main conclusions Large-scale forest productivity is driven by monthly to seasonal climate controls, but our results emphasize species-specific growth patterns under comparable environmental conditions. Furthermore, we demonstrate that carry-over effects from the previous growing season can significantly influence tree growth, particularly in areas with harsh climatic conditions – an element not considered in most current-state DGVMs. Model–data discrepancies suggest that the simulated climate sensitivity of NPP will need refinement before carbon-cycle climate feedbacks can be accurately quantified.
Resumo:
The Earth’s climate system is driven by a complex interplay of internal chaotic dynamics and natural and anthropogenic external forcing. Recent instrumental data have shown a remarkable degree of asynchronicity between Northern Hemisphere and Southern Hemisphere temperature fluctuations, thereby questioning the relative importance of internal versus external drivers of past as well as future climate variability1, 2, 3. However, large-scale temperature reconstructions for the past millennium have focused on the Northern Hemisphere4, 5, limiting empirical assessments of inter-hemispheric variability on multi-decadal to centennial timescales. Here, we introduce a new millennial ensemble reconstruction of annually resolved temperature variations for the Southern Hemisphere based on an unprecedented network of terrestrial and oceanic palaeoclimate proxy records. In conjunction with an independent Northern Hemisphere temperature reconstruction ensemble5, this record reveals an extended cold period (1594–1677) in both hemispheres but no globally coherent warm phase during the pre-industrial (1000–1850) era. The current (post-1974) warm phase is the only period of the past millennium where both hemispheres are likely to have experienced contemporaneous warm extremes. Our analysis of inter-hemispheric temperature variability in an ensemble of climate model simulations for the past millennium suggests that models tend to overemphasize Northern Hemisphere–Southern Hemisphere synchronicity by underestimating the role of internal ocean–atmosphere dynamics, particularly in the ocean-dominated Southern Hemisphere. Our results imply that climate system predictability on decadal to century timescales may be lower than expected based on assessments of external climate forcing and Northern Hemisphere temperature variations5, 6 alone.
Resumo:
Abstract Cloud computing service emerged as an essential component of the Enterprise {IT} infrastructure. Migration towards a full range and large-scale convergence of Cloud and network services has become the current trend for addressing requirements of the Cloud environment. Our approach takes the infrastructure as a service paradigm to build converged virtual infrastructures, which allow offering tailored performance and enable multi-tenancy over a common physical infrastructure. Thanks to virtualization, new exploitation activities of the physical infrastructures may arise for both transport network and Data Centres services. This approach makes network and Data Centres’ resources dedicated to Cloud Computing to converge on the same flexible and scalable level. The work presented here is based on the automation of the virtual infrastructure provisioning service. On top of the virtual infrastructures, a coordinated operation and control of the different resources is performed with the objective of automatically tailoring connectivity services to the Cloud service dynamics. Furthermore, in order to support elasticity of the Cloud services through the optical network, dynamic re-planning features have been provided to the virtual infrastructure service, which allows scaling up or down existing virtual infrastructures to optimize resource utilisation and dynamically adapt to users’ demands. Thus, the dynamic re-planning of the service becomes key component for the coordination of Cloud and optical network resource in an optimal way in terms of resource utilisation. The presented work is complemented with a use case of the virtual infrastructure service being adopted in a distributed Enterprise Information System, that scales up and down as a function of the application requests.
Resumo:
The proliferation of multimedia content and the demand for new audio or video services have fostered the development of a new era based on multimedia information, which allowed the evolution of Wireless Multimedia Sensor Networks (WMSNs) and also Flying Ad-Hoc Networks (FANETs). In this way, live multimedia services require real-time video transmissions with a low frame loss rate, tolerable end-to-end delay, and jitter to support video dissemination with Quality of Experience (QoE) support. Hence, a key principle in a QoE-aware approach is the transmission of high priority frames (protect them) with a minimum packet loss ratio, as well as network overhead. Moreover, multimedia content must be transmitted from a given source to the destination via intermediate nodes with high reliability in a large scale scenario. The routing service must cope with dynamic topologies caused by node failure or mobility, as well as wireless channel changes, in order to continue to operate despite dynamic topologies during multimedia transmission. Finally, understanding user satisfaction on watching a video sequence is becoming a key requirement for delivery of multimedia content with QoE support. With this goal in mind, solutions involving multimedia transmissions must take into account the video characteristics to improve video quality delivery. The main research contributions of this thesis are driven by the research question how to provide multimedia distribution with high energy-efficiency, reliability, robustness, scalability, and QoE support over wireless ad hoc networks. The thesis addresses several problem domains with contributions on different layers of the communication stack. At the application layer, we introduce a QoE-aware packet redundancy mechanism to reduce the impact of the unreliable and lossy nature of wireless environment to disseminate live multimedia content. At the network layer, we introduce two routing protocols, namely video-aware Multi-hop and multi-path hierarchical routing protocol for Efficient VIdeo transmission for static WMSN scenarios (MEVI), and cross-layer link quality and geographical-aware beaconless OR protocol for multimedia FANET scenarios (XLinGO). Both protocols enable multimedia dissemination with energy-efficiency, reliability and QoE support. This is achieved by combining multiple cross-layer metrics for routing decision in order to establish reliable routes.
Resumo:
Cloud Computing has evolved to become an enabler for delivering access to large scale distributed applications running on managed network-connected computing systems. This makes possible hosting Distributed Enterprise Information Systems (dEISs) in cloud environments, while enforcing strict performance and quality of service requirements, defined using Service Level Agreements (SLAs). {SLAs} define the performance boundaries of distributed applications, and are enforced by a cloud management system (CMS) dynamically allocating the available computing resources to the cloud services. We present two novel VM-scaling algorithms focused on dEIS systems, which optimally detect most appropriate scaling conditions using performance-models of distributed applications derived from constant-workload benchmarks, together with SLA-specified performance constraints. We simulate the VM-scaling algorithms in a cloud simulator and compare against trace-based performance models of dEISs. We compare a total of three SLA-based VM-scaling algorithms (one using prediction mechanisms) based on a real-world application scenario involving a large variable number of users. Our results show that it is beneficial to use autoregressive predictive SLA-driven scaling algorithms in cloud management systems for guaranteeing performance invariants of distributed cloud applications, as opposed to using only reactive SLA-based VM-scaling algorithms.
Resumo:
The brain is a complex neural network with a hierarchical organization and the mapping of its elements and connections is an important step towards the understanding of its function. Recent developments in diffusion-weighted imaging have provided the opportunity to reconstruct the whole-brain structural network in-vivo at a large scale level and to study the brain structural substrate in a framework that is close to the current understanding of brain function. However, methods to construct the connectome are still under development and they should be carefully evaluated. To this end, the first two studies included in my thesis aimed at improving the analytical tools specific to the methodology of brain structural networks. The first of these papers assessed the repeatability of the most common global and local network metrics used in literature to characterize the connectome, while in the second paper the validity of further metrics based on the concept of communicability was evaluated. Communicability is a broader measure of connectivity which accounts also for parallel and indirect connections. These additional paths may be important for reorganizational mechanisms in the presence of lesions as well as to enhance integration in the network. These studies showed good to excellent repeatability of global network metrics when the same methodological pipeline was applied, but more variability was detected when considering local network metrics or when using different thresholding strategies. In addition, communicability metrics have been found to add some insight into the integration properties of the network by detecting subsets of nodes that were highly interconnected or vulnerable to lesions. The other two studies used methods based on diffusion-weighted imaging to obtain knowledge concerning the relationship between functional and structural connectivity and about the etiology of schizophrenia. The third study integrated functional oscillations measured using electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) as well as diffusion-weighted imaging data. The multimodal approach that was applied revealed a positive relationship between individual fluctuations of the EEG alpha-frequency and diffusion properties of specific connections of two resting-state networks. Finally, in the fourth study diffusion-weighted imaging was used to probe for a relationship between the underlying white matter tissue structure and season of birth in schizophrenia patients. The results are in line with the neurodevelopmental hypothesis of early pathological mechanisms as the origin of schizophrenia. The different analytical approaches selected in these studies also provide arguments for discussion of the current limitations in the analysis of brain structural networks. To sum up, the first studies presented in this thesis illustrated the potential of brain structural network analysis to provide useful information on features of brain functional segregation and integration using reliable network metrics. In the other two studies alternative approaches were presented. The common discussion of the four studies enabled us to highlight the benefits and possibilities for the analysis of the connectome as well as some current limitations.
Resumo:
Wireless networks have become more and more popular because of ease of installation, ease of access, and support of smart terminals and gadgets on the move. In the overall life cycle of providing green wireless technology, from production to operation and, finally, removal, this chapter focuses on the operation phase and summarizes insights in energy consumption of major technologies. The chapter also focuses on the edge of the network, comprising network access points (APs) and mobile user devices. It discusses particularities of most important wireless networking technologies: wireless access networks including 3G/LTE and wireless mesh networks (WMNs); wireless sensor networks (WSNs); and ad-hoc and opportunistic networks. Concerning energy efficiency, the chapter discusses challenges in access, wireless sensor, and ad-hoc and opportunistic networks.
Resumo:
We study the sensitivity of large-scale xenon detectors to low-energy solar neutrinos, to coherent neutrino-nucleus scattering and to neutrinoless double beta decay. As a concrete example, we consider the xenon part of the proposed DARWIN (Dark Matter WIMP Search with Noble Liquids) experiment. We perform detailed Monte Carlo simulations of the expected backgrounds, considering realistic energy resolutions and thresholds in the detector. In a low-energy window of 2–30 keV, where the sensitivity to solar pp and 7Be-neutrinos is highest, an integrated pp-neutrino rate of 5900 events can be reached in a fiducial mass of 14 tons of natural xenon, after 5 years of data. The pp-neutrino flux could thus be measured with a statistical uncertainty around 1%, reaching the precision of solar model predictions. These low-energy solar neutrinos will be the limiting background to the dark matter search channel for WIMP-nucleon cross sections below ~2X 10-48 cm2 and WIMP masses around 50 GeV c 2, for an assumed 99.5% rejection of electronic recoils due to elastic neutrino-electron scatters. Nuclear recoils from coherent scattering of solar neutrinos will limit the sensitivity to WIMP masses below ~6 GeV c-2 to cross sections above ~4X10-45cm2. DARWIN could reach a competitive half-life sensitivity of 5.6X1026 y to the neutrinoless double beta decay of 136Xe after 5 years of data, using 6 tons of natural xenon in the central detector region.
Resumo:
Spontaneous EEG signal can be parsed into sub-second periods of stable functional states (microstates) that assumingly correspond to brief large scale synchronization events. In schizophrenia, a specific class of microstate (class "D") has been found to be shorter than in healthy controls and to be correlated with positive symptoms. To explore potential new treatment options in schizophrenia, we tested in healthy controls if neurofeedback training to self-regulate microstate D presence is feasible and what learning patterns are observed. Twenty subjects underwent EEG-neurofeedback training to up-regulate microstate D presence. The protocol included 20 training sessions, consisting of baseline trials (resting state), regulation trials with auditory feedback contingent on microstate D presence, and a transfer trial. Response to neurofeedback was assessed with mixed effects modelling. All participants increased the percentage of time spent producing microstate D in at least one of the three conditions (p < 0.05). Significant between-subjects across-sessions results showed an increase of 0.42 % of time spent producing microstate D in baseline (reflecting a sustained change in the resting state), 1.93 % of increase during regulation and 1.83 % during transfer. Within-session analysis (performed in baseline and regulation trials only) showed a significant 1.65 % increase in baseline and 0.53 % increase in regulation. These values are in a range that is expected to have an impact upon psychotic experiences. Additionally, we found a negative correlation between alpha power and microstate D contribution during neurofeedback training. Given that microstate D has been related to attentional processes, this result provides further evidence that the training was to some degree specific for the attentional network. We conclude that microstate-neurofeedback training proved feasible in healthy subjects. The implementation of the same protocol in schizophrenia patients may promote skills useful to reduce positive symptoms by means of EEG-neurofeedback.
Resumo:
This study presents a comprehensive assessment of high-resolution Southern Hemisphere (SH) paleoarchives covering the last 2000 years. We identified 174 monthly to annually resolved climate proxy (tree ring, coral, ice core, documentary, speleothem and sedimentary) records from the Hemisphere. We assess the interannual and decadal sensitivity of each proxy record to large-scale circulation indices from the Pacific, Indian and Southern Ocean regions over the twentieth century. We then analyse the potential of this newly expanded palaeoclimate network to collectively represent predictands (sea surface temperature, sea level pressure, surface air temperature and precipitation) commonly used in climate reconstructions. The key dynamical centres-of-action of the equatorial Indo-Pacific are well captured by the palaeoclimate network, indicating that there is considerable reconstruction potential in this region, particularly in the post AD 1600 period when a number of long coral records are available. Current spatiotemporal gaps in data coverage and regions where significant potential for future proxy collection exists are discussed. We then highlight the need for new and extended records from key dynamical regions of the Southern Hemisphere. Although large-scale climate field reconstructions for the SH are in their infancy, we report that excellent progress in the development of regional proxies now makes plausible estimates of continental- to hemispheric-scale climate variations possible.
Resumo:
The north-eastern escarpment of Madagascar contains the island’s last remaining large-scale humid forest massifs surrounded by diverse small-scale agricultural mosaics. There is high deforestation mainly caused by shifting cultivation practiced by local land users to produce upland rice for subsistence. Today, large protected areas restrict land users’ access to forests to collect wood and other forest products. Moreover, they are no more able to expand their cultivated land, which leads to shorter shifting cultivation cycles and decreasing plot sizes for irrigated rice and cash crop cultivation. Cash crop production of clove and vanilla is exposed to risks such as extreme inter-annual price fluctuations, pests and cyclones. In the absence of work opportunities, agricultural extension services and micro-finance schemes people are stuck in a poverty trap. New development strategies are needed to mitigate the trade-offs between forest conservation and human well-being. As landscape composition and livelihood strategies vary across the region, these strategies need to be spatially differentiated to avoid implementing generic solutions, which do not fit the local context. However, up to date, little is known about the spatial patterns of shifting cultivation and other land use systems at the regional level. This is mainly due to the high spatial and temporal dynamics inherent to shifting cultivation, which makes it difficult to monitor the dynamics of this land use system with remote sensing methods. Furthermore, knowledge about land users’ livelihood strategies and the risks and opportunities they face stems from very few local case studies. To overcome this challenge, firstly, we used remote sensing data and a landscape mosaic approach to delineate the main landscape types at the regional level. Secondly, we developed a land user typology based on socio-ecological data from household surveys in 45 villages spread throughout the region. Combining the land user typology with the landscape mosaic map allowed us to reveal spatial patterns of the interaction between landscapes and people and to better understand the trade-offs between forest conservation and local wellbeing. While shifting cultivation systems are being transformed into more intensive permanent agricultural systems in many countries around the globe, Madagascar seems to be an exception to this trend. Linking land cover information to human-environmental interactions over large areas is crucial to designing policies and to inform decision making for a more sustainable development of this resource-rich but poverty-prone context.