10 resultados para Hosting
em CentAUR: Central Archive University of Reading - UK
Resumo:
During many lava dome-forming eruptions, persistent rockfalls and the concurrent development of a substantial talus apron around the foot of the dome are important aspects of the observed activity. An improved understanding of internal dome structure, including the shape and internal boundaries of the talus apron, is critical for determining when a lava dome is poised for a major collapse and how this collapse might ensue. We consider a period of lava dome growth at the Soufrière Hills Volcano, Montserrat, from August 2005 to May 2006, during which a 100 × 106 m3 lava dome developed that culminated in a major dome-collapse event on 20 May 2006. We use an axi-symmetrical Finite Element Method model to simulate the growth and evolution of the lava dome, including the development of the talus apron. We first test the generic behaviour of this continuum model, which has core lava and carapace/talus components. Our model describes the generation rate of talus, including its spatial and temporal variation, as well as its post-generation deformation, which is important for an improved understanding of the internal configuration and structure of the dome. We then use our model to simulate the 2005 to 2006 Soufrière Hills dome growth using measured dome volumes and extrusion rates to drive the model and generate the evolving configuration of the dome core and carapace/talus domains. The evolution of the model is compared with the observed rockfall seismicity using event counts and seismic energy parameters, which are used here as a measure of rockfall intensity and hence a first-order proxy for volumes. The range of model-derived volume increments of talus aggraded to the talus slope per recorded rockfall event, approximately 3 × 103–13 × 103 m3 per rockfall, is high with respect to estimates based on observed events. From this, it is inferred that some of the volumetric growth of the talus apron (perhaps up to 60–70%) might have occurred in the form of aseismic deformation of the talus, forced by an internal, laterally spreading core. Talus apron growth by this mechanism has not previously been identified, and this suggests that the core, hosting hot gas-rich lava, could have a greater lateral extent than previously considered.
Resumo:
A full assessment of para-virtualization is important, because without knowledge about the various overheads, users can not understand whether using virtualization is a good idea or not. In this paper we are very interested in assessing the overheads of running various benchmarks on bare-‐metal, as well as on para-‐virtualization. The idea is to see what the overheads of para-‐ virtualization are, as well as looking at the overheads of turning on monitoring and logging. The knowledge from assessing various benchmarks on these different systems will help a range of users understand the use of virtualization systems. In this paper we assess the overheads of using Xen, VMware, KVM and Citrix, see Table 1. These different virtualization systems are used extensively by cloud-‐users. We are using various Netlib1 benchmarks, which have been developed by the University of Tennessee at Knoxville (UTK), and Oak Ridge National Laboratory (ORNL). In order to assess these virtualization systems, we run the benchmarks on bare-‐metal, then on the para-‐virtualization, and finally we turn on monitoring and logging. The later is important as users are interested in Service Level Agreements (SLAs) used by the Cloud providers, and the use of logging is a means of assessing the services bought and used from commercial providers. In this paper we assess the virtualization systems on three different systems. We use the Thamesblue supercomputer, the Hactar cluster and IBM JS20 blade server (see Table 2), which are all servers available at the University of Reading. A functional virtualization system is multi-‐layered and is driven by the privileged components. Virtualization systems can host multiple guest operating systems, which run on its own domain, and the system schedules virtual CPUs and memory within each Virtual Machines (VM) to make the best use of the available resources. The guest-‐operating system schedules each application accordingly. You can deploy virtualization as full virtualization or para-‐virtualization. Full virtualization provides a total abstraction of the underlying physical system and creates a new virtual system, where the guest operating systems can run. No modifications are needed in the guest OS or application, e.g. the guest OS or application is not aware of the virtualized environment and runs normally. Para-‐virualization requires user modification of the guest operating systems, which runs on the virtual machines, e.g. these guest operating systems are aware that they are running on a virtual machine, and provide near-‐native performance. You can deploy both para-‐virtualization and full virtualization across various virtualized systems. Para-‐virtualization is an OS-‐assisted virtualization; where some modifications are made in the guest operating system to enable better performance. In this kind of virtualization, the guest operating system is aware of the fact that it is running on the virtualized hardware and not on the bare hardware. In para-‐virtualization, the device drivers in the guest operating system coordinate the device drivers of host operating system and reduce the performance overheads. The use of para-‐virtualization [0] is intended to avoid the bottleneck associated with slow hardware interrupts that exist when full virtualization is employed. It has revealed [0] that para-‐ virtualization does not impose significant performance overhead in high performance computing, and this in turn this has implications for the use of cloud computing for hosting HPC applications. The “apparent” improvement in virtualization has led us to formulate the hypothesis that certain classes of HPC applications should be able to execute in a cloud environment, with minimal performance degradation. In order to support this hypothesis, first it is necessary to define exactly what is meant by a “class” of application, and secondly it will be necessary to observe application performance, both within a virtual machine and when executing on bare hardware. A further potential complication is associated with the need for Cloud service providers to support Service Level Agreements (SLA), so that system utilisation can be audited.
Resumo:
During April-May 2010 volcanic ash clouds from the Icelandic Eyjafjallajökull volcano reached Europe causing an unprecedented disruption of the EUR/NAT region airspace. Civil aviation authorities banned all flight operations because of the threat posed by volcanic ash to modern turbine aircraft. New quantitative airborne ash mass concentration thresholds, still under discussion, were adopted for discerning regions contaminated by ash. This has implications for ash dispersal models routinely used to forecast the evolution of ash clouds. In this new context, quantitative model validation and assessment of the accuracies of current state-of-the-art models is of paramount importance. The passage of volcanic ash clouds over central Europe, a territory hosting a dense network of meteorological and air quality observatories, generated a quantity of observations unusual for volcanic clouds. From the ground, the cloud was observed by aerosol lidars, lidar ceilometers, sun photometers, other remote-sensing instru- ments and in-situ collectors. From the air, sondes and multiple aircraft measurements also took extremely valuable in-situ and remote-sensing measurements. These measurements constitute an excellent database for model validation. Here we validate the FALL3D ash dispersal model by comparing model results with ground and airplane-based measurements obtained during the initial 14e23 April 2010 Eyjafjallajökull explosive phase. We run the model at high spatial resolution using as input hourly- averaged observed heights of the eruption column and the total grain size distribution reconstructed from field observations. Model results are then compared against remote ground-based and in-situ aircraft-based measurements, including lidar ceilometers from the German Meteorological Service, aerosol lidars and sun photometers from EARLINET and AERONET networks, and flight missions of the German DLR Falcon aircraft. We find good quantitative agreement, with an error similar to the spread in the observations (however depending on the method used to estimate mass eruption rate) for both airborne and ground mass concentration. Such verification results help us understand and constrain the accuracy and reliability of ash transport models and it is of enormous relevance for designing future operational mitigation strategies at Volcanic Ash Advisory Centers.
Resumo:
By placing axons into polymeric micro-channels hosting embedded electrodes the extracellular amplitude of action potentials is greatly increased, allowing for robust recording and noise suppression. We are developing such an electrode interface to record electrical activity from bladder afferents to restore bladder control in patients suffering from spinal cord injury. Here we describe our microchannel electrode interface in terms of design, microfabrication and electrode characteristics and report on in vivo bladder function after implantation of teased dorsal rootlets within microchannels.
Resumo:
The narrative of Rosemary’s Baby hinges on a central hesitation between pregnancy induced madness and the existence of Satanism. Accordingly, the monstrous element is embodied in both the real and the supernatural: Rosemary’s husband Guy (John Cassavetes) is responsible for her victimisation through rape in either explanation. However, I will argue that the inherent ambiguity of the plot makes it difficult to place him as such a figure typical to the archetypal horror binaries of normality/monster, human/inhuman. By displacing generic convention the film complicates the issue of monstrosity, whilst simultaneously offering the possibility for the depiction of female experience of marriage to be at the centre of the narrative, for the real to be possibly of more significance than the supernatural. Previous writing has tended to concentrate on Rosemary and her pregnancy, so through detailed consideration of Cassavetes’ performance and its placement in the mise-en-scène this focus on Guy aims to demonstrate that he changes almost as much as Rosemary does. The chapter will focus on the film’s depiction of rape, during Rosemary’s nightmare and after it, in order to demonstrate how the notion of performance reveals Guy’s monstrousness and the difficulties this represents in our engagement with him.
Resumo:
Soils play a pivotal role in major global biogeochemical cycles (carbon, nutrient and water), while hosting the largest diversity of organisms on land. Because of this, soils deliver fundamental ecosystem services, and management to change a soil process in support of one ecosystem service can either provide co-benefits to other services or can result in trade-offs. In this critical review, we report the state-of-the-art understanding concerning the biogeochemical cycles and biodiversity in soil, and relate these to the provisioning, regulating, supporting and cultural ecosystem services which they underpin. We then outline key knowledge gaps and research challenges, before providing recommendations for management activities to support the continued delivery of ecosystem services from soils. We conclude that although there are knowledge gaps that require further research, enough is known to start improving soils globally. The main challenge is in finding ways to share knowledge with soil managers and policy-makers, so that best-practice management can be implemented. A key element of this knowledge sharing must be in raising awareness of the multiple ecosystem services underpinned by soils, and the natural capital they provide. The International Year of Soils in 2015 presents the perfect opportunity to begin a step-change in how we harness scientific knowledge to bring about more sustainable use of soils for a secure global society.
Resumo:
Soils play a pivotal role in major global biogeochemical cycles (carbon, nutrient, and water), while hosting the largest diversity of organisms on land. Because of this, soils deliver fundamental ecosystem services, and management to change a soil process in support of one ecosystem service can either provide co-benefits to other services or result in trade-offs. In this critical review, we report the state-of-the-art understanding concerning the biogeochemical cycles and biodiversity in soil, and relate these to the provisioning, regulating, supporting, and cultural ecosystem services which they underpin. We then outline key knowledge gaps and research challenges, before providing recommendations for management activities to support the continued delivery of ecosystem services from soils. We conclude that, although soils are complex, there are still knowledge gaps, and fundamental research is still needed to better understand the relationships between different facets of soils and the array of ecosystem services they underpin, enough is known to implement best practices now. There is a tendency among soil scientists to dwell on the complexity and knowledge gaps rather than to focus on what we do know and how this knowledge can be put to use to improve the delivery of ecosystem services. A significant challenge is to find effective ways to share knowledge with soil managers and policy makers so that best management can be implemented. A key element of this knowledge exchange must be to raise awareness of the ecosystems services underpinned by soils and thus the natural capital they provide. We know enough to start moving in the right direction while we conduct research to fill in our knowledge gaps. The lasting legacy of the International Year of Soils in 2015 should be for soil scientists to work together with policy makers and land managers to put soils at the centre of environmental policy making and land management decisions.
Resumo:
Detailed observations of the solar system planets reveal a wide variety of local atmospheric conditions. Astronomical observations have revealed a variety of extrasolar planets none of which resembles any of the solar system planets in full. Instead, the most massive amongst the extrasolar planets, the gas giants, appear very similar to the class of (young) Brown Dwarfs which are amongst the oldest objects in the universe. Despite of this diversity, solar system planets, extrasolar planets and Brown Dwarfs have broadly similar global temperatures between 300K and 2500K. In consequence, clouds of different chemical species form in their atmospheres. While the details of these clouds differ, the fundamental physical processes are the same. Further to this, all these objects were observed to produce radio and X-ray emission. While both kinds of radiation are well studied on Earth and to a lesser extent on the solar system planets, the occurrence of emission that potentially originate from accelerated electrons on Brown Dwarfs, extrasolar planets and protoplanetary disks is not well understood yet. This paper offers an interdisciplinary view on electrification processes and their feedback on their hosting environment in meteorology, volcanology, planetology and research on extrasolar planets and planet formation.
Resumo:
This paper discusses the work of Claude Parent and The Serving Library, considering the critiques generated by their intersecting of architecture, art and editorial design. Through focus on the ways in which hosting environment, architecture and forms of expanded publishing can serve to dissolve disciplinary boundaries and activities of production, spectatorship and reception, it draws on the lineage of 1960s/70s Conceptual Art in considering these practices as a means through which to escape medium specificity and spatial confinement. Relationships between actual and virtual space are then read against this broadening of aesthetic ideas and the theory of critical modernity.