945 resultados para Runs
Resumo:
We use a stratosphere–troposphere composition–climate model with interactive sulfur chemistry and aerosol microphysics, to investigate the effect of the 1991 Mount Pinatubo eruption on stratospheric aerosol properties. Satellite measurements indicate that shortly after the eruption, between 14 and 23 Tg of SO2 (7 to 11.5 Tg of sulfur) was present in the tropical stratosphere. Best estimates of the peak global stratospheric aerosol burden are in the range 19 to 26 Tg, or 3.7 to 6.7 Tg of sulfur assuming a composition of between 59 and 77 % H2SO4. In light of this large uncertainty range, we performed two main simulations with 10 and 20 Tg of SO2 injected into the tropical lower stratosphere. Simulated stratospheric aerosol properties through the 1991 to 1995 period are compared against a range of available satellite and in situ measurements. Stratospheric aerosol optical depth (sAOD) and effective radius from both simulations show good qualitative agreement with the observations, with the timing of peak sAOD and decay timescale matching well with the observations in the tropics and mid-latitudes. However, injecting 20 Tg gives a factor of 2 too high stratospheric aerosol mass burden compared to the satellite data, with consequent strong high biases in simulated sAOD and surface area density, with the 10 Tg injection in much better agreement. Our model cannot explain the large fraction of the injected sulfur that the satellite-derived SO2 and aerosol burdens indicate was removed within the first few months after the eruption. We suggest that either there is an additional alternative loss pathway for the SO2 not included in our model (e.g. via accommodation into ash or ice in the volcanic cloud) or that a larger proportion of the injected sulfur was removed via cross-tropopause transport than in our simulations. We also critically evaluate the simulated evolution of the particle size distribution, comparing in detail to balloon-borne optical particle counter (OPC) measurements from Laramie, Wyoming, USA (41° N). Overall, the model captures remarkably well the complex variations in particle concentration profiles across the different OPC size channels. However, for the 19 to 27 km injection height-range used here, both runs have a modest high bias in the lowermost stratosphere for the finest particles (radii less than 250 nm), and the decay timescale is longer in the model for these particles, with a much later return to background conditions. Also, whereas the 10 Tg run compared best to the satellite measurements, a significant low bias is apparent in the coarser size channels in the volcanically perturbed lower stratosphere. Overall, our results suggest that, with appropriate calibration, aerosol microphysics models are capable of capturing the observed variation in particle size distribution in the stratosphere across both volcanically perturbed and quiescent conditions. Furthermore, additional sensitivity simulations suggest that predictions with the models are robust to uncertainties in sub-grid particle formation and nucleation rates in the stratosphere.
Resumo:
Massive Open Online Courses (MOOCs) attract learners with a variety of backgrounds. Engaging them using game development was trialled in a beginner’s programming course, “Begin programming: build your first mobile game”, on FutureLearn platform. The course has completed two iterations: first in autumn 2013 and second in spring 2014 with thousands of participants. This paper explores the characteristics of learner groups attracted by these two consecutive runs of the course and their perceptions of the course using pre- and post-course surveys. Recommendations for practitioners are offered, including when the audience is different to the one expected. A MOOC is unlikely to please everyone, especially with such large cohorts. Nevertheless, this course, using game development as a vehicle to teach programming, seems to have offered a balanced learning experience to a diverse group of learners. However, MOOC creators and facilitators should accept that a course cannot be made to please everyone and try to communicate clearly who the intended audience for the course are.
Resumo:
There are large uncertainties in the circulation response of the atmosphere to climate change. One manifestation of this is the substantial spread in projections for the extratropical storm tracks made by different state-of-the-art climate models. In this study we perform a series of sensitivity experiments, with the atmosphere component of a single climate model, in order to identify the causes of the differences between storm track responses in different models. In particular, the Northern Hemisphere wintertime storm tracks in the CMIP3 multi-model ensemble are considered. A number of potential physical drivers of storm track change are identified and their influence on the storm tracks is assessed. The experimental design aims to perturb the different physical drivers independently, by magnitudes representative of the range of values present in the CMIP3 model runs, and this is achieved via perturbations to the sea surface temperature and the sea-ice concentration forcing fields. We ask the question: can the spread of projections for the extratropical storm tracks present in the CMIP3 models be accounted for in a simple way by any of the identified drivers? The results suggest that, whilst the changes in the upper-tropospheric equator-to-pole temperature difference have an influence on the storm track response to climate change, the large spread of projections for the extratropical storm track present in the northern North Atlantic in particular is more strongly associated with changes in the lower-tropospheric equator-to-pole temperature difference.
Resumo:
TESSA is a toolkit for experimenting with sensory augmentation. It includes hardware and software to facilitate rapid prototyping of interfaces that can enhance one sense using information gathered from another sense. The toolkit contains a range of sensors (e.g. ultrasonics, temperature sensors) and actuators (e.g. tactors or stereo sound), designed modularly so that inputs and outputs can be easily swapped in and out and customized using TESSA’s graphical user interface (GUI), with “real time” feedback. The system runs on a Raspberry Pi with a built-in touchscreen, providing a compact and portable form that is amenable for field trials. At CHI Interactivity, the audience will have the opportunity to experience sensory augmentation effects using this system, and design their own sensory augmentation interfaces.
Resumo:
A virtual system that emulates an ARM-based processor machine has been created to replace a traditional hardware-based system for teaching assembly language. The proposed virtual system integrates, in a single environment, all the development tools necessary to deliver introductory or advanced courses on modern assembly language programming. The virtual system runs a Linux operating system in either a graphical or console mode on a Windows or Linux host machine. No software licenses or extra hardware are required to use the virtual system, thus students are free to carry their own ARM emulator with them on a USB memory stick. Institutions adopting this, or a similar virtual system, can also benefit by reducing capital investment in hardware-based development kits and enable distance learning courses.
Resumo:
In general, particle filters need large numbers of model runs in order to avoid filter degeneracy in high-dimensional systems. The recently proposed, fully nonlinear equivalent-weights particle filter overcomes this requirement by replacing the standard model transition density with two different proposal transition densities. The first proposal density is used to relax all particles towards the high-probability regions of state space as defined by the observations. The crucial second proposal density is then used to ensure that the majority of particles have equivalent weights at observation time. Here, the performance of the scheme in a high, 65 500 dimensional, simplified ocean model is explored. The success of the equivalent-weights particle filter in matching the true model state is shown using the mean of just 32 particles in twin experiments. It is of particular significance that this remains true even as the number and spatial variability of the observations are changed. The results from rank histograms are less easy to interpret and can be influenced considerably by the parameter values used. This article also explores the sensitivity of the performance of the scheme to the chosen parameter values and the effect of using different model error parameters in the truth compared with the ensemble model runs.
Resumo:
This study investigates the effects of a short-term pedagogic intervention on the development of L2 fluency among learners studying English for Academic purposes (EAP) at a university in the UK. It also examines the interaction between the development of fluency, and complexity and accuracy. Through a pre-test, post-test design, data were collected over a period of four weeks from learners performing monologic tasks. While the Control Group (CG) focused on developing general speaking and listening skills, the Experimental Group (EG) received awareness-raising activities and fluency strategy training in addition to general speaking and listening practice i.e following the syllabus. The data, coded in terms of a range of measures of fluency, accuracy and complexity, were subjected to repeated measures MANOVA, t-tests and correlations. The results indicate that after the intervention, while some fluency gains were achieved by the CG, the EG produced statistically more fluent language demonstrating a faster speech and articulation rate, longer runs and higher phonation time ratios. The significant correlations obtained between measures of accuracy and learners’ pauses in the CG suggest that pausing opportunities may have been linked to accuracy. The findings of the study have significant implications for L2 pedagogy, highlighting the effective impact of instruction on the development of fluency.
Resumo:
The building industry is often berated for its short comings in meeting up with the demand for the provision of new housing. Addressing the need for new housing stock is a challenge that has led to debates among professional bodies, the construction sector, housing industry and government. The introduction of new manufacturing technologies is often offered as a solution, but the challenges of increasing the amount of off-site construction in residential building are well known and well-rehearsed. The modern flying factory (MFF) is a concept that involves the manufacture of specific components or modules in temporary off- or near- site locations using relatively simple and quick to set up and dismantle technologies and processes. The aim is to produce short batches and hence achieve some of the benefits of off-site manufacture on a much smaller scale than in dedicated factory environments. A case study of a modern flying factory being set up to produce pre-assembled utility cupboards for a large residential development in London is presented, involving participant observation and informal interviews with key actors on the design and operationalising of the process. The case reveals that although there are costs, efficiency and health and safety benefits to using MFF approaches, there are also challenges to overcome over the time required to set up and establish the process for relatively short runs, and in evaluating whether the MFF or traditional site based production is most effective for particular aspects of projects.
Resumo:
The martian solsticial pause, presented in a companion paper (Lewis et al., this issue), was investigated further through a series of model runs using the UK version of the LMD/UK Mars Global Climate Model. It was found that the pause could not be adequately reproduced if radiatively active water ice clouds were omitted from the model. When clouds were used, along with a realistic time-dependent dust opacity distribution, a substantial minimum in near-surface transient eddy activity formed around solstice in both hemispheres. The net effect of the clouds in the model is, by altering the thermal structure of the atmosphere, to decrease the vertical shear of the westerly jet near the surface around solstice, and thus reduce baroclinic growth rates. A similar effect was seen under conditions of large dust loading, implying that northern midlatitude eddy activity will tend to become suppressed after a period of intense flushing storm formation around the northern cap edge. Suppression of baroclinic eddy generation by the barotropic component of the flow and via diabatic eddy dissipation were also investigated as possible mechanisms leading to the formation of the solsticial pause but were found not to make major contributions. Zonal variations in topography were found to be important, as their presence results in weakened transient eddies around winter solstice in both hemispheres, through modification of the near-surface flow. The zonal topographic asymmetry appears to be the primary reason for the weakness of eddy activity in the southern hemisphere relative to the northern hemisphere, and the ultimate cause of the solsticial pause in both hemispheres. The meridional topographic gradient was found to exert a much weaker influence on near-surface transient eddies.
Resumo:
We describe infinitely scalable pipeline machines with perfect parallelism, in the sense that every instruction of an inline program is executed, on successive data, on every clock tick. Programs with shared data effectively execute in less than a clock tick. We show that pipeline machines are faster than single or multi-core, von Neumann machines for sufficiently many program runs of a sufficiently time consuming program. Our pipeline machines exploit the totality of transreal arithmetic and the known waiting time of statically compiled programs to deliver the interesting property that they need no hardware or software exception handling.
Resumo:
The advance of the onset of the Indian monsoon is here explained in terms of a balance between the low-level monsoon flow and an over-running intrusion of mid-tropospheric dry air. The monsoon advances, over a period of about 6 weeks, from the south of the country to the northwest. Given that the low-level monsoon winds are westerly or southwesterly, and the midlevel winds northwesterly, the monsoon onset propagates upwind relative to midlevel flow, and perpendicular to the low-level flow, and is not directly caused by moisture flux toward the northwest. Lacking a conceptual model for the advance means that it has been hard to understand and correct known biases in weather and climate prediction models. The mid-level northwesterlies form a wedge of dry air that is deep in the far northwest of India and over-runs the monsoon flow. The dry layer is moistened from below by shallow cumulus and congestus clouds, so that the profile becomes much closer to moist adiabatic, and the dry layer is much shallower in the vertical, toward the southeast of India. The profiles associated with this dry air show how the most favourable environment for deep convection occurs in the south, and onset occurs here first. As the onset advances across India, the advection of moisture from the Arabian Sea becomes stronger, and the mid-level dry air is increasingly moistened from below. This increased moistening makes the wedge of dry air shallower throughout its horizontal extent, and forces the northern limit of moist convection to move toward the northwest. Wetting of the land surface by rainfall will further reinforce the north-westward progression, by sustaining the supply of boundary layer moisture and shallow cumulus. The local advance of the monsoon onset is coincident with weakening of the mid-level northwesterlies, and therefore weakened mid-level dry advection.
Resumo:
The simulated annealing approach to crystal structure determination from powder diffraction data, as implemented in the DASH program, is readily amenable to parallelization at the individual run level. Very large scale increases in speed of execution can be achieved by distributing individual DASH runs over a network of computers. The CDASH program delivers this by using scalable on-demand computing clusters built on the Amazon Elastic Compute Cloud service. By way of example, a 360 vCPU cluster returned the crystal structure of racemic ornidazole (Z0 = 3, 30 degrees of freedom) ca 40 times faster than a typical modern quad-core desktop CPU. Whilst used here specifically for DASH, this approach is of general applicability to other packages that are amenable to coarse-grained parallelism strategies.
Resumo:
This chapter presents a simple econometric model of the medieval English economy, focusing on the relationship between money, prices and incomes. The model is estimated using annual data for the period 1263-1520 obtained from various sources. The start date is determined by the availability of continuous runs of annual data, while the finishing date immediately precedes the take-off of Tudor price inflation. Accounts from the ecclesiastical and monastic estates have survived in great numbers for this period, thereby ensuring that crop yields can be estimated from a regionally representative set of estates.
Resumo:
We present a selection of methodologies for using the palaeo-climate model component of the Coupled Model Intercomparison Project (Phase 5) (CMIP5) to attempt to constrain future climate projections using the same models. The constraints arise from measures of skill in hindcasting palaeo-climate changes from the present over three periods: the Last Glacial Maximum (LGM) (21 000 yr before present, ka), the mid-Holocene (MH) (6 ka) and the Last Millennium (LM) (850–1850 CE). The skill measures may be used to validate robust patterns of climate change across scenarios or to distinguish between models that have differing outcomes in future scenarios. We find that the multi-model ensemble of palaeo-simulations is adequate for addressing at least some of these issues. For example, selected benchmarks for the LGM and MH are correlated to the rank of future projections of precipitation/temperature or sea ice extent to indicate that models that produce the best agreement with palaeo-climate information give demonstrably different future results than the rest of the models. We also explore cases where comparisons are strongly dependent on uncertain forcing time series or show important non-stationarity, making direct inferences for the future problematic. Overall, we demonstrate that there is a strong potential for the palaeo-climate simulations to help inform the future projections and urge all the modelling groups to complete this subset of the CMIP5 runs.
Resumo:
Sea-ice concentrations in the Laptev Sea simulated by the coupled North Atlantic-Arctic Ocean-Sea-Ice Model and Finite Element Sea-Ice Ocean Model are evaluated using sea-ice concentrations from Advanced Microwave Scanning Radiometer-Earth Observing System satellite data and a polynya classification method for winter 2007/08. While developed to simulate largescale sea-ice conditions, both models are analysed here in terms of polynya simulation. The main modification of both models in this study is the implementation of a landfast-ice mask. Simulated sea-ice fields from different model runs are compared with emphasis placed on the impact of this prescribed landfast-ice mask. We demonstrate that sea-ice models are not able to simulate flaw polynyas realistically when used without fast-ice description. Our investigations indicate that without landfast ice and with coarse horizontal resolution the models overestimate the fraction of open water in the polynya. This is not because a realistic polynya appears but due to a larger-scale reduction of ice concentrations and smoothed ice-concentration fields. After implementation of a landfast-ice mask, the polynya location is realistically simulated but the total open-water area is still overestimated in most cases. The study shows that the fast-ice parameterization is essential for model improvements. However, further improvements are necessary in order to progress from the simulation of large-scale features in the Arctic towards a more detailed simulation of smaller-scaled features (here polynyas) in an Arctic shelf sea.