809 resultados para Significance driven computation
Resumo:
The terrestrial magnetopause suffered considerable sudden changes in its location on 9–10 September 1978. These magnetopause motions were accompanied by disturbances of the geomagnetic field on the ground. We present a study of the magnetopause motions and the ground magnetic signatures using, for the latter, 10 s averaged data from 14 high latitude ground magnetometer stations. Observations in the solar wind (from IMP 8) are employed and the motions of the magnetopause are monitored directly by the spacecraft ISEE 1 and 2. With these coordinated observations we are able to show that it is the sudden changes in the solar wind dynamic pressure that are responsible for the disturbances seen on the ground. At some ground stations we see evidence of a “ringing” of the magnetospheric cavity, while at others only the initial impulse is evident. We note that at some stations field perturbations closely match the hypothesized ground signatures of flux transfer events. In accordance with more recent work in the area (e.g. Potemra et al., 1989, J. geophys. Res., in press), we argue that causes other than impulsive reeonnection may produce the twin ionospheric flow vortex originally proposed as a flux transfer even signature.
Resumo:
Neural stem cells (NSCs) are early precursors of neuronal and glial cells. NSCs are capable of generating identical progeny through virtually unlimited numbers of cell divisions (cell proliferation), producing daughter cells committed to differentiation. Nuclear factor kappa B (NF-kappaB) is an inducible, ubiquitous transcription factor also expressed in neurones, glia and neural stem cells. Recently, several pieces of evidence have been provided for a central role of NF-kappaB in NSC proliferation control. Here, we propose a novel mathematical model for NF-kappaB-driven proliferation of NSCs. We have been able to reconstruct the molecular pathway of activation and inactivation of NF-kappaB and its influence on cell proliferation by a system of nonlinear ordinary differential equations. Then we use a combination of analytical and numerical techniques to study the model dynamics. The results obtained are illustrated by computer simulations and are, in general, in accordance with biological findings reported by several independent laboratories. The model is able to both explain and predict experimental data. Understanding of proliferation mechanisms in NSCs may provide a novel outlook in both potential use in therapeutic approaches, and basic research as well.
A benchmark-driven modelling approach for evaluating deployment choices on a multi-core architecture
Resumo:
The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.
Resumo:
The incorporation of numerical weather predictions (NWP) into a flood warning system can increase forecast lead times from a few hours to a few days. A single NWP forecast from a single forecast centre, however, is insufficient as it involves considerable non-predictable uncertainties and can lead to a high number of false or missed warnings. Weather forecasts using multiple NWPs from various weather centres implemented on catchment hydrology can provide significantly improved early flood warning. The availability of global ensemble weather prediction systems through the ‘THORPEX Interactive Grand Global Ensemble’ (TIGGE) offers a new opportunity for the development of state-of-the-art early flood forecasting systems. This paper presents a case study using the TIGGE database for flood warning on a meso-scale catchment (4062 km2) located in the Midlands region of England. For the first time, a research attempt is made to set up a coupled atmospheric-hydrologic-hydraulic cascade system driven by the TIGGE ensemble forecasts. A probabilistic discharge and flood inundation forecast is provided as the end product to study the potential benefits of using the TIGGE database. The study shows that precipitation input uncertainties dominate and propagate through the cascade chain. The current NWPs fall short of representing the spatial precipitation variability on such a comparatively small catchment, which indicates need to improve NWPs resolution and/or disaggregating techniques to narrow down the spatial gap between meteorology and hydrology. The spread of discharge forecasts varies from centre to centre, but it is generally large and implies a significant level of uncertainties. Nevertheless, the results show the TIGGE database is a promising tool to forecast flood inundation, comparable with that driven by raingauge observation.
Resumo:
This paper investigates the feasibility of using approximate Bayesian computation (ABC) to calibrate and evaluate complex individual-based models (IBMs). As ABC evolves, various versions are emerging, but here we only explore the most accessible version, rejection-ABC. Rejection-ABC involves running models a large number of times, with parameters drawn randomly from their prior distributions, and then retaining the simulations closest to the observations. Although well-established in some fields, whether ABC will work with ecological IBMs is still uncertain. Rejection-ABC was applied to an existing 14-parameter earthworm energy budget IBM for which the available data consist of body mass growth and cocoon production in four experiments. ABC was able to narrow the posterior distributions of seven parameters, estimating credible intervals for each. ABC’s accepted values produced slightly better fits than literature values do. The accuracy of the analysis was assessed using cross-validation and coverage, currently the best available tests. Of the seven unnarrowed parameters, ABC revealed that three were correlated with other parameters, while the remaining four were found to be not estimable given the data available. It is often desirable to compare models to see whether all component modules are necessary. Here we used ABC model selection to compare the full model with a simplified version which removed the earthworm’s movement and much of the energy budget. We are able to show that inclusion of the energy budget is necessary for a good fit to the data. We show how our methodology can inform future modelling cycles, and briefly discuss how more advanced versions of ABC may be applicable to IBMs. We conclude that ABC has the potential to represent uncertainty in model structure, parameters and predictions, and to embed the often complex process of optimizing an IBM’s structure and parameters within an established statistical framework, thereby making the process more transparent and objective.
Resumo:
We aim to develop an efficient robotic system for stroke rehabilitation, in which a robotic arm moves the hemiplegic upper limb when the patient tries to move it. In order to achieve this goal we have considered a method to detect the patient's intended motion using EEG (Electroencephalogram), and have designed a rehabilitation robot based on a Redundant Drive Method. In this paper, we propose an EEG driven rehabilitation robot system and present initial results evaluating the feasibility of the proposed system.
Resumo:
A strong correlation between the speed of the eddy-driven jet and the width of the Hadley cell is found to exist in the Southern Hemisphere, both in reanalysis data and in twenty-first-century integrations from the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report multimodel archive. Analysis of the space–time spectra of eddy momentum flux reveals that variations in eddy-driven jet speed are related to changes in the mean phase speed of midlatitude eddies. An increase in eddy phase speeds induces a poleward shift of the critical latitudes and a poleward expansion of the region of subtropical wave breaking. The associated changes in eddy momentum flux convergence are balanced by anomalous meridional winds consistent with a wider Hadley cell. At the same time, faster eddies are also associated with a strengthened poleward eddy momentum flux, sustaining a stronger westerly jet in midlatitudes. The proposed mechanism is consistent with the seasonal dependence of the interannual variability of the Hadley cell width and appears to explain at least part of the projected twenty-first-century trends.
Resumo:
We study the effect of a thermal forcing confined to the midlatitudes of one hemisphere on the eddy-driven jet in the opposite hemisphere. We demonstrate the existence of an “interhemispheric teleconnection,” whereby warming (cooling) the Northern Hemisphere causes both the intertropical convergence zone (ITCZ) and the Southern Hemispheric midlatitude jet to shift northward (southward). The interhemispheric teleconnection is effected by a change in the asymmetry of the Hadley cells: as the ITCZ shifts away from the Equator, the cross-equatorial Hadley cell intensifies, fluxing more momentum toward the subtropics and sustaining a stronger subtropical jet. Changes in subtropical jet strength, in turn, alter the propagation of extratropical waves into the tropics, affecting eddy momentum fluxes and the eddy-driven westerlies. The relevance of this mechanism is demonstrated in the context of future climate change simulations, where shifts of the ITCZ are significantly related to shifts of the Southern Hemispheric eddy-driven jet in austral winter. The possible relevance of the proposed mechanism to paleoclimates is discussed, particularly with regard to theories of ice age terminations.
Resumo:
A strong relationship is found between changes in the meridional gradient of absorbed shortwave radiation (ASR) and Southern Hemispheric jet shifts in 21st century climate simulations of CMIP5 (Coupled Model Intercomparison Project phase 5) coupled models. The relationship is such that models with increases in the meridional ASR gradient around the southern midlatitudes, and therefore increases in midlatitude baroclinicity, tend to produce a larger poleward jet shift. The ASR changes are shown to be dominated by changes in cloud properties, with sea ice declines playing a secondary role. We demonstrate that the ASR changes are the cause, and not the result, of the intermodel differences in jet response by comparing coupled simulations with experiments in which sea surface temperature increases are prescribed. Our results highlight the importance of reducing the uncertainty in cloud feedbacks in order to constrain future circulation changes.
Resumo:
This paper uses the exploration of the grounds of a common criticism of luck egalitarianism to try to make an argument about both the proper subject of theorising about justice and how to approach that subject. It draws a distinction between what it calls basic structure views and a priori baseline views, where the former take the institutional aspects of political prescriptions seriously and the latter do not. It argues that objections to luck egalitarianism on the grounds of its harshness can in part be explained by this blindness to relevant features of institutions. Further, it may be that luck egalitarianism cannot regard its own enactment as just. A related objection to Dworkin’s equality of resources, which claims that it cannot pick a particular institutional background to set the costs of resources and so is radically indeterminate, is also presented. These results, I argue, give us good reason to reject all a priori baseline views.
Resumo:
Conference proceedings paper in Alexander, O. (Ed.) 2007, Proceedings of the 2005 joint BALEAP/SATEFL conference: New Approaches to Materials Development for Language Learning. Bern: Peter Lang.
Resumo:
Blanket bog occupies approximately 6 % of the area of the UK today. The Holocene expansion of this hyperoceanic biome has previously been explained as a consequence of Neolithic forest clearance. However, the present distribution of blanket bog in Great Britain can be predicted accurately with a simple model (PeatStash) based on summer temperature and moisture index thresholds, and the same model correctly predicts the highly disjunct distribution of blanket bog worldwide. This finding suggests that climate, rather than land-use history, controls blanket-bog distribution in the UK and everywhere else. We set out to test this hypothesis for blanket bogs in the UK using bioclimate envelope modelling compared with a database of peat initiation age estimates. We used both pollen-based reconstructions and climate model simulations of climate changes between the mid-Holocene (6000 yr BP, 6 ka) and modern climate to drive PeatStash and predict areas of blanket bog. We compiled data on the timing of blanket-bog initiation, based on 228 age determinations at sites where peat directly overlies mineral soil. The model predicts large areas of northern Britain would have had blanket bog by 6000 yr BP, and the area suitable for peat growth extended to the south after this time. A similar pattern is shown by the basal peat ages and new blanket bog appeared over a larger area during the late Holocene, the greatest expansion being in Ireland, Wales and southwest England, as the model predicts. The expansion was driven by a summer cooling of about 2 °C, shown by both pollen-based reconstructions and climate models. The data show early Holocene (pre-Neolithic) blanket-bog initiation at over half of the sites in the core areas of Scotland, and northern England. The temporal patterns and concurrence of the bioclimate model predictions and initiation data suggest that climate change provides a parsimonious explanation for the early Holocene distribution and later expansion of blanket bogs in the UK, and it is not necessary to invoke anthropogenic activity as a driver of this major landscape change.
Resumo:
Observed and predicted changes in the strength of the westerly winds blowing over the Southern Ocean have motivated a number of studies of the response of the Antarctic Circumpolar Current and Southern Ocean Meridional Overturning Circulation (MOC) to wind perturbations and led to the discovery of the``eddy-compensation" regime, wherein the MOC becomes insensitive to wind changes. In addition to the MOC, tracer transport also depends on mixing processes. Here we show, in a high-resolution process model, that isopycnal mixing by mesoscale eddies is strongly dependent on the wind strength. This dependence can be explained by mixing-length theory and is driven by increases in eddy kinetic energy; the mixing length does not change strongly in our simulation. Simulation of a passive ventilation tracer (analogous to CFCs or anthropogenic CO$_2$) demonstrates that variations in tracer uptake across experiments are dominated by changes in isopycnal mixing, rather than changes in the MOC. We argue that, to properly understand tracer uptake under different wind-forcing scenarios, the sensitivity of isopycnal mixing to winds must be accounted for.