9 resultados para statistical framework

em CentAUR: Central Archive University of Reading - UK


Relevância:

60.00% 60.00%

Publicador:

Resumo:

More than thirty years ago, Amari and colleagues proposed a statistical framework for identifying structurally stable macrostates of neural networks from observations of their microstates. We compare their stochastic stability criterion with a deterministic stability criterion based on the ergodic theory of dynamical systems, recently proposed for the scheme of contextual emergence and applied to particular inter-level relations in neuroscience. Stochastic and deterministic stability criteria for macrostates rely on macro-level contexts, which make them sensitive to differences between different macro-levels.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Transient neural assemblies mediated by synchrony in particular frequency ranges are thought to underlie cognition. We propose a new approach to their detection, using empirical mode decomposition (EMD), a data-driven approach removing the need for arbitrary bandpass filter cut-offs. Phase locking is sought between modes. We explore the features of EMD, including making a quantitative assessment of its ability to preserve phase content of signals, and proceed to develop a statistical framework with which to assess synchrony episodes. Furthermore, we propose a new approach to ensure signal decomposition using EMD. We adapt the Hilbert spectrum to a time-frequency representation of phase locking and are able to locate synchrony successfully in time and frequency between synthetic signals reminiscent of EEG. We compare our approach, which we call EMD phase locking analysis (EMDPL) with existing methods and show it to offer improved time-frequency localisation of synchrony.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The response of North Atlantic and European extratropical cyclones to climate change is investigated in the climate models participating in phase 5 of the Coupled Model Intercomparison Project (CMIP5). In contrast to previous multimodel studies, a feature-tracking algorithm is here applied to separately quantify the re- sponses in the number, the wind intensity, and the precipitation intensity of extratropical cyclones. Moreover, a statistical framework is employed to formally assess the uncertainties in the multimodel projections. Under the midrange representative concentration pathway (RCP4.5) emission scenario, the December–February (DJF) response is characterized by a tripolar pattern over Europe, with an increase in the number of cyclones in central Europe and a decreased number in the Norwegian and Mediterranean Seas. The June–August (JJA) response is characterized by a reduction in the number of North Atlantic cyclones along the southern flank of the storm track. The total number of cyclones decreases in both DJF (24%) and JJA (22%). Classifying cyclones according to their intensity indicates a slight basinwide reduction in the number of cy- clones associated with strong winds, but an increase in those associated with strong precipitation. However, in DJF, a slight increase in the number and intensity of cyclones associated with strong wind speeds is found over the United Kingdom and central Europe. The results are confirmed under the high-emission RCP8.5 scenario, where the signals tend to be larger. The sources of uncertainty in these projections are discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We apply a new parameterisation of the Greenland ice sheet (GrIS) feedback between surface mass balance (SMB: the sum of surface accumulation and surface ablation) and surface elevation in the MAR regional climate model (Edwards et al., 2014) to projections of future climate change using five ice sheet models (ISMs). The MAR (Modèle Atmosphérique Régional: Fettweis, 2007) climate projections are for 2000–2199, forced by the ECHAM5 and HadCM3 global climate models (GCMs) under the SRES A1B emissions scenario. The additional sea level contribution due to the SMB– elevation feedback averaged over five ISM projections for ECHAM5 and three for HadCM3 is 4.3% (best estimate; 95% credibility interval 1.8–6.9 %) at 2100, and 9.6% (best estimate; 95% credibility interval 3.6–16.0 %) at 2200. In all results the elevation feedback is significantly positive, amplifying the GrIS sea level contribution relative to the MAR projections in which the ice sheet topography is fixed: the lower bounds of our 95% credibility intervals (CIs) for sea level contributions are larger than the “no feedback” case for all ISMs and GCMs. Our method is novel in sea level projections because we propagate three types of modelling uncertainty – GCM and ISM structural uncertainties, and elevation feedback parameterisation uncertainty – along the causal chain, from SRES scenario to sea level, within a coherent experimental design and statistical framework. The relative contributions to uncertainty depend on the timescale of interest. At 2100, the GCM uncertainty is largest, but by 2200 both the ISM and parameterisation uncertainties are larger. We also perform a perturbed parameter ensemble with one ISM to estimate the shape of the projected sea level probability distribution; our results indicate that the probability density is slightly skewed towards higher sea level contributions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper investigates the feasibility of using approximate Bayesian computation (ABC) to calibrate and evaluate complex individual-based models (IBMs). As ABC evolves, various versions are emerging, but here we only explore the most accessible version, rejection-ABC. Rejection-ABC involves running models a large number of times, with parameters drawn randomly from their prior distributions, and then retaining the simulations closest to the observations. Although well-established in some fields, whether ABC will work with ecological IBMs is still uncertain. Rejection-ABC was applied to an existing 14-parameter earthworm energy budget IBM for which the available data consist of body mass growth and cocoon production in four experiments. ABC was able to narrow the posterior distributions of seven parameters, estimating credible intervals for each. ABC’s accepted values produced slightly better fits than literature values do. The accuracy of the analysis was assessed using cross-validation and coverage, currently the best available tests. Of the seven unnarrowed parameters, ABC revealed that three were correlated with other parameters, while the remaining four were found to be not estimable given the data available. It is often desirable to compare models to see whether all component modules are necessary. Here we used ABC model selection to compare the full model with a simplified version which removed the earthworm’s movement and much of the energy budget. We are able to show that inclusion of the energy budget is necessary for a good fit to the data. We show how our methodology can inform future modelling cycles, and briefly discuss how more advanced versions of ABC may be applicable to IBMs. We conclude that ABC has the potential to represent uncertainty in model structure, parameters and predictions, and to embed the often complex process of optimizing an IBM’s structure and parameters within an established statistical framework, thereby making the process more transparent and objective.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We propose a new modelling framework suitable for the description of atmospheric convective systems as a collection of distinct plumes. The literature contains many examples of models for collections of plumes in which strong simplifying assumptions are made, a diagnostic dependence of convection on the large-scale environment and the limit of many plumes often being imposed from the outset. Some recent studies have sought to remove one or the other of those assumptions. The proposed framework removes both, and is explicitly time-dependent and stochastic in its basic character. The statistical dynamics of the plume collection are defined through simple probabilistic rules applied at the level of individual plumes, and van Kampen's system size expansion is then used to construct the macroscopic limit of the microscopic model. Through suitable choices of the microscopic rules, the model is shown to encompass previous studies in the appropriate limits, and to allow their natural extensions beyond those limits.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new technique is described for the analysis of cloud-resolving model simulations, which allows one to investigate the statistics of the lifecycles of cumulus clouds. Clouds are tracked from timestep-to-timestep within the model run. This allows for a very simple method of tracking, but one which is both comprehensive and robust. An approach for handling cloud splits and mergers is described which allows clouds with simple and complicated time histories to be compared within a single framework. This is found to be important for the analysis of an idealized simulation of radiative-convective equilibrium, in which the moist, buoyant, updrafts (i.e., the convective cores) were tracked. Around half of all such cores were subject to splits and mergers during their lifecycles. For cores without any such events, the average lifetime is 30min, but events can lengthen the typical lifetime considerably.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the general response theory recently proposed by Ruelle for describing the impact of small perturbations to the non-equilibrium steady states resulting from Axiom A dynamical systems. We show that the causality of the response functions entails the possibility of writing a set of Kramers-Kronig (K-K) relations for the corresponding susceptibilities at all orders of nonlinearity. Nonetheless, only a special class of directly observable susceptibilities obey K-K relations. Specific results are provided for the case of arbitrary order harmonic response, which allows for a very comprehensive K-K analysis and the establishment of sum rules connecting the asymptotic behavior of the harmonic generation susceptibility to the short-time response of the perturbed system. These results set in a more general theoretical framework previous findings obtained for optical systems and simple mechanical models, and shed light on the very general impact of considering the principle of causality for testing self-consistency: the described dispersion relations constitute unavoidable benchmarks that any experimental and model generated dataset must obey. The theory exposed in the present paper is dual to the time-dependent theory of perturbations to equilibrium states and to non-equilibrium steady states, and has in principle similar range of applicability and limitations. In order to connect the equilibrium and the non equilibrium steady state case, we show how to rewrite the classical response theory by Kubo so that response functions formally identical to those proposed by Ruelle, apart from the measure involved in the phase space integration, are obtained. These results, taking into account the chaotic hypothesis by Gallavotti and Cohen, might be relevant in several fields, including climate research. In particular, whereas the fluctuation-dissipation theorem does not work for non-equilibrium systems, because of the non-equivalence between internal and external fluctuations, K-K relations might be robust tools for the definition of a self-consistent theory of climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Regional climate downscaling has arrived at an important juncture. Some in the research community favour continued refinement and evaluation of downscaling techniques within a broader framework of uncertainty characterisation and reduction. Others are calling for smarter use of downscaling tools, accepting that conventional, scenario-led strategies for adaptation planning have limited utility in practice. This paper sets out the rationale and new functionality of the Decision Centric (DC) version of the Statistical DownScaling Model (SDSM-DC). This tool enables synthesis of plausible daily weather series, exotic variables (such as tidal surge), and climate change scenarios guided, not determined, by climate model output. Two worked examples are presented. The first shows how SDSM-DC can be used to reconstruct and in-fill missing records based on calibrated predictor-predictand relationships. Daily temperature and precipitation series from sites in Africa, Asia and North America are deliberately degraded to show that SDSM-DC can reconstitute lost data. The second demonstrates the application of the new scenario generator for stress testing a specific adaptation decision. SDSM-DC is used to generate daily precipitation scenarios to simulate winter flooding in the Boyne catchment, Ireland. This sensitivity analysis reveals the conditions under which existing precautionary allowances for climate change might be insufficient. We conclude by discussing the wider implications of the proposed approach and research opportunities presented by the new tool.