992 resultados para Calculation methodology


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many modern statistical applications involve inference for complex stochastic models, where it is easy to simulate from the models, but impossible to calculate likelihoods. Approximate Bayesian computation (ABC) is a method of inference for such models. It replaces calculation of the likelihood by a step which involves simulating artificial data for different parameter values, and comparing summary statistics of the simulated data with summary statistics of the observed data. Here we show how to construct appropriate summary statistics for ABC in a semi-automatic manner. We aim for summary statistics which will enable inference about certain parameters of interest to be as accurate as possible. Theoretical results show that optimal summary statistics are the posterior means of the parameters. Although these cannot be calculated analytically, we use an extra stage of simulation to estimate how the posterior means vary as a function of the data; and we then use these estimates of our summary statistics within ABC. Empirical results show that our approach is a robust method for choosing summary statistics that can result in substantially more accurate ABC analyses than the ad hoc choices of summary statistics that have been proposed in the literature. We also demonstrate advantages over two alternative methods of simulation-based inference.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Contrails and especially their evolution into cirrus-like clouds are thought to have very important effects on local and global radiation budgets, though are generally not well represented in global climate models. Lack of contrail parameterisations is due to the limited availability of in situ contrail measurements which are difficult to obtain. Here we present a methodology for successful sampling and interpretation of contrail microphysical and radiative data using both in situ and remote sensing instrumentation on board the FAAM BAe146 UK research aircraft as part of the COntrails Spreading Into Cirrus (COSIC) study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although there is evidence that exact calculation recruits left hemisphere perisylvian language systems, recent work has shown that exact calculation can be retained despite severe damage to these networks. In this study, we sought to identify a “core” network for calculation and hence to determine the extent to which left hemisphere language areas are part of this network. We examined performance on addition and subtraction problems in two modalities: one using conventional two-digit problems that can be easily encoded into language; the other using novel shape representations. With regard to numerical problems, our results revealed increased left fronto-temporal activity in addition, and increased parietal activity in subtraction, potentially reflecting retrieval of linguistically encoded information during addition. The shape problems elicited activations of occipital, parietal and dorsal temporal regions, reflecting visual reasoning processes. A core activation common to both calculation types involved the superior parietal lobule bilaterally, right temporal sub-gyral area, and left lateralized activations in inferior parietal (BA 40), frontal (BA 6/8/32) and occipital (BA 18) regions. The large bilateral parietal activation could be attributed to visuo-spatial processing in calculation. The inferior parietal region, and particularly the left angular gyrus, was part of the core calculation network. However, given its activation in both shape and number tasks, its role is unlikely to reflect linguistic processing per se. A possibility is that it serves to integrate right hemisphere visuo-spatial and left hemisphere linguistic and executive processing in calculation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The role of language in exact calculation is the subject of debate. Some behavioral and functional neuroimaging investigations of healthy participants suggest that calculation requires language resources. However, there are also reports of individuals with severe aphasic language impairment who retain calculation ability. One possibility in resolving these discordant findings is that the neural basis of calculation has undergone significant reorganization in aphasic calculators. Using fMRI, we examined brain activations associated with exact addition and subtraction in two patients with severe agrammatic aphasia and retained calculation ability. Behavior and brain activations during two-digit addition and subtraction were compared to those of a group of 11 healthy, age-matched controls. Behavioral results confirmed that both patients retained calculation ability. Imaging findings revealed individual differences in processing, but also a similar activation pattern across patients and controls in bilateral parietal cortices. Patients differed from controls in small areas of increased activation in peri-lesional regions, a shift from left fronto-temporal activation to the contralateral region, and increased activations in bilateral superior parietal regions. Our results suggest that bilateral parietal cortex represents the core of the calculation network and, while healthy controls may recruit language resources to support calculation, these mechanisms are not mandatory in adult cognition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Wetland and Wetland CH4 Intercomparison of Models Project (WETCHIMP) was created to evaluate our present ability to simulate large-scale wetland characteristics and corresponding methane (CH4) emissions. A multi-model comparison is essential to evaluate the key uncertainties in the mechanisms and parameters leading to methane emissions. Ten modelling groups joined WETCHIMP to run eight global and two regional models with a common experimental protocol using the same climate and atmospheric carbon dioxide (CO2) forcing datasets. We reported the main conclusions from the intercomparison effort in a companion paper (Melton et al., 2013). Here we provide technical details for the six experiments, which included an equilibrium, a transient, and an optimized run plus three sensitivity experiments (temperature, precipitation, and atmospheric CO2 concentration). The diversity of approaches used by the models is summarized through a series of conceptual figures, and is used to evaluate the wide range of wetland extent and CH4 fluxes predicted by the models in the equilibrium run. We discuss relationships among the various approaches and patterns in consistencies of these model predictions. Within this group of models, there are three broad classes of methods used to estimate wetland extent: prescribed based on wetland distribution maps, prognostic relationships between hydrological states based on satellite observations, and explicit hydrological mass balances. A larger variety of approaches was used to estimate the net CH4 fluxes from wetland systems. Even though modelling of wetland extent and CH4 emissions has progressed significantly over recent decades, large uncertainties still exist when estimating CH4 emissions: there is little consensus on model structure or complexity due to knowledge gaps, different aims of the models, and the range of temporal and spatial resolutions of the models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many urban surface energy balance models now exist. These vary in complexity from simple schemes that represent the city as a concrete slab, to those which incorporate detailed representations of momentum and energy fluxes distributed within the atmospheric boundary layer. While many of these schemes have been evaluated against observations, with some models even compared with the same data sets, such evaluations have not been undertaken in a controlled manner to enable direct comparison. For other types of climate model, for instance the Project for Intercomparison of Land-Surface Parameterization Schemes (PILPS) experiments (Henderson-Sellers et al., 1993), such controlled comparisons have been shown to provide important insights into both the mechanics of the models and the physics of the real world. This paper describes the progress that has been made to date on a systematic and controlled comparison of urban surface schemes. The models to be considered, and their key attributes, are described, along with the methodology to be used for the evaluation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyse by simulation the impact of model-selection strategies (sometimes called pre-testing) on forecast performance in both constant-and non-constant-parameter processes. Restricted, unrestricted and selected models are compared when either of the first two might generate the data. We find little evidence that strategies such as general-to-specific induce significant over-fitting, or thereby cause forecast-failure rejection rates to greatly exceed nominal sizes. Parameter non-constancies put a premium on correct specification, but in general, model-selection effects appear to be relatively small, and progressive research is able to detect the mis-specifications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a software-based study of a hardware-based non-sorting median calculation method on a set of integer numbers. The method divides the binary representation of each integer element in the set into bit slices in order to find the element located in the middle position. The method exhibits a linear complexity order and our analysis shows that the best performance in execution time is obtained when slices of 4-bit in size are used for 8-bit and 16-bit integers, in mostly any data set size. Results suggest that software implementation of bit slice method for median calculation outperforms sorting-based methods with increasing improvement for larger data set size. For data set sizes of N > 5, our simulations show an improvement of at least 40%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Anticoagulants rodenticides have already known for over half a century, as effective and safe method of rodent control. However, discovered in 1958 anticoagulant resistance has given us a very important problem for their future long-term use. Laboratory tests provide the main method for identification the different types of anticoagulant resistances, quantify the magnitude of their effect and help us to choose the best pest control strategy. The main important tests are lethal feeding period (LFP) and blood clotting response (BCR) tests. These tests can now be used to quantify the likely effect of the resistance on treatment outcome by providing an estimate of the ‘resistance factor’. In 2004 the gene responsible for anticoagulant resistance (VKORC1) was identified and sequenced. As a result, a new molecular resistance testing methodology has been developed, and a number of resistance mutations, particularly in Norway rats and house mice. Three mutations of the VKORC1 gene in Norway rats have been identified to date that confer a degree of resistance to bromadiolone and difenacoum, sufficient to affect treatment outcome in the field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper makes a theoretical case for using these two systems approaches together. The theoretical and methodological assumptions of system dynamics (SD) and soft system methodology (SSM) are briefly described and a partial critique is presented. SSM generates and represents diverse perspectives on a problem situation and addresses the socio-political elements of an intervention. However, it is weak in ensuring `dynamic coherence'. consistency between the intuitive behaviour resulting from proposed changes and behaviour deduced from ideas on causal structure. Conversely, SD examines causal structures and dynamic behaviours. However, whilst emphasising the need for a clear issue focus, it has little theory for generating and representing diverse issues. Also, there is no theory for facilitating sensitivity to socio-political elements. A synthesis of the two called ‘Holon Dynamics' is proposed. After an SSM intervention, a second stage continues the socio-political analysis and also operates within a new perspective which values dynamic coherence of the mental construct - the holon - which is capable of expressing the proposed changes. A model of this holon is constructed using SD and the changes are thus rendered `systemically desirable' in the additional sense that dynamic consistency has been confirmed. The paper closes with reflections on the proposal and the need for theoretical consistency when mixing tools is emphasised.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article reviews the experiences of a practising business consultancy division. It discusses the reasons for the failure of the traditional, expert consultancy approach and states the requirements for a more suitable consultancy methodology. An approach called ‘Modelling as Learning’ is introduced, its three defining aspects being: client ownership of all analytical work performed, consultant acting as facilitator and sensitivity to soft issues within and surrounding a problem. The goal of such an approach is set as the acceleration of the client's learning about the business. The tools that are used within this methodological framework are discussed and some case studies of the methodology are presented. It is argued that a learning experience was necessary before arriving at the new methodology but that it is now a valuable and significant component of the division's work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The WFDEI meteorological forcing data set has been generated using the same methodology as the widely used WATCH Forcing Data (WFD) by making use of the ERA-Interim reanalysis data. We discuss the specifics of how changes in the reanalysis and processing have led to improvement over the WFD. We attribute improvements in precipitation and wind speed to the latest reanalysis basis data and improved downward shortwave fluxes to the changes in the aerosol corrections. Covering 1979–2012, the WFDEI will allow more thorough comparisons of hydrological and Earth System model outputs with hydrologically and phenologically relevant satellite products than using the WFD.