920 resultados para Memory errors


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regional to global scale modelling of N flux from land to ocean has progressed to date through the development of simple empirical models representing bulk N flux rates from large watersheds, regions, or continents on the basis of a limited selection of model parameters. Watershed scale N flux modelling has developed a range of physically-based approaches ranging from models where N flux rates are predicted through a physical representation of the processes involved, through to catchment scale models which provide a simplified representation of true systems behaviour. Generally, these watershed scale models describe within their structure the dominant process controls on N flux at the catchment or watershed scale, and take into account variations in the extent to which these processes control N flux rates as a function of landscape sensitivity to N cycling and export. This paper addresses the nature of the errors and uncertainties inherent in existing regional to global scale models, and the nature of error propagation associated with upscaling from small catchment to regional scale through a suite of spatial aggregation and conceptual lumping experiments conducted on a validated watershed scale model, the export coefficient model. Results from the analysis support the findings of other researchers developing macroscale models in allied research fields. Conclusions from the study confirm that reliable and accurate regional scale N flux modelling needs to take account of the heterogeneity of landscapes and the impact that this has on N cycling processes within homogenous landscape units.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of this research was to investigate the changes in neural processing in mild cognitive impairment. We measured phase synchrony, amplitudes, and event-related potentials in veridical and false memory to determine whether these differed in participants with mild cognitive impairment compared with typical, age-matched controls. Empirical mode decomposition phase locking analysis was used to assess synchrony, which is the first time this analysis technique has been applied in a complex cognitive task such as memory processing. The technique allowed assessment of changes in frontal and parietal cortex connectivity over time during a memory task, without a priori selection of frequency ranges, which has been shown previously to influence synchrony detection. Phase synchrony differed significantly in its timing and degree between participant groups in the theta and alpha frequency ranges. Timing differences suggested greater dependence on gist memory in the presence of mild cognitive impairment. The group with mild cognitive impairment had significantly more frontal theta phase locking than the controls in the absence of a significant behavioural difference in the task, providing new evidence for compensatory processing in the former group. Both groups showed greater frontal phase locking during false than true memory, suggesting increased searching when no actual memory trace was found. Significant inter-group differences in frontal alpha phase locking provided support for a role for lower and upper alpha oscillations in memory processing. Finally, fronto-parietal interaction was significantly reduced in the group with mild cognitive impairment, supporting the notion that mild cognitive impairment could represent an early stage in Alzheimer’s disease, which has been described as a ‘disconnection syndrome’.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Working memory (WM) is not a unitary construct. There are distinct processes involved in encoding information, maintaining it on-line, and using it to guide responses. The anatomical configurations of these processes are more accurately analyzed as functionally connected networks than collections of individual regions. In the current study we analyzed event-related functional magnetic resonance imaging (fMRI) data from a Sternberg Item Recognition Paradigm WM task using a multivariate analysis method that allowed the linking of functional networks to temporally-separated WM epochs. The length of the delay epochs was varied to optimize isolation of the hemodynamic response (HDR) for each task epoch. All extracted functional networks displayed statistically significant sensitivity to delay length. Novel information extracted from these networks that was not apparent in the univariate analysis of these data included involvement of the hippocampus in encoding/probe, and decreases in BOLD signal in the superior temporal gyrus (STG), along with default-mode regions, during encoding/delay. The bilateral hippocampal activity during encoding/delay fits with theoretical models of WM in which memoranda held across the short term are activated long-term memory representations. The BOLD signal decreases in the STG were unexpected, and may reflect repetition suppression effects invoked by internal repetition of letter stimuli. Thus, analysis methods focusing on how network dynamics relate to experimental conditions allowed extraction of novel information not apparent in univariate analyses, and are particularly recommended for WM experiments for which task epochs cannot be randomized.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Historical events are interpreted by collectivities in ways that are then instrumentalised in policy-making processes. This creates mythical "truths" and "rules of conduct" which in 20th (21st) century Western civilisations are not much different from those of pre-Enlightenment societies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce an algorithm (called REDFITmc2) for spectrum estimation in the presence of timescale errors. It is based on the Lomb-Scargle periodogram for unevenly spaced time series, in combination with the Welch's Overlapped Segment Averaging procedure, bootstrap bias correction and persistence estimation. The timescale errors are modelled parametrically and included in the simulations for determining (1) the upper levels of the spectrum of the red-noise AR(1) alternative and (2) the uncertainty of the frequency of a spectral peak. Application of REDFITmc2 to ice core and stalagmite records of palaeoclimate allowed a more realistic evaluation of spectral peaks than when ignoring this source of uncertainty. The results support qualitatively the intuition that stronger effects on the spectrum estimate (decreased detectability and increased frequency uncertainty) occur for higher frequencies. The surplus information brought by algorithm REDFITmc2 is that those effects are quantified. Regarding timescale construction, not only the fixpoints, dating errors and the functional form of the age-depth model play a role. Also the joint distribution of all time points (serial correlation, stratigraphic order) determines spectrum estimation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The detection of long-range dependence in time series analysis is an important task to which this paper contributes by showing that whilst the theoretical definition of a long-memory (or long-range dependent) process is based on the autocorrelation function, it is not possible for long memory to be identified using the sum of the sample autocorrelations, as usually defined. The reason for this is that the sample sum is a predetermined constant for any stationary time series; a result that is independent of the sample size. Diagnostic or estimation procedures, such as those in the frequency domain, that embed this sum are equally open to this criticism. We develop this result in the context of long memory, extending it to the implications for the spectral density function and the variance of partial sums of a stationary stochastic process. The results are further extended to higher order sample autocorrelations and the bispectral density. The corresponding result is that the sum of the third order sample (auto) bicorrelations at lags h,k≥1, is also a predetermined constant, different from that in the second order case, for any stationary time series of arbitrary length.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to validate the reported precision of space‐based atmospheric composition measurements, validation studies often focus on measurements in the tropical stratosphere, where natural variability is weak. The scatter in tropical measurements can then be used as an upper limit on single‐profile measurement precision. Here we introduce a method of quantifying the scatter of tropical measurements which aims to minimize the effects of short‐term atmospheric variability while maintaining large enough sample sizes that the results can be taken as representative of the full data set. We apply this technique to measurements of O3, HNO3, CO, H2O, NO, NO2, N2O, CH4, CCl2F2, and CCl3F produced by the Atmospheric Chemistry Experiment–Fourier Transform Spectrometer (ACE‐FTS). Tropical scatter in the ACE‐FTS retrievals is found to be consistent with the reported random errors (RREs) for H2O and CO at altitudes above 20 km, validating the RREs for these measurements. Tropical scatter in measurements of NO, NO2, CCl2F2, and CCl3F is roughly consistent with the RREs as long as the effect of outliers in the data set is reduced through the use of robust statistics. The scatter in measurements of O3, HNO3, CH4, and N2O in the stratosphere, while larger than the RREs, is shown to be consistent with the variability simulated in the Canadian Middle Atmosphere Model. This result implies that, for these species, stratospheric measurement scatter is dominated by natural variability, not random error, which provides added confidence in the scientific value of single‐profile measurements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

SST errors in the tropical Atlantic are large and systematic in current coupled general-circulation models. We analyse the growth of these errors in the region of the south-eastern tropical Atlantic in initialised decadal hindcasts integrations for three of the models participating in the Coupled Model Inter-comparison Project 5. A variety of causes for the initial bias development are identified, but a crucial involvement is found, in all cases considered, of ocean-atmosphere coupling for their maintenance. These involve an oceanic “bridge” between the Equator and the Benguela-Angola coastal seas which communicates sub-surface ocean anomalies and constitutes a coupling between SSTs in the south-eastern tropical Atlantic and the winds over the Equator. The resulting coupling between SSTs, winds and precipitation represents a positive feedback for warm SST errors in the south-eastern tropical Atlantic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Considerable progress has taken place in numerical weather prediction over the last decade. It has been possible to extend predictive skills in the extra-tropics of the Northern Hemisphere during the winter from less than five days to seven days. Similar improvements, albeit on a lower level, have taken place in the Southern Hemisphere. Another example of improvement in the forecasts is the prediction of intense synoptic phenomena such as cyclogenesis which on the whole is quite successful with the most advanced operational models (Bengtsson (1989), Gadd and Kruze (1988)). A careful examination shows that there are no single causes for the improvements in predictive skill, but instead they are due to several different factors encompassing the forecasting system as a whole (Bengtsson, 1985). In this paper we will focus our attention on the role of data-assimilation and the effect it may have on reducing the initial error and hence improving the forecast. The first part of the paper contains a theoretical discussion on error growth in simple data assimilation systems, following Leith (1983). In the second part we will apply the result on actual forecast data from ECMWF. The potential for further forecast improvements within the framework of the present observing system in the two hemispheres will be discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recalling information involves the process of discriminating between relevant and irrelevant information stored in memory. Not infrequently, the relevant information needs to be selected from amongst a series of related possibilities. This is likely to be particularly problematic when the irrelevant possibilities are not only temporally or contextually appropriate but also overlap semantically with the target or targets. Here, we investigate the extent to which purely perceptual features which discriminate between irrelevant and target material can be used to overcome the negative impact of contextual and semantic relatedness. Adopting a distraction paradigm, it is demonstrated that when distracters are interleaved with targets presented either visually (Experiment 1) or auditorily (Experiment 2), a within-modality semantic distraction effect occurs; semantically-related distracters impact upon recall more than unrelated distracters. In the semantically-related condition, the number of intrusions in recall is reduced whilst the number of correctly recalled targets is simultaneously increased by the presence of perceptual cues to relevance (color features in Experiment 1 or speaker’s gender in Experiment 2). However, as demonstrated in Experiment 3, even presenting semantically-related distracters in a language and a sensory modality (spoken Welsh) distinct from that of the targets (visual English) is insufficient to eliminate false recalls completely, or to restore correct recall to levels seen with unrelated distracters . Together, the study shows how semantic and non-semantic discriminability shape patterns of both erroneous and correct recall.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To undertake a process evaluation of pharmacists' recommendations arising in the context of a complex IT-enabled pharmacist-delivered randomised controlled trial (PINCER trial) to reduce the risk of hazardous medicines management in general practices. Methods PINCER pharmacists manually recorded patients’ demographics, details of interventions recommended, actions undertaken by practice staff and time taken to manage individual cases of hazardous medicines management. Data were coded and double entered into SPSS v15, and then summarised using percentages for categorical data (with 95% CI) and, as appropriate, means (SD) or medians (IQR) for continuous data. Key findings Pharmacists spent a median of 20 minutes (IQR 10, 30) reviewing medical records, recommending interventions and completing actions in each case of hazardous medicines management. Pharmacists judged 72% (95%CI 70, 74) (1463/2026) of cases of hazardous medicines management to be clinically relevant. Pharmacists recommended 2105 interventions in 74% (95%CI 73, 76) (1516/2038) of cases and 1685 actions were taken in 61% (95%CI 59, 63) (1246/2038) of cases; 66% (95%CI 64, 68) (1383/2105) of interventions recommended by pharmacists were completed and 5% (95%CI 4, 6) (104/2105) of recommendations were accepted by general practitioners (GPs), but not completed at the end of the pharmacists’ placement; the remaining recommendations were rejected or considered not relevant by GPs. Conclusions The outcome measures were used to target pharmacist activity in general practice towards patients at risk from hazardous medicines management. Recommendations from trained PINCER pharmacists were found to be broadly acceptable to GPs and led to ameliorative action in the majority of cases. It seems likely that the approach used by the PINCER pharmacists could be employed by other practice pharmacists following appropriate training.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To determine the prevalence and nature of prescribing and monitoring errors in general practices in England. Design Retrospective case note review of unique medication items prescribed over a 12 month period to a 2% random sample of patients. Mixed effects logistic regression was used to analyse the data. Setting Fifteen general practices across three primary care trusts in England. Data sources Examination of 6048 unique prescription items prescribed over the previous 12 months for 1777 patients. Main outcome measures Prevalence of prescribing and monitoring errors, and severity of errors, using validated definitions. Results Prescribing and/or monitoring errors were detected in 4.9% (296/6048) of all prescription items (95% confidence interval 4.4 - 5.5%). The vast majority of errors were of mild to moderate severity, with 0.2% (11/6048) of items having a severe error. After adjusting for covariates, patient-related factors associated with an increased risk of prescribing and/or monitoring errors were: age less than 15 (Odds Ratio (OR) 1.87, 1.19 to 2.94, p=0.006) or greater than 64 years (OR 1.68, 1.04 to 2.73, p=0.035), and higher numbers of unique medication items prescribed (OR 1.16, 1.12 to 1.19, p<0.001). Conclusion Prescribing and monitoring errors are common in English general practice, although severe errors are unusual. Many factors increase the risk of error. Having identified the most common and important errors, and the factors associated with these, strategies to prevent future errors should be developed based on the study findings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Remote sensing observations often have correlated errors, but the correlations are typically ignored in data assimilation for numerical weather prediction. The assumption of zero correlations is often used with data thinning methods, resulting in a loss of information. As operational centres move towards higher-resolution forecasting, there is a requirement to retain data providing detail on appropriate scales. Thus an alternative approach to dealing with observation error correlations is needed. In this article, we consider several approaches to approximating observation error correlation matrices: diagonal approximations, eigendecomposition approximations and Markov matrices. These approximations are applied in incremental variational assimilation experiments with a 1-D shallow water model using synthetic observations. Our experiments quantify analysis accuracy in comparison with a reference or ‘truth’ trajectory, as well as with analyses using the ‘true’ observation error covariance matrix. We show that it is often better to include an approximate correlation structure in the observation error covariance matrix than to incorrectly assume error independence. Furthermore, by choosing a suitable matrix approximation, it is feasible and computationally cheap to include error correlation structure in a variational data assimilation algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bottom-up processes can interrupt ongoing cognitive processing in order to adaptively respond to emotional stimuli of high potential significance, such as those that threaten wellbeing. However it is vital that this interference can be modulated in certain contexts to focus on current tasks. Deficits in the ability to maintain the appropriate balance between cognitive and emotional demands can severely impact on day-to-day activities. This fMRI study examined this interaction between threat processing and cognition; 18 adult participants performed a visuospatial working memory (WM) task with two load conditions, in the presence and absence of anxiety induction by threat of electric shock. Threat of shock interfered with performance in the low cognitive load condition; however interference was eradicated under high load, consistent with engagement of emotion regulation mechanisms. Under low load the amygdala showed significant activation to threat of shock that was modulated by high cognitive load. A directed top-down control contrast identified two regions associated with top-down control; ventrolateral PFC and dorsal ACC. Dynamic causal modeling provided further evidence that under high cognitive load, top-down inhibition is exerted on the amygdala and its outputs to prefrontal regions. Additionally, we hypothesized that individual differences in a separate, non-emotional top-down control task would predict the recruitment of dorsal ACC and ventrolateral PFC during top-down control of threat. Consistent with this, performance on a separate dichotic listening task predicted dorsal ACC and ventrolateral PFC activation during high WM load under threat of shock, though activation in these regions did not directly correlate with WM performance. Together, the findings suggest that under high cognitive load and threat, top-down control is exerted by dACC and vlPFC to inhibit threat processing, thus enabling WM performance without threat-related interference.