157 resultados para Todd, Doug
Resumo:
In the mid-1990s the subpolar gyre of the North Atlantic underwent a remarkable rapid warming, with sea surface temperatures increasing by around 1C in just 2 years. This rapid warming followed a prolonged positive phase of the North Atlantic Oscillation (NAO), but also coincided with an unusually negative NAO index in the winter of 1995/96. By comparing ocean analyses and carefully designed model experiments we show that this rapid warming can be understood as a delayed response to the prolonged positive phase of the NAO, and not simply an instantaneous response to the negative NAO index of 1995/96. Furthermore, we infer that the warming was partly caused by a surge, and subsequent decline, in the Meridional Overturning Circulation and northward heat transport of the Atlantic Ocean. Our results provide persuasive evidence of significant oceanic memory on multi-annual timescales, and are therefore encouraging for the prospects of developing skillful predictions.
Resumo:
In recent years, there has been a drive to save development costs and shorten time-to-market of new therapies. Research into novel trial designs to facilitate this goal has led to, amongst other approaches, the development of methodology for seamless phase II/III designs. Such designs allow treatment or dose selection at an interim analysis and comparative evaluation of efficacy with control, in the same study. Methods have gained much attention because of their potential advantages compared to conventional drug development programmes with separate trials for individual phases. In this article, we review the various approaches to seamless phase II/III designs based upon the group-sequential approach, the combination test approach and the adaptive Dunnett method. The objective of this article is to describe the approaches in a unified framework and highlight their similarities and differences to allow choice of an appropriate methodology by a trialist considering conducting such a trial.
Resumo:
Innovation is easier to describe than it is to systematically analyse, and easier to analyse than it is to effectively promote. Part of the problem, of course, is the imprecise way in which the activity of innovation itself is conceptualised. To achieve more precision, the logic of analysis suggests that innovation should be should be systematically analysed and then divided into rough categories to produce a working taxonomy based on a number of key dimensions. A major part of the purpose of this paper is to develop such a working taxonomy.
Resumo:
Constrained principal component analysis (CPCA) with a finite impulse response (FIR) basis set was used to reveal functionally connected networks and their temporal progression over a multistage verbal working memory trial in which memory load was varied. Four components were extracted, and all showed statistically significant sensitivity to the memory load manipulation. Additionally, two of the four components sustained this peak activity, both for approximately 3 s (Components 1 and 4). The functional networks that showed sustained activity were characterized by increased activations in the dorsal anterior cingulate cortex, right dorsolateral prefrontal cortex, and left supramarginal gyrus, and decreased activations in the primary auditory cortex and "default network" regions. The functional networks that did not show sustained activity were instead dominated by increased activation in occipital cortex, dorsal anterior cingulate cortex, sensori-motor cortical regions, and superior parietal cortex. The response shapes suggest that although all four components appear to be invoked at encoding, the two sustained-peak components are likely to be additionally involved in the delay period. Our investigation provides a unique view of the contributions made by a network of brain regions over the course of a multiple-stage working memory trial.
Resumo:
Current methods for estimating event-related potentials (ERPs) assume stationarity of the signal. Empirical Mode Decomposition (EMD) is a data-driven decomposition technique that does not assume stationarity. We evaluated an EMD-based method for estimating the ERP. On simulated data, EMD substantially reduced background EEG while retaining the ERP. EMD-denoised single trials also estimated shape, amplitude, and latency of the ERP better than raw single trials. On experimental data, EMD-denoised trials revealed event-related differences between two conditions (condition A and B) more effectively than trials lowpass filtered at 40 Hz. EMD also revealed event-related differences on both condition A and condition B that were clearer and of longer duration than those revealed by low-pass filtering at 40 Hz. Thus, EMD-based denoising is a promising data-driven, nonstationary method for estimating ERPs and should be investigated further.
Resumo:
This study investigates the two later-acquired but proficient languages, English and Hindi, of two multilingual individuals with transcortical aphasia (right basal ganglia lesion in GN and brain stem lesion in GS). Dissociation between lexical and syntactic profiles in both the languages with a uniform performance across the languages at the lexical level and an uneven performance across the languages at the syntactic level was observed. Their performances are discussed in relation to the implicit/explicit language processes (Paradis, 1994 and Paradis, 2004) and the declarative/procedural model (Ullman, 2001b and Ullman, 2005) of bilingual language processing. Additionally, their syntactic performance is interpreted in relation to the salient grammatical contrasts between English and Hindi.
Resumo:
This paper presents practical approaches to the problem of sample size re-estimation in the case of clinical trials with survival data when proportional hazards can be assumed. When data are readily available at the time of the review, on a full range of survival experiences across the recruited patients, it is shown that, as expected, performing a blinded re-estimation procedure is straightforward and can help to maintain the trial's pre-specified error rates. Two alternative methods for dealing with the situation where limited survival experiences are available at the time of the sample size review are then presented and compared. In this instance, extrapolation is required in order to undertake the sample size re-estimation. Worked examples, together with results from a simulation study are described. It is concluded that, as in the standard case, use of either extrapolation approach successfully protects the trial error rates. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
Working memory (WM) is not a unitary construct. There are distinct processes involved in encoding information, maintaining it on-line, and using it to guide responses. The anatomical configurations of these processes are more accurately analyzed as functionally connected networks than collections of individual regions. In the current study we analyzed event-related functional magnetic resonance imaging (fMRI) data from a Sternberg Item Recognition Paradigm WM task using a multivariate analysis method that allowed the linking of functional networks to temporally-separated WM epochs. The length of the delay epochs was varied to optimize isolation of the hemodynamic response (HDR) for each task epoch. All extracted functional networks displayed statistically significant sensitivity to delay length. Novel information extracted from these networks that was not apparent in the univariate analysis of these data included involvement of the hippocampus in encoding/probe, and decreases in BOLD signal in the superior temporal gyrus (STG), along with default-mode regions, during encoding/delay. The bilateral hippocampal activity during encoding/delay fits with theoretical models of WM in which memoranda held across the short term are activated long-term memory representations. The BOLD signal decreases in the STG were unexpected, and may reflect repetition suppression effects invoked by internal repetition of letter stimuli. Thus, analysis methods focusing on how network dynamics relate to experimental conditions allowed extraction of novel information not apparent in univariate analyses, and are particularly recommended for WM experiments for which task epochs cannot be randomized.
Resumo:
We present the first climate prediction of the coming decade made with multiple models, initialized with prior observations. This prediction accrues from an international activity to exchange decadal predictions in near real-time, in order to assess differences and similarities, provide a consensus view to prevent over-confidence in forecasts from any single model, and establish current collective capability. We stress that the forecast is experimental, since the skill of the multi-model system is as yet unknown. Nevertheless, the forecast systems used here are based on models that have undergone rigorous evaluation and individually have been evaluated for forecast skill. Moreover, it is important to publish forecasts to enable open evaluation, and to provide a focus on climate change in the coming decade. Initialized forecasts of the year 2011 agree well with observations, with a pattern correlation of 0.62 compared to 0.31 for uninitialized projections. In particular, the forecast correctly predicted La Niña in the Pacific, and warm conditions in the north Atlantic and USA. A similar pattern is predicted for 2012 but with a weaker La Niña. Indices of Atlantic multi-decadal variability and Pacific decadal variability show no signal beyond climatology after 2015, while temperature in the Niño3 region is predicted to warm slightly by about 0.5 °C over the coming decade. However, uncertainties are large for individual years and initialization has little impact beyond the first 4 years in most regions. Relative to uninitialized forecasts, initialized forecasts are significantly warmer in the north Atlantic sub-polar gyre and cooler in the north Pacific throughout the decade. They are also significantly cooler in the global average and over most land and ocean regions out to several years ahead. However, in the absence of volcanic eruptions, global temperature is predicted to continue to rise, with each year from 2013 onwards having a 50 % chance of exceeding the current observed record. Verification of these forecasts will provide an important opportunity to test the performance of models and our understanding and knowledge of the drivers of climate change.
Resumo:
PURPOSE: Multi-species probiotic preparations have been suggested as having a wide spectrum of application, although few studies have compared their efficacy with that of individual component strains at equal concentrations. We therefore tested the ability of 4 single probiotics and 4 probiotic mixtures to inhibit the urinary tract pathogens Escherichia coli NCTC 9001 and Enterococcus faecalis NCTC 00775. METHODS: We used an agar spot test to test the ability of viable cells to inhibit pathogens, while a broth inhibition assay was used to assess inhibition by cell-free probiotic supernatants in both pH-neutralised and non-neutralised forms. RESULTS: In the agar spot test, all probiotic treatments showed inhibition, L. acidophilus was the most inhibitory single strain against E. faecalis, L. fermentum the most inhibitory against E. coli. A commercially available mixture of 14 strains (Bio-Kult(®)) was the most effective mixture, against E. faecalis, the 3-lactobacillus mixture the most inhibitory against E. coli. Mixtures were not significantly more inhibitory than single strains. In the broth inhibition assays, all probiotic supernatants inhibited both pathogens when pH was not controlled, with only 2 treatments causing inhibition at a neutral pH. CONCLUSIONS: Both viable cells of probiotics and supernatants of probiotic cultures were able to inhibit growth of two urinary tract pathogens. Probiotic mixtures prevented the growth of urinary tract pathogens but were not significantly more inhibitory than single strains. Probiotics appear to produce metabolites that are inhibitory towards urinary tract pathogens. Probiotics display potential to reduce the incidence of urinary tract infections via inhibition of colonisation.
Resumo:
Seamless phase II/III clinical trials combine traditional phases II and III into a single trial that is conducted in two stages, with stage 1 used to answer phase II objectives such as treatment selection and stage 2 used for the confirmatory analysis, which is a phase III objective. Although seamless phase II/III clinical trials are efficient because the confirmatory analysis includes phase II data from stage 1, inference can pose statistical challenges. In this paper, we consider point estimation following seamless phase II/III clinical trials in which stage 1 is used to select the most effective experimental treatment and to decide if, compared with a control, the trial should stop at stage 1 for futility. If the trial is not stopped, then the phase III confirmatory part of the trial involves evaluation of the selected most effective experimental treatment and the control. We have developed two new estimators for the treatment difference between these two treatments with the aim of reducing bias conditional on the treatment selection made and on the fact that the trial continues to stage 2. We have demonstrated the properties of these estimators using simulations
Resumo:
Contemporary artists exploring Jewish identity in the UK are caught between two exclusions, broadly speaking: an art community that that sees itself as ‘post –identity’ and a ‘black’ art scene that revolves around the organizations that emerged out of the Identity debates of the 1980s and 1990s, namely Iniva, Third Text, Autograph. These organizations and those debates, don’t usually include Jewish identity within their remit as Jewish artists are considered to be well represented in the British art scene and, in any case, white. Out of these assumptions, questions arise in relation to the position of Jews in Britain and what is at stake for an artist in exploring Jewish Identity in their work. There is considerable scholarship, relatively speaking on art and Jewish Identity in the US (such as Lisa Bloom; Norman Kleeblatt; Catherine Sousslouf), which inform the debates on visual culture and Jews. In this chapter, I will be drawing out some of the distinctions between the US and the UK debates within my analysis, building on my own writing over the last ten years as well as the work of Juliet Steyn, Jon Stratton and Griselda Pollock. In short, this chapter aims to explore the problematic of what Jewish Identity can offer the viewer as art; what place such art inhabits within a wider artistic context and how, if at all, it is received. There is a predominance of lens based work that explores Identity arising out of the provenance of feminist practices and the politics of documentary that will be important in the framing of the work. I do not aim to consider what constitutes a Jewish artist, that has been done elsewhere and is an inadequate and somewhat spurious conversation . I will also not be focusing on artists whose intention is to celebrate an unproblematised Jewishness (however that is constituted in any given work). Recent artworks and scholarship has in any case rendered the trumpeting of attachment to any singular identity anachronistic at best. I will focus on artists working in the UK who incorporate questions of Jewishness into a larger visual enquiry that build on Judith Butler’s notion of identity as process or performative as well as the more recent debates and artwork that consider the intersectionality of identifications that co-constitute provisional identities (Jones, Modood, Sara Ahmed, Braidotti/Nikki S Lee, Glenn Ligon). The case studies to think through these questions of identity, will be artworks by Susan Hiller, Doug Fishbone and Suzanne Triester. In thinking through works by these artists, I will also serve to contextualise them, situating them briefly within the history of the landmark exhibition in the UK, Rubies and Rebels and the work of Ruth Novaczek, Lily Markewitz, Oreet Ashery and myself.
Resumo:
25 monolingual (L1) children with Specific Language Impairment (SLI), 32 sequential bilingual (L2) children, and 29 L1 controls completed the Test of Active & Passive Sentences-Revised (van der Lely, 1996) and the self-paced listening task with picture verification for actives and passives (Marinis, 2007). These revealed important between-group differences in both tasks. The children with SLI showed difficulties in both actives and passives when they had to reanalyse thematic roles on-line. Their error pattern provided evidence for working memory limitations. The L2 children showed difficulties only in passives both on-line and off-line. We suggest that these relate to the complex syntactic algorithm in passives and reflect an earlier developmental stage due to reduced exposure to the L2. The results are discussed in relation to theories of SLI and can be best accommodated within accounts proposing that difficulties in the comprehension of passives stem from processing limitations.