822 resultados para paradigm shifts
Dual-processes in learning and judgment:Evidence from the multiple cue probability learning paradigm
Resumo:
Multiple cue probability learning (MCPL) involves learning to predict a criterion based on a set of novel cues when feedback is provided in response to each judgment made. But to what extent does MCPL require controlled attention and explicit hypothesis testing? The results of two experiments show that this depends on cue polarity. Learning about cues that predict positively is aided by automatic cognitive processes, whereas learning about cues that predict negatively is especially demanding on controlled attention and hypothesis testing processes. In the studies reported here, negative, but not positive cue learning related to individual differences in working memory capacity both on measures of overall judgment performance and modelling of the implicit learning process. However, the introduction of a novel method to monitor participants' explicit beliefs about a set of cues on a trial-by-trial basis revealed that participants were engaged in explicit hypothesis testing about positive and negative cues, and explicit beliefs about both types of cues were linked to working memory capacity. Taken together, our results indicate that while people are engaged in explicit hypothesis testing during cue learning, explicit beliefs are applied to judgment only when cues are negative. © 2012 Elsevier Inc.
Resumo:
The biased agonism of the G protein-coupled receptors (GPCRs), where in addition to a traditional G protein-signalling pathway a GPCR promotes intracellular signals though ß-arrestin, is a novel paradigm in pharmacology. Biochemical and biophysical studies have suggested that a GPCR forms a distinct ensemble of conformations signalling through the G protein and ß-arrestin. Here we report on the dynamics of the ß2 adrenergic receptor bound to the ß-arrestin and G protein biased agonists and the empty receptor to further characterize the receptor conformational changes caused by biased agonists. We use conventional and accelerated molecular dynamics (aMD) simulations to explore the conformational transitions of the GPCR from the active state to the inactive state. We found that aMD simulations enable monitoring the transition within the nanosecond timescale while capturing the known microscopic characteristics of the inactive states, such as the ionic lock, the inward position of F6.44, and water clusters. Distinct conformational states are shown to be stabilized by each biased agonist. In particular, in simulations of the receptor with the ß-arrestin biased agonist, N-cyclopentylbutanepherine we observe a different pattern of motions in helix 7 when compared to simulations with the G protein biased agonist, Salbutamol that involves perturbations of the network of interactions within the NPxxY motif. Understanding the network of interactions induced by biased ligands and the subsequent receptor conformational shifts will lead to development of more efficient drugs. © 2013 American Chemical Society
Resumo:
At the core ofthe sense ofagency for self-produced action is the sense that I, and not some other agent, am producing and directing those actions. While there is an ever-expanding body of empirical research investigating the sense of agency for bodily action, there has, to date, been little empirical investigation of the sense ofagency for thought.The present study uses the novel Mind-to-Mind paradigm, in which the agentive source of a target thought is ambiguous, to measure misattributions of agency. Seventy-two percent of participants made at least one misattribution of agency during a 5-min trial. Misattributions were significantly more frequent when the target thought was an arousing negative thought as compared to a neutral control.The findings establish a novel protocol for measuring the sense of agency for thought, and suggest that both contextual factors and emotional experience play a role in its generation.
Resumo:
We present three natural language marking strategies based on fast and reliable shallow parsing techniques, and on widely available lexical resources: lexical substitution, adjective conjunction swaps, and relativiser switching. We test these techniques on a random sample of the British National Corpus. Individual candidate marks are checked for goodness of structural and semantic fit, using both lexical resources, and the web as a corpus. A representative sample of marks is given to 25 human judges to evaluate for acceptability and preservation of meaning. This establishes a correlation between corpus based felicity measures and perceived quality, and makes qualified predictions. Grammatical acceptability correlates with our automatic measure strongly (Pearson's r = 0.795, p = 0.001), allowing us to account for about two thirds of variability in human judgements. A moderate but statistically insignificant (Pearson's r = 0.422, p = 0.356) correlation is found with judgements of meaning preservation, indicating that the contextual window of five content words used for our automatic measure may need to be extended. © 2007 SPIE-IS&T.
Resumo:
Resistance to chemotherapy and molecularly targeted therapies is a major problem facing current cancer research. The mechanisms of resistance to 'classical' cytotoxic chemotherapeutics and to therapies that are designed to be selective for specific molecular targets share many features, such as alterations in the drug target, activation of prosurvival pathways and ineffective induction of cell death. With the increasing arsenal of anticancer agents, improving preclinical models and the advent of powerful high-throughput screening techniques, there are now unprecedented opportunities to understand and overcome drug resistance through the clinical assessment of rational therapeutic drug combinations and the use of predictive biomarkers to enable patient stratification.
Resumo:
Human listeners seem to be remarkably able to recognise acoustic sound sources based on timbre cues. Here we describe a psychophysical paradigm to estimate the time it takes to recognise a set of complex sounds differing only in timbre cues: both in terms of the minimum duration of the sounds and the inferred neural processing time. Listeners had to respond to the human voice while ignoring a set of distractors. All sounds were recorded from natural sources over the same pitch range and equalised to the same duration and power. In a first experiment, stimuli were gated in time with a raised-cosine window of variable duration and random onset time. A voice/non-voice (yes/no) task was used. Performance, as measured by d', remained above chance for the shortest sounds tested (2 ms); d's above 1 were observed for durations longer than or equal to 8 ms. Then, we constructed sequences of short sounds presented in rapid succession. Listeners were asked to report the presence of a single voice token that could occur at a random position within the sequence. This method is analogous to the "rapid sequential visual presentation" paradigm (RSVP), which has been used to evaluate neural processing time for images. For 500-ms sequences made of 32-ms and 16-ms sounds, d' remained above chance for presentation rates of up to 30 sounds per second. There was no effect of the pitch relation between successive sounds: identical for all sounds in the sequence or random for each sound. This implies that the task was not determined by streaming or forward masking, as both phenomena would predict better performance for the random pitch condition. Overall, the recognition of familiar sound categories such as the voice seems to be surprisingly fast, both in terms of the acoustic duration required and of the underlying neural time constants.
Resumo:
From the early 1900s, some psychologists have attempted to establish their discipline as a quantitative science. In using quantitative methods to investigate their theories, they adopted their own special definition of measurement of attributes such as cognitive abilities, as though they were quantities of the type encountered in Newtonian science. Joel Michell has presented a carefully reasoned argument that psychological attributes lack additivity, and therefore cannot be quantities in the same way as the attributes of classical Newtonian physics. In the early decades of the 20th century, quantum theory superseded Newtonian mechanics as the best model of physical reality. This paper gives a brief, critical overview of the evolution of current measurement practices in psychology, and suggests the need for a transition from a Newtonian to a quantum theoretical paradigm for psychological measurement. Finally, a case study is presented that considers the implications of a quantum theoretical model for educational measurement. In particular, it is argued that, since the OECD’s Programme for International Student Assessment (PISA) is predicated on a Newtonian conception of measurement, this may constrain the extent to which it can make accurate comparisons of the achievements of different education systems.
Resumo:
The outcomes of educational assessments undoubtedly have real implications for students, teachers, schools and education in the widest sense. Assessment results are, for example, used to award qualifications that determine future educational or vocational pathways of students. The results obtained by students in assessments are also used to gauge individual teacher quality, to hold schools to account for the standards achieved by their students, and to compare international education systems. Given the current high-stakes nature of educational assessment, it is imperative that the measurement practices involved have stable philosophical foundations. However, this paper casts doubt on the theoretical underpinnings of contemporary educational measurement models. Aspects of Wittgenstein’s later philosophy and Bohr’s philosophy of quantum theory are used to argue that a quantum theoretical rather than a Newtonian model is appropriate for educational measurement, and the associated implications for the concept of validity are elucidated. Whilst it is acknowledged that the transition to a quantum theoretical framework would not lead to the demise of educational assessment, it is argued that, where practical, current high-stakes assessments should be reformed to become as ‘low-stakes’ as possible. The paper also undermines some of the pro high-stakes testing rhetoric that has a tendency to afflict education.
Resumo:
Heterogeneous Networks (HetNets) are known to enhance the bandwidth efficiency and throughput of wireless networks by more effectively utilizing the network resources. However, the higher density of users and access points in HetNets introduces significant inter-user interference that needs to be mitigated through complex and sophisticated interference cancellation schemes. Moreover, due to significant channel attenuation and presence of hardware impairments, e.g., phase noise and amplifier nonlinearities, the vast bandwidth in the millimeter-wave band has not been fully utilized to date. In order to enable the development of multi-Gigabit per second wireless networks, we introduce a novel millimeter-wave HetNet paradigm, termed hybrid HetNet, which exploits the vast bandwidth and propagation characteristics in the 60 GHz and 70–80 GHz bands to reduce the impact of interference in HetNets. Simulation results are presented to illustrate the performance advantage of hybrid HetNets with respect to traditional networks. Next, two specific transceiver structures that enable hand-offs from the 60 GHz band, i.e., the V-band to the 70–80 GHz band, i.e., the E-band, and vice versa are proposed. Finally, the practical and regulatory challenges for establishing a hybrid HetNet are outlined.
Resumo:
Next-generation sequencing (NGS) is beginning to show its full potential for diagnostic and therapeutic applications. In particular, it is enunciating its capacity to contribute to a molecular taxonomy of cancer, to be used as a standard approach for diagnostic mutation detection, and to open new treatment options that are not exclusively organ-specific. If this is the case, how much validation is necessary and what should be the validation strategy, when bringing NGS into the diagnostic/clinical practice? This validation strategy should address key issues such as: what is the overall extent of the validation? Should essential indicators of test performance such as sensitivity of specificity be calculated for every target or sample type? Should bioinformatic interpretation approaches be validated with the same rigour? What is a competitive clinical turnaround time for a NGS-based test, and when does it become a cost-effective testing proposition? While we address these and other related topics in this commentary, we also suggest that a single set of international guidelines for the validation and use of NGS technology in routine diagnostics may allow us all to make a much more effective use of resources.
Hydroclimatic shifts in northeast Thailand during the last two millennia - the record of Lake Pa Kho
Resumo:
The Southeast Asian mainland is located in the central path of the Asian summer monsoon, a region where paleoclimatic data are still sparse. Here we present a multi-proxy (TOC, C/N, δ13C, biogenic silica, and XRF elemental data) study of a 1.5m sediment/peat sequence from Lake Pa Kho, northeast Thailand, which is supported by 20 AMS 14C ages. Hydroclimatic reconstructions for Pa Kho suggest a strengthened summer monsoon between BC 170-AD 370, AD 800-960, and after AD 1450; and a weakening of the summer monsoon between AD 370-800, and AD 1300-1450. Increased run-off and a higher nutrient supply after AD 1700 can be linked to agricultural intensification and land-use changes in the region. This study fills an important gap in data coverage with respect to summer monsoon variability over Southeast Asia during the past 2000 years and enables the mean position of the Intertropical Convergence Zone (ITCZ) to be inferred based on comparisons with other regional studies. Intervals of strengthened/weaker summer monsoon rainfall suggest that the mean position of the ITCZ was located as far north as 35°N between BC 170-AD 370 and AD 800-960, whereas it likely did not reach above 17°N during the drought intervals of AD 370-800 and AD 1300-1450. The spatial pattern of rainfall variation seems to have changed after AD 1450, when the inferred moisture history for Pa Kho indicates a more southerly location of the mean position of the summer ITCZ.