41 resultados para Paradigm vitalist
Resumo:
Human listeners seem to be remarkably able to recognise acoustic sound sources based on timbre cues. Here we describe a psychophysical paradigm to estimate the time it takes to recognise a set of complex sounds differing only in timbre cues: both in terms of the minimum duration of the sounds and the inferred neural processing time. Listeners had to respond to the human voice while ignoring a set of distractors. All sounds were recorded from natural sources over the same pitch range and equalised to the same duration and power. In a first experiment, stimuli were gated in time with a raised-cosine window of variable duration and random onset time. A voice/non-voice (yes/no) task was used. Performance, as measured by d', remained above chance for the shortest sounds tested (2 ms); d's above 1 were observed for durations longer than or equal to 8 ms. Then, we constructed sequences of short sounds presented in rapid succession. Listeners were asked to report the presence of a single voice token that could occur at a random position within the sequence. This method is analogous to the "rapid sequential visual presentation" paradigm (RSVP), which has been used to evaluate neural processing time for images. For 500-ms sequences made of 32-ms and 16-ms sounds, d' remained above chance for presentation rates of up to 30 sounds per second. There was no effect of the pitch relation between successive sounds: identical for all sounds in the sequence or random for each sound. This implies that the task was not determined by streaming or forward masking, as both phenomena would predict better performance for the random pitch condition. Overall, the recognition of familiar sound categories such as the voice seems to be surprisingly fast, both in terms of the acoustic duration required and of the underlying neural time constants.
Resumo:
From the early 1900s, some psychologists have attempted to establish their discipline as a quantitative science. In using quantitative methods to investigate their theories, they adopted their own special definition of measurement of attributes such as cognitive abilities, as though they were quantities of the type encountered in Newtonian science. Joel Michell has presented a carefully reasoned argument that psychological attributes lack additivity, and therefore cannot be quantities in the same way as the attributes of classical Newtonian physics. In the early decades of the 20th century, quantum theory superseded Newtonian mechanics as the best model of physical reality. This paper gives a brief, critical overview of the evolution of current measurement practices in psychology, and suggests the need for a transition from a Newtonian to a quantum theoretical paradigm for psychological measurement. Finally, a case study is presented that considers the implications of a quantum theoretical model for educational measurement. In particular, it is argued that, since the OECD’s Programme for International Student Assessment (PISA) is predicated on a Newtonian conception of measurement, this may constrain the extent to which it can make accurate comparisons of the achievements of different education systems.
Resumo:
The outcomes of educational assessments undoubtedly have real implications for students, teachers, schools and education in the widest sense. Assessment results are, for example, used to award qualifications that determine future educational or vocational pathways of students. The results obtained by students in assessments are also used to gauge individual teacher quality, to hold schools to account for the standards achieved by their students, and to compare international education systems. Given the current high-stakes nature of educational assessment, it is imperative that the measurement practices involved have stable philosophical foundations. However, this paper casts doubt on the theoretical underpinnings of contemporary educational measurement models. Aspects of Wittgenstein’s later philosophy and Bohr’s philosophy of quantum theory are used to argue that a quantum theoretical rather than a Newtonian model is appropriate for educational measurement, and the associated implications for the concept of validity are elucidated. Whilst it is acknowledged that the transition to a quantum theoretical framework would not lead to the demise of educational assessment, it is argued that, where practical, current high-stakes assessments should be reformed to become as ‘low-stakes’ as possible. The paper also undermines some of the pro high-stakes testing rhetoric that has a tendency to afflict education.
Resumo:
Heterogeneous Networks (HetNets) are known to enhance the bandwidth efficiency and throughput of wireless networks by more effectively utilizing the network resources. However, the higher density of users and access points in HetNets introduces significant inter-user interference that needs to be mitigated through complex and sophisticated interference cancellation schemes. Moreover, due to significant channel attenuation and presence of hardware impairments, e.g., phase noise and amplifier nonlinearities, the vast bandwidth in the millimeter-wave band has not been fully utilized to date. In order to enable the development of multi-Gigabit per second wireless networks, we introduce a novel millimeter-wave HetNet paradigm, termed hybrid HetNet, which exploits the vast bandwidth and propagation characteristics in the 60 GHz and 70–80 GHz bands to reduce the impact of interference in HetNets. Simulation results are presented to illustrate the performance advantage of hybrid HetNets with respect to traditional networks. Next, two specific transceiver structures that enable hand-offs from the 60 GHz band, i.e., the V-band to the 70–80 GHz band, i.e., the E-band, and vice versa are proposed. Finally, the practical and regulatory challenges for establishing a hybrid HetNet are outlined.
Resumo:
Next-generation sequencing (NGS) is beginning to show its full potential for diagnostic and therapeutic applications. In particular, it is enunciating its capacity to contribute to a molecular taxonomy of cancer, to be used as a standard approach for diagnostic mutation detection, and to open new treatment options that are not exclusively organ-specific. If this is the case, how much validation is necessary and what should be the validation strategy, when bringing NGS into the diagnostic/clinical practice? This validation strategy should address key issues such as: what is the overall extent of the validation? Should essential indicators of test performance such as sensitivity of specificity be calculated for every target or sample type? Should bioinformatic interpretation approaches be validated with the same rigour? What is a competitive clinical turnaround time for a NGS-based test, and when does it become a cost-effective testing proposition? While we address these and other related topics in this commentary, we also suggest that a single set of international guidelines for the validation and use of NGS technology in routine diagnostics may allow us all to make a much more effective use of resources.
Resumo:
In this paper we advocate the Loop-of-stencil-reduce pattern as a way to simplify the parallel programming of heterogeneous platforms (multicore+GPUs). Loop-of-Stencil-reduce is general enough to subsume map, reduce, map-reduce, stencil, stencil-reduce, and, crucially, their usage in a loop. It transparently targets (by using OpenCL) combinations of CPU cores and GPUs, and it makes it possible to simplify the deployment of a single stencil computation kernel on different GPUs. The paper discusses the implementation of Loop-of-stencil-reduce within the FastFlow parallel framework, considering a simple iterative data-parallel application as running example (Game of Life) and a highly effective parallel filter for visual data restoration to assess performance. Thanks to the high-level design of the Loop-of-stencil-reduce, it was possible to run the filter seamlessly on a multicore machine, on multi-GPUs, and on both.