958 resultados para paradigm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Human listeners seem to be remarkably able to recognise acoustic sound sources based on timbre cues. Here we describe a psychophysical paradigm to estimate the time it takes to recognise a set of complex sounds differing only in timbre cues: both in terms of the minimum duration of the sounds and the inferred neural processing time. Listeners had to respond to the human voice while ignoring a set of distractors. All sounds were recorded from natural sources over the same pitch range and equalised to the same duration and power. In a first experiment, stimuli were gated in time with a raised-cosine window of variable duration and random onset time. A voice/non-voice (yes/no) task was used. Performance, as measured by d', remained above chance for the shortest sounds tested (2 ms); d's above 1 were observed for durations longer than or equal to 8 ms. Then, we constructed sequences of short sounds presented in rapid succession. Listeners were asked to report the presence of a single voice token that could occur at a random position within the sequence. This method is analogous to the "rapid sequential visual presentation" paradigm (RSVP), which has been used to evaluate neural processing time for images. For 500-ms sequences made of 32-ms and 16-ms sounds, d' remained above chance for presentation rates of up to 30 sounds per second. There was no effect of the pitch relation between successive sounds: identical for all sounds in the sequence or random for each sound. This implies that the task was not determined by streaming or forward masking, as both phenomena would predict better performance for the random pitch condition. Overall, the recognition of familiar sound categories such as the voice seems to be surprisingly fast, both in terms of the acoustic duration required and of the underlying neural time constants.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

From the early 1900s, some psychologists have attempted to establish their discipline as a quantitative science. In using quantitative methods to investigate their theories, they adopted their own special definition of measurement of attributes such as cognitive abilities, as though they were quantities of the type encountered in Newtonian science. Joel Michell has presented a carefully reasoned argument that psychological attributes lack additivity, and therefore cannot be quantities in the same way as the attributes of classical Newtonian physics. In the early decades of the 20th century, quantum theory superseded Newtonian mechanics as the best model of physical reality. This paper gives a brief, critical overview of the evolution of current measurement practices in psychology, and suggests the need for a transition from a Newtonian to a quantum theoretical paradigm for psychological measurement. Finally, a case study is presented that considers the implications of a quantum theoretical model for educational measurement. In particular, it is argued that, since the OECD’s Programme for International Student Assessment (PISA) is predicated on a Newtonian conception of measurement, this may constrain the extent to which it can make accurate comparisons of the achievements of different education systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The outcomes of educational assessments undoubtedly have real implications for students, teachers, schools and education in the widest sense. Assessment results are, for example, used to award qualifications that determine future educational or vocational pathways of students. The results obtained by students in assessments are also used to gauge individual teacher quality, to hold schools to account for the standards achieved by their students, and to compare international education systems. Given the current high-stakes nature of educational assessment, it is imperative that the measurement practices involved have stable philosophical foundations. However, this paper casts doubt on the theoretical underpinnings of contemporary educational measurement models. Aspects of Wittgenstein’s later philosophy and Bohr’s philosophy of quantum theory are used to argue that a quantum theoretical rather than a Newtonian model is appropriate for educational measurement, and the associated implications for the concept of validity are elucidated. Whilst it is acknowledged that the transition to a quantum theoretical framework would not lead to the demise of educational assessment, it is argued that, where practical, current high-stakes assessments should be reformed to become as ‘low-stakes’ as possible. The paper also undermines some of the pro high-stakes testing rhetoric that has a tendency to afflict education.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Heterogeneous Networks (HetNets) are known to enhance the bandwidth efficiency and throughput of wireless networks by more effectively utilizing the network resources. However, the higher density of users and access points in HetNets introduces significant inter-user interference that needs to be mitigated through complex and sophisticated interference cancellation schemes. Moreover, due to significant channel attenuation and presence of hardware impairments, e.g., phase noise and amplifier nonlinearities, the vast bandwidth in the millimeter-wave band has not been fully utilized to date. In order to enable the development of multi-Gigabit per second wireless networks, we introduce a novel millimeter-wave HetNet paradigm, termed hybrid HetNet, which exploits the vast bandwidth and propagation characteristics in the 60 GHz and 70–80 GHz bands to reduce the impact of interference in HetNets. Simulation results are presented to illustrate the performance advantage of hybrid HetNets with respect to traditional networks. Next, two specific transceiver structures that enable hand-offs from the 60 GHz band, i.e., the V-band to the 70–80 GHz band, i.e., the E-band, and vice versa are proposed. Finally, the practical and regulatory challenges for establishing a hybrid HetNet are outlined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Next-generation sequencing (NGS) is beginning to show its full potential for diagnostic and therapeutic applications. In particular, it is enunciating its capacity to contribute to a molecular taxonomy of cancer, to be used as a standard approach for diagnostic mutation detection, and to open new treatment options that are not exclusively organ-specific. If this is the case, how much validation is necessary and what should be the validation strategy, when bringing NGS into the diagnostic/clinical practice? This validation strategy should address key issues such as: what is the overall extent of the validation? Should essential indicators of test performance such as sensitivity of specificity be calculated for every target or sample type? Should bioinformatic interpretation approaches be validated with the same rigour? What is a competitive clinical turnaround time for a NGS-based test, and when does it become a cost-effective testing proposition? While we address these and other related topics in this commentary, we also suggest that a single set of international guidelines for the validation and use of NGS technology in routine diagnostics may allow us all to make a much more effective use of resources.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ticks as vectors of several notorious zoonotic pathogens, represent an important and increasing threat for human, animal health in Europe. Recent application of new technology revealed the complexity of the tick microbiome that might impact upon its vectorial capacity. Appreciation of these complex systems is expanding our vision of tick-borne pathogens leading us to evolve a more integrated view that embraces the “pathobiome” representing the pathogenic agent integrated within its abiotic and biotic environments. In this review, we will explore how this new vision will revolutionize our understanding of tick-borne diseases. We will discuss the implications in terms of research approach for the future in order to efficiently prevent and control the threat posed by ticks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A passive three stimulus oddball paradigm was used to investigate Visual Mismatch Negativity (vMMN) a component of the Event Related Potential (ERP) believed to represent a central pre-attentive change mechanism. Responses to a change in orientation were recorded to monochrome stimuli presented to subjects on a computer screen. One of the infrequent stimuli formed an illusory figure (Kanizsa Square) aimed to capture spatial attention in the absence of an active task. Nineteen electrodes (10-20 system) were used to record the electroencephalogram in fourteen subjects (ten females) mean age 34.5 years. ERPs to all stimuli consisted of a positive negative positive complex recorded maximally over lateral occipital areas. The negative component was greater for deviant and illusory deviant compared to standard stimuli in a time window of 170-190 ms. A P3a component over frontal/central electrodes to the illusory deviant but not to the deviant stimulus suggests the illusory figure was able to capture attention and orientate subjects to the recording. Subtraction waveforms revealed visual discrimination responses at occipital electrodes, which may represent vMMN. In a control study with 13 subjects (11 females; mean age 29.23 years), using an embedded active attention task, we confirmed the existence of an earlier (150-170 ms) and attenuated vMMN. Recordings from an intracranial case study confirmed separation of N1 and discrimination components to posterior and anterior occipital areas, respectively. We conclude that although the illusory figure captured spatial attention in its own right it did not draw sufficient attentional resources from the standard-deviant comparison as revealed when using a concurrent active task.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The neighbourhood in both the UK and Europe continues to dominate thinking about the quality of life in local communities, representation and empowerment, and how local services can be delivered most effectively. For several decades a series of centrally funded programmes in neighbour- hood governance have targeted localities suffering deprivation and social exclusion in England. From these much can be learnt about the strengths and limitations of a local approach to achieving multiple objectives.We review the findings of a case study of neighbourhood governance in the City of Westminster and draw on evaluations of two national programmes. In the conclusions we discuss the problems arising from multiple objectives and examine the prospects for neighbourhood governance as the national paradigm moves away from `big state' solutions towards the less-well-defined `big society' approach and the reinvention of `localism'. While the rationale for neighbourhood governance may change, the `neighbourhood' as a site for service delivery and planning remains as important now as in the past.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper aims to do three things; firstly the authors will present a case for their argument that marketing can be viewed as a masculine paradigm, secondly a critical perspective of traditional research methodologies within marketing and consumer research will be developed and, finally, the authors will share their own experiences of adopting alternative approaches and engaging in research reflexivity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

At the beginning of the 21st century, a new social arrangement of work poses a series of questions and challenges to scholars who aim to help people develop their working lives. Given the globalization of career counseling, we decided to address these issues and then to formulate potentially innovative responses in an international forum. We used this approach to avoid the difficulties of creating models and methods in one country and then trying to export them to other countries where they would be adapted for use. This article presents the initial outcome of this collaboration, a counseling model and methods. The life-designing model for career intervention endorses five presuppositions about people and their work lives: contextual possibilities, dynamic processes, non-linear progression, multiple perspectives, and personal patterns. Thinking from these five presuppositions, we have crafted a contextualized model based on the epistemology of social constructionism, particularly recognizing that an individual's knowledge and identity are the product of social interaction and that meaning is co-constructed through discourse. The life-design framework for counseling implements the theories of self-constructing [Guichard, J. (2005). Life-long self-construction. International Journal for Educational and Vocational Guidance, 5, 111-124] and career construction [Savickas, M. L. (2005). The theory and practice of career construction. In S. D. Brown & R. W. Lent (Eds.), Career development and counselling: putting theory and research to work (pp. 42-70). Hoboken, NJ: Wiley] that describe vocational behavior and its development. Thus, the framework is structured to be life-long, holistic, contextual, and preventive.