46 resultados para Eclectic Paradigm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

At the core ofthe sense ofagency for self-produced action is the sense that I, and not some other agent, am producing and directing those actions. While there is an ever-expanding body of empirical research investigating the sense of agency for bodily action, there has, to date, been little empirical investigation of the sense ofagency for thought.The present study uses the novel Mind-to-Mind paradigm, in which the agentive source of a target thought is ambiguous, to measure misattributions of agency. Seventy-two percent of participants made at least one misattribution of agency during a 5-min trial. Misattributions were significantly more frequent when the target thought was an arousing negative thought as compared to a neutral control.The findings establish a novel protocol for measuring the sense of agency for thought, and suggest that both contextual factors and emotional experience play a role in its generation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present three natural language marking strategies based on fast and reliable shallow parsing techniques, and on widely available lexical resources: lexical substitution, adjective conjunction swaps, and relativiser switching. We test these techniques on a random sample of the British National Corpus. Individual candidate marks are checked for goodness of structural and semantic fit, using both lexical resources, and the web as a corpus. A representative sample of marks is given to 25 human judges to evaluate for acceptability and preservation of meaning. This establishes a correlation between corpus based felicity measures and perceived quality, and makes qualified predictions. Grammatical acceptability correlates with our automatic measure strongly (Pearson's r = 0.795, p = 0.001), allowing us to account for about two thirds of variability in human judgements. A moderate but statistically insignificant (Pearson's r = 0.422, p = 0.356) correlation is found with judgements of meaning preservation, indicating that the contextual window of five content words used for our automatic measure may need to be extended. © 2007 SPIE-IS&T.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resistance to chemotherapy and molecularly targeted therapies is a major problem facing current cancer research. The mechanisms of resistance to 'classical' cytotoxic chemotherapeutics and to therapies that are designed to be selective for specific molecular targets share many features, such as alterations in the drug target, activation of prosurvival pathways and ineffective induction of cell death. With the increasing arsenal of anticancer agents, improving preclinical models and the advent of powerful high-throughput screening techniques, there are now unprecedented opportunities to understand and overcome drug resistance through the clinical assessment of rational therapeutic drug combinations and the use of predictive biomarkers to enable patient stratification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Human listeners seem to be remarkably able to recognise acoustic sound sources based on timbre cues. Here we describe a psychophysical paradigm to estimate the time it takes to recognise a set of complex sounds differing only in timbre cues: both in terms of the minimum duration of the sounds and the inferred neural processing time. Listeners had to respond to the human voice while ignoring a set of distractors. All sounds were recorded from natural sources over the same pitch range and equalised to the same duration and power. In a first experiment, stimuli were gated in time with a raised-cosine window of variable duration and random onset time. A voice/non-voice (yes/no) task was used. Performance, as measured by d', remained above chance for the shortest sounds tested (2 ms); d's above 1 were observed for durations longer than or equal to 8 ms. Then, we constructed sequences of short sounds presented in rapid succession. Listeners were asked to report the presence of a single voice token that could occur at a random position within the sequence. This method is analogous to the "rapid sequential visual presentation" paradigm (RSVP), which has been used to evaluate neural processing time for images. For 500-ms sequences made of 32-ms and 16-ms sounds, d' remained above chance for presentation rates of up to 30 sounds per second. There was no effect of the pitch relation between successive sounds: identical for all sounds in the sequence or random for each sound. This implies that the task was not determined by streaming or forward masking, as both phenomena would predict better performance for the random pitch condition. Overall, the recognition of familiar sound categories such as the voice seems to be surprisingly fast, both in terms of the acoustic duration required and of the underlying neural time constants.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

From the early 1900s, some psychologists have attempted to establish their discipline as a quantitative science. In using quantitative methods to investigate their theories, they adopted their own special definition of measurement of attributes such as cognitive abilities, as though they were quantities of the type encountered in Newtonian science. Joel Michell has presented a carefully reasoned argument that psychological attributes lack additivity, and therefore cannot be quantities in the same way as the attributes of classical Newtonian physics. In the early decades of the 20th century, quantum theory superseded Newtonian mechanics as the best model of physical reality. This paper gives a brief, critical overview of the evolution of current measurement practices in psychology, and suggests the need for a transition from a Newtonian to a quantum theoretical paradigm for psychological measurement. Finally, a case study is presented that considers the implications of a quantum theoretical model for educational measurement. In particular, it is argued that, since the OECD’s Programme for International Student Assessment (PISA) is predicated on a Newtonian conception of measurement, this may constrain the extent to which it can make accurate comparisons of the achievements of different education systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The outcomes of educational assessments undoubtedly have real implications for students, teachers, schools and education in the widest sense. Assessment results are, for example, used to award qualifications that determine future educational or vocational pathways of students. The results obtained by students in assessments are also used to gauge individual teacher quality, to hold schools to account for the standards achieved by their students, and to compare international education systems. Given the current high-stakes nature of educational assessment, it is imperative that the measurement practices involved have stable philosophical foundations. However, this paper casts doubt on the theoretical underpinnings of contemporary educational measurement models. Aspects of Wittgenstein’s later philosophy and Bohr’s philosophy of quantum theory are used to argue that a quantum theoretical rather than a Newtonian model is appropriate for educational measurement, and the associated implications for the concept of validity are elucidated. Whilst it is acknowledged that the transition to a quantum theoretical framework would not lead to the demise of educational assessment, it is argued that, where practical, current high-stakes assessments should be reformed to become as ‘low-stakes’ as possible. The paper also undermines some of the pro high-stakes testing rhetoric that has a tendency to afflict education.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Heterogeneous Networks (HetNets) are known to enhance the bandwidth efficiency and throughput of wireless networks by more effectively utilizing the network resources. However, the higher density of users and access points in HetNets introduces significant inter-user interference that needs to be mitigated through complex and sophisticated interference cancellation schemes. Moreover, due to significant channel attenuation and presence of hardware impairments, e.g., phase noise and amplifier nonlinearities, the vast bandwidth in the millimeter-wave band has not been fully utilized to date. In order to enable the development of multi-Gigabit per second wireless networks, we introduce a novel millimeter-wave HetNet paradigm, termed hybrid HetNet, which exploits the vast bandwidth and propagation characteristics in the 60 GHz and 70–80 GHz bands to reduce the impact of interference in HetNets. Simulation results are presented to illustrate the performance advantage of hybrid HetNets with respect to traditional networks. Next, two specific transceiver structures that enable hand-offs from the 60 GHz band, i.e., the V-band to the 70–80 GHz band, i.e., the E-band, and vice versa are proposed. Finally, the practical and regulatory challenges for establishing a hybrid HetNet are outlined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Next-generation sequencing (NGS) is beginning to show its full potential for diagnostic and therapeutic applications. In particular, it is enunciating its capacity to contribute to a molecular taxonomy of cancer, to be used as a standard approach for diagnostic mutation detection, and to open new treatment options that are not exclusively organ-specific. If this is the case, how much validation is necessary and what should be the validation strategy, when bringing NGS into the diagnostic/clinical practice? This validation strategy should address key issues such as: what is the overall extent of the validation? Should essential indicators of test performance such as sensitivity of specificity be calculated for every target or sample type? Should bioinformatic interpretation approaches be validated with the same rigour? What is a competitive clinical turnaround time for a NGS-based test, and when does it become a cost-effective testing proposition? While we address these and other related topics in this commentary, we also suggest that a single set of international guidelines for the validation and use of NGS technology in routine diagnostics may allow us all to make a much more effective use of resources.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

They’re cheap. They’re in every settlement of significance in Britain, Ireland and elsewhere. We all use them but perhaps do not always admit to it. Especially, if we are architects.
Over the past decades Aldi/Lidl low cost supermarkets have escaped from middle Europe to take over large tracts of the English speaking world remaking them according to a formula of mass-produced sheds, buff-coloured cobble-lock car parks, logos in primary colours, bare-shelves and eclectic special offers. Response within architectural discourse to this phenomenon has been largely one of indifference and such places remain, perhaps reiterating Pevsner’s controversial insights into the bicycle shed, on the peripheries of what we might term architecture. This paper seeks to explore the spatial complexities of the discount supermarket and in doing so open up a discussion on the architecture of cheapness. As a road-map, it takes former managing director Dieter Brandes’ treatise on the Aldi formula, Bare Essentials: the Aldi Way to Retailing, and investigates the strategies through which economic exigencies manifest themselves in a series of spatial tactics which involve building. Central to this is the idea of architecture as system rather than form and, in Aldi/Lidl’s case, the result of a spatial network of flows. To understand the architecture of the supermarket, then, it is necessary to measure the times and spaces of supply across the scales of intersection between global and local.
Evaluating the energy, economy and precision of such systems challenges the liminal position of the commercial, the placeless and especially the cheap within architectural discourse. As is well known, architectures of mass-production and prefabrication and their origins exercised modernist thinkers such as Sigfried Giedion and Walter Gropius in the early twentieth century and has undergone a resurgence in recent times. Meanwhile, the mapping of the hitherto overlooked forms and iconography of commerce in Learning from Las Vegas (1971) was extended by Rem Koolhaas et al into an investigation of the technologies, systems and precedents of retail in the Harvard Design School Guide to Shopping, thirty years later in 2001. While obviously always a criteria for building, to find writings on architecture which explicitly celebrate cheapness as a design virtue or, indeed, even iterate the word cheap is more difficult. Walter Gropius’ essay ‘How can we build cheaper, better, more attractive houses?’ (1927), however, situates the cheap within the discussions – articulated, amongst others, by Karl Teige and Bruno Taut – surrounding the minimal dwelling and the moral benefits of absence of the 1920s and 30s.
In our contemporary age of heightened consumption, it is perhaps fitting that an architecture of bare essentials is defined in retail rather than in housing, a commercial existenzminimum where the Miesian paradox of ‘less is more’ is resold as a paradigm of ‘more for less’ in the ubiquitous yet overlooked architectures of the discount supermarket.