129 resultados para Claims
Resumo:
Background: The computational grammatical complexity ( CGC) hypothesis claims that children with G(rammatical)-specific language impairment ( SLI) have a domain-specific deficit in the computational system affecting syntactic dependencies involving 'movement'. One type of such syntactic dependencies is filler-gap dependencies. In contrast, the Generalized Slowing Hypothesis claims that SLI children have a domain-general deficit affecting processing speed and capacity. Aims: To test contrasting accounts of SLI we investigate processing of syntactic (filler-gap) dependencies in wh-questions. Methods & Procedures: Fourteen 10; 2 - 17; 2 G-SLI children, 14 age- matched and 17 vocabulary-matched controls were studied using the cross- modal picturepriming paradigm. Outcomes & Results: G-SLI children's processing speed was significantly slower than the age controls, but not younger vocabulary controls. The G- SLI children and vocabulary controls did not differ on memory span. However, the typically developing and G-SLI children showed a qualitatively different processing pattern. The age and vocabulary controls showed priming at the gap, indicating that they process wh-questions through syntactic filler-gap dependencies. In contrast, G-SLI children showed priming only at the verb. Conclusions: The findings indicate that G-SLI children fail to establish reliably a syntactic filler- gap dependency and instead interpret wh-questions via lexical thematic information. These data challenge the Generalized Slowing Hypothesis account, but support the CGC hypothesis, according to which G-SLI children have a particular deficit in the computational system affecting syntactic dependencies involving 'movement'. As effective remediation often depends on aetiological insight, the discovery of the nature of the syntactic deficit, along side a possible compensatory use of semantics to facilitate sentence processing, can be used to direct therapy. However, the therapeutic strategy to be used, and whether such similar strengths and weaknesses within the language system are found in other SLI subgroups are empirical issues that warrant further research.
Resumo:
Despite increasing empirical data to the contrary, it continues to be claimed that morphosyntax and face processing skills of people with Williams syndrome are intact, This purported intactness, which coexists with mental retardation, is used to bolster claims about innately specified, independently functioning modules, as if the atypically developing brain were simply a normal brain with parts intact and parts impaired. Yet this is highly unlikely, given the dynamics of brain development and the fact that in a genetic microdeletion syndrome the brain is developing differently from the moment of conception, throughout embryogenesis, and during postnatal brain growth. In this article, we challenge the intactness assumptions, using evidence from a wide variety of studies of toddlers, children, and adults with Williams syndrome.
Resumo:
Tendering is one of the stages in construction procurement that requires extensive information and documents exchange. However, tender documents are not always clear in practice. The aim of this study was to ascertain the clarity and adequacy of tender documents used in practice. Access was negotiated into two UK construction firms and the whole tender process for two projects was shadowed for 6-7 weeks in each firm using an ethnographic approach. A significant amount of tender queries, amendments and addenda were recorded. This showed that quality of tender documentation is still a problem in construction despite the existence of standards like Co-ordinated Project Information (1987) and British Standard 1192 (1984 and 1990) that are meant to help in producing clear and consistent project information. Poor quality tender documents are a source of inaccurate estimates, claims and disputes on contracts. Six recommendations are presented to help in improving the quality of tender documentation. Further research is needed into the recommendations to help improve the quality of tender documents, perhaps in conjunction with an industry-wide investigation into the level of incorporation of CPI principles in practice.
Resumo:
The last 20 years have seen a huge expansion in the additional adults working in classrooms in the UK, USA, and other countries. This paper presents the findings of a series of systematic literature reviews about teaching assistants. The first two reviews focused on stakeholder perceptions of teaching assistant contributions to academic and social engagement. Stakeholders were pupils, teachers, TAs, headteachers and parents. Perceptions focused on four principal contributions that teaching assistants contribute to: pupils’ academic and socio-academic engagement; inclusion; maintenance of stakeholder relations; and support for the teacher. The third review explored training. Against a background of patchy training provision both in the UK and the USA, strong claims are made for the benefits to TAs of training provided, particularly in building confidence and skills. The conclusions include implications for further training and the need for further research to gain an in-depth understanding as to precisely the manner in which TAs engage with children.
Resumo:
This article engages with the claims of Anne Brubaker that “[n]ow that the dust has settled after the so-called ‘Science Wars’ […] it is an opportune time to reassess the ways in which poststructural theory both argues persuasively for mathematics as a culturally embedded practice – a method as opposed to a metaphysics – and, at the same time, reinscribes realist notions of mathematics as a noise-free description of a mind independent reality.” Through a close re-reading of Jacques Derrida’s work I argue, in alliance with Vicki Kirby’s critique of the work of Brian Rotman, not only that Brubaker misunderstands Derrida’s “writing” but also that her argument constitutes a typical instance of much wider misreadings of Derrida and “poststructuralism” across a range of disciplines in terms of the ways in which her text re-institutes the very stabilities it itself attributes to Derrida’s texts.
Resumo:
In 1999, Elizabeth Hills pointed up the challenges that physically active women on film still posed, in cultural terms, and in relation to certain branches of feminist theory . Since then, a remarkable number of emphatically active female heroes have appeared on screen, from 'Charlie’s Angels' to 'Resident Evil', 'Aeon Flux', and the 'Matrix' and 'X-Men' trilogies. Nevertheless, in a contemporary Western culture frequently characterised as postfeminist, these seem to be the ‘acceptable face’ – and body – of female empowerment: predominantly white, heterosexual, often scantily clad, with the traditional hero’s toughness and resolve re-imagined in terms of gender-biased notions of decorum: grace and dignity alongside perfect hair and make-up, and a body that does not display unsightly markers of physical exertion. The homogeneity of these representations is worth investigating in relation to critical claims that valorise such air-brushed, high-kicking 'action babes' for their combination of sexiness and strength, and the feminist and postfeminist discourses that are refracted through such readings. Indeed, this arguably ‘safe’ set of depictions, dovetailing so neatly with certain postfeminist notions of ‘having it all’, suppresses particular kinds of spectacles in relation to the active female body: images of physical stress and extension, biological consequences of violence and dangerous motivations are all absent. I argue that the untidy female exertions refused in popular “action babe” representations are now erupting into view in a number of other contemporaneous movies – 'Kill Bill' Vols 1 & 2, 'Monster', and 'Hard Candy' – that mark the return of that which is repressed in the mainstream vision of female power – that is, a more viscerally realistic physicality, rage and aggression. As such, these films engage directly with the issue of how to represent violent female agency. This chapter explores what is at stake at a representational level and in terms of spectatorial processes of identification in the return of this particularly visceral rendering of the female avenger.
Resumo:
The existence of a specialized imitation module in humans is hotly debated. Studies suggesting a specific imitation impairment in individuals with autism spectrum disorders (ASD) support a modular view. However, the voluntary imitation tasks used in these studies (which require socio-cognitive abilities in addition to imitation for successful performance) cannot support claims of a specific impairment. Accordingly, an automatic imitation paradigm (a ‘cleaner’ measure of imitative ability) was used to assess the imitative ability of 16 adults with ASD and 16 non-autistic matched control participants. Participants performed a prespecified hand action in response to observed hand actions performed either by a human or a robotic hand. On compatible trials the stimulus and response actions matched, while on incompatible trials the two actions did not match. Replicating previous findings, the Control group showed an automatic imitation effect: responses on compatible trials were faster than those on incompatible trials. This effect was greater when responses were made to human than to robotic actions (‘animacy bias’). The ASD group also showed an automatic imitation effect and a larger animacy bias than the Control group. We discuss these findings with reference to the literature on imitation in ASD and theories of imitation.
Resumo:
The premotor theory of attention claims that attentional shifts are triggered during response programming, regardless of which response modality is involved. To investigate this claim, event-related brain potentials (ERPs) were recorded while participants covertly prepared a left or right response, as indicated by a precue presented at the beginning of each trial. Cues signalled a left or right eye movement in the saccade task, and a left or right manual response in the manual task. The cued response had to be executed or withheld following the presentation of a Go/Nogo stimulus. Although there were systematic differences between ERPs triggered during covert manual and saccade preparation, lateralised ERP components sensitive to the direction of a cued response were very similar for both tasks, and also similar to the components previously found during cued shifts of endogenous spatial attention. This is consistent with the claim that the control of attention and of covert response preparation are closely linked. N1 components triggered by task-irrelevant visual probes presented during the covert response preparation interval were enhanced when these probes were presented close to cued response hand in the manual task, and at the saccade target location in the saccade task. This demonstrates that both manual and saccade preparation result in spatially specific modulations of visual processing, in line with the predictions of the premotor theory.
Resumo:
Abstract I argue for the following claims: [1] all uses of I (the word ‘I’ or thought-element I) are absolutely immune to error through misidentification relative to I. [2] no genuine use of I can fail to refer. Nevertheless [3] I isn’t univocal: it doesn’t always refer to the same thing, or kind of thing, even in the thought or speech of a single person. This is so even though [4] I always refers to its user, the subject of experience who speaks or thinks, and although [5] if I’m thinking about something specifically as myself, I can’t fail to be thinking of myself, and although [6] a genuine understanding use of I always involves the subject thinking of itself as itself, whatever else it does or doesn’t involve, and although [7] if I take myself to be thinking about myself, then I am thinking about myself.
Resumo:
This paper reviews the evidence relating to the question: does the risk of fungicide resistance increase or decrease with dose? The development of fungicide resistance progresses through three key phases. During the ‘emergence phase’ the resistant strain has to arise through mutation and invasion. During the subsequent ‘selection phase’, the resistant strain is present in the pathogen population and the fraction of the pathogen population carrying the resistance increases due to the selection pressure caused by the fungicide. During the final phase of ‘adjustment’, the dose or choice of fungicide may need to be changed to maintain effective control over a pathogen population where resistance has developed to intermediate levels. Emergence phase: no experimental publications and only one model study report on the emergence phase, and we conclude that work in this area is needed. Selection phase: all the published experimental work, and virtually all model studies, relate to the selection phase. Seven peer reviewed and four non-peer reviewed publications report experimental evidence. All show increased selection for fungicide resistance with increased fungicide dose, except for one peer reviewed publication that does not detect any selection irrespective of dose and one conference proceedings publication which claims evidence for increased selection at a lower dose. In the mathematical models published, no evidence has been found that a lower dose could lead to a higher risk of fungicide resistance selection. We discuss areas of the dose rate debate that need further study. These include further work on pathogen-fungicide combinations where the pathogen develops partial resistance to the fungicide and work on the emergence phase.
Resumo:
Following the US model, the UK has seen considerable innovation in the funding, finance and procurement of real estate in the last decade. In the growing CMBS market asset backed securitisations have included $2.25billion secured on the Broadgate office development and issues secured on Canary Wharf and the Trafford Centre regional mall. Major occupiers (retailer Sainsbury’s, retail bank Abbey National) have engaged in innovative sale & leaseback and outsourcing schemes. Strong claims are made concerning the benefits of such schemes – e.g. British Land were reported to have reduced their weighted cost of debt by 150bp as a result of the Broadgate issue. The paper reports preliminary findings from a project funded by the Corporation of London and the RICS Research Foundation examining a number of innovative schemes to identify, within a formal finance framework, sources of added value and hidden costs. The analysis indicates that many of the gains claimed conceal costs – in terms of market value of debt or flexibility of management – while others result from unusual firm or market conditions (for example utilising the UK long lease and the unusual shape of the yield curve). Nonetheless, there are real gains resulting from the innovations, reflecting arbitrage and institutional constraints in the direct (private) real estate market
Resumo:
Deception-detection is the crux of Turing’s experiment to examine machine thinking conveyed through a capacity to respond with sustained and satisfactory answers to unrestricted questions put by a human interrogator. However, in 60 years to the month since the publication of Computing Machinery and Intelligence little agreement exists for a canonical format for Turing’s textual game of imitation, deception and machine intelligence. This research raises from the trapped mine of philosophical claims, counter-claims and rebuttals Turing’s own distinct five minutes question-answer imitation game, which he envisioned practicalised in two different ways: a) A two-participant, interrogator-witness viva voce, b) A three-participant, comparison of a machine with a human both questioned simultaneously by a human interrogator. Using Loebner’s 18th Prize for Artificial Intelligence contest, and Colby et al.’s 1972 transcript analysis paradigm, this research practicalised Turing’s imitation game with over 400 human participants and 13 machines across three original experiments. Results show that, at the current state of technology, a deception rate of 8.33% was achieved by machines in 60 human-machine simultaneous comparison tests. Results also show more than 1 in 3 Reviewers succumbed to hidden interlocutor misidentification after reading transcripts from experiment 2. Deception-detection is essential to uncover the increasing number of malfeasant programmes, such as CyberLover, developed to steal identity and financially defraud users in chatrooms across the Internet. Practicalising Turing’s two tests can assist in understanding natural dialogue and mitigate the risk from cybercrime.
Resumo:
Unhealthy diets can lead to various diseases, which in turn can translate into a bigger burden for the state in the form of health services and lost production. Obesity alone has enormous costs and claims thousands of lives every year. Although diet quality in the European Union has improved across countries, it still falls well short of conformity with the World Health Organization dietary guidelines. In this review, we classify types of policy interventions addressing healthy eating and identify through a literature review what specific policy interventions are better suited to improve diets. Policy interventions are classified into two broad categories: information measures and measures targeting the market environment. Using this classification, we summarize a number of previous systematic reviews, academic papers, and institutional reports and draw some conclusions about their effectiveness. Of the information measures, policy interventions aimed at reducing or banning unhealthy food advertisements generally have had a weak positive effect on improving diets, while public information campaigns have been successful in raising awareness of unhealthy eating but have failed to translate the message into action. Nutritional labeling allows for informed choice. However, informed choice is not necessarily healthier; knowing or being able to read and interpret nutritional labeling on food purchased does not necessarily result in consumption of healthier foods. Interventions targeting the market environment, such as fiscal measures and nutrient, food, and diet standards, are rarer and generally more effective, though more intrusive. Overall, we conclude that measures to support informed choice have a mixed and limited record of success. On the other hand, measures to target the market environment are more intrusive but may be more effective.
Resumo:
The article compares Florian Henckel von Donnersmarck's film Das Leben der Anderen (2006) with Kurt Maetzig's early post-war film Ehe im Schatten (1947). The comparison is based on significant narrative and thematic elements which the films share: They both have a ‘theatre couple’, representatives of the ‘Bildungsbürgertum’, at the centre of the story; in both cases the couple faces a crisis caused by the first and second German dictatorship respectively and then both try to solve the crisis by relying on the classical ‘bürgerliches Erbe’, particularly the ‘bürgerliches Trauerspiel’. The extensive use of the ‘bürgerliches Erbe’ in the films activates the function this heritage had for the definition of the German nation in the nineteenth century. However, while Maetzig's film shows how the ‘heritage’ and its representatives fail in the face of National Socialism, von Donnersmarck's film claims the effectiveness of this ‘heritage’ in the fight against the East German dictatorship. Von Donnersmarck thus inverts a critical film tradition of which Ehe im Schatten is an example; furthermore, as this tradition emerged from dealing with the Third Reich, von Donnersmarck's film, it will be argued, is more interested in the redemption of the Nazi past than the East German past.
Resumo:
Based on a series of collage made from images of mega yachts, the Future Monument looks at the possibility of taking late capitalism more seriously as an ideology than it takes itself seriously. The project asks whether the private display of wealth and power represented by the yacht can be appropriated for a new language of public sculpture. The choreographed live performance of the construction of the large scale monument was scripted to a proposed capitalist manifesto and took place in a public square in Herzliya, Israel. It aimed to articulate the ideology latent in capitalism’s claims to a neutral manifestation of human nature. The Future Monument project was developed through reading seminars taking place at Goldsmiths College, as part of a research strand headed by Pil and Galia Kollectiv on irony and overidentification within the Political Currency of Art research group. This research has so far produced a series of silk screen collage prints, a sculpture commissioned by the Essex Council and a live performance commissioned by the Herzliya Biennale. However, the project is ongoing, with future outputs planned including a curated exhibition and conference in 2012, in collaboration with Matthew Poole, Programme Director of the Centre for Curatorial Studies at the University of Essex.