147 resultados para Catalan language -- To 1500 -- Word order -- Congresses

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

My research investigates why nouns are learned disproportionately more frequently than other kinds of words during early language acquisition (Gentner, 1982; Gleitman, et al., 2004). This question must be considered in the context of cognitive development in general. Infants have two major streams of environmental information to make meaningful: perceptual and linguistic. Perceptual information flows in from the senses and is processed into symbolic representations by the primitive language of thought (Fodor, 1975). These symbolic representations are then linked to linguistic input to enable language comprehension and ultimately production. Yet, how exactly does perceptual information become conceptualized? Although this question is difficult, there has been progress. One way that children might have an easier job is if they have structures that simplify the data. Thus, if particular sorts of perceptual information could be separated from the mass of input, then it would be easier for children to refer to those specific things when learning words (Spelke, 1990; Pylyshyn, 2003). It would be easier still, if linguistic input was segmented in predictable ways (Gentner, 1982; Gleitman, et al., 2004) Unfortunately the frequency of patterns in lexical or grammatical input cannot explain the cross-cultural and cross-linguistic tendency to favor nouns over verbs and predicates. There are three examples of this failure: 1) a wide variety of nouns are uttered less frequently than a smaller number of verbs and yet are learnt far more easily (Gentner, 1982); 2) word order and morphological transparency offer no insight when you contrast the sentence structures and word inflections of different languages (Slobin, 1973) and 3) particular language teaching behaviors (e.g. pointing at objects and repeating names for them) have little impact on children's tendency to prefer concrete nouns in their first fifty words (Newport, et al., 1977). Although the linguistic solution appears problematic, there has been increasing evidence that the early visual system does indeed segment perceptual information in specific ways before the conscious mind begins to intervene (Pylyshyn, 2003). I argue that nouns are easier to learn because their referents directly connect with innate features of the perceptual faculty. This hypothesis stems from work done on visual indexes by Zenon Pylyshyn (2001, 2003). Pylyshyn argues that the early visual system (the architecture of the "vision module") segments perceptual data into pre-conceptual proto-objects called FINSTs. FINSTs typically correspond to physical things such as Spelke objects (Spelke, 1990). Hence, before conceptualization, visual objects are picked out by the perceptual system demonstratively, like a finger pointing indicating ‘this’ or ‘that’. I suggest that this primitive system of demonstration elaborates on Gareth Evan's (1982) theory of nonconceptual content. Nouns are learnt first because their referents attract demonstrative visual indexes. This theory also explains why infants less often name stationary objects such as plate or table, but do name things that attract the focal attention of the early visual system, i.e., small objects that move, such as ‘dog’ or ‘ball’. This view leaves open the question how blind children learn words for visible objects and why children learn category nouns (e.g. 'dog'), rather than proper nouns (e.g. 'Fido') or higher taxonomic distinctions (e.g. 'animal').

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to provide a comparison of various algorithms and parameters to build reduced semantic spaces. The effect of dimension reduction, the stability of the representation and the effect of word order are examined in the context of the five algorithms bearing on semantic vectors: Random projection (RP), singular value decom- position (SVD), non-negative matrix factorization (NMF), permutations and holographic reduced representations (HRR). The quality of semantic representation was tested by means of synonym finding task using the TOEFL test on the TASA corpus. Dimension reduction was found to improve the quality of semantic representation but it is hard to find the optimal parameter settings. Even though dimension reduction by RP was found to be more generally applicable than SVD, the semantic vectors produced by RP are somewhat unstable. The effect of encoding word order into the semantic vector representation via HRR did not lead to any increase in scores over vectors constructed from word co-occurrence in context information. In this regard, very small context windows resulted in better semantic vectors for the TOEFL test.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Models of word meaning, built from a corpus of text, have demonstrated success in emulating human performance on a number of cognitive tasks. Many of these models use geometric representations of words to store semantic associations between words. Often word order information is not captured in these models. The lack of structural information used by these models has been raised as a weakness when performing cognitive tasks. This paper presents an efficient tensor based approach to modelling word meaning that builds on recent attempts to encode word order information, while providing flexible methods for extracting task specific semantic information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is part one of a three part study into the collective regulation processes of players in massive multiplayer online games (MMOG). Traditionally game playing has not been classed as problematic, however with introduction of new media technologies and new ways to play games, certain contexts have become obscure, namely the localised order of ‘playing online’ or how players manage and maintain order between each other as opposed to ‘following the rules’. Principally this paper will examine concepts of ‘virtual community’. These will be illustrated as particularly unhelpful when considering how people conduct themselves in these spaces. Thus, ‘virtual community’ will be seen as critical in implicating various online behaviours as superior to other online behaviours causing obscurity and blurring actions. This obscurity is grounded by strong associations in the virtual community as logic of practise in and of itself; behaviours that fall outside this category become common sense and as such are made invisible for investigation. This paper will draw upon the theories of Basil Bernstein and Pierre Bourdieu to produce a distinction between online behaviours and ultimately make them visible for further investigation. In doing so this paper seeks to form a basis for future research where interaction in these spaces can be identified as belonging to a certain framework to inform the design of online games and applications more effectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The 1990 European Community was taken by surprise, by the urgency of demands from the newly-elected Eastern European governments to become member countries. Those governments were honouring the mass social movement of the streets, the year before, demanding free elections and a liberal economic system associated with “Europe”. The mass movement had actually been accompanied by much activity within institutional politics, in Western Europe, the former “satellite” states, the Soviet Union and the United States, to set up new structures – with German reunification and an expanded EC as the centre-piece. This paper draws on the writer’s doctoral dissertation on mass media in the collapse of the Eastern bloc, focused on the Berlin Wall – documenting both public protests and institutional negotiations. For example the writer as a correspondent in Europe from that time, recounts interventions of the German Chancellor, Helmut Kohl, at a European summit in Paris nine days after the “Wall”, and separate negotiations with the French President, Francois Mitterrand -- on the reunification, and EU monetary union after 1992. Through such processes, the “European idea” would receive fresh impetus, though the EU which eventuated, came with many altered expectations. It is argued here that as a result of the shock of 1989, a “social” Europe can be seen emerging, as a shared experience of daily life -- especially among people born during the last two decades of European consolidation. The paper draws on the author’s major research, in four parts: (1) Field observation from the strategic vantage point of a news correspondent. This includes a treatment of evidence at the time, of the wishes and intentions of the mass public (including the unexpected drive to join the European Community), and those of governments, (e.g. thoughts of a “Tienanmen Square solution” in East Berlin, versus the non-intervention policies of the Soviet leader, Mikhail Gorbachev). (2) A review of coverage of the crisis of 1989 by major news media outlets, treated as a history of the process. (3) As a comparison, and a test of accuracy and analysis; a review of conventional histories of the crisis appearing a decade later.(4) A further review, and test, provided by journalists responsible for the coverage of the time, as reflection on practice – obtained from semi-structured interviews.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multilevel converters are used in high power and high voltage applications due to their attractive benefits in generating high quality output voltage. Increasing the number of voltage levels can lead to a reduction in lower order harmonics. Various modulation and control techniques are introduced for multilevel converters like Space Vector Modulation (SVM), Sinusoidal Pulse Width Modulation (SPWM) and Harmonic Elimination (HE) methods. Multilevel converters may have a DC link with equal or unequal DC voltages. In this paper a new modulation technique based on harmonic elimination method is proposed for those multilevel converters that have unequal DC link voltages. This new technique has better effect on output voltage quality and less Total Harmonic Distortion (THD) than other modulation techniques. In order to verify the proposed modulation technique, MATLAB simulations are carried out for a single-phase diode-clamped inverter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The formation of a venture relies, in part, upon the participants reaching a shared understanding of purpose and process. Yet in circumstances of great complexity and uncertainty how can such a shared understanding be created? If the response to complexity and uncertainty is to seek simplicity in order to find commonality then what is lost and what is at risk? Can shared understandings of purpose and process be arrived at by embracing complexity and uncertainty and if so how? These questions led us to explore the process of dialogue and communication of a team in its formative stages. Our interests were not centred upon the behavioural characteristics of the individuals in the 'forming' stage of group dynamics but rather the process of cognitive and linguistic turns, the wax and wan of ideas and, the formation of shared meaning. This process of cognitive and linguistic turns was focused thematically on the areas of foresight, innovation, entrepreneurship, and public policy. This cross disciplinary exploration sought to explore potential synergies between these domains, in particular in developing a conceptual basis for long term thinking that can inform wiser public policy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software development and Web site development techniques have evolved significantly over the past 20 years. The relatively young Web Application development area has borrowed heavily from traditional software development methodologies primarily due to the similarities in areas of data persistence and User Interface (UI) design. Recent developments in this area propose a new Web Modeling Language (WebML) to facilitate the nuances specific to Web development. WebML is one of a number of implementations designed to enable modeling of web site interaction flows while being extendable to accommodate new features in Web site development into the future. Our research aims to extend WebML with a focus on stigmergy which is a biological term originally used to describe coordination between insects. We see design features in existing Web sites that mimic stigmergic mechanisms as part of the UI. We believe that we can synthesize and embed stigmergy in Web 2.0 sites. This paper focuses on the sub-topic of site UI design and stigmergic mechanism designs required to achieve this.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The decision of Applegarth J in Heartwood Architectural & Joinery Pty Ltd v Redchip Lawyers [2009] QSC 195 (27 July 2009) involved a costs order against solicitors personally. This decision is but one of several recent decisions in which the court has been persuaded that the circumstances justified costs orders against legal practitioners on the indemnity basis. These decisions serve as a reminder to practitioners of their disclosure obligations when seeking any interlocutory relief in an ex parte application. These obligations are now clearly set out in r 14.4 of the Legal Profession (Solicitors) Rule 2007 and r 25 of 2007 Barristers Rule. Inexperience or ignorance will not excuse breaches of the duties owed to the court.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In our rejoinder to Don Weatherburn's paper,"Law and Order Blues", we do not take issue with his advocacy of the need to take crime seriously and to foster a more rational approach to the problems it poses. Where differences do emerge is (1) with his claim that he is willing to do so whilst we (in our different ways) are not; and (2) on the question of what this involves. Of particular concern is the way in which his argument proceeds by a combination of simple misrepresentation of the positions it seeks to disparage, and silence concerning issues of real substance where intellectual debate and exchange would be welcome and useful. Our paper challenges, in turn, the misrepresentation of Indermaur's analysis of trends in violent crime, the misrepresentation of Hogg and Brown's Rethinking Law and Order, the misrepresentation of the findings of some of the research into the effectiveness of punitive policies and the silence on sexual assault in "Law and Order Blues". We suggest that his silence on sexual assault reflects a more widespread unwillingness to acknowledge the methodological problems that arise in the measurement of crime because such problems severely limit the extent to which confident assertions can be made about prevalence and trends.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In our rejoinder to Don Weatherburn's paper, “Law and Order Blues”, we do not take issue with his advocacy of the need to take crime seriously and to foster a more rational approach to the problems it poses. Where differences do emerge is (1) with his claim that he is willing to do so whilst we (in our different ways) are not; and (2) on the question of what this involves. Of particular concern is the way in which his argument proceeds by a combination of simple misrepresentation of the positions it seeks to disparage, and silence concerning issues of real substance where intellectual debate and exchange would be welcome and useful. Our paper challenges, in turn, the misrepresentation of Indermaur's analysis of trends in violent crime, the misrepresentation of Hogg and Brown's Rethinking Law and Order, the misrepresentation of the findings of some of the research into the effectiveness of punitive policies and the silence on sexual assault in “Law and Order Blues”. We suggest that his silence on sexual assault reflects a more widespread unwillingness to acknowledge the methodological problems that arise in the measurement of crime because such problems severely limit the extent to which confident assertions can be made about prevalence and trends.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction Many bilinguals will have had the experience of unintentionally reading something in a language other than the intended one (e.g. MUG to mean mosquito in Dutch rather than a receptacle for a hot drink, as one of the possible intended English meanings), of finding themselves blocked on a word for which many alternatives suggest themselves (but, somewhat annoyingly, not in the right language), of their accent changing when stressed or tired and, occasionally, of starting to speak in a language that is not understood by those around them. These instances where lexical access appears compromised and control over language behavior is reduced hint at the intricate structure of the bilingual lexical architecture and the complexity of the processes by which knowledge is accessed and retrieved. While bilinguals might tend to blame word finding and other language problems on their bilinguality, these difficulties per se are not unique to the bilingual population. However, what is unique, and yet far more common than is appreciated by monolinguals, is the cognitive architecture that subserves bilingual language processing. With bilingualism (and multilingualism) the rule rather than the exception (Grosjean, 1982), this architecture may well be the default structure of the language processing system. As such, it is critical that we understand more fully not only how the processing of more than one language is subserved by the brain, but also how this understanding furthers our knowledge of the cognitive architecture that encapsulates the bilingual mental lexicon. The neurolinguistic approach to bilingualism focuses on determining the manner in which the two (or more) languages are stored in the brain and how they are differentially (or similarly) processed. The underlying assumption is that the acquisition of more than one language requires at the very least a change to or expansion of the existing lexicon, if not the formation of language-specific components, and this is likely to manifest in some way at the physiological level. There are many sources of information, ranging from data on bilingual aphasic patients (Paradis, 1977, 1985, 1997) to lateralization (Vaid, 1983; see Hull & Vaid, 2006, for a review), recordings of event-related potentials (ERPs) (e.g. Ardal et al., 1990; Phillips et al., 2006), and positron emission tomography (PET) and functional magnetic resonance imaging (fMRI) studies of neurologically intact bilinguals (see Indefrey, 2006; Vaid & Hull, 2002, for reviews). Following the consideration of methodological issues and interpretative limitations that characterize these approaches, the chapter focuses on how the application of these approaches has furthered our understanding of (1) selectivity of bilingual lexical access, (2) distinctions between word types in the bilingual lexicon and (3) control processes that enable language selection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we propose an unsupervised segmentation approach, named "n-gram mutual information", or NGMI, which is used to segment Chinese documents into n-character words or phrases, using language statistics drawn from the Chinese Wikipedia corpus. The approach alleviates the tremendous effort that is required in preparing and maintaining the manually segmented Chinese text for training purposes, and manually maintaining ever expanding lexicons. Previously, mutual information was used to achieve automated segmentation into 2-character words. The NGMI approach extends the approach to handle longer n-character words. Experiments with heterogeneous documents from the Chinese Wikipedia collection show good results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The word “queer” is a slippery one; its etymology is uncertain, and academic and popular usage attributes conflicting meanings to the word. By the mid-nineteenth century, “queer” was used as a pejorative term for a (male) homosexual. This negative connotation continues when it becomes a term for homophobic abuse. In recent years, “queer” has taken on additional uses: as an all encompassing term for culturally marginalised sexualities – gay, lesbian, trans, bi, and intersex (“GLBTI”) – and as a theoretical strategy which deconstructs binary oppositions that govern identity formation. Tracing its history, the Oxford English Dictionary notes that the earliest references to “queer” may have appeared in the sixteenth century. These early examples of queer carried negative connotations such as “vulgar,” “bad,” “worthless,” “strange,” or “odd” and such associations continued until the mid-twentieth century. The early nineteenth century, and perhaps earlier, employed “queer” as a verb, meaning toto put out of order,” “to spoil”, “to interfere with”. The adjectival form also began to emerge during this time to refer to a person’s condition as being “not normal,” “out of sorts” or to cause a person “to feel queer” meaning “to disconcert, perturb, unsettle.” According to Eve Sedgwick (1993), “the word ‘queer’ itself means across – it comes from the Indo-European root – twerkw, which also yields the German quer (traverse), Latin torquere (to twist), English athwart . . . it is relational and strange.” Despite the gaps in the lineage and changes in usage, meaning and grammatical form, “queer” as a political and theoretical strategy has benefited from its diverse origins. It refuses to settle comfortably into a single classification, preferring instead to traverse several categories that would otherwise attempt to stabilise notions of chromosomal sex, gender and sexuality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two perceptions of the marginality of home economics are widespread across educational and other contexts. One is that home economics and those who engage in its pedagogy are inevitably marginalised within patriarchal relations in education and culture. This is because home economics is characterised as women's knowledge, for the private domain of the home. The other perception is that only orthodox epistemological frameworks of inquiry should be used to interrogate this state of affairs. These perceptions have prompted leading theorists in the field to call for non-essentialist approaches to research in order to re-think the thinking that has produced this cul-de-sac positioning of home economics as a body of knowledge and a site of teacher practice. This thesis takes up the challenge of working to locate a space outside the frame of modernist research theory and methods, recognising that this shift in epistemology is necessary to unsettle the idea that home economics is inevitably marginalised. The purpose of the study is to reconfigure how we have come to think about home economics teachers and the profession of home economics as a site of cultural practice, in order to think it otherwise (Lather, 1991). This is done by exploring how the culture of home economics is being contested from within. To do so, the thesis uses a 'posthumanist' approach, which rejects the conception of the individual as a unitary and fixed entity, but instead as a subject in process, shaped by desires and language which are not necessarily consciously determined. This posthumanist project focuses attention on pedagogical body subjects as the 'unsaid' of home economics research. It works to transcend the modernist dualism of mind/body, and other binaries central to modernist work, including private/public, male/female,paid/unpaid, and valued/unvalued. In so doing, it refuses the simple margin/centre geometry so characteristic of current perceptions of home economics itself. Three studies make up this work. Studies one and two serve to document the disciplined body of home economics knowledge, the governance of which works towards normalisation of the 'proper' home economics teacher. The analysis of these accounts of home economics teachers by home economics teachers, reveals that home economics teachers are 'skilled' yet they 'suffer' for their profession. Further,home economics knowledge is seen to be complicit in reinforcing the traditional roles of masculinity and femininity, thereby reinforcing heterosexual normativity which is central to patriarchal society. The third study looks to four 'atypical'subjects who defy the category of 'proper' and 'normal' home economics teacher. These 'atypical' bodies are 'skilled' but fiercely reject the label of 'suffering'. The discussion of the studies is a feminist poststructural account, using Russo's (1994) notion of the grotesque body, which is emergent from Bakhtin's (1968) theory of the carnivalesque. It draws on the 'shreds' of home economics pedagogy,scrutinising them for their subversive, transformative potential. In this analysis, the giving and taking of pleasure and fun in the home economics classroom presents moments of surprise and of carnival. Foucault's notion of the construction of the ethical individual shows these 'atypical' bodies to be 'immoderate' yet striving hard to be 'continent' body subjects. This research captures moments of transgression which suggest that transformative moments are already embodied in the pedagogical practices of home economics teachers, and these can be 'seen' when re-looking through postmodemist lenses. Hence, the cultural practices ofhome economics as inevitably marginalised are being contested from within. Until now, home economics as a lived culture has failed to recognise possibilities for reconstructing its own field beyond the confines of modernity. This research is an example of how to think about home economics teachers and the profession as a reconfigured cultural practice. Future research about home economics as a body of knowledge and a site of teacher practice need not retell a simple story of oppression. Using postmodemist epistemologies is one way to provide opportunities for new ways of looking.