933 resultados para Germanic languages.
Resumo:
This abstract is a preliminary discussion of the importance of blending of Indigenous cultural knowledges with mainstream knowledges of mathematics for supporting Indigenous young people. This import is emphasised in the documents Preparing the Ground for Partnership (Priest, 2005), The Indigenous Education Strategic Directions 2008–2011 (Department of Education, Training and the Arts, 2007) and the National Goals for Indigenous Education (Department of Education, Employment and Work Relations, 2008). These documents highlight the contextualising of literacy and numeracy to students’ community and culture (see Priest, 2005). Here, Community describes “a culture that is oriented primarily towards the needs of the group. Martin Nakata (2007) describes contextualising to culture as about that which already exists, that is, Torres Strait Islander community, cultural context and home languages (Nakata, 2007, p. 2). Continuing, Ezeife (2002) cites Hollins (1996) in stating that Indigenous people belong to “high-context culture groups” (p. 185). That is, “high-context cultures are characterized by a holistic (top-down) approach to information processing in which meaning is “extracted” from the environment and the situation. Low-context cultures use a linear, sequential building block (bottom-up) approach to information processing in which meaning is constructed” (p.185). In this regard, students who use holistic thought processing are more likely to be disadvantaged in mainstream mathematics classrooms. This is because Westernised mathematics is presented as broken into parts with limited connections made between concepts and with the students’ culture. It potentially conflicts with how they learn. If this is to change the curriculum needs to be made more culture-sensitive and community orientated so that students know and understand what they are learning and for what purposes.
Resumo:
How and why visualisations support learning was the subject of this qualitative instrumental collective case study. Five computer programming languages (PHP, Visual Basic, Alice, GameMaker, and RoboLab) supporting differing degrees of visualisation were used as cases to explore the effectiveness of software visualisation to develop fundamental computer programming concepts (sequence, iteration, selection, and modularity). Cognitive theories of visual and auditory processing, cognitive load, and mental models provided a framework in which student cognitive development was tracked and measured by thirty-one 15-17 year old students drawn from a Queensland metropolitan secondary private girls’ school, as active participants in the research. Seventeen findings in three sections increase our understanding of the effects of visualisation on the learning process. The study extended the use of mental model theory to track the learning process, and demonstrated application of student research based metacognitive analysis on individual and peer cognitive development as a means to support research and as an approach to teaching. The findings also forward an explanation for failures in previous software visualisation studies, in particular the study has demonstrated that for the cases examined, where complex concepts are being developed, the mixing of auditory (or text) and visual elements can result in excessive cognitive load and impede learning. This finding provides a framework for selecting the most appropriate instructional programming language based on the cognitive complexity of the concepts under study.
Resumo:
Psychologists investigating dreams in non-Western cultures have generally not considered the meanings of dreams within the unique meaning-structure of the person in his or her societal context. The majority of dream studies in African societies are no exception. Researchers approaching dreams within rural Xhosa and Zulu speaking societies have either adopted an anthropological or a psychodynamic orientation. The latter approach particularly imposes a Western perspective in the interpretation of dream material. There have been no comparable studies of dream interpretation among urban blacks participating in the African Independent Church Movement. The present study focuses on the rural Xhosa speaking people and the urban black population who speak one of the Nguni languages and identify with the African Independent Church Movement. The study is concerned with understanding the meanings of dreams within the cultural context in which they occur. The specific aims of the study are: 1. To explicate the indigenous system of dream interpretation as revealed by acknowledged dream experts. 2. To examine the commonalities and the differences between the interpretation of dreams in two groups, drawn from a rural and urban setting respectively. 3. To elaborate upon the life-world of the participants by the interpretations gained from the above investigation. One hundred dreams and interpretations are collected from two categories of participants referred to as the Rural Group and the Urban Group. The Rural Group is made up of amagqira [traditional healers] and their clients, while the Urban Group consists of prophets and members of the African Independent Churches. Each group includes acknowledged dream experts. A phenomenological methodology is adopted in explicating the data. The methodological precedure involves a number of rigorous stages of expl ication whereby the original data is reduced to Constituent Profiles leading to the construction of a Thematic Index File. By searching and reflect ing upon the data, interpretative themes are identified. These themes are explicated to provide a rigorous description of the interpretative-reality of each group. Themes explicated w i thin the Rural Group are: the physiognomy of the dreamer's life-world as revealed by ithongo, the interpretation of ithongo as revealed through action, the dream relationship as an anticipatory mode-of-existence, iphupha as disclosing a vulnerable mode-of-being, human bodiliness as revealed in dream interpretations and the legitimation of the interpretative-reality within the life-world. Themes explicated within the Urban Group are: the phys iognomy of the dreamer's life-world revealed in their dream-existence, the interpretative-reality revealed through the enaction of dreams, tension between the newer Christian-based cosomology and the traditional cultural-based cosmology, a moral imperative, prophetic perception and human bodiliness, as revealed in dream interpretations and the legitimation of the interpretative-reality within the life-world. The essence of the interpretative-reality of both groups is very similar and is expressed in the notion of relatedness to a cosmic mode-of-being. The cosmic mode-of-being includes a numinous dimension which is expressed through divine presence in the form of ancestors, Holy Spirit or God. These notions cannot be apprehended by theoretical constructs alone but may be grasped and given form in meaning-disclosing intuitions which are expressed in the lifeworld in terms of bodiliness, revelatory knowledge, action and healing. Some differences b e tween the two groups are evident and reveal some conflict between the monotheistic Christian cosmology and the traditional cosmology. Unique aspects of the interpetative-reality of the Urban Group are expressed in terms of difficulties in the urban social environment and the notion of a moral imperative. It is observed that cul tural self-expression based upon traditional ideas continues to play a significant role in the urban environment. The apparent conflict revealed between the respective cosmologies underlies an integration of the aditional meanings with Christian concepts. This finding is consistent with the literature suggesting that the African Independent Church is a syncretic movement. The life-world is based upon the immediate and vivid experience of the numinous as revealed in the dream phenomenon. The participants' approach to dreams is not based upon an explicit theory, but upon an immediate and pathic understanding of the dream phenomenon. The understanding is based upon the interpreter's concrete understanding of the life-world, which includes the possibility of cosmic integration and continuity between the personal and transpersonal realms of being. The approach is characterized as an expression of man's primordial attunement with the cosmos. The approach of the participants to dreams may not b e consistent with a Western rational orientation, but neverthele ss, it is a valid approach . The validity is based upon the immediate life-world of experience which is intelligible, coherent, and above all, it is meaning-giving in revealing life-possibility within the context of human existence.
Resumo:
It has been argued that intentional first year curriculum design has a critical role to play in enhancing first year student engagement, success and retention (Kift, 2008). A fundamental first year curriculum objective should be to assist students to make the successful transition to assessment in higher education. Scott (2006) has identified that ‘relevant, consistent and integrated assessment … [with] prompt and constructive feedback’ are particularly relevant to student retention generally; while Nicol (2007) suggests that ‘lack of clarity regarding expectations in the first year, low levels of teacher feedback and poor motivation’ are key issues in the first year. At the very minimum, if we expect first year students to become independent and self-managing learners, they need to be supported in their early development and acquisition of tertiary assessment literacies (Orrell, 2005). Critical to this attainment is the necessity to alleviate early anxieties around assessment information, instructions, guidance, and performance. This includes, for example: inducting students thoroughly into the academic languages and assessment genres they will encounter as the vehicles for evidencing learning success; and making expectations about the quality of this evidence clear. Most importantly, students should receive regular formative feedback of their work early in their program of study to aid their learning and to provide information to both students and teachers on progress and achievement. Leveraging research conducted under an ALTC Senior Fellowship that has sought to articulate a research-based 'transition pedagogy' (Kift & Nelson, 2005) – a guiding philosophy for intentional first year curriculum design and support that carefully scaffolds and mediates the first year learning experience for contemporary heterogeneous cohorts – this paper will discuss theoretical and practical strategies and examples that should be of assistance in implementing good assessment and feedback practices across a range of disciplines in the first year.
Resumo:
Context The School of Information Technology at QUT has recently undertaken a major restructuring of their Bachelor of Information Technology (BIT) course. Some of the aims of this restructuring include a reduction in first year attrition and to provide an attractive degree course that meets both student and industry expectations. Emphasis has been placed on the first semester in the context of retaining students by introducing a set of four units that complement one another and provide introductory material on technology, programming and related skills, and generic skills that will aid the students throughout their undergraduate course and in their careers. This discussion relates to one of these four fist semester units, namely Building IT Systems. The aim of this unit is to create small Information Technology (IT) systems that use programming or scripting, databases as either standalone applications or web applications. In the prior history of teaching introductory computer programming at QUT, programming has been taught as a stand alone subject and integration of computer applications with other systems such as databases and networks was not undertaken until students had been given a thorough grounding in those topics as well. Feedback has indicated that students do not believe that working with a database requires programming skills. In fact, the teaching of the building blocks of computer applications have been compartmentalized and taught in isolation from each other. The teaching of introductory computer programming has been an industry requirement of IT degree courses as many jobs require at least some knowledge of the topic. Yet, computer programming is not a skill that all students have equal capabilities of learning (Bruce et al., 2004) and this is clearly shown by the volume of publications dedicated to this topic in the literature over a broad period of time (Eckerdal & Berglund, 2005; Mayer, 1981; Winslow, 1996). The teaching of this introductory material has been done pretty much the same way over the past thirty years. During this period of time that introductory computer programming courses have been taught at QUT, a number of different programming languages and programming paradigms have been used and different approaches to teaching and learning have been attempted in an effort to find the golden thread that would allow students to learn this complex topic. Unfortunately, computer programming is not a skill that can be learnt in one semester. Some basics can be learnt but it can take many years to master (Norvig, 2001). Faculty data typically has shown a bimodal distribution of results for students undertaking introductory programming courses with a high proportion of students receiving a high mark and a high proportion of students receiving a low or failing mark. This indicates that there are students who understand and excel with the introductory material while there is another group who struggle to understand the concepts and practices required to be able to translate a specification or problem statement into a computer program that achieves what is being requested. The consequence of a large group of students failing the introductory programming course has been a high level of attrition amongst first year students. This attrition level does not provide good continuity in student numbers in later years of the degree program and the current approach is not seen as sustainable.
Resumo:
Oberon-2 is an object-oriented language with a class structure based on type extension. The runtime structure of Oberon-2 is described and the low-level mechanism for dynamic type checking explained. It is shown that the superior type-safety of the language, when used for programming styles based on heterogeneous, pointer-linked data structures, has an entirely negligible cost in runtime performance.
Resumo:
While Business Process Management (BPM) is an established discipline, the increased adoption of BPM technology in recent years has introduced new challenges. One challenge concerns dealing with process model complexity in order to improve the understanding of a process model by stakeholders and process analysts. Features for dealing with this complexity can be classified in two categories: 1) those that are solely concerned with the appearance of the model, and 2) those that in essence change the structure of the model. In this paper we focus on the former category and present a collection of patterns that generalize and conceptualize various existing features. The paper concludes with a detailed analysis of the degree of support of a number of state-of-the-art languages and language implementations for these patterns.
Resumo:
Practice-led or multi modal theses (describing examinable outcomes of postgraduate study which comprise the practice of dancing/choreography with an accompanying exegesis) are an emerging strength of dance scholarship; a form of enquiry that has been gaining momentum for over a decade, particularly in Australia and the United Kingdom. It has been strongly argued that, in this form of research, legitimate claims to new knowledge are embodied predominantly within the practice itself (Pakes, 2003) and that these findings are emergent, contingent and often interstitial, contained within both the material form of the practice and in the symbolic languages surrounding the form. In a recent study on ‘dancing’ theses Phillips, Stock, Vincs (2009) found that there was general agreement from academics and artists that ‘there could be more flexibility in matching written language with conceptual thought expressed in practice’. The authors discuss how the seemingly intangible nature of danced / embodied research, reliant on what Melrose (2003) terms ‘performance mastery’ by the ‘expert practitioner’ (2006, Point 4) involving ‘expert’ intuition (2006, Point 5), might be accessed, articulated and validated in terms of alternative ways of knowing through exploring an ongoing dialogue in which the danced practice develops emergent theory. They also propose ways in which the danced thesis can be ‘converted’ into the required ‘durable’ artefact which the ephemerality of live performance denies, drawing on the work of Rye’s ‘multi-view’ digital record (2003) and Stapleton’s ‘multi-voiced audio visual document’(2006, 82). Building on a two-year research project (2007-2008) Dancing Between Diversity and Consistency: Refining Assessment in Postgraduate Degrees in Dance, which examined such issues in relation to assessment in an Australian context, the three researchers have further explored issues around interdisciplinarity, cultural differences and documentation through engaging with the following questions: How do we represent research in which understandings, meanings and findings are situated within the body of the dancer/choreographer? Do these need a form of ‘translating’ into textual form in order to be accessed as research? What kind of language structures can be developed to effect this translation: metaphor, allusion, symbol? How important is contextualising the creative practice? How do we incorporate differing cultural inflections and practices into our reading and evaluation? What kind of layered documentation can assist in producing a ‘durable’ research artefact from a non-reproduce-able live event?
Resumo:
This thesis proposes that contemporary printmaking, at its most significant, marks the present through reconstructing pasts and anticipating futures. It argues this through examples in the field, occurring in contexts beyond the Euramerican (Europe and North America). The arguments revolve around how the practice of a number of significant artists in Japan, Australia and Thailand has generated conceptual and formal innovations in printmaking that transcend local histories and conventions, whilst paradoxically, also building upon them and creating new meanings. The arguments do not portray the relations between contemporary and traditional art as necessarily antagonistic but rather, as productively dialectical. Furthermore, the case studies demonstrate that, in the 1980s and 1990s particularly, the studio practice of these printmakers was informed by other visual arts disciplines and reflected postmodern concerns. Departures from convention witnessed in these countries within the Asia-Pacific region shifted the field of the print into a heterogeneous and hybrid realm. The practitioners concerned (especially in Thailand) produced work that was more readily equated with performance and installation art than with printmaking per se. In Japan, the incursion of photography interrupted the decorative cast of printmaking and delivered it from a straightforward, craft-based aesthetic. In Australia, fixed notions of national identity were challenged by print practitioners through deliberate cultural rapprochements and technical contradictions (speaking across old and new languages).However time-honoured print methods were not jettisoned by any case study artists. Their re-alignment of the fundamental attributes of printmaking, in line with materialist formalism, is a core consideration of my arguments. The artists selected for in-depth analysis from these three countries are all innovators whose geographical circumstances and creative praxis drew on local traditions whilst absorbing international trends. In their radical revisionism, they acknowledged the specificity of history and place, conditions of contingency and forces of globalisation. The transformational nature of their work during the late twentieth century connects it to the postmodern ethos and to a broader artistic and cultural nexus than has hitherto been recognised in literature on the print. Emerging from former guild-based practices, they ambitiously conceived their work to be part of a continually evolving visual arts vocabulary. I argue in this thesis that artists from the Asia-Pacific region have historically broken with the hermetic and Euramerican focus that has generally characterised the field. Inadequate documentation and access to print activity outside the dominant centres of critical discourse imply that readings of postmodernism have been too limited in their scope of inquiry. Other locations offer complexities of artistic practice where re-alignments of customary boundaries are often the norm. By addressing innovative activity in Japan, Australia and Thailand, this thesis exposes the need for a more inclusive theoretical framework and wider global reach than currently exists for ‘printmaking’.
Resumo:
Privacy enhancing protocols (PEPs) are a family of protocols that allow secure exchange and management of sensitive user information. They are important in preserving users’ privacy in today’s open environment. Proof of the correctness of PEPs is necessary before they can be deployed. However, the traditional provable security approach, though well established for verifying cryptographic primitives, is not applicable to PEPs. We apply the formal method of Coloured Petri Nets (CPNs) to construct an executable specification of a representative PEP, namely the Private Information Escrow Bound to Multiple Conditions Protocol (PIEMCP). Formal semantics of the CPN specification allow us to reason about various security properties of PIEMCP using state space analysis techniques. This investigation provides us with preliminary insights for modeling and verification of PEPs in general, demonstrating the benefit of applying the CPN-based formal approach to proving the correctness of PEPs.
Resumo:
The Thai written language is one of the languages that does not have word boundaries. In order to discover the meaning of the document, all texts must be separated into syllables, words, sentences, and paragraphs. This paper develops a novel method to segment the Thai text by combining a non-dictionary based technique with a dictionary-based technique. This method first applies the Thai language grammar rules to the text for identifying syllables. The hidden Markov model is then used for merging possible syllables into words. The identified words are verified with a lexical dictionary and a decision tree is employed to discover the words unidentified by the lexical dictionary. Documents used in the litigation process of Thai court proceedings have been used in experiments. The results which are segmented words, obtained by the proposed method outperform the results obtained by other existing methods.
Resumo:
In this paper, we classify, review, and experimentally compare major methods that are exploited in the definition, adoption, and utilization of element similarity measures in the context of XML schema matching. We aim at presenting a unified view which is useful when developing a new element similarity measure, when implementing an XML schema matching component, when using an XML schema matching system, and when comparing XML schema matching systems.
Resumo:
One of the classic forms of intermediate representation used for communication between compiler front-ends and back-ends are those based on abstract stack machines. It is possible to compile the stack machine instructions into machine code by means of an interpretive code generator, or to simulate the stack machine at runtime using an interpreter. This paper describes an approach intermediate between these two extremes. The front-end for a commercial Modula 2 compiler was ported to the "industry standard PC", and a partially compiling back-end written. The object code runs with the assistance of an interpreter, but may be linked with libraries which are fully compiled. The intent was to provide a programming environment on the PC which is identical to that of the same compilers on 32-bit UNIX machines. This objective has been met, and the compiler is available to educational institutions as free-ware. The design basis of the new compiler is described, and the performance critically evaluated.
Resumo:
Programs written in languages of the Oberon family usually contain runtime tests on the dynamic type of variables. In some cases it may be desirable to reduce the number of such tests. Typeflow analysis is a static method of determining bounds on the types that objects may possess at runtime. We show that this analysis is able to reduce the number of tests in certain plausible circumstances. Furthermore, the same analysis is able to detect certain program errors at compile time, which would normally only be detected at program execution. This paper introduces the concepts of typeflow analysis and details its use in the reduction of runtime overhead in Oberon-2.
Resumo:
Luna is an object-oriented language. It does not, as do many other object-oriented languages, have a conventional procedural language as a base. It is strongly typed and modular. The elegance of Luna is that it is entirely reference based, there are no static objects. Luna is similar to Oberon in that inheritance and subtyping is based on type extension.