906 resultados para epistemic marking
Resumo:
Timely feedback is a vital component in the learning process. It is especially important for beginner students in Information Technology since many have not yet formed an effective internal model of a computer that they can use to construct viable knowledge. Research has shown that learning efficiency is increased if immediate feedback is provided for students. Automatic analysis of student programs has the potential to provide immediate feedback for students and to assist teaching staff in the marking process. This paper describes a “fill in the gap” programming analysis framework which tests students’ solutions and gives feedback on their correctness, detects logic errors and provides hints on how to fix these errors. Currently, the framework is being used with the Environment for Learning to Programming (ELP) system at Queensland University of Technology (QUT); however, the framework can be integrated into any existing online learning environment or programming Integrated Development Environment (IDE)
Resumo:
An information filtering (IF) system monitors an incoming document stream to find the documents that match the information needs specified by the user profiles. To learn to use the user profiles effectively is one of the most challenging tasks when developing an IF system. With the document selection criteria better defined based on the users’ needs, filtering large streams of information can be more efficient and effective. To learn the user profiles, term-based approaches have been widely used in the IF community because of their simplicity and directness. Term-based approaches are relatively well established. However, these approaches have problems when dealing with polysemy and synonymy, which often lead to an information overload problem. Recently, pattern-based approaches (or Pattern Taxonomy Models (PTM) [160]) have been proposed for IF by the data mining community. These approaches are better at capturing sematic information and have shown encouraging results for improving the effectiveness of the IF system. On the other hand, pattern discovery from large data streams is not computationally efficient. Also, these approaches had to deal with low frequency pattern issues. The measures used by the data mining technique (for example, “support” and “confidences”) to learn the profile have turned out to be not suitable for filtering. They can lead to a mismatch problem. This thesis uses the rough set-based reasoning (term-based) and pattern mining approach as a unified framework for information filtering to overcome the aforementioned problems. This system consists of two stages - topic filtering and pattern mining stages. The topic filtering stage is intended to minimize information overloading by filtering out the most likely irrelevant information based on the user profiles. A novel user-profiles learning method and a theoretical model of the threshold setting have been developed by using rough set decision theory. The second stage (pattern mining) aims at solving the problem of the information mismatch. This stage is precision-oriented. A new document-ranking function has been derived by exploiting the patterns in the pattern taxonomy. The most likely relevant documents were assigned higher scores by the ranking function. Because there is a relatively small amount of documents left after the first stage, the computational cost is markedly reduced; at the same time, pattern discoveries yield more accurate results. The overall performance of the system was improved significantly. The new two-stage information filtering model has been evaluated by extensive experiments. Tests were based on the well-known IR bench-marking processes, using the latest version of the Reuters dataset, namely, the Reuters Corpus Volume 1 (RCV1). The performance of the new two-stage model was compared with both the term-based and data mining-based IF models. The results demonstrate that the proposed information filtering system outperforms significantly the other IF systems, such as the traditional Rocchio IF model, the state-of-the-art term-based models, including the BM25, Support Vector Machines (SVM), and Pattern Taxonomy Model (PTM).
Resumo:
We argue that web service discovery technology should help the user navigate a complex problem space by providing suggestions for services which they may not be able to formulate themselves as (s)he lacks the epistemic resources to do so. Free text documents in service environments provide an untapped source of information for augmenting the epistemic state of the user and hence their ability to search effectively for services. A quantitative approach to semantic knowledge representation is adopted in the form of semantic space models computed from these free text documents. Knowledge of the user’s agenda is promoted by associational inferences computed from the semantic space. The inferences are suggestive and aim to promote human abductive reasoning to guide the user from fuzzy search goals into a better understanding of the problem space surrounding the given agenda. Experimental results are discussed based on a complex and realistic planning activity.
Resumo:
Early in the practice-led research debate, Steven Scrivener (2000, 2002) identified some general differences in the approach of artists and designers undertaking postgraduate research. His distinctions centered on the role of the artefact in problem-based research (associated with design) and creative-production research (associated with artistic practice). Nonetheless, in broader discussions on practice-led research, 'art and design' often continues to be conflated within a single term. In particular, marked differences between art and design methodologies, theoretical framing, research goals and research claims have been underestimated. This paper revisits Scrivener's work and establishes further distinctions between art and design research. It is informed by our own experiences of postgraduate supervision and research methods training, and an empirical study of over sixty postgraduate, practice-led projects completed at the Creative Industries Faculty of QUT between 2002 and 2008. Our reflections have led us to propose that artists and designers work with differing research goals (the evocative and the effective, respectively), which are played out in the questions asked, the creative process, the role of the artefact and the way new knowledge is evidenced. Of course, research projects will have their own idiosyncrasies but, we argue, marking out the poles at each end of the spectrum of art and design provides useful insights for postgraduate candidates, supervisors and methodologists alike.
Resumo:
The automatic extraction of road features from remote sensed images has been a topic of great interest within the photogrammetric and remote sensing communities for over 3 decades. Although various techniques have been reported in the literature, it is still challenging to efficiently extract the road details with the increasing of image resolution as well as the requirement for accurate and up-to-date road data. In this paper, we will focus on the automatic detection of road lane markings, which are crucial for many applications, including lane level navigation and lane departure warning. The approach consists of four steps: i) data preprocessing, ii) image segmentation and road surface detection, iii) road lane marking extraction based on the generated road surface, and iv) testing and system evaluation. The proposed approach utilized the unsupervised ISODATA image segmentation algorithm, which segments the image into vegetation regions, and road surface based only on the Cb component of YCbCr color space. A shadow detection method based on YCbCr color space is also employed to detect and recover the shadows from the road surface casted by the vehicles and trees. Finally, the lane marking features are detected from the road surface using the histogram clustering. The experiments of applying the proposed method to the aerial imagery dataset of Gympie, Queensland demonstrate the efficiency of the approach.
Resumo:
While the need for teamwork skills consistently appears in job advertisements across all sectors, the development of these skills for many university students (and some academic staff) remains one of the most painful and often complained about experiences. This presentation introduces the final phase of a project that has investigated and analysed the design of teamwork assessment across all discipline areas in order to provide a university-wide protocol for this important graduate capability. The protocol concentrates best practice guidelines and resources across a range of approaches to team assessment and includes an online diagnostic tool for evaluating the quality of assessment design. Guide-lines are provided for all aspects of the design process such as the development of real-world relevance; choosing the ideal team structure; planning for intervention and conflict resolution; and selecting appropriate marking options. While still allowing academic staff to exercise creativity in assessment design; the guidelines increase the possibility of students’ experiencing a consistent and explicit approach to teamwork throughout their course. If implementation of the protocol is successful, the project team predicts that the resulting consistency and explicitness in approaches to teamwork will lead to more coherent skill development across units, more realistic expectations for students and staff and better communication between all those participating in the process.
Resumo:
Rapid advancements in the field of genetic science have engendered considerable debate, speculation, misinformation and legislative action worldwide. While programs such as the Human Genome Project bring the prospect of seemingly miraculous medical advancements within imminent reach, they also create the potential for significant invasions of traditional areas of privacy and human dignity through laying the potential foundation for new forms of discrimination in insurance, employment and immigration regulation. The insurance industry, which has of course, traditionally been premised on discrimination as part of its underwriting process, is proving to be the frontline of this regulatory battle with extensive legislation, guidelines and debate marking its progress.
Resumo:
De Certeau (1984) constructs the notion of belonging as a sentiment which develops over time through the everyday activities. He explains that simple everyday activities are part of the process of appropriation and territorialisation and suggests that over time belonging and attachment are established and built on memory, knowledge and the experiences of everyday activities. Based on the work of de Certeau, non-Indigenous Australians have developed attachment and belonging to places based on the dispossession of Aboriginal people and on their everyday practices over the past two hundred years. During this time non-Indigenous people have marked their appropriation and territorialisation with signs, symbols, representations and images. In marking their attachment, they also define how they position Australia’s Indigenous people by both our presence and our absence. This paper will explore signs and symbols within spaces and places in health services and showcase how they reflect the historical, political, cultural, social and economic values, and power relations of broader society. It will draw on the voices of Aboriginal women to demonstrate their everyday experiences of such sites. It will conclude by highlighting how Aboriginal people assert their identities and un-ceded sovereignty within such health sites and actively resist on-going white epistemological notions of us and the logic of patriarchal white sovereignty.
Resumo:
Interactional research on advice giving has described advice as normative and asymmetric. In this paper we examine how these dimensions of advice are softened by counselors on a helpline for children and young people through the use of questions. Through what we term ‘‘adviceimplicative interrogatives,’’ counselors ask clients about the relevance or applicability of a possible future course of action. The allusion to this possible action by the counselor identifies it as normatively relevant, and displays the counselor’s epistemic authority in relation to dealing with a client’s problems. However, the interrogative format mitigates the normative and asymmetric dimensions typical of advice sequences by orienting to the client’s epistemic authority in relation to their own lives, and delivering advice in a way that is contingent upon the client’s accounts of their experiences, capacities, and understandings. The demonstration of the use of questions in advice sequences offers an interactional specification of the ‘‘client-centered’’ support that is characteristic of prevailing counseling practice. More specifically, it shows how the values of empowerment and child-centered practice, which underpin services such as Kids Helpline, are embodied in specific interactional devices. Detailed descriptions of this interactional practice offer fresh insights into the use of interrogatives in counseling contexts, and provide practitioners with new ways of thinking about, and discussing, their current practices.
Resumo:
This thesis is a problematisation of the teaching of art to young children. To problematise a domain of social endeavour, is, in Michel Foucault's terms, to ask how we come to believe that "something ... can and must be thought" (Foucault, 1985:7). The aim is to document what counts (i.e., what is sayable, thinkable, feelable) as proper art teaching in Queensland at this point ofhistorical time. In this sense, the thesis is a departure from more recognisable research on 'more effective' teaching, including critical studies of art teaching and early childhood teaching. It treats 'good teaching' as an effect of moral training made possible through disciplinary discourses organised around certain epistemic rules at a particular place and time. There are four key tasks accomplished within the thesis. The first is to describe an event which is not easily resolved by means of orthodox theories or explanations, either liberal-humanist or critical ones. The second is to indicate how poststructuralist understandings of the self and social practice enable fresh engagements with uneasy pedagogical moments. What follows this discussion is the documentation of an empirical investigation that was made into texts generated by early childhood teachers, artists and parents about what constitutes 'good practice' in art teaching. Twenty-two participants produced text to tell and re-tell the meaning of 'proper' art education, from different subject positions. Rather than attempting to capture 'typical' representations of art education in the early years, a pool of 'exemplary' teachers, artists and parents were chosen, using "purposeful sampling", and from this pool, three videos were filmed and later discussed by the audience of participants. The fourth aspect of the thesis involves developing a means of analysing these texts in such a way as to allow a 're-description' of the field of art teaching by attempting to foreground the epistemic rules through which such teacher-generated texts come to count as true ie, as propriety in art pedagogy. This analysis drew on Donna Haraway's (1995) understanding of 'ironic' categorisation to hold the tensions within the propositions inside the categories of analysis rather than setting these up as discursive oppositions. The analysis is therefore ironic in the sense that Richard Rorty (1989) understands the term to apply to social scientific research. Three 'ironic' categories were argued to inform the discursive construction of 'proper' art teaching. It is argued that a teacher should (a) Teach without teaching; (b) Manufacture the natural; and (c) Train for creativity. These ironic categories work to undo modernist assumptions about theory/practice gaps and finding a 'balance' between oppositional binary terms. They were produced through a discourse theoretical reading of the texts generated by the participants in the study, texts that these same individuals use as a means of discipline and self-training as they work to teach properly. In arguing the usefulness of such approaches to empirical data analysis, the thesis challenges early childhood research in arts education, in relation to its capacity to deal with ambiguity and to acknowledge contradiction in the work of teachers and in their explanations for what they do. It works as a challenge at a range of levels - at the level of theorising, of method and of analysis. In opening up thinking about normalised categories, and questioning traditional Western philosophy and the grand narratives of early childhood art pedagogy, it makes a space for re-thinking art pedagogy as "a game oftruth and error" (Foucault, 1985). In doing so, it opens up a space for thinking how art education might be otherwise.
Resumo:
Optimal decision-making requires us to accurately pinpoint the basis of our thoughts, e.g. whether they originate from our memory or our imagination. This paper argues that the phenomenal qualities of our subjective experience provide permissible evidence to revise beliefs, particularly as it pertains to memory. I look to the source monitoring literature to reconcile circumstances where mnemic beliefs and mnemic qualia conflict. By separating the experience of remembering from biological facts of memory, unusual cases make sense, such as memory qualia without memory (e.g. déjà vu, false memories) or a failure to have memory qualia with memory (e.g. functional amnesia, unintentional plagiarism). I argue that a pragmatic, probabilistic approach to belief revision is a way to rationally incorporate information from conscious experience, whilst acknowledging its inherent difficulties as an epistemic source. I conclude with a Bayesian defense of source monitoring based on C.I. Lewis’ coherence argument for memorial knowledge.
Resumo:
Neo-liberalism has become one of the boom concepts of our time. From its original reference point as a descriptor of the economics of the “Chicago School” such as Milton Friedman, or authors such as Friedrich von Hayek, neo-liberalism has become an all-purpose descriptor and explanatory device for phenomena as diverse as Bollywood weddings, standardized testing in schools, violence in Australian cinema, and the digitization of content in public libraries. Moreover, it has become an entirely pejorative term: no-one refers to their own views as “neo-liberal”, but it rather refers to the erroneous views held by others, whether they acknowledge this or not. Neo-liberalism as it has come to be used, then, bears many of the hallmarks of a dominant ideology theory in the classical Marxist sense, even if it is often not explored in these terms. This presentation will take the opportunity provided by the English language publication of Michel Foucault’s 1978-79 lectures, under the title of The Birth of Biopolitics, to consider how he used the term neo-liberalism, and how this equates with its current uses in critical social and cultural theory. It will be argued that Foucault did not understand neo-liberalism as a dominant ideology in these lectures, but rather as marking a point of inflection in the historical evolution of liberal political philosophies of government. It will also be argued that his interpretation of neo-liberalism was more nuanced and more comparative than the more recent uses of Foucault in the literature on neo-liberalism. It will also look at how Foucault develops comparative historical models of liberal capitalism in The Birth of Biopolitics, arguing that this dimension of his work has been lost in more recent interpretations, which tend to retro-fit Foucault to contemporary critiques of either U.S. neo-conservatism or the “Third Way” of Tony Blair’s New Labour in the UK.
Resumo:
In this paper, we presented an automatic system for precise urban road model reconstruction based on aerial images with high spatial resolution. The proposed approach consists of two steps: i) road surface detection and ii) road pavement marking extraction. In the first step, support vector machine (SVM) was utilized to classify the images into two categories: road and non-road. In the second step, road lane markings are further extracted on the generated road surface based on 2D Gabor filters. The experiments using several pan-sharpened aerial images of Brisbane, Queensland have validated the proposed method.
Resumo:
PURPOSE: This study investigated the effects of simulated visual impairment on nighttime driving performance and pedestrian recognition under real-road conditions. METHODS: Closed road nighttime driving performance was measured for 20 young visually normal participants (M = 27.5 +/- 6.1 years) under three visual conditions: normal vision, simulated cataracts, and refractive blur that were incorporated in modified goggles. The visual acuity levels for the cataract and blur conditions were matched for each participant. Driving measures included sign recognition, avoidance of low contrast road hazards, time to complete the course, and lane keeping. Pedestrian recognition was measured for pedestrians wearing either black clothing or black clothing with retroreflective markings on the moveable joints to create the perception of biological motion ("biomotion"). RESULTS: Simulated visual impairment significantly reduced participants' ability to recognize road signs, avoid road hazards, and increased the time taken to complete the driving course (p < 0.05); the effect was greatest for the cataract condition, even though the cataract and blur conditions were matched for visual acuity. Although visual impairment also significantly reduced the ability to recognize the pedestrian wearing black clothing, the pedestrian wearing "biomotion" was seen 80% of the time. CONCLUSIONS: Driving performance under nighttime conditions was significantly degraded by modest visual impairment; these effects were greatest for the cataract condition. Pedestrian recognition was greatly enhanced by marking limb joints in the pattern of "biomotion," which was relatively robust to the effects of visual impairment.
Resumo:
Neo-liberalism has become one of the boom concepts of our time. From its original reference point as a descriptor of the economics of the ‘Chicago School’ or authors such as Friedrich von Hayek, neo-liberalism has become an all-purpose concept, explanatory device and basis for social critique. This presentation evaluates Michel Foucault’s 1978–79 lectures, published as The Birth of Biopolitics, to consider how he used the term neo-liberalism, and how this equates with its current uses in critical social and cultural theory. It will be argued that Foucault did not understand neo-liberalism as a dominant ideology in these lectures, but rather as marking a point of inflection in the historical evolution of liberal political philosophies of government. It will also be argued that his interpretation of neo-liberalism was more nuanced and more comparative than more recent contributions. The article points towards an attempt to theorize comparative historical models of liberal capitalism.