277 resultados para Abductive reasoning
Resumo:
Background Selection of candidates for clinical psychology programmes is arguably the most important decision made in determining the clinical psychology workforce. However, there are few models to inform the development of selection tools to support selection procedures. The study, using a factor analytic structure, has operationalised the model predicting applicants' capabilities. Method Eighty-eight clinical applicants for entry into a postgraduate clinical psychology programme were assessed on a series of tasks measuring eight capabilities: guided reflection, communication skills, ethical decision making, writing, conceptual reasoning, empathy, and awareness of mind and self-observation. Results Factor analysis revealed three capabilities: labelled “awareness” accounting for 35.71% of variance; “reflection” accounting for 20.56%; and “reasoning” accounting for 18.24% of variance. Fourth year grade point average (GPA) did not correlate with performance on any of the selection capabilities other than a weak correlation with performance on the ethics capability. Conclusions Eight selection capabilities are identified for the selection of candidates independent of GPA. While the model is tentative, it is hoped that the findings will stimulate the development and validation of assessment procedures with good predictive validity which will benefit the training of clinical psychologists and, ultimately, effective service delivery.
Resumo:
This chapter argues for the need to restructure children’s statistical experiences from the beginning years of formal schooling. The ability to understand and apply statistical reasoning is paramount across all walks of life, as seen in the variety of graphs, tables, diagrams, and other data representations requiring interpretation. Young children are immersed in our data-driven society, with early access to computer technology and daily exposure to the mass media. With the rate of data proliferation have come increased calls for advancing children’s statistical reasoning abilities, commencing with the earliest years of schooling (e.g., Langrall et al. 2008; Lehrer and Schauble 2005; Shaughnessy 2010; Whitin and Whitin 2011). Several articles (e.g., Franklin and Garfield 2006; Langrall et al. 2008) and policy documents (e.g., National Council of Teachers ofMathematics 2006) have highlighted the need for a renewed focus on this component of early mathematics learning, with children working mathematically and scientifically in dealing with realworld data. One approach to this component in the beginning school years is through data modelling (English 2010; Lehrer and Romberg 1996; Lehrer and Schauble 2000, 2007)...
Resumo:
This thesis examines the construction of Aboriginality in recent public policy reasoning through identifying representations deployed by architects and supporters of the Commonwealth’s 2007 Northern Territory Emergency Response (the intervention). Debate about the Northern Territory intervention was explicitly situated in relation to a range of ideas about appropriate Government policy towards Indigenous people, and particularly about the nature, role, status, value and future of Aboriginality and of Aboriginal people and Torres Strait Islanders. This project involves analysis of constructions of Aboriginality deployed in texts created and circulated to explain and justify the policy program. The aim of the project is to identify the ideas about Aboriginality deployed by the intervention’s architects and supporters, and to examine the effects and implications of these discourses for political relationships between Indigenous people and settlers in Australia. This thesis will argue that advocates of the Northern Territory intervention construct Aboriginality in a range of important ways that reassert and reinforce the legitimacy of the settler colonial order and the project of Australian nationhood, and operate to limit Aboriginal claims. Specifically, it is argued that in linking Aboriginality to the abuse of Aboriginal children, the intervention’s advocates and supporters establish a political debate about the nature and future of Aboriginality within a discursive terrain in which the authority and perspectives of Indigenous people are problematised. Aboriginality is constructed in this process as both temporally and spatially separated from settler society, and in need of coercive integration into mainstream economic and political arrangements. Aboriginality is depicted by settler advocates of intervention as an anachronism, with Aboriginal people and cultures understood as primitive and/or savage precursors to settlers who are represented as modern and civilised. As such, the communities seen as the authentic home or location of Aboriginality represent a threat to Aboriginal children as well as to settlers. These constructions function to obscure the violence of the settler order, provide justification or moral rehabilitation for the colonising project, and reassert the sovereignty of the settler state. The resolution offered by the intervention’s advocates is a performance or enactment of settler sovereignty, representing a claim over and through both the territory of Aboriginal people and the discursive terrain of nationhood.
Resumo:
Asking why is an important foundation of inquiry and fundamental to the development of reasoning skills and learning. Despite this, and despite the relentless and often disruptive nature of innovations in information and communications technology (ICT), sophisticated tools that directly support this basic act of learning appear to be undeveloped, not yet recognized, or in the very early stages of development. Why is this so? To this question, there is no single satisfactory answer; instead, numerous plausible explanations and related questions arise. After learning something, however, explaining why can be revealing of a person’s understanding (or lack of it). What then differentiates explanation from information; and, explanatory from descriptive content? What ICT scaffolding might support inquiry instigated by why-questioning? What is the role of reflective practice in inquiry-based learning? These and other questions have emerged from this investigation and underscore that why-questions often propagate further questions and are a catalyst for cognitive engagement and dialogue. This paper reports on a multi-disciplinary, theoretical investigation that informs the broad discourse on e-learning and points to a specific frontier for design and development of e-learning tools. Probing why reveals that versatile and ambiguous semantics present the core challenge – asking, learning, knowing, understanding, and explaining why.
Resumo:
The policy objectives of the continuous disclosure regime augmented by the misleading or deceptive conduct provisions in the Corporations Act are to enhance the integrity and efficiency of Australian capital markets by ensuring equality of opportunity for all investors through public access to accurate and material company information to enable them to make well-informed investment decisions. This article argues that there were failures by the regulators in the performance of their roles to protect the interests of investors in Forrest v ASIC; FMG v ASIC (2012) 247 CLR 486: ASX failed to enforce timely compliance with the continuous disclosure regime and ensure that the market was properly informed by seeking immediate clarification from FMG as to the agreed fixed price and/or seeking production of a copy of the CREC agreement; and ASIC failed to succeed in the High Court because of the way it pleaded its case. The article also examines the reasoning of the High Court in Forrest v ASIC and whether it might have changed previous understandings of the Campomar test for determining whether representations directed to the public generally are misleading.
Resumo:
Recent research from within a neo-Piagetian perspective proposes that novice programmers pass through the sensorimotor and preoperational stages before being able to reason at the concrete operational stage. However, academics traditionally teach and assess introductory programming as if students commence at the concrete operational stage. In this paper, we present results from a series of think aloud sessions with a single student, known by the pseudonym “Donald”. We conducted the sessions mainly over one semester, with an additional session three semesters later. Donald first manifested predominately sensorimotor reasoning, followed by preoperational reasoning, and finally concrete operational reasoning. This longitudinal think aloud study of Donald is the first direct observational evidence of a novice programmer progressing through the neo-Piagetian stages.
Resumo:
There is no doubt that social engineering plays a vital role in compromising most security defenses, and in attacks on people, organizations, companies, or even governments. It is the art of deceiving and tricking people to reveal critical information or to perform an action that benefits the attacker in some way. Fraudulent and deceptive people have been using social engineering traps and tactics using information technology such as e-mails, social networks, web sites, and applications to trick victims into obeying them, accepting threats, and falling victim to various crimes and attacks such as phishing, sexual abuse, financial abuse, identity theft, impersonation, physical crime, and many other forms of attack. Although organizations, researchers, practitioners, and lawyers recognize the severe risk of social engineering-based threats, there is a severe lack of understanding and controlling of such threats. One side of the problem is perhaps the unclear concept of social engineering as well as the complexity of understand human behaviors in behaving toward, approaching, accepting, and failing to recognize threats or the deception behind them. The aim of this paper is to explain the definition of social engineering based on the related theories of the many related disciplines such as psychology, sociology, information technology, marketing, and behaviourism. We hope, by this work, to help researchers, practitioners, lawyers, and other decision makers to get a fuller picture of social engineering and, therefore, to open new directions of collaboration toward detecting and controlling it.
Resumo:
Philosophical inquiry in the teaching and learning of mathematics has received continued, albeit limited, attention over many years (e.g., Daniel, 2000; English, 1994; Lafortune, Daniel, Fallascio, & Schleider, 2000; Kennedy, 2012a). The rich contributions these communities can offer school mathematics, however, have not received the deserved recognition, especially from the mathematics education community. This is a perplexing situation given the close relationship between the two disciplines and their shared values for empowering students to solve a range of challenging problems, often unanticipated, and often requiring broadened reasoning. In this article, I first present my understanding of philosophical inquiry as it pertains to the mathematics classroom, taking into consideration the significant work that has been undertaken on socio-political contexts in mathematics education (e.g., Skovsmose & Greer, 2012). I then consider one approach to advancing philosophical inquiry in the mathematics classroom, namely, through modelling activities that require interpretation, questioning, and multiple approaches to solution. The design of these problem activities, set within life-based contexts, provides an ideal vehicle for stimulating philosophical inquiry.
Resumo:
The Pattern and Structure Mathematics Awareness Project (PASMAP) has investigated the development of patterning and early algebraic reasoning among 4 to 8 year olds over a series of related studies. We assert that an awareness of mathematical pattern and structure (AMPS) enables mathematical thinking and simple forms of generalization from an early age. This paper provides an overview of key findings of the Reconceptualizing Early Mathematics Learning empirical evaluation study involving 316 Kindergarten students from 4 schools. The study found highly significant differences on PASA scores for PASMAP students. Analysis of structural development showed increased levels for the PASMAP students; those categorised as low ability developed improved structural responses over a short period of time.
Resumo:
This paper explores how the amalgamated wisdom of East and West can instigate a wisdombased renaissance of humanistic epistemology (Rooney & McKenna, 2005) to provide a platform of harmony in managing knowledge-worker productivity, one of the biggest management challenges of the 21st century (Drucker, 1999). The paper invites further discussions from the social and business research communities on the significance of "interpretation realism" technique in comprehending philosophies of Lao Tzu Confucius and Sun Tzu (Lao/Confucius/Sun] written in "Classical Chinese." This paper concludes with a call to build prudent, responsible practices in management which affects the daily lives of many (Rooney & McKenna, 2005) in today's knowledgebased economy. Interpretation Realism will be applied to an analysis of three Chinese classics of Lao/Confucius/Sun which have been embodied in the Chinese culture for over 2,500 years. Comprehending Lao/Confucius/Sun's philosophies is the first step towards understanding Classical Chinese culture. However, interpreting Chinese subtlety in language and the yin and yang circular synthesis in their mode of thinking is very different to understanding Western thought with its open communication and its linear, analytical pattern of Aristotelian/Platonic wisdom (Zuo, 2012). Furthermore, Eastern ways of communication are relatively indirect and mediatory in culture. Western ways of communication are relatively direct and litigious in culture (Goh, 2002). Furthermore, Lao/Confucius/Sun's philosophies are difficult to comprehend as there are four written Chinese formats and over 250 dialects: Pre-classical Chinese Classical Chinese Literary Chinese and modern Vernacular Chinese Because Classical Chinese is poetic, comprehension requires a mixed approach of interpretation realism combining logical reasoning behind "word splitting word occurrences", "empathetic metaphor" and "poetic appreciation of word.
Resumo:
During the evolution of the music industry, developments in the media environment have required music firms to adapt in order to survive. Changes in broadcast radio programming during the 1950s; the Compact Cassette during the 1970s; and the deregulation of media ownership during the 1990s are all examples of changes which have heavily affected the music industry. This study explores similar contemporary dynamics, examines how decision makers in the music industry perceive and make sense of the developments, and reveals how they revise their business strategies, based on their mental models of the media environment. A qualitative system dynamics model is developed in order to support the reasoning brought forward by the study. The model is empirically grounded, but is also based on previous music industry research and a theoretical platform constituted by concepts from evolutionary economics and sociology of culture. The empirical data primarily consist of 36 personal interviews with decision makers in the American, British and Swedish music industrial ecosystems. The study argues that the model which is proposed, more effectively explains contemporary music industry dynamics than music industry models presented by previous research initiatives. Supported by the model, the study is able to show how “new” media outlets make old music business models obsolete and challenge the industry’s traditional power structures. It is no longer possible to expose music at one outlet (usually broadcast radio) in the hope that it will lead to sales of the same music at another (e.g. a compact disc). The study shows that many music industry decision makers still have not embraced the new logic, and have not yet challenged their traditional mental models of the media environment. Rather, they remain focused on preserving the pivotal role held by the CD and other physical distribution technologies. Further, the study shows that while many music firms remain attached to the old models, other firms, primarily music publishers, have accepted the transformation, and have reluctantly recognised the realities of a virtualised environment.
Resumo:
Recent studies have linked the ability of novice (CS1) programmers to read and explain code with their ability to write code. This study extends earlier work by asking CS2 students to explain object-oriented data structures problems that involve recursion. Results show a strong correlation between ability to explain code at an abstract level and performance on code writing and code reading test problems for these object-oriented data structures problems. The authors postulate that there is a common set of skills concerned with reasoning about programs that explains the correlation between writing code and explaining code. The authors suggest that an overly exclusive emphasis on code writing may be detrimental to learning to program. Non-code writing learning activities (e.g., reading and explaining code) are likely to improve student ability to reason about code and, by extension, improve student ability to write code. A judicious mix of code-writing and code-reading activities is recommended.
Resumo:
Objective: To develop a system for the automatic classification of pathology reports for Cancer Registry notifications. Method: A two pass approach is proposed to classify whether pathology reports are cancer notifiable or not. The first pass queries pathology HL7 messages for known report types that are received by the Queensland Cancer Registry (QCR), while the second pass aims to analyse the free text reports and identify those that are cancer notifiable. Cancer Registry business rules, natural language processing and symbolic reasoning using the SNOMED CT ontology were adopted in the system. Results: The system was developed on a corpus of 500 histology and cytology reports (with 47% notifiable reports) and evaluated on an independent set of 479 reports (with 52% notifiable reports). Results show that the system can reliably classify cancer notifiable reports with a sensitivity, specificity, and positive predicted value (PPV) of 0.99, 0.95, and 0.95, respectively for the development set, and 0.98, 0.96, and 0.96 for the evaluation set. High sensitivity can be achieved at a slight expense in specificity and PPV. Conclusion: The system demonstrates how medical free-text processing enables the classification of cancer notifiable pathology reports with high reliability for potential use by Cancer Registries and pathology laboratories.
Resumo:
This project develops and evaluates a model of curriculum design that aims to assist student learning of foundational disciplinary ‘Threshold Concepts’. The project uses phenomenographic action research, cross-institutional peer collaboration and the Variation Theory of Learning to develop and trial the model. Two contrasting disciplines (Physics and Law) and four institutions (two research-intensive and two universities of technology) were involved in the project, to ensure broad applicability of the model across different disciplines and contexts. The Threshold Concepts that were selected for curriculum design attention were measurement uncertainty in Physics and legal reasoning in Law. Threshold Concepts are key disciplinary concepts that are inherently troublesome, transformative and integrative in nature. Once understood, such concepts transform students’ views of the discipline because they enable students to coherently integrate what were previously seen as unrelated aspects of the subject, providing new ways of thinking about it (Meyer & Land 2003, 2005, 2006; Land et al. 2008). However, the integrative and transformative nature of such threshold concepts make them inherently difficult for students to learn, with resulting misunderstandings of concepts being prevalent...
Resumo:
This chapter addresses data modelling as a means of promoting statistical literacy in the early grades. Consideration is first given to the importance of increasing young children’s exposure to statistical reasoning experiences and how data modelling can be a rich means of doing so. Selected components of data modelling are then reviewed, followed by a report on some findings from the third-year of a three-year longitudinal study across grades one through three.