275 resultados para sociomoral reasoning
Resumo:
The world’s increasing complexity, competitiveness, interconnectivity, and dependence on technology generate new challenges for nations and individuals that cannot be met by continuing education as usual (Katehi, Pearson, & Feder, 2009). With the proliferation of complex systems have come new technologies for communication, collaboration, and conceptualisation. These technologies have led to significant changes in the forms of mathematical and scientific thinking that are required beyond the classroom. Modelling, in its various forms, can develop and broaden children’s mathematical and scientific thinking beyond the standard curriculum. This paper first considers future competencies in the mathematical sciences within an increasingly complex world. Next, consideration is given to interdisciplinary problem solving and models and modelling. Examples of complex, interdisciplinary modelling activities across grades are presented, with data modelling in 1st grade, model-eliciting in 4th grade, and engineering-based modelling in 7th-9th grades.
Resumo:
Australia’s Future Tax System Review, headed by the then head of the Australian Treasury, and the Productivity Commission’s Research Report on the not for profit sector, both examined the state of tax concessions to Australia’s not for profit sector in the light of the High Court’s decision in Commissioner of Taxation v Word Investments Ltd. Despite being unable to quantify with any certainty the pre- or post-Word Investments cost of the tax concessions, both Reports indicated their support for continuation of the income tax exemption. However, the government acted in the 2011 Budget to target the not for profit income tax concessions more precisely, mainly on competitive neutrality grounds. This article examines the income tax exemption by applying the five taxation design principles, proposed in the Australia’s Future Tax System Review, for assessing tax expenditure. The conclusion is that the exemptions can be justified and, further, that a rationale for the exemption can be consistent with the reasoning in the Word Investments case.
Resumo:
Embedded real-time programs rely on external interrupts to respond to events in their physical environment in a timely fashion. Formal program verification theories, such as the refinement calculus, are intended for development of sequential, block-structured code and do not allow for asynchronous control constructs such as interrupt service routines. In this article we extend the refinement calculus to support formal development of interrupt-dependent programs. To do this we: use a timed semantics, to support reasoning about the occurrence of interrupts within bounded time intervals; introduce a restricted form of concurrency, to model composition of interrupt service routines with the main program they may preempt; introduce a semantics for shared variables, to model contention for variables accessed by both interrupt service routines and the main program; and use real-time scheduling theory to discharge timing requirements on interruptible program code.
Resumo:
This chapter discusses a range of issues associated with supporting inquiry and deep reasoning while utilising information and communications technology (ICT). The role of questioning in critical thinking and reflection is considered in the context of scaffolding and new opportunities for ICT-enabled scaffolding identified. In particular, why-questioning provides a key point of focus and is presented as an important consideration in the design of systems that not only require cognitive engagement but aim to nurture it. Advances in automated question generation within intelligent tutoring systems are shown to hold promise for both teaching and learning in a range of other applications. While shortening attention spans appear to be a hazard of engaging with digital media cognitive engagement is presented as something with broader scope than attention span and is best conceived of as a crucible within which a rich mix of cognitive activities take place and from which new knowledge is created.
Resumo:
Virtual environments can provide, through digital games and online social interfaces, extremely exciting forms of interactive entertainment. Because of their capability in displaying and manipulating information in natural and intuitive ways, such environments have found extensive applications in decision support, education and training in the health and science domains amongst others. Currently, the burden of validating both the interactive functionality and visual consistency of a virtual environment content is entirely carried out by developers and play-testers. While considerable research has been conducted in assisting the design of virtual world content and mechanics, to date, only limited contributions have been made regarding the automatic testing of the underpinning graphics software and hardware. The aim of this thesis is to determine whether the correctness of the images generated by a virtual environment can be quantitatively defined, and automatically measured, in order to facilitate the validation of the content. In an attempt to provide an environment-independent definition of visual consistency, a number of classification approaches were developed. First, a novel model-based object description was proposed in order to enable reasoning about the color and geometry change of virtual entities during a play-session. From such an analysis, two view-based connectionist approaches were developed to map from geometry and color spaces to a single, environment-independent, geometric transformation space; we used such a mapping to predict the correct visualization of the scene. Finally, an appearance-based aliasing detector was developed to show how incorrectness too, can be quantified for debugging purposes. Since computer games heavily rely on the use of highly complex and interactive virtual worlds, they provide an excellent test bed against which to develop, calibrate and validate our techniques. Experiments were conducted on a game engine and other virtual worlds prototypes to determine the applicability and effectiveness of our algorithms. The results show that quantifying visual correctness in virtual scenes is a feasible enterprise, and that effective automatic bug detection can be performed through the techniques we have developed. We expect these techniques to find application in large 3D games and virtual world studios that require a scalable solution to testing their virtual world software and digital content.
Resumo:
It is almost a truism that persons who occupy formal bureaucratic positions in schools may not actually be leaders if they were not role incumbents in a bureaucracy. It is also clear from studies of grassroots leaders that without the qualities of skills of leadership no one would follow them because they have no formal, hierarchical role upon which others were dependent to them. One of the reasons for re-examining the nature of grassroots leaders is to attempt to recapture those tactics or strategies which might be reconceptualized and utilized within more formal settings so that role dependent leadership becomes more effectual and trustworthy than one that is totally dependent on role authority. This reasoning is especially a critical need if there is a desire to work towards more democratic and collaborative working arrangements between leaders and followers, and where more flexible and dynamic relationships promise higher levels of commitment and productivity. Hecksher (1994) speaks of such a reconceptualization as part of a shift from an emphasis on power to one centered on influence. This paper examines the nature of leadership before it was subjected to positivistic science and later behavioural studies. This move follows the advice of Heilbrunn (1996) who trenchantly observed that for leadership studies to grow as a discipline, “it will have to cast a wider net” (p.11). Willis et. Al. (2008) make a similar point when they lament that social scientist have forced favoured understanding bureaucracies rather than grassroots community organizations, yet much can be gained by being aware of the tactics and strategies used by grassroots leaders who depend on influence as opposed to power. This paper, then, aims to do this by posing a tentative model of grassroots leadership and then considering how this model might inform and be used by those responsible for developing school leaders.
Resumo:
This paper argues for a renewed focus on statistical reasoning in the beginning school years, with opportunities for children to engage in data modelling. Results are reported from the first year of a 3-year longitudinal study in which three classes of first-grade children (6-year-olds) and their teachers engaged in data modelling activities. The theme of Looking after our Environment, part of the children’s science curriculum, provided the task context. The goals for the two activities addressed here included engaging children in core components of data modelling, namely, selecting attributes, structuring and representing data, identifying variation in data, and making predictions from given data. Results include the various ways in which children represented and re represented collected data, including attribute selection, and the metarepresentational competence they displayed in doing so. The “data lenses” through which the children dealt with informal inference (variation and prediction) are also reported.
Resumo:
Privacy issues have hindered the evolution of e-health since its emergence. Patients demand better solutions for the protection of private information. Health professionals demand open access to patient health records. Existing e-health systems find it difficult to fulfill these competing requirements. In this paper, we present an information accountability framework (IAF) for e-health systems. The IAF is intended to address privacy issues and their competing concerns related to e-health. Capabilities of the IAF adhere to information accountability principles and e-health requirements. Policy representation and policy reasoning are key capabilities introduced in the IAF. We investigate how these capabilities are feasible using Semantic Web technologies. We discuss with the use of a case scenario, how we can represent the different types of policies in the IAF using the Open Digital Rights Language (ODRL).
Resumo:
Consider the concept combination ‘pet human’. In word association experiments, human subjects produce the associate ‘slave’ in relation to this combination. The striking aspect of this associate is that it is not produced as an associate of ‘pet’, or ‘human’ in isolation. In other words, the associate ‘slave’ seems to be emergent. Such emergent associations sometimes have a creative character and cognitive science is largely silent about how we produce them. Departing from a dimensional model of human conceptual space, this article will explore concept combinations, and will argue that emergent associations are a result of abductive reasoning within conceptual space, that is, below the symbolic level of cognition. A tensor-based approach is used to model concept combinations allowing such combinations to be formalized as interacting quantum systems. Free association norm data is used to motivate the underlying basis of the conceptual space. It is shown by analogy how some concept combinations may behave like quantum-entangled (non-separable) particles. Two methods of analysis were presented for empirically validating the presence of non-separable concept combinations in human cognition. One method is based on quantum theory and another based on comparing a joint (true theoretic) probability distribution with another distribution based on a separability assumption using a chi-square goodness-of-fit test. Although these methods were inconclusive in relation to an empirical study of bi-ambiguous concept combinations, avenues for further refinement of these methods are identified.
Resumo:
Now in its eighth edition, Australian Tax Analysis: Cases, Commentary, Commercial Applications and Questions has a proven track record as a high level work for students of taxation law written by a team of authors with many years of experience. Taking into account the fact that the volume of material needed to be processed by today’s taxation student can be overwhelming, the well-chosen extracts and thought-provoking commentary in Australian Tax Analysis, 8th edition, provide readers with the depth of knowledge, and reasoning and analytical skills that will be required of them as practitioners. As well as the carefully selected case extracts and the helpful commentary, each chapter is supplemented by engaging practice questions, involving problem-solving, commercial decision-making, legal analysis and quantitative application. All these elements combined make Australian Tax Analysis an invaluable aid to the understanding of a subject that can be both technical and complex.
Resumo:
It is argued that concerns arise about the integrity and fairness of the taxation regime where charitable organizations, which avail themselves of the tax exemption status while undertaking commercial activities, compete directly with the for-profit sector. The appropriateness of the tax concessions granted to charitable organizations is considered in respect of income derived from commercial activities. It is principally argued that the traditional line of reasoning for imposing limitations on tax concessions focuses on an incorrect underlying inquiry. Traditionally, it is argued that limitations should be imposed because of unfair competition, lack of competitive neutrality, or an arbitrary decision relating to a lack of deserving. However, it is argued that a more appropriate question from which to base any limitations is one which considers the value attached to the integrity of the taxation regime as a whole, and the tax base specifically compared to the public good of charities. When the correct underlying question is asked, sound taxation policy ensues, as a less arbitrary approach may be adopted to limit the scope of tax concessions available to charitable organizations.
Resumo:
The opening phrase of the title is from Charles Darwin’s notebooks (Schweber 1977). It is a double reminder, firstly that mainstream evolutionary theory is not just about describing nature but is particularly looking for mechanisms or ‘causes’, and secondly, that there will usually be several causes affecting any particular outcome. The second part of the title is our concern at the almost universal rejection of the idea that biological mechanisms are sufficient for macroevolutionary changes, thus rejecting a cornerstone of Darwinian evolutionary theory. Our primary aim here is to consider ways of making it easier to develop and to test hypotheses about evolution. Formalizing hypotheses can help generate tests. In an absolute sense, some of the discussion by scientists about evolution is little better than the lack of reasoning used by those advocating intelligent design. Our discussion here is in a Popperian framework where science is defined by that area of study where it is possible, in principle, to find evidence against hypotheses – they are in principle falsifiable. However, with time, the boundaries of science keep expanding. In the past, some aspects of evolution were outside the current boundaries of falsifiable science, but increasingly new techniques and ideas are expanding the boundaries of science and it is appropriate to re-examine some topics. It often appears that over the last few decades there has been an increasingly strong assumption to look first (and only) for a physical cause. This decision is virtually never formally discussed, just an assumption is made that some physical factor ‘drives’ evolution. It is necessary to examine our assumptions much more carefully. What is meant by physical factors ‘driving’ evolution, or what is an ‘explosive radiation’. Our discussion focuses on two of the six mass extinctions, the fifth being events in the Late Cretaceous, and the sixth starting at least 50,000 years ago (and is ongoing). Cretaceous/Tertiary boundary; the rise of birds and mammals. We have had a long-term interest (Cooper and Penny 1997) in designing tests to help evaluate whether the processes of microevolution are sufficient to explain macroevolution. The real challenge is to formulate hypotheses in a testable way. For example the numbers of lineages of birds and mammals that survive from the Cretaceous to the present is one test. Our first estimate was 22 for birds, and current work is tending to increase this value. This still does not consider lineages that survived into the Tertiary, and then went extinct later. Our initial suggestion was probably too narrow in that it lumped four models from Penny and Phillips (2004) into one model. This reduction is too simplistic in that we need to know about survival and ecological and morphological divergences during the Late Cretaceous, and whether Crown groups of avian or mammalian orders may have existed back into the Cretaceous. More recently (Penny and Phillips 2004) we have formalized hypotheses about dinosaurs and pterosaurs, with the prediction that interactions between mammals (and groundfeeding birds) and dinosaurs would be most likely to affect the smallest dinosaurs, and similarly interactions between birds and pterosaurs would particularly affect the smaller pterosaurs. There is now evidence for both classes of interactions, with the smallest dinosaurs and pterosaurs declining first, as predicted. Thus, testable models are now possible. Mass extinction number six: human impacts. On a broad scale, there is a good correlation between time of human arrival, and increased extinctions (Hurles et al. 2003; Martin 2005; Figure 1). However, it is necessary to distinguish different time scales (Penny 2005) and on a finer scale there are still large numbers of possibilities. In Hurles et al. (2003) we mentioned habitat modification (including the use of Geogenes III July 2006 31 fire), introduced plants and animals (including kiore) in addition to direct predation (the ‘overkill’ hypothesis). We need also to consider prey switching that occurs in early human societies, as evidenced by the results of Wragg (1995) on the middens of different ages on Henderson Island in the Pitcairn group. In addition, the presence of human-wary or humanadapted animals will affect the distribution in the subfossil record. A better understanding of human impacts world-wide, in conjunction with pre-scientific knowledge will make it easier to discuss the issues by removing ‘blame’. While continued spontaneous generation was accepted universally, there was the expectation that animals continued to reappear. New Zealand is one of the very best locations in the world to study many of these issues. Apart from the marine fossil record, some human impact events are extremely recent and the remains less disrupted by time.
Resumo:
Nowadays, Workflow Management Systems (WfMSs) and, more generally, Process Management Systems (PMPs) are process-aware Information Systems (PAISs), are widely used to support many human organizational activities, ranging from well-understood, relatively stable and structures processes (supply chain management, postal delivery tracking, etc.) to processes that are more complicated, less structured and may exhibit a high degree of variation (health-care, emergency management, etc.). Every aspect of a business process involves a certain amount of knowledge which may be complex depending on the domain of interest. The adequate representation of this knowledge is determined by the modeling language used. Some processes behave in a way that is well understood, predictable and repeatable: the tasks are clearly delineated and the control flow is straightforward. Recent discussions, however, illustrate the increasing demand for solutions for knowledge-intensive processes, where these characteristics are less applicable. The actors involved in the conduct of a knowledge-intensive process have to deal with a high degree of uncertainty. Tasks may be hard to perform and the order in which they need to be performed may be highly variable. Modeling knowledge-intensive processes can be complex as it may be hard to capture at design-time what knowledge is available at run-time. In realistic environments, for example, actors lack important knowledge at execution time or this knowledge can become obsolete as the process progresses. Even if each actor (at some point) has perfect knowledge of the world, it may not be certain of its beliefs at later points in time, since tasks by other actors may change the world without those changes being perceived. Typically, a knowledge-intensive process cannot be adequately modeled by classical, state of the art process/workflow modeling approaches. In some respect there is a lack of maturity when it comes to capturing the semantic aspects involved, both in terms of reasoning about them. The main focus of the 1st International Workshop on Knowledge-intensive Business processes (KiBP 2012) was investigating how techniques from different fields, such as Artificial Intelligence (AI), Knowledge Representation (KR), Business Process Management (BPM), Service Oriented Computing (SOC), etc., can be combined with the aim of improving the modeling and the enactment phases of a knowledge-intensive process. The 1st International Workshop on Knowledge-intensive Business process (KiBP 2012) was held as part of the program of the 2012 Knowledge Representation & Reasoning International Conference (KR 2012) in Rome, Italy, in June 2012. The workshop was hosted by the Dipartimento di Ingegneria Informatica, Automatica e Gestionale Antonio Ruberti of Sapienza Universita di Roma, with financial support of the University, through grant 2010-C26A107CN9 TESTMED, and the EU Commission through the projects FP7-25888 Greener Buildings and FP7-257899 Smart Vortex. This volume contains the 5 papers accepted and presented at the workshop. Each paper was reviewed by three members of the internationally renowned Program Committee. In addition, a further paper was invted for inclusion in the workshop proceedings and for presentation at the workshop. There were two keynote talks, one by Marlon Dumas (Institute of Computer Science, University of Tartu, Estonia) on "Integrated Data and Process Management: Finally?" and the other by Yves Lesperance (Department of Computer Science and Engineering, York University, Canada) on "A Logic-Based Approach to Business Processes Customization" completed the scientific program. We would like to thank all the Program Committee members for the valuable work in selecting the papers, Andrea Marrella for his valuable work as publication and publicity chair of the workshop, and Carola Aiello and the consulting agency Consulta Umbria for the organization of this successful event.
Resumo:
Concerns regarding students' learning and reasoning in chemistry classrooms are well documented. Students' reasoning in chemistry should be characterized by conscious consideration of chemical phenomenon from laboratory work at macroscopic, molecular/sub-micro and symbolic levels. Further, students should develop metacognition in relation to such ways of reasoning about chemistry phenomena. Classroom change eliciting metacognitive experiences and metacognitive reflection is necessary to shift entrenched views of teaching and learning in students. In this study, Activity Theory is used as the framework for intepreting changes to the rules/customs and tools of the activity systems of two different classes of students taught by the same teacher, Frances, who was teaching chemical equilibrium to those classes in consecutive years. An interpretive methodolgy involving multiple data sources was employed. Frances explicitly changed her pedagogy in the second year to direct students attention to increasingly consider chemical phenomena at the molecular/sub-micro level. Additonally, she asked students not to use the textbook until toward the end of the equilibrium unit and sought to engage them in using their prior knowledge of chemistry to understand their observations from experiments. Frances' changed pedagogy elicited metacognitive experiences and reflection in students and challenged them to reconsider their metacognitive beliefs about learning chemistry and how it might be achieved. While teacher change is essential for science education reform, students are not passive players in the change efforts and they need to be convinced of the viability of teacher pedagogical change in the context of their goals, intentions, and beliefs.
Resumo:
Design Science Research (DSR) has emerged as an important approach in Information Systems (IS) research. However, DSR is still in its genesis and has yet to achieve consensus on even the fundamentals, such as what methodology / approach to use for DSR. While there has been much effort to establish DSR methodologies, a complete, holistic and validated approach for the conduct of DSR to guide IS researcher (especially novice researchers) is yet to be established. Alturki et al. (2011) present a DSR ‘Roadmap’, making the claim that it is a complete and comprehensive guide for conducting DSR. This paper aims to further assess this Roadmap, by positioning it against the ‘Idealized Model for Theory Development’ (IM4TD) (Fischer & Gregor 2011). The IM4TD highlights the role of discovery and justification and forms of reasoning to progress in theory development. Fischer and Gregor (2011) have applied IM4TD’s hypothetico-deductive method to analyze DSR methodologies, which is adopted in this study to deductively validate the Alturki et al. (2011) Roadmap. The results suggest that the Roadmap adheres to the IM4TD, is reasonably complete, overcomes most shortcomings identified in other DSR methodologies and also highlights valuable refinements that should be considered within the IM4TD.