915 resultados para Logical reasoning


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The world’s increasing complexity, competitiveness, interconnectivity, and dependence on technology generate new challenges for nations and individuals that cannot be met by continuing education as usual (Katehi, Pearson, & Feder, 2009). With the proliferation of complex systems have come new technologies for communication, collaboration, and conceptualisation. These technologies have led to significant changes in the forms of mathematical and scientific thinking that are required beyond the classroom. Modelling, in its various forms, can develop and broaden children’s mathematical and scientific thinking beyond the standard curriculum. This paper first considers future competencies in the mathematical sciences within an increasingly complex world. Next, consideration is given to interdisciplinary problem solving and models and modelling. Examples of complex, interdisciplinary modelling activities across grades are presented, with data modelling in 1st grade, model-eliciting in 4th grade, and engineering-based modelling in 7th-9th grades.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Australia’s Future Tax System Review, headed by the then head of the Australian Treasury, and the Productivity Commission’s Research Report on the not for profit sector, both examined the state of tax concessions to Australia’s not for profit sector in the light of the High Court’s decision in Commissioner of Taxation v Word Investments Ltd. Despite being unable to quantify with any certainty the pre- or post-Word Investments cost of the tax concessions, both Reports indicated their support for continuation of the income tax exemption. However, the government acted in the 2011 Budget to target the not for profit income tax concessions more precisely, mainly on competitive neutrality grounds. This article examines the income tax exemption by applying the five taxation design principles, proposed in the Australia’s Future Tax System Review, for assessing tax expenditure. The conclusion is that the exemptions can be justified and, further, that a rationale for the exemption can be consistent with the reasoning in the Word Investments case.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Embedded real-time programs rely on external interrupts to respond to events in their physical environment in a timely fashion. Formal program verification theories, such as the refinement calculus, are intended for development of sequential, block-structured code and do not allow for asynchronous control constructs such as interrupt service routines. In this article we extend the refinement calculus to support formal development of interrupt-dependent programs. To do this we: use a timed semantics, to support reasoning about the occurrence of interrupts within bounded time intervals; introduce a restricted form of concurrency, to model composition of interrupt service routines with the main program they may preempt; introduce a semantics for shared variables, to model contention for variables accessed by both interrupt service routines and the main program; and use real-time scheduling theory to discharge timing requirements on interruptible program code.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many modern business environments employ software to automate the delivery of workflows; whereas, workflow design and generation remains a laborious technical task for domain specialists. Several differ- ent approaches have been proposed for deriving workflow models. Some approaches rely on process data mining approaches, whereas others have proposed derivations of workflow models from operational struc- tures, domain specific knowledge or workflow model compositions from knowledge-bases. Many approaches draw on principles from automatic planning, but conceptual in context and lack mathematical justification. In this paper we present a mathematical framework for deducing tasks in workflow models from plans in mechanistic or strongly controlled work environments, with a focus around automatic plan generations. In addition, we prove an associative composition operator that permits crisp hierarchical task compositions for workflow models through a set of mathematical deduction rules. The result is a logical framework that can be used to prove tasks in workflow hierarchies from operational information about work processes and machine configurations in controlled or mechanistic work environments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the last decade, researchers and legislators have struggled to get an accurate picture of the scale and nature of the problem of human trafficking. In the absence of reliable data, some anti-prostitution activists have asserted that a causal relationship exists between legalised prostitution and human trafficking. They claim that systems of legalised or decriminalised prostitution lead to increases in trafficking into the sex industry. This paper critically analyses attempts to substantiate this claim during the development of anti-trafficking policy in Australia and the United States. These attempts are explored within the context of persistent challenges in measuring the scale and nature of human trafficking. The efforts of abolitionist campaigners to use statistical evidence and logical argumentation are analysed, with a specific focus on the characterisation of demand for sexual services and systems of legalised prostitution as ‘pull’ factors fuelling an increase in sex trafficking. The extent to which policymakers sought to introduce evidence-based policy is also explored.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter discusses a range of issues associated with supporting inquiry and deep reasoning while utilising information and communications technology (ICT). The role of questioning in critical thinking and reflection is considered in the context of scaffolding and new opportunities for ICT-enabled scaffolding identified. In particular, why-questioning provides a key point of focus and is presented as an important consideration in the design of systems that not only require cognitive engagement but aim to nurture it. Advances in automated question generation within intelligent tutoring systems are shown to hold promise for both teaching and learning in a range of other applications. While shortening attention spans appear to be a hazard of engaging with digital media cognitive engagement is presented as something with broader scope than attention span and is best conceived of as a crucible within which a rich mix of cognitive activities take place and from which new knowledge is created.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Virtual environments can provide, through digital games and online social interfaces, extremely exciting forms of interactive entertainment. Because of their capability in displaying and manipulating information in natural and intuitive ways, such environments have found extensive applications in decision support, education and training in the health and science domains amongst others. Currently, the burden of validating both the interactive functionality and visual consistency of a virtual environment content is entirely carried out by developers and play-testers. While considerable research has been conducted in assisting the design of virtual world content and mechanics, to date, only limited contributions have been made regarding the automatic testing of the underpinning graphics software and hardware. The aim of this thesis is to determine whether the correctness of the images generated by a virtual environment can be quantitatively defined, and automatically measured, in order to facilitate the validation of the content. In an attempt to provide an environment-independent definition of visual consistency, a number of classification approaches were developed. First, a novel model-based object description was proposed in order to enable reasoning about the color and geometry change of virtual entities during a play-session. From such an analysis, two view-based connectionist approaches were developed to map from geometry and color spaces to a single, environment-independent, geometric transformation space; we used such a mapping to predict the correct visualization of the scene. Finally, an appearance-based aliasing detector was developed to show how incorrectness too, can be quantified for debugging purposes. Since computer games heavily rely on the use of highly complex and interactive virtual worlds, they provide an excellent test bed against which to develop, calibrate and validate our techniques. Experiments were conducted on a game engine and other virtual worlds prototypes to determine the applicability and effectiveness of our algorithms. The results show that quantifying visual correctness in virtual scenes is a feasible enterprise, and that effective automatic bug detection can be performed through the techniques we have developed. We expect these techniques to find application in large 3D games and virtual world studios that require a scalable solution to testing their virtual world software and digital content.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is almost a truism that persons who occupy formal bureaucratic positions in schools may not actually be leaders if they were not role incumbents in a bureaucracy. It is also clear from studies of grassroots leaders that without the qualities of skills of leadership no one would follow them because they have no formal, hierarchical role upon which others were dependent to them. One of the reasons for re-examining the nature of grassroots leaders is to attempt to recapture those tactics or strategies which might be reconceptualized and utilized within more formal settings so that role dependent leadership becomes more effectual and trustworthy than one that is totally dependent on role authority. This reasoning is especially a critical need if there is a desire to work towards more democratic and collaborative working arrangements between leaders and followers, and where more flexible and dynamic relationships promise higher levels of commitment and productivity. Hecksher (1994) speaks of such a reconceptualization as part of a shift from an emphasis on power to one centered on influence. This paper examines the nature of leadership before it was subjected to positivistic science and later behavioural studies. This move follows the advice of Heilbrunn (1996) who trenchantly observed that for leadership studies to grow as a discipline, “it will have to cast a wider net” (p.11). Willis et. Al. (2008) make a similar point when they lament that social scientist have forced favoured understanding bureaucracies rather than grassroots community organizations, yet much can be gained by being aware of the tactics and strategies used by grassroots leaders who depend on influence as opposed to power. This paper, then, aims to do this by posing a tentative model of grassroots leadership and then considering how this model might inform and be used by those responsible for developing school leaders.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper argues for a renewed focus on statistical reasoning in the beginning school years, with opportunities for children to engage in data modelling. Results are reported from the first year of a 3-year longitudinal study in which three classes of first-grade children (6-year-olds) and their teachers engaged in data modelling activities. The theme of Looking after our Environment, part of the children’s science curriculum, provided the task context. The goals for the two activities addressed here included engaging children in core components of data modelling, namely, selecting attributes, structuring and representing data, identifying variation in data, and making predictions from given data. Results include the various ways in which children represented and re represented collected data, including attribute selection, and the metarepresentational competence they displayed in doing so. The “data lenses” through which the children dealt with informal inference (variation and prediction) are also reported.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Privacy issues have hindered the evolution of e-health since its emergence. Patients demand better solutions for the protection of private information. Health professionals demand open access to patient health records. Existing e-health systems find it difficult to fulfill these competing requirements. In this paper, we present an information accountability framework (IAF) for e-health systems. The IAF is intended to address privacy issues and their competing concerns related to e-health. Capabilities of the IAF adhere to information accountability principles and e-health requirements. Policy representation and policy reasoning are key capabilities introduced in the IAF. We investigate how these capabilities are feasible using Semantic Web technologies. We discuss with the use of a case scenario, how we can represent the different types of policies in the IAF using the Open Digital Rights Language (ODRL).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Consider the concept combination ‘pet human’. In word association experiments, human subjects produce the associate ‘slave’ in relation to this combination. The striking aspect of this associate is that it is not produced as an associate of ‘pet’, or ‘human’ in isolation. In other words, the associate ‘slave’ seems to be emergent. Such emergent associations sometimes have a creative character and cognitive science is largely silent about how we produce them. Departing from a dimensional model of human conceptual space, this article will explore concept combinations, and will argue that emergent associations are a result of abductive reasoning within conceptual space, that is, below the symbolic level of cognition. A tensor-based approach is used to model concept combinations allowing such combinations to be formalized as interacting quantum systems. Free association norm data is used to motivate the underlying basis of the conceptual space. It is shown by analogy how some concept combinations may behave like quantum-entangled (non-separable) particles. Two methods of analysis were presented for empirically validating the presence of non-separable concept combinations in human cognition. One method is based on quantum theory and another based on comparing a joint (true theoretic) probability distribution with another distribution based on a separability assumption using a chi-square goodness-of-fit test. Although these methods were inconclusive in relation to an empirical study of bi-ambiguous concept combinations, avenues for further refinement of these methods are identified.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Now in its eighth edition, Australian Tax Analysis: Cases, Commentary, Commercial Applications and Questions has a proven track record as a high level work for students of taxation law written by a team of authors with many years of experience. Taking into account the fact that the volume of material needed to be processed by today’s taxation student can be overwhelming, the well-chosen extracts and thought-provoking commentary in Australian Tax Analysis, 8th edition, provide readers with the depth of knowledge, and reasoning and analytical skills that will be required of them as practitioners. As well as the carefully selected case extracts and the helpful commentary, each chapter is supplemented by engaging practice questions, involving problem-solving, commercial decision-making, legal analysis and quantitative application. All these elements combined make Australian Tax Analysis an invaluable aid to the understanding of a subject that can be both technical and complex.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is argued that concerns arise about the integrity and fairness of the taxation regime where charitable organizations, which avail themselves of the tax exemption status while undertaking commercial activities, compete directly with the for-profit sector. The appropriateness of the tax concessions granted to charitable organizations is considered in respect of income derived from commercial activities. It is principally argued that the traditional line of reasoning for imposing limitations on tax concessions focuses on an incorrect underlying inquiry. Traditionally, it is argued that limitations should be imposed because of unfair competition, lack of competitive neutrality, or an arbitrary decision relating to a lack of deserving. However, it is argued that a more appropriate question from which to base any limitations is one which considers the value attached to the integrity of the taxation regime as a whole, and the tax base specifically compared to the public good of charities. When the correct underlying question is asked, sound taxation policy ensues, as a less arbitrary approach may be adopted to limit the scope of tax concessions available to charitable organizations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Anthony Downs public choice theory proposes that every rational person would try to meet their own desires in preference to those of others, and that such rational persons would attempt to obtain these desires in the most efficient manner possible. This paper will demonstrate that the application of this theory would mean that public servants and politicians would perform acts of corruption and maladministration in order to efficiently meet their desires. As such action is unavoidable, political parties must appear to meet the public demand for accountability systems, but must not make these systems viable lest they expose the corruption and maladministration that would threaten the government’s chance or re-election. It is therefore logical for governments to display a commitment for accountability whilst simultaneously ensuring the systems would not be able to interfere with government control or expose its flaws.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The opening phrase of the title is from Charles Darwin’s notebooks (Schweber 1977). It is a double reminder, firstly that mainstream evolutionary theory is not just about describing nature but is particularly looking for mechanisms or ‘causes’, and secondly, that there will usually be several causes affecting any particular outcome. The second part of the title is our concern at the almost universal rejection of the idea that biological mechanisms are sufficient for macroevolutionary changes, thus rejecting a cornerstone of Darwinian evolutionary theory. Our primary aim here is to consider ways of making it easier to develop and to test hypotheses about evolution. Formalizing hypotheses can help generate tests. In an absolute sense, some of the discussion by scientists about evolution is little better than the lack of reasoning used by those advocating intelligent design. Our discussion here is in a Popperian framework where science is defined by that area of study where it is possible, in principle, to find evidence against hypotheses – they are in principle falsifiable. However, with time, the boundaries of science keep expanding. In the past, some aspects of evolution were outside the current boundaries of falsifiable science, but increasingly new techniques and ideas are expanding the boundaries of science and it is appropriate to re-examine some topics. It often appears that over the last few decades there has been an increasingly strong assumption to look first (and only) for a physical cause. This decision is virtually never formally discussed, just an assumption is made that some physical factor ‘drives’ evolution. It is necessary to examine our assumptions much more carefully. What is meant by physical factors ‘driving’ evolution, or what is an ‘explosive radiation’. Our discussion focuses on two of the six mass extinctions, the fifth being events in the Late Cretaceous, and the sixth starting at least 50,000 years ago (and is ongoing). Cretaceous/Tertiary boundary; the rise of birds and mammals. We have had a long-term interest (Cooper and Penny 1997) in designing tests to help evaluate whether the processes of microevolution are sufficient to explain macroevolution. The real challenge is to formulate hypotheses in a testable way. For example the numbers of lineages of birds and mammals that survive from the Cretaceous to the present is one test. Our first estimate was 22 for birds, and current work is tending to increase this value. This still does not consider lineages that survived into the Tertiary, and then went extinct later. Our initial suggestion was probably too narrow in that it lumped four models from Penny and Phillips (2004) into one model. This reduction is too simplistic in that we need to know about survival and ecological and morphological divergences during the Late Cretaceous, and whether Crown groups of avian or mammalian orders may have existed back into the Cretaceous. More recently (Penny and Phillips 2004) we have formalized hypotheses about dinosaurs and pterosaurs, with the prediction that interactions between mammals (and groundfeeding birds) and dinosaurs would be most likely to affect the smallest dinosaurs, and similarly interactions between birds and pterosaurs would particularly affect the smaller pterosaurs. There is now evidence for both classes of interactions, with the smallest dinosaurs and pterosaurs declining first, as predicted. Thus, testable models are now possible. Mass extinction number six: human impacts. On a broad scale, there is a good correlation between time of human arrival, and increased extinctions (Hurles et al. 2003; Martin 2005; Figure 1). However, it is necessary to distinguish different time scales (Penny 2005) and on a finer scale there are still large numbers of possibilities. In Hurles et al. (2003) we mentioned habitat modification (including the use of Geogenes III July 2006 31 fire), introduced plants and animals (including kiore) in addition to direct predation (the ‘overkill’ hypothesis). We need also to consider prey switching that occurs in early human societies, as evidenced by the results of Wragg (1995) on the middens of different ages on Henderson Island in the Pitcairn group. In addition, the presence of human-wary or humanadapted animals will affect the distribution in the subfossil record. A better understanding of human impacts world-wide, in conjunction with pre-scientific knowledge will make it easier to discuss the issues by removing ‘blame’. While continued spontaneous generation was accepted universally, there was the expectation that animals continued to reappear. New Zealand is one of the very best locations in the world to study many of these issues. Apart from the marine fossil record, some human impact events are extremely recent and the remains less disrupted by time.