963 resultados para Curriculum theory
Resumo:
Presentation Structure: - THEORY - CASE STUDY 1: Southbank Institute of Technology - CASE STUDY 2: QUT Science and Technology Precinct - MORE IDEAS - ACTIVITY
Resumo:
In this annotated guide we offer a reference list, with brief synposes, of possible films for inclusion in schools and linked to the Australian Curriculum: English (AC:E). These films meet one of the three cross curriculum priorities in the Australian Curriculum, which is Studies of Asia, specifically Australia’s contribution to Asia and Asia’s impact on Australia. This priority was recently introduced to curriculum policy in the 2008 Melbourne Declaration (Ministerial Council for Education Early Childhood Development and Youth Affairs, 2008). In this guide we include Australians films made by Asian Australian filmmakers, as well as films about people from Asian countries in Australia, where representations of Asia are a significant part of the film’s content.
Resumo:
Power system restoration after a large area outage involves many factors, and the procedure is usually very complicated. A decision-making support system could then be developed so as to find the optimal black-start strategy. In order to evaluate candidate black-start strategies, some indices, usually both qualitative and quantitative, are employed. However, it may not be possible to directly synthesize these indices, and different extents of interactions may exist among these indices. In the existing black-start decision-making methods, qualitative and quantitative indices cannot be well synthesized, and the interactions among different indices are not taken into account. The vague set, an extended version of the well-developed fuzzy set, could be employed to deal with decision-making problems with interacting attributes. Given this background, the vague set is first employed in this work to represent the indices for facilitating the comparisons among them. Then, a concept of the vague-valued fuzzy measure is presented, and on that basis a mathematical model for black-start decision-making developed. Compared with the existing methods, the proposed method could deal with the interactions among indices and more reasonably represent the fuzzy information. Finally, an actual power system is served for demonstrating the basic features of the developed model and method.
Resumo:
Lankes and Silverstein (2006) introduced the “participatory library” and suggested that the nature and form of the library should be explored. In the last several years, some attempts have been made in order to develop contemporary library models that are often known as Library 2.0. However, little research has been based on empirical data and such models have had a strong focus on technical aspects but less focus on participation. The research presented in this paper fills this gap. A grounded theory approach was adopted for this study. Six librarians were involved in in-depth individual interviews. As a preliminary result, five main factors of the participatory library emerged including technological, human, educational, social-economic, and environmental. Five factors influencing the participation in libraries were also identified: finance, technology, education, awareness, and policy. The study’s findings provide a fresh perspective on contemporary library and create a basis for further studies on this area.
Resumo:
Whole-body computer control interfaces present new opportunities to engage children with games for learning. Stomp is a suite of educational games that use such a technology, allowing young children to use their whole body to interact with a digital environment projected on the floor. To maximise the effectiveness of this technology, tenets of self-determination theory (SDT) are applied to the design of Stomp experiences. By meeting user needs for competence, autonomy, and relatedness our aim is to increase children's engagement with the Stomp learning platform. Analysis of Stomp's design suggests that these tenets are met. Observations from a case study of Stomp being used by young children show that they were highly engaged and motivated by Stomp. This analysis demonstrates that continued application of SDT to Stomp will further enhance user engagement. It also is suggested that SDT, when applied more widely to other whole-body multi-user interfaces, could instil similar positive effects.
Resumo:
According to Karl Popper, widely regarded as one of the greatest philosophers of science in the 20th century, falsifiability is the primary characteristic that distinguishes scientific theories from ideologies – or dogma. For example, for people who argue that schools should treat creationism as a scientific theory, comparable to modern theories of evolution, advocates of creationism would need to become engaged in the generation of falsifiable hypothesis, and would need to abandon the practice of discouraging questioning and inquiry. Ironically, scientific theories themselves are accepted or rejected based on a principle that might be called survival of the fittest. So, for healthy theories on development to occur, four Darwinian functions should function: (a) variation – avoid orthodoxy and encourage divergent thinking, (b) selection – submit all assumptions and innovations to rigorous testing, (c) diffusion – encourage the shareability of new and/or viable ways of thinking, and (d) accumulation – encourage the reuseability of viable aspects of productive innovations.
Resumo:
Open the sports or business section of your daily newspaper, and you are immediately bombarded with an array of graphs, tables, diagrams, and statistical reports that require interpretation. Across all walks of life, the need to understand statistics is fundamental. Given that our youngsters’ future world will be increasingly data laden, scaffolding their statistical understanding and reasoning is imperative, from the early grades on. The National Council of Teachers of Mathematics (NCTM) continues to emphasize the importance of early statistical learning; data analysis and probability was the Council’s professional development “Focus of the Year” for 2007–2008. We need such a focus, especially given the results of the statistics items from the 2003 NAEP. As Shaughnessy (2007) noted, students’ performance was weak on more complex items involving interpretation or application of items of information in graphs and tables. Furthermore, little or no gains were made between the 2000 NAEP and the 2003 NAEP studies. One approach I have taken to promote young children’s statistical reasoning is through data modeling. Having implemented in grades 3 –9 a number of model-eliciting activities involving working with data (e.g., English 2010), I observed how competently children could create their own mathematical ideas and representations—before being instructed how to do so. I thus wished to introduce data-modeling activities to younger children, confi dent that they would likewise generate their own mathematics. I recently implemented data-modeling activities in a cohort of three first-grade classrooms of six year- olds. I report on some of the children’s responses and discuss the components of data modeling the children engaged in.
Resumo:
This article focuses on problem solving activities in a first grade classroom in a typical small community and school in Indiana. But, the teacher and the activities in this class were not at all typical of what goes on in most comparable classrooms; and, the issues that will be addressed are relevant and important for students from kindergarten through college. Can children really solve problems that involve concepts (or skills) that they have not yet been taught? Can children really create important mathematical concepts on their own – without a lot of guidance from teachers? What is the relationship between problem solving abilities and the mastery of skills that are widely regarded as being “prerequisites” to such tasks?Can primary school children (whose toolkits of skills are limited) engage productively in authentic simulations of “real life” problem solving situations? Can three-person teams of primary school children really work together collaboratively, and remain intensely engaged, on problem solving activities that require more than an hour to complete? Are the kinds of learning and problem solving experiences that are recommended (for example) in the USA’s Common Core State Curriculum Standards really representative of the kind that even young children encounter beyond school in the 21st century? … This article offers an existence proof showing why our answers to these questions are: Yes. Yes. Yes. Yes. Yes. Yes. And: No. … Even though the evidence we present is only intended to demonstrate what’s possible, not what’s likely to occur under any circumstances, there is no reason to expect that the things that our children accomplished could not be accomplished by average ability children in other schools and classrooms.
Resumo:
The Pattern and Structure Mathematics Awareness Project (PASMAP) has investigated the development of patterning and early algebraic reasoning among 4 to 8 year olds over a series of related studies. We assert that an awareness of mathematical pattern and structure enables mathematical thinking and simple forms of generalisation from an early age. The project aims to promote a strong foundation for mathematical development by focusing on critical, underlying features of mathematics learning. This paper provides an overview of key aspects of the assessment and intervention, and analyses of the impact of PASMAP on students’ representation, abstraction and generalisation of mathematical ideas. A purposive sample of four large primary schools, two in Sydney and two in Brisbane, representing 316 students from diverse socio-economic and cultural contexts, participated in the evaluation throughout the 2009 school year and a follow-up assessment in 2010. Two different mathematics programs were implemented: in each school, two Kindergarten teachers implemented the PASMAP and another two implemented their regular program. The study shows that both groups of students made substantial gains on the ‘I Can Do Maths’ assessment and a Pattern and Structure Assessment (PASA) interview, but highly significant differences were found on the latter with PASMAP students outperforming the regular group on PASA scores. Qualitative analysis of students’ responses for structural development showed increased levels for the PASMAP students; those categorised as low ability developed improved structural responses over a relatively short period of time.
Resumo:
As the financial planning industry undergoes a series of reforms aimed at increased professionalism and improved quality of advice, financial planner training in Australia and elsewhere has begun to acknowledge the importance of interdisciplinary knowledge bases in informing both curriculum design and professoinal practice (e.g. FPA2009). This paper underscores the importance of the process of financial planning by providing a conceptual analysis of the six step financial planning process using key mechanisms derived from theory and research in cognate disciplines such as psychology and well-being. The paper identifies how these mechanisms may operate to impact client well-being in the financial planning context. The conceptual mapping of th emechanisms to process elements of financial planning is a unique contribution to the financial planning literature and offers a further framework in the armamentarium of researchers interested in pursuing questions around the value of financial planning. The conceptual framework derived from the analysis also adds to the growing body of literature aimed at developing an integrated model of financial planning.
Resumo:
PURPOSE: This pilot project’s aim was to trial a tool and process for developing students’ ability to engage in self-assessment using reflection on their clinical experiences, including feedback from workplace learning, in order to aid them in linking theory to practice and develop strategies to improve performance. BACKGROUND: In nursing education, students can experience a mismatch in performance compared to theoretical learning, this is referred to as the ‘theory practice gap’ (Scully 2011, Chan Chan & Liu 2011). One specific contributing factor seems to be students’ inability to engage in meaningful reflection and self-correcting behaviours. A self-assessment strategy was implemented within a third year clinical unit to ameliorate this mismatch with encouraging results, as students developed self-direction in addressing learning needs. In this pilot project the above strategy was adapted for implementation between different clinical units, to create a whole of course approach to integrating workplace learning. METHOD: The methodology underpinning this project is a scaffolded, supported reflective practice process. Improved self-assessment skills is achieved by students reflecting on and engaging with feedback, then mapping this to learning outcomes to identify where performance can be improved. Evaluation of this project includes: collation of student feedback identifying successful strategies along with barriers encountered in implementation; feedback from students and teachers via above processes and tools; and comparison of the number of learning contracts issued in clinical nursing units with similar cohorts. RESULTS: Results will be complete by May 2012 and include analysis of the data collected via the above evaluation methods. Other outcomes will include the refined process and tool, plus resources that should improve cost effectiveness without reducing student support. CONCLUSION: Implementing these tools and processes over the entire student’s learning package, will assist them to demonstrate progressive development through the course. Students will have learnt to understand feedback and integrate these skills for life-long learning.
Resumo:
The numerical solution of stochastic differential equations (SDEs) has been focused recently on the development of numerical methods with good stability and order properties. These numerical implementations have been made with fixed stepsize, but there are many situations when a fixed stepsize is not appropriate. In the numerical solution of ordinary differential equations, much work has been carried out on developing robust implementation techniques using variable stepsize. It has been necessary, in the deterministic case, to consider the "best" choice for an initial stepsize, as well as developing effective strategies for stepsize control-the same, of course, must be carried out in the stochastic case. In this paper, proportional integral (PI) control is applied to a variable stepsize implementation of an embedded pair of stochastic Runge-Kutta methods used to obtain numerical solutions of nonstiff SDEs. For stiff SDEs, the embedded pair of the balanced Milstein and balanced implicit method is implemented in variable stepsize mode using a predictive controller for the stepsize change. The extension of these stepsize controllers from a digital filter theory point of view via PI with derivative (PID) control will also be implemented. The implementations show the improvement in efficiency that can be attained when using these control theory approaches compared with the regular stepsize change strategy.
Resumo:
This paper reports a study that explored a new construct: ‘climate of fear’. We hypothesised that climate of fear would vary across work sites within organisations, but not across organisations. This is in contrast a to measures of organisational culture, which were expected to vary both within and across organisations. To test our hypotheses, we developed a new 13-item measure of perceived fear in organisations and tested it in 20 sites across two organisations (N ≡ 209). Culture variables measured were innovative leadership culture, and communication culture. Results were that climate of fear did vary across sites in both organisations, while differences across organisations were not significant, as we anticipated. Organisational culture, however, varied between the organisations, and within one of the organisations. The climate of fear scale exhibited acceptable psychometric properties
Resumo:
The content for the school science curriculum has always been an interplay or contest between the interests of a number of stakeholders, who have an interest in establishing it at a new level of schooling or in changing its current form. For most of its history, the interplay was dominated by the interests of academic scientists, but in the 1980s the needs of both future scientists and future citizens began to be more evenly balanced as science educators promoted a wider sense of science. The contest changed again in the 1990s with a super-ordinate control being exerted by government bureaucrats at the expense of the subject experts. This change coincides with the rise in a number of countries of a market view of education, and of science education in particular, accompanied by demands for public accountability via simplistic auditing measures. This shift from expertise to bureaucratise and its consequences for the quality of science education is illustrated with five case studies of science curriculum reform in Australia.
Resumo:
This paper proposes a framework to analyse performance on multiple choice questions with the focus on linguistic factors. Item Response Theory (IRT) is deployed to estimate ability and question difficulty levels. A logistic regression model is used to detect Differential Item Functioning questions. Probit models testify relationships between performance and linguistic factors controlling the effects of question construction and students’ background. Empirical results have important implications. The lexical density of stems affects performance. The use of non-Economics specialised vocabulary has differing impacts on the performance of students with different language backgrounds. The IRT-based ability and difficulty help explain performance variations.