114 resultados para Science teaching methods
em University of Queensland eSpace - Australia
Reflections on context-based science teaching: A case study of physics for students of physiotherapy
Resumo:
Conferences that deliver interactive sessions designed to enhance physician participation, such as role play, small discussion groups, workshops, hands-on training, problem- or case-based learning and individualised training sessions, are effective for physician education.
Resumo:
This website is linked to UNESCO.org and is free to download for educational purposes. It contains a database of school science experiments and investigations in physics, chemistry, biology, astronomy, geology, weather studies, agriculture projects for primary and secondary schools; and sexuality education and drugs education. It is based on a revision, updating and expansion of the "New UNESCO source book for science teaching", 1979 edition, UNESCO, Paris. It contains experiments from the "low cost" science teaching movement, simplified versions of classical experiments, experiments using locally available substances and kitchen chemicals, and environmental science. Some experiments anticipate experiments usually done in senior high school or college classes. The experiments should be "student-friendly" and "teacher-friendly" because there is no overwhelming technology. Enough theoretical background is included to remind teachers of the theoretical context of the experiment. Every experiment is based on materials listed in a modern commercial catalogue of chemicals and equipment for use by educational institutions. The procedures and safety standards are consistent with instructions issued by Education Queensland (Ministry of Education), State of Queensland, Australia.
Resumo:
University teaching is a diverse enterprise which encompasses a range of disciplines, cirricula, teaching methods, learning tasks and learning approaches. Within this diversity, common themes and issues which reflect academics' understanding of effective teaching may be discerned. Drawing on written data collected from 708 practising teachers who were nominated by their Heads or Deans as exhibiting exemplary teaching practice, and from interviews conducted with 44 of these, a number of these themes and issues are identified and illustrated The findings offer insights which may stimulate further reflection on, and discussion of, the quality of teaching in higher education.
Resumo:
Purpose. To conduct a controlled trial of traditional and problem-based learning (PBL) methods of teaching epidemiology. Method. All second-year medical students (n = 136) at The University of Western Australia Medical School were offered the chance to participate in a randomized controlled trial of teaching methods fur an epidemiology course. Students who consented to participate (n = 80) were randomly assigned to either a PBL or a traditional course. Students who did not consent or did not return the consent form (n = 56) were assigned to the traditional course, Students in both streams took identical quizzes and exams. These scores, a collection of semi-quantitative feedback from all students, and a qualitative analysis of interviews with a convenience sample of six students from each stream were compared. Results. There was no significant difference in performances on quizzes or exams between PBL and traditional students. Students using PBL reported a stronger grasp of epidemiologic principles, enjoyed working with a group, and, at the end of the course, were more enthusiastic about epidemiology and its professional relevance to them than were students in the traditional course. PBL students worked more steadily during the semester but spent only marginally more time on the epidemiology course overall. Interviews corroborated these findings. Non-consenting students were older (p < 0.02) and more likely to come from non-English-speaking backgrounds (p < 0.005). Conclusions. PBL provides an academically equivalent but personally far richer learning experience. The adoption of PBL approaches to medical education makes it important to study whether PBL presents particular challenges for students whose first language is not the language of instruction.
Resumo:
Taking functional programming to its extremities in search of simplicity still requires integration with other development (e.g. formal) methods. Induction is the key to deriving and verifying functional programs, but can be simplified through packaging proofs with functions, particularly folds, on data (structures). Totally Functional Programming avoids the complexities of interpretation by directly representing data (structures) as platonic combinators - the functions characteristic to the data. The link between the two simplifications is that platonic combinators are a kind of partially-applied fold, which means that platonic combinators inherit fold-theoretic properties, but with some apparent simplifications due to the platonic combinator representation. However, despite observable behaviour within functional programming that suggests that TFP is widely-applicable, significant work remains before TFP as such could be widely adopted.
Resumo:
The cost of spatial join processing can be very high because of the large sizes of spatial objects and the computation-intensive spatial operations. While parallel processing seems a natural solution to this problem, it is not clear how spatial data can be partitioned for this purpose. Various spatial data partitioning methods are examined in this paper. A framework combining the data-partitioning techniques used by most parallel join algorithms in relational databases and the filter-and-refine strategy for spatial operation processing is proposed for parallel spatial join processing. Object duplication caused by multi-assignment in spatial data partitioning can result in extra CPU cost as well as extra communication cost. We find that the key to overcome this problem is to preserve spatial locality in task decomposition. We show in this paper that a near-optimal speedup can be achieved for parallel spatial join processing using our new algorithms.
Resumo:
Previous work on generating state machines for the purpose of class testing has not been formally based. There has also been work on deriving state machines from formal specifications for testing non-object-oriented software. We build on this work by presenting a method for deriving a state machine for testing purposes from a formal specification of the class under test. We also show how the resulting state machine can be used as the basis for a test suite developed and executed using an existing framework for class testing. To derive the state machine, we identify the states and possible interactions of the operations of the class under test. The Test Template Framework is used to formally derive the states from the Object-Z specification of the class under test. The transitions of the finite state machine are calculated from the derived states and the class's operations. The formally derived finite state machine is transformed to a ClassBench testgraph, which is used as input to the ClassBench framework to test a C++ implementation of the class. The method is illustrated using a simple bounded queue example.
Resumo:
In this and a preceding paper, we provide an introduction to the Fujitsu VPP range of vector-parallel supercomputers and to some of the computational chemistry software available for the VPP. Here, we consider the implementation and performance of seven popular chemistry application packages. The codes discussed range from classical molecular dynamics to semiempirical and ab initio quantum chemistry. All have evolved from sequential codes, and have typically been parallelised using a replicated data approach. As such they are well suited to the large-memory/fast-processor architecture of the VPP. For one code, CASTEP, a distributed-memory data-driven parallelisation scheme is presented. (C) 2000 Published by Elsevier Science B.V. All rights reserved.
Resumo:
In this paper we present a model of specification-based testing of interactive systems. This model provides the basis for a framework to guide such testing. Interactive systems are traditionally decomposed into a functionality component and a user interface component; this distinction is termed dialogue separation and is the underlying basis for conceptual and architectural models of such systems. Correctness involves both proper behaviour of the user interface and proper computation by the underlying functionality. Specification-based testing is one method used to increase confidence in correctness, but it has had limited application to interactive system development to date.
Resumo:
Normal mixture models are being increasingly used to model the distributions of a wide variety of random phenomena and to cluster sets of continuous multivariate data. However, for a set of data containing a group or groups of observations with longer than normal tails or atypical observations, the use of normal components may unduly affect the fit of the mixture model. In this paper, we consider a more robust approach by modelling the data by a mixture of t distributions. The use of the ECM algorithm to fit this t mixture model is described and examples of its use are given in the context of clustering multivariate data in the presence of atypical observations in the form of background noise.