945 resultados para explicit undervisning
Resumo:
Concerns raised in educational reports about school science in terms of students. outcomes and attitudes, as well as science teaching practices prompted investigation into science learning and teaching practices at the foundational level of school science. Without science content and process knowledge, understanding issues of modern society and active participation in decision-making is difficult. This study contended that a focus on the development of the language of science could enable learners to engage more effectively in learning science and enhance their interest and attitudes towards science. Furthermore, it argued that explicit teaching practices where science language is modelled and scaffolded would facilitate the learning of science by young children at the beginning of their formal schooling. This study aimed to investigate science language development at the foundational level of school science learning in the preparatory-school with students aged five and six years. It focussed on the language of science and science teaching practices in early childhood. In particular, the study focussed on the capacity for young students to engage with and understand science language. Previous research suggests that students have difficulty with the language of science most likely because of the complexities and ambiguities of science language. Furthermore, literature indicates that tensions transpire between traditional science teaching practices and accepted early childhood teaching practices. This contention prompted investigation into means and models of pedagogy for learning foundational science language, knowledge and processes in early childhood. This study was positioned within qualitative assumptions of research and reported via descriptive case study. It was located in a preparatory-school classroom with the class teacher, teacher-aide, and nineteen students aged four and five years who participated with the researcher in the study. Basil Bernstein.s pedagogical theory coupled with Halliday.s Systemic Functional Linguistics (SFL) framed an examination of science pedagogical practices for early childhood science learning. Students. science learning outcomes were gauged by focussing a Hallydayan lens on their oral and reflective language during 12 science-focussed episodes of teaching. Data were collected throughout the 12 episodes. Data included video and audio-taped science activities, student artefacts, journal and anecdotal records, semi-structured interviews and photographs. Data were analysed according to Bernstein.s visible and invisible pedagogies and performance and competence models. Additionally, Halliday.s SFL provided the resource to examine teacher and student language to determine teacher/student interpersonal relationships as well as specialised science and everyday language used in teacher and student science talk. Their analysis established the socio-linguistic characteristics that promoted science competencies in young children. An analysis of the data identified those teaching practices that facilitate young children.s acquisition of science meanings. Positive indications for modelling science language and science text types to young children have emerged. Teaching within the studied setting diverged from perceived notions of common early childhood practices and the benefits of dynamic shifting pedagogies were validated. Significantly, young students demonstrated use of particular specialised components of school-science language in terms of science language features and vocabulary. As well, their use of language demonstrated the students. knowledge of science concepts, processes and text types. The young students made sense of science phenomena through their incorporation of a variety of science language and text-types in explanations during both teacher-directed and independent situations. The study informs early childhood science practices as well as practices for foundational school science teaching and learning. It has exposed implications for science education policy, curriculum and practices. It supports other findings in relation to the capabilities of young students. The study contributes to Systemic Functional Linguistic theory through the development of a specific resource to determine the technicality of teacher language used in teaching young students. Furthermore, the study contributes to methodology practices relating to Bernsteinian theoretical perspectives and has demonstrated new ways of depicting and reporting teaching practices. It provides an analytical tool which couples Bernsteinian and Hallidayan theoretical perspectives. Ultimately, it defines directions for further research in terms of foundation science language learning, ongoing learning of the language of science and learning science, science teaching and learning practices, specifically in foundational school science, and relationships between home and school science language experiences.
Resumo:
The stochastic simulation algorithm was introduced by Gillespie and in a different form by Kurtz. There have been many attempts at accelerating the algorithm without deviating from the behavior of the simulated system. The crux of the explicit τ-leaping procedure is the use of Poisson random variables to approximate the number of occurrences of each type of reaction event during a carefully selected time period, τ. This method is acceptable providing the leap condition, that no propensity function changes “significantly” during any time-step, is met. Using this method there is a possibility that species numbers can, artificially, become negative. Several recent papers have demonstrated methods that avoid this situation. One such method classifies, as critical, those reactions in danger of sending species populations negative. At most, one of these critical reactions is allowed to occur in the next time-step. We argue that the criticality of a reactant species and its dependent reaction channels should be related to the probability of the species number becoming negative. This way only reactions that, if fired, produce a high probability of driving a reactant population negative are labeled critical. The number of firings of more reaction channels can be approximated using Poisson random variables thus speeding up the simulation while maintaining the accuracy. In implementing this revised method of criticality selection we make use of the probability distribution from which the random variable describing the change in species number is drawn. We give several numerical examples to demonstrate the effectiveness of our new method.
Resumo:
International law’s capacity to influence state behaviour by regulating recourse to violence has been a longstanding source of debate among international lawyers and political scientists. On the one hand, sceptics assert that frequent violations of the prohibition on the use of force have rendered article 2(4) of the UN Charter redundant. They contend that national self-interest, rather than international law, is the key determinant of state behaviour regarding the use of force. On the other hand, defenders of article 2(4) argue first, that most states comply with the Charter framework, and second, that state rhetoric continues to acknowledge the existence of the jus ad bellum. In particular, the fact that violators go to considerable lengths to offer legal or factual justifications for their conduct – typically by relying on the right of self-defence – is advanced as evidence that the prohibition on the use of force retains legitimacy in the eyes of states. This paper identifies two potentially significant features of state practice since 2006 which may signal a shift in states’ perceptions of the normative authority of article 2(4). The first aspect is the recent failure by several states to offer explicit legal justifications for their use or force, or to report action taken in self-defence to the Security Council in accordance with Article 51. Four incidents linked to the global “war on terror” are examined here: Israeli airstrikes in Syria in 2007 and in Sudan in 2009, Turkey’s 2006-2008 incursions into northern Iraq, and Ethiopia’s 2006 intervention in Somalia. The second, more troubling feature is the international community’s apparent lack of concern over the legality of these incidents. Each use of force is difficult to reconcile with the strict requirements of the jus ad bellum; yet none attracted genuine legal scrutiny or debate among other states. While it is too early to conclude that these relatively minor incidents presage long term shifts in state practice, viewed together the two developments identified here suggest a possible downgrading of the role of international law in discussions over the use of force, at least in conflicts linked to the “war on terror”. This, in turn, may represent a declining perception of the normative authority of the jus ad bellum, and a concomitant admission of the limits of international law in regulating violence.
Resumo:
Assurance of learning is a predominant feature in both quality enhancement and assurance in higher education. Assurance of learning is a process that articulates explicit program outcomes and standards, and systematically gathers evidence to determine the extent to which performance matches expectations. Benefits accrue to the institution through the systematic assessment of whole of program goals. Data may be used for continuous improvement, program development, and to inform external accreditation and evaluation bodies. Recent developments, including the introduction of the Tertiary Education and Quality Standards Agency (TEQSA) will require universities to review the methods they use to assure learning outcomes. This project investigates two critical elements of assurance of learning: 1. the mapping of graduate attributes throughout a program; and 2. the collection of assurance of learning data. An audit was conducted with 25 of the 39 Business Schools in Australian universities to identify current methods of mapping graduate attributes and for collecting assurance of learning data across degree programs, as well as a review of the key challenges faced in these areas. Our findings indicate that external drivers like professional body accreditation (for example: Association to Advance Collegiate Schools of Business (AACSB)) and TEQSA are important motivators for assuring learning, and those who were undertaking AACSB accreditation had more robust assurance of learning systems in place. It was reassuring to see that the majority of institutions (96%) had adopted an embedding approach to assuring learning rather than opting for independent standardised testing. The main challenges that were evident were the development of sustainable processes that were not considered a burden to academic staff, and obtainment of academic buy in to the benefits of assuring learning per se rather than assurance of learning being seen as a tick box exercise. This cultural change is the real challenge in assurance of learning practice.
Resumo:
To address issues of divisive ideologies in the Mathematics Education community and to subsequently advance educational practice, an alternative theoretical framework and operational model is proposed which represents a consilience of constructivist learning theories whilst acknowledging the objective but improvable nature of domain knowledge. Based upon Popper’s three-world model of knowledge, the proposed theory supports the differentiation and explicit modelling of both shared domain knowledge and idiosyncratic personal understanding using a visual nomenclature. The visual nomenclature embodies Piaget’s notion of reflective abstraction and so may support an individual’s experience-based transformation of personal understanding with regards to shared domain knowledge. Using the operational model and visual nomenclature, seminal literature regarding early-number counting and addition was analysed and described. Exemplars of the resultant visual artefacts demonstrate the proposed theory’s viability as a tool with which to characterise the reflective abstraction-based organisation of a domain’s shared knowledge. Utilising such a description of knowledge, future research needs to consider the refinement of the operational model and visual nomenclature to include the analysis, description and scaffolded transformation of personal understanding. A detailed model of knowledge and understanding may then underpin the future development of educational software tools such as computer-mediated teaching and learning environments.
Resumo:
High fidelity simulation as a teaching and learning approach is being embraced by many schools of nursing. Our school embarked on integrating high fidelity (HF) simulation into the undergraduate clinical education program in 2011. Low and medium fidelity simulation has been used for many years, but this did not simplify the integration of HF simulation. Alongside considerations of how and where HF simulation would be integrated, issues arose with: student consent and participation for observed activities; data management of video files; staff development, and conceptualising how methods for student learning could be researched. Simulation for undergraduate student nurses commenced as a formative learning activity, undertaken in groups of eight, where four students undertake the ‘doing’ role and four are structured observers, who then take a formal role in the simulation debrief. Challenges for integrating simulation into student learning included conceptualising and developing scenarios to trigger students’ decision making and application of skills, knowledge and attitudes explicit to solving clinical ‘problems’. Developing and planning scenarios for students to ‘try out’ skills and make decisions for problem solving lay beyond choosing pre-existing scenarios inbuilt with the software. The supplied scenarios were not concept based but rather knowledge, skills and technology (of the manikin) focussed. Challenges lay in using the technology for the purpose of building conceptual mastery rather than using technology simply because it was available. As we integrated use of HF simulation into the final year of the program, focus was on building skills, knowledge and attitudes that went beyond technical skill, and provided an opportunity to bridge the gap with theory-based knowledge that students often found difficult to link to clinical reality. We wished to provide opportunities to develop experiential knowledge based on application and clinical reasoning processes in team environments where problems are encountered, and to solve them, the nurse must show leadership and direction. Other challenges included students consenting for simulations to be videotaped and ethical considerations of this. For example if one student in a group of eight did not consent, did this mean they missed the opportunity to undertake simulation, or that others in the group may be disadvantaged by being unable to review their performance. This has implications for freely given consent but also for equity of access to learning opportunities for students who wished to be taped and those who did not. Alongside this issue were the details behind data management, storage and access. Developing staff with varying levels of computer skills to use software and undertake a different approach to being the ‘teacher’ required innovation where we took an experiential approach. Considering explicit learning approaches to be trialled for learning was not a difficult proposition, but considering how to enact this as research with issues of blinding, timetabling of blinded groups, and reducing bias for testing results of different learning approaches along with gaining ethical approval was problematic. This presentation presents examples of these challenges and how we overcame them.
Resumo:
Background We investigated the geographical variation of water supply and sanitation indicators (WS&S) and their role to the risk of schistosomiasis and hookworm infection in school age children in West Africa. The aim was to predict large-scale geographical variation in WS&S, quantify the attributable risk of S. haematobium, S. mansoni and hookworm infections due to WS&S and identify communities where sustainable transmission control could be targeted across the region. Methods National cross-sectional household-based demographic health surveys were conducted in 24,542 households in Burkina Faso, Ghana and Mali, in 2003–2006. We generated spatially-explicit predictions of areas without piped water, toilet facilities and finished floors in West Africa, adjusting for household covariates. Using recently published helminth prevalence data we developed Bayesian geostatistical models (MGB) of S. haematobium, S. mansoni and hookworm infection in West Africa including environmental and the mapped outputs for WS&S. Using these models we estimated the effect of WS&S on parasite risk, quantified their attributable fraction of infection, and mapped the risk of infection in West Africa. Findings Our maps show that most areas in West Africa are very poorly served by water supply except in major urban centers. There is a better geographical coverage for toilet availability and improved household flooring. We estimated smaller attributable risks for water supply in S. mansoni (47%) compared to S. haematobium (71%), and 5% of hookworm cases could be averted by improving sanitation. Greater levels of inadequate sanitation increased the risk of schistosomiasis, and increased levels of unsafe water supply increased the risk of hookworm. The role of floor type for S. haematobium infection (21%) was comparable to that of S. mansoni (16%), but was significantly higher for hookworm infection (86%). S. haematobium and hookworm maps accounting for WS&S show small clusters of maximal prevalence areas in areas bordering Burkina Faso and Mali smaller. The map of S. mansoni shows that this parasite is much more wide spread across the north of the Niger River basin than previously predicted. Interpretation Our maps identify areas where the Millennium Development Goal for water and sanitation is lagging behind. Our results show that WS&S are important contributors to the burden of major helminth infections of children in West Africa. Including information about WS&S as well as the “traditional” environmental risk factors in spatial models of helminth risk yielded a substantial gain both in model fit and at explaining the proportion of spatial variance in helminth risk. Mapping the distribution of infection risk adjusted for WS&S allowed the identification of communities in West Africa where integrative preventive chemotherapy and engineering interventions will yield the greatest public health benefits.
Resumo:
AIMS This paper reports on the implementation of a research project that trials an educational strategy implemented over six months of an undergraduate third year nursing curriculum. This project aims to explore the effectiveness of ‘think aloud’ as a strategy for learning clinical reasoning for students in simulated clinical settings. BACKGROUND Nurses are required to apply and utilise critical thinking skills to enable clinical reasoning and problem solving in the clinical setting [1]. Nursing students are expected to develop and display clinical reasoning skills in practice, but may struggle articulating reasons behind decisions about patient care. For students learning to manage complex clinical situations, teaching approaches are required that make these instinctive cognitive processes explicit and clear [2-5]. In line with professional expectations, nursing students in third year at Queensland University of Technology (QUT) are expected to display clinical reasoning skills in practice. This can be a complex proposition for students in practice situations, particularly as the degree of uncertainty or decision complexity increases [6-7]. The ‘think aloud’ approach is an innovative learning/teaching method which can create an environment suitable for developing clinical reasoning skills in students [4, 8]. This project aims to use the ‘think aloud’ strategy within a simulation context to provide a safe learning environment in which third year students are assisted to uncover cognitive approaches that best assist them to make effective patient care decisions, and improve their confidence, clinical reasoning and active critical reflection on their practice. MEHODS In semester 2 2011 at QUT, third year nursing students will undertake high fidelity simulation, some for the first time commencing in September of 2011. There will be two cohorts for strategy implementation (group 1= use think aloud as a strategy within the simulation, group 2= not given a specific strategy outside of nursing assessment frameworks) in relation to problem solving patient needs. Students will be briefed about the scenario, given a nursing handover, placed into a simulation group and an observer group, and the facilitator/teacher will run the simulation from a control room, and not have contact (as a ‘teacher’) with students during the simulation. Then debriefing will occur as a whole group outside of the simulation room where the session can be reviewed on screen. The think aloud strategy will be described to students in their pre-simulation briefing and allow for clarification of this strategy at this time. All other aspects of the simulations remain the same, (resources, suggested nursing assessment frameworks, simulation session duration, size of simulation teams, preparatory materials). RESULTS Methodology of the project and the challenges of implementation will be the focus of this presentation. This will include ethical considerations in designing the project, recruitment of students and implementation of a voluntary research project within a busy educational curriculum which in third year targets 669 students over two campuses. CONCLUSIONS In an environment of increasingly constrained clinical placement opportunities, exploration of alternate strategies to improve critical thinking skills and develop clinical reasoning and problem solving for nursing students is imperative in preparing nurses to respond to changing patient needs. References 1. Lasater, K., High-fidelity simulation and the development of clinical judgement: students' experiences. Journal of Nursing Education, 2007. 46(6): p. 269-276. 2. Lapkin, S., et al., Effectiveness of patient simulation manikins in teaching clinical reasoning skills to undergraduate nursing students: a systematic review. Clinical Simulation in Nursing, 2010. 6(6): p. e207-22. 3. Kaddoura, M.P.C.M.S.N.R.N., New Graduate Nurses' Perceptions of the Effects of Clinical Simulation on Their Critical Thinking, Learning, and Confidence. The Journal of Continuing Education in Nursing, 2010. 41(11): p. 506. 4. Banning, M., The think aloud approach as an educational tool to develop and assess clinical reasoning in undergraduate students. Nurse Education Today, 2008. 28: p. 8-14. 5. Porter-O'Grady, T., Profound change:21st century nursing. Nursing Outlook, 2001. 49(4): p. 182-186. 6. Andersson, A.K., M. Omberg, and M. Svedlund, Triage in the emergency department-a qualitative study of the factors which nurses consider when making decisions. Nursing in Critical Care, 2006. 11(3): p. 136-145. 7. O'Neill, E.S., N.M. Dluhy, and C. Chin, Modelling novice clinical reasoning for a computerized decision support system. Journal of Advanced Nursing, 2005. 49(1): p. 68-77. 8. Lee, J.E. and N. Ryan-Wenger, The "Think Aloud" seminar for teaching clinical reasoning: a case study of a child with pharyngitis. J Pediatr Health Care, 1997. 11(3): p. 101-10.
Resumo:
Social networks have proven to be an attractive avenue of investigation for researchers since humans are social creatures. Numerous literature have explored the term “social networks” from different perspectives and in diverse research fields. With the popularity of the Internet, social networking has taken on a new dimension. Online social communities therefore have become an emerging social avenue for people to communicate in today’s information age. People use online social communities to share their interests, maintain friendships, and extend their so-called circle of “friends”. Likewise, social capital, also known as human capital, is an important theory in sociology. Researchers usually utilise social capital theory when they investigate the topic relating to social networks. However, there is little literature that can provide an explicit and strong assertion in that research area due to the complexity of social capital. This thesis therefore focuses on the issue related to providing a better understanding about the relationship between social capital and online social communities. To enhance the value within the scope of this analysis, an online survey was conducted to examine the effects of the dimensions of social capital: relational capital, structural capital, and cognitive capital, determining the intensity of using online social communities. The data were derived from a total of 350 self-selected respondents completing an online survey during the research period. The main results indicate that social capital exists in online social communities under normal circumstances. Finally, this thesis also presents three contributions for both theory and practice in Chapter 5. The main results contribute to the understanding of connectivity in the interrelationships between individual social capital exchange within online social networks. Secondly, social trust was found to have a weak effect in influencing the intensity of individuals using online social communities. Third, the perpetual role of information sharing has an indirect influence on individual users participating in online social communities. This study also benefits online marketing consultants as marketers can not only gain consumer information easier from online social communities but also this understanding assists in designing effective communication within online social communities. The cross-sectional study, the reliability of Internet survey data, and sampling issues are the major three limitations in this research. The thesis provides a new research model and recommends that the mediating effects, privacy paradox, and social trust on online social communities should be further explored in future research.
Resumo:
This paper reports a 2-year longitudinal study on the effectiveness of the Pattern and Structure Mathematical Awareness Program (PASMAP) on students’ mathematical development. The study involved 316 Kindergarten students in 17 classes from four schools in Sydney and Brisbane. The development of the PASA assessment interview and scale are presented. The intervention program provided explicit instruction in mathematical pattern and structure that enhanced the development of students’ spatial structuring, multiplicative reasoning, and emergent generalisations. This paper presents the initial findings of the impact of the PASMAP and illustrates students’ structural development.
Resumo:
We present a novel, web-accessible scientific workflow system which makes large-scale comparative studies accessible without programming or excessive configuration requirements. GPFlow allows a workflow defined on single input values to be automatically lifted to operate over collections of input values and supports the formation and processing of collections of values without the need for explicit iteration constructs. We introduce a new model for collection processing based on key aggregation and slicing which guarantees processing integrity and facilitates automatic association of inputs, allowing scientific users to manage the combinatorial explosion of data values inherent in large scale comparative studies. The approach is demonstrated using a core task from comparative genomics, and builds upon our previous work in supporting combined interactive and batch operation, through a lightweight web-based user interface.
Resumo:
In John Frazer's seminal book An Evolutionary Architecture (1995), from which this essay is extracted, a fundamental approach is established for have natural systems can unfold mechanisms for negotiating the complex design space inherent in architectural systems. In this essay, which forms a critical part of the book, Frazer draws both correlations and distinctions from natural processes as emulated in design processes and form as active manifestations within natural systems. Form is seen as an evolving agent generated via the rules of descriptive genetic coding, functioning as a part of a metabolic environment. Frazer's process-model establishes the realm in which computation must manoeuvre to produce a valid solution space, including the operations of self-organisation, complexity and emergent behaviour. Addressing design as an authored practice, he extends the transference of 'creativity' from the explicit impression into form, to the investment of though, organisation and strategy in the computational processes which produce form. Frazer's text concentrates astutely on the practising of the evolutionary paradigm, the output of which postulates an architecture born of the relationships to dynamic environmental and socio-economic contexts, and realised through morphogenetic materialisation.
Resumo:
Peeling is an essential phase of post harvesting and processing industry; however the undesirable losses and waste rate that occur during peeling stage are always the main concern of food processing sector. There are three methods of peeling fruits and vegetables including mechanical, chemical and thermal, depending on the class and type of fruit. By comparison, the mechanical method is the most preferred; this method keeps edible portions of produce fresh and creates less damage. Obviously reducing material losses and increasing the quality of the process has a direct effect on the whole efficiency of food processing industry which needs more study on technological aspects of this industrial segment. In order to enhance the effectiveness of food industrial practices it is essential to have a clear understanding of material properties and behaviour of tissues under industrial processes. This paper presents the scheme of research that seeks to examine tissue damage of tough skinned vegetables under mechanical peeling process by developing a novel FE model of the process using explicit dynamic finite element analysis approach. In the proposed study a nonlinear model which will be capable of simulating the peeling process specifically, will be developed. It is expected that unavailable information such as cutting force, maximum shearing force, shear strength, tensile strength and rupture stress will be quantified using the new FEA model. The outcomes will be used to optimize and improve the current mechanical peeling methods of this class of vegetables and thereby enhance the overall effectiveness of processing operations. Presented paper aims to review available literature and previous works have been done in this area of research and identify current gap in modelling and simulation of food processes.
Resumo:
In the long term, with development of skill, knowledge, exposure and confidence within the engineering profession, rigorous analysis techniques have the potential to become a reliable and far more comprehensive method for design and verification of the structural adequacy of OPS, write Nimal J Perera, David P Thambiratnam and Brian Clark. This paper explores the potential to enhance operator safety of self-propelled mechanical plant subjected to roll over and impact of falling objects using the non-linear and dynamic response simulation capabilities of analytical processes to supplement quasi-static testing methods prescribed in International and Australian Codes of Practice for bolt on Operator Protection Systems (OPS) that are post fitted. The paper is based on research work carried out by the authors at the Queensland University of Technology (QUT) over a period of three years by instrumentation of prototype tests, scale model tests in the laboratory and rigorous analysis using validated Finite Element (FE) Models. The FE codes used were ABAQUS for implicit analysis and LSDYNA for explicit analysis. The rigorous analysis and dynamic simulation technique described in the paper can be used to investigate the structural response due to accident scenarios such as multiple roll over, impact of multiple objects and combinations of such events and thereby enhance the safety and performance of Roll Over and Falling Object Protection Systems (ROPS and FOPS). The analytical techniques are based on sound engineering principles and well established practice for investigation of dynamic impact on all self propelled vehicles. They are used for many other similar applications where experimental techniques are not feasible.
Resumo:
Modelling the power systems load is a challenge since the load level and composition varies with time. An accurate load model is important because there is a substantial component of load dynamics in the frequency range relevant to system stability. The composition of loads need to be charaterised because the time constants of composite loads affect the damping contributions of the loads to power system oscillations, and their effects vary with the time of the day, depending on the mix of motors loads. This chapter has two main objectives: 1) describe the load modelling in small signal using on-line measurements; and 2) present a new approach to develop models that reflect the load response to large disturbances. Small signal load characterisation based on on-line measurements allows predicting the composition of load with improved accuracy compared with post-mortem or classical load models. Rather than a generic dynamic model for small signal modelling of the load, an explicit induction motor is used so the performance for larger disturbances can be more reliably inferred. The relation between power and frequency/voltage can be explicitly formulated and the contribution of induction motors extracted. One of the main features of this work is the induction motor component can be associated to nominal powers or equivalent motors