558 resultados para real-world


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Literacy in Early Childhood and Primary Education provides a comprehensive introduction to literacy teaching and learning. The book explores the continuum of literacy learning and children’s transitions from early childhood settings to junior primary classrooms, and then to senior primary and beyond. Reader-friendly and accessible, this book equips pre-service teachers with the theoretical underpinnings and practical strategies and skills needed to teach literacy. It places the ‘reading wars’ firmly in the past as it examines contemporary research and practices. The book covers important topics such as literacy acquisition, family literacies and multiliteracies, foundation skills for literacy learning, reading difficulties, assessment, and supporting diverse literacy learners in early childhood and primary classrooms. It also addresses some of the challenges that teachers may face in the classroom and provides solutions to these. Each chapter includes learning objectives, reflective questions and definitions to key terms to engage and assist readers. Further resources are also available at www.cambridge.edu.au/academic/literacy. Written by an expert author team and featuring real-world examples from literacy teachers and learners. Literacy in Early Childhood and Primary Education will help pre-service teachers feel confident teaching literacy to diverse age groups and abilities.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose – Rehearsing practical site operations is without doubt one of the most effective methods for minimising planning mistakes, because of the learning that takes place during the rehearsal activity. However, real rehearsal is not a practical solution for on-site construction activities, as it not only involves a considerable amount of cost but can also have adverse environmental implications. One approach to overcoming this is by the use of virtual rehearsals. The purpose of this paper is to investigate an approach to simulation of the motion of cranes in order to test the feasibility of associated construction sequencing and generate construction schedules for review and visualisation. Design/methodology/approach – The paper describes a system involving two technologies, virtual prototyping (VP) and four-dimensional (4D) simulation, to assist construction planners in testing the sequence of construction activities when mobile cranes are involved. The system consists of five modules, comprising input, database, equipment, process and output, and is capable of detecting potential collisions. A real-world trial is described in which the system was tested and validated. Findings – Feedback from the planners involved in the trial indicated that they found the system to be useful in its present form and that they would welcome its further development into a fully automated platform for validating construction sequencing decisions. Research limitations/implications – The tool has the potential to provide a cost-effective means of improving construction planning. However, it is limited at present to the specific case of crane movement under special consideration. Originality/value – This paper presents a large-scale, real life case of applying VP technology in planning construction processes and activities.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Authentic assessment tasks enhance engagement, retention and the aspirations of students. This paper explores the discipline-generic features of authentic assessment, which reflect what students need to achieve in the real world. Some assessment tasks are more authentic than others and this paper designs a proposed framework supported by the literature that aids unit co-ordinators to determine the level of authenticity of an assessment task. The framework is applied to three summative assessment tasks, that is, tutorial participation, advocacy exercise and problem-based exam, in a law unit. The level of authenticity of the assessment tasks is compared and opportunities to improve authenticity are identified.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A Flash Event (FE) represents a period of time when a web-server experiences a dramatic increase in incoming traffic, either following a newsworthy event that has prompted users to locate and access it, or as a result of redirection from other popular web or social media sites. This usually leads to network congestion and Quality-of-Service (QoS) degradation. These events can be mistaken for Distributed Denial-of-Service (DDoS) attacks aimed at disrupting the server. Accurate detection of FEs and their distinction from DDoS attacks is important, since different actions need to be undertaken by network administrators in these two cases. However, lack of public domain FE datasets hinders research in this area. In this paper we present a detailed study of flash events and classify them into three broad categories. In addition, the paper describes FEs in terms of three key components: the volume of incoming traffic, the related source IP-addresses, and the resources being accessed. We present such a FE model with minimal parameters and use publicly available datasets to analyse and validate our proposed model. The model can be used to generate different types of FE traffic, closely approximating real-world scenarios, in order to facilitate research into distinguishing FEs from DDoS attacks.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Foot ulcers are a frequent reason for diabetes-related hospitalisation. Clinical training is known to have a beneficial impact on foot ulcer outcomes. Clinical training using simulation techniques has rarely been used in the management of diabetes-related foot complications or chronic wounds. Simulation can be defined as a device or environment that attempts to replicate the real world. The few non-web-based foot-related simulation courses have focused solely on training for a single skill or “part task” (for example, practicing ingrown toenail procedures on models). This pilot study aimed to primarily investigate the effect of a training program using multiple methods of simulation on participants’ clinical confidence in the management of foot ulcers. Methods: Sixteen podiatrists participated in a two-day Foot Ulcer Simulation Training (FUST) course. The course included pre-requisite web-based learning modules, practicing individual foot ulcer management part tasks (for example, debriding a model foot ulcer), and participating in replicated clinical consultation scenarios (for example, treating a standardised patient (actor) with a model foot ulcer). The primary outcome measure of the course was participants’ pre- and post completion of confidence surveys, using a five-point Likert scale (1 = Unacceptable-5 = Proficient). Participants’ knowledge, satisfaction and their perception of the relevance and fidelity (realism) of a range of course elements were also investigated. Parametric statistics were used to analyse the data. Pearson’s r was used for correlation, ANOVA for testing the differences between groups, and a paired-sample t-test to determine the significance between pre- and post-workshop scores. A minimum significance level of p < 0.05 was used. Results: An overall 42% improvement in clinical confidence was observed following completion of FUST (mean scores 3.10 compared to 4.40, p < 0.05). The lack of an overall significant change in knowledge scores reflected the participant populations’ high baseline knowledge and pre-requisite completion of web-based modules. Satisfaction, relevance and fidelity of all course elements were rated highly. Conclusions: This pilot study suggests simulation training programs can improve participants’ clinical confidence in the management of foot ulcers. The approach has the potential to enhance clinical training in diabetes-related foot complications and chronic wounds in general.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

PURPOSE: To examine the basis of previous findings of an association between indices of driving safety and visual motion sensitivity and to examine whether this association could be explained by low-level changes in visual function. METHODS: 36 visually normal participants (aged 19 – 80 years), completed a battery of standard vision tests including visual acuity, contrast sensitivity and automated visual fields. and two tests of motion perception including sensitivity for movement of a drifting Gabor stimulus, and sensitivity for displacement in a random-dot kinematogram (Dmin). Participants also completed a hazard perception test (HPT) which measured participants’ response times to hazards embedded in video recordings of real world driving which has been shown to be linked to crash risk. RESULTS: Dmin for the random-dot stimulus ranged from -0.88 to -0.12 log minutes of arc, and the minimum drift rate for the Gabor stimulus ranged from 0.01 to 0.35 cycles per second. Both measures of motion sensitivity significantly predicted response times on the HPT. In addition, while the relationship involving the HPT and motion sensitivity for the random-dot kinematogram was partially explained by the other visual function measures, the relationship with sensitivity for detection of the drifting Gabor stimulus remained significant even after controlling for these variables. CONCLUSION: These findings suggest that motion perception plays an important role in the visual perception of driving-relevant hazards independent of other areas of visual function and should be further explored as a predictive test of driving safety. Future research should explore the causes of reduced motion perception in order to develop better interventions to improve road safety.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Most current computer systems authorise the user at the start of a session and do not detect whether the current user is still the initial authorised user, a substitute user, or an intruder pretending to be a valid user. Therefore, a system that continuously checks the identity of the user throughout the session is necessary without being intrusive to end-user and/or effectively doing this. Such a system is called a continuous authentication system (CAS). Researchers have applied several approaches for CAS and most of these techniques are based on biometrics. These continuous biometric authentication systems (CBAS) are supplied by user traits and characteristics. One of the main types of biometric is keystroke dynamics which has been widely tried and accepted for providing continuous user authentication. Keystroke dynamics is appealing for many reasons. First, it is less obtrusive, since users will be typing on the computer keyboard anyway. Second, it does not require extra hardware. Finally, keystroke dynamics will be available after the authentication step at the start of the computer session. Currently, there is insufficient research in the CBAS with keystroke dynamics field. To date, most of the existing schemes ignore the continuous authentication scenarios which might affect their practicality in different real world applications. Also, the contemporary CBAS with keystroke dynamics approaches use characters sequences as features that are representative of user typing behavior but their selected features criteria do not guarantee features with strong statistical significance which may cause less accurate statistical user-representation. Furthermore, their selected features do not inherently incorporate user typing behavior. Finally, the existing CBAS that are based on keystroke dynamics are typically dependent on pre-defined user-typing models for continuous authentication. This dependency restricts the systems to authenticate only known users whose typing samples are modelled. This research addresses the previous limitations associated with the existing CBAS schemes by developing a generic model to better identify and understand the characteristics and requirements of each type of CBAS and continuous authentication scenario. Also, the research proposes four statistical-based feature selection techniques that have highest statistical significance and encompasses different user typing behaviors which represent user typing patterns effectively. Finally, the research proposes the user-independent threshold approach that is able to authenticate a user accurately without needing any predefined user typing model a-priori. Also, we enhance the technique to detect the impostor or intruder who may take over during the entire computer session.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Traditionally, Science education has stressed the importance of teaching students to conduct ‘scientific inquiry’, with the main focus being the experimental model of inquiry used by real world scientists. Current educational approaches using constructivist pedagogy recognise the value of inquiry as a method for promoting the development of deep understanding of discipline content. A recent Information Learning Activity undertaken by a Grade Eight Science class was observed to discover how inquiry based learning is implemented in contemporary Science education. By analysing student responses to questionnaires and assessment task outcomes, the author was able to determine the level of inquiry inherent in the activity and how well the model supported student learning and the development of students’ information literacy skills. Although students achieved well overall, some recommendations are offered that may enable teachers to better exploit the learning opportunities provided by inquiry based learning. Planning interventions at key stages of the inquiry process can assist students to learn more effective strategies for dealing with cognitive and affective challenges. Allowing students greater input into the selection of topic or focus of the activity may encourage students to engage more deeply with the learning task. Students are likely to experience greater learning benefit from access to developmentally appropriate resources, increased time to explore topics and multiple opportunities to undertake information searches throughout the learning activity. Finally, increasing the cognitive challenge can enhance both the depth of students’ learning and their information literacy skills.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In various industrial and scientific fields, conceptual models are derived from real world problem spaces to understand and communicate containing entities and coherencies. Abstracted models mirror the common understanding and information demand of engineers, who apply conceptual models for performing their daily tasks. However, most standardized models in Process Management, Product Lifecycle Management and Enterprise Resource Planning lack of a scientific foundation for their notation. In collaboration scenarios with stakeholders from several disciplines, tailored conceptual models complicate communication processes, as a common understanding is not shared or implemented in specific models. To support direct communication between experts from several disciplines, a visual language is developed which allows a common visualization of discipline-specific conceptual models. For visual discrimination and to overcome visual complexity issues, conceptual models are arranged in a three-dimensional space. The visual language introduced here follows and extends established principles of Visual Language science.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Problem-based learning (PBL) has been used successfully in disciplines such as medicine, nursing, law and engineering. However a review of the literature shows that there has been little use of this approach to learning in accounting. This paper extends the research in accounting education by reporting the findings of a case study of the development and implementation of PBL at the Queensland University of Technology (QUT) in a new Accountancy Capstone unit that began in 2006. The fundamentals of the PBL approach were adhered to. However, one of the essential elements of the approach adopted was to highlight the importance of questioning as a means of gathering the necessary information upon which decisions are made. This approach can be contrasted with the typical ‘give all the facts’ case studies that are commonly used. Another feature was that students worked together in the same group for an entire semester (similar to how teams in the workplace operate) so there was an intended focus on teamwork in solving unstructured, real-world accounting problems presented to students. Based on quantitative and qualitative data collected from student questionnaires over seven semesters, it was found that students perceived PBL to be effective, especially in terms of developing the skills of questioning, teamwork, and problem solving. The effectiveness of questioning is very important as this is a skill that is rarely the focus of development in accounting education. The successful implementation of PBL in accounting through ‘learning by doing’ could be the catalyst for change to bring about better learning outcomes for accounting graduates.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Airports and cities inevitably recognise the value that each brings the other; however, the separation in decision-making authority for what to build, where, when and how provides a conundrum for both parties. Airports often want a say in what is developed outside of the airport fence, and cities often want a say in what is developed inside the airport fence. Defining how much of a say airports and cities have in decisions beyond their jurisdictional control is likely to be a topic that continues so long as airports and cities maintain separate formal decision-making processes for what to build, where, when and how. However, the recent Green and White Papers for a new National Aviation Policy have made early inroads to formalising relationships between Australia’s major airports and their host cities. At present, no clear indication (within practice or literature) is evident to the appropriateness of different governance arrangements for decisions to develop in situations that bring together the opposing strategic interests of airports and cities; thus leaving decisions for infrastructure development as complex decision-making spaces that hold airport and city/regional interests at stake. The line of enquiry is motivated by a lack of empirical research on networked decision-making domains outside of the realm of institutional theorists (Agranoff & McGuire, 2001; Provan, Fish & Sydow, 2007). That is, governance literature has remained focused towards abstract conceptualisations of organisation, without focusing on the minutia of how organisation influences action in real-world applications. A recent study by Black (2008) has provided an initial foothold for governance researchers into networked decision-making domains. This study builds upon Black’s (2008) work by aiming to explore and understand the problem space of making decisions subjected to complex jurisdictional and relational interdependencies. That is, the research examines the formal and informal structures, relationships, and forums that operationalise debates and interactions between decision-making actors as they vie for influence over deciding what to build, where, when and how in airport-proximal development projects. The research mobilises a mixture of qualitative and quantitative methods to examine three embedded cases of airport-proximal development from a network governance perspective. Findings from the research provide a new understanding to the ways in which informal actor networks underpin and combine with formal decision-making networks to create new (or realigned) governance spaces that facilitate decision-making during complex phases of development planning. The research is timely, and responds well to Isett, Mergel, LeRoux, Mischen and Rethemeyer’s (2011) recent critique of limitations within current network governance literature, specifically to their noted absence of empirical studies that acknowledge and interrogate the simultaneity of formal and informal network structures within network governance arrangements (Isett et al., 2011, pp. 162-166). The combination of social network analysis (SNA) techniques and thematic enquiry has enabled findings to document and interpret the ways in which decision-making actors organise to overcome complex problems for planning infrastructure. An innovative approach to using association networks has been used to provide insights to the importance of the different ways actors interact with one another, thus providing a simple yet valuable addition to the increasingly popular discipline of SNA. The research also identifies when and how different types of networks (i.e. formal and informal) are able to overcome currently known limitations to network governance (see McGuire & Agranoff, 2011), thus adding depth to the emerging body of network governance literature surrounding limitations to network ways of working (i.e. Rhodes, 1997a; Keast & Brown, 2002; Rethemeyer & Hatmaker, 2008; McGuire & Agranoff, 2011). Contributions are made to practice via the provision of a timely understanding of how horizontal fora between airports and their regions are used, particularly in the context of how they reframe the governance of decision-making for airport-proximal infrastructure development. This new understanding will enable government and industry actors to better understand the structural impacts of governance arrangements before they design or adopt them, particularly for factors such as efficiency of information, oversight, and responsiveness to change.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fastmoving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment due to the use of non-deterministic readily-available hardware (such as 802.11-based wireless) and inaccurate clock synchronisation protocols (such as Network Time Protocol (NTP)). As a result, the synchronisation of the clocks between robots can be out by tens-to-hundreds of milliseconds making correlation of data difficult and preventing the possibility of the units performing synchronised actions such as triggering cameras or intricate swarm manoeuvres. In this thesis, a complete data fusion unit is designed, implemented and tested. The unit, named BabelFuse, is able to accept sensor data from a number of low-speed communication buses (such as RS232, RS485 and CAN Bus) and also timestamp events that occur on General Purpose Input/Output (GPIO) pins referencing a submillisecondaccurate wirelessly-distributed "global" clock signal. In addition to its timestamping capabilities, it can also be used to trigger an attached camera at a predefined start time and frame rate. This functionality enables the creation of a wirelessly-synchronised distributed image acquisition system over a large geographic area; a real world application for this functionality is the creation of a platform to facilitate wirelessly-distributed 3D stereoscopic vision. A ‘best-practice’ design methodology is adopted within the project to ensure the final system operates according to its requirements. Initially, requirements are generated from which a high-level architecture is distilled. This architecture is then converted into a hardware specification and low-level design, which is then manufactured. The manufactured hardware is then verified to ensure it operates as designed and firmware and Linux Operating System (OS) drivers are written to provide the features and connectivity required of the system. Finally, integration testing is performed to ensure the unit functions as per its requirements. The BabelFuse System comprises of a single Grand Master unit which is responsible for maintaining the absolute value of the "global" clock. Slave nodes then determine their local clock o.set from that of the Grand Master via synchronisation events which occur multiple times per-second. The mechanism used for synchronising the clocks between the boards wirelessly makes use of specific hardware and a firmware protocol based on elements of the IEEE-1588 Precision Time Protocol (PTP). With the key requirement of the system being submillisecond-accurate clock synchronisation (as a basis for timestamping and camera triggering), automated testing is carried out to monitor the o.sets between each Slave and the Grand Master over time. A common strobe pulse is also sent to each unit for timestamping; the correlation between the timestamps of the di.erent units is used to validate the clock o.set results. Analysis of the automated test results show that the BabelFuse units are almost threemagnitudes more accurate than their requirement; clocks of the Slave and Grand Master units do not di.er by more than three microseconds over a running time of six hours and the mean clock o.set of Slaves to the Grand Master is less-than one microsecond. The common strobe pulse used to verify the clock o.set data yields a positive result with a maximum variation between units of less-than two microseconds and a mean value of less-than one microsecond. The camera triggering functionality is verified by connecting the trigger pulse output of each board to a four-channel digital oscilloscope and setting each unit to output a 100Hz periodic pulse with a common start time. The resulting waveform shows a maximum variation between the rising-edges of the pulses of approximately 39¥ìs, well below its target of 1ms.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Reliable ambiguity resolution (AR) is essential to Real-Time Kinematic (RTK) positioning and its applications, since incorrect ambiguity fixing can lead to largely biased positioning solutions. A partial ambiguity fixing technique is developed to improve the reliability of AR, involving partial ambiguity decorrelation (PAD) and partial ambiguity resolution (PAR). Decorrelation transformation could substantially amplify the biases in the phase measurements. The purpose of PAD is to find the optimum trade-off between decorrelation and worst-case bias amplification. The concept of PAR refers to the case where only a subset of the ambiguities can be fixed correctly to their integers in the integer least-squares (ILS) estimation system at high success rates. As a result, RTK solutions can be derived from these integer-fixed phase measurements. This is meaningful provided that the number of reliably resolved phase measurements is sufficiently large for least-square estimation of RTK solutions as well. Considering the GPS constellation alone, partially fixed measurements are often insufficient for positioning. The AR reliability is usually characterised by the AR success rate. In this contribution an AR validation decision matrix is firstly introduced to understand the impact of success rate. Moreover the AR risk probability is included into a more complete evaluation of the AR reliability. We use 16 ambiguity variance-covariance matrices with different levels of success rate to analyse the relation between success rate and AR risk probability. Next, the paper examines during the PAD process, how a bias in one measurement is propagated and amplified onto many others, leading to more than one wrong integer and to affect the success probability. Furthermore, the paper proposes a partial ambiguity fixing procedure with a predefined success rate criterion and ratio-test in the ambiguity validation process. In this paper, the Galileo constellation data is tested with simulated observations. Numerical results from our experiment clearly demonstrate that only when the computed success rate is very high, the AR validation can provide decisions about the correctness of AR which are close to real world, with both low AR risk and false alarm probabilities. The results also indicate that the PAR procedure can automatically chose adequate number of ambiguities to fix at given high-success rate from the multiple constellations instead of fixing all the ambiguities. This is a benefit that multiple GNSS constellations can offer.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Middle schooling is a crucial area of education where adolescents experiencing physiological and psychological hanges require expert guidance. As more research evidence is provided about adolescent learning, teachers are considered pivotal to adolescents’ educational development. The two levels of implementing reform measures need to be targeted, that is, at the inservice and preservice teacher levels. This quantitative study employs a 40-item, five-part Likert scale survey to understand preservice teachers’ (n=142) perceptions of their confidence to teach in the middle school at the conclusion of their tertiary education. The survey instrument was developed from the literature with connections to the Queensland College of Teachers professional standards. Results indicated that they perceived themselves as capable of creating a positive classroom environment with seven items greater than 80%, except with behaviour management (<80% for two items) and they considered their pedagogical knowledge to be adequate (i.e., 7 out of 8 items >84%). Items associated with implementing middle schooling curriculum had varied responses (e.g., implementing literacy and numeracy were 74% while implementing learning with real-world connections was 91%). This information may assist coursework designers. For example, if significant percentages of preservice teachers indicate they believe they were not well prepared for assessment and reporting in the middle school then course designers can target these areas more effectively.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Distributed Genetic Algorithms (DGAs) designed for the Internet have to take its high communication cost into consideration. For island model GAs, the migration topology has a major impact on DGA performance. This paper describes and evaluates an adaptive migration topology optimizer that keeps the communication load low while maintaining high solution quality. Experiments on benchmark problems show that the optimized topology outperforms static or random topologies of the same degree of connectivity. The applicability of the method on real-world problems is demonstrated on a hard optimization problem in VLSI design.