388 resultados para lab computers
Resumo:
In recent years there has been a large emphasis placed on the need to use Learning Management Systems (LMS) in the field of higher education, with many universities mandating their use. An important aspect of these systems is their ability to offer collaboration tools to build a community of learners. This paper reports on a study of the effectiveness of an LMS (Blackboard©) in a higher education setting and whether both lecturers and students voluntarily use collaborative tools for teaching and learning. Interviews were conducted with participants (N=67) from the faculties of Science and Technology, Business, Health and Law. Results from this study indicated that participants often use Blackboard© as an online repository of learning materials and that the collaboration tools of Blackboard© are often not utilised. The study also found that several factors have inhibited the use and uptake of the collaboration tools within Blackboard©. These have included structure and user experience, pedagogical practice, response time and a preference for other tools.
Resumo:
BACKGROUND Collaborative and active learning have been clearly identified as ways students can engage in learning with each other and the academic staff. Traditional tier based lecture theatres and the didactic style they engender are not popular with students today as evidenced by the low attendance rates for lectures. Many universities are installing spaces designed with tables for group interaction with evolutions on spaces such as the TEAL (Technology Enabled Active Learning) (Massachusetts Institute of Technology, n.d.) and SCALE-UP (Student-Centred Activities for Large-Enrolment Undergraduate Programs) (North Carolina State University, n.d.) models. Technology advances in large screen computers and applications have also aided the move to these collaborative spaces. How well have universities structured learning using these spaces and how have students engaged with the content, technology, space and each other? This paper investigates the application of collaborative learning in such spaces for a cohort of 800+ first year engineers in the context of learning about and developing professional skills representative of engineering practice. PURPOSE To determine whether moving from tiers to tables enhances the student experience. Does utilising technology rich, activity based, collaborative learning spaces lead to positive experiences and active engagement of first year undergraduate engineering students? In developing learning methodology and approach in new learning spaces, what needs to change from a more traditional lecture and tutorial configuration? DESIGN/METHOD A post delivery review and analysis of outcomes was undertaken to determine how well students and tutors engaged with learning in new collaborative learning spaces. Data was gathered via focus group and survey of tutors, students survey and attendance observations. The authors considered the unit delivery approach along with observed and surveyed outcomes then conducted further review to produce the reported results. RESULTS Results indicate high participation in the collaborative sessions while the accompanying lectures were poorly attended. Students reported a high degree of satisfaction with the learning experience; however more investigation is required to determine the degree of improvement in retained learning outcomes. Survey feedback from tutors found that students engaged well in the activities during tutorials and there was an observed improvement in the quality of professional practice modelled by students during sessions. Student feedback confirmed the positive experiences in these collaborative learning spaces with 30% improvement in satisfaction ratings from previous years. CONCLUSIONS It is concluded that the right mix of space, technology and appropriate activities does engage students, improve participation and create a rich experience to facilitate potential for improved learning outcomes. The new Collaborative Teaching Spaces, together with integrated technology and tailored activities, has transformed the delivery of this unit and improved student satisfaction in tutorials significantly.
Resumo:
Mobile telecommunications have become a key lifestyle and technological trend of the twenty first century. In the context of increased urbanism and pressure on cites for citizen engagement for the purpose of creating good public places the potential of these technologies raises critical questions for planning professionals. Even though technology has become integral to all functions within our urban environment, little is known about perceptions and relationship between urban planners and the ubiquitous, ever-present digital layer of urban data and information. This paper explores this issue, via three focus groups and an additional follow-up interview with planners from local and state government, education and private sector. This paper explores the issues of integrating information and communication technologies into planning practice and the affordances that these technologies offer for community consultation and placemaking.
Resumo:
The computer is fast becoming part of the furniture in many hospital settings. Increasing reliance on the computer for documentation and dissemination of information in patient-care areas has increased the need to consider this equipment as a potential environmental reservoir for microorganisms. This paper reports on a small experimental study which investigated the potential role of computers in cross-infection. The results indicate that computer surfaces are similar to other environmental surfaces and carry the same risks for cross-infection.
Resumo:
The making of the modern world has long been fuelled by utopian images that are blind to ecologi- cal reality. Botanical gardens are but one example – who typically portray themselves as miniature, isolated 'edens on earth', whereas they are now in many cases self-evidently also the vital ‘lungs’ of crowded cities, as well as critical habitats for threat- ened biodiversity. In 2010 the 'Remnant Emergency Art lab' set out to question utopian thinking through a creative provocation called the 'Botanical Gardens ‘X-Tension’ - an imagined city-wide, distributed, network of 'ecological gardens' suited to both bat and human needs, in order to ask, what now needs to be better understood, connected and therefore ultimately conserved.
Resumo:
A Delay Tolerant Network (DTN) is one where nodes can be highly mobile, with long message delay times forming dynamic and fragmented networks. Traditional centralised network security is difficult to implement in such a network, therefore distributed security solutions are more desirable in DTN implementations. Establishing effective trust in distributed systems with no centralised Public Key Infrastructure (PKI) such as the Pretty Good Privacy (PGP) scheme usually requires human intervention. Our aim is to build and compare different de- centralised trust systems for implementation in autonomous DTN systems. In this paper, we utilise a key distribution model based on the Web of Trust principle, and employ a simple leverage of common friends trust system to establish initial trust in autonomous DTN’s. We compare this system with two other methods of autonomously establishing initial trust by introducing a malicious node and measuring the distribution of malicious and fake keys. Our results show that the new trust system not only mitigates the distribution of fake malicious keys by 40% at the end of the simulation, but it also improved key distribution between nodes. This paper contributes a comparison of three de-centralised trust systems that can be employed in autonomous DTN systems.
Resumo:
“Our students have changed radically. Today’s students are no longer the people our educational system was designed to teach” (Prensky, 2001, p. 1). The influx of available new technology has helped to democratise knowledge, transforming when, where and how learning takes place, and changing perceptions of traditional learning landscapes (JISC, 2006; Neary et al., 2010). Mobile computers combined with wireless technology, have completely transformed the educational world; students have turned nomad[ic], engaging in conversations and thinking across traditional campus spaces (Alexander, 2004; Fisher, 2005). In this workshop we will be attempting to de-mystify a facet of mobile learning, by working in small groups to set up and kick start a number of social media sites, which can be used for collaboration and information exchange, in the design studio.
Resumo:
In this work we discuss the effects of white and coloured noise perturbations on the parameters of a mathematical model of bacteriophage infection introduced by Beretta and Kuang in [Math. Biosc. 149 (1998) 57]. We numerically simulate the strong solutions of the resulting systems of stochastic ordinary differential equations (SDEs), with respect to the global error, by means of numerical methods of both Euler-Taylor expansion and stochastic Runge-Kutta type.
Resumo:
Introduction QC and EQA are integral to good pathology laboratory practice. Medical Laboratory Science students undertake a project exploring internal QC and EQA procedures used in chemical pathology laboratories. Each student represents an individual lab and the class group represents the peer group of labs performing the same assay using the same method. Methods Using a manual BCG assay for serum albumin, normal and abnormal controls are run with a patient sample over 7 weeks. The QC results are assessed each week using calculated z-scores and both 2S & 3S control rules to determine whether a run is ‘in control’. At the end of the 7 weeks a completed LJ chart is assessed using the Westgard Multirules. Students investigate causes of error and the implications for both lab practice and patient care if runs are not ‘in control’. Twice in the 7 weeks two EQA samples (with target values unknown) are assayed alongside the weekly QC and patient samples. Results from each student are collated and form the basis of an EQA program. ALP are provided and students complete a Youden Plot, which is used to analyse the performance of each ‘lab’ and the method to identify bias. Students explore the concept of possible clinical implications of a biased method and address the actions that should be taken if a lab is not in consensus with the peer group. Conclusion This project is a model of ‘real world’ practice in which student demonstrate an understanding of the importance of QC procedures in a pathology laboratory, apply and interpret statistics and QC rules and charts, apply critical thinking and analytical skills to quality performance data to make recommendations for further practice and improve their technical competence and confidence.
Resumo:
"The second of the Oral History Workshops conducted by Associate Professor Helen Klaebe and the Oral History team from the Queensland University of Technology in Brisbane, was conducted in El Arish on the last weekend in September 2011. The first workshop was held in Cardwell in March 2011. Historical Society members and other researchers from both the Cardwell and El Arish areas combined to organise and fund the workshops, which have produced a growing collection of recordings of personal stories from people with a wide variety of experiences during and after cyclone Yasi. Aside from being productive in documenting history, the workshops have offered a greatly appreciated educational opportunity for many people, most of whom have never before had access to such benefits. Not only were they able to learn history gathering methodologies and the relevant technical skills, but they also gained new experience in the use of computers to apply these skills. These far northern oral history workshops took the form of a shortened version of the 5 series workshops being presented at QUT in Brisbane this year. The agenda was aligned to the wishes and requirements of the participants who attended from the Cassowary Coast and Tableland regions."
Resumo:
Background: Extra corporeal membrane oxygenation (ECMO) is a complex rescue therapy used to provide cardiac and/or respiratory support for critically ill patients who have failed maximal conventional medical management. ECMO is based on a modified cardiopulmonary bypass (CPB) circuit, and can provide cardiopulmonary support for up-to several months. It can be used in a veno venous configuration for isolated respiratory failure, (VV-ECMO), or in a veno arterial configuration (VA-ECMO) where support is necessary for cardiac +/- respiratory failure. The ECMO circuit consists of five main components: large bore cannulae (access cannulae) for drainage of the venous system, and return cannulae to either the venous (in VV-ECMO) or arterial (in VA ECMO) system. An oxygenator, with a vast surface area of hollow filaments, allows addition of oxygen and removal of carbon dioxide; a centrifugal blood pump allows propulsion of blood through the circuit at upto 10 L/minute; a control module and a thermoregulatory unit, which allows for exact temperature control of the extra corporeal blood. Methods: The first successful use of ECMO for ARDS in adults occurred in 1972, and its use has become more commonplace over the last 30 years, supported by the improvement in design and biocompatibility of the equipment, which has reduced the morbidity associated with this modality. Whilst the use of ECMO in neonatal population has been supported by numerous studies, the evidence upon which ECMO was integrated into adult practice was substantially less robust. Results: Recent data, including the CESAR study (Conventional Ventilatory Support versus Extra corporeal membrane oxygenation for Severe Respiratory failure) has added a degree of evidence to the role of ECMO in such a patient population. The CESAR study analysed 180 patients, and confirmed that ECMO was associated with an improved rate of survival. More recently, ECMO has been utilized in numerous situations within the critical care area, including support in high-risk percutaneous interventions in cardiac catheter lab; the operating room, emergency department, as well in specialized inter-hospital retrieval services. The increased understanding of the risk:benefit profile of ECMO, along with a reduction in morbidity associated with its use will doubtless lead to a substantial rise in the utilisation of this modality. As with all extra-corporeal circuits, ECMO opposes the basic premises of the mammalian inflammation and coagulation cascade where blood comes into foreign circulation, both these cascades are activated. Anti-coagulation is readily dealt with through use of agents such as heparin, but the inflammatory excess, whilst less macroscopically obvious, continues un-abated. Platelet consumption and neutrophil activation occur rapidly, and the clinician is faced with balancing the need of anticoagulation for the circuit, against haemostasis in an acutely bleeding patient. Alterations in pharmacokinetics may result in inadequate levels of disease modifying therapeutics, such as antibiotics, hence paradoxically delaying recovery from conditions such as pneumonia. Key elements of nutrition and the innate immune system maysimilarly be affected. Summary: This presentation will discuss the basic features of ECMO to the nonspecialist, and review the clinical conundrum faced by the team treating these most complex cases.
Resumo:
An optical system which performs the multiplication of binary numbers is described and proof-of-principle experiments are performed. The simultaneous generation of all partial products, optical regrouping of bit products, and optical carry look-ahead addition are novel features of the proposed scheme which takes advantage of the parallel operations capability of optical computers. The proposed processor uses liquid crystal light valves (LCLVs). By space-sharing the LCLVs one such system could function as an array of multipliers. Together with the optical carry look-ahead adders described, this would constitute an optical matrix-vector multiplier.
Resumo:
The method of lines is a standard method for advancing the solution of partial differential equations (PDEs) in time. In one sense, the method applies equally well to space-fractional PDEs as it does to integer-order PDEs. However, there is a significant challenge when solving space-fractional PDEs in this way, owing to the non-local nature of the fractional derivatives. Each equation in the resulting semi-discrete system involves contributions from every spatial node in the domain. This has important consequences for the efficiency of the numerical solver, especially when the system is large. First, the Jacobian matrix of the system is dense, and hence methods that avoid the need to form and factorise this matrix are preferred. Second, since the cost of evaluating the discrete equations is high, it is essential to minimise the number of evaluations required to advance the solution in time. In this paper, we show how an effective preconditioner is essential for improving the efficiency of the method of lines for solving a quite general two-sided, nonlinear space-fractional diffusion equation. A key contribution is to show, how to construct suitable banded approximations to the system Jacobian for preconditioning purposes that permit high orders and large stepsizes to be used in the temporal integration, without requiring dense matrices to be formed. The results of numerical experiments are presented that demonstrate the effectiveness of this approach.
Resumo:
Motivation: Unravelling the genetic architecture of complex traits requires large amounts of data, sophisticated models and large computational resources. The lack of user-friendly software incorporating all these requisites is delaying progress in the analysis of complex traits. Methods: Linkage disequilibrium and linkage analysis (LDLA) is a high-resolution gene mapping approach based on sophisticated mixed linear models, applicable to any population structure. LDLA can use population history information in addition to pedigree and molecular markers to decompose traits into genetic components. Analyses are distributed in parallel over a large public grid of computers in the UK. Results: We have proven the performance of LDLA with analyses of simulated data. There are real gains in statistical power to detect quantitative trait loci when using historical information compared with traditional linkage analysis. Moreover, the use of a grid of computers significantly increases computational speed, hence allowing analyses that would have been prohibitive on a single computer. © The Author 2009. Published by Oxford University Press. All rights reserved.