96 resultados para ease of publication
Resumo:
The emergence of ePortfolios is relatively recent in the university sector as a way to engage students in their learning and assessment, and to produce records of their accomplishments. An ePortfolio is an online tool that students can utilise to record, catalogue, retrieve and present reflections and artefacts that support and demonstrate the development of graduate students’ capabilities and professional standards across university courses. The ePortfolio is therefore considered as both process and product. Although ePortfolios show promise as a useful tool and their uptake has grown, they are not yet a mainstream higher education technology. To date, the emphasis has been on investigating their potential to support the multiple purposes of learning, assessment and employability, but less is known about whether and how students engage with ePortfolios in the university setting. This thesis investigates student engagement with an ePortfolio in one university. As the educational designer for the ePortfolio project at the University, I was uniquely positioned as a researching professional to undertake an inquiry into whether students were engaging with the ePortfolio. The participants in this study were a cohort (defined by enrolment in a unit of study) of second and third year education students (n=105) enrolled in a four year Bachelor of Education degree. The students were introduced to the ePortfolio in an introductory lecture and a hands-on workshop in a computer laboratory. They were subsequently required to complete a compulsory assessment task – a critical reflection - using the ePortfolio. Following that, engagement with the ePortfolio was voluntary. A single case study approach arising from an interpretivist paradigm directed the methodological approach and research design for this study. The study investigated the participants’ own accounts of their experiences with the ePortfolio, including how and when they engaged with the ePortfolio and the factors that impacted on their engagement. Data collection methods consisted of an attitude survey, student interviews, document collection, a researcher reflective journal and researcher observations. The findings of the study show that, while the students were encouraged to use the ePortfolio as a learning and employability tool, most students ultimately chose to disengage after completing the assessment task. Only six of the forty-five students (13%) who completed the research survey had used the ePortfolio in a sustained manner. The data obtained from the students during this research has provided insight into reasons why they disengaged from the ePortfolio. The findings add to the understandings and descriptions of student engagement with technology, and more broadly, advance the understanding of ePortfolios. These findings also contribute to the interdisciplinary field of technology implementation. There are three key outcomes from this study, a model of student engagement with technology, a set of criteria for the design of an ePortfolio, and a set of recommendations for effective practice for those implementing ePortfolios. The first, the Model of Student Engagement with Technology (MSET) (Version 2) explored student engagement with technology by highlighting key engagement decision points for students The model was initially conceptualised by building on work of previous research (Version 1), however, following data analysis a new model emerged, MSET (Version 2). The engagement decision points were identified as: • Prior Knowledge and Experience, leading to imagined usefulness and imagined ease of use; • Initial Supported Engagement, leading to supported experience of usefulness and supported ease of use; • Initial Independent Engagement, leading to actual experience of independent usefulness and actual ease of use; and • Ongoing Independent Engagement, leading to ongoing experience of usefulness and ongoing ease of use. The Model of Student Engagement with Technology (MSET) goes beyond numerical figures of usage to demonstrate student engagement with an ePortfolio. The explanatory power of the model is based on the identification of the types of decisions that students make and when they make them during the engagement process. This model presents a greater depth of understanding student engagement than was previously available and has implications for the direction and timing of future implementation, and academic and student development activities. The second key outcome from this study is a set of criteria for the re-conceptualisation of the University ePortfolio. The knowledge gained from this research has resulted in a new set of design criteria that focus on the student actions of writing reflections and adding artefacts. The process of using the ePortfolio is reconceptualised in terms of privileging student learning over administrative compliance. The focus of the ePortfolio is that the writing of critical reflections is the key function, not the selection of capabilities. The third key outcome from this research consists of five recommendations for university practice that have arisen from this study. They are that, sustainable implementation is more often achieved through small steps building on one another; that a clear definition of the purpose of an ePortfolio is crucial for students and staff; that ePortfolio pedagogy should be the driving force not the technology; that the merit of the ePortfolio is fostered in students and staff; and finally, that supporting delayed task performance is crucial. Students do not adopt an ePortfolio just because it is provided. While students must accept responsibility for their own engagement with the ePortfolio, the institution has to accept responsibility for providing the environment, and technical and pedagogical support to foster engagement. Ultimately, an ePortfolio should be considered as a joint venture between student and institution where strong returns on investment can be realised by both. It is acknowledged that the current implementation strategies for the ePortfolio are just the beginning of a much longer process. The real rewards for students, academics and the university lie in the future.
Resumo:
During 1999 the Department of Industry, Science and Resources (ISR) published 4 research reports it had commissioned from the Australian Expert Group in Industry Studies (AEGIS), a research centre of the University of Western Sydney, Macarthur. ISR will shortly publish the fifth and final report in this series. The five reports were commissioned by the Department, as part of the Building and Construction Action Agenda process, to investigate the dynamics and performance of the sector, particularly in relation its innovative capacity. Professor Jane Marceau, PVCR at the University of Western Sydney and Director of AEGIS, led the research team. Dr Karen Manley was the researcher and joint author on three of the five reports. This paper outlines the approach and key findings of each of the five reports. The reports examined 5 key elements of the ‘building and construction product system’. The term ‘product system’ reflects the very broad range of industries and players we consider to contribute to the performance of the building and construction industries. The term ‘product system’ also highlights our focus on the systemic qualities of the building and construction industries. We were most interested in the inter-relationships between key segments and players and how these impacted on the innovation potential of the product system. The ‘building and construction product system’ is hereafter referred to as ‘the industry’ for ease of presentation. All the reports are based, at least in part, on an interviewing or survey research phase which involved gathering data from public and private sector players nationally. The first report ‘maps’ the industry to identify and describe its key elements and the inter-relationships between them. The second report focuses specifically on the linkages between public-sector research organisations and firms in the industry. The third report examines the conditions surrounding the emergence of new businesses in the industry. The fourth report examines how manufacturing businesses are responding to customer demands for ‘total solutions’ to their building and construction needs, by providing various services to clients. The fifth report investigates the capacity of the industry to encourage and undertake energy efficient building design and construction.
Resumo:
Electronic Blocks are a new programming environment, designed specifically for children aged between three and eight years. As such, the design of the Electronic Block environment is firmly based on principles of developmentally appropriate practices in early childhood education. The Electronic Blocks are physical, stackable blocks that include sensor blocks, action blocks and logic blocks. Evaluation of the Electronic Blocks with both preschool and primary school children shows that the blocks' ease of use and power of engagement have created a compelling tool for the introduction of meaningful technology education in an early childhood setting. The key to the effectiveness of the Electronic Blocks lies in an adherence to theories of development and learning throughout the Electronic Blocks design process.
Resumo:
Maximum-likelihood estimates of the parameters of stochastic differential equations are consistent and asymptotically efficient, but unfortunately difficult to obtain if a closed-form expression for the transitional probability density function of the process is not available. As a result, a large number of competing estimation procedures have been proposed. This article provides a critical evaluation of the various estimation techniques. Special attention is given to the ease of implementation and comparative performance of the procedures when estimating the parameters of the Cox–Ingersoll–Ross and Ornstein–Uhlenbeck equations respectively.
Resumo:
The uniformization method (also known as randomization) is a numerically stable algorithm for computing transient distributions of a continuous time Markov chain. When the solution is needed after a long run or when the convergence is slow, the uniformization method involves a large number of matrix-vector products. Despite this, the method remains very popular due to its ease of implementation and its reliability in many practical circumstances. Because calculating the matrix-vector product is the most time-consuming part of the method, overall efficiency in solving large-scale problems can be significantly enhanced if the matrix-vector product is made more economical. In this paper, we incorporate a new relaxation strategy into the uniformization method to compute the matrix-vector products only approximately. We analyze the error introduced by these inexact matrix-vector products and discuss strategies for refining the accuracy of the relaxation while reducing the execution cost. Numerical experiments drawn from computer systems and biological systems are given to show that significant computational savings are achieved in practical applications.
Resumo:
Technologies and languages for integrated processes are a relatively recent innovation. Over that period many divergent waves of innovation have transformed process integration. Like sockets and distributed objects, early workflow systems ordered programming interfaces that connected the process modelling layer to any middleware. BPM systems emerged later, connecting the modelling world to middleware through components. While BPM systems increased ease of use (modelling convenience), long-standing and complex interactions involving many process instances remained di±cult to model. Enterprise Service Buses (ESBs), followed, connecting process models to heterogeneous forms of middleware. ESBs, however, generally forced modellers to choose a particular underlying middleware and to stick to it, despite their ability to connect with many forms of middleware. Furthermore ESBs encourage process integrations to be modelled on their own, logically separate from the process model. This can lead to the inability to reason about long standing conversations at the process layer. Technologies and languages for process integration generally lack formality. This has led to arbitrariness in the underlying language building blocks. Conceptual holes exist in a range of technologies and languages for process integration and this can lead to customer dissatisfaction and failure to bring integration projects to reach their potential. Standards for process integration share similar fundamental flaws to languages and technologies. Standards are also in direct competition with other standards causing a lack of clarity. Thus the area of greatest risk in a BPM project remains process integration, despite major advancements in the technology base. This research examines some fundamental aspects of communication middleware and how these fundamental building blocks of integration can be brought to the process modelling layer in a technology agnostic manner. This way process modelling can be conceptually complete without becoming stuck in a particular middleware technology. Coloured Petri nets are used to define a formal semantics for the fundamental aspects of communication middleware. They provide the means to define and model the dynamic aspects of various integration middleware. Process integration patterns are used as a tool to codify common problems to be solved. Object Role Modelling is a formal modelling technique that was used to define the syntax of a proposed process integration language. This thesis provides several contributions to the field of process integration. It proposes a framework defining the key notions of integration middleware. This framework provides a conceptual foundation upon which a process integration language could be built. The thesis defines an architecture that allows various forms of middleware to be aggregated and reasoned about at the process layer. This thesis provides a comprehensive set of process integration patterns. These constitute a benchmark for the kinds of problems a process integration language must support. The thesis proposes a process integration modelling language and a partial implementation that is able to enact the language. A process integration pilot project in a German hospital is brie°y described at the end of the thesis. The pilot is based on ideas in this thesis.
Resumo:
The compressed gas industry and government agencies worldwide utilize "adiabatic compression" testing for qualifying high-pressure valves, regulators, and other related flow control equipment for gaseous oxygen service. This test methodology is known by various terms including adiabatic compression testing, gaseous fluid impact testing, pneumatic impact testing, and BAM testing as the most common terms. The test methodology will be described in greater detail throughout this document but in summary it consists of pressurizing a test article (valve, regulator, etc.) with gaseous oxygen within 15 to 20 milliseconds (ms). Because the driven gas1 and the driving gas2 are rapidly compressed to the final test pressure at the inlet of the test article, they are rapidly heated by the sudden increase in pressure to sufficient temperatures (thermal energies) to sometimes result in ignition of the nonmetallic materials (seals and seats) used within the test article. In general, the more rapid the compression process the more "adiabatic" the pressure surge is presumed to be and the more like an isentropic process the pressure surge has been argued to simulate. Generally speaking, adiabatic compression is widely considered the most efficient ignition mechanism for directly kindling a nonmetallic material in gaseous oxygen and has been implicated in many fire investigations. Because of the ease of ignition of many nonmetallic materials by this heating mechanism, many industry standards prescribe this testing. However, the results between various laboratories conducting the testing have not always been consistent. Research into the test method indicated that the thermal profile achieved (i.e., temperature/time history of the gas) during adiabatic compression testing as required by the prevailing industry standards has not been fully modeled or empirically verified, although attempts have been made. This research evaluated the following questions: 1) Can the rapid compression process required by the industry standards be thermodynamically and fluid dynamically modeled so that predictions of the thermal profiles be made, 2) Can the thermal profiles produced by the rapid compression process be measured in order to validate the thermodynamic and fluid dynamic models; and, estimate the severity of the test, and, 3) Can controlling parameters be recommended so that new guidelines may be established for the industry standards to resolve inconsistencies between various test laboratories conducting tests according to the present standards?
Resumo:
As access to networked digital communities increases, a growing number of teens participate in digital communities by creating and sharing a variety of content. The affordances of social media - ease of use, ubiquitous access, and communal nature - have made creating and sharing content an appealing process for teens. Teens primarily learn the practices of encountering and using information through social interaction and participation within digital communities. This article adopts the position that information literacy is the experience of using information to learn. It reports on an investigation into teens experiences in the United States, as they use information to learn how to create content and participate within the context of social media. Teens that participate in sharing art on sites such as DeiviantArt, website creation, blogging, and/or posting edited videos via YouTube and Vimeo, were interviewed. The interviews explored teens' information experiences within particular social and digital contexts. Teens discussed the information they used, how information was gathered and accessed, and explored the process of using that information to participate in the communities.
Resumo:
Child passenger injury remains a major road safety issue despite advances in biomechanical understanding and child restraint design. In Australia, one intervention with parents to encourage universal and consistent use of the most appropriate restraint as well as draw their attention to critical aspects of installation is the RoadWise Type 1 Child Car Restraints Fitting Service, WA. A mixed methods evaluation of this service was conducted in early 2010. Evaluation results suggest that it has been effective in ensuring good quality training of child restraint fitters. In addition, stakeholder and user satisfaction with the Service is high, with participants agreeing that the Service is valuable to the community, and fitters regarding the training course, materials and post-training support as effective. However, a continuing issue for interventions of this type is whether the parents who need them perceive this need. Evidence from the evaluation suggests that only about 25% of parents who could benefit from the Service actually use it. This may be partly due to parental perceptions that such services are not necessary or relevant to them, or to overconfidence about the ease of installing restraints correctly. Thus there is scope for improving awareness of the Service amongst groups most likely to benefit from it (e.g. new parents) and for alerting parents to the importance of correct installation and getting their self-installed restraints checked. Efforts to inform and influence parents should begin when their children are very young, preferably at or prior to birth and/or before the parent installs the first restraint.
Resumo:
A traditional approach centred on weekly lectures, perhaps supported by a tutorial programme, still predominates in modern legal education in Australia. This approach tends to focus on the transmission of knowledge about legal rules and doctrine to students who adopt a largely passive role. Criticisms of the traditional approach have led to law schools expanding their curricula to include the teaching of skills, including the skill of negotiation and an appreciation of legal ethics and professional responsibility. However, in a climate of limited government funding for law schools in Australia, innovation in legal education remains a challenge. This paper considers the successful use of Second Life machinima in two programs, Air Gondwana and Entry into Valhalla and their part in the creation of engaging, effective learning environments. These programs not only engage students in active learning but also facilitate flexibility in their studies and other benefits. The programs yield important lessons concerning the use of machinima innovations in curricula, not only for academics involved in legal education but also those in other disciplines, especially those that rely on traditional passive lectures in their teaching and learning approaches.
Resumo:
As online business thrives, a company’s Web presence holds enormous importance as a source of information, entertainment, and customer service for Internet users. Besides being user-friendly, a Web site should offer interesting and enjoyable content to attract online visitors in an ever-changing multimedia environment. Companies that operate globally must know how cultural differences influence the way potential customers perceive their sites. This paper presents a model that highlights the importance of ease of use, enjoyment, content, and brand trust for Web site loyalty. The model is subsequently tested in four countries: Australia, Japan, Mongolia, and the USA. The results show that perceptual differences exist: while ease of use is crucial for Web site loyalty in all four countries, the importance of content, perceived enjoyment, and brand trust varies across different cultures.
Resumo:
The ubiquity of multimodality in hypermedia environments is undeniable. Bezemer and Kress (2008) have argued that writing has been displaced by image as the central mode for representation. Given the current technical affordances of digital technology and user-friendly interfaces that enable the ease of multimodal design, the conspicuous absence of images in certain domains of cyberspace is deserving of critical analysis. In this presentation, I examine the politics of discourses implicit within hypertextual spaces, drawing textual examples from a higher education website. I critically examine the role of writing and other modes of production used in what Fairclough (1993) refers to as discourses of marketisation in higher education, tracing four pervasive discourses of teaching and learning in the current economy: i) materialization, ii) personalization, iii) technologisation, and iv) commodification (Fairclough, 1999). Each of these arguments is supported by the critical analysis of multimodal texts. The first is a podcast highlighting the new architectonic features of a university learning space. The second is a podcast and transcript of a university Open Day interview with prospective students. The third is a time-lapse video showing the construction of a new science and engineering precinct. These three multimodal texts contrast a final web-based text that exhibits a predominance of writing and the powerful absence or silencing of the image. I connect the weightiness of words and the function of monomodality in the commodification of discourses, and its resistance to the multimodal affordances of web-based technologies, and how this is used to establish particular sets of subject positions and ideologies through which readers are constrained to occupy. Applying principles of critical language study by theorists that include Fairclough, Kress, Lemke, and others whose semiotic analysis of texts focuses on the connections between language, power, and ideology, I demonstrate how the denial of image and the privileging of written words in the multimodality of cyberspace is an ideological effect to accentuate the dominance of the institution.
Numerical and experimental studies of cold-formed steel floor systems under standard fire conditions
Resumo:
Light gauge cold-formed steel frame (LSF) structures are increasingly used in industrial, commercial and residential buildings because of their non-combustibility, dimensional stability, and ease of installation. A floor-ceiling system is an example of its applications. LSF floor-ceiling systems must be designed to serve as fire compartment boundaries and provide adequate fire resistance. Fire rated floor-ceiling assemblies formed with new materials and construction methodologies have been increasingly used in buildings. However, limited research has been undertaken in the past and hence a thorough understanding of their fire resistance behaviour is not available. Recently a new composite panel in which an external insulation layer is used between two plasterboards has been developed at QUT to provide a higher fire rating to LSF floors under standard fire conditions. But its increased fire rating could not be determined using the currently available design methods. Research on LSF floor systems under fire conditions is relatively recent and the behaviour of floor joists and other components in the systems is not fully understood. The present design methods thus require the use of expensive fire protection materials to protect them from excessive heat increase during a fire. This leads to uneconomical and conservative designs. Fire rating of these floor systems is provided simply by adding more plasterboard sheets to the steel joists and such an approach is totally inefficient. Hence a detailed fire research study was undertaken into the structural and thermal performance of LSF floor systems including those protected by the new composite panel system using full scale fire tests and extensive numerical studies. Experimental study included both the conventional and the new steel floor-ceiling systems under structural and fire loads using a gas furnace designed to deliver heat in accordance with the standard time- temperature curve in AS 1530.4 (SA, 2005). Fire tests included the behavioural and deflection characteristics of LSF floor joists until failure as well as related time-temperature measurements across the section and along the length of all the specimens. Full scale fire tests have shown that the structural and thermal performance of externally insulated LSF floor system was superior than traditional LSF floors with or without cavity insulation. Therefore this research recommends the use of the new composite panel system for cold-formed LSF floor-ceiling systems. The numerical analyses of LSF floor joists were undertaken using the finite element program ABAQUS based on the measured time-temperature profiles obtained from fire tests under both steady state and transient state conditions. Mechanical properties at elevated temperatures were considered based on the equations proposed by Dolamune Kankanamge and Mahendran (2011). Finite element models were calibrated using the full scale test results and used to further provide a detailed understanding of the structural fire behaviour of the LSF floor-ceiling systems. The models also confirmed the superior performance of the new composite panel system. The validated model was then used in a detailed parametric study. Fire tests and the numerical studies showed that plasterboards provided sufficient lateral restraint to LSF floor joists until their failure. Hence only the section moment capacity of LSF floor joists subjected to local buckling effects was considered in this research. To predict the section moment capacity at elevated temperatures, the effective section modulus of joists at ambient temperature is generally considered adequate. However, this research has shown that it leads to considerable over- estimation of the local buckling capacity of joist subject to non-uniform temperature distributions under fire conditions. Therefore new simplified fire design rules were proposed for LSF floor joist to determine the section moment capacity at elevated temperature based on AS/NZS 4600 (SA, 2005), NAS (AISI, 2007) and Eurocode 3 Part 1.3 (ECS, 2006). The accuracy of the proposed fire design rules was verified with finite element analysis results. A spread sheet based design tool was also developed based on these design rules to predict the failure load ratio versus time, moment capacity versus time and temperature for various LSF floor configurations. Idealised time-temperature profiles of LSF floor joists were developed based on fire test measurements. They were used in the detailed parametric study to fully understand the structural and fire behaviour of LSF floor panels. Simple design rules were also proposed to predict both critical average joist temperatures and failure times (fire rating) of LSF floor systems with various floor configurations and structural parameters under any given load ratio. Findings from this research have led to a comprehensive understanding of the structural and fire behaviour of LSF floor systems including those protected by the new composite panel, and simple design methods. These design rules were proposed within the guidelines of the Australian/New Zealand, American and European cold- formed steel structures standard codes of practice. These may also lead to further improvements to fire resistance through suitable modifications to the current composite panel system.
Resumo:
Complex flow datasets are often difficult to represent in detail using traditional vector visualisation techniques such as arrow plots and streamlines. This is particularly true when the flow regime changes in time. Texture-based techniques, which are based on the advection of dense textures, are novel techniques for visualising such flows (i.e., complex dynamics and time-dependent). In this paper, we review two popular texture-based techniques and their application to flow datasets sourced from real research projects. The texture-based techniques investigated were Line Integral Convolution (LIC), and Image-Based Flow Visualisation (IBFV). We evaluated these techniques and in this paper report on their visualisation effectiveness (when compared with traditional techniques), their ease of implementation, and their computational overhead.
Resumo:
Detailed representations of complex flow datasets are often difficult to generate using traditional vector visualisation techniques such as arrow plots and streamlines. This is particularly true when the flow regime changes in time. Texture-based techniques, which are based on the advection of dense textures, are novel techniques for visualising such flows. We review two popular texture based techniques and their application to flow datasets sourced from active research projects. The techniques investigated were Line integral convolution (LIC) [1], and Image based flow visualisation (IBFV) [18]. We evaluated these and report on their effectiveness from a visualisation perspective. We also report on their ease of implementation and computational overheads.