342 resultados para Part-Time Work
Resumo:
This ALTC Teaching Fellowship aimed to establish Guiding Principles for Library and Information Science Education 2.0. The aim was achieved by (i) identifying the current and anticipated skills and knowledge required by successful library and information science (LIS) professionals in the age of web 2.0 (and beyond), (ii) establishing the current state of LIS education in Australia in supporting the development of librarian 2.0, and in doing so, identify models of best practice.
The fellowship has contributed to curriculum renewal in the LIS profession. It has helped to ensure that LIS education in Australia continues to meet the changing skills and knowledge requirements of the profession it supports. It has also provided a vehicle through which LIS professionals and LIS educators may find opportunities for greater collaboration and more open communication. This will help bridge the gap between LIS theory and practice and will foster more authentic engagement between LIS education and other parts of the LIS industry in the education of the next generation of professionals. Through this fellowship the LIS discipline has become a role model for other disciplines who will be facing similar issues in the coming years.
Eighty-one members of the Australian LIS profession participated in a series of focus groups exploring the current and anticipated skills and knowledge needed by the LIS professional in the web 2.0 world and beyond. Whilst each focus group tended to draw on specific themes of interest to that particular group of people, there was a great deal of common ground. Eight key themes emerged: technology, learning and education, research or evidence-based practice, communication, collaboration and team work, user focus, business savvy and personal traits.
It was acknowledged that the need for successful LIS professionals to possess transferable skills and interpersonal attributes was not new. It was noted however that the speed with which things are changing in the web 2.0 world was having a significant impact and that this faster pace is placing a new and unexpected emphasis on the transferable skills and knowledge. It was also acknowledged that all librarians need to possess these skills, knowledge and attributes and not just the one or two role models who lead the way.
The most interesting finding however was that web 2.0, library 2.0 and librarian 2.0 represented a ‘watershed’ for the LIS profession. Almost all the focus groups spoke about how they are seeing and experiencing a culture change in the profession. Librarian 2.0 requires a ‘different mindset or attitude’. The Levels of Perspective model by Daniel Kim provides one lens by which to view this finding. The focus group findings suggest that we are witnessing a re-awaking of the Australian LIS profession as it begins to move towards the higher levels of Kim’s model (ie mental models, vision).
Thirty-six LIS educators participated in telephone interviews aimed at exploring the current state of LIS education in supporting the development of librarian 2.0. Skills and knowledge of LIS professionals in a web 2.0 world that were identified and discussed by the LIS educators mirrored those highlighted in the focus group discussions with LIS professionals. Similarly it was noted that librarian 2.0 needed a focus less on skills and knowledge and more on attitude. However, whilst LIS professionals felt that there was a paradigm shift within the profession. LIS educators did not speak with one voice on this matter with quite a number of the educators suggesting that this might be ‘overstating it a bit’. This study provides evidence for “disparate viewpoints” (Hallam, 2007) between LIS educators and LIS professionals that can have a significant implications for the future of not just LIS professional education specifically but for the profession generally.
Library and information science education 2.0: guiding principles and models of best practice 1
Inviting the LIS academics to discuss how their teaching and learning activities support the development of librarian 2.0 was a core part of the interviews conducted. The strategies used and the challenges faced by LIS educators in developing their teaching and learning approaches to support the formation of librarian 2.0 are identified and discussed. A core part of the fellowship was the identification of best practice examples on how LIS educators were developing librarian 2.0. Twelve best practice examples were identified. Each educator was recorded discussing his or her approach to teaching and learning. Videos of these interviews are available via the Fellowship blog at
Resumo:
Transmission smart grids will use a digital platform for the automation of high voltage substations. The IEC 61850 series of standards, released in parts over the last ten years, provide a specification for substation communications networks and systems. These standards, along with IEEE Std 1588-2008 Precision Time Protocol version 2 (PTPv2) for precision timing, are recommended by the both IEC Smart Grid Strategy Group and the NIST Framework and Roadmap for Smart Grid Interoperability Standards for substation automation. IEC 61850, PTPv2 and Ethernet are three complementary protocol families that together define the future of sampled value digital process connections for smart substation automation. A time synchronisation system is required for a sampled value process bus, however the details are not defined in IEC 61850-9-2. PTPv2 provides the greatest accuracy of network based time transfer systems, with timing errors of less than 100 ns achievable. The suitability of PTPv2 to synchronise sampling in a digital process bus is evaluated, with preliminary results indicating that steady state performance of low cost clocks is an acceptable ±300 ns, but that corrections issued by grandmaster clocks can introduce significant transients. Extremely stable grandmaster oscillators are required to ensure any corrections are sufficiently small that time synchronising performance is not degraded.
Resumo:
Twitter has become a major instrument for the rapid dissemination and subsequent debate of news stories. It has been instrumental both in drawing attention to events as they unfolded (such as the emergency landing of a plane in New York’s Hudson River in 2009) and in facilitating a sustained discussion of major stories over timeframes measured in weeks and months (including the continuing saga around Wikileaks and Julian Assange), sometimes still keeping stories alive even if mainstream media attention has moved on elsewhere. More comprehensive methodologies for research into news discussion on Twitter – beyond anecdotal or case study approaches – are only now beginning to emerge. This paper presents a large-scale quantitative approach to studying public communication in the Australian Twittersphere, developed as part of a three-year ARC Discovery project that also examines blogs and other social media spaces. The paper will both outline the innovative research tools developed for this work, and present outcomes from an application of these methodologies to recent and present news themes. Our methodology enables us to identify major themes in Twitter’s discussion of these events, trace their development and decline over time, and map the dynamics of the discussion networks formed ad hoc around specific themes (in part with the help of Twitter #hashtags: brief identifiers which mark a tweet as taking part in an established discussion). It is also able to identify links to major news stories and other online resources, and to track their dissemination across the wider Twittersphere.
Resumo:
Objective: In Australia and comparable countries, case management has become the dominant process by which public mental health services provide outpatient clinical services to people with severe mental illness. There is recognition that caseload size impacts on service provision and that management of caseloads is an important dimension of overall service management. There has been little empirical investigation, however, of caseload and its management. The present study was undertaken in the context of an industrial agreement in Victoria, Australia that required services to introduce standardized approaches to caseload management. The aims of the present study were therefore to (i) investigate caseload size and approaches to caseload management in Victoria's mental health services; and (ii) determine whether caseload size and/or approach to caseload management is associated with work-related stress or case manager self-efficacy among community mental health professionals employed in Victoria's mental health services. Method: A total of 188 case managers responded to an online cross-sectional survey with both purpose-developed items investigating methods of case allocation and caseload monitoring, and standard measures of work-related stress and case manager personal efficacy. Results: The mean caseload size was 20 per full-time case manager. Both work-related stress scores and case manager personal efficacy scores were broadly comparable with those reported in previous studies. Higher caseloads were associated with higher levels of work-related stress and lower levels of case manager personal efficacy. Active monitoring of caseload was associated with lower scores for work-related stress and higher scores for case manager personal efficacy, regardless of size of caseload. Although caseloads were most frequently monitored by the case manager, there was evidence that monitoring by a supervisor was more beneficial than self-monitoring. Conclusion: Routine monitoring of caseload, especially by a workplace supervisor, may be effective in reducing work-related stress and enhancing case manager personal efficacy. Keywords: case management, caseload, stress
Resumo:
Boundary spanning links organisations to one another in order to create mutually beneficial relationships; it is a concept developed and used in organisational theory but rarely used to understand organisational structures in higher education (Pruitt & Schwartz, 1999). Yet understanding boundary spanning activity has the capacity to help universities respond to demands for continuous quality improvement, and to increase capacity to react to environmental uncertainty. At a time of rapid change characterised by a fluctuating economic environment, globalisation, increased mobility, and ecological issues, boundary spanning could be viewed as a key element in assisting institutions in effectively understanding and responding to such change. The literature suggests that effective boundary spanning could help universities improve organisational performance, use of infrastructure and resources, intergroup relations, leadership styles, performance and levels of job satisfaction, technology transfer, knowledge creation, and feedback processes, amongst other things. Our research aims to put a face on boundary spanning (Miller, 2008) by contextualising it within organisational systems and structures in university departments responsible for work related programs i.e. Work Integrated Learning (WIL) and Co-operative Education (Co-op). In this paper these approaches are referred to collectively as work related programs. The authors formed a research team in Victoria, British Columbia in 2009 at a sponsored international research forum, Two Days in June. The purpose of the invitation-only forum was to investigate commonalities and differences across programs and to formulate an international research agenda for work related programs over the next five to ten years. Researchers from Queensland University of Technology, University of Cincinnati, Baden-Wuerttemberg Cooperative State University, University of Ottawa,and Dublin City University agreed that further research was needed into the impact stakeholders, organisational systems, structures, policies, and practices have on departments delivering work related programs. This paper illustrates how policy and practice across the five institutions can be better understood through the lens of boundary spanning. It is argued that boundary spanning is an area of theory and practice with great applicability to a better understanding of the activity of these departments. The paper concludes by proposing topics for future research to examine how boundary spanning can be used to better understand practice and change in work related programs.
Resumo:
Real-time networked control systems (NCSs) over data networks are being increasingly implemented on a massive scale in industrial applications. Along with this trend, wireless network technologies have been promoted for modern wireless NCSs (WNCSs). However, popular wireless network standards such as IEEE 802.11/15/16 are not designed for real-time communications. Key issues in real-time applications include limited transmission reliability and poor transmission delay performance. Considering the unique features of real-time control systems, this paper develops a conditional retransmission enabled transport protocol (CRETP) to improve the delay performance of the transmission control protocol (TCP) and also the reliability performance of the user datagram protocol (UDP) and its variants. Key features of the CRETP include a connectionless mechanism with acknowledgement (ACK), conditional retransmission and detection of ineffective data packets on the receiver side.
Resumo:
Popular wireless networks, such as IEEE 802.11/15/16, are not designed for real-time applications. Thus, supporting real-time quality of service (QoS) in wireless real-time control is challenging. This paper adopts the widely used IEEE 802.11, with the focus on its distributed coordination function (DCF), for soft-real-time control systems. The concept of the critical real-time traffic condition is introduced to characterize the marginal satisfaction of real-time requirements. Then, mathematical models are developed to describe the dynamics of DCF based real-time control networks with periodic traffic, a unique feature of control systems. Performance indices such as throughput and packet delay are evaluated using the developed models, particularly under the critical real-time traffic condition. Finally, the proposed modelling is applied to traffic rate control for cross-layer networked control system design.
Resumo:
Expected satiety has been shown to play a key role in decisions around meal size. Recently it has become clear that these expectations can also influence the satiety that is experienced after a food has been consumed. As such, increasing the expected and actual satiety a food product confers without increasing its caloric content is of importance. In this study we sought to determine whether this could be achieved via product labelling. Female participants (N=75) were given a 223-kcal yoghurt smoothie for lunch. In separate conditions the smoothie was labelled as a diet brand, a highly-satiating brand, or an ‘own brand’ control. Expected satiety was assessed using rating scales and a computer-based ‘method of adjustment’, both prior to consuming the smoothie and 24 hours later. Hunger and fullness were assessed at baseline, immediately after consuming the smoothie, and for a further three hours. Despite the fact that all participants consumed the same food, the smoothie branded as highly-satiating was consistently expected to deliver more satiety than the other ‘brands’; this difference was sustained 24 hours after consumption. Furthermore, post-consumption and over three hours, participants consuming this smoothie reported significantly less hunger and significantly greater fullness. These findings demonstrate that the satiety that a product confers depends in part on information that is present around the time of consumption. We suspect that this process is mediated by changes to expected satiety. These effects may potentially be utilised in the development of successful weight-management products.
Resumo:
Site-specific performance provides choices in audience experience via degrees of scale, proximity, levels of immersion and viewing perspectives. Beyond these choices, multi-site promenade events also form a connected audience/performer relationship in which moving together in time and space can produce a shared narrative and aesthetic sensibility of collective, yet individuated and shifting meanings. This paper interrogates this notion through audience/performer experiences in two separate multi-site, dance-led events. here/there/then/now occurred in four intimate sites within the Brisbane Powerhouse, providing a theatricalised platform for audiences to create linked narratives through open-ended and fragmented intertextuality. Accented Body, based on the concept of “the body as site and in site” and notions of connectivity, provided a more expansive platform for a similar, but heightened, shared engagement. Audiences traversed 6 outdoor and 2 indoor Brisbane sites moving to varying levels of a large complex. Eleven, predominantly interactive, screens provided links to other sites as well as to distributed presences in Seoul and London. The differentiation in scale and travel time between sites deepened the immersive experiences of audiences who reported transformative engagements with both site and architecture, accompanied by a sense of extended and yet quickened time.
Resumo:
This chapter provides an analysis of feedback from key stakeholders, collected as part of a research project, on the problems and tensions evident in the collective work practices of learning advisers employed in learning assistance services at an Australian metropolitan university (Peach, 2003). The term 'learning assistance' is used in the Australian higher education sector generally to refer to student support services that include assistance with academic writing and other study skills. The aim of the study was to help learning advisers and other key stakeholders develop a better understanding of the work activity with a view to using this understanding to generate improvements in service provision. Over twenty problems and associated tensions were identified through stakeholder feedback however the focus of this chapter is the analysis of tensions related to a cluster of problems referred to as cost-efficiency versus quality service. Theoretical modelling derived from the tools made available through cultural historical activity theory and expansive visibilsation (Engestrom and Miettinen, 1999) and excerpts from data are used to illustrate how different understandings of the purpose of learning assistance services impacts on the work practices of learning advisers and creates problems and tensions in relation to the type of service available (including use of technology),level of service available, and learning adviser workload.
Resumo:
This article augments Resource Dependence Theory with Real Options reasoning in order to explain time bounds specification in strategic alliances. Whereas prior work has found about a 50/50 split between alliances that are time bound and those that are open-ended, their substantive differences and antecedents are ill understood. To address this, we suggest that the two alliance modes present different real options trade-offs in adaptation to environmental uncertainty: ceteris paribus, time-bound alliances are likely to provide abandonment options over open-ended alliances, but require additional investments to extend the alliance when this turns out to be desirable after formation. Open-ended alliances are likely to provide growth options over open-ended alliances, but they demand additional effort to abandon the alliance if post-formation circumstances so desire. Therefore, we expect time bounds specification to be a function of environmental uncertainty: organizations in more uncertain environments will be relatively more likely to place time bounds on their strategic alliances. Longitudinal archival and survey data collected amongst 39 industry clusters provides empirical support for our claims, which contribute to the recent renaissance of resource dependence theory by specifying the conditions under which organizations choose different time windows in strategic partnering.
Resumo:
The research objectives of this thesis were to contribute to Bayesian statistical methodology by contributing to risk assessment statistical methodology, and to spatial and spatio-temporal methodology, by modelling error structures using complex hierarchical models. Specifically, I hoped to consider two applied areas, and use these applications as a springboard for developing new statistical methods as well as undertaking analyses which might give answers to particular applied questions. Thus, this thesis considers a series of models, firstly in the context of risk assessments for recycled water, and secondly in the context of water usage by crops. The research objective was to model error structures using hierarchical models in two problems, namely risk assessment analyses for wastewater, and secondly, in a four dimensional dataset, assessing differences between cropping systems over time and over three spatial dimensions. The aim was to use the simplicity and insight afforded by Bayesian networks to develop appropriate models for risk scenarios, and again to use Bayesian hierarchical models to explore the necessarily complex modelling of four dimensional agricultural data. The specific objectives of the research were to develop a method for the calculation of credible intervals for the point estimates of Bayesian networks; to develop a model structure to incorporate all the experimental uncertainty associated with various constants thereby allowing the calculation of more credible credible intervals for a risk assessment; to model a single day’s data from the agricultural dataset which satisfactorily captured the complexities of the data; to build a model for several days’ data, in order to consider how the full data might be modelled; and finally to build a model for the full four dimensional dataset and to consider the timevarying nature of the contrast of interest, having satisfactorily accounted for possible spatial and temporal autocorrelations. This work forms five papers, two of which have been published, with two submitted, and the final paper still in draft. The first two objectives were met by recasting the risk assessments as directed, acyclic graphs (DAGs). In the first case, we elicited uncertainty for the conditional probabilities needed by the Bayesian net, incorporated these into a corresponding DAG, and used Markov chain Monte Carlo (MCMC) to find credible intervals, for all the scenarios and outcomes of interest. In the second case, we incorporated the experimental data underlying the risk assessment constants into the DAG, and also treated some of that data as needing to be modelled as an ‘errors-invariables’ problem [Fuller, 1987]. This illustrated a simple method for the incorporation of experimental error into risk assessments. In considering one day of the three-dimensional agricultural data, it became clear that geostatistical models or conditional autoregressive (CAR) models over the three dimensions were not the best way to approach the data. Instead CAR models are used with neighbours only in the same depth layer. This gave flexibility to the model, allowing both the spatially structured and non-structured variances to differ at all depths. We call this model the CAR layered model. Given the experimental design, the fixed part of the model could have been modelled as a set of means by treatment and by depth, but doing so allows little insight into how the treatment effects vary with depth. Hence, a number of essentially non-parametric approaches were taken to see the effects of depth on treatment, with the model of choice incorporating an errors-in-variables approach for depth in addition to a non-parametric smooth. The statistical contribution here was the introduction of the CAR layered model, the applied contribution the analysis of moisture over depth and estimation of the contrast of interest together with its credible intervals. These models were fitted using WinBUGS [Lunn et al., 2000]. The work in the fifth paper deals with the fact that with large datasets, the use of WinBUGS becomes more problematic because of its highly correlated term by term updating. In this work, we introduce a Gibbs sampler with block updating for the CAR layered model. The Gibbs sampler was implemented by Chris Strickland using pyMCMC [Strickland, 2010]. This framework is then used to consider five days data, and we show that moisture in the soil for all the various treatments reaches levels particular to each treatment at a depth of 200 cm and thereafter stays constant, albeit with increasing variances with depth. In an analysis across three spatial dimensions and across time, there are many interactions of time and the spatial dimensions to be considered. Hence, we chose to use a daily model and to repeat the analysis at all time points, effectively creating an interaction model of time by the daily model. Such an approach allows great flexibility. However, this approach does not allow insight into the way in which the parameter of interest varies over time. Hence, a two-stage approach was also used, with estimates from the first-stage being analysed as a set of time series. We see this spatio-temporal interaction model as being a useful approach to data measured across three spatial dimensions and time, since it does not assume additivity of the random spatial or temporal effects.
Resumo:
This paper presents a comprehensive study to find the most efficient bitrate requirement to deliver mobile video that optimizes bandwidth, while at the same time maintains good user viewing experience. In the study, forty participants were asked to choose the lowest quality video that would still provide for a comfortable and long-term viewing experience, knowing that higher video quality is more expensive and bandwidth intensive. This paper proposes the lowest pleasing bitrates and corresponding encoding parameters for five different content types: cartoon, movie, music, news and sports. It also explores how the lowest pleasing quality is influenced by content type, image resolution, bitrate, and user gender, prior viewing experience, and preference. In addition, it analyzes the trajectory of users’ progression while selecting the lowest pleasing quality. The findings reveal that the lowest bitrate requirement for a pleasing viewing experience is much higher than that of the lowest acceptable quality. Users’ criteria for the lowest pleasing video quality are related to the video’s content features, as well as its usage purpose and the user’s personal preferences. These findings can provide video providers guidance on what quality they should offer to please mobile users.
Resumo:
Statement: Jams, Jelly Beans and the Fruits of Passion Let us search, instead, for an epistemology of practice implicit in the artistic, intuitive processes which some practitioners do bring to situations of uncertainty, instability, uniqueness, and value conflict. (Schön 1983, p40) Game On was born out of the idea of creative community; finding, networking, supporting and inspiring the people behind the face of an industry, those in the mist of the machine and those intending to join. We understood this moment to be a pivotal opportunity to nurture a new emerging form of game making, in an era of change, where the old industry models were proving to be unsustainable. As soon as we started putting people into a room under pressure, to make something in 48hrs, a whole pile of evolutionary creative responses emerged. People refashioned their craft in a moment of intense creativity that demanded different ways of working, an adaptive approach to the craft of making games – small – fast – indie. An event like the 48hrs forces participants’ attention onto the process as much as the outcome. As one game industry professional taking part in a challenge for the first time observed: there are three paths in the genesis from idea to finished work: the path that focuses on mechanics; the path that focuses on team structure and roles, and the path that focuses on the idea, the spirit – and the more successful teams put the spirit of the work first and foremost. The spirit drives the adaptation, it becomes improvisation. As Schön says: “Improvisation consists on varying, combining and recombining a set of figures within the schema which bounds and gives coherence to the performance.” (1983, p55). This improvisational approach is all about those making the games: the people and the principles of their creative process. This documentation evidences the intensity of their passion, determination and the shit that they are prepared to put themselves through to achieve their goal – to win a cup full of jellybeans and make a working game in 48hrs. 48hr is a project where, on all levels, analogue meets digital. This concept was further explored through the documentation process. All of these pictures were taken with a 1945 Leica III camera. The use of this classic, film-based camera, gives the images a granularity and depth, this older slower technology exposes the very human moments of digital creativity. ____________________________ Schön, D. A. 1983, The Reflective Practitioner: How Professionals Think in Action, Basic Books, New York
Resumo:
Beryl & Gael discuss the ‘new’ metalanguage for knowledge about language presented in the Australian Curriculum English (ACARA, 2010). Their discussion connects to practice by recounting how one teacher scaffolds her students through detailed understandings of noun and adjective groups in reading activities. The stimulus text is the novel ‘A wrinkle in time’ (L’Engle, 1962, reproduced 2007) and the purpose is to build students’ understandings so they can work towards ‘expressing and developing ideas’ in written text (ACARA, 2010).