996 resultados para managerial approaches


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Occupational driving crashes are the most common cause of death and injury in the workplace. The physical and psychological outcomes following injury are also very costly to organizations. Thus, safe driving poses a managerial challenge. Some research has attempted to address this issue through modifying discrete and often simple target behaviors (e.g., driver training programs). However, current intervention approaches in the occupational driving field generally do not consider the role of organizational factors in workplace safety. This study adopts the A-B-C framework to identify the contingencies associated with an effective exchange of safety information within the occupational driving context. Utilizing a sample of occupational drivers and their supervisors, this multi-level study examines the contingencies associated with the exchange of safety information within the supervisor-driver relationship. Safety values are identified as an antecedent of the safety information exchange, and the quality of the leader-member exchange relationship and safe driving performance is identified as the behavioral consequences. We also examine the function of role overload as a factor influencing the relationship between safety values and the safety information exchange. Hierarchical Linear Modelling found that role overload moderated the relationship between supervisors’ perceptions of the value given to safety and the safety information exchange. A significant relationship was also found between the safety information exchange and the subsequent quality of the leader-member exchange relationship. Finally, the quality of the leader-member exchange relationship was found to be significantly associated with safe driving performance. Theoretical and practical implications of these results are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the current economy, knowledge has been recognized to be a valuable organisational asset, a crucial factor that aids organisations to succeed in highly competitive environments. Many organisations have begun projects and special initiatives aimed at fostering better knowledge sharing amongst their employees. Not surprisingly, information technology (IT) has been a central element of many of these projects and initiatives, as the potential of emerging information technologies such as Web 2.0 for enabling the process of managing organisational knowledge is recognised. This technology could be used as a collaborative system for knowledge management (KM) within enterprises. Enterprise 2.0 is the application of Web 2.0 in an organisational context. Enterprise 2.0 technologies are web-based social software that facilitate collaboration, communication and information flow in a bidirectional manner: an essential aspect of organisational knowledge management. This chapter explains how Enterprise 2.0 technologies (Web 2.0 technologies within organisations) can support knowledge management. The chapter also explores how such technologies support the codifying (technology-centred) and social network (people-centred) approaches of KM, towards bridging the current gap between these two approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For millennia humans have sought, organized, and used information as they learned and evolved patterns of human information behaviors to resolve their human problems and survive. However, despite the current focus on living in an "information age," we have a limited evolutionary understanding of human information behavior. In this article the authors examine the current three interdisciplinary approaches to conceptualizing how humans have sought information including (a) the everyday life information seeking-sense-making approach, (b) the information foraging approach, and (c) the problem-solution perspective on information seeking approach. In addition, due to the lack of clarity regarding the role of information use in information behavior, a fourth information approach is provided based on a theory of information use. The use theory proposed starts from an evolutionary psychology notion that humans are able to adapt to their environment and survive because of our modular cognitive architecture. Finally, the authors begin the process of conceptualizing these diverse approaches, and the various aspects or elements of these approaches, within an integrated model with consideration of information use. An initial integrated model of these different approaches with information use is proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Unstructured text data, such as emails, blogs, contracts, academic publications, organizational documents, transcribed interviews, and even tweets, are important sources of data in Information Systems research. Various forms of qualitative analysis of the content of these data exist and have revealed important insights. Yet, to date, these analyses have been hampered by limitations of human coding of large data sets, and by bias due to human interpretation. In this paper, we compare and combine two quantitative analysis techniques to demonstrate the capabilities of computational analysis for content analysis of unstructured text. Specifically, we seek to demonstrate how two quantitative analytic methods, viz., Latent Semantic Analysis and data mining, can aid researchers in revealing core content topic areas in large (or small) data sets, and in visualizing how these concepts evolve, migrate, converge or diverge over time. We exemplify the complementary application of these techniques through an examination of a 25-year sample of abstracts from selected journals in Information Systems, Management, and Accounting disciplines. Through this work, we explore the capabilities of two computational techniques, and show how these techniques can be used to gather insights from a large corpus of unstructured text.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Virtual environments can provide, through digital games and online social interfaces, extremely exciting forms of interactive entertainment. Because of their capability in displaying and manipulating information in natural and intuitive ways, such environments have found extensive applications in decision support, education and training in the health and science domains amongst others. Currently, the burden of validating both the interactive functionality and visual consistency of a virtual environment content is entirely carried out by developers and play-testers. While considerable research has been conducted in assisting the design of virtual world content and mechanics, to date, only limited contributions have been made regarding the automatic testing of the underpinning graphics software and hardware. The aim of this thesis is to determine whether the correctness of the images generated by a virtual environment can be quantitatively defined, and automatically measured, in order to facilitate the validation of the content. In an attempt to provide an environment-independent definition of visual consistency, a number of classification approaches were developed. First, a novel model-based object description was proposed in order to enable reasoning about the color and geometry change of virtual entities during a play-session. From such an analysis, two view-based connectionist approaches were developed to map from geometry and color spaces to a single, environment-independent, geometric transformation space; we used such a mapping to predict the correct visualization of the scene. Finally, an appearance-based aliasing detector was developed to show how incorrectness too, can be quantified for debugging purposes. Since computer games heavily rely on the use of highly complex and interactive virtual worlds, they provide an excellent test bed against which to develop, calibrate and validate our techniques. Experiments were conducted on a game engine and other virtual worlds prototypes to determine the applicability and effectiveness of our algorithms. The results show that quantifying visual correctness in virtual scenes is a feasible enterprise, and that effective automatic bug detection can be performed through the techniques we have developed. We expect these techniques to find application in large 3D games and virtual world studios that require a scalable solution to testing their virtual world software and digital content.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The International Classification of Diseases (ICD) is used to categorise diseases, injuries and external causes, and is a key epidemiological tool enabling the storage and retrieval of data from health and vital records to produce core international mortality and morbidity statistics. The ICD is updated periodically to ensure the classification remains current and work is now underway to develop the next revision, ICD-11. There have been almost 20 years since the last ICD edition was published and over 60 years since the last substantial structural revision of the external causes chapter. Revision of such a critical tool requires transparency and documentation to ensure that changes made to the classification system are recorded comprehensively for future reference. In this paper, the authors provide a history of external causes classification development and outline the external cause structure. Approaches to manage ICD-10 deficiencies are discussed and the ICD-11 revision approach regarding the development of, rationale for and implications of proposed changes to the chapter are outlined. Through improved capture of external cause concepts in ICD-11, a stronger evidence base will be available to inform injury prevention, treatment, rehabilitation and policy initiatives to ultimately contribute to a reduction in injury morbidity and mortality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Climate change presents as the archetypal environmental problem with short-term economic self-interest operating to the detriment of the long-term sustainability of our society. The scientific reports of the Intergovernmental Panel on Climate Change strongly assert that the stabilisation of emissions in the atmosphere, to avoid the adverse impacts of climate change, requires significant and rapid reductions in ‘business as usual’ global greenhouse gas emissions. The sheer magnitude of emissions reductions required, within this urgent timeframe, will necessitate an unprecedented level of international, multi-national and intra-national cooperation and will challenge conventional approaches to the creation and implementation of international and domestic legal regimes. To meet this challenge, existing international, national and local legal systems must harmoniously implement a strong international climate change regime through a portfolio of traditional and innovative legal mechanisms that swiftly transform current behavioural practices in emitting greenhouse gases. These include the imposition of strict duties to reduce emissions through the establishment of strong command and control regulation (the regulatory approach); mechanisms for the creation and distribution of liabilities for greenhouse gas emissions and climaterelated harm (the liability approach) and the use of innovative regulatory tools in the form of the carbon trading scheme (the market approach). The legal relations between these various regulatory, liability and market approaches must be managed to achieve a consistent, compatible and optimally effective legal regime to respond to the threat of climate change. The purpose of this thesis is to analyse and evaluate the emerging legal rules and frameworks, both international and Australian, required for the effective regulation of greenhouse gas emissions to address climate change in the context of the urgent and deep emissions reductions required to minimise the adverse impacts of climate change. In doing so, this thesis will examine critically the existing and potential role of law in effectively responding to climate change and will provide recommendations on the necessary reforms to achieve a more effective legal response to this global phenomenon in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dynamic interplay between existing learning frameworks: people, pedagogy, learning spaces and technology is challenging the traditional lecture. A paradigm is emerging from the correlation of change amongst these elements, offering new possibilities for improving the quality of the learning experience. For many universities, the design of physical learning spaces has been the focal point for blending technology and flexible learning spaces to promote learning and teaching. As the pace of technological change intensifies, affording new opportunities for engaging learners, pedagogical practice in higher education is not comparatively evolving. The resulting disparity is an opportunity for the reconsideration of pedagogical practice for increased student engagement in physical learning spaces as an opportunity for active learning. This interplay between students, staff and technology is challenging the value for students in attending physical learning spaces such as the traditional lecture. Why should students attend for classes devoted to content delivery when streaming and web technologies afford more flexible learning opportunities? Should we still lecture? Reconsideration of pedagogy is driving learning design at Queensland University of Technology, seeking new approaches affording increased student engagement via active learning experiences within large lectures. This paper provides an overview and an evaluation of one of these initiatives, Open Web Lecture (OWL), an experimental web based student response application developed by Queensland University of Technology. OWL seamlessly integrates a virtual learning environment within physical learning spaces, fostering active learning opportunities. This paper will evaluate the pilot of this initiative through consideration of effectiveness in increasing student engagement through the affordance of web enabled active learning opportunities in physical learning spaces.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dynamic interplay between existing learning frameworks: people, pedagogy, learning spaces and technology is challenging the traditional lecture. A paradigm is emerging from the correlation of change amongst these elements, offering new possibilities for improving the quality of the learning experience. For many universities, the design of physical learning spaces has been the focal point for blending technology and flexible learning spaces to promote learning and teaching. As the pace of technological change intensifies, affording new opportunities for engaging learners, pedagogical practice in higher education is not comparatively evolving. The resulting disparity is an opportunity for the reconsideration of pedagogical practice for increased student engagement in physical learning spaces as an opportunity for active learning. This interplay between students, staff and technology is challenging the value for students in attending physical learning spaces such as the traditional lecture. Why should students attend for classes devoted to content delivery when streaming and web technologies afford more flexible learning opportunities? Should we still lecture? Reconsideration of pedagogy is driving learning design at Queensland University of Technology, seeking new approaches affording increased student engagement via active learning experiences within large lectures. This paper provides an overview and an evaluation of one of these initiatives, Open Web Lecture (OWL), an experimental web based student response application developed by Queensland University of Technology. OWL seamlessly integrates a virtual learning environment within physical learning spaces, fostering active learning opportunities. This paper will evaluate the pilot of this initiative through consideration of effectiveness in increasing student engagement through the affordance of web enabled active learning opportunities in physical learning spaces.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the era of Web 2.0, huge volumes of consumer reviews are posted to the Internet every day. Manual approaches to detecting and analyzing fake reviews (i.e., spam) are not practical due to the problem of information overload. However, the design and development of automated methods of detecting fake reviews is a challenging research problem. The main reason is that fake reviews are specifically composed to mislead readers, so they may appear the same as legitimate reviews (i.e., ham). As a result, discriminatory features that would enable individual reviews to be classified as spam or ham may not be available. Guided by the design science research methodology, the main contribution of this study is the design and instantiation of novel computational models for detecting fake reviews. In particular, a novel text mining model is developed and integrated into a semantic language model for the detection of untruthful reviews. The models are then evaluated based on a real-world dataset collected from amazon.com. The results of our experiments confirm that the proposed models outperform other well-known baseline models in detecting fake reviews. To the best of our knowledge, the work discussed in this article represents the first successful attempt to apply text mining methods and semantic language models to the detection of fake consumer reviews. A managerial implication of our research is that firms can apply our design artifacts to monitor online consumer reviews to develop effective marketing or product design strategies based on genuine consumer feedback posted to the Internet.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The literature supporting the notion that active, student-centered learning is superior to passive, teacher-centered instruction is encyclopedic (Bonwell & Eison, 1991; Bruning, Schraw, & Ronning, 1999; Haile, 1997a, 1997b, 1998; Johnson, Johnson, & Smith, 1999). Previous action research demonstrated that introducing a learning activity in class improved the learning outcomes of students (Mejias, 2010). People acquire knowledge and skills through practice and reflection, not by watching and listening to others telling them how to do something. In this context, this project aims to find more insights about the level of interactivity in the curriculum a class should have and its alignment with assessment so the intended learning outcomes (ILOs) are achieved. In this project, interactivity is implemented in the form of problem- based learning (PBL). I present the argument that a more continuous formative feedback when implemented with the correct amount of PBL stimulates student engagement bringing enormous benefits to student learning. Different levels of practical work (PBL) were implemented together with two different assessment approaches in two subjects. The outcomes were measured using qualitative and quantitative data to evaluate the levels of student engagement and satisfaction in the terms of ILOs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The content and approach of study skills courses are critiqued and alternatives are suggested. It is proposed that an approach providing students with knowledge about the cognitive processes involved in mastering complex material would make the study skills teacher an agent of social change aiming for the enlightenment and emancipation of students and lecturers.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Individual-based models describing the migration and proliferation of a population of cells frequently restrict the cells to a predefined lattice. An implicit assumption of this type of lattice based model is that a proliferative population will always eventually fill the lattice. Here we develop a new lattice-free individual-based model that incorporates cell-to-cell crowding effects. We also derive approximate mean-field descriptions for the lattice-free model in two special cases motivated by commonly used experimental setups. Lattice-free simulation results are compared to these mean-field descriptions and to a corresponding lattice-based model. Data from a proliferation experiment is used to estimate the parameters for the new model, including the cell proliferation rate, showing that the model fits the data well. An important aspect of the lattice-free model is that the confluent cell density is not predefined, as with lattice-based models, but an emergent model property. As a consequence of the more realistic, irregular configuration of cells in the lattice-free model, the population growth rate is much slower at high cell densities and the population cannot reach the same confluent density as an equivalent lattice-based model.