170 resultados para Analysis of teaching process
Resumo:
Encompasses the whole BPM lifecycle, including process identification, modelling, analysis, redesign, automation and monitoring Class-tested textbook complemented with additional teaching material on the accompanying website Covers both relevant conceptual background, industrial standards and actionable skills Business Process Management (BPM) is the art and science of how work should be performed in an organization in order to ensure consistent outputs and to take advantage of improvement opportunities, e.g. reducing costs, execution times or error rates. Importantly, BPM is not about improving the way individual activities are performed, but rather about managing entire chains of events, activities and decisions that ultimately produce added value for an organization and its customers. This textbook encompasses the entire BPM lifecycle, from process identification to process monitoring, covering along the way process modelling, analysis, redesign and automation. Concepts, methods and tools from business management, computer science and industrial engineering are blended into one comprehensive and inter-disciplinary approach. The presentation is illustrated using the BPMN industry standard defined by the Object Management Group and widely endorsed by practitioners and vendors worldwide. In addition to explaining the relevant conceptual background, the book provides dozens of examples, more than 100 hands-on exercises – many with solutions – as well as numerous suggestions for further reading. The textbook is the result of many years of combined teaching experience of the authors, both at the undergraduate and graduate levels as well as in the context of professional training. Students and professionals from both business management and computer science will benefit from the step-by-step style of the textbook and its focus on fundamental concepts and proven methods. Lecturers will appreciate the class-tested format and the additional teaching material available on the accompanying website fundamentals-of-bpm.org.
Resumo:
This paper offers an analysis of the character animation in Tangled to develop a deeper understanding of how Disney has approached the extension of their traditional aesthetic into the CG medium.
Resumo:
Business process analysis and process mining, particularly within the health care domain, remain under-utilised. Applied research that employs such techniques to routinely collected, health care data enables stakeholders to empirically investigate care as it is delivered by different health providers. However, cross-organisational mining and the comparative analysis of processes present a set of unique challenges in terms of ensuring population and activity comparability, visualising the mined models and interpreting the results. Without addressing these issues, health providers will find it difficult to use process mining insights, and the potential benefits of evidence-based process improvement within health will remain unrealised. In this paper, we present a brief introduction on the nature of health care processes; a review of the process mining in health literature; and a case study conducted to explore and learn how health care data, and cross-organisational comparisons with process mining techniques may be approached. The case study applies process mining techniques to administrative and clinical data for patients who present with chest pain symptoms at one of four public hospitals in South Australia. We demonstrate an approach that provides detailed insights into clinical (quality of patient health) and fiscal (hospital budget) pressures in health care practice. We conclude by discussing the key lessons learned from our experience in conducting business process analysis and process mining based on the data from four different hospitals.
Resumo:
The validity of using rainfall characteristics as lumped parameters for investigating the pollutant wash-off process such as first flush occurrence is questionable. This research study introduces an innovative concept of using sector parameters to investigate the relationship between the pollutant wash-off process and different sectors of the runoff hydrograph and rainfall hyetograph. The research outcomes indicated that rainfall depth and rainfall intensity are two key rainfall characteristics which influence the wash-off process compared to the antecedent dry period. Additionally, the rainfall pattern also plays a critical role in the wash-off process and is independent of the catchment characteristics. The knowledge created through this research study provides the ability to select appropriate rainfall events for stormwater quality treatment design based on the required treatment outcomes such as the need to target different sectors of the runoff hydrograph or pollutant species. The study outcomes can also contribute to enhancing stormwater quality modelling and prediction in view of the fact that conventional approaches to stormwater quality estimation is primarily based on rainfall intensity rather than considering other rainfall parameters or solely based on stochastic approaches irrespective of the characteristics of the rainfall event.
Resumo:
Temporary Traffic Control Plans (TCP’s), which provide construction phasing to maintain traffic during construction operations, are integral component of highway construction project design. Using the initial design, designers develop estimated quantities for the required TCP devices that become the basis for bids submitted by highway contractors. However, actual as-built quantities are often significantly different from the engineer’s original estimate. The total cost of TCP phasing on highway construction projects amounts to 6–10% of the total construction cost. Variations between engineer estimated quantities and final quantities contribute to reduced cost control, increased chances of cost related litigations, and bid rankings and selection. Statistical analyses of over 2000 highway construction projects were performed to determine the sources of variation, which later were used as the basis of development for an automated-hybrid prediction model that uses multiple regressions and heuristic rules to provide accurate TCP quantities and costs. The predictive accuracy of the model developed was demonstrated through several case studies.
Resumo:
This paper addresses the problem of identifying and explaining behavioral differences between two business process event logs. The paper presents a method that, given two event logs, returns a set of statements in natural language capturing behavior that is present or frequent in one log, while absent or infrequent in the other. This log delta analysis method allows users to diagnose differences between normal and deviant executions of a process or between two versions or variants of a process. The method relies on a novel approach to losslessly encode an event log as an event structure, combined with a frequency-enhanced technique for differencing pairs of event structures. A validation of the proposed method shows that it accurately diagnoses typical change patterns and can explain differences between normal and deviant cases in a real-life log, more compactly and precisely than previously proposed methods.
Resumo:
Moderation of assessment constitutes a crucial element of the learning and teaching process at the university. Yet, despite its importance, many academics have confusing beliefs and attitudes towards moderation practices, processes and procedures. This paper reports on a qualitative study conducted in a Science, Technology, Engineering and Mathematics (STEM)-focused faculty at a large Australian higher education institution. The findings of the study revealed a strong need for further investigation on the ways moderation is understood and enacted by academics within a STEM-specific context and informed redevelopment of the faculty’s internal moderation policy.
Resumo:
Student perceptions of teaching have often been used in tertiary education for evaluation purposes. However, there is a paucity of research on the validity, reliability, and applicability of instruments that cover a wide range of student perceptions of pedagogies and practices in high school settings for descriptive purposes. The study attempts to validate an inventory of pedagogy and practice (IPP) that provides researchers and practitioners with a psychometrically sound instrument that covers the most salient factors related to teaching. Using a sample of students (N = 1515) from 39 schools in Singapore, 14 factors about teaching in English lessons from the students’ perspective were tested with confirmatory factor analysis (classroom task goal, structure and clarity, curiosity and interest, positive class climate, feedback, questioning, quality homework, review of students’ work, conventional teaching, exam preparation, behaviour management, maximizing learning time, student-centred pedagogy, and subject domain teaching). Two external criterion factors were used to further test the IPP factor structure. The inventory will enable teachers to understand more about their teaching and researchers to examine how teaching may be related to learning outcomes.
Resumo:
Background Medication safety is a pressing concern for residential aged care facilities (RACFs). Retrospective studies in RACF settings identify inadequate communication between RACFs, doctors, hospitals and community pharmacies as the major cause of medication errors. Existing literature offers limited insight about the gaps in the existing information exchange process that may lead to medication errors. The aim of this research was to explicate the cognitive distribution that underlies RACF medication ordering and delivery to identify gaps in medication-related information exchange which lead to medication errors in RACFs. Methods The study was undertaken in three RACFs in Sydney, Australia. Data were generated through ethnographic field work over a period of five months (May–September 2011). Triangulated analysis of data primarily focused on examining the transformation and exchange of information between different media across the process. Results The findings of this study highlight the extensive scope and intense nature of information exchange in RACF medication ordering and delivery. Rather than attributing error to individual care providers, the explication of distributed cognition processes enabled the identification of gaps in three information exchange dimensions which potentially contribute to the occurrence of medication errors namely: (1) design of medication charts which complicates order processing and record keeping (2) lack of coordination mechanisms between participants which results in misalignment of local practices (3) reliance on restricted communication bandwidth channels mainly telephone and fax which complicates the information processing requirements. The study demonstrates how the identification of these gaps enhances understanding of medication errors in RACFs. Conclusions Application of the theoretical lens of distributed cognition can assist in enhancing our understanding of medication errors in RACFs through identification of gaps in information exchange. Understanding the dynamics of the cognitive process can inform the design of interventions to manage errors and improve residents’ safety.
Resumo:
Over the past several years, there has been resurgent interest in regional planning in North America, Europe and Australasia. Spurred by issues such as metropolitan growth, transportation infrastructure, environmental management and economic development, many states and metropolitan regions are undertaking new planning initiatives. These regional efforts have also raised significant question about governance structures, accountability and measures of effectiveness.n this paper, the authors conducted an international review of ten case studies from the United States, Canada, England, Belgium, New Zealand and Australia to explore several critical questions. Using qualitative data template, the research team reviewed plans, documents, web sites and published literature to address three questions. First, what are the governance arrangements for delivering regional planning? Second, what are the mechanisms linking regional plans with state plans (when relevant) and local plans? Third, what means and mechanisms do these regional plans use to evaluate and measure effectiveness? The case study analysis revealed several common themes. First, there is an increasing focus on goverance at the regional level, which is being driven by a range of trends, including regional spatial development initiatives in Europe, regional transportation issues in the US, and the growth of metropolitan regions generally. However, there is considerable variation in how regional governance arrangements are being played out. Similarly, there is a range of processes being used at the regional level to guide planning that range from broad ranging (thick) processes to narrow and limited (thin) approaches. Finally, evaluation and monitoring of regional planning efforts are compiling data on inputs, processes, outputs and outcomes. Although there is increased attention being paid to indicators and monitoring, most of it falls into outcome evaluations such as Agenda 21 or sustainability reporting. Based on our review we suggest there is a need for increased attention on input, process and output indicators and clearer linkages of these indicators in monitoring and evaluation frameworks. The focus on outcome indicators, such as sustainability indicators, creates feedback systems that are too long-term and remote for effective monitoring and feedback. Although we found some examples of where these kinds of monitoring frameworks are linked into a system of governance, there is a need for clearer conceptual development for both theory and practice.
Resumo:
More than a century ago in their definitive work “The Right to Privacy” Samuel D. Warren and Louis D. Brandeis highlighted the challenges posed to individual privacy by advancing technology. Today’s workplace is characterised by its reliance on computer technology, particularly the use of email and the Internet to perform critical business functions. Increasingly these and other workplace activities are the focus of monitoring by employers. There is little formal regulation of electronic monitoring in Australian or United States workplaces. Without reasonable limits or controls, this has the potential to adversely affect employees’ privacy rights. Australia has a history of legislating to protect privacy rights, whereas the United States has relied on a combination of constitutional guarantees, federal and state statutes, and the common law. This thesis examines a number of existing and proposed statutory and other workplace privacy laws in Australia and the United States. The analysis demonstrates that existing measures fail to adequately regulate monitoring or provide employees with suitable remedies where unjustifiable intrusions occur. The thesis ultimately supports the view that enacting uniform legislation at the national level provides a more effective and comprehensive solution for both employers and employees. Chapter One provides a general introduction and briefly discusses issues relevant to electronic monitoring in the workplace. Chapter Two contains an overview of privacy law as it relates to electronic monitoring in Australian and United States workplaces. In Chapter Three there is an examination of the complaint process and remedies available to a hypothetical employee (Mary) who is concerned about protecting her privacy rights at work. Chapter Four provides an analysis of the major themes emerging from the research, and also discusses the draft national uniform legislation. Chapter Five details the proposed legislation in the form of the Workplace Surveillance and Monitoring Act, and Chapter Six contains the conclusion.
Resumo:
This paper presents a phenomenographic analysis of the conceptions of teaching and learning held by a sample of 16 secondary school teachers in two Australian schools. It provides descriptions of four categories, derived from pooled data, of the ways in which these teachers thought about teaching and about learning, their teaching strategies, and their focus on student or content. The categories for teaching and learning are described with each teacher allocated to the category most typical of their conceptions of teaching and of learning. The lack of congruence, in some cases, between the conceptions of teaching and of learning held by these teachers is discussed.
Resumo:
The Malaysian accounting profession is committed to promoting education that results in a strong ethical culture within accountants. However, some consider ethical training unproductive since trainees may have their ethical values formed pre-commencement. This paper investigates the impact of ethics instruction on final year accounting students, the future accountants of Malaysia. 85 final year accounting students were given five ethical scenarios, and asked what action they considered appropriate. They were then subject to two ethical training methodologies, a traditional lecture/tutorial process and a group assignment. After a significant gap, students were re-presented with the ethical scenarios and asked what action they now considered appropriate. In all five instances students offered a more ethical response the second time. Also, participants rated both training methods and their combined effect as effective. Results suggest there is benefit in including ethics teaching and indeed emphasising its importance in accountancy courses, if the profession’s goal of ethical practitioners is to be achieved.
Resumo:
CRTA technology offers better resolution and a more detailed interpretation of the decomposition processes of a clay mineral such as sepiolite via approaching equilibrium conditions of decomposition through the elimination of the slow transfer of heat to the sample as a controlling parameter on the process of decomposition. Constant-rate decomposition processes of non-isothermal nature reveal changes in the sepiolite as the sepiolite is converted to an anhydride. In the dynamic experiment two dehydration steps are observed over the ~20-170 and 170-350°C temperature range. In the dynamic experiment three dehydroxylation steps are observed over the temperature ranges 201-337, 337-638 and 638-982°C. The CRTA technology enables the separation of the thermal decomposition steps.