957 resultados para organizational models
Resumo:
This presentation addresses issues related to leadership, academic development and scholarship of teaching and learning, and highlights research funded by the Australian Office of Learning and Teaching (OLT) designed to embed and sustain peer review of teaching within the culture of 5 Australian universities: Queensland University of Technology, University of Technology, Sydney, University of Adelaide, Curtin University, and Charles Darwin University. Peer review of teaching in higher education will be emphasised as a professional process for providing feedback on teaching and learning practice, which if sustained, can become an effective ongoing strategy for academic development (Barnard et al, 2011; Bell, 2005; Bolt and Atkinson, 2010; McGill & Beaty 2001, 1992; Kemmis & McTaggart, 2000). The research affirms that using developmental peer review models (Barnard et al, 2011; D'Andrea, 2002; Hammersley-Fletcher & Orsmond, 2004) can bring about successful implementation, especially when implemented within a distributive leadership framework (Spillane & Healey, 2010). The project’s aims and objectives were to develop leadership capacity and integrate peer review as a cultural practice in higher education. The research design was a two stage inquiry process over 2 years. The project began in July 2011 and encompassed a development and pilot phase followed by a cascade phase with questionnaire and focus group evaluation processes to support ongoing improvement and measures of outcome. Leadership development activities included locally delivered workshops complemented by the identification and support of champions. To optimise long term sustainability, the project was implemented through existing learning and teaching structures and processes within the respective partner universities. Research outcomes highlight the fundamentals of peer review of teaching and the broader contextual elements of integration, leadership and development, expressed as a conceptual model for embedding peer review of teaching within higher education. The research opens a communicative space about introduction of peer review that goes further than simply espousing its worth and introduction. The conceptual model highlights the importance of development of distributive leadership capacity, integration of policies and processes, and understanding the values, beliefs, assumptions and behaviors embedded in an organizational culture. The presentation overviews empirical findings that demonstrate progress to advance peer review requires an ‘across-the-board’ commitment to embed change, and inherently demands a process that co-creates connection across colleagues, discipline groups, and the university sector. Progress toward peer review of teaching as a cultural phenomenon can be achieved and has advantages for academic staff, scholarship, teaching evaluation and an organisation, if attention is given to strategies that influence the contexts and cultures of teaching practice. Peer review as a strategy to develop excellence in teaching is considered from a holistic perspective that by necessity encompasses all elements of an educational environment and has a focus on scholarship of teaching. The work is ongoing and has implication for policy, research, teaching development and student outcomes, and has potential application world-wide.
Resumo:
This research used a multiple-case study approach to empirically investigate the complex relationship between factors influencing inter-project knowledge sharing—trustworthiness, organizational culture, and knowledge-sharing mechanisms. Adopting a competing values framework, we found evidence of patterns existing between the type of culture, on the project management unit level, and project managers’ perceptions of valuing trustworthy behaviors and the way they share knowledge, on the individual level. We also found evidence for mutually reinforcing the effect of trust and clan culture, which shape tacit knowledge-sharing behaviors.
Resumo:
Despite decades of attempts to embed sustainability within higher education, literature clearly suggests that highly regulated disciplines such as engineering have been relatively slow to incorporate sustainability knowledge and skill areas, and are generally poorly prepared to do so. With current efforts, it is plausible that sustainability could take another two decades to be embedded within the curriculum. Within this context, this paper presents a whole system approach to implement systematic, intentional and timely curriculum renewal that is responsive to emerging challenges and opportunities, encompassing curriculum and organizational change. The paper begins by considering the evolution of curriculum renewal processes, documenting a number of whole system considerations that have been empirically distilled from literature, case studies, pilot trials, and a series of workshops with built environment educators from around the world over the last decade. The paper outlines a whole-of-institution curriculum renewal approach to embedding sustainability knowledge and skills within the DNA of the institutional offerings. The paper concludes with a discussion of research and practice implications for the field of education research, within and beyond higher education.
Resumo:
Organizations invest in ways to stimulate new ideas for new products and services for the benefit of the organization, engaging in tournaments and competitions to generate new ideas or to combine existing ideas in new ways for new products and services (Terweisch and Uhlrich, 2009). Specifically, some large companies have developed platforms for posting intractable problems to tap into the ideas and problem solving abilities of a broader range of people (Huston and Sakkab, 2006; Morgan and Wang, 2010), and to develop new and elegant solutions often in an open innovation approach (Chesbrough, 2003). The notion of ingenuity is often applied to individuals who create innovative solutions in situations of constraint, where ingenuity in the form of elegant solutions can be understood as one form of resourcefulness (Young, 2011). However, the notion of organizational ingenuity locates ingenuity more centrally to an organization's strategic decision making and implementation, embedding ingenuity into the company's culture. Studies of organizations displaying ingenuity indicate a range of possibilities from extreme ingenuity (Baker and Nelson, 2005) to less dramatic but substantial changes (Thomke, 2003), sometimes in an experimental phase or as part of a move towards a new and distinct identity for ongoing innovation.
Resumo:
The previous chapters gave an insightful introduction into the various facets of Business Process Management. We now share a rich understanding of the essential ideas behind designing and managing processes for organizational purposes. We have also learned about the various streams of research and development that have influenced contemporary BPM. As a matter of fact, BPM has become a holistic management discipline. As such, it requires that a plethora of facets needs to be addressed for its successful und sustainable application. This chapter provides a framework that consolidates and structures the essential factors that constitute BPM as a whole. Drawing from research in the field of maturity models, we suggest six core elements of BPM: strategic alignment, governance, methods, information technology, people, and culture. These six elements serve as the structure for this BPM Handbook.
Resumo:
Software to create individualised finite element (FE) models of the osseoligamentous spine using pre-operative computed tomography (CT) data-sets for spinal surgery patients has recently been developed. This study presents a geometric sensitivity analysis of this software to assess the effect of intra-observer variability in user-selected anatomical landmarks. User-selected landmarks on the osseous anatomy were defined from CT data-sets for three scoliosis patients and these landmarks were used to reconstruct patient-specific anatomy of the spine and ribcage using parametric descriptions. The intra-observer errors in landmark co-ordinates for these anatomical landmarks were calculated. FE models of the spine and ribcage were created using the reconstructed anatomy for each patient and these models were analysed for a loadcase simulating clinical flexibility assessment. The intra-observer error in the anatomical measurements was low in comparison to the initial dimensions, with the exception of the angular measurements for disc wedge and zygapophyseal joint (z-joint) orientation and disc height. This variability suggested that CT resolution may influence such angular measurements, particularly for small anatomical features, such as the z-joints, and may also affect disc height. The results of the FE analysis showed low variation in the model predictions for spinal curvature with the mean intra-observer variability substantially less than the accepted error in clinical measurement. These findings demonstrate that intra-observer variability in landmark point selection has minimal effect on the subsequent FE predictions for a clinical loadcase.
Resumo:
Autonomous navigation and picture compilation tasks require robust feature descriptions or models. Given the non Gaussian nature of sensor observations, it will be shown that Gaussian mixture models provide a general probabilistic representation allowing analytical solutions to the update and prediction operations in the general Bayesian filtering problem. Each operation in the Bayesian filter for Gaussian mixture models multiplicatively increases the number of parameters in the representation leading to the need for a re-parameterisation step. A computationally efficient re-parameterisation step will be demonstrated resulting in a compact and accurate estimate of the true distribution.
Resumo:
Automated process discovery techniques aim at extracting process models from information system logs. Existing techniques in this space are effective when applied to relatively small or regular logs, but generate spaghetti-like and sometimes inaccurate models when confronted to logs with high variability. In previous work, trace clustering has been applied in an attempt to reduce the size and complexity of automatically discovered process models. The idea is to split the log into clusters and to discover one model per cluster. This leads to a collection of process models – each one representing a variant of the business process – as opposed to an all-encompassing model. Still, models produced in this way may exhibit unacceptably high complexity and low fitness. In this setting, this paper presents a two-way divide-and-conquer process discovery technique, wherein the discovered process models are split on the one hand by variants and on the other hand hierarchically using subprocess extraction. Splitting is performed in a controlled manner in order to achieve user-defined complexity or fitness thresholds. Experiments on real-life logs show that the technique produces collections of models substantially smaller than those extracted by applying existing trace clustering techniques, while allowing the user to control the fitness of the resulting models.
Resumo:
Techniques for evaluating and selecting multivariate volatility forecasts are not yet understood as well as their univariate counterparts. This paper considers the ability of different loss functions to discriminate between a set of competing forecasting models which are subsequently applied in a portfolio allocation context. It is found that a likelihood-based loss function outperforms its competitors, including those based on the given portfolio application. This result indicates that considering the particular application of forecasts is not necessarily the most effective basis on which to select models.
Resumo:
Process Modeling is a widely used concept for understanding, documenting and also redesigning the operations of organizations. The validation and usage of process models is however affected by the fact that only business analysts fully understand them in detail. This is in particular a problem because they are typically not domain experts. In this paper, we investigate in how far the concept of verbalization can be adapted from object-role modeling to process models. To this end, we define an approach which automatically transforms BPMN process models into natural language texts and combines different techniques from linguistics and graph decomposition in a flexible and accurate manner. The evaluation of the technique is based on a prototypical implementation and involves a test set of 53 BPMN process models showing that natural language texts can be generated in a reliable fashion.
Resumo:
The motion response of marine structures in waves can be studied using finite-dimensional linear-time-invariant approximating models. These models, obtained using system identification with data computed by hydrodynamic codes, find application in offshore training simulators, hardware-in-the-loop simulators for positioning control testing, and also in initial designs of wave-energy conversion devices. Different proposals have appeared in the literature to address the identification problem in both time and frequency domains, and recent work has highlighted the superiority of the frequency-domain methods. This paper summarises practical frequency-domain estimation algorithms that use constraints on model structure and parameters to refine the search of approximating parametric models. Practical issues associated with the identification are discussed, including the influence of radiation model accuracy in force-to-motion models, which are usually the ultimate modelling objective. The illustration examples in the paper are obtained using a freely available MATLAB toolbox developed by the authors, which implements the estimation algorithms described.
Resumo:
This article studies the problem of transforming a process model with an arbitrary topology into an equivalent well-structured process model. While this problem has received significant attention, there is still no full characterization of the class of unstructured process models that can be transformed into well-structured ones, nor an automated method for structuring any process model that belongs to this class. This article fills this gap in the context of acyclic process models. The article defines a necessary and sufficient condition for an unstructured acyclic process model to have an equivalent well-structured process model under fully concurrent bisimulation, as well as a complete structuring method. The method has been implemented as a tool that takes process models captured in the BPMN and EPC notations as input. The article also reports on an empirical evaluation of the structuring method using a repository of process models from commercial practice.
Resumo:
This article deals with time-domain hydroelastic analysis of a marine structure. The convolution terms associated with fluid memory effects are replaced by an alternative state-space representation, the parameters of which are obtained by using realization theory. The mathematical model established is validated by comparison to experimental results of a very flexible barge. Two types of time-domain simulations are performed: dynamic response of the initially inert structure to incident regular waves and transient response of the structure after it is released from a displaced condition in still water. The accuracy and the efficiency of the simulations based on the state-space model representations are compared to those that integrate the convolutions.
Resumo:
This article describes a Matlab toolbox for parametric identification of fluid-memory models associated with the radiation forces ships and offshore structures. Radiation forces are a key component of force-to-motion models used in simulators, motion control designs, and also for initial performance evaluation of wave-energy converters. The software described provides tools for preparing non-parmatric data and for identification with automatic model-order detection. The identification problem is considered in the frequency domain.