845 resultados para Dual-process Model
Resumo:
The proliferation of innovative schemes to address climate change at international, national and local levels signals a fundamental shift in the priority and role of the natural environment to society, organizations and individuals. This shift in shared priorities invites academics and practitioners to consider the role of institutions in shaping and constraining responses to climate change at multiple levels of organisations and society. Institutional theory provides an approach to conceptualising and addressing climate change challenges by focusing on the central logics that guide society, organizations and individuals and their material and symbolic relationship to the environment. For example, framing a response to climate change in the form of an emission trading scheme evidences a practice informed by a capitalist market logic (Friedland and Alford 1991). However, not all responses need necessarily align with a market logic. Indeed, Thornton (2004) identifies six broad societal sectors each with its own logic (markets, corporations, professions, states, families, religions). Hence, understanding the logics that underpin successful –and unsuccessful– climate change initiatives contributes to revealing how institutions shape and constrain practices, and provides valuable insights for policy makers and organizations. This paper develops models and propositions to consider the construction of, and challenges to, climate change initiatives based on institutional logics (Thornton and Ocasio 2008). We propose that the challenge of understanding and explaining how climate change initiatives are successfully adopted be examined in terms of their institutional logics, and how these logics evolve over time. To achieve this, a multi-level framework of analysis that encompasses society, organizations and individuals is necessary (Friedland and Alford 1991). However, to date most extant studies of institutional logics have tended to emphasize one level over the others (Thornton and Ocasio 2008: 104). In addition, existing studies related to climate change initiatives have largely been descriptive (e.g. Braun 2008) or prescriptive (e.g. Boiral 2006) in terms of the suitability of particular practices. This paper contributes to the literature on logics by examining multiple levels: the proliferation of the climate change agenda provides a site in which to study how institutional logics are played out across multiple, yet embedded levels within society through institutional forums in which change takes place. Secondly, the paper specifically examines how institutional logics provide society with organising principles –material practices and symbolic constructions– which enable and constrain their actions and help define their motives and identity. Based on this model, we develop a series of propositions of the conditions required for the successful introduction of climate change initiatives. The paper proceeds as follows. We present a review of literature related to institutional logics and develop a generic model of the process of the operation of institutional logics. We then consider how this is applied to key initiatives related to climate change. Finally, we develop a series of propositions which might guide insights into the successful implementation of climate change practices.
Resumo:
Purpose – The paper aims to describe a workforce-planning model developed in-house in an Australian university library that is based on rigorous environmental scanning of an institution, the profession and the sector. Design/methodology/approach – The paper uses a case study that describes the stages of the planning process undertaken to develop the Library’s Workforce Plan and the documentation produced. Findings – While it has been found that the process has had successful and productive outcomes, workforce planning is an ongoing process. To remain effective, the workforce plan needs to be reviewed annually in the context of the library’s overall planning program. This is imperative if the plan is to remain current and to be regarded as a living document that will continue to guide library practice. Research limitations/implications – Although a single case study, the work has been contextualized within the wider research into workforce planning. Practical implications – The paper provides a model that can easily be deployed within a library without external or specialist consultant skills, and due to its scalability can be applied at department or wider level. Originality/value – The paper identifies the trends impacting on, and the emerging opportunities for, university libraries and provides a model for workforce planning that recognizes the context and culture of the organization as key drivers in determining workforce planning. Keywords - Australia, University libraries, Academic libraries, Change management, Manpower planning Paper type - Case study
Resumo:
Process modeling grammars are used by analysts to describe information systems domains in terms of the business operations an organization is conducting. While prior research has examined the factors that lead to continued usage behavior, little knowledge has been established as to what extent characteristics of the users of process modeling grammars inform usage behavior. In this study, a theoretical model is advanced that incorporates determinants of continued usage behavior as well as key antecedent individual difference factors of the grammar users, such as modeling experience, modeling background and perceived grammar familiarity. Findings from a global survey of 529 grammar users support the hypothesized relationships of the model. The study offers three central contributions. First, it provides a validated theoretical model of post-adoptive modeling grammar usage intentions. Second, it discusses the effects of individual difference factors of grammar users in the context of modeling grammar usage. Third, it provides implications for research and practice.
Resumo:
The selection criteria for contractor pre-qualification are characterized by the co-existence of both quantitative and qualitative data. The qualitative data is non-linear, uncertain and imprecise. An ideal decision support system for contractor pre-qualification should have the ability of handling both quantitative and qualitative data, and of mapping the complicated nonlinear relationship of the selection criteria, such that rational and consistent decisions can be made. In this research paper, an artificial neural network model was developed to assist public clients identifying suitable contractors for tendering. The pre-qualification criteria (variables) were identified for the model. One hundred and twelve real pre-qualification cases were collected from civil engineering projects in Hong Kong, and eighty-eight hypothetical pre-qualification cases were also generated according to the “If-then” rules used by professionals in the pre-qualification process. The results of the analysis totally comply with current practice (public developers in Hong Kong). Each pre-qualification case consisted of input ratings for candidate contractors’ attributes and their corresponding pre-qualification decisions. The training of the neural network model was accomplished by using the developed program, in which a conjugate gradient descent algorithm was incorporated for improving the learning performance of the network. Cross-validation was applied to estimate the generalization errors based on the “re-sampling” of training pairs. The case studies show that the artificial neural network model is suitable for mapping the complicated nonlinear relationship between contractors’ attributes and their corresponding pre-qualification (disqualification) decisions. The artificial neural network model can be concluded as an ideal alternative for performing the contractor pre-qualification task.
Resumo:
This study was designed to examine affective leader behaviours, and their impact on cognitive, affective and behavioural engagement. Researchers (e.g., Cropanzano & Mitchell, 2005; Moorman et al., 1998) have called for more research to be directed toward modelling and testing sets of relationships which better approximate the complexity associated with contemporary organisational experience. This research has attempted to do this by clarifying and defining the construct of engagement, and then by examining how each of the engagement dimensions are impacted by affective leader behaviours. Specifically, a model was tested that identifies leader behaviour antecedents of cognitive, affective and behavioural engagement. Data was collected from five public-sector organisations. Structural equation modelling was used to identify the relationships between the engagement dimensions and leader behaviours. The results suggested that affective leader behaviours had a substantial direct impact on cognitive engagement, which in turn influenced affective engagement, which then influenced intent to stay and extra-role performance. The results indicated a directional process for engagement, but particularly highlighted the significant impact of affective leader behaviours as an antecedent to engagement. In general terms, the findings will provide a platform from which to develop a robust measure of engagement, and will be helpful to human resource practitioners interested in understanding the directional process of engagement and the importance of affective leadership as an antecedent to engagement.
Resumo:
It has been suggested that the Internet is the most significant driver of international trade in recent years to the extent that the term =internetalisation‘ has been coined (Bell, Deans, Ibbotson & Sinkovics, 2001; Buttriss & Wilkinson, 2003). This term is used to describe the Internet‘s affect on the internationalisation process of the firm. Consequently, researchers have argued that the internationalisation process of the firm has altered due to the Internet, hence is in need of further investigation. However, as there is limited research and understanding, ambiguity remains in how the Internet has influenced international market growth. Thus, the purpose of this study was to explore how the Internet influences firms‘ internationalisation process, specifically, international market growth. To this end, Internet marketing and international market growth theories are used to illuminate this ambiguity in the body of knowledge. Thus, the research problem =How and why does the Internet influence international market growth of the firm’ is justified for investigation. To explore the research question a two-stage approach is used. Firstly, twelve case studies were used to evaluate key concepts, generate hypotheses and to develop a model of Internetalisation for testing. The participants held key positions within their firm, so that rich data could be drawn from international market growth decision makers. Secondly, a quantitative confirmation process analysed the identified themes or constructs, using two hundred and twenty four valid responses. Constructs were evaluated through an exploratory factor analysis, confirmatory factor analysis and structural equation modelling process. Structural equation modelling was used to test the model of =internetalisation‘ to examine the interrelationships between the internationalisation process components: information availability, information usage, interaction communication, international mindset, business relationship usage, psychic distance, the Internet intensity of the firm and international market growth. This study found that the Internet intensity of the firm mediates information availability, information usage, international mindset, and business relationships when firms grow in international markets. Therefore, these results provide empirical evidence that the Internet has a positive influence on international information, knowledge, entrepreneurship and networks and these in turn influence international market growth. The theoretical contributions are three fold. Firstly, the study identifies a holistic model of the impact the Internet has had on the outward internationalisation of the firm. This contribution extends the body of knowledge pertaining to Internet international marketing by mapping and confirming interrelationships between the Internet, internationalisation and growth concepts. Secondly, the study highlights the broad scope and accelerated rate of international market growth of firms. Evidence that the Internet influences the traditional and virtual networks for the pursuit of international market growth extends the current understanding. Thirdly, this study confirms that international information, knowledge, entrepreneurship and network concepts are valid in a single model. Thus, these three contributions identify constructs, measure constructs in a multi-item capacity, map interrelationships and confirm single holistic model of ‗internetalisation‘. The main practical contribution is that the findings identified information, knowledge and entrepreneurial opportunities for firms wishing to maximise international market growth. To capitalise on these opportunities suggestions are offered to assist firms to develop greater Internet intensity and internationalisation capabilities. From a policy perspective, educational institutions and government bodies need to promote more applied programs for Internet international marketing. The study provides future researchers with a platform of identified constructs and interrelationships related to internetalisation, with which to investigate. However, a single study has limitations of generalisability; thus, future research should replicate this study. Such replication or cross validation will assist in the verification of scales used in this research and enhance the validity of causal predications. Furthermore, this study was undertaken in the Australian outward-bound context. Research in other nations, as well as research into inbound internationalisation would be fruitful.
Resumo:
The Thai written language is one of the languages that does not have word boundaries. In order to discover the meaning of the document, all texts must be separated into syllables, words, sentences, and paragraphs. This paper develops a novel method to segment the Thai text by combining a non-dictionary based technique with a dictionary-based technique. This method first applies the Thai language grammar rules to the text for identifying syllables. The hidden Markov model is then used for merging possible syllables into words. The identified words are verified with a lexical dictionary and a decision tree is employed to discover the words unidentified by the lexical dictionary. Documents used in the litigation process of Thai court proceedings have been used in experiments. The results which are segmented words, obtained by the proposed method outperform the results obtained by other existing methods.
Resumo:
Road and highway infrastructure provides the backbone for a nation's economic growth. The versatile dispersion of population in Australia, from sparsely settled communities in remote areas to regenerated inner city suburbs with high density living in metropolitans, calls for continuing development and improvement on roads infrastructure under the current federal government policies and state governments' strategic plans. As road infrastructure projects involve large resources and mechanism, achieving sustainability not only in economic scales but also through environmental and social responsibility becomes a crucial issue. Current efforts are often impeded by different interpretation on sustainability agenda by stakeholders involved in these types of projects. As a result, sustainability deliverables at the project level is not often as transparent and measurable, compared to promises in project briefs and designs. This paper reviews the past studies on sustainable infrastructure construction, focusing on roads and highway projects. Through literature study and consultation with the industry, key sustainability indicators specific to road infrastructure projects have been identified. Based on these findings, this paper introduces an on-going research project aimed at identifying and integrating the different perceptions and priority needs of the stakeholders, and issues that impact on the gap between sustainability foci and its actual realization at project end level. The exploration helps generate an integrated decision-making model for sustainable road infrastructure projects. The research will promote to the industry more systematic and integrated approaches to decision-making on the implementation of sustainability strategies to achieve deliverable goals throughout the development and delivery process of road infrastructure projects in Australia.
Resumo:
Conventional clinical therapies are unable to resolve osteochondral defects adequately, hence tissue engineering solutions are sought to address the challenge. A biphasic implant which was seeded with Mesenchymal Stem Cells (MSC) and coupled with an electrospun membrane was evaluated as an alternative. This dual phase construct comprised of a Polycaprolactone (PCL) cartilage scaffold and a Polycaprolactone - Tri Calcium Phosphate (PCL - TCP) osseous matrix. Autologous MSC was seeded into the entire implant via fibrin and the construct was inserted into critically sized osteochondral defects located at the medial condyle and patellar groove of pigs. The defect was resurfaced with a PCL - collagen electrospun mesh that served as a substitute for periosteal flap in preventing cell leakage. Controls either without implanted MSC or resurfacing membrane were included. After 6 months, cartilaginous repair was observed with a low occurrence of fibrocartilage at the medial condyle. Osteochondral repair was promoted and host cartilage degeneration was arrested as shown by the superior Glycosaminoglycan (GAG) maintenance. This positive morphological outcome was supported by a higher relative Young's modulus which indicated functional cartilage restoration. Bone in growth and remodeling occurred in all groups with a higher degree of mineralization in the experimental group. Tissue repair was compromised in the absence of the implanted cells or the resurfacing membrane. Moreover healing was inferior at the patellar groove as compared to the medial condyle and this was attributed to the native biomechanical features.
Resumo:
Introduction The purpose of this study was to develop, implement and evaluate the impact of an educational intervention, comprising an innovative model of clinical decisionmaking and educational delivery strategy for facilitating nursing students‘ learning and development of competence in paediatric physical assessment practices. Background of the study Nursing students have an undergraduate education that aims to produce graduates of a generalist nature who demonstrate entry level competence for providing nursing care in a variety of health settings. Consistent with population morbidity and health care roles, paediatric nursing concepts typically form a comparatively small part of undergraduate curricula and students‘ exposure to paediatric physical assessment concepts and principles are brief. However, the nursing shortage has changed traditional nursing employment patterns and new graduates form the majority of the recruitment pool for paediatric nursing speciality staff. Paediatric nursing is a popular career choice for graduates and anecdotal evidence suggests that nursing students who select a clinical placement in their final year intend to seek employment in paediatrics upon graduation. Although concepts of paediatric nursing are included within undergraduate curriculum, students‘ ability to develop the required habits of mind to practice in what is still regarded as a speciality area of practice is somewhat limited. One of the areas of practice where this particularly impacts is in paediatric nursing physical assessment. Physical assessment is a fundamental component of nursing practice and competence in this area of practice is central to nursing students‘ development of clinical capability for practice as a registered nurse. Timely recognition of physiologic deterioration of patients is a key outcome of nurses‘ competent use of physical assessment strategies, regardless of the practice context. In paediatric nursing contexts children‘s physical assessment practices must specifically accommodate the child‘s different physiological composition, function and pattern of clinical deterioration (Hockenberry & Barrera, 2007). Thus, to effectively manage physical assessment of patients within the paediatric practice setting nursing students need to integrate paediatric nursing theory into their practice. This requires significant information processing and it is in this process where students are frequently challenged. The provision of rules or models can guide practice and assist novice-level nurses to develop their capabilities (Benner, 1984; Benner, Hooper-Kyriakidis & Stannard, 1999). Nursing practice models are cognitive tools that represent simplified patterns of expert analysis employing concepts that suit the limited reasoning of the inexperienced, and can represent the =rules‘ referred to by Benner (1984). Without a practice model of physical assessment students are likely to be uncertain about how to proceed with data collection, the interpretation of paediatric clinical findings and the appraisal of findings. These circumstances can result in ad hoc and unreliable nursing physical assessment that forms a poor basis for nursing decisions. The educational intervention developed as part of this study sought to resolve this problem and support nursing students‘ development of competence in paediatric physical assessment. Methods This study utilised the Context Input Process Product (CIPP) Model by Stufflebeam (2004) as the theoretical framework that underpinned the research design and evaluation methodology. Each of the four elements in the CIPP model were utilised to guide discrete stages of this study. The Context element informed design of the clinical decision-making process, the Paediatric Nursing Physical Assessment model. The Input element was utilised in appraising relevant literature, identifying an appropriate instructional methodology to facilitate learning and educational intervention delivery to undergraduate nursing students, and development of program content (the CD-ROM kit). Study One employed the Process element and used expert panel approaches to review and refine instructional methods, identifying potential barriers to obtaining an effective evaluation outcome. The Product element guided design and implementation of Study Two, which was conducted in two phases. Phase One employed a quasiexperimental between-subjects methodology to evaluate the impact of the educational intervention on nursing students‘ clinical performance and selfappraisal of practices in paediatric physical assessment. Phase Two employed a thematic analysis and explored the experiences and perspectives of a sample subgroup of nursing students who used the PNPA CD-ROM kit as preparation for paediatric clinical placement. Results Results from the Process review in Study One indicated that the prototype CDROM kit containing the PNPA model met the predetermined benchmarks for face validity and the impact evaluation instrumentation had adequate content validity in comparison with predetermined benchmarks. In the first phase of Study Two the educational intervention did not result in statistically significant differences in measures of student performance or self-appraisal of practice. However, in Phase Two qualitative commentary from students, and from the expert panel who reviewed the prototype CD-ROM kit (Study One, Phase One), strongly endorsed the quality of the intervention and its potential for supporting learning. This raises questions regarding transfer of learning and it is likely that, within this study, several factors have influenced students‘ transfer of learning from the educational intervention to the clinical practice environment, where outcomes were measured. Conclusion In summary, the educational intervention employed in this study provides insights into the potential e-learning approaches offer for delivering authentic learning experiences to undergraduate nursing students. Findings in this study raise important questions regarding possible pedagogical influences on learning outcomes, issues within the transfer of theory to practice and factors that may have influenced findings within the context of this study. This study makes a unique contribution to nursing education, specifically with respect to progressing an understanding of the challenges faced in employing instructive methods to impact upon nursing students‘ development of competence. The important contribution transfer of learning processes make to students‘ transition into the professional practice context and to their development of competence within the context of speciality practice is also highlighted. This study contributes to a greater awareness of the complexity of translating theoretical learning at undergraduate level into clinical practice, particularly within speciality contexts.
Resumo:
In an automotive environment, the performance of a speech recognition system is affected by environmental noise if the speech signal is acquired directly from a microphone. Speech enhancement techniques are therefore necessary to improve the speech recognition performance. In this paper, a field-programmable gate array (FPGA) implementation of dual-microphone delay-and-sum beamforming (DASB) for speech enhancement is presented. As the first step towards a cost-effective solution, the implementation described in this paper uses a relatively high-end FPGA device to facilitate the verification of various design strategies and parameters. Experimental results show that the proposed design can produce output waveforms close to those generated by a theoretical (floating-point) model with modest usage of FPGA resources. Speech recognition experiments are also conducted on enhanced in-car speech waveforms produced by the FPGA in order to compare recognition performance with the floating-point representation running on a PC.
Resumo:
Modern computer graphics systems are able to construct renderings of such high quality that viewers are deceived into regarding the images as coming from a photographic source. Large amounts of computing resources are expended in this rendering process, using complex mathematical models of lighting and shading. However, psychophysical experiments have revealed that viewers only regard certain informative regions within a presented image. Furthermore, it has been shown that these visually important regions contain low-level visual feature differences that attract the attention of the viewer. This thesis will present a new approach to image synthesis that exploits these experimental findings by modulating the spatial quality of image regions by their visual importance. Efficiency gains are therefore reaped, without sacrificing much of the perceived quality of the image. Two tasks must be undertaken to achieve this goal. Firstly, the design of an appropriate region-based model of visual importance, and secondly, the modification of progressive rendering techniques to effect an importance-based rendering approach. A rule-based fuzzy logic model is presented that computes, using spatial feature differences, the relative visual importance of regions in an image. This model improves upon previous work by incorporating threshold effects induced by global feature difference distributions and by using texture concentration measures. A modified approach to progressive ray-tracing is also presented. This new approach uses the visual importance model to guide the progressive refinement of an image. In addition, this concept of visual importance has been incorporated into supersampling, texture mapping and computer animation techniques. Experimental results are presented, illustrating the efficiency gains reaped from using this method of progressive rendering. This visual importance-based rendering approach is expected to have applications in the entertainment industry, where image fidelity may be sacrificed for efficiency purposes, as long as the overall visual impression of the scene is maintained. Different aspects of the approach should find many other applications in image compression, image retrieval, progressive data transmission and active robotic vision.
Resumo:
Games and related virtual environments have been a much-hyped area of the entertainment industry. The classic quote is that games are now approaching the size of Hollywood box office sales [1]. Books are now appearing that talk up the influence of games on business [2], and it is one of the key drivers of present hardware development. Some of this 3D technology is now embedded right down at the operating system level via the Windows Presentation Foundations – hit Windows/Tab on your Vista box to find out... In addition to this continued growth in the area of games, there are a number of factors that impact its development in the business community. Firstly, the average age of gamers is approaching the mid thirties. Therefore, a number of people who are in management positions in large enterprises are experienced in using 3D entertainment environments. Secondly, due to the pressure of demand for more computational power in both CPU and Graphical Processing Units (GPUs), your average desktop, any decent laptop, can run a game or virtual environment. In fact, the demonstrations at the end of this paper were developed at the Queensland University of Technology (QUT) on a standard Software Operating Environment, with an Intel Dual Core CPU and basic Intel graphics option. What this means is that the potential exists for the easy uptake of such technology due to 1. a broad range of workers being regularly exposed to 3D virtual environment software via games; 2. present desktop computing power now strong enough to potentially roll out a virtual environment solution across an entire enterprise. We believe such visual simulation environments can have a great impact in the area of business process modeling. Accordingly, in this article we will outline the communication capabilities of such environments, giving fantastic possibilities for business process modeling applications, where enterprises need to create, manage, and improve their business processes, and then communicate their processes to stakeholders, both process and non-process cognizant. The article then concludes with a demonstration of the work we are doing in this area at QUT.
Resumo:
Information Retrieval is an important albeit imperfect component of information technologies. A problem of insufficient diversity of retrieved documents is one of the primary issues studied in this research. This study shows that this problem leads to a decrease of precision and recall, traditional measures of information retrieval effectiveness. This thesis presents an adaptive IR system based on the theory of adaptive dual control. The aim of the approach is the optimization of retrieval precision after all feedback has been issued. This is done by increasing the diversity of retrieved documents. This study shows that the value of recall reflects this diversity. The Probability Ranking Principle is viewed in the literature as the “bedrock” of current probabilistic Information Retrieval theory. Neither the proposed approach nor other methods of diversification of retrieved documents from the literature conform to this principle. This study shows by counterexample that the Probability Ranking Principle does not in general lead to optimal precision in a search session with feedback (for which it may not have been designed but is actively used). Retrieval precision of the search session should be optimized with a multistage stochastic programming model to accomplish the aim. However, such models are computationally intractable. Therefore, approximate linear multistage stochastic programming models are derived in this study, where the multistage improvement of the probability distribution is modelled using the proposed feedback correctness method. The proposed optimization models are based on several assumptions, starting with the assumption that Information Retrieval is conducted in units of topics. The use of clusters is the primary reasons why a new method of probability estimation is proposed. The adaptive dual control of topic-based IR system was evaluated in a series of experiments conducted on the Reuters, Wikipedia and TREC collections of documents. The Wikipedia experiment revealed that the dual control feedback mechanism improves precision and S-recall when all the underlying assumptions are satisfied. In the TREC experiment, this feedback mechanism was compared to a state-of-the-art adaptive IR system based on BM-25 term weighting and the Rocchio relevance feedback algorithm. The baseline system exhibited better effectiveness than the cluster-based optimization model of ADTIR. The main reason for this was insufficient quality of the generated clusters in the TREC collection that violated the underlying assumption.
Resumo:
In the quest for shorter time-to-market, higher quality and reduced cost, model-driven software development has emerged as a promising approach to software engineering. The central idea is to promote models to first-class citizens in the development process. Starting from a set of very abstract models in the early stage of the development, they are refined into more concrete models and finally, as a last step, into code. As early phases of development focus on different concepts compared to later stages, various modelling languages are employed to most accurately capture the concepts and relations under discussion. In light of this refinement process, translating between modelling languages becomes a time-consuming and error-prone necessity. This is remedied by model transformations providing support for reusing and automating recurring translation efforts. These transformations typically can only be used to translate a source model into a target model, but not vice versa. This poses a problem if the target model is subject to change. In this case the models get out of sync and therefore do not constitute a coherent description of the software system anymore, leading to erroneous results in later stages. This is a serious threat to the promised benefits of quality, cost-saving, and time-to-market. Therefore, providing a means to restore synchronisation after changes to models is crucial if the model-driven vision is to be realised. This process of reflecting changes made to a target model back to the source model is commonly known as Round-Trip Engineering (RTE). While there are a number of approaches to this problem, they impose restrictions on the nature of the model transformation. Typically, in order for a transformation to be reversed, for every change to the target model there must be exactly one change to the source model. While this makes synchronisation relatively “easy”, it is ill-suited for many practically relevant transformations as they do not have this one-to-one character. To overcome these issues and to provide a more general approach to RTE, this thesis puts forward an approach in two stages. First, a formal understanding of model synchronisation on the basis of non-injective transformations (where a number of different source models can correspond to the same target model) is established. Second, detailed techniques are devised that allow the implementation of this understanding of synchronisation. A formal underpinning for these techniques is drawn from abductive logic reasoning, which allows the inference of explanations from an observation in the context of a background theory. As non-injective transformations are the subject of this research, there might be a number of changes to the source model that all equally reflect a certain target model change. To help guide the procedure in finding “good” source changes, model metrics and heuristics are investigated. Combining abductive reasoning with best-first search and a “suitable” heuristic enables efficient computation of a number of “good” source changes. With this procedure Round-Trip Engineering of non-injective transformations can be supported.