940 resultados para Interpreters’ Intercultural Mediation Process
Resumo:
Background: This study aims to explore moderation and mediation roles of caregiver self-efficacy between subjective caregiver burden and (a) behavioral and psychological symptoms (BPSD) of dementia; and (b) social support. Methods: A cross-sectional study with 137 spouse caregivers of dementia patients was conducted in Shanghai. We collected demographic information for the caregiver–patient dyads, as well as information associated with dementia-related impairments, caregiver social support, caregiver self-efficacy, and SF-36. Results: Multiple regression analysis showed that caregiver self-efficacy was a moderator both between BPSD and subjective caregiver burden, and social support and subjective caregiver burden. Results also showed a partial mediation effect of caregiver self-efficacy on the impact of BPSD on subjective caregiver burden, and a mediation effect of social support on subjective caregiver burden. Caregiver self-efficacy and subjective burden significantly influenced BPSD and social support. Conclusion: Caregiver self-efficacy played an important role in the paths by which the two factors influenced subjective burden. Enhancing caregiver self-efficacy for symptom management (particularly BPSD) can be an essential strategy for determining interventions to support dementia caregivers in China, and possibly in other countries.
Resumo:
This research contributes a fully-operational approach for managing business process risk in near real-time. The approach consists of a language for defining risks on top of process models, a technique to detect such risks as they eventuate during the execution of business processes, a recommender system for making risk-informed decisions, and a technique to automatically mitigate the detected risks when they are no longer tolerable. Through the incorporation of risk management elements in all stages of the lifecycle of business processes, this work contributes to the effective integration of the fields of Business Process Management and Risk Management.
Resumo:
The purpose of this paper is to review existing knowledge management (KM) practices within the field of asset management, identify gaps, and propose a new approach to managing knowledge for asset management. Existing approaches to KM in the field of asset management are incomplete with the focus primarily on the application of data and information systems, for example the use of an asset register. It is contended these approaches provide access to explicit knowledge and overlook the importance of tacit knowledge acquisition, sharing and application. In doing so, current KM approaches within asset management tend to neglect the significance of relational factors; whereas studies in the knowledge management field have showed that relational modes such as social capital is imperative for ef-fective KM outcomes. In this paper, we argue that incorporating a relational ap-proach to KM is more likely to contribute to the exchange of ideas and the devel-opment of creative responses necessary to improve decision-making in asset management. This conceptual paper uses extant literature to explain knowledge management antecedents and explore its outcomes in the context of asset man-agement. KM is a component in the new Integrated Strategic Asset Management (ISAM) framework developed in conjunction with asset management industry as-sociations (AAMCoG, 2012) that improves asset management performance. In this paper we use Nahapiet and Ghoshal’s (1998) model to explain antecedents of relational approach to knowledge management. Further, we develop an argument that relational knowledge management is likely to contribute to the improvement of the ISAM framework components, such as Organisational Strategic Manage-ment, Service Planning and Delivery. The main contribution of the paper is a novel and robust approach to managing knowledge that leads to the improvement of asset management outcomes.
Resumo:
Process modelling is an integral part of any process industry. Several sugar factory models have been developed over the years to simulate the unit operations. An enhanced and comprehensive milling process simulation model has been developed to analyse the performance of the milling train and to assess the impact of changes and advanced control options for improved operational efficiency. The developed model is incorporated in a proprietary software package ‘SysCAD’. As an example, the milling process model has been used to predict a significant loss of extraction by returning the cush from the juice screen before #3 mill instead of before #2 mill as is more commonly done. Further work is being undertaken to more accurately model extraction processes in a milling train, to examine extraction issues dynamically and to integrate the model into a whole factory model.
Resumo:
Purpose – Context-awareness has emerged as an important principle in the design of flexible business processes. The goal of the research is to develop an approach to extend context-aware business process modeling toward location-awareness. The purpose of this paper is to identify and conceptualize location-dependencies in process modeling. Design/methodology/approach – This paper uses a pattern-based approach to identify location-dependency in process models. The authors design specifications for these patterns. The authors present illustrative examples and evaluate the identified patterns through a literature review of published process cases. Findings – This paper introduces location-awareness as a new perspective to extend context-awareness in BPM research, by introducing relevant location concepts such as location-awareness and location-dependencies. The authors identify five basic location-dependent control-flow patterns that can be captured in process models. And the authors identify location-dependencies in several existing case studies of business processes. Research limitations/implications – The authors focus exclusively on the control-flow perspective of process models. Further work needs to extend the research to address location-dependencies in process data or resources. Further empirical work is needed to explore determinants and consequences of the modeling of location-dependencies. Originality/value – As existing literature mostly focusses on the broad context of business process, location in process modeling still is treated as “second class citizen” in theory and in practice. This paper discusses the vital role of location-dependencies within business processes. The proposed five basic location-dependent control-flow patterns are novel and useful to explain location-dependency in business process models. They provide a conceptual basis for further exploration of location-awareness in the management of business processes.
Resumo:
Empirical evidence shows that repositories of business process models used in industrial practice contain significant amounts of duplication. This duplication arises for example when the repository covers multiple variants of the same processes or due to copy-pasting. Previous work has addressed the problem of efficiently retrieving exact clones that can be refactored into shared subprocess models. This article studies the broader problem of approximate clone detection in process models. The article proposes techniques for detecting clusters of approximate clones based on two well-known clustering algorithms: DBSCAN and Hi- erarchical Agglomerative Clustering (HAC). The article also defines a measure of standardizability of an approximate clone cluster, meaning the potential benefit of replacing the approximate clones with a single standardized subprocess. Experiments show that both techniques, in conjunction with the proposed standardizability measure, accurately retrieve clusters of approximate clones that originate from copy-pasting followed by independent modifications to the copied fragments. Additional experiments show that both techniques produce clusters that match those produced by human subjects and that are perceived to be standardizable.
Resumo:
The textual turn is a good friend of expert spectating, where it assumes the role of writing-productive apparatus, but no friend at all of expert practices or practitioners (Melrose, 2003). Introduction The challenge of time-based embodied performance when the artefact is unstable As a former full-time professional practitioner with an embodied dance practice as performer, choreographer and artistic director for three decades, I somewhat unexpectedly entered the world of academia in 2000 after completing a practice-based PhD, which was described by its examiners as ‘pioneering’. Like many artists my intention was to deepen and extend my practice through formal research into my work and its context (which was intercultural) and to privilege the artist’s voice in a research world where it was too often silent. Practice as research, practice-based research, and practice-led research were not yet fully named. It was in its infancy and my biggest challenge was to find a serviceable methodology which did not betray my intentions to keep practice at the centre of the research. Over the last 15 years, practice led doctoral research, where examinable creative work is placed alongside an accompanying (exegetical) written component, has come a long way. It has been extensively debated with a range of theories and models proposed (Barrett & Bolt, 2007, Pakes, 2003 & 2004, Piccini, 2005, Philips, Stock & Vincs 2009, Stock, 2009 & 2010, Riley & Hunter 2009, Haseman, 2006, Hecq, 2012). Much of this writing is based around epistemological concerns where the research methodologies proposed normally incorporate a contextualisation of the creative work in its field of practice, and more importantly validation and interrogation of the processes of the practice as the central ‘data gathering’ method. It is now widely accepted, at least in the Australian creative arts context, that knowledge claims in creative practice research arise from the material activities of the practice itself (Carter, 2004). The creative work explicated as the tangible outcome of that practice is sometimes referred to as the ‘artefact’. Although the making of the artefact, according to Colbert (2009, p. 7) is influenced by “personal, experiential and iterative processes”, mapping them through a research pathway is “difficult to predict [for] “the adjustments made to the artefact in the light of emerging knowledge and insights cannot be foreshadowed”. Linking the process and the practice outcome most often occurs through the textual intervention of an exegesis which builds, and/or builds on, theoretical concerns arising in and from the work. This linking produces what Barrett (2007) refers to as “situated knowledge… that operates in relation to established knowledge” (p. 145). But what if those material forms or ‘artefacts’ are not objects or code or digitised forms, but live within the bodies of artist/researchers where the nature of the practice itself is live, ephemeral and constantly transforming, as in dance and physical performance? Even more unsettling is when the ‘artefact’ is literally embedded and embodied in the work and in the maker/researcher; when subject and object are merged. To complicate matters, the performing arts are necessarily collaborative, relying not only on technical mastery and creative/interpretive processes, but on social and artistic relationships which collectively make up the ‘artefact’. This chapter explores issues surrounding live dance and physical performance when placed in a research setting, specifically the complexities of being required to translate embodied dance findings into textual form. Exploring how embodied knowledge can be shared in a research context for those with no experiential knowledge of communicating through and in dance, I draw on theories of “dance enaction” (Warburton, 2011) together with notions of “affective intensities” and “performance mastery” (Melrose, 2003), “intentional activity” (Pakes, 2004) and the place of memory. In seeking ways to capture in another form the knowledge residing in live dance practice, thus making implicit knowledge explicit, I further propose there is a process of triple translation as the performance (the living ‘artefact’) is documented in multi-facetted ways to produce something durable which can be re-visited. This translation becomes more complex if the embodied knowledge resides in culturally specific practices, formed by world views and processes quite different from accepted norms and conventions (even radical ones) of international doctoral research inquiry. But whatever the combination of cultural, virtual and genre-related dance practices being researched, embodiment is central to the process, outcome and findings, and the question remains of how we will use text and what forms that text might take.
Resumo:
In this chapter we describe a critical fairytales unit taught to 4.5 to 5.5 year olds in a context of intensifying pressure to raise literacy achievement. The unit was infused with lessons on reinterpreted fairytales followed by process drama activities built around a sophisticated picture book, Beware of the Bears (MacDonald, 2004). The latter entailed a text analytic approach to critical literacy derived from systemic functional linguistics (Halliday, 1978; Halliday & Matthiessen, 2004). This approach provides a way of analysing how words and discourse are used to represent the world in a particular way and shape reader relations with the author in a particular field (Janks, 2010).
Resumo:
This article will discuss some real life case examples of what will be termed “lawyers behaving badly” where it will be argued that legal representatives have not performed as effectively as they could have in mediation settings. These instances of “lawyer misbehaviour” will be grouped under several broad headings: the Process Thwarter, the Zealous Adversarial Advocate, the Misguided Advisor, the Distributive Bargainer, the Passive Advocate, and the Legal Takeover. Reflecting on these situations will provide guidance to legal educators as to the specific areas of dispute resolution knowledge and skills that future lawyers need to learn and develop.
Resumo:
Structural equation modeling (SEM) is a versatile multivariate statistical technique, and applications have been increasing since its introduction in the 1980s. This paper provides a critical review of 84 articles involving the use of SEM to address construction related problems over the period 1998–2012 including, but not limited to, seven top construction research journals. After conducting a yearly publication trend analysis, it is found that SEM applications have been accelerating over time. However, there are inconsistencies in the various recorded applications and several recurring problems exist. The important issues that need to be considered are examined in research design, model development and model evaluation and are discussed in detail with reference to current applications. A particularly important issue concerns the construct validity. Relevant topics for efficient research design also include longitudinal or cross-sectional studies, mediation and moderation effects, sample size issues and software selection. A guideline framework is provided to help future researchers in construction SEM applications.
Resumo:
Process models describe someone’s understanding of processes. Processes can be described using unstructured, semi-formal or diagrammatic representation forms. These representations are used in a variety of task settings, ranging from understanding processes to executing or improving processes, with the implicit assumption that the chosen representation form will be appropriate for all task settings. We explore the validity of this assumption by examining empirically the preference for different process representation forms depending on the task setting and cognitive style of the user. Based on data collected from 120 business school students, we show that preferences for process representation formats vary dependent on application purpose and cognitive styles of the participants. However, users consistently prefer diagrams over other representation formats. Our research informs a broader research agenda on task-specific applications of process modeling. We offer several recommendations for further research in this area.
Resumo:
To effectively manage the challenges being faced by construction organisations in a fast changing business environment, many organisations are attempting to integrate knowledge management (KM) into their business operations. KM activities interact with each other and form a process which receives input from its internal business environment and produces outputs that should be justified by its business performance. This paper aims to provide further understanding on the dynamic nature of the KM process. Through a combination of path analysis and system dynamic simulation, this study found that: 1) an improved business performance enables active KM activities and provide feedback and guidance for formulating learning-based policies; and 2) effective human resource recruitment policies can enlarge the pool of individual knowledge, which lead to a more conducive internal business environment, as well as a higher KM activity level. Consequently, the desired business performance level can be reached within a shorter time frame.
Resumo:
Today’s information systems log vast amounts of data. These collections of data (implicitly) describe events (e.g. placing an order or taking a blood test) and, hence, provide information on the actual execution of business processes. The analysis of such data provides an excellent starting point for business process improvement. This is the realm of process mining, an area which has provided a repertoire of many analysis techniques. Despite the impressive capabilities of existing process mining algorithms, dealing with the abundance of data recorded by contemporary systems and devices remains a challenge. Of particular importance is the capability to guide the meaningful interpretation of “oceans of data” by process analysts. To this end, insights from the field of visual analytics can be leveraged. This article proposes an approach where process states are reconstructed from event logs and visualised in succession, leading to an animated history of a process. This approach is customisable in how a process state, partially defined through a collection of activity instances, is visualised: one can select a map and specify a projection of events on this map based on the properties of the events. This paper describes a comprehensive implementation of the proposal. It was realised using the open-source process mining framework ProM. Moreover, this paper also reports on an evaluation of the approach conducted with Suncorp, one of Australia’s largest insurance companies.
Resumo:
In 2007 some of us were fortunate enough to be in Dundee for the Royal College of Nursing’s Annual International Nursing Research Conference. A highlight of that conference was an enactment of the process and context debate. The chair asked for volunteers and various members of the audience came forward giving the impression that they were nurses and that it was a chance selection. The audience accepted these individuals as their representatives and once they had gathered on stage we all expected the debate to begin. But the large number of researchers in the audience gave little thought to the selection and recruitment process they had just witnessed. Then the selected representatives stood up and sang A cappella. Suddenly the context was different and we questioned the process. The point was made: process or context, or both?
Resumo:
In-memory databases have become a mainstay of enterprise computing offering significant performance and scalability boosts for online analytical and (to a lesser extent) transactional processing as well as improved prospects for integration across different applications through an efficient shared database layer. Significant research and development has been undertaken over several years concerning data management considerations of in-memory databases. However, limited insights are available on the impacts of applications and their supportive middleware platforms and how they need to evolve to fully function through, and leverage, in-memory database capabilities. This paper provides a first, comprehensive exposition into how in-memory databases impact Business Pro- cess Management, as a mission-critical and exemplary model-driven integration and orchestration middleware. Through it, we argue that in-memory databases will render some prevalent uses of legacy BPM middleware obsolete, but also open up exciting possibilities for tighter application integration, better process automation performance and some entirely new BPM capabilities such as process-based application customization. To validate the feasibility of an in-memory BPM, we develop a surprisingly simple BPM runtime embedded into SAP HANA and providing for BPMN-based process automation capabilities.