988 resultados para Dominant process
Resumo:
Today’s information systems log vast amounts of data. These collections of data (implicitly) describe events (e.g. placing an order or taking a blood test) and, hence, provide information on the actual execution of business processes. The analysis of such data provides an excellent starting point for business process improvement. This is the realm of process mining, an area which has provided a repertoire of many analysis techniques. Despite the impressive capabilities of existing process mining algorithms, dealing with the abundance of data recorded by contemporary systems and devices remains a challenge. Of particular importance is the capability to guide the meaningful interpretation of “oceans of data” by process analysts. To this end, insights from the field of visual analytics can be leveraged. This article proposes an approach where process states are reconstructed from event logs and visualised in succession, leading to an animated history of a process. This approach is customisable in how a process state, partially defined through a collection of activity instances, is visualised: one can select a map and specify a projection of events on this map based on the properties of the events. This paper describes a comprehensive implementation of the proposal. It was realised using the open-source process mining framework ProM. Moreover, this paper also reports on an evaluation of the approach conducted with Suncorp, one of Australia’s largest insurance companies.
Resumo:
In 2007 some of us were fortunate enough to be in Dundee for the Royal College of Nursing’s Annual International Nursing Research Conference. A highlight of that conference was an enactment of the process and context debate. The chair asked for volunteers and various members of the audience came forward giving the impression that they were nurses and that it was a chance selection. The audience accepted these individuals as their representatives and once they had gathered on stage we all expected the debate to begin. But the large number of researchers in the audience gave little thought to the selection and recruitment process they had just witnessed. Then the selected representatives stood up and sang A cappella. Suddenly the context was different and we questioned the process. The point was made: process or context, or both?
Resumo:
In-memory databases have become a mainstay of enterprise computing offering significant performance and scalability boosts for online analytical and (to a lesser extent) transactional processing as well as improved prospects for integration across different applications through an efficient shared database layer. Significant research and development has been undertaken over several years concerning data management considerations of in-memory databases. However, limited insights are available on the impacts of applications and their supportive middleware platforms and how they need to evolve to fully function through, and leverage, in-memory database capabilities. This paper provides a first, comprehensive exposition into how in-memory databases impact Business Pro- cess Management, as a mission-critical and exemplary model-driven integration and orchestration middleware. Through it, we argue that in-memory databases will render some prevalent uses of legacy BPM middleware obsolete, but also open up exciting possibilities for tighter application integration, better process automation performance and some entirely new BPM capabilities such as process-based application customization. To validate the feasibility of an in-memory BPM, we develop a surprisingly simple BPM runtime embedded into SAP HANA and providing for BPMN-based process automation capabilities.
Resumo:
As a key element in their response to new media forcing transformations in mass media and media use, newspapers have deployed various strategies to not only establish online and mobile products, and develop healthy business plans, but to set out to be dominant portals. Their response to change was the subject of an early investigation by one of the present authors (Keshvani 2000). That was part of a set of short studies inquiring into what impact new software applications and digital convergence might have on journalism practice (Tickle and Keshvani 2000), and also looking for demonstrations of the way that innovations, technologies and protocols then under development might produce a “wireless, streamlined electronic news production process (Tickle and Keshvani 2001).” The newspaper study compared the online products of The Age in Melbourne and the Straits Times in Singapore. It provided an audit of the Singapore and Australia Information and Communications Technology (ICT) climate concentrating on the state of development of carrier networks, as a determining factor in the potential strength of the two services with their respective markets. In the outcome, contrary to initial expectations, the early cable roll-out and extensive ‘wiring’ of the city in Singapore had not produced a level of uptake of Internet services as strong as that achieved in Melbourne by more ad hoc and varied strategies. By interpretation, while news websites and online content were at an early stage of development everywhere, and much the same as one another, no determining structural imbalance existed to separate these leading media participants in Australia and South-east Asia. The present research revisits that situation, by again studying the online editions of the two large newspapers in the original study, and one other, The Courier Mail, (recognising the diversification of types of product in this field, by including it as a representative of Newscorp, now a major participant). The inquiry works through the principle of comparison. It is an exercise in qualitative, empirical research that establishes a comparison between the situation in 2000 as described in the earlier work, and the situation in 2014, after a decade of intense development in digital technology affecting the media industries. It is in that sense a follow-up study on the earlier work, although this time giving emphasis to content and style of the actual products as experienced by their users. It compares the online and print editions of each of these three newspapers; then the three mastheads as print and online entities, among themselves; and finally it compares one against the other two, as representing a South-east Asian model and Australian models. This exercise is accompanied by a review of literature on the developments in ICT affecting media production and media organisations, to establish the changed context. The new study of the online editions is conducted as a systematic appraisal of the first level, or principal screens, of the three publications, over the course of six days (10-15.2.14 inclusive). For this, categories for analysis were made, through conducting a preliminary examination of the products over three days in the week before. That process identified significant elements of media production, such as: variegated sourcing of materials; randomness in the presentation of items; differential production values among media platforms considered, whether text, video or stills images; the occasional repurposing and repackaging of top news stories of the day and the presence of standard news values – once again drawn out of the trial ‘bundle’ of journalistic items. Reduced in this way the online artefacts become comparable with the companion print editions from the same days. The categories devised and then used in the appraisal of the online products have been adapted to print, to give the closest match of sets of variables. This device, to study the two sets of publications on like standards -- essentially production values and news values—has enabled the comparisons to be made. This comparing of the online and print editions of each of the three publications was set up as up the first step in the investigation. In recognition of the nature of the artefacts, as ones that carry very diverse information by subject and level of depth, and involve heavy creative investment in the formulation and presentation of the information; the assessment also includes an open section for interpreting and commenting on main points of comparison. This takes the form of a field for text, for the insertion of notes, in the table employed for summarising the features of each product, for each day. When the sets of comparisons as outlined above are noted, the process then becomes interpretative, guided by the notion of change. In the context of changing media technology and publication processes, what substantive alterations have taken place, in the overall effort of news organisations in the print and online fields since 2001; and in their print and online products separately? Have they diverged or continued along similar lines? The remaining task is to begin to make inferences from that. Will the examination of findings enforce the proposition that a review of the earlier study, and a forensic review of new models, does provide evidence of the character and content of change --especially change in journalistic products and practice? Will it permit an authoritative description on of the essentials of such change in products and practice? Will it permit generalisation, and provide a reliable base for discussion of the implications of change, and future prospects? Preliminary observations suggest a more dynamic and diversified product has been developed in Singapore, well themed, obviously sustained by public commitment and habituation to diversified online and mobile media services. The Australian products suggest a concentrated corporate and journalistic effort and deployment of resources, with a strong market focus, but less settled and ordered, and showing signs of limitations imposed by the delay in establishing a uniform, large broadband network. The scope of the study is limited. It is intended to test, and take advantage of the original study as evidentiary material from the early days of newspaper companies’ experimentation with online formats. Both are small studies. The key opportunity for discovery lies in the ‘time capsule’ factor; the availability of well-gathered and processed information on major newspaper company production, at the threshold of a transformational decade of change in their industry. The comparison stands to identify key changes. It should also be useful as a reference for further inquiries of the same kind that might be made, and for monitoring of the situation in regard to newspaper portals on line, into the future.
Resumo:
Accurate process model elicitation continues to be a time consuming task, requiring skill on the part of the interviewer to extract explicit and tacit process information from the interviewee. Many errors occur in this elicitation stage that would be avoided by better activity recall, more consistent specification methods and greater engagement in the elicitation process by interviewees. Metasonic GmbH has developed a process elicitation tool for their process suite. As part of a research engagement with Metasonic, staff from QUT, Australia have developed a 3D virtual world approach to the same problem, viz. eliciting process models from stakeholders in an intuitive manner. This book chapter tells the story of how QUT staff developed a 3D Virtual World tool for process elicitation, took the outcomes of their research project to Metasonic for evaluation, and finally, Metasonic’s response to the initial proof of concept.
Resumo:
This chapter sets out to identify patterns at play in boardroom discussions around the design and adoption of an accountability system in a nonprofit organisation. To this end, it contributes to the scarce literature showing the backstage of management accounting systems (Berry, 2005), investment policy determining (Kreander, Beattie & McPhail, 2009; Kreander, McPhail & Molyneaux, 2004) and financial planning strategizing (Parker, 2004) or budgeting (Irvine 2005). The paucity of publications is due to issues raised by confidentiality preventing attendance at those meetings (Irvine, 2003), Irvine & Gaffikin, 2006). However, often, the implementation of a new control technology occurs over a long period of time that might exceed the duration of a research project (Quattrone & Hopper, 2001, 2005). Recent trends consisting of having research funded by grants from private institutions or charities have tended to reduce the length of such undertakings to a few months or rarely more than a couple of years (Parker, 2013);
Resumo:
Digital innovation is transforming the media and entertainment industries. The professionalization of YouTube’s platform is paradigmatic of that change. The 100 original channel initiative launched in late 2011 was designed to transform YouTube’s brand through production of a high volume of quality premium video content that would more deeply engage its audience base and in the process attract big advertisers. An unanticipated by-product has been the rapid growth of a wave of aspiring next-generation digital media companies from within the YouTube ecosystem. Fuelled by early venture capital some have ambitious goals to become global media corporations in the online video space. A number of larger MCNs (Multi-Channel Networks) - BigFrame, Machinima, Fullscreen, AwesomenessTV, Maker Studios , Revision3 and DanceOn - have attracted interest from media incumbents like Warner Brothers, DreamWorks, Discovery, Bertlesmann, Comcast and AMC, and two larger MCNs Alloy and Break Media have merged. This indicates that a shakeout is underway in these new online supply chains, after rapid initial growth. The higher profile MCNs seek to rapidly develop scale economies in online distribution and facilitate audience growth for their member channels, helping channels optimize monetization, develop sustainable business models and to facilitate producer-collaboration within a growing online community of like-minded content creators. Some MCNs already attract far larger online audiences than any national TV network. The speed with which these developments have occurred is reminiscent of the 1910s, when Hollywood studios first emerged and within only a few years replaced the incumbent film studios as the dominant force within the film industry.
Resumo:
Fashion Thinking: Creative Approaches to the Design Process, F. Dieffenbacher (2013) London: AVA, 224 pp., ISBN: 9782940411719, p/bk, $79.99
Resumo:
This paper explores how four English teachers position their English language learners for critical literacy within senior high school curriculum in Queensland, Australia. Such learners are often positioned, even by their teachers, within a broader “deficit discourse” that claims they are inherently lacking the requisite knowledge and skills to engage with intransigent school curricula. As such, English language learners’ identity formation is often constrained by deficit views that can ultimately see limited kinds of literacy teaching offered to them. Using Fairclough’s (2003) critical discourse analysis method, analysis of 16 interviews with the teachers was conducted as part of a larger, critical instrumental case study in two state high schools during 2010. Five competing discourses were identified: deficit as lack; deficit as need; learner “difference” as a resource; conceptual capacity for critical literacy; and linguistic, cultural and conceptual difficulty with critical literacy. While a deficit view is present, counter-hegemonic discourses also exist in their talk. The combination of discourses challenges monolithic deficit views of English language learners, and opens up generative discursive territory to position English language learners in ways other than “problematic”. This has important implications for how teachers view and teach English language learners and their capacity for critical literacy work in senior high school classrooms.
Resumo:
Identifying appropriate decision criteria and making optimal decisions in a structured way is a complex process. This paper presents an approach for doing this in the form of a hybrid Quality Function Deployment (QFD) and Cybernetic Analytic Network Process (CANP) model for project manager selection. This involves the use of QFD to translate the owner's project management expectations into selection criteria and the CANP to weight the expectations and selection criteria. The supermatrix approach then prioritises the candidates with respect to the overall decision-making goal. A case study is used to demonstrate the use of the model in selecting a renovation project manager. This involves the development of 18 selection criteria in response to the owner's three main expectations of time, cost and quality.
Resumo:
The aim of the paper is to give a feasibility study on the material deposition of Nanoscale textured morphology of titanium and titanium oxide layers on titanium and glass substrates. As a recent development in nanoscale deposition, Physical Vapor Deposition (PVD) based DC magnetron sputtering has been the choice for the deposition process. The nanoscale morphology and surface roughness of the samples have been characterized using Atomic Force Microscope (AFM). The surface roughnesses obtained from AFM have been compared using surface profiler. From the results we can say that the roughness values are dependent on the surface roughness of the substrate. The glass substrate was relatively smoother than the titanium plate and hence lower layer roughness was obtained. From AFM a unique nano-pattern of a boomerang shaped titanium oxide layer on glass substrate have been obtained. The boomerang shaped nano-scale pattern was found to be smaller when the layer was deposited at higher sputtering power. This indicated that the morphology of the deposited titanium oxide layer has been influenced by the sputtering power.
Resumo:
Although the notion of wellbeing is popular in contemporary literature, it is variously interpreted and has no common definition. Such inconsistencies in definition have particular relevance when considering wellbeing programs designed for children. By developing a broader conceptualisation of wellbeing and its key elements, the range of programs and services developed in the name of wellbeing will achieve a more consistent cross-disciplinary focus to ensure that the needs of the individual, including children, can more accurately be addressed. This paper presents a new perspective on conceptualising wellbeing. The authors argue that conceptualising wellbeing as an accrued process has particular relevance for both adults and children. A definition for accrued wellbeing is presented in an attempt to address some of the current deficiencies in existing understandings of an already complicated construct. The potential for the ideas presented when considering wellbeing as a process of accrual may have further application when considered beyond childhood.
Resumo:
Kaolinite naturally occurs in the plate form for the interlayer hydrogen bond and the distortion and adaption of tetrahedron and octahedron. But kaolinite sheets can be exfoliated to nanoscrolls artificially in laboratory through multiple-step displacement intercalation. The driving force for kaolinite sheet to be curled nanoscroll originates from the size discrepancy of Si–O tetrahedron and Al–O octahedron. The displacement intercalation promoted the platy kaolinite sheets spontaneously to be scrolled by eliminating the interlayer hydrogen bond and atomic interaction. Kaolinite nanoscrolls are hollow tubes with outer face of tetrahedral sheet and inner face of octahedral sheet. Based on the theoretical calculation it is firstly reported that the minimum interior diameter for a single kaolinite sheet to be scrolled is about 9.08 nm, and the optimal 24.30 nm, the maximum 100 nm, which is verified by the observation of scanning electron microscope and transmission electron microscope. The different adaption types and discrepancy degree between tetrahedron and octahedron generate various curling forces in different directions. The nanoscroll axes prefer the directions as [100], [1 �10], [110], [3 �10], and the relative curling force are as follows, [3 �10] > [100] = [1�10] > [110].
Resumo:
Process modeling – the design and use of graphical documentations of an organization’s business processes – is a key method to document and use information about the operations of businesses. Still, despite current interest in process modeling, this research area faces essential challenges. Key unanswered questions concern the impact of process modeling in organizational practice, and the mechanisms through which impacts are developed. To answer these questions and to provide a better understanding of process modeling impact, I turn to the concept of affordances. Affordances describe the possibilities for goal-oriented action that a technical object offers to a user. This notion has received growing attention from IS researchers. The purpose of my research is to further develop the IS discipline’s understanding of affordances and impacts from information objects, such as process models used by analysts for information systems analysis and design. Specifically, I seek to extend existing theory on the emergence, perception and actualization of affordances. I develop a research model that describes the process by which affordances emerge between an individual and an object, how affordances are perceived, and how they are actualized by the individual. The proposed model also explains the role of available information for the individual, and the influence of perceived actualization effort. I operationalize and test this research model empirically, using a full-cycle, mixed methods study consisting of case study and experiment.
Resumo:
We identified, mapped, and characterized a widespread area (gt;1,020 km2) of patterned ground in the Saginaw Lowlands of Michigan, a wet, flat plain composed of waterlain tills, lacustrine deposits, or both. The polygonal patterned ground is interpreted as a possible relict permafrost feature, formed in the Late Wisconsin when this area was proximal to the Laurentide ice sheet. Cold-air drainage off the ice sheet might have pooled in the Saginaw Lowlands, which sloped toward the ice margin, possibly creating widespread but short-lived permafrost on this glacial lake plain. The majority of the polygons occur between the Glacial Lake Warren strandline (~14.8 cal. ka) and the shoreline of Glacial Lake Elkton (~14.3 cal. ka), providing a relative age bracket for the patterned ground. Most of the polygons formed in dense, wet, silt loam soils on flat-lying sites and take the form of reticulate nets with polygon long axes of 150 to 160 m and short axes of 60 to 90 m. Interpolygon swales, often shown as dark curvilinears on aerial photographs, are typically slightly lower than are the polygon centers they bound. Some portions of these interpolygon swales are infilled with gravel-free, sandy loam sediments. The subtle morphology and sedimentological characteristics of the patterned ground in the Saginaw Lowlands suggest that thermokarst erosion, rather than ice-wedge replacement, was the dominant geomorphic process associated with the degradation of the Late-Wisconsin permafrost in the study area and, therefore, was primarily responsible for the soil patterns seen there today.