912 resultados para Business Administration, Management|Information Science|Engineering, System Science


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The process view concept deploys a partial and temporal representation to adjust the visible view of a business process according to various perception constraints of users. Process view technology is of practical use for privacy protection and authorization control in process-oriented business management. Owing to complex organizational structure, it is challenging for large companies to accurately specify the diverse perception of different users over business processes. Aiming to tackle this issue, this article presents a role-based process view model to incorporate role dependencies into process view derivation. Compared to existing process view approaches, ours particularly supports runtime updates to the process view perceivable to a user with specific view merging operations, thereby enabling the dynamic tracing of process perception. A series of rules and theorems are established to guarantee the structural consistency and validity of process view transformation. A hypothetical case is conducted to illustrate the feasibility of our approach, and a prototype is developed for the proof-of-concept purpose.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is the fourth TAProViz workshop being run at the 13th International Conference on Business Process Management (BPM). The intention this year is to consolidate on the results of the previous successful workshops by further developing this important topic, identifying the key research topics of interest to the BPM visualization community. Towards this goal, the workshop topics were extended to human computer interaction and related domains. Submitted papers were evaluated by at least three program committee members, in a double blind manner, on the basis of significance, originality, technical quality and exposition. Three full and one position papers were accepted for presentation at the workshop. In addition, we invited a keynote speaker, Jakob Pinggera, a postdoctoral researcher at the Business Process Management Research Cluster at the University of Innsbruck, Austria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Overprocessing waste occurs in a business process when effort is spent in a way that does not add value to the customer nor to the business. Previous studies have identied a recurrent overprocessing pattern in business processes with so-called "knockout checks", meaning activities that classify a case into "accepted" or "rejected", such that if the case is accepted it proceeds forward, while if rejected, it is cancelled and all work performed in the case is considered unnecessary. Thus, when a knockout check rejects a case, the effort spent in other (previous) checks becomes overprocessing waste. Traditional process redesign methods propose to order knockout checks according to their mean effort and rejection rate. This paper presents a more fine-grained approach where knockout checks are ordered at runtime based on predictive machine learning models. Experiments on two real-life processes show that this predictive approach outperforms traditional methods while incurring minimal runtime overhead.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper presents an innovative approach to modelling the causal relationships of human errors in rail crack incidents (RCI) from a managerial perspective. A Bayesian belief network is developed to model RCI by considering the human errors of designers, manufactures, operators and maintainers (DMOM) and the causal relationships involved. A set of dependent variables whose combinations express the relevant functions performed by each DMOM participant is used to model the causal relationships. A total of 14 RCI on Hong Kong’s mass transit railway (MTR) from 2008 to 2011 are used to illustrate the application of the model. Bayesian inference is used to conduct an importance analysis to assess the impact of the participants’ errors. Sensitivity analysis is then employed to gauge the effect the increased probability of occurrence of human errors on RCI. Finally, strategies for human error identification and mitigation of RCI are proposed. The identification of ability of maintainer in the case study as the most important factor influencing the probability of RCI implies the priority need to strengthen the maintenance management of the MTR system and that improving the inspection ability of the maintainer is likely to be an effective strategy for RCI risk mitigation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Business process models have become an effective way of examining business practices to identify areas for improvement. While common information gathering approaches are generally efficacious, they can be quite time consuming and have the risk of developing inaccuracies when information is forgotten or incorrectly interpreted by analysts. In this study, the potential of a role-playing approach to process elicitation and specification has been examined. This method allows stakeholders to enter a virtual world and role-play actions similarly to how they would in reality. As actions are completed, a model is automatically developed, removing the need for stakeholders to learn and understand a modelling grammar. An empirical investigation comparing both the modelling outputs and participant behaviour of this virtual world role-play elicitor with an S-BPM process modelling tool found that while the modelling approaches of the two groups varied greatly, the virtual world elicitor may not only improve both the number of individual process task steps remembered and the correctness of task ordering, but also provide a reduction in the time required for stakeholders to model a process view.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the problem of discovering business process models from event logs. Existing approaches to this problem strike various tradeoffs between accuracy and understandability of the discovered models. With respect to the second criterion, empirical studies have shown that block-structured process models are generally more understandable and less error-prone than unstructured ones. Accordingly, several automated process discovery methods generate block-structured models by construction. These approaches however intertwine the concern of producing accurate models with that of ensuring their structuredness, sometimes sacrificing the former to ensure the latter. In this paper we propose an alternative approach that separates these two concerns. Instead of directly discovering a structured process model, we first apply a well-known heuristic technique that discovers more accurate but sometimes unstructured (and even unsound) process models, and then transform the resulting model into a structured one. An experimental evaluation shows that our “discover and structure” approach outperforms traditional “discover structured” approaches with respect to a range of accuracy and complexity measures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The business value of information technology (IT) is realized through the continuous use of IT subsequent to users’ adoption. Understanding post-adoptive IT usage is useful in realizing potential IT business value. Most previous research on post-adoptive IT usage, however, dismisses the unintentional and unconscious aspects of usage behavior. This paper advances understanding of the unintentional, unconscious, and thereby automatic usage of IT features during the post-adoptive stage. Drawing from Social Psychology literature, we argue human behaviors can be triggered by environmental cues and directed by the person’s mental goals, thereby operating without a person’s consciousness and intentional will. On this basis, we theorize the role of a user’s innovativeness goal, as the desired state of an act to innovate, in directing the user’s unintentional, unconscious, and automatic post-adoptive IT feature usage behavior. To test the hypothesized mechanisms, a human experiment employing a priming technique, is described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several researchers are of the opinion that there are many benefits in using the object-oriented paradigm in information systems development. If the object-oriented paradigm is used, the development of information systems may, for example, be faster and more efficient. On the other hand, there are also several problems with the paradigm. For example, it is often considered complex, it is often difficult to make use of the reuse concept and it is still immature in some areas. Although there are several interesting features in the object-oriented paradigm, there is still little comprehensive knowledge of the benefits and problems associated with it. The objective of the following study was to investigate and to gain more understanding of the benefits and problems of the object-oriented paradigm. A review of previous studies was made and twelve benefits and twelve problems were established. These benefits and problems were then analysed, studied and discussed. Further a survey and some case studies were made in order to get some knowledge on what benefits and problems with the object-oriented paradigm Finnish software companies had experienced. One hundred and four companies answered the survey that was sent to all Finnish software companies with five or more employees. The case studies were made with six large Finnish software companies. The major finding was that Finnish software companies were exceptionally positive towards the object-oriented information systems development and had experienced very few of the proposed problems. Finally two models for further research were developed. The first model presents connections between benefits and the second between problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kirjastoissa ja yliopistoissa tapahtuvaa tieteellisten töiden verkkokäyttöä koskevat tekijänoikeudelliset kysymykset ovat viimeaikoina aiheuttaneet päänvaivaa. Tietoverkot ja digitaalinen ympäristö muodostavatkin tekijänoikeuden kannalta erityisen soveltamisympäristön johon perehtyminen edellyttää tarkempaa tietämystä tiedon siirtämisestä, tietokannoista sekä ylipäätään tietoverkkoihin liittyvistä teknisistä toiminnoista. Koska sovelletut tekniset ratkaisut poikkeavat eri yhteyksissä toisistaan, pyrin kirjoituksessa yleisellä tasolla selvittämään niitä käyttäjien ja oikeudenhaltijoiden välisiä tekijän- ja sopimusoikeudellisia kysymyksiä, joita teosten käyttö tietoverkoissa aiheuttaa. Pyrkimyksenä on tuoda esiin ne tekijänoikeudellisesti merkitykselliset seikat, jotka verkkojulkaisuja arkistoitaessa, välitettäessä sekä linkkejä käytettäessä tulisi alkuperäisten tekijöiden, kustantajien ja verkkojulkaisijoiden (esimerkiksi kirjasto tai yliopisto) välisissä sopimuksissa ottaa huomioon. Kysymyksiä tarkastellaan erityisesti julkaisijan näkökulmasta. Esitys sisältää myös kustantajien lupakäytäntöä käsittelevän empiirisen tutkimuksen. Tutkimuksessa on tarkasteltu kuinka usein kustantajat ovat vuosien 2000 – 2003 välisenä aikana myöntäneet luvan julkaista väitöskirjan artikkeli osana väitöskirjaa Teknillisen korkeakoulun avoimella ei kaupallisella www-palvelimella. Koska linkeillä on verkkojulkaisutoiminnassa usein merkittävä rooli, mutta niiden tekijänoikeudellinen asema on epäselvä, kirjoituksen jälkimmäisessä osiossa perehdytään linkkien tekijänoikeudelliseen asemaan.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Executive Summary: The Estuary Restoration Act of 2000 (ERA), Title I of the Estuaries and Clean Waters Act of 2000, was created to promote the restoration of habitats along the coast of the United States (including the US protectorates and the Great Lakes). The NOAA National Centers for Coastal Ocean Science was charged with the development of a guidance manual for monitoring plans under this Act. This guidance manual, titled Science-Based Restoration Monitoring of Coastal Habitats, is written in two volumes. It provides technical assistance, outlines necessary steps, and provides useful tools for the development and implementation of sound scientific monitoring of coastal restoration efforts. In addition, this manual offers a means to detect early warnings that the restoration is on track or not, to gauge how well a restoration site is functioning, to coordinate projects and efforts for consistent and successful restoration, and to evaluate the ecological health of specific coastal habitats both before and after project completion (Galatowitsch et al. 1998). The following habitats have been selected for discussion in this manual: water column, rock bottom, coral reefs, oyster reefs, soft bottom, kelp and other macroalgae, rocky shoreline, soft shoreline, submerged aquatic vegetation, marshes, mangrove swamps, deepwater swamps, and riverine forests. The classification of habitats used in this document is generally based on that of Cowardin et al. (1979) in their Classification of Wetlands and Deepwater Habitats of the United States, as called for in the ERA Estuary Habitat Restoration Strategy. This manual is not intended to be a restoration monitoring “cookbook” that provides templates of monitoring plans for specific habitats. The interdependence of a large number of site-specific factors causes habitat types to vary in physical and biological structure within and between regions and geographic locations (Kusler and Kentula 1990). Monitoring approaches used should be tailored to these differences. However, even with the diversity of habitats that may need to be restored and the extreme geographic range across which these habitats occur, there are consistent principles and approaches that form a common basis for effective monitoring. Volume One, titled A Framework for Monitoring Plans under the Estuaries and Clean Waters Act of 2000, begins with definitions and background information. Topics such as restoration, restoration monitoring, estuaries, and the role of socioeconomics in restoration are discussed. In addition, the habitats selected for discussion in this manual are briefly described. (PDF contains 116 pages)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Alliance for Coastal Technologies (ACT) Workshop "Making Oxygen Measurements Routine Like Temperature" was convened in St. Petersburg, Florida, January 4th - 6th, 2006. This event was sponsored by the University of South Florida (USF) College of Marine Science, an ACT partner institution and co-hosted by the Ocean Research Interactive Observatory Networks (ORION). Participants from researcldacademia, resource management, industry, and engineering sectors collaborated with the aim to foster ideas and information on how to make measuring dissolved oxygen a routine part of a coastal or open ocean observing system. Plans are in motion to develop large scale ocean observing systems as part of the US Integrated Ocean Observing System (100s; see http://ocean.us) and the NSF Ocean Observatory Initiative (001; see http://www.orionprogram.org/00I/default.hl). These systems will require biological and chemical sensors that can be deployed in large numbers, with high reliability, and for extended periods of time (years). It is also likely that the development cycle for new sensors is sufficiently long enough that completely new instruments, which operate on novel principles, cannot be developed before these complex observing systems will be deployed. The most likely path to development of robust, reliable, high endurance sensors in the near future is to move the current generation of sensors to a much greater degree of readiness. The ACT Oxygen Sensor Technology Evaluation demonstrated two important facts that are related to the need for sensors. There is a suite of commercially available sensors that can, in some circumstances, generate high quality data; however, the evaluation also showed that none of the sensors were able to generate high quality data in all circumstances for even one month time periods due to biofouling issues. Many groups are attempting to use oxygen sensors in large observing programs; however, there often seems to be limited communication between these groups and they often do not have access to sophisticated engineering resources. Instrument manufacturers also do not have sufficient resources to bring sensors, which are marketable, but of limited endurance or reliability, to a higher state of readiness. The goal of this ACT/ORION Oxygen Sensor Workshop was to bring together a group of experienced oceanographers who are now deploying oxygen sensors in extended arrays along with a core of experienced and interested academic and industrial engineers, and manufacturers. The intended direction for this workshop was for this group to exchange information accumulated through a variety of sensor deployments, examine failure mechanisms and explore a variety of potential solutions to these problems. One anticipated outcome was for there to be focused recommendations to funding agencies on development needs and potential solutions for 02 sensors. (pdf contains 19 pages)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Science Cafes present a casual meeting place where people who may have little or no science background can learn about a current scientific topic in an informal and friendly environment. The coffee shop setting is designed to be inviting and informal so that students, faculty, and community members can feel comfortable and engage in lively and meaningful conversations. The café is organized around an interesting scientific topic with a brief presentation by a scientist and may include a short video clip. A Science Café can (1) provide an opportunity and venue for increasing science literacy, (2) publicize local scientific endeavors, and (3) identify the library as an epicenter of informal education on the campus and in the community. This presentation will describe the development of the Science Café at the University of Southern Mississippi Gulf Coast campus Library in Long Beach and plans for future cafes on the Mississippi coast.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On July 12-15, 2008, researchers and resource managers met in Jupiter, Florida to discuss and review the state of knowledge regarding mesophotic coral ecosystems, develop a working definition for these ecosystems, identify critical resource management information needs, and develop a Mesophotic Coral Ecosystems Research Strategy to assist the U.S. National Oceanic and Atmospheric Administration (NOAA) and other agencies and institutions in their research prioritization and strategic planning for mesophotic coral ecosystems. Workshop participants included representatives from international, Federal, and state governments; academia; and nongovernmental organizations. The Mesophotic Coral Ecosystems Workshop was hosted by the Perry Institute for Marine Science (PIMS) and organized by NOAA and the U.S. Geological Survey (USGS). The workshop goals, objectives, schedule, and products were governed by a Steering Committee consisting of members from NOAA (National Centers for Coastal Ocean Science’s Center for Sponsored Coastal Ocean Research, the Office of Ocean Exploration and Research’s NOAA Undersea Research Program, and the National Marine Fisheries Service), USGS, PIMS, the Caribbean Coral Reef Institute, and the Bishop Museum.