823 resultados para Business -- Data processing -- Management


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the field of process mining, the use of event logs for the purpose of root cause analysis is increasingly studied. In such an analysis, the availability of attributes/features that may explain the root cause of some phenomena is crucial. Currently, the process of obtaining these attributes from raw event logs is performed more or less on a case-by-case basis: there is still a lack of generalized systematic approach that captures this process. This paper proposes a systematic approach to enrich and transform event logs in order to obtain the required attributes for root cause analysis using classical data mining techniques, the classification techniques. This approach is formalized and its applicability has been validated using both self-generated and publicly-available logs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Complexity is a major concern which is aimed to be overcome by people through modeling. One way of reducing complexity is separation of concerns, e.g. separation of business process from applications. One sort of concerns are cross-cutting concerns i.e. concerns which are scattered and tangled through one of several models. In business process management, examples of such concerns are security and privacy policies. To deal with these cross-cutting concerns, the aspect orientated approach was introduced in the software development area and recently also in the business process management area. The work presented in this paper elaborates on aspect oriented process modelling. It extends earlier work by defining a mechanism for capturing multiple concerns and specifying a precedence order according to which they should be handled in a process. A formal syntax of the notation is presented precisely capturing the extended concepts and mechanisms. Finally, the relevant of the approach is demonstrated through a case study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates the critical role of knowledge sharing (KS) in leveraging manufacturing activities, namely integrated supplier management (ISM) and new product development (NPD) to improve business performance (BP) within the context of Taiwanese electronic manufacturing companies. The research adopted a sequential mixed method research design, which provided both quantitative empirical evidence as well as qualitative insights, into the moderating effect of KS on the relationships between these two core manufacturing activities and BP. First, a questionnaire survey was administered, which resulted in a sample of 170 managerial and technical professionals providing their opinions on KS, NPD and ISM activities and the BP level within their respective companies. On the basis of the collected data, factor analysis was used to verify the measurement model, followed by correlation analysis to explore factor interrelationships, and finally moderated regression analyses to extract the moderating effects of KS on the relationships of NPD and ISM with BP. Following the quantitative study, six semi-structured interviews were conducted to provide qualitative in-depth insights into the value added from KS practices to the targeted manufacturing activities and the extent of its leveraging power. Results from quantitative statistical analysis indicated that KS, NPD and ISM all have a significant positive impact on BP. Specifically, IT infrastructure and open communication were identified as the two types of KS practices that could facilitate enriched supplier evaluation and selection, empower active employee involvement in the design process, and provide support for product simplification and the modular design process, thereby improving manufacturing performance and strengthening company competitiveness. The interviews authenticated many of the empirical findings, suggesting that in the contemporary manufacturing context KS has become an integral part of many ISM and NPD activities and when embedded properly can lead to an improvement in BP. The paper also highlights a number of useful implications for manufacturing companies seeking to leverage their BP through innovative and sustained KS practices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The representation of business process models has been a continuing research topic for many years now. However, many process model representations have not developed beyond minimally interactive 2D icon-based representations of directed graphs and networks, with little or no annotation for information overlays. In addition, very few of these representations have undergone a thorough analysis or design process with reference to psychological theories on data and process visualization. This dearth of visualization research, we believe, has led to problems with BPM uptake in some organizations, as the representations can be difficult for stakeholders to understand, and thus remains an open research question for the BPM community. In addition, business analysts and process modeling experts themselves need visual representations that are able to assist with key BPM life cycle tasks in the process of generating optimal solutions. With the rise of desktop computers and commodity mobile devices capable of supporting rich interactive 3D environments, we believe that much of the research performed in computer human interaction, virtual reality, games and interactive entertainment have much potential in areas of BPM; to engage, provide insight, and to promote collaboration amongst analysts and stakeholders alike. We believe this is a timely topic, with research emerging in a number of places around the globe, relevant to this workshop. This is the second TAProViz workshop being run at BPM. The intention this year is to consolidate on the results of last year's successful workshop by further developing this important topic, identifying the key research topics of interest to the BPM visualization community.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is only in recent years that the critical role that spatial data can play in disaster management and strengthening community resilience has been recognised. The recognition of this importance is singularly evident from the fact that in Australia spatial data is considered as soft infrastructure. In the aftermath of every disaster this importance is being increasingly strengthened with state agencies paying greater attention to ensuring the availability of accurate spatial data based on the lessons learnt. For example, the major flooding in Queensland during the summer of 2011 resulted in a comprehensive review of responsibilities and accountability for the provision of spatial information during such natural disasters. A high level commission of enquiry completed a comprehensive investigation of the 2011 Brisbane flood inundation event and made specific recommendations concerning the collection of and accessibility to spatial information for disaster management and for strengthening community resilience during and after a natural disaster. The lessons learnt and processes implemented were subsequently tested by natural disasters during subsequent years. This paper provides an overview of the practical implementation of the recommendations of the commission of enquiry. It focuses particularly on the measures adopted by the state agencies with the primary role for managing spatial data and the evolution of this role in Queensland State, Australia. The paper concludes with a review of the development of the role and the increasing importance of spatial data as an infrastructure for disaster planning and management which promotes the strengthening of community resilience.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Process models are usually depicted as directed graphs, with nodes representing activities and directed edges control flow. While structured processes with pre-defined control flow have been studied in detail, flexible processes including ad-hoc activities need further investigation. This paper presents flexible process graph, a novel approach to model processes in the context of dynamic environment and adaptive process participants’ behavior. The approach allows defining execution constraints, which are more restrictive than traditional ad-hoc processes and less restrictive than traditional control flow, thereby balancing structured control flow with unstructured ad-hoc activities. Flexible process graph focuses on what can be done to perform a process. Process participants’ routing decisions are based on the current process state. As a formal grounding, the approach uses hypergraphs, where each edge can associate any number of nodes. Hypergraphs are used to define execution semantics of processes formally. We provide a process scenario to motivate and illustrate the approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Big data is certainly the buzz term in executive networking circles at the moment. Heralded by management consultancies and research organisations alike as the next big thing in business efficiency, it is shooting up the Gartner hype cycle to the giddy heights of the peak of inflated expectations before it tumbles down in to the trough of disillusionment

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We formalise and present a new generic multifaceted complex system approach for modelling complex business enterprises. Our method has a strong focus on integrating the various data types available in an enterprise which represent the diverse perspectives of various stakeholders. We explain the challenges faced and define a novel approach to converting diverse data types into usable Bayesian probability forms. The data types that can be integrated include historic data, survey data, and management planning data, expert knowledge and incomplete data. The structural complexities of the complex system modelling process, based on various decision contexts, are also explained along with a solution. This new application of complex system models as a management tool for decision making is demonstrated using a railway transport case study. The case study demonstrates how the new approach can be utilised to develop a customised decision support model for a specific enterprise. Various decision scenarios are also provided to illustrate the versatility of the decision model at different phases of enterprise operations such as planning and control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design and development of a Bottom Pressure Recorder for a Tsunami Early Warning System is described here. The special requirements that it should satisfy for the specific application of deployment at ocean bed and pressure monitoring of the water column above are dealt with. A high-resolution data digitization and low circuit power consumption are typical ones. The implementation details of the data sensing and acquisition part to meet these are also brought out. The data processing part typically encompasses a Tsunami detection algorithm that should detect an event of significance in the background of a variety of periodic and aperiodic noise signals. Such an algorithm and its simulation are presented. Further, the results of sea trials carried out on the system off the Chennai coast are presented. The high quality and fidelity of the data prove that the system design is robust despite its low cost and with suitable augmentations, is ready for a full-fledged deployment at ocean bed. (C) 2013 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report is a detailed description of data processing of NOAA/MLML spectroradiometry data. It introduces the MLML_DBASE programs, describes the assembly of diverse data fues, and describes general algorithms and how individual routines are used. Definitions of data structures are presented in Appendices. [PDF contains 48 pages]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report outlines the NOAA spectroradiometer data processing system implemented by the MLML_DBASE programs. This is done by presenting the algorithms and graphs showing the effects of each step in the algorithms. [PDF contains 32 pages]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Commercially available software packages for IBM PC-compatibles are evaluated to use for data acquisition and processing work. Moss Landing Marine Laboratories (MLML) acquired computers since 1978 to use on shipboard data acquisition (Le. CTD, radiometric, etc.) and data processing. First Hewlett-Packard desktops were used then a transition to the DEC VAXstations, with software developed mostly by the author and others at MLML (Broenkow and Reaves, 1993; Feinholz and Broenkow, 1993; Broenkow et al, 1993). IBM PC were at first very slow and limited in available software, so they were not used in the early days. Improved technology such as higher speed microprocessors and a wide range of commercially available software made use of PC more reasonable today. MLML is making a transition towards using the PC for data acquisition and processing. Advantages are portability and available outside support.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical analysis of diffusion tensor imaging (DTI) data requires a computational framework that is both numerically tractable (to account for the high dimensional nature of the data) and geometric (to account for the nonlinear nature of diffusion tensors). Building upon earlier studies exploiting a Riemannian framework to address these challenges, the present paper proposes a novel metric and an accompanying computational framework for DTI data processing. The proposed approach grounds the signal processing operations in interpolating curves. Well-chosen interpolating curves are shown to provide a computational framework that is at the same time tractable and information relevant for DTI processing. In addition, and in contrast to earlier methods, it provides an interpolation method which preserves anisotropy, a central information carried by diffusion tensor data. © 2013 Springer Science+Business Media New York.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The power consumption of wireless sensor networks (WSN) module is an important practical concern in building energy management (BEM) system deployments. A set of metrics are created to assess the power profiles of WSN in real world condition. The aim of this work is to understand and eventually eliminate the uncertainties in WSN power consumption during long term deployments and the compatibility with existing and emerging energy harvesting technologies. This paper investigates the key metrics in data processing, wireless data transmission, data sensing and duty cycle parameter to understand the system power profile from a practical deployment prospective. Based on the proposed analysis, the impacts of individual metric on power consumption in a typical BEM application are presented and the subsequent low power solutions are investigated.