924 resultados para Business Process Model Validation
Resumo:
Over the last few years, Business Process Management (BPM) has achieved increasing popularity and dissemination. An analysis of the underlying assumptions of BPM shows that it pursues two apparently contradicting goals: on the one hand it aims at formalising work practices into business process models; on the other hand, it intends to confer flexibility to the organization - i.e. to maintain its ability to respond to new and unforeseen situations. This paper analyses the relationship between formalisation and flexibility in business process modelling by means of an empirical case study of a BPM project in an aircraft maintenance company. A qualitative approach is adopted based on the Actor-Network Theory. The paper offers two major contributions: (a) it illustrates the sociotechnical complexity involved in BPM initiatives; (b) it points towards a multidimensional understanding of the relation between formalization and flexibility in BPM projects.
Resumo:
Background: Tuberculosis (TB) remains a public health issue worldwide. The lack of specific clinical symptoms to diagnose TB makes the correct decision to admit patients to respiratory isolation a difficult task for the clinician. Isolation of patients without the disease is common and increases health costs. Decision models for the diagnosis of TB in patients attending hospitals can increase the quality of care and decrease costs, without the risk of hospital transmission. We present a predictive model for predicting pulmonary TB in hospitalized patients in a high prevalence area in order to contribute to a more rational use of isolation rooms without increasing the risk of transmission. Methods: Cross sectional study of patients admitted to CFFH from March 2003 to December 2004. A classification and regression tree (CART) model was generated and validated. The area under the ROC curve (AUC), sensitivity, specificity, positive and negative predictive values were used to evaluate the performance of model. Validation of the model was performed with a different sample of patients admitted to the same hospital from January to December 2005. Results: We studied 290 patients admitted with clinical suspicion of TB. Diagnosis was confirmed in 26.5% of them. Pulmonary TB was present in 83.7% of the patients with TB (62.3% with positive sputum smear) and HIV/AIDS was present in 56.9% of patients. The validated CART model showed sensitivity, specificity, positive predictive value and negative predictive value of 60.00%, 76.16%, 33.33%, and 90.55%, respectively. The AUC was 79.70%. Conclusions: The CART model developed for these hospitalized patients with clinical suspicion of TB had fair to good predictive performance for pulmonary TB. The most important variable for prediction of TB diagnosis was chest radiograph results. Prospective validation is still necessary, but our model offer an alternative for decision making in whether to isolate patients with clinical suspicion of TB in tertiary health facilities in countries with limited resources.
Resumo:
The advent of distributed and heterogeneous systems has laid the foundation for the birth of new architectural paradigms, in which many separated and autonomous entities collaborate and interact to the aim of achieving complex strategic goals, impossible to be accomplished on their own. A non exhaustive list of systems targeted by such paradigms includes Business Process Management, Clinical Guidelines and Careflow Protocols, Service-Oriented and Multi-Agent Systems. It is largely recognized that engineering these systems requires novel modeling techniques. In particular, many authors are claiming that an open, declarative perspective is needed to complement the closed, procedural nature of the state of the art specification languages. For example, the ConDec language has been recently proposed to target the declarative and open specification of Business Processes, overcoming the over-specification and over-constraining issues of classical procedural approaches. On the one hand, the success of such novel modeling languages strongly depends on their usability by non-IT savvy: they must provide an appealing, intuitive graphical front-end. On the other hand, they must be prone to verification, in order to guarantee the trustworthiness and reliability of the developed model, as well as to ensure that the actual executions of the system effectively comply with it. In this dissertation, we claim that Computational Logic is a suitable framework for dealing with the specification, verification, execution, monitoring and analysis of these systems. We propose to adopt an extended version of the ConDec language for specifying interaction models with a declarative, open flavor. We show how all the (extended) ConDec constructs can be automatically translated to the CLIMB Computational Logic-based language, and illustrate how its corresponding reasoning techniques can be successfully exploited to provide support and verification capabilities along the whole life cycle of the targeted systems.
Resumo:
Il lavoro tratta l’applicazione di un progetto di Certificazione del Sistema di Gestione della Qualità di un’innovativa linea di Business nel settore delle macchine per il confezionamento delle bevande. Questo lavoro è stato preparato durante un periodo di Stage della durata di sei mesi effettuato presso SACMI IMOLA S.C. (Imola, BOLOGNA) a seguito della necessità, riscontrata dal management, di allineare il sistema gestione qualità della nuova linea di business alla normativa ISO 9001:2008, come per le altre linee di business dell’azienda. Tutto questo mediante l’implementazione di un sistema di Business Process Management (BPM) e di tutti i sistemi informatici ad esso collegati. La tesi si struttura in tre parti. Nella prima parte, attraverso un’indagine della letteratura di riferimento, si sono indagati l’evoluzione storica, la struttura attuale e i possibili scenari evolutivi inerenti il concetto di qualità, il sistema gestione qualità e la normativa di riferimento. La seconda parte è dedicata all’approfondimento delle tematiche del BPM, i cui principi hanno guidato l’intervento effettuato. La ricerca è stata condotta allo scopo di evidenziare le radici e gli elementi innovativi che contraddistinguono questo approccio di “management”, descrivere gli aspetti che ne hanno determinato la diffusione e l’evoluzione ed evidenziare, inoltre, il collegamento tra l’approccio per processi che sta alla base di questa filosofia di management e lo stesso approccio previsto nella normativa ISO 9001:2008 e, più specificatamente nella cosiddetta Vision 2000. Tale sezione si conclude con la formalizzazione delle metodologia e degli strumenti effettivamente utilizzati per la gestione del progetto. La terza ed ultima parte del lavoro consiste nella descrizione del caso. Vengono presentate le varie fasi dell’implementazione del progetto, dall’analisi dell’attuale situazione alla costruzione dell’infrastruttura informatica per l’attuazione del BPM ottenuta attraverso l’applicazione dei “criteri di progettazione” trattati dalla letteratura di riferimento, passando per la mappatura dei processi attualmente in vigore e per l’analisi delle performance del processo attuale, misurate attraverso indicatori sviluppati “ad hoc”. Il lavoro è arricchito dall’analisi di un’innovativa metodologia per la creazione di un sistema di gestione integrato delle certificazioni (ISO 9001, ISO 14001, OHSAS 18001) fondato sull’infrastruttura informatica creata.
Resumo:
Development of novel implants in orthopaedic trauma surgery is based on limited datasets of cadaver trials or artificial bone models. A method has been developed whereby implants can be constructed in an evidence based method founded on a large anatomic database consisting of more than 2.000 datasets of bones extracted from CT scans. The aim of this study was the development and clinical application of an anatomically pre-contoured plate for the treatment of distal fibular fractures based on the anatomical database. 48 Caucasian and Asian bone models (left and right) from the database were used for the preliminary optimization process and validation of the fibula plate. The implant was constructed to fit bilaterally in a lateral position of the fibula. Then a biomechanical comparison of the designed implant to the current gold standard in the treatment of distal fibular fractures (locking 1/3 tubular plate) was conducted. Finally, a clinical surveillance study to evaluate the grade of implant fit achieved was performed. The results showed that with a virtual anatomic database it was possible to design a fibula plate with an optimized fit for a large proportion of the population. Biomechanical testing showed the novel fibula plate to be superior to 1/3 tubular plates in 4-point bending tests. The clinical application showed a very high degree of primary implant fit. Only in a small minority of cases further intra-operative implant bending was necessary. Therefore, the goal to develop an implant for the treatment of distal fibular fractures based on the evidence of a large anatomical database could be attained. Biomechanical testing showed good results regarding the stability and the clinical application confirmed the high grade of anatomical fit.
Resumo:
Indoor radon is regularly measured in Switzerland. However, a nationwide model to predict residential radon levels has not been developed. The aim of this study was to develop a prediction model to assess indoor radon concentrations in Switzerland. The model was based on 44,631 measurements from the nationwide Swiss radon database collected between 1994 and 2004. Of these, 80% randomly selected measurements were used for model development and the remaining 20% for an independent model validation. A multivariable log-linear regression model was fitted and relevant predictors selected according to evidence from the literature, the adjusted R², the Akaike's information criterion (AIC), and the Bayesian information criterion (BIC). The prediction model was evaluated by calculating Spearman rank correlation between measured and predicted values. Additionally, the predicted values were categorised into three categories (50th, 50th-90th and 90th percentile) and compared with measured categories using a weighted Kappa statistic. The most relevant predictors for indoor radon levels were tectonic units and year of construction of the building, followed by soil texture, degree of urbanisation, floor of the building where the measurement was taken and housing type (P-values <0.001 for all). Mean predicted radon values (geometric mean) were 66 Bq/m³ (interquartile range 40-111 Bq/m³) in the lowest exposure category, 126 Bq/m³ (69-215 Bq/m³) in the medium category, and 219 Bq/m³ (108-427 Bq/m³) in the highest category. Spearman correlation between predictions and measurements was 0.45 (95%-CI: 0.44; 0.46) for the development dataset and 0.44 (95%-CI: 0.42; 0.46) for the validation dataset. Kappa coefficients were 0.31 for the development and 0.30 for the validation dataset, respectively. The model explained 20% overall variability (adjusted R²). In conclusion, this residential radon prediction model, based on a large number of measurements, was demonstrated to be robust through validation with an independent dataset. The model is appropriate for predicting radon level exposure of the Swiss population in epidemiological research. Nevertheless, some exposure misclassification and regression to the mean is unavoidable and should be taken into account in future applications of the model.
Resumo:
he notion of outsourcing – making arrangements with an external entity for the provision of goods or services to supplement or replace internal efforts – has been around for centuries. The outsourcing of information systems (IS) is however a much newer concept but one which has been growing dramatically. This book attempts to synthesize what is known about IS outsourcing by dividing the subject into three interrelated parts: (1) Traditional Information Technology Outsourcing, (2) Information Technolgy Offshoring, and (3) Business Process Outsourcing. The book should be of interest to all academics and students in the field of Information Systems as well as corporate executives and professionals who seek a more profound analysis and understanding of the underlying factors and mechanisms of outsourcing.
Resumo:
The aim of our study was to develop a modeling framework suitable to quantify the incidence, absolute number and economic impact of osteoporosis-attributable hip, vertebral and distal forearm fractures, with a particular focus on change over time, and with application to the situation in Switzerland from 2000 to 2020. A Markov process model was developed and analyzed by Monte Carlo simulation. A demographic scenario provided by the Swiss Federal Statistical Office and various Swiss and international data sources were used as model inputs. Demographic and epidemiologic input parameters were reproduced correctly, confirming the internal validity of the model. The proportion of the Swiss population aged 50 years or over will rise from 33.3% in 2000 to 41.3% in 2020. At the total population level, osteoporosis-attributable incidence will rise from 1.16 to 1.54 per 1,000 person-years in the case of hip fracture, from 3.28 to 4.18 per 1,000 person-years in the case of radiographic vertebral fracture, and from 0.59 to 0.70 per 1,000 person-years in the case of distal forearm fracture. Osteoporosis-attributable hip fracture numbers will rise from 8,375 to 11,353, vertebral fracture numbers will rise from 23,584 to 30,883, and distal forearm fracture numbers will rise from 4,209 to 5,186. Population-level osteoporosis-related direct medical inpatient costs per year will rise from 713.4 million Swiss francs (CHF) to CHF946.2 million. These figures correspond to 1.6% and 2.2% of Swiss health care expenditures in 2000. The modeling framework described can be applied to a wide variety of settings. It can be used to assess the impact of new prevention, diagnostic and treatment strategies. In Switzerland incidences of osteoporotic hip, vertebral and distal forearm fracture will rise by 33%, 27%, and 19%, respectively, between 2000 and 2020, if current prevention and treatment patterns are maintained. Corresponding absolute fracture numbers will rise by 36%, 31%, and 23%. Related direct medical inpatient costs are predicted to increase by 33%; however, this estimate is subject to uncertainty due to limited availability of input data.
Resumo:
Ein dynamisches Umfeld erforderte die permanente Anpassung (intra-)logistischer Prozesse zur Aufrechterhaltung der Leistungs- und Wettbewerbsfähigkeit von Unternehmen. In der Standardisierung von Prozessen und in unternehmensübergreifenden Prozessmodellen wurden Schlüsselfaktoren für ein effizientes Geschäftsprozessmanagement gesehen, insbesondere in Netzwerken. In der Praxis fehlten wissenschaftlich fundierte und detaillierte Referenzprozessmodelle für die (Intra-)Logistik. Mit der Erforschung und Entwicklung eines Referenzprozessmodells und der prototypischen Realisierung einer Prozess-Workbench zur generischen Erstellung wandelbarer Prozessketten wurde ein Beitrag zur Prozessstandardisierung in der Logistik geleistet. Im Folgenden wird der beschrittene Lösungsweg dargestellt, der erstens aus der Entwicklung eines Metamodells für die Referenzmodellierung, zweitens aus dem empirischen Nachweis der Generierung eines Referenzprozessmodells aus „atomaren“ Elementen und drittens aus der Modellevaluation bestand.
Resumo:
The Business and Information Technologies (BIT) project strives to reveal new insights into how modern IT impacts organizational structures and business practices using empirical methods. Due to its international scope, it allows for inter-country comparison of empirical results. Germany — represented by the European School of Management and Technologies (ESMT) and the Institute of Information Systems at Humboldt-Universität zu Berlin — joined the BIT project in 2006. This report presents the result of the first survey conducted in Germany during November–December 2006. The key results are as follows: • The most widely adopted technologies and systems in Germany are websites, wireless hardware and software, groupware/productivity tools, and enterprise resource planning (ERP) systems. The biggest potential for growth exists for collaboration and portal tools, content management systems, business process modelling, and business intelligence applications. A number of technological solutions have not yet been adopted by many organizations but also bear some potential, in particular identity management solutions, Radio Frequency Identification (RFID), biometrics, and third-party authentication and verification. • IT security remains on the top of the agenda for most enterprises: budget spending was increasing in the last 3 years. • The workplace and work requirements are changing. IT is used to monitor employees' performance in Germany, but less heavily compared to the United States (Karmarkar and Mangal, 2007).1 The demand for IT skills is increasing at all corporate levels. Executives are asking for more and better structured information and this, in turn, triggers the appearance of new decision-making tools and online technologies on the market. • The internal organization of companies in Germany is underway: organizations are becoming flatter, even though the trend is not as pronounced as in the United States (Karmarkar and Mangal, 2007), and the geographical scope of their operations is increasing. Modern IT plays an important role in enabling this development, e.g. telecommuting, teleconferencing, and other web-based collaboration formats are becoming increasingly popular in the corporate context. • The degree to which outsourcing is being pursued is quite limited with little change expected. IT services, payroll, and market research are the most widely outsourced business functions. This corresponds to the results from other countries. • Up to now, the adoption of e-business technologies has had a rather limited effect on marketing functions. Companies tend to extract synergies from traditional printed media and on-line advertising. • The adoption of e-business has not had a major impact on marketing capabilities and strategy yet. Traditional methods of customer segmentation are still dominating. The corporate identity of most organizations does not change significantly when going online. • Online sales channel are mainly viewed as a complement to the traditional distribution means. • Technology adoption has caused production and organizational costs to decrease. However, the costs of technology acquisition and maintenance as well as consultancy and internal communication costs have increased.
Resumo:
IT has turned out to be a key factor for the purposes of gaining maturity in Business Process Management (BPM). This book presents a worldwide investigation that was conducted among companies from the ‘Forbes Global 2000’ list to explore the current usage of software throughout the BPM life cycle and to identify the companies’ requirements concerning process modelling. The responses from 130 companies indicate that, at the present time, it is mainly software for process description and analysis that is required, while process execution is supported by general software such as databases, ERP systems and office tools. The resulting complex system landscapes give rise to distinct requirements for BPM software, while the process modelling requirements can be equally satisfied by the most common languages (BPMN, UML, EPC).
Explaining Emergence and Consequences of Specific Formal Controls in IS Outsourcing – A Process-View
Resumo:
IS outsourcing projects often fail to achieve project goals. To inhibit this failure, managers need to design formal controls that are tailored to the specific contextual demands. However, the dynamic and uncertain nature of IS outsourcing projects makes the design of such specific formal controls at the outset of a project challenging. Hence, the process of translating high-level project goals into specific formal controls becomes crucial for success or failure of IS outsourcing projects. Based on a comparative case study of four IS outsourcing projects, our study enhances current understanding of such translation processes and their consequences by developing a process model that explains the success or failure to achieve high-level project goals as an outcome of two unique translation patterns. This novel process-based explanation for how and why IS outsourcing projects succeed or fail has important implications for control theory and IS project escalation literature.
Resumo:
Information systems (IS) outsourcing projects often fail to achieve initial goals. To avoid project failure, managers need to design formal controls that meet the specific contextual demands of the project. However, the dynamic and uncertain nature of IS outsourcing projects makes it difficult to design such specific formal controls at the outset of a project. It is hence crucial to translate high-level project goals into specific formal controls during the course of a project. This study seeks to understand the underlying patterns of such translation processes. Based on a comparative case study of four outsourced software development projects, we inductively develop a process model that consists of three unique patterns. The process model shows that the performance implications of emergent controls with higher specificity depend on differences in the translation process. Specific formal controls have positive implications for goal achievement if only the stakeholder context is adapted, while they are negative for goal achievement if in the translation process tasks are unintendedly adapted. In the latter case projects incrementally drift away from their initial direction. Our findings help to better understand control dynamics in IS outsourcing projects. We contribute to a process theoretic understanding of IS outsourcing governance and we derive implications for control theory and the IS project escalation literature.
Resumo:
Appropriate field data are required to check the reliability of hydrodynamic models simulating the dispersion of soluble substances in the marine environment. This study deals with the collection of physical measurements and soluble tracer data intended specifically for this kind of validation. The intensity of currents as well as the complexity of topography and tides around the Cap de La Hague in the center of the English Channel makes it one of the most difficult areas to represent in terms of hydrodynamics and dispersion. Controlled releases of tritium - in the form of HTO - are carried out in this area by the AREVA-NC plant, providing an excellent soluble tracer. A total of 14 493 measurements were acquired to track dispersion in the hours and days following a release. These data, supplementing previously gathered data and physical measurements (bathymetry, water-surface levels, Eulerian and Lagrangian current studies) allow us to test dispersion models from the hour following release to periods of several years which are not accessible with dye experiments. The dispersion characteristics are described and methods are proposed for comparing models against measurements. An application is proposed for a 2 dimensions high-resolution numerical model. It shows how an extensive dataset can be used to build, calibrate and validate several aspects of the model in a highly dynamic and macrotidal area: tidal cycle timing, tidal amplitude, fixed-point current data, hodographs. This study presents results concerning the model's ability to reproduce residual Lagrangian currents, along with a comparison between simulation and high-frequency measurements of tracer dispersion. Physical and tracer data are available from the SISMER database of IFREMER (www.ifremer.fr/sismer/catal). This tool for validation of models in macro-tidal seas is intended to be an open and evolving resource, which could provide a benchmark for dispersion model validation.
Resumo:
This paper develops a micro-simulation framework for multinational entry and sales activities across countries. The model is based on Eaton, Kortum, and Kramarz's (2010) quantitative trade model adapted towards multinational production. Using micro data on Japanese manufacturing firms, we first stylize the empirical regularities of multinational entry and sales activity and estimate the model's structural parameters with simulated method of moments. We then demonstrate that our adapted model is able to replicate important dimensions of the in-sample moments conditioned in our estimation strategy. Importantly, it is able to replicate activity under an economic period with a far different level of FDI barriers than was conditioned upon in our estimation sample. Overall, our research highlights the richness of the simulation framework for performing counterfactual analysis of various FDI policies.