882 resultados para Generalization of Ehrenfest’s urn Model
Resumo:
Design Science Research (DSR) has emerged as an important approach in Information Systems (IS) research. However, DSR is still in its genesis and has yet to achieve consensus on even the fundamentals, such as what methodology / approach to use for DSR. While there has been much effort to establish DSR methodologies, a complete, holistic and validated approach for the conduct of DSR to guide IS researcher (especially novice researchers) is yet to be established. Alturki et al. (2011) present a DSR ‘Roadmap’, making the claim that it is a complete and comprehensive guide for conducting DSR. This paper aims to further assess this Roadmap, by positioning it against the ‘Idealized Model for Theory Development’ (IM4TD) (Fischer & Gregor 2011). The IM4TD highlights the role of discovery and justification and forms of reasoning to progress in theory development. Fischer and Gregor (2011) have applied IM4TD’s hypothetico-deductive method to analyze DSR methodologies, which is adopted in this study to deductively validate the Alturki et al. (2011) Roadmap. The results suggest that the Roadmap adheres to the IM4TD, is reasonably complete, overcomes most shortcomings identified in other DSR methodologies and also highlights valuable refinements that should be considered within the IM4TD.
Resumo:
Business process management (BPM) is becoming the dominant management paradigm. Business process modelling is central to BPM, and the resultant business process model the core artefact guiding subsequent process change. Thus, model quality is at the centre, mediating between the modelling effort and related growing investment in ultimate process improvements. Nonetheless, though research interest in the properties that differentiate high quality process models is longstanding, there have been no past reports of a valid, operationalised, holistic measure of business process model quality. In attention to this gap, this paper reports validation of a Business Process Model Quality measurement model, conceptualised as a single-order, formative index. Such a measurement model has value as the dependent variable in rigorously researching the drivers of model quality; as antecedent of ultimate process improvements; and potentially as an economical comparator and diagnostic for practice.
Resumo:
Chatrooms, for example Internet Relay Chat, are generally multi-user, multi-channel and multiserver chat-systems which run over the Internet and provide a protocol for real-time text-based conferencing between users all over the world. While a well-trained human observer is able to understand who is chatting with whom, there are no efficient and accurate automated tools to determine the groups of users conversing with each other. A precursor to analysing evolving cyber-social phenomena is to first determine what the conversations are and which groups of chatters are involved in each conversation. We consider this problem in this paper. We propose an algorithm to discover all groups of users that are engaged in conversation. Our algorithms are based on a statistical model of a chatroom that is founded on our experience with real chatrooms. Our approach does not require any semantic analysis of the conversations, rather it is based purely on the statistical information contained in the sequence of posts. We improve the accuracy by applying some graph algorithms to clean the statistical information. We present some experimental results which indicate that one can automatically determine the conversing groups in a chatroom, purely on the basis of statistical analysis.
Resumo:
Catalytic CO2 reforming of biomass tar on palygorskite-supported nickel catalysts using toluene as a model compound of biomass tar was investigated. The experiments were performed in a bench scale installation a fixed bed reactor. All experiments were carried out at 650, 750, 800 °C and atmospheric pressure. The effect of Ni loading, reaction temperature and concentration of CO2 on H2 yield and carbon deposit was investigated. Ni/Palygorskite (Ni/PG) catalysts with Ni/PG ratios of 0%, 2%, 5% and 8% were tested, the last two show the best performance. H2 yield and carbon deposit diminished with the increase of reaction temperature, Ni loading, and CO2 concentration.
Resumo:
The need for a house rental model in Townsville, Australia is addressed. Models developed for predicting house rental levels are described. An analytical model is built upon a priori selected variables and parameters of rental levels. Regression models are generated to provide a comparison to the analytical model. Issues in model development and performance evaluation are discussed. A comparison of the models indicates that the analytical model performs better than the regression models.
Resumo:
Recent years have seen an increased uptake of business process management technology in industries. This has resulted in organizations trying to manage large collections of business process models. One of the challenges facing these organizations concerns the retrieval of models from large business process model repositories. For example, in some cases new process models may be derived from existing models, thus finding these models and adapting them may be more effective and less error-prone than developing them from scratch. Since process model repositories may be large, query evaluation may be time consuming. Hence, we investigate the use of indexes to speed up this evaluation process. To make our approach more applicable, we consider the semantic similarity between labels. Experiments are conducted to demonstrate that our approach is efficient.
Resumo:
A multi-segment foot model was used to develop an accurate and reliable kinematic model to describe in-shoe foot kinematics during gait.
Resumo:
Discretization of a geographical region is quite common in spatial analysis. There have been few studies into the impact of different geographical scales on the outcome of spatial models for different spatial patterns. This study aims to investigate the impact of spatial scales and spatial smoothing on the outcomes of modelling spatial point-based data. Given a spatial point-based dataset (such as occurrence of a disease), we study the geographical variation of residual disease risk using regular grid cells. The individual disease risk is modelled using a logistic model with the inclusion of spatially unstructured and/or spatially structured random effects. Three spatial smoothness priors for the spatially structured component are employed in modelling, namely an intrinsic Gaussian Markov random field, a second-order random walk on a lattice, and a Gaussian field with Matern correlation function. We investigate how changes in grid cell size affect model outcomes under different spatial structures and different smoothness priors for the spatial component. A realistic example (the Humberside data) is analyzed and a simulation study is described. Bayesian computation is carried out using an integrated nested Laplace approximation. The results suggest that the performance and predictive capacity of the spatial models improve as the grid cell size decreases for certain spatial structures. It also appears that different spatial smoothness priors should be applied for different patterns of point data.
Resumo:
Stagnation-point total heat transfer was measured on a 1:27.7 model of the Flight Investigation of Reentry Environment II flight vehicle. Experiments were performed in the X1 expansion tube at an equivalent flight velocity and static enthalpy of 11 km/s and 12.7 MJ/kg, respectively. Conditions were chosen to replicate the flight condition at a total flight time of 1639.5 s, where radiation contributed an estimated 17-36% of the total heat transfer. This contribution is theorized to reduce to <2% in the scaled experiments, and the heating environment on the test model was expected to be dominated by convection. A correlation between reported flight heating rates and expected experimental heating, referred to as the reduced flight value, was developed to predict the level of heating expected on the test model. At the given flow conditions, the reduced flight value was calculated to be 150 MW/m2. Average stagnation-point total heat transfer was measured to be 140 ± 7% W/m2, showing good agreement with the predicted value. Experimentally measured heat transfer was found to have good agreement of between 5 and 15% with a number of convective heating correlations, confirming that convection dominates the tunnel heating environment, and that useful experimental measurements could be made in weakly coupled radiating flow
Resumo:
A long query provides more useful hints for searching relevant documents, but it is likely to introduce noise which affects retrieval performance. In order to smooth such adverse effect, it is important to reduce noisy terms, introduce and boost additional relevant terms. This paper presents a comprehensive framework, called Aspect Hidden Markov Model (AHMM), which integrates query reduction and expansion, for retrieval with long queries. It optimizes the probability distribution of query terms by utilizing intra-query term dependencies as well as the relationships between query terms and words observed in relevance feedback documents. Empirical evaluation on three large-scale TREC collections demonstrates that our approach, which is automatic, achieves salient improvements over various strong baselines, and also reaches a comparable performance to a state of the art method based on user’s interactive query term reduction and expansion.
Resumo:
This thesis presents novel techniques for addressing the problems of continuous change and inconsistencies in large process model collections. The developed techniques treat process models as a collection of fragments and facilitate version control, standardization and automated process model discovery using fragment-based concepts. Experimental results show that the presented techniques are beneficial in consolidating large process model collections, specifically when there is a high degree of redundancy.
Resumo:
A business process is often modeled using some kind of a directed flow graph, which we call a workflow graph. The Refined Process Structure Tree (RPST) is a technique for workflow graph parsing, i.e., for discovering the structure of a workflow graph, which has various applications. In this paper, we provide two improvements to the RPST. First, we propose an alternative way to compute the RPST that is simpler than the one developed originally. In particular, the computation reduces to constructing the tree of the triconnected components of a workflow graph in the special case when every node has at most one incoming or at most one outgoing edge. Such graphs occur frequently in applications. Secondly, we extend the applicability of the RPST. Originally, the RPST was applicable only to graphs with a single source and single sink such that the completed version of the graph is biconnected. We lift both restrictions. Therefore, the RPST is then applicable to arbitrary directed graphs such that every node is on a path from some source to some sink. This includes graphs with multiple sources and/or sinks and disconnected graphs.
Resumo:
The ineffectiveness of current design processes has been well studied and has resulted in widespread calls for the evolution and development of new management processes. Even following the advent of BIM, we continue to move from one stage to another without necessarily having resolved all the issues. CAD design technology, if well handled, could have significantly raised the level of quality and efficiency of current processes, but in practice this was not fully realized. Therefore, technology alone can´t solve all the problems and the advent of BIM could result in a similar bottleneck. For a precise definition of the problem to be solved we should start by understanding what are the main current bottlenecks that have yet to be overcome by either new technologies or management processes, and the impact of human behaviour-related issues which impact the adoption and utilization of new technologies. The fragmented and dispersed nature of the AEC sector, and the huge number of small organizations that comprise it, are a major limiting factor. Several authors have addressed this issue and more recently IDDS has been defined as the highest level of achievement. However, what is written on IDDS shows an extremely ideal situation on a state to be achieved; it shows a holistic utopian proposition with the intent to create the research agenda to move towards that state. Key to IDDS is the framing of a new management model which should address the problems associated with key aspects: technology, processes, policies and people. One of the primary areas to be further studied is the process of collaborative work and understanding, together with the development of proposals to overcome the many cultural barriers that currently exist and impede the advance of new management methods. The purpose of this paper is to define and delimit problems to be solved so that it is possible to implement a new management model for a collaborative design process.
Resumo:
This paper presents an object-oriented world model for the road traffic environment of autonomous (driver-less) city vehicles. The developed World Model is a software component of the autonomous vehicle's control system, which represents the vehicle's view of its road environment. Regardless whether the information is a priori known, obtained through on-board sensors, or through communication, the World Model stores and updates information in real-time, notifies the decision making subsystem about relevant events, and provides access to its stored information. The design is based on software design patterns, and its application programming interface provides both asynchronous and synchronous access to its information. Experimental results of both a 3D simulation and real-world experiments show that the approach is applicable and real-time capable.