970 resultados para model complexity


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real world business process models may consist of hundreds of elements and have sophisticated structure. Although there are tasks where such models are valuable and appreciated, in general complexity has a negative influence on model comprehension and analysis. Thus, means for managing the complexity of process models are needed. One approach is abstraction of business process models-creation of a process model which preserves the main features of the initial elaborate process model, but leaves out insignificant details. In this paper we study the structural aspects of process model abstraction and introduce an abstraction approach based on process structure trees (PST). The developed approach assures that the abstracted process model preserves the ordering constraints of the initial model. It surpasses pattern-based process model abstraction approaches, allowing to handle graph-structured process models of arbitrary structure. We also provide an evaluation of the proposed approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design and development of process-aware information systems is often supported by specifying requirements as business process models. Although this approach is generally accepted as an effective strategy, it remains a fundamental challenge to adequately validate these models given the diverging skill set of domain experts and system analysts. As domain experts often do not feel confident in judging the correctness and completeness of process models that system analysts create, the validation often has to regress to a discourse using natural language. In order to support such a discourse appropriately, so-called verbalization techniques have been defined for different types of conceptual models. However, there is currently no sophisticated technique available that is capable of generating natural-looking text from process models. In this paper, we address this research gap and propose a technique for generating natural language texts from business process models. A comparison with manually created process descriptions demonstrates that the generated texts are superior in terms of completeness, structure, and linguistic complexity. An evaluation with users further demonstrates that the texts are very understandable and effectively allow the reader to infer the process model semantics. Hence, the generated texts represent a useful input for process model validation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Design Science is the process of solving ‘wicked problems’ through designing, developing, instantiating, and evaluating novel solutions (Hevner, March, Park and Ram, 2004). Wicked problems are described as agent finitude in combination with problem complexity and normative constraint (Farrell and Hooker, 2013). In Information Systems Design Science, determining that problems are ‘wicked’ differentiates Design Science research from Solutions Engineering (Winter, 2008) and is a necessary part of proving the relevance to Information Systems Design Science research (Hevner, 2007; Iivari, 2007). Problem complexity is characterised as many problem components with nested, dependent and co-dependent relationships interacting through multiple feedback and feed-forward loops. Farrell and Hooker (2013) specifically state for wicked problems “it will often be impossible to disentangle the consequences of specific actions from those of other co-occurring interactions”. This paper discusses the application of an Enterprise Information Architecture modelling technique to disentangle the wicked problem complexity for one case. It proposes that such a modelling technique can be applied to other wicked problems and can lay the foundations for proving relevancy to DSR, provide solution pathways for artefact development, and aid to substantiate those elements required to produce Design Theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – Simple linear accounts of prescribing do not adequately address reasons “why” doctors prescribe psychotropic medication to people with intellectual disability (ID). Greater understanding of the complex array of factors that influence decisions to prescribe is needed. Design/methodology/approach – After consideration of a number of conceptual frameworks that have potential to better understand prescribing of psychotropic medication to adults with ID, an ecological model of prescribing was developed. A case study is used to outline how the model can provide greater understanding of prescribing processes. Findings – The model presented aims to consider the complexity and multi-dimensional nature of community-based psychotropic prescribing to adults with ID. The utility of the model is illustrated through a consideration of the case study. Research limitations/implications – The model presented is conceptual and is as yet untested. Practical implications – The model presented aims to capture the complexity and multi-dimensional nature of community-based psychotropic prescribing to adults with ID. The model may provide utility for clinicians and researchers as they seek clarification of prescribing decisions. Originality/value – The paper adds valuable insight into factors influencing psychotropic prescribing to adults with ID. The ecological model of prescribing extends traditional analysis that focuses on patient characteristics and introduces multi-level perspectives that may provide utility for clinicians and researchers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analytically or computationally intractable likelihood functions can arise in complex statistical inferential problems making them inaccessible to standard Bayesian inferential methods. Approximate Bayesian computation (ABC) methods address such inferential problems by replacing direct likelihood evaluations with repeated sampling from the model. ABC methods have been predominantly applied to parameter estimation problems and less to model choice problems due to the added difficulty of handling multiple model spaces. The ABC algorithm proposed here addresses model choice problems by extending Fearnhead and Prangle (2012, Journal of the Royal Statistical Society, Series B 74, 1–28) where the posterior mean of the model parameters estimated through regression formed the summary statistics used in the discrepancy measure. An additional stepwise multinomial logistic regression is performed on the model indicator variable in the regression step and the estimated model probabilities are incorporated into the set of summary statistics for model choice purposes. A reversible jump Markov chain Monte Carlo step is also included in the algorithm to increase model diversity for thorough exploration of the model space. This algorithm was applied to a validating example to demonstrate the robustness of the algorithm across a wide range of true model probabilities. Its subsequent use in three pathogen transmission examples of varying complexity illustrates the utility of the algorithm in inferring preference of particular transmission models for the pathogens.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Design deals with improving the lives of people. As such interactions with products, interfaces, and systems should facilitate not only usable and practical concerns but also mediate emotionally meaningful experiences. This paper presents an integrated and comprehensive model of experience, labeled 'Unified User Experience Model', covering the most prominent perspectives from across the design field. It is intended to support designers from different disciplines to consider the complexity of user experience. The vision of the model is to support both the analysis of existing products, interfaces, and systems, as well as the development of new designs that take into account this complexity. In essence, we hope the model can enable designers to develop more marketable, appropriate, and enhanced products to improve experiences and ultimately the lives of people.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biomedical systems involve a large number of entities and intricate interactions between these. Their direct analysis is, therefore, difficult, and it is often necessary to rely on computational models. These models require significant resources and parallel computing solutions. These approaches are particularly suited, given parallel aspects in the nature of biomedical systems. Model hybridisation also permits the integration and simultaneous study of multiple aspects and scales of these systems, thus providing an efficient platform for multidisciplinary research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many researchers in the field of civil structural health monitoring have developed and tested their methods on simple to moderately complex laboratory structures such as beams, plates, frames, and trusses. Field work has also been conducted by many researchers and practitioners on more complex operating bridges. Most laboratory structures do not adequately replicate the complexity of truss bridges. This paper presents some preliminary results of experimental modal testing and analysis of the bridge model presented in the companion paper, using the peak picking method, and compares these results with those of a simple numerical model of the structure. Three dominant modes of vibration were experimentally identified under 15 Hz. The mode shapes and order of the modes matched those of the numerical model; however, the frequencies did not match.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose Performance heterogeneity between collaborative infrastructure projects is typically examined by considering procurement systems and their governance mechanisms at static points in time. The literature neglects to consider the impact of dynamic learning capability, which is thought to reconfigure governance mechanisms over time in response to evolving market conditions. This conceptual paper proposes a new model to show how continuous joint learning of participant organisations improves project performance. Design/methodology/approach There are two stages of conceptual development. In the first stage, the management literature is analysed to explain the Standard Model of dynamic learning capability that emphasises three learning phases for organisations. This Standard Model is extended to derive a novel Circular Model of dynamic learning capability that shows a new feedback loop between performance and learning. In the second stage, the construction management literature is consulted, adding project lifecycle, stakeholder diversity and three organisational levels to the analysis, to arrive at the Collaborative Model of dynamic learning capability. Findings The Collaborative Model should enable construction organisations to successfully adapt and perform under changing market conditions. The complexity of learning cycles results in capabilities that are imperfectly imitable between organisations, explaining performance heterogeneity on projects. Originality/value The Collaborative Model provides a theoretically substantiated description of project performance, driven by the evolution of procurement systems and governance mechanisms. The Model’s empirical value will be tested in future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new structured model-following adaptive approach is presented in this paper to achieve large attitude maneuvers of rigid bodies. First, a nominal controller is designed using the dynamic inversion philosophy. Next, a neuro- adaptive design is proposed to augment the nominal design in order to assure robust performance in the presence of parameter inaccuracies as well as unknown constant external disturbances. The structured approach proposed in this paper (where kinematic and dynamic equations are handled separately), reduces the complexity of the controller structure. From simulation studies, this adaptive controller is found to be very effective in assuring robust performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The parasitic weed Orobanche crenata inflicts major damage on faba bean, lentil, pea and other crops in Mediterranean environments. The development of methods to control O. crenata is to a large extent hampered by the complexity of host-parasite systems. Using a model of host-parasite interactions can help to explain and understand this intricacy. This paper reports on the evaluation and application of a model simulating host-parasite competition as affected by environment and management that was implemented in the framework of the Agricultural Production Systems Simulator (APSIM). Model-predicted faba bean and O. crenata growth and development were evaluated against independent data. The APSIM-Fababean and -Parasite modules displayed a good capability to reproduce effects of pedoclimatic conditions, faba bean sowing date and O. crenata infestation on host-parasite competition. The r(2) values throughout exceeded 0.84 (RMSD: 5.36 days) for phenological, 0.85 (RMSD: 223.00 g m(-2)) for host growth and 0.78 (RMSD: 99.82 g m(-2)) for parasite growth parameters. Inaccuracies of simulated faba bean root growth that caused some bias of predicted parasite number and host yield loss may be dealt with by more flexibly simulating vertical root distribution. The model was applied in simulation experiments to determine optimum sowing windows for infected and non-infected faba bean in Mediterranean environments. Simulation results proved realistic and testified to the capability of APSIM to contribute to the development of tactical approaches in parasitic weed control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using analysis-by-synthesis (AbS) approach, we develop a soft decision based switched vector quantization (VQ) method for high quality and low complexity coding of wideband speech line spectral frequency (LSF) parameters. For each switching region, a low complexity transform domain split VQ (TrSVQ) is designed. The overall rate-distortion (R/D) performance optimality of new switched quantizer is addressed in the Gaussian mixture model (GMM) based parametric framework. In the AbS approach, the reduction of quantization complexity is achieved through the use of nearest neighbor (NN) TrSVQs and splitting the transform domain vector into higher number of subvectors. Compared to the current LSF quantization methods, the new method is shown to provide competitive or better trade-off between R/D performance and complexity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Minimum Description Length (MDL) is an information-theoretic principle that can be used for model selection and other statistical inference tasks. There are various ways to use the principle in practice. One theoretically valid way is to use the normalized maximum likelihood (NML) criterion. Due to computational difficulties, this approach has not been used very often. This thesis presents efficient floating-point algorithms that make it possible to compute the NML for multinomial, Naive Bayes and Bayesian forest models. None of the presented algorithms rely on asymptotic analysis and with the first two model classes we also discuss how to compute exact rational number solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this two-part series of papers, a generalized non-orthogonal amplify and forward (GNAF) protocol which generalizes several known cooperative diversity protocols is proposed. Transmission in the GNAF protocol comprises of two phases - the broadcast phase and the cooperation phase. In the broadcast phase, the source broadcasts its information to the relays as well as the destination. In the cooperation phase, the source and the relays together transmit a space-time code in a distributed fashion. The GNAF protocol relaxes the constraints imposed by the protocol of Jing and Hassibi on the code structure. In Part-I of this paper, a code design criteria is obtained and it is shown that the GNAF protocol is delay efficient and coding gain efficient as well. Moreover GNAF protocol enables the use of sphere decoders at the destination with a non-exponential Maximum likelihood (ML) decoding complexity. In Part-II, several low decoding complexity code constructions are studied and a lower bound on the Diversity-Multiplexing Gain tradeoff of the GNAF protocol is obtained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We address the problem of distributed space-time coding with reduced decoding complexity for wireless relay network. The transmission protocol follows a two-hop model wherein the source transmits a vector in the first hop and in the second hop the relays transmit a vector, which is a transformation of the received vector by a relay-specific unitary transformation. Design criteria is derived for this system model and codes are proposed that achieve full diversity. For a fixed number of relay nodes, the general system model considered in this paper admits code constructions with lower decoding complexity compared to codes based on some earlier system models.