942 resultados para Model evolution
Resumo:
The Business Model Canvas (BMC) assists in the design of companies' business models. As strategies evolve so too does the business model. Unfortunately, each BMC is a standalone representation. Thus, there is a need to be able to describe transformation from one version of a business model to the next as well as to visualize these operations. To address this issue, and to contribute to computer-assisted business model design, we propose a set of design principles for business model evolution. We also demonstrate a tool that can assist in the creation and navigation of business model versions in a visual and user-friendly way
Resumo:
The DNDC (DeNitrification and DeComposition) model was first developed by Li et al. (1992) as a rain event-driven process-orientated simulation model for nitrous oxide, carbon dioxide and nitrogen gas emissions from the agricultural soils in the U.S. Over the last 20 years, the model has been modified and adapted by various research groups around the world to suit specific purposes and circumstances. The Global Research Alliance Modelling Platform (GRAMP) is a UK-led initiative for the establishment of a purposeful and credible web-based platform initially aimed at users of the DNDC model. With the aim of improving the predictions of soil C and N cycling in the context of climate change the objectives of GRAMP are to: 1) to document the existing versions of the DNDC model; 2) to create a family tree of the individual DNDC versions; 3) to provide information on model use and development; and 4) to identify strengths, weaknesses and potential improvements for the model.
Resumo:
In this paper, we present a framework for pattern-based model evolution approaches in the MDA context. In the framework, users define patterns using a pattern modeling language that is designed to describe software design patterns, and they can use the patterns as rules to evolve their model. In the framework, design model evolution takes place via two steps. The first step is a binding process of selecting a pattern and defining where and how to apply the pattern in the model. The second step is an automatic model transformation that actually evolves the model according to the binding information and the pattern rule. The pattern modeling language is defined in terms of a MOF-based role metamodel, and implemented using an existing modeling framework, EMF, and incorporated as a plugin to the Eclipse modeling environment. The model evolution process is also implemented as an Eclipse plugin. With these two plugins, we provide an integrated framework where defining and validating patterns, and model evolution based on patterns can take place in a single modeling environment.
Resumo:
This research explores the business model (BM) evolution process of entrepreneurial companies and investigates the relationship between BM evolution and firm performance. Recently, it has been increasingly recognised that the innovative design (and re-design) of BMs is crucial to the performance of entrepreneurial firms, as BM can be associated with superior value creation and competitive advantage. However, there has been limited theoretical and empirical evidence in relation to the micro-mechanisms behind the BM evolution process and the entrepreneurial outcomes of BM evolution. This research seeks to fill this gap by opening up the ‘black box’ of the BM evolution process, exploring the micro-patterns that facilitate the continuous shaping, changing, and renewing of BMs and examining how BM evolutions create and capture value in a dynamic manner. Drawing together the BM and strategic entrepreneurship literature, this research seeks to understand: (1) how and why companies introduce BM innovations and imitations; (2) how BM innovations and imitations interplay as patterns in the BM evolution process; and (3) how BM evolution patterns affect firm performances. This research adopts a longitudinal multiple case study design that focuses on the emerging phenomenon of BM evolution. Twelve entrepreneurial firms in the Chinese Online Group Buying (OGB) industry were selected for their continuous and intensive developments of BMs and their varying success rates in this highly competitive market. Two rounds of data collection were carried out between 2013 and 2014, which generates 31 interviews with founders/co-founders and in total 5,034 pages of data. Following a three-stage research framework, the data analysis begins by mapping the BM evolution process of the twelve companies and classifying the changes in the BMs into innovations and imitations. The second stage focuses down to the BM level, which addresses the BM evolution as a dynamic process by exploring how BM innovations and imitations unfold and interplay over time. The final stage focuses on the firm level, providing theoretical explanations as to the effects of BM evolution patterns on firm performance. This research provides new insights into the nature of BM evolution by elaborating on the missing link between BM dynamics and firm performance. The findings identify four patterns of BM evolution that have different effects on a firm’s short- and long-term performance. This research contributes to the BM literature by presenting what the BM evolution process actually looks like. Moreover, it takes a step towards the process theory of the interplay between BM innovations and imitations, which addresses the role of companies’ actions, and more importantly, reactions to the competitors. Insights are also given into how entrepreneurial companies achieve and sustain value creation and capture by successfully combining the BM evolution patterns. Finally, the findings on BM evolution contributes to the strategic entrepreneurship literature by increasing the understanding of how companies compete in a more dynamic and complex environment. It reveals that, the achievement of superior firm performance is more than a simple question of whether to innovate or imitate, but rather an integration of innovation and imitation strategies over time. This study concludes with a discussion of the findings and their implications for theory and practice.
Resumo:
Companies are increasingly more and more dependent on distributed web-based software systems to support their businesses. This increases the need to maintain and extend software systems with up-to-date new features. Thus, the development process to introduce new features usually needs to be swift and agile, and the supporting software evolution process needs to be safe, fast, and efficient. However, this is usually a difficult and challenging task for a developer due to the lack of support offered by programming environments, frameworks, and database management systems. Changes needed at the code level, database model, and the actual data contained in the database must be planned and developed together and executed in a synchronized way. Even under a careful development discipline, the impact of changing an application data model is hard to predict. The lifetime of an application comprises changes and updates designed and tested using data, which is usually far from the real, production, data. So, coding DDL and DML SQL scripts to update database schema and data, is the usual (and hard) approach taken by developers. Such manual approach is error prone and disconnected from the real data in production, because developers may not know the exact impact of their changes. This work aims to improve the maintenance process in the context of Agile Platform by Outsystems. Our goal is to design and implement new data-model evolution features that ensure a safe support for change and a sound migration process. Our solution includes impact analysis mechanisms targeting the data model and the data itself. This provides, to developers, a safe, simple, and guided evolution process.
Resumo:
Understanding the evolution of intraspecific variance is a major research question in evolutionary biology. While its importance to processes operating at individual and population levels is well-documented, much less is known about its role in macroevolutionary patterns. Nevertheless, both experimental and theoretical evidence suggest that the intraspecific variance is susceptible to selection, can transform into interspecific variation and, therefore, is crucial for macroevolutionary processes. The main objectives of this thesis were: (l) to investigate which factors impact evolution of intraspecific variation in Polygonaceae and determine if evolution of intraspecific variation influences species diversification; and (2) to develop a novel comparative phylogenetic method to model evolution of intraspecific variation. Using the buckwheat family, Polygonaceae, as a study system, I demonstrated which life-history and ecological traits are relevant to the evolution of intraspecific variation. I analyzed how differential intraspecific variation drives species diversification patterns. I showed with computer simulations the shortcomings of existing comparative methods with respect to intraspecific variation. I developed a novel comparative model that readily incorporates the intraspecific variance into phylogenetic comparative methods. The obtained results are complimentary, because they affect both empirical and methodological aspects of comparative analysis. Overall, I highlight that intraspecific variation is an important contributor to the macroevolutionary patterns and it should be explicitly considered in the comparative phylogenetic analysis. - En biologie évolutive comprendre l'évolution de la variance intraspécifique est un axe de recherche majeur. Bien que l'importance de cette variation soit bien documentée au niveau individuel et populationnel, on en sait beaucoup moins sur son rôle au niveau macroévolutif. Néanmoins, des preuves expérimentales et théoriques suggèrent que la variance intraspécifique est sensible à la sélection et peut se transformer en variation interspécifique. Par conséquent, elle est cruciale pour mieux comprendre les processus macroévolutifs. Les principaux objectifs de ma thèse étaient : (i) d'enquêter sur les facteurs qui affectent l'évolution de la variation intraspécifique chez les Polygonaceae et de déterminer si l'évolution de cette dernière influence la diversification des espèces, et (2) de développer une nouvelle méthode comparative permettant de modéliser l'évolution de la variation intraspécifique dans un cadre phylogénétique. En utilisant comme système d'étude la famille du sarrasin, les Polygonacées, je démontre que les traits d'histoire de vie sont pertinents pour comprendre l'évolution de la variation intraspécifique. J'ai également analysé l'influence de la variation intraspécifique au niveau de la diversification des espèces. J'ai ensuite démontré avec des données simulées les limites des méthodes comparatives existantes vis à vis de la variation intraspécifique. Finalement, j'ai développé un modèle comparatif qui intègre facilement la variance intraspécifique dans les méthodes comparatives phylogénétiques existantes. Les résultats obtenus lors de ma thèse sont complémentaires car ils abordent aspects empiriques et méthodologiques de l'analyse comparative. En conclusion, je souligne que la variation intraspécifique est un facteur important en macroévolution et qu'elle doit être explicitement considérée lors d'analyses comparatives phylogénétiques.
Resumo:
This study develops a simplified model describing the evolutionary dynamics of a population composed of obligate sexually and asexually reproducing, unicellular organisms. The model assumes that the organisms have diploid genomes consisting of two chromosomes, and that the sexual organisms replicate by first dividing into haploid intermediates, which then combine with other haploids, followed by the normal mitotic division of the resulting diploid into two new daughter cells. We assume that the fitness landscape of the diploids is analogous to the single-fitness-peak approach often used in single-chromosome studies. That is, we assume a master chromosome that becomes defective with just one point mutation. The diploid fitness then depends on whether the genome has zero, one, or two copies of the master chromosome. We also assume that only pairs of haploids with a master chromosome are capable of combining so as to produce sexual diploid cells, and that this process is described by second-order kinetics. We find that, in a range of intermediate values of the replication fidelity, sexually reproducing cells can outcompete asexual ones, provided the initial abundance of sexual cells is above some threshold value. The range of values where sexual reproduction outcompetes asexual reproduction increases with decreasing replication rate and increasing population density. We critically evaluate a common approach, based on a group selection perspective, used to study the competition between populations and show its flaws in addressing the evolution of sex problem.
Resumo:
Part 8: Business Strategies Alignment
Resumo:
A truly variance-minimizing filter is introduced and its per for mance is demonstrated with the Korteweg– DeV ries (KdV) equation and with a multilayer quasigeostrophic model of the ocean area around South Africa. It is recalled that Kalman-like filters are not variance minimizing for nonlinear model dynamics and that four - dimensional variational data assimilation (4DV AR)-like methods relying on per fect model dynamics have dif- ficulty with providing error estimates. The new method does not have these drawbacks. In fact, it combines advantages from both methods in that it does provide error estimates while automatically having balanced states after analysis, without extra computations. It is based on ensemble or Monte Carlo integrations to simulate the probability density of the model evolution. When obser vations are available, the so-called importance resampling algorithm is applied. From Bayes’ s theorem it follows that each ensemble member receives a new weight dependent on its ‘ ‘distance’ ’ t o the obser vations. Because the weights are strongly var ying, a resampling of the ensemble is necessar y. This resampling is done such that members with high weights are duplicated according to their weights, while low-weight members are largely ignored. In passing, it is noted that data assimilation is not an inverse problem by nature, although it can be for mulated that way . Also, it is shown that the posterior variance can be larger than the prior if the usual Gaussian framework is set aside. However , i n the examples presented here, the entropy of the probability densities is decreasing. The application to the ocean area around South Africa, gover ned by strongly nonlinear dynamics, shows that the method is working satisfactorily . The strong and weak points of the method are discussed and possible improvements are proposed.
Resumo:
A smoother introduced earlier by van Leeuwen and Evensen is applied to a problem in which real obser vations are used in an area with strongly nonlinear dynamics. The derivation is new , but it resembles an earlier derivation by van Leeuwen and Evensen. Again a Bayesian view is taken in which the prior probability density of the model and the probability density of the obser vations are combined to for m a posterior density . The mean and the covariance of this density give the variance-minimizing model evolution and its errors. The assumption is made that the prior probability density is a Gaussian, leading to a linear update equation. Critical evaluation shows when the assumption is justified. This also sheds light on why Kalman filters, in which the same ap- proximation is made, work for nonlinear models. By reference to the derivation, the impact of model and obser vational biases on the equations is discussed, and it is shown that Bayes’ s for mulation can still be used. A practical advantage of the ensemble smoother is that no adjoint equations have to be integrated and that error estimates are easily obtained. The present application shows that for process studies a smoother will give superior results compared to a filter , not only owing to the smooth transitions at obser vation points, but also because the origin of features can be followed back in time. Also its preference over a strong-constraint method is highlighted. Further more, it is argued that the proposed smoother is more efficient than gradient descent methods or than the representer method when error estimates are taken into account
Resumo:
Purpose – The strategic management literature lacks a comprehensive explanation as to why seemingly similar business models in the same industry perform differently. This paper strives to explain this phenomenon. Design/methodology/approach – The model is conceptualized and accompanied by a case study on the airline industry to explain knowledge brokerage that creates value from the effective utilization of knowledge resources acquired from intra- and inter-firm environments. Findings – The model explains a cyclical view of business model flexibility in which the knowledge-based resource accumulation of the business model is spread across the intra- and inter-firm environments. Knowledge brokerage strategies from the inter- and intra-firm environments result in improved performance of the business model. The flexibility that the business model acquires is determined by how efficiently resource accumulation is aligned with its external environment. Originality/value – The paper effectively integrates the concepts of knowledge brokerage and business models from a resource accumulation-based view and simultaneously arrives at the performance heterogeneity of seemingly similar business models within the same industry. It has performance implications for firms that start out without any distinct resources of their own, or that use an imitated business model, to attain better performance through business model evolution aligned with successful knowledge brokerage strategies. It adds to the resource accumulation literature by explaining how resources can be effectively acquired to create value.
Resumo:
Analogue and finite element numerical models with frictional and viscous properties are used to model thrust wedge development. Comparison between model types yields valuable information about analogue model evolution, scaling laws and the relative strengths and limitations of the techniques. Both model types show a marked contrast in structural style between ‘frictional-viscous domains’ underlain by a thin viscous layer and purely ‘frictional domains’. Closely spaced thrusts form a narrow and highly asymmetric fold-and-thrust belt in the frictional domain, characterized by in-sequence propagation of forward thrusts. In contrast, the frictional-viscous domain shows a wide and low taper wedge and a thrust belt with a more symmetrical vergence, with both forward and back thrusts. The frictional-viscous domain numerical models show that the viscous layer initially simple shears as deformation propagates along it, while localized deformation resulting in the formation of a pop-up structure occurs in the overlying frictional layers. In both domains, thrust shear zones in the numerical model are generally steeper than the equivalent faults in the analogue model, because the finite element code uses a non-associated plasticity flow law. Nevertheless, the qualitative agreement between analogue and numerical models is encouraging. It shows that the continuum approximation used in numerical models can be used to model frictional materials, such as sand, provided caution is taken to properly scale the experiments, and some of the limitations are taken into account.
Resumo:
This paper presents a way to describe design patterns rigorously based on role concepts. Rigorous pattern descriptions are a key aspect for patterns to be used as rules for model evolution in the MDA context, for example. We formalize the role concepts commonly used in defining design patterns as a role metamodel using Object-Z. Given this role metamodel, individual design patterns are specified generically as a formal pattern role model using Object-Z. We also formalize the properties that must be captured in a class model when a design pattern is deployed. These properties are defined generically in terms of role bindings from a pattern role model to a class model. Our work provides a precise but abstract approach for pattern definition and also provides a precise basis for checking the validity of pattern usage in designs.
Resumo:
This paper presents a formal but practical approach for defining and using design patterns. Initially we formalize the concepts commonly used in defining design patterns using Object-Z. We also formalize consistency constraints that must be satisfied when a pattern is deployed in a design model. Then we implement the pattern modeling language and its consistency constraints using an existing modeling framework, EMF, and incorporate the implementation as plug-ins to the Eclipse modeling environment. While the language is defined formally in terms of Object-Z definitions, the language is implemented in a practical environment. Using the plug-ins, users can develop precise pattern descriptions without knowing the underlying formalism, and can use the tool to check the validity of the pattern descriptions and pattern usage in design models. In this work, formalism brings precision to the pattern language definition and its implementation brings practicability to our pattern-based modeling approach.
Resumo:
This work is concerned with approximate inference in dynamical systems, from a variational Bayesian perspective. When modelling real world dynamical systems, stochastic differential equations appear as a natural choice, mainly because of their ability to model the noise of the system by adding a variation of some stochastic process to the deterministic dynamics. Hence, inference in such processes has drawn much attention. Here a new extended framework is derived that is based on a local polynomial approximation of a recently proposed variational Bayesian algorithm. The paper begins by showing that the new extension of this variational algorithm can be used for state estimation (smoothing) and converges to the original algorithm. However, the main focus is on estimating the (hyper-) parameters of these systems (i.e. drift parameters and diffusion coefficients). The new approach is validated on a range of different systems which vary in dimensionality and non-linearity. These are the Ornstein–Uhlenbeck process, the exact likelihood of which can be computed analytically, the univariate and highly non-linear, stochastic double well and the multivariate chaotic stochastic Lorenz ’63 (3D model). As a special case the algorithm is also applied to the 40 dimensional stochastic Lorenz ’96 system. In our investigation we compare this new approach with a variety of other well known methods, such as the hybrid Monte Carlo, dual unscented Kalman filter, full weak-constraint 4D-Var algorithm and analyse empirically their asymptotic behaviour as a function of observation density or length of time window increases. In particular we show that we are able to estimate parameters in both the drift (deterministic) and the diffusion (stochastic) part of the model evolution equations using our new methods.