25 resultados para Markov chains. Convergence. Evolutionary Strategy. Large Deviations
Resumo:
A strategy process was completed in the ESF project “Promotion of Work-related Immigration”, which was implemented at Centre for Economic Development, Transport and the Environment for North Ostrobothnia, and an immigration strategy was drawn up for Northern Ostrobothnia on the basis of the process. Information was collected about the situation in Northern Ostrobothnia from the point of view of immigration and the future availability of labour. The intention was to use the information as background material for the strategy. Employers’ need for support in recruiting foreign labour was investigated with a broad inquiry, to which 1000 respondents replied. The strategy process was carried out together with an outside consultant (Net Effect Oy) by arranging three workshops and a seminar where the workshop results were summarised. A large number of companies, authorities, municipalities, associations, project actors and immigrants engaged in immigration issues participated in the workshops. The draft strategy is based on their experiences about immigration and on statistical data, background inquiries and surveys. To ensure the accuracy of the draft strategy, comments were requested from several parties and received from 64 organisations. The core of the immigration strategy consists of an initial analysis, values, a vision and priorities. The strategy is composed of three priorities. The key aim of the priority Internationalisation and Supporting Diversity is to support diversity in schools, workplaces and people’s everyday lives e.g. through attitude development and by promoting internationalisation in companies and education institutions. The aim of the priority Supporting Entrepreneurship and Recruiting Foreign Labour is to promote entrepreneurship among immigrants and the recruitment of foreign labour and to develop the forecasting of educational needs. The priority Developing Integration Services, Regional Cooperation and Networks, in turn, seeks to develop the service structure and policies of immigrant integration and to increase cooperation and exchange of information between regional actors engaged in integration issues. The aim is to use the strategy as a guideline document for immigration issues in Northern Ostrobothnia. The strategy is used to coordinate the existing organisations and operations dealing with immigration issues. In addition, it contains a future-oriented focus and underlines the management of new immigration projects and operations. The main party responsible for the implementation of the strategy is the Immigration Committee. In addition, responsible parties have been assigned to each measure. The implementation of the immigration strategy will be monitored annually on the basis of indicators.
Resumo:
According to several surveys and observations, the percentage of successfully conducted IT projects without over-budgeting and delays in time schedule are extremely low. Many projects also are evaluated as failures in terms of delivered functionality. Nuldén (1996) compares IT projects with bad movies; after watching for 2 hours, one still tries to finish it even though one understands that it is a complete waste of time. The argument for that is 'I've already invested too much time to terminate it now'. The same happens with IT projects: sometimes the company continues wasting money on these projects for a long time, even though there are no expected benefits from these projects. Eventually these projects are terminated anyway, but until this moment, the company spends a lot. The situation described above is a consequence of “escalation of commitment” - project continuation even after a manager receives negative feedback of the project’s success probability. According to Keil and Mähring (2010), even though escalation can occur in any type of project, it is more common among complex technological projects, such as IT projects. Escalation of commitment very often results in runaway projects. In order to avoid it, managers use de-escalation strategies, which allow the resources to be used in more effective. These strategies lead to project termination or turning around, which stops the flow of wasted investments. Numbers of researches explore escalation of commitment phenomena based on experiments and business cases. Moreover, during the last decade several frameworks were proposed for de-escalation strategy. However, there is no evidence of successful implementation of the de-escalation of commitment strategy in the literature. In addition, despite that fact that IT project management methodologies are widely used in the companies, none of them cover the topic of escalation of commitment risks. At the same time, there are no researches proposing the way to implement de-escalation of commitment strategy into the existing project management methodology The research is focused on a single case of large ERP implementation project by the consulting company. Hence, the main deliverables of the study include suggestions of improvement in de-escalation methods and techniques in the project and in the company. Moreover, the way to implement these methods into existing project management methodology and into the company general policies is found.
Resumo:
Social media is a multidimensional marketing and communications channel which can support and enhance a business’ reputation, sales and even longevity. Social media as a business tool encourages an interaction between customers and companies which gives opportunities for a company to better understand their customers, to target them more effectively and to collaborate and create dialogues with them which is not possible through traditional media channels. The aim of a social media strategy is to increase brand awareness, image, loyalty and recognition. The peer networks that social media creates allows a company to disseminate information through loyal customers to new and prospective customers to ultimately increase reach. The purpose of the study is to understand the marketer’s perspective of social media marketing use and how it is currently utilized in marketing and communications activities in Finland. Three companies were interviewed covering fourteen different implementations of social media marketing campaigns. These were then analysed to ascertain the utilization methods and experience gained on recent campaigns in the Finnish market The utilization of social media marketing was analysed using the methods of thematic analysis and inductive and abductive reasoning. Elements and themes were drawn out of the separate interviews to create a framework with which to explore, evaluate and match theories that define social media usage by companies. It became clear from all of the interviews that social media as a tool is most effective when it captures the viewer’s interest through rich and entertaining content. This directed the theoretical research towards Engagement Theory and Content Marketing which look to emphasize the importance of communities, collaboration, interaction, and peer-sharing as the key drivers of a social media marketing campaign.
Resumo:
The behavioural finance literature expects systematic and significant deviations from efficiency to persist in securities markets due to behavioural and cognitive biases of investors. These behavioural models attempt to explain the coexistence of intermediate-term momentum and long-term reversals in stock returns based on the systematic violations of rational behaviour of investors. The study investigates the anchoring bias of investors and the profitability of the 52-week momentum strategy (GH henceforward). The relatively highly volatile OMX Helsinki stock exchange is a suitable market for examining the momentum effect, since international investors tend to realise their positions first from the furthest security markets by the time of market turbulence. Empirical data is collected from Thomson Reuters Datastream and the OMX Nordic website. The objective of the study is to provide a throughout research by formulating a self-financing GH momentum portfolio. First, the seasonality of the strategy is examined by taking the January effect into account and researching abnormal returns in long-term. The results indicate that the GH strategy is subject to significantly negative revenues in January, but the strategy is not prone to reversals in long-term. Then the predictive proxies of momentum returns are investigated in terms of acquisition prices and 52-week high statistics as anchors. The results show that the acquisition prices do not have explanatory power over the GH strategys abnormal returns. Finally, the efficacy of the GH strategy is examined after taking transaction costs into account, finding that the robust abnormal returns remain statistically significant despite the transaction costs. As a conclusion, the relative distance between a stock’s current price and its 52-week high statistic explains the profits of momentum investing to a high degree. The results indicate that intermediateterm momentum and long-term reversals are separate phenomena. This presents a challenge to current behavioural theories, which model these aspects of stock returns as subsequent components of how securities markets respond to relevant information.
Resumo:
Developing software is a difficult and error-prone activity. Furthermore, the complexity of modern computer applications is significant. Hence,an organised approach to software construction is crucial. Stepwise Feature Introduction – created by R.-J. Back – is a development paradigm, in which software is constructed by adding functionality in small increments. The resulting code has an organised, layered structure and can be easily reused. Moreover, the interaction with the users of the software and the correctness concerns are essential elements of the development process, contributing to high quality and functionality of the final product. The paradigm of Stepwise Feature Introduction has been successfully applied in an academic environment, to a number of small-scale developments. The thesis examines the paradigm and its suitability to construction of large and complex software systems by focusing on the development of two software systems of significant complexity. Throughout the thesis we propose a number of improvements and modifications that should be applied to the paradigm when developing or reengineering large and complex software systems. The discussion in the thesis covers various aspects of software development that relate to Stepwise Feature Introduction. More specifically, we evaluate the paradigm based on the common practices of object-oriented programming and design and agile development methodologies. We also outline the strategy to testing systems built with the paradigm of Stepwise Feature Introduction.
Resumo:
Open innovation paradigm states that the boundaries of the firm have become permeable, allowing knowledge to flow inwards and outwards to accelerate internal innovations and take unused knowledge to the external environment; respectively. The successful implementation of open innovation practices in firms like Procter & Gamble, IBM, and Xerox, among others; suggest that it is a sustainable trend which could provide basis for achieving competitive advantage. However, implementing open innovation could be a complex process which involves several domains of management; and whose term, classification, and practices have not totally been agreed upon. Thus, with many possible ways to address open innovation, the following research question was formulated: How could Ericsson LMF assess which open innovation mode to select depending on the attributes of the project at hand? The research followed the constructive research approach which has the following steps: find a practical relevant problem, obtain general understanding of the topic, innovate the solution, demonstrate the solution works, show theoretical contributions, and examine the scope of applicability of the solution. The research involved three phases of data collection and analysis: Extensive literature review of open innovation, strategy, business model, innovation, and knowledge management; direct observation of the environment of the case company through participative observation; and semi-structured interviews based of six cases involving multiple and heterogeneous open innovation initiatives. Results from the cases suggest that the selection of modes depend on multiple reasons, with a stronger influence of factors related to strategy, business models, and resources gaps. Based on these and others factors found in the literature review and observations; it was possible to construct a model that supports approaching open innovation. The model integrates perspectives from multiple domains of the literature review, observations inside the case company, and factors from the six open innovation cases. It provides steps, guidelines, and tools to approach open innovation and assess the selection of modes. Measuring the impact of open innovation could take years; thus, implementing and testing entirely the model was not possible due time limitation. Nevertheless, it was possible to validate the core elements of the model with empirical data gathered from the cases. In addition to constructing the model, this research contributed to the literature by increasing the understanding of open innovation, providing suggestions to the case company, and proposing future steps.
Resumo:
This thesis is concerned with the state and parameter estimation in state space models. The estimation of states and parameters is an important task when mathematical modeling is applied to many different application areas such as the global positioning systems, target tracking, navigation, brain imaging, spread of infectious diseases, biological processes, telecommunications, audio signal processing, stochastic optimal control, machine learning, and physical systems. In Bayesian settings, the estimation of states or parameters amounts to computation of the posterior probability density function. Except for a very restricted number of models, it is impossible to compute this density function in a closed form. Hence, we need approximation methods. A state estimation problem involves estimating the states (latent variables) that are not directly observed in the output of the system. In this thesis, we use the Kalman filter, extended Kalman filter, Gauss–Hermite filters, and particle filters to estimate the states based on available measurements. Among these filters, particle filters are numerical methods for approximating the filtering distributions of non-linear non-Gaussian state space models via Monte Carlo. The performance of a particle filter heavily depends on the chosen importance distribution. For instance, inappropriate choice of the importance distribution can lead to the failure of convergence of the particle filter algorithm. In this thesis, we analyze the theoretical Lᵖ particle filter convergence with general importance distributions, where p ≥2 is an integer. A parameter estimation problem is considered with inferring the model parameters from measurements. For high-dimensional complex models, estimation of parameters can be done by Markov chain Monte Carlo (MCMC) methods. In its operation, the MCMC method requires the unnormalized posterior distribution of the parameters and a proposal distribution. In this thesis, we show how the posterior density function of the parameters of a state space model can be computed by filtering based methods, where the states are integrated out. This type of computation is then applied to estimate parameters of stochastic differential equations. Furthermore, we compute the partial derivatives of the log-posterior density function and use the hybrid Monte Carlo and scaled conjugate gradient methods to infer the parameters of stochastic differential equations. The computational efficiency of MCMC methods is highly depend on the chosen proposal distribution. A commonly used proposal distribution is Gaussian. In this kind of proposal, the covariance matrix must be well tuned. To tune it, adaptive MCMC methods can be used. In this thesis, we propose a new way of updating the covariance matrix using the variational Bayesian adaptive Kalman filter algorithm.
Resumo:
The goal of this thesis is to look for and point out problems and bottlenecks related to value chains and networks in initiation and implementation of intelligent packaging. The research is based on interviews in different case companies and is qualitative by nature. The interview results are examined through a framework built upon relevant theory, with the aim to present a useful recommendation for a supplier company for advancing intelligent packaging business. The perspective that is attained through the research questions demonstrates the potential customer companies’ views of possibilities and problems. The key results suggest that intellectual property of relevant products is in an important position from the customers’ perspective. If the supplier does not own a product technology, a sufficiently large company can consider working as an integrator in a network where smaller companies make use of a compiled offering from other smaller actors. The foundation for these networks and company relationships is value creation, which has to be based on profound customer knowledge and research. The framework that is created for this study builds upon earlier research to provide a model that better serves intelligent packaging implementation and includes the notion of importance of value proposition and continuous value co-creation.
Resumo:
This thesis concerns the analysis of epidemic models. We adopt the Bayesian paradigm and develop suitable Markov Chain Monte Carlo (MCMC) algorithms. This is done by considering an Ebola outbreak in the Democratic Republic of Congo, former Zaïre, 1995 as a case of SEIR epidemic models. We model the Ebola epidemic deterministically using ODEs and stochastically through SDEs to take into account a possible bias in each compartment. Since the model has unknown parameters, we use different methods to estimate them such as least squares, maximum likelihood and MCMC. The motivation behind choosing MCMC over other existing methods in this thesis is that it has the ability to tackle complicated nonlinear problems with large number of parameters. First, in a deterministic Ebola model, we compute the likelihood function by sum of square of residuals method and estimate parameters using the LSQ and MCMC methods. We sample parameters and then use them to calculate the basic reproduction number and to study the disease-free equilibrium. From the sampled chain from the posterior, we test the convergence diagnostic and confirm the viability of the model. The results show that the Ebola model fits the observed onset data with high precision, and all the unknown model parameters are well identified. Second, we convert the ODE model into a SDE Ebola model. We compute the likelihood function using extended Kalman filter (EKF) and estimate parameters again. The motivation of using the SDE formulation here is to consider the impact of modelling errors. Moreover, the EKF approach allows us to formulate a filtered likelihood for the parameters of such a stochastic model. We use the MCMC procedure to attain the posterior distributions of the parameters of the SDE Ebola model drift and diffusion parts. In this thesis, we analyse two cases: (1) the model error covariance matrix of the dynamic noise is close to zero , i.e. only small stochasticity added into the model. The results are then similar to the ones got from deterministic Ebola model, even if methods of computing the likelihood function are different (2) the model error covariance matrix is different from zero, i.e. a considerable stochasticity is introduced into the Ebola model. This accounts for the situation where we would know that the model is not exact. As a results, we obtain parameter posteriors with larger variances. Consequently, the model predictions then show larger uncertainties, in accordance with the assumption of an incomplete model.
Resumo:
There are more than 7000 languages in the world, and many of these have emerged through linguistic divergence. While questions related to the drivers of linguistic diversity have been studied before, including studies with quantitative methods, there is no consensus as to which factors drive linguistic divergence, and how. In the thesis, I have studied linguistic divergence with a multidisciplinary approach, applying the framework and quantitative methods of evolutionary biology to language data. With quantitative methods, large datasets may be analyzed objectively, while approaches from evolutionary biology make it possible to revisit old questions (related to, for example, the shape of the phylogeny) with new methods, and adopt novel perspectives to pose novel questions. My chief focus was on the effects exerted on the speakers of a language by environmental and cultural factors. My approach was thus an ecological one, in the sense that I was interested in how the local environment affects humans and whether this human-environment connection plays a possible role in the divergence process. I studied this question in relation to the Uralic language family and to the dialects of Finnish, thus covering two different levels of divergence. However, as the Uralic languages have not previously been studied using quantitative phylogenetic methods, nor have population genetic methods been previously applied to any dialect data, I first evaluated the applicability of these biological methods to language data. I found the biological methodology to be applicable to language data, as my results were rather similar to traditional views as to both the shape of the Uralic phylogeny and the division of Finnish dialects. I also found environmental conditions, or changes in them, to be plausible inducers of linguistic divergence: whether in the first steps in the divergence process, i.e. dialect divergence, or on a large scale with the entire language family. My findings concerning Finnish dialects led me to conclude that the functional connection between linguistic divergence and environmental conditions may arise through human cultural adaptation to varying environmental conditions. This is also one possible explanation on the scale of the Uralic language family as a whole. The results of the thesis bring insights on several different issues in both a local and a global context. First, they shed light on the emergence of the Finnish dialects. If the approach used in the thesis is applied to the dialects of other languages, broader generalizations may be drawn as to the inducers of linguistic divergence. This again brings us closer to understanding the global patterns of linguistic diversity. Secondly, the quantitative phylogeny of the Uralic languages, with estimated times of language divergences, yields another hypothesis as to the shape and age of the language family tree. In addition, the Uralic languages can now be added to the growing list of language families studied with quantitative methods. This will allow broader inferences as to global patterns of language evolution, and more language families can be included in constructing the tree of the world’s languages. Studying history through language, however, is only one way to illuminate the human past. Therefore, thirdly, the findings of the thesis, when combined with studies of other language families, and those for example in genetics and archaeology, bring us again closer to an understanding of human history.